Input
stringlengths
251
41.6k
Output
stringlengths
137
9.7k
input_ids
listlengths
157
2.05k
attention_mask
listlengths
157
2.05k
labels
listlengths
157
2.05k
Below is given review of a research paper from cnoference journal. Please write a summary the review. ### Review: thank you for a pleasurable and informative read i consider the writing and structure of the paper to be coherent and well written given an endtoend learning of neural motifs a great deal of time can be avoided reducing the several intermediary steps required to detect motifs from calcium imaging this paper may very well improve researchers efficiency in particular when working with calcium imaging the question remain to what extent these ideas may be useful in other imaging modalities ie fmri my main critique would be to be more explicit about why the vae you propose is superior to other models in the generative modelling domaindocsepthe paper proposes a vaestyle model for identifying motifs from calcium imaging videos as opposed to standard vae with gaussian latent variables it relies on bernouli variables and hence requires gumbelsoftmax trick for inference compared to methods based on matrix factorization the proposed method has the advantage of not requiring any preprocessing on the imaging videos my main comments are as follows how sensitive is the method to the choice of beta and other hyperparameters compared to scc which has fewer hyperparameters how robust is the method how does it perform on real data compared to methods based on spike time matrices do they generate similar motifs the application of the method seems quite limited to calcium imaging videos and it does not provide comparison with other deep generative models for videos methods such as johnson et al nips 2016 composing graphical models with neural networks for structured representations and fast inference can also be applied to calcium imaging datasets and can potentially infer the motifs i believe the problem of inferring the neural motifs is an interesting problem however i think this paper requires more work to it shows its advantages over other deep generative models for video data and also its performance on real data compared to scc or some other matrix factorization based approach the authors have addressed my comments about other deep generative models and hyperparameter sensitivity however i still think the paper is more suitable for other venues with readers from the neuroscience community hence i change my rating to 5 docseplast time i had two comments 1 the real data motifs did not look like what id expect motifs to look like now that the authors have thresholded the real data motifs they do look as id expect 2 im not a fan of vae and believe that simpler optimization algorithms might be profitable i acknowledge that scc requires additional steps i am not comparing to scc rather im saying given your generative model there are many strategies one could employ to estimate the motifs i realize that vae is all the rage and is probably fine in my own experiments simpler methods often work as well or better for these types of problems i therefore believe this would be an interesting avenue to explore in future work ### Summary:
this paper is about representation learning for calcium imaging and thus a bit different in scope that most iclr submissions but the paper is wellexecuted with good choices for the various parts of the model making it relevant for other similar domains
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 47033, 368, 323, 247, 26757, 11722, 285, 27096, 1239, 891, 1908, 253, 4028, 285, 2605, 273, 253, 2929, 281, 320, 18893, 285, 973, 3542, 50275, 28821, 271, 990, 936, 423, 4715, 273, 11454, 25907, 247, 1270, 2968, 273, 673, 476, 320, 16371, 8493, 253, 2067, 26103, 552, 5018, 2424, 281, 2736, 25907, 432, 11672, 6979, 436, 2929, 778, 1077, 973, 3157, 8607, 6733, 275, 1798, 672, 2444, 342, 11672, 6979, 253, 1953, 3464, 281, 752, 6070, 841, 5697, 778, 320, 4217, 275, 643, 6979, 33433, 26332, 49555, 363, 50276, 2577, 2022, 29254, 651, 320, 281, 320, 625, 6843, 670, 2139, 253, 362, 3348, 368, 12661, 310, 8936, 281, 643, 3210, 275, 253, 1006, 800, 26278, 5028, 7152, 339, 431, 248, 2929, 29328, 247, 13460, 12463, 1566, 323, 12488, 25907, 432, 11672, 6979, 10556, 347, 10066, 281, 2629, 362, 3348, 342, 305, 12064, 21624, 4903, 352, 15771, 327, 270, 1808, 276, 965, 4903, 285, 7613, 4419, 305, 3561, 293, 5530, 4090, 10480, 323, 17032, 2429, 281, 3082, 1754, 327, 4315, 39401, 253, 4081, 1332, 556, 253, 5750, 273, 417, 10568, 667, 638, 21678, 327, 253, 6979, 10556, 619, 2022, 5701, 403, 347, 3637, 50275, 5430, 7996, 310, 253, 1332, 281, 253, 4327, 273, 9840, 285, 643, 4373, 22041, 50276, 3118, 1096, 281, 256, 550, 534, 556, 11184, 4373, 22041, 849, 10237, 310, 253, 1332, 50276, 5430, 1057, 352, 1347, 327, 1524, 941, 2429, 281, 3082, 1754, 327, 24147, 673, 12624, 513, 597, 6635, 2074, 25907, 50275, 783, 2898, 273, 253, 1332, 3133, 3240, 3710, 281, 11672, 6979, 10556, 285, 352, 1057, 417, 2085, 5301, 342, 643, 3676, 1006, 800, 3210, 323, 10556, 3082, 824, 347, 480, 2116, 1665, 1162, 355, 295, 2824, 4022, 47247, 29886, 3210, 342, 11454, 6928, 323, 18872, 14237, 285, 3809, 17032, 476, 671, 320, 3732, 281, 11672, 6979, 15302, 285, 476, 7826, 9441, 253, 25907, 50276, 74, 2868, 253, 1895, 273, 9441, 804, 253, 11454, 25907, 310, 271, 4722, 1895, 2299, 891, 1158, 436, 2929, 4419, 625, 789, 281, 352, 2722, 697, 11361, 689, 643, 3676, 1006, 800, 3210, 323, 3492, 941, 285, 671, 697, 3045, 327, 1524, 941, 2429, 281, 256, 550, 390, 690, 643, 4315, 39401, 1754, 2746, 50275, 783, 4477, 452, 9713, 619, 5701, 670, 643, 3676, 1006, 800, 3210, 285, 4373, 19484, 7340, 2299, 891, 1335, 1158, 253, 2929, 310, 625, 7470, 323, 643, 28966, 342, 10668, 432, 253, 6551, 21559, 3114, 7613, 891, 1818, 619, 13716, 281, 608, 5474, 339, 446, 505, 673, 891, 574, 767, 5701, 337, 253, 1524, 941, 25907, 858, 417, 1007, 751, 752, 2654, 1902, 25907, 281, 1007, 751, 1024, 326, 253, 4477, 452, 7887, 264, 253, 1524, 941, 25907, 597, 513, 1007, 347, 2654, 1902, 374, 516, 417, 247, 7989, 273, 362, 3348, 285, 2868, 326, 19554, 13757, 11333, 1537, 320, 27834, 50276, 74, 14409, 326, 256, 550, 4419, 3081, 5018, 891, 717, 417, 10941, 281, 256, 550, 2581, 516, 3981, 1677, 634, 1006, 800, 1566, 627, 403, 1142, 8130, 581, 812, 2126, 281, 6642, 253, 25907, 50276, 74, 8968, 326, 362, 3348, 310, 512, 253, 22324, 285, 310, 3164, 4030, 50276, 249, 619, 1211, 4679, 19554, 3082, 2223, 789, 347, 973, 390, 1805, 323, 841, 3510, 273, 3237, 50276, 74, 3103, 2868, 436, 651, 320, 271, 4722, 39893, 281, 8338, 275, 2852, 789, 187, 187, 4118, 18435, 27, 2520, 2929, 310, 670, 6779, 4715, 323, 11672, 6979, 285, 3021, 247, 2372, 1027, 275, 7990, 326, 954, 17857, 32888, 35103, 533, 253, 2929, 310, 6210, 1591, 886, 4525, 342, 1175, 10165, 323, 253, 2710, 4243, 273, 253, 1566, 2403, 352, 4623, 323, 643, 2074, 10625 ]
[ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 ]
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 47033, 368, 323, 247, 26757, 11722, 285, 27096, 1239, 891, 1908, 253, 4028, 285, 2605, 273, 253, 2929, 281, 320, 18893, 285, 973, 3542, 50275, 28821, 271, 990, 936, 423, 4715, 273, 11454, 25907, 247, 1270, 2968, 273, 673, 476, 320, 16371, 8493, 253, 2067, 26103, 552, 5018, 2424, 281, 2736, 25907, 432, 11672, 6979, 436, 2929, 778, 1077, 973, 3157, 8607, 6733, 275, 1798, 672, 2444, 342, 11672, 6979, 253, 1953, 3464, 281, 752, 6070, 841, 5697, 778, 320, 4217, 275, 643, 6979, 33433, 26332, 49555, 363, 50276, 2577, 2022, 29254, 651, 320, 281, 320, 625, 6843, 670, 2139, 253, 362, 3348, 368, 12661, 310, 8936, 281, 643, 3210, 275, 253, 1006, 800, 26278, 5028, 7152, 339, 431, 248, 2929, 29328, 247, 13460, 12463, 1566, 323, 12488, 25907, 432, 11672, 6979, 10556, 347, 10066, 281, 2629, 362, 3348, 342, 305, 12064, 21624, 4903, 352, 15771, 327, 270, 1808, 276, 965, 4903, 285, 7613, 4419, 305, 3561, 293, 5530, 4090, 10480, 323, 17032, 2429, 281, 3082, 1754, 327, 4315, 39401, 253, 4081, 1332, 556, 253, 5750, 273, 417, 10568, 667, 638, 21678, 327, 253, 6979, 10556, 619, 2022, 5701, 403, 347, 3637, 50275, 5430, 7996, 310, 253, 1332, 281, 253, 4327, 273, 9840, 285, 643, 4373, 22041, 50276, 3118, 1096, 281, 256, 550, 534, 556, 11184, 4373, 22041, 849, 10237, 310, 253, 1332, 50276, 5430, 1057, 352, 1347, 327, 1524, 941, 2429, 281, 3082, 1754, 327, 24147, 673, 12624, 513, 597, 6635, 2074, 25907, 50275, 783, 2898, 273, 253, 1332, 3133, 3240, 3710, 281, 11672, 6979, 10556, 285, 352, 1057, 417, 2085, 5301, 342, 643, 3676, 1006, 800, 3210, 323, 10556, 3082, 824, 347, 480, 2116, 1665, 1162, 355, 295, 2824, 4022, 47247, 29886, 3210, 342, 11454, 6928, 323, 18872, 14237, 285, 3809, 17032, 476, 671, 320, 3732, 281, 11672, 6979, 15302, 285, 476, 7826, 9441, 253, 25907, 50276, 74, 2868, 253, 1895, 273, 9441, 804, 253, 11454, 25907, 310, 271, 4722, 1895, 2299, 891, 1158, 436, 2929, 4419, 625, 789, 281, 352, 2722, 697, 11361, 689, 643, 3676, 1006, 800, 3210, 323, 3492, 941, 285, 671, 697, 3045, 327, 1524, 941, 2429, 281, 256, 550, 390, 690, 643, 4315, 39401, 1754, 2746, 50275, 783, 4477, 452, 9713, 619, 5701, 670, 643, 3676, 1006, 800, 3210, 285, 4373, 19484, 7340, 2299, 891, 1335, 1158, 253, 2929, 310, 625, 7470, 323, 643, 28966, 342, 10668, 432, 253, 6551, 21559, 3114, 7613, 891, 1818, 619, 13716, 281, 608, 5474, 339, 446, 505, 673, 891, 574, 767, 5701, 337, 253, 1524, 941, 25907, 858, 417, 1007, 751, 752, 2654, 1902, 25907, 281, 1007, 751, 1024, 326, 253, 4477, 452, 7887, 264, 253, 1524, 941, 25907, 597, 513, 1007, 347, 2654, 1902, 374, 516, 417, 247, 7989, 273, 362, 3348, 285, 2868, 326, 19554, 13757, 11333, 1537, 320, 27834, 50276, 74, 14409, 326, 256, 550, 4419, 3081, 5018, 891, 717, 417, 10941, 281, 256, 550, 2581, 516, 3981, 1677, 634, 1006, 800, 1566, 627, 403, 1142, 8130, 581, 812, 2126, 281, 6642, 253, 25907, 50276, 74, 8968, 326, 362, 3348, 310, 512, 253, 22324, 285, 310, 3164, 4030, 50276, 249, 619, 1211, 4679, 19554, 3082, 2223, 789, 347, 973, 390, 1805, 323, 841, 3510, 273, 3237, 50276, 74, 3103, 2868, 436, 651, 320, 271, 4722, 39893, 281, 8338, 275, 2852, 789, 187, 187, 4118, 18435, 27, 2520, 2929, 310, 670, 6779, 4715, 323, 11672, 6979, 285, 3021, 247, 2372, 1027, 275, 7990, 326, 954, 17857, 32888, 35103, 533, 253, 2929, 310, 6210, 1591, 886, 4525, 342, 1175, 10165, 323, 253, 2710, 4243, 273, 253, 1566, 2403, 352, 4623, 323, 643, 2074, 10625 ]
Below is given review of a research paper from cnoference journal. Please write a summary the review. ### Review: the paper takes a good step toward developing more structured representations by exploring the use of quaternions in recurrent neural networks the idea is motivated by the observation that in many cases there are local relationships among elements of a vector that should be explicitly represented this is also the idea behind capsules to have each unit output a vector of parameters to be operated upon rather than a single number here the authors show that by incorporating quaternions into the representations used by rnns or lstms one achieves better performance at speech recognition tasks using fewer parameters the quaternionic representation of the spectrogram chosen here seems a bit arbitrary why are these the attributes to be packaged together its not obvious shouldnt this be learned docsepquality sufficient though there are issues work done in automatic speech recognition on numerous variants of recurrent models such as interleaved tdnn and lstm peddinti 2017 is completely ignored addressed in the revision the description of derivatives needs to mention the linear relationship between input features and derivatives see trajectory hmms by zen and tokuda addressed in the revision timit is a very simple task addressed by adding wsj experiments derivations in the appendices could be connected better addressed in the revision clarity sufficient it would be good to see some discussion of 1 split activations and other possible options short comment added in the revision if any 2 expressions of derivatives and their connection to standard rnn derivatives short comment added in the revision 3 computational complexity addressed in the revision originality sufficient this paper describes the extension of quaternion feedforward neural networks to recurrent neural networks and a parameter initialisation method in the quaternial domain significance sufficient pros audience interested in quaternial neural networks would benefit from this publication experimental results even if limited suggest that quaternial representation may offer a significant reduction in the number of model parameters at no loss in performance cons the choice of derivatives to yield quaternions as there are other more interesting views to contemplate both in speech and other fields a simple task makes it hard to judge how the quaternion extension would scale other the format of references the use of a number in parentheses is unusual and distractive fixed in the revision please at least name all the terms used in the main paper body even if they are defined later in the appendix eg ht in equation 10 fixed in the revision do both whh and bh contain the same deltahht term in their update equation 11 fixed in the revision page 7 by mistake mentions 182 which cannot be found in the table 1 fixed in the revision page 12 is equals to remains in the revision docsepafter the discussion with authors i am happy to recommend acceptance 1 in consequently for each input vector of size n output vector of size m dimensions are split into four parts the first one equals to r the second is xi the third one equals to yj and the last one to zk to compose a quaternion q r1 xi yj zk are you splitting dimension m or mtimes n and if you split m times n i believe thats what you are doing in which order you are splitting row major right please explain 2 i did not understand why authors didnt go in the negative direction of the gradient in eq 1011 3 in section 34 authors mentioned moreover an hypercomplex parameter cannot be simply initialized randomly and componentwise due to the interactions between components which i strongly agree but in eq 7 and 9 why the update rules and activation function are applied component wise 4 i really like the elegance in the parameter initialization couple of minor things here 1 its better to mention in eq 16 why ew is 0 because of symmetry 2 reference should be 61 instead of 51 5 another reasonable baseline will be using a complex network like httpsopenreviewnetforumidh1t2hmzab and use the first two terms in eq 19 for representation this will also possibly justify the usefulness of using higher order partials 6 the authors mentioned multiple times about the achieved stateoftheart results without giving any citation as a reader not well versed in the acoustic domain it will be nice to see some references to crossvalidate the claim made general comments 1 i understand the necessity of defining rnn lstm model in the space of quaternions but unit quaternions can be identified with other spaces where convolution is defined recently eg with s3 httpsarxivorgabs180906211 i can see that this paper is contemporary but at least can authors comment on the applicability of this general method in their case given that in nips18 the following paper talked about rnn model on noneuclidean spaces httpsarxivorgpdf180511204pdf one can extend these ideas to develop an rnn model in the space of quaternions authors should look into it rigorously as future directions but at least please comments on the applicability 2 the experimental results section is somewhat weak the overall claim of using fewer parameters and achieving comparable results is only validated on timit data more experimentation is necessary 3 in terms of technical novelty though quaternion algebra is wellknown i like the parameter initialization algorithm i can see the merit of this in ml vision community pros 1 nice well grounded methodological development on wellknown algebra simple but elegant so thats good 2 nicely written and all the maths check out thats good 3 experimental result on timit dataset shows usefulness in terms of using fewer parameters but still can achieve soa results cons 1 see my comments above i expect the authors to rebut address the aforementioned comments overall though simple but nice and necessary development of rnn lstm framework in the space of quaternions 2 lacks extensive experimental validation my reason for my rating is mainly because of 1 lack of experimental validation 2 being aware of the recent development of general rnn model on noneuclidean spaces i want some comments in this direction see detailed comment and reference above ### Summary:
the authors derive and experiment with quaternionbased recurrent neural networks and demonstrate their effectiveness on speech recognition tasks timit and wsj where the authors demonstrate that the proposed models can achieve the same accuracy with fewer parameters than conventional models the reviewers were unanimous in recommending that the paper be accepted
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 783, 2929, 3936, 247, 1175, 3213, 2584, 6684, 625, 18872, 14237, 407, 18216, 253, 897, 273, 40163, 621, 275, 18902, 11454, 6928, 50276, 783, 2934, 310, 17194, 407, 253, 8310, 326, 275, 1142, 2219, 627, 403, 1980, 7688, 2190, 3603, 273, 247, 4972, 326, 943, 320, 11120, 6607, 50276, 2520, 310, 671, 253, 2934, 3212, 43642, 50276, 936, 452, 1016, 3943, 3453, 247, 4972, 273, 3602, 281, 320, 11658, 2220, 2581, 685, 247, 2014, 1180, 50275, 1568, 253, 4477, 921, 326, 407, 24049, 40163, 621, 715, 253, 14237, 908, 407, 391, 79, 2224, 390, 298, 296, 983, 581, 33526, 1805, 3045, 387, 6519, 8981, 8892, 970, 11184, 3602, 50276, 783, 40163, 17135, 6779, 273, 253, 2812, 287, 1710, 6777, 1060, 3133, 247, 2372, 10341, 50276, 22309, 403, 841, 253, 12474, 281, 320, 32487, 2366, 50276, 953, 417, 4755, 50276, 11425, 2649, 436, 320, 6311, 50276, 7152, 33032, 15177, 4209, 2167, 627, 403, 3374, 789, 2218, 275, 12077, 6519, 8981, 327, 7418, 11640, 273, 18902, 3210, 824, 347, 25817, 9367, 32989, 9866, 285, 298, 296, 78, 7690, 69, 565, 74, 4240, 310, 4336, 12841, 9713, 275, 253, 18520, 253, 5740, 273, 13335, 3198, 281, 3748, 253, 4872, 2954, 875, 3280, 3386, 285, 13335, 923, 18974, 288, 78, 983, 407, 1182, 257, 285, 18734, 14776, 9713, 275, 253, 18520, 4522, 262, 310, 247, 1077, 2969, 4836, 9713, 407, 6240, 37280, 75, 4679, 3538, 569, 275, 253, 14801, 1271, 812, 320, 4802, 1805, 9713, 275, 253, 18520, 50274, 498, 15752, 4209, 352, 651, 320, 1175, 281, 923, 690, 5955, 273, 337, 8085, 1396, 569, 285, 643, 1896, 4610, 2159, 4385, 2879, 275, 253, 18520, 604, 667, 374, 12091, 273, 13335, 285, 616, 4602, 281, 2629, 391, 9866, 13335, 2159, 4385, 2879, 275, 253, 18520, 495, 15180, 10454, 9713, 275, 253, 18520, 50275, 19164, 414, 4209, 436, 2929, 8631, 253, 6880, 273, 40163, 279, 3997, 10495, 11454, 6928, 281, 18902, 11454, 6928, 285, 247, 4764, 3302, 5837, 1332, 275, 253, 40163, 451, 5028, 50276, 9188, 40348, 4209, 50275, 856, 84, 8446, 6110, 275, 40163, 451, 11454, 6928, 651, 5649, 432, 436, 9311, 5661, 1543, 1014, 604, 3710, 1804, 326, 40163, 451, 6779, 778, 3959, 247, 1534, 5141, 275, 253, 1180, 273, 1566, 3602, 387, 642, 2957, 275, 3045, 50274, 5040, 253, 4327, 273, 13335, 281, 4917, 40163, 621, 347, 627, 403, 643, 625, 4722, 6849, 281, 46973, 1097, 275, 6519, 285, 643, 4910, 247, 2969, 4836, 2789, 352, 1892, 281, 5963, 849, 253, 40163, 279, 6880, 651, 4311, 50274, 977, 50276, 783, 5981, 273, 10414, 253, 897, 273, 247, 1180, 275, 41616, 310, 11555, 285, 940, 36484, 4229, 275, 253, 18520, 50276, 32897, 387, 1878, 1416, 512, 253, 2426, 908, 275, 253, 2022, 2929, 2133, 1014, 604, 597, 403, 2931, 1996, 275, 253, 30762, 24088, 288, 85, 275, 5150, 884, 4229, 275, 253, 18520, 513, 1097, 364, 73, 285, 270, 73, 3831, 253, 1072, 1448, 15559, 384, 1307, 275, 616, 5731, 5150, 1903, 4229, 275, 253, 18520, 3239, 818, 407, 10551, 25957, 25985, 534, 2550, 320, 1119, 275, 253, 2829, 337, 4229, 275, 253, 18520, 3239, 1249, 310, 18207, 281, 4558, 275, 253, 18520, 5474, 339, 4904, 699, 253, 5955, 342, 4477, 891, 717, 5211, 281, 5583, 14924, 50275, 18, 186, 249, 17912, 323, 1016, 3280, 4972, 273, 1979, 295, 3453, 4972, 273, 1979, 278, 10103, 403, 8085, 715, 1740, 4243, 253, 806, 581, 18207, 281, 391, 253, 1273, 310, 1269, 74, 253, 2626, 581, 18207, 281, 340, 75, 285, 253, 1390, 581, 281, 1182, 76, 281, 38530, 247, 40163, 279, 2805, 50276, 83, 18, 50276, 2981, 50276, 90, 75, 50276, 32786, 403, 368, 19860, 7877, 278, 390, 278, 3181, 295, 285, 604, 368, 8085, 278, 2069, 295, 891, 2868, 28763, 752, 368, 403, 2509, 275, 534, 1340, 368, 403, 19860, 4194, 2201, 987, 4496, 5513, 374, 186, 74, 858, 417, 2096, 2139, 4477, 42126, 564, 275, 253, 4016, 3884, 273, 253, 11786, 275, 16186, 8437, 18, 495, 186, 249, 2593, 5910, 4477, 5393, 25761, 271, 4373, 19017, 4764, 2550, 320, 3365, 31260, 12421, 285, 4445, 3020, 1955, 281, 253, 6355, 875, 4295, 534, 891, 7052, 5194, 533, 275, 16186, 818, 285, 898, 2139, 253, 5731, 4803, 285, 5743, 1159, 403, 3732, 4445, 15822, 577, 186, 74, 1663, 751, 253, 13990, 593, 275, 253, 4764, 31850, 4564, 273, 5884, 1841, 1060, 337, 697, 1805, 281, 3748, 275, 16186, 1668, 2139, 299, 88, 310, 470, 984, 273, 10377, 374, 3806, 943, 320, 9901, 3185, 273, 8319, 608, 186, 23955, 5272, 8245, 588, 320, 970, 247, 2570, 2990, 751, 5987, 5758, 15337, 3024, 39061, 301, 73, 18, 85, 19, 11774, 91, 357, 285, 897, 253, 806, 767, 2426, 275, 16186, 655, 323, 6779, 436, 588, 671, 6830, 15249, 253, 31471, 273, 970, 2169, 1340, 7898, 84, 50276, 23, 186, 783, 4477, 5393, 2709, 2069, 670, 253, 6786, 1375, 23037, 14387, 1543, 1293, 4933, 667, 25577, 347, 247, 9414, 417, 973, 1888, 264, 275, 253, 19463, 5028, 352, 588, 320, 5322, 281, 923, 690, 10414, 281, 2831, 30716, 253, 1750, 1160, 50274, 16691, 5701, 337, 186, 74, 2096, 253, 15504, 273, 13947, 391, 9866, 298, 296, 78, 1566, 275, 253, 2317, 273, 40163, 621, 533, 3943, 40163, 621, 476, 320, 3636, 342, 643, 8470, 835, 27311, 310, 2931, 4102, 24088, 342, 256, 20, 5987, 39962, 2061, 5375, 11395, 2270, 3763, 883, 891, 476, 923, 326, 436, 2929, 310, 13399, 533, 387, 1878, 476, 4477, 4385, 327, 253, 30437, 273, 436, 2087, 1332, 275, 616, 1083, 1677, 326, 275, 295, 2824, 1093, 253, 1563, 2929, 10062, 670, 391, 9866, 1566, 327, 5293, 26365, 8470, 5987, 39962, 2061, 9275, 1093, 1762, 11124, 2125, 9275, 581, 476, 9017, 841, 5697, 281, 1287, 271, 391, 9866, 1566, 275, 253, 2317, 273, 40163, 621, 4477, 943, 1007, 715, 352, 8132, 29689, 347, 2852, 10746, 533, 387, 1878, 4496, 5701, 327, 253, 30437, 374, 186, 783, 5661, 1543, 2593, 310, 8489, 5075, 253, 4583, 1750, 273, 970, 11184, 3602, 285, 17170, 10870, 1543, 310, 760, 17618, 327, 4522, 262, 941, 625, 40290, 310, 3309, 50276, 20, 186, 249, 2426, 273, 7681, 38135, 2167, 40163, 279, 8697, 310, 973, 4304, 891, 751, 253, 4764, 31850, 5933, 891, 476, 923, 253, 15785, 273, 436, 275, 13361, 8113, 3114, 50274, 856, 84, 50276, 18, 5322, 973, 28462, 35961, 2440, 327, 973, 4304, 8697, 2969, 533, 20654, 594, 28763, 1175, 374, 23395, 3542, 285, 512, 253, 14168, 84, 2451, 562, 28763, 1175, 495, 5661, 906, 327, 4522, 262, 10895, 2722, 31471, 275, 2426, 273, 970, 11184, 3602, 533, 1335, 476, 5115, 594, 66, 1543, 50276, 5040, 337, 923, 619, 5701, 1840, 891, 1902, 253, 4477, 281, 30080, 2953, 253, 18979, 5701, 4583, 2167, 2969, 533, 5322, 285, 3309, 2440, 273, 391, 9866, 298, 296, 78, 7792, 275, 253, 2317, 273, 40163, 621, 50276, 19, 19756, 9470, 5661, 12820, 50276, 2577, 1921, 323, 619, 13716, 310, 7194, 984, 273, 337, 3480, 273, 5661, 12820, 374, 1146, 6600, 273, 253, 3332, 2440, 273, 2087, 391, 9866, 1566, 327, 5293, 26365, 8470, 891, 971, 690, 5701, 275, 436, 3884, 923, 7000, 4385, 285, 3806, 1840, 2490, 187, 4118, 18435, 27, 783, 4477, 15313, 285, 3368, 342, 40163, 279, 3169, 18902, 11454, 6928, 285, 7568, 616, 12510, 327, 6519, 8981, 8892, 4522, 262, 285, 37280, 75, 835, 253, 4477, 7568, 326, 253, 4081, 3210, 476, 5115, 253, 1072, 7200, 342, 11184, 3602, 685, 6041, 3210, 253, 30628, 497, 42293, 275, 46705, 326, 253, 2929, 320, 7607 ]
[ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 ]
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 783, 2929, 3936, 247, 1175, 3213, 2584, 6684, 625, 18872, 14237, 407, 18216, 253, 897, 273, 40163, 621, 275, 18902, 11454, 6928, 50276, 783, 2934, 310, 17194, 407, 253, 8310, 326, 275, 1142, 2219, 627, 403, 1980, 7688, 2190, 3603, 273, 247, 4972, 326, 943, 320, 11120, 6607, 50276, 2520, 310, 671, 253, 2934, 3212, 43642, 50276, 936, 452, 1016, 3943, 3453, 247, 4972, 273, 3602, 281, 320, 11658, 2220, 2581, 685, 247, 2014, 1180, 50275, 1568, 253, 4477, 921, 326, 407, 24049, 40163, 621, 715, 253, 14237, 908, 407, 391, 79, 2224, 390, 298, 296, 983, 581, 33526, 1805, 3045, 387, 6519, 8981, 8892, 970, 11184, 3602, 50276, 783, 40163, 17135, 6779, 273, 253, 2812, 287, 1710, 6777, 1060, 3133, 247, 2372, 10341, 50276, 22309, 403, 841, 253, 12474, 281, 320, 32487, 2366, 50276, 953, 417, 4755, 50276, 11425, 2649, 436, 320, 6311, 50276, 7152, 33032, 15177, 4209, 2167, 627, 403, 3374, 789, 2218, 275, 12077, 6519, 8981, 327, 7418, 11640, 273, 18902, 3210, 824, 347, 25817, 9367, 32989, 9866, 285, 298, 296, 78, 7690, 69, 565, 74, 4240, 310, 4336, 12841, 9713, 275, 253, 18520, 253, 5740, 273, 13335, 3198, 281, 3748, 253, 4872, 2954, 875, 3280, 3386, 285, 13335, 923, 18974, 288, 78, 983, 407, 1182, 257, 285, 18734, 14776, 9713, 275, 253, 18520, 4522, 262, 310, 247, 1077, 2969, 4836, 9713, 407, 6240, 37280, 75, 4679, 3538, 569, 275, 253, 14801, 1271, 812, 320, 4802, 1805, 9713, 275, 253, 18520, 50274, 498, 15752, 4209, 352, 651, 320, 1175, 281, 923, 690, 5955, 273, 337, 8085, 1396, 569, 285, 643, 1896, 4610, 2159, 4385, 2879, 275, 253, 18520, 604, 667, 374, 12091, 273, 13335, 285, 616, 4602, 281, 2629, 391, 9866, 13335, 2159, 4385, 2879, 275, 253, 18520, 495, 15180, 10454, 9713, 275, 253, 18520, 50275, 19164, 414, 4209, 436, 2929, 8631, 253, 6880, 273, 40163, 279, 3997, 10495, 11454, 6928, 281, 18902, 11454, 6928, 285, 247, 4764, 3302, 5837, 1332, 275, 253, 40163, 451, 5028, 50276, 9188, 40348, 4209, 50275, 856, 84, 8446, 6110, 275, 40163, 451, 11454, 6928, 651, 5649, 432, 436, 9311, 5661, 1543, 1014, 604, 3710, 1804, 326, 40163, 451, 6779, 778, 3959, 247, 1534, 5141, 275, 253, 1180, 273, 1566, 3602, 387, 642, 2957, 275, 3045, 50274, 5040, 253, 4327, 273, 13335, 281, 4917, 40163, 621, 347, 627, 403, 643, 625, 4722, 6849, 281, 46973, 1097, 275, 6519, 285, 643, 4910, 247, 2969, 4836, 2789, 352, 1892, 281, 5963, 849, 253, 40163, 279, 6880, 651, 4311, 50274, 977, 50276, 783, 5981, 273, 10414, 253, 897, 273, 247, 1180, 275, 41616, 310, 11555, 285, 940, 36484, 4229, 275, 253, 18520, 50276, 32897, 387, 1878, 1416, 512, 253, 2426, 908, 275, 253, 2022, 2929, 2133, 1014, 604, 597, 403, 2931, 1996, 275, 253, 30762, 24088, 288, 85, 275, 5150, 884, 4229, 275, 253, 18520, 513, 1097, 364, 73, 285, 270, 73, 3831, 253, 1072, 1448, 15559, 384, 1307, 275, 616, 5731, 5150, 1903, 4229, 275, 253, 18520, 3239, 818, 407, 10551, 25957, 25985, 534, 2550, 320, 1119, 275, 253, 2829, 337, 4229, 275, 253, 18520, 3239, 1249, 310, 18207, 281, 4558, 275, 253, 18520, 5474, 339, 4904, 699, 253, 5955, 342, 4477, 891, 717, 5211, 281, 5583, 14924, 50275, 18, 186, 249, 17912, 323, 1016, 3280, 4972, 273, 1979, 295, 3453, 4972, 273, 1979, 278, 10103, 403, 8085, 715, 1740, 4243, 253, 806, 581, 18207, 281, 391, 253, 1273, 310, 1269, 74, 253, 2626, 581, 18207, 281, 340, 75, 285, 253, 1390, 581, 281, 1182, 76, 281, 38530, 247, 40163, 279, 2805, 50276, 83, 18, 50276, 2981, 50276, 90, 75, 50276, 32786, 403, 368, 19860, 7877, 278, 390, 278, 3181, 295, 285, 604, 368, 8085, 278, 2069, 295, 891, 2868, 28763, 752, 368, 403, 2509, 275, 534, 1340, 368, 403, 19860, 4194, 2201, 987, 4496, 5513, 374, 186, 74, 858, 417, 2096, 2139, 4477, 42126, 564, 275, 253, 4016, 3884, 273, 253, 11786, 275, 16186, 8437, 18, 495, 186, 249, 2593, 5910, 4477, 5393, 25761, 271, 4373, 19017, 4764, 2550, 320, 3365, 31260, 12421, 285, 4445, 3020, 1955, 281, 253, 6355, 875, 4295, 534, 891, 7052, 5194, 533, 275, 16186, 818, 285, 898, 2139, 253, 5731, 4803, 285, 5743, 1159, 403, 3732, 4445, 15822, 577, 186, 74, 1663, 751, 253, 13990, 593, 275, 253, 4764, 31850, 4564, 273, 5884, 1841, 1060, 337, 697, 1805, 281, 3748, 275, 16186, 1668, 2139, 299, 88, 310, 470, 984, 273, 10377, 374, 3806, 943, 320, 9901, 3185, 273, 8319, 608, 186, 23955, 5272, 8245, 588, 320, 970, 247, 2570, 2990, 751, 5987, 5758, 15337, 3024, 39061, 301, 73, 18, 85, 19, 11774, 91, 357, 285, 897, 253, 806, 767, 2426, 275, 16186, 655, 323, 6779, 436, 588, 671, 6830, 15249, 253, 31471, 273, 970, 2169, 1340, 7898, 84, 50276, 23, 186, 783, 4477, 5393, 2709, 2069, 670, 253, 6786, 1375, 23037, 14387, 1543, 1293, 4933, 667, 25577, 347, 247, 9414, 417, 973, 1888, 264, 275, 253, 19463, 5028, 352, 588, 320, 5322, 281, 923, 690, 10414, 281, 2831, 30716, 253, 1750, 1160, 50274, 16691, 5701, 337, 186, 74, 2096, 253, 15504, 273, 13947, 391, 9866, 298, 296, 78, 1566, 275, 253, 2317, 273, 40163, 621, 533, 3943, 40163, 621, 476, 320, 3636, 342, 643, 8470, 835, 27311, 310, 2931, 4102, 24088, 342, 256, 20, 5987, 39962, 2061, 5375, 11395, 2270, 3763, 883, 891, 476, 923, 326, 436, 2929, 310, 13399, 533, 387, 1878, 476, 4477, 4385, 327, 253, 30437, 273, 436, 2087, 1332, 275, 616, 1083, 1677, 326, 275, 295, 2824, 1093, 253, 1563, 2929, 10062, 670, 391, 9866, 1566, 327, 5293, 26365, 8470, 5987, 39962, 2061, 9275, 1093, 1762, 11124, 2125, 9275, 581, 476, 9017, 841, 5697, 281, 1287, 271, 391, 9866, 1566, 275, 253, 2317, 273, 40163, 621, 4477, 943, 1007, 715, 352, 8132, 29689, 347, 2852, 10746, 533, 387, 1878, 4496, 5701, 327, 253, 30437, 374, 186, 783, 5661, 1543, 2593, 310, 8489, 5075, 253, 4583, 1750, 273, 970, 11184, 3602, 285, 17170, 10870, 1543, 310, 760, 17618, 327, 4522, 262, 941, 625, 40290, 310, 3309, 50276, 20, 186, 249, 2426, 273, 7681, 38135, 2167, 40163, 279, 8697, 310, 973, 4304, 891, 751, 253, 4764, 31850, 5933, 891, 476, 923, 253, 15785, 273, 436, 275, 13361, 8113, 3114, 50274, 856, 84, 50276, 18, 5322, 973, 28462, 35961, 2440, 327, 973, 4304, 8697, 2969, 533, 20654, 594, 28763, 1175, 374, 23395, 3542, 285, 512, 253, 14168, 84, 2451, 562, 28763, 1175, 495, 5661, 906, 327, 4522, 262, 10895, 2722, 31471, 275, 2426, 273, 970, 11184, 3602, 533, 1335, 476, 5115, 594, 66, 1543, 50276, 5040, 337, 923, 619, 5701, 1840, 891, 1902, 253, 4477, 281, 30080, 2953, 253, 18979, 5701, 4583, 2167, 2969, 533, 5322, 285, 3309, 2440, 273, 391, 9866, 298, 296, 78, 7792, 275, 253, 2317, 273, 40163, 621, 50276, 19, 19756, 9470, 5661, 12820, 50276, 2577, 1921, 323, 619, 13716, 310, 7194, 984, 273, 337, 3480, 273, 5661, 12820, 374, 1146, 6600, 273, 253, 3332, 2440, 273, 2087, 391, 9866, 1566, 327, 5293, 26365, 8470, 891, 971, 690, 5701, 275, 436, 3884, 923, 7000, 4385, 285, 3806, 1840, 2490, 187, 4118, 18435, 27, 783, 4477, 15313, 285, 3368, 342, 40163, 279, 3169, 18902, 11454, 6928, 285, 7568, 616, 12510, 327, 6519, 8981, 8892, 4522, 262, 285, 37280, 75, 835, 253, 4477, 7568, 326, 253, 4081, 3210, 476, 5115, 253, 1072, 7200, 342, 11184, 3602, 685, 6041, 3210, 253, 30628, 497, 42293, 275, 46705, 326, 253, 2929, 320, 7607 ]
Below is given review of a research paper from cnoference journal. Please write a summary the review. ### Review: this paper describes a procedure for training neural networks via an explicit constraint on the energy budget as opposed to pruning the model size as commonly done with standard compression methods comparative results are shown on a few data sets where the proposed method outperforms multiple different approaches overall the concept is interesting and certainly could prove valuable in resourceconstrained environments still i retain some reservations as detailed below my first concern is that this paper exceeds the recommended 8 page limit for reasons that are seemingly quite unnecessary there are no large essential figurestables and nearly the first 6 pages is just introduction and background material likewise the paper consumes a considerable amount of space presenting technical results related to knapsack problems and various epsilonaccurate solutions but this theoretical content seems somewhat irrelevant and distracting since it is not directly related to the greedy approximation strategy actually used for practical deployment much of this material could have been moved to the supplementary so as to adhere to the 8 page soft limit per the iclr reviewer instructions papers deemed unnecessarily long relative to this length should be judged more critically another issue relates to the use of a mask for controlling the sparsity of network inputs although not acknowledged similar techniques are already used to prune the activations of deep networks for compression in particular various forms of variational dropout essentially use multiplicative weights to remove the influence of activations andor other network components similar to the mask m used is this work representative examples include neklyudov et al structured bayesian pruning via lognormal multiplicative noise nips 2017 and louizos et al bayesian compression for deep learning nips 2017 but there are many other related alternatives using some form of trainable gate or mask possibly stochastic to affect pruning the major ml and cv conferences over the past year have numerous related compression papers so i dont consider this aspect of the paper to be new in any significant way moreover for the empirical comparisons it would be better to compare against stateoftheart compression methods as opposed to just the stated mp and ssl methods from 2015 and 2016 respectively despite claims to the contrary on page 9 i would not consider these to be stateoftheart methods at this point another comment i have regarding the experiments is that hyperparameters and the use of knowledge distillation were potentially tuned for the proposed method and then simultaneously applied to the competing algorithms for the sake of headtohead comparison but to me if these enhancements are to be included at all tuning must be done carefully and independently for each algorithm was this actually done moreover it would have been nice to see results without the confounding influence of distillation to isolate sources of improvement but no ablation studies were presented finally regarding the content in section 5 the paper carefully presents an explicit bound on energy that ultimately leads to a constraint that is nphard just to project on to although approximate solutions exist that depend on some error tolerance however even this requires an algorithm that is dismissed as complicated instead a greedy alternative is derived in the appendix which presumably serves as the final endorsed approach but at this point it is no longer clear to me exactly what performance guarantees remain with respect to the energy bound theorem 3 presents a fairly inscrutable bound and it is not at all transparent how to interpret this in any practical sense note that after theorem 3 conditions are described whereby an optimal projection can be obtained but these seem highly nuanced and unlikely to apply in most cases additionally it would appear that crude bounds on the energy could also be introduced by simply penalizingconstraining the sparsity on each layer which leads to a much simpler projection step for example a simple affine function of the l0 norm would be much easier to optimize and could serve as a loose bound on the energy given that the latter should be a nondecreasing function of the l0 norm any idea how such a bound compares to those presented given all the approximations and greedy steps that must be included other comments as an implementation heuristic the proposed algorithm 1 gradually decays the parameter q which controls the sparsity of the mask m but this will certainly alter the energy budget and i wonder how important it is to employ a complex energy constraint if minimization requires this type of heuristic i did not see where the quantity lmw embedded in eq 17 was formally defined although i can guess what it is in general it is somewhat troublesome that on top of a complex nonconvex deep network energy function just the small subproblem required for projecting onto the energy constraint is nphard even if approximations are possible i wonder if this extra complexity is always worth it relative so simple sparsitybased compression methods which can be efficiently implemented with exactly closedform projections in table 1 the proposed method is highlighted as having the smallest accuracy drop on squeezenet but this is not true eap is lower likewise on alexnet netadapt has an equally optimal energydocsepthe paper is dedicated to energybased compression of deep neural networks while most works on compression are dedicated to decreasing the number of parameters or decreasing the number of operations to speedup or reducing of memory footprint these approaches do not provide any guarantees on energy consumption in this work the authors derived a loss for training nn with energy constraints and provided an optimization algorithm for it the authors showed that the proposed method achieves higher accuracy with lower energy consumption given the same energy budget the experimental results are quite interesting and include even highly optimized network mobilenetv2 several questions and concerns our energy modeling results are validated against the industrystrength dnn hardware simulator scalesim could the authors please elaborate on this sentence one of the main assumptions is the following if the value of the data is zero the hardware can skip accessing the data as far as i know this is a quite strong assumption that is not supported by many architectures how do the authors take into account overhead of using sparse data formats in such hardware in their estimations is it possible to simulate such behavior in scalesim moreover in many modern systems dram can only be read in chunks therefore it can decrease number of dram accesses in 4 small typos and other issues page 8 there exists an algorithm that can find an an epsilon page 8 but it is possible to fan approximate solution page 4 it is better to put the sentence where s convolutional stride after 2 in formulation of the theorem 3 it is better to explicitly state that a contains rational numbers only since gcd is used overall the paper is written clearly and organized well contains interesting experimental and theoretical results docsepthe paper proposes a method for neural network training under a hard energy constraint ie the method guarantees the energy consumption to be upper bounded based on a systolic array hardware architecture the authors model the energy consumption of transferring the weights and activations into different levels of memory dram cache register file during inference the energy consumption is therefore determined by the number of nonzero elements in the weight and activation tensors to minimize the network loss under an energy constraint the authors develop a training framework including a novel greedy algorithm to compute the projection of the weight tensors to the energy constraint pros the proposed method allows to accurately impose an energy constraint in terms of the proposed model in contrast to previous methods and also yields a higher accuracy than these on some data sets the proposed solution seems sound although i did not check the proofs in detail and i am not very familiar with hardware energy consumption subtleties questions the experiments in sec 62 suggest that the activation mask is mainly beneficial when the data is highly structured how are the benefits in terms of weight and activation sparsity composed in the experiments on imagenet how does the weight sparsity of the the proposed method compare to the related methods in these experiments is weight sparsity in these cases a good proxy for energy consumption how does the activation sparsity decay parameter delta q affect the accuracyenergy consumption tradeoff for the two data sets the authors show that the weight projection problem can be solved efficiently how does the guarantee translate into wallclock time filter pruning methods 12 reduce both the size of the weight and activation tensors while not requiring to solve a complicated projection problem or introducing activation masks it would be good to compare to these methods or at least comment on the gains to be expected under the proposed energy consumption model knowledge distillation has previously been observed to be quite helpful when constraining neural network weights to be quantized andor sparse see 345 it might be worth mentioning this minor comments sec 34 1st paragraph subscript superscript sec 62 first paragraph pattens patterns aliened aligned 1 he y zhang x sun j 2017 channel pruning for accelerating very deep neural networks iccv 2017 2 li h kadav a durdanovic i samet h graf h p pruning filters for efficient convnets iclr 2017 3 mishra a marr d apprentice using knowledge distillation techniques to improve lowprecision network accuracy iclr 2018 4 tschannen m khanna a anandkumar a strassennets deep learning with a multiplication budget icml 2018 5 zhuang b shen c tan m liu l reid i towards effective lowbitwidth convolutional neural networks cvpr 2018 ### Summary:
all of the reviewers agree that this is a wellwritten paper with the novel perspective of minimizing energy consumption in neural networks as opposed to maximizing sparsity which does not always correlate with energy cost there are a number of promised clarifications and additional results that have emerged from the discussion that should be put into the final draft namely describing the overhead of converting from sparse to dense representations adding the imagenet sparsity results and adding the time taken to run the projection step
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 2520, 2929, 8631, 247, 5199, 323, 3733, 11454, 6928, 3066, 271, 6843, 7658, 327, 253, 2341, 7563, 347, 10066, 281, 819, 25004, 253, 1566, 1979, 347, 7744, 2218, 342, 2629, 13800, 3082, 50276, 681, 1148, 800, 1543, 403, 2011, 327, 247, 1643, 941, 5239, 835, 253, 4081, 1332, 41731, 13015, 2709, 1027, 7274, 50276, 1189, 455, 253, 4473, 310, 4722, 285, 5604, 812, 5276, 9865, 275, 7741, 48454, 12620, 50276, 23350, 891, 13280, 690, 33196, 347, 7000, 2708, 50276, 2577, 806, 4468, 310, 326, 436, 2929, 23141, 253, 8521, 854, 3239, 2701, 323, 4606, 326, 403, 16907, 3240, 15279, 50276, 9088, 403, 642, 1781, 5667, 4677, 296, 2272, 285, 4829, 253, 806, 721, 7223, 310, 816, 10199, 285, 4114, 2144, 50276, 3022, 3020, 253, 2929, 3651, 265, 247, 10665, 2408, 273, 2317, 15250, 7681, 1543, 2905, 281, 694, 1825, 471, 3237, 285, 2710, 299, 4277, 18921, 366, 5482, 533, 436, 10527, 2600, 3133, 8489, 19124, 285, 940, 25031, 1580, 352, 310, 417, 3587, 2905, 281, 253, 38754, 11193, 5700, 2686, 908, 323, 8542, 19007, 50276, 25914, 273, 436, 2144, 812, 452, 644, 4395, 281, 253, 24864, 594, 347, 281, 29534, 281, 253, 854, 3239, 2602, 2701, 50276, 468, 253, 17857, 32888, 37317, 7997, 9380, 14320, 48312, 1048, 4103, 281, 436, 2978, 943, 320, 24242, 625, 21038, 50276, 23955, 2523, 7033, 281, 253, 897, 273, 247, 8989, 323, 10938, 253, 37139, 414, 273, 2990, 14800, 50276, 20261, 417, 14969, 2074, 5609, 403, 2168, 908, 281, 819, 2517, 253, 1396, 569, 273, 3676, 6928, 323, 13800, 50276, 249, 1798, 2710, 4948, 273, 39762, 5926, 483, 9093, 897, 43904, 13461, 281, 5386, 253, 4833, 273, 1396, 569, 285, 263, 643, 2990, 4295, 2074, 281, 253, 8989, 278, 908, 310, 436, 789, 50276, 12554, 800, 6667, 2486, 425, 76, 314, 438, 729, 1162, 355, 18872, 17699, 16561, 819, 25004, 3066, 298, 2331, 1939, 43904, 6046, 295, 2824, 4240, 285, 29245, 478, 375, 1162, 355, 17699, 16561, 13800, 323, 3676, 4715, 295, 2824, 4240, 533, 627, 403, 1142, 643, 2905, 18075, 970, 690, 830, 273, 6194, 494, 7394, 390, 8989, 6830, 19191, 281, 2818, 819, 25004, 253, 2201, 13361, 285, 30105, 27691, 689, 253, 2469, 807, 452, 7418, 2905, 13800, 9380, 50276, 601, 891, 13414, 1908, 436, 4809, 273, 253, 2929, 281, 320, 747, 275, 667, 1534, 1039, 50276, 3062, 1189, 323, 253, 16774, 14023, 352, 651, 320, 1805, 281, 7277, 1411, 1375, 23037, 14387, 13800, 3082, 347, 10066, 281, 816, 253, 4767, 23542, 285, 256, 3433, 3082, 432, 4104, 285, 4022, 2975, 50276, 3229, 3784, 3916, 281, 253, 10214, 327, 3239, 898, 891, 651, 417, 1908, 841, 281, 320, 1375, 23037, 14387, 3082, 387, 436, 1127, 50276, 23955, 4385, 891, 452, 5001, 253, 4679, 310, 326, 4373, 22041, 285, 253, 897, 273, 3640, 940, 21755, 497, 7826, 24251, 323, 253, 4081, 1332, 285, 840, 10486, 3732, 281, 253, 11771, 11333, 323, 253, 13232, 273, 1481, 936, 2522, 5301, 50276, 2858, 281, 479, 604, 841, 42752, 403, 281, 320, 2908, 387, 512, 25184, 1364, 320, 2218, 9257, 285, 10939, 323, 1016, 5933, 50276, 4238, 436, 2686, 2218, 50276, 3062, 1189, 352, 651, 452, 644, 5322, 281, 923, 1543, 1293, 253, 34541, 4833, 273, 940, 21755, 281, 20843, 4973, 273, 7756, 533, 642, 28913, 2175, 497, 3559, 50276, 71, 3341, 5001, 253, 2600, 275, 2593, 608, 253, 2929, 9257, 10262, 271, 6843, 3033, 327, 2341, 326, 9142, 5644, 281, 247, 7658, 326, 310, 295, 545, 472, 816, 281, 2199, 327, 281, 3738, 16851, 5482, 2226, 326, 3469, 327, 690, 2228, 13761, 50276, 35529, 1014, 436, 4419, 271, 5933, 326, 310, 11511, 347, 9542, 50276, 34235, 247, 38754, 5795, 310, 6012, 275, 253, 30762, 534, 18289, 11029, 347, 253, 2457, 30020, 2746, 50276, 2858, 387, 436, 1127, 352, 310, 642, 3356, 2590, 281, 479, 4555, 752, 3045, 23632, 3464, 342, 1675, 281, 253, 2341, 3033, 50276, 33921, 495, 10262, 247, 9648, 275, 8658, 13508, 3033, 285, 352, 310, 417, 387, 512, 13955, 849, 281, 4665, 436, 275, 667, 8542, 3282, 50276, 9939, 326, 846, 10012, 495, 2515, 403, 2529, 17580, 271, 8654, 12378, 476, 320, 2797, 533, 841, 1646, 4122, 8794, 3086, 285, 11543, 281, 4647, 275, 954, 2219, 50276, 29483, 595, 352, 651, 3176, 326, 18934, 14493, 327, 253, 2341, 812, 671, 320, 5611, 407, 3365, 29697, 3006, 3474, 26208, 253, 37139, 414, 327, 1016, 3828, 534, 5644, 281, 247, 1199, 19554, 12378, 3213, 50276, 1542, 1650, 247, 2969, 29438, 1159, 273, 253, 298, 17, 5222, 651, 320, 1199, 6927, 281, 22318, 285, 812, 5752, 347, 247, 13155, 3033, 327, 253, 2341, 1677, 326, 253, 6158, 943, 320, 247, 1327, 40600, 2355, 1159, 273, 253, 298, 17, 5222, 50276, 1279, 2934, 849, 824, 247, 3033, 26662, 281, 1110, 3559, 1677, 512, 253, 34754, 285, 38754, 5018, 326, 1364, 320, 2908, 50275, 977, 5701, 50276, 284, 271, 7092, 47641, 253, 4081, 5933, 337, 13237, 27221, 253, 4764, 2805, 534, 5760, 253, 37139, 414, 273, 253, 8989, 278, 50276, 2858, 436, 588, 5604, 6990, 253, 2341, 7563, 285, 891, 4282, 849, 1774, 352, 310, 281, 2126, 247, 2570, 2341, 7658, 604, 41458, 4419, 436, 1511, 273, 47641, 50275, 74, 858, 417, 923, 835, 253, 10671, 298, 50198, 12691, 275, 16186, 1722, 369, 19186, 2931, 3738, 891, 476, 5476, 752, 352, 310, 50275, 249, 2087, 352, 310, 8489, 45991, 326, 327, 1755, 273, 247, 2570, 1327, 44181, 3676, 2990, 2341, 1159, 816, 253, 1355, 749, 28872, 2424, 323, 35104, 4830, 253, 2341, 7658, 310, 295, 545, 472, 50276, 9154, 604, 34754, 403, 1896, 891, 4282, 604, 436, 4465, 10454, 310, 1900, 4409, 352, 4103, 594, 2969, 37139, 414, 3169, 13800, 3082, 534, 476, 320, 14556, 9009, 342, 4555, 4581, 630, 20553, 50275, 249, 2829, 337, 253, 4081, 1332, 310, 16318, 347, 1907, 253, 8004, 7200, 5926, 327, 15232, 5282, 292, 50276, 2858, 436, 310, 417, 2032, 299, 522, 310, 2406, 50276, 3022, 3020, 327, 247, 1591, 3024, 2036, 26672, 556, 271, 9696, 8654, 2341, 7152, 339, 431, 248, 2929, 310, 9940, 281, 2341, 3169, 13800, 273, 3676, 11454, 6928, 1223, 954, 2987, 327, 13800, 403, 9940, 281, 11052, 253, 1180, 273, 3602, 390, 11052, 253, 1180, 273, 5871, 281, 3885, 484, 390, 8493, 273, 3541, 33257, 841, 7274, 513, 417, 2085, 667, 23632, 327, 2341, 8353, 275, 436, 789, 253, 4477, 6012, 247, 2957, 323, 3733, 48257, 342, 2341, 10806, 285, 2530, 271, 13757, 5933, 323, 352, 253, 4477, 2692, 326, 253, 4081, 1332, 33526, 2169, 7200, 342, 2406, 2341, 8353, 1677, 253, 1072, 2341, 7563, 253, 5661, 1543, 403, 3240, 4722, 285, 2486, 1014, 4122, 18325, 2990, 31551, 257, 292, 87, 19, 50276, 43249, 3533, 285, 7350, 776, 2341, 14053, 1543, 403, 17618, 1411, 253, 4491, 45563, 277, 9866, 10309, 40022, 11498, 303, 812, 253, 4477, 4496, 21184, 327, 436, 6197, 50276, 531, 273, 253, 2022, 13260, 310, 253, 1563, 604, 253, 1318, 273, 253, 941, 310, 5058, 253, 10309, 476, 17049, 24497, 253, 941, 347, 2080, 347, 891, 871, 436, 310, 247, 3240, 2266, 9376, 326, 310, 417, 4516, 407, 1142, 35615, 849, 513, 253, 4477, 1379, 715, 2395, 18332, 273, 970, 23507, 941, 21453, 275, 824, 10309, 275, 616, 3311, 569, 310, 352, 1896, 281, 26065, 824, 3879, 275, 11498, 303, 25761, 275, 1142, 4980, 2718, 8711, 476, 760, 320, 1239, 275, 30151, 3103, 352, 476, 6379, 1180, 273, 8711, 2289, 265, 275, 577, 50276, 6795, 963, 993, 285, 643, 3374, 3239, 854, 627, 4961, 271, 5933, 326, 476, 1089, 271, 271, 299, 4277, 3239, 854, 533, 352, 310, 1896, 281, 7989, 16851, 2900, 3239, 577, 50276, 262, 310, 1805, 281, 1691, 253, 6197, 835, 256, 27311, 267, 31482, 50276, 6438, 374, 275, 15895, 273, 253, 10012, 495, 352, 310, 1805, 281, 11120, 1375, 326, 247, 4428, 8870, 3904, 760, 1580, 305, 2428, 310, 908, 4583, 253, 2929, 310, 3542, 4518, 285, 10932, 973, 4428, 4722, 5661, 285, 10527, 1543, 5474, 339, 431, 248, 2929, 29328, 247, 1332, 323, 11454, 2990, 3733, 762, 247, 1892, 2341, 7658, 26332, 253, 1332, 23632, 253, 2341, 8353, 281, 320, 5170, 11542, 1754, 327, 247, 29479, 3781, 10309, 10336, 253, 4477, 1566, 253, 2341, 8353, 273, 27090, 253, 13461, 285, 1396, 569, 715, 1027, 2308, 273, 3541, 8711, 11556, 8749, 1873, 1309, 17032, 253, 2341, 8353, 310, 3103, 3413, 407, 253, 1180, 273, 28078, 3603, 275, 253, 2801, 285, 5743, 47454, 281, 15338, 253, 2990, 2957, 762, 271, 2341, 7658, 253, 4477, 1287, 247, 3733, 7792, 1690, 247, 4460, 38754, 5933, 281, 11897, 253, 12378, 273, 253, 2801, 47454, 281, 253, 2341, 7658, 50276, 856, 84, 50276, 783, 4081, 1332, 4483, 281, 13613, 16209, 271, 2341, 7658, 275, 2426, 273, 253, 4081, 1566, 275, 4499, 281, 2045, 3082, 285, 671, 11026, 247, 2169, 7200, 685, 841, 327, 690, 941, 5239, 253, 4081, 2900, 3133, 3590, 3738, 891, 858, 417, 2451, 253, 27947, 275, 2508, 285, 891, 717, 417, 1077, 7615, 342, 10309, 2341, 8353, 8482, 1059, 447, 50276, 34974, 50276, 783, 4679, 275, 4706, 9743, 1804, 326, 253, 5743, 8989, 310, 7194, 12912, 672, 253, 941, 310, 4122, 18872, 849, 403, 253, 5373, 275, 2426, 273, 2801, 285, 5743, 37139, 414, 9924, 275, 253, 4679, 327, 4440, 257, 292, 849, 1057, 253, 2801, 37139, 414, 273, 253, 253, 4081, 1332, 7277, 281, 253, 2905, 3082, 275, 841, 4679, 310, 2801, 37139, 414, 275, 841, 2219, 247, 1175, 17335, 323, 2341, 8353, 50276, 5430, 1057, 253, 5743, 37139, 414, 10027, 4764, 18687, 2805, 2818, 253, 7200, 14115, 8353, 5454, 2727, 323, 253, 767, 941, 5239, 50276, 783, 4477, 921, 326, 253, 2801, 12378, 1895, 476, 320, 14042, 14556, 849, 1057, 253, 12215, 16497, 715, 3402, 13273, 673, 50276, 10978, 819, 25004, 3082, 1249, 4796, 1097, 253, 1979, 273, 253, 2801, 285, 5743, 47454, 1223, 417, 10568, 281, 8415, 247, 9542, 12378, 1895, 390, 16984, 5743, 25965, 352, 651, 320, 1175, 281, 7277, 281, 841, 3082, 390, 387, 1878, 4385, 327, 253, 15988, 281, 320, 3264, 762, 253, 4081, 2341, 8353, 1566, 50276, 36871, 940, 21755, 556, 3786, 644, 2540, 281, 320, 3240, 9371, 672, 1030, 26208, 11454, 2990, 13461, 281, 320, 2677, 1025, 285, 263, 23507, 923, 32670, 352, 1537, 320, 4409, 29570, 436, 50276, 37585, 5701, 50276, 1704, 5910, 337, 296, 12494, 749, 3866, 50276, 8403, 398, 1687, 50276, 1704, 9743, 806, 12494, 869, 85, 561, 50276, 17523, 84, 14165, 264, 50276, 2132, 50276, 18, 344, 340, 1182, 12109, 1269, 50276, 13998, 480, 4240, 5048, 819, 25004, 323, 38757, 1077, 3676, 11454, 6928, 17857, 17312, 4240, 374, 632, 288, 40303, 580, 247, 277, 11180, 266, 32733, 891, 1775, 292, 288, 50276, 17676, 71, 288, 268, 819, 25004, 15116, 323, 5919, 2410, 47301, 17857, 32888, 4240, 495, 49285, 376, 247, 50276, 78, 3298, 277, 35016, 547, 970, 3640, 940, 21755, 5609, 281, 3157, 1698, 40540, 2990, 7200, 17857, 32888, 4765, 577, 246, 10629, 1136, 257, 278, 26856, 9045, 247, 50276, 266, 395, 76, 22711, 247, 1213, 515, 2477, 1507, 3676, 4715, 342, 247, 25219, 7563, 17857, 1686, 4765, 608, 1182, 11917, 606, 270, 703, 79, 260, 23136, 278, 632, 86, 298, 50276, 250, 301, 891, 4404, 3576, 1698, 2713, 3429, 27311, 267, 11454, 6928, 30105, 1087, 4765, 187, 187, 4118, 18435, 27, 455, 273, 253, 30628, 5194, 326, 436, 310, 247, 973, 15720, 2929, 342, 253, 4460, 8668, 273, 28699, 2341, 8353, 275, 11454, 6928, 347, 10066, 281, 46875, 37139, 414, 534, 1057, 417, 1900, 24888, 342, 2341, 2105, 627, 403, 247, 1180, 273, 12316, 8254, 6787, 285, 3081, 1543, 326, 452, 13082, 432, 253, 5955, 326, 943, 320, 1691, 715, 253, 2457, 7482, 10775, 12930, 253, 18332, 273, 22022, 432, 23507, 281, 14086, 14237, 6240, 253, 4440, 257, 292, 37139, 414, 1543, 285, 6240, 253, 673, 2668, 281, 1408, 253, 12378, 3213 ]
[ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 ]
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 2520, 2929, 8631, 247, 5199, 323, 3733, 11454, 6928, 3066, 271, 6843, 7658, 327, 253, 2341, 7563, 347, 10066, 281, 819, 25004, 253, 1566, 1979, 347, 7744, 2218, 342, 2629, 13800, 3082, 50276, 681, 1148, 800, 1543, 403, 2011, 327, 247, 1643, 941, 5239, 835, 253, 4081, 1332, 41731, 13015, 2709, 1027, 7274, 50276, 1189, 455, 253, 4473, 310, 4722, 285, 5604, 812, 5276, 9865, 275, 7741, 48454, 12620, 50276, 23350, 891, 13280, 690, 33196, 347, 7000, 2708, 50276, 2577, 806, 4468, 310, 326, 436, 2929, 23141, 253, 8521, 854, 3239, 2701, 323, 4606, 326, 403, 16907, 3240, 15279, 50276, 9088, 403, 642, 1781, 5667, 4677, 296, 2272, 285, 4829, 253, 806, 721, 7223, 310, 816, 10199, 285, 4114, 2144, 50276, 3022, 3020, 253, 2929, 3651, 265, 247, 10665, 2408, 273, 2317, 15250, 7681, 1543, 2905, 281, 694, 1825, 471, 3237, 285, 2710, 299, 4277, 18921, 366, 5482, 533, 436, 10527, 2600, 3133, 8489, 19124, 285, 940, 25031, 1580, 352, 310, 417, 3587, 2905, 281, 253, 38754, 11193, 5700, 2686, 908, 323, 8542, 19007, 50276, 25914, 273, 436, 2144, 812, 452, 644, 4395, 281, 253, 24864, 594, 347, 281, 29534, 281, 253, 854, 3239, 2602, 2701, 50276, 468, 253, 17857, 32888, 37317, 7997, 9380, 14320, 48312, 1048, 4103, 281, 436, 2978, 943, 320, 24242, 625, 21038, 50276, 23955, 2523, 7033, 281, 253, 897, 273, 247, 8989, 323, 10938, 253, 37139, 414, 273, 2990, 14800, 50276, 20261, 417, 14969, 2074, 5609, 403, 2168, 908, 281, 819, 2517, 253, 1396, 569, 273, 3676, 6928, 323, 13800, 50276, 249, 1798, 2710, 4948, 273, 39762, 5926, 483, 9093, 897, 43904, 13461, 281, 5386, 253, 4833, 273, 1396, 569, 285, 263, 643, 2990, 4295, 2074, 281, 253, 8989, 278, 908, 310, 436, 789, 50276, 12554, 800, 6667, 2486, 425, 76, 314, 438, 729, 1162, 355, 18872, 17699, 16561, 819, 25004, 3066, 298, 2331, 1939, 43904, 6046, 295, 2824, 4240, 285, 29245, 478, 375, 1162, 355, 17699, 16561, 13800, 323, 3676, 4715, 295, 2824, 4240, 533, 627, 403, 1142, 643, 2905, 18075, 970, 690, 830, 273, 6194, 494, 7394, 390, 8989, 6830, 19191, 281, 2818, 819, 25004, 253, 2201, 13361, 285, 30105, 27691, 689, 253, 2469, 807, 452, 7418, 2905, 13800, 9380, 50276, 601, 891, 13414, 1908, 436, 4809, 273, 253, 2929, 281, 320, 747, 275, 667, 1534, 1039, 50276, 3062, 1189, 323, 253, 16774, 14023, 352, 651, 320, 1805, 281, 7277, 1411, 1375, 23037, 14387, 13800, 3082, 347, 10066, 281, 816, 253, 4767, 23542, 285, 256, 3433, 3082, 432, 4104, 285, 4022, 2975, 50276, 3229, 3784, 3916, 281, 253, 10214, 327, 3239, 898, 891, 651, 417, 1908, 841, 281, 320, 1375, 23037, 14387, 3082, 387, 436, 1127, 50276, 23955, 4385, 891, 452, 5001, 253, 4679, 310, 326, 4373, 22041, 285, 253, 897, 273, 3640, 940, 21755, 497, 7826, 24251, 323, 253, 4081, 1332, 285, 840, 10486, 3732, 281, 253, 11771, 11333, 323, 253, 13232, 273, 1481, 936, 2522, 5301, 50276, 2858, 281, 479, 604, 841, 42752, 403, 281, 320, 2908, 387, 512, 25184, 1364, 320, 2218, 9257, 285, 10939, 323, 1016, 5933, 50276, 4238, 436, 2686, 2218, 50276, 3062, 1189, 352, 651, 452, 644, 5322, 281, 923, 1543, 1293, 253, 34541, 4833, 273, 940, 21755, 281, 20843, 4973, 273, 7756, 533, 642, 28913, 2175, 497, 3559, 50276, 71, 3341, 5001, 253, 2600, 275, 2593, 608, 253, 2929, 9257, 10262, 271, 6843, 3033, 327, 2341, 326, 9142, 5644, 281, 247, 7658, 326, 310, 295, 545, 472, 816, 281, 2199, 327, 281, 3738, 16851, 5482, 2226, 326, 3469, 327, 690, 2228, 13761, 50276, 35529, 1014, 436, 4419, 271, 5933, 326, 310, 11511, 347, 9542, 50276, 34235, 247, 38754, 5795, 310, 6012, 275, 253, 30762, 534, 18289, 11029, 347, 253, 2457, 30020, 2746, 50276, 2858, 387, 436, 1127, 352, 310, 642, 3356, 2590, 281, 479, 4555, 752, 3045, 23632, 3464, 342, 1675, 281, 253, 2341, 3033, 50276, 33921, 495, 10262, 247, 9648, 275, 8658, 13508, 3033, 285, 352, 310, 417, 387, 512, 13955, 849, 281, 4665, 436, 275, 667, 8542, 3282, 50276, 9939, 326, 846, 10012, 495, 2515, 403, 2529, 17580, 271, 8654, 12378, 476, 320, 2797, 533, 841, 1646, 4122, 8794, 3086, 285, 11543, 281, 4647, 275, 954, 2219, 50276, 29483, 595, 352, 651, 3176, 326, 18934, 14493, 327, 253, 2341, 812, 671, 320, 5611, 407, 3365, 29697, 3006, 3474, 26208, 253, 37139, 414, 327, 1016, 3828, 534, 5644, 281, 247, 1199, 19554, 12378, 3213, 50276, 1542, 1650, 247, 2969, 29438, 1159, 273, 253, 298, 17, 5222, 651, 320, 1199, 6927, 281, 22318, 285, 812, 5752, 347, 247, 13155, 3033, 327, 253, 2341, 1677, 326, 253, 6158, 943, 320, 247, 1327, 40600, 2355, 1159, 273, 253, 298, 17, 5222, 50276, 1279, 2934, 849, 824, 247, 3033, 26662, 281, 1110, 3559, 1677, 512, 253, 34754, 285, 38754, 5018, 326, 1364, 320, 2908, 50275, 977, 5701, 50276, 284, 271, 7092, 47641, 253, 4081, 5933, 337, 13237, 27221, 253, 4764, 2805, 534, 5760, 253, 37139, 414, 273, 253, 8989, 278, 50276, 2858, 436, 588, 5604, 6990, 253, 2341, 7563, 285, 891, 4282, 849, 1774, 352, 310, 281, 2126, 247, 2570, 2341, 7658, 604, 41458, 4419, 436, 1511, 273, 47641, 50275, 74, 858, 417, 923, 835, 253, 10671, 298, 50198, 12691, 275, 16186, 1722, 369, 19186, 2931, 3738, 891, 476, 5476, 752, 352, 310, 50275, 249, 2087, 352, 310, 8489, 45991, 326, 327, 1755, 273, 247, 2570, 1327, 44181, 3676, 2990, 2341, 1159, 816, 253, 1355, 749, 28872, 2424, 323, 35104, 4830, 253, 2341, 7658, 310, 295, 545, 472, 50276, 9154, 604, 34754, 403, 1896, 891, 4282, 604, 436, 4465, 10454, 310, 1900, 4409, 352, 4103, 594, 2969, 37139, 414, 3169, 13800, 3082, 534, 476, 320, 14556, 9009, 342, 4555, 4581, 630, 20553, 50275, 249, 2829, 337, 253, 4081, 1332, 310, 16318, 347, 1907, 253, 8004, 7200, 5926, 327, 15232, 5282, 292, 50276, 2858, 436, 310, 417, 2032, 299, 522, 310, 2406, 50276, 3022, 3020, 327, 247, 1591, 3024, 2036, 26672, 556, 271, 9696, 8654, 2341, 7152, 339, 431, 248, 2929, 310, 9940, 281, 2341, 3169, 13800, 273, 3676, 11454, 6928, 1223, 954, 2987, 327, 13800, 403, 9940, 281, 11052, 253, 1180, 273, 3602, 390, 11052, 253, 1180, 273, 5871, 281, 3885, 484, 390, 8493, 273, 3541, 33257, 841, 7274, 513, 417, 2085, 667, 23632, 327, 2341, 8353, 275, 436, 789, 253, 4477, 6012, 247, 2957, 323, 3733, 48257, 342, 2341, 10806, 285, 2530, 271, 13757, 5933, 323, 352, 253, 4477, 2692, 326, 253, 4081, 1332, 33526, 2169, 7200, 342, 2406, 2341, 8353, 1677, 253, 1072, 2341, 7563, 253, 5661, 1543, 403, 3240, 4722, 285, 2486, 1014, 4122, 18325, 2990, 31551, 257, 292, 87, 19, 50276, 43249, 3533, 285, 7350, 776, 2341, 14053, 1543, 403, 17618, 1411, 253, 4491, 45563, 277, 9866, 10309, 40022, 11498, 303, 812, 253, 4477, 4496, 21184, 327, 436, 6197, 50276, 531, 273, 253, 2022, 13260, 310, 253, 1563, 604, 253, 1318, 273, 253, 941, 310, 5058, 253, 10309, 476, 17049, 24497, 253, 941, 347, 2080, 347, 891, 871, 436, 310, 247, 3240, 2266, 9376, 326, 310, 417, 4516, 407, 1142, 35615, 849, 513, 253, 4477, 1379, 715, 2395, 18332, 273, 970, 23507, 941, 21453, 275, 824, 10309, 275, 616, 3311, 569, 310, 352, 1896, 281, 26065, 824, 3879, 275, 11498, 303, 25761, 275, 1142, 4980, 2718, 8711, 476, 760, 320, 1239, 275, 30151, 3103, 352, 476, 6379, 1180, 273, 8711, 2289, 265, 275, 577, 50276, 6795, 963, 993, 285, 643, 3374, 3239, 854, 627, 4961, 271, 5933, 326, 476, 1089, 271, 271, 299, 4277, 3239, 854, 533, 352, 310, 1896, 281, 7989, 16851, 2900, 3239, 577, 50276, 262, 310, 1805, 281, 1691, 253, 6197, 835, 256, 27311, 267, 31482, 50276, 6438, 374, 275, 15895, 273, 253, 10012, 495, 352, 310, 1805, 281, 11120, 1375, 326, 247, 4428, 8870, 3904, 760, 1580, 305, 2428, 310, 908, 4583, 253, 2929, 310, 3542, 4518, 285, 10932, 973, 4428, 4722, 5661, 285, 10527, 1543, 5474, 339, 431, 248, 2929, 29328, 247, 1332, 323, 11454, 2990, 3733, 762, 247, 1892, 2341, 7658, 26332, 253, 1332, 23632, 253, 2341, 8353, 281, 320, 5170, 11542, 1754, 327, 247, 29479, 3781, 10309, 10336, 253, 4477, 1566, 253, 2341, 8353, 273, 27090, 253, 13461, 285, 1396, 569, 715, 1027, 2308, 273, 3541, 8711, 11556, 8749, 1873, 1309, 17032, 253, 2341, 8353, 310, 3103, 3413, 407, 253, 1180, 273, 28078, 3603, 275, 253, 2801, 285, 5743, 47454, 281, 15338, 253, 2990, 2957, 762, 271, 2341, 7658, 253, 4477, 1287, 247, 3733, 7792, 1690, 247, 4460, 38754, 5933, 281, 11897, 253, 12378, 273, 253, 2801, 47454, 281, 253, 2341, 7658, 50276, 856, 84, 50276, 783, 4081, 1332, 4483, 281, 13613, 16209, 271, 2341, 7658, 275, 2426, 273, 253, 4081, 1566, 275, 4499, 281, 2045, 3082, 285, 671, 11026, 247, 2169, 7200, 685, 841, 327, 690, 941, 5239, 253, 4081, 2900, 3133, 3590, 3738, 891, 858, 417, 2451, 253, 27947, 275, 2508, 285, 891, 717, 417, 1077, 7615, 342, 10309, 2341, 8353, 8482, 1059, 447, 50276, 34974, 50276, 783, 4679, 275, 4706, 9743, 1804, 326, 253, 5743, 8989, 310, 7194, 12912, 672, 253, 941, 310, 4122, 18872, 849, 403, 253, 5373, 275, 2426, 273, 2801, 285, 5743, 37139, 414, 9924, 275, 253, 4679, 327, 4440, 257, 292, 849, 1057, 253, 2801, 37139, 414, 273, 253, 253, 4081, 1332, 7277, 281, 253, 2905, 3082, 275, 841, 4679, 310, 2801, 37139, 414, 275, 841, 2219, 247, 1175, 17335, 323, 2341, 8353, 50276, 5430, 1057, 253, 5743, 37139, 414, 10027, 4764, 18687, 2805, 2818, 253, 7200, 14115, 8353, 5454, 2727, 323, 253, 767, 941, 5239, 50276, 783, 4477, 921, 326, 253, 2801, 12378, 1895, 476, 320, 14042, 14556, 849, 1057, 253, 12215, 16497, 715, 3402, 13273, 673, 50276, 10978, 819, 25004, 3082, 1249, 4796, 1097, 253, 1979, 273, 253, 2801, 285, 5743, 47454, 1223, 417, 10568, 281, 8415, 247, 9542, 12378, 1895, 390, 16984, 5743, 25965, 352, 651, 320, 1175, 281, 7277, 281, 841, 3082, 390, 387, 1878, 4385, 327, 253, 15988, 281, 320, 3264, 762, 253, 4081, 2341, 8353, 1566, 50276, 36871, 940, 21755, 556, 3786, 644, 2540, 281, 320, 3240, 9371, 672, 1030, 26208, 11454, 2990, 13461, 281, 320, 2677, 1025, 285, 263, 23507, 923, 32670, 352, 1537, 320, 4409, 29570, 436, 50276, 37585, 5701, 50276, 1704, 5910, 337, 296, 12494, 749, 3866, 50276, 8403, 398, 1687, 50276, 1704, 9743, 806, 12494, 869, 85, 561, 50276, 17523, 84, 14165, 264, 50276, 2132, 50276, 18, 344, 340, 1182, 12109, 1269, 50276, 13998, 480, 4240, 5048, 819, 25004, 323, 38757, 1077, 3676, 11454, 6928, 17857, 17312, 4240, 374, 632, 288, 40303, 580, 247, 277, 11180, 266, 32733, 891, 1775, 292, 288, 50276, 17676, 71, 288, 268, 819, 25004, 15116, 323, 5919, 2410, 47301, 17857, 32888, 4240, 495, 49285, 376, 247, 50276, 78, 3298, 277, 35016, 547, 970, 3640, 940, 21755, 5609, 281, 3157, 1698, 40540, 2990, 7200, 17857, 32888, 4765, 577, 246, 10629, 1136, 257, 278, 26856, 9045, 247, 50276, 266, 395, 76, 22711, 247, 1213, 515, 2477, 1507, 3676, 4715, 342, 247, 25219, 7563, 17857, 1686, 4765, 608, 1182, 11917, 606, 270, 703, 79, 260, 23136, 278, 632, 86, 298, 50276, 250, 301, 891, 4404, 3576, 1698, 2713, 3429, 27311, 267, 11454, 6928, 30105, 1087, 4765, 187, 187, 4118, 18435, 27, 455, 273, 253, 30628, 5194, 326, 436, 310, 247, 973, 15720, 2929, 342, 253, 4460, 8668, 273, 28699, 2341, 8353, 275, 11454, 6928, 347, 10066, 281, 46875, 37139, 414, 534, 1057, 417, 1900, 24888, 342, 2341, 2105, 627, 403, 247, 1180, 273, 12316, 8254, 6787, 285, 3081, 1543, 326, 452, 13082, 432, 253, 5955, 326, 943, 320, 1691, 715, 253, 2457, 7482, 10775, 12930, 253, 18332, 273, 22022, 432, 23507, 281, 14086, 14237, 6240, 253, 4440, 257, 292, 37139, 414, 1543, 285, 6240, 253, 673, 2668, 281, 1408, 253, 12378, 3213 ]
Below is given review of a research paper from cnoference journal. Please write a summary the review. ### Review: the paper presents an empirical study on an optimization anomaly which is referred to as the slingshot mechanismeffect by the authors when adaptive optimizers eg adams are used for solving supervised learning problems specifically it was found that when adaptive optimizers were used cyclic phase transitions between stable and unstable training occurred and were correlated with how the norm of the lastlayer weights changed the authors also reported findings on the correlations between the onset of the slingshot mechanism and the glokking phenomenon strengths the reported findings hint that extra care may be required when adaptive optimizers are used the results may also promote further theoretical studies on the limitations of using adaptive optimizers in supervised learning weaknesses additional numerical experiments could have been conducted to help future theorists better understand the source of issues in particular some rather simple diagnoses could have been done but were missing from the paper there is much room for improvement in presentation of the paper see the specific comments in the questions section i do not have any comments on this docsepthe paper studies the grokking phenomenon introduced by power et al 1 which is a curious phenomenon of delayed generalization this paper describes a characteristic of training termed the slingshot mechanism that is found in a number of tasks and optimization settings the slingshot mechanism refers to periods of instability found in adaptive optimizers in the late stages of training these periods of instability are accompanied by norm increases in the classification layer as well as an improvement in generalization performance for the grokking tasks introduced in 1 the paper studies this phenomenon and finds that it occurs in a number of different tasks more general than those in 1 and find that it is controlled by the epsilon parameter in adaptive optimizers strengths the paper has a good and comprehensive discussion of related work the paper characterizes an interesting phenomenon that is sometimes found in the optimization trajectory of adaptive optimizers interesting observations regarding the effect of epsilon on the slingshot behavior weaknesses there is no explanation empirical or otherwise for how the slingshot mechanism may relate to generalization understanding the delayed generalization is an important aspect of why the grokking phenomenon in 1 is interesting to the community i believe that in the current state without a deeper understanding this work may be more appropriate for a workshop submission the paper claims that the slingshot mechansim is a general phenomenon however on cifar10 the phenomenon was mostly observed when training with extremely small training data eg 200 examples and no correlation to better generalization was shown there is one experiment in the appendix where the full cifar10 dataset is used in training a vit model but this model only achieves a final test accuracy of 80 and its not clear how much test accuracy is improved due to the slingshot mechanism due to a single run and increasing test loss this calls into question how relevant or general the identified behavior is to modern day neural networks beyond the weaknesses i listed the authors were good at addressing several limitations of this work docsepthe authors describe a slingshot effect present in the use of adam and cousins this purportedly cyclic effect can be observed when the norm of the final layer sharply increases often accompanied by an increase in train loss the authors offer the slingshot effect as a possible explanation for the ability of adam to induce generalization even without explicit regularization this is overall a nicely executed empirical paper that reveals a moderately interesting phenomenon a cyclic regime of generalization would be quite remarkable and this paper offers preliminary evidence that supports the authors hypothesis it is also refreshing that the authors did not attempt to handwave nonsense poorlymotivated math into the paper which is unfortunately a severe problem even in 2022 my main concerns are the following significance given that this effect explains generalization in unregularized optimization the overall impact to practical deep learning is somewhat muted strength of evidence the results in the paper leave me with certain questions most notably 6 below as to whether the authors have sufficiently validated their hypothesized effect implications the paper enumerates a number of interesting facts eg dependence on epsilon but doesnt really dive deeper beyond some of those facts leaving the reader wondering as to why those facts were presented overall i enjoyed reading this paper i dont think its super noteworthy in terms of immediate impact but it is as far as i can tell a novel discovery that may serve as a waypoint on our path to better understanding of oddball generalization phenomena in deep learning the authors do a good job not overclaiming and they explicitly enumerate a number of other regimes that achieve the results despite lack of slingshot certain items are left unaddressed see questions minor line 84 missing author name line 112 what is adam without momentum and how is it different from rmsprop line 145 layers rightarrow layer line 170 has rightarrow have line 245 choices rightarrow choice line 254 its rightarrow its ### Summary:
the paper examines a widely known phenomenon when training neural networks with adaptive optimizers where the training loss cyclically alternates in later stages some evidence is given that in the absence of explicit regularization this is associated with improved generalization the more concrete contribution of the paper is to show a strong link between the aforementioned cyclicity and sudden cyclic growth in the weight of the last layer of the network the main concern is that beyond this empirical observation a specific mechanism behind the phenomenon is not identified nor a clear connection is made with the apparent benefit to generalization while the observations have merit it is hard to determine whether causeeffect relationships exist between them making the paper feel as a workinprogress and only weakly significant to the community judging by the reviewers being underwhelmed if the authors are convinced of their message that slingshots cause grokking in contrast to eg being a byproduct while the real mechanism lies elsewhere then they are advised to show exactly that in the further elaboration of their work
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 783, 2929, 10262, 271, 16774, 1263, 327, 271, 13757, 30207, 534, 310, 6289, 281, 347, 253, 1499, 723, 12022, 5122, 8222, 407, 253, 4477, 672, 17825, 5556, 14460, 24088, 519, 1317, 403, 908, 323, 16161, 22296, 4715, 3237, 5742, 352, 369, 1119, 326, 672, 17825, 5556, 14460, 497, 908, 19870, 3408, 16307, 875, 6474, 285, 17631, 3733, 5866, 285, 497, 9578, 342, 849, 253, 5222, 273, 253, 1390, 12026, 13461, 4391, 253, 4477, 671, 2361, 4342, 327, 253, 13007, 875, 253, 11653, 273, 253, 1499, 723, 12022, 5122, 285, 253, 1289, 536, 4351, 11562, 50276, 296, 3755, 20556, 50275, 783, 2361, 4342, 12662, 326, 4465, 1557, 778, 320, 2424, 672, 17825, 5556, 14460, 403, 908, 253, 1543, 778, 671, 8591, 2007, 10527, 2175, 327, 253, 7364, 273, 970, 17825, 5556, 14460, 275, 22296, 4715, 50276, 20881, 1255, 265, 50275, 38092, 10704, 4679, 812, 452, 644, 5196, 281, 1361, 2852, 29075, 1346, 1805, 2096, 253, 2603, 273, 3374, 275, 1798, 690, 2581, 2969, 28978, 812, 452, 644, 2218, 533, 497, 5816, 432, 253, 2929, 50276, 9088, 310, 1199, 2316, 323, 7756, 275, 9759, 273, 253, 2929, 923, 253, 2173, 5701, 275, 253, 3533, 2593, 891, 513, 417, 452, 667, 5701, 327, 436, 5474, 339, 431, 248, 2929, 2175, 253, 7753, 76, 4351, 11562, 5611, 407, 1612, 1162, 355, 337, 534, 310, 247, 14338, 11562, 273, 13444, 26647, 436, 2929, 8631, 247, 8847, 273, 3733, 23776, 253, 1499, 723, 12022, 5122, 326, 310, 1119, 275, 247, 1180, 273, 8892, 285, 13757, 7533, 253, 1499, 723, 12022, 5122, 10770, 281, 9894, 273, 17620, 1119, 275, 17825, 5556, 14460, 275, 253, 3563, 8661, 273, 3733, 841, 9894, 273, 17620, 403, 11704, 407, 5222, 5459, 275, 253, 9162, 3828, 347, 973, 347, 271, 7756, 275, 26647, 3045, 323, 253, 7753, 76, 4351, 8892, 5611, 275, 337, 253, 2929, 2175, 436, 11562, 285, 9010, 326, 352, 6634, 275, 247, 1180, 273, 1027, 8892, 625, 2087, 685, 1110, 275, 337, 285, 1089, 326, 352, 310, 6537, 407, 253, 299, 4277, 4764, 275, 17825, 5556, 14460, 50276, 296, 3755, 20556, 50275, 783, 2929, 556, 247, 1175, 285, 11088, 5955, 273, 2905, 789, 50276, 783, 2929, 45589, 271, 4722, 11562, 326, 310, 4536, 1119, 275, 253, 13757, 18974, 273, 17825, 5556, 14460, 50276, 47606, 7313, 5001, 253, 1055, 273, 299, 4277, 327, 253, 1499, 723, 12022, 3879, 50275, 20881, 1255, 265, 50275, 9088, 310, 642, 8813, 16774, 390, 5010, 323, 849, 253, 1499, 723, 12022, 5122, 778, 14588, 281, 26647, 4685, 253, 13444, 26647, 310, 271, 1774, 4809, 273, 2139, 253, 7753, 76, 4351, 11562, 275, 337, 310, 4722, 281, 253, 3114, 891, 2868, 326, 275, 253, 1655, 1375, 1293, 247, 12861, 4685, 436, 789, 778, 320, 625, 4569, 323, 247, 22586, 19529, 50276, 783, 2929, 3916, 326, 253, 1499, 723, 12022, 479, 348, 507, 303, 310, 247, 2087, 11562, 2299, 327, 260, 338, 274, 740, 253, 11562, 369, 6571, 2540, 672, 3733, 342, 6685, 1355, 3733, 941, 24088, 1052, 6667, 285, 642, 5921, 281, 1805, 26647, 369, 2011, 627, 310, 581, 3368, 275, 253, 30762, 835, 253, 2120, 260, 338, 274, 740, 10895, 310, 908, 275, 3733, 247, 9084, 1566, 533, 436, 1566, 760, 33526, 247, 2457, 1071, 7200, 273, 5096, 285, 697, 417, 2590, 849, 1199, 1071, 7200, 310, 5520, 1955, 281, 253, 1499, 723, 12022, 5122, 1955, 281, 247, 2014, 1408, 285, 3629, 1071, 2957, 436, 5841, 715, 1953, 849, 4623, 390, 2087, 253, 3636, 3879, 310, 281, 4980, 1388, 11454, 6928, 50275, 42218, 253, 32213, 891, 7117, 253, 4477, 497, 1175, 387, 15974, 2067, 7364, 273, 436, 789, 5474, 339, 431, 248, 4477, 6266, 247, 1499, 723, 12022, 1055, 1246, 275, 253, 897, 273, 38622, 285, 37827, 436, 28751, 314, 19870, 1055, 476, 320, 2540, 672, 253, 5222, 273, 253, 2457, 3828, 23071, 5459, 2223, 11704, 407, 271, 2572, 275, 6194, 2957, 253, 4477, 3959, 253, 1499, 723, 12022, 1055, 347, 247, 1896, 8813, 323, 253, 3745, 273, 38622, 281, 10808, 26647, 1014, 1293, 6843, 37820, 436, 310, 4583, 247, 23395, 11407, 16774, 2929, 326, 12957, 247, 28249, 4722, 11562, 247, 19870, 9459, 273, 26647, 651, 320, 3240, 13406, 285, 436, 2929, 6131, 12611, 1941, 326, 8525, 253, 4477, 9079, 352, 310, 671, 31255, 326, 253, 4477, 858, 417, 3177, 281, 1133, 15007, 25333, 15225, 24013, 8550, 14168, 715, 253, 2929, 534, 310, 19235, 247, 5460, 1895, 1014, 275, 1384, 1423, 50276, 2577, 2022, 7350, 403, 253, 1563, 50275, 9188, 40348, 1677, 326, 436, 1055, 11424, 26647, 275, 440, 12846, 1025, 13757, 253, 4583, 3486, 281, 8542, 3676, 4715, 310, 8489, 2873, 264, 50276, 45563, 273, 1941, 253, 1543, 275, 253, 2929, 3553, 479, 342, 2176, 3533, 954, 19836, 721, 2708, 347, 281, 1880, 253, 4477, 452, 10481, 17618, 616, 24045, 1055, 50276, 303, 35663, 253, 2929, 30482, 684, 247, 1180, 273, 4722, 5441, 24088, 10096, 327, 299, 4277, 533, 36908, 1663, 25760, 12861, 4457, 690, 273, 1110, 5441, 6108, 253, 9414, 12371, 347, 281, 2139, 1110, 5441, 497, 3559, 50276, 1189, 455, 891, 11346, 4361, 436, 2929, 891, 13414, 1158, 697, 2221, 35092, 275, 2426, 273, 8993, 3486, 533, 352, 310, 347, 2080, 347, 891, 476, 2028, 247, 4460, 8900, 326, 778, 5752, 347, 247, 1039, 3659, 327, 776, 1854, 281, 1805, 4685, 273, 8909, 2910, 26647, 16958, 275, 3676, 4715, 253, 4477, 513, 247, 1175, 2628, 417, 689, 43759, 285, 597, 11120, 49860, 247, 1180, 273, 643, 27005, 326, 5115, 253, 1543, 5747, 3480, 273, 1499, 723, 12022, 2176, 4957, 403, 1669, 440, 1911, 2079, 923, 3533, 50276, 37585, 50274, 1282, 11130, 5816, 2488, 1416, 50276, 1282, 11633, 752, 310, 38622, 1293, 10254, 285, 849, 310, 352, 1027, 432, 391, 983, 8560, 50276, 1282, 19092, 8090, 987, 2501, 3828, 50276, 1282, 18672, 556, 987, 2501, 452, 50276, 1282, 22752, 10165, 987, 2501, 4327, 50276, 1282, 29501, 697, 987, 2501, 697, 2490, 187, 4118, 18435, 27, 783, 2929, 33888, 247, 7561, 1929, 11562, 672, 3733, 11454, 6928, 342, 17825, 5556, 14460, 835, 253, 3733, 2957, 6776, 1037, 3960, 684, 275, 1996, 8661, 690, 1941, 310, 1677, 326, 275, 253, 5928, 273, 6843, 37820, 436, 310, 2330, 342, 5520, 26647, 253, 625, 11859, 7680, 273, 253, 2929, 310, 281, 921, 247, 2266, 3048, 875, 253, 18979, 6776, 5755, 285, 5982, 19870, 3116, 275, 253, 2801, 273, 253, 1390, 3828, 273, 253, 2990, 253, 2022, 4468, 310, 326, 4457, 436, 16774, 8310, 247, 2173, 5122, 3212, 253, 11562, 310, 417, 3636, 4543, 247, 2590, 4602, 310, 1160, 342, 253, 5165, 5649, 281, 26647, 1223, 253, 7313, 452, 15785, 352, 310, 1892, 281, 3653, 1880, 2847, 8222, 7688, 2226, 875, 731, 2403, 253, 2929, 1928, 347, 247, 789, 249, 24951, 285, 760, 22112, 1534, 281, 253, 3114, 32721, 407, 253, 30628, 1146, 762, 11622, 1314, 604, 253, 4477, 403, 13762, 273, 616, 3935, 326, 1499, 723, 40396, 2847, 7753, 76, 4351, 275, 4499, 281, 24088, 1146, 247, 407, 7509, 1223, 253, 1524, 5122, 8696, 11358, 840, 597, 403, 15140, 281, 921, 4555, 326, 275, 253, 2007, 14883, 318, 273, 616, 789 ]
[ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 ]
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 783, 2929, 10262, 271, 16774, 1263, 327, 271, 13757, 30207, 534, 310, 6289, 281, 347, 253, 1499, 723, 12022, 5122, 8222, 407, 253, 4477, 672, 17825, 5556, 14460, 24088, 519, 1317, 403, 908, 323, 16161, 22296, 4715, 3237, 5742, 352, 369, 1119, 326, 672, 17825, 5556, 14460, 497, 908, 19870, 3408, 16307, 875, 6474, 285, 17631, 3733, 5866, 285, 497, 9578, 342, 849, 253, 5222, 273, 253, 1390, 12026, 13461, 4391, 253, 4477, 671, 2361, 4342, 327, 253, 13007, 875, 253, 11653, 273, 253, 1499, 723, 12022, 5122, 285, 253, 1289, 536, 4351, 11562, 50276, 296, 3755, 20556, 50275, 783, 2361, 4342, 12662, 326, 4465, 1557, 778, 320, 2424, 672, 17825, 5556, 14460, 403, 908, 253, 1543, 778, 671, 8591, 2007, 10527, 2175, 327, 253, 7364, 273, 970, 17825, 5556, 14460, 275, 22296, 4715, 50276, 20881, 1255, 265, 50275, 38092, 10704, 4679, 812, 452, 644, 5196, 281, 1361, 2852, 29075, 1346, 1805, 2096, 253, 2603, 273, 3374, 275, 1798, 690, 2581, 2969, 28978, 812, 452, 644, 2218, 533, 497, 5816, 432, 253, 2929, 50276, 9088, 310, 1199, 2316, 323, 7756, 275, 9759, 273, 253, 2929, 923, 253, 2173, 5701, 275, 253, 3533, 2593, 891, 513, 417, 452, 667, 5701, 327, 436, 5474, 339, 431, 248, 2929, 2175, 253, 7753, 76, 4351, 11562, 5611, 407, 1612, 1162, 355, 337, 534, 310, 247, 14338, 11562, 273, 13444, 26647, 436, 2929, 8631, 247, 8847, 273, 3733, 23776, 253, 1499, 723, 12022, 5122, 326, 310, 1119, 275, 247, 1180, 273, 8892, 285, 13757, 7533, 253, 1499, 723, 12022, 5122, 10770, 281, 9894, 273, 17620, 1119, 275, 17825, 5556, 14460, 275, 253, 3563, 8661, 273, 3733, 841, 9894, 273, 17620, 403, 11704, 407, 5222, 5459, 275, 253, 9162, 3828, 347, 973, 347, 271, 7756, 275, 26647, 3045, 323, 253, 7753, 76, 4351, 8892, 5611, 275, 337, 253, 2929, 2175, 436, 11562, 285, 9010, 326, 352, 6634, 275, 247, 1180, 273, 1027, 8892, 625, 2087, 685, 1110, 275, 337, 285, 1089, 326, 352, 310, 6537, 407, 253, 299, 4277, 4764, 275, 17825, 5556, 14460, 50276, 296, 3755, 20556, 50275, 783, 2929, 556, 247, 1175, 285, 11088, 5955, 273, 2905, 789, 50276, 783, 2929, 45589, 271, 4722, 11562, 326, 310, 4536, 1119, 275, 253, 13757, 18974, 273, 17825, 5556, 14460, 50276, 47606, 7313, 5001, 253, 1055, 273, 299, 4277, 327, 253, 1499, 723, 12022, 3879, 50275, 20881, 1255, 265, 50275, 9088, 310, 642, 8813, 16774, 390, 5010, 323, 849, 253, 1499, 723, 12022, 5122, 778, 14588, 281, 26647, 4685, 253, 13444, 26647, 310, 271, 1774, 4809, 273, 2139, 253, 7753, 76, 4351, 11562, 275, 337, 310, 4722, 281, 253, 3114, 891, 2868, 326, 275, 253, 1655, 1375, 1293, 247, 12861, 4685, 436, 789, 778, 320, 625, 4569, 323, 247, 22586, 19529, 50276, 783, 2929, 3916, 326, 253, 1499, 723, 12022, 479, 348, 507, 303, 310, 247, 2087, 11562, 2299, 327, 260, 338, 274, 740, 253, 11562, 369, 6571, 2540, 672, 3733, 342, 6685, 1355, 3733, 941, 24088, 1052, 6667, 285, 642, 5921, 281, 1805, 26647, 369, 2011, 627, 310, 581, 3368, 275, 253, 30762, 835, 253, 2120, 260, 338, 274, 740, 10895, 310, 908, 275, 3733, 247, 9084, 1566, 533, 436, 1566, 760, 33526, 247, 2457, 1071, 7200, 273, 5096, 285, 697, 417, 2590, 849, 1199, 1071, 7200, 310, 5520, 1955, 281, 253, 1499, 723, 12022, 5122, 1955, 281, 247, 2014, 1408, 285, 3629, 1071, 2957, 436, 5841, 715, 1953, 849, 4623, 390, 2087, 253, 3636, 3879, 310, 281, 4980, 1388, 11454, 6928, 50275, 42218, 253, 32213, 891, 7117, 253, 4477, 497, 1175, 387, 15974, 2067, 7364, 273, 436, 789, 5474, 339, 431, 248, 4477, 6266, 247, 1499, 723, 12022, 1055, 1246, 275, 253, 897, 273, 38622, 285, 37827, 436, 28751, 314, 19870, 1055, 476, 320, 2540, 672, 253, 5222, 273, 253, 2457, 3828, 23071, 5459, 2223, 11704, 407, 271, 2572, 275, 6194, 2957, 253, 4477, 3959, 253, 1499, 723, 12022, 1055, 347, 247, 1896, 8813, 323, 253, 3745, 273, 38622, 281, 10808, 26647, 1014, 1293, 6843, 37820, 436, 310, 4583, 247, 23395, 11407, 16774, 2929, 326, 12957, 247, 28249, 4722, 11562, 247, 19870, 9459, 273, 26647, 651, 320, 3240, 13406, 285, 436, 2929, 6131, 12611, 1941, 326, 8525, 253, 4477, 9079, 352, 310, 671, 31255, 326, 253, 4477, 858, 417, 3177, 281, 1133, 15007, 25333, 15225, 24013, 8550, 14168, 715, 253, 2929, 534, 310, 19235, 247, 5460, 1895, 1014, 275, 1384, 1423, 50276, 2577, 2022, 7350, 403, 253, 1563, 50275, 9188, 40348, 1677, 326, 436, 1055, 11424, 26647, 275, 440, 12846, 1025, 13757, 253, 4583, 3486, 281, 8542, 3676, 4715, 310, 8489, 2873, 264, 50276, 45563, 273, 1941, 253, 1543, 275, 253, 2929, 3553, 479, 342, 2176, 3533, 954, 19836, 721, 2708, 347, 281, 1880, 253, 4477, 452, 10481, 17618, 616, 24045, 1055, 50276, 303, 35663, 253, 2929, 30482, 684, 247, 1180, 273, 4722, 5441, 24088, 10096, 327, 299, 4277, 533, 36908, 1663, 25760, 12861, 4457, 690, 273, 1110, 5441, 6108, 253, 9414, 12371, 347, 281, 2139, 1110, 5441, 497, 3559, 50276, 1189, 455, 891, 11346, 4361, 436, 2929, 891, 13414, 1158, 697, 2221, 35092, 275, 2426, 273, 8993, 3486, 533, 352, 310, 347, 2080, 347, 891, 476, 2028, 247, 4460, 8900, 326, 778, 5752, 347, 247, 1039, 3659, 327, 776, 1854, 281, 1805, 4685, 273, 8909, 2910, 26647, 16958, 275, 3676, 4715, 253, 4477, 513, 247, 1175, 2628, 417, 689, 43759, 285, 597, 11120, 49860, 247, 1180, 273, 643, 27005, 326, 5115, 253, 1543, 5747, 3480, 273, 1499, 723, 12022, 2176, 4957, 403, 1669, 440, 1911, 2079, 923, 3533, 50276, 37585, 50274, 1282, 11130, 5816, 2488, 1416, 50276, 1282, 11633, 752, 310, 38622, 1293, 10254, 285, 849, 310, 352, 1027, 432, 391, 983, 8560, 50276, 1282, 19092, 8090, 987, 2501, 3828, 50276, 1282, 18672, 556, 987, 2501, 452, 50276, 1282, 22752, 10165, 987, 2501, 4327, 50276, 1282, 29501, 697, 987, 2501, 697, 2490, 187, 4118, 18435, 27, 783, 2929, 33888, 247, 7561, 1929, 11562, 672, 3733, 11454, 6928, 342, 17825, 5556, 14460, 835, 253, 3733, 2957, 6776, 1037, 3960, 684, 275, 1996, 8661, 690, 1941, 310, 1677, 326, 275, 253, 5928, 273, 6843, 37820, 436, 310, 2330, 342, 5520, 26647, 253, 625, 11859, 7680, 273, 253, 2929, 310, 281, 921, 247, 2266, 3048, 875, 253, 18979, 6776, 5755, 285, 5982, 19870, 3116, 275, 253, 2801, 273, 253, 1390, 3828, 273, 253, 2990, 253, 2022, 4468, 310, 326, 4457, 436, 16774, 8310, 247, 2173, 5122, 3212, 253, 11562, 310, 417, 3636, 4543, 247, 2590, 4602, 310, 1160, 342, 253, 5165, 5649, 281, 26647, 1223, 253, 7313, 452, 15785, 352, 310, 1892, 281, 3653, 1880, 2847, 8222, 7688, 2226, 875, 731, 2403, 253, 2929, 1928, 347, 247, 789, 249, 24951, 285, 760, 22112, 1534, 281, 253, 3114, 32721, 407, 253, 30628, 1146, 762, 11622, 1314, 604, 253, 4477, 403, 13762, 273, 616, 3935, 326, 1499, 723, 40396, 2847, 7753, 76, 4351, 275, 4499, 281, 24088, 1146, 247, 407, 7509, 1223, 253, 1524, 5122, 8696, 11358, 840, 597, 403, 15140, 281, 921, 4555, 326, 275, 253, 2007, 14883, 318, 273, 616, 789 ]
Below is given review of a research paper from cnoference journal. Please write a summary the review. ### Review: this paper extends the work of hindsight experience replay to goalconditioned policy gradient methods hindsight which allows one to learn policies conditioned on some goal g from offpolicy experience generated by following goal g is cast in the framework of importance sampling the authors show how one can simply rewrite the goalconditioned policy gradient by first sampling a trajectory conditioned on some goal g and then computing the closed form gradient in expectation over all goals this gradient is unbiased if the rewards are offpolicy corrected along the generated trajectories while this naive formulation is found to be unstable the authors propose a simple normalized importance sampling formulation which appears to work well in practice to further reduce variance and computational costs the authors also propose goal subsampling mechanisms which sample goals which are likely along the generated trajectories the method is evaluated on the same bitflipping environment as 1 and a variety of discrete environments grid worlds ms pacman simulated robot arm where the method appears highly effective unfortunately for reasons which remain unclear hindsight policy gradients with value baselines appear unstable quality this paper scores high wrt quality the theoretical contributions of the method are solid the experiments are well designed and highlight the efficacy of the method as well as areas for improvement in particular i commend the authors for the rigorous analysis bootstrapped error estimates separate seeds for hyperparameters and reporting test error etc including the additional results found in the appendix sensitivity and ablative analyses that being said the paper could benefit from experiments in the continuous control domain and a direct headtohead comparison with her while i do not anticipate the proposed method to outperform her in terms of dataefficiency due to the use of replay the comparison would still be informative to the reader clarity the paper is well written and easy to follow if anything the authors could have abridged sections 2 and 3 in favor of other material found in the appendix as goalconditioned policy gradients and variants are straightforward generalizations of standard policy gradient methods originality novelty is somewhat low for the paper as hindsight experience replay already presented a very similar offgoalcorrection mechanism for actorcritic methods ddpg the method is also very similar to 2 the connection to which should also be discussed significance despite the low novelty i do believe there is value in framing hindsight as importance sampling in goalconditioned policy gradients this combined with the clear presentation and thorough analysis in my opinion warrants publication and will certainly prove useful to the community significance could be improved further should the paper feature a more prominent discussion comparison to her along with a fix for the instabilities which occur when using their method in conjunction with a value baseline 1 hindsight experience replay marcin andrychowicz et al 2 dataefficient hierarchical reinforcement learning ofir nachum shixiang gu honglak lee sergey levine detailed comments section 2 this formulation allows the probability of a state transition given an action to change across timesteps within an episode i do not understand this statement as pst1 mid st at is the same transition distribution found in standard mdps and appears stationary wrt time theorems 31 31 and equations a bit lengthy and superfluous consider condensing the material section 5 i found the change in notation from lower to uppercase somewhat jarring also the notation used for empirical samples from the minibatch is confusing if ait is meant to be the action at timestep t for the ith trajectory in the minibatch then what does gi g mean i realize this means evaluating the probability by setting the goal state to g but this is confusing especially when other probabilities are evaluated conditioned on gi directly section 6 which would often require the agent to act after the end of an episode do you mean that most episodes have length t t and as such we would waste time generating longer trajectories re baseline instabilities plotting the loss function for the value function could shed light on the instability docsepthe authors present hpg which applies the hindsight formulation already applied to offpolicy rl algorithms hindsight experience replay her andrychowicz et al 2017 to policy gradients because the idea is not new and formulating hpg from pg is so straightforward simply tie the dynamical model over goals the work seems incremental also going off policy in pg is known to be quite unstable and so im not sure that simply using the well known approach of normalized importance weights is in practice enough to make this a widely useful algorithm for hindsight rl evaluation 35 how does hpg compare to her the only common experiment appears to be bitflipping which it appears looking back at the her paper no reference to her performance in this paper to signifcantly underperform her in general i think that the justification for proposing hpg and possible advantages over her need to be discussed why should we generalize what is considered an onpolicy algorithm like pg to handle hindsight when her seems ideally suited for such scenarios why not design an experiment that showcases the advantages of hpg over her clarity 45 generally well explained significance 35 the importance of hpg relative to offpolicy variants of hindsight is not clear are normalized importance weights a well established variance reduction technique enough to make hpg highly effective do we really want to be running separate policies for all goals with the practical need to do goal subsampling is hpg really a strong algorithm eg compared to her why does hpg degrade later in training sometimes when a baseline is added this is strange and warrants further investigation originality 25 more straightforward extension of previous work based on current presentation overall i feel that hpg is a more straightforward extention of previous work and is not yet at least adequately justified in the paper ie over her furthermore the experiments seem very preliminary and the paper needs further maturation ie more discussion about and experimental comparision with previous work stronger experiments and justification rating 510 weak reject confidence 45 updated review the authors have updated the appendix with new results comparing against her and provided detailed responses to all of my concerns thank you authors while not all of my concerns have been addressed see below the new results and discussion that have been added to the paper make me much more comfortable with recommending acceptance the formuation while straightforward and not without limitations has been shown in preliminary experiments to be effective while many important details eg robust baselines and ultimate performance still need to be worked out hpg is almost certainly going to end up being a widely used addition to the rl toolbox good paper recommend acceptance evaluationclarityoriginalitysignificance 35434 remaining concerns the poor performance of the baselines may indeed be due to lack of hindsight but this should really be debugged and addressed by the final version of the paper results throughout the paper are shown for only the first 100 evaluation steps in many of the figures the baselines are still improving and are highly competitive some extended results should be included in the final version of the paper at least in the appendix as pointed out it is difficult to compare the her results directly and it is fair to initially avoid confounding factors but polyakaveraging and temporal difference target clipping are important optimization tricks i think it would strengthen the paper to optimize both the pg and dqn based methods and provide additional results to get a better idea of where things stand on these andor possibly a more complicated set of tasks docsepfollowing recent work on hindsight experience replay andrychowicz et al 2017 the authors extend the idea to policy gradient methods they formally describe the goalconditioned policy gradient setup and derive the extensions of the classical policy gradient estimators their key insight to deriving a computationally efficient estimator is that for many situations only a small number of goals will be active in a single trajectory then they conduct extensive experiments on a range of problems and show that their approach leads to improvements in sample efficiency for goalconditioned tasks although the technical novelty of the paper is not high many of the estimators follow straightforwardly from previous results however the goal subsampling idea is a nice contribution the paper is well written the topic is of great interest and the experiments are extensive and insightful i expect that this will serve as a nice reference paper in the future and launching point for future work the only major issue i have is that there is no comparison to her i think it would greatly strengthen the paper to have a comparison with her i dont think it diminishes their contributions if her outperforms hpg so i hope the authors can add that comments in sec 61 it seems surprising that gcpgb underperforms gcpg i understand that hpgb may underperform hpg but usually for pg methods a baseline helps do you understand whats going on here in sec 62 it would be helpful to plot the average return of the optimal policy for comparison otherwise its hard to know if the performance is good or bad also do you have any explanations for why hpg does poorly on the four rooms raising my score after the authors responded to my questions and added the her results ### Summary:
the paper generalizes the concept of hindsight ie the recycling of data from trajectories in a goalbased system based on the goal state actually achieved to policy gradient methods this was an interesting paper in that it scored quite highly despite all three reviewers mentioning incrementality or a relative lack of novelty although the authors naturally took some exception to this ac personally believes that properly executed contributions that seem quite straightforward in hindsight pun partly intended can be valuable in moving the field forward a clean and didactic presentation of theory backed by welldesigned and extensive empirical investigation both of which are adjectives used by reviewers to describe the empirical work in this paper can be as valuable or moreso than a poorly executed but highernovelty works to quote anonreviewer3 hpg is almost certainly going to end up being a widely used addition to the rl toolbox feedback from reviewers prompted extensive discussion and a direct comparison with hindsight experience replay which reviewers agreed added significant value to the manuscript earning it a postrebuttal unanimous rating of 7 it is therefore my pleasure to recommend acceptance
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 2520, 2929, 8725, 253, 789, 273, 17134, 18347, 2793, 44864, 281, 4736, 44321, 3646, 11786, 3082, 17134, 18347, 534, 4483, 581, 281, 3037, 7823, 27039, 327, 690, 4736, 305, 432, 745, 22872, 2793, 4561, 407, 1563, 4736, 305, 310, 5248, 275, 253, 7792, 273, 6349, 10491, 253, 4477, 921, 849, 581, 476, 3365, 24813, 253, 4736, 44321, 3646, 11786, 407, 806, 10491, 247, 18974, 27039, 327, 690, 4736, 305, 285, 840, 12672, 253, 4581, 830, 11786, 275, 15355, 689, 512, 7342, 436, 11786, 310, 38663, 604, 253, 23267, 403, 745, 22872, 15045, 2112, 253, 4561, 24102, 1223, 436, 27785, 15895, 310, 1119, 281, 320, 17631, 50276, 783, 4477, 12661, 247, 2969, 12650, 6349, 10491, 15895, 534, 4620, 281, 789, 973, 275, 3946, 281, 2007, 4796, 11041, 285, 15180, 4815, 253, 4477, 671, 12661, 4736, 8790, 312, 4906, 6297, 534, 3410, 7342, 534, 403, 2779, 2112, 253, 4561, 24102, 253, 1332, 310, 6760, 327, 253, 1072, 2372, 71, 965, 2784, 3126, 347, 337, 285, 247, 5235, 273, 13358, 12620, 9860, 20490, 13818, 19162, 1342, 15524, 15688, 4430, 835, 253, 1332, 4620, 4122, 3576, 19235, 323, 4606, 534, 3464, 12744, 17134, 18347, 3646, 27935, 342, 1318, 1666, 25379, 3176, 17631, 50276, 15177, 436, 2929, 7363, 1029, 8772, 3290, 253, 10527, 9021, 273, 253, 1332, 403, 4891, 253, 4679, 403, 973, 4158, 285, 6780, 253, 10307, 273, 253, 1332, 347, 973, 347, 3672, 323, 7756, 275, 1798, 891, 49638, 253, 4477, 323, 253, 26565, 1783, 7491, 10981, 1882, 2228, 8197, 4858, 12922, 323, 4373, 22041, 285, 9610, 1071, 2228, 3966, 1690, 253, 3081, 1543, 1119, 275, 253, 30762, 7340, 285, 490, 77, 800, 6260, 326, 1146, 753, 253, 2929, 812, 5649, 432, 4679, 275, 253, 5415, 1453, 5028, 285, 247, 1480, 1481, 936, 2522, 5301, 342, 617, 1223, 891, 513, 417, 30258, 253, 4081, 1332, 281, 562, 32231, 617, 275, 2426, 273, 941, 46505, 1955, 281, 253, 897, 273, 44864, 253, 5301, 651, 1335, 320, 27096, 281, 253, 9414, 50276, 498, 15752, 253, 2929, 310, 973, 3542, 285, 3477, 281, 956, 604, 2712, 253, 4477, 812, 452, 490, 6992, 2400, 7118, 374, 285, 495, 275, 3718, 273, 643, 2144, 1119, 275, 253, 30762, 347, 4736, 44321, 3646, 27935, 285, 11640, 403, 15246, 2087, 5904, 273, 2629, 3646, 11786, 3082, 50276, 19164, 414, 38135, 310, 8489, 1698, 323, 253, 2929, 347, 17134, 18347, 2793, 44864, 2168, 3559, 247, 1077, 2074, 745, 41881, 5528, 15831, 5122, 323, 12353, 68, 17425, 3082, 32765, 8159, 253, 1332, 310, 671, 1077, 2074, 281, 374, 253, 4602, 281, 534, 943, 671, 320, 5469, 50276, 9188, 40348, 5747, 253, 1698, 38135, 891, 513, 2868, 627, 310, 1318, 275, 39926, 17134, 18347, 347, 6349, 10491, 275, 4736, 44321, 3646, 27935, 436, 5678, 342, 253, 2590, 9759, 285, 11080, 1783, 275, 619, 4743, 32570, 9311, 285, 588, 5604, 5276, 4217, 281, 253, 3114, 8453, 812, 320, 5520, 2007, 943, 253, 2929, 4735, 247, 625, 11906, 5955, 50276, 47109, 281, 617, 2112, 342, 247, 4993, 323, 253, 978, 6720, 534, 2826, 672, 970, 616, 1332, 275, 17385, 342, 247, 1318, 8245, 50276, 18, 17134, 18347, 2793, 44864, 2304, 5620, 285, 610, 348, 319, 25928, 1162, 355, 374, 941, 20246, 24498, 35221, 4715, 273, 343, 23424, 360, 439, 895, 22589, 1149, 288, 543, 77, 518, 458, 70, 1151, 463, 90, 20978, 460, 50276, 5992, 7193, 5701, 50276, 4674, 374, 436, 15895, 4483, 253, 5912, 273, 247, 1375, 5502, 1677, 271, 2250, 281, 1818, 2439, 4522, 383, 2265, 1561, 271, 9037, 891, 513, 417, 2096, 436, 3908, 347, 268, 296, 18, 4260, 331, 387, 310, 253, 1072, 5502, 3268, 1119, 275, 2629, 31934, 793, 285, 4620, 17429, 8772, 673, 50276, 783, 28657, 4562, 50276, 2405, 285, 7424, 247, 2372, 24585, 285, 2221, 1258, 3472, 1908, 6882, 13843, 253, 2144, 50276, 4674, 608, 891, 1119, 253, 1818, 275, 14951, 432, 2406, 281, 4627, 41810, 8489, 21152, 804, 671, 253, 14951, 908, 323, 16774, 3530, 432, 253, 1054, 487, 1506, 310, 21643, 604, 247, 262, 310, 5486, 281, 320, 253, 2250, 387, 4522, 383, 554, 246, 323, 253, 209, 334, 18974, 275, 253, 1054, 487, 1506, 840, 752, 1057, 15891, 50276, 72, 1599, 50276, 74, 8968, 436, 2097, 16344, 253, 5912, 407, 4758, 253, 4736, 1375, 281, 305, 533, 436, 310, 21643, 3340, 672, 643, 20552, 403, 6760, 27039, 327, 15891, 3587, 50276, 4674, 721, 534, 651, 2223, 2430, 253, 5570, 281, 769, 846, 253, 990, 273, 271, 9037, 513, 368, 1599, 326, 954, 13305, 452, 2978, 246, 50276, 85, 285, 347, 824, 359, 651, 8138, 673, 11365, 3356, 24102, 50275, 250, 8245, 978, 6720, 38542, 253, 2957, 1159, 323, 253, 1318, 1159, 812, 17914, 1708, 327, 253, 17620, 5474, 339, 431, 248, 4477, 1246, 288, 8159, 534, 10384, 253, 17134, 18347, 15895, 2168, 3732, 281, 745, 22872, 391, 77, 11333, 17134, 18347, 2793, 44864, 617, 285, 610, 348, 319, 25928, 1162, 355, 4240, 281, 3646, 27935, 984, 253, 2934, 310, 417, 747, 285, 830, 8287, 288, 8159, 432, 23256, 310, 594, 15246, 3365, 13898, 253, 18525, 1566, 689, 7342, 253, 789, 3133, 32809, 671, 1469, 745, 3646, 275, 23256, 310, 1929, 281, 320, 3240, 17631, 285, 594, 516, 417, 2119, 326, 3365, 970, 253, 973, 1929, 2746, 273, 12650, 6349, 13461, 310, 275, 3946, 2217, 281, 1056, 436, 247, 7561, 4217, 5933, 323, 17134, 18347, 391, 77, 50275, 15419, 2368, 50272, 1671, 849, 1057, 288, 8159, 7277, 281, 617, 253, 760, 1846, 3368, 4620, 281, 320, 2372, 71, 965, 2784, 534, 352, 4620, 2819, 896, 387, 253, 617, 2929, 642, 3806, 281, 617, 3045, 275, 436, 2929, 281, 861, 338, 68, 5954, 762, 32231, 617, 275, 2087, 891, 1158, 326, 253, 22861, 323, 36636, 288, 8159, 285, 1896, 11361, 689, 617, 878, 281, 320, 5469, 2139, 943, 359, 39970, 752, 310, 2783, 271, 327, 22872, 5933, 751, 23256, 281, 6016, 17134, 18347, 672, 617, 3133, 34243, 18960, 323, 824, 15216, 2139, 417, 2216, 271, 3368, 326, 921, 12866, 253, 11361, 273, 288, 8159, 689, 617, 50276, 498, 15752, 50264, 1857, 3839, 973, 5544, 8453, 50274, 1671, 253, 6349, 273, 288, 8159, 4103, 281, 745, 22872, 11640, 273, 17134, 18347, 310, 417, 2590, 403, 12650, 6349, 13461, 247, 973, 4232, 11041, 5141, 5853, 2217, 281, 1056, 288, 8159, 4122, 3576, 513, 359, 1663, 971, 281, 320, 3515, 4858, 7823, 323, 512, 7342, 342, 253, 8542, 878, 281, 513, 4736, 8790, 312, 4906, 310, 288, 8159, 1663, 247, 2266, 5933, 24088, 2429, 281, 617, 2139, 1057, 288, 8159, 40195, 1996, 275, 3733, 4536, 672, 247, 8245, 310, 2879, 436, 310, 8921, 285, 32570, 2007, 5839, 3236, 414, 50273, 1099, 625, 15246, 6880, 273, 2045, 789, 1754, 327, 1655, 9759, 50275, 1189, 455, 891, 1928, 326, 288, 8159, 310, 247, 625, 15246, 1021, 1075, 273, 2045, 789, 285, 310, 417, 2568, 387, 1878, 18212, 17285, 275, 253, 2929, 26332, 689, 617, 33810, 253, 4679, 1646, 1077, 12611, 285, 253, 2929, 3198, 2007, 24988, 26332, 625, 5955, 670, 285, 5661, 3294, 1297, 342, 2045, 789, 10046, 4679, 285, 22861, 13716, 50268, 35191, 5075, 12009, 7162, 50272, 1857, 50276, 39055, 2278, 50275, 783, 4477, 452, 9300, 253, 30762, 342, 747, 1543, 10941, 1411, 617, 285, 2530, 7000, 6128, 281, 512, 273, 619, 7350, 5717, 368, 4477, 50276, 6050, 417, 512, 273, 619, 7350, 452, 644, 9713, 923, 2708, 253, 747, 1543, 285, 5955, 326, 452, 644, 2879, 281, 253, 2929, 1056, 479, 1199, 625, 9848, 342, 46705, 14924, 253, 830, 2368, 1223, 15246, 285, 417, 1293, 7364, 556, 644, 2011, 275, 12611, 4679, 281, 320, 3576, 1223, 1142, 1774, 4278, 24088, 10237, 1666, 25379, 285, 12553, 3045, 1335, 878, 281, 320, 4307, 562, 288, 8159, 310, 2761, 5604, 1469, 281, 990, 598, 1146, 247, 7561, 908, 1635, 281, 253, 391, 77, 4968, 3364, 1175, 2929, 5583, 14924, 50276, 15419, 2368, 498, 15752, 19164, 414, 9188, 40348, 38614, 1706, 50276, 2013, 1776, 7350, 50275, 783, 4105, 3045, 273, 253, 1666, 25379, 778, 6296, 320, 1955, 281, 3480, 273, 17134, 18347, 533, 436, 943, 1663, 320, 13844, 2400, 285, 9713, 407, 253, 2457, 2715, 273, 253, 2929, 50276, 16680, 4768, 253, 2929, 403, 2011, 323, 760, 253, 806, 2233, 7103, 5018, 275, 1142, 273, 253, 8442, 253, 1666, 25379, 403, 1335, 11138, 285, 403, 4122, 12085, 690, 6508, 1543, 943, 320, 2908, 275, 253, 2457, 2715, 273, 253, 2929, 387, 1878, 275, 253, 30762, 50276, 284, 8042, 562, 352, 310, 2834, 281, 7277, 253, 617, 1543, 3587, 285, 352, 310, 4344, 281, 8523, 3693, 34541, 2616, 533, 3488, 10573, 332, 2977, 285, 11935, 3064, 2303, 502, 8201, 403, 1774, 13757, 24866, 891, 1158, 352, 651, 17084, 253, 2929, 281, 22318, 1097, 253, 23256, 285, 277, 47051, 1754, 3082, 285, 2085, 3081, 1543, 281, 755, 247, 1805, 2934, 273, 835, 1841, 1462, 327, 841, 285, 263, 6830, 247, 625, 9542, 873, 273, 8892, 50275, 7152, 339, 18569, 8165, 3332, 789, 327, 17134, 18347, 2793, 44864, 285, 610, 348, 319, 25928, 1162, 355, 4240, 253, 4477, 9017, 253, 2934, 281, 3646, 11786, 3082, 597, 19186, 6266, 253, 4736, 44321, 3646, 11786, 9978, 285, 15313, 253, 18149, 273, 253, 8946, 3646, 11786, 48489, 616, 2234, 12288, 281, 44190, 247, 43245, 5919, 29107, 310, 326, 323, 1142, 9534, 760, 247, 1355, 1180, 273, 7342, 588, 320, 3939, 275, 247, 2014, 18974, 840, 597, 2589, 9470, 4679, 327, 247, 2491, 273, 3237, 285, 921, 326, 616, 2746, 5644, 281, 11701, 275, 3410, 6733, 323, 4736, 44321, 8892, 50276, 20261, 253, 7681, 38135, 273, 253, 2929, 310, 417, 1029, 1142, 273, 253, 48489, 956, 15246, 314, 432, 2045, 1543, 2299, 253, 4736, 8790, 312, 4906, 2934, 310, 247, 5322, 7680, 253, 2929, 310, 973, 3542, 253, 9400, 310, 273, 1270, 1600, 285, 253, 4679, 403, 9470, 285, 47860, 891, 1902, 326, 436, 588, 5752, 347, 247, 5322, 3806, 2929, 275, 253, 2852, 285, 25251, 1127, 323, 2852, 789, 50275, 783, 760, 2201, 2523, 891, 452, 310, 326, 627, 310, 642, 5301, 281, 617, 891, 1158, 352, 651, 10260, 17084, 253, 2929, 281, 452, 247, 5301, 342, 617, 891, 13414, 1158, 352, 13064, 6419, 616, 9021, 604, 617, 41731, 13015, 288, 8159, 594, 891, 3524, 253, 4477, 476, 823, 326, 50276, 26122, 50276, 249, 4706, 9901, 352, 3133, 10084, 326, 305, 7693, 20773, 762, 468, 13015, 305, 7693, 72, 891, 2096, 326, 288, 8159, 67, 778, 762, 32231, 288, 8159, 533, 3798, 323, 23256, 3082, 247, 8245, 7729, 513, 368, 2096, 47515, 1469, 327, 1060, 50276, 249, 4706, 9743, 352, 651, 320, 9371, 281, 7484, 253, 3388, 1091, 273, 253, 8654, 3646, 323, 5301, 5010, 697, 1892, 281, 871, 604, 253, 3045, 310, 1175, 390, 3076, 671, 513, 368, 452, 667, 22909, 323, 2139, 288, 8159, 1057, 15225, 327, 253, 1740, 9956, 50274, 25118, 619, 4868, 846, 253, 4477, 10974, 281, 619, 3533, 285, 2879, 253, 617, 1543, 187, 187, 4118, 18435, 27, 783, 2929, 2087, 4219, 253, 4473, 273, 17134, 18347, 26332, 253, 28604, 273, 941, 432, 24102, 275, 247, 4736, 3169, 985, 1754, 327, 253, 4736, 1375, 2686, 6786, 281, 3646, 11786, 3082, 50276, 2520, 369, 271, 4722, 2929, 275, 326, 352, 11691, 3240, 4122, 5747, 512, 1264, 30628, 29570, 17627, 1319, 390, 247, 4103, 3480, 273, 38135, 3738, 253, 4477, 10748, 2335, 690, 6517, 281, 436, 913, 11697, 11532, 326, 6283, 11407, 9021, 326, 1646, 3240, 15246, 275, 17134, 18347, 5419, 13730, 6034, 476, 320, 9865, 275, 4886, 253, 1673, 3579, 247, 4076, 285, 858, 9994, 9759, 273, 3762, 17245, 407, 6210, 392, 265, 1300, 285, 9470, 16774, 5839, 1097, 273, 534, 403, 519, 720, 1644, 908, 407, 30628, 281, 6266, 253, 16774, 789, 275, 436, 2929, 476, 320, 347, 9865, 390, 278, 2324, 80, 685, 247, 15225, 11407, 533, 2169, 2369, 652, 555, 2987, 281, 14430, 271, 251, 15337, 254, 20, 288, 8159, 310, 2761, 5604, 1469, 281, 990, 598, 1146, 247, 7561, 908, 1635, 281, 253, 391, 77, 4968, 3364, 50276, 44333, 432, 30628, 20588, 9470, 5955, 285, 247, 1480, 5301, 342, 17134, 18347, 2793, 44864, 534, 30628, 5821, 2879, 1534, 1318, 281, 253, 7714, 22915, 352, 247, 1501, 250, 2858, 22559, 42293, 13716, 273, 818, 352, 310, 3103, 619, 11284, 281, 5583, 14924 ]
[ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 ]
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 2520, 2929, 8725, 253, 789, 273, 17134, 18347, 2793, 44864, 281, 4736, 44321, 3646, 11786, 3082, 17134, 18347, 534, 4483, 581, 281, 3037, 7823, 27039, 327, 690, 4736, 305, 432, 745, 22872, 2793, 4561, 407, 1563, 4736, 305, 310, 5248, 275, 253, 7792, 273, 6349, 10491, 253, 4477, 921, 849, 581, 476, 3365, 24813, 253, 4736, 44321, 3646, 11786, 407, 806, 10491, 247, 18974, 27039, 327, 690, 4736, 305, 285, 840, 12672, 253, 4581, 830, 11786, 275, 15355, 689, 512, 7342, 436, 11786, 310, 38663, 604, 253, 23267, 403, 745, 22872, 15045, 2112, 253, 4561, 24102, 1223, 436, 27785, 15895, 310, 1119, 281, 320, 17631, 50276, 783, 4477, 12661, 247, 2969, 12650, 6349, 10491, 15895, 534, 4620, 281, 789, 973, 275, 3946, 281, 2007, 4796, 11041, 285, 15180, 4815, 253, 4477, 671, 12661, 4736, 8790, 312, 4906, 6297, 534, 3410, 7342, 534, 403, 2779, 2112, 253, 4561, 24102, 253, 1332, 310, 6760, 327, 253, 1072, 2372, 71, 965, 2784, 3126, 347, 337, 285, 247, 5235, 273, 13358, 12620, 9860, 20490, 13818, 19162, 1342, 15524, 15688, 4430, 835, 253, 1332, 4620, 4122, 3576, 19235, 323, 4606, 534, 3464, 12744, 17134, 18347, 3646, 27935, 342, 1318, 1666, 25379, 3176, 17631, 50276, 15177, 436, 2929, 7363, 1029, 8772, 3290, 253, 10527, 9021, 273, 253, 1332, 403, 4891, 253, 4679, 403, 973, 4158, 285, 6780, 253, 10307, 273, 253, 1332, 347, 973, 347, 3672, 323, 7756, 275, 1798, 891, 49638, 253, 4477, 323, 253, 26565, 1783, 7491, 10981, 1882, 2228, 8197, 4858, 12922, 323, 4373, 22041, 285, 9610, 1071, 2228, 3966, 1690, 253, 3081, 1543, 1119, 275, 253, 30762, 7340, 285, 490, 77, 800, 6260, 326, 1146, 753, 253, 2929, 812, 5649, 432, 4679, 275, 253, 5415, 1453, 5028, 285, 247, 1480, 1481, 936, 2522, 5301, 342, 617, 1223, 891, 513, 417, 30258, 253, 4081, 1332, 281, 562, 32231, 617, 275, 2426, 273, 941, 46505, 1955, 281, 253, 897, 273, 44864, 253, 5301, 651, 1335, 320, 27096, 281, 253, 9414, 50276, 498, 15752, 253, 2929, 310, 973, 3542, 285, 3477, 281, 956, 604, 2712, 253, 4477, 812, 452, 490, 6992, 2400, 7118, 374, 285, 495, 275, 3718, 273, 643, 2144, 1119, 275, 253, 30762, 347, 4736, 44321, 3646, 27935, 285, 11640, 403, 15246, 2087, 5904, 273, 2629, 3646, 11786, 3082, 50276, 19164, 414, 38135, 310, 8489, 1698, 323, 253, 2929, 347, 17134, 18347, 2793, 44864, 2168, 3559, 247, 1077, 2074, 745, 41881, 5528, 15831, 5122, 323, 12353, 68, 17425, 3082, 32765, 8159, 253, 1332, 310, 671, 1077, 2074, 281, 374, 253, 4602, 281, 534, 943, 671, 320, 5469, 50276, 9188, 40348, 5747, 253, 1698, 38135, 891, 513, 2868, 627, 310, 1318, 275, 39926, 17134, 18347, 347, 6349, 10491, 275, 4736, 44321, 3646, 27935, 436, 5678, 342, 253, 2590, 9759, 285, 11080, 1783, 275, 619, 4743, 32570, 9311, 285, 588, 5604, 5276, 4217, 281, 253, 3114, 8453, 812, 320, 5520, 2007, 943, 253, 2929, 4735, 247, 625, 11906, 5955, 50276, 47109, 281, 617, 2112, 342, 247, 4993, 323, 253, 978, 6720, 534, 2826, 672, 970, 616, 1332, 275, 17385, 342, 247, 1318, 8245, 50276, 18, 17134, 18347, 2793, 44864, 2304, 5620, 285, 610, 348, 319, 25928, 1162, 355, 374, 941, 20246, 24498, 35221, 4715, 273, 343, 23424, 360, 439, 895, 22589, 1149, 288, 543, 77, 518, 458, 70, 1151, 463, 90, 20978, 460, 50276, 5992, 7193, 5701, 50276, 4674, 374, 436, 15895, 4483, 253, 5912, 273, 247, 1375, 5502, 1677, 271, 2250, 281, 1818, 2439, 4522, 383, 2265, 1561, 271, 9037, 891, 513, 417, 2096, 436, 3908, 347, 268, 296, 18, 4260, 331, 387, 310, 253, 1072, 5502, 3268, 1119, 275, 2629, 31934, 793, 285, 4620, 17429, 8772, 673, 50276, 783, 28657, 4562, 50276, 2405, 285, 7424, 247, 2372, 24585, 285, 2221, 1258, 3472, 1908, 6882, 13843, 253, 2144, 50276, 4674, 608, 891, 1119, 253, 1818, 275, 14951, 432, 2406, 281, 4627, 41810, 8489, 21152, 804, 671, 253, 14951, 908, 323, 16774, 3530, 432, 253, 1054, 487, 1506, 310, 21643, 604, 247, 262, 310, 5486, 281, 320, 253, 2250, 387, 4522, 383, 554, 246, 323, 253, 209, 334, 18974, 275, 253, 1054, 487, 1506, 840, 752, 1057, 15891, 50276, 72, 1599, 50276, 74, 8968, 436, 2097, 16344, 253, 5912, 407, 4758, 253, 4736, 1375, 281, 305, 533, 436, 310, 21643, 3340, 672, 643, 20552, 403, 6760, 27039, 327, 15891, 3587, 50276, 4674, 721, 534, 651, 2223, 2430, 253, 5570, 281, 769, 846, 253, 990, 273, 271, 9037, 513, 368, 1599, 326, 954, 13305, 452, 2978, 246, 50276, 85, 285, 347, 824, 359, 651, 8138, 673, 11365, 3356, 24102, 50275, 250, 8245, 978, 6720, 38542, 253, 2957, 1159, 323, 253, 1318, 1159, 812, 17914, 1708, 327, 253, 17620, 5474, 339, 431, 248, 4477, 1246, 288, 8159, 534, 10384, 253, 17134, 18347, 15895, 2168, 3732, 281, 745, 22872, 391, 77, 11333, 17134, 18347, 2793, 44864, 617, 285, 610, 348, 319, 25928, 1162, 355, 4240, 281, 3646, 27935, 984, 253, 2934, 310, 417, 747, 285, 830, 8287, 288, 8159, 432, 23256, 310, 594, 15246, 3365, 13898, 253, 18525, 1566, 689, 7342, 253, 789, 3133, 32809, 671, 1469, 745, 3646, 275, 23256, 310, 1929, 281, 320, 3240, 17631, 285, 594, 516, 417, 2119, 326, 3365, 970, 253, 973, 1929, 2746, 273, 12650, 6349, 13461, 310, 275, 3946, 2217, 281, 1056, 436, 247, 7561, 4217, 5933, 323, 17134, 18347, 391, 77, 50275, 15419, 2368, 50272, 1671, 849, 1057, 288, 8159, 7277, 281, 617, 253, 760, 1846, 3368, 4620, 281, 320, 2372, 71, 965, 2784, 534, 352, 4620, 2819, 896, 387, 253, 617, 2929, 642, 3806, 281, 617, 3045, 275, 436, 2929, 281, 861, 338, 68, 5954, 762, 32231, 617, 275, 2087, 891, 1158, 326, 253, 22861, 323, 36636, 288, 8159, 285, 1896, 11361, 689, 617, 878, 281, 320, 5469, 2139, 943, 359, 39970, 752, 310, 2783, 271, 327, 22872, 5933, 751, 23256, 281, 6016, 17134, 18347, 672, 617, 3133, 34243, 18960, 323, 824, 15216, 2139, 417, 2216, 271, 3368, 326, 921, 12866, 253, 11361, 273, 288, 8159, 689, 617, 50276, 498, 15752, 50264, 1857, 3839, 973, 5544, 8453, 50274, 1671, 253, 6349, 273, 288, 8159, 4103, 281, 745, 22872, 11640, 273, 17134, 18347, 310, 417, 2590, 403, 12650, 6349, 13461, 247, 973, 4232, 11041, 5141, 5853, 2217, 281, 1056, 288, 8159, 4122, 3576, 513, 359, 1663, 971, 281, 320, 3515, 4858, 7823, 323, 512, 7342, 342, 253, 8542, 878, 281, 513, 4736, 8790, 312, 4906, 310, 288, 8159, 1663, 247, 2266, 5933, 24088, 2429, 281, 617, 2139, 1057, 288, 8159, 40195, 1996, 275, 3733, 4536, 672, 247, 8245, 310, 2879, 436, 310, 8921, 285, 32570, 2007, 5839, 3236, 414, 50273, 1099, 625, 15246, 6880, 273, 2045, 789, 1754, 327, 1655, 9759, 50275, 1189, 455, 891, 1928, 326, 288, 8159, 310, 247, 625, 15246, 1021, 1075, 273, 2045, 789, 285, 310, 417, 2568, 387, 1878, 18212, 17285, 275, 253, 2929, 26332, 689, 617, 33810, 253, 4679, 1646, 1077, 12611, 285, 253, 2929, 3198, 2007, 24988, 26332, 625, 5955, 670, 285, 5661, 3294, 1297, 342, 2045, 789, 10046, 4679, 285, 22861, 13716, 50268, 35191, 5075, 12009, 7162, 50272, 1857, 50276, 39055, 2278, 50275, 783, 4477, 452, 9300, 253, 30762, 342, 747, 1543, 10941, 1411, 617, 285, 2530, 7000, 6128, 281, 512, 273, 619, 7350, 5717, 368, 4477, 50276, 6050, 417, 512, 273, 619, 7350, 452, 644, 9713, 923, 2708, 253, 747, 1543, 285, 5955, 326, 452, 644, 2879, 281, 253, 2929, 1056, 479, 1199, 625, 9848, 342, 46705, 14924, 253, 830, 2368, 1223, 15246, 285, 417, 1293, 7364, 556, 644, 2011, 275, 12611, 4679, 281, 320, 3576, 1223, 1142, 1774, 4278, 24088, 10237, 1666, 25379, 285, 12553, 3045, 1335, 878, 281, 320, 4307, 562, 288, 8159, 310, 2761, 5604, 1469, 281, 990, 598, 1146, 247, 7561, 908, 1635, 281, 253, 391, 77, 4968, 3364, 1175, 2929, 5583, 14924, 50276, 15419, 2368, 498, 15752, 19164, 414, 9188, 40348, 38614, 1706, 50276, 2013, 1776, 7350, 50275, 783, 4105, 3045, 273, 253, 1666, 25379, 778, 6296, 320, 1955, 281, 3480, 273, 17134, 18347, 533, 436, 943, 1663, 320, 13844, 2400, 285, 9713, 407, 253, 2457, 2715, 273, 253, 2929, 50276, 16680, 4768, 253, 2929, 403, 2011, 323, 760, 253, 806, 2233, 7103, 5018, 275, 1142, 273, 253, 8442, 253, 1666, 25379, 403, 1335, 11138, 285, 403, 4122, 12085, 690, 6508, 1543, 943, 320, 2908, 275, 253, 2457, 2715, 273, 253, 2929, 387, 1878, 275, 253, 30762, 50276, 284, 8042, 562, 352, 310, 2834, 281, 7277, 253, 617, 1543, 3587, 285, 352, 310, 4344, 281, 8523, 3693, 34541, 2616, 533, 3488, 10573, 332, 2977, 285, 11935, 3064, 2303, 502, 8201, 403, 1774, 13757, 24866, 891, 1158, 352, 651, 17084, 253, 2929, 281, 22318, 1097, 253, 23256, 285, 277, 47051, 1754, 3082, 285, 2085, 3081, 1543, 281, 755, 247, 1805, 2934, 273, 835, 1841, 1462, 327, 841, 285, 263, 6830, 247, 625, 9542, 873, 273, 8892, 50275, 7152, 339, 18569, 8165, 3332, 789, 327, 17134, 18347, 2793, 44864, 285, 610, 348, 319, 25928, 1162, 355, 4240, 253, 4477, 9017, 253, 2934, 281, 3646, 11786, 3082, 597, 19186, 6266, 253, 4736, 44321, 3646, 11786, 9978, 285, 15313, 253, 18149, 273, 253, 8946, 3646, 11786, 48489, 616, 2234, 12288, 281, 44190, 247, 43245, 5919, 29107, 310, 326, 323, 1142, 9534, 760, 247, 1355, 1180, 273, 7342, 588, 320, 3939, 275, 247, 2014, 18974, 840, 597, 2589, 9470, 4679, 327, 247, 2491, 273, 3237, 285, 921, 326, 616, 2746, 5644, 281, 11701, 275, 3410, 6733, 323, 4736, 44321, 8892, 50276, 20261, 253, 7681, 38135, 273, 253, 2929, 310, 417, 1029, 1142, 273, 253, 48489, 956, 15246, 314, 432, 2045, 1543, 2299, 253, 4736, 8790, 312, 4906, 2934, 310, 247, 5322, 7680, 253, 2929, 310, 973, 3542, 253, 9400, 310, 273, 1270, 1600, 285, 253, 4679, 403, 9470, 285, 47860, 891, 1902, 326, 436, 588, 5752, 347, 247, 5322, 3806, 2929, 275, 253, 2852, 285, 25251, 1127, 323, 2852, 789, 50275, 783, 760, 2201, 2523, 891, 452, 310, 326, 627, 310, 642, 5301, 281, 617, 891, 1158, 352, 651, 10260, 17084, 253, 2929, 281, 452, 247, 5301, 342, 617, 891, 13414, 1158, 352, 13064, 6419, 616, 9021, 604, 617, 41731, 13015, 288, 8159, 594, 891, 3524, 253, 4477, 476, 823, 326, 50276, 26122, 50276, 249, 4706, 9901, 352, 3133, 10084, 326, 305, 7693, 20773, 762, 468, 13015, 305, 7693, 72, 891, 2096, 326, 288, 8159, 67, 778, 762, 32231, 288, 8159, 533, 3798, 323, 23256, 3082, 247, 8245, 7729, 513, 368, 2096, 47515, 1469, 327, 1060, 50276, 249, 4706, 9743, 352, 651, 320, 9371, 281, 7484, 253, 3388, 1091, 273, 253, 8654, 3646, 323, 5301, 5010, 697, 1892, 281, 871, 604, 253, 3045, 310, 1175, 390, 3076, 671, 513, 368, 452, 667, 22909, 323, 2139, 288, 8159, 1057, 15225, 327, 253, 1740, 9956, 50274, 25118, 619, 4868, 846, 253, 4477, 10974, 281, 619, 3533, 285, 2879, 253, 617, 1543, 187, 187, 4118, 18435, 27, 783, 2929, 2087, 4219, 253, 4473, 273, 17134, 18347, 26332, 253, 28604, 273, 941, 432, 24102, 275, 247, 4736, 3169, 985, 1754, 327, 253, 4736, 1375, 2686, 6786, 281, 3646, 11786, 3082, 50276, 2520, 369, 271, 4722, 2929, 275, 326, 352, 11691, 3240, 4122, 5747, 512, 1264, 30628, 29570, 17627, 1319, 390, 247, 4103, 3480, 273, 38135, 3738, 253, 4477, 10748, 2335, 690, 6517, 281, 436, 913, 11697, 11532, 326, 6283, 11407, 9021, 326, 1646, 3240, 15246, 275, 17134, 18347, 5419, 13730, 6034, 476, 320, 9865, 275, 4886, 253, 1673, 3579, 247, 4076, 285, 858, 9994, 9759, 273, 3762, 17245, 407, 6210, 392, 265, 1300, 285, 9470, 16774, 5839, 1097, 273, 534, 403, 519, 720, 1644, 908, 407, 30628, 281, 6266, 253, 16774, 789, 275, 436, 2929, 476, 320, 347, 9865, 390, 278, 2324, 80, 685, 247, 15225, 11407, 533, 2169, 2369, 652, 555, 2987, 281, 14430, 271, 251, 15337, 254, 20, 288, 8159, 310, 2761, 5604, 1469, 281, 990, 598, 1146, 247, 7561, 908, 1635, 281, 253, 391, 77, 4968, 3364, 50276, 44333, 432, 30628, 20588, 9470, 5955, 285, 247, 1480, 5301, 342, 17134, 18347, 2793, 44864, 534, 30628, 5821, 2879, 1534, 1318, 281, 253, 7714, 22915, 352, 247, 1501, 250, 2858, 22559, 42293, 13716, 273, 818, 352, 310, 3103, 619, 11284, 281, 5583, 14924 ]
Below is given review of a research paper from cnoference journal. Please write a summary the review. ### Review: this paper describes a framework tree reconstruction error tre for assessing compositionality of representations by comparing the learned outputs against those of the closest compositional approximation the paper demonstrates the use of this framework to assess the role of compositionality in a hypothetical compression phase of representation learning compares the correspondence of tre with human judgments of compositionality of bigrams provides an explanation of the relationship of the metric to topographic similarity and uses the framework to draw conclusions about the role of compositionality in model generalization overall i think this is a solid paper with an interesting and reasonable approach to quantifying compositionality and a fairly compelling set of results the reported experiments cover reasonable ground in terms of questions relevant to compositionality relationship to representation compression generalization and i appreciate the comparison to human judgments which lends credibility to applicability of the framework the results are generally intuitive and reasonable enough to be credible as indicators of how compositionality relates to aspects of learning while providing some potential insight the paper is clearly written and to my knowledge the approach is novel i would say the main limitation to the conclusions that can be drawn from these experiments lies in the necessity of committing to a particular composition operator of which the authors have selected very simple ones without comparing to others there is nothing obviously unreasonable about the choices of composition operator but it seems that the conclusions drawn cannot be construed to apply to compositionality as a general concept but rather to compositionality when defined by these particular operators similar limitations apply to the fact that the tests have been run on very specific tasks it is not clear how these conclusions would generalize to other tasks despite this limitation im inclined to say that the introduction of the framework is a solid contribution and the results presented are interesting i think this is a reasonable paper to accept for publication minor comment p8 typo training and accuracies reviewer 2 makes a good point that the presentation of the framework could be much clearer currently obscuring the central role of learning the primitive representations this is something that would benefit from revision reviewer 2s comments also remind me that from a perspective of learning compositionready primitives fyshe et al 2015 is a relevant reference here as it similarly learns primitive word representations to be compatible with a chosen composition function beyond issues of presentation it seems that we are all in agreement that the papers takeaways would also benefit from an increase in the scope of the experiments im happy to adjust my score to reflect this reference fyshe et al 2015 a compositional and interpretable semantic space docsepedit and a further question reading again section 7 im wondering whether the the high generalization is possible due to the fact that at test time only one of the two candidates is unseen and the other is seen having both candidates to be unseen makes the problem significantly harder since the only way for the listener to get it right is to associate the message with the right candidate rather than relying in some other strategy like whether the message is novel thus its the seen candidate or new thus its the unseen candidate as such i dont think i can fully trust your conclusions due to this potential confounder the authors propose a measure of compositionality in representations given instances of data x annotated with semantic primitives the authors learn a vector for each of the primitive such that the addition of the vectors of the primitives is very close in terms of cosine to the latent representation z of the input x the authors find that this measure correlates with the mutual information between the input x and z approximates the human judges of compositionality on a language dataset and finally presents a study on the relation between the proposed measure and generalizalization performance concluding that their measure correlates with generalization error as well as absolute test accuracy this in an interesting study and attacks a very fundamental question tracking compositionality in representations could pave the way towards representations that facilitate transfer learning and better generalization while the paper is very clear with respects to results i found the presentation of the proposed measure overly confusing and somewhat more exaggerated that what is really going on the authors start with a very clean example that can potentially facilitate clarifying in a visual way the process of obtaining the measure however i feel that clarity is being tradedoff for formality it needs several reads to really distill the idea that essentially the authors are simply learning vectors of primitives that when added should resemble the representation of the input moreover the name of the measure is a bit misleading and not justified by the experiments and the data the authors do not deal with trees in any of the examples but rather with a set of primitives apparent in the use of addition as a composition function which being commutative does not allow for wordorder and the like deep syntactic properties now onto the measure i like the idea of learning basis vectors from the representations and constraining to follow the primitive semantics of course this constraints quite a bit the form of compositionality that the authors are searching for the idea of additive semantics has been explored in nlp however its mostly applicable for primitives with intersective semantics eg a white towel is something that is both white and a towel do the authors think that this restricts their experiments especially the natural languages ones what about other composition techniques found in the literature of compositional semantics eg by baroni and zamparelli 2010 this is good to be clarified moreover given the simplicity of the datasets in the current study wouldnt a reasonable baseline be to obtain the basis vector of blue by averaging all the latent representations of blue similarly how sensitive are conclusions with respect to different composition functions section 4 is potentially very interesting but i dont seem to understand why its good news that tre correlates with ixtheta low tre indicates highdegree of compositionality i suspect that low mi means that input and latent representation are somewhat independent but i dont see the connection to compositional components can the authors clarify section 5 is a nice addition the authors mention that they learn word and phrase representations where are the word representations used my understanding is that you derive basis word representations by using sgd and the phrase vectors and compute tre with these if this is the case an interesting experiment would be to report how similar the induced basis vectors are either some firstorder or secondorder similarity to the pretrained ones section 8 presents results on discrete representations since this is the experiment most similar to the recent work that uses topographic similarity and since the authors already prime the reader at section 7 about relation between the 2 measures it would be interesting to see the empirical relation between tre and topographic and its relation to generalization and absolute performance baroni and zamparelli 2010 nouns are vectors adjectives are matrices representing adjectivenoun constructions in semantic space docsepthe paper tackles a very interesting problem about representations especially of the connectionist kind how do we know if the learned representations capture the compositional structure present in the inputs and tries to come up with a systematic framework to answer that question the framework assumes the presence of an oracle that can give us the true compositional structure then the author try to answer some refreshing questions about the dynamics of learning and compositionality while citing some interesting background reading however im a bit torn about the experiments on the one hand i like the pedagogical nature of the experiments they are small and should be easy to reproduce on the other hand all of them seem to be fairly similar kinds of composition with very few attributes mostly bigrams so whether the intuitions hold for more complex compositional structures is hard to say nevertheless its a well written paper and is a helpful first step towards studying the problem of compositionality in vector representations minor points pg 3 grammar for composing meanings where licensed by derivations seems incorrect figure 5 seems quite noisy to make the linear relationship claim edit i still think the compositions under consideration are the simpler ones still with the new experiments the coverage seems nicer given the authors plan to release their source code i expect there will be an opportunity for the rest of the community to build on these to test tres efficacy on more complex compositions i updated my scores to reflect the change ### Summary:
this paper presents a method for measuring the degree to which some representation for a composed object effectively represents the pieces from which it is composed all three authors found this to be an important topic for study and found the paper to be a limited but original and important step toward studying this topic however two reviewers expressed serious concerns about clarity and were not fully satisfied with the revisions made so far im recommending acceptance but i ask the authors to further revise the paper especially the introduction to make sure it includes a blunt and straightforward presentation of the problem under study and the way tre addresses it im also somewhat concerned at r2s mention of a potential confound in one experiment the paper has been updated with what appears to be a fix though and r2 has not yet responded so im presuming that this issue has been resolved i also ask the authors to release code shortly upon deanonymization as promised
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 2520, 2929, 8631, 247, 7792, 50276, 12588, 14433, 2228, 2578, 50276, 1542, 18005, 5889, 1319, 273, 14237, 407, 10941, 253, 6311, 18012, 1411, 1110, 273, 253, 8642, 5889, 267, 11193, 253, 2929, 14371, 253, 897, 273, 436, 7792, 281, 2939, 253, 2554, 273, 5889, 1319, 275, 247, 27710, 13800, 3408, 273, 6779, 4715, 26662, 253, 17668, 273, 2578, 342, 1966, 23014, 273, 5889, 1319, 273, 1943, 10624, 3400, 271, 8813, 273, 253, 2954, 273, 253, 7982, 281, 1755, 5576, 14259, 285, 4648, 253, 7792, 281, 3812, 11815, 670, 253, 2554, 273, 5889, 1319, 275, 1566, 26647, 50276, 1189, 455, 891, 1158, 436, 310, 247, 4891, 2929, 342, 271, 4722, 285, 5272, 2746, 281, 2677, 5411, 5889, 1319, 285, 247, 9648, 18511, 873, 273, 1543, 253, 2361, 4679, 3835, 5272, 3216, 275, 2426, 273, 3533, 4623, 281, 5889, 1319, 2954, 281, 6779, 13800, 26647, 285, 891, 11435, 253, 5301, 281, 1966, 23014, 534, 298, 1727, 17938, 281, 30437, 273, 253, 7792, 253, 1543, 403, 3839, 27350, 285, 5272, 2217, 281, 320, 24542, 347, 18172, 273, 849, 5889, 1319, 7033, 281, 7794, 273, 4715, 1223, 5277, 690, 2442, 12288, 253, 2929, 310, 4518, 3542, 285, 281, 619, 3640, 253, 2746, 310, 4460, 50276, 74, 651, 1333, 253, 2022, 12291, 281, 253, 11815, 326, 476, 320, 8392, 432, 841, 4679, 8696, 275, 253, 15504, 273, 26841, 281, 247, 1798, 5889, 5572, 273, 534, 253, 4477, 452, 4236, 1077, 2969, 4394, 1293, 10941, 281, 2571, 627, 310, 2717, 9090, 20697, 670, 253, 10165, 273, 5889, 5572, 533, 352, 3133, 326, 253, 11815, 8392, 2550, 320, 21323, 281, 4647, 281, 5889, 1319, 347, 247, 2087, 4473, 533, 2581, 281, 5889, 1319, 672, 2931, 407, 841, 1798, 9158, 2074, 7364, 4647, 281, 253, 958, 326, 253, 5216, 452, 644, 1408, 327, 1077, 2173, 8892, 50276, 262, 310, 417, 2590, 849, 841, 11815, 651, 39970, 281, 643, 8892, 50276, 3229, 3784, 436, 12291, 516, 21802, 281, 1333, 326, 253, 10199, 273, 253, 7792, 310, 247, 4891, 7680, 285, 253, 1543, 3559, 403, 4722, 891, 1158, 436, 310, 247, 5272, 2929, 281, 2997, 323, 9311, 50276, 37585, 4385, 268, 25, 1745, 80, 3733, 285, 3933, 19103, 50274, 15337, 254, 374, 2789, 247, 1175, 1127, 326, 253, 9759, 273, 253, 7792, 812, 320, 1199, 30909, 4390, 14551, 981, 253, 4275, 2554, 273, 4715, 253, 20523, 14237, 436, 310, 1633, 326, 651, 5649, 432, 18520, 37317, 374, 84, 5701, 671, 9287, 479, 326, 432, 247, 8668, 273, 4715, 5889, 2038, 2248, 23223, 269, 656, 248, 1162, 355, 4104, 310, 247, 4623, 3806, 1060, 347, 352, 12014, 33772, 20523, 3159, 14237, 281, 320, 13333, 342, 247, 6777, 5889, 1159, 50275, 42218, 3374, 273, 9759, 352, 3133, 326, 359, 403, 512, 275, 4345, 326, 253, 9380, 1379, 42287, 651, 671, 5649, 432, 271, 2572, 275, 253, 7990, 273, 253, 4679, 516, 5211, 281, 4575, 619, 4868, 281, 4887, 436, 50276, 14005, 269, 656, 248, 1162, 355, 4104, 247, 5889, 267, 285, 4665, 494, 24705, 2317, 5474, 339, 9357, 262, 285, 247, 2007, 1953, 4361, 969, 2593, 818, 516, 12371, 1880, 253, 50276, 783, 1029, 26647, 310, 1896, 1955, 281, 253, 958, 326, 387, 1071, 673, 760, 581, 273, 253, 767, 9183, 310, 39709, 285, 253, 643, 310, 2326, 1907, 1097, 9183, 281, 320, 39709, 2789, 253, 1895, 3012, 12150, 1580, 253, 760, 1039, 323, 253, 24199, 281, 755, 352, 987, 310, 281, 15629, 253, 3935, 342, 253, 987, 7431, 2581, 685, 22128, 275, 690, 643, 5700, 751, 1880, 253, 3935, 310, 4460, 3021, 697, 253, 2326, 7431, 390, 747, 3021, 697, 253, 39709, 7431, 347, 824, 891, 13414, 1158, 891, 476, 4751, 4517, 634, 11815, 1955, 281, 436, 2442, 1461, 10117, 50274, 783, 4477, 12661, 247, 2557, 273, 5889, 1319, 275, 14237, 1677, 10872, 273, 941, 1269, 28267, 342, 24705, 2248, 23223, 253, 4477, 3037, 247, 4972, 323, 1016, 273, 253, 20523, 824, 326, 253, 1635, 273, 253, 11390, 273, 253, 2248, 23223, 310, 1077, 2810, 275, 2426, 273, 7349, 460, 281, 253, 21624, 6779, 50276, 91, 273, 253, 3280, 1269, 253, 4477, 1089, 326, 436, 2557, 27972, 342, 253, 15577, 1491, 875, 253, 3280, 1269, 285, 1182, 4020, 684, 253, 1966, 16006, 273, 5889, 1319, 327, 247, 3448, 10895, 285, 4720, 10262, 247, 1263, 327, 253, 5886, 875, 253, 4081, 2557, 285, 2087, 478, 267, 1320, 3045, 26215, 326, 616, 2557, 27972, 342, 26647, 2228, 347, 973, 347, 7880, 1071, 7200, 50276, 2520, 275, 271, 4722, 1263, 285, 8104, 247, 1077, 7936, 1953, 12544, 5889, 1319, 275, 14237, 812, 29238, 253, 1039, 4404, 14237, 326, 12454, 3700, 4715, 285, 1805, 26647, 1223, 253, 2929, 310, 1077, 2590, 342, 23006, 281, 1543, 891, 1119, 253, 9759, 273, 253, 4081, 2557, 27662, 21643, 285, 8489, 625, 36074, 326, 752, 310, 1663, 1469, 327, 50275, 783, 4477, 1265, 342, 247, 1077, 4076, 1650, 326, 476, 7826, 12454, 8254, 5411, 275, 247, 5304, 1039, 253, 1232, 273, 13546, 253, 2557, 2299, 891, 1928, 326, 19843, 310, 1146, 24191, 2727, 323, 830, 1319, 352, 3198, 2067, 9563, 281, 1663, 940, 408, 253, 2934, 326, 9093, 253, 4477, 403, 3365, 4715, 11390, 273, 2248, 23223, 326, 672, 2879, 943, 28788, 253, 6779, 273, 253, 3280, 25761, 253, 1416, 273, 253, 2557, 310, 247, 2372, 24363, 285, 417, 17285, 407, 253, 4679, 285, 253, 941, 253, 4477, 513, 417, 2968, 342, 7139, 275, 667, 273, 253, 6667, 533, 2581, 342, 247, 873, 273, 2248, 23223, 5165, 275, 253, 897, 273, 1635, 347, 247, 5889, 1159, 534, 1146, 33796, 1057, 417, 1581, 323, 3159, 2621, 285, 253, 751, 3676, 43548, 9994, 3607, 50275, 2666, 4830, 253, 2557, 891, 751, 253, 2934, 273, 4715, 3720, 11390, 432, 253, 14237, 285, 1030, 26208, 281, 956, 253, 20523, 35185, 273, 2282, 436, 10806, 3240, 247, 2372, 253, 830, 273, 5889, 1319, 326, 253, 4477, 403, 12203, 323, 50276, 783, 2934, 273, 21842, 35185, 556, 644, 14859, 275, 295, 24343, 2299, 697, 6571, 7763, 323, 2248, 23223, 342, 23965, 422, 35185, 24088, 247, 3168, 29861, 310, 1633, 326, 310, 1097, 3168, 285, 247, 29861, 513, 253, 4477, 1158, 326, 436, 45798, 616, 4679, 3340, 253, 3626, 11515, 4394, 752, 670, 643, 5889, 5609, 1119, 275, 253, 6239, 273, 5889, 267, 35185, 24088, 407, 2534, 12355, 285, 1182, 1301, 609, 25658, 4267, 50276, 2520, 310, 1175, 281, 320, 31637, 50276, 3062, 1189, 1677, 253, 17647, 273, 253, 15302, 275, 253, 1655, 1263, 651, 2649, 247, 5272, 8245, 320, 281, 4044, 253, 3720, 4972, 273, 4797, 407, 25001, 512, 253, 21624, 14237, 273, 4797, 50276, 3549, 6241, 849, 7996, 403, 11815, 342, 1675, 281, 1027, 5889, 3470, 50276, 4674, 577, 310, 7826, 1077, 4722, 533, 891, 13414, 1646, 281, 2096, 2139, 697, 1175, 3668, 326, 2578, 27972, 342, 891, 633, 22666, 1698, 2578, 6492, 1029, 14577, 273, 5889, 1319, 891, 9101, 326, 1698, 3641, 2097, 326, 3280, 285, 21624, 6779, 403, 8489, 3907, 533, 891, 13414, 923, 253, 4602, 281, 5889, 267, 4295, 476, 253, 4477, 19148, 50276, 4674, 608, 310, 247, 5322, 1635, 253, 4477, 3748, 326, 597, 3037, 3159, 285, 12616, 14237, 835, 403, 253, 3159, 14237, 908, 619, 4685, 310, 326, 368, 15313, 3720, 3159, 14237, 407, 970, 256, 35333, 285, 253, 12616, 11390, 285, 11897, 2578, 342, 841, 604, 436, 310, 253, 1083, 271, 4722, 3368, 651, 320, 281, 1304, 849, 2074, 253, 5802, 3720, 11390, 403, 2057, 690, 806, 2621, 390, 1273, 2621, 14259, 281, 253, 3215, 11273, 4394, 50276, 4674, 854, 10262, 1543, 327, 13358, 14237, 1580, 436, 310, 253, 3368, 954, 2074, 281, 253, 3332, 789, 326, 4648, 1755, 5576, 14259, 285, 1580, 253, 4477, 2168, 4335, 253, 9414, 387, 2593, 818, 670, 5886, 875, 253, 374, 5593, 352, 651, 320, 4722, 281, 923, 253, 16774, 5886, 875, 2578, 285, 1755, 5576, 285, 697, 5886, 281, 26647, 285, 7880, 3045, 50275, 2009, 12355, 285, 1182, 1301, 609, 25658, 4267, 28407, 84, 403, 11390, 519, 720, 1644, 403, 12624, 9999, 519, 720, 3870, 415, 35831, 275, 24705, 2317, 5474, 339, 431, 248, 2929, 39223, 247, 1077, 4722, 1895, 670, 14237, 3340, 273, 253, 4602, 382, 2238, 50276, 5430, 513, 359, 871, 604, 253, 6311, 14237, 9232, 253, 5889, 267, 2605, 1246, 275, 253, 14800, 285, 14177, 281, 1705, 598, 342, 247, 12082, 7792, 281, 3662, 326, 1953, 253, 7792, 19584, 253, 3361, 273, 271, 42295, 326, 476, 1918, 441, 253, 2032, 5889, 267, 2605, 840, 253, 2488, 1611, 281, 3662, 690, 31255, 3533, 670, 253, 8062, 273, 4715, 285, 5889, 1319, 1223, 19936, 690, 4722, 4114, 4361, 50276, 35529, 516, 247, 2372, 15070, 670, 253, 4679, 327, 253, 581, 1133, 891, 751, 253, 7690, 356, 38721, 3753, 273, 253, 4679, 597, 403, 1355, 285, 943, 320, 3477, 281, 18302, 327, 253, 643, 1133, 512, 273, 731, 1646, 281, 320, 9648, 2074, 9351, 273, 5889, 342, 1077, 1643, 12474, 6571, 1943, 10624, 594, 1880, 253, 16875, 4431, 2186, 323, 625, 2570, 5889, 267, 5289, 310, 1892, 281, 1333, 50276, 7594, 8299, 697, 247, 973, 3542, 2929, 285, 310, 247, 9371, 806, 3213, 4404, 12392, 253, 1895, 273, 5889, 1319, 275, 4972, 14237, 50275, 37585, 2792, 23256, 495, 28146, 323, 47247, 30460, 835, 17236, 407, 3538, 569, 3133, 13583, 50276, 13206, 608, 3133, 3240, 27620, 281, 1056, 253, 4872, 2954, 1750, 50276, 15576, 891, 1335, 1158, 253, 16672, 762, 8180, 403, 253, 19554, 4394, 1335, 342, 253, 747, 4679, 253, 7031, 3133, 49482, 1677, 253, 4477, 2098, 281, 3727, 616, 2603, 2127, 891, 1902, 627, 588, 320, 271, 5107, 323, 253, 1551, 273, 253, 3114, 281, 1973, 327, 841, 281, 1071, 47588, 10307, 327, 625, 2570, 16672, 891, 9300, 619, 7363, 281, 4887, 253, 1818, 187, 187, 4118, 18435, 27, 2520, 2929, 10262, 247, 1332, 323, 10499, 253, 4248, 281, 534, 690, 6779, 323, 247, 9924, 1789, 8069, 6125, 253, 7437, 432, 534, 352, 310, 9924, 512, 1264, 4477, 1119, 436, 281, 320, 271, 1774, 9400, 323, 1263, 285, 1119, 253, 2929, 281, 320, 247, 3710, 533, 3236, 285, 1774, 3213, 2584, 12392, 436, 9400, 2299, 767, 30628, 4469, 4092, 7350, 670, 19843, 285, 497, 417, 4751, 10048, 342, 253, 38549, 1160, 594, 2080, 516, 46705, 14924, 533, 891, 1642, 253, 4477, 281, 2007, 49620, 253, 2929, 3340, 253, 10199, 281, 1056, 2119, 352, 3797, 247, 29156, 285, 15246, 9759, 273, 253, 1895, 762, 1263, 285, 253, 1039, 2578, 12453, 352, 50276, 303, 671, 8489, 7514, 387, 391, 19, 84, 3748, 273, 247, 2442, 44667, 275, 581, 3368, 253, 2929, 556, 644, 9300, 342, 752, 4620, 281, 320, 247, 4993, 2167, 285, 391, 19, 556, 417, 2568, 10974, 594, 516, 9475, 272, 326, 436, 2523, 556, 644, 11512, 50276, 74, 671, 1642, 253, 4477, 281, 3727, 2127, 13515, 2220, 372, 266, 7983, 1320, 347, 12316 ]
[ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 ]
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 2520, 2929, 8631, 247, 7792, 50276, 12588, 14433, 2228, 2578, 50276, 1542, 18005, 5889, 1319, 273, 14237, 407, 10941, 253, 6311, 18012, 1411, 1110, 273, 253, 8642, 5889, 267, 11193, 253, 2929, 14371, 253, 897, 273, 436, 7792, 281, 2939, 253, 2554, 273, 5889, 1319, 275, 247, 27710, 13800, 3408, 273, 6779, 4715, 26662, 253, 17668, 273, 2578, 342, 1966, 23014, 273, 5889, 1319, 273, 1943, 10624, 3400, 271, 8813, 273, 253, 2954, 273, 253, 7982, 281, 1755, 5576, 14259, 285, 4648, 253, 7792, 281, 3812, 11815, 670, 253, 2554, 273, 5889, 1319, 275, 1566, 26647, 50276, 1189, 455, 891, 1158, 436, 310, 247, 4891, 2929, 342, 271, 4722, 285, 5272, 2746, 281, 2677, 5411, 5889, 1319, 285, 247, 9648, 18511, 873, 273, 1543, 253, 2361, 4679, 3835, 5272, 3216, 275, 2426, 273, 3533, 4623, 281, 5889, 1319, 2954, 281, 6779, 13800, 26647, 285, 891, 11435, 253, 5301, 281, 1966, 23014, 534, 298, 1727, 17938, 281, 30437, 273, 253, 7792, 253, 1543, 403, 3839, 27350, 285, 5272, 2217, 281, 320, 24542, 347, 18172, 273, 849, 5889, 1319, 7033, 281, 7794, 273, 4715, 1223, 5277, 690, 2442, 12288, 253, 2929, 310, 4518, 3542, 285, 281, 619, 3640, 253, 2746, 310, 4460, 50276, 74, 651, 1333, 253, 2022, 12291, 281, 253, 11815, 326, 476, 320, 8392, 432, 841, 4679, 8696, 275, 253, 15504, 273, 26841, 281, 247, 1798, 5889, 5572, 273, 534, 253, 4477, 452, 4236, 1077, 2969, 4394, 1293, 10941, 281, 2571, 627, 310, 2717, 9090, 20697, 670, 253, 10165, 273, 5889, 5572, 533, 352, 3133, 326, 253, 11815, 8392, 2550, 320, 21323, 281, 4647, 281, 5889, 1319, 347, 247, 2087, 4473, 533, 2581, 281, 5889, 1319, 672, 2931, 407, 841, 1798, 9158, 2074, 7364, 4647, 281, 253, 958, 326, 253, 5216, 452, 644, 1408, 327, 1077, 2173, 8892, 50276, 262, 310, 417, 2590, 849, 841, 11815, 651, 39970, 281, 643, 8892, 50276, 3229, 3784, 436, 12291, 516, 21802, 281, 1333, 326, 253, 10199, 273, 253, 7792, 310, 247, 4891, 7680, 285, 253, 1543, 3559, 403, 4722, 891, 1158, 436, 310, 247, 5272, 2929, 281, 2997, 323, 9311, 50276, 37585, 4385, 268, 25, 1745, 80, 3733, 285, 3933, 19103, 50274, 15337, 254, 374, 2789, 247, 1175, 1127, 326, 253, 9759, 273, 253, 7792, 812, 320, 1199, 30909, 4390, 14551, 981, 253, 4275, 2554, 273, 4715, 253, 20523, 14237, 436, 310, 1633, 326, 651, 5649, 432, 18520, 37317, 374, 84, 5701, 671, 9287, 479, 326, 432, 247, 8668, 273, 4715, 5889, 2038, 2248, 23223, 269, 656, 248, 1162, 355, 4104, 310, 247, 4623, 3806, 1060, 347, 352, 12014, 33772, 20523, 3159, 14237, 281, 320, 13333, 342, 247, 6777, 5889, 1159, 50275, 42218, 3374, 273, 9759, 352, 3133, 326, 359, 403, 512, 275, 4345, 326, 253, 9380, 1379, 42287, 651, 671, 5649, 432, 271, 2572, 275, 253, 7990, 273, 253, 4679, 516, 5211, 281, 4575, 619, 4868, 281, 4887, 436, 50276, 14005, 269, 656, 248, 1162, 355, 4104, 247, 5889, 267, 285, 4665, 494, 24705, 2317, 5474, 339, 9357, 262, 285, 247, 2007, 1953, 4361, 969, 2593, 818, 516, 12371, 1880, 253, 50276, 783, 1029, 26647, 310, 1896, 1955, 281, 253, 958, 326, 387, 1071, 673, 760, 581, 273, 253, 767, 9183, 310, 39709, 285, 253, 643, 310, 2326, 1907, 1097, 9183, 281, 320, 39709, 2789, 253, 1895, 3012, 12150, 1580, 253, 760, 1039, 323, 253, 24199, 281, 755, 352, 987, 310, 281, 15629, 253, 3935, 342, 253, 987, 7431, 2581, 685, 22128, 275, 690, 643, 5700, 751, 1880, 253, 3935, 310, 4460, 3021, 697, 253, 2326, 7431, 390, 747, 3021, 697, 253, 39709, 7431, 347, 824, 891, 13414, 1158, 891, 476, 4751, 4517, 634, 11815, 1955, 281, 436, 2442, 1461, 10117, 50274, 783, 4477, 12661, 247, 2557, 273, 5889, 1319, 275, 14237, 1677, 10872, 273, 941, 1269, 28267, 342, 24705, 2248, 23223, 253, 4477, 3037, 247, 4972, 323, 1016, 273, 253, 20523, 824, 326, 253, 1635, 273, 253, 11390, 273, 253, 2248, 23223, 310, 1077, 2810, 275, 2426, 273, 7349, 460, 281, 253, 21624, 6779, 50276, 91, 273, 253, 3280, 1269, 253, 4477, 1089, 326, 436, 2557, 27972, 342, 253, 15577, 1491, 875, 253, 3280, 1269, 285, 1182, 4020, 684, 253, 1966, 16006, 273, 5889, 1319, 327, 247, 3448, 10895, 285, 4720, 10262, 247, 1263, 327, 253, 5886, 875, 253, 4081, 2557, 285, 2087, 478, 267, 1320, 3045, 26215, 326, 616, 2557, 27972, 342, 26647, 2228, 347, 973, 347, 7880, 1071, 7200, 50276, 2520, 275, 271, 4722, 1263, 285, 8104, 247, 1077, 7936, 1953, 12544, 5889, 1319, 275, 14237, 812, 29238, 253, 1039, 4404, 14237, 326, 12454, 3700, 4715, 285, 1805, 26647, 1223, 253, 2929, 310, 1077, 2590, 342, 23006, 281, 1543, 891, 1119, 253, 9759, 273, 253, 4081, 2557, 27662, 21643, 285, 8489, 625, 36074, 326, 752, 310, 1663, 1469, 327, 50275, 783, 4477, 1265, 342, 247, 1077, 4076, 1650, 326, 476, 7826, 12454, 8254, 5411, 275, 247, 5304, 1039, 253, 1232, 273, 13546, 253, 2557, 2299, 891, 1928, 326, 19843, 310, 1146, 24191, 2727, 323, 830, 1319, 352, 3198, 2067, 9563, 281, 1663, 940, 408, 253, 2934, 326, 9093, 253, 4477, 403, 3365, 4715, 11390, 273, 2248, 23223, 326, 672, 2879, 943, 28788, 253, 6779, 273, 253, 3280, 25761, 253, 1416, 273, 253, 2557, 310, 247, 2372, 24363, 285, 417, 17285, 407, 253, 4679, 285, 253, 941, 253, 4477, 513, 417, 2968, 342, 7139, 275, 667, 273, 253, 6667, 533, 2581, 342, 247, 873, 273, 2248, 23223, 5165, 275, 253, 897, 273, 1635, 347, 247, 5889, 1159, 534, 1146, 33796, 1057, 417, 1581, 323, 3159, 2621, 285, 253, 751, 3676, 43548, 9994, 3607, 50275, 2666, 4830, 253, 2557, 891, 751, 253, 2934, 273, 4715, 3720, 11390, 432, 253, 14237, 285, 1030, 26208, 281, 956, 253, 20523, 35185, 273, 2282, 436, 10806, 3240, 247, 2372, 253, 830, 273, 5889, 1319, 326, 253, 4477, 403, 12203, 323, 50276, 783, 2934, 273, 21842, 35185, 556, 644, 14859, 275, 295, 24343, 2299, 697, 6571, 7763, 323, 2248, 23223, 342, 23965, 422, 35185, 24088, 247, 3168, 29861, 310, 1633, 326, 310, 1097, 3168, 285, 247, 29861, 513, 253, 4477, 1158, 326, 436, 45798, 616, 4679, 3340, 253, 3626, 11515, 4394, 752, 670, 643, 5889, 5609, 1119, 275, 253, 6239, 273, 5889, 267, 35185, 24088, 407, 2534, 12355, 285, 1182, 1301, 609, 25658, 4267, 50276, 2520, 310, 1175, 281, 320, 31637, 50276, 3062, 1189, 1677, 253, 17647, 273, 253, 15302, 275, 253, 1655, 1263, 651, 2649, 247, 5272, 8245, 320, 281, 4044, 253, 3720, 4972, 273, 4797, 407, 25001, 512, 253, 21624, 14237, 273, 4797, 50276, 3549, 6241, 849, 7996, 403, 11815, 342, 1675, 281, 1027, 5889, 3470, 50276, 4674, 577, 310, 7826, 1077, 4722, 533, 891, 13414, 1646, 281, 2096, 2139, 697, 1175, 3668, 326, 2578, 27972, 342, 891, 633, 22666, 1698, 2578, 6492, 1029, 14577, 273, 5889, 1319, 891, 9101, 326, 1698, 3641, 2097, 326, 3280, 285, 21624, 6779, 403, 8489, 3907, 533, 891, 13414, 923, 253, 4602, 281, 5889, 267, 4295, 476, 253, 4477, 19148, 50276, 4674, 608, 310, 247, 5322, 1635, 253, 4477, 3748, 326, 597, 3037, 3159, 285, 12616, 14237, 835, 403, 253, 3159, 14237, 908, 619, 4685, 310, 326, 368, 15313, 3720, 3159, 14237, 407, 970, 256, 35333, 285, 253, 12616, 11390, 285, 11897, 2578, 342, 841, 604, 436, 310, 253, 1083, 271, 4722, 3368, 651, 320, 281, 1304, 849, 2074, 253, 5802, 3720, 11390, 403, 2057, 690, 806, 2621, 390, 1273, 2621, 14259, 281, 253, 3215, 11273, 4394, 50276, 4674, 854, 10262, 1543, 327, 13358, 14237, 1580, 436, 310, 253, 3368, 954, 2074, 281, 253, 3332, 789, 326, 4648, 1755, 5576, 14259, 285, 1580, 253, 4477, 2168, 4335, 253, 9414, 387, 2593, 818, 670, 5886, 875, 253, 374, 5593, 352, 651, 320, 4722, 281, 923, 253, 16774, 5886, 875, 2578, 285, 1755, 5576, 285, 697, 5886, 281, 26647, 285, 7880, 3045, 50275, 2009, 12355, 285, 1182, 1301, 609, 25658, 4267, 28407, 84, 403, 11390, 519, 720, 1644, 403, 12624, 9999, 519, 720, 3870, 415, 35831, 275, 24705, 2317, 5474, 339, 431, 248, 2929, 39223, 247, 1077, 4722, 1895, 670, 14237, 3340, 273, 253, 4602, 382, 2238, 50276, 5430, 513, 359, 871, 604, 253, 6311, 14237, 9232, 253, 5889, 267, 2605, 1246, 275, 253, 14800, 285, 14177, 281, 1705, 598, 342, 247, 12082, 7792, 281, 3662, 326, 1953, 253, 7792, 19584, 253, 3361, 273, 271, 42295, 326, 476, 1918, 441, 253, 2032, 5889, 267, 2605, 840, 253, 2488, 1611, 281, 3662, 690, 31255, 3533, 670, 253, 8062, 273, 4715, 285, 5889, 1319, 1223, 19936, 690, 4722, 4114, 4361, 50276, 35529, 516, 247, 2372, 15070, 670, 253, 4679, 327, 253, 581, 1133, 891, 751, 253, 7690, 356, 38721, 3753, 273, 253, 4679, 597, 403, 1355, 285, 943, 320, 3477, 281, 18302, 327, 253, 643, 1133, 512, 273, 731, 1646, 281, 320, 9648, 2074, 9351, 273, 5889, 342, 1077, 1643, 12474, 6571, 1943, 10624, 594, 1880, 253, 16875, 4431, 2186, 323, 625, 2570, 5889, 267, 5289, 310, 1892, 281, 1333, 50276, 7594, 8299, 697, 247, 973, 3542, 2929, 285, 310, 247, 9371, 806, 3213, 4404, 12392, 253, 1895, 273, 5889, 1319, 275, 4972, 14237, 50275, 37585, 2792, 23256, 495, 28146, 323, 47247, 30460, 835, 17236, 407, 3538, 569, 3133, 13583, 50276, 13206, 608, 3133, 3240, 27620, 281, 1056, 253, 4872, 2954, 1750, 50276, 15576, 891, 1335, 1158, 253, 16672, 762, 8180, 403, 253, 19554, 4394, 1335, 342, 253, 747, 4679, 253, 7031, 3133, 49482, 1677, 253, 4477, 2098, 281, 3727, 616, 2603, 2127, 891, 1902, 627, 588, 320, 271, 5107, 323, 253, 1551, 273, 253, 3114, 281, 1973, 327, 841, 281, 1071, 47588, 10307, 327, 625, 2570, 16672, 891, 9300, 619, 7363, 281, 4887, 253, 1818, 187, 187, 4118, 18435, 27, 2520, 2929, 10262, 247, 1332, 323, 10499, 253, 4248, 281, 534, 690, 6779, 323, 247, 9924, 1789, 8069, 6125, 253, 7437, 432, 534, 352, 310, 9924, 512, 1264, 4477, 1119, 436, 281, 320, 271, 1774, 9400, 323, 1263, 285, 1119, 253, 2929, 281, 320, 247, 3710, 533, 3236, 285, 1774, 3213, 2584, 12392, 436, 9400, 2299, 767, 30628, 4469, 4092, 7350, 670, 19843, 285, 497, 417, 4751, 10048, 342, 253, 38549, 1160, 594, 2080, 516, 46705, 14924, 533, 891, 1642, 253, 4477, 281, 2007, 49620, 253, 2929, 3340, 253, 10199, 281, 1056, 2119, 352, 3797, 247, 29156, 285, 15246, 9759, 273, 253, 1895, 762, 1263, 285, 253, 1039, 2578, 12453, 352, 50276, 303, 671, 8489, 7514, 387, 391, 19, 84, 3748, 273, 247, 2442, 44667, 275, 581, 3368, 253, 2929, 556, 644, 9300, 342, 752, 4620, 281, 320, 247, 4993, 2167, 285, 391, 19, 556, 417, 2568, 10974, 594, 516, 9475, 272, 326, 436, 2523, 556, 644, 11512, 50276, 74, 671, 1642, 253, 4477, 281, 3727, 2127, 13515, 2220, 372, 266, 7983, 1320, 347, 12316 ]
Below is given review of a research paper from cnoference journal. Please write a summary the review. ### Review: this paper studies the problem of computing nondata specific perturbations also known as universal perturbations to attack neural networks and take profit of their inherent vulnerability compared to previous works in the domain the authors look specifically at equivariant networks and derive geometric insights and methods to compute universal perturbations for these networks the paper starts by analysing the mainprincipal directions of set of perturbations that are able to change the decisions in different forms of equivariant neural networks with this heuristic study a few main directions are shown to be shared by most adversarial perturbations the authors then propose to construct universal perturbations built on the insights given by the principle directions of perturbations which is an interesting an effective method in addition it is shown that a few adversarial samples are sufficient to identify pretty accurately the principle directions the fooling rates achieved by this method is pretty good which demonstrates that the proposed strategy is reasonable the key idea in this paper using principal shared directions of perturbations computed on a small subset of data points has unfortunately already been proposed and tested in classical nonequivariant neural networks see for example fig 9 in moosavidezfooli 2017 cited in the paper and published in cvpr 2017 the present paper proposes however a few additional bits of information with a nice theoretical analysis while the previous works were mostly based on heuristics it is probably not sufficient however to pass the cut in iclr the interesting additional novelty here is the study of equivariant networks however this ends up falling sort of initial expectations there seems to be nothing specific to equivariant networks in the proposed study and the solution and algorithm is actually applicable to any neural network architectures also no specific insights are derived for equivariant networks which could be potentially very interesting to make progress in understanding better equivariant representations which still consist in a widely open research problem in general the paper has a nonclassical organisation with a lot of heuristics that are not discussed in depth that gives a sort of highlevel impression that the proposed idea is potentially nice but that but superficially addressed it should probably be improved in the next versions of this work docsepthe authors made an interesting observation theres an important common subspace of gradientfgsmdeepfool attacks among all examples therefore they propose to use top svd components of the directions to conduct universal attack this is an interesting finding but also not surprising we know the gradient of loss function wrt input can be used for interpretability and in mnist examples they usually reveals some rough shape of the class this is also observed in figure 813 in this paper and thus it makes sense that the gradient directions share a common subspace therefore i think this observation itself is not significant enough using this for universal attack is interesting however the experiments are not that convincing 1 to show this is a good way for universal attack i think the authors should compare with previous work in moosavidezfooli et al 2 all the experiments are on mnist how about cifarimagenet docsepthe paper presents some interesting observations related to the connection between the universal adversarial attacks on cnns and spectral properties while most of the results are empirical the authors present two theorems to justify some of the observations however the paper is poorly written and very hard to read rather than providing too many plotsresults in the main paper maybe use supplementary matl the empirical results should be better explained to help the readers similarly the implications of the theorems are not really clear and bit handwavy xxxxxxxxxxxxxx it seems that the authors provided a generic response to all the reviewers and i am not sure if they acknowledge the lack of clarity and lot of handwavy explanations in the paper this issue has been raised by other reviewers too and is quite critical for becoming a good paper worthy for iclr therefore i am unable to update my score for this paper however i do appreciate the comparison with moosavidezfooli et al cvpr17 this is a good addition as suggested by another reviewer ### Summary:
the topic of universal adversarial perturbation is quite intriguing and fairly poorly studied and the paper provides a mix of new insights both theoretical and empirical in nature however the significant presentation issues make it hard to properly understand and evaluate them in particular the theoretical part feels rushed and not sufficiently rigorous and it is unclear why focusing on the case of equivariant network is crucial also it would be useful if the authors put more effort in explaining how their contributions fit into the context of prior work in the area overall this paper has a potential of becoming a solid contribution once the above shortcomings are addressed
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 2520, 2929, 2175, 253, 1895, 273, 12672, 27370, 682, 2173, 26309, 671, 1929, 347, 10898, 26309, 281, 2983, 11454, 6928, 285, 1379, 11528, 273, 616, 12794, 24189, 2429, 281, 2045, 2987, 275, 253, 5028, 253, 4477, 1007, 5742, 387, 32270, 6410, 6928, 285, 15313, 17856, 16039, 285, 3082, 281, 11897, 10898, 26309, 323, 841, 6928, 50275, 783, 2929, 7866, 407, 5127, 272, 253, 2022, 26985, 19522, 10746, 273, 873, 273, 26309, 326, 403, 2104, 281, 1818, 253, 7089, 275, 1027, 4948, 273, 32270, 6410, 11454, 6928, 342, 436, 47641, 1263, 247, 1643, 2022, 10746, 403, 2011, 281, 320, 6096, 407, 954, 48960, 26309, 253, 4477, 840, 12661, 281, 3989, 10898, 26309, 4270, 327, 253, 16039, 1677, 407, 253, 8063, 10746, 273, 26309, 534, 310, 271, 4722, 271, 3576, 1332, 275, 1635, 352, 310, 2011, 326, 247, 1643, 48960, 3530, 403, 4209, 281, 4271, 3965, 50276, 18921, 1523, 253, 8063, 10746, 253, 11213, 272, 4142, 6786, 407, 436, 1332, 310, 3965, 1175, 534, 14371, 326, 253, 4081, 5700, 310, 5272, 50276, 783, 2234, 2934, 275, 436, 2929, 970, 8624, 6096, 10746, 273, 26309, 10302, 327, 247, 1355, 8578, 273, 941, 2792, 556, 19235, 2168, 644, 4081, 285, 5762, 275, 8946, 5293, 371, 400, 6410, 11454, 6928, 50276, 2887, 323, 1650, 3036, 898, 275, 5497, 375, 580, 504, 91, 71, 1062, 74, 4240, 11106, 275, 253, 2929, 285, 3863, 275, 30105, 1087, 4240, 253, 1246, 2929, 29328, 2299, 247, 1643, 3081, 9886, 273, 1491, 342, 247, 5322, 10527, 1783, 1223, 253, 2045, 2987, 497, 6571, 1754, 327, 344, 321, 3397, 352, 310, 3164, 417, 4209, 2299, 281, 1509, 253, 2624, 275, 17857, 32888, 50275, 783, 4722, 3081, 38135, 1060, 310, 253, 1263, 273, 32270, 6410, 6928, 2299, 436, 7637, 598, 10805, 3686, 273, 3302, 12656, 50276, 9088, 3133, 281, 320, 2717, 2173, 281, 32270, 6410, 6928, 275, 253, 4081, 1263, 285, 253, 2900, 285, 5933, 310, 2686, 7763, 281, 667, 11454, 2990, 35615, 50276, 12563, 642, 2173, 16039, 403, 6012, 323, 32270, 6410, 6928, 534, 812, 320, 7826, 1077, 4722, 281, 1056, 4780, 275, 4685, 1805, 32270, 6410, 14237, 534, 1335, 2882, 275, 247, 7561, 1527, 2561, 1895, 50275, 249, 2087, 253, 2929, 556, 247, 1327, 37347, 19156, 342, 247, 2257, 273, 344, 321, 3397, 326, 403, 417, 5469, 275, 6864, 50276, 3529, 4245, 247, 3686, 273, 1029, 5251, 13214, 326, 253, 4081, 2934, 310, 7826, 5322, 533, 326, 533, 23426, 280, 1365, 9713, 352, 943, 3164, 320, 5520, 275, 253, 1735, 9508, 273, 436, 789, 5474, 339, 431, 248, 4477, 1160, 271, 4722, 8310, 253, 373, 271, 1774, 1846, 24822, 273, 11786, 16054, 3610, 22412, 71, 1062, 8104, 2190, 512, 6667, 3103, 597, 12661, 281, 897, 1755, 18504, 69, 4295, 273, 253, 10746, 281, 2589, 10898, 2983, 436, 310, 271, 4722, 4560, 533, 671, 417, 10084, 359, 871, 253, 11786, 273, 2957, 1159, 8772, 3280, 476, 320, 908, 323, 4665, 1430, 285, 275, 278, 79, 382, 6667, 597, 3798, 12957, 690, 7227, 5281, 273, 253, 966, 436, 310, 671, 2540, 275, 4677, 854, 1012, 275, 436, 2929, 285, 3021, 352, 2789, 3282, 326, 253, 11786, 10746, 3894, 247, 1846, 24822, 3103, 891, 1158, 436, 8310, 3139, 310, 417, 1534, 2217, 50275, 5302, 436, 323, 10898, 2983, 310, 4722, 2299, 253, 4679, 403, 417, 326, 21414, 50275, 18, 281, 921, 436, 310, 247, 1175, 1039, 323, 10898, 2983, 891, 1158, 253, 4477, 943, 7277, 342, 2045, 789, 275, 5497, 375, 580, 504, 91, 71, 1062, 74, 1162, 355, 50275, 19, 512, 253, 4679, 403, 327, 278, 79, 382, 849, 670, 260, 338, 274, 303, 6533, 292, 50276, 7152, 339, 431, 248, 2929, 10262, 690, 4722, 7313, 2905, 281, 253, 4602, 875, 253, 10898, 48960, 8104, 327, 260, 79, 2224, 285, 9879, 3607, 1223, 954, 273, 253, 1543, 403, 16774, 253, 4477, 1246, 767, 39383, 281, 15249, 690, 273, 253, 7313, 2299, 253, 2929, 310, 15225, 3542, 285, 1077, 1892, 281, 1239, 2581, 685, 5277, 1512, 1142, 14777, 16680, 275, 253, 2022, 2929, 5046, 897, 24864, 1111, 77, 253, 16774, 1543, 943, 320, 1805, 5544, 281, 1361, 253, 10668, 12014, 253, 12739, 273, 253, 39383, 403, 417, 1663, 2590, 285, 2372, 1133, 88, 17157, 50272, 34576, 16321, 5260, 50276, 262, 3133, 326, 253, 4477, 2530, 247, 12314, 2380, 281, 512, 253, 30628, 285, 891, 717, 417, 2119, 604, 597, 14409, 253, 3480, 273, 19843, 285, 2257, 273, 1133, 88, 17157, 22909, 275, 253, 2929, 436, 2523, 556, 644, 5439, 407, 643, 30628, 1512, 285, 310, 3240, 4619, 323, 7552, 247, 1175, 2929, 18338, 323, 17857, 32888, 3103, 891, 717, 7591, 281, 5731, 619, 4868, 323, 436, 2929, 2299, 891, 513, 11435, 253, 5301, 342, 5497, 375, 580, 504, 91, 71, 1062, 74, 1162, 355, 30105, 1087, 1166, 436, 310, 247, 1175, 1635, 347, 5125, 407, 1529, 37317, 2490, 187, 4118, 18435, 27, 783, 9400, 273, 10898, 48960, 20452, 310, 3240, 27807, 285, 9648, 15225, 5421, 285, 253, 2929, 3400, 247, 5878, 273, 747, 16039, 1097, 10527, 285, 16774, 275, 3753, 2299, 253, 1534, 9759, 3374, 1056, 352, 1892, 281, 6283, 2096, 285, 7472, 731, 275, 1798, 253, 10527, 629, 9193, 20906, 285, 417, 10481, 26565, 285, 352, 310, 12744, 2139, 13654, 327, 253, 1083, 273, 32270, 6410, 2990, 310, 9560, 671, 352, 651, 320, 4217, 604, 253, 4477, 1691, 625, 3434, 275, 15571, 849, 616, 9021, 4944, 715, 253, 3634, 273, 2720, 789, 275, 253, 2170, 50276, 1189, 455, 436, 2929, 556, 247, 2442, 273, 7552, 247, 4891, 7680, 2378, 253, 1840, 35387, 403, 9713 ]
[ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 ]
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 2520, 2929, 2175, 253, 1895, 273, 12672, 27370, 682, 2173, 26309, 671, 1929, 347, 10898, 26309, 281, 2983, 11454, 6928, 285, 1379, 11528, 273, 616, 12794, 24189, 2429, 281, 2045, 2987, 275, 253, 5028, 253, 4477, 1007, 5742, 387, 32270, 6410, 6928, 285, 15313, 17856, 16039, 285, 3082, 281, 11897, 10898, 26309, 323, 841, 6928, 50275, 783, 2929, 7866, 407, 5127, 272, 253, 2022, 26985, 19522, 10746, 273, 873, 273, 26309, 326, 403, 2104, 281, 1818, 253, 7089, 275, 1027, 4948, 273, 32270, 6410, 11454, 6928, 342, 436, 47641, 1263, 247, 1643, 2022, 10746, 403, 2011, 281, 320, 6096, 407, 954, 48960, 26309, 253, 4477, 840, 12661, 281, 3989, 10898, 26309, 4270, 327, 253, 16039, 1677, 407, 253, 8063, 10746, 273, 26309, 534, 310, 271, 4722, 271, 3576, 1332, 275, 1635, 352, 310, 2011, 326, 247, 1643, 48960, 3530, 403, 4209, 281, 4271, 3965, 50276, 18921, 1523, 253, 8063, 10746, 253, 11213, 272, 4142, 6786, 407, 436, 1332, 310, 3965, 1175, 534, 14371, 326, 253, 4081, 5700, 310, 5272, 50276, 783, 2234, 2934, 275, 436, 2929, 970, 8624, 6096, 10746, 273, 26309, 10302, 327, 247, 1355, 8578, 273, 941, 2792, 556, 19235, 2168, 644, 4081, 285, 5762, 275, 8946, 5293, 371, 400, 6410, 11454, 6928, 50276, 2887, 323, 1650, 3036, 898, 275, 5497, 375, 580, 504, 91, 71, 1062, 74, 4240, 11106, 275, 253, 2929, 285, 3863, 275, 30105, 1087, 4240, 253, 1246, 2929, 29328, 2299, 247, 1643, 3081, 9886, 273, 1491, 342, 247, 5322, 10527, 1783, 1223, 253, 2045, 2987, 497, 6571, 1754, 327, 344, 321, 3397, 352, 310, 3164, 417, 4209, 2299, 281, 1509, 253, 2624, 275, 17857, 32888, 50275, 783, 4722, 3081, 38135, 1060, 310, 253, 1263, 273, 32270, 6410, 6928, 2299, 436, 7637, 598, 10805, 3686, 273, 3302, 12656, 50276, 9088, 3133, 281, 320, 2717, 2173, 281, 32270, 6410, 6928, 275, 253, 4081, 1263, 285, 253, 2900, 285, 5933, 310, 2686, 7763, 281, 667, 11454, 2990, 35615, 50276, 12563, 642, 2173, 16039, 403, 6012, 323, 32270, 6410, 6928, 534, 812, 320, 7826, 1077, 4722, 281, 1056, 4780, 275, 4685, 1805, 32270, 6410, 14237, 534, 1335, 2882, 275, 247, 7561, 1527, 2561, 1895, 50275, 249, 2087, 253, 2929, 556, 247, 1327, 37347, 19156, 342, 247, 2257, 273, 344, 321, 3397, 326, 403, 417, 5469, 275, 6864, 50276, 3529, 4245, 247, 3686, 273, 1029, 5251, 13214, 326, 253, 4081, 2934, 310, 7826, 5322, 533, 326, 533, 23426, 280, 1365, 9713, 352, 943, 3164, 320, 5520, 275, 253, 1735, 9508, 273, 436, 789, 5474, 339, 431, 248, 4477, 1160, 271, 4722, 8310, 253, 373, 271, 1774, 1846, 24822, 273, 11786, 16054, 3610, 22412, 71, 1062, 8104, 2190, 512, 6667, 3103, 597, 12661, 281, 897, 1755, 18504, 69, 4295, 273, 253, 10746, 281, 2589, 10898, 2983, 436, 310, 271, 4722, 4560, 533, 671, 417, 10084, 359, 871, 253, 11786, 273, 2957, 1159, 8772, 3280, 476, 320, 908, 323, 4665, 1430, 285, 275, 278, 79, 382, 6667, 597, 3798, 12957, 690, 7227, 5281, 273, 253, 966, 436, 310, 671, 2540, 275, 4677, 854, 1012, 275, 436, 2929, 285, 3021, 352, 2789, 3282, 326, 253, 11786, 10746, 3894, 247, 1846, 24822, 3103, 891, 1158, 436, 8310, 3139, 310, 417, 1534, 2217, 50275, 5302, 436, 323, 10898, 2983, 310, 4722, 2299, 253, 4679, 403, 417, 326, 21414, 50275, 18, 281, 921, 436, 310, 247, 1175, 1039, 323, 10898, 2983, 891, 1158, 253, 4477, 943, 7277, 342, 2045, 789, 275, 5497, 375, 580, 504, 91, 71, 1062, 74, 1162, 355, 50275, 19, 512, 253, 4679, 403, 327, 278, 79, 382, 849, 670, 260, 338, 274, 303, 6533, 292, 50276, 7152, 339, 431, 248, 2929, 10262, 690, 4722, 7313, 2905, 281, 253, 4602, 875, 253, 10898, 48960, 8104, 327, 260, 79, 2224, 285, 9879, 3607, 1223, 954, 273, 253, 1543, 403, 16774, 253, 4477, 1246, 767, 39383, 281, 15249, 690, 273, 253, 7313, 2299, 253, 2929, 310, 15225, 3542, 285, 1077, 1892, 281, 1239, 2581, 685, 5277, 1512, 1142, 14777, 16680, 275, 253, 2022, 2929, 5046, 897, 24864, 1111, 77, 253, 16774, 1543, 943, 320, 1805, 5544, 281, 1361, 253, 10668, 12014, 253, 12739, 273, 253, 39383, 403, 417, 1663, 2590, 285, 2372, 1133, 88, 17157, 50272, 34576, 16321, 5260, 50276, 262, 3133, 326, 253, 4477, 2530, 247, 12314, 2380, 281, 512, 253, 30628, 285, 891, 717, 417, 2119, 604, 597, 14409, 253, 3480, 273, 19843, 285, 2257, 273, 1133, 88, 17157, 22909, 275, 253, 2929, 436, 2523, 556, 644, 5439, 407, 643, 30628, 1512, 285, 310, 3240, 4619, 323, 7552, 247, 1175, 2929, 18338, 323, 17857, 32888, 3103, 891, 717, 7591, 281, 5731, 619, 4868, 323, 436, 2929, 2299, 891, 513, 11435, 253, 5301, 342, 5497, 375, 580, 504, 91, 71, 1062, 74, 1162, 355, 30105, 1087, 1166, 436, 310, 247, 1175, 1635, 347, 5125, 407, 1529, 37317, 2490, 187, 4118, 18435, 27, 783, 9400, 273, 10898, 48960, 20452, 310, 3240, 27807, 285, 9648, 15225, 5421, 285, 253, 2929, 3400, 247, 5878, 273, 747, 16039, 1097, 10527, 285, 16774, 275, 3753, 2299, 253, 1534, 9759, 3374, 1056, 352, 1892, 281, 6283, 2096, 285, 7472, 731, 275, 1798, 253, 10527, 629, 9193, 20906, 285, 417, 10481, 26565, 285, 352, 310, 12744, 2139, 13654, 327, 253, 1083, 273, 32270, 6410, 2990, 310, 9560, 671, 352, 651, 320, 4217, 604, 253, 4477, 1691, 625, 3434, 275, 15571, 849, 616, 9021, 4944, 715, 253, 3634, 273, 2720, 789, 275, 253, 2170, 50276, 1189, 455, 436, 2929, 556, 247, 2442, 273, 7552, 247, 4891, 7680, 2378, 253, 1840, 35387, 403, 9713 ]
Below is given review of a research paper from cnoference journal. Please write a summary the review. ### Review: this paper is a survey on 15 datasets related to us criminal justice it describes the us justice pipeline in a flowchart and sorts the datasets with according to it the authors propose a datasheet for each dataset which summarizes relevant information on the dataset data collection motivation uses distribution and maintainance they highlight that there are parts of the justice pipeline to which no dataset correspond and discuss domainspecific difficulties in the data collection process 1 the paper formalizes the us justice pipeline as a flowchart and uses it to link the datasets to relevant parts of the pipeline which makes it easy for researchers to choose a dataset 2 a datasheet is associated to each dataset providing a very practical and useful summary of each dataset 3 a gap in coverage of the justice pipeline by existing datasets is identified which could guide the collection of future datasets although the authors note it is difficult 4 political and social implications of the availability of such datasets are quickly discussed 1 although it is central to this kind of data the ethical discussions are a bit limited section 6 only contains one sentence about it and the datasheets do not seem to contain any informations regarding these questions 2 the authors provide download links for the datasets but i would have liked to have a unified method that download the datasets automatically making the use of multiple criminal justice dataset practical for researchers 3 the criminal justice pipeline described in figure 1 does not seem to be supported by any reference i believe there are two possibilities either it is a wellknown pipeline then i would appreciate to have a link to relevant previous works or it is a contribution of the paper in which case reference to eg the law could be nice but i admit i am no expert of the us criminal justice it also looks impossible to escape the pipeline after being charged i would imagine that after probation one can be free but it does not appear on figure 1 4 the authors mention many datasets but do not include data sheets for each of them what motivates the choice between the datasets chosen for datasheets and the others 5 some datasets contain multiple data records about one given individual authors mention that partial criminal path can be reconstructed from these datasets it is not clear how many of these partial paths can be reconstructed for all records 10 of records 1 6 the authors mention that different data collection processes lead to different kinds of unfairness in the datasets they claim that bias may be present at each step of the pipeline but fail to propose any even very basic statistics to support this claim 7 i find figure 1 to be a little to strict are the authors sure that their association between one dataset and one part of the pipeline are correct i am worried that someone could find other uses for these datasets either by applying new methods on it or by combining multiple datasets that are gathered in the survey docsepthis work surveys datasets in the criminal justice field that are often used for machine learning or fairness research it supplements this survey with a catalog of datasheets that are newly generated by the authors except for 1 where the dataset creators already generated a datasheet along with the survey there is a synthesis of knowledge where authors discuss gaps in the datasets and potential downstream analyses utility and quality of the submission impact originality novelty relevance to the neurips community will all be considered this piece reviews an important area of criminal justice datasets these datasets are used in a large body of fairness literature the work provides a survey of relevant criminal justice datasets and supplements these datasets with data sheets this is important and relevant to the neurips community and provides a nice survey while adding additional value in the form of data sheets there is a nice synthesis of information from the authors survey in the discussion section this also adds value to the survey completeness of the relevant documentation for datasets sufficient detail must be provided on how the data was collected and organized what kind of information it contains how it should be used ethically and responsibly as well as how it will be made available and maintained for benchmarks best practices on reproducibility should be followed the main contribution of this work is documentation so it certainly has that accessibility and accountability for datasets there should be a convincing hosting licensing and maintenance plan there is a github repo for maintaining and updating datasheets utility and quality of the submission impact originality novelty relevance to the neurips community will all be considered this isnt the most novel or original work since it is completing datasheets for datasets that have examined by a number of works in the past this piece cites many such related works however no other work is as comprehensive in its survey and no work systematically creates datasheets so concerns of novelty are somewhat minor the discussion on biases could be strengthened at the point where it is discussed it feels like an afterthought but there are significant problems here beyond just implicit biases for example earlier there is discussion on how for example hispanics are often reported as white this would lead to significant problems with biases and errors in analyses in downstream models especially for things like fairness by accounting for demographics this discussion could be expanded to touch on more of the findings from the authors exploration that might affect models or analyses similarly much of the discussion makes it seem like omitting information is a strictly bad thing for example the authors seem to lament that victim information is not included to conduct analyses with but these datasets require significant care and taking into account privacy considerations more engagement with these privacy considerations would strengthen the paper completeness of the relevant documentation for datasets sufficient detail must be provided on how the data was collected and organized what kind of information it contains how it should be used ethically and responsibly as well as how it will be made available and maintained for benchmarks best practices on reproducibility should be followed some of the datasheets are not particularly detailed if the contribution of the work is the datasheets i would have expected a bit more for example for the cpii datasheet the question what data does each instance consist of has the response as the data is compiled from 27 different sources each source has a different set of variables all sources report on the date time and location of the crime as recorded and the type of the offense i wouldve expected a table or breakdown here to make this information up front similarly on this same page there are some broken references i would suggest a pass over the supplementary datasheets to add more detail to spots like this and fix broken refs accessibility and accountability for datasets there should be a convincing hosting licensing and maintenance plan if the goal is to raise awareness of these datasheets and the underlying gaps for these datasets i would suggest creating a project webpage to host them in a way thats more prominent than as pdfs in a github repo this will raise the impact make it more accessible and might get others to update the datasheets as new information is identified such as for the questions regarding whether the data is being used already docsepthe authors conduct a survey of criminal justice datasets their main contributions are a discussion of these datasets in context of the full criminal justice pipeline and a public repository containing datasheets for 15 selected datasets this is a wellwritten paper on an important topic papers like this will become increasingly important in the realm of machine learning datasheets for datasets and specialized repositories are essential for responsible data use the survey is thorough and thoughtfully done i think the paper could benefit from an expanded discussion of related work and misuse of these datasets docsepin this paper the authors surveyed datasets from the national archive of criminal justice data nacjd and compiled a list of datasets arising from a variety of points in the criminal justice pipeline for 15 of these datasets the authors created publicly available datasheets using a template that they updated from the original in order to better suit criminal justice datasets the authors briefly describe each of the 15 datasets in the paper and create two index tables that summarize for each dataset 1 the type of criminal justice information and demographics covered and 2 the size composition maintenance and license information finally the authors discuss challenges in working with criminal justice datasets and illustrate these points using examples from the 15 surveyed datasets 1 it is in the best interest of the research community among others to broaden which datasets are used when studying criminal justice rather than focusing evaluations on a few datasets eg compas with this paper the authors have initiated this process by shining a light on 15 potential new datasets which are already publicly available 2 the 15 datasets are thoughtfully organized and presented in particular i found mapping the datasets onto the pipeline figure 1 to be a useful tool for getting quickly acquainted with the datasets and it nicely complements tables 1 and 2 3 in creating the index tables and a new datasheet template tailored to criminal justice datasets the authors have initiated the important discussion about what metadata should accompany criminal justice datasets and how these metadata questions might be standardized for describing new criminal justice datasets going forward this is an important discussion for datasets of any field but is particularly challenging and important for criminal justice datasets in which context is often not properly considered 4 the paper is wellwritten and should be easy to understand for a lay machine learning audience major points 1 since the authors are bringing 15 criminal justice datasets to the attention of the ml community it seems important to discuss in the paper why introducing these criminal justice datasets is beneficial to the ml community and to society to this point i feel it is important to answer the following questions how does providing the datasheets improve how these datasets can be used in ml beyond the official documentation does introducing these datasets help alleviate some of the existing problems with the use of criminal justice datasets in ml or will this just extend current problems to new datasets 2 in the introduction the authors state we give broad context to the datasets draw out potential uses and discuss gaps and limitations while the paper does address the first and third points it does not in my opinion adequately address potential uses for these datasets questions iiia d in the datasheets provide some information on potential uses but for a general machine learning audience it seems important to broadly discuss in the main paper how these 15 datasets should be used by the ml research community should they primarily be used to investigate the criminal justice pipeline as most of the uses in the datasheets seem to indicate should these datasets be used as benchmark datasets for testing out new methods which are not necessarily tailored to criminal justice applications 3 i found myself quite curious about the updates made to the datasheet template as described in section 2 this seems like an important contribution of this paper but it is not highlighted as such what gaps in the original template did these updates fill what unique challenges do criminal justice datasets pose to metadata documentation 4 it remains a bit unclear to me how the authors envision researchers using section 4 which is nearly three pages dedicated to short descriptions of the 15 datasets is the intention that these descriptions provide a quick introduction to a dataset and if interested one should then go to the datasheet for more information the combination of figure 1 and tables 1 and 2 seems quite useful for this purpose in and of itself the benefits of including these short descriptions in the main paper vs in the supplementary material is not clear to me 5 while section 3 provides necessary context in particular for figure 1 i am not wellversed in this area and find it troubling that this section does not have any references minor points 1 in the introduction one of the papers highlighted contributions is stated as reporting on 30 datasets however its not clear to me that this is really the case throughout section 4 other datasets are mentioned in addition to the main 15 but these are not included in the index tables nor in figure 1 and are not provided datasheets while i do see the usefulness in directing readers to other potentially relevant datasets in section 4 i found myself a bit confused in particular moving from sections 1 to 2 as to where the 30 vs 15 datasets were coming into play now after reading the entire paper i dont feel that this is one of its contributions ive noted this as a minor point because there is actually fairly little mention of the 30 datasets and removing mentions of this would seem to affect very little of the paper 2 related to the previous point it is still a little unclear to me how the authors arrived at the 15 datasets for which they created datasheets since there seems to have been 30 datasets which met the mentioned inclusion criteria why these 15 datasets are these 15 datasets useful for an ml audience in particular 3 figure 1 just want to verify that the colors in this figure correspond to the stages as mentioned in table 1 it may be useful to indicate this somewhere on the figure 4 section 7 typo i believe trough should be through 5 table 2 inconsistent capitalization in the geographic resolution and maintained columns docsepthe paper presents a collection of data sources on the us criminal justice system with the goal to provide researchers with an overview of available data sources datasets stem from a variety of fields from crime reports to jailprison sentencing the authors furthermore provide datasheets for 15 selected datasets which provides a standardizedstructured resource for accessing relevant data characteristics my major complaint would be the lack of a maintenance scheduleroute towards updating datasheets and the collection in general this does not only prevent collaboration with other researchers but also results in the paper being an immutable artefact which in my opinion is not adequate for such a collection of datasets if this complaint were addressed i would recommend acceptance of the paper the paper makes an important contribution to the field of criminal justice datasets by collecting and systematizing a variety of available datasets the survey seems to be exhaustive within its limited domain of us datasets and provides a good overview of relevant data sources the authors provide datasheets for datasets on 15 datasets which provide a comprehensive and structured resource that answers many of the relevant questions wrt those datasets the authors discuss the funnel arising from the sequential decisions made in the court system and provide an overview of procedures that typically lead to this funnel this provides a good understanding of the structure of the different data sources the paper discusses a series of relevant limitations to the collected data raising awareness for potential use cases i think the paper would make an even stronger case if the results were presented in the form of a websitegithub repository enabling easier navigation this would also allow other investigators to contribute eg via pull requests and allow for updating resources if eg errors are found in the datasets or when new datasets become available i would strongly suggest further work in that direction in their current version the datasheets are an artefact for which maintenance and updates are unclear which i would consider a major drawback of the paper improving in this direction could be done by 1 including a statement regarding maintenance and updates 2 uploading latex sources that enable updating the datasheets 3 uploading a latex template for similar datasheets 4 outlining how people can contribute new datasets to this collection how datasheets can be updated and the criteria for such changes 4 providing a good overview improving the readme in the existing github repository docsepthe authors present an analysis of criminal justice related data sets i think this is an important topic and i appreciate the degree to which they are exploring pros and cons of these data sets overall this seems to be a valuable resource i have some concerns about the completeness of the model however this is a good summary of many diverse data sets collecting information that can be used to make informed choices about how to use this data however i have some concerns my biggest concern is the presumption in their model that a crime was committed the criminal justice system is imperfect and not everyone is convicted nor is everyone who is convicted guilty do no data sets address this issue even if that is the case given that the authors wish to identify gaps in the data sets why is that not represented in figure 1 outside of the word acquittal in addition the model is incomplete for example the list of outcomes of pretrial hearings is incomplete as people may be released but required to wear an ankle monitor just as an example i am not an expert in this area so there may be other outcomes that are missing too that i am unaware of further the order of operations shown is not accurate to actual criminal justice experiences for example many people have to pay fines even if they are not convicted for example fees for required ankle bracelet surveillance between charging and conviction or paying back a loan for bail money my neighbor was forced to pay a fine simply for appearing in court even though she was not charged in the end another thing that seems to be missing from the model provided is the type of defender public or private for example that participants had access to or even whether when and how such a person was assigned ### Summary:
the paper introduces a set of criminal justice datasets to the machine learning community surveying 30 datasets and creating datasheets for 15 of them reviewers appreciated that the paper raises awareness of these datasets in the ml community and the documentation work that the authors have contributed there were two main concerns inadequate discussion of ethics and lack of detail on how the ml community could work with these datasets the authors have addressed the first concern in a revision and partially addressed the second concern
[ 281, 3167, 731, 275, 247, 1039, 28763, 625, 11906, 685, 347, 31385, 3671, 275, 247, 40477, 30905, 436, 588, 7164, 253, 3486, 1056, 352, 625, 12482, 285, 1537, 755, 2571, 281, 5731, 253, 7621, 37586, 347, 747, 1491, 310, 3636, 824, 347, 323, 253, 3533, 5001, 1880, 253, 941, 310, 1146, 908, 2168, 50276, 7152, 339, 431, 248, 4477, 2589, 247, 6630, 273, 6424, 8426, 15302, 616, 2022, 9021, 403, 247, 5955, 273, 841, 15302, 275, 3634, 273, 253, 2120, 6424, 8426, 15722, 285, 247, 1345, 18491, 4508, 7621, 37586, 323, 1458, 4236, 15302, 436, 310, 247, 973, 15720, 2929, 327, 271, 1774, 9400, 9380, 751, 436, 588, 2489, 9592, 1774, 275, 253, 19929, 273, 5145, 4715, 7621, 37586, 323, 15302, 285, 18052, 43445, 403, 5667, 323, 5506, 941, 897, 253, 6630, 310, 11080, 285, 1869, 2920, 2218, 50276, 74, 1158, 253, 2929, 812, 5649, 432, 271, 11848, 5955, 273, 2905, 789, 285, 41775, 273, 841, 15302, 5474, 339, 9852, 436, 2929, 253, 4477, 28671, 15302, 432, 253, 3872, 21429, 273, 6424, 8426, 941, 295, 317, 28034, 285, 18133, 247, 1618, 273, 15302, 14475, 432, 247, 5235, 273, 2792, 275, 253, 6424, 8426, 15722, 323, 1458, 273, 841, 15302, 253, 4477, 3562, 13644, 2130, 7621, 37586, 970, 247, 7646, 326, 597, 9300, 432, 253, 3236, 275, 1340, 281, 1805, 4176, 6424, 8426, 15302, 253, 4477, 13366, 6266, 1016, 273, 253, 1458, 15302, 275, 253, 2929, 285, 2794, 767, 3605, 7180, 326, 26799, 323, 1016, 10895, 337, 253, 1511, 273, 6424, 8426, 1491, 285, 35949, 6107, 285, 374, 253, 1979, 5889, 9363, 285, 7981, 1491, 4720, 253, 4477, 2319, 7881, 275, 2444, 342, 6424, 8426, 15302, 285, 17093, 841, 2792, 970, 6667, 432, 253, 1458, 28671, 15302, 337, 352, 310, 275, 253, 1682, 1600, 273, 253, 2561, 3114, 2190, 2571, 281, 3862, 257, 534, 15302, 403, 908, 672, 12392, 6424, 8426, 2581, 685, 13654, 27163, 327, 247, 1643, 15302, 24088, 509, 284, 342, 436, 2929, 253, 4477, 452, 15247, 436, 1232, 407, 28115, 247, 1708, 327, 1458, 2442, 747, 15302, 534, 403, 2168, 13644, 2130, 374, 253, 1458, 15302, 403, 1869, 2920, 10932, 285, 3559, 275, 1798, 891, 1119, 10603, 253, 15302, 4830, 253, 15722, 4677, 337, 281, 320, 247, 4217, 4968, 323, 2970, 4541, 41061, 342, 253, 15302, 285, 352, 23395, 509, 9115, 7180, 337, 285, 374, 495, 275, 6153, 253, 3605, 7180, 285, 247, 747, 7621, 14934, 7646, 27846, 281, 6424, 8426, 15302, 253, 4477, 452, 15247, 253, 1774, 5955, 670, 752, 21464, 943, 13920, 6424, 8426, 15302, 285, 849, 841, 21464, 3533, 1537, 320, 19817, 323, 12930, 747, 6424, 8426, 15302, 1469, 3579, 436, 310, 271, 1774, 5955, 323, 15302, 273, 667, 1673, 533, 310, 3782, 11132, 285, 1774, 323, 6424, 8426, 15302, 275, 534, 3634, 310, 2223, 417, 6283, 2783, 577, 253, 2929, 310, 973, 15720, 285, 943, 320, 3477, 281, 2096, 323, 247, 2242, 5145, 4715, 8446, 2201, 2792, 337, 1580, 253, 4477, 403, 9745, 1458, 6424, 8426, 15302, 281, 253, 4116, 273, 253, 13361, 3114, 352, 3133, 1774, 281, 2319, 275, 253, 2929, 2139, 16984, 841, 6424, 8426, 15302, 310, 12912, 281, 253, 13361, 3114, 285, 281, 5948, 281, 436, 1127, 891, 1928, 352, 310, 1774, 281, 3662, 253, 1563, 3533, 849, 1057, 5277, 253, 7621, 37586, 3157, 849, 841, 15302, 476, 320, 908, 275, 13361, 4457, 253, 3565, 10097, 1057, 16984, 841, 15302, 1361, 33623, 690, 273, 253, 5368, 3237, 342, 253, 897, 273, 6424, 8426, 15302, 275, 13361, 390, 588, 436, 816, 9017, 1655, 3237, 281, 747, 15302, 374, 275, 253, 10199, 253, 4477, 1375, 359, 1918, 3862, 3634, 281, 253, 15302, 3812, 562, 2442, 4648, 285, 2319, 18388, 285, 7364, 1223, 253, 2929, 1057, 2953, 253, 806, 285, 2626, 2792, 352, 1057, 417, 275, 619, 4743, 18212, 2953, 2442, 4648, 323, 841, 15302, 3533, 21255, 571, 50276, 69, 275, 253, 7621, 37586, 2085, 690, 1491, 327, 2442, 4648, 533, 323, 247, 2087, 5145, 4715, 8446, 352, 3133, 1774, 281, 21450, 2319, 275, 253, 2022, 2929, 849, 841, 1458, 15302, 943, 320, 908, 407, 253, 13361, 2561, 3114, 943, 597, 8558, 320, 908, 281, 7409, 253, 6424, 8426, 15722, 347, 954, 273, 253, 4648, 275, 253, 7621, 37586, 1646, 281, 5224, 943, 841, 15302, 320, 908, 347, 22791, 15302, 323, 5175, 562, 747, 3082, 534, 403, 417, 7933, 27846, 281, 6424, 8426, 4893, 50276, 20, 891, 1119, 4266, 3240, 14338, 670, 253, 11269, 1160, 281, 253, 7621, 14934, 7646, 347, 2529, 275, 2593, 374, 436, 3133, 751, 271, 1774, 7680, 273, 436, 2929, 533, 352, 310, 417, 16318, 347, 824, 752, 18388, 275, 253, 3236, 7646, 858, 841, 11269, 7522, 752, 4451, 7881, 513, 6424, 8426, 15302, 16753, 281, 21464, 10097, 50276, 21, 352, 4558, 247, 2372, 12744, 281, 479, 849, 253, 4477, 31161, 8607, 970, 2593, 577, 534, 310, 4829, 1264, 7223, 9940, 281, 2159, 20121, 273, 253, 1458, 15302, 310, 253, 8208, 326, 841, 20121, 2085, 247, 3158, 10199, 281, 247, 10895, 285, 604, 6110, 581, 943, 840, 564, 281, 253, 7621, 14934, 323, 625, 1491, 253, 5019, 273, 4677, 337, 285, 7180, 337, 285, 374, 3133, 3240, 4217, 323, 436, 4096, 275, 285, 273, 3139, 253, 5373, 273, 1690, 841, 2159, 20121, 275, 253, 2022, 2929, 4632, 275, 253, 24864, 2144, 310, 417, 2590, 281, 479, 608, 1223, 2593, 495, 3400, 3309, 3634, 275, 1798, 323, 4677, 337, 891, 717, 417, 973, 43910, 275, 436, 2170, 285, 1089, 352, 39728, 326, 436, 2593, 1057, 417, 452, 667, 10414, 50275, 37585, 2792, 337, 275, 253, 10199, 581, 273, 253, 9380, 16318, 9021, 310, 4767, 347, 9610, 327, 1884, 15302, 2299, 697, 417, 2590, 281, 479, 326, 436, 310, 1663, 253, 1083, 4768, 2593, 577, 643, 15302, 403, 5393, 275, 1635, 281, 253, 2022, 1458, 533, 841, 403, 417, 2908, 275, 253, 3605, 7180, 4543, 275, 4677, 337, 285, 403, 417, 2530, 7621, 37586, 1223, 891, 513, 923, 253, 31471, 275, 25091, 10668, 281, 643, 7826, 4623, 15302, 275, 2593, 577, 891, 1119, 4266, 247, 2372, 13477, 275, 1798, 4886, 432, 7118, 337, 281, 374, 347, 281, 835, 253, 1884, 4632, 1458, 15302, 497, 3551, 715, 1132, 1024, 846, 4361, 253, 2862, 2929, 891, 13414, 1928, 326, 436, 310, 581, 273, 697, 9021, 209, 422, 4879, 436, 347, 247, 5884, 1127, 984, 627, 310, 2686, 9648, 1652, 3748, 273, 253, 1884, 15302, 285, 11922, 25957, 273, 436, 651, 1646, 281, 2818, 1077, 1652, 273, 253, 2929, 374, 2905, 281, 253, 2045, 1127, 352, 310, 1335, 247, 1652, 12744, 281, 479, 849, 253, 4477, 7244, 387, 253, 1458, 15302, 323, 534, 597, 3562, 7621, 37586, 1580, 627, 3133, 281, 452, 644, 1884, 15302, 534, 1313, 253, 5393, 11250, 6866, 2139, 841, 1458, 15302, 403, 841, 1458, 15302, 4217, 323, 271, 13361, 8446, 275, 1798, 495, 4677, 337, 816, 971, 281, 12654, 326, 253, 9830, 275, 436, 4677, 2723, 281, 253, 8661, 347, 5393, 275, 2829, 337, 352, 778, 320, 4217, 281, 5224, 436, 9366, 327, 253, 4677, 577, 2593, 818, 1745, 80, 891, 2868, 39445, 943, 320, 949, 608, 2829, 374, 16706, 5347, 1320, 275, 253, 23365, 6064, 285, 8838, 9930, 50276, 7152, 339, 431, 248, 2929, 10262, 247, 4849, 273, 941, 4973, 327, 253, 441, 6424, 8426, 985, 342, 253, 4736, 281, 2085, 8607, 342, 271, 18389, 273, 2130, 941, 4973, 15302, 8424, 432, 247, 5235, 273, 4910, 432, 6617, 5012, 281, 12907, 13094, 14625, 253, 4477, 33810, 2085, 7621, 37586, 323, 1458, 4236, 15302, 534, 3400, 247, 19817, 34218, 7741, 323, 24497, 4623, 941, 5319, 50276, 2577, 2201, 5833, 651, 320, 253, 3480, 273, 247, 9363, 8194, 14398, 5380, 4404, 22753, 7621, 37586, 285, 253, 4849, 275, 2087, 50276, 2520, 1057, 417, 760, 3657, 14448, 342, 643, 8607, 533, 671, 1543, 275, 253, 2929, 1146, 271, 4293, 13508, 39624, 12690, 534, 275, 619, 4743, 50276, 261, 417, 10599, 323, 824, 247, 4849, 273, 15302, 604, 436, 5833, 497, 9713, 891, 651, 5583, 14924, 273, 253, 2929, 253, 2929, 2789, 271, 1774, 7680, 281, 253, 1673, 273, 6424, 8426, 15302, 407, 17055, 285, 985, 255, 3006, 247, 5235, 273, 2130, 15302, 253, 6630, 3133, 281, 320, 41389, 1561, 697, 3710, 5028, 273, 441, 15302, 285, 3400, 247, 1175, 18389, 273, 4623, 941, 4973, 50276, 783, 4477, 2085, 7621, 37586, 323, 15302, 327, 1458, 15302, 534, 2085, 247, 11088, 285, 18872, 7741, 326, 9172, 1142, 273, 253, 4623, 3533, 8772, 1110, 15302, 50275, 783, 4477, 2319, 253, 37346, 14475, 432, 253, 22453, 7089, 1160, 275, 253, 1302, 985, 285, 2085, 271, 18389, 273, 7259, 326, 5431, 1421, 281, 436, 37346, 436, 3400, 247, 1175, 4685, 273, 253, 2605, 273, 253, 1027, 941, 4973, 50276, 783, 2929, 25339, 247, 2962, 273, 4623, 7364, 281, 253, 5728, 941, 12976, 11891, 323, 2442, 897, 2219, 50275, 74, 1158, 253, 2929, 651, 1056, 271, 1014, 10046, 1083, 604, 253, 1543, 497, 3559, 275, 253, 830, 273, 247, 4422, 7280, 18491, 17690, 6927, 15034, 436, 651, 671, 1581, 643, 17085, 281, 8162, 24088, 3066, 3785, 9762, 285, 1581, 323, 22753, 5300, 604, 24088, 6332, 403, 1119, 275, 253, 15302, 390, 672, 747, 15302, 2489, 2130, 891, 651, 7052, 1804, 2007, 789, 275, 326, 3884, 275, 616, 1655, 2715, 253, 7621, 37586, 403, 271, 39624, 12690, 323, 534, 9363, 285, 11269, 403, 12744, 534, 891, 651, 1908, 247, 2201, 32489, 273, 253, 2929, 50276, 303, 40037, 275, 436, 3884, 812, 320, 2218, 407, 50276, 18, 1690, 247, 3908, 5001, 9363, 285, 11269, 374, 49487, 44127, 4973, 326, 8046, 22753, 253, 7621, 37586, 495, 49487, 247, 44127, 7646, 323, 2074, 7621, 37586, 577, 562, 30927, 849, 952, 476, 8162, 747, 15302, 281, 436, 4849, 849, 7621, 37586, 476, 320, 9300, 285, 253, 6866, 323, 824, 2544, 577, 5277, 247, 1175, 18389, 50276, 303, 40037, 253, 1239, 1405, 275, 253, 5368, 40477, 18491, 50275, 7152, 339, 431, 248, 4477, 1246, 271, 1783, 273, 6424, 8426, 2905, 941, 5239, 891, 1158, 436, 310, 271, 1774, 9400, 285, 891, 11435, 253, 4248, 281, 534, 597, 403, 18216, 5847, 285, 772, 273, 841, 941, 5239, 50276, 1189, 455, 436, 3133, 281, 320, 247, 9865, 7741, 891, 452, 690, 7350, 670, 253, 29867, 273, 253, 1566, 2299, 50274, 2520, 310, 247, 1175, 6010, 273, 1142, 11117, 941, 5239, 17055, 1491, 326, 476, 320, 908, 281, 1056, 8191, 10165, 670, 849, 281, 897, 436, 941, 50276, 35529, 891, 452, 690, 7350, 50275, 2577, 5962, 4468, 310, 253, 22797, 275, 616, 1566, 326, 247, 6617, 369, 7730, 253, 6424, 8426, 985, 310, 35180, 285, 417, 4130, 310, 13084, 4543, 310, 4130, 665, 310, 13084, 8106, 513, 642, 941, 5239, 2953, 436, 2523, 1014, 604, 326, 310, 253, 1083, 1677, 326, 253, 4477, 5730, 281, 4271, 18388, 275, 253, 941, 5239, 2139, 310, 326, 417, 6607, 275, 4677, 337, 3345, 273, 253, 3159, 49207, 50275, 249, 1635, 253, 1566, 310, 18464, 50276, 1542, 1650, 253, 1618, 273, 6973, 273, 37631, 26388, 310, 18464, 347, 952, 778, 320, 4439, 533, 2424, 281, 8251, 271, 23469, 5724, 816, 347, 271, 1650, 891, 717, 417, 271, 6485, 275, 436, 2170, 594, 627, 778, 320, 643, 6973, 326, 403, 5816, 1512, 326, 891, 717, 25229, 273, 50275, 44295, 253, 1340, 273, 5871, 2011, 310, 417, 7899, 281, 4588, 6424, 8426, 8450, 323, 1650, 1142, 952, 452, 281, 2075, 29701, 1014, 604, 597, 403, 417, 13084, 323, 1650, 8128, 323, 2424, 23469, 27965, 1059, 13234, 875, 16153, 285, 9611, 390, 10054, 896, 247, 10119, 323, 19590, 2583, 619, 6346, 369, 6726, 281, 2075, 247, 4030, 3365, 323, 15602, 275, 1302, 1014, 2167, 703, 369, 417, 6636, 275, 253, 990, 50275, 23955, 2181, 326, 3133, 281, 320, 5816, 432, 253, 1566, 2530, 310, 253, 1511, 273, 26762, 1345, 390, 3055, 323, 1650, 326, 5014, 574, 2289, 281, 390, 1014, 1880, 672, 285, 849, 824, 247, 1436, 369, 7922, 50276, 187, 187, 4118, 18435, 27, 783, 2929, 23970, 247, 873, 273, 6424, 8426, 15302, 281, 253, 5145, 4715, 3114, 8957, 3184, 1884, 15302, 285, 6153, 7621, 37586, 323, 1458, 273, 731, 30628, 14109, 326, 253, 2929, 16540, 11891, 273, 841, 15302, 275, 253, 13361, 3114, 285, 253, 10097, 789, 326, 253, 4477, 452, 9945, 627, 497, 767, 2022, 7350, 18766, 5955, 273, 18035, 285, 3480, 273, 2508, 327, 849, 253, 13361, 3114, 812, 789, 342, 841, 15302, 253, 4477, 452, 9713, 253, 806, 4468, 275, 247, 18520, 285, 10571, 9713, 253, 1273, 4468 ]
[ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 ]
[ 281, 3167, 731, 275, 247, 1039, 28763, 625, 11906, 685, 347, 31385, 3671, 275, 247, 40477, 30905, 436, 588, 7164, 253, 3486, 1056, 352, 625, 12482, 285, 1537, 755, 2571, 281, 5731, 253, 7621, 37586, 347, 747, 1491, 310, 3636, 824, 347, 323, 253, 3533, 5001, 1880, 253, 941, 310, 1146, 908, 2168, 50276, 7152, 339, 431, 248, 4477, 2589, 247, 6630, 273, 6424, 8426, 15302, 616, 2022, 9021, 403, 247, 5955, 273, 841, 15302, 275, 3634, 273, 253, 2120, 6424, 8426, 15722, 285, 247, 1345, 18491, 4508, 7621, 37586, 323, 1458, 4236, 15302, 436, 310, 247, 973, 15720, 2929, 327, 271, 1774, 9400, 9380, 751, 436, 588, 2489, 9592, 1774, 275, 253, 19929, 273, 5145, 4715, 7621, 37586, 323, 15302, 285, 18052, 43445, 403, 5667, 323, 5506, 941, 897, 253, 6630, 310, 11080, 285, 1869, 2920, 2218, 50276, 74, 1158, 253, 2929, 812, 5649, 432, 271, 11848, 5955, 273, 2905, 789, 285, 41775, 273, 841, 15302, 5474, 339, 9852, 436, 2929, 253, 4477, 28671, 15302, 432, 253, 3872, 21429, 273, 6424, 8426, 941, 295, 317, 28034, 285, 18133, 247, 1618, 273, 15302, 14475, 432, 247, 5235, 273, 2792, 275, 253, 6424, 8426, 15722, 323, 1458, 273, 841, 15302, 253, 4477, 3562, 13644, 2130, 7621, 37586, 970, 247, 7646, 326, 597, 9300, 432, 253, 3236, 275, 1340, 281, 1805, 4176, 6424, 8426, 15302, 253, 4477, 13366, 6266, 1016, 273, 253, 1458, 15302, 275, 253, 2929, 285, 2794, 767, 3605, 7180, 326, 26799, 323, 1016, 10895, 337, 253, 1511, 273, 6424, 8426, 1491, 285, 35949, 6107, 285, 374, 253, 1979, 5889, 9363, 285, 7981, 1491, 4720, 253, 4477, 2319, 7881, 275, 2444, 342, 6424, 8426, 15302, 285, 17093, 841, 2792, 970, 6667, 432, 253, 1458, 28671, 15302, 337, 352, 310, 275, 253, 1682, 1600, 273, 253, 2561, 3114, 2190, 2571, 281, 3862, 257, 534, 15302, 403, 908, 672, 12392, 6424, 8426, 2581, 685, 13654, 27163, 327, 247, 1643, 15302, 24088, 509, 284, 342, 436, 2929, 253, 4477, 452, 15247, 436, 1232, 407, 28115, 247, 1708, 327, 1458, 2442, 747, 15302, 534, 403, 2168, 13644, 2130, 374, 253, 1458, 15302, 403, 1869, 2920, 10932, 285, 3559, 275, 1798, 891, 1119, 10603, 253, 15302, 4830, 253, 15722, 4677, 337, 281, 320, 247, 4217, 4968, 323, 2970, 4541, 41061, 342, 253, 15302, 285, 352, 23395, 509, 9115, 7180, 337, 285, 374, 495, 275, 6153, 253, 3605, 7180, 285, 247, 747, 7621, 14934, 7646, 27846, 281, 6424, 8426, 15302, 253, 4477, 452, 15247, 253, 1774, 5955, 670, 752, 21464, 943, 13920, 6424, 8426, 15302, 285, 849, 841, 21464, 3533, 1537, 320, 19817, 323, 12930, 747, 6424, 8426, 15302, 1469, 3579, 436, 310, 271, 1774, 5955, 323, 15302, 273, 667, 1673, 533, 310, 3782, 11132, 285, 1774, 323, 6424, 8426, 15302, 275, 534, 3634, 310, 2223, 417, 6283, 2783, 577, 253, 2929, 310, 973, 15720, 285, 943, 320, 3477, 281, 2096, 323, 247, 2242, 5145, 4715, 8446, 2201, 2792, 337, 1580, 253, 4477, 403, 9745, 1458, 6424, 8426, 15302, 281, 253, 4116, 273, 253, 13361, 3114, 352, 3133, 1774, 281, 2319, 275, 253, 2929, 2139, 16984, 841, 6424, 8426, 15302, 310, 12912, 281, 253, 13361, 3114, 285, 281, 5948, 281, 436, 1127, 891, 1928, 352, 310, 1774, 281, 3662, 253, 1563, 3533, 849, 1057, 5277, 253, 7621, 37586, 3157, 849, 841, 15302, 476, 320, 908, 275, 13361, 4457, 253, 3565, 10097, 1057, 16984, 841, 15302, 1361, 33623, 690, 273, 253, 5368, 3237, 342, 253, 897, 273, 6424, 8426, 15302, 275, 13361, 390, 588, 436, 816, 9017, 1655, 3237, 281, 747, 15302, 374, 275, 253, 10199, 253, 4477, 1375, 359, 1918, 3862, 3634, 281, 253, 15302, 3812, 562, 2442, 4648, 285, 2319, 18388, 285, 7364, 1223, 253, 2929, 1057, 2953, 253, 806, 285, 2626, 2792, 352, 1057, 417, 275, 619, 4743, 18212, 2953, 2442, 4648, 323, 841, 15302, 3533, 21255, 571, 50276, 69, 275, 253, 7621, 37586, 2085, 690, 1491, 327, 2442, 4648, 533, 323, 247, 2087, 5145, 4715, 8446, 352, 3133, 1774, 281, 21450, 2319, 275, 253, 2022, 2929, 849, 841, 1458, 15302, 943, 320, 908, 407, 253, 13361, 2561, 3114, 943, 597, 8558, 320, 908, 281, 7409, 253, 6424, 8426, 15722, 347, 954, 273, 253, 4648, 275, 253, 7621, 37586, 1646, 281, 5224, 943, 841, 15302, 320, 908, 347, 22791, 15302, 323, 5175, 562, 747, 3082, 534, 403, 417, 7933, 27846, 281, 6424, 8426, 4893, 50276, 20, 891, 1119, 4266, 3240, 14338, 670, 253, 11269, 1160, 281, 253, 7621, 14934, 7646, 347, 2529, 275, 2593, 374, 436, 3133, 751, 271, 1774, 7680, 273, 436, 2929, 533, 352, 310, 417, 16318, 347, 824, 752, 18388, 275, 253, 3236, 7646, 858, 841, 11269, 7522, 752, 4451, 7881, 513, 6424, 8426, 15302, 16753, 281, 21464, 10097, 50276, 21, 352, 4558, 247, 2372, 12744, 281, 479, 849, 253, 4477, 31161, 8607, 970, 2593, 577, 534, 310, 4829, 1264, 7223, 9940, 281, 2159, 20121, 273, 253, 1458, 15302, 310, 253, 8208, 326, 841, 20121, 2085, 247, 3158, 10199, 281, 247, 10895, 285, 604, 6110, 581, 943, 840, 564, 281, 253, 7621, 14934, 323, 625, 1491, 253, 5019, 273, 4677, 337, 285, 7180, 337, 285, 374, 3133, 3240, 4217, 323, 436, 4096, 275, 285, 273, 3139, 253, 5373, 273, 1690, 841, 2159, 20121, 275, 253, 2022, 2929, 4632, 275, 253, 24864, 2144, 310, 417, 2590, 281, 479, 608, 1223, 2593, 495, 3400, 3309, 3634, 275, 1798, 323, 4677, 337, 891, 717, 417, 973, 43910, 275, 436, 2170, 285, 1089, 352, 39728, 326, 436, 2593, 1057, 417, 452, 667, 10414, 50275, 37585, 2792, 337, 275, 253, 10199, 581, 273, 253, 9380, 16318, 9021, 310, 4767, 347, 9610, 327, 1884, 15302, 2299, 697, 417, 2590, 281, 479, 326, 436, 310, 1663, 253, 1083, 4768, 2593, 577, 643, 15302, 403, 5393, 275, 1635, 281, 253, 2022, 1458, 533, 841, 403, 417, 2908, 275, 253, 3605, 7180, 4543, 275, 4677, 337, 285, 403, 417, 2530, 7621, 37586, 1223, 891, 513, 923, 253, 31471, 275, 25091, 10668, 281, 643, 7826, 4623, 15302, 275, 2593, 577, 891, 1119, 4266, 247, 2372, 13477, 275, 1798, 4886, 432, 7118, 337, 281, 374, 347, 281, 835, 253, 1884, 4632, 1458, 15302, 497, 3551, 715, 1132, 1024, 846, 4361, 253, 2862, 2929, 891, 13414, 1928, 326, 436, 310, 581, 273, 697, 9021, 209, 422, 4879, 436, 347, 247, 5884, 1127, 984, 627, 310, 2686, 9648, 1652, 3748, 273, 253, 1884, 15302, 285, 11922, 25957, 273, 436, 651, 1646, 281, 2818, 1077, 1652, 273, 253, 2929, 374, 2905, 281, 253, 2045, 1127, 352, 310, 1335, 247, 1652, 12744, 281, 479, 849, 253, 4477, 7244, 387, 253, 1458, 15302, 323, 534, 597, 3562, 7621, 37586, 1580, 627, 3133, 281, 452, 644, 1884, 15302, 534, 1313, 253, 5393, 11250, 6866, 2139, 841, 1458, 15302, 403, 841, 1458, 15302, 4217, 323, 271, 13361, 8446, 275, 1798, 495, 4677, 337, 816, 971, 281, 12654, 326, 253, 9830, 275, 436, 4677, 2723, 281, 253, 8661, 347, 5393, 275, 2829, 337, 352, 778, 320, 4217, 281, 5224, 436, 9366, 327, 253, 4677, 577, 2593, 818, 1745, 80, 891, 2868, 39445, 943, 320, 949, 608, 2829, 374, 16706, 5347, 1320, 275, 253, 23365, 6064, 285, 8838, 9930, 50276, 7152, 339, 431, 248, 2929, 10262, 247, 4849, 273, 941, 4973, 327, 253, 441, 6424, 8426, 985, 342, 253, 4736, 281, 2085, 8607, 342, 271, 18389, 273, 2130, 941, 4973, 15302, 8424, 432, 247, 5235, 273, 4910, 432, 6617, 5012, 281, 12907, 13094, 14625, 253, 4477, 33810, 2085, 7621, 37586, 323, 1458, 4236, 15302, 534, 3400, 247, 19817, 34218, 7741, 323, 24497, 4623, 941, 5319, 50276, 2577, 2201, 5833, 651, 320, 253, 3480, 273, 247, 9363, 8194, 14398, 5380, 4404, 22753, 7621, 37586, 285, 253, 4849, 275, 2087, 50276, 2520, 1057, 417, 760, 3657, 14448, 342, 643, 8607, 533, 671, 1543, 275, 253, 2929, 1146, 271, 4293, 13508, 39624, 12690, 534, 275, 619, 4743, 50276, 261, 417, 10599, 323, 824, 247, 4849, 273, 15302, 604, 436, 5833, 497, 9713, 891, 651, 5583, 14924, 273, 253, 2929, 253, 2929, 2789, 271, 1774, 7680, 281, 253, 1673, 273, 6424, 8426, 15302, 407, 17055, 285, 985, 255, 3006, 247, 5235, 273, 2130, 15302, 253, 6630, 3133, 281, 320, 41389, 1561, 697, 3710, 5028, 273, 441, 15302, 285, 3400, 247, 1175, 18389, 273, 4623, 941, 4973, 50276, 783, 4477, 2085, 7621, 37586, 323, 15302, 327, 1458, 15302, 534, 2085, 247, 11088, 285, 18872, 7741, 326, 9172, 1142, 273, 253, 4623, 3533, 8772, 1110, 15302, 50275, 783, 4477, 2319, 253, 37346, 14475, 432, 253, 22453, 7089, 1160, 275, 253, 1302, 985, 285, 2085, 271, 18389, 273, 7259, 326, 5431, 1421, 281, 436, 37346, 436, 3400, 247, 1175, 4685, 273, 253, 2605, 273, 253, 1027, 941, 4973, 50276, 783, 2929, 25339, 247, 2962, 273, 4623, 7364, 281, 253, 5728, 941, 12976, 11891, 323, 2442, 897, 2219, 50275, 74, 1158, 253, 2929, 651, 1056, 271, 1014, 10046, 1083, 604, 253, 1543, 497, 3559, 275, 253, 830, 273, 247, 4422, 7280, 18491, 17690, 6927, 15034, 436, 651, 671, 1581, 643, 17085, 281, 8162, 24088, 3066, 3785, 9762, 285, 1581, 323, 22753, 5300, 604, 24088, 6332, 403, 1119, 275, 253, 15302, 390, 672, 747, 15302, 2489, 2130, 891, 651, 7052, 1804, 2007, 789, 275, 326, 3884, 275, 616, 1655, 2715, 253, 7621, 37586, 403, 271, 39624, 12690, 323, 534, 9363, 285, 11269, 403, 12744, 534, 891, 651, 1908, 247, 2201, 32489, 273, 253, 2929, 50276, 303, 40037, 275, 436, 3884, 812, 320, 2218, 407, 50276, 18, 1690, 247, 3908, 5001, 9363, 285, 11269, 374, 49487, 44127, 4973, 326, 8046, 22753, 253, 7621, 37586, 495, 49487, 247, 44127, 7646, 323, 2074, 7621, 37586, 577, 562, 30927, 849, 952, 476, 8162, 747, 15302, 281, 436, 4849, 849, 7621, 37586, 476, 320, 9300, 285, 253, 6866, 323, 824, 2544, 577, 5277, 247, 1175, 18389, 50276, 303, 40037, 253, 1239, 1405, 275, 253, 5368, 40477, 18491, 50275, 7152, 339, 431, 248, 4477, 1246, 271, 1783, 273, 6424, 8426, 2905, 941, 5239, 891, 1158, 436, 310, 271, 1774, 9400, 285, 891, 11435, 253, 4248, 281, 534, 597, 403, 18216, 5847, 285, 772, 273, 841, 941, 5239, 50276, 1189, 455, 436, 3133, 281, 320, 247, 9865, 7741, 891, 452, 690, 7350, 670, 253, 29867, 273, 253, 1566, 2299, 50274, 2520, 310, 247, 1175, 6010, 273, 1142, 11117, 941, 5239, 17055, 1491, 326, 476, 320, 908, 281, 1056, 8191, 10165, 670, 849, 281, 897, 436, 941, 50276, 35529, 891, 452, 690, 7350, 50275, 2577, 5962, 4468, 310, 253, 22797, 275, 616, 1566, 326, 247, 6617, 369, 7730, 253, 6424, 8426, 985, 310, 35180, 285, 417, 4130, 310, 13084, 4543, 310, 4130, 665, 310, 13084, 8106, 513, 642, 941, 5239, 2953, 436, 2523, 1014, 604, 326, 310, 253, 1083, 1677, 326, 253, 4477, 5730, 281, 4271, 18388, 275, 253, 941, 5239, 2139, 310, 326, 417, 6607, 275, 4677, 337, 3345, 273, 253, 3159, 49207, 50275, 249, 1635, 253, 1566, 310, 18464, 50276, 1542, 1650, 253, 1618, 273, 6973, 273, 37631, 26388, 310, 18464, 347, 952, 778, 320, 4439, 533, 2424, 281, 8251, 271, 23469, 5724, 816, 347, 271, 1650, 891, 717, 417, 271, 6485, 275, 436, 2170, 594, 627, 778, 320, 643, 6973, 326, 403, 5816, 1512, 326, 891, 717, 25229, 273, 50275, 44295, 253, 1340, 273, 5871, 2011, 310, 417, 7899, 281, 4588, 6424, 8426, 8450, 323, 1650, 1142, 952, 452, 281, 2075, 29701, 1014, 604, 597, 403, 417, 13084, 323, 1650, 8128, 323, 2424, 23469, 27965, 1059, 13234, 875, 16153, 285, 9611, 390, 10054, 896, 247, 10119, 323, 19590, 2583, 619, 6346, 369, 6726, 281, 2075, 247, 4030, 3365, 323, 15602, 275, 1302, 1014, 2167, 703, 369, 417, 6636, 275, 253, 990, 50275, 23955, 2181, 326, 3133, 281, 320, 5816, 432, 253, 1566, 2530, 310, 253, 1511, 273, 26762, 1345, 390, 3055, 323, 1650, 326, 5014, 574, 2289, 281, 390, 1014, 1880, 672, 285, 849, 824, 247, 1436, 369, 7922, 50276, 187, 187, 4118, 18435, 27, 783, 2929, 23970, 247, 873, 273, 6424, 8426, 15302, 281, 253, 5145, 4715, 3114, 8957, 3184, 1884, 15302, 285, 6153, 7621, 37586, 323, 1458, 273, 731, 30628, 14109, 326, 253, 2929, 16540, 11891, 273, 841, 15302, 275, 253, 13361, 3114, 285, 253, 10097, 789, 326, 253, 4477, 452, 9945, 627, 497, 767, 2022, 7350, 18766, 5955, 273, 18035, 285, 3480, 273, 2508, 327, 849, 253, 13361, 3114, 812, 789, 342, 841, 15302, 253, 4477, 452, 9713, 253, 806, 4468, 275, 247, 18520, 285, 10571, 9713, 253, 1273, 4468 ]
Below is given review of a research paper from cnoference journal. Please write a summary the review. ### Review: in this paper the authors present a discussion on two mainstream metaleaners with attentional classifiers and threshold metalearners under a unique view of polythetic classification and to address the limitations of both attentionbased feature selection is introduced improved performance is demonstrated by both synthetic and realworld fewshot learning tasks strengths this paper discusses metalearner from a very unique perspective with the shortcomings of both protonetliked threshold metalearners and matchingnetliked attentional metalearner discussed solidly with intuitive examples and convincing derivation of misclassification rates the proposed attentional feature selection is simple yet effective and can potentially guide and stimulate many followup improvements to metalearning weaknesses my main concern is that this paper tends to isolate itself from the entire research of metalearning and focus on a corner instead protonets representing threshold metalearners and matchingnets representing attentive metalearners are insufficient to cover the entire research of meta learning there are many other directions of metalearning that are completely overlooked in the discussion the most representative case is the innerloop optimizationbased metalearner like maml 1 where the innerloop adaptation to the feature extractor can potentially solve the challenge of not all features are relevant in all tasks and that the support is unlikely to span the input domain pointed out in this paper and many other directions and methods eg methods based on hebbian rules 2 might share a similar spirit with the proposed method and the presentation can be further improved by incorporating more comprehensive discussion to the latest progress of metalearning and the connections of the proposed method to others minor the experimental settings included in this paper seem a bit weak more common benchmarks on realworld metalearning classification tasks like miniimagenet and the challenging crossdomain settings can better support the discussions the overall writing is good but some further improvements are expected for example is the very first sentence of this paper grammatically incorrect i believe its better to say that need to be neither universal nor 1 modelagnostic metalearning for fast adaptation of deep networks icml2017 2 differentiable plasticity training plastic neural networks with backpropagation icml 2018 i give an initial recommendation of score 6 mainly for appreciating this novel view to mainstream metalearners and solid discussions supporting the points however to meet and standard of iclr and demonstrate a clear contribution to the research of metalearning i believe further efforts on comprehensive discussions are highly expected docsepauthors propose the general problem of fewshot polythetic classification where class membership is determined by the combinations of present and absent features and the salient features and combination patterns change at test time authors demonstrate that prototypical approaches with linear decision boundaries respond poorly to these highly nonlinear problems while attentionbased soft nearest neighbors paradigms work well but overfit to specious cues an attentionbased feature refinement technique is proposed and evaluated beating both baselines on specifically polythetic fewshot tasks strengths the fewshot polythetic classification problem is interesting and novel and the shortcomings of baseline approaches are readily apparent work is wellgrounded in older prior literature and the xoralpha task is extensively analyzed both theoretically and empirically writing is clear and concise though not always easy to understand see below the proposed method performs well in its intended setting weaknessesissues in no particular order the xoralpha metalearning task is never formally described it is only described as a function of bits on page 3 and so subsequent sections become difficult to follow please state clearly in section 2 that xoralpha also refers to the collection of presumably all alphavariable xor tasks in a given binary nspace with nalpha with a presumably random partition of train and test tasks perhaps also consider using different notation when referring to the metalearning problem ie caption for fig3 right vs the task ie second to last paragraph of pg4 either way the problem setup for this and other experimental tasks more broadly ie the nonstandard tieredimagenet task should be elaborated up front and more clearly as is i would have a difficult time reimplementing some of these benchmarks without the provided code contrary to claims in the abstract and conclusion the paper does not show that the embedding space of a threshold classifier must grow exponentially with the number of features pg4 discusses embedding growth but has it at onalpha which for given alpha is only polynomial in the number of features the embedding space does grow exponentially with task complexity alpha which is itself on but they are not the same and these claims should be clarified assuming that this is in fact authors intended argument on a much broader level i have a hard time understanding the motivation for this work what is the envisioned use case scenario for this kind of approach the synthetic examples while interesting are very contrived which is fine since theyre synthetic but the paper does not provide a strong motivation for the real world examples omniglot and tieredimagenet either these are also somewhat contrived while the finetocoarse generalization task under discussion is interesting its the opposite coarsetofine generalization that has a clear use case what is the envisioned usecase for a polythetic metalearner the introduction suggests certain kinds of finegrained classification but if this is the case then it should be investigated directly such as with cub or tiered metainat benchmarks the first paragraph of page 2 is difficult to parse both because sentences are somewhat long and mostly because it is not immediately clear how conclusions are following from premises ie the described 45degree rotationchange of basis of the or function was initially confusing to me because this geometric operation does not translate intuitively into pure boolean algebra the rates of growth 22n and 2n2 are not immediately obvious this section could use some elaboration and clarification i do not find appendix a convincing appendix a claims to be a demonstration that protonets do not generalize to unseen variable combinations while it is true that train and test combinations do not overlap by my understanding the chance performance here is clearly due to the fact that training time noise features are test time signal features and vice versa and the network has simply and correctly learned to suppress the relevant noise features as such no learningbased classifier should be able to solve this problem as presented it is less a demonstration of the limits of protonets and more a demonstration of the no free lunch theorem a much more useful and interesting set of results would be to partition the combinations of active variables randomly over train and test so that combinations do not overlap but features are equally active in expectation at both train and test time by my understanding this is exactly the experiment in fig3 right though so perhaps this ought to just be removed entirely authors use a pretrained resnet18 for the tieredimagenet experiment but the provided source for the model is pytorch itself how was this model pretrained if the model was pretrained on imagenet then it has been trained on the test images and these results are not valid typo in algorithm 1 input the an arbitrarily ordered matrix an arbitrarily ordered matrix the paper proposes an interesting problem and based on theoretical and empirical analysis provides a neat solution issues stem from the perhaps overly concise writing many aspects of the paper are in need of elaboration these include the motivation problem setups for the various benchmarks and the reasoning behind certain conclusions currently i am scoring the paper as if my understanding is correct and authors intended arguments are simply wrong but if these issues can be clarified id be happy to raise my score postdiscussion authors mostly address the above issues in discussion and in the revised manuscript i have some lingering concerns shared with reviewer q98z regarding the lack of demonstrated practical benefit and the fact that accuracy gains in the more realworld benchmarks shrink substantially relative to the motivating xor problem however the revised paper is much improved conceptual novelty remains high and on the whole the paper is of sufficient quality for acceptance docsepthis paper discusses monothetic and polythetic classifications in the context of fewshot learning distinguishing similar classes often require reasoning with combinations of certain features which can be especially challenging when only a few training examples are available two common types of fewshot methods protonets and matching networks are shown to be threshold classifiers and attentional classifiers respectively each with their advantages and disadvantages the authors introduce a simple method based on selfattention as a solution to these challenges with experiments on several toy tasks and the more realworld tieredimagenet pros 1 strong motivation and introduction of monothetic and polythetic classifications drawing from prior work from other fields 2 interesting analysis and excellent visualizations 3 uses simple boolean tasks to explain background conceptually 4 welldesigned toy datasets and experiments to verify hypotheses of the nature of protonets and matching networks in the context of the paper 5 wellwritten cons 1 practical value isnt clearly demonstrated improvements over matching networks for the only nontoy dataset tieredimagenet is marginal at best 2 limited baselines in experiments more detailed comments this paper examines fewshot learning in the context of monothetic and polythetic classifications this perspective sheds some light on how protonets and matching networks perform classification of features and there is some interesting analysis to back these hypotheses up concepts from the paper are wellexplained with several simple examples and toy settings are designed to illustrate them empirically furthermore the analysis leads to a simple selfattentionbased solution to improve features for classification overall this paper reads well and the analysis leads to some interesting insights into fewshot learning empirically the results are a little more disappointing while several toy experiments eg xor binary strings polythetic mnist are welldesigned to illustrate the advantages of considering polythetic patterns its not as clear how much the realworld exhibits these characteristics the proposed method strongly outperforms the baselines in the toy settings but improvements over the matching networks baseline on tieredimagenet is marginal at best as such as nice as these insights are its not clear if its practically useful additionally while i understand the focus was comparing with protonets and matching networks as representatives of thresholdbased and attentional classifiers it would have been nice to have included more baselines in the experiments questions 1 while conceptually easytounderstand i was previously unaware of categorizing classifiers as either attentional or thresholdbased are there any other kinds or must a nonattentional classifier be threshold based and vice versa 2 how do other fewshot methods fit into this taxonomy for example what are kinds of features monothetic vs polythetic are optimizationbased methods eg maml learning what about simply training a classifier or svm postdiscussion i thank the authors for all their effort during the discussion phase to clarify various questions about their paper and for running additional baselines after reading the other reviews and seeing the authors responses my recommendation remains mostly the same the new perspective on metalearning provided by the authors is an interesting one but the practical benefits of this approach can still be more concretely demonstrated additional experiments in such use cases eg the dna example mentioned in one of the discussions if such a dataset exists would significantly strengthen this paper as stated in the main reviews the empirical results arent particularly impressive but overall the paper does provide some interesting analysis that may encourage new ways of thinking about fewshot learning problems as such i think this paper may be of interest to the iclr community docsepthis paper first considers the limitations of threshold and attentional classifiers they proposed an attentionbased method for feature selection to address the problems of threshold classifiers and attentional classifiers the experiments on several synthetic and realworld fewshot learning tasks seem good strengths 1 the explanation about the challenges of threshold prototypical networks and attention classifiers matching network is interesting 2 the motivation on the polythetic classification is wellmotivated 3 wellwritten and easy to follow weaknesses 1 lack of technological innovation the proposed featureselection mechanism is too simple and the only technological innovation they only use selfattention to obtain a better representation rather than directly using average pooling prototypical networks or all the support data matching network 2 how to choose repetitions r it is a hyperparameter is the bigger the r the better does it have no upper bound 3 lack the experiments for comparison with threshold classifiers and attentional classifiers in the main paper it is crucial to show the problems of threshold classifiers and attentional classifiers 4 selfattention also has some parameters for the transformation why is the proposed method nonparametric 5 the related work is too little maybe consider adding some selfattention work the implementation details and related work of the proposed paper should be further added ### Summary:
this paper analyzes problems of existing threshold metalearners and attentional metalearners for fewshot learning in polythetic classifications the threshold metalearners such as prototypical networks require exponential number of embedding dimensionality and the attentional metalearners are susceptible to misclassification the authors proposed a simple yet effective method to address these problems and demonstrated its effectiveness in their experiments this paper discusses metalearning from a very unique perspective as commented by a reviewer and clearly explained problems of widelyused metalearning methods however this paper focus on prototypical networks and matching networks even though there have been proposed many metalearning methods some existing methods seem not to have the problems of prototypical networks andor matching networks in addition the practical benefits of the proposed approach are not well demonstrated although the additional experiments in the author response addressed some concerns of the reviewers they are not enough to demonstrate the effectiveness of the proposed method
[ 50275, 20881, 1255, 265, 22402, 275, 642, 1798, 1340, 50275, 783, 1269, 263, 1637, 5148, 613, 920, 4836, 310, 1620, 19186, 2529, 352, 310, 760, 2529, 347, 247, 1159, 273, 9886, 327, 3239, 495, 285, 594, 6774, 7118, 2489, 2834, 281, 956, 4496, 1375, 4518, 275, 2593, 374, 326, 1269, 263, 1637, 671, 10770, 281, 253, 4849, 273, 18289, 512, 355, 545, 580, 10288, 1269, 263, 8892, 275, 247, 1677, 8985, 295, 5641, 342, 295, 1637, 342, 247, 18289, 3632, 10883, 273, 6194, 285, 1071, 8892, 4931, 671, 1908, 970, 1027, 14951, 672, 14339, 281, 253, 5148, 613, 920, 1895, 26332, 11743, 323, 3036, 20, 987, 4632, 253, 4836, 26332, 1273, 281, 1390, 12494, 273, 23256, 21, 2057, 1039, 253, 1895, 9978, 323, 436, 285, 643, 5661, 8892, 625, 21450, 26332, 253, 1327, 15291, 13898, 433, 303, 6533, 292, 4836, 943, 320, 50221, 598, 2914, 285, 625, 4518, 347, 310, 891, 651, 452, 247, 2834, 673, 294, 303, 3018, 272, 690, 273, 841, 49602, 1293, 253, 2530, 2127, 50275, 19657, 552, 281, 3916, 275, 253, 12002, 285, 6452, 253, 2929, 1057, 417, 921, 326, 253, 21496, 2317, 273, 247, 7887, 30410, 1364, 1756, 28596, 342, 253, 1180, 273, 3386, 23256, 21, 25339, 21496, 3116, 533, 556, 352, 387, 327, 1637, 534, 323, 1677, 9765, 310, 760, 14189, 275, 253, 1180, 273, 3386, 253, 21496, 2317, 1057, 1756, 28596, 342, 4836, 10454, 9765, 534, 310, 3139, 327, 533, 597, 403, 417, 253, 1072, 285, 841, 3916, 943, 320, 31637, 7384, 326, 436, 310, 275, 958, 4477, 6034, 4154, 50275, 251, 247, 1199, 16055, 1268, 891, 452, 247, 1892, 673, 4685, 253, 16038, 323, 436, 789, 752, 310, 253, 44921, 897, 1083, 10076, 323, 436, 2238, 273, 2746, 253, 13506, 6667, 1223, 4722, 403, 1077, 523, 30487, 534, 310, 4030, 1580, 597, 250, 13506, 533, 253, 2929, 1057, 417, 2085, 247, 2266, 16038, 323, 253, 1524, 1533, 6667, 33039, 304, 11753, 285, 13898, 433, 303, 6533, 292, 2057, 841, 403, 671, 8489, 523, 30487, 1223, 253, 1442, 292, 16856, 10788, 26647, 4836, 762, 5955, 310, 4722, 697, 253, 7285, 820, 1032, 292, 1171, 460, 26647, 326, 556, 247, 2590, 897, 1083, 752, 310, 253, 44921, 441, 886, 511, 323, 247, 3488, 783, 3028, 5148, 613, 1216, 253, 10199, 5936, 2176, 9351, 273, 4030, 72, 11273, 9162, 533, 604, 436, 310, 253, 1083, 840, 352, 943, 320, 6949, 3587, 824, 347, 342, 12966, 390, 13898, 433, 1313, 404, 255, 49602, 50275, 783, 806, 12494, 273, 3239, 374, 310, 2834, 281, 14390, 1097, 984, 14683, 403, 8489, 1048, 285, 6571, 984, 352, 310, 417, 4745, 2590, 849, 11815, 403, 1563, 432, 18702, 26332, 253, 2529, 5329, 14577, 9381, 4168, 273, 3720, 273, 253, 390, 1159, 369, 8523, 21643, 281, 479, 984, 436, 17856, 4254, 1057, 417, 16497, 540, 41597, 715, 6313, 12419, 8697, 253, 4142, 273, 3116, 3307, 79, 285, 374, 79, 19, 403, 417, 4745, 4755, 436, 2593, 812, 897, 690, 14883, 318, 285, 37699, 50275, 74, 513, 417, 1089, 30762, 247, 21414, 30762, 247, 3916, 281, 320, 247, 20028, 326, 19025, 1507, 513, 417, 39970, 281, 39709, 4778, 13553, 1223, 352, 310, 2032, 326, 6194, 285, 1071, 13553, 513, 417, 14787, 407, 619, 4685, 253, 4839, 3045, 1060, 310, 4518, 1955, 281, 253, 958, 326, 3733, 673, 6046, 3386, 403, 1071, 673, 2625, 3386, 285, 12008, 26620, 285, 253, 2990, 556, 3365, 285, 9113, 6311, 281, 10476, 253, 4623, 6046, 3386, 347, 824, 642, 4715, 3169, 30410, 943, 320, 2104, 281, 8415, 436, 1895, 347, 3559, 352, 310, 1679, 247, 20028, 273, 253, 7787, 273, 19025, 1507, 285, 625, 247, 20028, 273, 253, 642, 1959, 11157, 10012, 247, 1199, 625, 4217, 285, 4722, 873, 273, 1543, 651, 320, 281, 10883, 253, 13553, 273, 3939, 4903, 12421, 689, 6194, 285, 1071, 594, 326, 13553, 513, 417, 14787, 533, 3386, 403, 9696, 3939, 275, 15355, 387, 1097, 6194, 285, 1071, 673, 407, 619, 4685, 436, 310, 4555, 253, 3368, 275, 3036, 20, 987, 2167, 594, 4931, 436, 12758, 281, 816, 320, 5176, 7094, 50275, 43355, 897, 247, 3215, 11273, 501, 3024, 1093, 323, 253, 13898, 433, 303, 6533, 292, 3368, 533, 253, 2530, 2603, 323, 253, 1566, 310, 268, 1767, 263, 348, 3139, 849, 369, 436, 1566, 3215, 11273, 604, 253, 1566, 369, 3215, 11273, 327, 4440, 257, 292, 840, 352, 556, 644, 10166, 327, 253, 1071, 3888, 285, 841, 1543, 403, 417, 3588, 50275, 555, 5367, 275, 5933, 337, 3280, 253, 271, 29607, 6960, 4315, 50276, 266, 29607, 6960, 4315, 50276, 783, 2929, 29328, 271, 4722, 1895, 285, 1754, 327, 10527, 285, 16774, 1783, 3400, 247, 18176, 2900, 3374, 8424, 432, 253, 4931, 27662, 44003, 4028, 1142, 7794, 273, 253, 2929, 403, 275, 878, 273, 14883, 318, 841, 2486, 253, 16038, 1895, 873, 8777, 323, 253, 2710, 49602, 285, 253, 14720, 3212, 2176, 11815, 4390, 891, 717, 14755, 253, 2929, 347, 604, 619, 4685, 310, 3451, 285, 4477, 6034, 7125, 403, 3365, 3430, 533, 604, 841, 3374, 476, 320, 31637, 2654, 320, 5211, 281, 7164, 619, 4868, 50275, 5996, 49794, 50276, 43355, 6571, 2953, 253, 1840, 3374, 275, 5955, 285, 275, 253, 17265, 7714, 891, 452, 690, 42578, 7350, 6096, 342, 37317, 2805, 4185, 91, 5001, 253, 3480, 273, 5183, 8542, 5649, 285, 253, 958, 326, 7200, 15988, 275, 253, 625, 1524, 10186, 49602, 23973, 9619, 4103, 281, 253, 15265, 839, 1269, 263, 1895, 2299, 253, 17265, 2929, 310, 1199, 5520, 20178, 38135, 4558, 1029, 285, 327, 253, 2644, 253, 2929, 310, 273, 4209, 3290, 323, 14924, 50276, 7152, 33032, 2520, 2929, 25339, 1114, 4977, 3028, 285, 3488, 783, 3028, 43394, 275, 253, 3634, 273, 1643, 11860, 4715, 32495, 2074, 5971, 2223, 2430, 14720, 342, 13553, 273, 2176, 3386, 534, 476, 320, 3340, 11132, 672, 760, 247, 1643, 3733, 6667, 403, 2130, 767, 1846, 3510, 273, 1643, 11860, 3082, 19025, 1507, 285, 11038, 6928, 403, 2011, 281, 320, 7887, 49996, 285, 4116, 267, 49996, 2975, 1016, 342, 616, 11361, 285, 23797, 253, 4477, 9569, 247, 2969, 1332, 1754, 327, 1881, 42959, 347, 247, 2900, 281, 841, 7881, 342, 4679, 327, 2067, 20953, 8892, 285, 253, 625, 1524, 10186, 13898, 433, 303, 6533, 292, 5847, 337, 186, 9072, 16038, 285, 10199, 273, 1114, 4977, 3028, 285, 3488, 783, 3028, 43394, 10263, 432, 2720, 789, 432, 643, 4910, 374, 186, 47606, 1783, 285, 7126, 5304, 5904, 495, 186, 5123, 2969, 12419, 8892, 281, 5513, 4114, 4473, 1230, 577, 186, 88, 293, 392, 265, 1300, 20953, 15302, 285, 4679, 281, 12654, 24316, 273, 253, 3753, 273, 19025, 1507, 285, 11038, 6928, 275, 253, 3634, 273, 253, 2929, 608, 186, 4714, 15720, 50276, 5040, 337, 186, 81, 26080, 1318, 310, 2649, 4518, 5183, 11701, 689, 11038, 6928, 323, 253, 760, 25450, 899, 10895, 13898, 433, 303, 6533, 292, 310, 16888, 387, 1682, 374, 186, 15870, 1666, 25379, 275, 4679, 50276, 3062, 7000, 5701, 50276, 2520, 2929, 33888, 1643, 11860, 4715, 275, 253, 3634, 273, 1114, 4977, 3028, 285, 3488, 783, 3028, 43394, 436, 8668, 703, 1397, 690, 1708, 327, 849, 19025, 1507, 285, 11038, 6928, 1347, 9162, 273, 3386, 285, 627, 310, 690, 4722, 1783, 281, 896, 841, 24316, 598, 12342, 432, 253, 2929, 403, 6210, 1591, 446, 1243, 342, 2067, 2969, 6667, 285, 20953, 7533, 403, 4158, 281, 17093, 731, 45190, 33810, 253, 1783, 5644, 281, 247, 2969, 1881, 42959, 3169, 2900, 281, 3157, 3386, 323, 9162, 4583, 436, 2929, 9563, 973, 285, 253, 1783, 5644, 281, 690, 4722, 16039, 715, 1643, 11860, 4715, 50275, 358, 5378, 1037, 253, 1543, 403, 247, 1652, 625, 31623, 1223, 2067, 20953, 4679, 24088, 1269, 263, 8985, 11559, 3488, 783, 3028, 278, 79, 382, 403, 6210, 392, 265, 1300, 281, 17093, 253, 11361, 273, 7296, 3488, 783, 3028, 6127, 697, 417, 347, 2590, 849, 1199, 253, 1524, 10186, 15646, 841, 5319, 253, 4081, 1332, 7052, 41731, 13015, 253, 1666, 25379, 275, 253, 20953, 7533, 533, 11701, 689, 253, 11038, 6928, 8245, 327, 13898, 433, 303, 6533, 292, 310, 16888, 387, 1682, 347, 824, 347, 5322, 347, 841, 16039, 403, 697, 417, 2590, 604, 697, 18236, 4217, 23000, 1223, 891, 2096, 253, 2770, 369, 10941, 342, 19025, 1507, 285, 11038, 6928, 347, 15572, 273, 7887, 3169, 285, 4116, 267, 49996, 352, 651, 452, 644, 5322, 281, 452, 2908, 625, 1666, 25379, 275, 253, 4679, 50275, 34974, 337, 186, 6050, 4473, 1230, 1842, 1767, 10117, 1676, 891, 369, 3786, 25229, 273, 13213, 3006, 49996, 347, 2057, 4116, 267, 390, 7887, 3169, 403, 627, 667, 643, 9351, 390, 1364, 247, 1327, 1595, 41454, 30410, 320, 7887, 1754, 285, 12008, 26620, 374, 186, 5430, 513, 643, 1643, 11860, 3082, 4944, 715, 436, 2891, 13646, 323, 1650, 752, 403, 9351, 273, 3386, 1114, 4977, 3028, 4632, 3488, 783, 3028, 403, 13757, 3169, 3082, 24088, 278, 16878, 4715, 752, 670, 3365, 3733, 247, 30410, 390, 256, 11618, 50274, 5996, 49794, 50276, 74, 5717, 253, 4477, 323, 512, 616, 3434, 1309, 253, 5955, 3408, 281, 19148, 2710, 3533, 670, 616, 2929, 285, 323, 3515, 3081, 1666, 25379, 846, 4361, 253, 643, 10123, 285, 6523, 253, 4477, 6128, 619, 17401, 4558, 6571, 253, 1072, 253, 747, 8668, 327, 5148, 613, 920, 2530, 407, 253, 4477, 310, 271, 4722, 581, 533, 253, 8542, 5373, 273, 436, 2746, 476, 1335, 320, 625, 345, 2414, 600, 5183, 3081, 4679, 275, 824, 897, 2219, 24088, 253, 277, 2072, 1650, 5393, 275, 581, 273, 253, 11985, 604, 824, 247, 10895, 4961, 651, 3012, 17084, 436, 2929, 347, 4767, 275, 253, 2022, 10123, 253, 16774, 1543, 403, 2649, 3782, 13943, 533, 4583, 253, 2929, 1057, 2085, 690, 4722, 1783, 326, 778, 11907, 747, 4088, 273, 4680, 670, 1643, 11860, 4715, 3237, 347, 824, 891, 1158, 436, 2929, 778, 320, 273, 1600, 281, 253, 17857, 32888, 3114, 5474, 33032, 2520, 2929, 806, 19401, 253, 7364, 273, 7887, 285, 4116, 267, 49996, 597, 4081, 271, 4116, 3169, 1332, 323, 4735, 5438, 281, 2953, 253, 3237, 273, 7887, 49996, 285, 4116, 267, 49996, 50276, 783, 4679, 327, 50276, 43249, 13506, 285, 1524, 10186, 1643, 11860, 4715, 8892, 1646, 1175, 50276, 296, 3755, 20556, 50275, 18, 253, 8813, 670, 253, 7881, 273, 7887, 3861, 49225, 6928, 285, 4116, 49996, 11038, 2990, 310, 4722, 50276, 19, 253, 16038, 327, 253, 3488, 783, 3028, 9162, 310, 973, 24013, 8550, 50275, 20, 973, 15720, 285, 3477, 281, 956, 50275, 20881, 1255, 265, 50276, 18, 3480, 273, 20417, 15832, 253, 4081, 4735, 27423, 5122, 310, 1512, 2969, 285, 253, 760, 20417, 15832, 50275, 9328, 760, 897, 1881, 42959, 281, 4044, 247, 1805, 6779, 2581, 685, 3587, 970, 3388, 45900, 3861, 49225, 6928, 390, 512, 253, 1329, 941, 11038, 2990, 50276, 19, 849, 281, 5206, 49495, 391, 50276, 262, 310, 247, 4373, 19484, 310, 253, 8750, 253, 391, 253, 1805, 1057, 352, 452, 642, 5170, 3033, 50276, 20, 3480, 253, 4679, 323, 5301, 342, 7887, 49996, 285, 4116, 267, 49996, 275, 253, 2022, 2929, 352, 310, 9560, 281, 921, 253, 3237, 273, 7887, 49996, 285, 4116, 267, 49996, 50276, 21, 50276, 1286, 42959, 671, 556, 690, 3602, 323, 253, 9261, 2139, 310, 253, 4081, 1332, 1327, 36928, 50276, 22, 253, 2905, 789, 310, 1512, 1652, 5046, 1908, 6240, 690, 1881, 42959, 789, 50274, 783, 7092, 4278, 285, 2905, 789, 273, 253, 4081, 2929, 943, 320, 2007, 2879, 2490, 187, 4118, 18435, 27, 2520, 2929, 3537, 13505, 3237, 273, 5368, 7887, 5148, 613, 6118, 285, 4116, 267, 5148, 613, 6118, 323, 1643, 11860, 4715, 275, 3488, 783, 3028, 43394, 253, 7887, 5148, 613, 6118, 824, 347, 3861, 49225, 6928, 2430, 17619, 1180, 273, 21496, 7877, 1319, 285, 253, 4116, 267, 5148, 613, 6118, 403, 16931, 281, 3731, 42070, 253, 4477, 4081, 247, 2969, 2568, 3576, 1332, 281, 2953, 841, 3237, 285, 5183, 697, 12510, 275, 616, 4679, 436, 2929, 25339, 5148, 613, 920, 432, 247, 1077, 4451, 8668, 347, 20503, 407, 247, 37317, 285, 4518, 5544, 3237, 273, 7561, 3197, 5148, 613, 920, 3082, 2299, 436, 2929, 2770, 327, 3861, 49225, 6928, 285, 11038, 6928, 1014, 2167, 627, 452, 644, 4081, 1142, 5148, 613, 920, 3082, 690, 5368, 3082, 1646, 417, 281, 452, 253, 3237, 273, 3861, 49225, 6928, 285, 263, 11038, 6928, 275, 1635, 253, 8542, 5373, 273, 253, 4081, 2746, 403, 417, 973, 5183, 3738, 253, 3081, 4679, 275, 253, 2488, 2380, 9713, 690, 7350, 273, 253, 30628, 597, 403, 417, 2217, 281, 7568, 253, 12510, 273, 253, 4081, 1332 ]
[ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 ]
[ 50275, 20881, 1255, 265, 22402, 275, 642, 1798, 1340, 50275, 783, 1269, 263, 1637, 5148, 613, 920, 4836, 310, 1620, 19186, 2529, 352, 310, 760, 2529, 347, 247, 1159, 273, 9886, 327, 3239, 495, 285, 594, 6774, 7118, 2489, 2834, 281, 956, 4496, 1375, 4518, 275, 2593, 374, 326, 1269, 263, 1637, 671, 10770, 281, 253, 4849, 273, 18289, 512, 355, 545, 580, 10288, 1269, 263, 8892, 275, 247, 1677, 8985, 295, 5641, 342, 295, 1637, 342, 247, 18289, 3632, 10883, 273, 6194, 285, 1071, 8892, 4931, 671, 1908, 970, 1027, 14951, 672, 14339, 281, 253, 5148, 613, 920, 1895, 26332, 11743, 323, 3036, 20, 987, 4632, 253, 4836, 26332, 1273, 281, 1390, 12494, 273, 23256, 21, 2057, 1039, 253, 1895, 9978, 323, 436, 285, 643, 5661, 8892, 625, 21450, 26332, 253, 1327, 15291, 13898, 433, 303, 6533, 292, 4836, 943, 320, 50221, 598, 2914, 285, 625, 4518, 347, 310, 891, 651, 452, 247, 2834, 673, 294, 303, 3018, 272, 690, 273, 841, 49602, 1293, 253, 2530, 2127, 50275, 19657, 552, 281, 3916, 275, 253, 12002, 285, 6452, 253, 2929, 1057, 417, 921, 326, 253, 21496, 2317, 273, 247, 7887, 30410, 1364, 1756, 28596, 342, 253, 1180, 273, 3386, 23256, 21, 25339, 21496, 3116, 533, 556, 352, 387, 327, 1637, 534, 323, 1677, 9765, 310, 760, 14189, 275, 253, 1180, 273, 3386, 253, 21496, 2317, 1057, 1756, 28596, 342, 4836, 10454, 9765, 534, 310, 3139, 327, 533, 597, 403, 417, 253, 1072, 285, 841, 3916, 943, 320, 31637, 7384, 326, 436, 310, 275, 958, 4477, 6034, 4154, 50275, 251, 247, 1199, 16055, 1268, 891, 452, 247, 1892, 673, 4685, 253, 16038, 323, 436, 789, 752, 310, 253, 44921, 897, 1083, 10076, 323, 436, 2238, 273, 2746, 253, 13506, 6667, 1223, 4722, 403, 1077, 523, 30487, 534, 310, 4030, 1580, 597, 250, 13506, 533, 253, 2929, 1057, 417, 2085, 247, 2266, 16038, 323, 253, 1524, 1533, 6667, 33039, 304, 11753, 285, 13898, 433, 303, 6533, 292, 2057, 841, 403, 671, 8489, 523, 30487, 1223, 253, 1442, 292, 16856, 10788, 26647, 4836, 762, 5955, 310, 4722, 697, 253, 7285, 820, 1032, 292, 1171, 460, 26647, 326, 556, 247, 2590, 897, 1083, 752, 310, 253, 44921, 441, 886, 511, 323, 247, 3488, 783, 3028, 5148, 613, 1216, 253, 10199, 5936, 2176, 9351, 273, 4030, 72, 11273, 9162, 533, 604, 436, 310, 253, 1083, 840, 352, 943, 320, 6949, 3587, 824, 347, 342, 12966, 390, 13898, 433, 1313, 404, 255, 49602, 50275, 783, 806, 12494, 273, 3239, 374, 310, 2834, 281, 14390, 1097, 984, 14683, 403, 8489, 1048, 285, 6571, 984, 352, 310, 417, 4745, 2590, 849, 11815, 403, 1563, 432, 18702, 26332, 253, 2529, 5329, 14577, 9381, 4168, 273, 3720, 273, 253, 390, 1159, 369, 8523, 21643, 281, 479, 984, 436, 17856, 4254, 1057, 417, 16497, 540, 41597, 715, 6313, 12419, 8697, 253, 4142, 273, 3116, 3307, 79, 285, 374, 79, 19, 403, 417, 4745, 4755, 436, 2593, 812, 897, 690, 14883, 318, 285, 37699, 50275, 74, 513, 417, 1089, 30762, 247, 21414, 30762, 247, 3916, 281, 320, 247, 20028, 326, 19025, 1507, 513, 417, 39970, 281, 39709, 4778, 13553, 1223, 352, 310, 2032, 326, 6194, 285, 1071, 13553, 513, 417, 14787, 407, 619, 4685, 253, 4839, 3045, 1060, 310, 4518, 1955, 281, 253, 958, 326, 3733, 673, 6046, 3386, 403, 1071, 673, 2625, 3386, 285, 12008, 26620, 285, 253, 2990, 556, 3365, 285, 9113, 6311, 281, 10476, 253, 4623, 6046, 3386, 347, 824, 642, 4715, 3169, 30410, 943, 320, 2104, 281, 8415, 436, 1895, 347, 3559, 352, 310, 1679, 247, 20028, 273, 253, 7787, 273, 19025, 1507, 285, 625, 247, 20028, 273, 253, 642, 1959, 11157, 10012, 247, 1199, 625, 4217, 285, 4722, 873, 273, 1543, 651, 320, 281, 10883, 253, 13553, 273, 3939, 4903, 12421, 689, 6194, 285, 1071, 594, 326, 13553, 513, 417, 14787, 533, 3386, 403, 9696, 3939, 275, 15355, 387, 1097, 6194, 285, 1071, 673, 407, 619, 4685, 436, 310, 4555, 253, 3368, 275, 3036, 20, 987, 2167, 594, 4931, 436, 12758, 281, 816, 320, 5176, 7094, 50275, 43355, 897, 247, 3215, 11273, 501, 3024, 1093, 323, 253, 13898, 433, 303, 6533, 292, 3368, 533, 253, 2530, 2603, 323, 253, 1566, 310, 268, 1767, 263, 348, 3139, 849, 369, 436, 1566, 3215, 11273, 604, 253, 1566, 369, 3215, 11273, 327, 4440, 257, 292, 840, 352, 556, 644, 10166, 327, 253, 1071, 3888, 285, 841, 1543, 403, 417, 3588, 50275, 555, 5367, 275, 5933, 337, 3280, 253, 271, 29607, 6960, 4315, 50276, 266, 29607, 6960, 4315, 50276, 783, 2929, 29328, 271, 4722, 1895, 285, 1754, 327, 10527, 285, 16774, 1783, 3400, 247, 18176, 2900, 3374, 8424, 432, 253, 4931, 27662, 44003, 4028, 1142, 7794, 273, 253, 2929, 403, 275, 878, 273, 14883, 318, 841, 2486, 253, 16038, 1895, 873, 8777, 323, 253, 2710, 49602, 285, 253, 14720, 3212, 2176, 11815, 4390, 891, 717, 14755, 253, 2929, 347, 604, 619, 4685, 310, 3451, 285, 4477, 6034, 7125, 403, 3365, 3430, 533, 604, 841, 3374, 476, 320, 31637, 2654, 320, 5211, 281, 7164, 619, 4868, 50275, 5996, 49794, 50276, 43355, 6571, 2953, 253, 1840, 3374, 275, 5955, 285, 275, 253, 17265, 7714, 891, 452, 690, 42578, 7350, 6096, 342, 37317, 2805, 4185, 91, 5001, 253, 3480, 273, 5183, 8542, 5649, 285, 253, 958, 326, 7200, 15988, 275, 253, 625, 1524, 10186, 49602, 23973, 9619, 4103, 281, 253, 15265, 839, 1269, 263, 1895, 2299, 253, 17265, 2929, 310, 1199, 5520, 20178, 38135, 4558, 1029, 285, 327, 253, 2644, 253, 2929, 310, 273, 4209, 3290, 323, 14924, 50276, 7152, 33032, 2520, 2929, 25339, 1114, 4977, 3028, 285, 3488, 783, 3028, 43394, 275, 253, 3634, 273, 1643, 11860, 4715, 32495, 2074, 5971, 2223, 2430, 14720, 342, 13553, 273, 2176, 3386, 534, 476, 320, 3340, 11132, 672, 760, 247, 1643, 3733, 6667, 403, 2130, 767, 1846, 3510, 273, 1643, 11860, 3082, 19025, 1507, 285, 11038, 6928, 403, 2011, 281, 320, 7887, 49996, 285, 4116, 267, 49996, 2975, 1016, 342, 616, 11361, 285, 23797, 253, 4477, 9569, 247, 2969, 1332, 1754, 327, 1881, 42959, 347, 247, 2900, 281, 841, 7881, 342, 4679, 327, 2067, 20953, 8892, 285, 253, 625, 1524, 10186, 13898, 433, 303, 6533, 292, 5847, 337, 186, 9072, 16038, 285, 10199, 273, 1114, 4977, 3028, 285, 3488, 783, 3028, 43394, 10263, 432, 2720, 789, 432, 643, 4910, 374, 186, 47606, 1783, 285, 7126, 5304, 5904, 495, 186, 5123, 2969, 12419, 8892, 281, 5513, 4114, 4473, 1230, 577, 186, 88, 293, 392, 265, 1300, 20953, 15302, 285, 4679, 281, 12654, 24316, 273, 253, 3753, 273, 19025, 1507, 285, 11038, 6928, 275, 253, 3634, 273, 253, 2929, 608, 186, 4714, 15720, 50276, 5040, 337, 186, 81, 26080, 1318, 310, 2649, 4518, 5183, 11701, 689, 11038, 6928, 323, 253, 760, 25450, 899, 10895, 13898, 433, 303, 6533, 292, 310, 16888, 387, 1682, 374, 186, 15870, 1666, 25379, 275, 4679, 50276, 3062, 7000, 5701, 50276, 2520, 2929, 33888, 1643, 11860, 4715, 275, 253, 3634, 273, 1114, 4977, 3028, 285, 3488, 783, 3028, 43394, 436, 8668, 703, 1397, 690, 1708, 327, 849, 19025, 1507, 285, 11038, 6928, 1347, 9162, 273, 3386, 285, 627, 310, 690, 4722, 1783, 281, 896, 841, 24316, 598, 12342, 432, 253, 2929, 403, 6210, 1591, 446, 1243, 342, 2067, 2969, 6667, 285, 20953, 7533, 403, 4158, 281, 17093, 731, 45190, 33810, 253, 1783, 5644, 281, 247, 2969, 1881, 42959, 3169, 2900, 281, 3157, 3386, 323, 9162, 4583, 436, 2929, 9563, 973, 285, 253, 1783, 5644, 281, 690, 4722, 16039, 715, 1643, 11860, 4715, 50275, 358, 5378, 1037, 253, 1543, 403, 247, 1652, 625, 31623, 1223, 2067, 20953, 4679, 24088, 1269, 263, 8985, 11559, 3488, 783, 3028, 278, 79, 382, 403, 6210, 392, 265, 1300, 281, 17093, 253, 11361, 273, 7296, 3488, 783, 3028, 6127, 697, 417, 347, 2590, 849, 1199, 253, 1524, 10186, 15646, 841, 5319, 253, 4081, 1332, 7052, 41731, 13015, 253, 1666, 25379, 275, 253, 20953, 7533, 533, 11701, 689, 253, 11038, 6928, 8245, 327, 13898, 433, 303, 6533, 292, 310, 16888, 387, 1682, 347, 824, 347, 5322, 347, 841, 16039, 403, 697, 417, 2590, 604, 697, 18236, 4217, 23000, 1223, 891, 2096, 253, 2770, 369, 10941, 342, 19025, 1507, 285, 11038, 6928, 347, 15572, 273, 7887, 3169, 285, 4116, 267, 49996, 352, 651, 452, 644, 5322, 281, 452, 2908, 625, 1666, 25379, 275, 253, 4679, 50275, 34974, 337, 186, 6050, 4473, 1230, 1842, 1767, 10117, 1676, 891, 369, 3786, 25229, 273, 13213, 3006, 49996, 347, 2057, 4116, 267, 390, 7887, 3169, 403, 627, 667, 643, 9351, 390, 1364, 247, 1327, 1595, 41454, 30410, 320, 7887, 1754, 285, 12008, 26620, 374, 186, 5430, 513, 643, 1643, 11860, 3082, 4944, 715, 436, 2891, 13646, 323, 1650, 752, 403, 9351, 273, 3386, 1114, 4977, 3028, 4632, 3488, 783, 3028, 403, 13757, 3169, 3082, 24088, 278, 16878, 4715, 752, 670, 3365, 3733, 247, 30410, 390, 256, 11618, 50274, 5996, 49794, 50276, 74, 5717, 253, 4477, 323, 512, 616, 3434, 1309, 253, 5955, 3408, 281, 19148, 2710, 3533, 670, 616, 2929, 285, 323, 3515, 3081, 1666, 25379, 846, 4361, 253, 643, 10123, 285, 6523, 253, 4477, 6128, 619, 17401, 4558, 6571, 253, 1072, 253, 747, 8668, 327, 5148, 613, 920, 2530, 407, 253, 4477, 310, 271, 4722, 581, 533, 253, 8542, 5373, 273, 436, 2746, 476, 1335, 320, 625, 345, 2414, 600, 5183, 3081, 4679, 275, 824, 897, 2219, 24088, 253, 277, 2072, 1650, 5393, 275, 581, 273, 253, 11985, 604, 824, 247, 10895, 4961, 651, 3012, 17084, 436, 2929, 347, 4767, 275, 253, 2022, 10123, 253, 16774, 1543, 403, 2649, 3782, 13943, 533, 4583, 253, 2929, 1057, 2085, 690, 4722, 1783, 326, 778, 11907, 747, 4088, 273, 4680, 670, 1643, 11860, 4715, 3237, 347, 824, 891, 1158, 436, 2929, 778, 320, 273, 1600, 281, 253, 17857, 32888, 3114, 5474, 33032, 2520, 2929, 806, 19401, 253, 7364, 273, 7887, 285, 4116, 267, 49996, 597, 4081, 271, 4116, 3169, 1332, 323, 4735, 5438, 281, 2953, 253, 3237, 273, 7887, 49996, 285, 4116, 267, 49996, 50276, 783, 4679, 327, 50276, 43249, 13506, 285, 1524, 10186, 1643, 11860, 4715, 8892, 1646, 1175, 50276, 296, 3755, 20556, 50275, 18, 253, 8813, 670, 253, 7881, 273, 7887, 3861, 49225, 6928, 285, 4116, 49996, 11038, 2990, 310, 4722, 50276, 19, 253, 16038, 327, 253, 3488, 783, 3028, 9162, 310, 973, 24013, 8550, 50275, 20, 973, 15720, 285, 3477, 281, 956, 50275, 20881, 1255, 265, 50276, 18, 3480, 273, 20417, 15832, 253, 4081, 4735, 27423, 5122, 310, 1512, 2969, 285, 253, 760, 20417, 15832, 50275, 9328, 760, 897, 1881, 42959, 281, 4044, 247, 1805, 6779, 2581, 685, 3587, 970, 3388, 45900, 3861, 49225, 6928, 390, 512, 253, 1329, 941, 11038, 2990, 50276, 19, 849, 281, 5206, 49495, 391, 50276, 262, 310, 247, 4373, 19484, 310, 253, 8750, 253, 391, 253, 1805, 1057, 352, 452, 642, 5170, 3033, 50276, 20, 3480, 253, 4679, 323, 5301, 342, 7887, 49996, 285, 4116, 267, 49996, 275, 253, 2022, 2929, 352, 310, 9560, 281, 921, 253, 3237, 273, 7887, 49996, 285, 4116, 267, 49996, 50276, 21, 50276, 1286, 42959, 671, 556, 690, 3602, 323, 253, 9261, 2139, 310, 253, 4081, 1332, 1327, 36928, 50276, 22, 253, 2905, 789, 310, 1512, 1652, 5046, 1908, 6240, 690, 1881, 42959, 789, 50274, 783, 7092, 4278, 285, 2905, 789, 273, 253, 4081, 2929, 943, 320, 2007, 2879, 2490, 187, 4118, 18435, 27, 2520, 2929, 3537, 13505, 3237, 273, 5368, 7887, 5148, 613, 6118, 285, 4116, 267, 5148, 613, 6118, 323, 1643, 11860, 4715, 275, 3488, 783, 3028, 43394, 253, 7887, 5148, 613, 6118, 824, 347, 3861, 49225, 6928, 2430, 17619, 1180, 273, 21496, 7877, 1319, 285, 253, 4116, 267, 5148, 613, 6118, 403, 16931, 281, 3731, 42070, 253, 4477, 4081, 247, 2969, 2568, 3576, 1332, 281, 2953, 841, 3237, 285, 5183, 697, 12510, 275, 616, 4679, 436, 2929, 25339, 5148, 613, 920, 432, 247, 1077, 4451, 8668, 347, 20503, 407, 247, 37317, 285, 4518, 5544, 3237, 273, 7561, 3197, 5148, 613, 920, 3082, 2299, 436, 2929, 2770, 327, 3861, 49225, 6928, 285, 11038, 6928, 1014, 2167, 627, 452, 644, 4081, 1142, 5148, 613, 920, 3082, 690, 5368, 3082, 1646, 417, 281, 452, 253, 3237, 273, 3861, 49225, 6928, 285, 263, 11038, 6928, 275, 1635, 253, 8542, 5373, 273, 253, 4081, 2746, 403, 417, 973, 5183, 3738, 253, 3081, 4679, 275, 253, 2488, 2380, 9713, 690, 7350, 273, 253, 30628, 597, 403, 417, 2217, 281, 7568, 253, 12510, 273, 253, 4081, 1332 ]
Below is given review of a research paper from cnoference journal. Please write a summary the review. ### Review: the paper provides an optimal scaling analysis of locally balanced proposals lbp a recentlyproposed class of informed mcmc algorithms for discrete spaces see in particular 91011 in particular the paper a provides an optimal scaling result theorem 31 for lbp on discrete spaces more specifically on productform binary spaces interestingly the optimal acceptance rate and asymptotic scaling is the same as mala on continuous spaces ie an optimal acceptance rate of 0574 and on13 steps for each effective sample in dimension n see corollary 32 b provides an optimal scaling result for rwm theorem 38 for randomwalk metropolis rwm under the stronger assumption that each bit has probability close to 05 under this somewhat more artificial scenario one recovers again the same behavior as rwm in continuous spaces namely an optimal acceptance rate of 0234 and on steps for each effective sample in dimension n see corollary 39 this result seems to be mainly interesting to have a comparison with the one in a c implements adaptive mcmc versions of multistep lbp exploiting the theoretical results of a and b and tests the resulting algorithms on a variety of sampling tasks in highdimensional discrete spaces d the proposed methods are applied to energybased models ebm training obtaining a 2 to 5fold speedup compared to nontuned singlestep lbp strengths there is a large and wellestablished optimal scaling literature for mcmc in highd continuous spaces on the contrary providing an analogous and meaningful analysis for discrete spaces is not easy and could be helpful to advance the practical and theoretical understanding of informed mcmc in highd discrete spaces this paper provides an interesting and significant step in this direction coupling the optimal scaling framework with lbp and providing a meaningful and nontrivial analysis interestingly the author obtain results that mirror the ones of continuous spaces both in terms of optimal acceptance rates and scaling with dimensionality while recent work using lbp for discrete spaces sampling eg 91011 have witnessed promising empirical results in various tasks including ebm training there is not much theory even with some heuristics or on toy cases quantifying the potential improvements given by lbp in highdimensions the analysis of the current paper provides some interesting results in this direction suggesting an on23 gain at least under some simplifying assumptions the numerical results are welldesigned match with remarkable fidelity what predicted by the theory and suggest that the adaptive multistep lbp schemes under consideration are promising algorithms for discrete space sampling and ebm training weaknesses while the result for lbp is interesting and genuinely novel the one for rwm is derived under the unsatisfactory assumption that flipping probabilities are close to 05 ie the target is close to flat and is quite reminiscent of though technically different from the one in 30 both in terms of setting spirit and conclusion the results are derived for the specific case of productform binary targets which is quite specific and potentially restrictive also it somehow feels that in the context of discrete spaces which is probably harder to study in terms of the questions considered in the paper specific assumptions about the algorithm and the target distribution can play a more crucial role than in continuous spaces eg use of gradient approximation or not use of continuous versus discrete time lbp scheme binary versus more general spaces sampling bits with or without replacement etc for example it would be interesting to have theoretical results that also quantify the impact of the gradient approximation as in 10 since without that the cost per iteration of lbp and rwm is not comparable this however would require considering more complex target distributions some mathematical statements eg lemma 33 are hard to follow and not rigorous enough also there are quite a few typos throughout the paper more details below the novelty of the proposed methodology is relatively limited basically an adaptive version of previously proposed schemes in that sense the main contribution is the theoretical one which provides results on optimal acceptance rates to guide tuning and results quantifying the scaling with dimensionality under some simplifying assumptions the authors have adequately addressed the limitations and potential negative societal impact of their work docsepthis paper studies the optimal scaling of metropolishastings algorithms on the cube mathbbz2n the analysis of optimal scaling for markov chain monte carlo methods has been studied extensively for continuous state spaces but not for discrete spaces the authors provide a scaling analysis for the locally balanced proposal lbp in 2 the authors show that the lbp has a better asymptotic property than the randomwalk metropolis rwm algorithm simulation results support the theoretical results the results in this paper are interesting and show a systematic difference between lbp and rwm lbp is a new and useful method for discrete state space i believe that the results in this paper are a good contribution to this field if the following weak points are solved from the theoretical point of view the result shows the similarity of lbp with the langevin algorithm therefore it is reasonable that lbp is superior to rwm from a practical point of view the result provides a guide for the choice of tuning parameters however i feel that there is a gap between theory and claim the efficiency presented in this paper only makes sense if the limit is a class of diffusion processes however the literature suggests that the limit is not a diffusion process but a poisson process therefore the meaning of efficiency is not obvious to me in this paper also i have a less clear picture of comparing lbp and rwm the analysis of lbp focuses on the case pll 12 while the analysis of rwm focuses on the case p approx 12 therefore it isnt easy to draw a general conclusion comparing the two algorithms however the paper 1 is useful in this comparison as a novelty i need to mention that the scaling limit for discrete space is not new more precisely the lbp results are new but the rwm results are known see 1 in 1 only the iid target distribution is considered so the class of target distribution in this paper is much more general in this sense however the authors only study the case where p approx 12 but 1 also studied p ll 12 and identified the limit process the paper has good potential but i think there is room for improvement minor i recommend checking statements and proofs once again there are many minor errors for example in lemma 33 ab is a limit of the random variable say xn but ab still depends on n it would be nice if the authors could comment on the guideline for choosing the function g 1 gareth o roberts 1998 optimal metropolis algorithms for product measures on the vertices of a hypercube stochastics and stochastic reports 6234 275283 2 haoran sun hanjun dai wei xia and arun ramamurthy path auxiliary proposal for mcmc in discrete space in international conference on learning representations 2021 i updated my rating after reading the authors response and followup the main limitation is the gap between theory and claim the authors evaluate the expected squared jump distance however the meaning of this statistic is not clear in the current context since the limit process is a pure jump process and not the langevin diffusion docsepthe optimal scaling problem is the problem of finding the right scale of each jump in metropolishasting that leads to optimal convergence rate even though the problem has been wellstudied in the continuous settings it has been seldomly explored in discrete settings as a contribution in this direction the authors study the locally balanced proposal lbp proposed by zanella 2020 which is designed to move one coordinate at a time with the proposal distribution depending on the likelihood ratio between the local change later sun et al 2021 propose a more general version of lbp which can switch multiple coordinates in one step with this version of ldp the authors are able to derive the optimal scaling and with this scaling they show that the multistep lbp has an asymptotic acceptance rate that is independent of the target distribution which is greater than that of the random walk metropolis strengths this paper provides sufficient contribution towards efficient metropolishasting algorithms on discrete spaces the authors provide thorough analyses and experiments which agree with the results i find the application on efficient energy based model to be very interesting weaknesses while the paper can be easily read by experts in the fields i feel that some additionsexplanations would be appreciated by outsiders for example the authors should explain what scale means in both continuous and discrete settings and they should give a precise definition of optimal scaling problem in section 1 or 2 the authors have the following assumption on the target distribution pix prodi1n pixi 1pi1xi epsilon min pj 1pj frac12 epsilon qquad textfor some epsilon in 0frac14 how the choice of epsilon in eq 2 affects the rate of convergence in eq 6 is still unclear to me i have a hunch that the quality of the taylor approximation in lemma 33 depends on epsilon as well all in all i think the authors should expand more in the main paper on the effect of epsilon on the efficiency i see that the choices of epsilon are mentioned in appendix c1 but not in the main paper and whether the assumption above is reasonable in reallife scenarios eg in ebm in the experiments the authors only compare between two methods namely lbp and rwm even though there are many experiments in the past which have already shown effectiveness of lbp against other methods such as rwm gibbs sampling the hamming ball sampler and continuous relaxation based methods i still want to see how lbp performs against some of these methods at least in energy based models in addition i have found some potential mathematical typos in this paper page 3 eq 2 wedge is used here but has never been defined before page 4 eq 8 arr should be equal 2lnfrac23phileftfrac12lambda1 l frac32right in the asymptotic limit not exactly equal page 14 eq 25 what is m0 page 14 eq 29 and 30 should the middle expressions be put in absolute values page 14 eq 29 and 31 mathbbpwxuwnfrac12 should be mathbbpwxuwtnfrac12 page 15 eq 36 what is the appropriate value of t in order to apply lemma a1 page 16 eq 58 i am not sure how the factor of 1onfrac13 pops up here page 18 eq 84 xn should be xn uildots ur should be uildots ur and uiur should be uildots ur page 20 line 529 we have we have page 23 eq 164 should the lower order term depends on t and lambda2 as well also rfrac32 should be rfrac34 in addition to what the authors suggest in the discussion one might try to find nonasymptotic convergence rate as a function of epsilon there is also the question of finding the best weight function gt which might depend on the task at hand can we find a heuristic on g that gives us optimal or nearoptimal performance ### Summary:
the reviewers and i agree that the contributions of the paper are of interest and useful addition to the literature therefore i recommend accepting the paper please consider the reviewers comments when preparing the cameraready version
[ 436, 906, 3133, 281, 320, 7194, 4722, 281, 452, 247, 5301, 342, 253, 581, 275, 247, 50276, 68, 17930, 17825, 278, 3591, 68, 9508, 273, 1554, 382, 554, 298, 12303, 38883, 253, 10527, 1543, 273, 247, 285, 270, 285, 5216, 253, 4795, 11333, 327, 247, 5235, 273, 10491, 8892, 275, 1029, 6967, 13358, 8470, 50275, 69, 253, 4081, 3082, 403, 3732, 281, 2341, 3169, 3210, 299, 5844, 3733, 13546, 247, 374, 281, 608, 8089, 3885, 484, 2429, 281, 25450, 37437, 1625, 46701, 554, 298, 12303, 20544, 50275, 9088, 310, 247, 1781, 285, 973, 21877, 8654, 13642, 6239, 323, 278, 3591, 68, 275, 1029, 69, 5415, 8470, 327, 253, 10214, 5277, 271, 19890, 285, 14282, 1783, 323, 13358, 8470, 310, 417, 3477, 285, 812, 320, 9371, 281, 7170, 253, 8542, 285, 10527, 4685, 273, 8191, 278, 3591, 68, 275, 1029, 69, 13358, 8470, 436, 2929, 3400, 271, 4722, 285, 1534, 3213, 275, 436, 3884, 8789, 253, 8654, 13642, 7792, 342, 298, 12303, 285, 5277, 247, 14282, 285, 37825, 1783, 4722, 314, 253, 2488, 4044, 1543, 326, 11472, 253, 4394, 273, 5415, 8470, 1097, 275, 2426, 273, 8654, 14924, 4142, 285, 13642, 342, 7877, 1319, 50275, 6050, 3332, 789, 970, 298, 12303, 323, 13358, 8470, 10491, 24088, 898, 6903, 18, 452, 21153, 12532, 16774, 1543, 275, 2710, 8892, 1690, 299, 5844, 3733, 627, 310, 417, 1199, 3762, 1014, 342, 690, 344, 321, 3397, 390, 327, 20953, 2219, 2677, 5411, 253, 2442, 11701, 1677, 407, 298, 12303, 275, 1029, 4528, 5354, 253, 1783, 273, 253, 1655, 2929, 3400, 690, 4722, 1543, 275, 436, 3884, 7738, 271, 327, 1508, 6351, 387, 1878, 762, 690, 8077, 5411, 13260, 50275, 783, 10704, 1543, 403, 6210, 392, 265, 1300, 3761, 342, 13406, 32422, 752, 8131, 407, 253, 3762, 285, 1804, 326, 253, 17825, 1554, 382, 554, 298, 12303, 15849, 762, 8180, 403, 12532, 11333, 323, 13358, 2317, 10491, 285, 299, 5844, 3733, 50276, 20881, 1255, 265, 50275, 6050, 253, 906, 323, 298, 12303, 310, 4722, 285, 27364, 4460, 253, 581, 323, 391, 28386, 310, 6012, 762, 253, 49770, 9376, 326, 46899, 20552, 403, 2810, 281, 16987, 26332, 253, 2303, 310, 2810, 281, 6507, 285, 310, 3240, 35036, 273, 2167, 22335, 1027, 432, 253, 581, 275, 1884, 1097, 275, 2426, 273, 4758, 5968, 285, 6452, 50275, 783, 1543, 403, 6012, 323, 253, 2173, 1083, 273, 1885, 630, 8985, 8571, 534, 310, 3240, 2173, 285, 7826, 29190, 671, 352, 10380, 9193, 326, 275, 253, 3634, 273, 13358, 8470, 534, 310, 3164, 12150, 281, 1263, 275, 2426, 273, 253, 3533, 2783, 275, 253, 2929, 2173, 13260, 670, 253, 5933, 285, 253, 2303, 3268, 476, 1132, 247, 625, 9560, 2554, 685, 275, 5415, 8470, 24088, 897, 273, 11786, 11193, 390, 417, 897, 273, 5415, 7147, 13358, 673, 298, 12303, 6974, 8985, 7147, 625, 2087, 8470, 10491, 9886, 342, 390, 1293, 5407, 3966, 323, 1650, 352, 651, 320, 4722, 281, 452, 10527, 1543, 326, 671, 22048, 253, 3486, 273, 253, 11786, 11193, 347, 275, 884, 1580, 1293, 326, 253, 2105, 591, 19502, 273, 298, 12303, 285, 391, 28386, 310, 417, 10870, 436, 2299, 651, 2430, 7296, 625, 2570, 2303, 10670, 50275, 8826, 15965, 7234, 24088, 18057, 5922, 403, 1892, 281, 956, 285, 417, 26565, 2217, 671, 627, 403, 3240, 247, 1643, 963, 993, 4768, 253, 2929, 625, 4278, 2708, 50275, 783, 38135, 273, 253, 4081, 16182, 310, 4942, 3710, 10323, 271, 17825, 2715, 273, 3786, 4081, 15849, 275, 326, 3282, 253, 2022, 7680, 310, 253, 10527, 581, 534, 3400, 1543, 327, 8654, 14924, 4142, 281, 7102, 25184, 285, 1543, 2677, 5411, 253, 13642, 342, 7877, 1319, 762, 690, 8077, 5411, 13260, 253, 4477, 452, 18212, 9713, 253, 7364, 285, 2442, 4016, 38058, 3486, 273, 616, 789, 5474, 33032, 2520, 2929, 2175, 253, 8654, 13642, 273, 1313, 18427, 763, 42118, 11333, 327, 253, 23636, 14168, 4482, 91, 19, 79, 253, 1783, 273, 8654, 13642, 323, 1616, 729, 5931, 1114, 442, 1113, 4213, 3082, 556, 644, 5421, 18171, 323, 5415, 1375, 8470, 533, 417, 323, 13358, 8470, 253, 4477, 2085, 247, 13642, 1783, 323, 253, 12171, 16645, 10419, 298, 12303, 275, 374, 253, 4477, 921, 326, 253, 298, 12303, 556, 247, 1805, 20185, 2867, 685, 253, 3632, 13678, 1313, 37489, 391, 28386, 5933, 9864, 1543, 1329, 253, 10527, 1543, 253, 1543, 275, 436, 2929, 403, 4722, 285, 921, 247, 12082, 3064, 875, 298, 12303, 285, 391, 28386, 298, 12303, 310, 247, 747, 285, 4217, 1332, 323, 13358, 1375, 2317, 891, 2868, 326, 253, 1543, 275, 436, 2929, 403, 247, 1175, 7680, 281, 436, 1673, 604, 253, 1563, 5075, 2792, 403, 14042, 432, 253, 10527, 1127, 273, 1859, 253, 906, 2722, 253, 14259, 273, 298, 12303, 342, 253, 298, 912, 8498, 5933, 3103, 352, 310, 5272, 326, 298, 12303, 310, 8936, 281, 391, 28386, 432, 247, 8542, 1127, 273, 1859, 253, 906, 3400, 247, 7102, 323, 253, 4327, 273, 25184, 3602, 50276, 35529, 891, 1928, 326, 627, 310, 247, 8037, 875, 3762, 285, 1750, 253, 6733, 3559, 275, 436, 2929, 760, 2789, 3282, 604, 253, 2701, 310, 247, 966, 273, 12393, 4870, 2299, 253, 6239, 5936, 326, 253, 2701, 310, 417, 247, 12393, 1232, 533, 247, 50276, 5367, 17469, 1232, 3103, 253, 4495, 273, 6733, 310, 417, 4755, 281, 479, 275, 436, 2929, 50276, 12563, 891, 452, 247, 1679, 2590, 5406, 273, 10941, 298, 12303, 285, 391, 28386, 253, 1783, 273, 298, 12303, 16633, 327, 253, 1083, 499, 77, 1249, 1223, 253, 1783, 273, 391, 28386, 16633, 327, 253, 1083, 268, 1192, 89, 1249, 3103, 352, 310, 2649, 3477, 281, 3812, 247, 2087, 6452, 10941, 253, 767, 11333, 2299, 253, 2929, 337, 310, 4217, 275, 436, 5301, 50276, 284, 247, 38135, 891, 878, 281, 3748, 326, 253, 13642, 2701, 323, 13358, 2317, 310, 417, 747, 625, 10534, 253, 298, 12303, 1543, 403, 747, 533, 253, 391, 28386, 1543, 403, 1929, 923, 337, 50276, 249, 337, 760, 253, 891, 301, 2303, 3268, 310, 2783, 594, 253, 966, 273, 2303, 3268, 275, 436, 2929, 310, 1199, 625, 2087, 275, 436, 3282, 50276, 35529, 253, 4477, 760, 1263, 253, 1083, 835, 268, 1192, 89, 1249, 533, 337, 671, 5421, 268, 26198, 1249, 285, 3636, 253, 2701, 1232, 50276, 783, 2929, 556, 1175, 2442, 533, 891, 1158, 627, 310, 2316, 323, 7756, 50276, 37585, 50276, 74, 5583, 12669, 7234, 285, 27947, 2378, 969, 627, 403, 1142, 5884, 6332, 323, 1650, 275, 18057, 5922, 490, 310, 247, 2701, 273, 253, 3632, 4778, 1333, 1269, 79, 533, 490, 1335, 7024, 327, 295, 50275, 262, 651, 320, 5322, 604, 253, 4477, 812, 4385, 327, 253, 29609, 323, 13887, 253, 1159, 305, 50275, 18, 305, 40654, 258, 687, 589, 1641, 8065, 8654, 1313, 37489, 11333, 323, 1885, 5593, 327, 253, 13388, 273, 247, 4373, 68, 4338, 331, 3770, 26245, 285, 19191, 5012, 721, 20210, 25255, 28933, 50276, 19, 419, 263, 266, 5101, 15761, 30986, 277, 2284, 359, 74, 1269, 571, 285, 549, 328, 17653, 312, 321, 24085, 1854, 24026, 10419, 323, 278, 3591, 68, 275, 13358, 2317, 275, 5213, 8059, 327, 4715, 14237, 43425, 50276, 74, 9300, 619, 13716, 846, 4361, 253, 4477, 2380, 285, 956, 484, 253, 2022, 12291, 310, 253, 8037, 875, 3762, 285, 1750, 253, 4477, 7472, 253, 3264, 30044, 6923, 4181, 2299, 253, 4495, 273, 436, 26312, 310, 417, 2590, 275, 253, 1655, 3634, 1580, 253, 2701, 1232, 310, 247, 6313, 6923, 1232, 285, 417, 253, 298, 912, 8498, 12393, 50276, 7152, 339, 431, 248, 8654, 13642, 1895, 310, 253, 1895, 273, 4560, 253, 987, 4311, 273, 1016, 6923, 275, 1313, 18427, 763, 14669, 326, 5644, 281, 8654, 14940, 2281, 1014, 2167, 253, 1895, 556, 644, 973, 14091, 728, 275, 253, 5415, 7533, 352, 556, 644, 28277, 314, 14859, 275, 13358, 7533, 347, 247, 7680, 275, 436, 3884, 253, 4477, 1263, 253, 12171, 16645, 10419, 298, 12303, 4081, 407, 1182, 266, 5021, 9169, 534, 310, 4158, 281, 2118, 581, 13249, 387, 247, 673, 342, 253, 10419, 3268, 7293, 327, 253, 12177, 4313, 875, 253, 1980, 1818, 1996, 5101, 1162, 355, 43425, 12661, 247, 625, 2087, 2715, 273, 298, 12303, 534, 476, 5234, 2709, 11627, 275, 581, 3213, 342, 436, 2715, 273, 298, 12132, 253, 4477, 403, 2104, 281, 15313, 253, 8654, 13642, 285, 342, 436, 13642, 597, 921, 326, 253, 1554, 382, 554, 298, 12303, 556, 271, 20185, 14924, 2281, 326, 310, 3907, 273, 253, 2303, 3268, 534, 310, 3687, 685, 326, 273, 253, 3632, 2940, 1313, 37489, 50275, 296, 3755, 20556, 436, 2929, 3400, 4209, 7680, 4404, 5919, 1313, 18427, 763, 14669, 11333, 327, 13358, 8470, 253, 4477, 2085, 11080, 6260, 285, 4679, 534, 5194, 342, 253, 1543, 891, 1089, 253, 2898, 327, 5919, 2341, 1754, 1566, 281, 320, 1077, 4722, 50274, 20881, 1255, 265, 1223, 253, 2929, 476, 320, 4354, 1239, 407, 10071, 275, 253, 4910, 891, 1928, 326, 690, 1635, 11523, 11139, 569, 651, 320, 14109, 407, 20823, 5852, 323, 1650, 253, 4477, 943, 5513, 752, 4311, 2097, 275, 1097, 5415, 285, 13358, 7533, 285, 597, 943, 1918, 247, 10799, 5426, 273, 8654, 13642, 1895, 275, 2593, 337, 390, 374, 50276, 783, 4477, 452, 253, 1563, 9376, 327, 253, 2303, 3268, 8066, 50276, 856, 5168, 18, 79, 8066, 74, 337, 2059, 18, 2981, 50276, 4259, 50276, 1222, 268, 75, 50276, 18, 81, 75, 50275, 1124, 805, 50276, 4259, 2805, 3362, 2505, 1542, 690, 50276, 4259, 275, 470, 1124, 1047, 50276, 5430, 253, 4327, 273, 299, 4277, 275, 16186, 374, 11852, 253, 2281, 273, 14940, 275, 16186, 721, 310, 1335, 12744, 281, 479, 891, 452, 247, 288, 3204, 326, 253, 3290, 273, 253, 246, 9614, 11193, 275, 18057, 5922, 7024, 327, 299, 4277, 347, 973, 512, 275, 512, 891, 1158, 253, 4477, 943, 5645, 625, 275, 253, 2022, 2929, 327, 253, 1055, 273, 299, 4277, 327, 253, 6733, 891, 923, 326, 253, 10165, 273, 299, 4277, 403, 5393, 275, 30762, 260, 18, 533, 417, 275, 253, 2022, 2929, 285, 1880, 253, 9376, 1840, 310, 5272, 275, 294, 455, 1074, 15216, 24088, 275, 299, 5844, 50276, 249, 253, 4679, 253, 4477, 760, 7277, 875, 767, 3082, 10775, 298, 12303, 285, 391, 28386, 1014, 2167, 627, 403, 1142, 4679, 275, 253, 2469, 534, 452, 2168, 2011, 12510, 273, 298, 12303, 1411, 643, 3082, 824, 347, 391, 28386, 33342, 1768, 10491, 253, 288, 28444, 4023, 1775, 17407, 285, 5415, 17040, 1754, 3082, 891, 1335, 971, 281, 923, 849, 298, 12303, 17923, 1411, 690, 273, 841, 3082, 387, 1878, 275, 2341, 1754, 3210, 50276, 249, 1635, 891, 452, 1119, 690, 2442, 15965, 963, 993, 275, 436, 2929, 50276, 6377, 495, 16186, 374, 33906, 310, 908, 1060, 533, 556, 1620, 644, 2931, 1078, 50276, 6377, 577, 16186, 854, 4077, 943, 320, 4503, 374, 6677, 1124, 1508, 545, 587, 649, 1124, 805, 2260, 18, 298, 1315, 317, 1237, 918, 275, 253, 20185, 2701, 417, 4555, 4503, 50276, 6377, 1638, 16186, 2030, 752, 310, 278, 17, 50275, 6377, 1638, 16186, 3285, 285, 1884, 943, 253, 4766, 12091, 320, 1691, 275, 7880, 2193, 50276, 6377, 1638, 16186, 3285, 285, 4562, 14168, 4482, 81, 22358, 86, 939, 1124, 805, 943, 320, 14168, 4482, 81, 22358, 41440, 14543, 1124, 805, 50276, 6377, 1458, 16186, 5540, 752, 310, 253, 4569, 1318, 273, 246, 275, 1340, 281, 4647, 18057, 247, 18, 50276, 6377, 1668, 16186, 9135, 891, 717, 417, 2119, 849, 253, 2803, 273, 337, 251, 1124, 1012, 42206, 598, 1060, 50276, 6377, 1283, 16186, 11130, 1269, 79, 943, 320, 1269, 79, 1484, 786, 1502, 2936, 943, 320, 1484, 786, 1502, 2936, 285, 28243, 321, 943, 320, 1484, 786, 1502, 2936, 50276, 6377, 1384, 1386, 40355, 359, 452, 50276, 664, 452, 50276, 6377, 3495, 16186, 17601, 943, 253, 2406, 1340, 1307, 7024, 327, 246, 285, 29331, 19, 347, 973, 671, 391, 1124, 1237, 943, 320, 391, 1124, 1706, 275, 1635, 281, 752, 253, 4477, 1804, 275, 253, 5955, 581, 1537, 1611, 281, 1089, 1327, 284, 40045, 3875, 14940, 2281, 347, 247, 1159, 273, 299, 4277, 627, 310, 671, 253, 1953, 273, 4560, 253, 1682, 2801, 1159, 305, 85, 534, 1537, 3469, 327, 253, 4836, 387, 1133, 476, 359, 1089, 247, 47641, 327, 305, 326, 4245, 441, 8654, 390, 2822, 29776, 3045, 2490, 187, 4118, 18435, 27, 783, 30628, 285, 891, 5194, 326, 253, 9021, 273, 253, 2929, 403, 273, 1600, 285, 4217, 1635, 281, 253, 6239, 3103, 891, 5583, 18738, 253, 2929, 50276, 32897, 1908, 253, 30628, 5701, 672, 13828, 253, 4049, 254, 609, 5102, 2715, 209 ]
[ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 ]
[ 436, 906, 3133, 281, 320, 7194, 4722, 281, 452, 247, 5301, 342, 253, 581, 275, 247, 50276, 68, 17930, 17825, 278, 3591, 68, 9508, 273, 1554, 382, 554, 298, 12303, 38883, 253, 10527, 1543, 273, 247, 285, 270, 285, 5216, 253, 4795, 11333, 327, 247, 5235, 273, 10491, 8892, 275, 1029, 6967, 13358, 8470, 50275, 69, 253, 4081, 3082, 403, 3732, 281, 2341, 3169, 3210, 299, 5844, 3733, 13546, 247, 374, 281, 608, 8089, 3885, 484, 2429, 281, 25450, 37437, 1625, 46701, 554, 298, 12303, 20544, 50275, 9088, 310, 247, 1781, 285, 973, 21877, 8654, 13642, 6239, 323, 278, 3591, 68, 275, 1029, 69, 5415, 8470, 327, 253, 10214, 5277, 271, 19890, 285, 14282, 1783, 323, 13358, 8470, 310, 417, 3477, 285, 812, 320, 9371, 281, 7170, 253, 8542, 285, 10527, 4685, 273, 8191, 278, 3591, 68, 275, 1029, 69, 13358, 8470, 436, 2929, 3400, 271, 4722, 285, 1534, 3213, 275, 436, 3884, 8789, 253, 8654, 13642, 7792, 342, 298, 12303, 285, 5277, 247, 14282, 285, 37825, 1783, 4722, 314, 253, 2488, 4044, 1543, 326, 11472, 253, 4394, 273, 5415, 8470, 1097, 275, 2426, 273, 8654, 14924, 4142, 285, 13642, 342, 7877, 1319, 50275, 6050, 3332, 789, 970, 298, 12303, 323, 13358, 8470, 10491, 24088, 898, 6903, 18, 452, 21153, 12532, 16774, 1543, 275, 2710, 8892, 1690, 299, 5844, 3733, 627, 310, 417, 1199, 3762, 1014, 342, 690, 344, 321, 3397, 390, 327, 20953, 2219, 2677, 5411, 253, 2442, 11701, 1677, 407, 298, 12303, 275, 1029, 4528, 5354, 253, 1783, 273, 253, 1655, 2929, 3400, 690, 4722, 1543, 275, 436, 3884, 7738, 271, 327, 1508, 6351, 387, 1878, 762, 690, 8077, 5411, 13260, 50275, 783, 10704, 1543, 403, 6210, 392, 265, 1300, 3761, 342, 13406, 32422, 752, 8131, 407, 253, 3762, 285, 1804, 326, 253, 17825, 1554, 382, 554, 298, 12303, 15849, 762, 8180, 403, 12532, 11333, 323, 13358, 2317, 10491, 285, 299, 5844, 3733, 50276, 20881, 1255, 265, 50275, 6050, 253, 906, 323, 298, 12303, 310, 4722, 285, 27364, 4460, 253, 581, 323, 391, 28386, 310, 6012, 762, 253, 49770, 9376, 326, 46899, 20552, 403, 2810, 281, 16987, 26332, 253, 2303, 310, 2810, 281, 6507, 285, 310, 3240, 35036, 273, 2167, 22335, 1027, 432, 253, 581, 275, 1884, 1097, 275, 2426, 273, 4758, 5968, 285, 6452, 50275, 783, 1543, 403, 6012, 323, 253, 2173, 1083, 273, 1885, 630, 8985, 8571, 534, 310, 3240, 2173, 285, 7826, 29190, 671, 352, 10380, 9193, 326, 275, 253, 3634, 273, 13358, 8470, 534, 310, 3164, 12150, 281, 1263, 275, 2426, 273, 253, 3533, 2783, 275, 253, 2929, 2173, 13260, 670, 253, 5933, 285, 253, 2303, 3268, 476, 1132, 247, 625, 9560, 2554, 685, 275, 5415, 8470, 24088, 897, 273, 11786, 11193, 390, 417, 897, 273, 5415, 7147, 13358, 673, 298, 12303, 6974, 8985, 7147, 625, 2087, 8470, 10491, 9886, 342, 390, 1293, 5407, 3966, 323, 1650, 352, 651, 320, 4722, 281, 452, 10527, 1543, 326, 671, 22048, 253, 3486, 273, 253, 11786, 11193, 347, 275, 884, 1580, 1293, 326, 253, 2105, 591, 19502, 273, 298, 12303, 285, 391, 28386, 310, 417, 10870, 436, 2299, 651, 2430, 7296, 625, 2570, 2303, 10670, 50275, 8826, 15965, 7234, 24088, 18057, 5922, 403, 1892, 281, 956, 285, 417, 26565, 2217, 671, 627, 403, 3240, 247, 1643, 963, 993, 4768, 253, 2929, 625, 4278, 2708, 50275, 783, 38135, 273, 253, 4081, 16182, 310, 4942, 3710, 10323, 271, 17825, 2715, 273, 3786, 4081, 15849, 275, 326, 3282, 253, 2022, 7680, 310, 253, 10527, 581, 534, 3400, 1543, 327, 8654, 14924, 4142, 281, 7102, 25184, 285, 1543, 2677, 5411, 253, 13642, 342, 7877, 1319, 762, 690, 8077, 5411, 13260, 253, 4477, 452, 18212, 9713, 253, 7364, 285, 2442, 4016, 38058, 3486, 273, 616, 789, 5474, 33032, 2520, 2929, 2175, 253, 8654, 13642, 273, 1313, 18427, 763, 42118, 11333, 327, 253, 23636, 14168, 4482, 91, 19, 79, 253, 1783, 273, 8654, 13642, 323, 1616, 729, 5931, 1114, 442, 1113, 4213, 3082, 556, 644, 5421, 18171, 323, 5415, 1375, 8470, 533, 417, 323, 13358, 8470, 253, 4477, 2085, 247, 13642, 1783, 323, 253, 12171, 16645, 10419, 298, 12303, 275, 374, 253, 4477, 921, 326, 253, 298, 12303, 556, 247, 1805, 20185, 2867, 685, 253, 3632, 13678, 1313, 37489, 391, 28386, 5933, 9864, 1543, 1329, 253, 10527, 1543, 253, 1543, 275, 436, 2929, 403, 4722, 285, 921, 247, 12082, 3064, 875, 298, 12303, 285, 391, 28386, 298, 12303, 310, 247, 747, 285, 4217, 1332, 323, 13358, 1375, 2317, 891, 2868, 326, 253, 1543, 275, 436, 2929, 403, 247, 1175, 7680, 281, 436, 1673, 604, 253, 1563, 5075, 2792, 403, 14042, 432, 253, 10527, 1127, 273, 1859, 253, 906, 2722, 253, 14259, 273, 298, 12303, 342, 253, 298, 912, 8498, 5933, 3103, 352, 310, 5272, 326, 298, 12303, 310, 8936, 281, 391, 28386, 432, 247, 8542, 1127, 273, 1859, 253, 906, 3400, 247, 7102, 323, 253, 4327, 273, 25184, 3602, 50276, 35529, 891, 1928, 326, 627, 310, 247, 8037, 875, 3762, 285, 1750, 253, 6733, 3559, 275, 436, 2929, 760, 2789, 3282, 604, 253, 2701, 310, 247, 966, 273, 12393, 4870, 2299, 253, 6239, 5936, 326, 253, 2701, 310, 417, 247, 12393, 1232, 533, 247, 50276, 5367, 17469, 1232, 3103, 253, 4495, 273, 6733, 310, 417, 4755, 281, 479, 275, 436, 2929, 50276, 12563, 891, 452, 247, 1679, 2590, 5406, 273, 10941, 298, 12303, 285, 391, 28386, 253, 1783, 273, 298, 12303, 16633, 327, 253, 1083, 499, 77, 1249, 1223, 253, 1783, 273, 391, 28386, 16633, 327, 253, 1083, 268, 1192, 89, 1249, 3103, 352, 310, 2649, 3477, 281, 3812, 247, 2087, 6452, 10941, 253, 767, 11333, 2299, 253, 2929, 337, 310, 4217, 275, 436, 5301, 50276, 284, 247, 38135, 891, 878, 281, 3748, 326, 253, 13642, 2701, 323, 13358, 2317, 310, 417, 747, 625, 10534, 253, 298, 12303, 1543, 403, 747, 533, 253, 391, 28386, 1543, 403, 1929, 923, 337, 50276, 249, 337, 760, 253, 891, 301, 2303, 3268, 310, 2783, 594, 253, 966, 273, 2303, 3268, 275, 436, 2929, 310, 1199, 625, 2087, 275, 436, 3282, 50276, 35529, 253, 4477, 760, 1263, 253, 1083, 835, 268, 1192, 89, 1249, 533, 337, 671, 5421, 268, 26198, 1249, 285, 3636, 253, 2701, 1232, 50276, 783, 2929, 556, 1175, 2442, 533, 891, 1158, 627, 310, 2316, 323, 7756, 50276, 37585, 50276, 74, 5583, 12669, 7234, 285, 27947, 2378, 969, 627, 403, 1142, 5884, 6332, 323, 1650, 275, 18057, 5922, 490, 310, 247, 2701, 273, 253, 3632, 4778, 1333, 1269, 79, 533, 490, 1335, 7024, 327, 295, 50275, 262, 651, 320, 5322, 604, 253, 4477, 812, 4385, 327, 253, 29609, 323, 13887, 253, 1159, 305, 50275, 18, 305, 40654, 258, 687, 589, 1641, 8065, 8654, 1313, 37489, 11333, 323, 1885, 5593, 327, 253, 13388, 273, 247, 4373, 68, 4338, 331, 3770, 26245, 285, 19191, 5012, 721, 20210, 25255, 28933, 50276, 19, 419, 263, 266, 5101, 15761, 30986, 277, 2284, 359, 74, 1269, 571, 285, 549, 328, 17653, 312, 321, 24085, 1854, 24026, 10419, 323, 278, 3591, 68, 275, 13358, 2317, 275, 5213, 8059, 327, 4715, 14237, 43425, 50276, 74, 9300, 619, 13716, 846, 4361, 253, 4477, 2380, 285, 956, 484, 253, 2022, 12291, 310, 253, 8037, 875, 3762, 285, 1750, 253, 4477, 7472, 253, 3264, 30044, 6923, 4181, 2299, 253, 4495, 273, 436, 26312, 310, 417, 2590, 275, 253, 1655, 3634, 1580, 253, 2701, 1232, 310, 247, 6313, 6923, 1232, 285, 417, 253, 298, 912, 8498, 12393, 50276, 7152, 339, 431, 248, 8654, 13642, 1895, 310, 253, 1895, 273, 4560, 253, 987, 4311, 273, 1016, 6923, 275, 1313, 18427, 763, 14669, 326, 5644, 281, 8654, 14940, 2281, 1014, 2167, 253, 1895, 556, 644, 973, 14091, 728, 275, 253, 5415, 7533, 352, 556, 644, 28277, 314, 14859, 275, 13358, 7533, 347, 247, 7680, 275, 436, 3884, 253, 4477, 1263, 253, 12171, 16645, 10419, 298, 12303, 4081, 407, 1182, 266, 5021, 9169, 534, 310, 4158, 281, 2118, 581, 13249, 387, 247, 673, 342, 253, 10419, 3268, 7293, 327, 253, 12177, 4313, 875, 253, 1980, 1818, 1996, 5101, 1162, 355, 43425, 12661, 247, 625, 2087, 2715, 273, 298, 12303, 534, 476, 5234, 2709, 11627, 275, 581, 3213, 342, 436, 2715, 273, 298, 12132, 253, 4477, 403, 2104, 281, 15313, 253, 8654, 13642, 285, 342, 436, 13642, 597, 921, 326, 253, 1554, 382, 554, 298, 12303, 556, 271, 20185, 14924, 2281, 326, 310, 3907, 273, 253, 2303, 3268, 534, 310, 3687, 685, 326, 273, 253, 3632, 2940, 1313, 37489, 50275, 296, 3755, 20556, 436, 2929, 3400, 4209, 7680, 4404, 5919, 1313, 18427, 763, 14669, 11333, 327, 13358, 8470, 253, 4477, 2085, 11080, 6260, 285, 4679, 534, 5194, 342, 253, 1543, 891, 1089, 253, 2898, 327, 5919, 2341, 1754, 1566, 281, 320, 1077, 4722, 50274, 20881, 1255, 265, 1223, 253, 2929, 476, 320, 4354, 1239, 407, 10071, 275, 253, 4910, 891, 1928, 326, 690, 1635, 11523, 11139, 569, 651, 320, 14109, 407, 20823, 5852, 323, 1650, 253, 4477, 943, 5513, 752, 4311, 2097, 275, 1097, 5415, 285, 13358, 7533, 285, 597, 943, 1918, 247, 10799, 5426, 273, 8654, 13642, 1895, 275, 2593, 337, 390, 374, 50276, 783, 4477, 452, 253, 1563, 9376, 327, 253, 2303, 3268, 8066, 50276, 856, 5168, 18, 79, 8066, 74, 337, 2059, 18, 2981, 50276, 4259, 50276, 1222, 268, 75, 50276, 18, 81, 75, 50275, 1124, 805, 50276, 4259, 2805, 3362, 2505, 1542, 690, 50276, 4259, 275, 470, 1124, 1047, 50276, 5430, 253, 4327, 273, 299, 4277, 275, 16186, 374, 11852, 253, 2281, 273, 14940, 275, 16186, 721, 310, 1335, 12744, 281, 479, 891, 452, 247, 288, 3204, 326, 253, 3290, 273, 253, 246, 9614, 11193, 275, 18057, 5922, 7024, 327, 299, 4277, 347, 973, 512, 275, 512, 891, 1158, 253, 4477, 943, 5645, 625, 275, 253, 2022, 2929, 327, 253, 1055, 273, 299, 4277, 327, 253, 6733, 891, 923, 326, 253, 10165, 273, 299, 4277, 403, 5393, 275, 30762, 260, 18, 533, 417, 275, 253, 2022, 2929, 285, 1880, 253, 9376, 1840, 310, 5272, 275, 294, 455, 1074, 15216, 24088, 275, 299, 5844, 50276, 249, 253, 4679, 253, 4477, 760, 7277, 875, 767, 3082, 10775, 298, 12303, 285, 391, 28386, 1014, 2167, 627, 403, 1142, 4679, 275, 253, 2469, 534, 452, 2168, 2011, 12510, 273, 298, 12303, 1411, 643, 3082, 824, 347, 391, 28386, 33342, 1768, 10491, 253, 288, 28444, 4023, 1775, 17407, 285, 5415, 17040, 1754, 3082, 891, 1335, 971, 281, 923, 849, 298, 12303, 17923, 1411, 690, 273, 841, 3082, 387, 1878, 275, 2341, 1754, 3210, 50276, 249, 1635, 891, 452, 1119, 690, 2442, 15965, 963, 993, 275, 436, 2929, 50276, 6377, 495, 16186, 374, 33906, 310, 908, 1060, 533, 556, 1620, 644, 2931, 1078, 50276, 6377, 577, 16186, 854, 4077, 943, 320, 4503, 374, 6677, 1124, 1508, 545, 587, 649, 1124, 805, 2260, 18, 298, 1315, 317, 1237, 918, 275, 253, 20185, 2701, 417, 4555, 4503, 50276, 6377, 1638, 16186, 2030, 752, 310, 278, 17, 50275, 6377, 1638, 16186, 3285, 285, 1884, 943, 253, 4766, 12091, 320, 1691, 275, 7880, 2193, 50276, 6377, 1638, 16186, 3285, 285, 4562, 14168, 4482, 81, 22358, 86, 939, 1124, 805, 943, 320, 14168, 4482, 81, 22358, 41440, 14543, 1124, 805, 50276, 6377, 1458, 16186, 5540, 752, 310, 253, 4569, 1318, 273, 246, 275, 1340, 281, 4647, 18057, 247, 18, 50276, 6377, 1668, 16186, 9135, 891, 717, 417, 2119, 849, 253, 2803, 273, 337, 251, 1124, 1012, 42206, 598, 1060, 50276, 6377, 1283, 16186, 11130, 1269, 79, 943, 320, 1269, 79, 1484, 786, 1502, 2936, 943, 320, 1484, 786, 1502, 2936, 285, 28243, 321, 943, 320, 1484, 786, 1502, 2936, 50276, 6377, 1384, 1386, 40355, 359, 452, 50276, 664, 452, 50276, 6377, 3495, 16186, 17601, 943, 253, 2406, 1340, 1307, 7024, 327, 246, 285, 29331, 19, 347, 973, 671, 391, 1124, 1237, 943, 320, 391, 1124, 1706, 275, 1635, 281, 752, 253, 4477, 1804, 275, 253, 5955, 581, 1537, 1611, 281, 1089, 1327, 284, 40045, 3875, 14940, 2281, 347, 247, 1159, 273, 299, 4277, 627, 310, 671, 253, 1953, 273, 4560, 253, 1682, 2801, 1159, 305, 85, 534, 1537, 3469, 327, 253, 4836, 387, 1133, 476, 359, 1089, 247, 47641, 327, 305, 326, 4245, 441, 8654, 390, 2822, 29776, 3045, 2490, 187, 4118, 18435, 27, 783, 30628, 285, 891, 5194, 326, 253, 9021, 273, 253, 2929, 403, 273, 1600, 285, 4217, 1635, 281, 253, 6239, 3103, 891, 5583, 18738, 253, 2929, 50276, 32897, 1908, 253, 30628, 5701, 672, 13828, 253, 4049, 254, 609, 5102, 2715, 209 ]
Below is given review of a research paper from cnoference journal. Please write a summary the review. ### Review: this paper proposes a new sequence to sequence model where attention is treated as a latent variable and derive novel inference procedures for this model the approach obtains significant improvements in machine translation and morphological inflection generation tasks an approximation is also used to make hard attention more efficient by reducing the number of softmaxes that have to be computed strengths novel principled sequence to sequence model strong experimental results in machine translation and morphological inflection weaknesses connections can be made with previous closely related architectures further ablation experiments could be included the derivation of the model would be more clear if it is first derived without attention feeding the assumption that output is dependent only on the current attention variable is then valid the markov assumption on the attention variable should also be stated as an assumption rather than an approximation given that assumption as far as i can tell the posterior inference procedure that is derived is exact it is indeed equivalent to the using the forward computation of the classic forwardbackward algorithm for hmms to do inference the models overall distribution can then be defined in a somewhat different way than the authors presentation which i think makes more clear what the model is doing py x suma prodt1n pyt yt x at pat yt x at1 the equations derived in the paper for computing the prior and posterior attention is then just a dynamic program for computing this distribution and is equivalent to using the forward algorithm which in this context is alphata pat a yt pyt st at a suma alphat1a pat a st at1 a the only substantial difference in the inference procedure is then that the posterior attention probability is fed into the decoder rnn which means that the independence assumptions are not strictly valid any more even though the structural assumptions are still encoded through the way inference is done 1 recently proposed a model with a similar factorization although that model did not feed the attention distribution and performed emlike inference with the forwardbackward algorithm while this model is effectively computing forward probabilities and performing inference through automatic differentiation the priorjoint variant though its definition is not as clear as it should be seems to be assuming that the attention distribution at each time step is independent of the previous attention similar to the way standard soft attention is computed the equations then reduce to a neural version of ibm alignment model 1 similar to another recently proposed model 2 these papers can be seen as concurrent work and this paper provides important insights but it would strengthen rather than weaken the paper to make these connections clear the results clearly show the advantages of the proposed approach over soft and sparse attention baselines however the difference in bleu score between the variants of the prior or posterior attention models is very small across all translation datasets so to make claims about which of the variants are better at a minimum statistical significance testing should be done given that the priorjoint model performs competitively is it computationally more efficient that the full model the main missing experiment is not doing attention feeding at all the other experiment that is not included as i understood it is to compute prior and posterior attention but feed the prior attention rather than the posterior attention the paper is mostly written very clearly there are just a few typos and grammatical errors in sections 42 and 43 overall i really like this paper and would like to see it accepted although i hope that a revised version would make the assumptions the model is making clearer and make connections to related models clearer 1 neural hidden markov model for machine translation wang et al acl 2018 2 hard nonmonotonic attention for characterlevel transduction wu shapiro and cotterell emnlp 2018 docseporiginality existing attention models do not statistically express interactions among multiple attentions the authors of this manuscript reformulate pyx and define prior attention distribution at depends on previous outputs yt and posterior attention distribution at depends on current output yt as well and essentially compute the prior attention at current position using posterior attention at the previous position the hypothesis and derivations make statistical sense and a couple of assumptionsapproximations seem to be mild quality the overall quality of this paper is technically sound it pushs forward the development of attention models in sequence to sequence mapping clarity the ideas are presented well if the readers go through it slowly or twice however the authors need to clarify the following issues xa is not well defined in section 22 py as a short form of pryx1m could be problematic and confusing in interpretation of dependency over which variables page 3 line 19 of section 221 should sn1 be st1 in postrjoint eq 5 and others i believe at1 is better than a because the former indicate it is attention for position t1 i am a bit lost in the description of coupling energies the two formulas for proximity biased coupling and monotonicity biased coupling are not well explained in addition to the above major issues i also identified a few minors significant find significant finding last line of page 2 should pytyt an an be pytyt at at topk topk a equally weighted combination an equally weighted combination some citations are not used properly such as last 3rd line of page 4 and brackets are forgotten in some places etc end of section 3 x should be in boldface nondifferentiability nondifferentiability full stop is missing in some places luong attention is not defined significance comparisons with an existing softattention model and an sparseattention model on five machine translation datasets show that the performance of using posterior attention indeed are better than benchmark models update i have read the authors response my current rating is final docseppros 1 this work presents a novel construction of the popularlyused attention modules it points out the problems lied in existing design that attention vectors are only computed based on parametric functions instead of considering the interactions among each attention step and output variables to achieve that the authors rewrite the joint distribution as a product of tractable terms at each timestamp and fully exploit the dependencies among attention and output variables across the sequence the motivation is clear and the proposed strategy is original and to the point this makes the work relative solid and interesting for a publication furthermore the authors propose 3 different formulation for prior attention making the work even stronger 2 the technical content looks good with each formula written clearly and with sufficient deductive steps figure 1 provides clear illustration on the comparison with traditional attentions and shows the advantage of the proposed model 3 extensive experiments are conducted including 5 machine translation tasks as well as another morphological inflection task these results make the statement more convincing the authors also conducted further experiments to analyze the effectiveness including attention entropy evaluation cons 1 the rich information contained in the paper is not very wellorganized it takes some time to digest due to some unclear or missing statements specifically the computation for prior attention should be ordered in a subsection with a section name the 3 different formulations should be first summarized and started with the same core formula as 4 in this way it will become more clear of where does eq6 come from or used for currently this part is confusing 2 many substitutions of variables take place without detailed explanation eg yt with st a with xa in 11 etc could you explain before making these substitutions 3 as mentioned the pam actually computes hard attentions it should be better to make the statement more clear by explicitly explaining eq11 on how it assembles hard attention computation qa 1 in the equation above 3 that computes priorat can you explain how pat1yt approximates patyt whats the assumption 2 how is eq5 computed using first order taylor expansion how to make postr inside the probability and where does xa come from 3 transferring from py on top of page 3 to eq11 how do you substitute yt at with st xj is there a typo for xj 4 can you explain how is the baseline priorjoint constructed specifically how to compute prior using soft attention without postr ### Summary:
the reviewers of this paper agreed that it has done a stellar job of presenting a novel and principled approach to attention as a latent variable providing a new and sound set of inference techniques to this end this builds on top of a discussion of the limitations of existing deterministic approaches to attention and frames the contribution well in relation to other recurrent and stochastic approaches to attention while there are a few issues with clarity surrounding some aspects of the proposed method which the authors are encouraged to finetune in their final version paying careful attention to the review comments this paper is more or less ready for publication with a few tweaks it makes a clear significant and wellevaluate contribution to the field of attention models in sequence to sequence architectures and will be of great interest to many attendees at iclr
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 2520, 2929, 29328, 247, 747, 3425, 281, 3425, 1566, 835, 4116, 310, 4127, 347, 247, 21624, 4778, 285, 15313, 4460, 17032, 7259, 323, 436, 1566, 253, 2746, 31326, 1534, 11701, 275, 5145, 10234, 285, 21418, 2192, 1788, 5978, 8892, 271, 11193, 310, 671, 908, 281, 1056, 1892, 4116, 625, 5919, 407, 8493, 253, 1180, 273, 2602, 4090, 265, 326, 452, 281, 320, 10302, 50274, 296, 3755, 20556, 50276, 2369, 652, 3505, 74, 6216, 3425, 281, 3425, 1566, 50276, 9072, 5661, 1543, 275, 5145, 10234, 285, 21418, 2192, 1788, 32213, 50276, 35002, 476, 320, 1160, 342, 2045, 8244, 2905, 35615, 50276, 44295, 28913, 4679, 812, 320, 2908, 50275, 783, 28529, 273, 253, 1566, 651, 320, 625, 2590, 604, 352, 310, 806, 6012, 1293, 4116, 12422, 253, 9376, 326, 3453, 310, 7976, 760, 327, 253, 1655, 4116, 4778, 310, 840, 3588, 253, 1616, 729, 9376, 327, 253, 4116, 4778, 943, 671, 320, 4767, 347, 271, 9376, 2581, 685, 271, 11193, 1677, 326, 9376, 347, 2080, 347, 891, 476, 2028, 253, 12637, 17032, 5199, 326, 310, 6012, 310, 3242, 352, 310, 6296, 6425, 281, 253, 970, 253, 3579, 13782, 273, 253, 10610, 3579, 2135, 1034, 5933, 323, 288, 78, 983, 281, 513, 17032, 50276, 783, 3210, 4583, 3268, 476, 840, 320, 2931, 275, 247, 8489, 1027, 1039, 685, 253, 4477, 9759, 534, 891, 1158, 2789, 625, 2590, 752, 253, 1566, 310, 2509, 7239, 50276, 89, 50276, 2204, 66, 354, 7064, 18, 79, 268, 1767, 50276, 1767, 1269, 387, 869, 50276, 1767, 1269, 387, 18, 50275, 783, 7424, 6012, 275, 253, 2929, 323, 12672, 253, 2720, 285, 12637, 4116, 310, 840, 816, 247, 7870, 2086, 323, 12672, 436, 3268, 285, 310, 6425, 281, 970, 253, 3579, 5933, 534, 275, 436, 3634, 310, 50276, 21697, 682, 50276, 4066, 50276, 66, 340, 85, 50276, 81, 1767, 50276, 296, 387, 247, 2020, 66, 355, 545, 255, 18, 66, 869, 50276, 66, 50276, 296, 387, 18, 50276, 66, 50275, 783, 760, 6832, 3064, 275, 253, 17032, 5199, 310, 840, 326, 253, 12637, 4116, 5912, 310, 10208, 715, 253, 29810, 391, 9866, 534, 2097, 326, 253, 14275, 13260, 403, 417, 13714, 3588, 667, 625, 1014, 2167, 253, 8350, 13260, 403, 1335, 16202, 949, 253, 1039, 17032, 310, 2218, 50276, 18, 4102, 4081, 247, 1566, 342, 247, 2074, 39401, 3738, 326, 1566, 858, 417, 3997, 253, 4116, 3268, 285, 2684, 802, 3022, 17032, 342, 253, 3579, 2135, 1034, 5933, 1223, 436, 1566, 310, 8069, 12672, 3579, 20552, 285, 9591, 17032, 949, 12077, 9827, 50276, 783, 2720, 16662, 12955, 2167, 697, 5426, 310, 417, 347, 2590, 347, 352, 943, 320, 3133, 281, 320, 7384, 326, 253, 4116, 3268, 387, 1016, 673, 3213, 310, 3907, 273, 253, 2045, 4116, 2074, 281, 253, 1039, 2629, 2602, 4116, 310, 10302, 50276, 783, 7424, 840, 4796, 281, 247, 11454, 2715, 273, 18890, 78, 12420, 1566, 337, 2074, 281, 1529, 4102, 4081, 1566, 374, 841, 9380, 476, 320, 2326, 347, 17336, 789, 285, 436, 2929, 3400, 1774, 16039, 533, 352, 651, 17084, 2581, 685, 20171, 253, 2929, 281, 1056, 841, 10291, 2590, 50275, 783, 1543, 4518, 921, 253, 11361, 273, 253, 4081, 2746, 689, 2602, 285, 23507, 4116, 1666, 25379, 2299, 253, 3064, 275, 7387, 86, 4868, 875, 253, 11640, 273, 253, 2720, 390, 12637, 4116, 3210, 310, 1077, 1355, 2439, 512, 10234, 15302, 594, 281, 1056, 3916, 670, 534, 273, 253, 11640, 403, 1805, 387, 247, 5927, 7605, 8453, 5175, 943, 320, 2218, 1677, 326, 253, 2720, 16662, 1566, 17923, 3947, 25785, 310, 352, 43245, 625, 5919, 326, 253, 2120, 1566, 50275, 783, 2022, 5816, 3368, 310, 417, 2509, 4116, 12422, 387, 512, 253, 643, 3368, 326, 310, 417, 2908, 347, 891, 7192, 352, 310, 281, 11897, 2720, 285, 12637, 4116, 533, 3997, 253, 2720, 4116, 2581, 685, 253, 12637, 4116, 50275, 783, 2929, 310, 6571, 3542, 1077, 4518, 627, 403, 816, 247, 1643, 963, 993, 285, 47412, 474, 6332, 275, 7118, 5976, 285, 7652, 50275, 1189, 455, 891, 1663, 751, 436, 2929, 285, 651, 751, 281, 923, 352, 7607, 3738, 891, 3524, 326, 247, 17265, 2715, 651, 1056, 253, 13260, 253, 1566, 310, 2403, 30909, 285, 1056, 10291, 281, 2905, 3210, 30909, 50274, 18, 11454, 8763, 1616, 729, 1566, 323, 5145, 10234, 259, 606, 1162, 355, 247, 498, 4765, 50276, 19, 1892, 1327, 2163, 302, 5120, 4116, 323, 1894, 5251, 28942, 259, 86, 439, 37557, 285, 13450, 442, 11436, 802, 13307, 81, 4765, 5474, 339, 1831, 10019, 414, 5368, 4116, 3210, 513, 417, 10126, 3890, 6355, 2190, 2709, 33056, 621, 253, 4477, 273, 436, 7714, 8460, 4187, 7239, 89, 285, 4853, 2720, 4116, 3268, 387, 7024, 327, 2045, 18012, 340, 85, 285, 12637, 4116, 3268, 387, 7024, 327, 1655, 3453, 340, 85, 347, 973, 285, 9093, 11897, 253, 2720, 4116, 387, 1655, 1899, 970, 12637, 4116, 387, 253, 2045, 1899, 253, 9079, 285, 3538, 569, 1056, 7605, 3282, 285, 247, 4564, 273, 13260, 6772, 3266, 569, 1646, 281, 320, 11134, 50275, 15177, 253, 4583, 3290, 273, 436, 2929, 310, 22335, 3590, 352, 7450, 84, 3579, 253, 2440, 273, 4116, 3210, 275, 3425, 281, 3425, 10603, 50276, 498, 15752, 253, 5697, 403, 3559, 973, 604, 253, 10668, 564, 949, 352, 7808, 390, 7019, 2299, 253, 4477, 878, 281, 19148, 253, 1563, 3374, 50276, 14346, 310, 417, 973, 2931, 50276, 249, 2593, 3307, 7239, 347, 247, 2159, 830, 273, 268, 610, 89, 18, 78, 812, 320, 20276, 285, 21643, 275, 7914, 273, 18925, 689, 534, 4903, 50275, 6377, 495, 1386, 655, 273, 2593, 27587, 943, 3802, 18, 320, 331, 18, 275, 1501, 83, 16662, 16186, 608, 285, 2571, 891, 2868, 387, 18, 310, 1805, 685, 247, 984, 253, 3438, 5224, 352, 310, 4116, 323, 1899, 246, 18, 50276, 74, 717, 247, 2372, 3663, 275, 253, 5740, 273, 8789, 14120, 253, 767, 23276, 323, 18326, 23539, 8789, 285, 45973, 414, 23539, 8789, 403, 417, 973, 5544, 50275, 249, 1635, 281, 253, 1840, 2201, 3374, 891, 671, 3636, 247, 1643, 36973, 50276, 32258, 1089, 50276, 32258, 4560, 1390, 1386, 273, 3239, 374, 943, 7239, 555, 85, 271, 271, 320, 7239, 555, 85, 387, 387, 1755, 76, 50276, 3956, 76, 247, 9696, 17375, 5019, 50276, 266, 9696, 17375, 5019, 690, 30404, 403, 417, 908, 6283, 824, 347, 1390, 495, 5784, 1386, 273, 3239, 577, 285, 26609, 403, 14454, 275, 690, 5053, 3966, 990, 273, 2593, 495, 1269, 943, 320, 275, 13433, 1664, 27370, 7413, 74, 1430, 50275, 79, 857, 7413, 74, 1430, 2120, 3523, 50276, 261, 5816, 275, 690, 5053, 26535, 543, 4116, 310, 417, 2931, 50276, 9188, 40348, 14023, 342, 271, 5368, 2602, 42959, 1566, 285, 271, 23507, 42959, 1566, 327, 2620, 5145, 10234, 15302, 921, 326, 253, 3045, 273, 970, 12637, 4116, 6296, 403, 1805, 685, 22791, 3210, 50275, 11183, 891, 452, 1239, 253, 4477, 2380, 619, 1655, 13716, 310, 2457, 5474, 339, 377, 2921, 337, 436, 789, 10262, 247, 4460, 5140, 273, 253, 4633, 314, 3197, 4116, 11911, 352, 2792, 562, 253, 3237, 27786, 275, 5368, 2216, 326, 4116, 11390, 403, 760, 10302, 1754, 327, 36833, 3470, 3185, 273, 7296, 253, 6355, 2190, 1016, 4116, 3213, 285, 3453, 4903, 281, 5115, 326, 253, 4477, 24813, 253, 6036, 3268, 347, 247, 1885, 273, 10649, 494, 2426, 387, 1016, 28921, 285, 4751, 22059, 253, 21011, 2190, 4116, 285, 3453, 4903, 2439, 253, 3425, 253, 16038, 310, 2590, 285, 253, 4081, 5700, 310, 3236, 285, 281, 253, 1127, 436, 2789, 253, 789, 4103, 4891, 285, 4722, 323, 247, 9311, 33810, 253, 4477, 12661, 495, 1027, 15895, 323, 2720, 4116, 2403, 253, 789, 1014, 10046, 374, 253, 7681, 2600, 4453, 1175, 342, 1016, 7212, 3542, 4518, 285, 342, 4209, 31985, 422, 5018, 4677, 337, 3400, 2590, 23356, 327, 253, 5301, 342, 5899, 33056, 621, 285, 2722, 253, 5750, 273, 253, 4081, 1566, 495, 9470, 4679, 403, 5196, 1690, 608, 5145, 10234, 8892, 347, 973, 347, 1529, 21418, 2192, 1788, 4836, 841, 1543, 1056, 253, 3908, 625, 21414, 253, 4477, 671, 5196, 2007, 4679, 281, 12106, 253, 12510, 1690, 4116, 15579, 7103, 50276, 5040, 337, 253, 6793, 1491, 6221, 275, 253, 2929, 310, 417, 1077, 973, 34092, 352, 3936, 690, 673, 281, 19818, 1955, 281, 690, 12744, 390, 5816, 7234, 5742, 253, 13782, 323, 2720, 4116, 943, 320, 6960, 275, 247, 19087, 342, 247, 2593, 1416, 253, 495, 1027, 26850, 943, 320, 806, 17903, 285, 3053, 342, 253, 1072, 5161, 7212, 347, 577, 275, 436, 1039, 352, 588, 2489, 625, 2590, 273, 835, 1057, 16186, 23, 1705, 432, 390, 908, 323, 4390, 436, 629, 310, 21643, 374, 1142, 35225, 273, 4903, 1379, 1659, 1293, 7000, 8813, 24088, 340, 85, 342, 331, 247, 342, 1269, 66, 275, 1903, 3966, 812, 368, 5513, 1078, 2403, 841, 35225, 495, 347, 5393, 253, 30459, 2686, 48169, 1892, 33056, 621, 352, 943, 320, 1805, 281, 1056, 253, 3908, 625, 2590, 407, 11120, 15571, 16186, 883, 327, 849, 352, 347, 40275, 1892, 4116, 13782, 50276, 31569, 337, 275, 253, 5150, 1840, 495, 326, 48169, 2720, 255, 476, 368, 5513, 849, 869, 18, 1767, 4020, 684, 869, 1767, 47515, 253, 9376, 374, 849, 310, 16186, 22, 10302, 970, 806, 1340, 246, 9614, 7466, 849, 281, 1056, 1501, 83, 3304, 253, 5912, 285, 835, 1057, 1269, 66, 1705, 432, 495, 27090, 432, 7239, 327, 1755, 273, 3239, 495, 281, 16186, 883, 849, 513, 368, 16502, 340, 85, 387, 342, 331, 1269, 75, 310, 627, 247, 1745, 80, 323, 1269, 75, 577, 476, 368, 5513, 849, 310, 253, 8245, 2720, 16662, 8818, 5742, 849, 281, 11897, 2720, 970, 2602, 4116, 1293, 1501, 83, 187, 187, 4118, 18435, 27, 783, 30628, 273, 436, 2929, 5821, 326, 352, 556, 2218, 247, 13671, 2628, 273, 15250, 247, 4460, 285, 3505, 74, 6216, 2746, 281, 4116, 347, 247, 21624, 4778, 5277, 247, 747, 285, 3590, 873, 273, 17032, 5609, 281, 436, 990, 436, 21168, 327, 1755, 273, 247, 5955, 273, 253, 7364, 273, 5368, 30027, 7274, 281, 4116, 285, 13009, 253, 7680, 973, 275, 5886, 281, 643, 18902, 285, 19191, 7274, 281, 4116, 1223, 627, 403, 247, 1643, 3374, 342, 19843, 8704, 690, 7794, 273, 253, 4081, 1332, 534, 253, 4477, 403, 14659, 281, 1442, 292, 2517, 275, 616, 2457, 2715, 10054, 10182, 4116, 281, 253, 2278, 5701, 436, 2929, 310, 625, 390, 1679, 4704, 323, 9311, 342, 247, 1643, 13660, 8765, 352, 2789, 247, 2590, 1534, 285, 259, 4415, 1208, 6340, 7680, 281, 253, 1673, 273, 4116, 3210, 275, 3425, 281, 3425, 35615, 285, 588, 320, 273, 1270, 1600, 281, 1142, 36994, 387, 17857, 32888 ]
[ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 ]
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 2520, 2929, 29328, 247, 747, 3425, 281, 3425, 1566, 835, 4116, 310, 4127, 347, 247, 21624, 4778, 285, 15313, 4460, 17032, 7259, 323, 436, 1566, 253, 2746, 31326, 1534, 11701, 275, 5145, 10234, 285, 21418, 2192, 1788, 5978, 8892, 271, 11193, 310, 671, 908, 281, 1056, 1892, 4116, 625, 5919, 407, 8493, 253, 1180, 273, 2602, 4090, 265, 326, 452, 281, 320, 10302, 50274, 296, 3755, 20556, 50276, 2369, 652, 3505, 74, 6216, 3425, 281, 3425, 1566, 50276, 9072, 5661, 1543, 275, 5145, 10234, 285, 21418, 2192, 1788, 32213, 50276, 35002, 476, 320, 1160, 342, 2045, 8244, 2905, 35615, 50276, 44295, 28913, 4679, 812, 320, 2908, 50275, 783, 28529, 273, 253, 1566, 651, 320, 625, 2590, 604, 352, 310, 806, 6012, 1293, 4116, 12422, 253, 9376, 326, 3453, 310, 7976, 760, 327, 253, 1655, 4116, 4778, 310, 840, 3588, 253, 1616, 729, 9376, 327, 253, 4116, 4778, 943, 671, 320, 4767, 347, 271, 9376, 2581, 685, 271, 11193, 1677, 326, 9376, 347, 2080, 347, 891, 476, 2028, 253, 12637, 17032, 5199, 326, 310, 6012, 310, 3242, 352, 310, 6296, 6425, 281, 253, 970, 253, 3579, 13782, 273, 253, 10610, 3579, 2135, 1034, 5933, 323, 288, 78, 983, 281, 513, 17032, 50276, 783, 3210, 4583, 3268, 476, 840, 320, 2931, 275, 247, 8489, 1027, 1039, 685, 253, 4477, 9759, 534, 891, 1158, 2789, 625, 2590, 752, 253, 1566, 310, 2509, 7239, 50276, 89, 50276, 2204, 66, 354, 7064, 18, 79, 268, 1767, 50276, 1767, 1269, 387, 869, 50276, 1767, 1269, 387, 18, 50275, 783, 7424, 6012, 275, 253, 2929, 323, 12672, 253, 2720, 285, 12637, 4116, 310, 840, 816, 247, 7870, 2086, 323, 12672, 436, 3268, 285, 310, 6425, 281, 970, 253, 3579, 5933, 534, 275, 436, 3634, 310, 50276, 21697, 682, 50276, 4066, 50276, 66, 340, 85, 50276, 81, 1767, 50276, 296, 387, 247, 2020, 66, 355, 545, 255, 18, 66, 869, 50276, 66, 50276, 296, 387, 18, 50276, 66, 50275, 783, 760, 6832, 3064, 275, 253, 17032, 5199, 310, 840, 326, 253, 12637, 4116, 5912, 310, 10208, 715, 253, 29810, 391, 9866, 534, 2097, 326, 253, 14275, 13260, 403, 417, 13714, 3588, 667, 625, 1014, 2167, 253, 8350, 13260, 403, 1335, 16202, 949, 253, 1039, 17032, 310, 2218, 50276, 18, 4102, 4081, 247, 1566, 342, 247, 2074, 39401, 3738, 326, 1566, 858, 417, 3997, 253, 4116, 3268, 285, 2684, 802, 3022, 17032, 342, 253, 3579, 2135, 1034, 5933, 1223, 436, 1566, 310, 8069, 12672, 3579, 20552, 285, 9591, 17032, 949, 12077, 9827, 50276, 783, 2720, 16662, 12955, 2167, 697, 5426, 310, 417, 347, 2590, 347, 352, 943, 320, 3133, 281, 320, 7384, 326, 253, 4116, 3268, 387, 1016, 673, 3213, 310, 3907, 273, 253, 2045, 4116, 2074, 281, 253, 1039, 2629, 2602, 4116, 310, 10302, 50276, 783, 7424, 840, 4796, 281, 247, 11454, 2715, 273, 18890, 78, 12420, 1566, 337, 2074, 281, 1529, 4102, 4081, 1566, 374, 841, 9380, 476, 320, 2326, 347, 17336, 789, 285, 436, 2929, 3400, 1774, 16039, 533, 352, 651, 17084, 2581, 685, 20171, 253, 2929, 281, 1056, 841, 10291, 2590, 50275, 783, 1543, 4518, 921, 253, 11361, 273, 253, 4081, 2746, 689, 2602, 285, 23507, 4116, 1666, 25379, 2299, 253, 3064, 275, 7387, 86, 4868, 875, 253, 11640, 273, 253, 2720, 390, 12637, 4116, 3210, 310, 1077, 1355, 2439, 512, 10234, 15302, 594, 281, 1056, 3916, 670, 534, 273, 253, 11640, 403, 1805, 387, 247, 5927, 7605, 8453, 5175, 943, 320, 2218, 1677, 326, 253, 2720, 16662, 1566, 17923, 3947, 25785, 310, 352, 43245, 625, 5919, 326, 253, 2120, 1566, 50275, 783, 2022, 5816, 3368, 310, 417, 2509, 4116, 12422, 387, 512, 253, 643, 3368, 326, 310, 417, 2908, 347, 891, 7192, 352, 310, 281, 11897, 2720, 285, 12637, 4116, 533, 3997, 253, 2720, 4116, 2581, 685, 253, 12637, 4116, 50275, 783, 2929, 310, 6571, 3542, 1077, 4518, 627, 403, 816, 247, 1643, 963, 993, 285, 47412, 474, 6332, 275, 7118, 5976, 285, 7652, 50275, 1189, 455, 891, 1663, 751, 436, 2929, 285, 651, 751, 281, 923, 352, 7607, 3738, 891, 3524, 326, 247, 17265, 2715, 651, 1056, 253, 13260, 253, 1566, 310, 2403, 30909, 285, 1056, 10291, 281, 2905, 3210, 30909, 50274, 18, 11454, 8763, 1616, 729, 1566, 323, 5145, 10234, 259, 606, 1162, 355, 247, 498, 4765, 50276, 19, 1892, 1327, 2163, 302, 5120, 4116, 323, 1894, 5251, 28942, 259, 86, 439, 37557, 285, 13450, 442, 11436, 802, 13307, 81, 4765, 5474, 339, 1831, 10019, 414, 5368, 4116, 3210, 513, 417, 10126, 3890, 6355, 2190, 2709, 33056, 621, 253, 4477, 273, 436, 7714, 8460, 4187, 7239, 89, 285, 4853, 2720, 4116, 3268, 387, 7024, 327, 2045, 18012, 340, 85, 285, 12637, 4116, 3268, 387, 7024, 327, 1655, 3453, 340, 85, 347, 973, 285, 9093, 11897, 253, 2720, 4116, 387, 1655, 1899, 970, 12637, 4116, 387, 253, 2045, 1899, 253, 9079, 285, 3538, 569, 1056, 7605, 3282, 285, 247, 4564, 273, 13260, 6772, 3266, 569, 1646, 281, 320, 11134, 50275, 15177, 253, 4583, 3290, 273, 436, 2929, 310, 22335, 3590, 352, 7450, 84, 3579, 253, 2440, 273, 4116, 3210, 275, 3425, 281, 3425, 10603, 50276, 498, 15752, 253, 5697, 403, 3559, 973, 604, 253, 10668, 564, 949, 352, 7808, 390, 7019, 2299, 253, 4477, 878, 281, 19148, 253, 1563, 3374, 50276, 14346, 310, 417, 973, 2931, 50276, 249, 2593, 3307, 7239, 347, 247, 2159, 830, 273, 268, 610, 89, 18, 78, 812, 320, 20276, 285, 21643, 275, 7914, 273, 18925, 689, 534, 4903, 50275, 6377, 495, 1386, 655, 273, 2593, 27587, 943, 3802, 18, 320, 331, 18, 275, 1501, 83, 16662, 16186, 608, 285, 2571, 891, 2868, 387, 18, 310, 1805, 685, 247, 984, 253, 3438, 5224, 352, 310, 4116, 323, 1899, 246, 18, 50276, 74, 717, 247, 2372, 3663, 275, 253, 5740, 273, 8789, 14120, 253, 767, 23276, 323, 18326, 23539, 8789, 285, 45973, 414, 23539, 8789, 403, 417, 973, 5544, 50275, 249, 1635, 281, 253, 1840, 2201, 3374, 891, 671, 3636, 247, 1643, 36973, 50276, 32258, 1089, 50276, 32258, 4560, 1390, 1386, 273, 3239, 374, 943, 7239, 555, 85, 271, 271, 320, 7239, 555, 85, 387, 387, 1755, 76, 50276, 3956, 76, 247, 9696, 17375, 5019, 50276, 266, 9696, 17375, 5019, 690, 30404, 403, 417, 908, 6283, 824, 347, 1390, 495, 5784, 1386, 273, 3239, 577, 285, 26609, 403, 14454, 275, 690, 5053, 3966, 990, 273, 2593, 495, 1269, 943, 320, 275, 13433, 1664, 27370, 7413, 74, 1430, 50275, 79, 857, 7413, 74, 1430, 2120, 3523, 50276, 261, 5816, 275, 690, 5053, 26535, 543, 4116, 310, 417, 2931, 50276, 9188, 40348, 14023, 342, 271, 5368, 2602, 42959, 1566, 285, 271, 23507, 42959, 1566, 327, 2620, 5145, 10234, 15302, 921, 326, 253, 3045, 273, 970, 12637, 4116, 6296, 403, 1805, 685, 22791, 3210, 50275, 11183, 891, 452, 1239, 253, 4477, 2380, 619, 1655, 13716, 310, 2457, 5474, 339, 377, 2921, 337, 436, 789, 10262, 247, 4460, 5140, 273, 253, 4633, 314, 3197, 4116, 11911, 352, 2792, 562, 253, 3237, 27786, 275, 5368, 2216, 326, 4116, 11390, 403, 760, 10302, 1754, 327, 36833, 3470, 3185, 273, 7296, 253, 6355, 2190, 1016, 4116, 3213, 285, 3453, 4903, 281, 5115, 326, 253, 4477, 24813, 253, 6036, 3268, 347, 247, 1885, 273, 10649, 494, 2426, 387, 1016, 28921, 285, 4751, 22059, 253, 21011, 2190, 4116, 285, 3453, 4903, 2439, 253, 3425, 253, 16038, 310, 2590, 285, 253, 4081, 5700, 310, 3236, 285, 281, 253, 1127, 436, 2789, 253, 789, 4103, 4891, 285, 4722, 323, 247, 9311, 33810, 253, 4477, 12661, 495, 1027, 15895, 323, 2720, 4116, 2403, 253, 789, 1014, 10046, 374, 253, 7681, 2600, 4453, 1175, 342, 1016, 7212, 3542, 4518, 285, 342, 4209, 31985, 422, 5018, 4677, 337, 3400, 2590, 23356, 327, 253, 5301, 342, 5899, 33056, 621, 285, 2722, 253, 5750, 273, 253, 4081, 1566, 495, 9470, 4679, 403, 5196, 1690, 608, 5145, 10234, 8892, 347, 973, 347, 1529, 21418, 2192, 1788, 4836, 841, 1543, 1056, 253, 3908, 625, 21414, 253, 4477, 671, 5196, 2007, 4679, 281, 12106, 253, 12510, 1690, 4116, 15579, 7103, 50276, 5040, 337, 253, 6793, 1491, 6221, 275, 253, 2929, 310, 417, 1077, 973, 34092, 352, 3936, 690, 673, 281, 19818, 1955, 281, 690, 12744, 390, 5816, 7234, 5742, 253, 13782, 323, 2720, 4116, 943, 320, 6960, 275, 247, 19087, 342, 247, 2593, 1416, 253, 495, 1027, 26850, 943, 320, 806, 17903, 285, 3053, 342, 253, 1072, 5161, 7212, 347, 577, 275, 436, 1039, 352, 588, 2489, 625, 2590, 273, 835, 1057, 16186, 23, 1705, 432, 390, 908, 323, 4390, 436, 629, 310, 21643, 374, 1142, 35225, 273, 4903, 1379, 1659, 1293, 7000, 8813, 24088, 340, 85, 342, 331, 247, 342, 1269, 66, 275, 1903, 3966, 812, 368, 5513, 1078, 2403, 841, 35225, 495, 347, 5393, 253, 30459, 2686, 48169, 1892, 33056, 621, 352, 943, 320, 1805, 281, 1056, 253, 3908, 625, 2590, 407, 11120, 15571, 16186, 883, 327, 849, 352, 347, 40275, 1892, 4116, 13782, 50276, 31569, 337, 275, 253, 5150, 1840, 495, 326, 48169, 2720, 255, 476, 368, 5513, 849, 869, 18, 1767, 4020, 684, 869, 1767, 47515, 253, 9376, 374, 849, 310, 16186, 22, 10302, 970, 806, 1340, 246, 9614, 7466, 849, 281, 1056, 1501, 83, 3304, 253, 5912, 285, 835, 1057, 1269, 66, 1705, 432, 495, 27090, 432, 7239, 327, 1755, 273, 3239, 495, 281, 16186, 883, 849, 513, 368, 16502, 340, 85, 387, 342, 331, 1269, 75, 310, 627, 247, 1745, 80, 323, 1269, 75, 577, 476, 368, 5513, 849, 310, 253, 8245, 2720, 16662, 8818, 5742, 849, 281, 11897, 2720, 970, 2602, 4116, 1293, 1501, 83, 187, 187, 4118, 18435, 27, 783, 30628, 273, 436, 2929, 5821, 326, 352, 556, 2218, 247, 13671, 2628, 273, 15250, 247, 4460, 285, 3505, 74, 6216, 2746, 281, 4116, 347, 247, 21624, 4778, 5277, 247, 747, 285, 3590, 873, 273, 17032, 5609, 281, 436, 990, 436, 21168, 327, 1755, 273, 247, 5955, 273, 253, 7364, 273, 5368, 30027, 7274, 281, 4116, 285, 13009, 253, 7680, 973, 275, 5886, 281, 643, 18902, 285, 19191, 7274, 281, 4116, 1223, 627, 403, 247, 1643, 3374, 342, 19843, 8704, 690, 7794, 273, 253, 4081, 1332, 534, 253, 4477, 403, 14659, 281, 1442, 292, 2517, 275, 616, 2457, 2715, 10054, 10182, 4116, 281, 253, 2278, 5701, 436, 2929, 310, 625, 390, 1679, 4704, 323, 9311, 342, 247, 1643, 13660, 8765, 352, 2789, 247, 2590, 1534, 285, 259, 4415, 1208, 6340, 7680, 281, 253, 1673, 273, 4116, 3210, 275, 3425, 281, 3425, 35615, 285, 588, 320, 273, 1270, 1600, 281, 1142, 36994, 387, 17857, 32888 ]
Below is given review of a research paper from cnoference journal. Please write a summary the review. ### Review: clear exposition of the derivations motivations related work and experiments very interesting idea a large number and range of competitor models the two tested datasets are very similar it would have been more informative to test on two datasets that are more different there is a discussion of feature learning in fewshot learning but i think that there may need to be more of a distinction feature learning representation learning can often be done during metatraining before fewshot learning occurs seems to be some typos eg without proposed framework in the conclusion docsepthe proposed method is effective for fewshot learning and outperforms some stateoftheart competitors and it can handle overfitting oversmooth and noise it is better to say bncv than mutual information since it actually incorporates the bncv method my main concern is how to actually use the idea of mutual information this work is just using bncv i feel the title is kind of misleading or a little bit big therefore i wonder if you are thinking of ideas of using mi other than bncv even just simply pairwise or something i bit doubt the effectiveness of upgrading to bncv i am not targeting mi i wish the author need more to justify this this impacts the significance of this work docsepthe paper presents an approach that is novel for handling the problem of for fewshot node classification on graph the main idea of adopting the bayesian network with continuous variables to improve the correlation representation among the node features is a novel idea at least in this context the paper shows that the idea is effective yielding stateoftheart results on fewshot node classification benchmarks cora and citeseer 1 according to the experiment section of the paper the task concerned in this paper is the fewshot node classification on graph but it is described as the fewshot learning in the whole paper which could be misleading 2 the results are not so convincing first it seems that the method proposed in this paper is not completely superior to previous methods 1way 3shot in citeseer second the paper could lack results on some other datasets such as amazon email and reddit which are mentioned in the previous methods like rale 3 the authors do not compare the noise stability of the proposed method with other methods although the proposed method is very stable to noise we do not know how stable it is compared with other methods first the authors should reorganize the description of the target task second authors could prove the effectiveness of their strategies by comparing their work with other works on more datasets finally the authors could compare the noise stability of the proposed method with other methods to certificate the noise stability of the proposed method docsep1 there are some research findings that attempt to include the mutual information into deep neural networks but this approach is a novel solution to fewshot learning task 2 the idea of using mi to represent the correlation among features is wellmotivated and reasonable 3 building bncv from the dropout in hidden layers of bgs is interesting and approximating the mi via probabilistic inferences over bncv is technically correct 4 comprehensive experiments have been conducted and the results show the effectiveness of the approach 1 the experimental results would be more convincing if more datasets could be adopted 2 some concepts in section 3 are not properly explained 1 the proposed bgsmi method significantly improves the accuracy of fewshot learning compared with bgs in table 2 and table 3 but are worse than rale in some situations which should be discussed in more details 2 what is definition of graph g section 3 how to obtain g from the datasets cora and citeseer section 51 these should be interpreted 3 is there any difference or relationship between d and i in section 3 4 mi is usually used to describe the dependence among data is the term dependence more properly than correlation the motivation of correlation should be further discussed properly ### Summary:
meta review the paper proposes a new fewshot learning method using mutual information to model relationships among features quality the paper is technically sound and has a convincing experimental evaluation clarity the paper is well written originality the idea is very interesting and has been demonstrated very useful in fewshot learning significance the method has great potential in fewshot learning three reviewers are very positive about the paper the idea is excellent and the experimental results are convincing one reviewer raises some concerns the authors have responded to the concerns the authors have provided quite reasonable explanations for the reviewers concerns
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 2590, 47284, 273, 253, 3538, 569, 42852, 2905, 789, 285, 4679, 50276, 635, 4722, 2934, 50276, 66, 1781, 1180, 285, 2491, 273, 32048, 3210, 50275, 783, 767, 5762, 15302, 403, 1077, 2074, 352, 651, 452, 644, 625, 27096, 281, 1071, 327, 767, 15302, 326, 403, 625, 1027, 50275, 9088, 310, 247, 5955, 273, 4735, 4715, 275, 1643, 11860, 4715, 533, 891, 1158, 326, 627, 778, 878, 281, 320, 625, 273, 247, 13812, 4735, 4715, 50276, 37626, 4715, 476, 2223, 320, 2218, 1309, 1313, 255, 26208, 1078, 1643, 11860, 4715, 6634, 50276, 339, 3030, 281, 320, 690, 963, 993, 24088, 1293, 4081, 7792, 275, 253, 6452, 50276, 7152, 339, 431, 248, 4081, 1332, 310, 3576, 323, 1643, 11860, 4715, 285, 41731, 13015, 690, 1375, 23037, 14387, 21607, 285, 352, 476, 6016, 689, 31893, 689, 34006, 285, 6046, 352, 310, 1805, 281, 1333, 270, 9068, 87, 685, 15577, 1491, 1580, 352, 2686, 31167, 253, 270, 9068, 87, 1332, 619, 2022, 4468, 310, 849, 281, 2686, 897, 253, 2934, 273, 15577, 1491, 50276, 2520, 789, 310, 816, 970, 270, 9068, 87, 891, 1928, 253, 4060, 310, 2238, 273, 24363, 390, 247, 1652, 2372, 1943, 3103, 891, 4282, 604, 368, 403, 4680, 273, 5697, 273, 970, 3641, 643, 685, 270, 9068, 87, 1014, 816, 3365, 28208, 390, 1633, 891, 2372, 5545, 253, 12510, 273, 38234, 281, 270, 9068, 87, 891, 717, 417, 12262, 3641, 891, 5730, 253, 2488, 878, 625, 281, 15249, 436, 436, 16274, 253, 8453, 273, 436, 789, 5474, 339, 431, 248, 2929, 10262, 271, 2746, 326, 310, 4460, 323, 10885, 253, 1895, 273, 323, 1643, 11860, 4666, 9162, 327, 4216, 253, 2022, 2934, 273, 25987, 253, 17699, 16561, 2990, 342, 5415, 4903, 281, 3157, 253, 5921, 6779, 2190, 253, 4666, 3386, 310, 247, 4460, 2934, 387, 1878, 275, 436, 3634, 50276, 783, 2929, 2722, 326, 253, 2934, 310, 3576, 27012, 1375, 23037, 14387, 1543, 327, 1643, 11860, 4666, 9162, 49602, 944, 66, 285, 4851, 3248, 254, 50276, 18, 2556, 281, 253, 3368, 2593, 273, 253, 2929, 253, 4836, 7514, 275, 436, 2929, 310, 253, 1643, 11860, 4666, 9162, 327, 4216, 533, 352, 310, 2529, 347, 253, 1643, 11860, 4715, 275, 253, 2644, 2929, 534, 812, 320, 24363, 374, 253, 1543, 403, 417, 594, 21414, 806, 352, 3133, 326, 253, 1332, 4081, 275, 436, 2929, 310, 417, 4336, 8936, 281, 2045, 3082, 337, 1106, 495, 11860, 275, 4851, 3248, 254, 1273, 253, 2929, 812, 3480, 1543, 327, 690, 643, 15302, 824, 347, 7001, 251, 4579, 285, 28159, 262, 534, 403, 5393, 275, 253, 2045, 3082, 751, 391, 1079, 495, 253, 4477, 513, 417, 7277, 253, 6046, 7882, 273, 253, 4081, 1332, 342, 643, 3082, 3738, 253, 4081, 1332, 310, 1077, 6474, 281, 6046, 359, 513, 417, 871, 849, 6474, 352, 310, 2429, 342, 643, 3082, 50276, 7053, 253, 4477, 943, 294, 7397, 907, 253, 5740, 273, 253, 2303, 4836, 1273, 4477, 812, 5276, 253, 12510, 273, 616, 8130, 407, 10941, 616, 789, 342, 643, 2987, 327, 625, 15302, 4720, 253, 4477, 812, 7277, 253, 6046, 7882, 273, 253, 4081, 1332, 342, 643, 3082, 281, 14204, 253, 6046, 7882, 273, 253, 4081, 1332, 5474, 33032, 18, 186, 9088, 403, 690, 2561, 4342, 326, 3177, 281, 2486, 253, 15577, 1491, 715, 3676, 11454, 6928, 533, 436, 2746, 310, 247, 4460, 2900, 281, 1643, 11860, 4715, 4836, 374, 186, 783, 2934, 273, 970, 3641, 281, 1957, 253, 5921, 2190, 3386, 310, 973, 24013, 8550, 285, 5272, 495, 186, 22157, 270, 9068, 87, 432, 253, 5926, 483, 275, 8763, 8090, 273, 270, 5943, 310, 4722, 285, 4020, 839, 253, 3641, 3066, 37851, 27377, 689, 270, 9068, 87, 310, 22335, 3451, 577, 186, 3118, 8391, 422, 4679, 452, 644, 5196, 285, 253, 1543, 921, 253, 12510, 273, 253, 2746, 50275, 18, 186, 783, 5661, 1543, 651, 320, 625, 21414, 604, 625, 15302, 812, 320, 8671, 374, 186, 8826, 12342, 275, 2593, 495, 403, 417, 6283, 5544, 50275, 18, 186, 783, 4081, 15826, 3610, 74, 1332, 3012, 19132, 253, 7200, 273, 1643, 11860, 4715, 2429, 342, 270, 5943, 275, 2829, 374, 285, 2829, 495, 533, 403, 7197, 685, 391, 1079, 275, 690, 9534, 534, 943, 320, 5469, 275, 625, 4278, 50276, 19, 186, 5371, 310, 5426, 273, 4216, 305, 2593, 495, 849, 281, 4044, 305, 432, 253, 15302, 944, 66, 285, 4851, 3248, 254, 2593, 8319, 841, 943, 320, 12814, 495, 186, 261, 627, 667, 3064, 390, 2954, 875, 277, 285, 891, 275, 2593, 495, 577, 186, 7373, 310, 3798, 908, 281, 6266, 253, 10096, 2190, 941, 310, 253, 1307, 10096, 625, 6283, 685, 5921, 253, 16038, 273, 5921, 943, 320, 2007, 5469, 6283, 50276, 187, 187, 4118, 18435, 27, 13518, 2278, 253, 2929, 29328, 247, 747, 1643, 11860, 4715, 1332, 970, 15577, 1491, 281, 1566, 7688, 2190, 3386, 50275, 15177, 253, 2929, 310, 22335, 3590, 285, 556, 247, 21414, 5661, 7103, 50275, 498, 15752, 253, 2929, 310, 973, 3542, 50276, 19164, 414, 253, 2934, 310, 1077, 4722, 285, 556, 644, 5183, 1077, 4217, 275, 1643, 11860, 4715, 50275, 9188, 40348, 253, 1332, 556, 1270, 2442, 275, 1643, 11860, 4715, 50275, 13524, 30628, 403, 1077, 2762, 670, 253, 2929, 253, 2934, 310, 7126, 285, 253, 5661, 1543, 403, 21414, 50275, 531, 37317, 16540, 690, 7350, 253, 4477, 452, 10974, 281, 253, 7350, 253, 4477, 452, 2530, 3240, 5272, 22909, 323, 253, 30628, 7350, 50276 ]
[ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 ]
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 2590, 47284, 273, 253, 3538, 569, 42852, 2905, 789, 285, 4679, 50276, 635, 4722, 2934, 50276, 66, 1781, 1180, 285, 2491, 273, 32048, 3210, 50275, 783, 767, 5762, 15302, 403, 1077, 2074, 352, 651, 452, 644, 625, 27096, 281, 1071, 327, 767, 15302, 326, 403, 625, 1027, 50275, 9088, 310, 247, 5955, 273, 4735, 4715, 275, 1643, 11860, 4715, 533, 891, 1158, 326, 627, 778, 878, 281, 320, 625, 273, 247, 13812, 4735, 4715, 50276, 37626, 4715, 476, 2223, 320, 2218, 1309, 1313, 255, 26208, 1078, 1643, 11860, 4715, 6634, 50276, 339, 3030, 281, 320, 690, 963, 993, 24088, 1293, 4081, 7792, 275, 253, 6452, 50276, 7152, 339, 431, 248, 4081, 1332, 310, 3576, 323, 1643, 11860, 4715, 285, 41731, 13015, 690, 1375, 23037, 14387, 21607, 285, 352, 476, 6016, 689, 31893, 689, 34006, 285, 6046, 352, 310, 1805, 281, 1333, 270, 9068, 87, 685, 15577, 1491, 1580, 352, 2686, 31167, 253, 270, 9068, 87, 1332, 619, 2022, 4468, 310, 849, 281, 2686, 897, 253, 2934, 273, 15577, 1491, 50276, 2520, 789, 310, 816, 970, 270, 9068, 87, 891, 1928, 253, 4060, 310, 2238, 273, 24363, 390, 247, 1652, 2372, 1943, 3103, 891, 4282, 604, 368, 403, 4680, 273, 5697, 273, 970, 3641, 643, 685, 270, 9068, 87, 1014, 816, 3365, 28208, 390, 1633, 891, 2372, 5545, 253, 12510, 273, 38234, 281, 270, 9068, 87, 891, 717, 417, 12262, 3641, 891, 5730, 253, 2488, 878, 625, 281, 15249, 436, 436, 16274, 253, 8453, 273, 436, 789, 5474, 339, 431, 248, 2929, 10262, 271, 2746, 326, 310, 4460, 323, 10885, 253, 1895, 273, 323, 1643, 11860, 4666, 9162, 327, 4216, 253, 2022, 2934, 273, 25987, 253, 17699, 16561, 2990, 342, 5415, 4903, 281, 3157, 253, 5921, 6779, 2190, 253, 4666, 3386, 310, 247, 4460, 2934, 387, 1878, 275, 436, 3634, 50276, 783, 2929, 2722, 326, 253, 2934, 310, 3576, 27012, 1375, 23037, 14387, 1543, 327, 1643, 11860, 4666, 9162, 49602, 944, 66, 285, 4851, 3248, 254, 50276, 18, 2556, 281, 253, 3368, 2593, 273, 253, 2929, 253, 4836, 7514, 275, 436, 2929, 310, 253, 1643, 11860, 4666, 9162, 327, 4216, 533, 352, 310, 2529, 347, 253, 1643, 11860, 4715, 275, 253, 2644, 2929, 534, 812, 320, 24363, 374, 253, 1543, 403, 417, 594, 21414, 806, 352, 3133, 326, 253, 1332, 4081, 275, 436, 2929, 310, 417, 4336, 8936, 281, 2045, 3082, 337, 1106, 495, 11860, 275, 4851, 3248, 254, 1273, 253, 2929, 812, 3480, 1543, 327, 690, 643, 15302, 824, 347, 7001, 251, 4579, 285, 28159, 262, 534, 403, 5393, 275, 253, 2045, 3082, 751, 391, 1079, 495, 253, 4477, 513, 417, 7277, 253, 6046, 7882, 273, 253, 4081, 1332, 342, 643, 3082, 3738, 253, 4081, 1332, 310, 1077, 6474, 281, 6046, 359, 513, 417, 871, 849, 6474, 352, 310, 2429, 342, 643, 3082, 50276, 7053, 253, 4477, 943, 294, 7397, 907, 253, 5740, 273, 253, 2303, 4836, 1273, 4477, 812, 5276, 253, 12510, 273, 616, 8130, 407, 10941, 616, 789, 342, 643, 2987, 327, 625, 15302, 4720, 253, 4477, 812, 7277, 253, 6046, 7882, 273, 253, 4081, 1332, 342, 643, 3082, 281, 14204, 253, 6046, 7882, 273, 253, 4081, 1332, 5474, 33032, 18, 186, 9088, 403, 690, 2561, 4342, 326, 3177, 281, 2486, 253, 15577, 1491, 715, 3676, 11454, 6928, 533, 436, 2746, 310, 247, 4460, 2900, 281, 1643, 11860, 4715, 4836, 374, 186, 783, 2934, 273, 970, 3641, 281, 1957, 253, 5921, 2190, 3386, 310, 973, 24013, 8550, 285, 5272, 495, 186, 22157, 270, 9068, 87, 432, 253, 5926, 483, 275, 8763, 8090, 273, 270, 5943, 310, 4722, 285, 4020, 839, 253, 3641, 3066, 37851, 27377, 689, 270, 9068, 87, 310, 22335, 3451, 577, 186, 3118, 8391, 422, 4679, 452, 644, 5196, 285, 253, 1543, 921, 253, 12510, 273, 253, 2746, 50275, 18, 186, 783, 5661, 1543, 651, 320, 625, 21414, 604, 625, 15302, 812, 320, 8671, 374, 186, 8826, 12342, 275, 2593, 495, 403, 417, 6283, 5544, 50275, 18, 186, 783, 4081, 15826, 3610, 74, 1332, 3012, 19132, 253, 7200, 273, 1643, 11860, 4715, 2429, 342, 270, 5943, 275, 2829, 374, 285, 2829, 495, 533, 403, 7197, 685, 391, 1079, 275, 690, 9534, 534, 943, 320, 5469, 275, 625, 4278, 50276, 19, 186, 5371, 310, 5426, 273, 4216, 305, 2593, 495, 849, 281, 4044, 305, 432, 253, 15302, 944, 66, 285, 4851, 3248, 254, 2593, 8319, 841, 943, 320, 12814, 495, 186, 261, 627, 667, 3064, 390, 2954, 875, 277, 285, 891, 275, 2593, 495, 577, 186, 7373, 310, 3798, 908, 281, 6266, 253, 10096, 2190, 941, 310, 253, 1307, 10096, 625, 6283, 685, 5921, 253, 16038, 273, 5921, 943, 320, 2007, 5469, 6283, 50276, 187, 187, 4118, 18435, 27, 13518, 2278, 253, 2929, 29328, 247, 747, 1643, 11860, 4715, 1332, 970, 15577, 1491, 281, 1566, 7688, 2190, 3386, 50275, 15177, 253, 2929, 310, 22335, 3590, 285, 556, 247, 21414, 5661, 7103, 50275, 498, 15752, 253, 2929, 310, 973, 3542, 50276, 19164, 414, 253, 2934, 310, 1077, 4722, 285, 556, 644, 5183, 1077, 4217, 275, 1643, 11860, 4715, 50275, 9188, 40348, 253, 1332, 556, 1270, 2442, 275, 1643, 11860, 4715, 50275, 13524, 30628, 403, 1077, 2762, 670, 253, 2929, 253, 2934, 310, 7126, 285, 253, 5661, 1543, 403, 21414, 50275, 531, 37317, 16540, 690, 7350, 253, 4477, 452, 10974, 281, 253, 7350, 253, 4477, 452, 2530, 3240, 5272, 22909, 323, 253, 30628, 7350, 50276 ]
Below is given review of a research paper from cnoference journal. Please write a summary the review. ### Review: 1 summary this paper proposes an improved offline rl batch rl algorithm combining the stateoftheart behaviorregularization actorcritic method nair et al 2020 with a modelbased rl technique ntrained probabilistic dynamics models generate fictitious trajectories with uncertaintypenalized rewards after pretraining the policy with the behaviorregularization solely on the offline data both these generated data and the original offline data are used for the further behaviorregularized actorcritic training numerical results showed that the proposed method outperformed recent offline modelfree and modelbased rl algorithms 2 pros this paper efficiently combines the stateoftheart method with a modelbased rl technique challenges in offline rl and the related works are welldescribed ablation study investigated five different kinds of data types for indepth investigation 3 cons 1 experimental support the ablation study is unfair the results of ours in table 1 are the maximum combination of awac and awacmb2po which includes five best results from awac out of nine best results from ours although related explanations are written at the end of section 5 awacmb2po should be considered as ours this is the reason why i had no choice but to give a harsh rating besides modelbased learning and generating fictitious samples suffer from compound errors so the learning can be inaccurate especially when the length of the trajectories is long however the proposed algorithm uses 95 of fictitious samples in the modelbased finetuning phase causing degradation of overall performance due to the compound error issues 2 novelty this work is not novel since this papers algorithm heavily depends on stateoftheart awac nair et al 2020 and the uncertaintypenalized reward generalization is from other previous work yu et al 2020 3 reproducibility the papers experiments do not guarantee reproducibility since the authors did not provide the implementation code 4 minor concerns the notion of the behavioral policy should be unified either pib or pibeta the notion of the expectation should be unified either e or mathbbe post rebuttal responses i thank the authors for their replies and the corresponding new revision after reading them and the other reviews carefully i changed the rating accordingly however i still lean to reject this paper for the following reasons 1 when saying our method outperforms something the standard deviation and the mean should be considered such as confidence interval or statistical test however the authors seem only to consider the mean in the paper and the other replies in this sense the proposed method does not outperform cql besides the variances of cql in halfcheetahmixed and hoppermixed are required to conclude that the proposed method outperforms cql in the mixed setting eg the variance of the proposed method in hoppermixed is high 2 i agree with the other reviewers common concerns on the novelty main benefit of this paper 3 the supplementary code can be shared with the reviewers using an anonymous link as explained in iclr author guide it would have been better if the authors used this functionality of official comments during the review process for better reproducibility docsephi first i wanted to thanks authors to put this manuscript together i enjoyed reading it summary authors suggest modelbased behaviorregularized policy optimization mb2po for fine tuning policies trained with behavioral regularized model free rl i enjoyed reading the paper i think its a neat idea and authors did a good job explaining them away from the details looking at the results i have an important concern about the usefulness of the method so authors claim that if awac has not achieved expert performance but looking at the result the method have not delivered the promise and in many cases the performance actually degraded so if we cannot be at least to a good extent confident that fine tuning will make the performance better what is the main benefit specially if we are in mostly offpolicy setting and cannot really get the performance of the model by running the policy multiple times on the environment i am happy to increase my score if authors clarify my concern misunderstanding thanksdocsep summary the paper describes a novel method to combine two different school of thoughts for improving offline reinforcement learning the model free part relies on keeping policy changes local and model based part uses model uncertainties to fine tune to obtain final offline rl policy strong points 1 paper is well written this is impressive as both the methods on which the paper relies on are recent additions to the literature and as such i wasnot fully aware of the techniques described therein however despite this the paper was easy to follow and a pleasant read 2 i would also like to thank authors for honestly describing the scenarios in which fine tuning led to degradation in the performance of the proposed method what can be improved 1 the paper relies on awac method which is itself novel and not peer reviewed with benefit of doubt to authors i assume that the awac represents excellent scholarship this though raises another question as to how much contribution the results from have on their own there maybe excellent points the authors use to improve upon awc results however with the time budget allocated to this review it is impossible to read and understand awac and then objectively judge the improvements bought about by this scholarship 2 again i would like to commend authors for the future work directions mentioned in the conclusion unfortunately these are the very same questions i would have hoped this paper addresses for example one of the classic use cases for offline rl would be that we are not aware of the situation whether the data is expert policy or naive policy so it is very challenging to decide whether or not to fine tune the results one obtains from awac for example 3 one page 6 penultimate paragraph authors say and we are confident that we can learn an effective model with the available data then we do additional finetuning using mb2po i am not sure how we are establish this this is a genuine question and not a rhetorical remark 4 the details of the models used for fine tuning are absent i did see which nn models are used to build dynamics models i am sure authors would agree that a neural networks that outputs a gaussian distribution would cover large literature at conference like iclr moreover what network is used would significantly impact what can be learned by effective referring to the comment above typos no impact on score is there a tilde missing in the algorithm 43 just above add sample if not do we need to learn reward model on page 6 last paragraph the iterations mentioned are on model based samples innermost for loop in the algorithm docsepfinal recommendation i do not recommend accepting the paper the results have been greatly improved they now look decent and i have improved my score as a result however i think the contributions are still not clearly highlighted i however encourage the authors to improve their paper summary this paper proposes to combine two approaches for offline reinforcement learning behaviorregularized methods and uncertaintyaware modelbased methods the proposed approach works in two steps first a conservative mdp is constructed by estimating a transition functin and substracting an uncertainty penalty from the reward for each state and action pair similarly to mopo yu et al 2020 awac nair et al 2020 is used to learn a policy pi using offline data and the penalized reward where pi is constrained to be closed to the behavior policy in terms of kl divergence in a second optional step called mb2po the paper proposes to iteratively refine pi by sampling trajectories from the conservative mdp and improving pi using awac the proposed approach is compared to existing methods mopo bear brac cql on the d4rl benchmark the best result of the proposed approach ie with or without the second step is shown to be competitive with the state of the art strong points 1 the paper offers a possible theoretical reason about why the proposed approach may improve result it improves the bound on the difference between the return of the mdp and the approximate mdp 2 the experiments look good to me and show the performance of the method 3 results can be competitive with the state of the art weak points 1 my key concern about the paper is that at the moment it does not provide a non empirical method to decide whether to used the second step or not in the experiments the best out of the two results is used this makes it hard to assess the interest of the method 2 reading the paper was rather easy but identifying the difference between the contributions of the paper and previous work was not so easy for me 3 i am not sure i could reproduce the experiments for example the architecture of the models used is not described recommendation i vote for rejecting the paper the idea looks interesting and should be investigated further however currently the proposed idea is in my opinion not fully developped weak point 1 and the results while fine are not improving the state of the art enough to overlook this point details i like the theoretical intuition while not the foundation of the idea it is good to have a theoretical sanity check the second step of the method sometimes severely deteriorates the policy for me evaluating both policies and picking the best one is not a very satisfactory method for an offpolicy algorithm regarding the contributions of the paper i am under the impression that in the section presenting the method section 4 the novel contributions are equation 7 and associated text in section 41 equation 11 and associated text and section 43 it is however nice to have information about existing methods leveraged in this work i am not sure what the best solution is but i would like to suggest using they rather than we when the paper describes previous work around equation 6 the paper states that a problem with mopo is that estimating uncertainty is difficult and there will inevitably be model errors that will lead to overestimated qvalues and that the proposed approach address this issue i think it would be nice to have some experimental results highlighting that the proposed approach indeed fixes this issue perhaps by showing the uncertainty estimates and different resulting actions for different methods in problematic state questions could you please correct me if i misunderstood the contributions of the paper minor details section 21 i think awac is used before being introduced algorithm 1 has no captiondocsepthe authors take an attempt at offline rl thanks to a mix between behavioural policy regularization and model based policy optimization they basically combine two algorithms awax and depending on the level of safety given an epistemic uncertainty evaluation mopo may be additionally used to fine tune the policy unfortunately the work suffers from several severe weaknesses the writing is not good see the typo and minor comments section the positioning is biased and missing to many accounts out of 31 citations 12 are from the same author even more problematic most if not all the references on offline rl are from this author and therefore lacks diversity in particular modelbased offline rl iyengar2005nilim2005 and modelfree offline rl thomas2015b have a rich history more specifically equation 3 is identical to that of the rewardadjusted mdp found in petrik2016 the safe policy improvement objective has been considered for instance in thomas2015bpetrik2016 equation 7 proposes to optimize the policy under a constraint on the policy search identical to online trpo which is evoked later that is very similar to laroche2019 except that the constraint is not state based and therefore probably less efficient if our initial policy does not achieve expert level performance and we are confident that we can learn an effective model with the available data then it is unclear how these decisions are sorted out performing those safety tests are an area of research in themselves thomas2015a even if we assume that the algorithmic novelty is proven it seems pretty incremental since it amounts to perform a test to decide between two algorithms finally the experimental results do not savethe day we observe that ours is always the max of awac and awacmb2po which is a little suspicious since we have no information on how the decision is made in comparison with cql it is not better but it is a strong baseline so its not improving the state of the art it would have been informative to show the behavioural performance in each setting typo and minor comments awac is used without citation or explanation first 21 the most common offpolicy modelfree algorithms are actorcritic algorithms that alternate between policy evaluation and policy improvement in order to learn an effective policy this is not actorcritic but policy iteration otherwise we use the fully trained awac policy these results are reported in the column ours in table 1 otherwise what sec 42 effect affect sec 43 degredation degradation iyengar2005 iyengar g n robust dynamic programming mathematics of operations research 302257280 2005 laroche2019 laroche r trichelair p tachet des combes r t 2019 may safe policy improvement with baseline bootstrapping in international conference on machine learning pp 36523661 nilim2005 nilim a and el ghaoui l robust control of markov decision processes with uncertain transition matrices operations research 535780798 2005 petrik2016 petrik m ghavamzadeh m chow y 2016 safe policy improvement by minimizing robust baseline regret in advances in neural information processing systems pp 22982306 thomas2015a thomas p s theocharous g ghavamzadeh m 2015 february highconfidence offpolicy evaluation in twentyninth aaai conference on artificial intelligence thomas2015b thomas p theocharous g ghavamzadeh m 2015 june high confidence policy improvement in international conference on machine learning pp 23802388 ### Summary:
this paper proposes a method for offline reinforcement learning methods with modelbased policy optimization where they first learn a model of the environment to learn the transition dynamics a critic and the policy in an offline manner they basically learn the model by training an ensemble of probabilistic dynamics models represented by neural networks that output a diagonal gaussian distribution over the next state and reward then they use the covariance of the probabilistic dynamics model to get an uncertainty measure that they incorporate into the reward when training it with the awac there were two main concerns raised by the reviewers 1 experiments as pointed out by the reviewers the experimental gains dont look very convincing in particular the performance of awac looks bad and mb2po doesnt give much gains on top of it it is not clear how much better the proposed method is doing on the tasks that it does well without any confidence intervals or variance measures provided 2 novelty this is almost a trivial combination of two existing ideas model based policy optimization and awac it is not clear how useful this particular combination is in practice and it seems like there is not much insights gained from it i think better motivations further ablations and more empirical analysis to understand the proposed model better for example analyzing the types of behaviors learned or how calibrated the uncertainty estimates that is incorporated into the reward is or some hyperparameter sensitivity analysis would make the paper more interesting as it stands right now i am suggesting to reject this paper i hope the authors will improve the paper for the future
[ 452, 327, 616, 1211, 627, 5046, 7126, 2792, 253, 4477, 50274, 2327, 281, 3157, 2220, 3768, 68, 1543, 2299, 342, 253, 673, 7563, 18564, 281, 50274, 2520, 2278, 352, 310, 7479, 281, 1239, 285, 2096, 3768, 317, 285, 840, 38304, 50274, 47075, 253, 11701, 8686, 670, 407, 436, 26104, 50276, 19, 969, 891, 651, 751, 281, 49638, 4477, 323, 253, 2852, 789, 10746, 50274, 13012, 275, 253, 6452, 19235, 841, 403, 253, 1077, 1072, 3533, 50274, 74, 651, 452, 13937, 436, 2929, 12453, 323, 1650, 581, 273, 253, 10610, 897, 50274, 12866, 323, 28841, 391, 77, 651, 320, 326, 359, 403, 417, 6600, 273, 253, 4112, 1880, 50274, 783, 941, 310, 6485, 3646, 390, 27785, 3646, 594, 352, 310, 1077, 11132, 281, 50274, 8632, 504, 1880, 390, 417, 281, 4030, 19928, 253, 1543, 581, 31326, 432, 3768, 317, 323, 50274, 11667, 50276, 20, 581, 3239, 721, 4331, 503, 2542, 12494, 4477, 1333, 50274, 395, 359, 403, 13224, 50274, 3529, 359, 476, 3037, 271, 3576, 1566, 342, 253, 2130, 941, 840, 359, 513, 50274, 38092, 1442, 292, 25004, 970, 45505, 19, 5367, 891, 717, 417, 2119, 849, 359, 403, 5100, 436, 50273, 2520, 310, 247, 13241, 1953, 285, 417, 247, 21145, 33140, 7579, 577, 253, 4278, 273, 253, 3210, 908, 323, 4030, 25184, 403, 12125, 891, 858, 923, 534, 48257, 50274, 19286, 403, 908, 281, 1973, 8062, 3210, 891, 717, 2119, 4477, 651, 5194, 326, 50274, 66, 11454, 6928, 326, 18012, 247, 305, 12064, 3268, 651, 3835, 1781, 50274, 22478, 1177, 387, 8059, 751, 17857, 32888, 25761, 752, 2990, 310, 908, 651, 50274, 9188, 36211, 3486, 752, 476, 320, 6311, 407, 3576, 14339, 281, 253, 50274, 13982, 1840, 50274, 555, 993, 642, 3486, 327, 4868, 50275, 261, 627, 247, 246, 6227, 5816, 275, 253, 5933, 7652, 816, 1840, 823, 3410, 50276, 338, 50275, 1439, 513, 359, 878, 281, 3037, 10921, 1566, 50274, 251, 3239, 721, 1390, 12494, 253, 25142, 5393, 403, 327, 1566, 1754, 3530, 50275, 2966, 32848, 323, 6287, 275, 253, 5933, 50276, 7152, 33032, 13017, 17401, 891, 513, 417, 5583, 18738, 253, 2929, 253, 1543, 452, 644, 10260, 5520, 597, 1024, 1007, 12524, 285, 891, 452, 5520, 619, 4868, 347, 247, 906, 2299, 891, 1158, 253, 9021, 403, 1335, 417, 4518, 16318, 891, 2299, 11907, 253, 4477, 281, 3157, 616, 2929, 50275, 8774, 436, 2929, 29328, 281, 13398, 767, 7274, 323, 28841, 35221, 4715, 3879, 12846, 1025, 3082, 285, 11649, 13823, 1566, 3169, 3082, 253, 4081, 2746, 2987, 275, 767, 5018, 806, 247, 11518, 278, 12132, 310, 8818, 407, 26230, 247, 5502, 794, 27467, 285, 749, 6316, 272, 271, 11649, 12339, 432, 253, 10921, 323, 1016, 1375, 285, 2250, 4667, 12014, 281, 278, 38332, 340, 86, 1162, 355, 9169, 50276, 1403, 317, 295, 1094, 1162, 355, 9169, 310, 908, 281, 3037, 247, 3646, 12580, 970, 28841, 941, 285, 253, 29697, 1025, 10921, 835, 12580, 310, 20793, 281, 320, 4581, 281, 253, 3879, 3646, 275, 2426, 273, 27451, 23279, 275, 247, 1273, 15266, 3213, 1925, 45505, 19, 5367, 253, 2929, 29328, 281, 10040, 3146, 39494, 12580, 407, 10491, 24102, 432, 253, 11518, 278, 12132, 285, 11138, 12580, 970, 3768, 317, 253, 4081, 2746, 310, 2429, 281, 5368, 3082, 278, 38332, 8800, 1308, 317, 260, 5848, 327, 253, 277, 21, 8435, 22791, 253, 1682, 906, 273, 253, 4081, 2746, 26332, 342, 390, 1293, 253, 1273, 3213, 310, 2011, 281, 320, 12085, 342, 253, 1375, 273, 253, 1445, 50274, 9072, 2792, 50276, 18, 253, 2929, 6131, 247, 1896, 10527, 1921, 670, 2139, 253, 4081, 2746, 778, 3157, 906, 352, 19132, 253, 3033, 327, 253, 3064, 875, 253, 1091, 273, 253, 278, 12132, 285, 253, 16851, 278, 12132, 50276, 19, 253, 4679, 1007, 1175, 281, 479, 285, 921, 253, 3045, 273, 253, 1332, 50276, 20, 1543, 476, 320, 12085, 342, 253, 1375, 273, 253, 1445, 50276, 20881, 2792, 50276, 18, 619, 2234, 4468, 670, 253, 2929, 310, 326, 387, 253, 2774, 352, 1057, 417, 2085, 247, 1327, 16774, 1332, 281, 7617, 1880, 281, 908, 253, 1273, 3213, 390, 417, 275, 253, 4679, 253, 1682, 562, 273, 253, 767, 1543, 310, 908, 436, 2789, 352, 1892, 281, 2939, 253, 1600, 273, 253, 1332, 50276, 19, 4361, 253, 2929, 369, 2581, 3477, 533, 12488, 253, 3064, 875, 253, 9021, 273, 253, 2929, 285, 2045, 789, 369, 417, 594, 3477, 323, 479, 50275, 20, 891, 717, 417, 2119, 891, 812, 18302, 253, 4679, 323, 1650, 253, 10336, 273, 253, 3210, 908, 310, 417, 2529, 50275, 250, 27167, 318, 50276, 74, 6273, 323, 33944, 253, 2929, 253, 2934, 4453, 4722, 285, 943, 320, 6949, 2007, 2299, 4390, 253, 4081, 2934, 310, 275, 619, 4743, 417, 4751, 372, 652, 80, 1882, 5075, 1127, 337, 285, 253, 1543, 1223, 4030, 403, 417, 11138, 253, 1375, 273, 253, 1445, 2217, 281, 20621, 436, 1127, 50274, 23454, 50276, 74, 751, 253, 10527, 30328, 1223, 417, 253, 12153, 273, 253, 2934, 352, 310, 1175, 281, 452, 247, 10527, 45985, 2451, 50276, 783, 1273, 3213, 273, 253, 1332, 4536, 18270, 16528, 684, 253, 3646, 323, 479, 16344, 1097, 7823, 285, 8871, 253, 1682, 581, 310, 417, 247, 1077, 20297, 1332, 323, 271, 745, 22872, 5933, 50273, 1747, 13218, 253, 9021, 273, 253, 2929, 891, 717, 762, 253, 13214, 326, 275, 253, 2593, 15250, 253, 1332, 2593, 577, 253, 4460, 9021, 403, 5150, 818, 285, 2330, 2505, 275, 2593, 7609, 5150, 1903, 285, 2330, 2505, 285, 2593, 7652, 352, 310, 2299, 5322, 281, 452, 1491, 670, 5368, 3082, 19732, 2961, 275, 436, 789, 891, 717, 417, 2119, 752, 253, 1682, 2900, 310, 533, 891, 651, 751, 281, 1804, 970, 597, 2581, 685, 359, 672, 253, 2929, 8631, 2045, 789, 50276, 16576, 5150, 721, 253, 2929, 3054, 326, 247, 1895, 342, 278, 38332, 310, 326, 26230, 11649, 310, 2834, 285, 627, 588, 24473, 320, 1566, 6332, 326, 588, 1421, 281, 35039, 18280, 2805, 8858, 285, 326, 253, 4081, 2746, 2953, 436, 2523, 891, 1158, 352, 651, 320, 5322, 281, 452, 690, 5661, 1543, 27321, 326, 253, 4081, 2746, 6296, 26019, 436, 2523, 4931, 407, 4645, 253, 11649, 8197, 285, 1027, 4795, 5231, 323, 1027, 3082, 275, 20276, 1375, 50274, 34974, 50276, 16534, 368, 4496, 3451, 479, 604, 891, 46485, 253, 9021, 273, 253, 2929, 50275, 37585, 4278, 2593, 3127, 891, 1158, 3768, 317, 310, 908, 1078, 1146, 5611, 5933, 337, 556, 642, 11743, 7152, 339, 431, 248, 4477, 1379, 271, 3177, 387, 28841, 391, 77, 6701, 281, 247, 5878, 875, 35174, 3646, 37820, 285, 1566, 1754, 3646, 13757, 597, 10323, 13398, 767, 11333, 3768, 991, 285, 7293, 327, 253, 1268, 273, 5252, 1677, 271, 30009, 11060, 11649, 7103, 278, 38332, 778, 320, 23000, 908, 281, 4030, 19928, 253, 3646, 50276, 328, 9520, 253, 789, 27171, 432, 2067, 5460, 32213, 50276, 783, 4028, 310, 417, 1175, 923, 253, 1745, 80, 285, 5884, 5701, 2593, 50276, 783, 19274, 310, 23539, 285, 5816, 281, 1142, 8553, 562, 273, 4562, 30404, 1249, 403, 432, 253, 1072, 2488, 1014, 625, 20276, 954, 604, 417, 512, 253, 10414, 327, 28841, 391, 77, 403, 432, 436, 2488, 285, 3103, 19756, 9991, 275, 1798, 1566, 3169, 28841, 391, 77, 891, 90, 1205, 274, 9204, 18789, 303, 9204, 285, 771, 813, 658, 28841, 391, 77, 289, 4921, 6620, 67, 452, 247, 6793, 2892, 625, 5742, 5150, 495, 310, 8931, 281, 326, 273, 253, 10921, 23811, 278, 12132, 1119, 275, 7590, 16409, 6961, 253, 4999, 3646, 7756, 8103, 556, 644, 2783, 323, 4227, 275, 289, 4921, 6620, 67, 8337, 16409, 6961, 5150, 818, 29328, 281, 22318, 253, 3646, 762, 247, 7658, 327, 253, 3646, 3186, 8931, 281, 3909, 492, 5367, 534, 310, 32338, 1996, 326, 310, 1077, 2074, 281, 1236, 13807, 9638, 3707, 326, 253, 7658, 310, 417, 1375, 1754, 285, 3103, 3164, 1679, 5919, 50276, 338, 776, 3302, 3646, 1057, 417, 5115, 6485, 1268, 3045, 285, 359, 403, 13224, 326, 359, 476, 3037, 271, 3576, 1566, 342, 253, 2130, 941, 840, 50275, 262, 310, 12744, 849, 841, 7089, 403, 20045, 562, 9591, 1110, 5252, 5216, 403, 271, 2170, 273, 2561, 275, 3746, 289, 4921, 6620, 66, 50276, 9154, 604, 359, 5467, 326, 253, 5933, 280, 38135, 310, 11464, 352, 3133, 3965, 32809, 1580, 352, 8322, 281, 1347, 247, 1071, 281, 7617, 875, 767, 11333, 50276, 71, 3341, 253, 5661, 1543, 513, 417, 5745, 10666, 1388, 359, 10018, 326, 20451, 310, 1900, 253, 2781, 273, 3768, 317, 285, 3768, 317, 1814, 19, 5367, 534, 310, 247, 1652, 20634, 1580, 359, 452, 642, 1491, 327, 849, 253, 3061, 310, 1160, 275, 5301, 342, 260, 5848, 352, 310, 417, 1805, 533, 352, 310, 247, 2266, 8245, 594, 697, 417, 11138, 253, 1375, 273, 253, 1445, 352, 651, 452, 644, 27096, 281, 921, 253, 35174, 3045, 275, 1016, 4758, 50276, 555, 5367, 285, 5884, 5701, 50276, 1403, 317, 310, 908, 1293, 25577, 390, 8813, 806, 3127, 50276, 783, 954, 1846, 745, 22872, 771, 813, 658, 11333, 403, 12353, 68, 17425, 11333, 326, 17958, 875, 3646, 7103, 285, 3646, 7756, 275, 1340, 281, 3037, 271, 3576, 3646, 50276, 2520, 310, 417, 12353, 68, 17425, 533, 3646, 19502, 50276, 32240, 359, 897, 253, 4751, 10166, 3768, 317, 3646, 841, 1543, 403, 2361, 275, 253, 5084, 20451, 275, 2829, 337, 50276, 32240, 752, 50276, 1704, 5976, 1055, 50276, 66, 887, 50276, 1704, 7652, 6797, 433, 318, 50276, 615, 4971, 318, 50276, 14059, 1205, 274, 9204, 891, 90, 1205, 274, 305, 295, 10237, 7870, 10717, 23065, 273, 5871, 2561, 1884, 14832, 3547, 1438, 5826, 1236, 13807, 9638, 1236, 13807, 391, 492, 280, 2955, 1094, 268, 50276, 85, 2679, 85, 711, 2049, 265, 391, 246, 6247, 778, 4999, 3646, 7756, 342, 8245, 7491, 10981, 2784, 275, 5213, 8059, 327, 5145, 4715, 7266, 23412, 1508, 36630, 5296, 303, 9204, 5296, 303, 247, 285, 1045, 305, 3227, 276, 74, 298, 10237, 1453, 273, 1616, 729, 3061, 4870, 342, 8767, 5502, 12624, 5871, 2561, 608, 24424, 1438, 37348, 5826, 7590, 16409, 6961, 7590, 16409, 278, 305, 10940, 312, 91, 796, 73, 278, 50276, 348, 319, 340, 4022, 4999, 3646, 7756, 407, 28699, 10237, 8245, 14938, 275, 16424, 275, 11454, 1491, 5162, 2718, 7266, 26780, 3507, 18950, 289, 4921, 6620, 66, 289, 4921, 268, 256, 253, 80, 3615, 528, 305, 50276, 72, 10940, 312, 91, 796, 73, 278, 4104, 704, 67, 4701, 1029, 39943, 745, 22872, 7103, 275, 2500, 290, 1362, 11902, 39951, 2284, 8059, 327, 13345, 9260, 289, 4921, 6620, 67, 289, 4921, 268, 253, 80, 3615, 528, 305, 50276, 72, 10940, 312, 91, 796, 73, 278, 4104, 480, 2517, 1029, 7162, 3646, 7756, 275, 5213, 8059, 327, 5145, 4715, 7266, 3495, 1438, 1508, 2055, 187, 187, 4118, 18435, 27, 2520, 2929, 29328, 247, 1332, 323, 28841, 35221, 4715, 3082, 342, 1566, 3169, 3646, 13757, 835, 597, 806, 3037, 247, 1566, 273, 253, 3126, 281, 3037, 253, 5502, 8062, 247, 7291, 285, 253, 3646, 275, 271, 28841, 5133, 597, 10323, 3037, 253, 1566, 407, 3733, 271, 19862, 273, 37851, 8062, 3210, 6607, 407, 11454, 6928, 326, 3453, 247, 16421, 305, 12064, 3268, 689, 253, 1735, 1375, 285, 10921, 840, 597, 897, 253, 26677, 273, 253, 37851, 8062, 1566, 281, 755, 271, 11649, 2557, 326, 597, 19071, 715, 253, 50276, 250, 1034, 672, 3733, 352, 342, 253, 3768, 317, 50276, 9088, 497, 767, 2022, 7350, 5439, 407, 253, 30628, 50276, 18, 4679, 347, 8042, 562, 407, 253, 30628, 253, 5661, 15988, 13414, 1007, 1077, 21414, 275, 1798, 253, 3045, 273, 3768, 317, 4453, 3076, 285, 45505, 19, 5367, 36908, 1918, 1199, 15988, 327, 1755, 273, 352, 352, 310, 417, 2590, 849, 1199, 1805, 253, 4081, 1332, 310, 2509, 327, 253, 8892, 326, 352, 1057, 973, 1293, 667, 7162, 11508, 390, 11041, 5593, 2530, 50276, 19, 38135, 50276, 2520, 310, 2761, 247, 14916, 5019, 273, 767, 5368, 5697, 1566, 1754, 3646, 13757, 285, 3768, 317, 352, 310, 417, 2590, 849, 4217, 436, 1798, 5019, 310, 275, 3946, 285, 352, 3133, 751, 627, 310, 417, 1199, 16039, 12103, 432, 352, 50275, 74, 1158, 1805, 42852, 2007, 490, 77, 569, 285, 625, 16774, 1783, 281, 2096, 253, 4081, 1566, 1805, 323, 1650, 18918, 253, 3510, 273, 13576, 6311, 390, 849, 35890, 253, 11649, 8197, 326, 310, 11217, 715, 253, 10921, 310, 390, 690, 4373, 19484, 7340, 1783, 651, 1056, 253, 2929, 625, 4722, 50276, 284, 352, 9572, 987, 1024, 891, 717, 7738, 281, 12009, 436, 2929, 891, 3524, 253, 4477, 588, 3157, 253, 2929, 323, 253, 2852, 50276 ]
[ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 ]
[ 452, 327, 616, 1211, 627, 5046, 7126, 2792, 253, 4477, 50274, 2327, 281, 3157, 2220, 3768, 68, 1543, 2299, 342, 253, 673, 7563, 18564, 281, 50274, 2520, 2278, 352, 310, 7479, 281, 1239, 285, 2096, 3768, 317, 285, 840, 38304, 50274, 47075, 253, 11701, 8686, 670, 407, 436, 26104, 50276, 19, 969, 891, 651, 751, 281, 49638, 4477, 323, 253, 2852, 789, 10746, 50274, 13012, 275, 253, 6452, 19235, 841, 403, 253, 1077, 1072, 3533, 50274, 74, 651, 452, 13937, 436, 2929, 12453, 323, 1650, 581, 273, 253, 10610, 897, 50274, 12866, 323, 28841, 391, 77, 651, 320, 326, 359, 403, 417, 6600, 273, 253, 4112, 1880, 50274, 783, 941, 310, 6485, 3646, 390, 27785, 3646, 594, 352, 310, 1077, 11132, 281, 50274, 8632, 504, 1880, 390, 417, 281, 4030, 19928, 253, 1543, 581, 31326, 432, 3768, 317, 323, 50274, 11667, 50276, 20, 581, 3239, 721, 4331, 503, 2542, 12494, 4477, 1333, 50274, 395, 359, 403, 13224, 50274, 3529, 359, 476, 3037, 271, 3576, 1566, 342, 253, 2130, 941, 840, 359, 513, 50274, 38092, 1442, 292, 25004, 970, 45505, 19, 5367, 891, 717, 417, 2119, 849, 359, 403, 5100, 436, 50273, 2520, 310, 247, 13241, 1953, 285, 417, 247, 21145, 33140, 7579, 577, 253, 4278, 273, 253, 3210, 908, 323, 4030, 25184, 403, 12125, 891, 858, 923, 534, 48257, 50274, 19286, 403, 908, 281, 1973, 8062, 3210, 891, 717, 2119, 4477, 651, 5194, 326, 50274, 66, 11454, 6928, 326, 18012, 247, 305, 12064, 3268, 651, 3835, 1781, 50274, 22478, 1177, 387, 8059, 751, 17857, 32888, 25761, 752, 2990, 310, 908, 651, 50274, 9188, 36211, 3486, 752, 476, 320, 6311, 407, 3576, 14339, 281, 253, 50274, 13982, 1840, 50274, 555, 993, 642, 3486, 327, 4868, 50275, 261, 627, 247, 246, 6227, 5816, 275, 253, 5933, 7652, 816, 1840, 823, 3410, 50276, 338, 50275, 1439, 513, 359, 878, 281, 3037, 10921, 1566, 50274, 251, 3239, 721, 1390, 12494, 253, 25142, 5393, 403, 327, 1566, 1754, 3530, 50275, 2966, 32848, 323, 6287, 275, 253, 5933, 50276, 7152, 33032, 13017, 17401, 891, 513, 417, 5583, 18738, 253, 2929, 253, 1543, 452, 644, 10260, 5520, 597, 1024, 1007, 12524, 285, 891, 452, 5520, 619, 4868, 347, 247, 906, 2299, 891, 1158, 253, 9021, 403, 1335, 417, 4518, 16318, 891, 2299, 11907, 253, 4477, 281, 3157, 616, 2929, 50275, 8774, 436, 2929, 29328, 281, 13398, 767, 7274, 323, 28841, 35221, 4715, 3879, 12846, 1025, 3082, 285, 11649, 13823, 1566, 3169, 3082, 253, 4081, 2746, 2987, 275, 767, 5018, 806, 247, 11518, 278, 12132, 310, 8818, 407, 26230, 247, 5502, 794, 27467, 285, 749, 6316, 272, 271, 11649, 12339, 432, 253, 10921, 323, 1016, 1375, 285, 2250, 4667, 12014, 281, 278, 38332, 340, 86, 1162, 355, 9169, 50276, 1403, 317, 295, 1094, 1162, 355, 9169, 310, 908, 281, 3037, 247, 3646, 12580, 970, 28841, 941, 285, 253, 29697, 1025, 10921, 835, 12580, 310, 20793, 281, 320, 4581, 281, 253, 3879, 3646, 275, 2426, 273, 27451, 23279, 275, 247, 1273, 15266, 3213, 1925, 45505, 19, 5367, 253, 2929, 29328, 281, 10040, 3146, 39494, 12580, 407, 10491, 24102, 432, 253, 11518, 278, 12132, 285, 11138, 12580, 970, 3768, 317, 253, 4081, 2746, 310, 2429, 281, 5368, 3082, 278, 38332, 8800, 1308, 317, 260, 5848, 327, 253, 277, 21, 8435, 22791, 253, 1682, 906, 273, 253, 4081, 2746, 26332, 342, 390, 1293, 253, 1273, 3213, 310, 2011, 281, 320, 12085, 342, 253, 1375, 273, 253, 1445, 50274, 9072, 2792, 50276, 18, 253, 2929, 6131, 247, 1896, 10527, 1921, 670, 2139, 253, 4081, 2746, 778, 3157, 906, 352, 19132, 253, 3033, 327, 253, 3064, 875, 253, 1091, 273, 253, 278, 12132, 285, 253, 16851, 278, 12132, 50276, 19, 253, 4679, 1007, 1175, 281, 479, 285, 921, 253, 3045, 273, 253, 1332, 50276, 20, 1543, 476, 320, 12085, 342, 253, 1375, 273, 253, 1445, 50276, 20881, 2792, 50276, 18, 619, 2234, 4468, 670, 253, 2929, 310, 326, 387, 253, 2774, 352, 1057, 417, 2085, 247, 1327, 16774, 1332, 281, 7617, 1880, 281, 908, 253, 1273, 3213, 390, 417, 275, 253, 4679, 253, 1682, 562, 273, 253, 767, 1543, 310, 908, 436, 2789, 352, 1892, 281, 2939, 253, 1600, 273, 253, 1332, 50276, 19, 4361, 253, 2929, 369, 2581, 3477, 533, 12488, 253, 3064, 875, 253, 9021, 273, 253, 2929, 285, 2045, 789, 369, 417, 594, 3477, 323, 479, 50275, 20, 891, 717, 417, 2119, 891, 812, 18302, 253, 4679, 323, 1650, 253, 10336, 273, 253, 3210, 908, 310, 417, 2529, 50275, 250, 27167, 318, 50276, 74, 6273, 323, 33944, 253, 2929, 253, 2934, 4453, 4722, 285, 943, 320, 6949, 2007, 2299, 4390, 253, 4081, 2934, 310, 275, 619, 4743, 417, 4751, 372, 652, 80, 1882, 5075, 1127, 337, 285, 253, 1543, 1223, 4030, 403, 417, 11138, 253, 1375, 273, 253, 1445, 2217, 281, 20621, 436, 1127, 50274, 23454, 50276, 74, 751, 253, 10527, 30328, 1223, 417, 253, 12153, 273, 253, 2934, 352, 310, 1175, 281, 452, 247, 10527, 45985, 2451, 50276, 783, 1273, 3213, 273, 253, 1332, 4536, 18270, 16528, 684, 253, 3646, 323, 479, 16344, 1097, 7823, 285, 8871, 253, 1682, 581, 310, 417, 247, 1077, 20297, 1332, 323, 271, 745, 22872, 5933, 50273, 1747, 13218, 253, 9021, 273, 253, 2929, 891, 717, 762, 253, 13214, 326, 275, 253, 2593, 15250, 253, 1332, 2593, 577, 253, 4460, 9021, 403, 5150, 818, 285, 2330, 2505, 275, 2593, 7609, 5150, 1903, 285, 2330, 2505, 285, 2593, 7652, 352, 310, 2299, 5322, 281, 452, 1491, 670, 5368, 3082, 19732, 2961, 275, 436, 789, 891, 717, 417, 2119, 752, 253, 1682, 2900, 310, 533, 891, 651, 751, 281, 1804, 970, 597, 2581, 685, 359, 672, 253, 2929, 8631, 2045, 789, 50276, 16576, 5150, 721, 253, 2929, 3054, 326, 247, 1895, 342, 278, 38332, 310, 326, 26230, 11649, 310, 2834, 285, 627, 588, 24473, 320, 1566, 6332, 326, 588, 1421, 281, 35039, 18280, 2805, 8858, 285, 326, 253, 4081, 2746, 2953, 436, 2523, 891, 1158, 352, 651, 320, 5322, 281, 452, 690, 5661, 1543, 27321, 326, 253, 4081, 2746, 6296, 26019, 436, 2523, 4931, 407, 4645, 253, 11649, 8197, 285, 1027, 4795, 5231, 323, 1027, 3082, 275, 20276, 1375, 50274, 34974, 50276, 16534, 368, 4496, 3451, 479, 604, 891, 46485, 253, 9021, 273, 253, 2929, 50275, 37585, 4278, 2593, 3127, 891, 1158, 3768, 317, 310, 908, 1078, 1146, 5611, 5933, 337, 556, 642, 11743, 7152, 339, 431, 248, 4477, 1379, 271, 3177, 387, 28841, 391, 77, 6701, 281, 247, 5878, 875, 35174, 3646, 37820, 285, 1566, 1754, 3646, 13757, 597, 10323, 13398, 767, 11333, 3768, 991, 285, 7293, 327, 253, 1268, 273, 5252, 1677, 271, 30009, 11060, 11649, 7103, 278, 38332, 778, 320, 23000, 908, 281, 4030, 19928, 253, 3646, 50276, 328, 9520, 253, 789, 27171, 432, 2067, 5460, 32213, 50276, 783, 4028, 310, 417, 1175, 923, 253, 1745, 80, 285, 5884, 5701, 2593, 50276, 783, 19274, 310, 23539, 285, 5816, 281, 1142, 8553, 562, 273, 4562, 30404, 1249, 403, 432, 253, 1072, 2488, 1014, 625, 20276, 954, 604, 417, 512, 253, 10414, 327, 28841, 391, 77, 403, 432, 436, 2488, 285, 3103, 19756, 9991, 275, 1798, 1566, 3169, 28841, 391, 77, 891, 90, 1205, 274, 9204, 18789, 303, 9204, 285, 771, 813, 658, 28841, 391, 77, 289, 4921, 6620, 67, 452, 247, 6793, 2892, 625, 5742, 5150, 495, 310, 8931, 281, 326, 273, 253, 10921, 23811, 278, 12132, 1119, 275, 7590, 16409, 6961, 253, 4999, 3646, 7756, 8103, 556, 644, 2783, 323, 4227, 275, 289, 4921, 6620, 67, 8337, 16409, 6961, 5150, 818, 29328, 281, 22318, 253, 3646, 762, 247, 7658, 327, 253, 3646, 3186, 8931, 281, 3909, 492, 5367, 534, 310, 32338, 1996, 326, 310, 1077, 2074, 281, 1236, 13807, 9638, 3707, 326, 253, 7658, 310, 417, 1375, 1754, 285, 3103, 3164, 1679, 5919, 50276, 338, 776, 3302, 3646, 1057, 417, 5115, 6485, 1268, 3045, 285, 359, 403, 13224, 326, 359, 476, 3037, 271, 3576, 1566, 342, 253, 2130, 941, 840, 50275, 262, 310, 12744, 849, 841, 7089, 403, 20045, 562, 9591, 1110, 5252, 5216, 403, 271, 2170, 273, 2561, 275, 3746, 289, 4921, 6620, 66, 50276, 9154, 604, 359, 5467, 326, 253, 5933, 280, 38135, 310, 11464, 352, 3133, 3965, 32809, 1580, 352, 8322, 281, 1347, 247, 1071, 281, 7617, 875, 767, 11333, 50276, 71, 3341, 253, 5661, 1543, 513, 417, 5745, 10666, 1388, 359, 10018, 326, 20451, 310, 1900, 253, 2781, 273, 3768, 317, 285, 3768, 317, 1814, 19, 5367, 534, 310, 247, 1652, 20634, 1580, 359, 452, 642, 1491, 327, 849, 253, 3061, 310, 1160, 275, 5301, 342, 260, 5848, 352, 310, 417, 1805, 533, 352, 310, 247, 2266, 8245, 594, 697, 417, 11138, 253, 1375, 273, 253, 1445, 352, 651, 452, 644, 27096, 281, 921, 253, 35174, 3045, 275, 1016, 4758, 50276, 555, 5367, 285, 5884, 5701, 50276, 1403, 317, 310, 908, 1293, 25577, 390, 8813, 806, 3127, 50276, 783, 954, 1846, 745, 22872, 771, 813, 658, 11333, 403, 12353, 68, 17425, 11333, 326, 17958, 875, 3646, 7103, 285, 3646, 7756, 275, 1340, 281, 3037, 271, 3576, 3646, 50276, 2520, 310, 417, 12353, 68, 17425, 533, 3646, 19502, 50276, 32240, 359, 897, 253, 4751, 10166, 3768, 317, 3646, 841, 1543, 403, 2361, 275, 253, 5084, 20451, 275, 2829, 337, 50276, 32240, 752, 50276, 1704, 5976, 1055, 50276, 66, 887, 50276, 1704, 7652, 6797, 433, 318, 50276, 615, 4971, 318, 50276, 14059, 1205, 274, 9204, 891, 90, 1205, 274, 305, 295, 10237, 7870, 10717, 23065, 273, 5871, 2561, 1884, 14832, 3547, 1438, 5826, 1236, 13807, 9638, 1236, 13807, 391, 492, 280, 2955, 1094, 268, 50276, 85, 2679, 85, 711, 2049, 265, 391, 246, 6247, 778, 4999, 3646, 7756, 342, 8245, 7491, 10981, 2784, 275, 5213, 8059, 327, 5145, 4715, 7266, 23412, 1508, 36630, 5296, 303, 9204, 5296, 303, 247, 285, 1045, 305, 3227, 276, 74, 298, 10237, 1453, 273, 1616, 729, 3061, 4870, 342, 8767, 5502, 12624, 5871, 2561, 608, 24424, 1438, 37348, 5826, 7590, 16409, 6961, 7590, 16409, 278, 305, 10940, 312, 91, 796, 73, 278, 50276, 348, 319, 340, 4022, 4999, 3646, 7756, 407, 28699, 10237, 8245, 14938, 275, 16424, 275, 11454, 1491, 5162, 2718, 7266, 26780, 3507, 18950, 289, 4921, 6620, 66, 289, 4921, 268, 256, 253, 80, 3615, 528, 305, 50276, 72, 10940, 312, 91, 796, 73, 278, 4104, 704, 67, 4701, 1029, 39943, 745, 22872, 7103, 275, 2500, 290, 1362, 11902, 39951, 2284, 8059, 327, 13345, 9260, 289, 4921, 6620, 67, 289, 4921, 268, 253, 80, 3615, 528, 305, 50276, 72, 10940, 312, 91, 796, 73, 278, 4104, 480, 2517, 1029, 7162, 3646, 7756, 275, 5213, 8059, 327, 5145, 4715, 7266, 3495, 1438, 1508, 2055, 187, 187, 4118, 18435, 27, 2520, 2929, 29328, 247, 1332, 323, 28841, 35221, 4715, 3082, 342, 1566, 3169, 3646, 13757, 835, 597, 806, 3037, 247, 1566, 273, 253, 3126, 281, 3037, 253, 5502, 8062, 247, 7291, 285, 253, 3646, 275, 271, 28841, 5133, 597, 10323, 3037, 253, 1566, 407, 3733, 271, 19862, 273, 37851, 8062, 3210, 6607, 407, 11454, 6928, 326, 3453, 247, 16421, 305, 12064, 3268, 689, 253, 1735, 1375, 285, 10921, 840, 597, 897, 253, 26677, 273, 253, 37851, 8062, 1566, 281, 755, 271, 11649, 2557, 326, 597, 19071, 715, 253, 50276, 250, 1034, 672, 3733, 352, 342, 253, 3768, 317, 50276, 9088, 497, 767, 2022, 7350, 5439, 407, 253, 30628, 50276, 18, 4679, 347, 8042, 562, 407, 253, 30628, 253, 5661, 15988, 13414, 1007, 1077, 21414, 275, 1798, 253, 3045, 273, 3768, 317, 4453, 3076, 285, 45505, 19, 5367, 36908, 1918, 1199, 15988, 327, 1755, 273, 352, 352, 310, 417, 2590, 849, 1199, 1805, 253, 4081, 1332, 310, 2509, 327, 253, 8892, 326, 352, 1057, 973, 1293, 667, 7162, 11508, 390, 11041, 5593, 2530, 50276, 19, 38135, 50276, 2520, 310, 2761, 247, 14916, 5019, 273, 767, 5368, 5697, 1566, 1754, 3646, 13757, 285, 3768, 317, 352, 310, 417, 2590, 849, 4217, 436, 1798, 5019, 310, 275, 3946, 285, 352, 3133, 751, 627, 310, 417, 1199, 16039, 12103, 432, 352, 50275, 74, 1158, 1805, 42852, 2007, 490, 77, 569, 285, 625, 16774, 1783, 281, 2096, 253, 4081, 1566, 1805, 323, 1650, 18918, 253, 3510, 273, 13576, 6311, 390, 849, 35890, 253, 11649, 8197, 326, 310, 11217, 715, 253, 10921, 310, 390, 690, 4373, 19484, 7340, 1783, 651, 1056, 253, 2929, 625, 4722, 50276, 284, 352, 9572, 987, 1024, 891, 717, 7738, 281, 12009, 436, 2929, 891, 3524, 253, 4477, 588, 3157, 253, 2929, 323, 253, 2852, 50276 ]
Below is given review of a research paper from cnoference journal. Please write a summary the review. ### Review: the authors propose a maximum likelihood approach for irl theoretical results show that the algorithm converges in finite time and that for linear rewards that the mlirl problem is dual with maxentirl and that there is strong duality empirical results show that mlirl outperforms stateoftheart irl algorithms across several standard benchmarks the strong duality theorem appears novel and draws an interesting connection between mlirl and maxentirl the authors prove the first finite guarantees for irl with nonlinear reward functions the empirical results are promising and improve upon soa some of the claims of novelty seem to be standard practice and are used in prior work it is unclear why an ml approach to irl should perform better than other approaches the authors have nice theory but in practice the implemented algorithm seems very similar to prior work it is unclear how statistically significant the empirical results are with only 3 seeds and no confidence intervals a discussion of limitations are lacking in the main body of the paper docsepthe paper formalizes irl as a entropyregularized maximum likelihood problem the authors show that this problem is dual to maximum entropy irl and then formulates this problem as a bilevel optimization they propose an algorithm that iterates between single steps of evaluation and policy improvement in contrast to most max entropy irl methods that have to solve rl problems in an inner loop for this algorithm the authors provide a finite time convergence guarantee under some regularity conditions finally they provide an empirical evaluation on standard mujoco task demonstrating that their algorithm outperforms alternative irl approaches strengths the paper provides the first finitetime convergence analysis without assuming linear rewards as far as i can tell the assumptions of the main theoretical results seem pretty reasonable this is a solid theoretical contribution the duality between maxent irl and the proposed maximum likelihood formulation might be interesting to the irl community and could motivate interesting followup work the authors provide a good experimental evaluation that covers most relevant baselines and shows that their algorithm can perform well empirically the authors discuss the novelty of their work and provide particularly appropriate comparisons to results in prior work at multiple point eg for their ml formulation the proposed algorithm and the main theoretical result weaknesses the duality with maxent irl is not particularly novel and has been observed before in other context as noted by the authors the results in table 1 dont show very large differences and it is hard to evaluate if these are significant differences given that only 3 random seeds were run the experimental evaluation is limited to mujoco locomotion tasks which can sometimes be too simple to draw strong conclusions overall the writing could be clearer at times and the structure of the paper could be improved for example the introduction spends a lot of time discussion prior work before stating the contributions of this paper which is suboptimal i did not find any discussion of potential broader impact of this work docsepthis paper proposes a novel singleloop irl algorithm to reduce the computational burden of the previous irl framework the authors prove that the solution of the proposed framework is equivalent to the maximum entropy irl framework with a linearly parameterized reward function they also provide a convergence guarantee of the proposed algorithm the experiment result seems promising strengths the paper is well written the problem studied in this paper is important and the proposed method is novel the authors provide theoretical guarantees for the proposed framework weaknesses there may need more literature review on the maxentirl although the mlirl framework could reduce the computational complexity it is not obvious why the proposed mlirl framework works for the transfer learning tasks the authors adequately addressed the limitations and potential negative societal impact of their work docsepin this paper the authors proposed a new formulation of irl based on maximum likelihood which is equivalent to maxent when the reward function is linear by leveraging this formulation the authors propose a new computationally efficient gradientbased iterative algorithm that does not require solving mdp in each iteration the authors further prove a nonasymptotically rate of how fast the algorithm converges to a stationary point which is the first nonasymptotic convergence result for ilr with nonlinear reward parameterization finally the authors conduct extensive experiments that show that the proposed algorithm outperforms several stateoftheart irl algorithms strength the authors claim that thm 2 is the first nonasymptotic convergence result for ilr with nonlinear reward parameterization i believe this is true as a recent paper 1 from icml 2021 proves a nonasymptotic rate for a gradientbased method with linear reward parameterization although that papers proof does not seem to have the assumption the value function can be accurately estimated weakness i think the contribution above is important to the irl community however other contributions of this paper may not be as significant as this one 1 the new formulation of irl this maximum likelihood formulation is known 2 the proof seems to be similar to thm 3 in 3 2 the algorithm the authors mentioned that the algorithm enjoys computational efficiency and is capable of reward transferring however gailtype of algorithms that use statedependent rewards in the discriminator 4 and scalable maxent irl algorithms 5 are able to achieve those benefits as well 3 the experiments i like the experiments on reward transferring however the main goal of proposing the new algorithm and theory is to reduce computation burden but the experiments focus on the quality of the learned policy reward after convergence its unclear from the experiments whether the new algorithm is indeed faster for example an interesting experiment would be a comparison with maxent irl as in theory the proposed method is able to converge to a stationary point thats the same as maxent irl but faster furthermore the clarity of the paper can be improved line 118 to 135 is rather confusing and only until the second reading i was able to understand the math in line 119pitheta is parameterized by theta but in line 123 theta is used to parameterize the reward function 1 kamoutsi angeliki goran banjac and john lygeros efficient performance bounds for primaldual reinforcement learning from demonstrations international conference on machine learning pmlr 2021 2 jain vinamra prashant doshi and bikramjit banerjee modelfree irl using maximum likelihood estimation proceedings of the aaai conference on artificial intelligence vol 33 no 01 2019 3 ziebart brian d j andrew bagnell and anind k dey the principle of maximum causal entropy for estimating interacting processes ieee transactions on information theory 594 2013 19661980 4 torabi faraz garrett warnell and peter stone generative adversarial imitation from observation arxiv preprint arxiv180706158 2018 5 finn chelsea sergey levine and pieter abbeel guided cost learning deep inverse optimal control via policy optimization international conference on machine learning pmlr 2016 yes ### Summary:
this paper presents a new single loop irl algorithm that avoids the typical policyreward optimization loop in irl algorithms without sacrificing the accuracy of the learned reward function this is achieved through the use of stochastic gradients of the likelihood function the proposed algorithm is proved to converge to a stationary solution with a finitetime guarantee experiments on some problems in mujoco show that the proposed algorithm can outperform existing solutions the reviewers all agree that the paper is wellwritten the algorithm is sufficiently new and the experiments are compelling there are some concerns that experimental evaluation is limited to mujoco locomotion tasks which can sometimes be too simple to draw strong conclusions
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 783, 4477, 12661, 247, 4869, 12177, 2746, 323, 209, 2587, 10527, 1543, 921, 326, 253, 5933, 26414, 275, 6486, 673, 285, 326, 323, 4872, 23267, 326, 253, 13361, 2587, 1895, 310, 8746, 342, 2781, 290, 2587, 285, 326, 627, 310, 2266, 34962, 16774, 1543, 921, 326, 13361, 2587, 41731, 13015, 1375, 23037, 14387, 209, 2587, 11333, 2439, 2067, 2629, 49602, 253, 2266, 34962, 10012, 4620, 4460, 285, 21354, 271, 4722, 4602, 875, 13361, 2587, 285, 2781, 290, 2587, 50276, 783, 4477, 5276, 253, 806, 6486, 23632, 323, 209, 2587, 342, 14561, 10921, 3470, 50276, 783, 16774, 1543, 403, 12532, 285, 3157, 2220, 594, 66, 50276, 8826, 273, 253, 3916, 273, 38135, 1646, 281, 320, 2629, 3946, 285, 403, 908, 275, 2720, 789, 50276, 262, 310, 12744, 2139, 271, 13361, 2746, 281, 209, 2587, 943, 1347, 1805, 685, 643, 7274, 253, 4477, 452, 5322, 3762, 533, 275, 3946, 253, 9009, 5933, 3133, 1077, 2074, 281, 2720, 789, 50276, 262, 310, 12744, 849, 10126, 1534, 253, 16774, 1543, 403, 342, 760, 495, 12922, 285, 642, 7162, 11508, 247, 5955, 273, 7364, 403, 14999, 275, 253, 2022, 2133, 273, 253, 2929, 5474, 339, 431, 248, 2929, 7473, 4219, 209, 2587, 347, 247, 15579, 12846, 1025, 4869, 12177, 1895, 253, 4477, 921, 326, 436, 1895, 310, 8746, 281, 4869, 15579, 209, 2587, 285, 840, 17075, 684, 436, 1895, 347, 247, 26413, 652, 13757, 597, 12661, 271, 5933, 326, 10040, 684, 875, 2014, 5018, 273, 7103, 285, 3646, 7756, 275, 4499, 281, 954, 2781, 15579, 209, 2587, 3082, 326, 452, 281, 8415, 391, 77, 3237, 275, 271, 6703, 6287, 323, 436, 5933, 253, 4477, 2085, 247, 6486, 673, 14940, 12215, 762, 690, 31793, 2515, 4720, 597, 2085, 271, 16774, 7103, 327, 2629, 278, 10441, 16856, 4836, 17227, 326, 616, 5933, 41731, 13015, 5795, 209, 2587, 7274, 20544, 50276, 783, 2929, 3400, 253, 806, 1442, 262, 7816, 14940, 1783, 1293, 7384, 4872, 23267, 347, 2080, 347, 891, 476, 2028, 253, 13260, 273, 253, 2022, 10527, 1543, 1646, 3965, 5272, 436, 310, 247, 4891, 10527, 7680, 50276, 783, 34962, 875, 2781, 290, 209, 2587, 285, 253, 4081, 4869, 12177, 15895, 1537, 320, 4722, 281, 253, 209, 2587, 3114, 285, 812, 41509, 4722, 956, 484, 789, 50276, 783, 4477, 2085, 247, 1175, 5661, 7103, 326, 10949, 954, 4623, 1666, 25379, 285, 2722, 326, 616, 5933, 476, 1347, 973, 45190, 50276, 783, 4477, 2319, 253, 38135, 273, 616, 789, 285, 2085, 3782, 4569, 14023, 281, 1543, 275, 2720, 789, 387, 2709, 1127, 24088, 323, 616, 13361, 15895, 253, 4081, 5933, 285, 253, 2022, 10527, 906, 50275, 20881, 1255, 265, 50276, 783, 34962, 342, 2781, 290, 209, 2587, 310, 417, 3782, 4460, 285, 556, 644, 2540, 1078, 275, 643, 3634, 347, 4879, 407, 253, 4477, 50276, 783, 1543, 275, 2829, 337, 13414, 921, 1077, 1781, 3910, 285, 352, 310, 1892, 281, 7472, 604, 841, 403, 1534, 3910, 1677, 326, 760, 495, 3632, 12922, 497, 1408, 50276, 783, 5661, 7103, 310, 3710, 281, 278, 10441, 16856, 23904, 5011, 8892, 534, 476, 4536, 320, 1512, 2969, 281, 3812, 2266, 11815, 50276, 1189, 455, 253, 4028, 812, 320, 30909, 387, 2069, 285, 253, 2605, 273, 253, 2929, 812, 320, 5520, 323, 1650, 253, 10199, 30885, 247, 2257, 273, 673, 5955, 2720, 789, 1078, 14851, 253, 9021, 273, 436, 2929, 534, 310, 749, 29776, 891, 858, 417, 1089, 667, 5955, 273, 2442, 16055, 3486, 273, 436, 789, 5474, 33032, 2520, 2929, 29328, 247, 4460, 2014, 14075, 209, 2587, 5933, 281, 4796, 253, 15180, 7977, 273, 253, 2045, 209, 2587, 7792, 253, 4477, 5276, 326, 253, 2900, 273, 253, 4081, 7792, 310, 6425, 281, 253, 4869, 15579, 209, 2587, 7792, 342, 247, 50276, 1282, 1285, 4764, 1025, 10921, 1159, 597, 671, 2085, 247, 14940, 12215, 273, 253, 4081, 5933, 253, 3368, 906, 3133, 12532, 50276, 296, 3755, 20556, 50275, 783, 2929, 310, 973, 3542, 50275, 783, 1895, 5421, 275, 436, 2929, 310, 1774, 285, 253, 4081, 1332, 310, 4460, 50276, 783, 4477, 2085, 10527, 23632, 323, 253, 4081, 7792, 50276, 20881, 1255, 265, 50275, 9088, 778, 878, 625, 6239, 2278, 327, 253, 2781, 290, 2587, 50276, 20261, 253, 13361, 2587, 7792, 812, 4796, 253, 15180, 10454, 352, 310, 417, 4755, 2139, 253, 4081, 13361, 2587, 7792, 2987, 323, 253, 3700, 4715, 8892, 50276, 783, 4477, 18212, 9713, 253, 7364, 285, 2442, 4016, 38058, 3486, 273, 616, 789, 5474, 339, 9852, 436, 2929, 253, 4477, 4081, 247, 747, 15895, 273, 209, 2587, 1754, 327, 4869, 12177, 534, 310, 6425, 281, 2781, 290, 672, 253, 10921, 1159, 310, 4872, 50276, 1615, 19732, 2977, 436, 15895, 253, 4477, 12661, 247, 747, 43245, 5919, 11786, 3169, 34560, 5933, 326, 1057, 417, 2430, 16161, 278, 12132, 275, 1016, 19502, 253, 4477, 2007, 5276, 247, 1327, 284, 40045, 33248, 2281, 273, 849, 3809, 253, 5933, 26414, 281, 247, 17429, 1127, 534, 310, 253, 806, 1327, 284, 40045, 3875, 14940, 906, 323, 4164, 83, 342, 14561, 10921, 4764, 1320, 4720, 253, 4477, 2589, 9470, 4679, 326, 921, 326, 253, 4081, 5933, 41731, 13015, 2067, 1375, 23037, 14387, 209, 2587, 11333, 50275, 45563, 50275, 783, 4477, 1750, 326, 289, 78, 374, 310, 253, 806, 1327, 284, 40045, 3875, 14940, 906, 323, 4164, 83, 342, 14561, 10921, 4764, 1320, 891, 2868, 436, 310, 2032, 347, 247, 3332, 2929, 337, 432, 17857, 1686, 43425, 19539, 247, 1327, 284, 40045, 3875, 2281, 323, 247, 11786, 3169, 1332, 342, 4872, 10921, 4764, 1320, 3738, 326, 9380, 4737, 1057, 417, 1646, 281, 452, 253, 9376, 253, 1318, 1159, 476, 320, 13613, 5998, 50274, 20881, 1255, 50275, 74, 1158, 253, 7680, 1840, 310, 1774, 281, 253, 209, 2587, 3114, 50276, 35529, 643, 9021, 273, 436, 2929, 778, 417, 320, 347, 1534, 347, 436, 581, 50275, 18, 253, 747, 15895, 273, 209, 2587, 436, 4869, 12177, 15895, 310, 1929, 374, 253, 4737, 3133, 281, 320, 2074, 281, 289, 78, 495, 275, 495, 50276, 19, 253, 5933, 253, 4477, 5393, 326, 253, 5933, 29566, 15180, 6733, 285, 310, 7032, 273, 10921, 27090, 2299, 305, 647, 881, 273, 11333, 326, 897, 4767, 2662, 23267, 275, 253, 7134, 12915, 577, 285, 44755, 2781, 290, 209, 2587, 11333, 608, 403, 2104, 281, 5115, 1110, 5373, 347, 973, 50276, 20, 253, 4679, 891, 751, 253, 4679, 327, 10921, 27090, 2299, 253, 2022, 4736, 273, 36636, 253, 747, 5933, 285, 3762, 310, 281, 4796, 13782, 7977, 533, 253, 4679, 2770, 327, 253, 3290, 273, 253, 6311, 3646, 50276, 250, 1034, 846, 14940, 697, 12744, 432, 253, 4679, 1880, 253, 747, 5933, 310, 6296, 7938, 323, 1650, 271, 4722, 3368, 651, 320, 247, 5301, 342, 2781, 290, 209, 2587, 347, 275, 3762, 253, 4081, 1332, 310, 2104, 281, 29623, 281, 247, 17429, 1127, 28763, 253, 1072, 347, 2781, 290, 209, 2587, 533, 7938, 50276, 44295, 3062, 253, 19843, 273, 253, 2929, 476, 320, 5520, 1386, 12643, 281, 13620, 310, 2581, 21643, 285, 760, 1919, 253, 1273, 4361, 891, 369, 2104, 281, 2096, 253, 14168, 275, 1386, 12035, 18086, 22666, 310, 4764, 1025, 407, 39116, 533, 275, 1386, 15567, 39116, 310, 908, 281, 4764, 907, 253, 10921, 1159, 50275, 18, 465, 312, 8349, 74, 23087, 8678, 42599, 266, 8913, 47941, 285, 480, 2116, 12865, 1063, 375, 5919, 3045, 14493, 323, 819, 1983, 34716, 35221, 4715, 432, 32367, 5213, 8059, 327, 5145, 4715, 268, 1686, 83, 43425, 50276, 19, 480, 404, 35803, 312, 376, 819, 1225, 386, 9500, 5801, 285, 270, 1479, 3358, 75, 262, 8913, 254, 39101, 771, 813, 658, 209, 2587, 970, 4869, 12177, 13418, 10061, 273, 253, 39951, 2284, 8059, 327, 13345, 9260, 1936, 5922, 642, 14805, 6247, 50276, 20, 1182, 466, 35292, 270, 5651, 277, 480, 285, 2663, 270, 1530, 437, 285, 271, 527, 465, 372, 90, 253, 8063, 273, 4869, 19349, 15579, 323, 26230, 18745, 4870, 26332, 1796, 13122, 327, 1491, 3762, 46475, 4072, 19213, 16134, 50276, 21, 7263, 18754, 2080, 1370, 6746, 12436, 8276, 437, 285, 268, 1715, 8805, 1006, 800, 48960, 45738, 432, 8310, 549, 32693, 638, 3845, 549, 32693, 11395, 28166, 18663, 4765, 50276, 22, 1442, 79, 1161, 77, 15681, 1151, 463, 90, 20978, 460, 285, 12580, 1715, 490, 1257, 293, 18107, 2105, 4715, 3676, 13737, 8654, 1453, 3066, 3646, 13757, 5213, 8059, 327, 5145, 4715, 268, 1686, 83, 4022, 50276, 9820, 2490, 187, 4118, 18435, 27, 2520, 2929, 10262, 247, 747, 2014, 6287, 209, 2587, 5933, 326, 32547, 253, 6867, 3646, 250, 1034, 13757, 6287, 275, 209, 2587, 11333, 1293, 18501, 272, 253, 7200, 273, 253, 6311, 10921, 1159, 436, 310, 6786, 949, 253, 897, 273, 19191, 27935, 273, 253, 12177, 1159, 253, 4081, 5933, 310, 8058, 281, 29623, 281, 247, 17429, 2900, 342, 247, 1442, 262, 7816, 12215, 4679, 327, 690, 3237, 275, 278, 10441, 16856, 921, 326, 253, 4081, 5933, 476, 562, 32231, 5368, 5482, 253, 30628, 512, 5194, 326, 253, 2929, 310, 973, 15720, 253, 5933, 310, 10481, 747, 285, 253, 4679, 403, 18511, 627, 403, 690, 7350, 326, 5661, 7103, 310, 3710, 281, 278, 10441, 16856, 23904, 5011, 8892, 534, 476, 4536, 320, 1512, 2969, 281, 3812, 2266, 11815 ]
[ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 ]
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 783, 4477, 12661, 247, 4869, 12177, 2746, 323, 209, 2587, 10527, 1543, 921, 326, 253, 5933, 26414, 275, 6486, 673, 285, 326, 323, 4872, 23267, 326, 253, 13361, 2587, 1895, 310, 8746, 342, 2781, 290, 2587, 285, 326, 627, 310, 2266, 34962, 16774, 1543, 921, 326, 13361, 2587, 41731, 13015, 1375, 23037, 14387, 209, 2587, 11333, 2439, 2067, 2629, 49602, 253, 2266, 34962, 10012, 4620, 4460, 285, 21354, 271, 4722, 4602, 875, 13361, 2587, 285, 2781, 290, 2587, 50276, 783, 4477, 5276, 253, 806, 6486, 23632, 323, 209, 2587, 342, 14561, 10921, 3470, 50276, 783, 16774, 1543, 403, 12532, 285, 3157, 2220, 594, 66, 50276, 8826, 273, 253, 3916, 273, 38135, 1646, 281, 320, 2629, 3946, 285, 403, 908, 275, 2720, 789, 50276, 262, 310, 12744, 2139, 271, 13361, 2746, 281, 209, 2587, 943, 1347, 1805, 685, 643, 7274, 253, 4477, 452, 5322, 3762, 533, 275, 3946, 253, 9009, 5933, 3133, 1077, 2074, 281, 2720, 789, 50276, 262, 310, 12744, 849, 10126, 1534, 253, 16774, 1543, 403, 342, 760, 495, 12922, 285, 642, 7162, 11508, 247, 5955, 273, 7364, 403, 14999, 275, 253, 2022, 2133, 273, 253, 2929, 5474, 339, 431, 248, 2929, 7473, 4219, 209, 2587, 347, 247, 15579, 12846, 1025, 4869, 12177, 1895, 253, 4477, 921, 326, 436, 1895, 310, 8746, 281, 4869, 15579, 209, 2587, 285, 840, 17075, 684, 436, 1895, 347, 247, 26413, 652, 13757, 597, 12661, 271, 5933, 326, 10040, 684, 875, 2014, 5018, 273, 7103, 285, 3646, 7756, 275, 4499, 281, 954, 2781, 15579, 209, 2587, 3082, 326, 452, 281, 8415, 391, 77, 3237, 275, 271, 6703, 6287, 323, 436, 5933, 253, 4477, 2085, 247, 6486, 673, 14940, 12215, 762, 690, 31793, 2515, 4720, 597, 2085, 271, 16774, 7103, 327, 2629, 278, 10441, 16856, 4836, 17227, 326, 616, 5933, 41731, 13015, 5795, 209, 2587, 7274, 20544, 50276, 783, 2929, 3400, 253, 806, 1442, 262, 7816, 14940, 1783, 1293, 7384, 4872, 23267, 347, 2080, 347, 891, 476, 2028, 253, 13260, 273, 253, 2022, 10527, 1543, 1646, 3965, 5272, 436, 310, 247, 4891, 10527, 7680, 50276, 783, 34962, 875, 2781, 290, 209, 2587, 285, 253, 4081, 4869, 12177, 15895, 1537, 320, 4722, 281, 253, 209, 2587, 3114, 285, 812, 41509, 4722, 956, 484, 789, 50276, 783, 4477, 2085, 247, 1175, 5661, 7103, 326, 10949, 954, 4623, 1666, 25379, 285, 2722, 326, 616, 5933, 476, 1347, 973, 45190, 50276, 783, 4477, 2319, 253, 38135, 273, 616, 789, 285, 2085, 3782, 4569, 14023, 281, 1543, 275, 2720, 789, 387, 2709, 1127, 24088, 323, 616, 13361, 15895, 253, 4081, 5933, 285, 253, 2022, 10527, 906, 50275, 20881, 1255, 265, 50276, 783, 34962, 342, 2781, 290, 209, 2587, 310, 417, 3782, 4460, 285, 556, 644, 2540, 1078, 275, 643, 3634, 347, 4879, 407, 253, 4477, 50276, 783, 1543, 275, 2829, 337, 13414, 921, 1077, 1781, 3910, 285, 352, 310, 1892, 281, 7472, 604, 841, 403, 1534, 3910, 1677, 326, 760, 495, 3632, 12922, 497, 1408, 50276, 783, 5661, 7103, 310, 3710, 281, 278, 10441, 16856, 23904, 5011, 8892, 534, 476, 4536, 320, 1512, 2969, 281, 3812, 2266, 11815, 50276, 1189, 455, 253, 4028, 812, 320, 30909, 387, 2069, 285, 253, 2605, 273, 253, 2929, 812, 320, 5520, 323, 1650, 253, 10199, 30885, 247, 2257, 273, 673, 5955, 2720, 789, 1078, 14851, 253, 9021, 273, 436, 2929, 534, 310, 749, 29776, 891, 858, 417, 1089, 667, 5955, 273, 2442, 16055, 3486, 273, 436, 789, 5474, 33032, 2520, 2929, 29328, 247, 4460, 2014, 14075, 209, 2587, 5933, 281, 4796, 253, 15180, 7977, 273, 253, 2045, 209, 2587, 7792, 253, 4477, 5276, 326, 253, 2900, 273, 253, 4081, 7792, 310, 6425, 281, 253, 4869, 15579, 209, 2587, 7792, 342, 247, 50276, 1282, 1285, 4764, 1025, 10921, 1159, 597, 671, 2085, 247, 14940, 12215, 273, 253, 4081, 5933, 253, 3368, 906, 3133, 12532, 50276, 296, 3755, 20556, 50275, 783, 2929, 310, 973, 3542, 50275, 783, 1895, 5421, 275, 436, 2929, 310, 1774, 285, 253, 4081, 1332, 310, 4460, 50276, 783, 4477, 2085, 10527, 23632, 323, 253, 4081, 7792, 50276, 20881, 1255, 265, 50275, 9088, 778, 878, 625, 6239, 2278, 327, 253, 2781, 290, 2587, 50276, 20261, 253, 13361, 2587, 7792, 812, 4796, 253, 15180, 10454, 352, 310, 417, 4755, 2139, 253, 4081, 13361, 2587, 7792, 2987, 323, 253, 3700, 4715, 8892, 50276, 783, 4477, 18212, 9713, 253, 7364, 285, 2442, 4016, 38058, 3486, 273, 616, 789, 5474, 339, 9852, 436, 2929, 253, 4477, 4081, 247, 747, 15895, 273, 209, 2587, 1754, 327, 4869, 12177, 534, 310, 6425, 281, 2781, 290, 672, 253, 10921, 1159, 310, 4872, 50276, 1615, 19732, 2977, 436, 15895, 253, 4477, 12661, 247, 747, 43245, 5919, 11786, 3169, 34560, 5933, 326, 1057, 417, 2430, 16161, 278, 12132, 275, 1016, 19502, 253, 4477, 2007, 5276, 247, 1327, 284, 40045, 33248, 2281, 273, 849, 3809, 253, 5933, 26414, 281, 247, 17429, 1127, 534, 310, 253, 806, 1327, 284, 40045, 3875, 14940, 906, 323, 4164, 83, 342, 14561, 10921, 4764, 1320, 4720, 253, 4477, 2589, 9470, 4679, 326, 921, 326, 253, 4081, 5933, 41731, 13015, 2067, 1375, 23037, 14387, 209, 2587, 11333, 50275, 45563, 50275, 783, 4477, 1750, 326, 289, 78, 374, 310, 253, 806, 1327, 284, 40045, 3875, 14940, 906, 323, 4164, 83, 342, 14561, 10921, 4764, 1320, 891, 2868, 436, 310, 2032, 347, 247, 3332, 2929, 337, 432, 17857, 1686, 43425, 19539, 247, 1327, 284, 40045, 3875, 2281, 323, 247, 11786, 3169, 1332, 342, 4872, 10921, 4764, 1320, 3738, 326, 9380, 4737, 1057, 417, 1646, 281, 452, 253, 9376, 253, 1318, 1159, 476, 320, 13613, 5998, 50274, 20881, 1255, 50275, 74, 1158, 253, 7680, 1840, 310, 1774, 281, 253, 209, 2587, 3114, 50276, 35529, 643, 9021, 273, 436, 2929, 778, 417, 320, 347, 1534, 347, 436, 581, 50275, 18, 253, 747, 15895, 273, 209, 2587, 436, 4869, 12177, 15895, 310, 1929, 374, 253, 4737, 3133, 281, 320, 2074, 281, 289, 78, 495, 275, 495, 50276, 19, 253, 5933, 253, 4477, 5393, 326, 253, 5933, 29566, 15180, 6733, 285, 310, 7032, 273, 10921, 27090, 2299, 305, 647, 881, 273, 11333, 326, 897, 4767, 2662, 23267, 275, 253, 7134, 12915, 577, 285, 44755, 2781, 290, 209, 2587, 11333, 608, 403, 2104, 281, 5115, 1110, 5373, 347, 973, 50276, 20, 253, 4679, 891, 751, 253, 4679, 327, 10921, 27090, 2299, 253, 2022, 4736, 273, 36636, 253, 747, 5933, 285, 3762, 310, 281, 4796, 13782, 7977, 533, 253, 4679, 2770, 327, 253, 3290, 273, 253, 6311, 3646, 50276, 250, 1034, 846, 14940, 697, 12744, 432, 253, 4679, 1880, 253, 747, 5933, 310, 6296, 7938, 323, 1650, 271, 4722, 3368, 651, 320, 247, 5301, 342, 2781, 290, 209, 2587, 347, 275, 3762, 253, 4081, 1332, 310, 2104, 281, 29623, 281, 247, 17429, 1127, 28763, 253, 1072, 347, 2781, 290, 209, 2587, 533, 7938, 50276, 44295, 3062, 253, 19843, 273, 253, 2929, 476, 320, 5520, 1386, 12643, 281, 13620, 310, 2581, 21643, 285, 760, 1919, 253, 1273, 4361, 891, 369, 2104, 281, 2096, 253, 14168, 275, 1386, 12035, 18086, 22666, 310, 4764, 1025, 407, 39116, 533, 275, 1386, 15567, 39116, 310, 908, 281, 4764, 907, 253, 10921, 1159, 50275, 18, 465, 312, 8349, 74, 23087, 8678, 42599, 266, 8913, 47941, 285, 480, 2116, 12865, 1063, 375, 5919, 3045, 14493, 323, 819, 1983, 34716, 35221, 4715, 432, 32367, 5213, 8059, 327, 5145, 4715, 268, 1686, 83, 43425, 50276, 19, 480, 404, 35803, 312, 376, 819, 1225, 386, 9500, 5801, 285, 270, 1479, 3358, 75, 262, 8913, 254, 39101, 771, 813, 658, 209, 2587, 970, 4869, 12177, 13418, 10061, 273, 253, 39951, 2284, 8059, 327, 13345, 9260, 1936, 5922, 642, 14805, 6247, 50276, 20, 1182, 466, 35292, 270, 5651, 277, 480, 285, 2663, 270, 1530, 437, 285, 271, 527, 465, 372, 90, 253, 8063, 273, 4869, 19349, 15579, 323, 26230, 18745, 4870, 26332, 1796, 13122, 327, 1491, 3762, 46475, 4072, 19213, 16134, 50276, 21, 7263, 18754, 2080, 1370, 6746, 12436, 8276, 437, 285, 268, 1715, 8805, 1006, 800, 48960, 45738, 432, 8310, 549, 32693, 638, 3845, 549, 32693, 11395, 28166, 18663, 4765, 50276, 22, 1442, 79, 1161, 77, 15681, 1151, 463, 90, 20978, 460, 285, 12580, 1715, 490, 1257, 293, 18107, 2105, 4715, 3676, 13737, 8654, 1453, 3066, 3646, 13757, 5213, 8059, 327, 5145, 4715, 268, 1686, 83, 4022, 50276, 9820, 2490, 187, 4118, 18435, 27, 2520, 2929, 10262, 247, 747, 2014, 6287, 209, 2587, 5933, 326, 32547, 253, 6867, 3646, 250, 1034, 13757, 6287, 275, 209, 2587, 11333, 1293, 18501, 272, 253, 7200, 273, 253, 6311, 10921, 1159, 436, 310, 6786, 949, 253, 897, 273, 19191, 27935, 273, 253, 12177, 1159, 253, 4081, 5933, 310, 8058, 281, 29623, 281, 247, 17429, 2900, 342, 247, 1442, 262, 7816, 12215, 4679, 327, 690, 3237, 275, 278, 10441, 16856, 921, 326, 253, 4081, 5933, 476, 562, 32231, 5368, 5482, 253, 30628, 512, 5194, 326, 253, 2929, 310, 973, 15720, 253, 5933, 310, 10481, 747, 285, 253, 4679, 403, 18511, 627, 403, 690, 7350, 326, 5661, 7103, 310, 3710, 281, 278, 10441, 16856, 23904, 5011, 8892, 534, 476, 4536, 320, 1512, 2969, 281, 3812, 2266, 11815 ]
Below is given review of a research paper from cnoference journal. Please write a summary the review. ### Review: summary the paper presents a method for classification which takes into account the semantic hierarchy of output labels rather than treating them as independent categories in a typical classification setup the loss penalizes the kldivergence between the models predicted label distribution and a onehot distribution placing all probability mass on the single groundtruth label for each example the proposed method instead constructs a target distribution which places probability mass not only on leaf category nodes but also on their neighbors in a known semantic hierarchy of labels then penalizes the kldivergence between a models predicted distribution and this target distribution this model is used for classification on imagenet1k and for zeroshot classification on imagenet21k where a model must predict superclasses seen during training for images of leaf categories not seen during training pros method is fairly straightforward modeling relationships between labels is an important problem cons missing references to key prior work in this space minimal comparison to prior work confusing experimental setup paper is difficult to read missing references this paper is far from the first to consider the use of a semantic hierarchy to improve classification systems see for example deng et al hedging your bets optimizing accuracyspecificity tradeoffs in large scale visual recognition cvpr 2012 deng et al largescale object classification using label relation graphs eccv 2014 best paper jiang et al exploiting feature and class relationships in video categorization with regularized deep neural networks tpami 2017 none of these are cited in the submission deng et al 2014 is particularly relevant as it considers not just isa relationships as in this submission but also mutual exclusion relationships between categories without citation discussion and comparison with some of these key pieces of prior work the current submission is incomplete comparison to prior work the only direct comparison to prior work in the paper is the comparison to devise on ilsvrc12 classification performance in table 3 however since devise was intended to be used for zeroshot learning and not traditional supervised classification this comparison seems unfair instead the authors should compare their method against devise and conse for zeroshot learning indeed in section 43 the authors construct a test set in a sic same manner defined in frome et al but do not actually compare against this prior work i suspect that the authors chose not to perform this comparison since unlike devise and conse their method cannot predict category labels not seen during training instead it is constrained to predicting a known supercategory when presented with an image of a novel leaf category as such the proposed method is not really zeroshot in the sense of devise and conse experimental setup from section 31 we adopt a subset of imagenet the ilsvrc12 dataset which gather sic 1k classes the 1000 category labels in ilsvrc12 are mutually exclusive leaf nodes when placed in the context of the wordnet hierarchy there are 820 internal nodes between these leaves and the wordnet root as a result for the method to make sense i assume that all models must be trained to output classification scores for all 1820 categories rather than the 1k leaf categories this should be made more explicit in the paper as it means that none of the performance metrics reported in the paper are comparable to other results on ilsvrc12 which only measure performance on the 1k leaf categories the experiments on zeroshot learning are also confusing rather than following the existing experimental protocol for evaluating zeroshot learning from frome et al 2013 and norouzi et al 2013 the authors evaluate zeroshot learning by plotting sghit vs sgspecificity while these are reasonable metrics they make it difficult to compare with prior work poor writing the paper is difficult to follow with confusing notation and many spelling and grammatical errors overall on the whole the paper addresses an important problem and presents a reasonable method however due to the omission of key references and incomplete comparison to prior work the paper is not suitable for publication in its current formdocsepthis paper proposes a new soft negative loglikelihood loss formulation for multiclass classification problems the new loss is built upon the taxonomy graph of labels which is provided as external knowledge and this loss provides better semantic generalization ability compared to a regular nway classifier and yields more accurate and meaningful superclass predictions this paper is wellwritten the main ideas and claims are clearly expressed the main benefits of the new loss are caused by the extra information contained by the taxonomy of labels and this idea is wellknown and popular in the literature based on this reason i think the main contribution of this paper is the discussion on two novel learning settings which related to the superclasses however the formulation of the new soft nll loss and the sg measurement involves lots of concepts designed based on experiences so its hard to say whether these are the optimal choices so i suggest the authors discuss more on these designs another thing i concern about is the source of label taxonomy how to efficiently generate the taxonomy what if the taxonomy is not perfect and contains noises will these significantly affect the models performance i think its better to take these problems into consideration in conclusion i think this is an interesting paper but can still be improveddocsepfirst of all the paper cannot be accepted because it violates the double blind submission policy by including an acknowledgments section nonetheless i will give some brief comments the paper proposes a probabilistic hierarchical approach to perform zeroshot learning instead of directly optimizing the standard crossentropy loss the paper considers some soft probability scores that consider some class graph taxonomy the experimental section of the paper is strong enough although more baselines could have been tested the paper only compares the usual cross entropy loss with their proposed softclassification framework nonetheless different architectures of neural networks are tested on imagenet and validate the fact that the soft probability strategy improves performance on the zeroshot learning task on the other hand the theoretical aspect is weak the proposed method seems to be a straightforward extension of frome et al nips 2013 the main contribution is that soft probability scores are used to perform classification instead of using only class membership information some weighting strategy is proposed in section 22 but the proposed steps seem very ad hoc with no theoretical justification the first equation on page 8 has the same problem where some random definition is provided ### Summary:
the paper proposes to take into accunt the label structure for classification tasks instead of a flat nway softmax this also lead to a zeroshot setting to consider novel classes reviewers point to a lack of reference to prior work and comparisons authors have tried to justify their choices but the overall sentiment is that it lacks novelty with respect to previous approaches all reviewers recommend to reject and so do i
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 8774, 253, 2929, 10262, 247, 1332, 323, 9162, 534, 3936, 715, 2395, 253, 24705, 19868, 273, 3453, 13301, 2581, 685, 12767, 731, 347, 3907, 9050, 275, 247, 6867, 9162, 9978, 253, 2957, 29697, 4219, 253, 465, 392, 2373, 9515, 875, 253, 3210, 8131, 5203, 3268, 285, 247, 581, 12022, 3268, 15606, 512, 5912, 2280, 327, 253, 2014, 3216, 33024, 5203, 323, 1016, 1650, 253, 4081, 1332, 3185, 21031, 247, 2303, 3268, 534, 5053, 5912, 2280, 417, 760, 327, 10617, 7140, 7632, 533, 671, 327, 616, 15833, 275, 247, 1929, 24705, 19868, 273, 13301, 840, 29697, 4219, 253, 465, 392, 2373, 9515, 875, 247, 3210, 8131, 3268, 285, 436, 2303, 3268, 436, 1566, 310, 908, 323, 9162, 327, 4440, 257, 292, 18, 76, 285, 323, 1182, 254, 6934, 302, 9162, 327, 4440, 257, 292, 1797, 76, 835, 247, 1566, 1364, 3283, 2221, 19770, 2326, 1309, 3733, 323, 3888, 273, 10617, 9050, 417, 2326, 1309, 3733, 50276, 856, 84, 50276, 9349, 310, 9648, 15246, 50276, 7645, 272, 7688, 875, 13301, 310, 271, 1774, 1895, 50276, 5040, 50276, 33722, 10414, 281, 2234, 2720, 789, 275, 436, 2317, 50276, 8402, 5301, 281, 2720, 789, 50276, 8259, 5302, 5661, 9978, 50276, 20790, 310, 2834, 281, 1239, 50276, 33722, 10414, 436, 2929, 310, 2080, 432, 253, 806, 281, 1908, 253, 897, 273, 247, 24705, 19868, 281, 3157, 9162, 2718, 923, 323, 1650, 50276, 69, 1205, 1162, 355, 21867, 3390, 634, 45780, 39793, 3933, 317, 656, 29765, 414, 5454, 14273, 275, 1781, 4311, 5304, 8981, 30105, 1087, 4050, 50276, 69, 1205, 1162, 355, 1236, 2510, 25912, 1789, 9162, 970, 5203, 5886, 14580, 23746, 87, 4059, 1682, 2929, 50276, 36492, 1162, 355, 38883, 4735, 285, 966, 7688, 275, 3492, 13213, 1320, 342, 3963, 1025, 3676, 11454, 6928, 246, 81, 7588, 4240, 50276, 15422, 273, 841, 403, 11106, 275, 253, 19529, 32087, 1162, 355, 4059, 310, 3782, 4623, 347, 352, 19401, 417, 816, 310, 66, 7688, 347, 275, 436, 19529, 533, 671, 15577, 14978, 7688, 875, 9050, 1293, 25577, 5955, 285, 5301, 342, 690, 273, 841, 2234, 7437, 273, 2720, 789, 253, 1655, 19529, 310, 18464, 50276, 47109, 281, 2720, 789, 253, 760, 1480, 5301, 281, 2720, 789, 275, 253, 2929, 310, 253, 5301, 281, 45018, 327, 38934, 87, 3373, 805, 9162, 3045, 275, 2829, 495, 2299, 1580, 45018, 369, 6034, 281, 320, 908, 323, 1182, 254, 6934, 302, 4715, 285, 417, 5899, 22296, 9162, 436, 5301, 3133, 16593, 50276, 34235, 253, 4477, 943, 7277, 616, 1332, 1411, 45018, 285, 35584, 323, 1182, 254, 6934, 302, 4715, 6296, 275, 2593, 7652, 253, 4477, 3989, 247, 1071, 873, 275, 247, 256, 280, 1072, 5133, 2931, 275, 432, 70, 1162, 355, 533, 513, 417, 2686, 7277, 1411, 436, 2720, 789, 50276, 74, 9101, 326, 253, 4477, 9703, 417, 281, 1347, 436, 5301, 1580, 12401, 45018, 285, 35584, 616, 1332, 2550, 3283, 7140, 13301, 417, 2326, 1309, 3733, 3185, 352, 310, 20793, 281, 21565, 247, 1929, 2221, 14267, 672, 3559, 342, 271, 2460, 273, 247, 4460, 10617, 7140, 347, 824, 253, 4081, 1332, 310, 417, 1663, 1182, 254, 6934, 302, 275, 253, 3282, 273, 45018, 285, 35584, 50276, 49363, 9978, 432, 2593, 4562, 359, 5283, 247, 8578, 273, 4440, 257, 292, 253, 38934, 87, 3373, 805, 10895, 534, 9580, 256, 280, 337, 76, 5971, 50276, 783, 9098, 7140, 13301, 275, 38934, 87, 3373, 805, 403, 25834, 11855, 10617, 7632, 672, 4845, 275, 253, 3634, 273, 253, 3159, 3024, 19868, 627, 403, 50060, 4812, 7632, 875, 841, 6505, 285, 253, 3159, 3024, 5230, 347, 247, 906, 323, 253, 1332, 281, 1056, 3282, 891, 5467, 326, 512, 3210, 1364, 320, 10166, 281, 3453, 9162, 7363, 323, 512, 1283, 938, 9050, 2581, 685, 253, 337, 76, 10617, 9050, 436, 943, 320, 1160, 625, 6843, 275, 253, 2929, 347, 352, 2097, 326, 5293, 273, 253, 3045, 17082, 2361, 275, 253, 2929, 403, 10870, 281, 643, 1543, 327, 38934, 87, 3373, 805, 534, 760, 2557, 3045, 327, 253, 337, 76, 10617, 9050, 50276, 783, 4679, 327, 1182, 254, 6934, 302, 4715, 403, 671, 21643, 2581, 685, 1563, 253, 5368, 5661, 7241, 323, 16344, 1182, 254, 6934, 302, 4715, 432, 432, 70, 1162, 355, 4072, 285, 4543, 276, 9877, 1162, 355, 4072, 253, 4477, 7472, 1182, 254, 6934, 302, 4715, 407, 38542, 256, 18068, 262, 4632, 48237, 6160, 414, 1223, 841, 403, 5272, 17082, 597, 1056, 352, 2834, 281, 7277, 342, 2720, 789, 50276, 31943, 4028, 253, 2929, 310, 2834, 281, 956, 342, 21643, 14951, 285, 1142, 33797, 285, 47412, 474, 6332, 50276, 1189, 455, 327, 253, 2644, 253, 2929, 12453, 271, 1774, 1895, 285, 10262, 247, 5272, 1332, 2299, 1955, 281, 253, 33860, 273, 2234, 10414, 285, 18464, 5301, 281, 2720, 789, 253, 2929, 310, 417, 7470, 323, 9311, 275, 697, 1655, 830, 7152, 33032, 2520, 2929, 29328, 247, 747, 2602, 4016, 2412, 7513, 10202, 2957, 15895, 323, 23559, 14407, 9162, 3237, 253, 747, 2957, 310, 4270, 2220, 253, 2891, 13646, 4216, 273, 13301, 534, 310, 2530, 347, 6024, 3640, 285, 436, 2957, 3400, 1805, 24705, 26647, 3745, 2429, 281, 247, 3963, 295, 1106, 30410, 285, 11026, 625, 7899, 285, 14282, 2221, 2437, 13650, 50276, 2520, 2929, 310, 973, 15720, 253, 2022, 5697, 285, 3916, 403, 4518, 4469, 253, 2022, 5373, 273, 253, 747, 2957, 403, 4269, 407, 253, 4465, 1491, 6221, 407, 253, 2891, 13646, 273, 13301, 285, 436, 2934, 310, 973, 4304, 285, 4633, 275, 253, 6239, 1754, 327, 436, 1921, 891, 1158, 253, 2022, 7680, 273, 436, 2929, 310, 253, 5955, 327, 767, 4460, 4715, 7533, 534, 2905, 281, 253, 2221, 19770, 2299, 253, 15895, 273, 253, 747, 2602, 295, 620, 2957, 285, 253, 48237, 6814, 8687, 8783, 273, 12342, 4158, 1754, 327, 8450, 594, 697, 1892, 281, 1333, 1880, 841, 403, 253, 8654, 10165, 594, 891, 1804, 253, 4477, 2319, 625, 327, 841, 11809, 1529, 2181, 891, 4468, 670, 310, 253, 2603, 273, 5203, 2891, 13646, 849, 281, 14556, 6635, 253, 2891, 13646, 752, 604, 253, 2891, 13646, 310, 417, 3962, 285, 4428, 33737, 588, 841, 3012, 2818, 253, 3210, 3045, 891, 1158, 697, 1805, 281, 1379, 841, 3237, 715, 8180, 50276, 249, 6452, 891, 1158, 436, 310, 271, 4722, 2929, 533, 476, 1335, 320, 5520, 7152, 33032, 7053, 273, 512, 253, 2929, 2550, 320, 7607, 984, 352, 28096, 253, 4021, 9645, 19529, 3646, 407, 1690, 271, 9555, 14908, 2593, 50276, 4160, 14153, 891, 588, 1918, 690, 4864, 5701, 50275, 783, 2929, 29328, 247, 37851, 24498, 2746, 281, 1347, 1182, 254, 6934, 302, 4715, 3185, 273, 3587, 39793, 253, 2629, 2831, 290, 10144, 2957, 253, 2929, 19401, 690, 2602, 5912, 7363, 326, 1908, 690, 966, 4216, 2891, 13646, 50275, 783, 5661, 2593, 273, 253, 2929, 310, 2266, 2217, 3738, 625, 1666, 25379, 812, 452, 644, 5762, 253, 2929, 760, 26662, 253, 7312, 2831, 15579, 2957, 342, 616, 4081, 2602, 42070, 7792, 50276, 4160, 14153, 1027, 35615, 273, 11454, 6928, 403, 5762, 327, 4440, 257, 292, 285, 17813, 253, 958, 326, 253, 2602, 5912, 5700, 19132, 3045, 327, 253, 1182, 254, 6934, 302, 4715, 4836, 50274, 251, 253, 643, 1133, 253, 10527, 4809, 310, 5075, 253, 4081, 1332, 3133, 281, 320, 247, 15246, 6880, 273, 432, 70, 1162, 355, 295, 2824, 4072, 253, 2022, 7680, 310, 326, 2602, 5912, 7363, 403, 908, 281, 1347, 9162, 3185, 273, 970, 760, 966, 14199, 1491, 50276, 8826, 42428, 5700, 310, 4081, 275, 2593, 3307, 533, 253, 4081, 5018, 1646, 1077, 519, 26901, 342, 642, 10527, 22861, 253, 806, 5150, 327, 3239, 854, 556, 253, 1072, 1895, 835, 690, 3632, 5426, 310, 2530, 2490, 187, 4118, 18435, 27, 783, 2929, 29328, 281, 1379, 715, 756, 2084, 253, 5203, 2605, 323, 9162, 8892, 3185, 273, 247, 6507, 295, 1106, 2602, 4090, 436, 671, 1421, 281, 247, 1182, 254, 6934, 302, 4758, 281, 1908, 4460, 5971, 30628, 1127, 281, 247, 3480, 273, 3806, 281, 2720, 789, 285, 14023, 4477, 452, 3597, 281, 15249, 616, 10165, 533, 253, 4583, 21942, 310, 326, 352, 19756, 38135, 342, 1675, 281, 2045, 7274, 512, 30628, 5583, 281, 12009, 285, 594, 513, 891 ]
[ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 ]
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 8774, 253, 2929, 10262, 247, 1332, 323, 9162, 534, 3936, 715, 2395, 253, 24705, 19868, 273, 3453, 13301, 2581, 685, 12767, 731, 347, 3907, 9050, 275, 247, 6867, 9162, 9978, 253, 2957, 29697, 4219, 253, 465, 392, 2373, 9515, 875, 253, 3210, 8131, 5203, 3268, 285, 247, 581, 12022, 3268, 15606, 512, 5912, 2280, 327, 253, 2014, 3216, 33024, 5203, 323, 1016, 1650, 253, 4081, 1332, 3185, 21031, 247, 2303, 3268, 534, 5053, 5912, 2280, 417, 760, 327, 10617, 7140, 7632, 533, 671, 327, 616, 15833, 275, 247, 1929, 24705, 19868, 273, 13301, 840, 29697, 4219, 253, 465, 392, 2373, 9515, 875, 247, 3210, 8131, 3268, 285, 436, 2303, 3268, 436, 1566, 310, 908, 323, 9162, 327, 4440, 257, 292, 18, 76, 285, 323, 1182, 254, 6934, 302, 9162, 327, 4440, 257, 292, 1797, 76, 835, 247, 1566, 1364, 3283, 2221, 19770, 2326, 1309, 3733, 323, 3888, 273, 10617, 9050, 417, 2326, 1309, 3733, 50276, 856, 84, 50276, 9349, 310, 9648, 15246, 50276, 7645, 272, 7688, 875, 13301, 310, 271, 1774, 1895, 50276, 5040, 50276, 33722, 10414, 281, 2234, 2720, 789, 275, 436, 2317, 50276, 8402, 5301, 281, 2720, 789, 50276, 8259, 5302, 5661, 9978, 50276, 20790, 310, 2834, 281, 1239, 50276, 33722, 10414, 436, 2929, 310, 2080, 432, 253, 806, 281, 1908, 253, 897, 273, 247, 24705, 19868, 281, 3157, 9162, 2718, 923, 323, 1650, 50276, 69, 1205, 1162, 355, 21867, 3390, 634, 45780, 39793, 3933, 317, 656, 29765, 414, 5454, 14273, 275, 1781, 4311, 5304, 8981, 30105, 1087, 4050, 50276, 69, 1205, 1162, 355, 1236, 2510, 25912, 1789, 9162, 970, 5203, 5886, 14580, 23746, 87, 4059, 1682, 2929, 50276, 36492, 1162, 355, 38883, 4735, 285, 966, 7688, 275, 3492, 13213, 1320, 342, 3963, 1025, 3676, 11454, 6928, 246, 81, 7588, 4240, 50276, 15422, 273, 841, 403, 11106, 275, 253, 19529, 32087, 1162, 355, 4059, 310, 3782, 4623, 347, 352, 19401, 417, 816, 310, 66, 7688, 347, 275, 436, 19529, 533, 671, 15577, 14978, 7688, 875, 9050, 1293, 25577, 5955, 285, 5301, 342, 690, 273, 841, 2234, 7437, 273, 2720, 789, 253, 1655, 19529, 310, 18464, 50276, 47109, 281, 2720, 789, 253, 760, 1480, 5301, 281, 2720, 789, 275, 253, 2929, 310, 253, 5301, 281, 45018, 327, 38934, 87, 3373, 805, 9162, 3045, 275, 2829, 495, 2299, 1580, 45018, 369, 6034, 281, 320, 908, 323, 1182, 254, 6934, 302, 4715, 285, 417, 5899, 22296, 9162, 436, 5301, 3133, 16593, 50276, 34235, 253, 4477, 943, 7277, 616, 1332, 1411, 45018, 285, 35584, 323, 1182, 254, 6934, 302, 4715, 6296, 275, 2593, 7652, 253, 4477, 3989, 247, 1071, 873, 275, 247, 256, 280, 1072, 5133, 2931, 275, 432, 70, 1162, 355, 533, 513, 417, 2686, 7277, 1411, 436, 2720, 789, 50276, 74, 9101, 326, 253, 4477, 9703, 417, 281, 1347, 436, 5301, 1580, 12401, 45018, 285, 35584, 616, 1332, 2550, 3283, 7140, 13301, 417, 2326, 1309, 3733, 3185, 352, 310, 20793, 281, 21565, 247, 1929, 2221, 14267, 672, 3559, 342, 271, 2460, 273, 247, 4460, 10617, 7140, 347, 824, 253, 4081, 1332, 310, 417, 1663, 1182, 254, 6934, 302, 275, 253, 3282, 273, 45018, 285, 35584, 50276, 49363, 9978, 432, 2593, 4562, 359, 5283, 247, 8578, 273, 4440, 257, 292, 253, 38934, 87, 3373, 805, 10895, 534, 9580, 256, 280, 337, 76, 5971, 50276, 783, 9098, 7140, 13301, 275, 38934, 87, 3373, 805, 403, 25834, 11855, 10617, 7632, 672, 4845, 275, 253, 3634, 273, 253, 3159, 3024, 19868, 627, 403, 50060, 4812, 7632, 875, 841, 6505, 285, 253, 3159, 3024, 5230, 347, 247, 906, 323, 253, 1332, 281, 1056, 3282, 891, 5467, 326, 512, 3210, 1364, 320, 10166, 281, 3453, 9162, 7363, 323, 512, 1283, 938, 9050, 2581, 685, 253, 337, 76, 10617, 9050, 436, 943, 320, 1160, 625, 6843, 275, 253, 2929, 347, 352, 2097, 326, 5293, 273, 253, 3045, 17082, 2361, 275, 253, 2929, 403, 10870, 281, 643, 1543, 327, 38934, 87, 3373, 805, 534, 760, 2557, 3045, 327, 253, 337, 76, 10617, 9050, 50276, 783, 4679, 327, 1182, 254, 6934, 302, 4715, 403, 671, 21643, 2581, 685, 1563, 253, 5368, 5661, 7241, 323, 16344, 1182, 254, 6934, 302, 4715, 432, 432, 70, 1162, 355, 4072, 285, 4543, 276, 9877, 1162, 355, 4072, 253, 4477, 7472, 1182, 254, 6934, 302, 4715, 407, 38542, 256, 18068, 262, 4632, 48237, 6160, 414, 1223, 841, 403, 5272, 17082, 597, 1056, 352, 2834, 281, 7277, 342, 2720, 789, 50276, 31943, 4028, 253, 2929, 310, 2834, 281, 956, 342, 21643, 14951, 285, 1142, 33797, 285, 47412, 474, 6332, 50276, 1189, 455, 327, 253, 2644, 253, 2929, 12453, 271, 1774, 1895, 285, 10262, 247, 5272, 1332, 2299, 1955, 281, 253, 33860, 273, 2234, 10414, 285, 18464, 5301, 281, 2720, 789, 253, 2929, 310, 417, 7470, 323, 9311, 275, 697, 1655, 830, 7152, 33032, 2520, 2929, 29328, 247, 747, 2602, 4016, 2412, 7513, 10202, 2957, 15895, 323, 23559, 14407, 9162, 3237, 253, 747, 2957, 310, 4270, 2220, 253, 2891, 13646, 4216, 273, 13301, 534, 310, 2530, 347, 6024, 3640, 285, 436, 2957, 3400, 1805, 24705, 26647, 3745, 2429, 281, 247, 3963, 295, 1106, 30410, 285, 11026, 625, 7899, 285, 14282, 2221, 2437, 13650, 50276, 2520, 2929, 310, 973, 15720, 253, 2022, 5697, 285, 3916, 403, 4518, 4469, 253, 2022, 5373, 273, 253, 747, 2957, 403, 4269, 407, 253, 4465, 1491, 6221, 407, 253, 2891, 13646, 273, 13301, 285, 436, 2934, 310, 973, 4304, 285, 4633, 275, 253, 6239, 1754, 327, 436, 1921, 891, 1158, 253, 2022, 7680, 273, 436, 2929, 310, 253, 5955, 327, 767, 4460, 4715, 7533, 534, 2905, 281, 253, 2221, 19770, 2299, 253, 15895, 273, 253, 747, 2602, 295, 620, 2957, 285, 253, 48237, 6814, 8687, 8783, 273, 12342, 4158, 1754, 327, 8450, 594, 697, 1892, 281, 1333, 1880, 841, 403, 253, 8654, 10165, 594, 891, 1804, 253, 4477, 2319, 625, 327, 841, 11809, 1529, 2181, 891, 4468, 670, 310, 253, 2603, 273, 5203, 2891, 13646, 849, 281, 14556, 6635, 253, 2891, 13646, 752, 604, 253, 2891, 13646, 310, 417, 3962, 285, 4428, 33737, 588, 841, 3012, 2818, 253, 3210, 3045, 891, 1158, 697, 1805, 281, 1379, 841, 3237, 715, 8180, 50276, 249, 6452, 891, 1158, 436, 310, 271, 4722, 2929, 533, 476, 1335, 320, 5520, 7152, 33032, 7053, 273, 512, 253, 2929, 2550, 320, 7607, 984, 352, 28096, 253, 4021, 9645, 19529, 3646, 407, 1690, 271, 9555, 14908, 2593, 50276, 4160, 14153, 891, 588, 1918, 690, 4864, 5701, 50275, 783, 2929, 29328, 247, 37851, 24498, 2746, 281, 1347, 1182, 254, 6934, 302, 4715, 3185, 273, 3587, 39793, 253, 2629, 2831, 290, 10144, 2957, 253, 2929, 19401, 690, 2602, 5912, 7363, 326, 1908, 690, 966, 4216, 2891, 13646, 50275, 783, 5661, 2593, 273, 253, 2929, 310, 2266, 2217, 3738, 625, 1666, 25379, 812, 452, 644, 5762, 253, 2929, 760, 26662, 253, 7312, 2831, 15579, 2957, 342, 616, 4081, 2602, 42070, 7792, 50276, 4160, 14153, 1027, 35615, 273, 11454, 6928, 403, 5762, 327, 4440, 257, 292, 285, 17813, 253, 958, 326, 253, 2602, 5912, 5700, 19132, 3045, 327, 253, 1182, 254, 6934, 302, 4715, 4836, 50274, 251, 253, 643, 1133, 253, 10527, 4809, 310, 5075, 253, 4081, 1332, 3133, 281, 320, 247, 15246, 6880, 273, 432, 70, 1162, 355, 295, 2824, 4072, 253, 2022, 7680, 310, 326, 2602, 5912, 7363, 403, 908, 281, 1347, 9162, 3185, 273, 970, 760, 966, 14199, 1491, 50276, 8826, 42428, 5700, 310, 4081, 275, 2593, 3307, 533, 253, 4081, 5018, 1646, 1077, 519, 26901, 342, 642, 10527, 22861, 253, 806, 5150, 327, 3239, 854, 556, 253, 1072, 1895, 835, 690, 3632, 5426, 310, 2530, 2490, 187, 4118, 18435, 27, 783, 2929, 29328, 281, 1379, 715, 756, 2084, 253, 5203, 2605, 323, 9162, 8892, 3185, 273, 247, 6507, 295, 1106, 2602, 4090, 436, 671, 1421, 281, 247, 1182, 254, 6934, 302, 4758, 281, 1908, 4460, 5971, 30628, 1127, 281, 247, 3480, 273, 3806, 281, 2720, 789, 285, 14023, 4477, 452, 3597, 281, 15249, 616, 10165, 533, 253, 4583, 21942, 310, 326, 352, 19756, 38135, 342, 1675, 281, 2045, 7274, 512, 30628, 5583, 281, 12009, 285, 594, 513, 891 ]
Below is given review of a research paper from cnoference journal. Please write a summary the review. ### Review: this paper identifies label errors in test sets of popular vision text and audio datasets using confident learning the detected label errors are validated by crowdsourced workers from amazon mturk it is important to ensure all the test labels in benchmark datasets are correct the core finding in this paper challenges fundamentally the reported benchmark results in the literature so far the finding that higher capacity models generalize well on the original incorrect test labels but generalize worse on the corrected test labels than simpler counterparts is interesting 1 it is interesting to find and validate the label errors in test sets of popular benchmarks the opensource corrected test labels would be impactful 2 correcting all the test labels by brute force is expensive thus implementing preselection algorithmically as employed in this paper is important besides it is worthwhile to deal with realworld humanlevel label noise rather than synthetic label noise 3 the paper corrects test labels for many popular benchmark datasets that are heavily used in the literature 1 only a subset of the label errors had been corrected by humans it is unclear how many label errors are missed a complete set of corrected labels should include all label errors 2 the author only validates the label errors guessed by confident learning making the finding on model capacity not convincing enough this is because the label errors guessed by confident learning may not be independent it is unclear whether we can get the same finding if we have a subset of corrected labels independently sampled from the whole set with wrong annotations thus the currently used corrected labels in fig 3b may not represent the property of the other undetected label errors an experiment to show the difference between the real label errors and the current subset may be necessary docsepthis paper for the first time investigates the pervasive existence and important consequences of testdata errors in 10 popular ml datasets including imagenet mnist cifar10 etc the paper uses confident learning techniques to propose testdata error candidates and then asked mechanical turkers to confirm or verify the data errors the contribution of testdata error correction is very important to the mlvisionnlp fields as these datasets are so popularly used this paper also produces a significantly important observation that some simpler networks resnet18 sometimes work better than complicated ones resnet50 over data with error corrections which reveals an important fact that the bigcapacity models may fit too well to the training split even to the data errors there overall this is a very strong submission and very impactful to the field that i would even consider as a strong candidate competing for the best paper award of this track 1 this paper for the first time studied the pervasive existences and important consequences of testdata errors in 10 very popular datasets including imagenet cifar10 mnist etc 2 the paper finally produced dataerror corrections in these ten important datasets which could benefit the mlcvnlp communities very much 3 this paper uses the errordata corrections to reveal some important consequences including a bigcapacity networks may fit too much to the training dataerrors b some simpler network eg resnet18 may work better than resnet50 over the corrected errordata all of these conclusions are super important to the field 4 this paper is very wellwritten easytoread readyforpublication 5 the authors also provided the codebase for reproducing the results a website for data browsing detailed statisticsdata annotation process etc i dont think this paper has major weaknesses below are some small suggestions 1 for quickdraw and amazonreviews the authors can discuss how do you plan to finish them or any other ways to scale up the verification 2 the information in the current sec 5 is too dense to make it easier to read i suggest the authors break the materials into some subsections or paragraphs 3 is it planned to also fix the annotation errors in the training split is the same cl framework scalable to fix errors in the training set docsepthis is an important and interesting work concerning label errors in commonly used benchmarks contributions analysis of label errors in test sets of 10 commonlyused datasets release of cleaned tests for the data sets analysis of implications of label errors in benchmarking neural models the problem of label errors in test sets has not been given enough attention so far and the authors are filling an important gap here they use confident learning to find errors in test sets in paper it is shown that such errors can have important implications when models are compared the main weakness of the paper is that the dark matter of hardtodetect label errors is not considered and discussed it might be the case that such errors have different impact for simpler and complicated models i think the authors should run their procedure for the manual identification of label errors on random samples as an additional step to compare the statistics with the errors found with the help of confident learning minor issue the label flying saucer in an example given in figure 1 is quite arbitrary id say i think it merit some discussion problems like this another issue is that the variance of error rate across the 10 test sets is quite large 10 for quick draw vs 2 for 3 data sets this should be discussed ### Summary:
this dataset concerns label errors in test sets for ml benchmarks the reviewers found it welldocumented motivated and overall a compelling contribution
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 2520, 2929, 22649, 5203, 6332, 275, 1071, 5239, 273, 4633, 8113, 2505, 285, 9797, 15302, 970, 13224, 4715, 253, 5189, 5203, 6332, 403, 17618, 407, 24597, 47549, 5820, 432, 7001, 251, 26301, 321, 76, 352, 310, 1774, 281, 5416, 512, 253, 1071, 13301, 275, 22791, 15302, 403, 3451, 253, 5161, 4560, 275, 436, 2929, 7881, 26401, 253, 2361, 22791, 1543, 275, 253, 6239, 594, 2080, 253, 4560, 326, 2169, 5350, 3210, 39970, 973, 327, 253, 3236, 13583, 1071, 13301, 533, 39970, 7197, 327, 253, 15045, 1071, 13301, 685, 19554, 21421, 310, 4722, 337, 352, 310, 4722, 281, 1089, 285, 17813, 253, 5203, 6332, 275, 1071, 5239, 273, 4633, 49602, 253, 13279, 1505, 15045, 1071, 13301, 651, 320, 3486, 1020, 374, 35827, 512, 253, 1071, 13301, 407, 45294, 3490, 310, 8214, 3021, 16994, 638, 27423, 5933, 1037, 347, 7091, 275, 436, 2929, 310, 1774, 16280, 352, 310, 32811, 281, 2968, 342, 1524, 10186, 1966, 5251, 5203, 6046, 2581, 685, 13506, 5203, 6046, 495, 253, 2929, 3451, 84, 1071, 13301, 323, 1142, 4633, 22791, 15302, 326, 403, 11306, 908, 275, 253, 6239, 50275, 18, 760, 247, 8578, 273, 253, 5203, 6332, 574, 644, 15045, 407, 7497, 352, 310, 12744, 849, 1142, 5203, 6332, 403, 9829, 247, 3426, 873, 273, 15045, 13301, 943, 2486, 512, 5203, 6332, 374, 253, 2488, 760, 3588, 684, 253, 5203, 6332, 30346, 407, 13224, 4715, 2403, 253, 4560, 327, 1566, 5350, 417, 21414, 2217, 436, 310, 984, 253, 5203, 6332, 30346, 407, 13224, 4715, 778, 417, 320, 3907, 352, 310, 12744, 1880, 359, 476, 755, 253, 1072, 4560, 604, 359, 452, 247, 8578, 273, 15045, 13301, 10939, 19958, 432, 253, 2644, 873, 342, 3430, 31825, 3021, 253, 4390, 908, 15045, 13301, 275, 3036, 495, 67, 778, 417, 1957, 253, 2867, 273, 253, 643, 36568, 264, 5203, 6332, 271, 3368, 281, 921, 253, 3064, 875, 253, 1524, 5203, 6332, 285, 253, 1655, 8578, 778, 320, 3309, 50276, 7152, 33032, 2520, 2929, 323, 253, 806, 673, 2340, 684, 253, 42551, 6242, 285, 1774, 9099, 273, 1071, 2203, 6332, 275, 884, 4633, 13361, 15302, 1690, 4440, 257, 292, 278, 79, 382, 260, 338, 274, 740, 3966, 253, 2929, 4648, 13224, 4715, 5609, 281, 12661, 1071, 2203, 2228, 9183, 285, 840, 2546, 8651, 10709, 23521, 281, 6583, 390, 12654, 253, 941, 6332, 253, 7680, 273, 1071, 2203, 2228, 10618, 310, 1077, 1774, 281, 253, 13361, 4694, 13307, 81, 4910, 347, 841, 15302, 403, 594, 4633, 314, 908, 436, 2929, 671, 11330, 247, 3012, 1774, 8310, 326, 690, 19554, 6928, 501, 3024, 1093, 4536, 789, 1805, 685, 9542, 4394, 501, 3024, 1235, 689, 941, 342, 2228, 17660, 534, 12957, 271, 1774, 958, 326, 253, 1943, 37473, 3210, 778, 4944, 1512, 973, 281, 253, 3733, 8085, 1014, 281, 253, 941, 6332, 627, 4583, 436, 310, 247, 1077, 2266, 19529, 285, 1077, 3486, 1020, 281, 253, 1673, 326, 891, 651, 1014, 1908, 347, 247, 2266, 7431, 11771, 323, 253, 1682, 2929, 5722, 273, 436, 3540, 337, 436, 2929, 323, 253, 806, 673, 5421, 253, 42551, 2226, 2979, 285, 1774, 9099, 273, 1071, 2203, 6332, 275, 884, 1077, 4633, 15302, 1690, 4440, 257, 292, 260, 338, 274, 740, 278, 79, 382, 3966, 374, 253, 2929, 4720, 4197, 941, 3775, 17660, 275, 841, 3578, 1774, 15302, 534, 812, 5649, 253, 13361, 17312, 13307, 81, 7888, 1077, 1199, 495, 436, 2929, 4648, 253, 1486, 636, 682, 17660, 281, 10313, 690, 1774, 9099, 1690, 247, 1943, 37473, 6928, 778, 4944, 1512, 1199, 281, 253, 3733, 941, 22836, 270, 690, 19554, 2990, 24088, 501, 3024, 1093, 778, 789, 1805, 685, 501, 3024, 1235, 689, 253, 15045, 1486, 636, 682, 512, 273, 841, 11815, 403, 2221, 1774, 281, 253, 1673, 577, 436, 2929, 310, 1077, 973, 15720, 1842, 1767, 410, 324, 4704, 1542, 4387, 318, 608, 253, 4477, 671, 2530, 253, 2127, 4793, 323, 39306, 253, 1543, 247, 4422, 323, 941, 33310, 7000, 9990, 2203, 22581, 1232, 3966, 891, 13414, 1158, 436, 2929, 556, 2201, 32213, 2708, 403, 690, 1355, 13991, 337, 323, 3158, 6553, 285, 7001, 251, 15337, 84, 253, 4477, 476, 2319, 849, 513, 368, 2098, 281, 8416, 731, 390, 667, 643, 4088, 281, 4311, 598, 253, 21999, 374, 253, 1491, 275, 253, 1655, 4706, 608, 310, 1512, 14086, 281, 1056, 352, 6927, 281, 1239, 891, 1804, 253, 4477, 2740, 253, 4753, 715, 690, 749, 21454, 390, 33295, 495, 310, 352, 9355, 281, 671, 4993, 253, 22581, 6332, 275, 253, 3733, 8085, 310, 253, 1072, 502, 7792, 44755, 281, 4993, 6332, 275, 253, 3733, 873, 5474, 33032, 2520, 310, 271, 1774, 285, 4722, 789, 8664, 5203, 6332, 275, 7744, 908, 49602, 50276, 1987, 8303, 50275, 12792, 273, 5203, 6332, 275, 1071, 5239, 273, 884, 7744, 3197, 15302, 50276, 16690, 273, 22269, 5216, 323, 253, 941, 5239, 50276, 12792, 273, 12739, 273, 5203, 6332, 275, 22791, 272, 11454, 3210, 253, 1895, 273, 5203, 6332, 275, 1071, 5239, 556, 417, 644, 1677, 2217, 4116, 594, 2080, 285, 253, 4477, 403, 12868, 271, 1774, 8037, 1060, 597, 897, 13224, 4715, 281, 1089, 6332, 275, 1071, 5239, 275, 2929, 352, 310, 2011, 326, 824, 6332, 476, 452, 1774, 12739, 672, 3210, 403, 2429, 253, 2022, 14855, 273, 253, 2929, 310, 326, 253, 3644, 2647, 273, 1892, 85, 351, 32043, 5203, 6332, 310, 417, 2783, 285, 5469, 352, 1537, 320, 253, 1083, 326, 824, 6332, 452, 1027, 3486, 323, 19554, 285, 9542, 3210, 891, 1158, 253, 4477, 943, 1408, 616, 5199, 323, 253, 11595, 8137, 273, 5203, 6332, 327, 3632, 3530, 347, 271, 3081, 3213, 281, 7277, 253, 9990, 342, 253, 6332, 1119, 342, 253, 1361, 273, 13224, 4715, 50276, 37585, 2523, 253, 5203, 12060, 618, 48149, 275, 271, 1650, 1677, 275, 4677, 337, 310, 3240, 10341, 2654, 1333, 891, 1158, 352, 15785, 690, 5955, 3237, 751, 436, 50276, 23955, 2523, 310, 326, 253, 11041, 273, 2228, 2281, 2439, 253, 884, 1071, 5239, 310, 3240, 1781, 884, 323, 3158, 3812, 4632, 374, 323, 495, 941, 5239, 436, 943, 320, 5469, 2490, 187, 4118, 18435, 27, 2520, 10895, 7350, 5203, 6332, 275, 1071, 5239, 323, 13361, 49602, 253, 30628, 1119, 352, 6210, 392, 1829, 264, 17194, 285, 4583, 247, 18511, 7680 ]
[ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 ]
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 2520, 2929, 22649, 5203, 6332, 275, 1071, 5239, 273, 4633, 8113, 2505, 285, 9797, 15302, 970, 13224, 4715, 253, 5189, 5203, 6332, 403, 17618, 407, 24597, 47549, 5820, 432, 7001, 251, 26301, 321, 76, 352, 310, 1774, 281, 5416, 512, 253, 1071, 13301, 275, 22791, 15302, 403, 3451, 253, 5161, 4560, 275, 436, 2929, 7881, 26401, 253, 2361, 22791, 1543, 275, 253, 6239, 594, 2080, 253, 4560, 326, 2169, 5350, 3210, 39970, 973, 327, 253, 3236, 13583, 1071, 13301, 533, 39970, 7197, 327, 253, 15045, 1071, 13301, 685, 19554, 21421, 310, 4722, 337, 352, 310, 4722, 281, 1089, 285, 17813, 253, 5203, 6332, 275, 1071, 5239, 273, 4633, 49602, 253, 13279, 1505, 15045, 1071, 13301, 651, 320, 3486, 1020, 374, 35827, 512, 253, 1071, 13301, 407, 45294, 3490, 310, 8214, 3021, 16994, 638, 27423, 5933, 1037, 347, 7091, 275, 436, 2929, 310, 1774, 16280, 352, 310, 32811, 281, 2968, 342, 1524, 10186, 1966, 5251, 5203, 6046, 2581, 685, 13506, 5203, 6046, 495, 253, 2929, 3451, 84, 1071, 13301, 323, 1142, 4633, 22791, 15302, 326, 403, 11306, 908, 275, 253, 6239, 50275, 18, 760, 247, 8578, 273, 253, 5203, 6332, 574, 644, 15045, 407, 7497, 352, 310, 12744, 849, 1142, 5203, 6332, 403, 9829, 247, 3426, 873, 273, 15045, 13301, 943, 2486, 512, 5203, 6332, 374, 253, 2488, 760, 3588, 684, 253, 5203, 6332, 30346, 407, 13224, 4715, 2403, 253, 4560, 327, 1566, 5350, 417, 21414, 2217, 436, 310, 984, 253, 5203, 6332, 30346, 407, 13224, 4715, 778, 417, 320, 3907, 352, 310, 12744, 1880, 359, 476, 755, 253, 1072, 4560, 604, 359, 452, 247, 8578, 273, 15045, 13301, 10939, 19958, 432, 253, 2644, 873, 342, 3430, 31825, 3021, 253, 4390, 908, 15045, 13301, 275, 3036, 495, 67, 778, 417, 1957, 253, 2867, 273, 253, 643, 36568, 264, 5203, 6332, 271, 3368, 281, 921, 253, 3064, 875, 253, 1524, 5203, 6332, 285, 253, 1655, 8578, 778, 320, 3309, 50276, 7152, 33032, 2520, 2929, 323, 253, 806, 673, 2340, 684, 253, 42551, 6242, 285, 1774, 9099, 273, 1071, 2203, 6332, 275, 884, 4633, 13361, 15302, 1690, 4440, 257, 292, 278, 79, 382, 260, 338, 274, 740, 3966, 253, 2929, 4648, 13224, 4715, 5609, 281, 12661, 1071, 2203, 2228, 9183, 285, 840, 2546, 8651, 10709, 23521, 281, 6583, 390, 12654, 253, 941, 6332, 253, 7680, 273, 1071, 2203, 2228, 10618, 310, 1077, 1774, 281, 253, 13361, 4694, 13307, 81, 4910, 347, 841, 15302, 403, 594, 4633, 314, 908, 436, 2929, 671, 11330, 247, 3012, 1774, 8310, 326, 690, 19554, 6928, 501, 3024, 1093, 4536, 789, 1805, 685, 9542, 4394, 501, 3024, 1235, 689, 941, 342, 2228, 17660, 534, 12957, 271, 1774, 958, 326, 253, 1943, 37473, 3210, 778, 4944, 1512, 973, 281, 253, 3733, 8085, 1014, 281, 253, 941, 6332, 627, 4583, 436, 310, 247, 1077, 2266, 19529, 285, 1077, 3486, 1020, 281, 253, 1673, 326, 891, 651, 1014, 1908, 347, 247, 2266, 7431, 11771, 323, 253, 1682, 2929, 5722, 273, 436, 3540, 337, 436, 2929, 323, 253, 806, 673, 5421, 253, 42551, 2226, 2979, 285, 1774, 9099, 273, 1071, 2203, 6332, 275, 884, 1077, 4633, 15302, 1690, 4440, 257, 292, 260, 338, 274, 740, 278, 79, 382, 3966, 374, 253, 2929, 4720, 4197, 941, 3775, 17660, 275, 841, 3578, 1774, 15302, 534, 812, 5649, 253, 13361, 17312, 13307, 81, 7888, 1077, 1199, 495, 436, 2929, 4648, 253, 1486, 636, 682, 17660, 281, 10313, 690, 1774, 9099, 1690, 247, 1943, 37473, 6928, 778, 4944, 1512, 1199, 281, 253, 3733, 941, 22836, 270, 690, 19554, 2990, 24088, 501, 3024, 1093, 778, 789, 1805, 685, 501, 3024, 1235, 689, 253, 15045, 1486, 636, 682, 512, 273, 841, 11815, 403, 2221, 1774, 281, 253, 1673, 577, 436, 2929, 310, 1077, 973, 15720, 1842, 1767, 410, 324, 4704, 1542, 4387, 318, 608, 253, 4477, 671, 2530, 253, 2127, 4793, 323, 39306, 253, 1543, 247, 4422, 323, 941, 33310, 7000, 9990, 2203, 22581, 1232, 3966, 891, 13414, 1158, 436, 2929, 556, 2201, 32213, 2708, 403, 690, 1355, 13991, 337, 323, 3158, 6553, 285, 7001, 251, 15337, 84, 253, 4477, 476, 2319, 849, 513, 368, 2098, 281, 8416, 731, 390, 667, 643, 4088, 281, 4311, 598, 253, 21999, 374, 253, 1491, 275, 253, 1655, 4706, 608, 310, 1512, 14086, 281, 1056, 352, 6927, 281, 1239, 891, 1804, 253, 4477, 2740, 253, 4753, 715, 690, 749, 21454, 390, 33295, 495, 310, 352, 9355, 281, 671, 4993, 253, 22581, 6332, 275, 253, 3733, 8085, 310, 253, 1072, 502, 7792, 44755, 281, 4993, 6332, 275, 253, 3733, 873, 5474, 33032, 2520, 310, 271, 1774, 285, 4722, 789, 8664, 5203, 6332, 275, 7744, 908, 49602, 50276, 1987, 8303, 50275, 12792, 273, 5203, 6332, 275, 1071, 5239, 273, 884, 7744, 3197, 15302, 50276, 16690, 273, 22269, 5216, 323, 253, 941, 5239, 50276, 12792, 273, 12739, 273, 5203, 6332, 275, 22791, 272, 11454, 3210, 253, 1895, 273, 5203, 6332, 275, 1071, 5239, 556, 417, 644, 1677, 2217, 4116, 594, 2080, 285, 253, 4477, 403, 12868, 271, 1774, 8037, 1060, 597, 897, 13224, 4715, 281, 1089, 6332, 275, 1071, 5239, 275, 2929, 352, 310, 2011, 326, 824, 6332, 476, 452, 1774, 12739, 672, 3210, 403, 2429, 253, 2022, 14855, 273, 253, 2929, 310, 326, 253, 3644, 2647, 273, 1892, 85, 351, 32043, 5203, 6332, 310, 417, 2783, 285, 5469, 352, 1537, 320, 253, 1083, 326, 824, 6332, 452, 1027, 3486, 323, 19554, 285, 9542, 3210, 891, 1158, 253, 4477, 943, 1408, 616, 5199, 323, 253, 11595, 8137, 273, 5203, 6332, 327, 3632, 3530, 347, 271, 3081, 3213, 281, 7277, 253, 9990, 342, 253, 6332, 1119, 342, 253, 1361, 273, 13224, 4715, 50276, 37585, 2523, 253, 5203, 12060, 618, 48149, 275, 271, 1650, 1677, 275, 4677, 337, 310, 3240, 10341, 2654, 1333, 891, 1158, 352, 15785, 690, 5955, 3237, 751, 436, 50276, 23955, 2523, 310, 326, 253, 11041, 273, 2228, 2281, 2439, 253, 884, 1071, 5239, 310, 3240, 1781, 884, 323, 3158, 3812, 4632, 374, 323, 495, 941, 5239, 436, 943, 320, 5469, 2490, 187, 4118, 18435, 27, 2520, 10895, 7350, 5203, 6332, 275, 1071, 5239, 323, 13361, 49602, 253, 30628, 1119, 352, 6210, 392, 1829, 264, 17194, 285, 4583, 247, 18511, 7680 ]
Below is given review of a research paper from cnoference journal. Please write a summary the review. ### Review: the paper presents a method which takes as input sequences of video frames of scenes from which it is able to understand the dynamics of objects present in the scene and their interactions and transfer this to several downstream tasks the model consists of two modules one to distill individual object dynamics and a second relation module to understand interactions between objects the model is shown to have sota results on several downstream tasks including video understanding and reasoning video prediction reconstruction and segmentation the representations learnt by the model are shown improve upon prior work especially on tasks which require knowing object dynamics such as predicting the next frame in a sequence where collisions or occlusions can occur strengths novelty the two key contributions are clearly described use explicit dynamic features learnt by matching objects latents between the current and previous frames as opposed to only learning static scene properties as done in prior work explicitly model interactions between objects experimental verification small ablation on the effect of dynamic features in table 2 even if dynamics features are absent the model achieves better results than iodine which means that oddn can both model object dynamics and generates overall higher quality object static representations gives sota improvement across a variety of tasks table 1 and table 2 show a consistently significant improvement over prior work oddn model representations are fed into aloe instead of using static monet features which aloe originally used this oddnaloe model improves on state of the art on clever for various tasks interestingly the gains are greatest on predictive and counterfactual questions which are related to predicting how objects will move oddn also improves over using object representations from iodine or provide in table 2 qualitatively in fig 4 the generated images for the next frames are better for oddn as it is able to preserve the color of the objects as well as position whilst provide fails to preserve color after a few frames oddn also does better at reconstructing images and generates more compact segmentation masks than iodine or provide in fig 2 weaknesses clarity of explanations and figures could be greatly improved although the ideas are clear the method and implementation could be made clearer eg write out the algorithm in pseudo code in addition the notation is slightly confusing given there are two meanings for t inference model time step and time step in the video fig 7 c section titled disentanglement is not clearly explained the caption is incorrect b is actually referring to c a is not explained what is a dynamic dimension you could add some arrows to the diagram to explain what is happening across the rowscolumns its not clear how to interpret the images in addition where does velocity magnitude come from this was not explained previously figure 4 is also not very clear explain in the caption what is happening in the 6 frames ie blue and brown cylinder collide with odnn this interaction is predicted but with provide it loses information about the blue cyclinder it would be useful to explicitly mention the colors to look for in the images in the caption or in section 4 in the text the images are also really small hard to see what is happening figure 3 is it showing mse from 6 frames in total or averaged over some number of runs table 3 for each metric add an arrow to indicate if lower or higher is better figure 6 is also not clear the caption indicating the colors for the frames and timestamps is really tiny lack of implementation details no details about how the model was implemented or how to reproduce the results equation 2 and 3 explain what iodine does but the oddn method is not fully explained typos section 43 weather whether the paper presents a method with two key insightscontributions that are not present in prior work the ideas are validated quantitatively and qualitatively the technical details figures and tables could be improved with better explanations and descriptions docsepthis paper presents an unsupervised distillation of disentangled object dynamics from raw video inputs the distilled dynamics model is capable of causal reasoning and future frame prediction extensive results on tasks of segmentation and reconstruction show its favorable performance strengths the authors provide a novel distillation method to understand object dynamics the proposed system shows stateoftheart performance on clevrer weaknesses i would like to see more qualitative results on the attention module visualized coefficients in fig 6 only cover one single scenario i think the paper needs more validations on whether the attention module converges well the proposed method is validated only in clevrer which does not show its generalization capability in addition it is hard to find the differences in the qualitative results fig 4 compared to the previous methods in fig 5 iodine cannot represent the shadow areas while the proposed oddn can reconstruct them could you elaborate on the reason how oddn has a representation ability on the shadow areas i would like to see the comparison of the computational complexity with the previous methods eg aloe iodine etc please discuss the limitations of the proposed method it is required to discuss whether there are any other aspects that can be further improved as future works for this task overall the proposed architecture is designed simple and shows stateoftheart performance there are minor issues as commented in main review but i am leaning towards positive at this moment since the topic is out of my scope i would like to see other reviewers opinions docsepthe paper presents a framework that distillates explicit object dynamics eg velocity from the discrepancies between objects static latents of different frames called the object dynamics distillation network oddn the approach is built upon recent works which decompose static scenes into independent latents by randomly assigning different objects to a fixed number of slots which share weights to obtain latents with a common format and this paper makes use of the attention mechanism of the transformer to align objects latents of two input images and encode the aggregated representation into a lowdimensional vector to obtain disentangled object dynamics afterward the paper builds a relation module to model object interactions based on npe and rnem strengths the paper presents a very interesting idea of using transformer structure to align latent features and distillate object dynamics and it seems very effective experimental results show superior performance compared to the sota on video understanding and reasoning weaknesses the paper is built upon many existing techniques including the iodine npe and rnem especially the core part of it is the use of iodine latents and aligning them by a transformer structure the reasoning module is a combination of npe and rnem though the paper is built upon many existing techniques including the iodine npe and rnem and the novel part of this paper is using a transformer structure to align the iodine latents and find discrepancies i still consider this paper very interesting and strong it may provide some insight on how to model individual object dynamics from videos moreover the proposed method is very effective as experimental results show superior performance docsepthis paper proposes a method of extracting the latent space that describes objects in a video the encoder part inference model encodes an input image framebyframe from a video based on iodine the relative information between the latent vectors of three consequent frames is extracted and updated based on transformer the input image is reconstructed from the latent vectors by the decoder part generative model the proposed method is tested for various tasks on clevrer dataset the proposed method is compared to the previous methods iodine and provide and the better results are shown but the problem is that the detail of the method is not described and difficult to understand in th experiment of video understanding the input for iodine is a single frame but that for the proposed method is multiple frame from video how different the input data is if the information extracted by the proposed method is the dynamics of a scene the importance will depends on the scene but no example is shown in the paper to confirm the effectiveness of dynamic distillation although the method provide is chosen as a method that introduces temporal information it may not be meaningful because the performance is worse than the base method iodine which uses a single image in the experiment of prediction it is difficult to recognize how is the motion of objects and interaction each other in fig4 is the example appropriate to show if the proposed method can extract the dynamics from video in decomposition what is the condition of the experiment is the reconstruction through the autoencoder or reconstruction from the latent vectors given by user if in the latter case what and how are the input latent vectors given in the explanation of the proposed method the indexing of the latent vector is ambiguous a latent vector is indexed by the frame number i latent index k time t and the pixel coordinate but what is the difference between the frame number and the time and the indexes are omitted in many parts of the explanation it is difficult to understand which latent variables are considered to find the relationship the detail of generative model is not explained the color is estimated for each pixel but the relationship between a latent vector and a pixel is not explained although the proposed method seems to show the better results in the experiments the explanation is not sufficient to convince if the experiment is appropriate and fair additionally the explanation of the method is not clear to understand the detail additionally it is not clear that what kind of information is extracted by the part dynamic distillation introduced in this paper it seems no experiment is shown that object dynamics is distilled by the proposed method consequently the reviewer thinks that this paper is below the acceptance docsepthis paper aims at learning object dynamics from unlabelled videos for multiobject representations this work builds upon the inference model from greff et al 2019 to obtain the multiobject representations using that this paper learns latent dynamics by predicting the future frame given two previous observations the proposed network is trained by maximizing the loglikelihood in the pixel space for reconstructions as well as for predictions similar to standard work in video prediction in addition to that this paper optimizes the objective from greff et al 2019 to obtain multiobject representations the main contribution of this work lies in the module which predicts the future multiobject representation given the two past ones by using i dynamic distillation as well as ii a relation module the dynamic distillation module is motivated by the claim that the objecttoslot assignment switches inside videos therefore the paper introduces a selfattention layer to match the slots of the specific objects across two frames the output of this module after applying the feedforward network is called dynamic representation the relation module is proposed to model the interactions between object pairs this is implemented 1013 using selfattention between the dynamics from above and static representation of objects i and k slots the output is added to update the dynamic representation from the dynamic distillation module ultimately the future state is predicted using a linear transformation of the static representation of time t and the updated dynamic representation the paper evaluates their learned representation on the clevrer dataset for video reasoning prediction reconstruction and segmentation strength the introduced modules dynamic distillation and relation modules are well motivated and their design and realization seem reasonable the paper reports strong numbers on the task of video reasoning outperforming existing works on clevrer weaknesses missing ablation experiment for dynamic distillation and relation module the paper claims that the slot assignment switches unpredictably inside the video it proposed the dynamic distillation attention mechanism to solve that yet it is not evaluated if this module actually solves this problem with that the reader does not know if this module is effective the same for the relation module the paper should report numbers with and without the use of this module to assess its impact missing baseline to assess the significance of the proposed modules dynamic distillation relation module the reader also needs results without them this baseline can be created by replacing the proposed modules with a feed forward neural network with roughly the same number of parameters as the proposed modules that predicts the next static representation t1 based on the concatenation of the static representation at time t and t1 the paper introduces some kind of baseline in table 2 oddn wo dyn by reducing the 4 dimension dynamics thus the network is left to predict the next state solely based on the current static representation at time t which is insufficient that is why i urge the authors to implement the baseline as pointed out above and to report the resulting numbers in the rebuttal the paper compares their method to provide which is the only other competitor that leverages temporal information provide extends upon the static method of iodine using a 2dlstm the reported numbers in table 2 for provide are significantly 20 worse than for iodine which does not make sense to me especially when looking at the provide paper in which provide is on par with iodine on average the same holds true for table 3 the mse reported in provide is significantly better than iodine whereas in this paper mse is significantly worse 48 for mse this suggests to me that there is something wrong with how provide was trainedevaluated the paper discusses a reason why oddn is superior compared to provide in 43 but does not address why there is such a huge gap between iodine and provide could the authors please clarify this for me this shows and enhances the need for a baseline as discussed above missing implementation details the paper does not provide any implementation details like eg optimizer learning rate beta value hyperparameters etc this makes the paper not reproducible and i urge the authors to add these details does the code get released after acceptance the paper evaluates their proposed model only on the clevrer dataset to better assess the impact of this work the method needs to be evaluated at least on one other dataset such as eg cater cater rohit girdhar et al cater a diagnostic dataset for compositional actions and temporal reasoning iclr 2020 missing related work this paper misses a relevant related work from veerapaneni et al entity abstraction in visual modelbased reinforcement learning corl 2019 this work extracts entity representations from images and learns their dynamics for modelbased reinforcement learning moreover this paper only cites related work which operates on multiobject representations yet there exists related work also for nonmultiobject dynamics representation learning which should be cited such as minderer et al unsupervised learning of object structure and dynamics from videos neurips 2019 blattmann et al understanding object dynamics for interactive imagetovideo synthesis cvpr 2021 also the work should cite major works in video prediction since it is the task the network gets trained on lee et al stochastic adversarial video prediction savp franceschi et al stochastic latent residual video prediction slrvp denton et al stochastic video generation with a learned prior svg missing comparison to current nonmulti object representationbased approaches for video prediction it would have been nice if the paper would have compared to nonmulti object video prediction approaches to put their work more into context this work does not deal with the inherent ambiguity of the video prediction task itself it deterministically predicts a next state given two past observations thereby it cannot cover the different scenarios the future can hold the work from lee et al actually showed that by accounting for this ambiguity the learned model improves on the task of video prediction since it does not average over multiple predictions and can cover multiple different future scenarios i would like to hear the authors take on that and see this more as future work for multiobject dynamics learning general notes below 4 there is zdyn twice i believe the second one should be zsta paper states that their approach is capable of future frames prediction in complex 3d scenes i have to disagree with this since this dataset on which the method is evaluated is far away from any realworld video dataset that is why i would not call it complex change the name from reported to aloe wo selfsuperv to make it easier to read in table 2 typos figure 7 caption its its very young infants infants line intensity line intensity figure 7 has ab and c but in the caption only a and b occur in c the xaxis goes from 2 to 2 but there are 10 images 3 to 3 the paper mentioned that the objects motions has to be continuous through space while in the next sentence explaining how the next discrete static object latent state is predicted one would have to use eg neural odes to actually model continuous object motion so continuous should be replaced by eg smooth the writing of the paper needs some polishing since the wording in many sentences does not make sense eg objects motion consists with the basic physics concept in the caption of 7b it says line intensity indicates the magnitude of attention probability while in the figure it is written that the score of blue is 099 to me all the intensities look more or less the same while one of them is already 099 out of 1 the other ones should be then 001 how does that make sense could the authors not show all the weights between all objects between two frames also for multiple examples this would provide much more insight to the reader it is difficult to spot the object movement in figure 7 c it would be great if the authors could add some markers to help visibility the modules proposed in this work are well motivated and designed and produce strong results on video reasoning yet i am concerned about the insufficient evaluation without the ablations and the baseline i discussed in weaknesses it is difficult to quantify the actual impact of the proposed method and its underlying modules moreover experimental details are missing which makes it difficult to reproduce its reported numbers that is why my initial rating is below the acceptance threshold but i am open to adjusting my rating if the authors do sufficiently address the points raised in weaknesses ### Summary:
this work presents a novel method h to learn object dynamics from unlabelled videos and shows its benefits on causal reasoning and future frame prediction this paper received 4 positive reviews and 1 negative review in the rebuttal the authors have addressed most of the concerns ac feels this work is very interesting and deserves to be published on iclr 2022 the reviewers did raise some valuable concerns that should be addressed in the final cameraready version of the paper the authors are encouraged to make other necessary changes
[ 310, 2709, 3665, 432, 3492, 849, 1027, 253, 3280, 941, 310, 604, 253, 1491, 10375, 407, 253, 4081, 1332, 310, 253, 8062, 273, 247, 6200, 253, 6349, 588, 7024, 327, 253, 6200, 533, 642, 1650, 310, 2011, 275, 253, 2929, 281, 6583, 253, 12510, 273, 7870, 940, 21755, 3738, 253, 1332, 2085, 310, 6777, 347, 247, 1332, 326, 23970, 11935, 1491, 352, 778, 417, 320, 14282, 984, 253, 3045, 310, 7197, 685, 253, 2613, 1332, 39887, 534, 4648, 247, 2014, 2460, 50276, 249, 253, 3368, 273, 10554, 352, 310, 2834, 281, 9446, 849, 310, 253, 3200, 273, 5113, 285, 5016, 1016, 643, 275, 3036, 21, 310, 253, 1650, 4569, 281, 921, 604, 253, 4081, 1332, 476, 4908, 253, 8062, 432, 3492, 50276, 249, 14717, 752, 310, 253, 1617, 273, 253, 3368, 310, 253, 14433, 949, 253, 6753, 36465, 390, 14433, 432, 253, 21624, 11390, 1677, 407, 2608, 604, 275, 253, 6158, 1083, 752, 285, 849, 403, 253, 3280, 21624, 11390, 1677, 50276, 249, 253, 8813, 273, 253, 4081, 1332, 253, 44176, 273, 253, 21624, 4972, 310, 23851, 247, 21624, 4972, 310, 32891, 407, 253, 3665, 1180, 891, 21624, 3605, 465, 673, 246, 285, 253, 12275, 13249, 533, 752, 310, 253, 3064, 875, 253, 3665, 1180, 285, 253, 673, 285, 253, 28308, 403, 11035, 275, 1142, 4243, 273, 253, 8813, 352, 310, 2834, 281, 2096, 534, 21624, 4903, 403, 2783, 281, 1089, 253, 2954, 50276, 783, 2508, 273, 1006, 800, 1566, 310, 417, 5544, 253, 3295, 310, 5998, 323, 1016, 12275, 533, 253, 2954, 875, 247, 21624, 4972, 285, 247, 12275, 310, 417, 5544, 50276, 20261, 253, 4081, 1332, 3133, 281, 921, 253, 1805, 1543, 275, 253, 4679, 253, 8813, 310, 417, 4209, 281, 18578, 604, 253, 3368, 310, 4569, 285, 4344, 23000, 253, 8813, 273, 253, 1332, 310, 417, 2590, 281, 2096, 253, 2508, 23000, 352, 310, 417, 2590, 326, 752, 2238, 273, 1491, 310, 10375, 407, 253, 629, 7870, 940, 21755, 5611, 275, 436, 2929, 352, 3133, 642, 3368, 310, 2011, 326, 1789, 8062, 310, 35755, 407, 253, 4081, 1332, 17912, 253, 37317, 11121, 326, 436, 2929, 310, 2708, 253, 14924, 50276, 7152, 33032, 2520, 2929, 13698, 387, 4715, 1789, 8062, 432, 440, 47728, 10556, 323, 4471, 6082, 14237, 436, 789, 21168, 2220, 253, 17032, 1566, 432, 13738, 567, 1162, 355, 6247, 281, 4044, 253, 4471, 6082, 14237, 970, 326, 436, 2929, 33772, 21624, 8062, 407, 21565, 253, 2852, 3665, 1677, 767, 2045, 7313, 253, 4081, 2990, 310, 10166, 407, 46875, 253, 2412, 7513, 10202, 275, 253, 12275, 2317, 323, 49866, 6477, 347, 973, 347, 323, 13650, 2074, 281, 2629, 789, 275, 3492, 10554, 275, 1635, 281, 326, 436, 2929, 5556, 4219, 253, 8103, 432, 13738, 567, 1162, 355, 6247, 281, 4044, 4471, 6082, 14237, 253, 2022, 7680, 273, 436, 789, 8696, 275, 253, 6333, 534, 26295, 253, 2852, 4471, 6082, 6779, 1677, 253, 767, 2469, 4394, 407, 970, 891, 7870, 940, 21755, 347, 973, 347, 21255, 247, 5886, 6333, 50276, 783, 7870, 940, 21755, 6333, 310, 17194, 407, 253, 1750, 326, 253, 1789, 85, 375, 11753, 12714, 20994, 3304, 10556, 3103, 253, 2929, 23970, 247, 1881, 42959, 3828, 281, 3761, 253, 25195, 273, 253, 2173, 5113, 2439, 767, 13009, 253, 3453, 273, 436, 6333, 846, 9433, 253, 3997, 10495, 2990, 310, 1925, 7870, 6779, 50276, 783, 5886, 6333, 310, 4081, 281, 1566, 253, 6355, 875, 1789, 8557, 436, 310, 9009, 8437, 20, 970, 1881, 42959, 875, 253, 8062, 432, 1840, 285, 4228, 6779, 273, 5113, 891, 285, 465, 25195, 253, 3453, 310, 2879, 281, 5731, 253, 7870, 6779, 432, 253, 7870, 940, 21755, 6333, 9142, 253, 2852, 1375, 310, 8131, 970, 247, 4872, 9261, 273, 253, 4228, 6779, 273, 673, 246, 285, 253, 9300, 7870, 6779, 50276, 783, 2929, 44995, 616, 6311, 6779, 327, 253, 1391, 87, 6554, 10895, 323, 3492, 14720, 10554, 14433, 285, 26405, 4757, 50276, 783, 5611, 11911, 7870, 940, 21755, 285, 5886, 11911, 403, 973, 17194, 285, 616, 2216, 285, 22786, 1646, 5272, 50276, 783, 2929, 5012, 2266, 3904, 327, 253, 4836, 273, 3492, 14720, 41731, 14692, 5368, 2987, 327, 1391, 87, 6554, 50276, 20881, 1255, 265, 50276, 33722, 28913, 3368, 323, 7870, 940, 21755, 285, 5886, 6333, 253, 2929, 3916, 326, 253, 15239, 12714, 20994, 29444, 1598, 3304, 253, 3492, 352, 4081, 253, 7870, 940, 21755, 4116, 5122, 281, 8415, 326, 2568, 352, 310, 417, 6760, 604, 436, 6333, 2686, 35910, 436, 1895, 342, 326, 253, 9414, 1057, 417, 871, 604, 436, 6333, 310, 3576, 253, 1072, 323, 253, 5886, 6333, 253, 2929, 943, 1304, 3904, 342, 285, 1293, 253, 897, 273, 436, 6333, 281, 2939, 697, 3486, 50275, 33722, 8245, 281, 2939, 253, 8453, 273, 253, 4081, 11911, 7870, 940, 21755, 50276, 16429, 6333, 253, 9414, 671, 3198, 1543, 1293, 731, 436, 8245, 476, 320, 3562, 407, 15706, 253, 4081, 11911, 342, 247, 3997, 3579, 11454, 2990, 342, 11467, 253, 1072, 1180, 273, 3602, 347, 253, 4081, 11911, 326, 26295, 253, 1735, 4228, 6779, 246, 18, 1754, 327, 253, 32147, 318, 273, 253, 4228, 6779, 387, 673, 246, 285, 246, 18, 253, 2929, 23970, 690, 2238, 273, 8245, 275, 2829, 374, 8909, 79, 32063, 24187, 407, 8493, 253, 577, 7877, 8062, 3021, 253, 2990, 310, 1669, 281, 3283, 253, 1735, 1375, 12718, 1754, 327, 253, 1655, 4228, 6779, 387, 673, 246, 534, 310, 12497, 326, 310, 2139, 891, 21434, 253, 4477, 281, 3359, 253, 8245, 347, 8042, 562, 1840, 285, 281, 1304, 253, 4795, 3904, 275, 253, 30080, 22559, 50276, 783, 2929, 26662, 616, 1332, 281, 2085, 534, 310, 253, 760, 643, 32048, 326, 19732, 1131, 11935, 1491, 2085, 8725, 2220, 253, 4228, 1332, 273, 39887, 970, 247, 374, 11830, 296, 78, 253, 2361, 3904, 275, 2829, 374, 323, 2085, 403, 3012, 1384, 7197, 685, 323, 39887, 534, 1057, 417, 1056, 3282, 281, 479, 3340, 672, 2819, 387, 253, 2085, 2929, 275, 534, 2085, 310, 327, 1061, 342, 39887, 327, 3388, 253, 1072, 6556, 2032, 323, 2829, 495, 253, 278, 339, 2361, 275, 2085, 310, 3012, 1805, 685, 39887, 5727, 275, 436, 2929, 278, 339, 310, 3012, 7197, 5693, 323, 278, 339, 436, 5936, 281, 479, 326, 627, 310, 1633, 3430, 342, 849, 2085, 369, 10166, 15419, 11634, 253, 2929, 25339, 247, 1921, 2139, 8909, 79, 310, 8936, 2429, 281, 2085, 275, 7652, 533, 1057, 417, 2953, 2139, 627, 310, 824, 247, 5699, 8037, 875, 39887, 285, 2085, 812, 253, 4477, 4496, 19148, 436, 323, 479, 436, 2722, 285, 25222, 253, 878, 323, 247, 8245, 347, 5469, 1840, 50276, 33722, 7092, 4278, 253, 2929, 1057, 417, 2085, 667, 7092, 4278, 751, 24088, 5556, 6081, 4715, 2281, 9840, 1318, 4373, 22041, 3966, 436, 2789, 253, 2929, 417, 41374, 285, 891, 21434, 253, 4477, 281, 823, 841, 4278, 1057, 253, 2127, 755, 4439, 846, 14924, 50276, 783, 2929, 44995, 616, 4081, 1566, 760, 327, 253, 1391, 87, 6554, 10895, 281, 1805, 2939, 253, 3486, 273, 436, 789, 253, 1332, 3198, 281, 320, 6760, 387, 1878, 327, 581, 643, 10895, 824, 347, 24088, 28335, 28335, 687, 26824, 305, 1817, 9432, 1162, 355, 28335, 247, 10401, 10895, 323, 5889, 267, 5231, 285, 11935, 14720, 17857, 32888, 9169, 5816, 2905, 789, 436, 2929, 38771, 247, 4623, 2905, 789, 432, 1670, 254, 3682, 29571, 1162, 355, 10726, 38562, 275, 5304, 1566, 3169, 35221, 4715, 944, 77, 6247, 436, 789, 16756, 10726, 14237, 432, 3888, 285, 33772, 616, 8062, 323, 1566, 3169, 35221, 4715, 50276, 3062, 1189, 436, 2929, 760, 28070, 2905, 789, 534, 17209, 327, 4471, 6082, 14237, 2568, 627, 4961, 2905, 789, 671, 323, 1327, 23939, 6082, 8062, 6779, 4715, 534, 943, 320, 11106, 824, 347, 50275, 14785, 70, 6554, 1162, 355, 440, 35421, 4715, 273, 1789, 2605, 285, 8062, 432, 10556, 5723, 2824, 6247, 50275, 1559, 1595, 8420, 1162, 355, 4685, 1789, 8062, 323, 18366, 4440, 292, 729, 2842, 9066, 30105, 1087, 43425, 50276, 12563, 253, 789, 943, 26542, 2201, 2987, 275, 3492, 10554, 1580, 352, 310, 253, 4836, 253, 2990, 4850, 10166, 327, 50276, 14906, 1162, 355, 19191, 48960, 3492, 10554, 5745, 81, 50276, 925, 1972, 4635, 1162, 355, 19191, 21624, 12541, 3492, 10554, 1499, 23667, 81, 50276, 69, 26947, 1162, 355, 19191, 3492, 5978, 342, 247, 6311, 2720, 18504, 72, 50275, 33722, 5301, 281, 1655, 1327, 23939, 1789, 6779, 3169, 7274, 323, 3492, 10554, 352, 651, 452, 644, 5322, 604, 253, 2929, 651, 452, 2429, 281, 1327, 23939, 1789, 3492, 10554, 7274, 281, 1691, 616, 789, 625, 715, 3634, 50276, 2520, 789, 1057, 417, 2968, 342, 253, 12794, 28931, 273, 253, 3492, 10554, 4836, 3139, 352, 11544, 18260, 26295, 247, 1735, 1375, 1677, 767, 2469, 7313, 7624, 352, 2550, 3835, 253, 1027, 15216, 253, 2852, 476, 2186, 253, 789, 432, 458, 70, 1162, 355, 2686, 2692, 326, 407, 15890, 323, 436, 28931, 253, 6311, 1566, 19132, 327, 253, 4836, 273, 3492, 10554, 1580, 352, 1057, 417, 3388, 689, 2709, 13650, 285, 476, 3835, 2709, 1027, 2852, 15216, 50276, 74, 651, 751, 281, 4089, 253, 4477, 1379, 327, 326, 285, 923, 436, 625, 347, 2852, 789, 323, 4471, 6082, 8062, 4715, 50276, 16691, 7211, 50276, 27490, 577, 627, 310, 1182, 42927, 7019, 891, 2868, 253, 1273, 581, 943, 320, 1182, 18233, 50276, 20790, 3054, 326, 616, 2746, 310, 7032, 273, 2852, 13009, 10554, 275, 2570, 495, 69, 13451, 891, 452, 281, 14936, 342, 436, 1580, 436, 10895, 327, 534, 253, 1332, 310, 6760, 310, 2080, 1977, 432, 667, 1524, 10186, 3492, 10895, 326, 310, 2139, 891, 651, 417, 1067, 352, 2570, 50275, 4168, 253, 1416, 432, 2361, 281, 355, 3703, 32063, 1881, 12185, 87, 281, 1056, 352, 6927, 281, 1239, 275, 2829, 374, 963, 993, 4677, 818, 11743, 697, 50276, 953, 1077, 2872, 15330, 50276, 2050, 1103, 1386, 7133, 50276, 1282, 7133, 50276, 13206, 818, 556, 490, 285, 260, 533, 275, 253, 11743, 760, 247, 285, 270, 2826, 275, 260, 253, 1269, 10565, 4566, 432, 374, 281, 374, 533, 627, 403, 884, 3888, 50276, 20, 281, 495, 50275, 783, 2929, 5393, 326, 253, 5113, 14462, 50276, 7110, 281, 320, 5415, 949, 2317, 1223, 275, 253, 1735, 6197, 15571, 849, 253, 1735, 13358, 4228, 1789, 21624, 1375, 310, 8131, 581, 651, 452, 281, 897, 24088, 11454, 258, 3229, 281, 2686, 1566, 5415, 1789, 3200, 594, 5415, 943, 320, 7932, 407, 24088, 6032, 50276, 783, 4028, 273, 253, 2929, 3198, 690, 35952, 1580, 253, 41066, 275, 1142, 14683, 1057, 417, 1056, 3282, 24088, 5113, 3200, 8414, 342, 253, 5044, 12057, 4473, 50274, 249, 253, 11743, 273, 818, 67, 352, 2296, 1386, 7133, 6492, 253, 9777, 273, 4116, 5912, 1223, 275, 253, 4677, 352, 310, 3542, 326, 253, 4868, 273, 4797, 310, 470, 1525, 281, 479, 512, 253, 26112, 1007, 625, 390, 1679, 253, 1072, 1223, 581, 273, 731, 310, 2168, 470, 1525, 562, 273, 337, 253, 643, 4394, 943, 320, 840, 209, 2874, 849, 1057, 326, 1056, 3282, 50276, 16534, 253, 4477, 417, 921, 512, 253, 13461, 875, 512, 5113, 875, 767, 13009, 671, 323, 2709, 6667, 436, 651, 2085, 1199, 625, 12288, 281, 253, 9414, 50275, 262, 310, 2834, 281, 6308, 253, 1789, 4866, 275, 4677, 818, 260, 352, 651, 320, 1270, 604, 253, 4477, 812, 823, 690, 9588, 281, 1361, 23114, 50275, 783, 11911, 4081, 275, 436, 789, 403, 973, 17194, 285, 4158, 285, 4711, 2266, 1543, 327, 3492, 14720, 2568, 891, 717, 7514, 670, 253, 12497, 7103, 1293, 253, 490, 77, 569, 285, 253, 8245, 891, 5469, 275, 32213, 352, 310, 2834, 281, 22048, 253, 4588, 3486, 273, 253, 4081, 1332, 285, 697, 6944, 11911, 25761, 5661, 4278, 403, 5816, 534, 2789, 352, 2834, 281, 18302, 697, 2361, 3904, 326, 310, 2139, 619, 3302, 13716, 310, 2708, 253, 14924, 7887, 533, 891, 717, 1527, 281, 19427, 619, 13716, 604, 253, 4477, 513, 10481, 2953, 253, 2792, 5439, 275, 32213, 50275, 187, 187, 4118, 18435, 27, 2520, 789, 10262, 247, 4460, 1332, 288, 281, 3037, 1789, 8062, 432, 440, 47728, 10556, 285, 2722, 697, 5373, 327, 19349, 14720, 285, 2852, 3665, 10554, 50276, 2520, 2929, 2959, 577, 2762, 10123, 285, 337, 4016, 2278, 275, 253, 30080, 22559, 253, 4477, 452, 9713, 954, 273, 253, 7350, 913, 9193, 436, 789, 310, 1077, 4722, 285, 22828, 281, 320, 3863, 327, 17857, 32888, 1384, 1423, 253, 30628, 858, 7164, 690, 9865, 7350, 326, 943, 320, 9713, 275, 253, 2457, 4049, 254, 609, 5102, 2715, 273, 253, 2929, 253, 4477, 403, 14659, 281, 1056, 643, 3309, 2544 ]
[ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 ]
[ 310, 2709, 3665, 432, 3492, 849, 1027, 253, 3280, 941, 310, 604, 253, 1491, 10375, 407, 253, 4081, 1332, 310, 253, 8062, 273, 247, 6200, 253, 6349, 588, 7024, 327, 253, 6200, 533, 642, 1650, 310, 2011, 275, 253, 2929, 281, 6583, 253, 12510, 273, 7870, 940, 21755, 3738, 253, 1332, 2085, 310, 6777, 347, 247, 1332, 326, 23970, 11935, 1491, 352, 778, 417, 320, 14282, 984, 253, 3045, 310, 7197, 685, 253, 2613, 1332, 39887, 534, 4648, 247, 2014, 2460, 50276, 249, 253, 3368, 273, 10554, 352, 310, 2834, 281, 9446, 849, 310, 253, 3200, 273, 5113, 285, 5016, 1016, 643, 275, 3036, 21, 310, 253, 1650, 4569, 281, 921, 604, 253, 4081, 1332, 476, 4908, 253, 8062, 432, 3492, 50276, 249, 14717, 752, 310, 253, 1617, 273, 253, 3368, 310, 253, 14433, 949, 253, 6753, 36465, 390, 14433, 432, 253, 21624, 11390, 1677, 407, 2608, 604, 275, 253, 6158, 1083, 752, 285, 849, 403, 253, 3280, 21624, 11390, 1677, 50276, 249, 253, 8813, 273, 253, 4081, 1332, 253, 44176, 273, 253, 21624, 4972, 310, 23851, 247, 21624, 4972, 310, 32891, 407, 253, 3665, 1180, 891, 21624, 3605, 465, 673, 246, 285, 253, 12275, 13249, 533, 752, 310, 253, 3064, 875, 253, 3665, 1180, 285, 253, 673, 285, 253, 28308, 403, 11035, 275, 1142, 4243, 273, 253, 8813, 352, 310, 2834, 281, 2096, 534, 21624, 4903, 403, 2783, 281, 1089, 253, 2954, 50276, 783, 2508, 273, 1006, 800, 1566, 310, 417, 5544, 253, 3295, 310, 5998, 323, 1016, 12275, 533, 253, 2954, 875, 247, 21624, 4972, 285, 247, 12275, 310, 417, 5544, 50276, 20261, 253, 4081, 1332, 3133, 281, 921, 253, 1805, 1543, 275, 253, 4679, 253, 8813, 310, 417, 4209, 281, 18578, 604, 253, 3368, 310, 4569, 285, 4344, 23000, 253, 8813, 273, 253, 1332, 310, 417, 2590, 281, 2096, 253, 2508, 23000, 352, 310, 417, 2590, 326, 752, 2238, 273, 1491, 310, 10375, 407, 253, 629, 7870, 940, 21755, 5611, 275, 436, 2929, 352, 3133, 642, 3368, 310, 2011, 326, 1789, 8062, 310, 35755, 407, 253, 4081, 1332, 17912, 253, 37317, 11121, 326, 436, 2929, 310, 2708, 253, 14924, 50276, 7152, 33032, 2520, 2929, 13698, 387, 4715, 1789, 8062, 432, 440, 47728, 10556, 323, 4471, 6082, 14237, 436, 789, 21168, 2220, 253, 17032, 1566, 432, 13738, 567, 1162, 355, 6247, 281, 4044, 253, 4471, 6082, 14237, 970, 326, 436, 2929, 33772, 21624, 8062, 407, 21565, 253, 2852, 3665, 1677, 767, 2045, 7313, 253, 4081, 2990, 310, 10166, 407, 46875, 253, 2412, 7513, 10202, 275, 253, 12275, 2317, 323, 49866, 6477, 347, 973, 347, 323, 13650, 2074, 281, 2629, 789, 275, 3492, 10554, 275, 1635, 281, 326, 436, 2929, 5556, 4219, 253, 8103, 432, 13738, 567, 1162, 355, 6247, 281, 4044, 4471, 6082, 14237, 253, 2022, 7680, 273, 436, 789, 8696, 275, 253, 6333, 534, 26295, 253, 2852, 4471, 6082, 6779, 1677, 253, 767, 2469, 4394, 407, 970, 891, 7870, 940, 21755, 347, 973, 347, 21255, 247, 5886, 6333, 50276, 783, 7870, 940, 21755, 6333, 310, 17194, 407, 253, 1750, 326, 253, 1789, 85, 375, 11753, 12714, 20994, 3304, 10556, 3103, 253, 2929, 23970, 247, 1881, 42959, 3828, 281, 3761, 253, 25195, 273, 253, 2173, 5113, 2439, 767, 13009, 253, 3453, 273, 436, 6333, 846, 9433, 253, 3997, 10495, 2990, 310, 1925, 7870, 6779, 50276, 783, 5886, 6333, 310, 4081, 281, 1566, 253, 6355, 875, 1789, 8557, 436, 310, 9009, 8437, 20, 970, 1881, 42959, 875, 253, 8062, 432, 1840, 285, 4228, 6779, 273, 5113, 891, 285, 465, 25195, 253, 3453, 310, 2879, 281, 5731, 253, 7870, 6779, 432, 253, 7870, 940, 21755, 6333, 9142, 253, 2852, 1375, 310, 8131, 970, 247, 4872, 9261, 273, 253, 4228, 6779, 273, 673, 246, 285, 253, 9300, 7870, 6779, 50276, 783, 2929, 44995, 616, 6311, 6779, 327, 253, 1391, 87, 6554, 10895, 323, 3492, 14720, 10554, 14433, 285, 26405, 4757, 50276, 783, 5611, 11911, 7870, 940, 21755, 285, 5886, 11911, 403, 973, 17194, 285, 616, 2216, 285, 22786, 1646, 5272, 50276, 783, 2929, 5012, 2266, 3904, 327, 253, 4836, 273, 3492, 14720, 41731, 14692, 5368, 2987, 327, 1391, 87, 6554, 50276, 20881, 1255, 265, 50276, 33722, 28913, 3368, 323, 7870, 940, 21755, 285, 5886, 6333, 253, 2929, 3916, 326, 253, 15239, 12714, 20994, 29444, 1598, 3304, 253, 3492, 352, 4081, 253, 7870, 940, 21755, 4116, 5122, 281, 8415, 326, 2568, 352, 310, 417, 6760, 604, 436, 6333, 2686, 35910, 436, 1895, 342, 326, 253, 9414, 1057, 417, 871, 604, 436, 6333, 310, 3576, 253, 1072, 323, 253, 5886, 6333, 253, 2929, 943, 1304, 3904, 342, 285, 1293, 253, 897, 273, 436, 6333, 281, 2939, 697, 3486, 50275, 33722, 8245, 281, 2939, 253, 8453, 273, 253, 4081, 11911, 7870, 940, 21755, 50276, 16429, 6333, 253, 9414, 671, 3198, 1543, 1293, 731, 436, 8245, 476, 320, 3562, 407, 15706, 253, 4081, 11911, 342, 247, 3997, 3579, 11454, 2990, 342, 11467, 253, 1072, 1180, 273, 3602, 347, 253, 4081, 11911, 326, 26295, 253, 1735, 4228, 6779, 246, 18, 1754, 327, 253, 32147, 318, 273, 253, 4228, 6779, 387, 673, 246, 285, 246, 18, 253, 2929, 23970, 690, 2238, 273, 8245, 275, 2829, 374, 8909, 79, 32063, 24187, 407, 8493, 253, 577, 7877, 8062, 3021, 253, 2990, 310, 1669, 281, 3283, 253, 1735, 1375, 12718, 1754, 327, 253, 1655, 4228, 6779, 387, 673, 246, 534, 310, 12497, 326, 310, 2139, 891, 21434, 253, 4477, 281, 3359, 253, 8245, 347, 8042, 562, 1840, 285, 281, 1304, 253, 4795, 3904, 275, 253, 30080, 22559, 50276, 783, 2929, 26662, 616, 1332, 281, 2085, 534, 310, 253, 760, 643, 32048, 326, 19732, 1131, 11935, 1491, 2085, 8725, 2220, 253, 4228, 1332, 273, 39887, 970, 247, 374, 11830, 296, 78, 253, 2361, 3904, 275, 2829, 374, 323, 2085, 403, 3012, 1384, 7197, 685, 323, 39887, 534, 1057, 417, 1056, 3282, 281, 479, 3340, 672, 2819, 387, 253, 2085, 2929, 275, 534, 2085, 310, 327, 1061, 342, 39887, 327, 3388, 253, 1072, 6556, 2032, 323, 2829, 495, 253, 278, 339, 2361, 275, 2085, 310, 3012, 1805, 685, 39887, 5727, 275, 436, 2929, 278, 339, 310, 3012, 7197, 5693, 323, 278, 339, 436, 5936, 281, 479, 326, 627, 310, 1633, 3430, 342, 849, 2085, 369, 10166, 15419, 11634, 253, 2929, 25339, 247, 1921, 2139, 8909, 79, 310, 8936, 2429, 281, 2085, 275, 7652, 533, 1057, 417, 2953, 2139, 627, 310, 824, 247, 5699, 8037, 875, 39887, 285, 2085, 812, 253, 4477, 4496, 19148, 436, 323, 479, 436, 2722, 285, 25222, 253, 878, 323, 247, 8245, 347, 5469, 1840, 50276, 33722, 7092, 4278, 253, 2929, 1057, 417, 2085, 667, 7092, 4278, 751, 24088, 5556, 6081, 4715, 2281, 9840, 1318, 4373, 22041, 3966, 436, 2789, 253, 2929, 417, 41374, 285, 891, 21434, 253, 4477, 281, 823, 841, 4278, 1057, 253, 2127, 755, 4439, 846, 14924, 50276, 783, 2929, 44995, 616, 4081, 1566, 760, 327, 253, 1391, 87, 6554, 10895, 281, 1805, 2939, 253, 3486, 273, 436, 789, 253, 1332, 3198, 281, 320, 6760, 387, 1878, 327, 581, 643, 10895, 824, 347, 24088, 28335, 28335, 687, 26824, 305, 1817, 9432, 1162, 355, 28335, 247, 10401, 10895, 323, 5889, 267, 5231, 285, 11935, 14720, 17857, 32888, 9169, 5816, 2905, 789, 436, 2929, 38771, 247, 4623, 2905, 789, 432, 1670, 254, 3682, 29571, 1162, 355, 10726, 38562, 275, 5304, 1566, 3169, 35221, 4715, 944, 77, 6247, 436, 789, 16756, 10726, 14237, 432, 3888, 285, 33772, 616, 8062, 323, 1566, 3169, 35221, 4715, 50276, 3062, 1189, 436, 2929, 760, 28070, 2905, 789, 534, 17209, 327, 4471, 6082, 14237, 2568, 627, 4961, 2905, 789, 671, 323, 1327, 23939, 6082, 8062, 6779, 4715, 534, 943, 320, 11106, 824, 347, 50275, 14785, 70, 6554, 1162, 355, 440, 35421, 4715, 273, 1789, 2605, 285, 8062, 432, 10556, 5723, 2824, 6247, 50275, 1559, 1595, 8420, 1162, 355, 4685, 1789, 8062, 323, 18366, 4440, 292, 729, 2842, 9066, 30105, 1087, 43425, 50276, 12563, 253, 789, 943, 26542, 2201, 2987, 275, 3492, 10554, 1580, 352, 310, 253, 4836, 253, 2990, 4850, 10166, 327, 50276, 14906, 1162, 355, 19191, 48960, 3492, 10554, 5745, 81, 50276, 925, 1972, 4635, 1162, 355, 19191, 21624, 12541, 3492, 10554, 1499, 23667, 81, 50276, 69, 26947, 1162, 355, 19191, 3492, 5978, 342, 247, 6311, 2720, 18504, 72, 50275, 33722, 5301, 281, 1655, 1327, 23939, 1789, 6779, 3169, 7274, 323, 3492, 10554, 352, 651, 452, 644, 5322, 604, 253, 2929, 651, 452, 2429, 281, 1327, 23939, 1789, 3492, 10554, 7274, 281, 1691, 616, 789, 625, 715, 3634, 50276, 2520, 789, 1057, 417, 2968, 342, 253, 12794, 28931, 273, 253, 3492, 10554, 4836, 3139, 352, 11544, 18260, 26295, 247, 1735, 1375, 1677, 767, 2469, 7313, 7624, 352, 2550, 3835, 253, 1027, 15216, 253, 2852, 476, 2186, 253, 789, 432, 458, 70, 1162, 355, 2686, 2692, 326, 407, 15890, 323, 436, 28931, 253, 6311, 1566, 19132, 327, 253, 4836, 273, 3492, 10554, 1580, 352, 1057, 417, 3388, 689, 2709, 13650, 285, 476, 3835, 2709, 1027, 2852, 15216, 50276, 74, 651, 751, 281, 4089, 253, 4477, 1379, 327, 326, 285, 923, 436, 625, 347, 2852, 789, 323, 4471, 6082, 8062, 4715, 50276, 16691, 7211, 50276, 27490, 577, 627, 310, 1182, 42927, 7019, 891, 2868, 253, 1273, 581, 943, 320, 1182, 18233, 50276, 20790, 3054, 326, 616, 2746, 310, 7032, 273, 2852, 13009, 10554, 275, 2570, 495, 69, 13451, 891, 452, 281, 14936, 342, 436, 1580, 436, 10895, 327, 534, 253, 1332, 310, 6760, 310, 2080, 1977, 432, 667, 1524, 10186, 3492, 10895, 326, 310, 2139, 891, 651, 417, 1067, 352, 2570, 50275, 4168, 253, 1416, 432, 2361, 281, 355, 3703, 32063, 1881, 12185, 87, 281, 1056, 352, 6927, 281, 1239, 275, 2829, 374, 963, 993, 4677, 818, 11743, 697, 50276, 953, 1077, 2872, 15330, 50276, 2050, 1103, 1386, 7133, 50276, 1282, 7133, 50276, 13206, 818, 556, 490, 285, 260, 533, 275, 253, 11743, 760, 247, 285, 270, 2826, 275, 260, 253, 1269, 10565, 4566, 432, 374, 281, 374, 533, 627, 403, 884, 3888, 50276, 20, 281, 495, 50275, 783, 2929, 5393, 326, 253, 5113, 14462, 50276, 7110, 281, 320, 5415, 949, 2317, 1223, 275, 253, 1735, 6197, 15571, 849, 253, 1735, 13358, 4228, 1789, 21624, 1375, 310, 8131, 581, 651, 452, 281, 897, 24088, 11454, 258, 3229, 281, 2686, 1566, 5415, 1789, 3200, 594, 5415, 943, 320, 7932, 407, 24088, 6032, 50276, 783, 4028, 273, 253, 2929, 3198, 690, 35952, 1580, 253, 41066, 275, 1142, 14683, 1057, 417, 1056, 3282, 24088, 5113, 3200, 8414, 342, 253, 5044, 12057, 4473, 50274, 249, 253, 11743, 273, 818, 67, 352, 2296, 1386, 7133, 6492, 253, 9777, 273, 4116, 5912, 1223, 275, 253, 4677, 352, 310, 3542, 326, 253, 4868, 273, 4797, 310, 470, 1525, 281, 479, 512, 253, 26112, 1007, 625, 390, 1679, 253, 1072, 1223, 581, 273, 731, 310, 2168, 470, 1525, 562, 273, 337, 253, 643, 4394, 943, 320, 840, 209, 2874, 849, 1057, 326, 1056, 3282, 50276, 16534, 253, 4477, 417, 921, 512, 253, 13461, 875, 512, 5113, 875, 767, 13009, 671, 323, 2709, 6667, 436, 651, 2085, 1199, 625, 12288, 281, 253, 9414, 50275, 262, 310, 2834, 281, 6308, 253, 1789, 4866, 275, 4677, 818, 260, 352, 651, 320, 1270, 604, 253, 4477, 812, 823, 690, 9588, 281, 1361, 23114, 50275, 783, 11911, 4081, 275, 436, 789, 403, 973, 17194, 285, 4158, 285, 4711, 2266, 1543, 327, 3492, 14720, 2568, 891, 717, 7514, 670, 253, 12497, 7103, 1293, 253, 490, 77, 569, 285, 253, 8245, 891, 5469, 275, 32213, 352, 310, 2834, 281, 22048, 253, 4588, 3486, 273, 253, 4081, 1332, 285, 697, 6944, 11911, 25761, 5661, 4278, 403, 5816, 534, 2789, 352, 2834, 281, 18302, 697, 2361, 3904, 326, 310, 2139, 619, 3302, 13716, 310, 2708, 253, 14924, 7887, 533, 891, 717, 1527, 281, 19427, 619, 13716, 604, 253, 4477, 513, 10481, 2953, 253, 2792, 5439, 275, 32213, 50275, 187, 187, 4118, 18435, 27, 2520, 789, 10262, 247, 4460, 1332, 288, 281, 3037, 1789, 8062, 432, 440, 47728, 10556, 285, 2722, 697, 5373, 327, 19349, 14720, 285, 2852, 3665, 10554, 50276, 2520, 2929, 2959, 577, 2762, 10123, 285, 337, 4016, 2278, 275, 253, 30080, 22559, 253, 4477, 452, 9713, 954, 273, 253, 7350, 913, 9193, 436, 789, 310, 1077, 4722, 285, 22828, 281, 320, 3863, 327, 17857, 32888, 1384, 1423, 253, 30628, 858, 7164, 690, 9865, 7350, 326, 943, 320, 9713, 275, 253, 2457, 4049, 254, 609, 5102, 2715, 273, 253, 2929, 253, 4477, 403, 14659, 281, 1056, 643, 3309, 2544 ]
Below is given review of a research paper from cnoference journal. Please write a summary the review. ### Review: this paper considers one important question how to fairly compare the adversarial robustness between detectionbased defenses and classificationbased defenses from the theoretical perspective the authors show that one can always ideally construct a robust classifier from a robust detector which has equivalent robustness and vice versa based on this construction method they are able to transfer the robustness between robust detectors and robust classifiers finally they find that most existing detection defenses achieve suspiciously high robust performance compared with stateofart robust classifiers if they apply the proposed transferring criteria we summarize the strengths and weakness of this paper as following strengths 1 it is the first paper to attempt to unify the detection and robust classifier approaches 2 the authors provide constructive steps to reduce the detector to classifier and vice versa 3 the part about certified defense highlights the strictness of the theoretical results weakness 1 in the paper the authors prove that one can theoretically construct a robust inefficient classifier from a robust detector which has equivalent robustness theorem 4 however the constructive steps provided in theorem 4 can not be exactly solved in practice for most of the defense models except for certified defenses if it is extremely hard to figure out such a classifier we still could not say there exists such a feasible classifier with the same robustness as the detection model therefore the robust accuracy reported by the constructed classifier could be unreasonably high similar evidence can be found in table 1 as a result the construction process in theorem 4 might give a false sense about the true robustness of the classifier as well as the original detector 2 one possible direction is that one could approximately solve the inefficient problem in theorem 4 using gradient methods or black box optimization methods imagine that one can approximately solve this problem heshe can get such a classification model in practice which will greatly improve the potential practical use of the proposed theorem 3 based on the discussion above we also hesitate about the practical importance of the experimental results in table 1 the correctness of the theorems needs verification of the irrelevance of computational complexity in reduction which is not provided and the arguments of suspicious high performance of detection defenses are based on adaptive attacks instead of their theorems therefore as said in section 3 interpreting our reduction does not mean that the defenses claims is wrong the experiment results shown in table 1 which is based on the theorems can not be used to make any conclusion about the performance of existing detection defenses 4 since most of the results of this paper are based on normbased attacks are the results valid for other types of attacks such as spatial attacks and so on other comments or remarks a typo in theorem 4 proof in the second and third bullets x should be hatx overall the paper considers an interesting problem and tries to unify the detection and classifier defenses however the theorem is only correct without considering the computational complexity and poses another question about the relation between robust classifier training and computational complexity also the experiment part should include approximate results of the reduction steps to verify the feasibility of the theorems in practice docsepadversarial examples are test time attacks in which the input is modified by up to distance eps under some metric and the goal of adversarially robust learning is to have high generalized accuracy even under such attacks one way to make predictions is to always output a label another way is to abstaindetect when the learner thinks the input is not clean and is perturbed with the way we evaluate the performance in the detection model is to count detected perturbed inputs as correctly classified the paper asks a very natural question is it easier to learn when detectionabstain is allowed or not the main result of the paper is very clean for any metric d and any eps the existence of a learner that achieves accuracy c under detection model under 2eps perturbations is information theoretically equivalent to the existence of a classifier in the nodetection model with accuracy c and eps perturbation the proof is constructive but it is not efficiently constructive namely given a classifier in either of the two settings above the paper shows a rather simple but smart way of constructing another classifier with the parameters stated above in the other model the paper then takes this connection to revisit the results of quite a few papers from the literature in which they have claimed defenses that use detection as their key idea the paper observes that the bounds that many of those papers claim would imply classifiers with no detectionabstain that beat the state of the art adversarially robust classifiers the paper cautiously claims that this indicates that the defenses of those papers are not actually secure but rather not broken under simple attacks tried by the authors please list both the strengths and weaknesses of the paper when discussing weaknesses please provide concrete actionable feedback on the paper on the positive side the connection between detection and classification is a very natural question that deserves attention the paper proves a simple but very nice theorem that as far as i know was not proved before the connection between testing and decoding is not completely new in coding theory and the paper also mentions this but observing this phenomenon in the context of robust learning as far as i know is new also i like the fact that the paper uses this connection to study the implications of results already claimed in the literature on the down side the fact that the theorem of the paper is proved using information theoretic rather than computationally efficient reductions limits the ways one can benefit from such a connection in particular 1 it is not completely fair to question previous works claimed results on robustness using detection because those papers did not claim certified results or information theoretic results but rather hardness of breaking their schemes since the reductions in this paper are not computationally feasible one cannot conclude that those schemes were indeed not secure computationally 2 most if not all settings in which robust learning is a hot topic one already knows the existence of a robust ground truck function in particular for image classification it is the assumption that humans are robust to small eps perturbations and the goal is to find such classifiers automatically in such contexts the results of this paper become obsolete because the reductions exist trivially this further limits the applicability of the results other comments page 2 to the authors knowledge there is no known way of leveraging computational inefficiency to build more robust models the cited paper by garg et al seems to exactly show the possibility that computational efficiency could be leveraged to achieve robustness the two papers garg et al 2020 and bubeck et al 2018 are actually quite different in how they deal with the role of computational efficiency one deals with polytime learning and the other deals with polytime attacking i think its the latter that is more directly related to this works message following up on the issue 2 mentioned above one might say that the result of the paper still applies even if the ground truth is not robust to eps perturbations but then in that case it brings up another issue the definition used in this paper would not imply that an adversarial example is actually misclassified this issue is discussed in some previous work such as revisiting adversarial risk suggala et al aista19 and adversarial risk and robustness general definitions diochnos et al neurips18 the paper proves a natural theoretical result that is at the heart of robust learning the result is information theoretic but i still find it quite natural i think one should be very cautious to not overly interpret the implications of this paper but i think the mere theoretical observation that testing and decoding in the context of adversarial learning are equivalent has a merit that at least puts this paper on the border for iclr docsepthe paper showcases the vulnerability of existing adversarial detection algorithms multiple existing detectionbased defense algorithms are considered and have shown theoretically that the robustness claimed by those algorithms might not be the same as claimed in the corresponding papers while the idea is interesting and needs some attention however considering only the attacker is intelligent is somewhat wrong or misleading in both this and existing papers referred if the defense can be broken using a new adaptive attack using the knowledge of the defense then the defender must also have the same knowledge to modify the defense to carefully check the limit of defense another interesting question is why the attacker needs to generate an attack if heshe can assume all knowledge of the defense and the target model why not simply modify the decision therefore before showcasing the defenses are not working it is expected to consider this and adaptive defense or give the same freedom to the defender as well the claims made in the paper are not empirically tested they are mainly based on the assumptions of the authors or one selected paper it will be great if the authors can showcase that actually the defenses are not working what do the authors mean by the the evaluation is inefficient on page 7 please refer to some string defenses 1 detection based defense against adversarial examples from the steganalysis point of view in proceedings of the ieeecvf conference on computer vision and pattern recognition pp 48254834 2 damad database attack and model agnostic adversarial perturbation detector in ieee transactions on neural networks and learning systems tnnls 2021 3 image transformation based defense against adversarial perturbation on deep learning models in ieee transactions on dependable and secure computing tdsc 2021 vol 18 no 5 pp 21062121 similarly again the term worstcase robust attack is not clear it seems like the paper is based on some preassumptions that the detector can not work please clarify next sentence on page 7 if this detector defenses robustness claims were correct not clean what the author meant and downgrade take the credit of the defense work most of the assumptions are based on one work only rebuffi et al 2021 i feel the authors need to revisit the paper and literature thoroughly not only a few papers which showcase that existing defenses do not work or depend on the robsutness paper the paper is based on assumptions and findings of one single paper most of the time without explicit showcase that the existing defense will not work the analysis seems misleading and unfair to the parts of the defenders as well the authors believe that the existing defenses do not resemble the worstcase attack please generate such an attack and showcase without explicitly touching the defenses that defense is not working in my knowledge the true and fair concept to both attacker and defender needs to be there to make serious progress in the field else this can be just another paper reflecting the singularities of the defenses docsepthis submission connects the areas of adversarial training and adversarial robust detection generally speaking i think the conclusions in the submitted paper can provide beneficial insights for the community and avoid overclaims in future adversarial detection research strength this work connects the areas of adversarial training and adversarial detection and the writing is easy to follow it is of great significance to the future development of adversarial detection research weakness the construction of equivalent classifiers and detectors in theorem 4 and 5 are interesting however i have concerns about theorem 4 that epsilonrobust detection implies inefficient epsilon2robust classification for one thing it lacks a practical and efficient solution to find a perturbed input that is classified differently for another the claim is not necessary in practice for example if the input is rejected by an adversarial detection practitioners can simply add random noise on the input until the new input is not rejected and by the way compared to the analysis of robust detection of robust classifying considering adaptive attacks towards adversarial classification and adversarial detection simultaneously is more important the title of this submission seems inappropriate its widely acknowledged the success rate of adversarial detection far outweighs the success rate of adversarial classifying in previous studies to be specific since a large amount of high dimensional images within an epsilon ball classifying all images is a difficult task for neural networks with limited capacity by contrast the detection task just has to tell the difference between adversarial examples and natural examples hence the findings in this submission are not convincing that detecting adversarial examples is as hard as classifying them similar difficulties should be indicated by similar success rates otherwise it will give the public misunderstanding a modest proposal avoid using the paragraphing abstract and delete emph and in the open review submission page in general i think this submission can provide beneficial insights and evaluations for the field of adversarial detection however it lacks practical solutions which implies limited contributions i hope my concerns about theorem 4 and 5 can be considered or pointed out my misunderstanding in the discussion stage though i cannot recommend acceptance at this stage i will increase my score if my concerns are solved properly ### Summary:
the paper investigates a very interesting problem of the connections between adversarial detection and adversarial classification theoretically the authors show that one can always ideally construct a robust classifier from a robust detector that has equivalent robustness and vice versa this theorem is only correct without considering the computational complexity however the authors did not provide any approximate results of the reduction steps to verify the feasibility of the theorems in practice which is the main concern of all reviewers so we can say the paper is a reminder to the community we need to be careful about the detection results but did not provide any evidence to say they are overclaimed only a conjecture based on the theorem in the paper which greatly limits the contribution of the paper due to the competitiveness of iclr i cannot recommend accepting it
[ 253, 2929, 19401, 271, 4722, 1895, 285, 14177, 281, 440, 1419, 253, 5481, 285, 30410, 25774, 2299, 253, 10012, 310, 760, 3451, 1293, 7296, 253, 15180, 10454, 285, 24543, 1529, 1953, 670, 253, 5886, 875, 10237, 30410, 3733, 285, 15180, 10454, 671, 253, 3368, 629, 943, 2486, 16851, 1543, 273, 253, 5141, 5018, 281, 12654, 253, 25720, 273, 253, 39383, 275, 3946, 50276, 7152, 339, 11022, 735, 24406, 6667, 403, 1071, 673, 8104, 275, 534, 253, 3280, 310, 7321, 407, 598, 281, 4181, 299, 793, 762, 690, 7982, 285, 253, 4736, 273, 18539, 274, 1365, 10237, 4715, 310, 281, 452, 1029, 14923, 7200, 1014, 762, 824, 8104, 581, 1039, 281, 1056, 13650, 310, 281, 1900, 3453, 247, 5203, 1529, 1039, 310, 281, 20965, 404, 33492, 672, 253, 458, 47612, 11121, 253, 3280, 310, 417, 4076, 285, 310, 44711, 342, 253, 1039, 359, 7472, 253, 3045, 275, 253, 5481, 1566, 310, 281, 1385, 5189, 44711, 14800, 347, 9113, 10509, 50276, 783, 2929, 12325, 247, 1077, 3626, 1953, 310, 352, 6927, 281, 3037, 672, 5481, 357, 296, 404, 310, 4136, 390, 417, 253, 2022, 906, 273, 253, 2929, 310, 1077, 4076, 323, 667, 7982, 277, 285, 667, 299, 793, 253, 6242, 273, 247, 458, 47612, 326, 33526, 7200, 260, 762, 5481, 1566, 762, 374, 2265, 26309, 310, 1491, 28055, 6425, 281, 253, 6242, 273, 247, 30410, 275, 253, 6913, 292, 2441, 1566, 342, 7200, 260, 285, 299, 793, 20452, 50276, 783, 4737, 310, 25799, 533, 352, 310, 417, 14556, 25799, 10775, 1677, 247, 30410, 275, 2057, 273, 253, 767, 7533, 1840, 253, 2929, 2722, 247, 2581, 2969, 533, 7060, 1039, 273, 26736, 1529, 30410, 342, 253, 3602, 4767, 1840, 275, 253, 643, 1566, 50276, 783, 2929, 840, 3936, 436, 4602, 281, 45735, 253, 1543, 273, 3240, 247, 1643, 9380, 432, 253, 6239, 275, 534, 597, 452, 7558, 25774, 326, 897, 5481, 347, 616, 2234, 2934, 253, 2929, 40687, 326, 253, 14493, 326, 1142, 273, 1110, 9380, 1750, 651, 16084, 49996, 342, 642, 5481, 357, 296, 404, 326, 7171, 253, 1375, 273, 253, 1445, 18539, 274, 1365, 10237, 49996, 253, 2929, 45254, 3916, 326, 436, 6492, 326, 253, 25774, 273, 1110, 9380, 403, 417, 2686, 7895, 533, 2581, 417, 7154, 762, 2969, 8104, 3597, 407, 253, 4477, 4496, 1618, 1097, 253, 20544, 285, 32213, 273, 253, 2929, 672, 16585, 32213, 4496, 2085, 11859, 49353, 8680, 327, 253, 2929, 50276, 251, 253, 2762, 1930, 253, 4602, 875, 5481, 285, 9162, 310, 247, 1077, 3626, 1953, 326, 22828, 4116, 253, 2929, 19539, 247, 2969, 533, 1077, 5322, 10012, 326, 347, 2080, 347, 891, 871, 369, 417, 8058, 1078, 253, 4602, 875, 5175, 285, 28490, 310, 417, 4336, 747, 275, 12425, 3762, 285, 253, 2929, 671, 25957, 436, 533, 20764, 436, 11562, 275, 253, 3634, 273, 10237, 4715, 347, 2080, 347, 891, 871, 310, 747, 671, 891, 751, 253, 958, 326, 253, 2929, 4648, 436, 4602, 281, 1263, 253, 12739, 273, 1543, 2168, 7558, 275, 253, 6239, 50276, 251, 253, 1066, 1930, 253, 958, 326, 253, 10012, 273, 253, 2929, 310, 8058, 970, 1491, 253, 30325, 2581, 685, 43245, 5919, 23082, 7787, 253, 4088, 581, 476, 5649, 432, 824, 247, 4602, 275, 1798, 337, 352, 310, 417, 4336, 4344, 281, 1953, 2045, 2987, 7558, 1543, 327, 31640, 970, 5481, 984, 1110, 9380, 858, 417, 1750, 18065, 1543, 390, 1491, 253, 30325, 1543, 533, 2581, 38576, 273, 10155, 616, 15849, 1580, 253, 23082, 275, 436, 2929, 403, 417, 43245, 17887, 581, 2550, 7525, 326, 1110, 15849, 497, 6296, 417, 7895, 43245, 374, 954, 604, 417, 512, 7533, 275, 534, 10237, 4715, 310, 247, 3511, 9400, 581, 2168, 6057, 253, 6242, 273, 247, 10237, 3216, 9988, 1159, 275, 1798, 323, 2460, 9162, 352, 310, 253, 9376, 326, 7497, 403, 10237, 281, 1355, 299, 793, 26309, 285, 253, 4736, 310, 281, 1089, 824, 49996, 8356, 275, 824, 22349, 253, 1543, 273, 436, 2929, 2489, 40072, 984, 253, 23082, 2226, 35820, 1365, 436, 2007, 7787, 253, 30437, 273, 253, 1543, 50276, 977, 5701, 50276, 6377, 374, 281, 253, 4477, 3640, 627, 310, 642, 1929, 1039, 273, 19732, 2977, 15180, 275, 46505, 281, 1973, 625, 10237, 3210, 50275, 783, 11106, 2929, 407, 305, 1662, 1162, 355, 3133, 281, 4555, 921, 253, 6387, 326, 15180, 6733, 812, 320, 19732, 2961, 281, 5115, 31640, 50276, 783, 767, 9380, 305, 1662, 1162, 355, 9169, 285, 270, 4338, 777, 1162, 355, 4765, 403, 2686, 3240, 1027, 275, 849, 597, 2968, 342, 253, 2554, 273, 15180, 6733, 581, 13330, 342, 877, 1767, 553, 4715, 285, 253, 643, 13330, 342, 877, 1767, 553, 20362, 891, 1158, 697, 253, 6158, 326, 310, 625, 3587, 2905, 281, 436, 2987, 3935, 50276, 34814, 598, 327, 253, 2523, 374, 5393, 1840, 581, 1537, 1333, 326, 253, 906, 273, 253, 2929, 1335, 10384, 1014, 604, 253, 3216, 5083, 310, 417, 10237, 281, 299, 793, 26309, 533, 840, 275, 326, 1083, 352, 10316, 598, 1529, 2523, 253, 5426, 908, 275, 436, 2929, 651, 417, 16084, 326, 271, 48960, 1650, 310, 2686, 3731, 39651, 436, 2523, 310, 5469, 275, 690, 2045, 789, 824, 347, 27694, 2996, 48960, 2495, 1803, 7080, 1162, 355, 247, 8727, 746, 285, 48960, 2495, 285, 31640, 2087, 14308, 277, 900, 1451, 375, 1162, 355, 5723, 2824, 1093, 50276, 783, 2929, 19539, 247, 3626, 10527, 906, 326, 310, 387, 253, 2798, 273, 10237, 4715, 253, 906, 310, 1491, 253, 30325, 533, 891, 1335, 1089, 352, 3240, 3626, 50276, 74, 1158, 581, 943, 320, 1077, 31798, 281, 417, 27662, 4665, 253, 12739, 273, 436, 2929, 533, 891, 1158, 253, 11019, 10527, 8310, 326, 5175, 285, 28490, 275, 253, 3634, 273, 48960, 4715, 403, 6425, 556, 247, 15785, 326, 387, 1878, 12516, 436, 2929, 327, 253, 5680, 323, 17857, 32888, 50276, 7152, 339, 431, 248, 2929, 921, 12866, 253, 24189, 273, 5368, 48960, 5481, 11333, 2709, 5368, 5481, 3169, 5684, 11333, 403, 2783, 285, 452, 2011, 28055, 326, 253, 31640, 7558, 407, 1110, 11333, 1537, 417, 320, 253, 1072, 347, 7558, 275, 253, 3969, 9380, 50276, 6050, 253, 2934, 310, 4722, 285, 3198, 690, 4116, 2299, 7296, 760, 253, 30539, 310, 17497, 310, 8489, 3430, 390, 24363, 275, 1097, 436, 285, 5368, 9380, 6289, 604, 253, 5684, 476, 320, 7154, 970, 247, 747, 17825, 2983, 970, 253, 3640, 273, 253, 5684, 840, 253, 26762, 1364, 671, 452, 253, 1072, 3640, 281, 10007, 253, 5684, 281, 9257, 2451, 253, 2701, 273, 5684, 50276, 23955, 4722, 1953, 310, 2139, 253, 30539, 3198, 281, 6635, 271, 2983, 604, 344, 6689, 476, 5467, 512, 3640, 273, 253, 5684, 285, 253, 2303, 1566, 2139, 417, 3365, 10007, 253, 3061, 3103, 1078, 44762, 2355, 253, 25774, 403, 417, 2444, 352, 310, 3264, 281, 1908, 436, 285, 17825, 5684, 390, 1918, 253, 1072, 7185, 281, 253, 26762, 347, 973, 50276, 783, 3916, 1160, 275, 253, 2929, 403, 417, 45190, 5762, 597, 403, 7194, 1754, 327, 253, 13260, 273, 253, 4477, 390, 581, 4236, 2929, 352, 588, 320, 1270, 604, 253, 4477, 476, 34647, 326, 2686, 253, 25774, 403, 417, 2444, 50276, 5371, 513, 253, 4477, 1599, 407, 253, 50276, 783, 7103, 310, 31334, 327, 3239, 818, 4496, 3730, 281, 690, 2876, 25774, 50276, 18, 5481, 1754, 5684, 1411, 48960, 6667, 432, 253, 331, 909, 12792, 1127, 273, 1859, 275, 10061, 273, 253, 26332, 70, 886, 39985, 8059, 327, 4382, 8113, 285, 3102, 8981, 7266, 5693, 1099, 2385, 1706, 374, 2687, 324, 5447, 2983, 285, 1566, 639, 79, 6932, 48960, 20452, 13562, 275, 26332, 1796, 13122, 327, 11454, 6928, 285, 4715, 2718, 246, 9866, 5200, 43425, 495, 2460, 9261, 1754, 5684, 1411, 48960, 20452, 327, 3676, 4715, 3210, 275, 26332, 1796, 13122, 327, 3469, 494, 285, 7895, 12672, 32989, 1026, 43425, 1936, 1283, 642, 608, 7266, 20048, 3763, 15144, 50276, 3549, 6241, 969, 253, 1307, 9065, 5045, 10237, 2983, 310, 417, 2590, 352, 3133, 751, 253, 2929, 310, 1754, 327, 690, 638, 515, 360, 6372, 326, 253, 13562, 476, 417, 789, 4496, 19148, 50276, 8384, 6197, 327, 3239, 818, 50276, 338, 436, 13562, 25774, 31640, 3916, 497, 3451, 50275, 1439, 4076, 752, 253, 2488, 5486, 285, 1066, 7698, 1379, 253, 6152, 273, 253, 5684, 789, 50276, 2252, 273, 253, 13260, 403, 1754, 327, 581, 789, 760, 6142, 2066, 74, 1162, 355, 43425, 891, 1928, 253, 4477, 878, 281, 45735, 253, 2929, 285, 6239, 16575, 417, 760, 247, 1643, 9380, 534, 34647, 326, 5368, 25774, 513, 417, 789, 390, 3469, 327, 253, 687, 1768, 307, 1255, 2929, 50275, 783, 2929, 310, 1754, 327, 13260, 285, 4342, 273, 581, 2014, 2929, 954, 273, 253, 673, 1293, 6843, 34647, 326, 253, 5368, 5684, 588, 417, 789, 253, 1783, 3133, 24363, 285, 16593, 281, 253, 4243, 273, 253, 31342, 347, 973, 50275, 783, 4477, 2868, 326, 253, 5368, 25774, 513, 417, 28788, 253, 9065, 5045, 2983, 4496, 6635, 824, 271, 2983, 285, 34647, 1293, 11120, 19883, 253, 25774, 326, 5684, 310, 417, 2444, 50276, 249, 619, 3640, 253, 2032, 285, 4344, 4473, 281, 1097, 30539, 285, 26762, 3198, 281, 320, 627, 281, 1056, 4092, 4780, 275, 253, 1673, 2010, 436, 476, 320, 816, 1529, 2929, 18964, 253, 34001, 273, 253, 25774, 50276, 7152, 33032, 2520, 19529, 23417, 253, 3672, 273, 48960, 3733, 285, 48960, 10237, 5481, 3839, 8288, 891, 1158, 253, 11815, 275, 253, 9262, 2929, 476, 2085, 12912, 16039, 323, 253, 3114, 285, 3693, 689, 28803, 275, 2852, 48960, 5481, 2561, 4757, 436, 789, 23417, 253, 3672, 273, 48960, 3733, 285, 48960, 5481, 285, 253, 4028, 310, 3477, 281, 956, 352, 310, 273, 1270, 8453, 281, 253, 2852, 2440, 273, 48960, 5481, 2561, 50276, 20881, 1255, 253, 5140, 273, 6425, 49996, 285, 25421, 275, 10012, 577, 285, 608, 403, 4722, 2299, 891, 452, 7350, 670, 10012, 577, 326, 299, 4277, 18848, 461, 5481, 8018, 31334, 299, 4277, 19, 18848, 461, 9162, 323, 581, 2181, 352, 19756, 247, 8542, 285, 5919, 2900, 281, 1089, 247, 44711, 3280, 326, 310, 10509, 13359, 323, 1529, 253, 1750, 310, 417, 3309, 275, 3946, 323, 1650, 604, 253, 3280, 310, 10945, 407, 271, 48960, 5481, 24432, 476, 3365, 823, 3632, 6046, 327, 253, 3280, 1919, 253, 747, 3280, 310, 417, 10945, 285, 407, 253, 1039, 2429, 281, 253, 1783, 273, 10237, 5481, 273, 10237, 49653, 7296, 17825, 8104, 4404, 48960, 9162, 285, 48960, 5481, 10486, 310, 625, 1774, 50275, 783, 4060, 273, 436, 19529, 3133, 19582, 697, 7561, 14969, 253, 2323, 2281, 273, 48960, 5481, 2080, 32180, 21144, 253, 2323, 2281, 273, 48960, 49653, 275, 2045, 2175, 281, 320, 2173, 1580, 247, 1781, 2408, 273, 1029, 15759, 3888, 1561, 271, 299, 4277, 4023, 49653, 512, 3888, 310, 247, 2834, 4836, 323, 11454, 6928, 342, 3710, 5350, 407, 4499, 253, 5481, 4836, 816, 556, 281, 2028, 253, 3064, 875, 48960, 6667, 285, 3626, 6667, 7613, 253, 4342, 275, 436, 19529, 403, 417, 21414, 326, 15549, 48960, 6667, 310, 347, 1892, 347, 49653, 731, 2074, 12748, 943, 320, 4860, 407, 2074, 2323, 4142, 5010, 352, 588, 1918, 253, 1345, 40663, 50276, 66, 16453, 10419, 3693, 970, 253, 12494, 272, 12002, 285, 11352, 7013, 285, 50276, 249, 253, 1527, 2278, 19529, 3239, 50276, 249, 2087, 891, 1158, 436, 19529, 476, 2085, 12912, 16039, 285, 27163, 323, 253, 1673, 273, 48960, 5481, 2299, 352, 19756, 8542, 5482, 534, 8018, 3710, 9021, 891, 3524, 619, 7350, 670, 10012, 577, 285, 608, 476, 320, 2783, 390, 8042, 562, 619, 40663, 275, 253, 5955, 3924, 2167, 891, 2550, 5583, 14924, 387, 436, 3924, 891, 588, 2572, 619, 4868, 604, 619, 7350, 403, 14042, 6283, 2490, 187, 4118, 18435, 27, 783, 2929, 2340, 684, 247, 1077, 4722, 1895, 273, 253, 10291, 875, 48960, 5481, 285, 48960, 9162, 28055, 253, 4477, 921, 326, 581, 476, 1900, 34243, 3989, 247, 10237, 30410, 432, 247, 10237, 13562, 326, 556, 6425, 31640, 285, 12008, 26620, 436, 10012, 310, 760, 3451, 1293, 7296, 253, 15180, 10454, 2299, 253, 4477, 858, 417, 2085, 667, 16851, 1543, 273, 253, 5141, 5018, 281, 12654, 253, 25720, 273, 253, 39383, 275, 3946, 534, 310, 253, 2022, 4468, 273, 512, 30628, 594, 359, 476, 1333, 253, 2929, 310, 247, 24388, 281, 253, 3114, 359, 878, 281, 320, 10182, 670, 253, 5481, 1543, 533, 858, 417, 2085, 667, 1941, 281, 1333, 597, 403, 689, 13578, 760, 247, 24366, 1754, 327, 253, 10012, 275, 253, 2929, 534, 10260, 7787, 253, 7680, 273, 253, 2929, 1955, 281, 253, 3947, 48826, 273, 17857, 32888, 891, 2550, 5583, 18738, 352 ]
[ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 ]
[ 253, 2929, 19401, 271, 4722, 1895, 285, 14177, 281, 440, 1419, 253, 5481, 285, 30410, 25774, 2299, 253, 10012, 310, 760, 3451, 1293, 7296, 253, 15180, 10454, 285, 24543, 1529, 1953, 670, 253, 5886, 875, 10237, 30410, 3733, 285, 15180, 10454, 671, 253, 3368, 629, 943, 2486, 16851, 1543, 273, 253, 5141, 5018, 281, 12654, 253, 25720, 273, 253, 39383, 275, 3946, 50276, 7152, 339, 11022, 735, 24406, 6667, 403, 1071, 673, 8104, 275, 534, 253, 3280, 310, 7321, 407, 598, 281, 4181, 299, 793, 762, 690, 7982, 285, 253, 4736, 273, 18539, 274, 1365, 10237, 4715, 310, 281, 452, 1029, 14923, 7200, 1014, 762, 824, 8104, 581, 1039, 281, 1056, 13650, 310, 281, 1900, 3453, 247, 5203, 1529, 1039, 310, 281, 20965, 404, 33492, 672, 253, 458, 47612, 11121, 253, 3280, 310, 417, 4076, 285, 310, 44711, 342, 253, 1039, 359, 7472, 253, 3045, 275, 253, 5481, 1566, 310, 281, 1385, 5189, 44711, 14800, 347, 9113, 10509, 50276, 783, 2929, 12325, 247, 1077, 3626, 1953, 310, 352, 6927, 281, 3037, 672, 5481, 357, 296, 404, 310, 4136, 390, 417, 253, 2022, 906, 273, 253, 2929, 310, 1077, 4076, 323, 667, 7982, 277, 285, 667, 299, 793, 253, 6242, 273, 247, 458, 47612, 326, 33526, 7200, 260, 762, 5481, 1566, 762, 374, 2265, 26309, 310, 1491, 28055, 6425, 281, 253, 6242, 273, 247, 30410, 275, 253, 6913, 292, 2441, 1566, 342, 7200, 260, 285, 299, 793, 20452, 50276, 783, 4737, 310, 25799, 533, 352, 310, 417, 14556, 25799, 10775, 1677, 247, 30410, 275, 2057, 273, 253, 767, 7533, 1840, 253, 2929, 2722, 247, 2581, 2969, 533, 7060, 1039, 273, 26736, 1529, 30410, 342, 253, 3602, 4767, 1840, 275, 253, 643, 1566, 50276, 783, 2929, 840, 3936, 436, 4602, 281, 45735, 253, 1543, 273, 3240, 247, 1643, 9380, 432, 253, 6239, 275, 534, 597, 452, 7558, 25774, 326, 897, 5481, 347, 616, 2234, 2934, 253, 2929, 40687, 326, 253, 14493, 326, 1142, 273, 1110, 9380, 1750, 651, 16084, 49996, 342, 642, 5481, 357, 296, 404, 326, 7171, 253, 1375, 273, 253, 1445, 18539, 274, 1365, 10237, 49996, 253, 2929, 45254, 3916, 326, 436, 6492, 326, 253, 25774, 273, 1110, 9380, 403, 417, 2686, 7895, 533, 2581, 417, 7154, 762, 2969, 8104, 3597, 407, 253, 4477, 4496, 1618, 1097, 253, 20544, 285, 32213, 273, 253, 2929, 672, 16585, 32213, 4496, 2085, 11859, 49353, 8680, 327, 253, 2929, 50276, 251, 253, 2762, 1930, 253, 4602, 875, 5481, 285, 9162, 310, 247, 1077, 3626, 1953, 326, 22828, 4116, 253, 2929, 19539, 247, 2969, 533, 1077, 5322, 10012, 326, 347, 2080, 347, 891, 871, 369, 417, 8058, 1078, 253, 4602, 875, 5175, 285, 28490, 310, 417, 4336, 747, 275, 12425, 3762, 285, 253, 2929, 671, 25957, 436, 533, 20764, 436, 11562, 275, 253, 3634, 273, 10237, 4715, 347, 2080, 347, 891, 871, 310, 747, 671, 891, 751, 253, 958, 326, 253, 2929, 4648, 436, 4602, 281, 1263, 253, 12739, 273, 1543, 2168, 7558, 275, 253, 6239, 50276, 251, 253, 1066, 1930, 253, 958, 326, 253, 10012, 273, 253, 2929, 310, 8058, 970, 1491, 253, 30325, 2581, 685, 43245, 5919, 23082, 7787, 253, 4088, 581, 476, 5649, 432, 824, 247, 4602, 275, 1798, 337, 352, 310, 417, 4336, 4344, 281, 1953, 2045, 2987, 7558, 1543, 327, 31640, 970, 5481, 984, 1110, 9380, 858, 417, 1750, 18065, 1543, 390, 1491, 253, 30325, 1543, 533, 2581, 38576, 273, 10155, 616, 15849, 1580, 253, 23082, 275, 436, 2929, 403, 417, 43245, 17887, 581, 2550, 7525, 326, 1110, 15849, 497, 6296, 417, 7895, 43245, 374, 954, 604, 417, 512, 7533, 275, 534, 10237, 4715, 310, 247, 3511, 9400, 581, 2168, 6057, 253, 6242, 273, 247, 10237, 3216, 9988, 1159, 275, 1798, 323, 2460, 9162, 352, 310, 253, 9376, 326, 7497, 403, 10237, 281, 1355, 299, 793, 26309, 285, 253, 4736, 310, 281, 1089, 824, 49996, 8356, 275, 824, 22349, 253, 1543, 273, 436, 2929, 2489, 40072, 984, 253, 23082, 2226, 35820, 1365, 436, 2007, 7787, 253, 30437, 273, 253, 1543, 50276, 977, 5701, 50276, 6377, 374, 281, 253, 4477, 3640, 627, 310, 642, 1929, 1039, 273, 19732, 2977, 15180, 275, 46505, 281, 1973, 625, 10237, 3210, 50275, 783, 11106, 2929, 407, 305, 1662, 1162, 355, 3133, 281, 4555, 921, 253, 6387, 326, 15180, 6733, 812, 320, 19732, 2961, 281, 5115, 31640, 50276, 783, 767, 9380, 305, 1662, 1162, 355, 9169, 285, 270, 4338, 777, 1162, 355, 4765, 403, 2686, 3240, 1027, 275, 849, 597, 2968, 342, 253, 2554, 273, 15180, 6733, 581, 13330, 342, 877, 1767, 553, 4715, 285, 253, 643, 13330, 342, 877, 1767, 553, 20362, 891, 1158, 697, 253, 6158, 326, 310, 625, 3587, 2905, 281, 436, 2987, 3935, 50276, 34814, 598, 327, 253, 2523, 374, 5393, 1840, 581, 1537, 1333, 326, 253, 906, 273, 253, 2929, 1335, 10384, 1014, 604, 253, 3216, 5083, 310, 417, 10237, 281, 299, 793, 26309, 533, 840, 275, 326, 1083, 352, 10316, 598, 1529, 2523, 253, 5426, 908, 275, 436, 2929, 651, 417, 16084, 326, 271, 48960, 1650, 310, 2686, 3731, 39651, 436, 2523, 310, 5469, 275, 690, 2045, 789, 824, 347, 27694, 2996, 48960, 2495, 1803, 7080, 1162, 355, 247, 8727, 746, 285, 48960, 2495, 285, 31640, 2087, 14308, 277, 900, 1451, 375, 1162, 355, 5723, 2824, 1093, 50276, 783, 2929, 19539, 247, 3626, 10527, 906, 326, 310, 387, 253, 2798, 273, 10237, 4715, 253, 906, 310, 1491, 253, 30325, 533, 891, 1335, 1089, 352, 3240, 3626, 50276, 74, 1158, 581, 943, 320, 1077, 31798, 281, 417, 27662, 4665, 253, 12739, 273, 436, 2929, 533, 891, 1158, 253, 11019, 10527, 8310, 326, 5175, 285, 28490, 275, 253, 3634, 273, 48960, 4715, 403, 6425, 556, 247, 15785, 326, 387, 1878, 12516, 436, 2929, 327, 253, 5680, 323, 17857, 32888, 50276, 7152, 339, 431, 248, 2929, 921, 12866, 253, 24189, 273, 5368, 48960, 5481, 11333, 2709, 5368, 5481, 3169, 5684, 11333, 403, 2783, 285, 452, 2011, 28055, 326, 253, 31640, 7558, 407, 1110, 11333, 1537, 417, 320, 253, 1072, 347, 7558, 275, 253, 3969, 9380, 50276, 6050, 253, 2934, 310, 4722, 285, 3198, 690, 4116, 2299, 7296, 760, 253, 30539, 310, 17497, 310, 8489, 3430, 390, 24363, 275, 1097, 436, 285, 5368, 9380, 6289, 604, 253, 5684, 476, 320, 7154, 970, 247, 747, 17825, 2983, 970, 253, 3640, 273, 253, 5684, 840, 253, 26762, 1364, 671, 452, 253, 1072, 3640, 281, 10007, 253, 5684, 281, 9257, 2451, 253, 2701, 273, 5684, 50276, 23955, 4722, 1953, 310, 2139, 253, 30539, 3198, 281, 6635, 271, 2983, 604, 344, 6689, 476, 5467, 512, 3640, 273, 253, 5684, 285, 253, 2303, 1566, 2139, 417, 3365, 10007, 253, 3061, 3103, 1078, 44762, 2355, 253, 25774, 403, 417, 2444, 352, 310, 3264, 281, 1908, 436, 285, 17825, 5684, 390, 1918, 253, 1072, 7185, 281, 253, 26762, 347, 973, 50276, 783, 3916, 1160, 275, 253, 2929, 403, 417, 45190, 5762, 597, 403, 7194, 1754, 327, 253, 13260, 273, 253, 4477, 390, 581, 4236, 2929, 352, 588, 320, 1270, 604, 253, 4477, 476, 34647, 326, 2686, 253, 25774, 403, 417, 2444, 50276, 5371, 513, 253, 4477, 1599, 407, 253, 50276, 783, 7103, 310, 31334, 327, 3239, 818, 4496, 3730, 281, 690, 2876, 25774, 50276, 18, 5481, 1754, 5684, 1411, 48960, 6667, 432, 253, 331, 909, 12792, 1127, 273, 1859, 275, 10061, 273, 253, 26332, 70, 886, 39985, 8059, 327, 4382, 8113, 285, 3102, 8981, 7266, 5693, 1099, 2385, 1706, 374, 2687, 324, 5447, 2983, 285, 1566, 639, 79, 6932, 48960, 20452, 13562, 275, 26332, 1796, 13122, 327, 11454, 6928, 285, 4715, 2718, 246, 9866, 5200, 43425, 495, 2460, 9261, 1754, 5684, 1411, 48960, 20452, 327, 3676, 4715, 3210, 275, 26332, 1796, 13122, 327, 3469, 494, 285, 7895, 12672, 32989, 1026, 43425, 1936, 1283, 642, 608, 7266, 20048, 3763, 15144, 50276, 3549, 6241, 969, 253, 1307, 9065, 5045, 10237, 2983, 310, 417, 2590, 352, 3133, 751, 253, 2929, 310, 1754, 327, 690, 638, 515, 360, 6372, 326, 253, 13562, 476, 417, 789, 4496, 19148, 50276, 8384, 6197, 327, 3239, 818, 50276, 338, 436, 13562, 25774, 31640, 3916, 497, 3451, 50275, 1439, 4076, 752, 253, 2488, 5486, 285, 1066, 7698, 1379, 253, 6152, 273, 253, 5684, 789, 50276, 2252, 273, 253, 13260, 403, 1754, 327, 581, 789, 760, 6142, 2066, 74, 1162, 355, 43425, 891, 1928, 253, 4477, 878, 281, 45735, 253, 2929, 285, 6239, 16575, 417, 760, 247, 1643, 9380, 534, 34647, 326, 5368, 25774, 513, 417, 789, 390, 3469, 327, 253, 687, 1768, 307, 1255, 2929, 50275, 783, 2929, 310, 1754, 327, 13260, 285, 4342, 273, 581, 2014, 2929, 954, 273, 253, 673, 1293, 6843, 34647, 326, 253, 5368, 5684, 588, 417, 789, 253, 1783, 3133, 24363, 285, 16593, 281, 253, 4243, 273, 253, 31342, 347, 973, 50275, 783, 4477, 2868, 326, 253, 5368, 25774, 513, 417, 28788, 253, 9065, 5045, 2983, 4496, 6635, 824, 271, 2983, 285, 34647, 1293, 11120, 19883, 253, 25774, 326, 5684, 310, 417, 2444, 50276, 249, 619, 3640, 253, 2032, 285, 4344, 4473, 281, 1097, 30539, 285, 26762, 3198, 281, 320, 627, 281, 1056, 4092, 4780, 275, 253, 1673, 2010, 436, 476, 320, 816, 1529, 2929, 18964, 253, 34001, 273, 253, 25774, 50276, 7152, 33032, 2520, 19529, 23417, 253, 3672, 273, 48960, 3733, 285, 48960, 10237, 5481, 3839, 8288, 891, 1158, 253, 11815, 275, 253, 9262, 2929, 476, 2085, 12912, 16039, 323, 253, 3114, 285, 3693, 689, 28803, 275, 2852, 48960, 5481, 2561, 4757, 436, 789, 23417, 253, 3672, 273, 48960, 3733, 285, 48960, 5481, 285, 253, 4028, 310, 3477, 281, 956, 352, 310, 273, 1270, 8453, 281, 253, 2852, 2440, 273, 48960, 5481, 2561, 50276, 20881, 1255, 253, 5140, 273, 6425, 49996, 285, 25421, 275, 10012, 577, 285, 608, 403, 4722, 2299, 891, 452, 7350, 670, 10012, 577, 326, 299, 4277, 18848, 461, 5481, 8018, 31334, 299, 4277, 19, 18848, 461, 9162, 323, 581, 2181, 352, 19756, 247, 8542, 285, 5919, 2900, 281, 1089, 247, 44711, 3280, 326, 310, 10509, 13359, 323, 1529, 253, 1750, 310, 417, 3309, 275, 3946, 323, 1650, 604, 253, 3280, 310, 10945, 407, 271, 48960, 5481, 24432, 476, 3365, 823, 3632, 6046, 327, 253, 3280, 1919, 253, 747, 3280, 310, 417, 10945, 285, 407, 253, 1039, 2429, 281, 253, 1783, 273, 10237, 5481, 273, 10237, 49653, 7296, 17825, 8104, 4404, 48960, 9162, 285, 48960, 5481, 10486, 310, 625, 1774, 50275, 783, 4060, 273, 436, 19529, 3133, 19582, 697, 7561, 14969, 253, 2323, 2281, 273, 48960, 5481, 2080, 32180, 21144, 253, 2323, 2281, 273, 48960, 49653, 275, 2045, 2175, 281, 320, 2173, 1580, 247, 1781, 2408, 273, 1029, 15759, 3888, 1561, 271, 299, 4277, 4023, 49653, 512, 3888, 310, 247, 2834, 4836, 323, 11454, 6928, 342, 3710, 5350, 407, 4499, 253, 5481, 4836, 816, 556, 281, 2028, 253, 3064, 875, 48960, 6667, 285, 3626, 6667, 7613, 253, 4342, 275, 436, 19529, 403, 417, 21414, 326, 15549, 48960, 6667, 310, 347, 1892, 347, 49653, 731, 2074, 12748, 943, 320, 4860, 407, 2074, 2323, 4142, 5010, 352, 588, 1918, 253, 1345, 40663, 50276, 66, 16453, 10419, 3693, 970, 253, 12494, 272, 12002, 285, 11352, 7013, 285, 50276, 249, 253, 1527, 2278, 19529, 3239, 50276, 249, 2087, 891, 1158, 436, 19529, 476, 2085, 12912, 16039, 285, 27163, 323, 253, 1673, 273, 48960, 5481, 2299, 352, 19756, 8542, 5482, 534, 8018, 3710, 9021, 891, 3524, 619, 7350, 670, 10012, 577, 285, 608, 476, 320, 2783, 390, 8042, 562, 619, 40663, 275, 253, 5955, 3924, 2167, 891, 2550, 5583, 14924, 387, 436, 3924, 891, 588, 2572, 619, 4868, 604, 619, 7350, 403, 14042, 6283, 2490, 187, 4118, 18435, 27, 783, 2929, 2340, 684, 247, 1077, 4722, 1895, 273, 253, 10291, 875, 48960, 5481, 285, 48960, 9162, 28055, 253, 4477, 921, 326, 581, 476, 1900, 34243, 3989, 247, 10237, 30410, 432, 247, 10237, 13562, 326, 556, 6425, 31640, 285, 12008, 26620, 436, 10012, 310, 760, 3451, 1293, 7296, 253, 15180, 10454, 2299, 253, 4477, 858, 417, 2085, 667, 16851, 1543, 273, 253, 5141, 5018, 281, 12654, 253, 25720, 273, 253, 39383, 275, 3946, 534, 310, 253, 2022, 4468, 273, 512, 30628, 594, 359, 476, 1333, 253, 2929, 310, 247, 24388, 281, 253, 3114, 359, 878, 281, 320, 10182, 670, 253, 5481, 1543, 533, 858, 417, 2085, 667, 1941, 281, 1333, 597, 403, 689, 13578, 760, 247, 24366, 1754, 327, 253, 10012, 275, 253, 2929, 534, 10260, 7787, 253, 7680, 273, 253, 2929, 1955, 281, 253, 3947, 48826, 273, 17857, 32888, 891, 2550, 5583, 18738, 352 ]
Below is given review of a research paper from cnoference journal. Please write a summary the review. ### Review: in this paper the authors consider using learningbased method for influence estimation and influence maximization the author proposed a gnnbased to estimate influence as an upper bound based on the estimated influence the authors use celf optimization to find the optimal seed set to further improve the efficiency the authors proposed 1 a rl dqn based method and 2 a simplified influence function with only one layer the author carries out experiments on several synthetic and realworld datasets strength 1 it is an interesting and practical idea to consider learningbased approach for cascade problems on social networks 2 the authors carry out experiments on several networks both synthetic and realworld with reasonable large number of nodes 1m weakness 1 some of the writing in the paper is not clear or misleading first it is not clear what is the input to the gnn it mentioned that h0 in r nd if just whether the node is in seed set is encoded only 1 dimension is needed second it would be better for the authors to clarify that the monotonicity and submodularity of only 12 is proved there is no guarantee for 6 but only qualitative evaluation for 6 as a result the usage of celf may not be well justified 2 the method with best performance pun has little to do with the learning method it is just a diffusion function with 1 step also the network is actually not trained for the target one possible reason is that the activation probabilities are really small 1 avg degree setting such that the cascade depth are really shallow in this case the authors should compare their method to some heuristics 3 the empirical evaluation is not very convincing first rssbased methods are not included in the influence estimation part second running time comparison on gpu based method and cpu method is not a fair comparison third as mentioned above heuristicbased method should be included in the comparison as in table 5 4 one strength of learningbased approach is to generalize beyond one given diffusion model it would be interesting to see 1 experiment on realworld cascades 2 generalization under diffusion model misspecification though the idea of using learningbased approach for im problem is interesting the proposed method did not demonstrate advantage over existing methods the best performing one is more like existing heuristics also there are a few missing pieces in the evaluation docsepthis paper considers using learning methods to solve the wellknown influence maximization problem the paper proposes to estimate the upper bound of the influence by using graph neural networks which can be used in subsequent steps for selecting the seed nodes through either qlearning or a greedy algorithm based on the learned representation experiments on various datasets have been provided to evaluate the accuracy of influence estimation as well as the effect of influence maximization i would proceed by saying that i was one of the reviewers of this paper in its previous submission to another conference first of all i appreciate the efforts made by the authors to improve the previous version i am willing to increase my score if the author could address my concerns textbfstrength using gnn for learning representations is a reasonable choice this paper is technically sound constructing a submodular score function based on the learned representation is also an interesting idea textbfweakness using learning methods for estimating influence or influence maximization have been extensively studied eg ad listed below and their references using gnn and qlearning is reasonable but not novel a li hui mengting xu sourav s bhowmick changsheng sun zhongyuan jiang and jiangtao cui disco influence maximization meets network embedding and deep learning arxiv preprint arxiv190607378 2019 b du nan yingyu liang maria balcan and le song influence function learning in information diffusion networks in international conference on machine learning pp 20162024 pmlr 2014 c xia wenwen yuchen li jun wu and shenghong li deepis susceptibility estimation on social networks in proceedings of the 14th acm international conference on web search and data mining pp 761769 2021 d liu qi biao xiang enhong chen hui xiong fangshuang tang and jeffrey xu yu influence maximization over largescale social networks a bounded linear approach in proceedings of the 23rd acm international conference on conference on information and knowledge management pp 171180 2014 the experiments do not demonstrate a strong need for learningbased methods for influence estimation or influence maximization the influence estimation problem is only challenging when the cascade can spread for a large number of hops the presented experiments consider influence estimation from small seed sets under such settings even mc is not timeconsuming for the same reason the independent cascade model with uniform probability is a better model to study as it in general results in a much larger influence than the wc model for dense graphs the wc model tends to give low propagation probability notice that imm might be the stateoftheart method with approximation guarantees but efficient heuristics are possible as discussed in e it is wellknown that heuristic methods can provide solutions with qualities similar to the 11eapproximation therefore to demonstrate the efficiency of the proposed method it is better to also compare it with fast heuristics e debunking the myths of influence maximization an indepth benchmarking study the testing size appears to be too small to draw results that are statistically significant in addition there is no information about std to demonstrate the robustness the proposed learning model essentially uses a diffusion process of bounded steps to estimate the entire diffusion process since the layer number is 2 this means that only two steps of diffusion are considered under the wc model it is likely that the majority of the nodes are activated within two steps and therefore it is not very surprising that the estimate is accurate notice that the phardness only holds for general diffusion steps and assuming that only two diffusion steps are considered many existing methods can be modified to be more efficient eg imm with twohop reverse sampling minor questions what does negative sample mean in sec41 it is quite surprising that pun can produce highquality solutions to large graphs with a very small computation cost this essentially means that the phardness could be overcome by a onestep influence estimation which looks too ambitious if this is the case sigmam should be a nice method for influence estimation which could be verified by experiments in a related issue for the running time of the proposed methods i am wondering if the time used for computing the representations is included for large graphs computing the representation involves multiplications of large matrices the proposed technique is not very novel and its practical utility needs better justification docsepauthors propose several neural network modelbased approaches to influence maximization im first glie estimates influence using a gnn which can be plugged into optimization algorithms like celf second grim avoids the cost of having to estimate the influence of every candidate by approximating the marginal gain with a twolayer mlp finally pun avoids the cost of having to estimate the influence for every seed node by approximating the influence with features from gnn hidden states experiments demonstrate that the proposed method can generalize to graphs significantly different from training data also pun provides solution quality close to the strongest baseline with a fraction of compute time while the use of gnns for combinatorial optimization has become quite popular in recent years their application to influence maximization im is not straightforward as it is quite different from previous gnn applications the im problem has a strong structure submodularity and strong nonlearning algorithms are already established for the problem therefore it is quite surprising and encouraging to see that gnns can actually be quite competitive on this problem another strength of the paper is the thoroughness of the approach the paper proposes three levels of gradual transition from traditional approach celf to completely estimationbased approach pun compared to simply providing a single approach that works this gradual approach allows readers to better understand why a simpler approach does not suffice and more techniques need to be introduced one notable weakness of the paper is their lack of reference to alieva et al httpsopenreviewnetforumidac288vng7u since this paper also proposes a learningbased approach for submodular optimization this approach shouldve been discussed and compared against as a baseline also in 7 os s and 10 authors heuristically construct features from frozen glie models authors provide some intuition about them saying the hidden state hti can be considered as a label for each node whose sign indicates if it is predicted to be influenced however this seems like an overstatement htis are intermediate hidden states from the glie model and there does not seem to be any mechanism like a loss function which encourages these intermediate hidden states to be the influence label many design choices like why 7 sums over t and normalize by dt are unclear 10 is even more confusing because it only uses the first layer of the gnn does it mean that additional gnn layers are not needed actually it doesnt seem like authors share an analysis of the impact of gnn layers lastly it wasnt sure sigmams can be interpreted as a submodular function in a standard sense for a submodular function we should be able to evaluate for any set in the domain of the function however definition 12 assumes the given input s can be associated with the sequence of seed sets s1 s2 ldots ss in other words the function is only welldefined for the sequence of sets s1 s2 ldots observed during the execution of the greedy algorithm therefore i am not sure 11e approximation guarantee of greedy monotone submodular maximization applies here influence maximization is a quite surprising application of gnns authors employ a systematic approach to build multiple methods with more departure to the traditional approach and increasing levels of computational advantage some of modeling decisions look heuristic however and at least requires a better explanation docsepthe paper proposes a neural network approach glie for estimating the influence of a given seed in a given graph more importantly the authors propose three different methods to use the proposed influence estimation method for influence maximization the authors show the superior performance of their proposed method in comparison with baselines and the paper studies an important problem data mining problem ie influence maximization the proposed network for approximating the influence is interesting on its own the paper is overall wellwritten as easy to follow however the paper can benefit from polishing for instance the braces in eq 5 are a bit confusing also the experimental section is rushed for example in sec 41 it is not clear to me how the graph neural network is trained is it trained only on ba graphs if so how did the authors use that network for other graphs when the network weights are clearly dependant on graph node size also it seems that the main novelty of the paper is in proposing to estimate the influence via graph neural networks and the proposed influence maximization methods use already existing techniques in combination with this influence estimation method in that sense the theoretical contributions of the paper are marginal however combining these techniques is still nontrivial overall the paper proposes novel methods for influence estimation and influence maximization in my opinion the main weakness of the paper is the missing details in the experimental section ### Summary:
this paper revisits the problem of influence maximization and suggests using graph neural networks to estimate an upper bound on the influence which can then be used to find good seed sets the paper gives a variety of experimental evidence that the methods improve on various algorithms in the literature there was a wide variation in opinions some reviewers felt that the overall idea was not particularly novel as methods that combine graph embeddings and reinforcement learning to solve influence maximization have already been proposed in the literature additionally some reviewers felt that the experiments were missing important comparisons particularly to learningbased methods without which it is difficult to argue that these methods really do advance the state of the art
[ 253, 25282, 6864, 403, 1663, 20126, 275, 436, 1083, 253, 4477, 943, 7277, 616, 1332, 281, 690, 344, 321, 3397, 495, 253, 16774, 7103, 310, 417, 1077, 21414, 806, 391, 859, 3169, 3082, 403, 417, 2908, 275, 253, 4833, 13418, 629, 1273, 3515, 673, 5301, 327, 305, 11113, 1754, 1332, 285, 27754, 1332, 310, 417, 247, 4344, 5301, 2626, 347, 5393, 1840, 47641, 3169, 1332, 943, 320, 2908, 275, 253, 5301, 347, 275, 2829, 608, 577, 581, 4757, 273, 4715, 3169, 2746, 310, 281, 39970, 4457, 581, 1677, 12393, 1566, 352, 651, 320, 4722, 281, 923, 337, 3368, 327, 1524, 10186, 18779, 3355, 374, 26647, 762, 12393, 1566, 2985, 1553, 1877, 50275, 2004, 253, 2934, 273, 970, 4715, 3169, 2746, 323, 516, 1895, 310, 4722, 253, 4081, 1332, 858, 417, 7568, 5750, 689, 5368, 3082, 253, 1682, 9591, 581, 310, 625, 751, 5368, 344, 321, 3397, 671, 627, 403, 247, 1643, 5816, 7437, 275, 253, 7103, 5474, 33032, 2520, 2929, 19401, 970, 4715, 3082, 281, 8415, 253, 973, 4304, 4833, 11903, 1320, 1895, 253, 2929, 29328, 281, 6642, 253, 5170, 3033, 273, 253, 4833, 407, 970, 4216, 11454, 6928, 534, 476, 320, 908, 275, 6774, 5018, 323, 17221, 253, 8357, 7632, 949, 2057, 2805, 28269, 390, 247, 38754, 5933, 1754, 327, 253, 6311, 6779, 4679, 327, 2710, 15302, 452, 644, 2530, 281, 7472, 253, 7200, 273, 4833, 13418, 347, 973, 347, 253, 1055, 273, 4833, 11903, 1320, 50275, 74, 651, 4262, 407, 3981, 326, 891, 369, 581, 273, 253, 30628, 273, 436, 2929, 275, 697, 2045, 19529, 281, 1529, 8059, 806, 273, 512, 891, 11435, 253, 6031, 1160, 407, 253, 4477, 281, 3157, 253, 2045, 2715, 891, 717, 7378, 281, 2572, 619, 4868, 604, 253, 2488, 812, 2953, 619, 7350, 50276, 11765, 45563, 50276, 186, 5302, 305, 9866, 323, 4715, 14237, 310, 247, 5272, 4327, 209, 186, 2520, 2929, 310, 22335, 3590, 50276, 186, 17439, 272, 247, 749, 2307, 792, 4868, 1159, 1754, 327, 253, 6311, 6779, 310, 671, 271, 4722, 2934, 50275, 11765, 20881, 1255, 50276, 5302, 4715, 3082, 323, 26230, 4833, 390, 4833, 11903, 1320, 452, 644, 18171, 5421, 24088, 519, 7117, 2708, 285, 616, 10414, 970, 305, 9866, 285, 2805, 28269, 310, 5272, 533, 417, 4460, 50274, 66, 632, 288, 4113, 278, 1205, 1076, 1269, 86, 18155, 3385, 256, 270, 5430, 78, 781, 1683, 84, 24176, 5101, 1182, 73, 543, 90, 9041, 480, 22589, 285, 480, 22589, 893, 80, 36707, 1262, 80, 4833, 11903, 1320, 16382, 2990, 21496, 285, 3676, 4715, 549, 32693, 638, 3845, 549, 32693, 16129, 25616, 25139, 6247, 50276, 67, 3443, 6399, 340, 272, 30838, 632, 606, 2304, 571, 4273, 5092, 285, 458, 4498, 4833, 1159, 4715, 275, 1491, 12393, 6928, 275, 5213, 8059, 327, 5145, 4715, 7266, 4022, 938, 1348, 268, 1686, 83, 4059, 50276, 68, 1269, 571, 259, 257, 15082, 340, 1028, 864, 632, 12468, 259, 86, 285, 703, 1251, 73, 543, 632, 3676, 261, 16329, 13418, 327, 2675, 6928, 275, 10061, 273, 253, 1638, 394, 913, 78, 5213, 8059, 327, 4384, 3186, 285, 941, 15067, 7266, 10909, 1166, 2090, 43425, 50276, 69, 632, 86, 2805, 74, 270, 22728, 1269, 22589, 4075, 543, 260, 864, 288, 4113, 1269, 279, 72, 269, 606, 1200, 86, 606, 12717, 285, 5139, 567, 5292, 1269, 86, 340, 86, 4833, 11903, 1320, 689, 1236, 2510, 25912, 2675, 6928, 247, 11542, 4872, 2746, 275, 10061, 273, 253, 3495, 5784, 913, 78, 5213, 8059, 327, 8059, 327, 1491, 285, 3640, 4323, 7266, 1722, 883, 1438, 4059, 50276, 783, 4679, 513, 417, 7568, 247, 2266, 878, 323, 4715, 3169, 3082, 323, 4833, 13418, 390, 4833, 11903, 1320, 50274, 783, 4833, 13418, 1895, 310, 760, 11132, 672, 253, 25282, 476, 5195, 323, 247, 1781, 1180, 273, 47010, 253, 3559, 4679, 1908, 4833, 13418, 432, 1355, 8357, 5239, 762, 824, 7533, 1014, 278, 68, 310, 417, 673, 33136, 323, 253, 1072, 1921, 253, 3907, 25282, 1566, 342, 6447, 5912, 310, 247, 1805, 1566, 281, 1263, 347, 352, 275, 2087, 1543, 275, 247, 1199, 4067, 4833, 685, 253, 259, 68, 1566, 50276, 1542, 14086, 14580, 253, 259, 68, 1566, 14280, 281, 1918, 1698, 18634, 5912, 50275, 37277, 326, 4293, 1537, 320, 253, 1375, 23037, 14387, 1332, 342, 11193, 23632, 533, 5919, 344, 321, 3397, 403, 1896, 347, 5469, 275, 299, 352, 310, 973, 4304, 326, 47641, 3082, 476, 2085, 5482, 342, 18701, 2074, 281, 253, 1903, 70, 6772, 3266, 318, 3103, 281, 7568, 253, 6733, 273, 253, 4081, 1332, 352, 310, 1805, 281, 671, 7277, 352, 342, 3809, 344, 321, 3397, 50275, 70, 4274, 3938, 272, 253, 38671, 273, 4833, 11903, 1320, 271, 801, 554, 394, 22791, 272, 1263, 50275, 783, 5175, 1979, 4620, 281, 320, 1512, 1355, 281, 3812, 1543, 326, 403, 10126, 1534, 275, 1635, 627, 310, 642, 1491, 670, 6268, 281, 7568, 253, 31640, 50276, 783, 4081, 4715, 1566, 9093, 4648, 247, 12393, 1232, 273, 11542, 5018, 281, 6642, 253, 2862, 12393, 1232, 1580, 253, 3828, 1180, 310, 374, 436, 2097, 326, 760, 767, 5018, 273, 12393, 403, 2783, 762, 253, 259, 68, 1566, 352, 310, 2779, 326, 253, 5020, 273, 253, 7632, 403, 10564, 1561, 767, 5018, 285, 3103, 352, 310, 417, 1077, 10084, 326, 253, 6642, 310, 7899, 4366, 326, 253, 815, 472, 1255, 760, 6556, 323, 2087, 12393, 5018, 285, 7384, 326, 760, 767, 12393, 5018, 403, 2783, 1142, 5368, 3082, 476, 320, 7321, 281, 320, 625, 5919, 24088, 4293, 342, 767, 12242, 8107, 10491, 50275, 37585, 3533, 50276, 5371, 1057, 4016, 3410, 1599, 275, 4706, 3156, 50275, 262, 310, 3240, 10084, 326, 5419, 476, 4711, 1029, 15177, 5482, 281, 1781, 14580, 342, 247, 1077, 1355, 13782, 2105, 436, 9093, 2097, 326, 253, 815, 472, 1255, 812, 320, 11399, 407, 247, 327, 383, 554, 4833, 13418, 534, 4453, 1512, 24683, 604, 436, 310, 253, 1083, 9788, 78, 312, 943, 320, 247, 5322, 1332, 323, 4833, 13418, 534, 812, 320, 16058, 407, 4679, 275, 247, 2905, 2523, 323, 253, 3515, 673, 273, 253, 4081, 3082, 891, 717, 12371, 604, 253, 673, 908, 323, 12672, 253, 14237, 310, 2908, 323, 1781, 14580, 12672, 253, 6779, 8687, 30840, 569, 273, 1781, 12624, 50275, 783, 4081, 5853, 310, 417, 1077, 4460, 285, 697, 8542, 11839, 3198, 1805, 22861, 50276, 7152, 33032, 43355, 12661, 2067, 11454, 2990, 1566, 3169, 7274, 281, 4833, 11903, 1320, 516, 806, 1289, 466, 8197, 4833, 970, 247, 305, 9866, 534, 476, 320, 43867, 715, 13757, 11333, 751, 260, 813, 1273, 22072, 32547, 253, 2105, 273, 1907, 281, 6642, 253, 4833, 273, 1046, 7431, 407, 4020, 839, 253, 16888, 6351, 342, 247, 2500, 311, 4071, 13361, 81, 4720, 5419, 32547, 253, 2105, 273, 1907, 281, 6642, 253, 4833, 323, 1046, 8357, 4666, 407, 4020, 839, 253, 4833, 342, 3386, 432, 305, 9866, 8763, 3054, 4679, 7568, 326, 253, 4081, 1332, 476, 39970, 281, 14580, 3012, 1027, 432, 3733, 941, 671, 5419, 3400, 2900, 3290, 2810, 281, 253, 19508, 8245, 342, 247, 6919, 273, 11897, 673, 1223, 253, 897, 273, 18976, 2224, 323, 38183, 13757, 556, 2489, 3240, 4633, 275, 3332, 1107, 616, 2898, 281, 4833, 11903, 1320, 516, 310, 417, 15246, 347, 352, 310, 3240, 1027, 432, 2045, 305, 9866, 4893, 253, 516, 1895, 556, 247, 2266, 2605, 749, 2307, 792, 414, 285, 2266, 1327, 28269, 11333, 403, 2168, 4232, 323, 253, 1895, 3103, 352, 310, 3240, 10084, 285, 18462, 281, 923, 326, 18976, 2224, 476, 2686, 320, 3240, 12085, 327, 436, 1895, 50276, 23955, 4757, 273, 253, 2929, 310, 253, 11080, 1255, 273, 253, 2746, 253, 2929, 29328, 1264, 2308, 273, 26830, 5502, 432, 5899, 2746, 260, 813, 281, 4336, 13418, 3169, 2746, 5419, 2429, 281, 3365, 5277, 247, 2014, 2746, 326, 2987, 436, 26830, 2746, 4483, 10668, 281, 1805, 2096, 2139, 247, 19554, 2746, 1057, 417, 36433, 285, 625, 5609, 878, 281, 320, 5611, 50276, 531, 16613, 14855, 273, 253, 2929, 310, 616, 3480, 273, 3806, 281, 355, 466, 6156, 1162, 355, 5987, 5758, 15337, 3024, 39061, 301, 317, 21340, 87, 1251, 24, 86, 50276, 17480, 436, 2929, 671, 29328, 247, 4715, 3169, 2746, 323, 749, 2307, 792, 13757, 436, 2746, 943, 306, 644, 5469, 285, 2429, 1411, 347, 247, 8245, 50276, 12563, 275, 818, 7684, 256, 285, 884, 4477, 344, 321, 18260, 3989, 3386, 432, 13831, 1289, 466, 3210, 4477, 2085, 690, 30328, 670, 731, 3981, 253, 8763, 1375, 288, 6811, 476, 320, 2783, 347, 247, 5203, 323, 1016, 4666, 3692, 861, 6492, 604, 352, 310, 8131, 281, 320, 12208, 2299, 436, 3133, 751, 271, 689, 25322, 288, 35879, 403, 10444, 8763, 3054, 432, 253, 1289, 466, 1566, 285, 627, 1057, 417, 1646, 281, 320, 667, 5122, 751, 247, 2957, 1159, 534, 29426, 841, 10444, 8763, 3054, 281, 320, 253, 4833, 5203, 1142, 2216, 10165, 751, 2139, 818, 22661, 689, 246, 285, 39142, 407, 19641, 403, 12744, 884, 310, 1014, 625, 21643, 984, 352, 760, 4648, 253, 806, 3828, 273, 253, 305, 9866, 1057, 352, 1599, 326, 3081, 305, 9866, 8090, 403, 417, 3058, 2686, 352, 36908, 1646, 751, 4477, 3894, 271, 1783, 273, 253, 3486, 273, 305, 9866, 8090, 50276, 6275, 314, 352, 369, 2649, 2119, 9788, 78, 1317, 476, 320, 12814, 347, 247, 749, 2307, 792, 1159, 275, 247, 2629, 3282, 323, 247, 749, 2307, 792, 1159, 359, 943, 320, 2104, 281, 7472, 323, 667, 873, 275, 253, 5028, 273, 253, 1159, 2299, 5426, 1249, 19584, 253, 1677, 3280, 256, 476, 320, 2330, 342, 253, 3425, 273, 8357, 5239, 256, 18, 256, 19, 298, 6768, 23524, 275, 643, 3000, 253, 1159, 310, 760, 6210, 392, 37224, 323, 253, 3425, 273, 5239, 256, 18, 256, 19, 298, 6768, 2540, 1309, 253, 10636, 273, 253, 38754, 5933, 3103, 891, 717, 417, 2119, 1903, 70, 11193, 12215, 273, 38754, 49123, 749, 2307, 792, 11903, 1320, 10384, 1060, 4833, 11903, 1320, 310, 247, 3240, 10084, 2898, 273, 18976, 2224, 4477, 2126, 247, 12082, 2746, 281, 1973, 2709, 3082, 342, 625, 16018, 281, 253, 5899, 2746, 285, 3629, 2308, 273, 15180, 5750, 690, 273, 14053, 7089, 1007, 47641, 2299, 285, 387, 1878, 4419, 247, 1805, 8813, 5474, 339, 431, 248, 2929, 29328, 247, 11454, 2990, 2746, 1289, 466, 323, 26230, 253, 4833, 273, 247, 1677, 8357, 275, 247, 1677, 4216, 625, 15538, 253, 4477, 12661, 1264, 1027, 3082, 281, 897, 253, 4081, 4833, 13418, 1332, 323, 4833, 11903, 1320, 253, 4477, 921, 253, 8936, 3045, 273, 616, 4081, 1332, 275, 5301, 342, 1666, 25379, 285, 50276, 783, 2929, 2175, 271, 1774, 1895, 941, 15067, 1895, 26332, 4833, 11903, 1320, 253, 4081, 2990, 323, 4020, 839, 253, 4833, 310, 4722, 327, 697, 1211, 253, 2929, 310, 4583, 973, 15720, 347, 3477, 281, 956, 2299, 253, 2929, 476, 5649, 432, 35952, 323, 4227, 253, 1308, 1951, 275, 16186, 608, 403, 247, 2372, 21643, 671, 253, 5661, 2593, 310, 20906, 323, 1650, 275, 4706, 7609, 352, 310, 417, 2590, 281, 479, 849, 253, 4216, 11454, 2990, 310, 10166, 310, 352, 10166, 760, 327, 18927, 14580, 604, 594, 849, 858, 253, 4477, 897, 326, 2990, 323, 643, 14580, 672, 253, 2990, 13461, 403, 4518, 3469, 386, 327, 4216, 4666, 1979, 671, 352, 3133, 326, 253, 2022, 38135, 273, 253, 2929, 310, 275, 36636, 281, 6642, 253, 4833, 3066, 4216, 11454, 6928, 285, 253, 4081, 4833, 11903, 1320, 3082, 897, 2168, 5368, 5609, 275, 5019, 342, 436, 4833, 13418, 1332, 275, 326, 3282, 253, 10527, 9021, 273, 253, 2929, 403, 16888, 2299, 16248, 841, 5609, 310, 1335, 37825, 50275, 1189, 455, 253, 2929, 29328, 4460, 3082, 323, 4833, 13418, 285, 4833, 11903, 1320, 275, 619, 4743, 253, 2022, 14855, 273, 253, 2929, 310, 253, 5816, 4278, 275, 253, 5661, 2593, 50276, 187, 187, 4118, 18435, 27, 2520, 2929, 27694, 953, 253, 1895, 273, 4833, 11903, 1320, 285, 5936, 970, 4216, 11454, 6928, 281, 6642, 271, 5170, 3033, 327, 253, 4833, 534, 476, 840, 320, 908, 281, 1089, 1175, 8357, 5239, 253, 2929, 4245, 247, 5235, 273, 5661, 1941, 326, 253, 3082, 3157, 327, 2710, 11333, 275, 253, 6239, 627, 369, 247, 4618, 7629, 275, 11626, 690, 30628, 3543, 326, 253, 4583, 2934, 369, 417, 3782, 4460, 347, 3082, 326, 13398, 4216, 46234, 285, 35221, 4715, 281, 8415, 4833, 11903, 1320, 452, 2168, 644, 4081, 275, 253, 6239, 23000, 690, 30628, 3543, 326, 253, 4679, 497, 5816, 1774, 14023, 3782, 281, 4715, 3169, 3082, 1293, 534, 352, 310, 2834, 281, 9059, 326, 841, 3082, 1663, 513, 7170, 253, 1375, 273, 253, 1445 ]
[ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 ]
[ 253, 25282, 6864, 403, 1663, 20126, 275, 436, 1083, 253, 4477, 943, 7277, 616, 1332, 281, 690, 344, 321, 3397, 495, 253, 16774, 7103, 310, 417, 1077, 21414, 806, 391, 859, 3169, 3082, 403, 417, 2908, 275, 253, 4833, 13418, 629, 1273, 3515, 673, 5301, 327, 305, 11113, 1754, 1332, 285, 27754, 1332, 310, 417, 247, 4344, 5301, 2626, 347, 5393, 1840, 47641, 3169, 1332, 943, 320, 2908, 275, 253, 5301, 347, 275, 2829, 608, 577, 581, 4757, 273, 4715, 3169, 2746, 310, 281, 39970, 4457, 581, 1677, 12393, 1566, 352, 651, 320, 4722, 281, 923, 337, 3368, 327, 1524, 10186, 18779, 3355, 374, 26647, 762, 12393, 1566, 2985, 1553, 1877, 50275, 2004, 253, 2934, 273, 970, 4715, 3169, 2746, 323, 516, 1895, 310, 4722, 253, 4081, 1332, 858, 417, 7568, 5750, 689, 5368, 3082, 253, 1682, 9591, 581, 310, 625, 751, 5368, 344, 321, 3397, 671, 627, 403, 247, 1643, 5816, 7437, 275, 253, 7103, 5474, 33032, 2520, 2929, 19401, 970, 4715, 3082, 281, 8415, 253, 973, 4304, 4833, 11903, 1320, 1895, 253, 2929, 29328, 281, 6642, 253, 5170, 3033, 273, 253, 4833, 407, 970, 4216, 11454, 6928, 534, 476, 320, 908, 275, 6774, 5018, 323, 17221, 253, 8357, 7632, 949, 2057, 2805, 28269, 390, 247, 38754, 5933, 1754, 327, 253, 6311, 6779, 4679, 327, 2710, 15302, 452, 644, 2530, 281, 7472, 253, 7200, 273, 4833, 13418, 347, 973, 347, 253, 1055, 273, 4833, 11903, 1320, 50275, 74, 651, 4262, 407, 3981, 326, 891, 369, 581, 273, 253, 30628, 273, 436, 2929, 275, 697, 2045, 19529, 281, 1529, 8059, 806, 273, 512, 891, 11435, 253, 6031, 1160, 407, 253, 4477, 281, 3157, 253, 2045, 2715, 891, 717, 7378, 281, 2572, 619, 4868, 604, 253, 2488, 812, 2953, 619, 7350, 50276, 11765, 45563, 50276, 186, 5302, 305, 9866, 323, 4715, 14237, 310, 247, 5272, 4327, 209, 186, 2520, 2929, 310, 22335, 3590, 50276, 186, 17439, 272, 247, 749, 2307, 792, 4868, 1159, 1754, 327, 253, 6311, 6779, 310, 671, 271, 4722, 2934, 50275, 11765, 20881, 1255, 50276, 5302, 4715, 3082, 323, 26230, 4833, 390, 4833, 11903, 1320, 452, 644, 18171, 5421, 24088, 519, 7117, 2708, 285, 616, 10414, 970, 305, 9866, 285, 2805, 28269, 310, 5272, 533, 417, 4460, 50274, 66, 632, 288, 4113, 278, 1205, 1076, 1269, 86, 18155, 3385, 256, 270, 5430, 78, 781, 1683, 84, 24176, 5101, 1182, 73, 543, 90, 9041, 480, 22589, 285, 480, 22589, 893, 80, 36707, 1262, 80, 4833, 11903, 1320, 16382, 2990, 21496, 285, 3676, 4715, 549, 32693, 638, 3845, 549, 32693, 16129, 25616, 25139, 6247, 50276, 67, 3443, 6399, 340, 272, 30838, 632, 606, 2304, 571, 4273, 5092, 285, 458, 4498, 4833, 1159, 4715, 275, 1491, 12393, 6928, 275, 5213, 8059, 327, 5145, 4715, 7266, 4022, 938, 1348, 268, 1686, 83, 4059, 50276, 68, 1269, 571, 259, 257, 15082, 340, 1028, 864, 632, 12468, 259, 86, 285, 703, 1251, 73, 543, 632, 3676, 261, 16329, 13418, 327, 2675, 6928, 275, 10061, 273, 253, 1638, 394, 913, 78, 5213, 8059, 327, 4384, 3186, 285, 941, 15067, 7266, 10909, 1166, 2090, 43425, 50276, 69, 632, 86, 2805, 74, 270, 22728, 1269, 22589, 4075, 543, 260, 864, 288, 4113, 1269, 279, 72, 269, 606, 1200, 86, 606, 12717, 285, 5139, 567, 5292, 1269, 86, 340, 86, 4833, 11903, 1320, 689, 1236, 2510, 25912, 2675, 6928, 247, 11542, 4872, 2746, 275, 10061, 273, 253, 3495, 5784, 913, 78, 5213, 8059, 327, 8059, 327, 1491, 285, 3640, 4323, 7266, 1722, 883, 1438, 4059, 50276, 783, 4679, 513, 417, 7568, 247, 2266, 878, 323, 4715, 3169, 3082, 323, 4833, 13418, 390, 4833, 11903, 1320, 50274, 783, 4833, 13418, 1895, 310, 760, 11132, 672, 253, 25282, 476, 5195, 323, 247, 1781, 1180, 273, 47010, 253, 3559, 4679, 1908, 4833, 13418, 432, 1355, 8357, 5239, 762, 824, 7533, 1014, 278, 68, 310, 417, 673, 33136, 323, 253, 1072, 1921, 253, 3907, 25282, 1566, 342, 6447, 5912, 310, 247, 1805, 1566, 281, 1263, 347, 352, 275, 2087, 1543, 275, 247, 1199, 4067, 4833, 685, 253, 259, 68, 1566, 50276, 1542, 14086, 14580, 253, 259, 68, 1566, 14280, 281, 1918, 1698, 18634, 5912, 50275, 37277, 326, 4293, 1537, 320, 253, 1375, 23037, 14387, 1332, 342, 11193, 23632, 533, 5919, 344, 321, 3397, 403, 1896, 347, 5469, 275, 299, 352, 310, 973, 4304, 326, 47641, 3082, 476, 2085, 5482, 342, 18701, 2074, 281, 253, 1903, 70, 6772, 3266, 318, 3103, 281, 7568, 253, 6733, 273, 253, 4081, 1332, 352, 310, 1805, 281, 671, 7277, 352, 342, 3809, 344, 321, 3397, 50275, 70, 4274, 3938, 272, 253, 38671, 273, 4833, 11903, 1320, 271, 801, 554, 394, 22791, 272, 1263, 50275, 783, 5175, 1979, 4620, 281, 320, 1512, 1355, 281, 3812, 1543, 326, 403, 10126, 1534, 275, 1635, 627, 310, 642, 1491, 670, 6268, 281, 7568, 253, 31640, 50276, 783, 4081, 4715, 1566, 9093, 4648, 247, 12393, 1232, 273, 11542, 5018, 281, 6642, 253, 2862, 12393, 1232, 1580, 253, 3828, 1180, 310, 374, 436, 2097, 326, 760, 767, 5018, 273, 12393, 403, 2783, 762, 253, 259, 68, 1566, 352, 310, 2779, 326, 253, 5020, 273, 253, 7632, 403, 10564, 1561, 767, 5018, 285, 3103, 352, 310, 417, 1077, 10084, 326, 253, 6642, 310, 7899, 4366, 326, 253, 815, 472, 1255, 760, 6556, 323, 2087, 12393, 5018, 285, 7384, 326, 760, 767, 12393, 5018, 403, 2783, 1142, 5368, 3082, 476, 320, 7321, 281, 320, 625, 5919, 24088, 4293, 342, 767, 12242, 8107, 10491, 50275, 37585, 3533, 50276, 5371, 1057, 4016, 3410, 1599, 275, 4706, 3156, 50275, 262, 310, 3240, 10084, 326, 5419, 476, 4711, 1029, 15177, 5482, 281, 1781, 14580, 342, 247, 1077, 1355, 13782, 2105, 436, 9093, 2097, 326, 253, 815, 472, 1255, 812, 320, 11399, 407, 247, 327, 383, 554, 4833, 13418, 534, 4453, 1512, 24683, 604, 436, 310, 253, 1083, 9788, 78, 312, 943, 320, 247, 5322, 1332, 323, 4833, 13418, 534, 812, 320, 16058, 407, 4679, 275, 247, 2905, 2523, 323, 253, 3515, 673, 273, 253, 4081, 3082, 891, 717, 12371, 604, 253, 673, 908, 323, 12672, 253, 14237, 310, 2908, 323, 1781, 14580, 12672, 253, 6779, 8687, 30840, 569, 273, 1781, 12624, 50275, 783, 4081, 5853, 310, 417, 1077, 4460, 285, 697, 8542, 11839, 3198, 1805, 22861, 50276, 7152, 33032, 43355, 12661, 2067, 11454, 2990, 1566, 3169, 7274, 281, 4833, 11903, 1320, 516, 806, 1289, 466, 8197, 4833, 970, 247, 305, 9866, 534, 476, 320, 43867, 715, 13757, 11333, 751, 260, 813, 1273, 22072, 32547, 253, 2105, 273, 1907, 281, 6642, 253, 4833, 273, 1046, 7431, 407, 4020, 839, 253, 16888, 6351, 342, 247, 2500, 311, 4071, 13361, 81, 4720, 5419, 32547, 253, 2105, 273, 1907, 281, 6642, 253, 4833, 323, 1046, 8357, 4666, 407, 4020, 839, 253, 4833, 342, 3386, 432, 305, 9866, 8763, 3054, 4679, 7568, 326, 253, 4081, 1332, 476, 39970, 281, 14580, 3012, 1027, 432, 3733, 941, 671, 5419, 3400, 2900, 3290, 2810, 281, 253, 19508, 8245, 342, 247, 6919, 273, 11897, 673, 1223, 253, 897, 273, 18976, 2224, 323, 38183, 13757, 556, 2489, 3240, 4633, 275, 3332, 1107, 616, 2898, 281, 4833, 11903, 1320, 516, 310, 417, 15246, 347, 352, 310, 3240, 1027, 432, 2045, 305, 9866, 4893, 253, 516, 1895, 556, 247, 2266, 2605, 749, 2307, 792, 414, 285, 2266, 1327, 28269, 11333, 403, 2168, 4232, 323, 253, 1895, 3103, 352, 310, 3240, 10084, 285, 18462, 281, 923, 326, 18976, 2224, 476, 2686, 320, 3240, 12085, 327, 436, 1895, 50276, 23955, 4757, 273, 253, 2929, 310, 253, 11080, 1255, 273, 253, 2746, 253, 2929, 29328, 1264, 2308, 273, 26830, 5502, 432, 5899, 2746, 260, 813, 281, 4336, 13418, 3169, 2746, 5419, 2429, 281, 3365, 5277, 247, 2014, 2746, 326, 2987, 436, 26830, 2746, 4483, 10668, 281, 1805, 2096, 2139, 247, 19554, 2746, 1057, 417, 36433, 285, 625, 5609, 878, 281, 320, 5611, 50276, 531, 16613, 14855, 273, 253, 2929, 310, 616, 3480, 273, 3806, 281, 355, 466, 6156, 1162, 355, 5987, 5758, 15337, 3024, 39061, 301, 317, 21340, 87, 1251, 24, 86, 50276, 17480, 436, 2929, 671, 29328, 247, 4715, 3169, 2746, 323, 749, 2307, 792, 13757, 436, 2746, 943, 306, 644, 5469, 285, 2429, 1411, 347, 247, 8245, 50276, 12563, 275, 818, 7684, 256, 285, 884, 4477, 344, 321, 18260, 3989, 3386, 432, 13831, 1289, 466, 3210, 4477, 2085, 690, 30328, 670, 731, 3981, 253, 8763, 1375, 288, 6811, 476, 320, 2783, 347, 247, 5203, 323, 1016, 4666, 3692, 861, 6492, 604, 352, 310, 8131, 281, 320, 12208, 2299, 436, 3133, 751, 271, 689, 25322, 288, 35879, 403, 10444, 8763, 3054, 432, 253, 1289, 466, 1566, 285, 627, 1057, 417, 1646, 281, 320, 667, 5122, 751, 247, 2957, 1159, 534, 29426, 841, 10444, 8763, 3054, 281, 320, 253, 4833, 5203, 1142, 2216, 10165, 751, 2139, 818, 22661, 689, 246, 285, 39142, 407, 19641, 403, 12744, 884, 310, 1014, 625, 21643, 984, 352, 760, 4648, 253, 806, 3828, 273, 253, 305, 9866, 1057, 352, 1599, 326, 3081, 305, 9866, 8090, 403, 417, 3058, 2686, 352, 36908, 1646, 751, 4477, 3894, 271, 1783, 273, 253, 3486, 273, 305, 9866, 8090, 50276, 6275, 314, 352, 369, 2649, 2119, 9788, 78, 1317, 476, 320, 12814, 347, 247, 749, 2307, 792, 1159, 275, 247, 2629, 3282, 323, 247, 749, 2307, 792, 1159, 359, 943, 320, 2104, 281, 7472, 323, 667, 873, 275, 253, 5028, 273, 253, 1159, 2299, 5426, 1249, 19584, 253, 1677, 3280, 256, 476, 320, 2330, 342, 253, 3425, 273, 8357, 5239, 256, 18, 256, 19, 298, 6768, 23524, 275, 643, 3000, 253, 1159, 310, 760, 6210, 392, 37224, 323, 253, 3425, 273, 5239, 256, 18, 256, 19, 298, 6768, 2540, 1309, 253, 10636, 273, 253, 38754, 5933, 3103, 891, 717, 417, 2119, 1903, 70, 11193, 12215, 273, 38754, 49123, 749, 2307, 792, 11903, 1320, 10384, 1060, 4833, 11903, 1320, 310, 247, 3240, 10084, 2898, 273, 18976, 2224, 4477, 2126, 247, 12082, 2746, 281, 1973, 2709, 3082, 342, 625, 16018, 281, 253, 5899, 2746, 285, 3629, 2308, 273, 15180, 5750, 690, 273, 14053, 7089, 1007, 47641, 2299, 285, 387, 1878, 4419, 247, 1805, 8813, 5474, 339, 431, 248, 2929, 29328, 247, 11454, 2990, 2746, 1289, 466, 323, 26230, 253, 4833, 273, 247, 1677, 8357, 275, 247, 1677, 4216, 625, 15538, 253, 4477, 12661, 1264, 1027, 3082, 281, 897, 253, 4081, 4833, 13418, 1332, 323, 4833, 11903, 1320, 253, 4477, 921, 253, 8936, 3045, 273, 616, 4081, 1332, 275, 5301, 342, 1666, 25379, 285, 50276, 783, 2929, 2175, 271, 1774, 1895, 941, 15067, 1895, 26332, 4833, 11903, 1320, 253, 4081, 2990, 323, 4020, 839, 253, 4833, 310, 4722, 327, 697, 1211, 253, 2929, 310, 4583, 973, 15720, 347, 3477, 281, 956, 2299, 253, 2929, 476, 5649, 432, 35952, 323, 4227, 253, 1308, 1951, 275, 16186, 608, 403, 247, 2372, 21643, 671, 253, 5661, 2593, 310, 20906, 323, 1650, 275, 4706, 7609, 352, 310, 417, 2590, 281, 479, 849, 253, 4216, 11454, 2990, 310, 10166, 310, 352, 10166, 760, 327, 18927, 14580, 604, 594, 849, 858, 253, 4477, 897, 326, 2990, 323, 643, 14580, 672, 253, 2990, 13461, 403, 4518, 3469, 386, 327, 4216, 4666, 1979, 671, 352, 3133, 326, 253, 2022, 38135, 273, 253, 2929, 310, 275, 36636, 281, 6642, 253, 4833, 3066, 4216, 11454, 6928, 285, 253, 4081, 4833, 11903, 1320, 3082, 897, 2168, 5368, 5609, 275, 5019, 342, 436, 4833, 13418, 1332, 275, 326, 3282, 253, 10527, 9021, 273, 253, 2929, 403, 16888, 2299, 16248, 841, 5609, 310, 1335, 37825, 50275, 1189, 455, 253, 2929, 29328, 4460, 3082, 323, 4833, 13418, 285, 4833, 11903, 1320, 275, 619, 4743, 253, 2022, 14855, 273, 253, 2929, 310, 253, 5816, 4278, 275, 253, 5661, 2593, 50276, 187, 187, 4118, 18435, 27, 2520, 2929, 27694, 953, 253, 1895, 273, 4833, 11903, 1320, 285, 5936, 970, 4216, 11454, 6928, 281, 6642, 271, 5170, 3033, 327, 253, 4833, 534, 476, 840, 320, 908, 281, 1089, 1175, 8357, 5239, 253, 2929, 4245, 247, 5235, 273, 5661, 1941, 326, 253, 3082, 3157, 327, 2710, 11333, 275, 253, 6239, 627, 369, 247, 4618, 7629, 275, 11626, 690, 30628, 3543, 326, 253, 4583, 2934, 369, 417, 3782, 4460, 347, 3082, 326, 13398, 4216, 46234, 285, 35221, 4715, 281, 8415, 4833, 11903, 1320, 452, 2168, 644, 4081, 275, 253, 6239, 23000, 690, 30628, 3543, 326, 253, 4679, 497, 5816, 1774, 14023, 3782, 281, 4715, 3169, 3082, 1293, 534, 352, 310, 2834, 281, 9059, 326, 841, 3082, 1663, 513, 7170, 253, 1375, 273, 253, 1445 ]
Below is given review of a research paper from cnoference journal. Please write a summary the review. ### Review: paper introduces vlmbench a new benchmark for visionbased robot manipulation compared to the prior counterparts the tasks in this benchmark are compositional in terms of action objects and their attributes the benchmark also allows two types of goal definition goal pose constraint and path pose constraint to generate demonstrations the authors propose amsolver but do not provide many details about it finally some experiments on this benchmark are conducted with a model modified from cliport i believe the problem studied here is relevant and important visionbased agents executing tasks specified by natural language is of interest to the neurips community and could possibly help with other research including multitask learning multimodal learning etc the paper is overall illustrative some writeups could have been clearer though extending cliport to 6 dof robotic control is interesting however the proposed approach does not seem to be working at least on this benchmark having said those above i still have some major concerns about the methodology and experiment of this manuscript i hope the authors could help clarify them in a rebuttal as stated at the end of the introduction amsolver is one of the major contributions made by this paper however i cannot find enough details about it thereby making it hard to justify the novelty sec 3 seems to be talking about the grammatical structure of their tasks which i suppose should be part of sec 4 instead it is not clear how these task definitions connect to a demonstration generator or a robotic trajectory solver if my understanding is correct the authors seem to apply an offtheshelf solver provided by rlbench to their tasks if this is the case i doubt if this can be viewed as a contribution the authors are encouraged to elaborate more on the comparison between their proposed amsolver are other prior arts ex the solver in rlbench behaviour metaworld etc in table 1 the authors claim that their benchmark is the only one that includes automatic 6 dof grasping specifically they pregenerate some plausible twofingered grasps on the objects in their benchmark using gpd however its very unclear how could models benefit from these grasping samples when solving their tasks the authors should at least demonstrate a prototype system that could leverage this data to justify the rationale for making it part of their benchmark vlmbench is claimed to be a compositional benchmark i may be biased but indeed i found it necessary to also include some compositional or ood generalization tests to further verify the compositional structure of their tasks this has been a common practice for other compositional benchmarks including clevr alfred etc please kindly remind me if there are already and make it clearer in the paper the ablations in the experiments indicate that providing ground truth position and orientation could significantly lift the overall performances however more details on these model variants should be provided i can see a total of 6 outputs of the proposed 6dofcliport does pos mean xyz are already given while the other three will remain predictable and ori suggest the contrary it could be hard to parse the results without them the existing results also could have been more informative other than some numbers ex why do ground truth orientation useless than position in most tasks even 6 dof tasks door why does the model still fail even with this privileged information some failure mode analysis seems to be necessary to help understand the challenge imposed by this benchmark docsepthis paper presents a framework for vision and language manipulation aimed at being a benchmark vlmbench is a benchmark for vision and language manipulation which contains a manipulator and a tabletop environment vlmbench provides access to several sensors such as rgb depth segmentation as well as other agentspecific observations such as end effector pose the agent can be commanded to perform a variety of tasks such as pick and place stack drop etc each task with a potential for varying several parameters such as object type qualifying the task using constraints such as direction size extent etc given this information vlmbench solves the task using a rule based task solver called amsolver amsolver can compute solutions that can adhere to goalpath constraints finally the paper also describes a manipulation agent called 6dcliport which takes inspiration from cliport and given rgbd observations and a language instruction set for a task outputs relevant goal positions for the end effector the authors apply 6dcliport to environments in vlmbench and show that the performance of 6dcliport surpasses that of simple unimodal baselines the paper presents a benchmark for a domain that is of significant interest to the community the simulatordataset allows for specifying constraints of several kinds such as positionorientation constraints constraints related to the trajectory executed by the agent similarly the tasks have a good amount of variations possible in sizes shapes directions of motion etc linking such high fidelity task specification with language has the potential to enable more complex architectures and solution methodologies for vision and language manipulation the paper identifies the compositional nature of task specification as a key highlighting feature of the benchmark but it is unclear how this is betterdifferent than the composition seen in existing benchmarksdatasets like alfred 1 or calvin 2 the purpose of the discussion regarding the 6dcliport agent is unclear if the intent was to showcase how the environmentstasks from vlmbench could be solved by existing algorithms or pose a challenge to such techniques it would have been useful to implement a set of representative baseline algorithms for visionlanguage manipulation and show the performance on vlmbench if the intent was to demonstrate a new algorithm for vision language manipulation in general it would have been useful to compare with other baselines fundamentally the unimodal baseline seems to always be at a disadvantage as it is intuitive that an rgbdlanguage algorithm would perform better rgb depth language is a multimodal combination that is not very uncommon in the manipulation field so several other datasets can also provide this data and hence it is not really necessary to prove that multimodality is useful and important it is also not clear how 6dcliport performs relative to other best performing vlm techniques in the literature i think this section needs reframing to really understand the purpose of 6dcliport it is also not unclear why the performance of the unimodal baseline and to some extent 6d cliport without gt is so extremely low on the benchmark tasks also considering one of the strengths of vlmbench is its ability to generate variations and constraints in the task space the fact that 6d cliport fails so badly at tasks that have high variations see section 63 raises questions about the relevance of this technique 1 mohit shridhar jesse thomason daniel gordon yonatan bisk winson han roozbeh mottaghi luke zettlemoyer dieter fox alfred a benchmark for interpreting grounded instructions for everyday tasks 2 oier mees lukas hermann erick rosetebeas wolfram burgard calvin a benchmark for languageconditioned policy learning for longhorizon robot manipulation tasks docsepthe presented paper proposes a robotics manipulation benchmark containing language instructions and corresponding manipulation tasks eg pick and place wipe a table open a drawer etc the main contributions of the paper are 1 automatic generator of individual demonstrations with corresponding linguistic instructions the demonstrations might be composed of simple tasks there are 8 basic task types pick place objects stack objects drop pencil put into shape sorter pour water wipe table open drawer use door that can be varied by color shapes size relative positions etc 2 vlmbench benchmark including a set of manipulation tasks that compose of visual observations accompanied by linguistic instructions the manipulation tasks have several variations and can be automatically evaluated 3 baseline algorithm evaluated on vlm tasks i fully agree with the authors that such benchmark is filling an important blank spot to measure quality of embodied agents who should follow language instructions to perform object manipulation tasks while there are several benchmarks and datasets for language descriptions to images there are not that many for the embodied domain especially not those that would enable measuring the quality in an organized structured way the paper is well organised and describes clearly the presented benchmark the benchmark itself is from my perspective very well designed it enables evaluation of the quality of individual subtasks and more complex tasks can be created by joining the individual subtasks i appreciate that the evaluation is enabled by the design of the dataset all the tasks are specified by the change in position and orientation and therefore the ground truth waypoints are known the evaluation is done by checking if individual waypoints have been reached the installation guide datasheet file as well as appendix information are very helpful and detailed it is claimed in the paper that more complex tasks can be built as a composition of the simple tasks however from the text itself it is not clear how they are built and evaluated the actions are restricted to the manipulation with an object start of the action is determined by grasping an object and the end by releasing therfore actions such as pushkickpullhammeretc are not included it is also not clear how these could be included is there a way how to add a new task and evaluated i would appreciate if the authors could discuss on this i am also not sure how the difficulty of the tasks might be compared or evaluated do the authors consider that all the tasks are on the same level is there any way to evaluate the difficulty of the individual tasks eg number of the waypoints needed to pass the dataset itself is quite big to download aprox 10 gb it might be good to include also smaller version for users who want to just test it and see samples from the dataset i was able to run the pretrained models generate new data and test configurations as described in the readme however it is not clear how to turn on the simulator is there any argument to do so the ideaprocess is described well in the paper but it is hard to navigate in the code as there is not any detailed documentation i would appreciate more detailed documentation for the code itself docsepthis paper considers the problem of 6 degrees of freedom dof robot manipulation following language instructions which is an interesting and promising direction and there are several works emerged recently in particular the authors proposed a simulator amsolver based on the robot learning framework rlbench the authors added the language instruction part in rlbench and proposed modular rulebased task templates to classify and construct tasks including control constraints moving the target object with goal pose constraints m1 and moving the target object along a trajectory while satisfying the motion constraints m2 an objectcentric representation is used to add variations when generating data including class color size and geometry shape based on amsolver the authors constructed the dataset vlmbench for 6dof languageguided robot manipulation research based on cliport the authors further proposed a 6dof keypointbased model as a baseline for vlmbench in summary this paper considers an important problem languageguided robot manipulation and the contributions consist of a simulator amsolver a dataset vlmbench and a baseline model 6dcliport 1 i think the dataset is important for the language and roboticsvision research community and makes a step towards languageguided 6dof robot manipulation existing 6dof datasets such as manipulathorhttpsgithubcomallenaimanipulathor calvinhttpsgithubcommeescalvin etc usually focus on long horizontal task planning while ravenshttpsgithubcomgoogleresearchravens is suitable for languageguided manipulation it only operates on a 2d plane therefore i think vlmbech fills the gap of languageguided 6dof robot manipulation well 2 the authors propose and build a complete pipeline for languageguided 6dof robot manipulation research including a simulator amsolver a dataset vlmbench and a baseline model 6dcliport which can help novices get to know this field quickly and researchers develop and compare algorithms conveniently 1 following the github repository of this paper it is hard to reproduce the results table 3 and table 6 in the appendix for instance i used the pretrained model for the stack task and always got a 0 success rate 2 the process of generating the dataset is not described very clearly how many episodes for each task in training what is the difference between seen and unseen how is the training and validation split following the github repository of this paper i think that running python toolsdatasetgeneratornlppy only gives the validation dataset 3 some parts of the paper are not written very well for the amsolver how do the three modular rulebased constraints correspond to the three general household task categories some literature on languageguided robot manipulation such as 1 is missing the conclusion is missing 1 mees et al what matters in language conditioned imitation learning iros 2022 docsepthe authors of vlmbench propose a new benchmark for robotic manipulation based on visual input and a language query a benchmark assesses the completion of the task based on taskrelated final state constraints additionally the work proposes a new dataset associated with the benchmark the authors propose 8 various manipulation tasks for each task a set of randomised templatebased scenarios is defined additionally the paper introduces amsolver which is an automatic tool for demonstration generation i believe the authors did a good job in formalising various manipulation tasks the work proposes a systematic way to describe new tasks in robotic manipulation problems sec 31 i think the modular design suggested in the paper is a good idea objectcentric representations along with unit task builder allow for the scalability of data generation amsolver seems to provide a flexible way to generate demonstrations for new tasks after having defined proper goal constraints i believe providing automatic 6dof grasping is a big benefit of the tool the dataset provided with the benchmark offers a good variety of manipulation scenarios a baseline and ablation suggested by the authors are sufficient to emphasise the challenges of the proposed benchmark given the proposition of the dataset i would expect more details on the collected data to be provided some of the details are covered in the appendix however it would be good to discuss what set of objects is used in the tasks list of objects do they come from the existing dataset and how many different objects can be used per task eg how many can be opened some more details on how the initial scenes were randomised would be also useful how many objects were placed randomlywith constraints additionally what is the process of creating the instructions from templates is it similar to clevr 1 additionally for the dataset description what is the distribution of demonstrations over the tasks there are 5 different views generated in the tool and a description of what views are incorporated and why such views were chosen would be a nice addition also what is the exact set of observations provided in the system output for the benchmarking it would be good to discuss the performance with respect to fps that can be achieved parsing 5 images per 50ms may provide a noticeable delay sec 61 suggests that multistep tasks are provided with ground truth subgoals i think it would be worth showing the model dealing with multistep tasks and then maybe showing how compositioning of simple tasks affects the performance and probably provides a significant challenge to the proposed model i believe that the issues i mentioned are of a rather minor nature and are easily correctable 1 justin johnson li feifei bharath hariharan c lawrence zitnick laurens van der maaten and ross girshick clevr a diagnostic dataset for compositional language and elementary visual reasoning in ieee conference on computer vision and pattern recognition cvpr 2017 ### Summary:
the paper presents a benchmark for a domain that is of significant interest to the community and a complete pipeline is proposed and demonstrated the work proposes a systematic way to describe tasks in robotic manipulation problems suitable for the datasetbenchmark presented while reviewers were able to explore the tools presented clarifications on how to run the tools provided and reproduce results should be clearly present in the documentation for the toolkit itself the authors are encouraged to consider the strengths and weaknesses identified by reviewers for further clarifying and strengthening the manuscript strengths the problem studied here is relevant and important visionbased agents executing tasks specified by natural language is of interest to the neurips community the authors demonstrate a complete pipeline for languageguided robot manipulation research suitable for getting novel research tasks in this area up and running quickly the paper is well organized and clearly describes the presented benchmark especially with revisions the presented system allows for specifying a variety of kinds of constraints making it overall fairly widely applicable providing automatic 6dof grasping is a significant improvement of the tool over existing related systems weaknesses stronger differentiation from existing datasetsbenchmarks in the body of the paper would help clarify the contributions of the work the proposed approach does not seem to be working well on cliport which raises questions about the overall relevance of the technique that should be addressed it is not clear from the paper itself how new tasks can be included and evaluated some newer literature on languageguided robot manipulation should be included
[ 259, 9258, 15761, 687, 6002, 48384, 278, 1519, 356, 5801, 298, 17936, 1182, 3592, 5616, 37674, 1073, 1715, 30013, 355, 17408, 247, 22791, 323, 29375, 28462, 7997, 323, 15363, 8892, 374, 258, 1321, 479, 265, 298, 2788, 284, 617, 8420, 2827, 781, 687, 1178, 70, 1257, 284, 259, 311, 925, 312, 3600, 36964, 1724, 8498, 247, 22791, 323, 3448, 44321, 3646, 4715, 323, 1048, 1688, 21148, 15688, 19763, 8892, 5474, 339, 431, 248, 3559, 2929, 29328, 247, 15688, 982, 19763, 22791, 4508, 3448, 7997, 285, 3969, 19763, 8892, 24088, 2619, 285, 1659, 31179, 247, 2829, 1527, 247, 31579, 3966, 50276, 783, 2022, 9021, 273, 253, 2929, 403, 337, 12077, 14156, 273, 2060, 32367, 342, 3969, 32019, 7997, 253, 32367, 1537, 320, 9924, 273, 2969, 8892, 627, 403, 854, 5044, 4836, 3510, 2619, 50276, 5070, 5113, 8031, 5113, 5926, 29514, 1691, 715, 5281, 256, 5521, 6531, 1824, 31179, 2829, 1527, 31579, 897, 3369, 326, 476, 320, 12848, 407, 3295, 15029, 1979, 4103, 6887, 3966, 50276, 19, 362, 77, 1814, 15152, 50276, 31591, 4698, 1690, 247, 873, 273, 19763, 8892, 326, 38530, 273, 5304, 7313, 11704, 407, 32019, 7997, 253, 19763, 8892, 452, 2067, 10575, 285, 476, 320, 8356, 6760, 495, 8245, 5933, 6760, 327, 362, 20347, 8892, 891, 4751, 5194, 342, 253, 4477, 326, 824, 22791, 310, 12868, 271, 1774, 9912, 6308, 281, 2557, 3290, 273, 36080, 6083, 665, 943, 956, 3448, 7997, 281, 1347, 1789, 19763, 8892, 1223, 627, 403, 2067, 49602, 285, 15302, 323, 3448, 20121, 281, 3888, 627, 403, 417, 326, 1142, 323, 253, 36080, 5028, 3340, 417, 1110, 326, 651, 8046, 10499, 253, 3290, 275, 271, 10932, 18872, 1039, 50276, 783, 2929, 310, 973, 29070, 285, 8631, 4518, 253, 3559, 22791, 253, 22791, 3139, 310, 432, 619, 8668, 1077, 973, 4158, 352, 13276, 7103, 273, 253, 3290, 273, 2060, 8482, 6579, 285, 625, 2570, 8892, 476, 320, 3562, 407, 14167, 253, 2060, 8482, 6579, 50276, 74, 11435, 326, 253, 7103, 310, 11410, 407, 253, 2216, 273, 253, 10895, 512, 253, 8892, 403, 7616, 407, 253, 1818, 275, 1899, 285, 11259, 285, 3103, 253, 3216, 5083, 1039, 10801, 403, 1929, 253, 7103, 310, 2218, 407, 12669, 604, 2060, 1039, 10801, 452, 644, 4925, 253, 12692, 7102, 7621, 14934, 1873, 347, 973, 347, 30762, 1491, 403, 1077, 9371, 285, 7000, 50276, 262, 310, 7558, 275, 253, 2929, 326, 625, 2570, 8892, 476, 320, 4270, 347, 247, 5889, 273, 253, 2969, 8892, 2299, 432, 253, 2505, 3139, 352, 310, 417, 2590, 849, 597, 403, 4270, 285, 6760, 253, 5231, 403, 11096, 281, 253, 19763, 342, 271, 1789, 50276, 5478, 273, 253, 2250, 310, 3413, 407, 48635, 271, 1789, 285, 253, 990, 407, 20437, 253, 83, 922, 5231, 824, 347, 7450, 44568, 23290, 3964, 1405, 1221, 68, 403, 417, 2908, 352, 310, 671, 417, 2590, 849, 841, 812, 320, 2908, 310, 627, 247, 1039, 849, 281, 823, 247, 747, 4836, 285, 6760, 891, 651, 11435, 604, 253, 4477, 812, 2319, 327, 436, 891, 717, 671, 417, 2119, 849, 253, 10183, 273, 253, 8892, 1537, 320, 2429, 390, 6760, 513, 253, 4477, 1908, 326, 512, 253, 8892, 403, 327, 253, 1072, 1268, 310, 627, 667, 1039, 281, 7472, 253, 10183, 273, 253, 2060, 8892, 24088, 1180, 273, 253, 1039, 10801, 3058, 281, 1509, 253, 10895, 3139, 310, 3240, 1943, 281, 6184, 43012, 89, 884, 305, 67, 352, 1537, 320, 1175, 281, 2486, 671, 4577, 2715, 323, 4212, 665, 971, 281, 816, 1071, 352, 285, 923, 3530, 432, 253, 10895, 50276, 74, 369, 2104, 281, 1408, 253, 3215, 11273, 3210, 6635, 747, 941, 285, 1071, 16012, 347, 2529, 275, 253, 1239, 1405, 2299, 352, 310, 417, 2590, 849, 281, 1614, 327, 253, 40022, 50276, 261, 627, 667, 4154, 281, 513, 594, 253, 1827, 522, 287, 829, 310, 2529, 973, 275, 253, 2929, 533, 352, 310, 1892, 281, 24171, 275, 253, 2127, 347, 627, 310, 417, 667, 7000, 10097, 891, 651, 11435, 625, 7000, 10097, 323, 253, 2127, 3139, 5474, 33032, 2520, 2929, 19401, 253, 1895, 273, 721, 7759, 273, 7185, 513, 71, 15688, 19763, 1563, 3448, 7997, 534, 310, 271, 4722, 285, 12532, 3884, 285, 627, 403, 2067, 2987, 13082, 4102, 275, 1798, 253, 4477, 4081, 247, 40022, 717, 84, 14930, 1754, 327, 253, 15688, 4715, 7792, 391, 77, 31591, 253, 4477, 2879, 253, 3448, 9775, 629, 275, 391, 77, 31591, 285, 4081, 23178, 4086, 3169, 4836, 20665, 281, 30215, 285, 3989, 8892, 1690, 1453, 10806, 4886, 253, 2303, 1789, 342, 4736, 16753, 10806, 278, 18, 285, 4886, 253, 2303, 1789, 2112, 247, 18974, 1223, 14127, 253, 3200, 10806, 278, 19, 271, 1789, 37382, 6779, 310, 908, 281, 823, 10575, 672, 11365, 941, 1690, 966, 3295, 1979, 285, 12087, 5281, 1754, 327, 717, 84, 14930, 253, 4477, 8818, 253, 10895, 362, 77, 1814, 15152, 323, 721, 69, 1171, 3448, 26960, 15688, 19763, 2561, 1754, 327, 17230, 430, 253, 4477, 2007, 4081, 247, 721, 69, 1171, 2234, 3659, 3169, 1566, 347, 247, 8245, 323, 362, 77, 1814, 15152, 50276, 249, 6010, 436, 2929, 19401, 271, 1774, 1895, 3448, 26960, 15688, 19763, 285, 253, 9021, 2882, 273, 247, 40022, 717, 84, 14930, 247, 10895, 362, 77, 1814, 15152, 285, 247, 8245, 1566, 721, 69, 11536, 430, 50276, 18, 891, 1158, 253, 10895, 310, 1774, 323, 253, 3448, 285, 15688, 982, 4694, 2561, 3114, 285, 2789, 247, 3213, 4404, 3448, 26960, 721, 69, 1171, 15688, 19763, 5368, 721, 69, 1171, 15302, 824, 347, 9452, 335, 506, 263, 3614, 7280, 681, 455, 257, 1468, 266, 532, 335, 506, 263, 1724, 8498, 3614, 7280, 681, 1405, 265, 1179, 8498, 3966, 3798, 2770, 327, 1048, 11593, 4836, 7219, 1223, 1218, 14941, 3614, 7280, 681, 9906, 36642, 3385, 561, 310, 7470, 323, 3448, 26960, 19763, 352, 760, 17209, 327, 247, 374, 69, 6415, 3103, 891, 1158, 362, 20347, 1257, 348, 32113, 253, 8037, 273, 3448, 26960, 721, 69, 1171, 15688, 19763, 973, 50276, 19, 253, 4477, 12661, 285, 1973, 247, 3426, 15722, 323, 3448, 26960, 721, 69, 1171, 15688, 19763, 2561, 1690, 247, 40022, 50276, 1317, 14930, 247, 10895, 362, 77, 1814, 15152, 285, 247, 8245, 1566, 721, 69, 11536, 430, 534, 476, 1361, 22458, 1271, 755, 281, 871, 436, 1673, 4541, 285, 8607, 1287, 285, 7277, 11333, 34090, 50275, 18, 1563, 253, 40477, 18491, 273, 436, 2929, 352, 310, 1892, 281, 18302, 253, 1543, 2829, 495, 285, 2829, 721, 275, 253, 30762, 323, 4227, 891, 908, 253, 3215, 11273, 1566, 323, 253, 8031, 4836, 285, 1900, 1694, 247, 470, 2323, 2281, 50276, 19, 253, 1232, 273, 11365, 253, 10895, 310, 417, 2529, 1077, 4518, 849, 1142, 13305, 323, 1016, 4836, 275, 3733, 752, 310, 253, 3064, 875, 2326, 285, 39709, 849, 310, 253, 3733, 285, 12820, 8085, 1563, 253, 40477, 18491, 273, 436, 2929, 891, 1158, 326, 3515, 15548, 5657, 42429, 36089, 13307, 19500, 760, 4245, 253, 12820, 10895, 50276, 20, 690, 4243, 273, 253, 2929, 403, 417, 3542, 1077, 973, 323, 253, 717, 84, 14930, 849, 513, 253, 1264, 23178, 4086, 3169, 10806, 2723, 281, 253, 1264, 2087, 11692, 4836, 9050, 50276, 8826, 6239, 327, 3448, 26960, 15688, 19763, 824, 347, 337, 310, 5816, 253, 6452, 310, 5816, 50275, 18, 479, 265, 1162, 355, 752, 8213, 275, 3448, 27039, 45738, 4715, 891, 2921, 1384, 1423, 5474, 339, 431, 248, 4477, 273, 362, 77, 1814, 15152, 12661, 247, 747, 22791, 323, 35121, 19763, 1754, 327, 5304, 3280, 285, 247, 3448, 7316, 247, 22791, 2939, 265, 253, 12240, 273, 253, 4836, 1754, 327, 4836, 4919, 2457, 1375, 10806, 23000, 253, 789, 29328, 247, 747, 10895, 2330, 342, 253, 22791, 253, 4477, 12661, 854, 2710, 19763, 8892, 323, 1016, 4836, 247, 873, 273, 41699, 50276, 8547, 3169, 15216, 310, 2931, 23000, 253, 2929, 23970, 717, 84, 14930, 534, 310, 271, 12077, 4968, 323, 20028, 5978, 891, 2868, 253, 4477, 858, 247, 1175, 2628, 275, 7473, 2182, 2710, 19763, 8892, 253, 789, 29328, 247, 12082, 1039, 281, 6266, 747, 8892, 275, 35121, 19763, 3237, 4706, 4562, 891, 1158, 253, 23178, 2216, 5125, 275, 253, 2929, 310, 247, 1175, 2934, 1789, 37382, 14237, 2112, 342, 3943, 4836, 22074, 1581, 323, 253, 9171, 1430, 273, 941, 5978, 50275, 1317, 14930, 3133, 281, 2085, 247, 12112, 1039, 281, 6635, 32367, 323, 747, 8892, 846, 1907, 2931, 1463, 4736, 10806, 891, 2868, 5277, 12077, 721, 69, 1171, 48635, 310, 247, 1943, 5649, 273, 253, 4968, 50276, 783, 10895, 2530, 342, 253, 22791, 6131, 247, 1175, 5235, 273, 19763, 15216, 50276, 66, 8245, 285, 28913, 5125, 407, 253, 4477, 403, 4209, 281, 10251, 885, 253, 7881, 273, 253, 4081, 22791, 1677, 253, 13989, 273, 253, 10895, 891, 651, 1902, 625, 4278, 327, 253, 5728, 941, 281, 320, 2530, 690, 273, 253, 4278, 403, 6107, 275, 253, 30762, 2299, 352, 651, 320, 1175, 281, 2319, 752, 873, 273, 5113, 310, 908, 275, 253, 8892, 1618, 273, 5113, 513, 597, 1705, 432, 253, 5368, 10895, 285, 849, 1142, 1027, 5113, 476, 320, 908, 591, 4836, 24088, 849, 1142, 476, 320, 5485, 50276, 8826, 625, 4278, 327, 849, 253, 3302, 13451, 497, 41699, 651, 320, 671, 4217, 849, 1142, 5113, 497, 4845, 12421, 3113, 10806, 23000, 752, 310, 253, 1232, 273, 6153, 253, 7997, 432, 20665, 310, 352, 2074, 281, 1391, 24987, 337, 50276, 29483, 595, 323, 253, 10895, 5740, 752, 310, 253, 3268, 273, 32367, 689, 253, 8892, 50276, 9088, 403, 608, 1027, 6849, 4561, 275, 253, 4968, 285, 247, 5740, 273, 752, 6849, 403, 11217, 285, 2139, 824, 6849, 497, 6777, 651, 320, 247, 5322, 1635, 671, 752, 310, 253, 3242, 873, 273, 7313, 2530, 275, 253, 985, 3453, 50276, 1542, 253, 22791, 272, 352, 651, 320, 1175, 281, 2319, 253, 3045, 342, 1675, 281, 269, 793, 326, 476, 320, 6786, 29072, 608, 3888, 591, 2456, 983, 778, 2085, 247, 28629, 5778, 50276, 1704, 9901, 5936, 326, 1554, 382, 554, 8892, 403, 2530, 342, 3216, 5083, 749, 2184, 932, 891, 1158, 352, 651, 320, 4409, 4645, 253, 1566, 10620, 342, 1554, 382, 554, 8892, 285, 840, 5046, 4645, 849, 5889, 272, 273, 2969, 8892, 11852, 253, 3045, 285, 3164, 3400, 247, 1534, 5691, 281, 253, 4081, 1566, 50276, 74, 2868, 326, 253, 3374, 891, 5393, 403, 273, 247, 2581, 5884, 3753, 285, 403, 4354, 3451, 494, 50276, 18, 816, 249, 480, 2116, 1665, 632, 704, 1074, 74, 270, 9432, 506, 288, 1792, 9432, 266, 260, 1569, 1196, 1182, 262, 23237, 826, 86, 25083, 3889, 1784, 6429, 15030, 285, 687, 859, 48496, 1200, 781, 1391, 24987, 247, 10401, 10895, 323, 5889, 267, 3448, 285, 18307, 5304, 14720, 275, 26332, 1796, 8059, 327, 4382, 8113, 285, 3102, 8981, 30105, 1087, 4240, 2490, 187, 4118, 18435, 27, 783, 2929, 10262, 247, 22791, 323, 247, 5028, 326, 310, 273, 1534, 1600, 281, 253, 3114, 285, 247, 3426, 15722, 310, 4081, 285, 5183, 253, 789, 29328, 247, 12082, 1039, 281, 6266, 8892, 275, 35121, 19763, 3237, 7470, 323, 253, 10895, 31591, 4698, 3559, 1223, 30628, 497, 2104, 281, 8338, 253, 5657, 3559, 8254, 6787, 327, 849, 281, 1408, 253, 5657, 2530, 285, 18302, 1543, 943, 320, 4518, 1246, 275, 253, 10097, 323, 253, 4968, 11554, 3139, 253, 4477, 403, 14659, 281, 1908, 253, 20544, 285, 32213, 3636, 407, 30628, 323, 2007, 8254, 5411, 285, 31845, 253, 7714, 50276, 296, 3755, 20556, 50276, 783, 1895, 5421, 1060, 310, 4623, 285, 1774, 8113, 3169, 6083, 24364, 8892, 7616, 407, 3626, 3448, 310, 273, 1600, 281, 253, 5723, 2824, 3114, 50276, 783, 4477, 7568, 247, 3426, 15722, 323, 3448, 26960, 15688, 19763, 2561, 7470, 323, 2970, 4460, 2561, 8892, 275, 436, 2170, 598, 285, 3515, 4541, 50275, 783, 2929, 310, 973, 10932, 285, 4518, 8631, 253, 3559, 22791, 3340, 342, 38549, 50276, 783, 3559, 985, 4483, 323, 31238, 247, 5235, 273, 9351, 273, 10806, 2403, 352, 4583, 9648, 7561, 7763, 50276, 11404, 2821, 12077, 721, 69, 1171, 48635, 310, 247, 1534, 7756, 273, 253, 4968, 689, 5368, 2905, 2718, 50276, 20881, 1255, 265, 50276, 9072, 254, 9827, 432, 5368, 15302, 31591, 17144, 275, 253, 2133, 273, 253, 2929, 651, 1361, 19148, 253, 9021, 273, 253, 789, 50276, 783, 4081, 2746, 1057, 417, 1646, 281, 320, 2444, 973, 327, 17230, 430, 534, 16540, 3533, 670, 253, 4583, 17200, 273, 253, 5853, 326, 943, 320, 9713, 50276, 262, 310, 417, 2590, 432, 253, 2929, 3139, 849, 747, 8892, 476, 320, 2908, 285, 6760, 50276, 8826, 21629, 6239, 327, 3448, 26960, 15688, 19763, 943, 320, 2908, 209 ]
[ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 ]
[ 259, 9258, 15761, 687, 6002, 48384, 278, 1519, 356, 5801, 298, 17936, 1182, 3592, 5616, 37674, 1073, 1715, 30013, 355, 17408, 247, 22791, 323, 29375, 28462, 7997, 323, 15363, 8892, 374, 258, 1321, 479, 265, 298, 2788, 284, 617, 8420, 2827, 781, 687, 1178, 70, 1257, 284, 259, 311, 925, 312, 3600, 36964, 1724, 8498, 247, 22791, 323, 3448, 44321, 3646, 4715, 323, 1048, 1688, 21148, 15688, 19763, 8892, 5474, 339, 431, 248, 3559, 2929, 29328, 247, 15688, 982, 19763, 22791, 4508, 3448, 7997, 285, 3969, 19763, 8892, 24088, 2619, 285, 1659, 31179, 247, 2829, 1527, 247, 31579, 3966, 50276, 783, 2022, 9021, 273, 253, 2929, 403, 337, 12077, 14156, 273, 2060, 32367, 342, 3969, 32019, 7997, 253, 32367, 1537, 320, 9924, 273, 2969, 8892, 627, 403, 854, 5044, 4836, 3510, 2619, 50276, 5070, 5113, 8031, 5113, 5926, 29514, 1691, 715, 5281, 256, 5521, 6531, 1824, 31179, 2829, 1527, 31579, 897, 3369, 326, 476, 320, 12848, 407, 3295, 15029, 1979, 4103, 6887, 3966, 50276, 19, 362, 77, 1814, 15152, 50276, 31591, 4698, 1690, 247, 873, 273, 19763, 8892, 326, 38530, 273, 5304, 7313, 11704, 407, 32019, 7997, 253, 19763, 8892, 452, 2067, 10575, 285, 476, 320, 8356, 6760, 495, 8245, 5933, 6760, 327, 362, 20347, 8892, 891, 4751, 5194, 342, 253, 4477, 326, 824, 22791, 310, 12868, 271, 1774, 9912, 6308, 281, 2557, 3290, 273, 36080, 6083, 665, 943, 956, 3448, 7997, 281, 1347, 1789, 19763, 8892, 1223, 627, 403, 2067, 49602, 285, 15302, 323, 3448, 20121, 281, 3888, 627, 403, 417, 326, 1142, 323, 253, 36080, 5028, 3340, 417, 1110, 326, 651, 8046, 10499, 253, 3290, 275, 271, 10932, 18872, 1039, 50276, 783, 2929, 310, 973, 29070, 285, 8631, 4518, 253, 3559, 22791, 253, 22791, 3139, 310, 432, 619, 8668, 1077, 973, 4158, 352, 13276, 7103, 273, 253, 3290, 273, 2060, 8482, 6579, 285, 625, 2570, 8892, 476, 320, 3562, 407, 14167, 253, 2060, 8482, 6579, 50276, 74, 11435, 326, 253, 7103, 310, 11410, 407, 253, 2216, 273, 253, 10895, 512, 253, 8892, 403, 7616, 407, 253, 1818, 275, 1899, 285, 11259, 285, 3103, 253, 3216, 5083, 1039, 10801, 403, 1929, 253, 7103, 310, 2218, 407, 12669, 604, 2060, 1039, 10801, 452, 644, 4925, 253, 12692, 7102, 7621, 14934, 1873, 347, 973, 347, 30762, 1491, 403, 1077, 9371, 285, 7000, 50276, 262, 310, 7558, 275, 253, 2929, 326, 625, 2570, 8892, 476, 320, 4270, 347, 247, 5889, 273, 253, 2969, 8892, 2299, 432, 253, 2505, 3139, 352, 310, 417, 2590, 849, 597, 403, 4270, 285, 6760, 253, 5231, 403, 11096, 281, 253, 19763, 342, 271, 1789, 50276, 5478, 273, 253, 2250, 310, 3413, 407, 48635, 271, 1789, 285, 253, 990, 407, 20437, 253, 83, 922, 5231, 824, 347, 7450, 44568, 23290, 3964, 1405, 1221, 68, 403, 417, 2908, 352, 310, 671, 417, 2590, 849, 841, 812, 320, 2908, 310, 627, 247, 1039, 849, 281, 823, 247, 747, 4836, 285, 6760, 891, 651, 11435, 604, 253, 4477, 812, 2319, 327, 436, 891, 717, 671, 417, 2119, 849, 253, 10183, 273, 253, 8892, 1537, 320, 2429, 390, 6760, 513, 253, 4477, 1908, 326, 512, 253, 8892, 403, 327, 253, 1072, 1268, 310, 627, 667, 1039, 281, 7472, 253, 10183, 273, 253, 2060, 8892, 24088, 1180, 273, 253, 1039, 10801, 3058, 281, 1509, 253, 10895, 3139, 310, 3240, 1943, 281, 6184, 43012, 89, 884, 305, 67, 352, 1537, 320, 1175, 281, 2486, 671, 4577, 2715, 323, 4212, 665, 971, 281, 816, 1071, 352, 285, 923, 3530, 432, 253, 10895, 50276, 74, 369, 2104, 281, 1408, 253, 3215, 11273, 3210, 6635, 747, 941, 285, 1071, 16012, 347, 2529, 275, 253, 1239, 1405, 2299, 352, 310, 417, 2590, 849, 281, 1614, 327, 253, 40022, 50276, 261, 627, 667, 4154, 281, 513, 594, 253, 1827, 522, 287, 829, 310, 2529, 973, 275, 253, 2929, 533, 352, 310, 1892, 281, 24171, 275, 253, 2127, 347, 627, 310, 417, 667, 7000, 10097, 891, 651, 11435, 625, 7000, 10097, 323, 253, 2127, 3139, 5474, 33032, 2520, 2929, 19401, 253, 1895, 273, 721, 7759, 273, 7185, 513, 71, 15688, 19763, 1563, 3448, 7997, 534, 310, 271, 4722, 285, 12532, 3884, 285, 627, 403, 2067, 2987, 13082, 4102, 275, 1798, 253, 4477, 4081, 247, 40022, 717, 84, 14930, 1754, 327, 253, 15688, 4715, 7792, 391, 77, 31591, 253, 4477, 2879, 253, 3448, 9775, 629, 275, 391, 77, 31591, 285, 4081, 23178, 4086, 3169, 4836, 20665, 281, 30215, 285, 3989, 8892, 1690, 1453, 10806, 4886, 253, 2303, 1789, 342, 4736, 16753, 10806, 278, 18, 285, 4886, 253, 2303, 1789, 2112, 247, 18974, 1223, 14127, 253, 3200, 10806, 278, 19, 271, 1789, 37382, 6779, 310, 908, 281, 823, 10575, 672, 11365, 941, 1690, 966, 3295, 1979, 285, 12087, 5281, 1754, 327, 717, 84, 14930, 253, 4477, 8818, 253, 10895, 362, 77, 1814, 15152, 323, 721, 69, 1171, 3448, 26960, 15688, 19763, 2561, 1754, 327, 17230, 430, 253, 4477, 2007, 4081, 247, 721, 69, 1171, 2234, 3659, 3169, 1566, 347, 247, 8245, 323, 362, 77, 1814, 15152, 50276, 249, 6010, 436, 2929, 19401, 271, 1774, 1895, 3448, 26960, 15688, 19763, 285, 253, 9021, 2882, 273, 247, 40022, 717, 84, 14930, 247, 10895, 362, 77, 1814, 15152, 285, 247, 8245, 1566, 721, 69, 11536, 430, 50276, 18, 891, 1158, 253, 10895, 310, 1774, 323, 253, 3448, 285, 15688, 982, 4694, 2561, 3114, 285, 2789, 247, 3213, 4404, 3448, 26960, 721, 69, 1171, 15688, 19763, 5368, 721, 69, 1171, 15302, 824, 347, 9452, 335, 506, 263, 3614, 7280, 681, 455, 257, 1468, 266, 532, 335, 506, 263, 1724, 8498, 3614, 7280, 681, 1405, 265, 1179, 8498, 3966, 3798, 2770, 327, 1048, 11593, 4836, 7219, 1223, 1218, 14941, 3614, 7280, 681, 9906, 36642, 3385, 561, 310, 7470, 323, 3448, 26960, 19763, 352, 760, 17209, 327, 247, 374, 69, 6415, 3103, 891, 1158, 362, 20347, 1257, 348, 32113, 253, 8037, 273, 3448, 26960, 721, 69, 1171, 15688, 19763, 973, 50276, 19, 253, 4477, 12661, 285, 1973, 247, 3426, 15722, 323, 3448, 26960, 721, 69, 1171, 15688, 19763, 2561, 1690, 247, 40022, 50276, 1317, 14930, 247, 10895, 362, 77, 1814, 15152, 285, 247, 8245, 1566, 721, 69, 11536, 430, 534, 476, 1361, 22458, 1271, 755, 281, 871, 436, 1673, 4541, 285, 8607, 1287, 285, 7277, 11333, 34090, 50275, 18, 1563, 253, 40477, 18491, 273, 436, 2929, 352, 310, 1892, 281, 18302, 253, 1543, 2829, 495, 285, 2829, 721, 275, 253, 30762, 323, 4227, 891, 908, 253, 3215, 11273, 1566, 323, 253, 8031, 4836, 285, 1900, 1694, 247, 470, 2323, 2281, 50276, 19, 253, 1232, 273, 11365, 253, 10895, 310, 417, 2529, 1077, 4518, 849, 1142, 13305, 323, 1016, 4836, 275, 3733, 752, 310, 253, 3064, 875, 2326, 285, 39709, 849, 310, 253, 3733, 285, 12820, 8085, 1563, 253, 40477, 18491, 273, 436, 2929, 891, 1158, 326, 3515, 15548, 5657, 42429, 36089, 13307, 19500, 760, 4245, 253, 12820, 10895, 50276, 20, 690, 4243, 273, 253, 2929, 403, 417, 3542, 1077, 973, 323, 253, 717, 84, 14930, 849, 513, 253, 1264, 23178, 4086, 3169, 10806, 2723, 281, 253, 1264, 2087, 11692, 4836, 9050, 50276, 8826, 6239, 327, 3448, 26960, 15688, 19763, 824, 347, 337, 310, 5816, 253, 6452, 310, 5816, 50275, 18, 479, 265, 1162, 355, 752, 8213, 275, 3448, 27039, 45738, 4715, 891, 2921, 1384, 1423, 5474, 339, 431, 248, 4477, 273, 362, 77, 1814, 15152, 12661, 247, 747, 22791, 323, 35121, 19763, 1754, 327, 5304, 3280, 285, 247, 3448, 7316, 247, 22791, 2939, 265, 253, 12240, 273, 253, 4836, 1754, 327, 4836, 4919, 2457, 1375, 10806, 23000, 253, 789, 29328, 247, 747, 10895, 2330, 342, 253, 22791, 253, 4477, 12661, 854, 2710, 19763, 8892, 323, 1016, 4836, 247, 873, 273, 41699, 50276, 8547, 3169, 15216, 310, 2931, 23000, 253, 2929, 23970, 717, 84, 14930, 534, 310, 271, 12077, 4968, 323, 20028, 5978, 891, 2868, 253, 4477, 858, 247, 1175, 2628, 275, 7473, 2182, 2710, 19763, 8892, 253, 789, 29328, 247, 12082, 1039, 281, 6266, 747, 8892, 275, 35121, 19763, 3237, 4706, 4562, 891, 1158, 253, 23178, 2216, 5125, 275, 253, 2929, 310, 247, 1175, 2934, 1789, 37382, 14237, 2112, 342, 3943, 4836, 22074, 1581, 323, 253, 9171, 1430, 273, 941, 5978, 50275, 1317, 14930, 3133, 281, 2085, 247, 12112, 1039, 281, 6635, 32367, 323, 747, 8892, 846, 1907, 2931, 1463, 4736, 10806, 891, 2868, 5277, 12077, 721, 69, 1171, 48635, 310, 247, 1943, 5649, 273, 253, 4968, 50276, 783, 10895, 2530, 342, 253, 22791, 6131, 247, 1175, 5235, 273, 19763, 15216, 50276, 66, 8245, 285, 28913, 5125, 407, 253, 4477, 403, 4209, 281, 10251, 885, 253, 7881, 273, 253, 4081, 22791, 1677, 253, 13989, 273, 253, 10895, 891, 651, 1902, 625, 4278, 327, 253, 5728, 941, 281, 320, 2530, 690, 273, 253, 4278, 403, 6107, 275, 253, 30762, 2299, 352, 651, 320, 1175, 281, 2319, 752, 873, 273, 5113, 310, 908, 275, 253, 8892, 1618, 273, 5113, 513, 597, 1705, 432, 253, 5368, 10895, 285, 849, 1142, 1027, 5113, 476, 320, 908, 591, 4836, 24088, 849, 1142, 476, 320, 5485, 50276, 8826, 625, 4278, 327, 849, 253, 3302, 13451, 497, 41699, 651, 320, 671, 4217, 849, 1142, 5113, 497, 4845, 12421, 3113, 10806, 23000, 752, 310, 253, 1232, 273, 6153, 253, 7997, 432, 20665, 310, 352, 2074, 281, 1391, 24987, 337, 50276, 29483, 595, 323, 253, 10895, 5740, 752, 310, 253, 3268, 273, 32367, 689, 253, 8892, 50276, 9088, 403, 608, 1027, 6849, 4561, 275, 253, 4968, 285, 247, 5740, 273, 752, 6849, 403, 11217, 285, 2139, 824, 6849, 497, 6777, 651, 320, 247, 5322, 1635, 671, 752, 310, 253, 3242, 873, 273, 7313, 2530, 275, 253, 985, 3453, 50276, 1542, 253, 22791, 272, 352, 651, 320, 1175, 281, 2319, 253, 3045, 342, 1675, 281, 269, 793, 326, 476, 320, 6786, 29072, 608, 3888, 591, 2456, 983, 778, 2085, 247, 28629, 5778, 50276, 1704, 9901, 5936, 326, 1554, 382, 554, 8892, 403, 2530, 342, 3216, 5083, 749, 2184, 932, 891, 1158, 352, 651, 320, 4409, 4645, 253, 1566, 10620, 342, 1554, 382, 554, 8892, 285, 840, 5046, 4645, 849, 5889, 272, 273, 2969, 8892, 11852, 253, 3045, 285, 3164, 3400, 247, 1534, 5691, 281, 253, 4081, 1566, 50276, 74, 2868, 326, 253, 3374, 891, 5393, 403, 273, 247, 2581, 5884, 3753, 285, 403, 4354, 3451, 494, 50276, 18, 816, 249, 480, 2116, 1665, 632, 704, 1074, 74, 270, 9432, 506, 288, 1792, 9432, 266, 260, 1569, 1196, 1182, 262, 23237, 826, 86, 25083, 3889, 1784, 6429, 15030, 285, 687, 859, 48496, 1200, 781, 1391, 24987, 247, 10401, 10895, 323, 5889, 267, 3448, 285, 18307, 5304, 14720, 275, 26332, 1796, 8059, 327, 4382, 8113, 285, 3102, 8981, 30105, 1087, 4240, 2490, 187, 4118, 18435, 27, 783, 2929, 10262, 247, 22791, 323, 247, 5028, 326, 310, 273, 1534, 1600, 281, 253, 3114, 285, 247, 3426, 15722, 310, 4081, 285, 5183, 253, 789, 29328, 247, 12082, 1039, 281, 6266, 8892, 275, 35121, 19763, 3237, 7470, 323, 253, 10895, 31591, 4698, 3559, 1223, 30628, 497, 2104, 281, 8338, 253, 5657, 3559, 8254, 6787, 327, 849, 281, 1408, 253, 5657, 2530, 285, 18302, 1543, 943, 320, 4518, 1246, 275, 253, 10097, 323, 253, 4968, 11554, 3139, 253, 4477, 403, 14659, 281, 1908, 253, 20544, 285, 32213, 3636, 407, 30628, 323, 2007, 8254, 5411, 285, 31845, 253, 7714, 50276, 296, 3755, 20556, 50276, 783, 1895, 5421, 1060, 310, 4623, 285, 1774, 8113, 3169, 6083, 24364, 8892, 7616, 407, 3626, 3448, 310, 273, 1600, 281, 253, 5723, 2824, 3114, 50276, 783, 4477, 7568, 247, 3426, 15722, 323, 3448, 26960, 15688, 19763, 2561, 7470, 323, 2970, 4460, 2561, 8892, 275, 436, 2170, 598, 285, 3515, 4541, 50275, 783, 2929, 310, 973, 10932, 285, 4518, 8631, 253, 3559, 22791, 3340, 342, 38549, 50276, 783, 3559, 985, 4483, 323, 31238, 247, 5235, 273, 9351, 273, 10806, 2403, 352, 4583, 9648, 7561, 7763, 50276, 11404, 2821, 12077, 721, 69, 1171, 48635, 310, 247, 1534, 7756, 273, 253, 4968, 689, 5368, 2905, 2718, 50276, 20881, 1255, 265, 50276, 9072, 254, 9827, 432, 5368, 15302, 31591, 17144, 275, 253, 2133, 273, 253, 2929, 651, 1361, 19148, 253, 9021, 273, 253, 789, 50276, 783, 4081, 2746, 1057, 417, 1646, 281, 320, 2444, 973, 327, 17230, 430, 534, 16540, 3533, 670, 253, 4583, 17200, 273, 253, 5853, 326, 943, 320, 9713, 50276, 262, 310, 417, 2590, 432, 253, 2929, 3139, 849, 747, 8892, 476, 320, 2908, 285, 6760, 50276, 8826, 21629, 6239, 327, 3448, 26960, 15688, 19763, 943, 320, 2908, 209 ]
Below is given review of a research paper from cnoference journal. Please write a summary the review. ### Review: this paper proposes a rejection sampling algorithm for sampling from the gan generator authors establish a very clear connection between the optimal gan discriminator and the rejection sampling acceptance probability then they explain very clearly that in practice the connection is not exact and propose a practical algorithm experimental results suggest that the proposed algorithm helps the increase the accuracy of the generator measured in terms of inception score and frechet inception distance it would be interesting though to see if the proposed algorithm buys anything over a trivial rejection scheme such as looking at the discriminator values and rejecting the samples if they fall below a certain threshold this being said i do understand that the proposed practical acceptance ratio in equation 8 is close to the theoretically justified acceptance ratio since in practice the learnt discriminator is not exactly the ideal discriminator dx i think it is super okay to add a constant and optimize it on a validation set equation 7 is off anyways since in practice the things eg the discriminator are not ideal but again i do think it would make the paper much stronger to compare equation 8 with some other heuristic based rejection schemes docsephis paper assumes that in a gan the generator is not perfect and some information is left in the discriminator so that it can be used to reject some of the fake examples produced by the generator the introduction problem statement and justification for rejection sampling are excellent with a level of clarity that makes it understandable by non expert readers and a wittiness that makes the paper fun to read i assume this work is novel the reviewer is more an expert in rejection than in gans and is aware how few publications rely on rejection however the authors fail to compare their algorithm to a much simpler rejection scheme and a revised version should discuss this issue lets jump to equation 8 compared to a simple use of the dicriminator for rejection it adds the term under the log the basic rejection equation would read fx dx gamma and one would adjust the threshold gamma to obtain the desired operating point i am wondering why no comparison is provided with basic rejection let me try to understand the gaussian mixture experiment as the description is ambiguous gan setting 10k examples are generated and reported in figure 3 drs setting 10k examples are generated and submitted to algorithm in figure 1 for each batch a line search sets gamma so that 95 of the examples are accepted thus only 95k are reported in figure 3 what about basic rejection using fx dx gamma how does it compare to drs at the same 95 accept if this is my understanding then the comparison in figure 3 in unfair as drs is allowed to pick and choose for completeness basic rejection should also be added going back to eq8 one realizes that the difference between drs rejection and basic rejection may be negligible first order taylor expansion of log1x that would apply to the case where the rejection probability is small yields fx dx dm expdx dm x expx is monotonous so thresholding over it is the same as thresholding over x back to basic rejectiondocsepthis paper proposed a postprocessing rejection sampling scheme for gans named discriminator rejection sampling drs to help filter good samples from gans generator more specifically after training gans generator and discriminator are fixed gans discriminator is further exploited to design a rejection sampler which is used to reject the bad samples generated from the fixed generator accordingly the accepted generated samples have good quality better is and fid results experiments of sagan model on gmm toys and imagenet dataset show that drs helps further increases the is and reduces the fid the paper is easy to follow and the experimental results are convincing however i am curious about the follow questions 1 besides helping generate better samples could you list several other applications where the proposed technique is useful 2 in the last paragraph of page 4 i dont think the presented discriminator rejection sampling addresses the issues in sec 32 especially the first paragraph of page 5 3 the hyperparameter gamma in eq 8 is of vital importance for the proposed drs actually it is believed the key to determining whether drs works or not detailed analysisexperiments about hyperparameter gamma are considered missing ### Summary:
the paper proposes a discriminator dependent rejection sampling scheme for improving the quality of samples from a trained gan the paper is clearly written presents an interesting idea and the authors extended and improved the experimental analyses as suggested by the reviewers
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 2520, 2929, 29328, 247, 18235, 10491, 5933, 323, 10491, 432, 253, 36827, 14156, 4477, 5100, 247, 1077, 2590, 4602, 875, 253, 8654, 36827, 7134, 12915, 285, 253, 18235, 10491, 14924, 5912, 840, 597, 5513, 1077, 4518, 326, 275, 3946, 253, 4602, 310, 417, 3242, 285, 12661, 247, 8542, 5933, 50275, 49363, 1543, 1804, 326, 253, 4081, 5933, 7729, 253, 2572, 253, 7200, 273, 253, 14156, 4080, 275, 2426, 273, 39645, 4868, 285, 4107, 25742, 39645, 4181, 50275, 262, 651, 320, 4722, 2167, 281, 923, 604, 253, 4081, 5933, 41307, 2712, 689, 247, 14916, 18235, 6974, 824, 347, 2819, 387, 253, 7134, 12915, 2193, 285, 33944, 253, 3530, 604, 597, 2965, 2708, 247, 2176, 7887, 436, 1146, 753, 891, 513, 2096, 326, 253, 4081, 8542, 14924, 4313, 275, 5150, 854, 310, 2810, 281, 253, 28055, 17285, 14924, 4313, 1580, 275, 3946, 253, 34003, 7134, 12915, 310, 417, 4555, 253, 7445, 7134, 12915, 18747, 891, 1158, 352, 310, 2221, 8261, 281, 823, 247, 3638, 285, 22318, 352, 327, 247, 12820, 873, 5150, 818, 310, 745, 667, 1576, 1580, 275, 3946, 253, 1841, 24088, 253, 7134, 12915, 403, 417, 7445, 533, 969, 891, 513, 1158, 352, 651, 1056, 253, 2929, 1199, 10046, 281, 7277, 5150, 854, 342, 690, 643, 47641, 1754, 18235, 15849, 50275, 7152, 339, 29155, 2929, 19584, 326, 275, 247, 36827, 253, 14156, 310, 417, 3962, 285, 690, 1491, 310, 1669, 275, 253, 7134, 12915, 594, 326, 352, 476, 320, 908, 281, 12009, 690, 273, 253, 15223, 6667, 4197, 407, 253, 14156, 50276, 783, 10199, 1895, 3908, 285, 22861, 323, 18235, 10491, 403, 7126, 342, 247, 1268, 273, 19843, 326, 2789, 352, 34007, 407, 1327, 6485, 10668, 285, 247, 259, 770, 1632, 326, 2789, 253, 2929, 794, 281, 1239, 891, 5467, 436, 789, 310, 4460, 253, 37317, 310, 625, 271, 6485, 275, 18235, 685, 275, 305, 507, 285, 310, 6600, 849, 1643, 16516, 10725, 327, 18235, 50276, 35529, 253, 4477, 1891, 281, 7277, 616, 5933, 281, 247, 1199, 19554, 18235, 6974, 285, 247, 17265, 2715, 943, 2319, 436, 2523, 14935, 6923, 281, 5150, 854, 2429, 281, 247, 2969, 897, 273, 253, 37266, 3428, 12915, 323, 18235, 352, 11323, 253, 1307, 762, 253, 2412, 253, 5044, 18235, 5150, 651, 1239, 269, 89, 50276, 9665, 50276, 2733, 285, 581, 651, 4575, 253, 7887, 17356, 281, 4044, 253, 6799, 6498, 1127, 891, 717, 12371, 2139, 642, 5301, 310, 2530, 342, 5044, 18235, 50275, 1059, 479, 1611, 281, 2096, 253, 305, 12064, 7802, 3368, 347, 253, 5740, 310, 23851, 50276, 1247, 4758, 884, 76, 6667, 403, 4561, 285, 2361, 275, 4677, 495, 50276, 69, 2967, 4758, 884, 76, 6667, 403, 4561, 285, 9262, 281, 5933, 275, 4677, 337, 323, 1016, 14604, 247, 1386, 3186, 5239, 17356, 594, 326, 5325, 273, 253, 6667, 403, 7607, 3021, 760, 5325, 76, 403, 2361, 275, 4677, 495, 50276, 5371, 670, 5044, 18235, 970, 269, 89, 50276, 9665, 50276, 2733, 849, 1057, 352, 7277, 281, 1837, 84, 387, 253, 1072, 5325, 2997, 50276, 338, 436, 310, 619, 4685, 840, 253, 5301, 275, 4677, 495, 275, 16593, 347, 1837, 84, 310, 4136, 281, 2619, 285, 5206, 323, 29867, 5044, 18235, 943, 671, 320, 2879, 50276, 5681, 896, 281, 16186, 25, 581, 36683, 326, 253, 3064, 875, 1837, 84, 18235, 285, 5044, 18235, 778, 320, 22879, 806, 1340, 246, 9614, 7466, 273, 2412, 18, 89, 326, 651, 4647, 281, 253, 1083, 835, 253, 18235, 5912, 310, 1355, 11026, 269, 89, 50276, 9665, 50276, 17670, 50276, 4347, 9665, 50276, 17670, 50275, 89, 866, 89, 310, 41907, 528, 594, 7887, 272, 689, 352, 310, 253, 1072, 347, 7887, 272, 689, 1269, 896, 281, 5044, 18235, 7152, 33032, 2520, 2929, 4081, 247, 1501, 21678, 18235, 10491, 6974, 323, 305, 507, 4907, 7134, 12915, 18235, 10491, 1837, 84, 281, 1361, 5806, 1175, 3530, 432, 305, 507, 14156, 625, 5742, 846, 3733, 305, 507, 14156, 285, 7134, 12915, 403, 4229, 305, 507, 7134, 12915, 310, 2007, 28734, 281, 2216, 247, 18235, 1775, 17407, 534, 310, 908, 281, 12009, 253, 3076, 3530, 4561, 432, 253, 4229, 14156, 15672, 253, 7607, 4561, 3530, 452, 1175, 3290, 1805, 310, 285, 269, 301, 1543, 4679, 273, 256, 12043, 1566, 327, 305, 2188, 23908, 285, 4440, 257, 292, 10895, 921, 326, 1837, 84, 7729, 2007, 5459, 253, 310, 285, 11355, 253, 269, 301, 50276, 783, 2929, 310, 3477, 281, 956, 285, 253, 5661, 1543, 403, 21414, 2299, 891, 717, 14338, 670, 253, 956, 3533, 50276, 18, 186, 67, 11587, 9073, 6635, 1805, 3530, 812, 368, 1618, 2067, 643, 4893, 835, 253, 4081, 5853, 310, 4217, 50275, 19, 186, 249, 253, 1390, 12494, 273, 3239, 577, 891, 13414, 1158, 253, 3559, 7134, 12915, 18235, 10491, 12453, 253, 3374, 275, 4706, 4567, 3340, 253, 806, 12494, 273, 3239, 608, 50276, 20, 186, 783, 4373, 19484, 17356, 275, 16186, 854, 310, 273, 12232, 6349, 323, 253, 4081, 1837, 84, 2686, 352, 310, 6566, 253, 2234, 281, 8925, 1880, 1837, 84, 2987, 390, 417, 7000, 5127, 39549, 468, 3825, 670, 4373, 19484, 17356, 403, 2783, 5816, 50276, 187, 187, 4118, 18435, 27, 783, 2929, 29328, 247, 7134, 12915, 7976, 18235, 10491, 6974, 323, 11138, 253, 3290, 273, 3530, 432, 247, 10166, 36827, 253, 2929, 310, 4518, 3542, 10262, 271, 4722, 2934, 285, 253, 4477, 6508, 285, 5520, 253, 5661, 6260, 347, 5125, 407, 253, 30628 ]
[ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 ]
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 2520, 2929, 29328, 247, 18235, 10491, 5933, 323, 10491, 432, 253, 36827, 14156, 4477, 5100, 247, 1077, 2590, 4602, 875, 253, 8654, 36827, 7134, 12915, 285, 253, 18235, 10491, 14924, 5912, 840, 597, 5513, 1077, 4518, 326, 275, 3946, 253, 4602, 310, 417, 3242, 285, 12661, 247, 8542, 5933, 50275, 49363, 1543, 1804, 326, 253, 4081, 5933, 7729, 253, 2572, 253, 7200, 273, 253, 14156, 4080, 275, 2426, 273, 39645, 4868, 285, 4107, 25742, 39645, 4181, 50275, 262, 651, 320, 4722, 2167, 281, 923, 604, 253, 4081, 5933, 41307, 2712, 689, 247, 14916, 18235, 6974, 824, 347, 2819, 387, 253, 7134, 12915, 2193, 285, 33944, 253, 3530, 604, 597, 2965, 2708, 247, 2176, 7887, 436, 1146, 753, 891, 513, 2096, 326, 253, 4081, 8542, 14924, 4313, 275, 5150, 854, 310, 2810, 281, 253, 28055, 17285, 14924, 4313, 1580, 275, 3946, 253, 34003, 7134, 12915, 310, 417, 4555, 253, 7445, 7134, 12915, 18747, 891, 1158, 352, 310, 2221, 8261, 281, 823, 247, 3638, 285, 22318, 352, 327, 247, 12820, 873, 5150, 818, 310, 745, 667, 1576, 1580, 275, 3946, 253, 1841, 24088, 253, 7134, 12915, 403, 417, 7445, 533, 969, 891, 513, 1158, 352, 651, 1056, 253, 2929, 1199, 10046, 281, 7277, 5150, 854, 342, 690, 643, 47641, 1754, 18235, 15849, 50275, 7152, 339, 29155, 2929, 19584, 326, 275, 247, 36827, 253, 14156, 310, 417, 3962, 285, 690, 1491, 310, 1669, 275, 253, 7134, 12915, 594, 326, 352, 476, 320, 908, 281, 12009, 690, 273, 253, 15223, 6667, 4197, 407, 253, 14156, 50276, 783, 10199, 1895, 3908, 285, 22861, 323, 18235, 10491, 403, 7126, 342, 247, 1268, 273, 19843, 326, 2789, 352, 34007, 407, 1327, 6485, 10668, 285, 247, 259, 770, 1632, 326, 2789, 253, 2929, 794, 281, 1239, 891, 5467, 436, 789, 310, 4460, 253, 37317, 310, 625, 271, 6485, 275, 18235, 685, 275, 305, 507, 285, 310, 6600, 849, 1643, 16516, 10725, 327, 18235, 50276, 35529, 253, 4477, 1891, 281, 7277, 616, 5933, 281, 247, 1199, 19554, 18235, 6974, 285, 247, 17265, 2715, 943, 2319, 436, 2523, 14935, 6923, 281, 5150, 854, 2429, 281, 247, 2969, 897, 273, 253, 37266, 3428, 12915, 323, 18235, 352, 11323, 253, 1307, 762, 253, 2412, 253, 5044, 18235, 5150, 651, 1239, 269, 89, 50276, 9665, 50276, 2733, 285, 581, 651, 4575, 253, 7887, 17356, 281, 4044, 253, 6799, 6498, 1127, 891, 717, 12371, 2139, 642, 5301, 310, 2530, 342, 5044, 18235, 50275, 1059, 479, 1611, 281, 2096, 253, 305, 12064, 7802, 3368, 347, 253, 5740, 310, 23851, 50276, 1247, 4758, 884, 76, 6667, 403, 4561, 285, 2361, 275, 4677, 495, 50276, 69, 2967, 4758, 884, 76, 6667, 403, 4561, 285, 9262, 281, 5933, 275, 4677, 337, 323, 1016, 14604, 247, 1386, 3186, 5239, 17356, 594, 326, 5325, 273, 253, 6667, 403, 7607, 3021, 760, 5325, 76, 403, 2361, 275, 4677, 495, 50276, 5371, 670, 5044, 18235, 970, 269, 89, 50276, 9665, 50276, 2733, 849, 1057, 352, 7277, 281, 1837, 84, 387, 253, 1072, 5325, 2997, 50276, 338, 436, 310, 619, 4685, 840, 253, 5301, 275, 4677, 495, 275, 16593, 347, 1837, 84, 310, 4136, 281, 2619, 285, 5206, 323, 29867, 5044, 18235, 943, 671, 320, 2879, 50276, 5681, 896, 281, 16186, 25, 581, 36683, 326, 253, 3064, 875, 1837, 84, 18235, 285, 5044, 18235, 778, 320, 22879, 806, 1340, 246, 9614, 7466, 273, 2412, 18, 89, 326, 651, 4647, 281, 253, 1083, 835, 253, 18235, 5912, 310, 1355, 11026, 269, 89, 50276, 9665, 50276, 17670, 50276, 4347, 9665, 50276, 17670, 50275, 89, 866, 89, 310, 41907, 528, 594, 7887, 272, 689, 352, 310, 253, 1072, 347, 7887, 272, 689, 1269, 896, 281, 5044, 18235, 7152, 33032, 2520, 2929, 4081, 247, 1501, 21678, 18235, 10491, 6974, 323, 305, 507, 4907, 7134, 12915, 18235, 10491, 1837, 84, 281, 1361, 5806, 1175, 3530, 432, 305, 507, 14156, 625, 5742, 846, 3733, 305, 507, 14156, 285, 7134, 12915, 403, 4229, 305, 507, 7134, 12915, 310, 2007, 28734, 281, 2216, 247, 18235, 1775, 17407, 534, 310, 908, 281, 12009, 253, 3076, 3530, 4561, 432, 253, 4229, 14156, 15672, 253, 7607, 4561, 3530, 452, 1175, 3290, 1805, 310, 285, 269, 301, 1543, 4679, 273, 256, 12043, 1566, 327, 305, 2188, 23908, 285, 4440, 257, 292, 10895, 921, 326, 1837, 84, 7729, 2007, 5459, 253, 310, 285, 11355, 253, 269, 301, 50276, 783, 2929, 310, 3477, 281, 956, 285, 253, 5661, 1543, 403, 21414, 2299, 891, 717, 14338, 670, 253, 956, 3533, 50276, 18, 186, 67, 11587, 9073, 6635, 1805, 3530, 812, 368, 1618, 2067, 643, 4893, 835, 253, 4081, 5853, 310, 4217, 50275, 19, 186, 249, 253, 1390, 12494, 273, 3239, 577, 891, 13414, 1158, 253, 3559, 7134, 12915, 18235, 10491, 12453, 253, 3374, 275, 4706, 4567, 3340, 253, 806, 12494, 273, 3239, 608, 50276, 20, 186, 783, 4373, 19484, 17356, 275, 16186, 854, 310, 273, 12232, 6349, 323, 253, 4081, 1837, 84, 2686, 352, 310, 6566, 253, 2234, 281, 8925, 1880, 1837, 84, 2987, 390, 417, 7000, 5127, 39549, 468, 3825, 670, 4373, 19484, 17356, 403, 2783, 5816, 50276, 187, 187, 4118, 18435, 27, 783, 2929, 29328, 247, 7134, 12915, 7976, 18235, 10491, 6974, 323, 11138, 253, 3290, 273, 3530, 432, 247, 10166, 36827, 253, 2929, 310, 4518, 3542, 10262, 271, 4722, 2934, 285, 253, 4477, 6508, 285, 5520, 253, 5661, 6260, 347, 5125, 407, 253, 30628 ]
Below is given review of a research paper from cnoference journal. Please write a summary the review. ### Review: this paper presents an dfabased approach to constrain certain behavior of rl agents where behavior is defined by a sequence of actions this approach assumes that the developer has knowledge of what are goodbad behavior for a specific task and that the behavior can be checked by handcoded dfas or pdas during training whenever such behavior is detected the agent is given a negative reward and the rl state is augmented with the dfa state the authors experimented with different state augmentation methods eg onehot encoding learned embedding on 3 atari tasks the paper is clearly written i also like the general direction of biasing the agents exploration away from undesirable regions or conversely towards desired regions with prior knowledge however i find the results hard to read 1 goal the goal of this work is unclear is it to avoid disastrous states during exploration training or to inject prior knowledge into the agent to speed up learning or to balance tradeoffs between constraint violation and reward optimization it seems the authors are trying to do a bit of everything but then the evaluation is insufficient for example when there are tradeoffs between violation and rewards we expect to see tradeoff curves instead of single points for comparison without the tradeoff i suppose adding the constraint should speed up learning in which case learning curves should be shown 2 interpreting the results 1 what is the reward function used i suppose the penalty should have a large effect on the results which can be tuned to generate a tradeoff curve 2 why not try to add the enforcer during training a slightly more complex baseline would be to enforce with probability 1epsilon to control the tradeoff 3 except for fig 3 right and fig 4 left the constraints doesnt seem to affect the results much judging from the results of vanilla dqn and dqnenforcer are these the best settings to test the approach overall an interesting and novel idea but results are a bit lackingdocsepthis paper presents an approach for biasing an agent to avoid particular action sequences these action sequence constraints are defined with a deterministic finite state automaton dfa the agent is given an additional shaping reward that penalizes it for violating these constraints to make this an easier learning problem for the agent its state is augmented with additional information either an action history the state of the dfa or an embedding of the dfa state the authors show that these approaches do reduce these action constraint violations over not doing anything about them its unclear to me what the use case is for constraints solely on the action space of the agent and why it would be useful to treat them this way the authors motivate and demonstrate these constraints on 3 atari games but it is clear that the constraints they come up with negatively affect performance on most of the games so they are not improving performance or safety of the agent are there useful constraints that only need to view the sequence of actions of the agent and not any of the state if there are such constraints why not simply restrict the agent to only take the valid actions what is the benefit of only biasing it to avoid violating those constraints with a shaping reward this restriction was applied during testing but not during training in all but the first task no 1d dithering in breakout none of the proposed approaches were able to completely eliminate constraint violations why was this if these are really constraints on the action sequence isnt this showing that the algorithm does not work for the problem you are trying to solve the shaping reward used for the four atari games is 1000 in most work on dqn in atari the game rewards are clipped to be between 1 and 1 to improve stability of the learning algorithm were the atari rewards clipped or unclipped in this case did having the shaping reward be such large magnitude have any adverse effects on learning performance adding a shaping reward for some desired behavior of an agent is straightforward the more novel part of this work is in augmenting the state of the agent with the state of a dfa that is tracking the action sequence for constraint violations three approaches are compared and it does appear that dfa onehot is better than the other approaches or no augmentation pros augmenting agent state with state of dfa tracking action sequence constraints is novel and useful for this problem cons unclear if constraints on action sequences alone useful no clear benefit of addressing this problem through shaping rewards no comparison to simply training with only nonviolating action sequences algorithm still results in action constraint violations in 56 tasks docsepthis work aims to use formal languages to add a reward shaping signal in the form of a penalty on the system when constraints are violated there is also an interesting notion of using an embedding based on the action history to aid the agent in avoiding violations however i do not believe this paper did a good enough job in situating this work in the context of prior work in particular camacho 2017 there is a significant related work section that does an ok job of describing many other works but to my knowledge camacho 2017 is the most similar to this one minus the embedding yet is not mentioned here it is difficult to find all related work of course so i would encourage revision with detailed description of the novelty of this work in comparison with that one i would also encourage an more thoughtful examination of the theoretical ramifications of the reward shaping signal with respect to the optimal policy as camacho 2017 do and as is modeled in the ng 1999 paper as of this revision however im not sure i would recommend it for publication additionally i suggest that the authors describe the reward shaping mechanism a bit more formally it was unclear whether it fits into ngs potential function methodology at first pass comments it would be nice to explain to the reader in intuitive terms what no1ddithering means near this text i understand that later on this is explained but for clarity it would be good to have a short explanation during the first mentioning of this term as well it would be good to clarify in figure 1 what lr2 is since in the main text near the figure is is just lr2 and the is only explained several pages ahead an interesting connection that might be made is that ng et als reward shaping mechanism if the shaping function is based on a statedependent potential then the optimal policy under the new mdp is still optimal for the old mdp it would be interesting to see how well this holds under this holds under this schema in fact this seems like analysis that several other works have done for a very similar problem see below i have concerns about the novelty of this method it seems rather similar to camacho alberto oscar chen scott sanner and sheila a mcilraith decisionmaking with nonmarkovian rewards from ltl to automatabased reward shaping in proceedings of the multidisciplinary conference on reinforcement learning and decision making rldm pp 279283 2017 camacho alberto oscar chen scott sanner and sheila a mcilraith nonmarkovian rewards expressed in ltl guiding search via reward shaping in proceedings of the tenth international symposium on combinatorial search socs pp 159160 2017 however that work proposes a similar framework in a much more formal way in fact in that work also a dfa is used as a reward shaping signal from what i can tell for the same purpose through a similar mechanism it is possible however that i missed something which contrasts the two works another work that can be referenced de giacomo giuseppe luca iocchi marco favorito and fabio patrizi reinforcement learning for ltlfldlf goals arxiv preprint arxiv180706333 2018 i think it is particularly important to situate this work within the context of those others general the structure of the paper was a bit all over the place crucial details were spread throughout and it took me a couple of passes to put things together for example it wasnt quite clear what the reward shaping mechanism was until i saw the 1000 and then had to go back to figure out that basically 1000 is added to the reward if the constraint is violated i would suggest putting relevant details all in one place for example our reward shaping function fx was 1000 constraint violation 0 otherwise ### Summary:
the paper studies the problem of reinforcement learning under certain constraints on action sequences the reviewers raised important concerns regarding 1 the general motivation 2 the particular formulation of constraints in terms of action sequences and 3 the relevance and significance of experimental results the authors did not submit a rebuttal given the concerns raised by the reviewers i encourage the authors to improve the paper to possibly resubmit to another venue
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 2520, 2929, 10262, 271, 20926, 357, 833, 2746, 281, 37709, 2176, 3879, 273, 391, 77, 6083, 835, 3879, 310, 2931, 407, 247, 3425, 273, 5231, 436, 2746, 19584, 326, 253, 13722, 556, 3640, 273, 752, 403, 1175, 14367, 3879, 323, 247, 2173, 4836, 285, 326, 253, 3879, 476, 320, 10141, 407, 1133, 38059, 20926, 284, 390, 31385, 284, 1309, 3733, 10793, 824, 3879, 310, 5189, 253, 5570, 310, 1677, 247, 4016, 10921, 285, 253, 391, 77, 1375, 310, 31612, 342, 253, 277, 6855, 1375, 253, 4477, 3368, 264, 342, 1027, 1375, 42072, 3082, 24088, 581, 12022, 9706, 6311, 21496, 327, 495, 387, 1792, 8892, 50276, 783, 2929, 310, 4518, 3542, 891, 671, 751, 253, 2087, 3884, 273, 1794, 2355, 253, 6083, 17947, 1977, 432, 26016, 4811, 390, 5636, 600, 4404, 6799, 4811, 342, 2720, 3640, 2299, 891, 1089, 253, 1543, 1892, 281, 1239, 50276, 18, 4736, 253, 4736, 273, 436, 789, 310, 12744, 310, 352, 281, 3693, 37359, 3054, 1309, 17947, 50276, 31158, 390, 281, 14888, 2720, 3640, 715, 253, 5570, 281, 3885, 598, 4715, 390, 281, 6654, 5454, 14273, 875, 7658, 8411, 285, 10921, 13757, 352, 3133, 253, 4477, 403, 2820, 281, 513, 247, 2372, 273, 3253, 533, 840, 253, 7103, 310, 12497, 323, 1650, 672, 627, 403, 5454, 14273, 875, 8411, 285, 23267, 359, 1902, 281, 923, 5454, 2727, 9191, 3185, 273, 2014, 2792, 323, 5301, 1293, 253, 5454, 2727, 891, 9428, 6240, 253, 7658, 943, 3885, 598, 4715, 275, 534, 1083, 4715, 9191, 943, 320, 2011, 50276, 19, 29375, 253, 1543, 337, 752, 310, 253, 10921, 1159, 908, 891, 9428, 253, 12339, 943, 452, 247, 1781, 1055, 327, 253, 1543, 534, 476, 320, 24251, 281, 6635, 247, 5454, 2727, 6970, 374, 2139, 417, 1611, 281, 823, 253, 546, 1542, 1209, 1309, 3733, 247, 5777, 625, 2570, 8245, 651, 320, 281, 7767, 342, 5912, 337, 4259, 281, 1453, 253, 5454, 2727, 495, 3707, 323, 3036, 495, 987, 285, 3036, 577, 1669, 253, 10806, 36908, 1646, 281, 2818, 253, 1543, 1199, 32721, 432, 253, 1543, 273, 26724, 277, 47051, 285, 277, 82, 26159, 1542, 1209, 50276, 609, 841, 253, 1682, 7533, 281, 1071, 253, 2746, 50276, 1189, 455, 271, 4722, 285, 4460, 2934, 533, 1543, 403, 247, 2372, 14999, 7152, 33032, 2520, 2929, 10262, 271, 2746, 323, 1794, 2355, 271, 5570, 281, 3693, 1798, 2250, 6430, 841, 2250, 3425, 10806, 403, 2931, 342, 247, 30027, 6486, 1375, 3772, 13078, 277, 6855, 253, 5570, 310, 1677, 271, 3081, 29209, 10921, 326, 29697, 4219, 352, 323, 26554, 841, 10806, 281, 1056, 436, 271, 6927, 4715, 1895, 323, 253, 5570, 697, 1375, 310, 31612, 342, 3081, 1491, 2057, 271, 2250, 2892, 253, 1375, 273, 253, 277, 6855, 390, 271, 21496, 273, 253, 277, 6855, 1375, 253, 4477, 921, 326, 841, 7274, 513, 4796, 841, 2250, 7658, 15927, 689, 417, 2509, 2712, 670, 731, 50276, 953, 12744, 281, 479, 752, 253, 897, 1083, 310, 323, 10806, 12718, 327, 253, 2250, 2317, 273, 253, 5570, 285, 2139, 352, 651, 320, 4217, 281, 1555, 731, 436, 1039, 253, 4477, 41509, 285, 7568, 841, 10806, 327, 495, 387, 1792, 3958, 533, 352, 310, 2590, 326, 253, 10806, 597, 1705, 598, 342, 18123, 2818, 3045, 327, 954, 273, 253, 3958, 594, 597, 403, 417, 11138, 3045, 390, 5252, 273, 253, 5570, 403, 627, 4217, 10806, 326, 760, 878, 281, 1859, 253, 3425, 273, 5231, 273, 253, 5570, 285, 417, 667, 273, 253, 1375, 50276, 338, 627, 403, 824, 10806, 2139, 417, 3365, 4656, 253, 5570, 281, 760, 1379, 253, 3588, 5231, 752, 310, 253, 5649, 273, 760, 1794, 2355, 352, 281, 3693, 26554, 1110, 10806, 342, 247, 29209, 10921, 436, 12400, 369, 3732, 1309, 5175, 533, 417, 1309, 3733, 50275, 249, 512, 533, 253, 806, 4836, 642, 337, 69, 277, 1622, 272, 275, 2740, 483, 5293, 273, 253, 4081, 7274, 497, 2104, 281, 4336, 13469, 7658, 15927, 2139, 369, 436, 604, 841, 403, 1663, 10806, 327, 253, 2250, 3425, 310, 2649, 436, 4645, 326, 253, 5933, 1057, 417, 789, 323, 253, 1895, 368, 403, 2820, 281, 8415, 50275, 783, 29209, 10921, 908, 323, 253, 1740, 387, 1792, 3958, 310, 9098, 275, 954, 789, 327, 277, 47051, 275, 387, 1792, 253, 2165, 23267, 403, 502, 6390, 281, 320, 875, 337, 285, 337, 281, 3157, 7882, 273, 253, 4715, 5933, 497, 253, 387, 1792, 23267, 502, 6390, 390, 440, 498, 6390, 275, 436, 1083, 858, 1907, 253, 29209, 10921, 320, 824, 1781, 9777, 452, 667, 10021, 2538, 327, 4715, 3045, 50276, 8052, 247, 29209, 10921, 323, 690, 6799, 3879, 273, 271, 5570, 310, 15246, 253, 625, 4460, 629, 273, 436, 789, 310, 275, 35919, 272, 253, 1375, 273, 253, 5570, 342, 253, 1375, 273, 247, 277, 6855, 326, 310, 12544, 253, 2250, 3425, 323, 7658, 15927, 1264, 7274, 403, 2429, 285, 352, 1057, 3176, 326, 277, 6855, 581, 12022, 310, 1805, 685, 253, 643, 7274, 390, 642, 42072, 50276, 856, 84, 50276, 2321, 420, 272, 5570, 1375, 342, 1375, 273, 277, 6855, 12544, 2250, 3425, 10806, 310, 4460, 285, 4217, 323, 436, 1895, 772, 50276, 328, 8250, 604, 10806, 327, 2250, 6430, 3815, 4217, 50276, 2369, 2590, 5649, 273, 15974, 436, 1895, 949, 29209, 23267, 50276, 2369, 5301, 281, 3365, 3733, 342, 760, 1327, 23283, 839, 2250, 6430, 50276, 41528, 1335, 1543, 275, 2250, 7658, 15927, 275, 8026, 8892, 5474, 33032, 2520, 789, 13698, 281, 897, 7473, 11515, 281, 823, 247, 10921, 29209, 2625, 275, 253, 830, 273, 247, 12339, 327, 253, 985, 672, 10806, 403, 13588, 627, 310, 671, 271, 4722, 10732, 273, 970, 271, 21496, 1754, 327, 253, 2250, 2892, 281, 8596, 253, 5570, 275, 17816, 15927, 2299, 891, 513, 417, 2868, 436, 2929, 858, 247, 1175, 2217, 2628, 275, 5999, 839, 436, 789, 275, 253, 3634, 273, 2720, 789, 50276, 249, 1798, 4049, 49072, 4240, 627, 310, 247, 1534, 2905, 789, 2593, 326, 1057, 271, 8718, 2628, 273, 12930, 1142, 643, 2987, 533, 281, 619, 3640, 4049, 49072, 4240, 310, 253, 954, 2074, 281, 436, 581, 19734, 253, 21496, 2568, 310, 417, 5393, 1060, 352, 310, 2834, 281, 1089, 512, 2905, 789, 273, 2282, 594, 891, 651, 11907, 18520, 342, 7000, 5740, 273, 253, 38135, 273, 436, 789, 275, 5301, 342, 326, 581, 891, 651, 671, 11907, 271, 625, 30457, 8368, 273, 253, 10527, 17653, 6787, 273, 253, 10921, 29209, 2625, 342, 1675, 281, 253, 8654, 3646, 347, 4049, 49072, 4240, 513, 285, 347, 310, 23115, 275, 253, 9782, 7544, 2929, 347, 273, 436, 18520, 2299, 516, 417, 2119, 891, 651, 5583, 352, 323, 9311, 23000, 891, 1804, 326, 253, 4477, 6266, 253, 10921, 29209, 5122, 247, 2372, 625, 19186, 352, 369, 12744, 1880, 352, 13840, 715, 295, 5943, 2442, 1159, 16182, 387, 806, 1509, 50276, 26122, 50275, 262, 651, 320, 5322, 281, 5513, 281, 253, 9414, 275, 27350, 2426, 752, 642, 18, 1678, 1622, 272, 2097, 2822, 436, 2505, 891, 2096, 326, 1996, 327, 436, 310, 5544, 533, 323, 19843, 352, 651, 320, 1175, 281, 452, 247, 2159, 8813, 1309, 253, 806, 29570, 273, 436, 1307, 347, 973, 50276, 262, 651, 320, 1175, 281, 19148, 275, 4677, 337, 752, 50275, 32888, 19, 310, 1580, 275, 253, 2022, 2505, 2822, 253, 4677, 310, 310, 816, 298, 83, 19, 285, 253, 50276, 261, 760, 5544, 2067, 7223, 6386, 50276, 266, 4722, 4602, 326, 1537, 320, 1160, 310, 326, 9782, 1162, 14350, 10921, 29209, 5122, 604, 253, 575, 1200, 15609, 1159, 310, 1754, 327, 247, 4767, 2662, 2442, 840, 253, 8654, 3646, 762, 253, 747, 278, 12132, 310, 1335, 8654, 323, 253, 1711, 278, 12132, 352, 651, 320, 4722, 281, 923, 849, 973, 436, 6556, 762, 436, 6556, 762, 436, 20824, 275, 958, 436, 3133, 751, 1783, 326, 2067, 643, 2987, 452, 2218, 323, 247, 1077, 2074, 1895, 923, 2708, 50276, 74, 452, 7350, 670, 253, 38135, 273, 436, 1332, 352, 3133, 2581, 2074, 281, 50275, 12583, 49072, 355, 589, 936, 258, 19378, 260, 864, 660, 1519, 256, 9582, 285, 703, 8807, 247, 278, 4463, 376, 334, 3061, 11849, 342, 1327, 4698, 729, 757, 23267, 432, 298, 17945, 281, 3772, 255, 357, 833, 10921, 29209, 275, 575, 856, 22868, 273, 253, 44656, 8059, 327, 35221, 4715, 285, 3061, 2403, 391, 392, 78, 7266, 29226, 28933, 4240, 4049, 49072, 355, 589, 936, 258, 19378, 260, 864, 660, 1519, 256, 9582, 285, 703, 8807, 247, 278, 4463, 376, 334, 1327, 4698, 729, 757, 23267, 4469, 275, 298, 17945, 26766, 3186, 3066, 10921, 29209, 275, 10061, 273, 253, 28081, 5213, 18870, 35835, 327, 38183, 3186, 9267, 84, 7266, 22769, 9913, 4240, 50276, 35529, 326, 789, 29328, 247, 2074, 7792, 275, 247, 1199, 625, 7473, 1039, 275, 958, 275, 326, 789, 671, 247, 277, 6855, 310, 908, 347, 247, 10921, 29209, 2625, 50276, 4064, 752, 891, 476, 2028, 323, 253, 1072, 4096, 949, 247, 2074, 5122, 352, 310, 1896, 2299, 326, 891, 9829, 1633, 534, 39165, 253, 767, 2987, 50276, 23955, 789, 326, 476, 320, 23378, 50276, 615, 305, 14427, 19216, 15891, 2089, 365, 18205, 66, 891, 406, 4635, 2304, 1940, 3718, 7067, 285, 6969, 900, 20110, 9877, 35221, 4715, 323, 298, 17945, 71, 392, 35331, 7342, 575, 39962, 638, 3845, 549, 32693, 11395, 28166, 20084, 575, 7798, 50276, 74, 1158, 352, 310, 3782, 1774, 281, 5999, 366, 436, 789, 1561, 253, 3634, 273, 1110, 2571, 50274, 16691, 253, 2605, 273, 253, 2929, 369, 247, 2372, 512, 689, 253, 1659, 9560, 4278, 497, 5195, 4768, 285, 352, 2335, 479, 247, 4564, 273, 11999, 281, 1691, 1841, 2366, 323, 1650, 352, 369, 2649, 3240, 2590, 752, 253, 10921, 29209, 5122, 369, 1919, 891, 3047, 253, 9098, 285, 840, 574, 281, 564, 896, 281, 4677, 562, 326, 10323, 9098, 310, 2879, 281, 253, 10921, 604, 253, 7658, 310, 13588, 891, 651, 1804, 8133, 4623, 4278, 512, 275, 581, 1659, 323, 1650, 776, 10921, 29209, 1159, 269, 89, 369, 50275, 9138, 7658, 8411, 470, 5010, 2490, 187, 4118, 18435, 27, 783, 2929, 2175, 253, 1895, 273, 35221, 4715, 762, 2176, 10806, 327, 2250, 6430, 253, 30628, 5439, 1774, 7350, 5001, 337, 253, 2087, 16038, 374, 253, 1798, 15895, 273, 10806, 275, 2426, 273, 2250, 6430, 285, 495, 253, 17200, 285, 8453, 273, 5661, 1543, 253, 4477, 858, 417, 11929, 247, 30080, 22559, 1677, 253, 7350, 5439, 407, 253, 30628, 891, 11907, 253, 4477, 281, 3157, 253, 2929, 281, 6830, 501, 538, 2225, 281, 1529, 18767 ]
[ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 ]
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 2520, 2929, 10262, 271, 20926, 357, 833, 2746, 281, 37709, 2176, 3879, 273, 391, 77, 6083, 835, 3879, 310, 2931, 407, 247, 3425, 273, 5231, 436, 2746, 19584, 326, 253, 13722, 556, 3640, 273, 752, 403, 1175, 14367, 3879, 323, 247, 2173, 4836, 285, 326, 253, 3879, 476, 320, 10141, 407, 1133, 38059, 20926, 284, 390, 31385, 284, 1309, 3733, 10793, 824, 3879, 310, 5189, 253, 5570, 310, 1677, 247, 4016, 10921, 285, 253, 391, 77, 1375, 310, 31612, 342, 253, 277, 6855, 1375, 253, 4477, 3368, 264, 342, 1027, 1375, 42072, 3082, 24088, 581, 12022, 9706, 6311, 21496, 327, 495, 387, 1792, 8892, 50276, 783, 2929, 310, 4518, 3542, 891, 671, 751, 253, 2087, 3884, 273, 1794, 2355, 253, 6083, 17947, 1977, 432, 26016, 4811, 390, 5636, 600, 4404, 6799, 4811, 342, 2720, 3640, 2299, 891, 1089, 253, 1543, 1892, 281, 1239, 50276, 18, 4736, 253, 4736, 273, 436, 789, 310, 12744, 310, 352, 281, 3693, 37359, 3054, 1309, 17947, 50276, 31158, 390, 281, 14888, 2720, 3640, 715, 253, 5570, 281, 3885, 598, 4715, 390, 281, 6654, 5454, 14273, 875, 7658, 8411, 285, 10921, 13757, 352, 3133, 253, 4477, 403, 2820, 281, 513, 247, 2372, 273, 3253, 533, 840, 253, 7103, 310, 12497, 323, 1650, 672, 627, 403, 5454, 14273, 875, 8411, 285, 23267, 359, 1902, 281, 923, 5454, 2727, 9191, 3185, 273, 2014, 2792, 323, 5301, 1293, 253, 5454, 2727, 891, 9428, 6240, 253, 7658, 943, 3885, 598, 4715, 275, 534, 1083, 4715, 9191, 943, 320, 2011, 50276, 19, 29375, 253, 1543, 337, 752, 310, 253, 10921, 1159, 908, 891, 9428, 253, 12339, 943, 452, 247, 1781, 1055, 327, 253, 1543, 534, 476, 320, 24251, 281, 6635, 247, 5454, 2727, 6970, 374, 2139, 417, 1611, 281, 823, 253, 546, 1542, 1209, 1309, 3733, 247, 5777, 625, 2570, 8245, 651, 320, 281, 7767, 342, 5912, 337, 4259, 281, 1453, 253, 5454, 2727, 495, 3707, 323, 3036, 495, 987, 285, 3036, 577, 1669, 253, 10806, 36908, 1646, 281, 2818, 253, 1543, 1199, 32721, 432, 253, 1543, 273, 26724, 277, 47051, 285, 277, 82, 26159, 1542, 1209, 50276, 609, 841, 253, 1682, 7533, 281, 1071, 253, 2746, 50276, 1189, 455, 271, 4722, 285, 4460, 2934, 533, 1543, 403, 247, 2372, 14999, 7152, 33032, 2520, 2929, 10262, 271, 2746, 323, 1794, 2355, 271, 5570, 281, 3693, 1798, 2250, 6430, 841, 2250, 3425, 10806, 403, 2931, 342, 247, 30027, 6486, 1375, 3772, 13078, 277, 6855, 253, 5570, 310, 1677, 271, 3081, 29209, 10921, 326, 29697, 4219, 352, 323, 26554, 841, 10806, 281, 1056, 436, 271, 6927, 4715, 1895, 323, 253, 5570, 697, 1375, 310, 31612, 342, 3081, 1491, 2057, 271, 2250, 2892, 253, 1375, 273, 253, 277, 6855, 390, 271, 21496, 273, 253, 277, 6855, 1375, 253, 4477, 921, 326, 841, 7274, 513, 4796, 841, 2250, 7658, 15927, 689, 417, 2509, 2712, 670, 731, 50276, 953, 12744, 281, 479, 752, 253, 897, 1083, 310, 323, 10806, 12718, 327, 253, 2250, 2317, 273, 253, 5570, 285, 2139, 352, 651, 320, 4217, 281, 1555, 731, 436, 1039, 253, 4477, 41509, 285, 7568, 841, 10806, 327, 495, 387, 1792, 3958, 533, 352, 310, 2590, 326, 253, 10806, 597, 1705, 598, 342, 18123, 2818, 3045, 327, 954, 273, 253, 3958, 594, 597, 403, 417, 11138, 3045, 390, 5252, 273, 253, 5570, 403, 627, 4217, 10806, 326, 760, 878, 281, 1859, 253, 3425, 273, 5231, 273, 253, 5570, 285, 417, 667, 273, 253, 1375, 50276, 338, 627, 403, 824, 10806, 2139, 417, 3365, 4656, 253, 5570, 281, 760, 1379, 253, 3588, 5231, 752, 310, 253, 5649, 273, 760, 1794, 2355, 352, 281, 3693, 26554, 1110, 10806, 342, 247, 29209, 10921, 436, 12400, 369, 3732, 1309, 5175, 533, 417, 1309, 3733, 50275, 249, 512, 533, 253, 806, 4836, 642, 337, 69, 277, 1622, 272, 275, 2740, 483, 5293, 273, 253, 4081, 7274, 497, 2104, 281, 4336, 13469, 7658, 15927, 2139, 369, 436, 604, 841, 403, 1663, 10806, 327, 253, 2250, 3425, 310, 2649, 436, 4645, 326, 253, 5933, 1057, 417, 789, 323, 253, 1895, 368, 403, 2820, 281, 8415, 50275, 783, 29209, 10921, 908, 323, 253, 1740, 387, 1792, 3958, 310, 9098, 275, 954, 789, 327, 277, 47051, 275, 387, 1792, 253, 2165, 23267, 403, 502, 6390, 281, 320, 875, 337, 285, 337, 281, 3157, 7882, 273, 253, 4715, 5933, 497, 253, 387, 1792, 23267, 502, 6390, 390, 440, 498, 6390, 275, 436, 1083, 858, 1907, 253, 29209, 10921, 320, 824, 1781, 9777, 452, 667, 10021, 2538, 327, 4715, 3045, 50276, 8052, 247, 29209, 10921, 323, 690, 6799, 3879, 273, 271, 5570, 310, 15246, 253, 625, 4460, 629, 273, 436, 789, 310, 275, 35919, 272, 253, 1375, 273, 253, 5570, 342, 253, 1375, 273, 247, 277, 6855, 326, 310, 12544, 253, 2250, 3425, 323, 7658, 15927, 1264, 7274, 403, 2429, 285, 352, 1057, 3176, 326, 277, 6855, 581, 12022, 310, 1805, 685, 253, 643, 7274, 390, 642, 42072, 50276, 856, 84, 50276, 2321, 420, 272, 5570, 1375, 342, 1375, 273, 277, 6855, 12544, 2250, 3425, 10806, 310, 4460, 285, 4217, 323, 436, 1895, 772, 50276, 328, 8250, 604, 10806, 327, 2250, 6430, 3815, 4217, 50276, 2369, 2590, 5649, 273, 15974, 436, 1895, 949, 29209, 23267, 50276, 2369, 5301, 281, 3365, 3733, 342, 760, 1327, 23283, 839, 2250, 6430, 50276, 41528, 1335, 1543, 275, 2250, 7658, 15927, 275, 8026, 8892, 5474, 33032, 2520, 789, 13698, 281, 897, 7473, 11515, 281, 823, 247, 10921, 29209, 2625, 275, 253, 830, 273, 247, 12339, 327, 253, 985, 672, 10806, 403, 13588, 627, 310, 671, 271, 4722, 10732, 273, 970, 271, 21496, 1754, 327, 253, 2250, 2892, 281, 8596, 253, 5570, 275, 17816, 15927, 2299, 891, 513, 417, 2868, 436, 2929, 858, 247, 1175, 2217, 2628, 275, 5999, 839, 436, 789, 275, 253, 3634, 273, 2720, 789, 50276, 249, 1798, 4049, 49072, 4240, 627, 310, 247, 1534, 2905, 789, 2593, 326, 1057, 271, 8718, 2628, 273, 12930, 1142, 643, 2987, 533, 281, 619, 3640, 4049, 49072, 4240, 310, 253, 954, 2074, 281, 436, 581, 19734, 253, 21496, 2568, 310, 417, 5393, 1060, 352, 310, 2834, 281, 1089, 512, 2905, 789, 273, 2282, 594, 891, 651, 11907, 18520, 342, 7000, 5740, 273, 253, 38135, 273, 436, 789, 275, 5301, 342, 326, 581, 891, 651, 671, 11907, 271, 625, 30457, 8368, 273, 253, 10527, 17653, 6787, 273, 253, 10921, 29209, 2625, 342, 1675, 281, 253, 8654, 3646, 347, 4049, 49072, 4240, 513, 285, 347, 310, 23115, 275, 253, 9782, 7544, 2929, 347, 273, 436, 18520, 2299, 516, 417, 2119, 891, 651, 5583, 352, 323, 9311, 23000, 891, 1804, 326, 253, 4477, 6266, 253, 10921, 29209, 5122, 247, 2372, 625, 19186, 352, 369, 12744, 1880, 352, 13840, 715, 295, 5943, 2442, 1159, 16182, 387, 806, 1509, 50276, 26122, 50275, 262, 651, 320, 5322, 281, 5513, 281, 253, 9414, 275, 27350, 2426, 752, 642, 18, 1678, 1622, 272, 2097, 2822, 436, 2505, 891, 2096, 326, 1996, 327, 436, 310, 5544, 533, 323, 19843, 352, 651, 320, 1175, 281, 452, 247, 2159, 8813, 1309, 253, 806, 29570, 273, 436, 1307, 347, 973, 50276, 262, 651, 320, 1175, 281, 19148, 275, 4677, 337, 752, 50275, 32888, 19, 310, 1580, 275, 253, 2022, 2505, 2822, 253, 4677, 310, 310, 816, 298, 83, 19, 285, 253, 50276, 261, 760, 5544, 2067, 7223, 6386, 50276, 266, 4722, 4602, 326, 1537, 320, 1160, 310, 326, 9782, 1162, 14350, 10921, 29209, 5122, 604, 253, 575, 1200, 15609, 1159, 310, 1754, 327, 247, 4767, 2662, 2442, 840, 253, 8654, 3646, 762, 253, 747, 278, 12132, 310, 1335, 8654, 323, 253, 1711, 278, 12132, 352, 651, 320, 4722, 281, 923, 849, 973, 436, 6556, 762, 436, 6556, 762, 436, 20824, 275, 958, 436, 3133, 751, 1783, 326, 2067, 643, 2987, 452, 2218, 323, 247, 1077, 2074, 1895, 923, 2708, 50276, 74, 452, 7350, 670, 253, 38135, 273, 436, 1332, 352, 3133, 2581, 2074, 281, 50275, 12583, 49072, 355, 589, 936, 258, 19378, 260, 864, 660, 1519, 256, 9582, 285, 703, 8807, 247, 278, 4463, 376, 334, 3061, 11849, 342, 1327, 4698, 729, 757, 23267, 432, 298, 17945, 281, 3772, 255, 357, 833, 10921, 29209, 275, 575, 856, 22868, 273, 253, 44656, 8059, 327, 35221, 4715, 285, 3061, 2403, 391, 392, 78, 7266, 29226, 28933, 4240, 4049, 49072, 355, 589, 936, 258, 19378, 260, 864, 660, 1519, 256, 9582, 285, 703, 8807, 247, 278, 4463, 376, 334, 1327, 4698, 729, 757, 23267, 4469, 275, 298, 17945, 26766, 3186, 3066, 10921, 29209, 275, 10061, 273, 253, 28081, 5213, 18870, 35835, 327, 38183, 3186, 9267, 84, 7266, 22769, 9913, 4240, 50276, 35529, 326, 789, 29328, 247, 2074, 7792, 275, 247, 1199, 625, 7473, 1039, 275, 958, 275, 326, 789, 671, 247, 277, 6855, 310, 908, 347, 247, 10921, 29209, 2625, 50276, 4064, 752, 891, 476, 2028, 323, 253, 1072, 4096, 949, 247, 2074, 5122, 352, 310, 1896, 2299, 326, 891, 9829, 1633, 534, 39165, 253, 767, 2987, 50276, 23955, 789, 326, 476, 320, 23378, 50276, 615, 305, 14427, 19216, 15891, 2089, 365, 18205, 66, 891, 406, 4635, 2304, 1940, 3718, 7067, 285, 6969, 900, 20110, 9877, 35221, 4715, 323, 298, 17945, 71, 392, 35331, 7342, 575, 39962, 638, 3845, 549, 32693, 11395, 28166, 20084, 575, 7798, 50276, 74, 1158, 352, 310, 3782, 1774, 281, 5999, 366, 436, 789, 1561, 253, 3634, 273, 1110, 2571, 50274, 16691, 253, 2605, 273, 253, 2929, 369, 247, 2372, 512, 689, 253, 1659, 9560, 4278, 497, 5195, 4768, 285, 352, 2335, 479, 247, 4564, 273, 11999, 281, 1691, 1841, 2366, 323, 1650, 352, 369, 2649, 3240, 2590, 752, 253, 10921, 29209, 5122, 369, 1919, 891, 3047, 253, 9098, 285, 840, 574, 281, 564, 896, 281, 4677, 562, 326, 10323, 9098, 310, 2879, 281, 253, 10921, 604, 253, 7658, 310, 13588, 891, 651, 1804, 8133, 4623, 4278, 512, 275, 581, 1659, 323, 1650, 776, 10921, 29209, 1159, 269, 89, 369, 50275, 9138, 7658, 8411, 470, 5010, 2490, 187, 4118, 18435, 27, 783, 2929, 2175, 253, 1895, 273, 35221, 4715, 762, 2176, 10806, 327, 2250, 6430, 253, 30628, 5439, 1774, 7350, 5001, 337, 253, 2087, 16038, 374, 253, 1798, 15895, 273, 10806, 275, 2426, 273, 2250, 6430, 285, 495, 253, 17200, 285, 8453, 273, 5661, 1543, 253, 4477, 858, 417, 11929, 247, 30080, 22559, 1677, 253, 7350, 5439, 407, 253, 30628, 891, 11907, 253, 4477, 281, 3157, 253, 2929, 281, 6830, 501, 538, 2225, 281, 1529, 18767 ]
Below is given review of a research paper from cnoference journal. Please write a summary the review. ### Review: summary the paper proposes a framework for efficient architecture search for graphs this is done by combining a differentiable dartslike architecture encoding with a transfer learning method that searches on smaller graphs with similar properties and then transfers to the target graphs the experiments show that egan matches or exceeds both handdesigned and nasdesigned gnns moreover the method is very fast to run recommendation overall i am voting to reject as several crucial pieces of information are missing from the current draft and some other parts are unclear the most novel part of the paper seems to be the transfer of architectures learned on subgraphs to larger graphs but there is little discussion on how that impacts downstream accuracy or if the transfer learning is at all needed given that the method is already very efficient moreover there is no information on how the choice of method used to select subgraphs affects the entire framework the experimental results look promising but there should be more care taken to assess statistical significance main pros 1 the general concept of adapting ideas from darts to work with graph neural networks is fairly natural 2 the set of tasks that are considered is broad and the comparison is performed across a range of baselines 3 selecting subgraphs with similar properties to the full graph searching for good architectures on those and then transferring to the full task is a very interesting idea main cons 1 the transfer learning method is only described very briefly which leaves several open questions how much are the graphs reduced section 33 mentions 15 of original size but its unclear if that concerns the framework presented in the paper or is that a value advocated for in some related work it is also unclear if that refers to number of nodes or number of edges surrounding text seems to suggest the former but then the gnn running time will be more affected by the latter what is the empirical impact of the reduced graph size on running time as the proposed method seems to be very fast is it unfeasible to run the search on the full graphs this would allow to estimate how much accuracy is lost due to searching on a proxy task instead of using the target task and how that changes as one varies the graph sizes for the proxy task the paper mentions using random pagerank node to select the subgraphs i wonder how that choice affects the results as opposed to doing something more trivial eg dropping nodes andor edges uniformly at random 2 the bolding of results tables 2 4 6 is a bit misleading since many of the differences do not seem statistically significant eg computer proteins or are even zero dd it would be better to perform a statistical significance test and make the statistical significance clear by bolding several joint top values 3 from appendix 32 i understand that most baseline results were produced by the authors as opposed to copying the numbers from related works how were the hyperparameters for the different baselines tuned in particular im concerned about the last paragraph on page 14 which mentions that the number of layers for all models was fixed to 3 as the number of layers is one of the most important gnn hyperparameters im not sure if simply fixing it to 3 for all baselines is entirely fair other comments the paper says that the output of the last gnn layer is always used by the layer aggregator motivating it as injecting relational bias my understanding is that this just ensures there are no dead ends in the resulting computational graph which is a common idea in nas works im not sure how relational bias is related to this 1 defines two variants of the gumbel softmax trick besides the basic one theres also the straightthrough variant which uses hard ie one hot samples on the forward pass and the soft approximations as defined in equation 5 on the backward pass which variant did the authors use the paper motivates the transfer learning approach by saying that a klayer gnn needs to process khop neighbours the number of which can be exponential in k this seems to suggest the gnn running time grows exponentially with k which is of course not true in fact every gnn propagation step requires the same amount of compute proportional to the number of nodes and edges in the graph the first dot in section 34 says that egan is better than existing gnnnas methods as it has a larger and thus more expressive search space but then goes on to say that egan is also better as it has a smaller and thus cheaper to search search space this feels a bit contradictory it would be fine to just state that the egan search space has a different focus ie searching only over the core gnn propagation aggregation part 2 report several simple handdesigned gnn variants that get 0992 on ppi table 3 reports that searching on the arxiv dataset took 10 seconds how is that even possible as far as i understand the architecture search involves training a supernet containing 12 different aggregation types for each of the 3 gnn layers among other things for some number of epochs repeating that process 5 times for different seeds and then retraining the best architecture from scratch can you comment on how long each of these stages takes in the gnn literature results on dd and proteins table 4 are reported in two different ways some papers eg 3 report the validation set results as the final metric despite it being available to the model selection procedure while others eg 4 report the result on the test set which was not seen by any part of the pipeline i understand the authors follow the latter strategy can you confirm appendix 42 says we use the global sum pooling method at the end of retraining what does that mean reading the paper feels a bit bumpy since there are some sentences that are hard to read and therefore could be revised examples i include only part of each sentence just to make it identifiable page 2 in the literature we page 4 note that in our work page 6 jiang balaprakash 2020 is page 7 first we second we caption of table 5 page 16 for other methods small remarks typos and grammar issues did not influence my rating recommendation the egan abbreviation may be a bit misleading since one could assume it refers to generative adversarial networks abstract soat page 1 one shot methods nas methods page 1 in orders more efficient orders of magnitude more efficient pages 1 8 missing space after citation pages 2 3 for the best of our knowledge to the best of our knowledge page 3 computational computationally equation 1 l should be in parenthesis page 3 have been demonstrated effectiveness have been demonstrated effective page 4 kth kth page 5 to robust the final results to make the final results more robust page 8 by page 13 feature vector obtains feature vector obtained page 16 as evaluate metric as the evaluation metric throughout the paper in this part in this section references 1 categorical reparameterization with gumbelsoftmax 2 gnnfilm graph neural networks with featurewise linear modulation 3 how powerful are graph neural networks 4 a fair comparison of graph neural networks for graph classification comments after rebuttal i would like to thank the authors for their detailed response many things were addressed and i have increased my score to 5 to reflect that the paper is not far from acceptance threshold i think the main thing missing is a discussion of the effect of the sampling of subgraphs ie showing that pagerank is indeed better than choosing nodes at random and analysing how the results change when the reduction percentage is varied between a low value and the maximum value that fits in gpu memorydocsepthis paper presents a differentiable nas method named egan for automatically designing gnn architectures the main contribution is searching gnn architectures efficiently with an oneshot framework based on stochastic relaxation and natural gradient method extensive experiments conducted on nodelevel and graphlevel tasks show the efficiency and effectiveness of the proposed search method pros paper is wellwritten and easy to follow the proposed search space with node aggregators layer aggregators is interesting the design of the baseline methods including random and bayesian search is appreciated empirical results on different datasets and tasks are very strong cons limited novelty the proposed search method is very similar to snas xie et al 2018 except the search space a similar oneshot search method for gnn has been proposed in sgas li et al 2020 which weakens the claimed contribution of being the first oneshot nas method for gnns it is not clear why stochastic natural gradient method is needed the performance of graphnas with egans search space table 11 is close to the performance egan table 2 therefore the performance gain mainly comes from the welldesigned search space there is a lack of comparison of the models parameters across different search methods thus it is not clear whether the experiments are conducted under a fair setting other comments the obtained architecture in figure 2 b includes sage but it is not clear which aggregator is using the contribution in terms of transfer learning is weak since the proxy graph is a subsample of the large graph the identical nodes have been exposed during the search phase references xie s zheng h liu c and lin l 2018 september snas stochastic neural architecture search in international conference on learning representations li g qian g delgadillo ic muller m thabet a and ghanem b 2020 sgas sequential greedy architecture search in proceedings of the ieeecvf conference on computer vision and pattern recognition pp 16201630 post rebuttal comments thank the authors for the detailed response i keep my rating as 5docsep this work proposes an efficient graph neural architecture search to address the problem of automatically designing gnn architecture for any graphbased task comparing with the existing nas approaches for gnns the authors improves the search efficiency from the following three components 1 a slim search space only consisting of the node aggregator layer aggregator and skip connection 2 a oneshot search algorithm which is proposed in the previous nas work and 3 a transfer learning strategy which searches architectures for large graphs via sampling proxy graphs however the current performance improvement over the humandesigned models is marginal which diminishes their research contribution the paper organization is clear but some expressions should be improved the details are listed as below typos in the abstract stateoftheart should be abbreviated as sota not soat typos lthetaz after equation 4 is not defined should it be lw z as used in equation 4 clarity the explanation before equation 5 is a bit confused which should be reorganized there is grammar error the absence of sentence subject in the first sentence however in this work to make use of the differentiable nature of lz and design a differentiable search method to optimize eq 4 clarity the notations related to variable zi j ie zi jt and zi jk are not defined well what is the difference between the superscripts t and k the pros of this work are summarized in terms of three components used in egan which improves the search efficiency the experiment results show that their framework greatly reduce time comparing with the graphnas bayesian search and random search major questions 1 in introduction we doubt that designing proper gnn architectures will take tedious efforts as far as i know the architecture parameters of the humandesigned models do not require extensive tuning efforts on the testing benchmark datasets furthermore most of the architecture parameters could be shared and used among the testing datasets to achieve the competitive performances 2 it is unclear for the second challenge the oneshot methods cannot be directly applied to the aforementioned dummy search space there are some oneshot models with the parameter sharing strategy used for searching the hidden embedding size 3 in section 31 why is the dummy search space very large the search space seems only to include the aggregators and hidden dimensions it might be much smaller than the search space of cnns 4 their search space assigns skip connections between the intermediate layers and the final layer which is contradictory to the common case where the skip connections could be applied among the intermediate layers as shown in 1 the skip connections may exist between any two layers could you provide reasons on the design of skip connection limitation 5 in the node and graph classification of the experimental section the performance improvement over the humandesigned is marginal this would not justify the motivation of applying nas to search graph neural networks the authors should provide more discussions on the contribution of this work in terms of research and industrial applications 6 the marginal performance improvement might result from the search space currently the authors search space is based on the traditional message passing approaches they should consider more the recent developments in gnns to further improve the performance 7 the selection of baselines is unfair the search space contains the skip connection components based on the jknetwork however authors excluded the important baseline in 2 which could achieve the comparable performance on dataset citeseer and reddit for the graph classification task authors also excluded a lot of pooling methods such as the graphunet 3 which achieves the better performance than the proposed approach 1 rong yu et al dropedge towards deep graph convolutional networks on node classification international conference on learning representations 2019 2 xu keyulu et al representation learning on graphs with jumping knowledge networks arxiv preprint arxiv180603536 2018 3 gao hongyang and shuiwang ji graph unets arxiv preprint arxiv190505178 2019 docsepthe paper presents a nas method for graphstructured learning which focuses on constructing a search space tailored to graph neural networks based on different node aggregators skipconnections and layer aggregators in contrast to related gnnnas approaches the authors apply a oneshot training paradigm where model parameters and network architectures are learned simultaneously in an endtoend fashion to enable nas in large graphs authors further apply a transfer learning scheme where the optimal architecture is first learned on a sampled subgraph and model parameters are finetuned later on in the original graph the empirical evaluation includes experiments for node classification and graph classification on which the proposed approach constantly performs better than or on par with humandesigned gnns while being more efficient than related nas approaches the paper is easily comprehensible although it contains some typos and grammatical flaws the proposed approach is inspired by the idea to allow different gnn operators for different layers adhering to different expressive capabilities which is quite interesting and a fairly ignored topic in current literature in some sense this is related to pna 1 and it would be great to discuss this relationship in the paper the experiments look promising showing both the strength of the proposed approach in terms of performance as well as in efficiency compared to related nas approaches the training architecture and the model parameters are learned simultaneously based on a gumbel softmax formulation via gradient descent as far as i understand this requires the model to compute the output of every possible search state in every optimization step which does not seem to scale to larger search spaces is my understanding correct and if so how can the proposed method scale to larger search spaces furthermore other hyperparameters are tuned later on eg dropout ratio or hidden feature dimensionality which might prevent finding the right architecture in the first place furthermore this manual hyperparameter tuning might make the efficiency comparison to related nas approaches somewhat unfair additional comments eq 4 refers to optimizing the network architecture based on training loss performance while in general one wants to find hyperparametersnetwork architectures that perform good on the validation set please clarify some baseline results are a bit weak and do not reflect the results reported in the official papers eg for ppi and reddit for example the graphsage paper reports 954 micro f1 score on reddit while the paper reports 938 the transfer learning experiments do not seem that convincing to me given my previous comment regarding reddit results and the performance on ogbnarxiv compared to a humandesigned gnn i personally think that the transfer learning idea has potential but may need more work in order to see clear benefits is there some intuition why specific architectures win over others which architectures generally perform better than others can there be guidelines extracted for better gnn architecture design decisions 1 corso et al principal neighbourhood aggregation for graph nets neurips 2020 post rebuttal comments i would like to thank the authors for their insightful rebuttal and clarifications most of my concerns have been properly addressed and i very much appreciate the discussion about pna however since the authors train a supernet it is not particular clear to me why one even needs to decide for a specific aggregation scheme in contrast to pna eg by simply using the softmax function instead of the gumbel softmax formulation furthermore im still not that convinced about the transfer learning proposal in my opinion a more indepth analysis is needed both theoretical and practical to justify the claims made in the paper since gnns do not even need to be trained in fullbatch mode ie via graphsage clustergcn graphsaint im not sure which benefits the proposed approach brings to the table in comparison to any other scalable methods therefore my rating will stay the same ### Summary:
this paper presents a differentiable neural architecture search method for gnns using gumbel softmaxbased gating for fast search it also introduces a transfer technique to search architectures on smaller graphs with similar properties as the target graph dataset the paper further introduces a search space based on gnns message aggregators skip connections and layer aggregators results are presented on several undirected graph datasets without edge features on both node and graph classification the reviewers mention that the results are promising but they unanimously agree that the paper does not meet the bar for acceptance in its current form i tend to agree with the reviewers in that the effect of the individual contributions search space vs method vs transfer needs to be better disentangled and studied independently and that it is unclear why selecting a single aggregation function out of many is important vs choosing multiple ones at the same time such as in pna 1 as pointed out by r1 this should be carefully studied going forward lastly all reviewers agreed that the proposed transfer method requires more detailed experimental validation and motivation 1 corso et al principal neighbourhood aggregation for graph nets neurips 2020
[ 8892, 921, 253, 6733, 285, 12510, 273, 253, 4081, 3186, 1332, 50276, 856, 84, 50275, 20790, 310, 973, 15720, 285, 3477, 281, 956, 50276, 783, 4081, 3186, 2317, 342, 4666, 9406, 2392, 3828, 9406, 2392, 310, 4722, 50276, 783, 2216, 273, 253, 8245, 3082, 1690, 3632, 285, 17699, 16561, 3186, 310, 14109, 50276, 358, 5378, 474, 1543, 327, 1027, 15302, 285, 8892, 403, 1077, 2266, 50276, 5040, 50275, 15870, 38135, 253, 4081, 3186, 1332, 310, 1077, 2074, 281, 3802, 284, 1269, 466, 1162, 355, 4765, 3707, 253, 3186, 2317, 50276, 66, 2074, 4394, 12022, 3186, 1332, 323, 305, 9866, 556, 644, 4081, 275, 256, 22228, 632, 1162, 355, 9169, 534, 5075, 561, 253, 7558, 7680, 273, 1146, 253, 806, 4394, 12022, 13332, 1332, 323, 18976, 2224, 50276, 262, 310, 417, 2590, 2139, 19191, 3626, 11786, 1332, 310, 3058, 50276, 783, 3045, 273, 4216, 27109, 342, 24088, 507, 3186, 2317, 2829, 1903, 310, 2810, 281, 253, 3045, 299, 1247, 2829, 374, 3103, 253, 3045, 6351, 7194, 3249, 432, 253, 6210, 392, 265, 1300, 3186, 2317, 50276, 9088, 310, 247, 3480, 273, 5301, 273, 253, 3210, 3602, 2439, 1027, 3186, 3082, 3021, 352, 310, 417, 2590, 1880, 253, 4679, 403, 5196, 762, 247, 4344, 4758, 50276, 977, 5701, 50275, 783, 2797, 10336, 275, 4677, 374, 270, 3797, 35855, 533, 352, 310, 417, 2590, 534, 9406, 1080, 310, 970, 50276, 783, 7680, 275, 2426, 273, 3700, 4715, 310, 5075, 1580, 253, 17335, 4216, 310, 247, 8790, 4636, 273, 253, 1781, 4216, 253, 8931, 7632, 452, 644, 7329, 1309, 253, 3186, 3408, 50276, 250, 3065, 50276, 89, 466, 256, 1182, 24176, 288, 632, 86, 260, 285, 19169, 298, 4765, 22688, 2037, 3802, 284, 19191, 11454, 10336, 3186, 275, 5213, 8059, 327, 4715, 14237, 50276, 965, 305, 2805, 757, 305, 1448, 72, 324, 16078, 17857, 278, 962, 254, 278, 289, 357, 292, 247, 285, 305, 5582, 358, 270, 9169, 256, 22228, 22453, 38754, 10336, 3186, 275, 10061, 273, 253, 26332, 70, 886, 39985, 8059, 327, 4382, 8113, 285, 3102, 8981, 7266, 1668, 6961, 1229, 50276, 5996, 30080, 22559, 5701, 5717, 253, 4477, 323, 253, 7000, 2380, 891, 1978, 619, 13716, 347, 608, 7152, 33032, 436, 789, 29328, 271, 5919, 4216, 11454, 10336, 3186, 281, 2953, 253, 1895, 273, 8356, 20462, 305, 9866, 10336, 323, 667, 4216, 3169, 4836, 10941, 342, 253, 5368, 13332, 7274, 323, 18976, 2224, 253, 4477, 19132, 253, 3186, 6733, 432, 253, 1563, 1264, 4295, 337, 247, 27973, 3186, 2317, 760, 11253, 273, 253, 4666, 9406, 1080, 3828, 9406, 1080, 285, 17049, 4602, 374, 247, 4394, 12022, 3186, 5933, 534, 310, 4081, 275, 253, 2045, 13332, 789, 285, 495, 247, 3700, 4715, 5700, 534, 17891, 35615, 323, 1781, 14580, 3066, 10491, 17335, 14580, 2299, 253, 1655, 3045, 7756, 689, 253, 1547, 395, 265, 1300, 3210, 310, 16888, 534, 13064, 6419, 616, 2561, 7680, 50276, 783, 2929, 6003, 310, 2590, 533, 690, 12091, 943, 320, 5520, 253, 4278, 403, 7117, 347, 2708, 963, 993, 275, 253, 12002, 1375, 23037, 14387, 943, 320, 48030, 347, 256, 5503, 417, 594, 255, 963, 993, 298, 3124, 91, 846, 5150, 577, 310, 417, 2931, 943, 352, 320, 298, 88, 1182, 347, 908, 275, 5150, 577, 19843, 253, 8813, 1078, 5150, 608, 310, 247, 2372, 13477, 534, 943, 320, 294, 34092, 627, 310, 28146, 2228, 253, 5928, 273, 6197, 2256, 275, 253, 806, 6197, 2299, 275, 436, 789, 281, 1056, 897, 273, 253, 46350, 3753, 273, 298, 91, 285, 2216, 247, 46350, 3186, 1332, 281, 22318, 16186, 577, 19843, 253, 41818, 2905, 281, 4778, 1182, 74, 480, 26332, 1182, 74, 480, 85, 285, 1182, 74, 480, 76, 50276, 609, 417, 2931, 973, 752, 310, 253, 3064, 875, 253, 17402, 1687, 84, 246, 285, 465, 50276, 783, 5847, 273, 436, 789, 403, 17903, 275, 2426, 273, 1264, 4295, 908, 275, 299, 1247, 534, 19132, 253, 3186, 6733, 253, 3368, 1543, 921, 326, 616, 7792, 10260, 4796, 673, 10941, 342, 253, 4216, 27109, 17699, 16561, 3186, 285, 3632, 3186, 50276, 24330, 3533, 337, 275, 10199, 359, 5545, 326, 20462, 1463, 305, 9866, 35615, 588, 1379, 38519, 6031, 347, 2080, 347, 891, 871, 253, 10336, 3602, 273, 253, 1547, 395, 265, 1300, 3210, 513, 417, 2430, 9470, 25184, 6031, 327, 253, 5175, 22791, 15302, 33810, 954, 273, 253, 10336, 3602, 812, 320, 6096, 285, 908, 2190, 253, 5175, 15302, 281, 5115, 253, 12085, 16226, 374, 352, 310, 12744, 323, 253, 1273, 5691, 253, 4394, 12022, 3082, 2550, 320, 3587, 3732, 281, 253, 18979, 28726, 3186, 2317, 627, 403, 690, 4394, 12022, 3210, 342, 253, 4764, 9628, 5700, 908, 323, 12203, 253, 8763, 21496, 1979, 50276, 20, 275, 2593, 4562, 2139, 310, 253, 28726, 3186, 2317, 1077, 1781, 253, 3186, 2317, 3133, 760, 281, 2486, 253, 9406, 2392, 285, 8763, 10103, 352, 1537, 320, 1199, 4577, 685, 253, 3186, 2317, 273, 260, 79, 2224, 577, 616, 3186, 2317, 39360, 17049, 10291, 875, 253, 10444, 8090, 285, 253, 2457, 3828, 534, 310, 34126, 281, 253, 1846, 1083, 835, 253, 17049, 10291, 812, 320, 3732, 2190, 253, 10444, 8090, 347, 2011, 275, 337, 253, 17049, 10291, 778, 2226, 875, 667, 767, 8090, 812, 368, 2085, 4606, 327, 253, 2216, 273, 17049, 4602, 12291, 50276, 22, 275, 253, 4666, 285, 4216, 9162, 273, 253, 5661, 2593, 253, 3045, 7756, 689, 253, 1547, 395, 265, 1300, 310, 16888, 436, 651, 417, 15249, 253, 16038, 273, 9433, 13332, 281, 3186, 4216, 11454, 6928, 253, 4477, 943, 2085, 625, 11985, 327, 253, 7680, 273, 436, 789, 275, 2426, 273, 2561, 285, 9787, 4893, 50276, 23, 253, 16888, 3045, 7756, 1537, 906, 432, 253, 3186, 2317, 4390, 253, 4477, 3186, 2317, 310, 1754, 327, 253, 5899, 3935, 8136, 7274, 597, 943, 1908, 625, 253, 3332, 16936, 275, 18976, 2224, 281, 2007, 3157, 253, 3045, 50276, 24, 253, 5438, 273, 1666, 25379, 310, 16593, 253, 3186, 2317, 4428, 253, 17049, 4602, 4295, 1754, 327, 253, 480, 76, 18428, 2299, 4477, 10432, 253, 1774, 8245, 275, 374, 534, 812, 5115, 253, 10870, 3045, 327, 10895, 4851, 3248, 254, 285, 28159, 262, 323, 253, 4216, 9162, 4836, 4477, 671, 10432, 247, 2257, 273, 45900, 3082, 824, 347, 253, 4216, 328, 292, 495, 534, 33526, 253, 1805, 3045, 685, 253, 4081, 2746, 50276, 18, 391, 543, 340, 86, 1162, 355, 5926, 13057, 4404, 3676, 4216, 27311, 267, 6928, 327, 4666, 9162, 5213, 8059, 327, 4715, 14237, 6247, 374, 1269, 86, 2234, 21362, 1162, 355, 6779, 4715, 327, 14580, 342, 22802, 3640, 6928, 549, 32693, 638, 3845, 549, 32693, 11395, 1549, 1671, 1812, 4765, 495, 305, 8500, 288, 543, 31524, 285, 439, 4113, 33317, 480, 74, 4216, 440, 1507, 549, 32693, 638, 3845, 549, 32693, 16129, 18550, 20070, 6247, 5474, 339, 431, 248, 2929, 10262, 247, 13332, 1332, 323, 4216, 34218, 4715, 534, 16633, 327, 26736, 247, 3186, 2317, 27846, 281, 4216, 11454, 6928, 1754, 327, 1027, 4666, 9406, 2392, 17049, 35002, 285, 3828, 9406, 2392, 275, 4499, 281, 2905, 305, 9866, 27109, 7274, 253, 4477, 4647, 247, 4394, 12022, 3733, 22199, 835, 1566, 3602, 285, 2990, 35615, 403, 6311, 10486, 275, 271, 990, 936, 423, 8142, 281, 8046, 13332, 275, 1781, 14580, 4477, 2007, 4647, 247, 3700, 4715, 6974, 835, 253, 8654, 10336, 310, 806, 6311, 327, 247, 19958, 749, 10580, 285, 1566, 3602, 403, 1442, 292, 37437, 1996, 327, 275, 253, 3236, 4216, 253, 16774, 7103, 3797, 4679, 323, 4666, 9162, 285, 4216, 9162, 327, 534, 253, 4081, 2746, 11485, 17923, 1805, 685, 390, 327, 1061, 342, 1547, 395, 265, 1300, 18976, 2224, 1223, 1146, 625, 5919, 685, 2905, 13332, 7274, 50276, 783, 2929, 310, 4354, 28535, 6286, 3738, 352, 4428, 690, 963, 993, 285, 47412, 474, 32138, 253, 4081, 2746, 310, 11797, 407, 253, 2934, 281, 1581, 1027, 305, 9866, 9158, 323, 1027, 8090, 519, 42527, 281, 1027, 43541, 13789, 534, 310, 3240, 4722, 285, 247, 9648, 12841, 9400, 275, 1655, 6239, 275, 690, 3282, 436, 310, 2905, 281, 268, 2072, 337, 285, 352, 651, 320, 1270, 281, 2319, 436, 2954, 275, 253, 2929, 253, 4679, 1007, 12532, 4645, 1097, 253, 4757, 273, 253, 4081, 2746, 275, 2426, 273, 3045, 347, 973, 347, 275, 6733, 2429, 281, 2905, 13332, 7274, 50276, 783, 3733, 10336, 285, 253, 1566, 3602, 403, 6311, 10486, 1754, 327, 247, 305, 3561, 293, 2602, 4090, 15895, 3066, 11786, 18499, 347, 2080, 347, 891, 2096, 436, 4419, 253, 1566, 281, 11897, 253, 3453, 273, 1046, 1896, 3186, 1375, 275, 1046, 13757, 3213, 534, 1057, 417, 1646, 281, 4311, 281, 4067, 3186, 8470, 310, 619, 4685, 3451, 285, 604, 594, 849, 476, 253, 4081, 1332, 4311, 281, 4067, 3186, 8470, 33810, 643, 4373, 22041, 403, 24251, 1996, 327, 24088, 5926, 483, 4313, 390, 8763, 4735, 7877, 1319, 534, 1537, 3657, 4560, 253, 987, 10336, 275, 253, 806, 1659, 33810, 436, 11595, 4373, 19484, 25184, 1537, 1056, 253, 6733, 5301, 281, 2905, 13332, 7274, 8489, 16593, 50276, 38092, 5701, 50275, 2574, 577, 10770, 281, 39793, 253, 2990, 10336, 1754, 327, 3733, 2957, 3045, 1223, 275, 2087, 581, 5605, 281, 1089, 4373, 22041, 18428, 35615, 326, 1347, 1175, 327, 253, 12820, 873, 4496, 19148, 50275, 8826, 8245, 1543, 403, 247, 2372, 5075, 285, 513, 417, 4887, 253, 1543, 2361, 275, 253, 3565, 9380, 24088, 323, 268, 2059, 285, 28159, 262, 323, 1650, 253, 14580, 486, 2929, 5012, 898, 3439, 2494, 269, 18, 4868, 327, 28159, 262, 1223, 253, 2929, 5012, 898, 1839, 50275, 783, 3700, 4715, 4679, 513, 417, 1646, 326, 21414, 281, 479, 1677, 619, 2045, 4385, 5001, 28159, 262, 1543, 285, 253, 3045, 327, 9040, 15453, 39962, 2429, 281, 247, 1547, 395, 265, 1300, 305, 9866, 891, 11697, 1158, 326, 253, 3700, 4715, 2934, 556, 2442, 533, 778, 878, 625, 789, 275, 1340, 281, 923, 2590, 5373, 50275, 261, 627, 690, 30328, 2139, 2173, 35615, 3330, 689, 2571, 534, 35615, 3839, 1347, 1805, 685, 2571, 476, 627, 320, 9600, 10375, 323, 1805, 305, 9866, 10336, 2216, 7089, 50276, 18, 944, 601, 1162, 355, 8624, 24092, 20828, 323, 4216, 37507, 5723, 2824, 9169, 50275, 5996, 30080, 22559, 5701, 50275, 74, 651, 751, 281, 5717, 253, 4477, 323, 616, 47860, 30080, 22559, 285, 8254, 6787, 50276, 2252, 273, 619, 7350, 452, 644, 6283, 9713, 285, 891, 1077, 1199, 11435, 253, 5955, 670, 268, 2072, 2299, 1580, 253, 4477, 6194, 247, 2221, 3024, 352, 310, 417, 1798, 2590, 281, 479, 2139, 581, 1014, 3198, 281, 7617, 323, 247, 2173, 20828, 6974, 275, 4499, 281, 268, 2072, 24088, 407, 3365, 970, 253, 2602, 4090, 1159, 3185, 273, 253, 305, 3561, 293, 2602, 4090, 15895, 50276, 44295, 3062, 516, 1335, 417, 326, 13762, 670, 253, 3700, 4715, 10419, 275, 619, 4743, 247, 625, 801, 554, 394, 1783, 310, 3058, 1097, 10527, 285, 8542, 281, 15249, 253, 3916, 1160, 275, 253, 2929, 1580, 18976, 2224, 513, 417, 1014, 878, 281, 320, 10166, 275, 2120, 23941, 4438, 26332, 3066, 14580, 486, 4359, 1326, 14340, 14580, 1143, 50276, 303, 417, 2119, 534, 5373, 253, 4081, 2746, 10316, 281, 253, 2829, 275, 5301, 281, 667, 643, 44755, 3082, 50276, 45230, 619, 13716, 588, 3297, 253, 1072, 187, 187, 4118, 18435, 27, 2520, 2929, 10262, 247, 46350, 11454, 10336, 3186, 1332, 323, 18976, 2224, 970, 305, 3561, 293, 2602, 4090, 3169, 305, 839, 323, 3809, 3186, 352, 671, 23970, 247, 3700, 5853, 281, 3186, 35615, 327, 4577, 14580, 342, 2074, 3607, 347, 253, 2303, 4216, 10895, 253, 2929, 2007, 23970, 247, 3186, 2317, 1754, 327, 18976, 2224, 3935, 9406, 2392, 17049, 10291, 285, 3828, 9406, 2392, 1543, 403, 3559, 327, 2067, 3807, 17799, 4216, 15302, 1293, 5024, 3386, 327, 1097, 4666, 285, 4216, 9162, 50276, 783, 30628, 3748, 326, 253, 1543, 403, 12532, 533, 597, 38350, 5194, 326, 253, 2929, 1057, 417, 2525, 253, 2534, 323, 14924, 275, 697, 1655, 830, 891, 5257, 281, 5194, 342, 253, 30628, 275, 326, 253, 1055, 273, 253, 2060, 9021, 3186, 2317, 4632, 1332, 4632, 3700, 3198, 281, 320, 1805, 557, 290, 33195, 285, 5421, 10939, 285, 326, 352, 310, 12744, 2139, 17221, 247, 2014, 20828, 1159, 562, 273, 1142, 310, 1774, 4632, 13887, 2709, 4394, 387, 253, 1072, 673, 824, 347, 275, 268, 2072, 337, 347, 8042, 562, 407, 391, 18, 436, 943, 320, 9257, 5421, 1469, 3579, 1390, 314, 512, 30628, 5821, 326, 253, 4081, 3700, 1332, 4419, 625, 7000, 5661, 12820, 285, 16038, 50276, 18, 944, 601, 1162, 355, 8624, 24092, 20828, 323, 4216, 37507, 5723, 2824, 9169 ]
[ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 ]
[ 8892, 921, 253, 6733, 285, 12510, 273, 253, 4081, 3186, 1332, 50276, 856, 84, 50275, 20790, 310, 973, 15720, 285, 3477, 281, 956, 50276, 783, 4081, 3186, 2317, 342, 4666, 9406, 2392, 3828, 9406, 2392, 310, 4722, 50276, 783, 2216, 273, 253, 8245, 3082, 1690, 3632, 285, 17699, 16561, 3186, 310, 14109, 50276, 358, 5378, 474, 1543, 327, 1027, 15302, 285, 8892, 403, 1077, 2266, 50276, 5040, 50275, 15870, 38135, 253, 4081, 3186, 1332, 310, 1077, 2074, 281, 3802, 284, 1269, 466, 1162, 355, 4765, 3707, 253, 3186, 2317, 50276, 66, 2074, 4394, 12022, 3186, 1332, 323, 305, 9866, 556, 644, 4081, 275, 256, 22228, 632, 1162, 355, 9169, 534, 5075, 561, 253, 7558, 7680, 273, 1146, 253, 806, 4394, 12022, 13332, 1332, 323, 18976, 2224, 50276, 262, 310, 417, 2590, 2139, 19191, 3626, 11786, 1332, 310, 3058, 50276, 783, 3045, 273, 4216, 27109, 342, 24088, 507, 3186, 2317, 2829, 1903, 310, 2810, 281, 253, 3045, 299, 1247, 2829, 374, 3103, 253, 3045, 6351, 7194, 3249, 432, 253, 6210, 392, 265, 1300, 3186, 2317, 50276, 9088, 310, 247, 3480, 273, 5301, 273, 253, 3210, 3602, 2439, 1027, 3186, 3082, 3021, 352, 310, 417, 2590, 1880, 253, 4679, 403, 5196, 762, 247, 4344, 4758, 50276, 977, 5701, 50275, 783, 2797, 10336, 275, 4677, 374, 270, 3797, 35855, 533, 352, 310, 417, 2590, 534, 9406, 1080, 310, 970, 50276, 783, 7680, 275, 2426, 273, 3700, 4715, 310, 5075, 1580, 253, 17335, 4216, 310, 247, 8790, 4636, 273, 253, 1781, 4216, 253, 8931, 7632, 452, 644, 7329, 1309, 253, 3186, 3408, 50276, 250, 3065, 50276, 89, 466, 256, 1182, 24176, 288, 632, 86, 260, 285, 19169, 298, 4765, 22688, 2037, 3802, 284, 19191, 11454, 10336, 3186, 275, 5213, 8059, 327, 4715, 14237, 50276, 965, 305, 2805, 757, 305, 1448, 72, 324, 16078, 17857, 278, 962, 254, 278, 289, 357, 292, 247, 285, 305, 5582, 358, 270, 9169, 256, 22228, 22453, 38754, 10336, 3186, 275, 10061, 273, 253, 26332, 70, 886, 39985, 8059, 327, 4382, 8113, 285, 3102, 8981, 7266, 1668, 6961, 1229, 50276, 5996, 30080, 22559, 5701, 5717, 253, 4477, 323, 253, 7000, 2380, 891, 1978, 619, 13716, 347, 608, 7152, 33032, 436, 789, 29328, 271, 5919, 4216, 11454, 10336, 3186, 281, 2953, 253, 1895, 273, 8356, 20462, 305, 9866, 10336, 323, 667, 4216, 3169, 4836, 10941, 342, 253, 5368, 13332, 7274, 323, 18976, 2224, 253, 4477, 19132, 253, 3186, 6733, 432, 253, 1563, 1264, 4295, 337, 247, 27973, 3186, 2317, 760, 11253, 273, 253, 4666, 9406, 1080, 3828, 9406, 1080, 285, 17049, 4602, 374, 247, 4394, 12022, 3186, 5933, 534, 310, 4081, 275, 253, 2045, 13332, 789, 285, 495, 247, 3700, 4715, 5700, 534, 17891, 35615, 323, 1781, 14580, 3066, 10491, 17335, 14580, 2299, 253, 1655, 3045, 7756, 689, 253, 1547, 395, 265, 1300, 3210, 310, 16888, 534, 13064, 6419, 616, 2561, 7680, 50276, 783, 2929, 6003, 310, 2590, 533, 690, 12091, 943, 320, 5520, 253, 4278, 403, 7117, 347, 2708, 963, 993, 275, 253, 12002, 1375, 23037, 14387, 943, 320, 48030, 347, 256, 5503, 417, 594, 255, 963, 993, 298, 3124, 91, 846, 5150, 577, 310, 417, 2931, 943, 352, 320, 298, 88, 1182, 347, 908, 275, 5150, 577, 19843, 253, 8813, 1078, 5150, 608, 310, 247, 2372, 13477, 534, 943, 320, 294, 34092, 627, 310, 28146, 2228, 253, 5928, 273, 6197, 2256, 275, 253, 806, 6197, 2299, 275, 436, 789, 281, 1056, 897, 273, 253, 46350, 3753, 273, 298, 91, 285, 2216, 247, 46350, 3186, 1332, 281, 22318, 16186, 577, 19843, 253, 41818, 2905, 281, 4778, 1182, 74, 480, 26332, 1182, 74, 480, 85, 285, 1182, 74, 480, 76, 50276, 609, 417, 2931, 973, 752, 310, 253, 3064, 875, 253, 17402, 1687, 84, 246, 285, 465, 50276, 783, 5847, 273, 436, 789, 403, 17903, 275, 2426, 273, 1264, 4295, 908, 275, 299, 1247, 534, 19132, 253, 3186, 6733, 253, 3368, 1543, 921, 326, 616, 7792, 10260, 4796, 673, 10941, 342, 253, 4216, 27109, 17699, 16561, 3186, 285, 3632, 3186, 50276, 24330, 3533, 337, 275, 10199, 359, 5545, 326, 20462, 1463, 305, 9866, 35615, 588, 1379, 38519, 6031, 347, 2080, 347, 891, 871, 253, 10336, 3602, 273, 253, 1547, 395, 265, 1300, 3210, 513, 417, 2430, 9470, 25184, 6031, 327, 253, 5175, 22791, 15302, 33810, 954, 273, 253, 10336, 3602, 812, 320, 6096, 285, 908, 2190, 253, 5175, 15302, 281, 5115, 253, 12085, 16226, 374, 352, 310, 12744, 323, 253, 1273, 5691, 253, 4394, 12022, 3082, 2550, 320, 3587, 3732, 281, 253, 18979, 28726, 3186, 2317, 627, 403, 690, 4394, 12022, 3210, 342, 253, 4764, 9628, 5700, 908, 323, 12203, 253, 8763, 21496, 1979, 50276, 20, 275, 2593, 4562, 2139, 310, 253, 28726, 3186, 2317, 1077, 1781, 253, 3186, 2317, 3133, 760, 281, 2486, 253, 9406, 2392, 285, 8763, 10103, 352, 1537, 320, 1199, 4577, 685, 253, 3186, 2317, 273, 260, 79, 2224, 577, 616, 3186, 2317, 39360, 17049, 10291, 875, 253, 10444, 8090, 285, 253, 2457, 3828, 534, 310, 34126, 281, 253, 1846, 1083, 835, 253, 17049, 10291, 812, 320, 3732, 2190, 253, 10444, 8090, 347, 2011, 275, 337, 253, 17049, 10291, 778, 2226, 875, 667, 767, 8090, 812, 368, 2085, 4606, 327, 253, 2216, 273, 17049, 4602, 12291, 50276, 22, 275, 253, 4666, 285, 4216, 9162, 273, 253, 5661, 2593, 253, 3045, 7756, 689, 253, 1547, 395, 265, 1300, 310, 16888, 436, 651, 417, 15249, 253, 16038, 273, 9433, 13332, 281, 3186, 4216, 11454, 6928, 253, 4477, 943, 2085, 625, 11985, 327, 253, 7680, 273, 436, 789, 275, 2426, 273, 2561, 285, 9787, 4893, 50276, 23, 253, 16888, 3045, 7756, 1537, 906, 432, 253, 3186, 2317, 4390, 253, 4477, 3186, 2317, 310, 1754, 327, 253, 5899, 3935, 8136, 7274, 597, 943, 1908, 625, 253, 3332, 16936, 275, 18976, 2224, 281, 2007, 3157, 253, 3045, 50276, 24, 253, 5438, 273, 1666, 25379, 310, 16593, 253, 3186, 2317, 4428, 253, 17049, 4602, 4295, 1754, 327, 253, 480, 76, 18428, 2299, 4477, 10432, 253, 1774, 8245, 275, 374, 534, 812, 5115, 253, 10870, 3045, 327, 10895, 4851, 3248, 254, 285, 28159, 262, 323, 253, 4216, 9162, 4836, 4477, 671, 10432, 247, 2257, 273, 45900, 3082, 824, 347, 253, 4216, 328, 292, 495, 534, 33526, 253, 1805, 3045, 685, 253, 4081, 2746, 50276, 18, 391, 543, 340, 86, 1162, 355, 5926, 13057, 4404, 3676, 4216, 27311, 267, 6928, 327, 4666, 9162, 5213, 8059, 327, 4715, 14237, 6247, 374, 1269, 86, 2234, 21362, 1162, 355, 6779, 4715, 327, 14580, 342, 22802, 3640, 6928, 549, 32693, 638, 3845, 549, 32693, 11395, 1549, 1671, 1812, 4765, 495, 305, 8500, 288, 543, 31524, 285, 439, 4113, 33317, 480, 74, 4216, 440, 1507, 549, 32693, 638, 3845, 549, 32693, 16129, 18550, 20070, 6247, 5474, 339, 431, 248, 2929, 10262, 247, 13332, 1332, 323, 4216, 34218, 4715, 534, 16633, 327, 26736, 247, 3186, 2317, 27846, 281, 4216, 11454, 6928, 1754, 327, 1027, 4666, 9406, 2392, 17049, 35002, 285, 3828, 9406, 2392, 275, 4499, 281, 2905, 305, 9866, 27109, 7274, 253, 4477, 4647, 247, 4394, 12022, 3733, 22199, 835, 1566, 3602, 285, 2990, 35615, 403, 6311, 10486, 275, 271, 990, 936, 423, 8142, 281, 8046, 13332, 275, 1781, 14580, 4477, 2007, 4647, 247, 3700, 4715, 6974, 835, 253, 8654, 10336, 310, 806, 6311, 327, 247, 19958, 749, 10580, 285, 1566, 3602, 403, 1442, 292, 37437, 1996, 327, 275, 253, 3236, 4216, 253, 16774, 7103, 3797, 4679, 323, 4666, 9162, 285, 4216, 9162, 327, 534, 253, 4081, 2746, 11485, 17923, 1805, 685, 390, 327, 1061, 342, 1547, 395, 265, 1300, 18976, 2224, 1223, 1146, 625, 5919, 685, 2905, 13332, 7274, 50276, 783, 2929, 310, 4354, 28535, 6286, 3738, 352, 4428, 690, 963, 993, 285, 47412, 474, 32138, 253, 4081, 2746, 310, 11797, 407, 253, 2934, 281, 1581, 1027, 305, 9866, 9158, 323, 1027, 8090, 519, 42527, 281, 1027, 43541, 13789, 534, 310, 3240, 4722, 285, 247, 9648, 12841, 9400, 275, 1655, 6239, 275, 690, 3282, 436, 310, 2905, 281, 268, 2072, 337, 285, 352, 651, 320, 1270, 281, 2319, 436, 2954, 275, 253, 2929, 253, 4679, 1007, 12532, 4645, 1097, 253, 4757, 273, 253, 4081, 2746, 275, 2426, 273, 3045, 347, 973, 347, 275, 6733, 2429, 281, 2905, 13332, 7274, 50276, 783, 3733, 10336, 285, 253, 1566, 3602, 403, 6311, 10486, 1754, 327, 247, 305, 3561, 293, 2602, 4090, 15895, 3066, 11786, 18499, 347, 2080, 347, 891, 2096, 436, 4419, 253, 1566, 281, 11897, 253, 3453, 273, 1046, 1896, 3186, 1375, 275, 1046, 13757, 3213, 534, 1057, 417, 1646, 281, 4311, 281, 4067, 3186, 8470, 310, 619, 4685, 3451, 285, 604, 594, 849, 476, 253, 4081, 1332, 4311, 281, 4067, 3186, 8470, 33810, 643, 4373, 22041, 403, 24251, 1996, 327, 24088, 5926, 483, 4313, 390, 8763, 4735, 7877, 1319, 534, 1537, 3657, 4560, 253, 987, 10336, 275, 253, 806, 1659, 33810, 436, 11595, 4373, 19484, 25184, 1537, 1056, 253, 6733, 5301, 281, 2905, 13332, 7274, 8489, 16593, 50276, 38092, 5701, 50275, 2574, 577, 10770, 281, 39793, 253, 2990, 10336, 1754, 327, 3733, 2957, 3045, 1223, 275, 2087, 581, 5605, 281, 1089, 4373, 22041, 18428, 35615, 326, 1347, 1175, 327, 253, 12820, 873, 4496, 19148, 50275, 8826, 8245, 1543, 403, 247, 2372, 5075, 285, 513, 417, 4887, 253, 1543, 2361, 275, 253, 3565, 9380, 24088, 323, 268, 2059, 285, 28159, 262, 323, 1650, 253, 14580, 486, 2929, 5012, 898, 3439, 2494, 269, 18, 4868, 327, 28159, 262, 1223, 253, 2929, 5012, 898, 1839, 50275, 783, 3700, 4715, 4679, 513, 417, 1646, 326, 21414, 281, 479, 1677, 619, 2045, 4385, 5001, 28159, 262, 1543, 285, 253, 3045, 327, 9040, 15453, 39962, 2429, 281, 247, 1547, 395, 265, 1300, 305, 9866, 891, 11697, 1158, 326, 253, 3700, 4715, 2934, 556, 2442, 533, 778, 878, 625, 789, 275, 1340, 281, 923, 2590, 5373, 50275, 261, 627, 690, 30328, 2139, 2173, 35615, 3330, 689, 2571, 534, 35615, 3839, 1347, 1805, 685, 2571, 476, 627, 320, 9600, 10375, 323, 1805, 305, 9866, 10336, 2216, 7089, 50276, 18, 944, 601, 1162, 355, 8624, 24092, 20828, 323, 4216, 37507, 5723, 2824, 9169, 50275, 5996, 30080, 22559, 5701, 50275, 74, 651, 751, 281, 5717, 253, 4477, 323, 616, 47860, 30080, 22559, 285, 8254, 6787, 50276, 2252, 273, 619, 7350, 452, 644, 6283, 9713, 285, 891, 1077, 1199, 11435, 253, 5955, 670, 268, 2072, 2299, 1580, 253, 4477, 6194, 247, 2221, 3024, 352, 310, 417, 1798, 2590, 281, 479, 2139, 581, 1014, 3198, 281, 7617, 323, 247, 2173, 20828, 6974, 275, 4499, 281, 268, 2072, 24088, 407, 3365, 970, 253, 2602, 4090, 1159, 3185, 273, 253, 305, 3561, 293, 2602, 4090, 15895, 50276, 44295, 3062, 516, 1335, 417, 326, 13762, 670, 253, 3700, 4715, 10419, 275, 619, 4743, 247, 625, 801, 554, 394, 1783, 310, 3058, 1097, 10527, 285, 8542, 281, 15249, 253, 3916, 1160, 275, 253, 2929, 1580, 18976, 2224, 513, 417, 1014, 878, 281, 320, 10166, 275, 2120, 23941, 4438, 26332, 3066, 14580, 486, 4359, 1326, 14340, 14580, 1143, 50276, 303, 417, 2119, 534, 5373, 253, 4081, 2746, 10316, 281, 253, 2829, 275, 5301, 281, 667, 643, 44755, 3082, 50276, 45230, 619, 13716, 588, 3297, 253, 1072, 187, 187, 4118, 18435, 27, 2520, 2929, 10262, 247, 46350, 11454, 10336, 3186, 1332, 323, 18976, 2224, 970, 305, 3561, 293, 2602, 4090, 3169, 305, 839, 323, 3809, 3186, 352, 671, 23970, 247, 3700, 5853, 281, 3186, 35615, 327, 4577, 14580, 342, 2074, 3607, 347, 253, 2303, 4216, 10895, 253, 2929, 2007, 23970, 247, 3186, 2317, 1754, 327, 18976, 2224, 3935, 9406, 2392, 17049, 10291, 285, 3828, 9406, 2392, 1543, 403, 3559, 327, 2067, 3807, 17799, 4216, 15302, 1293, 5024, 3386, 327, 1097, 4666, 285, 4216, 9162, 50276, 783, 30628, 3748, 326, 253, 1543, 403, 12532, 533, 597, 38350, 5194, 326, 253, 2929, 1057, 417, 2525, 253, 2534, 323, 14924, 275, 697, 1655, 830, 891, 5257, 281, 5194, 342, 253, 30628, 275, 326, 253, 1055, 273, 253, 2060, 9021, 3186, 2317, 4632, 1332, 4632, 3700, 3198, 281, 320, 1805, 557, 290, 33195, 285, 5421, 10939, 285, 326, 352, 310, 12744, 2139, 17221, 247, 2014, 20828, 1159, 562, 273, 1142, 310, 1774, 4632, 13887, 2709, 4394, 387, 253, 1072, 673, 824, 347, 275, 268, 2072, 337, 347, 8042, 562, 407, 391, 18, 436, 943, 320, 9257, 5421, 1469, 3579, 1390, 314, 512, 30628, 5821, 326, 253, 4081, 3700, 1332, 4419, 625, 7000, 5661, 12820, 285, 16038, 50276, 18, 944, 601, 1162, 355, 8624, 24092, 20828, 323, 4216, 37507, 5723, 2824, 9169 ]
Below is given review of a research paper from cnoference journal. Please write a summary the review. ### Review: this paper is well written it proposes using multiple neural networks one for each subdomain instead of a single pinn over the entire domain because increasing domain size causes low frequencies in the solution to be viewed as high frequency features by a neural network the argument is that using individual neural networks will more effectively learn high frequencies due to the smaller domains seen by each neural network compared to a single pinn over the entire domain this approach of dividing the domain seems straightforward and reasonable although not novel the value of this work can be improved by comparing the tradeoffs in performance compute and memory between one pinn and the multiple neural networks in a fbpinn also there is the question of whether this is scalable given the increase in the number of neural networks the authors can also better describe the procedure for designing and applying a good fbpinn for example any domain decomposition can be used as long as the 54 subdomains overlap how should the domain decomposition be selected how to compare and evaluate a particular domain decomposition against alternatives figure 2 b d f i show that both fbpinn and pinn perform similarly against the exact solution so it is unclear what the argument is for using fbpinn over pinn from those four subfigures figures g and h and the text describe faster convergence of fbpinn but the differences can be highlighted in subfigures b d f i figure 3 e and f are more useful docsep the authors propose to use pinn but separating the domain into windows and then normalizing and running a different neural network in each window learning higher frequencies much more slowly than lower frequencies is not major problem it shouldnt prevent learning the correct solution given enough training time it hasnt been a major issue in other applications so why would it be in this one so the premise that we should cut the problem into separate window seems wrong one should only need a good enough neural network architecture having to use 30 different neural networks to solve a toy 1d problem seems crazy i cant imagine how much networks would be needed in large domains if rp would we use 15p different neural networks this is not scalable it would be important to show the total number of parameters for the fbpinn vs the pinn to really show that its more parsimonious in number of parameters ideally you would compare in a setting with equal number of parameters at various size ex 30 neuralnets with 500 parameters vs 1 neuralnet with 500 30 parameters 30 neuralnets with 1000 parameters vs 1 neuralnet with 1000 30 parameters etc this could be a plot with one linecurve for each number of neural networks eg 100 500 1000 2500 and the xaxis would be the total number of parameters this would really help convince that this idea is worthwhile also optimizing the architecture a bit would be good to help convince that fbpinn is really better because in my view a bigger network with the right activations layernormalization structure residuallayer would do better or as well as the multitude of small neural networks in small windows keeping to simple toy datasets is okay but more experiments at equal computation are needed to convince and you need at least one large dimension experiment like one in r100 and again show that fbpinn is better with the same number of total parameters than pinn the fact that fbpinn did not work on the burgers equation due to discontinuity is again a sign to me that we shouldnt separate the domain into windows i dont believe in the premise that the domain should be cut into windows and the experiments are not convincing these doubtsdocsepthe authors present a methodology for applying physicsinformed neural networks pinns to large domains while mitigating the scalability and spectral bias problems inherent of said networks they divide the domain in overlapping subdomains and normalise the input variables within each subdomain a nn is trained for each subdomain the authors test their method over differential equations and show through these examples that their method outperforms the pinns they also highlight some limitations of the methodology it can be worth to finetune the division of the subdomain in each particular problem the paper is clear and well written comments the operator mathcalc of equation 3 is introduced in line 56 but not described in the paper the authors reference the reader to 1 however as this operator allows to automatically satisfy the boundary conditions line 56 i would suggest incorporating a paragraph describing it and its properties while describing the methodology the authors indicate that unnorm is a common unnormalization unique for all subdomains line 53 i would recommend commenting on how this function should be selected in particular they could describe how it was selected for each of the given examples figure 2 a and e suggest that the number of subdomains 30 has been selected to be equal to frac12piomegadelta x where omega is the highest frequency and delta x is the length of the domain is this a coincidence does the method also work for another decomposition scheme eg using 29 or 31 subdomains the choice of the domain decomposition should be better motivated the problems presented in the results section might be considered rather simple since their analytic solutions are easy to calculate i would suggest incorporating some more complex differential equations ### Summary:
reviewers generally seemed to be skeptical about this paper one significant concern raised was the scalability of the gridlike spatial decomposition which increases exponentially with the dimension indeed pinns have so far achieved the most success in complicated regimes such as highdimensional pdes for which traditional often gridbased methods are lacklustre in lowdimensional regimes they are usually often outperformed by traditional solvers as such it seems that a hybrid approach may produce a method that is not competitive in any regime the lack of evidence to the contrary is definitely the single greatest weakness of the paper on a more positive note reviewers agreed that the paper was clearly written i commend the authors for a paper that is far above the usual standard in this regard i do believe the principle of the proposed divideandconquer approach to be a reasonable one as a practical matter it is no doubt much easier to fit each small subnetwork to a small problem than it is to fit a single large network to the whole thing overall i am inclined to accept the paper the proposed approach has clear limitations but fixing them seems like it would make for an interesting line of work and this paper is of the discussionprovoking kind that is a good fit for a workshop
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 2520, 2929, 310, 973, 3542, 352, 29328, 970, 2709, 11454, 6928, 581, 323, 1016, 749, 13517, 3185, 273, 247, 2014, 268, 2966, 689, 253, 2862, 5028, 984, 3629, 5028, 1979, 5997, 1698, 11383, 275, 253, 2900, 281, 320, 11575, 347, 1029, 4294, 3386, 407, 247, 11454, 2990, 253, 4154, 310, 326, 970, 2060, 11454, 6928, 588, 625, 8069, 3037, 1029, 11383, 1955, 281, 253, 4577, 10625, 2326, 407, 1016, 11454, 2990, 2429, 281, 247, 2014, 268, 2966, 689, 253, 2862, 5028, 50276, 2520, 2746, 273, 23534, 253, 5028, 3133, 15246, 285, 5272, 3738, 417, 4460, 253, 1318, 273, 436, 789, 476, 320, 5520, 407, 10941, 253, 5454, 14273, 275, 3045, 11897, 285, 3541, 875, 581, 268, 2966, 285, 253, 2709, 11454, 6928, 275, 247, 269, 12303, 2966, 671, 627, 310, 253, 1953, 273, 1880, 436, 310, 44755, 1677, 253, 2572, 275, 253, 1180, 273, 11454, 6928, 253, 4477, 476, 671, 1805, 6266, 253, 5199, 323, 20462, 285, 9433, 247, 1175, 269, 12303, 2966, 323, 1650, 667, 5028, 14717, 476, 320, 908, 347, 1048, 347, 253, 8255, 749, 45047, 14787, 50276, 5430, 943, 253, 5028, 14717, 320, 4236, 849, 281, 7277, 285, 7472, 247, 1798, 5028, 14717, 1411, 18075, 50276, 13206, 374, 270, 277, 269, 891, 921, 326, 1097, 269, 12303, 2966, 285, 268, 2966, 1347, 12014, 1411, 253, 3242, 2900, 594, 352, 310, 12744, 752, 253, 4154, 310, 323, 970, 269, 12303, 2966, 689, 268, 2966, 432, 1110, 1740, 749, 40203, 8442, 305, 285, 288, 285, 253, 2505, 6266, 7938, 14940, 273, 269, 12303, 2966, 533, 253, 3910, 476, 320, 16318, 275, 749, 40203, 270, 277, 269, 891, 4677, 495, 299, 285, 269, 403, 625, 4217, 5474, 33032, 253, 4477, 12661, 281, 897, 268, 2966, 533, 23694, 253, 5028, 715, 8323, 285, 840, 2622, 3006, 285, 3515, 247, 1027, 11454, 2990, 275, 1016, 3497, 50276, 28269, 2169, 11383, 1199, 625, 7808, 685, 2406, 11383, 310, 417, 2201, 1895, 352, 943, 2649, 3657, 4715, 253, 3451, 2900, 1677, 2217, 3733, 673, 352, 556, 2649, 644, 247, 2201, 2523, 275, 643, 4893, 594, 2139, 651, 352, 320, 275, 436, 581, 594, 253, 26536, 326, 359, 943, 2624, 253, 1895, 715, 4858, 3497, 3133, 3430, 581, 943, 760, 878, 247, 1175, 2217, 11454, 2990, 10336, 1907, 281, 897, 1884, 1027, 11454, 6928, 281, 8415, 247, 20953, 337, 69, 1895, 3133, 10412, 891, 16216, 8564, 849, 1199, 6928, 651, 320, 3058, 275, 1781, 10625, 604, 391, 81, 651, 359, 897, 1458, 81, 1027, 11454, 6928, 436, 310, 417, 44755, 50276, 262, 651, 320, 1774, 281, 921, 253, 2264, 1180, 273, 3602, 323, 253, 269, 12303, 2966, 4632, 253, 268, 2966, 281, 1663, 921, 326, 697, 625, 13328, 15329, 784, 275, 1180, 273, 3602, 34243, 368, 651, 7277, 275, 247, 4758, 342, 4503, 1180, 273, 3602, 387, 2710, 1979, 385, 50276, 1229, 11454, 47301, 342, 6783, 3602, 4632, 337, 11454, 3024, 342, 6783, 50276, 1229, 3602, 1884, 11454, 47301, 342, 9098, 3602, 4632, 337, 11454, 3024, 342, 9098, 50276, 1229, 3602, 3966, 436, 812, 320, 247, 7484, 342, 581, 1386, 33356, 323, 1016, 1180, 273, 11454, 6928, 24088, 2233, 6783, 9098, 46059, 285, 253, 1269, 10565, 651, 320, 253, 2264, 1180, 273, 3602, 436, 651, 1663, 1361, 18578, 326, 436, 2934, 310, 32811, 50275, 12563, 39793, 253, 10336, 247, 2372, 651, 320, 1175, 281, 1361, 18578, 326, 269, 12303, 2966, 310, 1663, 1805, 984, 275, 619, 1859, 247, 8750, 2990, 342, 253, 987, 1396, 569, 2242, 1808, 1939, 1320, 2605, 4178, 86, 455, 4071, 651, 513, 1805, 390, 347, 973, 347, 253, 30408, 273, 1355, 11454, 6928, 275, 1355, 8323, 7562, 281, 2969, 20953, 15302, 310, 8261, 533, 625, 4679, 387, 4503, 13782, 403, 3058, 281, 18578, 285, 368, 878, 387, 1878, 581, 1781, 7877, 3368, 751, 581, 275, 391, 2313, 285, 969, 921, 326, 269, 12303, 2966, 310, 1805, 342, 253, 1072, 1180, 273, 2264, 3602, 685, 268, 2966, 50276, 783, 958, 326, 269, 12303, 2966, 858, 417, 789, 327, 253, 42803, 5150, 1955, 281, 16196, 10533, 310, 969, 247, 861, 281, 479, 326, 359, 943, 2649, 4858, 253, 5028, 715, 8323, 50276, 74, 13414, 2868, 275, 253, 26536, 326, 253, 5028, 943, 320, 2624, 715, 8323, 285, 253, 4679, 403, 417, 21414, 841, 24626, 7152, 339, 431, 248, 4477, 1246, 247, 16182, 323, 9433, 12057, 38967, 11454, 6928, 9176, 2224, 281, 1781, 10625, 1223, 37460, 253, 9171, 1430, 285, 9879, 8492, 3237, 12794, 273, 753, 6928, 597, 10957, 253, 5028, 275, 21481, 749, 45047, 285, 2622, 885, 253, 3280, 4903, 1561, 1016, 749, 13517, 247, 48257, 310, 10166, 323, 1016, 749, 13517, 50275, 783, 4477, 1071, 616, 1332, 689, 8967, 7424, 285, 921, 949, 841, 6667, 326, 616, 1332, 41731, 13015, 253, 9176, 2224, 597, 671, 6780, 690, 7364, 273, 253, 16182, 352, 476, 320, 4409, 281, 1442, 292, 2517, 253, 9025, 273, 253, 749, 13517, 275, 1016, 1798, 1895, 50276, 783, 2929, 310, 2590, 285, 973, 3542, 50276, 26122, 209, 186, 783, 5572, 14168, 32557, 273, 5150, 495, 310, 5611, 275, 1386, 8026, 533, 417, 2529, 275, 253, 2929, 253, 4477, 3806, 253, 9414, 281, 337, 2299, 347, 436, 5572, 4483, 281, 8356, 10517, 253, 7548, 2515, 1386, 8026, 891, 651, 1804, 24049, 247, 12494, 12930, 352, 285, 697, 3607, 209, 186, 6050, 12930, 253, 16182, 253, 4477, 5224, 326, 440, 12850, 310, 247, 1846, 440, 6320, 1320, 4451, 323, 512, 749, 45047, 1386, 8676, 891, 651, 5583, 36738, 327, 849, 436, 1159, 943, 320, 4236, 275, 1798, 597, 812, 6266, 849, 352, 369, 4236, 323, 1016, 273, 253, 1677, 6667, 209, 186, 13206, 374, 247, 285, 299, 1804, 326, 253, 1180, 273, 749, 45047, 1884, 556, 644, 4236, 281, 320, 4503, 281, 1315, 317, 805, 2059, 485, 72, 324, 1862, 1269, 835, 40639, 310, 253, 4585, 4294, 285, 18687, 1269, 310, 253, 2978, 273, 253, 5028, 310, 436, 247, 27454, 1057, 253, 1332, 671, 789, 323, 1529, 14717, 6974, 24088, 970, 3285, 390, 4562, 749, 45047, 253, 4327, 273, 253, 5028, 14717, 943, 320, 1805, 17194, 50276, 186, 783, 3237, 3559, 275, 253, 1543, 2593, 1537, 320, 2783, 2581, 2969, 1580, 616, 20059, 5482, 403, 3477, 281, 10173, 891, 651, 1804, 24049, 690, 625, 2570, 8967, 7424, 2490, 187, 4118, 18435, 27, 15337, 398, 3839, 4455, 281, 320, 33872, 670, 436, 2929, 50276, 531, 1534, 4468, 5439, 369, 253, 9171, 1430, 273, 253, 9860, 3022, 8820, 14717, 534, 5459, 28596, 342, 253, 7877, 6296, 9176, 2224, 452, 594, 2080, 6786, 253, 954, 2323, 275, 9542, 27005, 824, 347, 1029, 6967, 268, 3229, 323, 534, 5899, 2223, 9860, 3169, 3082, 403, 3480, 77, 461, 250, 275, 1698, 6967, 27005, 597, 403, 3798, 2223, 41731, 10574, 407, 5899, 1220, 735, 347, 824, 352, 3133, 326, 247, 9769, 2746, 778, 4711, 247, 1332, 326, 310, 417, 12085, 275, 667, 9459, 253, 3480, 273, 1941, 281, 253, 10214, 310, 7964, 253, 2014, 6459, 14855, 273, 253, 2929, 50276, 251, 247, 625, 2762, 3877, 30628, 5821, 326, 253, 2929, 369, 4518, 3542, 891, 49638, 253, 4477, 323, 247, 2929, 326, 310, 2080, 1840, 253, 7312, 2629, 275, 436, 2743, 50276, 74, 513, 2868, 253, 8063, 273, 253, 4081, 10957, 395, 585, 14056, 2746, 281, 320, 247, 5272, 581, 347, 247, 8542, 2647, 352, 310, 642, 5545, 1199, 6927, 281, 4944, 1016, 1355, 749, 18428, 281, 247, 1355, 1895, 685, 352, 310, 281, 4944, 247, 2014, 1781, 2990, 281, 253, 2644, 2181, 50276, 1189, 455, 891, 717, 21802, 281, 2997, 253, 2929, 253, 4081, 2746, 556, 2590, 7364, 533, 18505, 731, 3133, 751, 352, 651, 1056, 323, 271, 4722, 1386, 273, 789, 285, 436, 2929, 310, 273, 253, 5955, 11404, 6856, 2238, 326, 310, 247, 1175, 4944, 323, 247, 22586, 50276 ]
[ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 ]
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 2520, 2929, 310, 973, 3542, 352, 29328, 970, 2709, 11454, 6928, 581, 323, 1016, 749, 13517, 3185, 273, 247, 2014, 268, 2966, 689, 253, 2862, 5028, 984, 3629, 5028, 1979, 5997, 1698, 11383, 275, 253, 2900, 281, 320, 11575, 347, 1029, 4294, 3386, 407, 247, 11454, 2990, 253, 4154, 310, 326, 970, 2060, 11454, 6928, 588, 625, 8069, 3037, 1029, 11383, 1955, 281, 253, 4577, 10625, 2326, 407, 1016, 11454, 2990, 2429, 281, 247, 2014, 268, 2966, 689, 253, 2862, 5028, 50276, 2520, 2746, 273, 23534, 253, 5028, 3133, 15246, 285, 5272, 3738, 417, 4460, 253, 1318, 273, 436, 789, 476, 320, 5520, 407, 10941, 253, 5454, 14273, 275, 3045, 11897, 285, 3541, 875, 581, 268, 2966, 285, 253, 2709, 11454, 6928, 275, 247, 269, 12303, 2966, 671, 627, 310, 253, 1953, 273, 1880, 436, 310, 44755, 1677, 253, 2572, 275, 253, 1180, 273, 11454, 6928, 253, 4477, 476, 671, 1805, 6266, 253, 5199, 323, 20462, 285, 9433, 247, 1175, 269, 12303, 2966, 323, 1650, 667, 5028, 14717, 476, 320, 908, 347, 1048, 347, 253, 8255, 749, 45047, 14787, 50276, 5430, 943, 253, 5028, 14717, 320, 4236, 849, 281, 7277, 285, 7472, 247, 1798, 5028, 14717, 1411, 18075, 50276, 13206, 374, 270, 277, 269, 891, 921, 326, 1097, 269, 12303, 2966, 285, 268, 2966, 1347, 12014, 1411, 253, 3242, 2900, 594, 352, 310, 12744, 752, 253, 4154, 310, 323, 970, 269, 12303, 2966, 689, 268, 2966, 432, 1110, 1740, 749, 40203, 8442, 305, 285, 288, 285, 253, 2505, 6266, 7938, 14940, 273, 269, 12303, 2966, 533, 253, 3910, 476, 320, 16318, 275, 749, 40203, 270, 277, 269, 891, 4677, 495, 299, 285, 269, 403, 625, 4217, 5474, 33032, 253, 4477, 12661, 281, 897, 268, 2966, 533, 23694, 253, 5028, 715, 8323, 285, 840, 2622, 3006, 285, 3515, 247, 1027, 11454, 2990, 275, 1016, 3497, 50276, 28269, 2169, 11383, 1199, 625, 7808, 685, 2406, 11383, 310, 417, 2201, 1895, 352, 943, 2649, 3657, 4715, 253, 3451, 2900, 1677, 2217, 3733, 673, 352, 556, 2649, 644, 247, 2201, 2523, 275, 643, 4893, 594, 2139, 651, 352, 320, 275, 436, 581, 594, 253, 26536, 326, 359, 943, 2624, 253, 1895, 715, 4858, 3497, 3133, 3430, 581, 943, 760, 878, 247, 1175, 2217, 11454, 2990, 10336, 1907, 281, 897, 1884, 1027, 11454, 6928, 281, 8415, 247, 20953, 337, 69, 1895, 3133, 10412, 891, 16216, 8564, 849, 1199, 6928, 651, 320, 3058, 275, 1781, 10625, 604, 391, 81, 651, 359, 897, 1458, 81, 1027, 11454, 6928, 436, 310, 417, 44755, 50276, 262, 651, 320, 1774, 281, 921, 253, 2264, 1180, 273, 3602, 323, 253, 269, 12303, 2966, 4632, 253, 268, 2966, 281, 1663, 921, 326, 697, 625, 13328, 15329, 784, 275, 1180, 273, 3602, 34243, 368, 651, 7277, 275, 247, 4758, 342, 4503, 1180, 273, 3602, 387, 2710, 1979, 385, 50276, 1229, 11454, 47301, 342, 6783, 3602, 4632, 337, 11454, 3024, 342, 6783, 50276, 1229, 3602, 1884, 11454, 47301, 342, 9098, 3602, 4632, 337, 11454, 3024, 342, 9098, 50276, 1229, 3602, 3966, 436, 812, 320, 247, 7484, 342, 581, 1386, 33356, 323, 1016, 1180, 273, 11454, 6928, 24088, 2233, 6783, 9098, 46059, 285, 253, 1269, 10565, 651, 320, 253, 2264, 1180, 273, 3602, 436, 651, 1663, 1361, 18578, 326, 436, 2934, 310, 32811, 50275, 12563, 39793, 253, 10336, 247, 2372, 651, 320, 1175, 281, 1361, 18578, 326, 269, 12303, 2966, 310, 1663, 1805, 984, 275, 619, 1859, 247, 8750, 2990, 342, 253, 987, 1396, 569, 2242, 1808, 1939, 1320, 2605, 4178, 86, 455, 4071, 651, 513, 1805, 390, 347, 973, 347, 253, 30408, 273, 1355, 11454, 6928, 275, 1355, 8323, 7562, 281, 2969, 20953, 15302, 310, 8261, 533, 625, 4679, 387, 4503, 13782, 403, 3058, 281, 18578, 285, 368, 878, 387, 1878, 581, 1781, 7877, 3368, 751, 581, 275, 391, 2313, 285, 969, 921, 326, 269, 12303, 2966, 310, 1805, 342, 253, 1072, 1180, 273, 2264, 3602, 685, 268, 2966, 50276, 783, 958, 326, 269, 12303, 2966, 858, 417, 789, 327, 253, 42803, 5150, 1955, 281, 16196, 10533, 310, 969, 247, 861, 281, 479, 326, 359, 943, 2649, 4858, 253, 5028, 715, 8323, 50276, 74, 13414, 2868, 275, 253, 26536, 326, 253, 5028, 943, 320, 2624, 715, 8323, 285, 253, 4679, 403, 417, 21414, 841, 24626, 7152, 339, 431, 248, 4477, 1246, 247, 16182, 323, 9433, 12057, 38967, 11454, 6928, 9176, 2224, 281, 1781, 10625, 1223, 37460, 253, 9171, 1430, 285, 9879, 8492, 3237, 12794, 273, 753, 6928, 597, 10957, 253, 5028, 275, 21481, 749, 45047, 285, 2622, 885, 253, 3280, 4903, 1561, 1016, 749, 13517, 247, 48257, 310, 10166, 323, 1016, 749, 13517, 50275, 783, 4477, 1071, 616, 1332, 689, 8967, 7424, 285, 921, 949, 841, 6667, 326, 616, 1332, 41731, 13015, 253, 9176, 2224, 597, 671, 6780, 690, 7364, 273, 253, 16182, 352, 476, 320, 4409, 281, 1442, 292, 2517, 253, 9025, 273, 253, 749, 13517, 275, 1016, 1798, 1895, 50276, 783, 2929, 310, 2590, 285, 973, 3542, 50276, 26122, 209, 186, 783, 5572, 14168, 32557, 273, 5150, 495, 310, 5611, 275, 1386, 8026, 533, 417, 2529, 275, 253, 2929, 253, 4477, 3806, 253, 9414, 281, 337, 2299, 347, 436, 5572, 4483, 281, 8356, 10517, 253, 7548, 2515, 1386, 8026, 891, 651, 1804, 24049, 247, 12494, 12930, 352, 285, 697, 3607, 209, 186, 6050, 12930, 253, 16182, 253, 4477, 5224, 326, 440, 12850, 310, 247, 1846, 440, 6320, 1320, 4451, 323, 512, 749, 45047, 1386, 8676, 891, 651, 5583, 36738, 327, 849, 436, 1159, 943, 320, 4236, 275, 1798, 597, 812, 6266, 849, 352, 369, 4236, 323, 1016, 273, 253, 1677, 6667, 209, 186, 13206, 374, 247, 285, 299, 1804, 326, 253, 1180, 273, 749, 45047, 1884, 556, 644, 4236, 281, 320, 4503, 281, 1315, 317, 805, 2059, 485, 72, 324, 1862, 1269, 835, 40639, 310, 253, 4585, 4294, 285, 18687, 1269, 310, 253, 2978, 273, 253, 5028, 310, 436, 247, 27454, 1057, 253, 1332, 671, 789, 323, 1529, 14717, 6974, 24088, 970, 3285, 390, 4562, 749, 45047, 253, 4327, 273, 253, 5028, 14717, 943, 320, 1805, 17194, 50276, 186, 783, 3237, 3559, 275, 253, 1543, 2593, 1537, 320, 2783, 2581, 2969, 1580, 616, 20059, 5482, 403, 3477, 281, 10173, 891, 651, 1804, 24049, 690, 625, 2570, 8967, 7424, 2490, 187, 4118, 18435, 27, 15337, 398, 3839, 4455, 281, 320, 33872, 670, 436, 2929, 50276, 531, 1534, 4468, 5439, 369, 253, 9171, 1430, 273, 253, 9860, 3022, 8820, 14717, 534, 5459, 28596, 342, 253, 7877, 6296, 9176, 2224, 452, 594, 2080, 6786, 253, 954, 2323, 275, 9542, 27005, 824, 347, 1029, 6967, 268, 3229, 323, 534, 5899, 2223, 9860, 3169, 3082, 403, 3480, 77, 461, 250, 275, 1698, 6967, 27005, 597, 403, 3798, 2223, 41731, 10574, 407, 5899, 1220, 735, 347, 824, 352, 3133, 326, 247, 9769, 2746, 778, 4711, 247, 1332, 326, 310, 417, 12085, 275, 667, 9459, 253, 3480, 273, 1941, 281, 253, 10214, 310, 7964, 253, 2014, 6459, 14855, 273, 253, 2929, 50276, 251, 247, 625, 2762, 3877, 30628, 5821, 326, 253, 2929, 369, 4518, 3542, 891, 49638, 253, 4477, 323, 247, 2929, 326, 310, 2080, 1840, 253, 7312, 2629, 275, 436, 2743, 50276, 74, 513, 2868, 253, 8063, 273, 253, 4081, 10957, 395, 585, 14056, 2746, 281, 320, 247, 5272, 581, 347, 247, 8542, 2647, 352, 310, 642, 5545, 1199, 6927, 281, 4944, 1016, 1355, 749, 18428, 281, 247, 1355, 1895, 685, 352, 310, 281, 4944, 247, 2014, 1781, 2990, 281, 253, 2644, 2181, 50276, 1189, 455, 891, 717, 21802, 281, 2997, 253, 2929, 253, 4081, 2746, 556, 2590, 7364, 533, 18505, 731, 3133, 751, 352, 651, 1056, 323, 271, 4722, 1386, 273, 789, 285, 436, 2929, 310, 273, 253, 5955, 11404, 6856, 2238, 326, 310, 247, 1175, 4944, 323, 247, 22586, 50276 ]
Below is given review of a research paper from cnoference journal. Please write a summary the review. ### Review: the authors present a dataset for benchmarking the performance of ecg and ppg waveform imputation methods by leveraging publicly available icu waveform data and using realistic missingness patterns to create simulated gaps to be imputed while i believe that some of the baseline model choices dont make sense overall the paper is well written and timely they share the waveform data along with missingness masks along with trainvalidationtest splits so that others can easily compare imputation methods to baseline model performance several baseline models from the literature were implemented and tested along with a novel transformer model to serve as an additional baseline strengths of the dataset include the scale and the realistic masking procedure used to generate missingness that tries to closely mirror patterns seen in mhealth systems i also appreciate that a downstream prediction task is also included in the dataset since often users care about how imputation quality will affect application performance many of the potential concerns i had with the dataset selection eg using icu patients mhealthspecific missingness patterns differences in ecg sensor quality from hospital to mhealth were preemptively by the authors explanation in appendix a1 im not sure that mean and linear imputation necessarily make sense as baselines if a peak is defined as a local maximum ie xi xi1 xi xi1 at index i in signal x then by definition there will be no peaks with mean or linear interpolation and the peak classification will fail this may explain the nan values and zero sensitivity of the mean and linear imputation in table 2 why not use a simple fftbased baseline since these are quasiperiodic signals the choice to not utilize the mimic matched waveform database so that covariates can be used either in the modeling process or during evaluation is also a weakness many parameters such as resting heart rate max heart rate etc vary as a function of age disease status medication usage etc and as a benchmarking dataset it would be nice to be able to incorporate that data docsepan existing architecture was slightly transformed and transferred to another domain ie wearables to address a gap in physiologic signal imputation the approach presented shows superior performance compared to existing work the authors demonstrate that the current stateoftheart algorithms trained on ecg and ppg waveform perform poorly on signals obtained from wearables table 1 outlines a clear comparison to existing work the explanation why multichannel is out of scope is solid the approach was only tested on one dataset for each domain validation on other datasets would strengthen the paper docsepthis paper raises the issue of missing data in pulsative signals collected from wearable devices and introduces an imputation benchmark pulseimpute specifically the authors extracted missingness patterns from realworld mhealth settings and applied them to two existing pulsative signal datasets they reproduced several existing imputation methods and proposed a new imputation method and used them as baselines applying these baseline imputation methods to their processed pulsative signal datasets with missing values the authors proposed benchmarks for the downstream tasks this paper defines an interesting research question code to replicate the baseline results i can see why the authors chose to apply the missingness patterns in the mhealth settings to the existing pulsative signal datasets rather than collecting data directly from mhealth wearables due to a possible lack of ground truth admittedly the authors also compared several differences between data collected from mhealth settings and clinical settings however the lack of a direct comparison of the differences in missing patterns in the data collected from the two settings and what exactly the missing patterns are in the data collected from mhealth wearables makes the choice of curated datasets to address the question raised in this paper less convincing what makes pulsative signals more interestingcompelling than other signals from a missingness perspective was less motivated the paper is a bit hard to follow and there are some inconsistencies and inaccessible references docsepthe usage of wearable sensors for medical purposes promises better monitoring with high frequency information however the usage of such sensors worn in daytoday life often leads to gaps in the sensory information imputation techniques attempt to fill such gaps but are lacking for pulsative signals like ecgs and ppgs this submission supplies datasets methods to mimic realistic missingness in the data as well as challenge tasks to evaluate pulsative signal imputation further the authors propose a benchmark transformer model where the signal tokenizer is realized via dilated convolution and empirically demonstrate significant outperformance compared to previous work on their benchmark tasks the proposed datasets and challenges appear highly relevant to facilitate broad usage of wearable sensors for medical purposes the definition of evidencedbased patterns of missingness fills a gap in previously published work to that end that definition is made accessible by the proposed challenge which is based on existing and peerreviewed datasets incorporating the patterns missingness the experiment design is clear and thoughtful the downstream taks are both difficult and relevant tasks for pulsative signals to test imputation methods the data provided makes it easy for researchers to work on the topic the authors propose a model architecture which demonstrates significant outperformance on the benchmark tasks in my view there are no major weaknesses two minor issues first the presentation of the bdc architecture may be a bit too confident in ll 211 following the authors state their requirements for their benchmark model and state that no existing transformer models address all three issues sufficiently id argue that these are rather common issues as an example images pixels require local context require some measure against permutation equivariance and have to deal with scaling of long sequences vision transformershttpsarxivorgabs201011929 vit have found ways to deal with that so has the perceiverhttpsarxivorgabs210303206 architecture directly on the data cvthttpsopenaccessthecvfcomcontenticcv2021paperswucvtintroducingconvolutionstovisiontransformersiccv2021paperpdf adds convolution data encoders to vit similar to the authors approach that does not take anything away from their model architecture which they show works fine id find it fair though to put it into context of other work the authos further state that positional encodings dont have a good inductive bias in their setting line 239 im curious is that empirical or are there other reasons to think that on other domains again images as example position encodings work surprisingly well to contextualize second equation 1 seems somewhat overcomplicated selfattention is commonplace and a softmax would have simplified it greatly docsepin this paper the authors proposed a new benchmark task for physiological sensor signal imputation specifically the authors focus on ecg and ppg signals and use public datasets for evaluation they first simulated realistic missing patterns for ecg and ppg signals by ablating samples and built a dataset with realistic missing data they implemented eight existing traditional or modern timeseries imputation techniques moreover they also proposed and developed a new bottleneck dilated convolution bdc selfattention architecture that fits the characteristic ecgppg data these data usually have a longrange structure as pulsative signals the authors then evaluated the performance of these algorithms on 1 the raw signal reconstruction task and 2 three downstream tasks a heartbeat detection in ecg b heartbeat detection in ppg and c cardiac pathophysiology classification in ecg their results indicate that the new bdc technique is significantly better than all baselines the topic of signal sensor imputation is an important realistic and very practical problem in mhealth daily physiological data collection both the raw construction error and the three downstream tasks are valid experiment designs the design of the new architecture does leverage the pattern of ecgppg properties and the advantage over the baseline methods are encouraging physiological signal types may go beyond ecgppg while ecgppg is arguably one of the most commonly collected physiological signals in mhealth applications there are other common sensors that are not covered in this paper such as imu accelerometer gyroscope and magnetometer gsr eeg etc i am not arguing that the authors must evaluate their techniques on these signals but their characteristics could be very different from ecgppg signals for example they may not have a clear pulsative pattern i am curious to know the authors consideration of the generalizability of the technique and the potential necessity to tone down the papers framing or clarify the scope of the paper baseline method selection the authors compare the new technique against eight baseline techniques which is great however why not compare against the two existing papers that specifically focus on imputing mhealth pulsative signals ie 20 40 the citation number in the paper please justify extremely low recall for baseline technique related to the previous point in table 2 the performance of the baseline techniques all have very low recall thus low f1 score the authors provided some reasons in the text which is great but such a low performance raises a concern that whether these baselines are too easy to beat comparing against the sota technique could provide more valid results 20 arman iranfar adriana arza and david atienza relearn a robust machine learning framework in presence of missing data for multimodal stress detection from physiological signals arxiv preprint arxiv210414278 2021 40 hillol sarker matthew tyburski md mahbubur rahman karen hovsepian moushumi sharmin david h epstein kenzie l preston c debra furrholden adam milam inbal nahumshani et al finding significant stress episodes in a discontinuous time series of rapidly varying mobile sensor data in proceedings of the 2016 chi conference on human factors in computing systems pages 44894501 2016 ### Summary:
this paper develops a new benchmark for missing data imputation in pulsative signals like ecg using realistic missingness models i expect such a dataset to drive important developments in this understudied area indeed the authors show that standard sota methods fail the reviewers enthusiasm makes this paper a clear accept
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 783, 4477, 1246, 247, 10895, 323, 22791, 272, 253, 3045, 273, 10038, 72, 285, 7266, 72, 34048, 516, 10340, 3082, 407, 19732, 2977, 13644, 2130, 17857, 86, 34048, 941, 285, 970, 15958, 5816, 1255, 6127, 281, 2794, 15524, 18388, 281, 320, 516, 19280, 1223, 891, 2868, 326, 690, 273, 253, 8245, 1566, 10165, 13414, 1056, 3282, 4583, 253, 2929, 310, 973, 3542, 285, 14793, 597, 3894, 253, 34048, 941, 2112, 342, 5816, 1255, 25965, 2112, 342, 6194, 29599, 2566, 36509, 594, 326, 2571, 476, 4354, 7277, 516, 10340, 3082, 281, 8245, 1566, 3045, 2067, 8245, 3210, 432, 253, 6239, 497, 9009, 285, 5762, 2112, 342, 247, 4460, 39707, 1566, 281, 5752, 347, 271, 3081, 8245, 50276, 296, 3755, 20556, 273, 253, 10895, 2486, 253, 4311, 285, 253, 15958, 44790, 5199, 908, 281, 6635, 5816, 1255, 326, 14177, 281, 8244, 11472, 6127, 2326, 275, 278, 15356, 2718, 891, 671, 11435, 326, 247, 15450, 10554, 4836, 310, 671, 2908, 275, 253, 10895, 1580, 2223, 4212, 1557, 670, 849, 516, 10340, 3290, 588, 2818, 2898, 3045, 1142, 273, 253, 2442, 7350, 891, 574, 342, 253, 10895, 5438, 24088, 970, 17857, 86, 1363, 278, 15356, 6160, 5816, 1255, 6127, 3910, 275, 10038, 72, 8468, 3290, 432, 4675, 281, 278, 15356, 497, 36588, 1242, 407, 253, 4477, 8813, 275, 30762, 247, 18, 50276, 303, 417, 2119, 326, 1599, 285, 4872, 516, 10340, 7933, 1056, 3282, 347, 1666, 25379, 604, 247, 5241, 310, 2931, 347, 247, 1980, 4869, 26332, 1269, 74, 50276, 2981, 18, 50276, 2981, 50276, 2981, 18, 387, 3605, 891, 275, 2625, 1269, 840, 407, 5426, 627, 588, 320, 642, 13596, 342, 1599, 390, 4872, 30370, 285, 253, 5241, 9162, 588, 1891, 436, 778, 5513, 253, 6399, 2193, 285, 5058, 7340, 273, 253, 1599, 285, 4872, 516, 10340, 275, 2829, 374, 2139, 417, 897, 247, 2969, 269, 649, 3169, 8245, 1580, 841, 403, 15539, 38847, 6298, 50276, 783, 4327, 281, 417, 16584, 253, 25066, 13373, 34048, 5447, 594, 326, 33520, 476, 320, 908, 2057, 275, 253, 14053, 1232, 390, 1309, 7103, 310, 671, 247, 14855, 1142, 3602, 824, 347, 18180, 2798, 2281, 2781, 2798, 2281, 3966, 6889, 347, 247, 1159, 273, 2363, 2728, 3708, 12358, 10393, 3966, 285, 347, 247, 22791, 272, 10895, 352, 651, 320, 5322, 281, 320, 2104, 281, 19071, 326, 941, 50276, 7152, 339, 4029, 5368, 10336, 369, 5777, 13657, 285, 9495, 281, 1529, 5028, 26332, 8251, 2272, 281, 2953, 247, 8037, 275, 25258, 9522, 2625, 516, 10340, 50276, 783, 2746, 3559, 2722, 8936, 3045, 2429, 281, 5368, 789, 253, 4477, 7568, 326, 253, 1655, 1375, 23037, 14387, 11333, 10166, 327, 10038, 72, 285, 7266, 72, 34048, 1347, 15225, 327, 6298, 2797, 432, 8251, 2272, 50275, 2420, 337, 36264, 247, 2590, 5301, 281, 5368, 789, 253, 8813, 2139, 1554, 469, 3536, 310, 562, 273, 7990, 310, 4891, 50275, 783, 2746, 369, 760, 5762, 327, 581, 10895, 323, 1016, 5028, 12820, 327, 643, 15302, 651, 17084, 253, 2929, 5474, 33032, 2520, 2929, 16540, 253, 2523, 273, 5816, 941, 275, 24295, 800, 6298, 5728, 432, 8251, 494, 4095, 285, 23970, 271, 516, 10340, 22791, 10724, 303, 48334, 5742, 253, 4477, 10375, 5816, 1255, 6127, 432, 1524, 10186, 278, 15356, 7533, 285, 3732, 731, 281, 767, 5368, 24295, 800, 2625, 15302, 597, 23775, 2067, 5368, 516, 10340, 3082, 285, 4081, 247, 747, 516, 10340, 1332, 285, 908, 731, 347, 1666, 25379, 9433, 841, 8245, 516, 10340, 3082, 281, 616, 11742, 24295, 800, 2625, 15302, 342, 5816, 2193, 253, 4477, 4081, 49602, 323, 253, 15450, 8892, 50275, 2520, 2929, 13067, 271, 4722, 2561, 1953, 50275, 3211, 281, 25464, 253, 8245, 1543, 50276, 74, 476, 923, 2139, 253, 4477, 9703, 281, 4647, 253, 5816, 1255, 6127, 275, 253, 278, 15356, 7533, 281, 253, 5368, 24295, 800, 2625, 15302, 2581, 685, 17055, 941, 3587, 432, 278, 15356, 8251, 2272, 1955, 281, 247, 1896, 3480, 273, 3216, 5083, 47421, 253, 4477, 671, 2429, 2067, 3910, 875, 941, 5728, 432, 278, 15356, 7533, 285, 3382, 7533, 2299, 253, 3480, 273, 247, 1480, 5301, 273, 253, 3910, 275, 5816, 6127, 275, 253, 941, 5728, 432, 253, 767, 7533, 285, 752, 4555, 253, 5816, 6127, 403, 275, 253, 941, 5728, 432, 278, 15356, 8251, 2272, 2789, 253, 4327, 273, 1095, 456, 15302, 281, 2953, 253, 1953, 5439, 275, 436, 2929, 1679, 21414, 50276, 5371, 2789, 24295, 800, 6298, 625, 4722, 3118, 3485, 685, 643, 6298, 432, 247, 5816, 1255, 8668, 369, 1679, 17194, 50275, 783, 2929, 310, 247, 2372, 1892, 281, 956, 285, 627, 403, 690, 45611, 285, 49187, 10414, 50275, 7152, 339, 431, 248, 10393, 273, 8251, 494, 13479, 323, 3739, 6378, 16966, 1805, 8667, 342, 1029, 4294, 1491, 2299, 253, 10393, 273, 824, 13479, 16332, 275, 1388, 39799, 1495, 2223, 5644, 281, 18388, 275, 253, 17872, 1491, 516, 10340, 5609, 3177, 281, 7522, 824, 18388, 533, 403, 14999, 323, 24295, 800, 6298, 751, 10038, 5943, 285, 7266, 5943, 436, 19529, 13191, 15302, 3082, 281, 25066, 15958, 5816, 1255, 275, 253, 941, 347, 973, 347, 5691, 8892, 281, 7472, 24295, 800, 2625, 516, 10340, 2007, 253, 4477, 12661, 247, 22791, 39707, 1566, 835, 253, 2625, 10669, 6081, 310, 8156, 3066, 49783, 27311, 285, 45190, 7568, 1534, 562, 24159, 2429, 281, 2045, 789, 327, 616, 22791, 8892, 253, 4081, 15302, 285, 7881, 3176, 4122, 4623, 281, 12454, 3862, 10393, 273, 8251, 494, 13479, 323, 3739, 6378, 253, 5426, 273, 27007, 3169, 6127, 273, 5816, 1255, 32113, 247, 8037, 275, 3786, 3863, 789, 281, 326, 990, 50276, 3529, 5426, 310, 1160, 12482, 407, 253, 4081, 5691, 534, 310, 1754, 327, 5368, 285, 14218, 33349, 15302, 24049, 253, 6127, 5816, 1255, 253, 3368, 2216, 310, 2590, 285, 30457, 253, 15450, 246, 8765, 403, 1097, 2834, 285, 4623, 8892, 323, 24295, 800, 6298, 281, 1071, 516, 10340, 3082, 253, 941, 2530, 2789, 352, 3477, 323, 8607, 281, 789, 327, 253, 9400, 253, 4477, 12661, 247, 1566, 10336, 534, 14371, 1534, 562, 24159, 327, 253, 22791, 8892, 275, 619, 1859, 627, 403, 642, 2201, 32213, 767, 5884, 3374, 50276, 7053, 253, 9759, 273, 253, 270, 12352, 10336, 778, 320, 247, 2372, 1512, 13224, 275, 26198, 24978, 1563, 253, 4477, 1375, 616, 6095, 323, 616, 22791, 1566, 285, 1375, 326, 642, 5368, 39707, 3210, 2953, 512, 1264, 3374, 10481, 2654, 9059, 326, 841, 403, 2581, 1846, 3374, 347, 271, 1650, 3888, 50276, 30061, 1241, 2430, 1980, 3634, 2430, 690, 2557, 1411, 29391, 32270, 14417, 285, 452, 281, 2968, 342, 13642, 273, 1048, 6430, 8113, 4979, 398, 3614, 39962, 2061, 5375, 1252, 520, 746, 1717, 9084, 452, 1119, 4088, 281, 2968, 342, 326, 594, 556, 253, 591, 22070, 3614, 39962, 2061, 5375, 16899, 20914, 18040, 10336, 3587, 327, 253, 941, 260, 20282, 3614, 5758, 317, 707, 296, 248, 17312, 71, 681, 6071, 280, 17312, 938, 1797, 50004, 88, 1028, 20282, 36445, 2844, 13118, 2241, 296, 729, 1297, 16702, 398, 280, 17312, 938, 1797, 20790, 9275, 11323, 27311, 941, 2349, 351, 398, 281, 9084, 2074, 281, 253, 4477, 2746, 326, 1057, 417, 1379, 2712, 1977, 432, 616, 1566, 10336, 534, 597, 921, 2987, 4030, 2654, 1089, 352, 4344, 2167, 281, 1691, 352, 715, 3634, 273, 643, 789, 253, 24896, 375, 2007, 1375, 326, 40798, 2349, 351, 723, 13414, 452, 247, 1175, 42115, 8492, 275, 616, 4758, 1386, 27862, 516, 14338, 310, 326, 16774, 390, 403, 627, 643, 4606, 281, 1158, 326, 327, 643, 10625, 969, 3888, 347, 1650, 1899, 2349, 351, 723, 789, 19143, 973, 281, 33876, 907, 50274, 9815, 5150, 337, 3133, 8489, 689, 681, 37787, 1881, 42959, 310, 47817, 285, 247, 2602, 4090, 651, 452, 21010, 352, 10260, 50275, 7152, 339, 9852, 436, 2929, 253, 4477, 4081, 247, 747, 22791, 4836, 323, 13424, 8468, 2625, 516, 10340, 5742, 253, 4477, 2770, 327, 10038, 72, 285, 7266, 72, 6298, 285, 897, 1345, 15302, 323, 7103, 597, 806, 15524, 15958, 5816, 6127, 323, 10038, 72, 285, 7266, 72, 6298, 407, 490, 77, 839, 3530, 285, 4270, 247, 10895, 342, 15958, 5816, 941, 597, 9009, 4314, 5368, 5899, 390, 4980, 2069, 12395, 516, 10340, 5609, 25761, 597, 671, 4081, 285, 3715, 247, 747, 3673, 44856, 49783, 27311, 270, 12352, 1881, 42959, 10336, 326, 13840, 253, 8847, 10038, 72, 377, 72, 941, 841, 941, 3798, 452, 247, 1048, 6324, 2605, 347, 24295, 800, 6298, 253, 4477, 840, 6760, 253, 3045, 273, 841, 11333, 327, 337, 253, 9305, 2625, 14433, 4836, 285, 374, 1264, 15450, 8892, 247, 43534, 5481, 275, 10038, 72, 270, 43534, 5481, 275, 7266, 72, 285, 260, 10177, 42405, 9162, 275, 10038, 72, 616, 1543, 5224, 326, 253, 747, 270, 12352, 5853, 310, 3012, 1805, 685, 512, 1666, 25379, 50276, 783, 9400, 273, 2625, 8468, 516, 10340, 310, 271, 1774, 15958, 285, 1077, 8542, 1895, 275, 278, 15356, 5312, 13424, 941, 4849, 50276, 15617, 253, 9305, 5140, 2228, 285, 253, 1264, 15450, 8892, 403, 3588, 3368, 11809, 50276, 783, 2216, 273, 253, 747, 10336, 1057, 25057, 253, 3102, 273, 10038, 72, 377, 72, 3607, 285, 253, 5750, 689, 253, 8245, 3082, 403, 18462, 50276, 14453, 8673, 2625, 3510, 778, 564, 4457, 10038, 72, 377, 72, 1223, 10038, 72, 377, 72, 310, 25711, 581, 273, 253, 954, 7744, 5728, 13424, 6298, 275, 278, 15356, 4893, 627, 403, 643, 1846, 13479, 326, 403, 417, 6107, 275, 436, 2929, 824, 347, 516, 86, 17308, 11955, 19859, 10425, 1714, 285, 10973, 11955, 305, 18356, 299, 909, 3966, 891, 717, 417, 16425, 326, 253, 4477, 1364, 7472, 616, 5609, 327, 841, 6298, 533, 616, 5319, 812, 320, 1077, 1027, 432, 10038, 72, 377, 72, 6298, 323, 1650, 597, 778, 417, 452, 247, 2590, 24295, 800, 3102, 891, 717, 14338, 281, 871, 253, 4477, 8180, 273, 253, 2087, 50228, 273, 253, 5853, 285, 253, 2442, 15504, 281, 10541, 1066, 253, 9380, 39926, 390, 19148, 253, 7990, 273, 253, 2929, 50275, 44650, 1332, 5438, 253, 4477, 7277, 253, 747, 5853, 1411, 4314, 8245, 5609, 534, 310, 1270, 2299, 2139, 417, 7277, 1411, 253, 767, 5368, 9380, 326, 5742, 2770, 327, 516, 1065, 272, 278, 15356, 24295, 800, 6298, 26332, 1384, 3387, 50276, 783, 25577, 1180, 275, 253, 2929, 4496, 15249, 50275, 2068, 42807, 1698, 6983, 323, 8245, 5853, 2905, 281, 253, 2045, 1127, 275, 2829, 374, 253, 3045, 273, 253, 8245, 5609, 512, 452, 1077, 1698, 6983, 3021, 1698, 269, 18, 4868, 253, 4477, 2530, 690, 4606, 275, 253, 2505, 534, 310, 1270, 533, 824, 247, 1698, 3045, 16540, 247, 4468, 326, 1880, 841, 1666, 25379, 403, 1512, 3477, 281, 7171, 10941, 1411, 253, 256, 5503, 5853, 812, 2085, 625, 3588, 1543, 50276, 938, 549, 1342, 3496, 266, 14103, 519, 363, 3230, 549, 4019, 285, 34843, 301, 387, 1914, 4019, 1693, 1596, 247, 10237, 5145, 4715, 7792, 275, 3361, 273, 5816, 941, 323, 23390, 26306, 4073, 5481, 432, 13424, 6298, 549, 32693, 638, 3845, 549, 32693, 19, 11238, 1047, 24803, 43425, 50276, 1449, 13599, 311, 256, 26599, 1111, 783, 88, 963, 67, 2244, 5985, 31934, 35926, 67, 538, 321, 1218, 38373, 465, 6950, 288, 729, 33032, 757, 278, 528, 18256, 74, 17614, 1222, 34843, 301, 288, 2563, 6339, 465, 32560, 298, 638, 5493, 260, 4274, 376, 11829, 83, 4949, 257, 38622, 2301, 312, 275, 7187, 295, 1240, 360, 1200, 6451, 1162, 355, 4560, 1534, 4073, 13305, 275, 247, 16196, 3472, 673, 2962, 273, 9086, 11962, 6109, 8468, 941, 275, 10061, 273, 253, 4022, 21477, 8059, 327, 1966, 2616, 275, 12672, 2718, 7223, 577, 30452, 1857, 520, 4022, 2490, 187, 4118, 18435, 27, 2520, 2929, 24357, 247, 747, 22791, 323, 5816, 941, 516, 10340, 275, 24295, 800, 6298, 751, 10038, 72, 970, 15958, 5816, 1255, 3210, 891, 1902, 824, 247, 10895, 281, 4446, 1774, 16936, 275, 436, 762, 14091, 728, 2170, 6296, 253, 4477, 921, 326, 2629, 256, 5503, 3082, 1891, 253, 30628, 23027, 2789, 436, 2929, 247, 2590, 2997 ]
[ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 ]
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 783, 4477, 1246, 247, 10895, 323, 22791, 272, 253, 3045, 273, 10038, 72, 285, 7266, 72, 34048, 516, 10340, 3082, 407, 19732, 2977, 13644, 2130, 17857, 86, 34048, 941, 285, 970, 15958, 5816, 1255, 6127, 281, 2794, 15524, 18388, 281, 320, 516, 19280, 1223, 891, 2868, 326, 690, 273, 253, 8245, 1566, 10165, 13414, 1056, 3282, 4583, 253, 2929, 310, 973, 3542, 285, 14793, 597, 3894, 253, 34048, 941, 2112, 342, 5816, 1255, 25965, 2112, 342, 6194, 29599, 2566, 36509, 594, 326, 2571, 476, 4354, 7277, 516, 10340, 3082, 281, 8245, 1566, 3045, 2067, 8245, 3210, 432, 253, 6239, 497, 9009, 285, 5762, 2112, 342, 247, 4460, 39707, 1566, 281, 5752, 347, 271, 3081, 8245, 50276, 296, 3755, 20556, 273, 253, 10895, 2486, 253, 4311, 285, 253, 15958, 44790, 5199, 908, 281, 6635, 5816, 1255, 326, 14177, 281, 8244, 11472, 6127, 2326, 275, 278, 15356, 2718, 891, 671, 11435, 326, 247, 15450, 10554, 4836, 310, 671, 2908, 275, 253, 10895, 1580, 2223, 4212, 1557, 670, 849, 516, 10340, 3290, 588, 2818, 2898, 3045, 1142, 273, 253, 2442, 7350, 891, 574, 342, 253, 10895, 5438, 24088, 970, 17857, 86, 1363, 278, 15356, 6160, 5816, 1255, 6127, 3910, 275, 10038, 72, 8468, 3290, 432, 4675, 281, 278, 15356, 497, 36588, 1242, 407, 253, 4477, 8813, 275, 30762, 247, 18, 50276, 303, 417, 2119, 326, 1599, 285, 4872, 516, 10340, 7933, 1056, 3282, 347, 1666, 25379, 604, 247, 5241, 310, 2931, 347, 247, 1980, 4869, 26332, 1269, 74, 50276, 2981, 18, 50276, 2981, 50276, 2981, 18, 387, 3605, 891, 275, 2625, 1269, 840, 407, 5426, 627, 588, 320, 642, 13596, 342, 1599, 390, 4872, 30370, 285, 253, 5241, 9162, 588, 1891, 436, 778, 5513, 253, 6399, 2193, 285, 5058, 7340, 273, 253, 1599, 285, 4872, 516, 10340, 275, 2829, 374, 2139, 417, 897, 247, 2969, 269, 649, 3169, 8245, 1580, 841, 403, 15539, 38847, 6298, 50276, 783, 4327, 281, 417, 16584, 253, 25066, 13373, 34048, 5447, 594, 326, 33520, 476, 320, 908, 2057, 275, 253, 14053, 1232, 390, 1309, 7103, 310, 671, 247, 14855, 1142, 3602, 824, 347, 18180, 2798, 2281, 2781, 2798, 2281, 3966, 6889, 347, 247, 1159, 273, 2363, 2728, 3708, 12358, 10393, 3966, 285, 347, 247, 22791, 272, 10895, 352, 651, 320, 5322, 281, 320, 2104, 281, 19071, 326, 941, 50276, 7152, 339, 4029, 5368, 10336, 369, 5777, 13657, 285, 9495, 281, 1529, 5028, 26332, 8251, 2272, 281, 2953, 247, 8037, 275, 25258, 9522, 2625, 516, 10340, 50276, 783, 2746, 3559, 2722, 8936, 3045, 2429, 281, 5368, 789, 253, 4477, 7568, 326, 253, 1655, 1375, 23037, 14387, 11333, 10166, 327, 10038, 72, 285, 7266, 72, 34048, 1347, 15225, 327, 6298, 2797, 432, 8251, 2272, 50275, 2420, 337, 36264, 247, 2590, 5301, 281, 5368, 789, 253, 8813, 2139, 1554, 469, 3536, 310, 562, 273, 7990, 310, 4891, 50275, 783, 2746, 369, 760, 5762, 327, 581, 10895, 323, 1016, 5028, 12820, 327, 643, 15302, 651, 17084, 253, 2929, 5474, 33032, 2520, 2929, 16540, 253, 2523, 273, 5816, 941, 275, 24295, 800, 6298, 5728, 432, 8251, 494, 4095, 285, 23970, 271, 516, 10340, 22791, 10724, 303, 48334, 5742, 253, 4477, 10375, 5816, 1255, 6127, 432, 1524, 10186, 278, 15356, 7533, 285, 3732, 731, 281, 767, 5368, 24295, 800, 2625, 15302, 597, 23775, 2067, 5368, 516, 10340, 3082, 285, 4081, 247, 747, 516, 10340, 1332, 285, 908, 731, 347, 1666, 25379, 9433, 841, 8245, 516, 10340, 3082, 281, 616, 11742, 24295, 800, 2625, 15302, 342, 5816, 2193, 253, 4477, 4081, 49602, 323, 253, 15450, 8892, 50275, 2520, 2929, 13067, 271, 4722, 2561, 1953, 50275, 3211, 281, 25464, 253, 8245, 1543, 50276, 74, 476, 923, 2139, 253, 4477, 9703, 281, 4647, 253, 5816, 1255, 6127, 275, 253, 278, 15356, 7533, 281, 253, 5368, 24295, 800, 2625, 15302, 2581, 685, 17055, 941, 3587, 432, 278, 15356, 8251, 2272, 1955, 281, 247, 1896, 3480, 273, 3216, 5083, 47421, 253, 4477, 671, 2429, 2067, 3910, 875, 941, 5728, 432, 278, 15356, 7533, 285, 3382, 7533, 2299, 253, 3480, 273, 247, 1480, 5301, 273, 253, 3910, 275, 5816, 6127, 275, 253, 941, 5728, 432, 253, 767, 7533, 285, 752, 4555, 253, 5816, 6127, 403, 275, 253, 941, 5728, 432, 278, 15356, 8251, 2272, 2789, 253, 4327, 273, 1095, 456, 15302, 281, 2953, 253, 1953, 5439, 275, 436, 2929, 1679, 21414, 50276, 5371, 2789, 24295, 800, 6298, 625, 4722, 3118, 3485, 685, 643, 6298, 432, 247, 5816, 1255, 8668, 369, 1679, 17194, 50275, 783, 2929, 310, 247, 2372, 1892, 281, 956, 285, 627, 403, 690, 45611, 285, 49187, 10414, 50275, 7152, 339, 431, 248, 10393, 273, 8251, 494, 13479, 323, 3739, 6378, 16966, 1805, 8667, 342, 1029, 4294, 1491, 2299, 253, 10393, 273, 824, 13479, 16332, 275, 1388, 39799, 1495, 2223, 5644, 281, 18388, 275, 253, 17872, 1491, 516, 10340, 5609, 3177, 281, 7522, 824, 18388, 533, 403, 14999, 323, 24295, 800, 6298, 751, 10038, 5943, 285, 7266, 5943, 436, 19529, 13191, 15302, 3082, 281, 25066, 15958, 5816, 1255, 275, 253, 941, 347, 973, 347, 5691, 8892, 281, 7472, 24295, 800, 2625, 516, 10340, 2007, 253, 4477, 12661, 247, 22791, 39707, 1566, 835, 253, 2625, 10669, 6081, 310, 8156, 3066, 49783, 27311, 285, 45190, 7568, 1534, 562, 24159, 2429, 281, 2045, 789, 327, 616, 22791, 8892, 253, 4081, 15302, 285, 7881, 3176, 4122, 4623, 281, 12454, 3862, 10393, 273, 8251, 494, 13479, 323, 3739, 6378, 253, 5426, 273, 27007, 3169, 6127, 273, 5816, 1255, 32113, 247, 8037, 275, 3786, 3863, 789, 281, 326, 990, 50276, 3529, 5426, 310, 1160, 12482, 407, 253, 4081, 5691, 534, 310, 1754, 327, 5368, 285, 14218, 33349, 15302, 24049, 253, 6127, 5816, 1255, 253, 3368, 2216, 310, 2590, 285, 30457, 253, 15450, 246, 8765, 403, 1097, 2834, 285, 4623, 8892, 323, 24295, 800, 6298, 281, 1071, 516, 10340, 3082, 253, 941, 2530, 2789, 352, 3477, 323, 8607, 281, 789, 327, 253, 9400, 253, 4477, 12661, 247, 1566, 10336, 534, 14371, 1534, 562, 24159, 327, 253, 22791, 8892, 275, 619, 1859, 627, 403, 642, 2201, 32213, 767, 5884, 3374, 50276, 7053, 253, 9759, 273, 253, 270, 12352, 10336, 778, 320, 247, 2372, 1512, 13224, 275, 26198, 24978, 1563, 253, 4477, 1375, 616, 6095, 323, 616, 22791, 1566, 285, 1375, 326, 642, 5368, 39707, 3210, 2953, 512, 1264, 3374, 10481, 2654, 9059, 326, 841, 403, 2581, 1846, 3374, 347, 271, 1650, 3888, 50276, 30061, 1241, 2430, 1980, 3634, 2430, 690, 2557, 1411, 29391, 32270, 14417, 285, 452, 281, 2968, 342, 13642, 273, 1048, 6430, 8113, 4979, 398, 3614, 39962, 2061, 5375, 1252, 520, 746, 1717, 9084, 452, 1119, 4088, 281, 2968, 342, 326, 594, 556, 253, 591, 22070, 3614, 39962, 2061, 5375, 16899, 20914, 18040, 10336, 3587, 327, 253, 941, 260, 20282, 3614, 5758, 317, 707, 296, 248, 17312, 71, 681, 6071, 280, 17312, 938, 1797, 50004, 88, 1028, 20282, 36445, 2844, 13118, 2241, 296, 729, 1297, 16702, 398, 280, 17312, 938, 1797, 20790, 9275, 11323, 27311, 941, 2349, 351, 398, 281, 9084, 2074, 281, 253, 4477, 2746, 326, 1057, 417, 1379, 2712, 1977, 432, 616, 1566, 10336, 534, 597, 921, 2987, 4030, 2654, 1089, 352, 4344, 2167, 281, 1691, 352, 715, 3634, 273, 643, 789, 253, 24896, 375, 2007, 1375, 326, 40798, 2349, 351, 723, 13414, 452, 247, 1175, 42115, 8492, 275, 616, 4758, 1386, 27862, 516, 14338, 310, 326, 16774, 390, 403, 627, 643, 4606, 281, 1158, 326, 327, 643, 10625, 969, 3888, 347, 1650, 1899, 2349, 351, 723, 789, 19143, 973, 281, 33876, 907, 50274, 9815, 5150, 337, 3133, 8489, 689, 681, 37787, 1881, 42959, 310, 47817, 285, 247, 2602, 4090, 651, 452, 21010, 352, 10260, 50275, 7152, 339, 9852, 436, 2929, 253, 4477, 4081, 247, 747, 22791, 4836, 323, 13424, 8468, 2625, 516, 10340, 5742, 253, 4477, 2770, 327, 10038, 72, 285, 7266, 72, 6298, 285, 897, 1345, 15302, 323, 7103, 597, 806, 15524, 15958, 5816, 6127, 323, 10038, 72, 285, 7266, 72, 6298, 407, 490, 77, 839, 3530, 285, 4270, 247, 10895, 342, 15958, 5816, 941, 597, 9009, 4314, 5368, 5899, 390, 4980, 2069, 12395, 516, 10340, 5609, 25761, 597, 671, 4081, 285, 3715, 247, 747, 3673, 44856, 49783, 27311, 270, 12352, 1881, 42959, 10336, 326, 13840, 253, 8847, 10038, 72, 377, 72, 941, 841, 941, 3798, 452, 247, 1048, 6324, 2605, 347, 24295, 800, 6298, 253, 4477, 840, 6760, 253, 3045, 273, 841, 11333, 327, 337, 253, 9305, 2625, 14433, 4836, 285, 374, 1264, 15450, 8892, 247, 43534, 5481, 275, 10038, 72, 270, 43534, 5481, 275, 7266, 72, 285, 260, 10177, 42405, 9162, 275, 10038, 72, 616, 1543, 5224, 326, 253, 747, 270, 12352, 5853, 310, 3012, 1805, 685, 512, 1666, 25379, 50276, 783, 9400, 273, 2625, 8468, 516, 10340, 310, 271, 1774, 15958, 285, 1077, 8542, 1895, 275, 278, 15356, 5312, 13424, 941, 4849, 50276, 15617, 253, 9305, 5140, 2228, 285, 253, 1264, 15450, 8892, 403, 3588, 3368, 11809, 50276, 783, 2216, 273, 253, 747, 10336, 1057, 25057, 253, 3102, 273, 10038, 72, 377, 72, 3607, 285, 253, 5750, 689, 253, 8245, 3082, 403, 18462, 50276, 14453, 8673, 2625, 3510, 778, 564, 4457, 10038, 72, 377, 72, 1223, 10038, 72, 377, 72, 310, 25711, 581, 273, 253, 954, 7744, 5728, 13424, 6298, 275, 278, 15356, 4893, 627, 403, 643, 1846, 13479, 326, 403, 417, 6107, 275, 436, 2929, 824, 347, 516, 86, 17308, 11955, 19859, 10425, 1714, 285, 10973, 11955, 305, 18356, 299, 909, 3966, 891, 717, 417, 16425, 326, 253, 4477, 1364, 7472, 616, 5609, 327, 841, 6298, 533, 616, 5319, 812, 320, 1077, 1027, 432, 10038, 72, 377, 72, 6298, 323, 1650, 597, 778, 417, 452, 247, 2590, 24295, 800, 3102, 891, 717, 14338, 281, 871, 253, 4477, 8180, 273, 253, 2087, 50228, 273, 253, 5853, 285, 253, 2442, 15504, 281, 10541, 1066, 253, 9380, 39926, 390, 19148, 253, 7990, 273, 253, 2929, 50275, 44650, 1332, 5438, 253, 4477, 7277, 253, 747, 5853, 1411, 4314, 8245, 5609, 534, 310, 1270, 2299, 2139, 417, 7277, 1411, 253, 767, 5368, 9380, 326, 5742, 2770, 327, 516, 1065, 272, 278, 15356, 24295, 800, 6298, 26332, 1384, 3387, 50276, 783, 25577, 1180, 275, 253, 2929, 4496, 15249, 50275, 2068, 42807, 1698, 6983, 323, 8245, 5853, 2905, 281, 253, 2045, 1127, 275, 2829, 374, 253, 3045, 273, 253, 8245, 5609, 512, 452, 1077, 1698, 6983, 3021, 1698, 269, 18, 4868, 253, 4477, 2530, 690, 4606, 275, 253, 2505, 534, 310, 1270, 533, 824, 247, 1698, 3045, 16540, 247, 4468, 326, 1880, 841, 1666, 25379, 403, 1512, 3477, 281, 7171, 10941, 1411, 253, 256, 5503, 5853, 812, 2085, 625, 3588, 1543, 50276, 938, 549, 1342, 3496, 266, 14103, 519, 363, 3230, 549, 4019, 285, 34843, 301, 387, 1914, 4019, 1693, 1596, 247, 10237, 5145, 4715, 7792, 275, 3361, 273, 5816, 941, 323, 23390, 26306, 4073, 5481, 432, 13424, 6298, 549, 32693, 638, 3845, 549, 32693, 19, 11238, 1047, 24803, 43425, 50276, 1449, 13599, 311, 256, 26599, 1111, 783, 88, 963, 67, 2244, 5985, 31934, 35926, 67, 538, 321, 1218, 38373, 465, 6950, 288, 729, 33032, 757, 278, 528, 18256, 74, 17614, 1222, 34843, 301, 288, 2563, 6339, 465, 32560, 298, 638, 5493, 260, 4274, 376, 11829, 83, 4949, 257, 38622, 2301, 312, 275, 7187, 295, 1240, 360, 1200, 6451, 1162, 355, 4560, 1534, 4073, 13305, 275, 247, 16196, 3472, 673, 2962, 273, 9086, 11962, 6109, 8468, 941, 275, 10061, 273, 253, 4022, 21477, 8059, 327, 1966, 2616, 275, 12672, 2718, 7223, 577, 30452, 1857, 520, 4022, 2490, 187, 4118, 18435, 27, 2520, 2929, 24357, 247, 747, 22791, 323, 5816, 941, 516, 10340, 275, 24295, 800, 6298, 751, 10038, 72, 970, 15958, 5816, 1255, 3210, 891, 1902, 824, 247, 10895, 281, 4446, 1774, 16936, 275, 436, 762, 14091, 728, 2170, 6296, 253, 4477, 921, 326, 2629, 256, 5503, 3082, 1891, 253, 30628, 23027, 2789, 436, 2929, 247, 2590, 2997 ]
Below is given review of a research paper from cnoference journal. Please write a summary the review. ### Review: mbw are proposed to tackle two problems 1 uncalibrated but synchronized videos with few views and 2 a limited manual annotations the main contributions lie in 3 aspects 1 propose a zoo dataset with uncalibrated two synchronized views 2 propose a semisupervised framework by leveraging mvnrsfm and optical flowbased algorithms 3 study different factors of the proposed method on the proposed dataset systematically s1 the paper create a zoo dataset with uncalibrated two synchronized views the mutliview animal datasets are rare and will be useful for future research s2 the paper proposes a semisupervised framework by leveraging mvnrsfm and optical flowbased algorithms mvnrsfm handles the uncalibrated few views while optical flowbased methods employ limited annotations to generate candidates the whole framework run iteratively to perform autolabeling s3 the experiments show that they achieve similar performance with 1 annotations to fully supervised methods besides the authors ablate the proposed method on the proposed dataset systematically w1 technical novelty is limited the merits of handling few uncalibrated views are originated from mvnrsfm and the 2d landmarks are generated from offtheshelf flowbased model the main techinical contribution is to use them in a semisupervised scheme w2 as shown in fig 4a the performance of mbw drops as the number of views increases whats the reason behind that w3 the experiments show that the method works well when there is only 1 annotation whats the minimum percentage for the algorithm to work well w4 what will happen if optical flowbased model performs not very well any requirement for the minimal performance of optical flowbased model docsepin this paper the authors have combined a nonrigid 3d neural prior with deep flow to obtain highfidelity landmark estimates from videos with only two or three uncalibrated handheld cameras utilizing multiview nonrigid structure from motion mvnrsfm to more reliably estimate camera poses and 3d landmark positions from noisy 2d inputs with few cameras and leveraging deep optical flow the paper presents a novel process the dataset collected for this project is another contribution to the community the ability to do achieve these results without calibration is worth highlighting calibration is a critical process and in many cases essential but by eliminating the need to calibrate the proposed idea can expand to variety of applications to capture the 3d reconstructions of the unoccluded objects of interest with the minimal cameras is also worth mentioning the significantly low number of annotated frames needed for this method to work is impressive the comparison in table 1 is also highly appreciated as a reader the approach is clearly laid out the paper is well written and easy to read the dataset being released is very small it captures the essence of the proposed idea but seems lacking releasing a larger dataset with multitude of variations in addition to the mentioned ones would help further help also to highlight the strength of the technique impact of the technique on semioccluded objects in the scene would further help in appendix d the idea for bounding box estimation and reducing the problem to single objectsingle frame sub problem is good but better techniques can be used for the same this could help in reducing the number of iterations to solve this also it would help to understand the impact of the bounding boxes not completely covering the chimpanzee in the scene docsepthis paper proposes multiview bootstrapping in the wild mbw a semisupervised pipeline that can annotate 2d and 3d landmarks of articulated objects given videos captured by two or three uncalibrated handheld cameras where 12 of frames are manually annotated specifically the proposed pipeline builds a neural shape prior from sparse manual annotations using multiview nonrigid structure from motion mvnrsfm then propagates the initial labels through the entire video using a pretrained optical flow method the propagated 2d landmark candidates are used to iteratively retrain the mvnrsfm as well as a 2d detector the authors collect the zoo dataset and apply the proposed pipeline for 2d and 3d landmark annotation extensive experiments on the zoo dataset and the human36m dataset show the reliability and scalability of the proposed annotation pipeline ie given very limited number of camera views and manual annotations the proposed pipeline can reliably annotate the 2d and 3d landmarks while rejecting correcting outliers 1 this paper proposes a feasible semisupervised pipeline for 2d and 3d landmark annotation of articulated objects under inthewild setting in contrast to complex capture systems like panoptic studio that can only be used in the laboratory and applied to limited objects the proposed pipeline is more flexible and has the potential to be applied to a wide variety of objects collecting largescale dataset is possible using the proposed pipeline as only sparse manual annotation is necessary 2 the authors successfully apply the proposed annotation pipeline to the newly collected zoo dataset this shows the effectiveness of the proposed pipeline while providing researchers with valuable data of less studied articulated object categories 3 extensive experiments show the reliability of the proposed annotation pipeline 1 this paper focuses on the semisupervised annotation pipeline rather than the dataset itself however the novelty of the proposed pipeline is limited as it mainly takes advantage of mvnrsfm and optical flow 2 although the proposed pipeline outperforms baseline methods it is not clear whether the annotation quality is sufficient to be used as ground truth more analyses of the pipeline may be necessary eg what should be done when the semisupervised annotation quality is not good enough does adding more manual annotation help if so how many additional annotations would be meaningful from fig 4a increasing manual annotation from 1 to 2 does not seem to significantly improve the outlier rejection 3 there is limited discussion on the zoo dataset it would be nice if the authors can provide more indepth analyses of the dataset eg what is the potential usage of the zoo dataset given it is annotated by the semisupervised pipeline docsephand labeling objects in natural videos is challenging and laborintensive instead of using rigid geometry and calibrated cameras the paper proposed an approach of combining a nonrigid 3d neural prior with the deep optical flow to obtain goodquality landmark detection the approach requires annotations from 12 of the video frames from only 23 uncalibrated handheld cameras the authors conducted a comprehensive analysis in 2d and 3d reconstructions of the remarkable experimental results on two benchmark datasets human and newly curated datasets 1 paper is wellwritten and wellorganized with great clarity on design motivations 2 the arguments made in the paper are justified and backed up by experimental results comprehensive analysis is provided to better understand the design motivations and model behaviors 3 the performance improvements in 2d and 3d results are significantly greater than in baselines 4 the idea of active learning is not new but utilizing multiview for bootstrapping in the wild is interesting it could extend to other applications in 3d vision 1 it would be great if the authors can conduct experiments to demonstrate the effectiveness of the method in more unconstrained natural environments when there involve large camera motions as well quantify how large camera motions would impact the results 2 test generalization abilities when context changes eg tigers move from one bridge cage to another dark cage in the zoo and quantify how the method could quickly adapt to continuously changed domains ### Summary:
this paper presents a dataset of uncalibrated pairstriplets of videos of zoo animals the paper also presents a method to label this data in a selfsupervised manner with minimal annotation reviewers appreciated the contents of this dataset and see value for follow up work the method for labelling was also appreciated weaknesses of the paper have been addressed or acknowledged i recommend accpeting this paper to the neurips 2022 datasets and benchmarks program
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 1814, 88, 403, 4081, 281, 18915, 767, 3237, 337, 440, 1179, 50250, 533, 30492, 10556, 342, 1643, 6849, 285, 374, 247, 3710, 11595, 31825, 253, 2022, 9021, 7027, 275, 495, 7794, 50276, 18, 12661, 247, 41089, 10895, 342, 440, 1179, 50250, 767, 30492, 6849, 374, 12661, 247, 49863, 29974, 13337, 7792, 407, 19732, 2977, 278, 26697, 2967, 22401, 285, 5748, 2685, 3169, 11333, 495, 1263, 1027, 2616, 273, 253, 4081, 1332, 327, 253, 4081, 10895, 24181, 256, 18, 253, 2929, 2794, 247, 41089, 10895, 342, 440, 1179, 50250, 767, 30492, 6849, 253, 2873, 23642, 827, 5893, 15302, 403, 7520, 285, 588, 320, 4217, 323, 2852, 2561, 50276, 84, 19, 253, 2929, 29328, 247, 49863, 29974, 13337, 7792, 407, 19732, 2977, 278, 26697, 2967, 22401, 285, 5748, 2685, 3169, 11333, 278, 26697, 2967, 22401, 22139, 253, 440, 1179, 50250, 1643, 6849, 1223, 5748, 2685, 3169, 3082, 2126, 3710, 31825, 281, 6635, 9183, 253, 2644, 7792, 1408, 10040, 3146, 281, 1347, 1125, 311, 1492, 272, 50276, 84, 20, 253, 4679, 921, 326, 597, 5115, 2074, 3045, 342, 337, 31825, 281, 4751, 22296, 3082, 16280, 253, 4477, 490, 12579, 253, 4081, 1332, 327, 253, 4081, 10895, 24181, 50276, 88, 18, 7681, 38135, 310, 3710, 253, 16108, 273, 10885, 1643, 440, 1179, 50250, 6849, 403, 23923, 432, 278, 26697, 2967, 22401, 285, 253, 374, 69, 39719, 403, 4561, 432, 273, 649, 1041, 48164, 2685, 3169, 1566, 253, 2022, 13817, 9641, 7680, 310, 281, 897, 731, 275, 247, 49863, 29974, 13337, 6974, 50276, 88, 19, 347, 2011, 275, 3036, 577, 66, 253, 3045, 273, 278, 39220, 15323, 347, 253, 1180, 273, 6849, 5459, 47515, 253, 1921, 3212, 326, 50276, 88, 20, 253, 4679, 921, 326, 253, 1332, 2987, 973, 672, 627, 310, 760, 337, 22581, 47515, 253, 5927, 7155, 323, 253, 5933, 281, 789, 973, 50276, 88, 21, 752, 588, 5108, 604, 5748, 2685, 3169, 1566, 17923, 417, 1077, 973, 667, 8284, 323, 253, 8723, 3045, 273, 5748, 2685, 3169, 1566, 5474, 339, 9852, 436, 2929, 253, 4477, 452, 5678, 247, 1327, 10389, 301, 495, 69, 11454, 2720, 342, 3676, 2685, 281, 4044, 1029, 71, 21718, 30951, 8197, 432, 10556, 342, 760, 767, 390, 1264, 440, 1179, 50250, 45848, 14693, 50276, 8906, 3006, 1554, 400, 827, 1327, 10389, 301, 2605, 432, 3200, 278, 26697, 2967, 22401, 281, 625, 27340, 6642, 6568, 24543, 285, 495, 69, 30951, 6887, 432, 27620, 374, 69, 14800, 342, 1643, 14693, 285, 19732, 2977, 3676, 5748, 2685, 253, 2929, 10262, 247, 4460, 1232, 253, 10895, 5728, 323, 436, 2199, 310, 1529, 7680, 281, 253, 3114, 50276, 783, 3745, 281, 513, 5115, 841, 1543, 1293, 18543, 310, 4409, 27321, 18543, 310, 247, 4619, 1232, 285, 275, 1142, 2219, 5667, 533, 407, 23703, 253, 878, 281, 24403, 366, 253, 4081, 2934, 476, 5645, 281, 5235, 273, 4893, 50276, 936, 9232, 253, 495, 69, 49866, 6477, 273, 253, 440, 406, 4686, 5113, 273, 1600, 342, 253, 8723, 14693, 310, 671, 4409, 29570, 50276, 783, 3012, 1698, 1180, 273, 28267, 13009, 3058, 323, 436, 1332, 281, 789, 310, 13943, 50276, 783, 5301, 275, 2829, 337, 310, 671, 4122, 14109, 347, 247, 9414, 50276, 783, 2746, 310, 4518, 10090, 562, 50276, 783, 2929, 310, 973, 3542, 285, 3477, 281, 1239, 50275, 783, 10895, 1146, 4439, 310, 1077, 1355, 352, 28174, 253, 17718, 273, 253, 4081, 2934, 533, 3133, 14999, 20437, 247, 4067, 10895, 342, 30408, 273, 10575, 275, 1635, 281, 253, 5393, 4394, 651, 1361, 2007, 1361, 671, 281, 6780, 253, 4757, 273, 253, 5853, 3486, 273, 253, 5853, 327, 10020, 406, 4686, 5113, 275, 253, 6200, 651, 2007, 1361, 50276, 249, 30762, 277, 253, 2934, 323, 41113, 3817, 13418, 285, 8493, 253, 1895, 281, 2014, 1789, 20199, 3665, 749, 1895, 310, 1175, 533, 1805, 5609, 476, 320, 908, 323, 253, 1072, 436, 812, 1361, 275, 8493, 253, 1180, 273, 25142, 281, 8415, 436, 50276, 12563, 352, 651, 1361, 281, 2096, 253, 3486, 273, 253, 41113, 12783, 417, 4336, 10985, 253, 19114, 4029, 91, 1796, 275, 253, 6200, 50274, 7152, 33032, 2520, 2929, 29328, 1554, 400, 827, 7491, 10981, 2784, 275, 253, 4956, 278, 39220, 247, 49863, 29974, 13337, 15722, 326, 476, 12182, 366, 374, 69, 285, 495, 69, 39719, 273, 35144, 5113, 1677, 10556, 10848, 407, 767, 390, 1264, 440, 1179, 50250, 45848, 14693, 835, 1249, 273, 13009, 403, 13542, 28267, 5742, 253, 4081, 15722, 21168, 247, 11454, 5281, 2720, 432, 23507, 11595, 31825, 970, 1554, 400, 827, 1327, 10389, 301, 2605, 432, 3200, 278, 26697, 2967, 22401, 840, 8641, 684, 253, 3302, 13301, 949, 253, 2862, 3492, 970, 247, 3215, 11273, 5748, 2685, 1332, 253, 46695, 374, 69, 30951, 9183, 403, 908, 281, 10040, 3146, 851, 1949, 253, 278, 26697, 2967, 22401, 347, 973, 347, 247, 374, 69, 13562, 253, 4477, 4822, 253, 41089, 10895, 285, 4647, 253, 4081, 15722, 323, 374, 69, 285, 495, 69, 30951, 22581, 9470, 4679, 327, 253, 41089, 10895, 285, 253, 1966, 1812, 78, 10895, 921, 253, 13367, 285, 9171, 1430, 273, 253, 4081, 22581, 15722, 26332, 1677, 1077, 3710, 1180, 273, 6568, 6849, 285, 11595, 31825, 253, 4081, 15722, 476, 27340, 12182, 366, 253, 374, 69, 285, 495, 69, 39719, 1223, 33944, 35827, 42559, 337, 436, 2929, 29328, 247, 17887, 49863, 29974, 13337, 15722, 323, 374, 69, 285, 495, 69, 30951, 22581, 273, 35144, 5113, 762, 540, 248, 32778, 4758, 275, 4499, 281, 2570, 9232, 2718, 751, 3199, 37016, 11803, 326, 476, 760, 320, 908, 275, 253, 9965, 285, 3732, 281, 3710, 5113, 253, 4081, 15722, 310, 625, 12112, 285, 556, 253, 2442, 281, 320, 3732, 281, 247, 4618, 5235, 273, 5113, 17055, 1236, 2510, 25912, 10895, 310, 1896, 970, 253, 4081, 15722, 347, 760, 23507, 11595, 22581, 310, 3309, 374, 253, 4477, 8379, 4647, 253, 4081, 22581, 15722, 281, 253, 9841, 5728, 41089, 10895, 436, 2722, 253, 12510, 273, 253, 4081, 15722, 1223, 5277, 8607, 342, 9865, 941, 273, 1679, 5421, 35144, 1789, 9050, 495, 9470, 4679, 921, 253, 13367, 273, 253, 4081, 22581, 15722, 337, 436, 2929, 16633, 327, 253, 49863, 29974, 13337, 22581, 15722, 2581, 685, 253, 10895, 3139, 2299, 253, 38135, 273, 253, 4081, 15722, 310, 3710, 347, 352, 7194, 3936, 5750, 273, 278, 26697, 2967, 22401, 285, 5748, 2685, 50276, 19, 3738, 253, 4081, 15722, 41731, 13015, 8245, 3082, 352, 310, 417, 2590, 1880, 253, 22581, 3290, 310, 4209, 281, 320, 908, 347, 3216, 5083, 625, 6260, 273, 253, 15722, 778, 320, 3309, 24088, 752, 943, 320, 2218, 672, 253, 49863, 29974, 13337, 22581, 3290, 310, 417, 1175, 2217, 1057, 6240, 625, 11595, 22581, 1361, 604, 594, 849, 1142, 3081, 31825, 651, 320, 14282, 432, 3036, 577, 66, 3629, 11595, 22581, 432, 337, 281, 374, 1057, 417, 1646, 281, 3012, 3157, 253, 562, 3623, 18235, 495, 627, 310, 3710, 5955, 327, 253, 41089, 10895, 352, 651, 320, 5322, 604, 253, 4477, 476, 2085, 625, 801, 554, 394, 6260, 273, 253, 10895, 24088, 752, 310, 253, 2442, 10393, 273, 253, 41089, 10895, 1677, 352, 310, 28267, 407, 253, 49863, 29974, 13337, 15722, 50276, 7152, 339, 545, 395, 21473, 5113, 275, 3626, 10556, 310, 11132, 285, 5299, 47986, 3185, 273, 970, 16572, 12087, 285, 35890, 14693, 253, 2929, 4081, 271, 2746, 273, 16248, 247, 1327, 10389, 301, 495, 69, 11454, 2720, 342, 253, 3676, 5748, 2685, 281, 4044, 1175, 15177, 30951, 5481, 253, 2746, 4419, 31825, 432, 1249, 273, 253, 3492, 13009, 432, 760, 3495, 440, 1179, 50250, 45848, 14693, 253, 4477, 5196, 247, 11088, 1783, 275, 374, 69, 285, 495, 69, 49866, 6477, 273, 253, 13406, 5661, 1543, 327, 767, 22791, 15302, 1966, 285, 9841, 1095, 456, 15302, 337, 2929, 310, 973, 15720, 285, 973, 34092, 342, 1270, 19843, 327, 2216, 42852, 374, 253, 7125, 1160, 275, 253, 2929, 403, 17285, 285, 17245, 598, 407, 5661, 1543, 11088, 1783, 310, 2530, 281, 1805, 2096, 253, 2216, 42852, 285, 1566, 13576, 495, 253, 3045, 11701, 275, 374, 69, 285, 495, 69, 1543, 403, 3012, 3687, 685, 275, 1666, 25379, 577, 253, 2934, 273, 3939, 4715, 310, 417, 747, 533, 17617, 1554, 400, 827, 323, 7491, 10981, 2784, 275, 253, 4956, 310, 4722, 352, 812, 9017, 281, 643, 4893, 275, 495, 69, 8113, 337, 352, 651, 320, 1270, 604, 253, 4477, 476, 2589, 4679, 281, 7568, 253, 12510, 273, 253, 1332, 275, 625, 440, 48454, 3626, 12620, 672, 627, 6388, 1781, 6568, 14462, 347, 973, 22048, 849, 1781, 6568, 14462, 651, 3486, 253, 1543, 374, 1071, 26647, 15277, 672, 3634, 2544, 24088, 246, 304, 398, 2118, 432, 581, 9729, 22133, 281, 1529, 3644, 22133, 275, 253, 41089, 285, 22048, 849, 253, 1332, 812, 4541, 5223, 281, 14949, 4391, 10625, 2490, 187, 4118, 18435, 27, 2520, 2929, 10262, 247, 10895, 273, 440, 1179, 50250, 1349, 712, 363, 28041, 273, 10556, 273, 41089, 5074, 253, 2929, 671, 10262, 247, 1332, 281, 5203, 436, 941, 275, 247, 1881, 35421, 5133, 342, 8723, 22581, 30628, 14109, 253, 9410, 273, 436, 10895, 285, 923, 1318, 323, 956, 598, 789, 253, 1332, 323, 46684, 369, 671, 14109, 32213, 273, 253, 2929, 452, 644, 9713, 390, 14969, 891, 5583, 756, 81, 8211, 436, 2929, 281, 253, 5723, 2824, 1384, 1423, 15302, 285, 49602, 2086 ]
[ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 ]
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 1814, 88, 403, 4081, 281, 18915, 767, 3237, 337, 440, 1179, 50250, 533, 30492, 10556, 342, 1643, 6849, 285, 374, 247, 3710, 11595, 31825, 253, 2022, 9021, 7027, 275, 495, 7794, 50276, 18, 12661, 247, 41089, 10895, 342, 440, 1179, 50250, 767, 30492, 6849, 374, 12661, 247, 49863, 29974, 13337, 7792, 407, 19732, 2977, 278, 26697, 2967, 22401, 285, 5748, 2685, 3169, 11333, 495, 1263, 1027, 2616, 273, 253, 4081, 1332, 327, 253, 4081, 10895, 24181, 256, 18, 253, 2929, 2794, 247, 41089, 10895, 342, 440, 1179, 50250, 767, 30492, 6849, 253, 2873, 23642, 827, 5893, 15302, 403, 7520, 285, 588, 320, 4217, 323, 2852, 2561, 50276, 84, 19, 253, 2929, 29328, 247, 49863, 29974, 13337, 7792, 407, 19732, 2977, 278, 26697, 2967, 22401, 285, 5748, 2685, 3169, 11333, 278, 26697, 2967, 22401, 22139, 253, 440, 1179, 50250, 1643, 6849, 1223, 5748, 2685, 3169, 3082, 2126, 3710, 31825, 281, 6635, 9183, 253, 2644, 7792, 1408, 10040, 3146, 281, 1347, 1125, 311, 1492, 272, 50276, 84, 20, 253, 4679, 921, 326, 597, 5115, 2074, 3045, 342, 337, 31825, 281, 4751, 22296, 3082, 16280, 253, 4477, 490, 12579, 253, 4081, 1332, 327, 253, 4081, 10895, 24181, 50276, 88, 18, 7681, 38135, 310, 3710, 253, 16108, 273, 10885, 1643, 440, 1179, 50250, 6849, 403, 23923, 432, 278, 26697, 2967, 22401, 285, 253, 374, 69, 39719, 403, 4561, 432, 273, 649, 1041, 48164, 2685, 3169, 1566, 253, 2022, 13817, 9641, 7680, 310, 281, 897, 731, 275, 247, 49863, 29974, 13337, 6974, 50276, 88, 19, 347, 2011, 275, 3036, 577, 66, 253, 3045, 273, 278, 39220, 15323, 347, 253, 1180, 273, 6849, 5459, 47515, 253, 1921, 3212, 326, 50276, 88, 20, 253, 4679, 921, 326, 253, 1332, 2987, 973, 672, 627, 310, 760, 337, 22581, 47515, 253, 5927, 7155, 323, 253, 5933, 281, 789, 973, 50276, 88, 21, 752, 588, 5108, 604, 5748, 2685, 3169, 1566, 17923, 417, 1077, 973, 667, 8284, 323, 253, 8723, 3045, 273, 5748, 2685, 3169, 1566, 5474, 339, 9852, 436, 2929, 253, 4477, 452, 5678, 247, 1327, 10389, 301, 495, 69, 11454, 2720, 342, 3676, 2685, 281, 4044, 1029, 71, 21718, 30951, 8197, 432, 10556, 342, 760, 767, 390, 1264, 440, 1179, 50250, 45848, 14693, 50276, 8906, 3006, 1554, 400, 827, 1327, 10389, 301, 2605, 432, 3200, 278, 26697, 2967, 22401, 281, 625, 27340, 6642, 6568, 24543, 285, 495, 69, 30951, 6887, 432, 27620, 374, 69, 14800, 342, 1643, 14693, 285, 19732, 2977, 3676, 5748, 2685, 253, 2929, 10262, 247, 4460, 1232, 253, 10895, 5728, 323, 436, 2199, 310, 1529, 7680, 281, 253, 3114, 50276, 783, 3745, 281, 513, 5115, 841, 1543, 1293, 18543, 310, 4409, 27321, 18543, 310, 247, 4619, 1232, 285, 275, 1142, 2219, 5667, 533, 407, 23703, 253, 878, 281, 24403, 366, 253, 4081, 2934, 476, 5645, 281, 5235, 273, 4893, 50276, 936, 9232, 253, 495, 69, 49866, 6477, 273, 253, 440, 406, 4686, 5113, 273, 1600, 342, 253, 8723, 14693, 310, 671, 4409, 29570, 50276, 783, 3012, 1698, 1180, 273, 28267, 13009, 3058, 323, 436, 1332, 281, 789, 310, 13943, 50276, 783, 5301, 275, 2829, 337, 310, 671, 4122, 14109, 347, 247, 9414, 50276, 783, 2746, 310, 4518, 10090, 562, 50276, 783, 2929, 310, 973, 3542, 285, 3477, 281, 1239, 50275, 783, 10895, 1146, 4439, 310, 1077, 1355, 352, 28174, 253, 17718, 273, 253, 4081, 2934, 533, 3133, 14999, 20437, 247, 4067, 10895, 342, 30408, 273, 10575, 275, 1635, 281, 253, 5393, 4394, 651, 1361, 2007, 1361, 671, 281, 6780, 253, 4757, 273, 253, 5853, 3486, 273, 253, 5853, 327, 10020, 406, 4686, 5113, 275, 253, 6200, 651, 2007, 1361, 50276, 249, 30762, 277, 253, 2934, 323, 41113, 3817, 13418, 285, 8493, 253, 1895, 281, 2014, 1789, 20199, 3665, 749, 1895, 310, 1175, 533, 1805, 5609, 476, 320, 908, 323, 253, 1072, 436, 812, 1361, 275, 8493, 253, 1180, 273, 25142, 281, 8415, 436, 50276, 12563, 352, 651, 1361, 281, 2096, 253, 3486, 273, 253, 41113, 12783, 417, 4336, 10985, 253, 19114, 4029, 91, 1796, 275, 253, 6200, 50274, 7152, 33032, 2520, 2929, 29328, 1554, 400, 827, 7491, 10981, 2784, 275, 253, 4956, 278, 39220, 247, 49863, 29974, 13337, 15722, 326, 476, 12182, 366, 374, 69, 285, 495, 69, 39719, 273, 35144, 5113, 1677, 10556, 10848, 407, 767, 390, 1264, 440, 1179, 50250, 45848, 14693, 835, 1249, 273, 13009, 403, 13542, 28267, 5742, 253, 4081, 15722, 21168, 247, 11454, 5281, 2720, 432, 23507, 11595, 31825, 970, 1554, 400, 827, 1327, 10389, 301, 2605, 432, 3200, 278, 26697, 2967, 22401, 840, 8641, 684, 253, 3302, 13301, 949, 253, 2862, 3492, 970, 247, 3215, 11273, 5748, 2685, 1332, 253, 46695, 374, 69, 30951, 9183, 403, 908, 281, 10040, 3146, 851, 1949, 253, 278, 26697, 2967, 22401, 347, 973, 347, 247, 374, 69, 13562, 253, 4477, 4822, 253, 41089, 10895, 285, 4647, 253, 4081, 15722, 323, 374, 69, 285, 495, 69, 30951, 22581, 9470, 4679, 327, 253, 41089, 10895, 285, 253, 1966, 1812, 78, 10895, 921, 253, 13367, 285, 9171, 1430, 273, 253, 4081, 22581, 15722, 26332, 1677, 1077, 3710, 1180, 273, 6568, 6849, 285, 11595, 31825, 253, 4081, 15722, 476, 27340, 12182, 366, 253, 374, 69, 285, 495, 69, 39719, 1223, 33944, 35827, 42559, 337, 436, 2929, 29328, 247, 17887, 49863, 29974, 13337, 15722, 323, 374, 69, 285, 495, 69, 30951, 22581, 273, 35144, 5113, 762, 540, 248, 32778, 4758, 275, 4499, 281, 2570, 9232, 2718, 751, 3199, 37016, 11803, 326, 476, 760, 320, 908, 275, 253, 9965, 285, 3732, 281, 3710, 5113, 253, 4081, 15722, 310, 625, 12112, 285, 556, 253, 2442, 281, 320, 3732, 281, 247, 4618, 5235, 273, 5113, 17055, 1236, 2510, 25912, 10895, 310, 1896, 970, 253, 4081, 15722, 347, 760, 23507, 11595, 22581, 310, 3309, 374, 253, 4477, 8379, 4647, 253, 4081, 22581, 15722, 281, 253, 9841, 5728, 41089, 10895, 436, 2722, 253, 12510, 273, 253, 4081, 15722, 1223, 5277, 8607, 342, 9865, 941, 273, 1679, 5421, 35144, 1789, 9050, 495, 9470, 4679, 921, 253, 13367, 273, 253, 4081, 22581, 15722, 337, 436, 2929, 16633, 327, 253, 49863, 29974, 13337, 22581, 15722, 2581, 685, 253, 10895, 3139, 2299, 253, 38135, 273, 253, 4081, 15722, 310, 3710, 347, 352, 7194, 3936, 5750, 273, 278, 26697, 2967, 22401, 285, 5748, 2685, 50276, 19, 3738, 253, 4081, 15722, 41731, 13015, 8245, 3082, 352, 310, 417, 2590, 1880, 253, 22581, 3290, 310, 4209, 281, 320, 908, 347, 3216, 5083, 625, 6260, 273, 253, 15722, 778, 320, 3309, 24088, 752, 943, 320, 2218, 672, 253, 49863, 29974, 13337, 22581, 3290, 310, 417, 1175, 2217, 1057, 6240, 625, 11595, 22581, 1361, 604, 594, 849, 1142, 3081, 31825, 651, 320, 14282, 432, 3036, 577, 66, 3629, 11595, 22581, 432, 337, 281, 374, 1057, 417, 1646, 281, 3012, 3157, 253, 562, 3623, 18235, 495, 627, 310, 3710, 5955, 327, 253, 41089, 10895, 352, 651, 320, 5322, 604, 253, 4477, 476, 2085, 625, 801, 554, 394, 6260, 273, 253, 10895, 24088, 752, 310, 253, 2442, 10393, 273, 253, 41089, 10895, 1677, 352, 310, 28267, 407, 253, 49863, 29974, 13337, 15722, 50276, 7152, 339, 545, 395, 21473, 5113, 275, 3626, 10556, 310, 11132, 285, 5299, 47986, 3185, 273, 970, 16572, 12087, 285, 35890, 14693, 253, 2929, 4081, 271, 2746, 273, 16248, 247, 1327, 10389, 301, 495, 69, 11454, 2720, 342, 253, 3676, 5748, 2685, 281, 4044, 1175, 15177, 30951, 5481, 253, 2746, 4419, 31825, 432, 1249, 273, 253, 3492, 13009, 432, 760, 3495, 440, 1179, 50250, 45848, 14693, 253, 4477, 5196, 247, 11088, 1783, 275, 374, 69, 285, 495, 69, 49866, 6477, 273, 253, 13406, 5661, 1543, 327, 767, 22791, 15302, 1966, 285, 9841, 1095, 456, 15302, 337, 2929, 310, 973, 15720, 285, 973, 34092, 342, 1270, 19843, 327, 2216, 42852, 374, 253, 7125, 1160, 275, 253, 2929, 403, 17285, 285, 17245, 598, 407, 5661, 1543, 11088, 1783, 310, 2530, 281, 1805, 2096, 253, 2216, 42852, 285, 1566, 13576, 495, 253, 3045, 11701, 275, 374, 69, 285, 495, 69, 1543, 403, 3012, 3687, 685, 275, 1666, 25379, 577, 253, 2934, 273, 3939, 4715, 310, 417, 747, 533, 17617, 1554, 400, 827, 323, 7491, 10981, 2784, 275, 253, 4956, 310, 4722, 352, 812, 9017, 281, 643, 4893, 275, 495, 69, 8113, 337, 352, 651, 320, 1270, 604, 253, 4477, 476, 2589, 4679, 281, 7568, 253, 12510, 273, 253, 1332, 275, 625, 440, 48454, 3626, 12620, 672, 627, 6388, 1781, 6568, 14462, 347, 973, 22048, 849, 1781, 6568, 14462, 651, 3486, 253, 1543, 374, 1071, 26647, 15277, 672, 3634, 2544, 24088, 246, 304, 398, 2118, 432, 581, 9729, 22133, 281, 1529, 3644, 22133, 275, 253, 41089, 285, 22048, 849, 253, 1332, 812, 4541, 5223, 281, 14949, 4391, 10625, 2490, 187, 4118, 18435, 27, 2520, 2929, 10262, 247, 10895, 273, 440, 1179, 50250, 1349, 712, 363, 28041, 273, 10556, 273, 41089, 5074, 253, 2929, 671, 10262, 247, 1332, 281, 5203, 436, 941, 275, 247, 1881, 35421, 5133, 342, 8723, 22581, 30628, 14109, 253, 9410, 273, 436, 10895, 285, 923, 1318, 323, 956, 598, 789, 253, 1332, 323, 46684, 369, 671, 14109, 32213, 273, 253, 2929, 452, 644, 9713, 390, 14969, 891, 5583, 756, 81, 8211, 436, 2929, 281, 253, 5723, 2824, 1384, 1423, 15302, 285, 49602, 2086 ]
Below is given review of a research paper from cnoference journal. Please write a summary the review. ### Review: there exist metrics for measuring the quality and consistency of standard generative models but these are not applicable to the oneshot setting therefore the authors of this paper introduce and quantify a diversityrecognizability space where single shot generated samples can be evaluated furthermore these two axes make it straightforward to compare the generated samples to samples from a human model a brain extensive experiments establish the solidity of these measures for the omniglot dataset this is a strikingly solid paper which presents a clear and novel idea the result on how ganbased architectures and vaebased approaches place in different parts of the diversityrecognizability scale is highly interesting each experiment is wellmotivated section 3 and their results are thoroughly transparent and well discussed the diversityrecognizability space also presents a very appealing way to think about evaluating generative models well situated in the psychophysics and cognitive science literature to the best of my knowledge relevant architectures and methods are used both for the feature extraction as part of the metrics as well as for the oneshot generation it is difficult to find something to remark on the one thing would be the fact that the method is run on a single dataset but in my view this should be left for future work since a very clear and useful framework is introduced here for anyone to pick up and the experiments are extensive even for this one dataset thus i do not count this as a weakness last the paper is highly readable and wellstructured i believe that it can be of large value to the community yes docsepthis paper studies the problem of fewshot generation on the omniglot dataset with a focus of benchmarking existing algorithms against humans they claim that existing metrics for generative models arent appropriate for evaluating oneshot generalization and propose a different framework for this purpose their framework is based on two aspects diversity and recognizability in both cases measuring these aspects is dependent upon a trained model in the case of diversity a feature extractor is used to measure intraclass variability standard deviation of produced samples in the feature space around a prototype and a simclr encoder is used for this for recognizability they make use of a oneshot classification model and the accuracy of that model on generated samples is the recognizability measure they use a prototypical network for this they investigate where vaebased and ganbased models fall in this evaluation framework and how this is affected by various hyperparameters due to the setup of the omniglot dataset it is also possible to compare against humans omniglot contains humandrawn characters which can be thought of as being the result of a human generative process they found that the neural statisticianbased vae is the closest on average to humans in the diversityrecognizability space strengths the paper is clearly written for the most part see a couple exceptions in the detailed comments below drawing connections between humans and different generative models is interesting the proposed framework is wellgrounded in that it draws inspiration from psychophysics work weaknesses limited scope poorly motivated limited significance in the current form detailed comments clarity in figure 4 it would be clearer to actually show the size of the context set eg via the size of the circle or color or something else currently i assume the direction of the arrow is giving a proxy for this but this is not as informative in figure 5 in the caption bigger squares correspond to the model showing the smallest distance to humans this is unclear as it sounds like the bigger the square the smaller the distance but these squares for the different models all have the same size and instead its their yaxis position that captures the distance should rephrase to the squares correspond to ie remove the word bigger if my understanding is correct as a highlevel comment it would have been useful to give some more details about the models used for example it is hard to understand exactly what effect we expect the context size to have on vaens without a good enough understanding of that model motivation and significance to motivate their evaluation framework the authors claim that existing metrics dont work well if the training and testing samples are too dissimilar in omniglot though this isnt really the case there have been several studies that show that in fact feature reuse is sufficient for good fewshot learning performance on simple datasets like omniglot see 1 for an example in the context of maml 2 for a more general example and in fact it was also demonstrated that good performance can even be achieved on new omniglot classes without using the support labels at test time 3 which also hints at the simplicity of this generalization challenge so is this claim just a hypothesis or is it based on some evidence on the failure of existing metrics for oneshot generation evaluation on omniglot especially in the weak generalization setting considered here this might not be an issue further for measuring recognizability it wasnt clear to me why we need to use a oneshot model as the critic for whether the generated samples can be accurately classified why cant we use an oracle here that was trained on more shots from the new classes one might argue that this is okay since this model is used just for evaluation purposes though perhaps we do want an examplelevel split where eg this oracle is trained on support images but not query images from the new classes the current setup seemed unnecessarily restrictive to me and not well motivated if we can use an oracle trained on more shots this also sidesteps the other difficulty mentioned by the authors as a limitation that prevents us from using existing metrics evaluation framework for this problem finally why did the authors decide to use a different model simclr to get the encoder used for measuring diversity compared to the model used for recognizability prototypical networks cant we use the same model for both of these and ideally report results with different choices for this model limited scope despite this paper being largely empirical there is a limited number of models experimented with why not also compare against the fewshot generalization model in 4 for example omniglot is also a very simple dataset and fewshot learning research at least for classification has drifted from it these days 5 it would be nice to report results on more modern datasets quickdraw which is mentioned in the discussions can be a good option as it still allows comparisons to humans significance all in all due to the limited scope and insufficient motivation of the proposed framework and set of experiments i felt that the significance of the findings is low further some findings like gans lacking diversity are already known eg mode collapse is a known problem for gans verifying that the same is true in the oneshot scenario is interesting but not too surprising especially in this weak generalization case studied in this paper references 1 rapid learning or feature reuse towards understanding the effectiveness of maml raghu et al iclr 2020 2 a closer look at fewshot classification chen et al iclr 2020 3 are fewshot learning benchmarks too simple solving them without task supervision at testtime huang et al 4 oneshot generalization in deep generative models rezende et al icml 2016 5 metadataset a dataset of datasets for learning to learn from few examples triantafillou et al iclr 2020 aside from the limitations mentioned earlier in the review another one is the following what if instead of a single image a few images were available of the new visual concept fewshot learning instead of oneshot learning the proposed framework assumes that in that case they will have to be aggregated into a prototype but this is not necessarily the best approach it would be nice to make it more flexible to allow evaluating any method for exploiting more than one shots docsepin this paper the authors propose a new framework for assessing the quality of oneshot generative models for images particularly the authors frame it as a twoobject metric where diversity how generated examples differ from each other and accuracy whether the generated example resembles the given single example of the generative model are measured furthermore the humanmade examples can also be put into comparison the authors conduct the experiments of applying such twoobject metric to representative oneshot generative models with detailed analysis and discussion strengths of this work it addresses the relatively less studied field of oneshot image generation and presents a framework as a metric for assessing the quality the idea of using two objects is novel this work is more of an empirical work and it does a good job in clearing presenting experiments and the analysis comparison of representative oneshot image generation methods with human performance comparable are shown visually and the discussion is substantial overall the writing quality is high and the content is substantial the result presented in this paper would provide interesting insights in the oneshot image generation weaknesses of this work if they are properly addressed im happy to increase my rating lack of comparison with previously leveraged metrics its true that the proposed method is more holistic than the scattered ones used in previous works for evaluation such as tsne clustering however readers would be also interested in how the new framework compass with such measures for example how diversity andor accuracy collrates with clustering results the dataset is limited to omniglot other datasets used in oneshot image generation for example cifar should also be considered as they may show different behaviors which is the focus of what a metric would be studying in such a case human output is not feasible and thus not in the comparison but still the impact of different methods especially with different hyper parameters where the paper paid a lot of attention would have significance some design choices are not clear see questions below changes duringafter rebuttal increased score from 6 to 7 no issue ### Summary:
this paper studies the problem of fewshot generation on the omniglot dataset contrasting fewshot generative models against humans they introduce diversity and recognizability metrics and perform an empirical analysis of how various models are situated in the diversityrecognizability plane relative to humans overall this is a wellwritten paper with a clear story and experiments it provides interesting insights on how various generative models architectures relate to humans in a particular task
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 9088, 2226, 17082, 323, 10499, 253, 3290, 285, 15274, 273, 2629, 1006, 800, 3210, 533, 841, 403, 417, 7763, 281, 253, 4394, 12022, 4758, 3103, 253, 4477, 273, 436, 2929, 9569, 285, 22048, 247, 9991, 21888, 50228, 2317, 835, 2014, 5103, 4561, 3530, 476, 320, 6760, 33810, 841, 767, 24039, 1056, 352, 15246, 281, 7277, 253, 4561, 3530, 281, 3530, 432, 247, 1966, 1566, 247, 3998, 9470, 4679, 5100, 253, 4891, 414, 273, 841, 5593, 323, 253, 33039, 304, 11753, 10895, 436, 310, 247, 13631, 314, 4891, 2929, 534, 10262, 247, 2590, 285, 4460, 2934, 253, 906, 327, 849, 36827, 3169, 35615, 285, 13460, 2275, 833, 7274, 1659, 275, 1027, 4243, 273, 253, 9991, 21888, 50228, 4311, 310, 4122, 4722, 1016, 3368, 310, 973, 24013, 8550, 2593, 495, 285, 616, 1543, 403, 16575, 13955, 285, 973, 5469, 253, 9991, 21888, 50228, 2317, 671, 10262, 247, 1077, 23176, 1039, 281, 1158, 670, 16344, 1006, 800, 3210, 973, 17860, 275, 253, 4369, 16946, 982, 285, 9699, 5859, 6239, 281, 253, 1682, 273, 619, 3640, 4623, 35615, 285, 3082, 403, 908, 1097, 323, 253, 4735, 11998, 347, 629, 273, 253, 17082, 347, 973, 347, 323, 253, 4394, 12022, 5978, 352, 310, 2834, 281, 1089, 1633, 281, 7579, 327, 253, 581, 2181, 651, 320, 253, 958, 326, 253, 1332, 310, 1408, 327, 247, 2014, 10895, 533, 275, 619, 1859, 436, 943, 320, 1669, 323, 2852, 789, 1580, 247, 1077, 2590, 285, 4217, 7792, 310, 5611, 1060, 323, 3780, 281, 2619, 598, 50276, 395, 253, 4679, 403, 9470, 1014, 323, 436, 581, 10895, 3021, 891, 513, 417, 1385, 436, 347, 247, 14855, 1390, 253, 2929, 310, 4122, 34025, 285, 973, 34218, 891, 2868, 326, 352, 476, 320, 273, 1781, 1318, 281, 253, 3114, 4754, 5474, 33032, 2520, 2929, 2175, 253, 1895, 273, 1643, 11860, 5978, 327, 253, 33039, 304, 11753, 10895, 342, 247, 2770, 273, 22791, 272, 5368, 11333, 1411, 7497, 597, 1750, 326, 5368, 17082, 323, 1006, 800, 3210, 403, 2649, 4569, 323, 16344, 4394, 12022, 26647, 285, 12661, 247, 1027, 7792, 323, 436, 4096, 616, 7792, 310, 1754, 327, 767, 7794, 9991, 285, 3183, 50228, 275, 1097, 2219, 10499, 841, 7794, 310, 7976, 2220, 247, 10166, 1566, 275, 253, 1083, 273, 9991, 247, 4735, 4908, 263, 310, 908, 281, 2557, 11251, 14407, 13099, 2629, 11254, 273, 4197, 3530, 275, 253, 4735, 2317, 1475, 247, 21841, 285, 247, 948, 498, 83, 32049, 310, 908, 323, 436, 323, 3183, 50228, 597, 1056, 897, 273, 247, 4394, 12022, 9162, 1566, 285, 253, 7200, 273, 326, 1566, 327, 4561, 3530, 310, 253, 3183, 50228, 2557, 597, 897, 247, 3861, 49225, 2990, 323, 436, 597, 7409, 835, 13460, 2275, 833, 285, 36827, 3169, 3210, 2965, 275, 436, 7103, 7792, 285, 849, 436, 310, 5876, 407, 2710, 4373, 22041, 1955, 281, 253, 9978, 273, 253, 33039, 304, 11753, 10895, 352, 310, 671, 1896, 281, 7277, 1411, 7497, 33039, 304, 11753, 4428, 1547, 17244, 939, 5810, 534, 476, 320, 1869, 273, 347, 1146, 253, 906, 273, 247, 1966, 1006, 800, 1232, 597, 1119, 326, 253, 11454, 26312, 757, 3169, 362, 3348, 310, 253, 8642, 327, 3388, 281, 7497, 275, 253, 9991, 21888, 50228, 2317, 50275, 296, 3755, 20556, 50276, 783, 2929, 310, 4518, 3542, 323, 253, 954, 629, 923, 247, 4564, 16022, 275, 253, 7000, 5701, 2708, 50275, 6553, 272, 10291, 875, 7497, 285, 1027, 1006, 800, 3210, 310, 4722, 50274, 783, 4081, 7792, 310, 973, 2595, 264, 275, 326, 352, 21354, 17006, 432, 4369, 16946, 982, 789, 50275, 20881, 1255, 265, 50276, 15870, 7990, 50275, 31943, 314, 17194, 50275, 15870, 8453, 275, 253, 1655, 830, 50274, 5992, 7193, 5701, 50275, 498, 15752, 50276, 249, 4677, 577, 352, 651, 320, 30909, 281, 2686, 921, 253, 1979, 273, 253, 3634, 873, 24088, 3066, 253, 1979, 273, 253, 9096, 390, 3295, 390, 1633, 2010, 4390, 891, 5467, 253, 3884, 273, 253, 14150, 310, 4933, 247, 17335, 323, 436, 533, 436, 310, 417, 347, 27096, 50275, 249, 4677, 608, 275, 253, 11743, 8750, 19325, 2723, 281, 253, 1566, 4645, 253, 8004, 4181, 281, 7497, 436, 310, 12744, 347, 352, 7835, 751, 253, 8750, 253, 6278, 253, 4577, 253, 4181, 533, 841, 19325, 323, 253, 1027, 3210, 512, 452, 253, 1072, 1979, 285, 3185, 697, 616, 340, 10565, 1899, 326, 28174, 253, 4181, 943, 294, 40712, 281, 253, 19325, 2723, 281, 26332, 5386, 253, 3159, 8750, 604, 619, 4685, 310, 3451, 50276, 284, 247, 1029, 5251, 4385, 352, 651, 452, 644, 4217, 281, 1918, 690, 625, 4278, 670, 253, 3210, 908, 323, 1650, 352, 310, 1892, 281, 2096, 4555, 752, 1055, 359, 1902, 253, 3634, 1979, 281, 452, 327, 13460, 561, 1293, 247, 1175, 2217, 4685, 273, 326, 1566, 50275, 24013, 7639, 285, 8453, 50276, 936, 41509, 616, 7103, 7792, 253, 4477, 1750, 326, 5368, 17082, 13414, 789, 973, 604, 253, 3733, 285, 5175, 3530, 403, 1512, 43110, 275, 33039, 304, 11753, 2167, 436, 310, 2649, 1663, 253, 1083, 627, 452, 644, 2067, 2175, 326, 921, 326, 275, 958, 4735, 33150, 310, 4209, 323, 1175, 1643, 11860, 4715, 3045, 327, 2969, 15302, 751, 33039, 304, 11753, 923, 337, 323, 271, 1650, 275, 253, 3634, 273, 278, 16878, 374, 323, 247, 625, 2087, 1650, 285, 275, 958, 352, 369, 671, 5183, 326, 1175, 3045, 476, 1014, 320, 6786, 327, 747, 33039, 304, 11753, 5971, 1293, 970, 253, 1329, 13301, 387, 1071, 673, 495, 534, 671, 28145, 387, 253, 17647, 273, 436, 26647, 5691, 594, 310, 436, 1750, 816, 247, 9079, 390, 310, 352, 1754, 327, 690, 1941, 327, 253, 4433, 273, 5368, 17082, 323, 4394, 12022, 5978, 7103, 327, 33039, 304, 11753, 3340, 275, 253, 5075, 26647, 4758, 2783, 1060, 436, 1537, 417, 320, 271, 2523, 50276, 44295, 323, 10499, 3183, 50228, 352, 369, 2649, 2590, 281, 479, 2139, 359, 878, 281, 897, 247, 4394, 12022, 1566, 347, 253, 7291, 323, 1880, 253, 4561, 3530, 476, 320, 13613, 10509, 2139, 16216, 359, 897, 271, 42295, 1060, 326, 369, 10166, 327, 625, 13768, 432, 253, 747, 5971, 581, 1537, 9059, 326, 436, 310, 8261, 1580, 436, 1566, 310, 908, 816, 323, 7103, 6378, 2167, 4931, 359, 513, 971, 271, 1650, 5251, 8085, 835, 24088, 436, 42295, 310, 10166, 327, 1329, 3888, 533, 417, 7316, 3888, 432, 253, 747, 5971, 253, 1655, 9978, 4455, 48312, 29190, 281, 479, 285, 417, 973, 17194, 604, 359, 476, 897, 271, 42295, 10166, 327, 625, 13768, 436, 671, 25549, 383, 2265, 253, 643, 10183, 5393, 407, 253, 4477, 347, 247, 12291, 326, 16897, 441, 432, 970, 5368, 17082, 50276, 15419, 2368, 7792, 323, 436, 1895, 50276, 71, 3341, 2139, 858, 253, 4477, 7617, 281, 897, 247, 1027, 1566, 948, 498, 83, 281, 755, 253, 32049, 908, 323, 10499, 9991, 2429, 281, 253, 1566, 908, 323, 3183, 50228, 3861, 49225, 6928, 16216, 359, 897, 253, 1072, 1566, 323, 1097, 273, 841, 285, 34243, 1304, 1543, 342, 1027, 10165, 323, 436, 1566, 50276, 15870, 7990, 50276, 3229, 3784, 436, 2929, 1146, 8127, 16774, 627, 310, 247, 3710, 1180, 273, 3210, 3368, 264, 342, 2139, 417, 671, 7277, 1411, 253, 1643, 11860, 26647, 1566, 275, 577, 323, 1650, 50276, 297, 79, 304, 11753, 310, 671, 247, 1077, 2969, 10895, 285, 1643, 11860, 4715, 2561, 387, 1878, 323, 9162, 556, 39147, 432, 352, 841, 1897, 608, 352, 651, 320, 5322, 281, 1304, 1543, 327, 625, 4980, 15302, 3158, 6553, 534, 310, 5393, 275, 253, 11985, 476, 320, 247, 1175, 4500, 347, 352, 1335, 4483, 14023, 281, 7497, 50276, 9188, 40348, 50275, 455, 275, 512, 1955, 281, 253, 3710, 7990, 285, 12497, 16038, 273, 253, 4081, 7792, 285, 873, 273, 4679, 891, 3543, 326, 253, 8453, 273, 253, 4342, 310, 1698, 2007, 690, 4342, 751, 305, 507, 14999, 9991, 403, 2168, 1929, 24088, 4438, 13551, 310, 247, 1929, 1895, 323, 305, 507, 49160, 326, 253, 1072, 310, 2032, 275, 253, 4394, 12022, 10076, 310, 4722, 533, 417, 1512, 10084, 3340, 275, 436, 5075, 26647, 1083, 5421, 275, 436, 2929, 50276, 250, 3065, 50276, 18, 5233, 4715, 390, 4735, 33150, 4404, 4685, 253, 12510, 273, 278, 16878, 23603, 11917, 1162, 355, 17857, 32888, 9169, 374, 247, 8003, 1007, 387, 1643, 11860, 9162, 260, 864, 1162, 355, 17857, 32888, 9169, 495, 403, 1643, 11860, 4715, 49602, 1512, 2969, 50276, 84, 11932, 731, 1293, 4836, 20446, 387, 1071, 2606, 30287, 606, 1162, 355, 577, 4394, 12022, 26647, 275, 3676, 1006, 800, 3210, 294, 91, 9747, 1162, 355, 17857, 1686, 4022, 608, 1313, 324, 255, 23456, 247, 10895, 273, 15302, 323, 4715, 281, 3037, 432, 1643, 6667, 1195, 386, 2320, 408, 276, 1162, 355, 17857, 32888, 9169, 50276, 45529, 432, 253, 7364, 5393, 4321, 275, 253, 2278, 1529, 581, 310, 253, 1563, 752, 604, 3185, 273, 247, 2014, 2460, 247, 1643, 3888, 497, 2130, 273, 253, 747, 5304, 4473, 1643, 11860, 4715, 3185, 273, 4394, 12022, 4715, 253, 4081, 7792, 19584, 326, 275, 326, 1083, 597, 588, 452, 281, 320, 40006, 715, 247, 21841, 533, 436, 310, 417, 7933, 253, 1682, 2746, 352, 651, 320, 5322, 281, 1056, 352, 625, 12112, 281, 1581, 16344, 667, 1332, 323, 38883, 625, 685, 581, 13768, 50276, 7152, 339, 9852, 436, 2929, 253, 4477, 12661, 247, 747, 7792, 323, 18005, 253, 3290, 273, 4394, 12022, 1006, 800, 3210, 323, 3888, 3782, 253, 4477, 3665, 352, 347, 247, 767, 6082, 7982, 835, 9991, 849, 4561, 6667, 9184, 432, 1016, 643, 285, 7200, 1880, 253, 4561, 1650, 29217, 253, 1677, 2014, 1650, 273, 253, 1006, 800, 1566, 403, 4080, 33810, 253, 1966, 12710, 6667, 476, 671, 320, 1691, 715, 5301, 253, 4477, 2589, 253, 4679, 273, 9433, 824, 767, 6082, 7982, 281, 8612, 4394, 12022, 1006, 800, 3210, 342, 7000, 1783, 285, 5955, 20544, 273, 436, 789, 50276, 262, 12453, 253, 4942, 1679, 5421, 1673, 273, 4394, 12022, 2460, 5978, 285, 10262, 247, 7792, 347, 247, 7982, 323, 18005, 253, 3290, 253, 2934, 273, 970, 767, 5113, 310, 4460, 50276, 2520, 789, 310, 625, 273, 271, 16774, 789, 285, 352, 1057, 247, 1175, 2628, 275, 22980, 15250, 4679, 285, 253, 1783, 5301, 273, 8612, 4394, 12022, 2460, 5978, 3082, 342, 1966, 3045, 10870, 403, 2011, 25910, 285, 253, 5955, 310, 6832, 4583, 253, 4028, 3290, 310, 1029, 285, 253, 2600, 310, 6832, 50276, 783, 906, 3559, 275, 436, 2929, 651, 2085, 4722, 16039, 275, 253, 4394, 12022, 2460, 5978, 50275, 20881, 1255, 265, 273, 436, 789, 604, 597, 403, 6283, 9713, 516, 5211, 281, 2572, 619, 13716, 50276, 77, 471, 273, 5301, 342, 3786, 19732, 2961, 17082, 697, 2032, 326, 253, 4081, 1332, 310, 625, 45290, 685, 253, 17485, 4394, 908, 275, 2045, 2987, 323, 7103, 824, 347, 28669, 570, 17524, 2299, 10668, 651, 320, 671, 6110, 275, 849, 253, 747, 7792, 17066, 342, 824, 5593, 323, 1650, 849, 9991, 285, 263, 7200, 3007, 27619, 342, 17524, 1543, 50276, 783, 10895, 310, 3710, 281, 33039, 304, 11753, 643, 15302, 908, 275, 4394, 12022, 2460, 5978, 323, 1650, 260, 338, 274, 943, 671, 320, 50276, 46779, 347, 597, 778, 921, 1027, 13576, 534, 310, 253, 2770, 273, 752, 247, 7982, 651, 320, 12392, 275, 824, 247, 1083, 1966, 3453, 310, 417, 17887, 285, 3021, 417, 275, 253, 5301, 533, 1335, 253, 3486, 273, 1027, 3082, 3340, 342, 1027, 4373, 3602, 835, 253, 2929, 5087, 247, 2257, 273, 4116, 651, 452, 8453, 50276, 8826, 2216, 10165, 403, 417, 2590, 923, 3533, 2708, 50274, 31973, 1309, 6438, 30080, 22559, 50276, 19687, 833, 4868, 432, 721, 281, 818, 50275, 2369, 2523, 2490, 187, 4118, 18435, 27, 2520, 2929, 2175, 253, 1895, 273, 1643, 11860, 5978, 327, 253, 33039, 304, 11753, 10895, 42455, 1643, 11860, 1006, 800, 3210, 1411, 7497, 597, 9569, 9991, 285, 3183, 50228, 17082, 285, 1347, 271, 16774, 1783, 273, 849, 2710, 3210, 403, 17860, 275, 253, 9991, 21888, 50228, 6415, 4103, 281, 7497, 50276, 1189, 455, 436, 310, 247, 973, 15720, 2929, 342, 247, 2590, 2926, 285, 4679, 352, 3400, 4722, 16039, 327, 849, 2710, 1006, 800, 3210, 35615, 14588, 281, 7497, 275, 247, 1798, 4836 ]
[ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 ]
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 9088, 2226, 17082, 323, 10499, 253, 3290, 285, 15274, 273, 2629, 1006, 800, 3210, 533, 841, 403, 417, 7763, 281, 253, 4394, 12022, 4758, 3103, 253, 4477, 273, 436, 2929, 9569, 285, 22048, 247, 9991, 21888, 50228, 2317, 835, 2014, 5103, 4561, 3530, 476, 320, 6760, 33810, 841, 767, 24039, 1056, 352, 15246, 281, 7277, 253, 4561, 3530, 281, 3530, 432, 247, 1966, 1566, 247, 3998, 9470, 4679, 5100, 253, 4891, 414, 273, 841, 5593, 323, 253, 33039, 304, 11753, 10895, 436, 310, 247, 13631, 314, 4891, 2929, 534, 10262, 247, 2590, 285, 4460, 2934, 253, 906, 327, 849, 36827, 3169, 35615, 285, 13460, 2275, 833, 7274, 1659, 275, 1027, 4243, 273, 253, 9991, 21888, 50228, 4311, 310, 4122, 4722, 1016, 3368, 310, 973, 24013, 8550, 2593, 495, 285, 616, 1543, 403, 16575, 13955, 285, 973, 5469, 253, 9991, 21888, 50228, 2317, 671, 10262, 247, 1077, 23176, 1039, 281, 1158, 670, 16344, 1006, 800, 3210, 973, 17860, 275, 253, 4369, 16946, 982, 285, 9699, 5859, 6239, 281, 253, 1682, 273, 619, 3640, 4623, 35615, 285, 3082, 403, 908, 1097, 323, 253, 4735, 11998, 347, 629, 273, 253, 17082, 347, 973, 347, 323, 253, 4394, 12022, 5978, 352, 310, 2834, 281, 1089, 1633, 281, 7579, 327, 253, 581, 2181, 651, 320, 253, 958, 326, 253, 1332, 310, 1408, 327, 247, 2014, 10895, 533, 275, 619, 1859, 436, 943, 320, 1669, 323, 2852, 789, 1580, 247, 1077, 2590, 285, 4217, 7792, 310, 5611, 1060, 323, 3780, 281, 2619, 598, 50276, 395, 253, 4679, 403, 9470, 1014, 323, 436, 581, 10895, 3021, 891, 513, 417, 1385, 436, 347, 247, 14855, 1390, 253, 2929, 310, 4122, 34025, 285, 973, 34218, 891, 2868, 326, 352, 476, 320, 273, 1781, 1318, 281, 253, 3114, 4754, 5474, 33032, 2520, 2929, 2175, 253, 1895, 273, 1643, 11860, 5978, 327, 253, 33039, 304, 11753, 10895, 342, 247, 2770, 273, 22791, 272, 5368, 11333, 1411, 7497, 597, 1750, 326, 5368, 17082, 323, 1006, 800, 3210, 403, 2649, 4569, 323, 16344, 4394, 12022, 26647, 285, 12661, 247, 1027, 7792, 323, 436, 4096, 616, 7792, 310, 1754, 327, 767, 7794, 9991, 285, 3183, 50228, 275, 1097, 2219, 10499, 841, 7794, 310, 7976, 2220, 247, 10166, 1566, 275, 253, 1083, 273, 9991, 247, 4735, 4908, 263, 310, 908, 281, 2557, 11251, 14407, 13099, 2629, 11254, 273, 4197, 3530, 275, 253, 4735, 2317, 1475, 247, 21841, 285, 247, 948, 498, 83, 32049, 310, 908, 323, 436, 323, 3183, 50228, 597, 1056, 897, 273, 247, 4394, 12022, 9162, 1566, 285, 253, 7200, 273, 326, 1566, 327, 4561, 3530, 310, 253, 3183, 50228, 2557, 597, 897, 247, 3861, 49225, 2990, 323, 436, 597, 7409, 835, 13460, 2275, 833, 285, 36827, 3169, 3210, 2965, 275, 436, 7103, 7792, 285, 849, 436, 310, 5876, 407, 2710, 4373, 22041, 1955, 281, 253, 9978, 273, 253, 33039, 304, 11753, 10895, 352, 310, 671, 1896, 281, 7277, 1411, 7497, 33039, 304, 11753, 4428, 1547, 17244, 939, 5810, 534, 476, 320, 1869, 273, 347, 1146, 253, 906, 273, 247, 1966, 1006, 800, 1232, 597, 1119, 326, 253, 11454, 26312, 757, 3169, 362, 3348, 310, 253, 8642, 327, 3388, 281, 7497, 275, 253, 9991, 21888, 50228, 2317, 50275, 296, 3755, 20556, 50276, 783, 2929, 310, 4518, 3542, 323, 253, 954, 629, 923, 247, 4564, 16022, 275, 253, 7000, 5701, 2708, 50275, 6553, 272, 10291, 875, 7497, 285, 1027, 1006, 800, 3210, 310, 4722, 50274, 783, 4081, 7792, 310, 973, 2595, 264, 275, 326, 352, 21354, 17006, 432, 4369, 16946, 982, 789, 50275, 20881, 1255, 265, 50276, 15870, 7990, 50275, 31943, 314, 17194, 50275, 15870, 8453, 275, 253, 1655, 830, 50274, 5992, 7193, 5701, 50275, 498, 15752, 50276, 249, 4677, 577, 352, 651, 320, 30909, 281, 2686, 921, 253, 1979, 273, 253, 3634, 873, 24088, 3066, 253, 1979, 273, 253, 9096, 390, 3295, 390, 1633, 2010, 4390, 891, 5467, 253, 3884, 273, 253, 14150, 310, 4933, 247, 17335, 323, 436, 533, 436, 310, 417, 347, 27096, 50275, 249, 4677, 608, 275, 253, 11743, 8750, 19325, 2723, 281, 253, 1566, 4645, 253, 8004, 4181, 281, 7497, 436, 310, 12744, 347, 352, 7835, 751, 253, 8750, 253, 6278, 253, 4577, 253, 4181, 533, 841, 19325, 323, 253, 1027, 3210, 512, 452, 253, 1072, 1979, 285, 3185, 697, 616, 340, 10565, 1899, 326, 28174, 253, 4181, 943, 294, 40712, 281, 253, 19325, 2723, 281, 26332, 5386, 253, 3159, 8750, 604, 619, 4685, 310, 3451, 50276, 284, 247, 1029, 5251, 4385, 352, 651, 452, 644, 4217, 281, 1918, 690, 625, 4278, 670, 253, 3210, 908, 323, 1650, 352, 310, 1892, 281, 2096, 4555, 752, 1055, 359, 1902, 253, 3634, 1979, 281, 452, 327, 13460, 561, 1293, 247, 1175, 2217, 4685, 273, 326, 1566, 50275, 24013, 7639, 285, 8453, 50276, 936, 41509, 616, 7103, 7792, 253, 4477, 1750, 326, 5368, 17082, 13414, 789, 973, 604, 253, 3733, 285, 5175, 3530, 403, 1512, 43110, 275, 33039, 304, 11753, 2167, 436, 310, 2649, 1663, 253, 1083, 627, 452, 644, 2067, 2175, 326, 921, 326, 275, 958, 4735, 33150, 310, 4209, 323, 1175, 1643, 11860, 4715, 3045, 327, 2969, 15302, 751, 33039, 304, 11753, 923, 337, 323, 271, 1650, 275, 253, 3634, 273, 278, 16878, 374, 323, 247, 625, 2087, 1650, 285, 275, 958, 352, 369, 671, 5183, 326, 1175, 3045, 476, 1014, 320, 6786, 327, 747, 33039, 304, 11753, 5971, 1293, 970, 253, 1329, 13301, 387, 1071, 673, 495, 534, 671, 28145, 387, 253, 17647, 273, 436, 26647, 5691, 594, 310, 436, 1750, 816, 247, 9079, 390, 310, 352, 1754, 327, 690, 1941, 327, 253, 4433, 273, 5368, 17082, 323, 4394, 12022, 5978, 7103, 327, 33039, 304, 11753, 3340, 275, 253, 5075, 26647, 4758, 2783, 1060, 436, 1537, 417, 320, 271, 2523, 50276, 44295, 323, 10499, 3183, 50228, 352, 369, 2649, 2590, 281, 479, 2139, 359, 878, 281, 897, 247, 4394, 12022, 1566, 347, 253, 7291, 323, 1880, 253, 4561, 3530, 476, 320, 13613, 10509, 2139, 16216, 359, 897, 271, 42295, 1060, 326, 369, 10166, 327, 625, 13768, 432, 253, 747, 5971, 581, 1537, 9059, 326, 436, 310, 8261, 1580, 436, 1566, 310, 908, 816, 323, 7103, 6378, 2167, 4931, 359, 513, 971, 271, 1650, 5251, 8085, 835, 24088, 436, 42295, 310, 10166, 327, 1329, 3888, 533, 417, 7316, 3888, 432, 253, 747, 5971, 253, 1655, 9978, 4455, 48312, 29190, 281, 479, 285, 417, 973, 17194, 604, 359, 476, 897, 271, 42295, 10166, 327, 625, 13768, 436, 671, 25549, 383, 2265, 253, 643, 10183, 5393, 407, 253, 4477, 347, 247, 12291, 326, 16897, 441, 432, 970, 5368, 17082, 50276, 15419, 2368, 7792, 323, 436, 1895, 50276, 71, 3341, 2139, 858, 253, 4477, 7617, 281, 897, 247, 1027, 1566, 948, 498, 83, 281, 755, 253, 32049, 908, 323, 10499, 9991, 2429, 281, 253, 1566, 908, 323, 3183, 50228, 3861, 49225, 6928, 16216, 359, 897, 253, 1072, 1566, 323, 1097, 273, 841, 285, 34243, 1304, 1543, 342, 1027, 10165, 323, 436, 1566, 50276, 15870, 7990, 50276, 3229, 3784, 436, 2929, 1146, 8127, 16774, 627, 310, 247, 3710, 1180, 273, 3210, 3368, 264, 342, 2139, 417, 671, 7277, 1411, 253, 1643, 11860, 26647, 1566, 275, 577, 323, 1650, 50276, 297, 79, 304, 11753, 310, 671, 247, 1077, 2969, 10895, 285, 1643, 11860, 4715, 2561, 387, 1878, 323, 9162, 556, 39147, 432, 352, 841, 1897, 608, 352, 651, 320, 5322, 281, 1304, 1543, 327, 625, 4980, 15302, 3158, 6553, 534, 310, 5393, 275, 253, 11985, 476, 320, 247, 1175, 4500, 347, 352, 1335, 4483, 14023, 281, 7497, 50276, 9188, 40348, 50275, 455, 275, 512, 1955, 281, 253, 3710, 7990, 285, 12497, 16038, 273, 253, 4081, 7792, 285, 873, 273, 4679, 891, 3543, 326, 253, 8453, 273, 253, 4342, 310, 1698, 2007, 690, 4342, 751, 305, 507, 14999, 9991, 403, 2168, 1929, 24088, 4438, 13551, 310, 247, 1929, 1895, 323, 305, 507, 49160, 326, 253, 1072, 310, 2032, 275, 253, 4394, 12022, 10076, 310, 4722, 533, 417, 1512, 10084, 3340, 275, 436, 5075, 26647, 1083, 5421, 275, 436, 2929, 50276, 250, 3065, 50276, 18, 5233, 4715, 390, 4735, 33150, 4404, 4685, 253, 12510, 273, 278, 16878, 23603, 11917, 1162, 355, 17857, 32888, 9169, 374, 247, 8003, 1007, 387, 1643, 11860, 9162, 260, 864, 1162, 355, 17857, 32888, 9169, 495, 403, 1643, 11860, 4715, 49602, 1512, 2969, 50276, 84, 11932, 731, 1293, 4836, 20446, 387, 1071, 2606, 30287, 606, 1162, 355, 577, 4394, 12022, 26647, 275, 3676, 1006, 800, 3210, 294, 91, 9747, 1162, 355, 17857, 1686, 4022, 608, 1313, 324, 255, 23456, 247, 10895, 273, 15302, 323, 4715, 281, 3037, 432, 1643, 6667, 1195, 386, 2320, 408, 276, 1162, 355, 17857, 32888, 9169, 50276, 45529, 432, 253, 7364, 5393, 4321, 275, 253, 2278, 1529, 581, 310, 253, 1563, 752, 604, 3185, 273, 247, 2014, 2460, 247, 1643, 3888, 497, 2130, 273, 253, 747, 5304, 4473, 1643, 11860, 4715, 3185, 273, 4394, 12022, 4715, 253, 4081, 7792, 19584, 326, 275, 326, 1083, 597, 588, 452, 281, 320, 40006, 715, 247, 21841, 533, 436, 310, 417, 7933, 253, 1682, 2746, 352, 651, 320, 5322, 281, 1056, 352, 625, 12112, 281, 1581, 16344, 667, 1332, 323, 38883, 625, 685, 581, 13768, 50276, 7152, 339, 9852, 436, 2929, 253, 4477, 12661, 247, 747, 7792, 323, 18005, 253, 3290, 273, 4394, 12022, 1006, 800, 3210, 323, 3888, 3782, 253, 4477, 3665, 352, 347, 247, 767, 6082, 7982, 835, 9991, 849, 4561, 6667, 9184, 432, 1016, 643, 285, 7200, 1880, 253, 4561, 1650, 29217, 253, 1677, 2014, 1650, 273, 253, 1006, 800, 1566, 403, 4080, 33810, 253, 1966, 12710, 6667, 476, 671, 320, 1691, 715, 5301, 253, 4477, 2589, 253, 4679, 273, 9433, 824, 767, 6082, 7982, 281, 8612, 4394, 12022, 1006, 800, 3210, 342, 7000, 1783, 285, 5955, 20544, 273, 436, 789, 50276, 262, 12453, 253, 4942, 1679, 5421, 1673, 273, 4394, 12022, 2460, 5978, 285, 10262, 247, 7792, 347, 247, 7982, 323, 18005, 253, 3290, 253, 2934, 273, 970, 767, 5113, 310, 4460, 50276, 2520, 789, 310, 625, 273, 271, 16774, 789, 285, 352, 1057, 247, 1175, 2628, 275, 22980, 15250, 4679, 285, 253, 1783, 5301, 273, 8612, 4394, 12022, 2460, 5978, 3082, 342, 1966, 3045, 10870, 403, 2011, 25910, 285, 253, 5955, 310, 6832, 4583, 253, 4028, 3290, 310, 1029, 285, 253, 2600, 310, 6832, 50276, 783, 906, 3559, 275, 436, 2929, 651, 2085, 4722, 16039, 275, 253, 4394, 12022, 2460, 5978, 50275, 20881, 1255, 265, 273, 436, 789, 604, 597, 403, 6283, 9713, 516, 5211, 281, 2572, 619, 13716, 50276, 77, 471, 273, 5301, 342, 3786, 19732, 2961, 17082, 697, 2032, 326, 253, 4081, 1332, 310, 625, 45290, 685, 253, 17485, 4394, 908, 275, 2045, 2987, 323, 7103, 824, 347, 28669, 570, 17524, 2299, 10668, 651, 320, 671, 6110, 275, 849, 253, 747, 7792, 17066, 342, 824, 5593, 323, 1650, 849, 9991, 285, 263, 7200, 3007, 27619, 342, 17524, 1543, 50276, 783, 10895, 310, 3710, 281, 33039, 304, 11753, 643, 15302, 908, 275, 4394, 12022, 2460, 5978, 323, 1650, 260, 338, 274, 943, 671, 320, 50276, 46779, 347, 597, 778, 921, 1027, 13576, 534, 310, 253, 2770, 273, 752, 247, 7982, 651, 320, 12392, 275, 824, 247, 1083, 1966, 3453, 310, 417, 17887, 285, 3021, 417, 275, 253, 5301, 533, 1335, 253, 3486, 273, 1027, 3082, 3340, 342, 1027, 4373, 3602, 835, 253, 2929, 5087, 247, 2257, 273, 4116, 651, 452, 8453, 50276, 8826, 2216, 10165, 403, 417, 2590, 923, 3533, 2708, 50274, 31973, 1309, 6438, 30080, 22559, 50276, 19687, 833, 4868, 432, 721, 281, 818, 50275, 2369, 2523, 2490, 187, 4118, 18435, 27, 2520, 2929, 2175, 253, 1895, 273, 1643, 11860, 5978, 327, 253, 33039, 304, 11753, 10895, 42455, 1643, 11860, 1006, 800, 3210, 1411, 7497, 597, 9569, 9991, 285, 3183, 50228, 17082, 285, 1347, 271, 16774, 1783, 273, 849, 2710, 3210, 403, 17860, 275, 253, 9991, 21888, 50228, 6415, 4103, 281, 7497, 50276, 1189, 455, 436, 310, 247, 973, 15720, 2929, 342, 247, 2590, 2926, 285, 4679, 352, 3400, 4722, 16039, 327, 849, 2710, 1006, 800, 3210, 35615, 14588, 281, 7497, 275, 247, 1798, 4836 ]
Below is given review of a research paper from cnoference journal. Please write a summary the review. ### Review: this paper presents a new continual learning problem setup continual knowledge learning ckl and constructs an associated benchmark resource the benchmark is based on slot fillingbased knowledge probing tasks ie the lama analysis the authors show the empirical performance of some existing cl methods ranging from regularization rehearsal and parameter expansion and they show a few findings based on their experimental results eg learning rate can be sensitive to balance the tradeoff between forgetting and learning new knowledge and ckl methods might have transferrable performance across different lms eg t5 and gpt strengths the problem of clk itself is an important task and more realistic to downstream knowledgeintensive applications i like the clear separation of the types of knowledge timeinvariant to keep outdated to remove new to inject as well as their collected datasets for reflecting the three types of knowledge update different types of baseline methods are covered and compared with analysis weakness the formulation of the ckl problem is overly simplified it basically only considers a single corpus d1 for updating the knowledge of previously learned lms a general setup should consider a streaming version of d1 and make it a sequence of subcorpus d1 d2 dt that arrive at different time steps also the associated tasks in updatedlama and newlama should reflect such a time series that is the streaming version of the current probing tasks the current formulation described in section 31 only has a single time step its more like an offline learning problem with the forgetting constraints but not an online continual learning problem the proposed setup is thus a bit far from the motivation of studying ckl maintaining an everchanging lm the experiments are very limited to the lama probing which does not necessarily connect to real downstream applications of these lms its also hard to justify whether such methods can maintain performance in general nlp tasks the argument about the kilt experiments seems to only focus on testing the retention but not about the newupdated knowledge about d1 the analysis of the ckl methods is not deep enough how do these methods work and why do some outperform others how do we know if an arbitrary new fact conflict with the existing knowledge or not on the fly how do you define such conflicts properly say you have a sentence in d0 cristiano ronaldo plays for xxx in 2010 and you have another sentence in d1 cristiano ronaldo plays for yyy now in the current problem setup how will such conflict be defined the new findings are not particularly nontrivial the presentation and the writing of the paper can be further improved with more illustrative examples and case studies for readers to qualitatively see the problem setup and the differences between these methods this paper is a pilot study of an improtant problem continual knowledge learning of lms but the problem formulation is overly simple and there are still many important yet missing points in both data construction and experiments also there are no much insightful and nontrivial findings with deep analysis of existing methods please find more details above docsepthe paper studies continual knowledge learning of language models which is an interesting and important problem particularly a new benchmark and a metric are introduced to quantify the retention of timeinvariant world knowledge the update of outdated knowledge and the acquisition of new knowledge to establish baselines for the ckl benchmark and validate the rationality of the proposed benchmark and metric the author conducts extensive experiments with a pretrained encoderdecoder model t5 based on various training methodologies including regularization rehearsal and parameter expansion methods the paper is well organized and easy to follow the proposed continual knowledge learning problem is quite interesting and important the fuar metric is also technically sound the authors also conduct comprehensive experiments to verify the rationality of the proposed benchmark under the various settings strengths 1 the proposed continual knowledge learning problem is quite interesting and important 2 the benchmark is useful and the proposed fuar metric is technically sound weaknesses 1 the paper only performs experiments with an encoderdecoder model t5 the experimental results will be more convincing if more pretrained language models such as gpt are included for example we can explore the ability of different plms to avoid catastrophic forgetting and to acquire new knowledge while preserving invariant knowledge 2 consisting with the traditional setting of cl the paper also creates a setting for multiple ckl phases however only a twophase setting is considered more experiments can be explored such as fivephase or controlling the differences in the distribution of data at different phases 3 the experimental findings in this paper are somewhat trivial the paper studies an interesting problem the proposed benchmark and metric are technically sound however there are some concerns about experimental settings docsep the paper is about continuous learning for language models the authors leverage existing lama tasks and collect a new test benchmark with updated information and new information they investigate several existing cl algorithms and they propose a new metric called fuar to measure tradeoff between forgotten timeinvariant knowledge and updated or newly acquired knowledge they provided some findings on their continuous lm learning strengths they provide a new benchmark and metric to measure the retention of timeinvariant knowledge updated knowledge and new knowledge some interesting observations are provided for example 1 rehearsal methods do not work well in this setting even though the reason is quite obvious because some knowledge is updated and parameterexpansion methods achieve better results 2 lms are prone to more forgetting as they go through multiple ckl phases 3 lms should be pretrained with just a few epochs on less duplicating data for efficiency weakness in a realworld scenario how can one know in advance that the new task is truly new as mentioned on page 5 it is possible that the fuar score is very large if no gain and preserve knowledge the authors did not show experiments or analysis in such a setting where the new task has some knowledge overlapping with the learned tasks to make this concern more general i feel the measurement of task similarity is missing in this work can we also see perplexity as another dimension em scores cannot know how bad the prediction distribution is do you think the conclusion of these lama tasks is the same as other nlp downstream tasks do you think others can easily replicate your numbers do you run several splitting or seeds for multiple round cl settings have you tried different sizes of t5 models maybe the gap between vanilla and ckl methods will be smaller given larger models misc i have doubt on this sentence moreover the effectiveness of ckl methods is much reduced in multiphase ckl shown by the decrease of the gap between the fuar of t5vanilla and the best performing ckl method in the scenario of twophase and one phase which is 092 and 046 respectively isnt it only prove that t5vanilla can learn better fuar scores smallp1 smallp2 than smallp1 smallp2 because you have similar fuar scores for t5kadapters this work is quite insightful for us to understand more about how lm continuous learning works although i think more experiments could be beneficial as i mentioned in the weakness section if we can make sure the numbers from this paper are reproducible and comparable then i think it could be a good testbed for future research in this direction docsepthe authors formulate a new continual learning cl problem called continual knowledge learning ckl particularly they distinguished three subtasks in ckl ie the retention of timeinvariant world knowledge the update of old knowledge and the acquisition of new knowledge they also introduce a new benchmark and metric to quantify the performance of various stateoftheart models on these subtasks they find that ckl demonstrates unique challenges that are not present in previous cl setups critical causes of knowledge forgetting in ckl are also discussed strengths in terms of novelty the paper extended the definition of continuous learning to formulate continuous knowledge learning that has unique challenges compared to the traditional cl the paper also introduced a novel metric named fuar to measure the tradeoff between knowledge forgetting update or acquisition this is a contribution to the field as this quantitative metric could facilitate direct comparison between models performing ckl tasks the paper is technically sound extensive experiments were conducted to benchmark the performance of various cl approaches regularization rehearsal and parameterexpansion methods on different aspects of the ckl task retention of timeinvariant knowledge updating old knowledge and acquiring new knowledge in the appendix the authors also presented various ways to understand the models learning process such as the change of predicted outputs during the continued pretraining as well as the failure analysis based on the type of probes these methods provide more insight into the models learning process that went beyond the plain performance scores weaknesses using t5 the authors showed that parameterexpansion methods have the most robust performance throughout all of the experimental settings however in the experiments with gpt2 a decoderonly model in the appendix gpt2mixreview a rehearsal method performs the best although the authors mentioned that we leave more exploration of applying ckl methods on decoderonly models such as gpt2 architecture as future work they should still mention this critical discrepancy in the main body of the paper so that the readers are aware of the context of the findings after all large language models take various forms and both t5 and gpt2 are examples of large language models questions the authors demonstrated that parameterexpansion methods have the most robust performance throughout all of the experimental settings with t5 since the three methods regularization rehearsal and parameter expansion are not mutually exclusive i was wondering if the authors have tried any combination of the approaches for instance the rehearsal approach could be combined with the parameterexpansion method i was wondering whether a combined approach would yield even higher performance the paper formulated the problem of continual knowledge learning and benchmarked the performance of large language models on this task with different cl methods the tradeoff between forgetting existing knowledge and updating old knowledgeacquiring new knowledge is quantified through a new metric which would serve as an important optimization goal for future research this work is a big contribution to the community and would invite more research into this topic ### Summary:
the paper introduces the problem of continual knowledge language learning the authors point out the interesting duality between continual learning and knowledge learning where in knowledge learning one must avoid forgetting timeinvariant knowledge avoid forgetting in cl be able to acquire new knowledge learn new tasks in cl and replace outdated knowledge a form of forgetting and relearning or adaptation in their paper the authors develop an initial benchmark for the task along with a set of baselines and provide empirical studies the initial reviews were quite mixed the reviewers seem to agree this work studies an interesting and fairly novel direction for continual learning of language however the reviewers did not agree on whether this initial stab at the problem was enough in particular reviewer u9hk argues that the formulation is oversimplified and the current experiments are limiting after the discussion the reviewers remained split with one high score 8 two borderline accepts 3 and one reject so three reviewers believe that this manuscript is already a good contribution the fourth reviewer disagrees but the authors provided clear and convincing responses to many of their comments and point to results already available in the appendix overall this is a clear and reasonable first step considering this paper proposes a new cl problem the reviewers and i believe that this is interesting and rigorous enough to be impactful and to warrant followup works as a result im happy to recommend acceptance i imagine that if the community demonstrates interest in this line of work there will be work both on methodologies to improve the proposed baselines but also work proposing extensions to the problem in line with some of the comments of reviewer u9hk in preparing their cameraready version i strongly encourage the authors to take into account the suggestions of the reviewers and your replies in particular your discussion regarding encoderdecoder and decoderonly lms and the associated results would be good to discuss in the main text even if the full results are in the appendix
[ 417, 271, 3909, 45120, 4715, 1895, 253, 4081, 9978, 310, 3021, 247, 2372, 2080, 432, 253, 16038, 273, 12392, 260, 7261, 50276, 41322, 1776, 271, 2455, 28276, 298, 78, 50275, 783, 4679, 403, 1077, 3710, 281, 253, 298, 2902, 39578, 534, 1057, 417, 7933, 4684, 281, 1524, 15450, 4893, 273, 841, 298, 983, 697, 671, 1892, 281, 15249, 1880, 824, 3082, 476, 6558, 3045, 275, 2087, 295, 24343, 8892, 253, 4154, 670, 253, 465, 2878, 4679, 3133, 281, 760, 2770, 327, 5175, 253, 17302, 533, 417, 670, 253, 747, 39055, 3640, 670, 277, 18, 50275, 783, 1783, 273, 253, 260, 7261, 3082, 310, 417, 3676, 2217, 849, 513, 841, 3082, 789, 285, 2139, 513, 690, 562, 32231, 2571, 849, 513, 359, 871, 604, 271, 10341, 747, 958, 7344, 342, 253, 5368, 3640, 390, 417, 327, 253, 8778, 849, 513, 368, 4853, 824, 15272, 6283, 1333, 368, 452, 247, 6197, 275, 277, 17, 1531, 382, 10356, 391, 2814, 3088, 7120, 323, 43911, 275, 4267, 285, 368, 452, 1529, 6197, 275, 277, 18, 1531, 382, 10356, 391, 2814, 3088, 7120, 323, 340, 12502, 1024, 275, 253, 1655, 1895, 9978, 849, 588, 824, 7344, 320, 2931, 50275, 783, 747, 4342, 403, 417, 3782, 37825, 50275, 783, 9759, 285, 253, 4028, 273, 253, 2929, 476, 320, 2007, 5520, 342, 625, 47386, 6667, 285, 1083, 2175, 323, 10668, 281, 36143, 923, 253, 1895, 9978, 285, 253, 3910, 875, 841, 3082, 436, 2929, 310, 247, 11572, 1263, 273, 271, 1965, 85, 386, 1895, 45120, 3640, 4715, 273, 298, 983, 533, 253, 1895, 15895, 310, 27662, 2969, 285, 627, 403, 1335, 1142, 1774, 2568, 5816, 2792, 275, 1097, 941, 5140, 285, 4679, 671, 627, 403, 642, 1199, 47860, 285, 37825, 4342, 342, 3676, 1783, 273, 5368, 3082, 4496, 1089, 625, 4278, 1840, 50276, 7152, 339, 431, 248, 2929, 2175, 45120, 3640, 4715, 273, 3448, 3210, 534, 310, 271, 4722, 285, 1774, 1895, 3782, 247, 747, 22791, 285, 247, 7982, 403, 5611, 281, 22048, 253, 17302, 273, 673, 25168, 1533, 3640, 253, 5731, 273, 36761, 3640, 285, 253, 11931, 273, 747, 3640, 281, 5100, 1666, 25379, 323, 253, 260, 7261, 22791, 285, 17813, 253, 8870, 414, 273, 253, 4081, 22791, 285, 7982, 253, 2488, 2589, 84, 9470, 4679, 342, 247, 3215, 11273, 32049, 48759, 1566, 246, 22, 1754, 327, 2710, 3733, 39396, 1690, 37820, 33558, 267, 285, 4764, 7466, 3082, 50276, 783, 2929, 310, 973, 10932, 285, 3477, 281, 956, 253, 4081, 45120, 3640, 4715, 1895, 310, 3240, 4722, 285, 1774, 253, 15260, 274, 7982, 310, 671, 22335, 3590, 253, 4477, 671, 2589, 11088, 4679, 281, 12654, 253, 8870, 414, 273, 253, 4081, 22791, 762, 253, 2710, 7533, 20544, 337, 253, 4081, 45120, 3640, 4715, 1895, 310, 3240, 4722, 285, 1774, 374, 253, 22791, 310, 4217, 285, 253, 4081, 15260, 274, 7982, 310, 22335, 3590, 50275, 20881, 1255, 265, 337, 253, 2929, 760, 17923, 4679, 342, 271, 32049, 48759, 1566, 246, 22, 253, 5661, 1543, 588, 320, 625, 21414, 604, 625, 3215, 11273, 3448, 3210, 824, 347, 305, 431, 403, 2908, 323, 1650, 359, 476, 8338, 253, 3745, 273, 1027, 499, 983, 281, 3693, 36256, 37264, 285, 281, 16270, 747, 3640, 1223, 24279, 13727, 3640, 374, 11253, 342, 253, 5899, 4758, 273, 502, 253, 2929, 671, 10513, 247, 4758, 323, 2709, 260, 7261, 12475, 2299, 760, 247, 2500, 2689, 511, 4758, 310, 2783, 625, 4679, 476, 320, 14859, 824, 347, 2620, 14213, 390, 10938, 253, 3910, 275, 253, 3268, 273, 941, 387, 1027, 12475, 495, 253, 5661, 4342, 275, 436, 2929, 403, 8489, 14916, 50276, 783, 2929, 2175, 271, 4722, 1895, 253, 4081, 22791, 285, 7982, 403, 22335, 3590, 2299, 627, 403, 690, 7350, 670, 5661, 7533, 5474, 33032, 253, 2929, 310, 670, 5415, 4715, 323, 3448, 3210, 253, 4477, 25057, 5368, 298, 2902, 8892, 285, 4822, 247, 747, 1071, 22791, 342, 9300, 1491, 285, 747, 1491, 50275, 9328, 7409, 2067, 5368, 502, 11333, 285, 597, 12661, 247, 747, 7982, 1925, 15260, 274, 281, 2557, 5454, 2727, 875, 14454, 673, 25168, 3640, 285, 9300, 390, 9841, 9288, 3640, 50276, 9328, 2530, 690, 4342, 327, 616, 5415, 298, 78, 4715, 50276, 296, 3755, 20556, 50275, 9328, 2085, 247, 747, 22791, 285, 7982, 281, 2557, 253, 17302, 273, 673, 25168, 3640, 9300, 3640, 285, 747, 3640, 50276, 8826, 4722, 7313, 403, 2530, 323, 1650, 337, 33558, 267, 3082, 513, 417, 789, 973, 275, 436, 4758, 1014, 2167, 253, 1921, 310, 3240, 4755, 984, 690, 3640, 310, 9300, 285, 30364, 11892, 37755, 6232, 3082, 5115, 1805, 1543, 374, 298, 983, 403, 21291, 281, 625, 37264, 347, 597, 564, 949, 2709, 260, 7261, 12475, 495, 50276, 77, 983, 943, 320, 3215, 11273, 342, 816, 247, 1643, 44540, 327, 1679, 11581, 839, 941, 323, 6733, 50276, 20881, 1255, 50276, 249, 247, 1524, 10186, 10076, 849, 476, 581, 871, 275, 7170, 326, 253, 747, 4836, 310, 7777, 747, 347, 5393, 327, 3239, 608, 352, 310, 1896, 326, 253, 15260, 274, 4868, 310, 1077, 1781, 604, 642, 6351, 285, 14003, 3640, 253, 4477, 858, 417, 921, 4679, 390, 1783, 275, 824, 247, 4758, 835, 253, 747, 4836, 556, 690, 3640, 21481, 342, 253, 6311, 8892, 281, 1056, 436, 4468, 625, 2087, 891, 1928, 253, 6814, 273, 4836, 14259, 310, 5816, 275, 436, 789, 50275, 5092, 359, 671, 923, 44229, 414, 347, 1529, 7877, 802, 7363, 2550, 871, 849, 3076, 253, 10554, 3268, 310, 50275, 3088, 368, 1158, 253, 6452, 273, 841, 298, 2902, 8892, 310, 253, 1072, 347, 643, 295, 24343, 15450, 8892, 50274, 3088, 368, 1158, 2571, 476, 4354, 25464, 634, 3904, 513, 368, 1408, 2067, 19860, 390, 12922, 323, 2709, 3790, 502, 7533, 50275, 9802, 368, 3597, 1027, 9552, 273, 246, 22, 3210, 5046, 253, 8037, 875, 26724, 285, 260, 7261, 3082, 588, 320, 4577, 1677, 4067, 3210, 50276, 43671, 50276, 74, 452, 5545, 327, 436, 6197, 25761, 253, 12510, 273, 260, 7261, 3082, 310, 1199, 3777, 275, 10796, 73, 511, 260, 7261, 2011, 407, 253, 6379, 273, 253, 8037, 875, 253, 15260, 274, 273, 246, 22, 6148, 6077, 285, 253, 1682, 9591, 260, 7261, 1332, 275, 253, 10076, 273, 2500, 2689, 511, 285, 581, 3408, 534, 310, 470, 4529, 285, 470, 2950, 2975, 310, 2649, 352, 760, 5276, 326, 246, 22, 6148, 6077, 476, 3037, 1805, 15260, 274, 7363, 1355, 81, 18, 50276, 6795, 81, 19, 685, 1355, 81, 18, 50276, 6795, 81, 19, 984, 368, 452, 2074, 15260, 274, 7363, 323, 246, 22, 76, 324, 49872, 50276, 2520, 789, 310, 3240, 47860, 323, 441, 281, 2096, 625, 670, 849, 298, 78, 5415, 4715, 2987, 3738, 891, 1158, 625, 4679, 812, 320, 12912, 347, 891, 5393, 275, 253, 14855, 2593, 604, 359, 476, 1056, 2119, 253, 3904, 432, 436, 2929, 403, 41374, 285, 10870, 840, 891, 1158, 352, 812, 320, 247, 1175, 1071, 3026, 323, 2852, 2561, 275, 436, 3884, 5474, 339, 431, 248, 4477, 36803, 247, 747, 45120, 4715, 502, 1895, 1925, 45120, 3640, 4715, 260, 7261, 3782, 597, 15622, 1264, 8482, 6579, 275, 260, 7261, 26332, 253, 17302, 273, 673, 25168, 1533, 3640, 253, 5731, 273, 1711, 3640, 285, 253, 11931, 273, 747, 3640, 597, 671, 9569, 247, 747, 22791, 285, 7982, 281, 22048, 253, 3045, 273, 2710, 1375, 23037, 14387, 3210, 327, 841, 8482, 6579, 597, 1089, 326, 260, 7261, 14371, 4451, 7881, 326, 403, 417, 1246, 275, 2045, 502, 873, 8777, 4619, 5997, 273, 3640, 37264, 275, 260, 7261, 403, 671, 5469, 20544, 275, 2426, 273, 38135, 253, 2929, 6508, 253, 5426, 273, 5415, 4715, 281, 36803, 5415, 3640, 4715, 326, 556, 4451, 7881, 2429, 281, 253, 5899, 502, 253, 2929, 671, 5611, 247, 4460, 7982, 4907, 15260, 274, 281, 2557, 253, 5454, 2727, 875, 3640, 37264, 5731, 390, 11931, 436, 310, 247, 7680, 281, 253, 1673, 347, 436, 11745, 7982, 812, 12454, 1480, 5301, 875, 3210, 9591, 260, 7261, 8892, 50276, 783, 2929, 310, 22335, 3590, 9470, 4679, 497, 5196, 281, 22791, 253, 3045, 273, 2710, 502, 7274, 37820, 33558, 267, 285, 30364, 11892, 37755, 6232, 3082, 327, 1027, 7794, 273, 253, 260, 7261, 4836, 17302, 273, 673, 25168, 3640, 22753, 1711, 3640, 285, 28635, 747, 3640, 275, 253, 30762, 253, 4477, 671, 3559, 2710, 4088, 281, 2096, 253, 3210, 4715, 1232, 824, 347, 253, 1818, 273, 8131, 18012, 1309, 253, 4821, 3215, 26208, 347, 973, 347, 253, 4433, 1783, 1754, 327, 253, 1511, 273, 19432, 841, 3082, 2085, 625, 12288, 715, 253, 3210, 4715, 1232, 326, 2427, 4457, 253, 8342, 3045, 7363, 50275, 20881, 1255, 265, 970, 246, 22, 253, 4477, 2692, 326, 30364, 11892, 37755, 6232, 3082, 452, 253, 954, 10237, 3045, 4768, 512, 273, 253, 5661, 7533, 2299, 275, 253, 4679, 342, 305, 431, 19, 247, 29810, 7483, 1566, 275, 253, 30762, 305, 431, 19, 24706, 15337, 247, 33558, 267, 1332, 17923, 253, 1682, 3738, 253, 4477, 5393, 326, 359, 3553, 625, 17947, 273, 9433, 260, 7261, 3082, 327, 29810, 7483, 3210, 824, 347, 305, 431, 19, 10336, 347, 2852, 789, 597, 943, 1335, 3748, 436, 4619, 26210, 275, 253, 2022, 2133, 273, 253, 2929, 594, 326, 253, 10668, 403, 6600, 273, 253, 3634, 273, 253, 4342, 846, 512, 1781, 3448, 3210, 1379, 2710, 4948, 285, 1097, 246, 22, 285, 305, 431, 19, 403, 6667, 273, 1781, 3448, 3210, 50276, 34974, 253, 4477, 5183, 326, 30364, 11892, 37755, 6232, 3082, 452, 253, 954, 10237, 3045, 4768, 512, 273, 253, 5661, 7533, 342, 246, 22, 1580, 253, 1264, 3082, 37820, 33558, 267, 285, 4764, 7466, 403, 417, 25834, 11855, 891, 369, 12371, 604, 253, 4477, 452, 3597, 667, 5019, 273, 253, 7274, 323, 4227, 253, 33558, 267, 2746, 812, 320, 5678, 342, 253, 30364, 11892, 37755, 6232, 1332, 891, 369, 12371, 1880, 247, 5678, 2746, 651, 4917, 1014, 2169, 3045, 50275, 783, 2929, 26115, 253, 1895, 273, 45120, 3640, 4715, 285, 22791, 264, 253, 3045, 273, 1781, 3448, 3210, 327, 436, 4836, 342, 1027, 502, 3082, 253, 5454, 2727, 875, 37264, 5368, 3640, 285, 22753, 1711, 3640, 317, 371, 4261, 747, 3640, 310, 18755, 949, 247, 747, 7982, 534, 651, 5752, 347, 271, 1774, 13757, 4736, 323, 2852, 2561, 436, 789, 310, 247, 1943, 7680, 281, 253, 3114, 285, 651, 19864, 625, 2561, 715, 436, 9400, 50276, 187, 187, 4118, 18435, 27, 783, 2929, 23970, 253, 1895, 273, 45120, 3640, 3448, 4715, 253, 4477, 1127, 562, 253, 4722, 34962, 875, 45120, 4715, 285, 3640, 4715, 835, 275, 3640, 4715, 581, 1364, 3693, 37264, 673, 25168, 3640, 3693, 37264, 275, 502, 320, 2104, 281, 16270, 747, 3640, 3037, 747, 8892, 275, 502, 285, 8171, 36761, 3640, 247, 830, 273, 37264, 285, 1693, 4026, 390, 15644, 275, 616, 2929, 253, 4477, 1287, 271, 3302, 22791, 323, 253, 4836, 2112, 342, 247, 873, 273, 1666, 25379, 285, 2085, 16774, 2175, 50276, 783, 3302, 10123, 497, 3240, 6804, 253, 30628, 1646, 281, 5194, 436, 789, 2175, 271, 4722, 285, 9648, 4460, 3884, 323, 45120, 4715, 273, 3448, 2299, 253, 30628, 858, 417, 5194, 327, 1880, 436, 3302, 25520, 387, 253, 1895, 369, 2217, 275, 1798, 37317, 1484, 26, 44734, 8219, 326, 253, 15895, 310, 689, 48573, 1245, 285, 253, 1655, 4679, 403, 14155, 50276, 6438, 253, 5955, 253, 30628, 6376, 8085, 342, 581, 1029, 4868, 854, 767, 45210, 25026, 495, 285, 581, 12009, 594, 1264, 30628, 2868, 326, 436, 7714, 310, 2168, 247, 1175, 7680, 253, 7002, 37317, 10009, 6151, 533, 253, 4477, 2530, 2590, 285, 21414, 6128, 281, 1142, 273, 616, 5701, 285, 1127, 281, 1543, 2168, 2130, 275, 253, 30762, 50276, 1189, 455, 436, 310, 247, 2590, 285, 5272, 806, 3213, 7296, 436, 2929, 29328, 247, 747, 502, 1895, 253, 30628, 285, 891, 2868, 326, 436, 310, 4722, 285, 26565, 2217, 281, 320, 3486, 1020, 285, 281, 7501, 956, 484, 2987, 347, 247, 906, 516, 5211, 281, 5583, 14924, 891, 8564, 326, 604, 253, 3114, 14371, 1600, 275, 436, 1386, 273, 789, 627, 588, 320, 789, 1097, 327, 39396, 281, 3157, 253, 4081, 1666, 25379, 533, 671, 789, 36636, 18149, 281, 253, 1895, 275, 1386, 342, 690, 273, 253, 5701, 273, 37317, 1484, 26, 44734, 50276, 249, 13828, 616, 4049, 254, 609, 5102, 2715, 891, 7052, 11907, 253, 4477, 281, 1379, 715, 2395, 253, 13991, 273, 253, 30628, 285, 634, 32114, 275, 1798, 634, 5955, 5001, 32049, 48759, 285, 29810, 7483, 298, 983, 285, 253, 2330, 1543, 651, 320, 1175, 281, 2319, 275, 253, 2022, 2505, 1014, 604, 253, 2120, 1543, 403, 275, 253, 30762 ]
[ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 ]
[ 417, 271, 3909, 45120, 4715, 1895, 253, 4081, 9978, 310, 3021, 247, 2372, 2080, 432, 253, 16038, 273, 12392, 260, 7261, 50276, 41322, 1776, 271, 2455, 28276, 298, 78, 50275, 783, 4679, 403, 1077, 3710, 281, 253, 298, 2902, 39578, 534, 1057, 417, 7933, 4684, 281, 1524, 15450, 4893, 273, 841, 298, 983, 697, 671, 1892, 281, 15249, 1880, 824, 3082, 476, 6558, 3045, 275, 2087, 295, 24343, 8892, 253, 4154, 670, 253, 465, 2878, 4679, 3133, 281, 760, 2770, 327, 5175, 253, 17302, 533, 417, 670, 253, 747, 39055, 3640, 670, 277, 18, 50275, 783, 1783, 273, 253, 260, 7261, 3082, 310, 417, 3676, 2217, 849, 513, 841, 3082, 789, 285, 2139, 513, 690, 562, 32231, 2571, 849, 513, 359, 871, 604, 271, 10341, 747, 958, 7344, 342, 253, 5368, 3640, 390, 417, 327, 253, 8778, 849, 513, 368, 4853, 824, 15272, 6283, 1333, 368, 452, 247, 6197, 275, 277, 17, 1531, 382, 10356, 391, 2814, 3088, 7120, 323, 43911, 275, 4267, 285, 368, 452, 1529, 6197, 275, 277, 18, 1531, 382, 10356, 391, 2814, 3088, 7120, 323, 340, 12502, 1024, 275, 253, 1655, 1895, 9978, 849, 588, 824, 7344, 320, 2931, 50275, 783, 747, 4342, 403, 417, 3782, 37825, 50275, 783, 9759, 285, 253, 4028, 273, 253, 2929, 476, 320, 2007, 5520, 342, 625, 47386, 6667, 285, 1083, 2175, 323, 10668, 281, 36143, 923, 253, 1895, 9978, 285, 253, 3910, 875, 841, 3082, 436, 2929, 310, 247, 11572, 1263, 273, 271, 1965, 85, 386, 1895, 45120, 3640, 4715, 273, 298, 983, 533, 253, 1895, 15895, 310, 27662, 2969, 285, 627, 403, 1335, 1142, 1774, 2568, 5816, 2792, 275, 1097, 941, 5140, 285, 4679, 671, 627, 403, 642, 1199, 47860, 285, 37825, 4342, 342, 3676, 1783, 273, 5368, 3082, 4496, 1089, 625, 4278, 1840, 50276, 7152, 339, 431, 248, 2929, 2175, 45120, 3640, 4715, 273, 3448, 3210, 534, 310, 271, 4722, 285, 1774, 1895, 3782, 247, 747, 22791, 285, 247, 7982, 403, 5611, 281, 22048, 253, 17302, 273, 673, 25168, 1533, 3640, 253, 5731, 273, 36761, 3640, 285, 253, 11931, 273, 747, 3640, 281, 5100, 1666, 25379, 323, 253, 260, 7261, 22791, 285, 17813, 253, 8870, 414, 273, 253, 4081, 22791, 285, 7982, 253, 2488, 2589, 84, 9470, 4679, 342, 247, 3215, 11273, 32049, 48759, 1566, 246, 22, 1754, 327, 2710, 3733, 39396, 1690, 37820, 33558, 267, 285, 4764, 7466, 3082, 50276, 783, 2929, 310, 973, 10932, 285, 3477, 281, 956, 253, 4081, 45120, 3640, 4715, 1895, 310, 3240, 4722, 285, 1774, 253, 15260, 274, 7982, 310, 671, 22335, 3590, 253, 4477, 671, 2589, 11088, 4679, 281, 12654, 253, 8870, 414, 273, 253, 4081, 22791, 762, 253, 2710, 7533, 20544, 337, 253, 4081, 45120, 3640, 4715, 1895, 310, 3240, 4722, 285, 1774, 374, 253, 22791, 310, 4217, 285, 253, 4081, 15260, 274, 7982, 310, 22335, 3590, 50275, 20881, 1255, 265, 337, 253, 2929, 760, 17923, 4679, 342, 271, 32049, 48759, 1566, 246, 22, 253, 5661, 1543, 588, 320, 625, 21414, 604, 625, 3215, 11273, 3448, 3210, 824, 347, 305, 431, 403, 2908, 323, 1650, 359, 476, 8338, 253, 3745, 273, 1027, 499, 983, 281, 3693, 36256, 37264, 285, 281, 16270, 747, 3640, 1223, 24279, 13727, 3640, 374, 11253, 342, 253, 5899, 4758, 273, 502, 253, 2929, 671, 10513, 247, 4758, 323, 2709, 260, 7261, 12475, 2299, 760, 247, 2500, 2689, 511, 4758, 310, 2783, 625, 4679, 476, 320, 14859, 824, 347, 2620, 14213, 390, 10938, 253, 3910, 275, 253, 3268, 273, 941, 387, 1027, 12475, 495, 253, 5661, 4342, 275, 436, 2929, 403, 8489, 14916, 50276, 783, 2929, 2175, 271, 4722, 1895, 253, 4081, 22791, 285, 7982, 403, 22335, 3590, 2299, 627, 403, 690, 7350, 670, 5661, 7533, 5474, 33032, 253, 2929, 310, 670, 5415, 4715, 323, 3448, 3210, 253, 4477, 25057, 5368, 298, 2902, 8892, 285, 4822, 247, 747, 1071, 22791, 342, 9300, 1491, 285, 747, 1491, 50275, 9328, 7409, 2067, 5368, 502, 11333, 285, 597, 12661, 247, 747, 7982, 1925, 15260, 274, 281, 2557, 5454, 2727, 875, 14454, 673, 25168, 3640, 285, 9300, 390, 9841, 9288, 3640, 50276, 9328, 2530, 690, 4342, 327, 616, 5415, 298, 78, 4715, 50276, 296, 3755, 20556, 50275, 9328, 2085, 247, 747, 22791, 285, 7982, 281, 2557, 253, 17302, 273, 673, 25168, 3640, 9300, 3640, 285, 747, 3640, 50276, 8826, 4722, 7313, 403, 2530, 323, 1650, 337, 33558, 267, 3082, 513, 417, 789, 973, 275, 436, 4758, 1014, 2167, 253, 1921, 310, 3240, 4755, 984, 690, 3640, 310, 9300, 285, 30364, 11892, 37755, 6232, 3082, 5115, 1805, 1543, 374, 298, 983, 403, 21291, 281, 625, 37264, 347, 597, 564, 949, 2709, 260, 7261, 12475, 495, 50276, 77, 983, 943, 320, 3215, 11273, 342, 816, 247, 1643, 44540, 327, 1679, 11581, 839, 941, 323, 6733, 50276, 20881, 1255, 50276, 249, 247, 1524, 10186, 10076, 849, 476, 581, 871, 275, 7170, 326, 253, 747, 4836, 310, 7777, 747, 347, 5393, 327, 3239, 608, 352, 310, 1896, 326, 253, 15260, 274, 4868, 310, 1077, 1781, 604, 642, 6351, 285, 14003, 3640, 253, 4477, 858, 417, 921, 4679, 390, 1783, 275, 824, 247, 4758, 835, 253, 747, 4836, 556, 690, 3640, 21481, 342, 253, 6311, 8892, 281, 1056, 436, 4468, 625, 2087, 891, 1928, 253, 6814, 273, 4836, 14259, 310, 5816, 275, 436, 789, 50275, 5092, 359, 671, 923, 44229, 414, 347, 1529, 7877, 802, 7363, 2550, 871, 849, 3076, 253, 10554, 3268, 310, 50275, 3088, 368, 1158, 253, 6452, 273, 841, 298, 2902, 8892, 310, 253, 1072, 347, 643, 295, 24343, 15450, 8892, 50274, 3088, 368, 1158, 2571, 476, 4354, 25464, 634, 3904, 513, 368, 1408, 2067, 19860, 390, 12922, 323, 2709, 3790, 502, 7533, 50275, 9802, 368, 3597, 1027, 9552, 273, 246, 22, 3210, 5046, 253, 8037, 875, 26724, 285, 260, 7261, 3082, 588, 320, 4577, 1677, 4067, 3210, 50276, 43671, 50276, 74, 452, 5545, 327, 436, 6197, 25761, 253, 12510, 273, 260, 7261, 3082, 310, 1199, 3777, 275, 10796, 73, 511, 260, 7261, 2011, 407, 253, 6379, 273, 253, 8037, 875, 253, 15260, 274, 273, 246, 22, 6148, 6077, 285, 253, 1682, 9591, 260, 7261, 1332, 275, 253, 10076, 273, 2500, 2689, 511, 285, 581, 3408, 534, 310, 470, 4529, 285, 470, 2950, 2975, 310, 2649, 352, 760, 5276, 326, 246, 22, 6148, 6077, 476, 3037, 1805, 15260, 274, 7363, 1355, 81, 18, 50276, 6795, 81, 19, 685, 1355, 81, 18, 50276, 6795, 81, 19, 984, 368, 452, 2074, 15260, 274, 7363, 323, 246, 22, 76, 324, 49872, 50276, 2520, 789, 310, 3240, 47860, 323, 441, 281, 2096, 625, 670, 849, 298, 78, 5415, 4715, 2987, 3738, 891, 1158, 625, 4679, 812, 320, 12912, 347, 891, 5393, 275, 253, 14855, 2593, 604, 359, 476, 1056, 2119, 253, 3904, 432, 436, 2929, 403, 41374, 285, 10870, 840, 891, 1158, 352, 812, 320, 247, 1175, 1071, 3026, 323, 2852, 2561, 275, 436, 3884, 5474, 339, 431, 248, 4477, 36803, 247, 747, 45120, 4715, 502, 1895, 1925, 45120, 3640, 4715, 260, 7261, 3782, 597, 15622, 1264, 8482, 6579, 275, 260, 7261, 26332, 253, 17302, 273, 673, 25168, 1533, 3640, 253, 5731, 273, 1711, 3640, 285, 253, 11931, 273, 747, 3640, 597, 671, 9569, 247, 747, 22791, 285, 7982, 281, 22048, 253, 3045, 273, 2710, 1375, 23037, 14387, 3210, 327, 841, 8482, 6579, 597, 1089, 326, 260, 7261, 14371, 4451, 7881, 326, 403, 417, 1246, 275, 2045, 502, 873, 8777, 4619, 5997, 273, 3640, 37264, 275, 260, 7261, 403, 671, 5469, 20544, 275, 2426, 273, 38135, 253, 2929, 6508, 253, 5426, 273, 5415, 4715, 281, 36803, 5415, 3640, 4715, 326, 556, 4451, 7881, 2429, 281, 253, 5899, 502, 253, 2929, 671, 5611, 247, 4460, 7982, 4907, 15260, 274, 281, 2557, 253, 5454, 2727, 875, 3640, 37264, 5731, 390, 11931, 436, 310, 247, 7680, 281, 253, 1673, 347, 436, 11745, 7982, 812, 12454, 1480, 5301, 875, 3210, 9591, 260, 7261, 8892, 50276, 783, 2929, 310, 22335, 3590, 9470, 4679, 497, 5196, 281, 22791, 253, 3045, 273, 2710, 502, 7274, 37820, 33558, 267, 285, 30364, 11892, 37755, 6232, 3082, 327, 1027, 7794, 273, 253, 260, 7261, 4836, 17302, 273, 673, 25168, 3640, 22753, 1711, 3640, 285, 28635, 747, 3640, 275, 253, 30762, 253, 4477, 671, 3559, 2710, 4088, 281, 2096, 253, 3210, 4715, 1232, 824, 347, 253, 1818, 273, 8131, 18012, 1309, 253, 4821, 3215, 26208, 347, 973, 347, 253, 4433, 1783, 1754, 327, 253, 1511, 273, 19432, 841, 3082, 2085, 625, 12288, 715, 253, 3210, 4715, 1232, 326, 2427, 4457, 253, 8342, 3045, 7363, 50275, 20881, 1255, 265, 970, 246, 22, 253, 4477, 2692, 326, 30364, 11892, 37755, 6232, 3082, 452, 253, 954, 10237, 3045, 4768, 512, 273, 253, 5661, 7533, 2299, 275, 253, 4679, 342, 305, 431, 19, 247, 29810, 7483, 1566, 275, 253, 30762, 305, 431, 19, 24706, 15337, 247, 33558, 267, 1332, 17923, 253, 1682, 3738, 253, 4477, 5393, 326, 359, 3553, 625, 17947, 273, 9433, 260, 7261, 3082, 327, 29810, 7483, 3210, 824, 347, 305, 431, 19, 10336, 347, 2852, 789, 597, 943, 1335, 3748, 436, 4619, 26210, 275, 253, 2022, 2133, 273, 253, 2929, 594, 326, 253, 10668, 403, 6600, 273, 253, 3634, 273, 253, 4342, 846, 512, 1781, 3448, 3210, 1379, 2710, 4948, 285, 1097, 246, 22, 285, 305, 431, 19, 403, 6667, 273, 1781, 3448, 3210, 50276, 34974, 253, 4477, 5183, 326, 30364, 11892, 37755, 6232, 3082, 452, 253, 954, 10237, 3045, 4768, 512, 273, 253, 5661, 7533, 342, 246, 22, 1580, 253, 1264, 3082, 37820, 33558, 267, 285, 4764, 7466, 403, 417, 25834, 11855, 891, 369, 12371, 604, 253, 4477, 452, 3597, 667, 5019, 273, 253, 7274, 323, 4227, 253, 33558, 267, 2746, 812, 320, 5678, 342, 253, 30364, 11892, 37755, 6232, 1332, 891, 369, 12371, 1880, 247, 5678, 2746, 651, 4917, 1014, 2169, 3045, 50275, 783, 2929, 26115, 253, 1895, 273, 45120, 3640, 4715, 285, 22791, 264, 253, 3045, 273, 1781, 3448, 3210, 327, 436, 4836, 342, 1027, 502, 3082, 253, 5454, 2727, 875, 37264, 5368, 3640, 285, 22753, 1711, 3640, 317, 371, 4261, 747, 3640, 310, 18755, 949, 247, 747, 7982, 534, 651, 5752, 347, 271, 1774, 13757, 4736, 323, 2852, 2561, 436, 789, 310, 247, 1943, 7680, 281, 253, 3114, 285, 651, 19864, 625, 2561, 715, 436, 9400, 50276, 187, 187, 4118, 18435, 27, 783, 2929, 23970, 253, 1895, 273, 45120, 3640, 3448, 4715, 253, 4477, 1127, 562, 253, 4722, 34962, 875, 45120, 4715, 285, 3640, 4715, 835, 275, 3640, 4715, 581, 1364, 3693, 37264, 673, 25168, 3640, 3693, 37264, 275, 502, 320, 2104, 281, 16270, 747, 3640, 3037, 747, 8892, 275, 502, 285, 8171, 36761, 3640, 247, 830, 273, 37264, 285, 1693, 4026, 390, 15644, 275, 616, 2929, 253, 4477, 1287, 271, 3302, 22791, 323, 253, 4836, 2112, 342, 247, 873, 273, 1666, 25379, 285, 2085, 16774, 2175, 50276, 783, 3302, 10123, 497, 3240, 6804, 253, 30628, 1646, 281, 5194, 436, 789, 2175, 271, 4722, 285, 9648, 4460, 3884, 323, 45120, 4715, 273, 3448, 2299, 253, 30628, 858, 417, 5194, 327, 1880, 436, 3302, 25520, 387, 253, 1895, 369, 2217, 275, 1798, 37317, 1484, 26, 44734, 8219, 326, 253, 15895, 310, 689, 48573, 1245, 285, 253, 1655, 4679, 403, 14155, 50276, 6438, 253, 5955, 253, 30628, 6376, 8085, 342, 581, 1029, 4868, 854, 767, 45210, 25026, 495, 285, 581, 12009, 594, 1264, 30628, 2868, 326, 436, 7714, 310, 2168, 247, 1175, 7680, 253, 7002, 37317, 10009, 6151, 533, 253, 4477, 2530, 2590, 285, 21414, 6128, 281, 1142, 273, 616, 5701, 285, 1127, 281, 1543, 2168, 2130, 275, 253, 30762, 50276, 1189, 455, 436, 310, 247, 2590, 285, 5272, 806, 3213, 7296, 436, 2929, 29328, 247, 747, 502, 1895, 253, 30628, 285, 891, 2868, 326, 436, 310, 4722, 285, 26565, 2217, 281, 320, 3486, 1020, 285, 281, 7501, 956, 484, 2987, 347, 247, 906, 516, 5211, 281, 5583, 14924, 891, 8564, 326, 604, 253, 3114, 14371, 1600, 275, 436, 1386, 273, 789, 627, 588, 320, 789, 1097, 327, 39396, 281, 3157, 253, 4081, 1666, 25379, 533, 671, 789, 36636, 18149, 281, 253, 1895, 275, 1386, 342, 690, 273, 253, 5701, 273, 37317, 1484, 26, 44734, 50276, 249, 13828, 616, 4049, 254, 609, 5102, 2715, 891, 7052, 11907, 253, 4477, 281, 1379, 715, 2395, 253, 13991, 273, 253, 30628, 285, 634, 32114, 275, 1798, 634, 5955, 5001, 32049, 48759, 285, 29810, 7483, 298, 983, 285, 253, 2330, 1543, 651, 320, 1175, 281, 2319, 275, 253, 2022, 2505, 1014, 604, 253, 2120, 1543, 403, 275, 253, 30762 ]
Below is given review of a research paper from cnoference journal. Please write a summary the review. ### Review: this paper proposes to apply neural architecture search nas for connectivity pruning to improve the parameter efficiency of densenet the idea is straightforward and the paper is well organized and easy to follow my major concern is the limited contribution applying deep reinforcement learning drl and following the automl framework for architectureparameter pruning has been extensively investigated during the past two years for instance this work has a similar motivation and design amc automl for model compression and acceleration on mobile devices the experimental results also show a limited efficiency improvement according to table 1 although this is a debatable drawback compared with the noveltycontribution concern it worth to reconsider the motivation of the proposed method given the fact that the automl framework is extremely expensive due to the drl design docsepthis paper proposes a layerbased pruning method based on reinforment learning for pretrain networks there are several major issues for my rating lack of perspective i do not understand where this paper sits compared to other compression methods if this is about rl great if this is about compression there is a lack of related work and proper comparisons to existing methods at least concenptual claims about the benefits of not needed expertise are not clear to me as from the results seems like expertise is needed to set the hyperparameters experiments are not convincing i would like to see something about computational costs current methods aim at minimizing training finetuning costs while maintaining the accuracy how does this stands in that regard how much time is needed to prune one of these models how many resources would it be possible to add this process into a training from scratch method how would this compare to training methods that integrate compression strategies table 1 shows incomplete results why also there is a big gap between accuracynumber of parameters tradeof between this method and other presented in that table why docsepthe paper introduces rl based approach to prune layers in a densenet this work extends blockdrop to densenet architecture making the controller independent form the input image the approach is evaluated on cifar10 and cifar100 datasets as well as on imagenet showing promising results in order to improve the paper the authors could take into consideration the following points 1 given the similarity of the approach with blockdrop i would suggest to discuss it in the introduction section clearly stating the similarities and the differences with the proposed approach 2 blockdrop seems to introduce a general framework of policy network to prune neural networks however the authors claim that blockdrop can only be applied to resnets or its variants could the authors comment on this 3 in the abstract the authors claim our experiments show that densenet with lwp is more compact and efficient than existing alternatives it is hard to asses if the statement is correct given the evidence presented in the experimental section it is not clear if the method is more efficient and compact than others e g condensenet 4 in the experimental section addressing the following questions would make the section stronger what is more important flops or number of parameters what is the accuracy drop we should allow to pay for reduction in number of parameters or flops 5 for the evaluation i would suggest to show that the learned policy is better than a random one e g not using the controller to define policy in line 20 of the algorithm and using a random random policy instead 6 in table 1 some entries for densenet lwp are missing is the network converging for this setups 7 sigma is not explained in section 33 what is the intuition behind this hyper parameter 8 id suggest moving related work section to background section and expanding it a bit 9 in the introduction it achieved stateoftheart results across several highly competitive datasets please add citations accordingly additional comments 1 it might be interesting to compare the method introduced in the paper to a scenario where the controller is conditioned on an input image and adaptively selects the connectionslayers in densenet at inference time 2 it might be interesting to report the number of connections in table 1 for all the models overall i liked the ideas presented in the paper however i think that the high degree of overlap with blockdrop should be addressed by clearly stating the differences in the introduction section moreover i encourage the authors to include missing results in table 1 and run a comparison to random policy in the current version of the manuscript it is hard to compare among different methods thus finding a metric or a visualization that would clearly outline the efficiency and compactness of the method would make the paper stronger ### Summary:
the paper proposes to apply neural architecture search for pruning densenet the reviewers and ac note the potential weaknesses of the paper in various aspects and decided that the authors need more works to publish
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 2520, 2929, 29328, 281, 4647, 11454, 10336, 3186, 13332, 323, 17769, 819, 25004, 281, 3157, 253, 4764, 6733, 273, 12006, 257, 292, 253, 2934, 310, 15246, 285, 253, 2929, 310, 973, 10932, 285, 3477, 281, 956, 50276, 2577, 2201, 4468, 310, 253, 3710, 7680, 9433, 3676, 35221, 4715, 1837, 77, 285, 1563, 253, 3772, 77, 7792, 323, 10336, 19484, 819, 25004, 556, 644, 18171, 6949, 1309, 253, 2469, 767, 1107, 323, 4227, 436, 789, 556, 247, 2074, 16038, 285, 2216, 717, 68, 3772, 77, 323, 1566, 13800, 285, 17680, 327, 6109, 4095, 50276, 783, 5661, 1543, 671, 921, 247, 3710, 6733, 7756, 2556, 281, 2829, 337, 3738, 436, 310, 247, 4274, 17980, 32489, 2429, 342, 253, 38135, 1987, 2382, 4468, 352, 4409, 281, 24033, 253, 16038, 273, 253, 4081, 1332, 1677, 253, 958, 326, 253, 3772, 77, 7792, 310, 6685, 8214, 1955, 281, 253, 1837, 77, 2216, 50275, 7152, 33032, 2520, 2929, 29328, 247, 3828, 3169, 819, 25004, 1332, 1754, 327, 9838, 1542, 420, 4715, 323, 3215, 1949, 6928, 50276, 9088, 403, 2067, 2201, 3374, 323, 619, 13716, 50275, 77, 471, 273, 8668, 891, 513, 417, 2096, 835, 436, 2929, 17954, 2429, 281, 643, 13800, 3082, 604, 436, 310, 670, 391, 77, 1270, 604, 436, 310, 670, 13800, 627, 310, 247, 3480, 273, 2905, 789, 285, 1463, 14023, 281, 5368, 3082, 387, 1878, 7036, 257, 431, 780, 50276, 28803, 670, 253, 5373, 273, 417, 3058, 15040, 403, 417, 2590, 281, 479, 347, 432, 253, 1543, 3133, 751, 15040, 310, 3058, 281, 873, 253, 4373, 22041, 50275, 16217, 3825, 403, 417, 21414, 891, 651, 751, 281, 923, 1633, 670, 15180, 4815, 1655, 3082, 4388, 387, 28699, 3733, 50276, 71, 7795, 25004, 4815, 1223, 11850, 253, 7200, 849, 1057, 436, 9572, 275, 326, 2743, 849, 1199, 673, 310, 3058, 281, 819, 2517, 581, 273, 841, 3210, 849, 1142, 5300, 50275, 12756, 352, 320, 1896, 281, 823, 436, 1232, 715, 247, 3733, 432, 20041, 1332, 50275, 5430, 651, 436, 7277, 281, 3733, 3082, 326, 19837, 13800, 8130, 50276, 2420, 337, 2722, 18464, 1543, 2139, 671, 627, 310, 247, 1943, 8037, 875, 3933, 317, 1362, 2764, 273, 3602, 5454, 1171, 875, 436, 1332, 285, 643, 3559, 275, 326, 2829, 2139, 50276, 7152, 339, 431, 248, 2929, 23970, 391, 77, 1754, 2746, 281, 819, 2517, 8090, 275, 247, 12006, 257, 292, 436, 789, 8725, 2972, 12233, 281, 12006, 257, 292, 10336, 2403, 253, 9763, 3907, 830, 253, 3280, 2460, 253, 2746, 310, 6760, 327, 260, 338, 274, 740, 285, 260, 338, 274, 2313, 15302, 347, 973, 347, 327, 4440, 257, 292, 4645, 12532, 1543, 50276, 249, 1340, 281, 3157, 253, 2929, 253, 4477, 812, 1379, 715, 8180, 253, 1563, 2792, 50276, 18, 1677, 253, 14259, 273, 253, 2746, 342, 2972, 12233, 891, 651, 1804, 281, 2319, 352, 275, 253, 10199, 2593, 4518, 14851, 253, 22620, 285, 253, 3910, 342, 253, 4081, 2746, 50276, 19, 2972, 12233, 3133, 281, 9569, 247, 2087, 7792, 273, 3646, 2990, 281, 819, 2517, 11454, 6928, 2299, 253, 4477, 1750, 326, 2972, 12233, 476, 760, 320, 3732, 281, 501, 47301, 390, 697, 11640, 812, 253, 4477, 4385, 327, 436, 50276, 20, 275, 253, 12002, 253, 4477, 1750, 776, 4679, 921, 326, 12006, 257, 292, 342, 298, 16471, 310, 625, 8566, 285, 5919, 685, 5368, 18075, 352, 310, 1892, 281, 718, 265, 604, 253, 3908, 310, 3451, 1677, 253, 1941, 3559, 275, 253, 5661, 2593, 352, 310, 417, 2590, 604, 253, 1332, 310, 625, 5919, 285, 8566, 685, 2571, 299, 305, 50276, 1038, 19434, 292, 50276, 21, 275, 253, 5661, 2593, 15974, 253, 1563, 3533, 651, 1056, 253, 2593, 10046, 752, 310, 625, 1774, 892, 2695, 390, 1180, 273, 3602, 752, 310, 253, 7200, 5926, 359, 943, 1581, 281, 2075, 323, 5141, 275, 1180, 273, 3602, 390, 892, 2695, 608, 323, 253, 7103, 891, 651, 1804, 281, 921, 326, 253, 6311, 3646, 310, 1805, 685, 247, 3632, 581, 299, 305, 417, 970, 253, 9763, 281, 4853, 3646, 275, 1386, 1384, 273, 253, 5933, 285, 970, 247, 3632, 3632, 3646, 3185, 721, 275, 2829, 337, 690, 12028, 323, 12006, 257, 292, 298, 16471, 403, 5816, 310, 253, 2990, 5975, 3390, 323, 436, 873, 8777, 50276, 24, 40009, 310, 417, 5544, 275, 2593, 5922, 752, 310, 253, 30328, 3212, 436, 4373, 4764, 854, 2654, 1804, 4886, 2905, 789, 2593, 281, 4114, 2593, 285, 16122, 352, 247, 2372, 898, 275, 253, 10199, 50276, 262, 6786, 1375, 23037, 14387, 1543, 2439, 2067, 4122, 12085, 15302, 4496, 823, 30404, 15672, 50276, 38092, 5701, 337, 352, 1537, 320, 4722, 281, 7277, 253, 1332, 5611, 275, 253, 2929, 281, 247, 10076, 835, 253, 9763, 310, 27039, 327, 271, 3280, 2460, 285, 5223, 1242, 34899, 253, 10291, 33990, 275, 12006, 257, 292, 387, 17032, 673, 374, 352, 1537, 320, 4722, 281, 1304, 253, 1180, 273, 10291, 275, 2829, 337, 323, 512, 253, 3210, 50276, 1189, 455, 891, 10490, 253, 5697, 3559, 275, 253, 2929, 2299, 891, 1158, 326, 253, 1029, 4248, 273, 14787, 342, 2972, 12233, 943, 320, 9713, 407, 4518, 14851, 253, 3910, 275, 253, 10199, 2593, 25761, 891, 11907, 253, 4477, 281, 2486, 5816, 1543, 275, 2829, 337, 285, 1408, 247, 5301, 281, 3632, 3646, 275, 253, 1655, 2715, 273, 253, 7714, 352, 310, 1892, 281, 7277, 2190, 1027, 3082, 3021, 4560, 247, 7982, 390, 247, 24426, 326, 651, 4518, 19270, 253, 6733, 285, 8566, 1255, 273, 253, 1332, 651, 1056, 253, 2929, 10046, 187, 187, 4118, 18435, 27, 783, 2929, 29328, 281, 4647, 11454, 10336, 3186, 323, 819, 25004, 12006, 257, 292, 50275, 783, 30628, 285, 913, 3877, 253, 2442, 32213, 273, 253, 2929, 275, 2710, 7794, 285, 4425, 326, 253, 4477, 878, 625, 2987, 281, 15452, 209 ]
[ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 ]
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 2520, 2929, 29328, 281, 4647, 11454, 10336, 3186, 13332, 323, 17769, 819, 25004, 281, 3157, 253, 4764, 6733, 273, 12006, 257, 292, 253, 2934, 310, 15246, 285, 253, 2929, 310, 973, 10932, 285, 3477, 281, 956, 50276, 2577, 2201, 4468, 310, 253, 3710, 7680, 9433, 3676, 35221, 4715, 1837, 77, 285, 1563, 253, 3772, 77, 7792, 323, 10336, 19484, 819, 25004, 556, 644, 18171, 6949, 1309, 253, 2469, 767, 1107, 323, 4227, 436, 789, 556, 247, 2074, 16038, 285, 2216, 717, 68, 3772, 77, 323, 1566, 13800, 285, 17680, 327, 6109, 4095, 50276, 783, 5661, 1543, 671, 921, 247, 3710, 6733, 7756, 2556, 281, 2829, 337, 3738, 436, 310, 247, 4274, 17980, 32489, 2429, 342, 253, 38135, 1987, 2382, 4468, 352, 4409, 281, 24033, 253, 16038, 273, 253, 4081, 1332, 1677, 253, 958, 326, 253, 3772, 77, 7792, 310, 6685, 8214, 1955, 281, 253, 1837, 77, 2216, 50275, 7152, 33032, 2520, 2929, 29328, 247, 3828, 3169, 819, 25004, 1332, 1754, 327, 9838, 1542, 420, 4715, 323, 3215, 1949, 6928, 50276, 9088, 403, 2067, 2201, 3374, 323, 619, 13716, 50275, 77, 471, 273, 8668, 891, 513, 417, 2096, 835, 436, 2929, 17954, 2429, 281, 643, 13800, 3082, 604, 436, 310, 670, 391, 77, 1270, 604, 436, 310, 670, 13800, 627, 310, 247, 3480, 273, 2905, 789, 285, 1463, 14023, 281, 5368, 3082, 387, 1878, 7036, 257, 431, 780, 50276, 28803, 670, 253, 5373, 273, 417, 3058, 15040, 403, 417, 2590, 281, 479, 347, 432, 253, 1543, 3133, 751, 15040, 310, 3058, 281, 873, 253, 4373, 22041, 50275, 16217, 3825, 403, 417, 21414, 891, 651, 751, 281, 923, 1633, 670, 15180, 4815, 1655, 3082, 4388, 387, 28699, 3733, 50276, 71, 7795, 25004, 4815, 1223, 11850, 253, 7200, 849, 1057, 436, 9572, 275, 326, 2743, 849, 1199, 673, 310, 3058, 281, 819, 2517, 581, 273, 841, 3210, 849, 1142, 5300, 50275, 12756, 352, 320, 1896, 281, 823, 436, 1232, 715, 247, 3733, 432, 20041, 1332, 50275, 5430, 651, 436, 7277, 281, 3733, 3082, 326, 19837, 13800, 8130, 50276, 2420, 337, 2722, 18464, 1543, 2139, 671, 627, 310, 247, 1943, 8037, 875, 3933, 317, 1362, 2764, 273, 3602, 5454, 1171, 875, 436, 1332, 285, 643, 3559, 275, 326, 2829, 2139, 50276, 7152, 339, 431, 248, 2929, 23970, 391, 77, 1754, 2746, 281, 819, 2517, 8090, 275, 247, 12006, 257, 292, 436, 789, 8725, 2972, 12233, 281, 12006, 257, 292, 10336, 2403, 253, 9763, 3907, 830, 253, 3280, 2460, 253, 2746, 310, 6760, 327, 260, 338, 274, 740, 285, 260, 338, 274, 2313, 15302, 347, 973, 347, 327, 4440, 257, 292, 4645, 12532, 1543, 50276, 249, 1340, 281, 3157, 253, 2929, 253, 4477, 812, 1379, 715, 8180, 253, 1563, 2792, 50276, 18, 1677, 253, 14259, 273, 253, 2746, 342, 2972, 12233, 891, 651, 1804, 281, 2319, 352, 275, 253, 10199, 2593, 4518, 14851, 253, 22620, 285, 253, 3910, 342, 253, 4081, 2746, 50276, 19, 2972, 12233, 3133, 281, 9569, 247, 2087, 7792, 273, 3646, 2990, 281, 819, 2517, 11454, 6928, 2299, 253, 4477, 1750, 326, 2972, 12233, 476, 760, 320, 3732, 281, 501, 47301, 390, 697, 11640, 812, 253, 4477, 4385, 327, 436, 50276, 20, 275, 253, 12002, 253, 4477, 1750, 776, 4679, 921, 326, 12006, 257, 292, 342, 298, 16471, 310, 625, 8566, 285, 5919, 685, 5368, 18075, 352, 310, 1892, 281, 718, 265, 604, 253, 3908, 310, 3451, 1677, 253, 1941, 3559, 275, 253, 5661, 2593, 352, 310, 417, 2590, 604, 253, 1332, 310, 625, 5919, 285, 8566, 685, 2571, 299, 305, 50276, 1038, 19434, 292, 50276, 21, 275, 253, 5661, 2593, 15974, 253, 1563, 3533, 651, 1056, 253, 2593, 10046, 752, 310, 625, 1774, 892, 2695, 390, 1180, 273, 3602, 752, 310, 253, 7200, 5926, 359, 943, 1581, 281, 2075, 323, 5141, 275, 1180, 273, 3602, 390, 892, 2695, 608, 323, 253, 7103, 891, 651, 1804, 281, 921, 326, 253, 6311, 3646, 310, 1805, 685, 247, 3632, 581, 299, 305, 417, 970, 253, 9763, 281, 4853, 3646, 275, 1386, 1384, 273, 253, 5933, 285, 970, 247, 3632, 3632, 3646, 3185, 721, 275, 2829, 337, 690, 12028, 323, 12006, 257, 292, 298, 16471, 403, 5816, 310, 253, 2990, 5975, 3390, 323, 436, 873, 8777, 50276, 24, 40009, 310, 417, 5544, 275, 2593, 5922, 752, 310, 253, 30328, 3212, 436, 4373, 4764, 854, 2654, 1804, 4886, 2905, 789, 2593, 281, 4114, 2593, 285, 16122, 352, 247, 2372, 898, 275, 253, 10199, 50276, 262, 6786, 1375, 23037, 14387, 1543, 2439, 2067, 4122, 12085, 15302, 4496, 823, 30404, 15672, 50276, 38092, 5701, 337, 352, 1537, 320, 4722, 281, 7277, 253, 1332, 5611, 275, 253, 2929, 281, 247, 10076, 835, 253, 9763, 310, 27039, 327, 271, 3280, 2460, 285, 5223, 1242, 34899, 253, 10291, 33990, 275, 12006, 257, 292, 387, 17032, 673, 374, 352, 1537, 320, 4722, 281, 1304, 253, 1180, 273, 10291, 275, 2829, 337, 323, 512, 253, 3210, 50276, 1189, 455, 891, 10490, 253, 5697, 3559, 275, 253, 2929, 2299, 891, 1158, 326, 253, 1029, 4248, 273, 14787, 342, 2972, 12233, 943, 320, 9713, 407, 4518, 14851, 253, 3910, 275, 253, 10199, 2593, 25761, 891, 11907, 253, 4477, 281, 2486, 5816, 1543, 275, 2829, 337, 285, 1408, 247, 5301, 281, 3632, 3646, 275, 253, 1655, 2715, 273, 253, 7714, 352, 310, 1892, 281, 7277, 2190, 1027, 3082, 3021, 4560, 247, 7982, 390, 247, 24426, 326, 651, 4518, 19270, 253, 6733, 285, 8566, 1255, 273, 253, 1332, 651, 1056, 253, 2929, 10046, 187, 187, 4118, 18435, 27, 783, 2929, 29328, 281, 4647, 11454, 10336, 3186, 323, 819, 25004, 12006, 257, 292, 50275, 783, 30628, 285, 913, 3877, 253, 2442, 32213, 273, 253, 2929, 275, 2710, 7794, 285, 4425, 326, 253, 4477, 878, 625, 2987, 281, 15452, 209 ]
Below is given review of a research paper from cnoference journal. Please write a summary the review. ### Review: this paper presents an endtoend system that can recognize singlechannel multiplespeaker speech with multiple languages pros the paper is well written it shows the existing endtoend multilingual asr seki et al 2018b and endtoend multispeaker asr seki et al 2018a techniques can be combined without any change to achieve reasonable performance it demonstrates the challenge of singlechannel multilingual multiplespeaker speech recognition and compares the performance of the multiplespeaker system on the mixed speech and the singlespeaker system on the isolated speech cons it lacks novelty the proposed framework just simply combines the two existing techniques as mentioned above the training and evaluation data are both artificially created by randomly concatenating utterances with different languages from different speakers with different context i am not sure of how useful the evaluation is since this situation is not realistic also currently it cannot test the real codeswitching since the utterances are not related and not from the same speaker there are not enough analyses eg it would be good to analyze what contributes to the gap between the singlespeaker asr system performance on the isolated speech and the multilingual multispeaker asr system on the mixed speech how well does the proposed endtoend framework perform compared to a twostep framework with speaker separation followed by multilingual singlespeaker asrdocsepthe authors propose to build a speech recognition system that has been trained to recognize a recording that has been produced by mixing multiple recordings from different languages together and allowing for some code switching also done artificially by concatenating different recordings while this sounds fancy and like a hard problem it is in fact easier than recognizing two speakers that have been mixed together speaking the same language which has already been solved in seki 2018a from what i can tell i dont see any contribution in this paper other than explaining how to create an artificial unrealistic database of mixed speech in multiple languages and then training a multispeaker endtoend speech recognition system on that database docsepthis paper presents a framework to train an endtoend multilingual multispeaker speech recognition system overall the paper is quite clear written strengthens experimental results show consistent improvements in speech recognition performance and language identification performance weakness im not sure whether the framework is novel the authors have just mixed training data from several languages to train an endtoend multispeaker speech recognition system i dont see the real motivation why the authors want to make the task harder than needed the example provided in figure 1 is very rare in reality the authors claimed that their system can recognise codeswitching but actually randomly mixing data from different languages are not codeswitching in general it would be better to have some more analyses showing what the system can do and why ### Summary:
the authors present a system for endtoend multilingual and multispeaker speech recognition the presented method is based on multiple prior works that propose endtoend models for multilingual asr and multispeaker asr the work combines these techniques and shows that a single system can do both with minimal changes the main critique from the reviewers is that the paper lacks novelty it builds heavily on existing work and does not make any enough contributions to be accepted at iclr furthermore training and evaluations are all on simulated test sets that are not very realistic so it is unclear how well the techniques would generalize to real usecases for these reasons the recommendation is to reject the paper
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 2520, 2929, 10262, 271, 990, 936, 423, 985, 326, 476, 9446, 2014, 13695, 4471, 1868, 365, 4584, 6519, 342, 2709, 11515, 50276, 856, 84, 50276, 783, 2929, 310, 973, 3542, 50276, 262, 2722, 253, 5368, 990, 936, 423, 1554, 39661, 347, 83, 396, 5985, 1162, 355, 4765, 67, 285, 990, 936, 423, 1554, 261, 365, 4584, 347, 83, 396, 5985, 1162, 355, 4765, 66, 5609, 476, 320, 5678, 1293, 667, 1818, 281, 5115, 5272, 3045, 50276, 262, 14371, 253, 5691, 273, 2014, 13695, 1554, 39661, 4471, 1868, 365, 4584, 6519, 8981, 285, 26662, 253, 3045, 273, 253, 4471, 1868, 365, 4584, 985, 327, 253, 6804, 6519, 285, 253, 21864, 365, 4584, 985, 327, 253, 7011, 6519, 50276, 5040, 50276, 262, 19756, 38135, 253, 4081, 7792, 816, 3365, 24772, 253, 767, 5368, 5609, 347, 5393, 1840, 50276, 783, 3733, 285, 7103, 941, 403, 1097, 41544, 3562, 407, 12421, 32147, 839, 13894, 1972, 342, 1027, 11515, 432, 1027, 17999, 342, 1027, 3634, 891, 717, 417, 2119, 273, 849, 4217, 253, 7103, 310, 1580, 436, 4112, 310, 417, 15958, 671, 4390, 352, 2550, 1071, 253, 1524, 11646, 88, 31054, 1580, 253, 13894, 1972, 403, 417, 2905, 285, 417, 432, 253, 1072, 14925, 50276, 9088, 403, 417, 2217, 6260, 24088, 352, 651, 320, 1175, 281, 12106, 752, 17904, 281, 253, 8037, 875, 253, 21864, 365, 4584, 347, 83, 985, 3045, 327, 253, 7011, 6519, 285, 253, 1554, 39661, 1554, 261, 365, 4584, 347, 83, 985, 327, 253, 6804, 6519, 849, 973, 1057, 253, 4081, 990, 936, 423, 7792, 1347, 2429, 281, 247, 2500, 493, 554, 7792, 342, 14925, 9712, 3560, 407, 1554, 39661, 21864, 365, 4584, 347, 5784, 406, 339, 431, 248, 4477, 12661, 281, 1973, 247, 6519, 8981, 985, 326, 556, 644, 10166, 281, 9446, 247, 7663, 326, 556, 644, 4197, 407, 12480, 2709, 19654, 432, 1027, 11515, 2366, 285, 6941, 323, 690, 2127, 12797, 671, 2218, 41544, 407, 32147, 839, 1027, 19654, 50276, 6050, 436, 7835, 18612, 285, 751, 247, 1892, 1895, 352, 310, 275, 958, 6927, 685, 26182, 767, 17999, 326, 452, 644, 6804, 2366, 8288, 253, 1072, 3448, 534, 556, 2168, 644, 14042, 275, 396, 5985, 4765, 66, 432, 752, 891, 476, 2028, 891, 13414, 923, 667, 7680, 275, 436, 2929, 643, 685, 15571, 849, 281, 2794, 271, 13345, 46521, 5447, 273, 6804, 6519, 275, 2709, 11515, 285, 840, 3733, 247, 1554, 261, 365, 4584, 990, 936, 423, 6519, 8981, 985, 327, 326, 5447, 5474, 33032, 2520, 2929, 10262, 247, 7792, 281, 6194, 271, 990, 936, 423, 1554, 39661, 1554, 261, 365, 4584, 6519, 8981, 985, 4583, 253, 2929, 310, 3240, 2590, 3542, 50276, 296, 3755, 49966, 50276, 49363, 1543, 921, 5185, 11701, 275, 6519, 8981, 3045, 285, 3448, 8137, 3045, 50275, 20881, 1255, 50276, 303, 417, 2119, 1880, 253, 7792, 310, 4460, 253, 4477, 452, 816, 6804, 3733, 941, 432, 2067, 11515, 281, 6194, 271, 990, 936, 423, 1554, 261, 365, 4584, 6519, 8981, 985, 50276, 74, 13414, 923, 253, 1524, 16038, 2139, 253, 4477, 971, 281, 1056, 253, 4836, 12150, 685, 3058, 253, 1650, 2530, 275, 4677, 337, 310, 1077, 7520, 275, 6612, 50276, 783, 4477, 7558, 326, 616, 985, 476, 31410, 11646, 88, 31054, 533, 2686, 12421, 12480, 941, 432, 1027, 11515, 403, 417, 11646, 88, 31054, 50276, 249, 2087, 352, 651, 320, 1805, 281, 452, 690, 625, 6260, 4645, 752, 253, 985, 476, 513, 285, 2139, 187, 187, 4118, 18435, 27, 783, 4477, 1246, 247, 985, 323, 990, 936, 423, 1554, 39661, 285, 1554, 261, 365, 4584, 6519, 8981, 253, 3559, 1332, 310, 1754, 327, 2709, 2720, 2987, 326, 12661, 990, 936, 423, 3210, 323, 1554, 39661, 347, 83, 285, 1554, 261, 365, 4584, 347, 83, 253, 789, 24772, 841, 5609, 285, 2722, 326, 247, 2014, 985, 476, 513, 1097, 342, 8723, 2544, 50275, 783, 2022, 29254, 432, 253, 30628, 310, 326, 253, 2929, 19756, 38135, 352, 21168, 11306, 327, 5368, 789, 285, 50276, 18566, 417, 1056, 667, 2217, 9021, 281, 320, 7607, 387, 17857, 32888, 33810, 3733, 285, 27163, 403, 512, 327, 15524, 1071, 5239, 326, 403, 417, 1077, 15958, 594, 352, 310, 12744, 849, 973, 253, 5609, 651, 39970, 281, 1524, 441, 886, 1169, 323, 841, 4606, 253, 17401, 310, 281, 12009, 253, 2929 ]
[ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 ]
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 2520, 2929, 10262, 271, 990, 936, 423, 985, 326, 476, 9446, 2014, 13695, 4471, 1868, 365, 4584, 6519, 342, 2709, 11515, 50276, 856, 84, 50276, 783, 2929, 310, 973, 3542, 50276, 262, 2722, 253, 5368, 990, 936, 423, 1554, 39661, 347, 83, 396, 5985, 1162, 355, 4765, 67, 285, 990, 936, 423, 1554, 261, 365, 4584, 347, 83, 396, 5985, 1162, 355, 4765, 66, 5609, 476, 320, 5678, 1293, 667, 1818, 281, 5115, 5272, 3045, 50276, 262, 14371, 253, 5691, 273, 2014, 13695, 1554, 39661, 4471, 1868, 365, 4584, 6519, 8981, 285, 26662, 253, 3045, 273, 253, 4471, 1868, 365, 4584, 985, 327, 253, 6804, 6519, 285, 253, 21864, 365, 4584, 985, 327, 253, 7011, 6519, 50276, 5040, 50276, 262, 19756, 38135, 253, 4081, 7792, 816, 3365, 24772, 253, 767, 5368, 5609, 347, 5393, 1840, 50276, 783, 3733, 285, 7103, 941, 403, 1097, 41544, 3562, 407, 12421, 32147, 839, 13894, 1972, 342, 1027, 11515, 432, 1027, 17999, 342, 1027, 3634, 891, 717, 417, 2119, 273, 849, 4217, 253, 7103, 310, 1580, 436, 4112, 310, 417, 15958, 671, 4390, 352, 2550, 1071, 253, 1524, 11646, 88, 31054, 1580, 253, 13894, 1972, 403, 417, 2905, 285, 417, 432, 253, 1072, 14925, 50276, 9088, 403, 417, 2217, 6260, 24088, 352, 651, 320, 1175, 281, 12106, 752, 17904, 281, 253, 8037, 875, 253, 21864, 365, 4584, 347, 83, 985, 3045, 327, 253, 7011, 6519, 285, 253, 1554, 39661, 1554, 261, 365, 4584, 347, 83, 985, 327, 253, 6804, 6519, 849, 973, 1057, 253, 4081, 990, 936, 423, 7792, 1347, 2429, 281, 247, 2500, 493, 554, 7792, 342, 14925, 9712, 3560, 407, 1554, 39661, 21864, 365, 4584, 347, 5784, 406, 339, 431, 248, 4477, 12661, 281, 1973, 247, 6519, 8981, 985, 326, 556, 644, 10166, 281, 9446, 247, 7663, 326, 556, 644, 4197, 407, 12480, 2709, 19654, 432, 1027, 11515, 2366, 285, 6941, 323, 690, 2127, 12797, 671, 2218, 41544, 407, 32147, 839, 1027, 19654, 50276, 6050, 436, 7835, 18612, 285, 751, 247, 1892, 1895, 352, 310, 275, 958, 6927, 685, 26182, 767, 17999, 326, 452, 644, 6804, 2366, 8288, 253, 1072, 3448, 534, 556, 2168, 644, 14042, 275, 396, 5985, 4765, 66, 432, 752, 891, 476, 2028, 891, 13414, 923, 667, 7680, 275, 436, 2929, 643, 685, 15571, 849, 281, 2794, 271, 13345, 46521, 5447, 273, 6804, 6519, 275, 2709, 11515, 285, 840, 3733, 247, 1554, 261, 365, 4584, 990, 936, 423, 6519, 8981, 985, 327, 326, 5447, 5474, 33032, 2520, 2929, 10262, 247, 7792, 281, 6194, 271, 990, 936, 423, 1554, 39661, 1554, 261, 365, 4584, 6519, 8981, 985, 4583, 253, 2929, 310, 3240, 2590, 3542, 50276, 296, 3755, 49966, 50276, 49363, 1543, 921, 5185, 11701, 275, 6519, 8981, 3045, 285, 3448, 8137, 3045, 50275, 20881, 1255, 50276, 303, 417, 2119, 1880, 253, 7792, 310, 4460, 253, 4477, 452, 816, 6804, 3733, 941, 432, 2067, 11515, 281, 6194, 271, 990, 936, 423, 1554, 261, 365, 4584, 6519, 8981, 985, 50276, 74, 13414, 923, 253, 1524, 16038, 2139, 253, 4477, 971, 281, 1056, 253, 4836, 12150, 685, 3058, 253, 1650, 2530, 275, 4677, 337, 310, 1077, 7520, 275, 6612, 50276, 783, 4477, 7558, 326, 616, 985, 476, 31410, 11646, 88, 31054, 533, 2686, 12421, 12480, 941, 432, 1027, 11515, 403, 417, 11646, 88, 31054, 50276, 249, 2087, 352, 651, 320, 1805, 281, 452, 690, 625, 6260, 4645, 752, 253, 985, 476, 513, 285, 2139, 187, 187, 4118, 18435, 27, 783, 4477, 1246, 247, 985, 323, 990, 936, 423, 1554, 39661, 285, 1554, 261, 365, 4584, 6519, 8981, 253, 3559, 1332, 310, 1754, 327, 2709, 2720, 2987, 326, 12661, 990, 936, 423, 3210, 323, 1554, 39661, 347, 83, 285, 1554, 261, 365, 4584, 347, 83, 253, 789, 24772, 841, 5609, 285, 2722, 326, 247, 2014, 985, 476, 513, 1097, 342, 8723, 2544, 50275, 783, 2022, 29254, 432, 253, 30628, 310, 326, 253, 2929, 19756, 38135, 352, 21168, 11306, 327, 5368, 789, 285, 50276, 18566, 417, 1056, 667, 2217, 9021, 281, 320, 7607, 387, 17857, 32888, 33810, 3733, 285, 27163, 403, 512, 327, 15524, 1071, 5239, 326, 403, 417, 1077, 15958, 594, 352, 310, 12744, 849, 973, 253, 5609, 651, 39970, 281, 1524, 441, 886, 1169, 323, 841, 4606, 253, 17401, 310, 281, 12009, 253, 2929 ]
Below is given review of a research paper from cnoference journal. Please write a summary the review. ### Review: 1 the whole structure of this manuscript is easy to follow and understand with good organization the given method with the sourceadaptive idea is novel in the field of the dirichlet process 2 regarding the reproducibility of this given method it is easy to implement since the code is available and there are sufficient experimental details thus i think it has good credibility and reproducibility some motivations are not well explained 1 some motivations are not well explained for example as mentioned in the introduction that selecting the number of clusters is important and as far as i know that there are many methods to select the number of clusters such as bic aic or some other criteria why does this work use the dirichlet process mixture model to solve this problem 2 will the invertibility of the nonlinear neural network f reduce the representability of the network could you provide some discussion on this point 3 since the proposed algorithm 1 adopts the gibbs sampling method which has been recognized to be slow how about the computational efficiency of this algorithm docsepthis is a novel combination of dp and nicebased feature learning initialised with vae in addition the learning is via mcem which is effective the generated examples in figure 4 also shows clearly that this method works very well the f score is not really a useful metric for a clustering problems the last statement in section 423 and figure 3 are not supported for example clusters for 7 and 9 does not really get denser and more concentrated abstract the term capacity may be better replaced by flexibility section 1 2nd para there should be a better word than reversely section 1 3rd para ditto for insufficiency section 32 2nd para x and y could be replaced by y and z to put it in the same context of eq 3 and preceding derivations footnote 1 scalar vector ng in equation 7 not defined second column heading for table 2 is not appropriate figure 2 subcaption advance advantage figure 2 should have better elaboration figure 2 kmeans on the ae and flowlearned features is better than ddpm this is worth pointing out and further explained dempster 1977 is a better reference for em the em algorithm and extensions is a better reference for mcem docsepthe paper is clearly written and the method is described well the experiments fairly indicate the strengths of the model as well as potential weaknesses figure 1 is a particularly nice diagram of how the model works unclear extent to which the dpm portion is having an effect the paper claims that the ddpm method is an effective deep clustering method for high dimensions however it seems that most of the clustering power in the approach comes from the autoencoder which projects the data down to ten dimensions in all cases limited evaluation and ablations the authors compare only against the gmeans and dpm approach for the experiments i believe there are several recent approaches in deep unsupervised clustering as discussed in the related work it would be instructive to compare ddpm against these methods when viewing k as a hyperparameter missing related work the paper seems to be missing any discussion to echraibi 2020 on the variational posterior of dirichlet process deep latent gaussian mixture models this work discusses an approach that seems quite similar with a dirichlet process model followed by a projection to the data space unclear extent to which the dpm portion is having an effect i think its necessary to see an ablation where the autoencoder is made less powerful and the behaviour of the overall model is examined in particular figure 3 seems to show that the encoder has achieved almost complete clustering even before any of the ddpm model runs of course a tsne can be misleading in that particular case but i think the authors need to address this general point the authors also present the method as a combination of feature learning via the normalizing flow and cluster identification with the dpm but judging by the mnist example the vast majority of the feature learning seems to be done in the pretraining where the dpm is not involved limited evaluation and ablations what if you choose the ground truth k as the input for these clustering methods what if k is slightly mispecified or you try a couple of values of k what if you have the same computational budget available to ddpm and competing deep clustering methods these are the sort of questions that an effective experimental comparison could answer in fact ddpm also has hyperparameters to choose such as the alpha0 value in table 1 which varies across seven orders of magnitude how is this chosen ### Summary:
meta review the paper proposes a nonlinear clustering algorithm based on a dirichlet process mixture model as a latent variable model in the latent space of a normalizing flow in that way the model automatically learns a suitable number of clusters simultaneously with a nonlinear feature extractor the reviews were largely positive and i recommend acceptance as a poster however one reviewer raised the point that for preprocessing an autoencoder was used the results are thus somewhat inconclusive since we observe the combined effect of autoencoder and the proposed method i agree with this criticism i want to strongly encourage the authors to include an ablation study ie run the experiments without autoencoder so that the contribution of autoencoder and dpmflow model becomes delineated pros well written reproducible good experimental results cons experiments not fully conclusive limited ablations potentially missing related work quality good clarity good originality fair significance fair but unclear to some extent
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 18, 253, 2644, 2605, 273, 436, 7714, 310, 3477, 281, 956, 285, 2096, 342, 1175, 6003, 253, 1677, 1332, 342, 253, 2603, 26672, 422, 2934, 310, 4460, 275, 253, 1673, 273, 253, 14035, 42878, 1232, 50275, 19, 5001, 253, 38041, 273, 436, 1677, 1332, 352, 310, 3477, 281, 3359, 1580, 253, 2127, 310, 2130, 285, 627, 403, 4209, 5661, 4278, 3021, 891, 1158, 352, 556, 1175, 17938, 285, 38041, 50276, 8826, 42852, 403, 417, 973, 5544, 50276, 18, 690, 42852, 403, 417, 973, 5544, 323, 1650, 347, 5393, 275, 253, 10199, 326, 17221, 253, 1180, 273, 9959, 310, 1774, 50276, 395, 347, 2080, 347, 891, 871, 326, 627, 403, 1142, 3082, 281, 3609, 253, 1180, 273, 9959, 824, 347, 43022, 247, 280, 390, 690, 643, 6866, 2139, 1057, 436, 789, 897, 253, 14035, 42878, 1232, 7802, 1566, 281, 8415, 436, 1895, 50276, 19, 588, 253, 30332, 2322, 273, 253, 14561, 11454, 2990, 269, 4796, 253, 1957, 1430, 273, 253, 2990, 812, 368, 2085, 690, 5955, 327, 436, 1127, 50276, 20, 1580, 253, 4081, 5933, 337, 47932, 253, 33342, 1768, 10491, 1332, 534, 556, 644, 7478, 281, 320, 3468, 50276, 5430, 670, 253, 15180, 6733, 273, 436, 5933, 5474, 33032, 2520, 310, 247, 4460, 5019, 273, 33234, 285, 5322, 3169, 4735, 4715, 3302, 1701, 342, 362, 3348, 275, 1635, 253, 4715, 310, 3066, 278, 336, 78, 534, 310, 3576, 50276, 783, 4561, 6667, 275, 4677, 577, 671, 2722, 4518, 326, 436, 1332, 2987, 1077, 973, 50276, 783, 269, 4868, 310, 417, 1663, 247, 4217, 7982, 323, 247, 17524, 3237, 50275, 783, 1390, 3908, 275, 2593, 38880, 285, 4677, 495, 403, 417, 4516, 323, 1650, 9959, 323, 818, 285, 898, 1057, 417, 1663, 755, 12006, 254, 285, 625, 16761, 50275, 15834, 253, 1307, 5350, 778, 320, 1805, 7932, 407, 15840, 50275, 4674, 337, 374, 2109, 5586, 627, 943, 320, 247, 1805, 3159, 685, 7661, 600, 50274, 4674, 337, 495, 5784, 5586, 277, 35570, 323, 39975, 50275, 4674, 4567, 374, 2109, 5586, 1269, 285, 340, 812, 320, 7932, 407, 340, 285, 1182, 281, 1691, 352, 275, 253, 1072, 3634, 273, 16186, 495, 285, 17691, 3538, 569, 50275, 8938, 9939, 337, 13434, 50276, 11000, 50275, 1251, 275, 5150, 818, 417, 2931, 50275, 9815, 5084, 13590, 323, 2829, 374, 310, 417, 4569, 50275, 13206, 374, 749, 34480, 7170, 50276, 11402, 486, 50275, 13206, 374, 943, 452, 1805, 50276, 293, 2735, 318, 50275, 13206, 374, 465, 30799, 327, 253, 247, 70, 285, 2685, 29343, 264, 3386, 310, 1805, 685, 32765, 2617, 436, 310, 4409, 13458, 562, 285, 2007, 5544, 50274, 9468, 81, 2971, 14960, 310, 247, 1805, 3806, 323, 802, 50275, 783, 802, 5933, 285, 18149, 310, 247, 1805, 3806, 323, 278, 336, 78, 5474, 339, 431, 248, 2929, 310, 4518, 3542, 285, 253, 1332, 310, 2529, 973, 253, 4679, 9648, 5224, 253, 20544, 273, 253, 1566, 347, 973, 347, 2442, 32213, 4677, 337, 310, 247, 3782, 5322, 10659, 273, 849, 253, 1566, 2987, 50276, 328, 8250, 6070, 281, 534, 253, 277, 2617, 5110, 310, 1907, 271, 1055, 253, 2929, 3916, 326, 253, 32765, 2617, 1332, 310, 271, 3576, 3676, 17524, 1332, 323, 1029, 10103, 2299, 352, 3133, 326, 954, 273, 253, 17524, 1612, 275, 253, 2746, 3249, 432, 253, 6753, 36465, 534, 6493, 253, 941, 1066, 281, 3578, 10103, 275, 512, 2219, 50276, 15870, 7103, 285, 490, 77, 569, 253, 4477, 7277, 760, 1411, 253, 305, 30799, 285, 277, 2617, 2746, 323, 253, 4679, 891, 2868, 627, 403, 2067, 3332, 7274, 275, 3676, 440, 35421, 17524, 347, 5469, 275, 253, 2905, 789, 352, 651, 320, 49664, 281, 7277, 32765, 2617, 1411, 841, 3082, 672, 14657, 465, 347, 247, 4373, 19484, 50275, 33722, 2905, 789, 253, 2929, 3133, 281, 320, 5816, 667, 5955, 281, 21509, 376, 29294, 9169, 327, 253, 39762, 12637, 273, 14035, 42878, 1232, 3676, 21624, 305, 12064, 7802, 3210, 436, 789, 25339, 271, 2746, 326, 3133, 3240, 2074, 342, 247, 14035, 42878, 1232, 1566, 3560, 407, 247, 12378, 281, 253, 941, 2317, 12744, 6070, 281, 534, 253, 277, 2617, 5110, 310, 1907, 271, 1055, 891, 1158, 697, 3309, 281, 923, 271, 28913, 835, 253, 6753, 36465, 310, 1160, 1679, 6422, 285, 253, 8770, 273, 253, 4583, 1566, 310, 6730, 275, 1798, 4677, 495, 3133, 281, 921, 326, 253, 32049, 556, 6786, 2761, 3426, 17524, 1014, 1078, 667, 273, 253, 32765, 2617, 1566, 6613, 273, 2282, 247, 28669, 570, 476, 320, 24363, 275, 326, 1798, 1083, 533, 891, 1158, 253, 4477, 878, 281, 2953, 436, 2087, 1127, 253, 4477, 671, 1246, 253, 1332, 347, 247, 5019, 273, 4735, 4715, 3066, 253, 2622, 3006, 2685, 285, 7368, 8137, 342, 253, 277, 2617, 533, 32721, 407, 253, 278, 79, 382, 1650, 253, 8485, 5020, 273, 253, 4735, 4715, 3133, 281, 320, 2218, 275, 253, 3215, 26208, 835, 253, 277, 2617, 310, 417, 3206, 50276, 15870, 7103, 285, 490, 77, 569, 752, 604, 368, 5206, 253, 3216, 5083, 465, 347, 253, 3280, 323, 841, 17524, 3082, 752, 604, 465, 310, 5777, 3731, 1553, 1245, 390, 368, 1611, 247, 4564, 273, 2193, 273, 465, 752, 604, 368, 452, 253, 1072, 15180, 7563, 2130, 281, 32765, 2617, 285, 11771, 3676, 17524, 3082, 841, 403, 253, 3686, 273, 3533, 326, 271, 3576, 5661, 5301, 812, 3662, 50276, 249, 958, 32765, 2617, 671, 556, 4373, 22041, 281, 5206, 824, 347, 253, 9765, 17, 1318, 275, 2829, 337, 534, 16149, 2439, 5093, 7367, 273, 9777, 849, 310, 436, 6777, 50275, 187, 187, 4118, 18435, 27, 13518, 2278, 253, 2929, 29328, 247, 14561, 17524, 5933, 1754, 327, 247, 14035, 42878, 1232, 7802, 1566, 347, 247, 21624, 4778, 1566, 275, 253, 21624, 2317, 273, 247, 2622, 3006, 2685, 275, 326, 1039, 253, 1566, 8356, 33772, 247, 7470, 1180, 273, 9959, 10486, 342, 247, 14561, 4735, 4908, 263, 253, 10123, 497, 8127, 2762, 285, 891, 5583, 14924, 347, 247, 20731, 50276, 35529, 581, 37317, 5439, 253, 1127, 326, 323, 638, 21678, 271, 6753, 36465, 369, 908, 253, 1543, 403, 3021, 8489, 16656, 7426, 1580, 359, 10018, 253, 5678, 1055, 273, 6753, 36465, 285, 253, 4081, 1332, 891, 5194, 342, 436, 14226, 891, 971, 281, 7052, 11907, 253, 4477, 281, 2486, 271, 28913, 1263, 26332, 1408, 253, 4679, 1293, 6753, 36465, 594, 326, 253, 7680, 273, 6753, 36465, 285, 277, 2617, 5449, 1566, 4916, 30191, 456, 50276, 856, 84, 973, 3542, 41374, 1175, 5661, 1543, 50276, 5040, 4679, 417, 4751, 38662, 3710, 490, 77, 569, 7826, 5816, 2905, 789, 50275, 15177, 1175, 19843, 1175, 50276, 19164, 414, 4344, 8453, 4344, 533, 12744, 281, 690, 6070, 209 ]
[ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 ]
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 18, 253, 2644, 2605, 273, 436, 7714, 310, 3477, 281, 956, 285, 2096, 342, 1175, 6003, 253, 1677, 1332, 342, 253, 2603, 26672, 422, 2934, 310, 4460, 275, 253, 1673, 273, 253, 14035, 42878, 1232, 50275, 19, 5001, 253, 38041, 273, 436, 1677, 1332, 352, 310, 3477, 281, 3359, 1580, 253, 2127, 310, 2130, 285, 627, 403, 4209, 5661, 4278, 3021, 891, 1158, 352, 556, 1175, 17938, 285, 38041, 50276, 8826, 42852, 403, 417, 973, 5544, 50276, 18, 690, 42852, 403, 417, 973, 5544, 323, 1650, 347, 5393, 275, 253, 10199, 326, 17221, 253, 1180, 273, 9959, 310, 1774, 50276, 395, 347, 2080, 347, 891, 871, 326, 627, 403, 1142, 3082, 281, 3609, 253, 1180, 273, 9959, 824, 347, 43022, 247, 280, 390, 690, 643, 6866, 2139, 1057, 436, 789, 897, 253, 14035, 42878, 1232, 7802, 1566, 281, 8415, 436, 1895, 50276, 19, 588, 253, 30332, 2322, 273, 253, 14561, 11454, 2990, 269, 4796, 253, 1957, 1430, 273, 253, 2990, 812, 368, 2085, 690, 5955, 327, 436, 1127, 50276, 20, 1580, 253, 4081, 5933, 337, 47932, 253, 33342, 1768, 10491, 1332, 534, 556, 644, 7478, 281, 320, 3468, 50276, 5430, 670, 253, 15180, 6733, 273, 436, 5933, 5474, 33032, 2520, 310, 247, 4460, 5019, 273, 33234, 285, 5322, 3169, 4735, 4715, 3302, 1701, 342, 362, 3348, 275, 1635, 253, 4715, 310, 3066, 278, 336, 78, 534, 310, 3576, 50276, 783, 4561, 6667, 275, 4677, 577, 671, 2722, 4518, 326, 436, 1332, 2987, 1077, 973, 50276, 783, 269, 4868, 310, 417, 1663, 247, 4217, 7982, 323, 247, 17524, 3237, 50275, 783, 1390, 3908, 275, 2593, 38880, 285, 4677, 495, 403, 417, 4516, 323, 1650, 9959, 323, 818, 285, 898, 1057, 417, 1663, 755, 12006, 254, 285, 625, 16761, 50275, 15834, 253, 1307, 5350, 778, 320, 1805, 7932, 407, 15840, 50275, 4674, 337, 374, 2109, 5586, 627, 943, 320, 247, 1805, 3159, 685, 7661, 600, 50274, 4674, 337, 495, 5784, 5586, 277, 35570, 323, 39975, 50275, 4674, 4567, 374, 2109, 5586, 1269, 285, 340, 812, 320, 7932, 407, 340, 285, 1182, 281, 1691, 352, 275, 253, 1072, 3634, 273, 16186, 495, 285, 17691, 3538, 569, 50275, 8938, 9939, 337, 13434, 50276, 11000, 50275, 1251, 275, 5150, 818, 417, 2931, 50275, 9815, 5084, 13590, 323, 2829, 374, 310, 417, 4569, 50275, 13206, 374, 749, 34480, 7170, 50276, 11402, 486, 50275, 13206, 374, 943, 452, 1805, 50276, 293, 2735, 318, 50275, 13206, 374, 465, 30799, 327, 253, 247, 70, 285, 2685, 29343, 264, 3386, 310, 1805, 685, 32765, 2617, 436, 310, 4409, 13458, 562, 285, 2007, 5544, 50274, 9468, 81, 2971, 14960, 310, 247, 1805, 3806, 323, 802, 50275, 783, 802, 5933, 285, 18149, 310, 247, 1805, 3806, 323, 278, 336, 78, 5474, 339, 431, 248, 2929, 310, 4518, 3542, 285, 253, 1332, 310, 2529, 973, 253, 4679, 9648, 5224, 253, 20544, 273, 253, 1566, 347, 973, 347, 2442, 32213, 4677, 337, 310, 247, 3782, 5322, 10659, 273, 849, 253, 1566, 2987, 50276, 328, 8250, 6070, 281, 534, 253, 277, 2617, 5110, 310, 1907, 271, 1055, 253, 2929, 3916, 326, 253, 32765, 2617, 1332, 310, 271, 3576, 3676, 17524, 1332, 323, 1029, 10103, 2299, 352, 3133, 326, 954, 273, 253, 17524, 1612, 275, 253, 2746, 3249, 432, 253, 6753, 36465, 534, 6493, 253, 941, 1066, 281, 3578, 10103, 275, 512, 2219, 50276, 15870, 7103, 285, 490, 77, 569, 253, 4477, 7277, 760, 1411, 253, 305, 30799, 285, 277, 2617, 2746, 323, 253, 4679, 891, 2868, 627, 403, 2067, 3332, 7274, 275, 3676, 440, 35421, 17524, 347, 5469, 275, 253, 2905, 789, 352, 651, 320, 49664, 281, 7277, 32765, 2617, 1411, 841, 3082, 672, 14657, 465, 347, 247, 4373, 19484, 50275, 33722, 2905, 789, 253, 2929, 3133, 281, 320, 5816, 667, 5955, 281, 21509, 376, 29294, 9169, 327, 253, 39762, 12637, 273, 14035, 42878, 1232, 3676, 21624, 305, 12064, 7802, 3210, 436, 789, 25339, 271, 2746, 326, 3133, 3240, 2074, 342, 247, 14035, 42878, 1232, 1566, 3560, 407, 247, 12378, 281, 253, 941, 2317, 12744, 6070, 281, 534, 253, 277, 2617, 5110, 310, 1907, 271, 1055, 891, 1158, 697, 3309, 281, 923, 271, 28913, 835, 253, 6753, 36465, 310, 1160, 1679, 6422, 285, 253, 8770, 273, 253, 4583, 1566, 310, 6730, 275, 1798, 4677, 495, 3133, 281, 921, 326, 253, 32049, 556, 6786, 2761, 3426, 17524, 1014, 1078, 667, 273, 253, 32765, 2617, 1566, 6613, 273, 2282, 247, 28669, 570, 476, 320, 24363, 275, 326, 1798, 1083, 533, 891, 1158, 253, 4477, 878, 281, 2953, 436, 2087, 1127, 253, 4477, 671, 1246, 253, 1332, 347, 247, 5019, 273, 4735, 4715, 3066, 253, 2622, 3006, 2685, 285, 7368, 8137, 342, 253, 277, 2617, 533, 32721, 407, 253, 278, 79, 382, 1650, 253, 8485, 5020, 273, 253, 4735, 4715, 3133, 281, 320, 2218, 275, 253, 3215, 26208, 835, 253, 277, 2617, 310, 417, 3206, 50276, 15870, 7103, 285, 490, 77, 569, 752, 604, 368, 5206, 253, 3216, 5083, 465, 347, 253, 3280, 323, 841, 17524, 3082, 752, 604, 465, 310, 5777, 3731, 1553, 1245, 390, 368, 1611, 247, 4564, 273, 2193, 273, 465, 752, 604, 368, 452, 253, 1072, 15180, 7563, 2130, 281, 32765, 2617, 285, 11771, 3676, 17524, 3082, 841, 403, 253, 3686, 273, 3533, 326, 271, 3576, 5661, 5301, 812, 3662, 50276, 249, 958, 32765, 2617, 671, 556, 4373, 22041, 281, 5206, 824, 347, 253, 9765, 17, 1318, 275, 2829, 337, 534, 16149, 2439, 5093, 7367, 273, 9777, 849, 310, 436, 6777, 50275, 187, 187, 4118, 18435, 27, 13518, 2278, 253, 2929, 29328, 247, 14561, 17524, 5933, 1754, 327, 247, 14035, 42878, 1232, 7802, 1566, 347, 247, 21624, 4778, 1566, 275, 253, 21624, 2317, 273, 247, 2622, 3006, 2685, 275, 326, 1039, 253, 1566, 8356, 33772, 247, 7470, 1180, 273, 9959, 10486, 342, 247, 14561, 4735, 4908, 263, 253, 10123, 497, 8127, 2762, 285, 891, 5583, 14924, 347, 247, 20731, 50276, 35529, 581, 37317, 5439, 253, 1127, 326, 323, 638, 21678, 271, 6753, 36465, 369, 908, 253, 1543, 403, 3021, 8489, 16656, 7426, 1580, 359, 10018, 253, 5678, 1055, 273, 6753, 36465, 285, 253, 4081, 1332, 891, 5194, 342, 436, 14226, 891, 971, 281, 7052, 11907, 253, 4477, 281, 2486, 271, 28913, 1263, 26332, 1408, 253, 4679, 1293, 6753, 36465, 594, 326, 253, 7680, 273, 6753, 36465, 285, 277, 2617, 5449, 1566, 4916, 30191, 456, 50276, 856, 84, 973, 3542, 41374, 1175, 5661, 1543, 50276, 5040, 4679, 417, 4751, 38662, 3710, 490, 77, 569, 7826, 5816, 2905, 789, 50275, 15177, 1175, 19843, 1175, 50276, 19164, 414, 4344, 8453, 4344, 533, 12744, 281, 690, 6070, 209 ]
Below is given review of a research paper from cnoference journal. Please write a summary the review. ### Review: in this paper authors propose a monocular imagebased 3d object detection method based on knowledge distillation specifically the monocular imagebased 3d object detector ma et al 2019 is infused with the depth cues via distilling knowledge from lidarbased teacher model at the testtime the model detects 3d objects without any intermediate depth prediction and ranks 1st on the kitti benchmark dataset hence the proposed monocularbased 3d detector is endtoend and more accurate than the other methods in this category without adding extra depthestimation overhead in between strengths 1 the endtoend monocular 3d detection without intermediate depth estimation which integrates depth cues relevant for 3d detection intermediate depthestimation is a real bottleneck in monocular 3d object detection as shown in multiple previous work ma et al 2021 reading et al 2021 lu et al 2021 hence this contribution to infuse depthcues directly to the monocular network via distillation is an important contribution 2 the method proposed to infuse depth via distillation across structuredscene space object featurespace and object resultspace through ablation study it allows to deduce that the design components are important for the 3d object detection furthermore during distillation authors mainly focus on the objectfeatures that allow the network to focus on them and this argument is also supported by the ablation study 3 paper shows that the accuracy improvement compared to the baseline model mainly comes through improvement in depth and dimension prediction of the objects this has been shown via a valid crossmodel experiment design weakness 1 the main claim of the paper is that the depthcues can help in 3d object detection from monocular images however the results are only shown for only one teacher model ma et al 2019 this puts the generality of depthcue based distillation in question also from the section 32 student model the choice of ma et al as a baseline is not clear can authors please explain what would happen with the other monocularbased methods as a teacher can you please show a small experiment 2 the motivation for the scene level distillation is not clear the first line of sec 33 first we believe that scene information is beneficial for our task however it is not clear why a very similar method to the proposed scene level distillation via affinity map has already been proposed and used for semantic segmentation in the past hou et al 2020 can authors please explain why this would be beneficial for the 3d detection task also please cite the relevant work 3 besides we further normalize the confidence of each predicted object using the estimated depth uncertainty see appendix a1 for more details which brings about 1 ap improvement while it is completely fine using the tricks that bring the accuracy up this needs to be clearly stated in the results it is not clear if the baseline model uses such normalization if not the comparison with the baseline model is not an eggtoegg comparison this needs to be clearly stated in the paper by adding accuracy of the current method without such normalization in table 3 table 4 and mentioning the accuracy improvement by infusing depth cues and confidence normalization separately section 42 comparison with stateoftheart methods other minor points 1 usually the projected lidar images will have empty regions eg fig 2 have empty black regions on the top how is this handled during training since the lidar image in fig3 shows a complete image without any empty region also do we need to do any preprocessing of the camera images at the inference time which requires the use of lidar information 2 sec 42 ablation studies 3d detection performance by 334 502 298 and improve bev performance by 516 662 387 here authors should compare their numbers with the confidence normalized baseline numbers and not the raw baseline because improvements by confidence normalization are significant in this context 3 sec 42 detailed design choice which improves the accuracy by 07 07 in what i think this number is only valid for 3diou07 moderate benchmark authors should mention that in the text 4 sec 42 comparison with stateoftheart by contrast our method only takes 25ms to process a kitti image however in table 3 the runtimes show 40ms why this difference i believe 40ms is the valid number since baseline monodle also reports 40ms runtime 5 table 3 authors should also put superscript for their own method since it requires lidar data while training typos 42 ablation studies specifically we can found specifically we found 42 comparison with stateoftheart methods our method can only takes our method only takes 43 what the model has learned improvement of dimension part is also considerable e f i think authors mean c f 43 do we need depth estimation here we qualitatively show the information loss i think it is quantitatively since the table7 are just accuracy numbers references hou yuenan et al interregion affinity distillation for road marking segmentation proceedings of the ieeecvf conference on computer vision and pattern recognition 2020 i recommend this paper to be accepted for the iclr conference score 8 good paper overall the paper is well written and does extensive ablation studies for their design choices in this paper authors proposed a monocular imagebased 3d object detection method that distills the knowledge from the lidarbased teacher at the inference time this method predicts at 25 fps on gpu without any intermediate depthprediction which is critical for the realtime application such as autonomous driving however there are some minor issues in some of the claims please see above i would like to see the authors addressing weaknesses in the rebuttal docsepthis paper proposes to distill the features from a lidar teacher model to a monocularbased student model to align the feature maps between the teacher and student model the teacher model uses the same networks as the student and the only difference is that the teacher model takes the sparsedense depth map as input several techniques include scenelevel distillation objectlevel distillation in featureoutputs feature fusion in the experimental part this paper extensively ablates several key analyses including crossmodal evaluation between baseline model and full model depth estimation pros 1 the paper is well written the overall pipeline is simple and the key factor of kd is well illustrated 2 the motivation and experiments look reasonable to me i think this is the first paper that distills the features from the lidar model to the monocular model to learn depth cues which gets good results and improvement experiments demonstrate the student model learns to predict better 3d location without inference cost the improvement is indeed impressive despite kd being a wellknown approach to improve model performance cons 1 the main concern is about the technical improvement for distillation as there are some papers 1 that demonstrates the effectiveness of crossmodal distillation i think the paper should at least be compared with the baseline kd approaches and demonstrate the superiority of the proposed modules such as sceneobjectlevel distillation whether to distill affinity map and discuss why the proposed methods work better than simple kd for monocular 3d object detection or particularly depth estimation otherwise the contribution is not enough as it cannot provide any new information for crossmodal kd also the related works about kd for 2d object detection eg 2 as the main point of this paper can be used as the baseline and should be included at least 2 from the ablation studies it seems that the depth cues are the key factor to affect the learning of the model is it possible due to the teacher network interpolates the sparse depth map and thus the network transfers better the depth supervision i think an ablation study of dense depth map supervision can help the analysis 3 the performance improvement for pedestrian and cyclist is not so obvious the comparison with the baseline model in kitti validation is missed and should be supplemented 1 cross modal distillation for supervision transfer gupta s et al cvpr 2016 2 learning efficient object detection models with knowledge distillation chen etal neurips 2017 the basic motivation is quite good and is worth studying the current ablation studies also demonstrate the good point of crossmodal distillation the performance seems good enough in the related works but its known that crossmodal kd is helpful as shown in the main review 1 which is not a new enough idea and the technical improvement of the proposed kd method over the baseline kd approach should be strengthened more overall i would like to give the rating of borderline reject docsepthe paper presents monodistill a way to enhance rgbbased 3d object detection through knowledge distillation from a lidarbased teacher a 3d detection pipeline built around monodle is trained both on rgb images and densified lidar input three main mechanism enable teacherstudent transfer on feature level scenelevel distillation aligns affinity maps between teacher and student while masked features are trained on objectlevel additional masked pseudo labels are leveraged around the teacher centre predictions each component is carefully ablated and experiments are conducted on the kitti benchmark in the following strengths s14 and weaknesses w13 are detailed s1 distillation pipeline the way in which the student is trained through scenelevel and objectlevel distillation together with the extended pseudo labels is an interesting approach to leverage the lidar input at training time which does not require architectural changes between student and teacher s2 comparison to intermediate depth estimation tasks ablation experiments are conducted to question the usage of the additional task of depth estimation often used for 3d detection also from stereo in the supplementary material they give insight that this direction might be less effective this might affect future network design choices s3 results the experimental evaluation of all involved components loss functions but also the densification justifies their contributions the overall accuracy of the pipeline is favourable over many stateofthe art approaches s4 paper presentation the paper is clearly motivated and easy to understand w1 fair comparison in the comparison tab 3 the presented method should also get a as the lidar signal is implicitly used for training w2 minor notation information in eqn 3 information is missing on the index k w3 some typos grammar mistakes and minor issues make some sections of the paper hard to read these are in order of appearance vs geometric constrain then use lidarbased model lots works monocular vs stereo results shows baseline model adopt and use several this model achieve models are mainly based on the 2d bounding box are used we can found that our full model improve can only takes point before table 3 table 7 show 23488 vs 3712 depth maps vs noisy depth while the paper has minor weaknesses i believe that they can be corrected in the course of the review process the strenghts in particulat s13 make this work worth sharing docsepthis paper proposes an approach to leveraging knowledge distillation kd for training imagebased monocular 3d detectors different from the ways to incorporate the lidar signal in prior works this paper takes it as the input of a teacher network and further supplements the imagebased student network in terms of spatial information three levels of distillation scenelevel objectlevel in the feature space and objectlevel in the result space are devised experimental results show that this method can effectively boost the performance of imagebased monocular 3d detection while still maintaining outstanding efficiency strengths the basic idea of this paper is easy to follow the motivation and core contributions are clearly presented the methodology is simple yet effective some experimental designs are novel and convincing for instance table 5 for crossmodel evaluation can validate that the localization accuracy has indeed improved a lot and table 6 provides a counterintuitive conclusion that the performance of the teacher model is not directly related to the performance improvement the overall model achieves stateoftheart on the kitti benchmark while maintaining impressive efficiency weaknesses the main concern is that the technical novelty is limited this paper is basically an application of knowledge distillation on monocular 3d detection while only different in terms of specific designs and conclusions due to different tasks the generalization ability of this method is not discussed for example if the settings of lidars or cameras are changed can the student network generalize to other scenarios well if the consistency of these intrinsic and extrinsic settings is quite necessary the application of this kind of method will also be much more limited a minor problem is that the size of the kitti dataset seems too small for now it would be more convincing if the method can be validated on largescale datasets such as nuscenes and waymo minor comments there are some small grammatical typos such as thus no extra computational cost is introduced needs an and in front and existing lidarbased models based on needs an are please doublecheck fix them and also polish the writing figure 4 the gaussianlike mask means gaussian weights in eqn 4 or only centersampling regions with equal weights here it also lacks reference to fcos 1 for 2d detection and fcos3d 2 for monocular 3d detection do we need depth estimation as an intermediate task this claim is closely related to another missing reference it does not matter because it is really recently published dd3d 3 the difference in terms of method design is clear but it can be useful if an empirical comparison can be conducted table 7 it is not clearly presented the source of training data for dorn is it from kittidepth or other datasets is there any dataleakage problem does it influence the result or conclusion in this part appendix a3 how was the depth error computed is it done for topk detection predictions or in other ways references 1 fcos fully convolutional onestage object detection iccv 2019 2 fcos3d fully convolutional onestage monocular 3d object detection iccvw 2021 3 is pseudolidar needed for monocular 3d object detection iccv 2021 this paper applies knowledge distillation to imagebased monocular 3d detection and achieves stateoftheart results it can provide a potential path to better leverage lidar signals to boost the imagebased 3d detectors while not much influencing the original design and efficiency however there are some concerns in terms of novelty and generalization ability so i would vote for borderline accept temporarily docsep aiming at accurately detecting objects in the 3d space from a single image this paper proposes a new method experiments on kitti dataset seems to show promising results 1 this paper claims to be a monobased 3d object detection method although only rgb image data is needed in the test process point cloud data is actually used in the training process from this point of view this claim is not appropriate 2 figure 3 is very confusing please redraw it according to the actual process in order to show the actual process more clearly 3 this paper claims in many places that the method based on monocular image is more valuable in application than the method based on pure point cloud but it has to be admitted in the detection result that the result of 3d object detection based on image is far inferior to that based on pure point so from this point of view do the above remarks lack the measurement dimension of detection accuracy 4 on the experimental results 1 this paper choses this kitti to conduct experiments although the good results are shown in table 3 the method of this paper anonymous is not seen on kittis official website the results on the kitti test set should be submitted to the kitti official website so please submit your results anonymously on the kitti official website otherwise the authenticity of the experimental results in this article will be seriously questioned 2 in the 3d object detection task more data sets have emerged such as waymo nuscences etc this article should try on more data sets 5 the writings should be checked carefully there are many singular and plural problems symbol problems tense problems and so on such as needneeds in section 2 two repeated symbols fj below formula 1 this paper has many problems in experiments especially the authenticity of the experimental results and the completeness and richness of the experiments and there are many writing errors in this paper ### Summary:
this paper received 5 quality reviews with 3 of them rated 8 1 rated 6 and 1 rated 5 in general while there are minor concerns the reviewers acknowledge the contribution of applying knowledge distillation to the problem of monocular 3d object detection and appreciate the sota performance on the kitti validation and test sets the ac concurs with these important contributions and recommends acceptance
[ 77, 684, 2067, 2234, 6260, 1690, 2831, 24353, 7103, 875, 8245, 1566, 285, 2120, 1566, 6864, 13418, 50276, 856, 84, 337, 186, 783, 2929, 310, 973, 3542, 253, 4583, 15722, 310, 2969, 285, 253, 2234, 2803, 273, 465, 69, 310, 973, 12800, 50276, 19, 186, 783, 16038, 285, 4679, 1007, 5272, 281, 479, 891, 1158, 436, 310, 253, 806, 2929, 326, 940, 3171, 253, 3386, 432, 253, 16486, 274, 1566, 281, 253, 1114, 26292, 1566, 281, 3037, 6864, 26638, 534, 4850, 1175, 1543, 285, 7756, 4679, 7568, 253, 5974, 1566, 33772, 281, 3283, 1805, 495, 69, 4328, 1293, 17032, 2105, 253, 7756, 310, 6296, 13943, 5747, 465, 69, 1146, 247, 973, 4304, 2746, 281, 3157, 1566, 3045, 50276, 5040, 337, 186, 783, 2022, 4468, 310, 670, 253, 7681, 7756, 323, 940, 21755, 347, 627, 403, 690, 9380, 337, 326, 14371, 253, 12510, 273, 2831, 24353, 940, 21755, 891, 1158, 253, 2929, 943, 387, 1878, 320, 2429, 342, 253, 8245, 465, 69, 7274, 285, 7568, 253, 34385, 273, 253, 4081, 11911, 824, 347, 6200, 6082, 5251, 940, 21755, 1880, 281, 940, 408, 15430, 3711, 285, 2319, 2139, 253, 4081, 3082, 789, 1805, 685, 2969, 465, 69, 323, 1114, 26292, 495, 69, 1789, 5481, 390, 3782, 6864, 13418, 5010, 253, 7680, 310, 417, 2217, 347, 352, 2550, 2085, 667, 747, 1491, 323, 2831, 24353, 465, 69, 671, 253, 2905, 2987, 670, 465, 69, 323, 374, 69, 1789, 5481, 24088, 374, 347, 253, 2022, 1127, 273, 436, 2929, 476, 320, 908, 347, 253, 8245, 285, 943, 320, 2908, 387, 1878, 50276, 19, 186, 4064, 253, 28913, 2175, 352, 3133, 326, 253, 6864, 26638, 403, 253, 2234, 2803, 281, 2818, 253, 4715, 273, 253, 1566, 310, 352, 1896, 1955, 281, 253, 9732, 2990, 20670, 684, 253, 23507, 6864, 3711, 285, 3021, 253, 2990, 21916, 1805, 253, 6864, 20446, 891, 1158, 271, 28913, 1263, 273, 14086, 6864, 3711, 20446, 476, 1361, 253, 1783, 495, 186, 783, 3045, 7756, 323, 34792, 285, 6776, 382, 310, 417, 594, 4755, 253, 5301, 342, 253, 8245, 1566, 275, 465, 21498, 12820, 310, 9829, 285, 943, 320, 19079, 50275, 18, 2831, 30771, 940, 21755, 323, 20446, 3700, 1149, 37668, 256, 1162, 355, 30105, 1087, 4022, 374, 4715, 5919, 1789, 5481, 3210, 342, 3640, 940, 21755, 260, 864, 1162, 267, 5723, 2824, 4240, 50275, 783, 5044, 16038, 310, 3240, 1175, 285, 310, 4409, 12392, 253, 1655, 28913, 2175, 671, 7568, 253, 1175, 1127, 273, 2831, 24353, 940, 21755, 253, 3045, 3133, 1175, 2217, 275, 253, 2905, 2987, 533, 697, 1929, 326, 2831, 24353, 465, 69, 310, 9371, 347, 2011, 275, 253, 2022, 2278, 337, 534, 310, 417, 247, 747, 2217, 2934, 285, 253, 7681, 7756, 273, 253, 4081, 465, 69, 1332, 689, 253, 8245, 465, 69, 2746, 943, 320, 34615, 625, 4583, 891, 651, 751, 281, 1918, 253, 13716, 273, 45210, 12009, 5474, 339, 431, 248, 2929, 10262, 1114, 351, 382, 408, 247, 1039, 281, 7278, 46206, 3169, 495, 69, 1789, 5481, 949, 3640, 940, 21755, 432, 247, 16486, 274, 3169, 9732, 247, 495, 69, 5481, 15722, 4270, 1475, 1114, 351, 282, 310, 10166, 1097, 327, 46206, 3888, 285, 12006, 1245, 16486, 274, 3280, 1264, 2022, 5122, 8046, 9732, 39095, 3700, 327, 4735, 1268, 6200, 5251, 940, 21755, 8495, 84, 15430, 8115, 875, 9732, 285, 5974, 1223, 34741, 3386, 403, 10166, 327, 1789, 5251, 3081, 34741, 17927, 13301, 403, 19732, 2961, 1475, 253, 9732, 9145, 13650, 1016, 4445, 310, 9257, 490, 16148, 285, 4679, 403, 5196, 327, 253, 465, 21498, 22791, 50276, 249, 253, 1563, 20544, 256, 1047, 285, 32213, 259, 1012, 403, 7000, 50276, 84, 18, 940, 21755, 15722, 253, 1039, 275, 534, 253, 5974, 310, 10166, 949, 6200, 5251, 285, 1789, 5251, 940, 21755, 2366, 342, 253, 6508, 17927, 13301, 310, 271, 4722, 2746, 281, 25057, 253, 16486, 274, 3280, 387, 3733, 673, 534, 1057, 417, 2430, 27934, 2544, 875, 5974, 285, 9732, 50276, 84, 19, 5301, 281, 10444, 6864, 13418, 8892, 28913, 4679, 403, 5196, 281, 1953, 253, 10393, 273, 253, 3081, 4836, 273, 6864, 13418, 2223, 908, 323, 495, 69, 5481, 671, 432, 36167, 275, 253, 24864, 2144, 597, 1918, 12288, 326, 436, 3884, 1537, 320, 1679, 3576, 436, 1537, 2818, 2852, 2990, 2216, 10165, 50276, 84, 20, 1543, 253, 5661, 7103, 273, 512, 3206, 4295, 2957, 3470, 533, 671, 253, 12006, 1877, 816, 7790, 616, 9021, 253, 4583, 7200, 273, 253, 15722, 310, 39262, 689, 1142, 1375, 23037, 248, 1445, 7274, 50276, 84, 21, 2929, 9759, 253, 2929, 310, 4518, 17194, 285, 3477, 281, 2096, 50276, 88, 18, 4344, 5301, 275, 253, 5301, 10334, 495, 253, 3559, 1332, 943, 671, 755, 247, 50276, 284, 253, 16486, 274, 2625, 310, 29688, 908, 323, 3733, 50276, 88, 19, 5884, 14951, 1491, 275, 16186, 79, 495, 1491, 310, 5816, 327, 253, 3605, 465, 50276, 88, 20, 690, 963, 993, 28146, 16503, 285, 5884, 3374, 1056, 690, 7118, 273, 253, 2929, 1892, 281, 1239, 841, 403, 275, 1340, 273, 7286, 4632, 17856, 37709, 840, 897, 16486, 274, 3169, 1566, 8783, 2987, 1114, 26292, 4632, 36167, 1543, 2722, 8245, 1566, 5283, 285, 897, 2067, 436, 1566, 5115, 3210, 403, 7194, 1754, 327, 253, 374, 69, 41113, 3817, 403, 908, 359, 476, 1119, 326, 776, 2120, 1566, 3157, 476, 760, 3936, 1127, 1078, 2829, 495, 2829, 818, 921, 27812, 2055, 4632, 5345, 805, 6864, 8115, 4632, 27620, 6864, 1223, 253, 2929, 556, 5884, 32213, 891, 2868, 326, 597, 476, 320, 15045, 275, 253, 2282, 273, 253, 2278, 1232, 253, 4056, 384, 84, 275, 46596, 255, 256, 1012, 1056, 436, 789, 4409, 9628, 50276, 7152, 33032, 2520, 2929, 29328, 271, 2746, 281, 19732, 2977, 3640, 940, 21755, 465, 69, 323, 3733, 2460, 3169, 1114, 26292, 495, 69, 25421, 1027, 432, 253, 4088, 281, 19071, 253, 16486, 274, 2625, 275, 2720, 2987, 436, 2929, 3936, 352, 347, 253, 3280, 273, 247, 9732, 2990, 285, 2007, 26434, 253, 2460, 3169, 5974, 2990, 275, 2426, 273, 8820, 1491, 1264, 2308, 273, 940, 21755, 6200, 5251, 1789, 5251, 275, 253, 4735, 2317, 285, 1789, 5251, 275, 253, 906, 2317, 403, 32434, 5661, 1543, 921, 326, 436, 1332, 476, 8069, 9510, 253, 3045, 273, 2460, 3169, 1114, 26292, 495, 69, 5481, 1223, 1335, 11850, 16383, 6733, 20544, 50276, 783, 5044, 2934, 273, 436, 2929, 310, 3477, 281, 956, 50276, 783, 16038, 285, 5161, 9021, 403, 4518, 3559, 50276, 783, 16182, 310, 2969, 2568, 3576, 50276, 8826, 5661, 11809, 403, 4460, 285, 21414, 323, 4227, 2829, 608, 323, 2831, 7645, 7103, 476, 17813, 326, 253, 14536, 7200, 556, 6296, 5520, 247, 2257, 285, 2829, 721, 3400, 247, 4828, 565, 48714, 6452, 326, 253, 3045, 273, 253, 9732, 1566, 310, 417, 3587, 2905, 281, 253, 3045, 7756, 50276, 783, 4583, 1566, 33526, 1375, 23037, 14387, 327, 253, 465, 21498, 22791, 1223, 11850, 13943, 6733, 50276, 20881, 1255, 265, 50276, 783, 2022, 4468, 310, 326, 253, 7681, 38135, 310, 3710, 436, 2929, 310, 10323, 271, 2898, 273, 3640, 940, 21755, 327, 1114, 26292, 495, 69, 5481, 1223, 760, 1027, 275, 2426, 273, 2173, 11809, 285, 11815, 1955, 281, 1027, 8892, 50276, 783, 26647, 3745, 273, 436, 1332, 310, 417, 5469, 323, 1650, 604, 253, 7533, 273, 16486, 1032, 390, 14693, 403, 4391, 476, 253, 5974, 2990, 39970, 281, 643, 15216, 973, 604, 253, 15274, 273, 841, 15276, 285, 38988, 7533, 310, 3240, 3309, 253, 2898, 273, 436, 2238, 273, 1332, 588, 671, 320, 1199, 625, 3710, 50276, 66, 5884, 1895, 310, 326, 253, 1979, 273, 253, 465, 21498, 10895, 3133, 1512, 1355, 323, 1024, 352, 651, 320, 625, 21414, 604, 253, 1332, 476, 320, 17618, 327, 1236, 2510, 25912, 15302, 824, 347, 295, 19387, 24453, 285, 1039, 6972, 50276, 37585, 5701, 50276, 9088, 403, 690, 1355, 47412, 474, 963, 993, 824, 347, 3021, 642, 4465, 15180, 2105, 310, 5611, 3198, 271, 285, 275, 2914, 285, 5368, 16486, 274, 3169, 3210, 1754, 327, 3198, 271, 403, 4496, 4021, 5903, 4993, 731, 285, 671, 40167, 253, 4028, 50276, 13206, 577, 253, 305, 12064, 3022, 8989, 2097, 305, 12064, 13461, 275, 16186, 79, 577, 390, 760, 12127, 312, 4906, 4811, 342, 4503, 13461, 1060, 352, 671, 19756, 3806, 281, 269, 4752, 337, 323, 374, 69, 5481, 285, 269, 4752, 20, 69, 374, 323, 1114, 26292, 495, 69, 5481, 50276, 3088, 359, 878, 6864, 13418, 347, 271, 10444, 4836, 436, 1750, 310, 8244, 2905, 281, 1529, 5816, 3806, 352, 1057, 417, 2647, 984, 352, 310, 1663, 4102, 3863, 32765, 20, 69, 495, 253, 3064, 275, 2426, 273, 1332, 2216, 310, 2590, 533, 352, 476, 320, 4217, 604, 271, 16774, 5301, 476, 320, 5196, 50276, 2420, 818, 352, 310, 417, 4518, 3559, 253, 2603, 273, 3733, 941, 323, 277, 1575, 310, 352, 432, 465, 770, 504, 81, 394, 390, 643, 15302, 310, 627, 667, 2856, 1079, 518, 486, 1895, 1057, 352, 4833, 253, 906, 390, 6452, 275, 436, 629, 50276, 50237, 247, 20, 849, 369, 253, 6864, 2228, 10302, 310, 352, 2218, 323, 1755, 76, 5481, 13650, 390, 275, 643, 4088, 50276, 250, 3065, 50276, 18, 269, 4752, 4751, 27311, 267, 327, 383, 486, 1789, 5481, 17857, 17312, 6247, 50276, 19, 269, 4752, 20, 69, 4751, 27311, 267, 327, 383, 486, 1114, 26292, 495, 69, 1789, 5481, 17857, 17312, 88, 43425, 50276, 20, 310, 10585, 9528, 274, 3058, 323, 1114, 26292, 495, 69, 1789, 5481, 17857, 17312, 43425, 436, 2929, 10384, 3640, 940, 21755, 281, 2460, 3169, 1114, 26292, 495, 69, 5481, 285, 33526, 1375, 23037, 14387, 1543, 352, 476, 2085, 247, 2442, 1854, 281, 1805, 25057, 16486, 274, 6298, 281, 9510, 253, 2460, 3169, 495, 69, 25421, 1223, 417, 1199, 29189, 253, 3236, 2216, 285, 6733, 2299, 627, 403, 690, 7350, 275, 2426, 273, 38135, 285, 26647, 3745, 594, 891, 651, 6273, 323, 45210, 2997, 20220, 5474, 33032, 26400, 387, 13613, 15549, 5113, 275, 253, 495, 69, 2317, 432, 247, 2014, 2460, 436, 2929, 29328, 247, 747, 1332, 4679, 327, 465, 21498, 10895, 3133, 281, 921, 12532, 1543, 337, 436, 2929, 3916, 281, 320, 247, 1114, 706, 833, 495, 69, 1789, 5481, 1332, 3738, 760, 46206, 2460, 941, 310, 3058, 275, 253, 1071, 1232, 1127, 9005, 941, 310, 2686, 908, 275, 253, 3733, 1232, 432, 436, 1127, 273, 1859, 436, 1750, 310, 417, 4569, 50276, 19, 4677, 495, 310, 1077, 21643, 4496, 2502, 2040, 352, 2556, 281, 253, 4588, 1232, 275, 1340, 281, 921, 253, 4588, 1232, 625, 4518, 50276, 20, 436, 2929, 3916, 275, 1142, 5053, 326, 253, 1332, 1754, 327, 1114, 26292, 2460, 310, 625, 9865, 275, 2898, 685, 253, 1332, 1754, 327, 6313, 1127, 9005, 533, 352, 556, 281, 320, 8176, 275, 253, 5481, 906, 326, 253, 906, 273, 495, 69, 1789, 5481, 1754, 327, 2460, 310, 2080, 18134, 281, 326, 1754, 327, 6313, 1127, 594, 432, 436, 1127, 273, 1859, 513, 253, 1840, 16157, 3480, 253, 6814, 7877, 273, 5481, 7200, 50276, 21, 327, 253, 5661, 1543, 337, 436, 2929, 448, 4863, 436, 465, 21498, 281, 2589, 4679, 3738, 253, 1175, 1543, 403, 2011, 275, 2829, 495, 253, 1332, 273, 436, 2929, 17679, 310, 417, 2326, 327, 465, 770, 261, 3565, 4422, 253, 1543, 327, 253, 465, 21498, 1071, 873, 943, 320, 9262, 281, 253, 465, 21498, 3565, 4422, 594, 4496, 11929, 634, 1543, 26314, 4087, 327, 253, 465, 21498, 3565, 4422, 5010, 253, 40318, 273, 253, 5661, 1543, 275, 436, 3929, 588, 320, 10369, 17801, 374, 275, 253, 495, 69, 1789, 5481, 4836, 625, 941, 5239, 452, 13082, 824, 347, 1039, 6972, 295, 19387, 2979, 3966, 436, 3929, 943, 1611, 327, 625, 941, 5239, 50276, 22, 253, 25527, 943, 320, 10141, 9257, 627, 403, 1142, 11098, 285, 25540, 3237, 9484, 3237, 29341, 3237, 285, 594, 327, 824, 347, 878, 50234, 275, 2593, 374, 767, 6015, 14217, 269, 75, 2708, 7212, 337, 436, 2929, 556, 1142, 3237, 275, 4679, 3340, 253, 40318, 273, 253, 5661, 1543, 285, 253, 29867, 285, 37175, 273, 253, 4679, 285, 627, 403, 1142, 4028, 6332, 275, 436, 2929, 2490, 187, 4118, 18435, 27, 2520, 2929, 2959, 608, 3290, 10123, 342, 495, 273, 731, 20139, 854, 337, 20139, 721, 285, 337, 20139, 608, 275, 2087, 1223, 627, 403, 5884, 7350, 253, 30628, 14409, 253, 7680, 273, 9433, 3640, 940, 21755, 281, 253, 1895, 273, 1114, 26292, 495, 69, 1789, 5481, 285, 11435, 253, 256, 5503, 3045, 327, 253, 465, 21498, 12820, 285, 1071, 5239, 253, 913, 15038, 84, 342, 841, 1774, 9021, 285, 32636, 14924 ]
[ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 ]
[ 77, 684, 2067, 2234, 6260, 1690, 2831, 24353, 7103, 875, 8245, 1566, 285, 2120, 1566, 6864, 13418, 50276, 856, 84, 337, 186, 783, 2929, 310, 973, 3542, 253, 4583, 15722, 310, 2969, 285, 253, 2234, 2803, 273, 465, 69, 310, 973, 12800, 50276, 19, 186, 783, 16038, 285, 4679, 1007, 5272, 281, 479, 891, 1158, 436, 310, 253, 806, 2929, 326, 940, 3171, 253, 3386, 432, 253, 16486, 274, 1566, 281, 253, 1114, 26292, 1566, 281, 3037, 6864, 26638, 534, 4850, 1175, 1543, 285, 7756, 4679, 7568, 253, 5974, 1566, 33772, 281, 3283, 1805, 495, 69, 4328, 1293, 17032, 2105, 253, 7756, 310, 6296, 13943, 5747, 465, 69, 1146, 247, 973, 4304, 2746, 281, 3157, 1566, 3045, 50276, 5040, 337, 186, 783, 2022, 4468, 310, 670, 253, 7681, 7756, 323, 940, 21755, 347, 627, 403, 690, 9380, 337, 326, 14371, 253, 12510, 273, 2831, 24353, 940, 21755, 891, 1158, 253, 2929, 943, 387, 1878, 320, 2429, 342, 253, 8245, 465, 69, 7274, 285, 7568, 253, 34385, 273, 253, 4081, 11911, 824, 347, 6200, 6082, 5251, 940, 21755, 1880, 281, 940, 408, 15430, 3711, 285, 2319, 2139, 253, 4081, 3082, 789, 1805, 685, 2969, 465, 69, 323, 1114, 26292, 495, 69, 1789, 5481, 390, 3782, 6864, 13418, 5010, 253, 7680, 310, 417, 2217, 347, 352, 2550, 2085, 667, 747, 1491, 323, 2831, 24353, 465, 69, 671, 253, 2905, 2987, 670, 465, 69, 323, 374, 69, 1789, 5481, 24088, 374, 347, 253, 2022, 1127, 273, 436, 2929, 476, 320, 908, 347, 253, 8245, 285, 943, 320, 2908, 387, 1878, 50276, 19, 186, 4064, 253, 28913, 2175, 352, 3133, 326, 253, 6864, 26638, 403, 253, 2234, 2803, 281, 2818, 253, 4715, 273, 253, 1566, 310, 352, 1896, 1955, 281, 253, 9732, 2990, 20670, 684, 253, 23507, 6864, 3711, 285, 3021, 253, 2990, 21916, 1805, 253, 6864, 20446, 891, 1158, 271, 28913, 1263, 273, 14086, 6864, 3711, 20446, 476, 1361, 253, 1783, 495, 186, 783, 3045, 7756, 323, 34792, 285, 6776, 382, 310, 417, 594, 4755, 253, 5301, 342, 253, 8245, 1566, 275, 465, 21498, 12820, 310, 9829, 285, 943, 320, 19079, 50275, 18, 2831, 30771, 940, 21755, 323, 20446, 3700, 1149, 37668, 256, 1162, 355, 30105, 1087, 4022, 374, 4715, 5919, 1789, 5481, 3210, 342, 3640, 940, 21755, 260, 864, 1162, 267, 5723, 2824, 4240, 50275, 783, 5044, 16038, 310, 3240, 1175, 285, 310, 4409, 12392, 253, 1655, 28913, 2175, 671, 7568, 253, 1175, 1127, 273, 2831, 24353, 940, 21755, 253, 3045, 3133, 1175, 2217, 275, 253, 2905, 2987, 533, 697, 1929, 326, 2831, 24353, 465, 69, 310, 9371, 347, 2011, 275, 253, 2022, 2278, 337, 534, 310, 417, 247, 747, 2217, 2934, 285, 253, 7681, 7756, 273, 253, 4081, 465, 69, 1332, 689, 253, 8245, 465, 69, 2746, 943, 320, 34615, 625, 4583, 891, 651, 751, 281, 1918, 253, 13716, 273, 45210, 12009, 5474, 339, 431, 248, 2929, 10262, 1114, 351, 382, 408, 247, 1039, 281, 7278, 46206, 3169, 495, 69, 1789, 5481, 949, 3640, 940, 21755, 432, 247, 16486, 274, 3169, 9732, 247, 495, 69, 5481, 15722, 4270, 1475, 1114, 351, 282, 310, 10166, 1097, 327, 46206, 3888, 285, 12006, 1245, 16486, 274, 3280, 1264, 2022, 5122, 8046, 9732, 39095, 3700, 327, 4735, 1268, 6200, 5251, 940, 21755, 8495, 84, 15430, 8115, 875, 9732, 285, 5974, 1223, 34741, 3386, 403, 10166, 327, 1789, 5251, 3081, 34741, 17927, 13301, 403, 19732, 2961, 1475, 253, 9732, 9145, 13650, 1016, 4445, 310, 9257, 490, 16148, 285, 4679, 403, 5196, 327, 253, 465, 21498, 22791, 50276, 249, 253, 1563, 20544, 256, 1047, 285, 32213, 259, 1012, 403, 7000, 50276, 84, 18, 940, 21755, 15722, 253, 1039, 275, 534, 253, 5974, 310, 10166, 949, 6200, 5251, 285, 1789, 5251, 940, 21755, 2366, 342, 253, 6508, 17927, 13301, 310, 271, 4722, 2746, 281, 25057, 253, 16486, 274, 3280, 387, 3733, 673, 534, 1057, 417, 2430, 27934, 2544, 875, 5974, 285, 9732, 50276, 84, 19, 5301, 281, 10444, 6864, 13418, 8892, 28913, 4679, 403, 5196, 281, 1953, 253, 10393, 273, 253, 3081, 4836, 273, 6864, 13418, 2223, 908, 323, 495, 69, 5481, 671, 432, 36167, 275, 253, 24864, 2144, 597, 1918, 12288, 326, 436, 3884, 1537, 320, 1679, 3576, 436, 1537, 2818, 2852, 2990, 2216, 10165, 50276, 84, 20, 1543, 253, 5661, 7103, 273, 512, 3206, 4295, 2957, 3470, 533, 671, 253, 12006, 1877, 816, 7790, 616, 9021, 253, 4583, 7200, 273, 253, 15722, 310, 39262, 689, 1142, 1375, 23037, 248, 1445, 7274, 50276, 84, 21, 2929, 9759, 253, 2929, 310, 4518, 17194, 285, 3477, 281, 2096, 50276, 88, 18, 4344, 5301, 275, 253, 5301, 10334, 495, 253, 3559, 1332, 943, 671, 755, 247, 50276, 284, 253, 16486, 274, 2625, 310, 29688, 908, 323, 3733, 50276, 88, 19, 5884, 14951, 1491, 275, 16186, 79, 495, 1491, 310, 5816, 327, 253, 3605, 465, 50276, 88, 20, 690, 963, 993, 28146, 16503, 285, 5884, 3374, 1056, 690, 7118, 273, 253, 2929, 1892, 281, 1239, 841, 403, 275, 1340, 273, 7286, 4632, 17856, 37709, 840, 897, 16486, 274, 3169, 1566, 8783, 2987, 1114, 26292, 4632, 36167, 1543, 2722, 8245, 1566, 5283, 285, 897, 2067, 436, 1566, 5115, 3210, 403, 7194, 1754, 327, 253, 374, 69, 41113, 3817, 403, 908, 359, 476, 1119, 326, 776, 2120, 1566, 3157, 476, 760, 3936, 1127, 1078, 2829, 495, 2829, 818, 921, 27812, 2055, 4632, 5345, 805, 6864, 8115, 4632, 27620, 6864, 1223, 253, 2929, 556, 5884, 32213, 891, 2868, 326, 597, 476, 320, 15045, 275, 253, 2282, 273, 253, 2278, 1232, 253, 4056, 384, 84, 275, 46596, 255, 256, 1012, 1056, 436, 789, 4409, 9628, 50276, 7152, 33032, 2520, 2929, 29328, 271, 2746, 281, 19732, 2977, 3640, 940, 21755, 465, 69, 323, 3733, 2460, 3169, 1114, 26292, 495, 69, 25421, 1027, 432, 253, 4088, 281, 19071, 253, 16486, 274, 2625, 275, 2720, 2987, 436, 2929, 3936, 352, 347, 253, 3280, 273, 247, 9732, 2990, 285, 2007, 26434, 253, 2460, 3169, 5974, 2990, 275, 2426, 273, 8820, 1491, 1264, 2308, 273, 940, 21755, 6200, 5251, 1789, 5251, 275, 253, 4735, 2317, 285, 1789, 5251, 275, 253, 906, 2317, 403, 32434, 5661, 1543, 921, 326, 436, 1332, 476, 8069, 9510, 253, 3045, 273, 2460, 3169, 1114, 26292, 495, 69, 5481, 1223, 1335, 11850, 16383, 6733, 20544, 50276, 783, 5044, 2934, 273, 436, 2929, 310, 3477, 281, 956, 50276, 783, 16038, 285, 5161, 9021, 403, 4518, 3559, 50276, 783, 16182, 310, 2969, 2568, 3576, 50276, 8826, 5661, 11809, 403, 4460, 285, 21414, 323, 4227, 2829, 608, 323, 2831, 7645, 7103, 476, 17813, 326, 253, 14536, 7200, 556, 6296, 5520, 247, 2257, 285, 2829, 721, 3400, 247, 4828, 565, 48714, 6452, 326, 253, 3045, 273, 253, 9732, 1566, 310, 417, 3587, 2905, 281, 253, 3045, 7756, 50276, 783, 4583, 1566, 33526, 1375, 23037, 14387, 327, 253, 465, 21498, 22791, 1223, 11850, 13943, 6733, 50276, 20881, 1255, 265, 50276, 783, 2022, 4468, 310, 326, 253, 7681, 38135, 310, 3710, 436, 2929, 310, 10323, 271, 2898, 273, 3640, 940, 21755, 327, 1114, 26292, 495, 69, 5481, 1223, 760, 1027, 275, 2426, 273, 2173, 11809, 285, 11815, 1955, 281, 1027, 8892, 50276, 783, 26647, 3745, 273, 436, 1332, 310, 417, 5469, 323, 1650, 604, 253, 7533, 273, 16486, 1032, 390, 14693, 403, 4391, 476, 253, 5974, 2990, 39970, 281, 643, 15216, 973, 604, 253, 15274, 273, 841, 15276, 285, 38988, 7533, 310, 3240, 3309, 253, 2898, 273, 436, 2238, 273, 1332, 588, 671, 320, 1199, 625, 3710, 50276, 66, 5884, 1895, 310, 326, 253, 1979, 273, 253, 465, 21498, 10895, 3133, 1512, 1355, 323, 1024, 352, 651, 320, 625, 21414, 604, 253, 1332, 476, 320, 17618, 327, 1236, 2510, 25912, 15302, 824, 347, 295, 19387, 24453, 285, 1039, 6972, 50276, 37585, 5701, 50276, 9088, 403, 690, 1355, 47412, 474, 963, 993, 824, 347, 3021, 642, 4465, 15180, 2105, 310, 5611, 3198, 271, 285, 275, 2914, 285, 5368, 16486, 274, 3169, 3210, 1754, 327, 3198, 271, 403, 4496, 4021, 5903, 4993, 731, 285, 671, 40167, 253, 4028, 50276, 13206, 577, 253, 305, 12064, 3022, 8989, 2097, 305, 12064, 13461, 275, 16186, 79, 577, 390, 760, 12127, 312, 4906, 4811, 342, 4503, 13461, 1060, 352, 671, 19756, 3806, 281, 269, 4752, 337, 323, 374, 69, 5481, 285, 269, 4752, 20, 69, 374, 323, 1114, 26292, 495, 69, 5481, 50276, 3088, 359, 878, 6864, 13418, 347, 271, 10444, 4836, 436, 1750, 310, 8244, 2905, 281, 1529, 5816, 3806, 352, 1057, 417, 2647, 984, 352, 310, 1663, 4102, 3863, 32765, 20, 69, 495, 253, 3064, 275, 2426, 273, 1332, 2216, 310, 2590, 533, 352, 476, 320, 4217, 604, 271, 16774, 5301, 476, 320, 5196, 50276, 2420, 818, 352, 310, 417, 4518, 3559, 253, 2603, 273, 3733, 941, 323, 277, 1575, 310, 352, 432, 465, 770, 504, 81, 394, 390, 643, 15302, 310, 627, 667, 2856, 1079, 518, 486, 1895, 1057, 352, 4833, 253, 906, 390, 6452, 275, 436, 629, 50276, 50237, 247, 20, 849, 369, 253, 6864, 2228, 10302, 310, 352, 2218, 323, 1755, 76, 5481, 13650, 390, 275, 643, 4088, 50276, 250, 3065, 50276, 18, 269, 4752, 4751, 27311, 267, 327, 383, 486, 1789, 5481, 17857, 17312, 6247, 50276, 19, 269, 4752, 20, 69, 4751, 27311, 267, 327, 383, 486, 1114, 26292, 495, 69, 1789, 5481, 17857, 17312, 88, 43425, 50276, 20, 310, 10585, 9528, 274, 3058, 323, 1114, 26292, 495, 69, 1789, 5481, 17857, 17312, 43425, 436, 2929, 10384, 3640, 940, 21755, 281, 2460, 3169, 1114, 26292, 495, 69, 5481, 285, 33526, 1375, 23037, 14387, 1543, 352, 476, 2085, 247, 2442, 1854, 281, 1805, 25057, 16486, 274, 6298, 281, 9510, 253, 2460, 3169, 495, 69, 25421, 1223, 417, 1199, 29189, 253, 3236, 2216, 285, 6733, 2299, 627, 403, 690, 7350, 275, 2426, 273, 38135, 285, 26647, 3745, 594, 891, 651, 6273, 323, 45210, 2997, 20220, 5474, 33032, 26400, 387, 13613, 15549, 5113, 275, 253, 495, 69, 2317, 432, 247, 2014, 2460, 436, 2929, 29328, 247, 747, 1332, 4679, 327, 465, 21498, 10895, 3133, 281, 921, 12532, 1543, 337, 436, 2929, 3916, 281, 320, 247, 1114, 706, 833, 495, 69, 1789, 5481, 1332, 3738, 760, 46206, 2460, 941, 310, 3058, 275, 253, 1071, 1232, 1127, 9005, 941, 310, 2686, 908, 275, 253, 3733, 1232, 432, 436, 1127, 273, 1859, 436, 1750, 310, 417, 4569, 50276, 19, 4677, 495, 310, 1077, 21643, 4496, 2502, 2040, 352, 2556, 281, 253, 4588, 1232, 275, 1340, 281, 921, 253, 4588, 1232, 625, 4518, 50276, 20, 436, 2929, 3916, 275, 1142, 5053, 326, 253, 1332, 1754, 327, 1114, 26292, 2460, 310, 625, 9865, 275, 2898, 685, 253, 1332, 1754, 327, 6313, 1127, 9005, 533, 352, 556, 281, 320, 8176, 275, 253, 5481, 906, 326, 253, 906, 273, 495, 69, 1789, 5481, 1754, 327, 2460, 310, 2080, 18134, 281, 326, 1754, 327, 6313, 1127, 594, 432, 436, 1127, 273, 1859, 513, 253, 1840, 16157, 3480, 253, 6814, 7877, 273, 5481, 7200, 50276, 21, 327, 253, 5661, 1543, 337, 436, 2929, 448, 4863, 436, 465, 21498, 281, 2589, 4679, 3738, 253, 1175, 1543, 403, 2011, 275, 2829, 495, 253, 1332, 273, 436, 2929, 17679, 310, 417, 2326, 327, 465, 770, 261, 3565, 4422, 253, 1543, 327, 253, 465, 21498, 1071, 873, 943, 320, 9262, 281, 253, 465, 21498, 3565, 4422, 594, 4496, 11929, 634, 1543, 26314, 4087, 327, 253, 465, 21498, 3565, 4422, 5010, 253, 40318, 273, 253, 5661, 1543, 275, 436, 3929, 588, 320, 10369, 17801, 374, 275, 253, 495, 69, 1789, 5481, 4836, 625, 941, 5239, 452, 13082, 824, 347, 1039, 6972, 295, 19387, 2979, 3966, 436, 3929, 943, 1611, 327, 625, 941, 5239, 50276, 22, 253, 25527, 943, 320, 10141, 9257, 627, 403, 1142, 11098, 285, 25540, 3237, 9484, 3237, 29341, 3237, 285, 594, 327, 824, 347, 878, 50234, 275, 2593, 374, 767, 6015, 14217, 269, 75, 2708, 7212, 337, 436, 2929, 556, 1142, 3237, 275, 4679, 3340, 253, 40318, 273, 253, 5661, 1543, 285, 253, 29867, 285, 37175, 273, 253, 4679, 285, 627, 403, 1142, 4028, 6332, 275, 436, 2929, 2490, 187, 4118, 18435, 27, 2520, 2929, 2959, 608, 3290, 10123, 342, 495, 273, 731, 20139, 854, 337, 20139, 721, 285, 337, 20139, 608, 275, 2087, 1223, 627, 403, 5884, 7350, 253, 30628, 14409, 253, 7680, 273, 9433, 3640, 940, 21755, 281, 253, 1895, 273, 1114, 26292, 495, 69, 1789, 5481, 285, 11435, 253, 256, 5503, 3045, 327, 253, 465, 21498, 12820, 285, 1071, 5239, 253, 913, 15038, 84, 342, 841, 1774, 9021, 285, 32636, 14924 ]
Below is given review of a research paper from cnoference journal. Please write a summary the review. ### Review: the paper proposes an algorithm and an analysis of its convergence the algorithm propose to learn a model of temporal data y1 yt given input x1 xt the observations are assumed to arise from a deterministic latent h governed by a piecewise continuous ode in between consecutive times ti ti1 with additional deterministic jumps at transitions in the odernn paper the latent h can be expressed as a single ode for the whole time horizon rewriting the jump with a skip transition this paper appears to me as taking this expression and choosing a particular bounded form for the dynamics jump and readout functions a statistical asymptotic analysis of the convergence of the algorithms for random times and inputs is given methodology i find the paper quite difficult to read i blame both its structure and my lack of ease with the mathematics used here however from what i have understood of the algorithm proposed i find the methodological contribution very limited clarity i come from the machine learning community and read with no difficulty papers cited in the related work section in comparison i find this paper extremely difficult to read and parse despite containing the same kind of information asymptotic analysis i leave to other reviewers the evaluation of the convergence analysis my evaluation being partial my confidence rating is set accordingly for a machine learning paper presenting in the end a 3 line simple algorithm the paper contains a lot of superfluous mathematical notation that crowds the paper and make the reading very tedious many of the papers cited brouwer 2019 rubanova 2019 li 2020 offer a much smoother read in that respect as is this paper feels better suited to a more specialist statistics venue for example many elements are introduced in the main text and are not really necessary to understand what the paper does the detailed section on random inputs is used only in a theorem coming later why have it in the main text in so much details on the other end a description of the method this paper builds on is left into appendices additional comments the formatting of the references is very inconsistent please update docsepsummary this paper introduces neural jump ordinary differential equations as a method for learning models of continuoustime stochastic processes sampled at random time epochs specifically the paper studies the problem of estimating the marginal conditional expectation ie the l2 optimal approximation conditional on the available information by estimating an auxiliary stochastic differential equation parameterized by neural networks that approximates the conditional expectation of the process of interest at each point in time the neural networks are trained by using a randomized mean squaredloss objective the main theoretical results in the paper include asymptotic consistency of the optimal objective value in the limit of a large neural network as well as consistency of a monte carlo sample average estimator of the value the paper also establishes the l2 convergence of the estimated auxiliary solution to the marginal conditional expectation the technical details in the paper are mostly sound and i believe it should be of interest to a wide community the question of estimating stochastic models sampled at regular or irregular intervals is of broad utility there are some technical issues however but these can be resolved i believe in particular in the conclusions of theorem 41 and 42 it seems as though the authors claim almost sure convergence unless i am misunderstanding their statement what the authors establish is convergence in l2 but why does is imply almost sure convergence wouldnt one require uniform integrability to conclude more furthermore this is not a process level convergence result and therefore i do not believe that they can conclude as in remark g3 that the limit holds almost surely also the authors seem to suggest tin remark g2 that theyre not establishing l2 convergence but this could be a problem with the writing coming to the writing i note that i did find the paper somewhat sloppy in its use of terminology and notation for instance on p1 the authors state while stochastic processes are continuous in time this is not quite true since one can define discretetime stochastic processes i also found the discussion around justifying irregular sampling of the stochastic process to be poorly written in particular it is stated that dividing the timeline into equallysized intervalsis again making assumptions and information is lost well any sampling will involve a loss of information and the randomized sampling process described in this paper also involves assumptions i dont think this comment is appropriate furthermore the authors do not make a clear case for why their irregular sampling procedure is appropriate im quite certain that the sampling process introduces bias into the estimation for instance theorem 1 of ref 1 below provides sufficient conditions under which an irregularly sampled estimator of a functional of an sde is unbiased the authors must do a better job of justifying their method i would also urge them to add an example of a randomized sampling process for instance a poisson process sampler would satisfy their definition in which case the sampling time epochs form an ordered statistic coming to the development of the stochastic model it is unclear to me as to why all of the random objects cannot be defined on the same sample space essentially couldnt one view the sampling process as a point process on the same sample space supporting the sde next in prop 21 the authors state that the optimal adapted process approximating process is hatxt but hatxt is only defined pointwise ie at each time t and it is not defined as a stochastic process indeed for that the authors must describe the finite dimensional distributions for all finite sets of time epochs to define the stochastic process i believe it is inappropriate to call this a stochastic process this doesnt affect the main results since the authors only establish convergence in an l2 sense where the full distribution is not necessary some further minor comments 1 change the term observation dates to observation epochs 2 change amount of observations to number of observations or samples 3 on p3 in the definition of lambdak the set mathcalb0t is undefined 4 the notation defining the function tildemu is very confusing please change 5 p4 since the variation of u should be since the total variation of u 6 what do you mean by ergodic approximation of the objective isnt it simply a sample average approximation which ergodic theorem is playing a role here 7 i would also urge you clearly define what you mean by mathbblconvergence for completion 1 unbiased estimation with square root convergence for sde models changhan rhee and peter w glynndocsepthe authors propose a method for learning the conditional expectation of stochastic process in an online fashion the paper bears a considerable theoretical treatment derived from the stochastic filtering literature which is present both in the main body of the paper and the appendix besides the model the paper also aims to provide a theoretical justification of the convergence of their method i find the contribution of the paper somewhat obscure its aims to be incremental with respect to the previous literature and the experimental validation heavily unconvincing i support my recommendation through the following points following the well known by now neural ode and neural jump sde the contribution of the paper seems minor the authors state that they focus on giving theoretical guarantees however these are specific and loosely validated experimentally there is a fair amount of space dedicate to the theoretical presentation of the background i agree with the importance of theory but i failed to see how that theory supports the claims of the paper the experiments are limited only 3 synthetic examples and only one realworld one the authors states that their method focuses on approximating directly the conditional expectation this seems to be a different with the previous literature however if thats the case the authors should consider more benchmarks such as linear filters adapted to nonuniformlyspaced data gaussian processes or general time series models this paper does have a contribution my recommendation is that the authors show it in a clearer to the point manner with an improved experimental validation docsepthe submission studies a simplified model of odernn and gruodebayes theoretically proves convergence results and presents experimental results in companion with the theoretical results the paper does a good job in defining concepts precisely though this has come at a cost of highly complex notation which may hinder the average researcher in the mldiffeq community who may not have a strong background in probability theory to understand the paper i would therefore recommend the authors to simplify the notation by deferring the precise mathematical definition of concepts such as information sigma algebras to the appendix the section sec 24 on optimal approximation of a stochastic process in the main text is somewhat vague optimality certainly depends on the cost function being considered in which case the appendix states that the 2norm is used here the particular norm being used is somewhat independent of the construction of the probability space eg we could consider the same prob space and evaluate the difference between the random variable and the fixed prediction using some other function say the metric function induced by the l1norm this makes terms such as l2omega x omega tilde minimizer somewhat confusing note my comment here is somewhat handwavy about the precise technicalities but it should convey the relevant idea my main concern regarding the paper is about novelty it seems that the model considered in section 3 falls broadly in line with odernn and gruodebayes on the other hand the experiments section also doesnt compare against latent ode which is a strong but relevant baseline the section sec 4 on theoretical convergence results mostly assume that the erm can already be found this rather strong assumption therefore leaves the theorems in that section not unexpected and at the same time less relevant for practitioners it is also unclear whether convergence rates can be derived the paper does a decent job in clarifying its relationship with prior work postrebuttal i thank the authors for improving the presentation of the paper and including additional experiments comparing to latent ode ### Summary:
this paper proposes a refinement and analysis of continuoustime inference schemes this paper got indepth criticism from some very thoughtful and expert reviewers and the authors seem to have taken it to heart im still worried about the similarity to gruodebayes but i feel that the clarifications to the general theory of continuoustime belief updates is a worthy contribution and the method proposed is a practical one one reviewer didnt update their score but the other reviewers put a lot of thought into the discussion and also raised their scores i do think the title and name of the method is a bit misleading i would call it something like consistent continuoustime filtering because the jump ode is really describing beliefs about an sde
[ 253, 2278, 15, 187, 4118, 8439, 27, 187, 253, 2929, 29328, 271, 5933, 285, 271, 1783, 273, 697, 14940, 253, 5933, 12661, 281, 3037, 247, 1566, 273, 11935, 941, 340, 18, 50274, 1767, 1677, 3280, 1269, 18, 50276, 633, 253, 7313, 403, 8025, 281, 12893, 432, 247, 30027, 21624, 288, 50276, 48881, 264, 407, 247, 5313, 3020, 5415, 258, 615, 275, 875, 12640, 2069, 16816, 16816, 18, 342, 3081, 30027, 27287, 387, 16307, 50276, 249, 253, 25342, 9866, 2929, 253, 21624, 288, 476, 320, 4469, 347, 247, 2014, 258, 615, 323, 253, 2644, 673, 16892, 294, 17695, 253, 6923, 342, 247, 17049, 5502, 50276, 2520, 2929, 4620, 281, 479, 347, 3192, 436, 2048, 285, 13887, 247, 1798, 11542, 830, 323, 253, 8062, 6923, 285, 49914, 3470, 247, 7605, 20185, 1783, 273, 253, 14940, 273, 253, 11333, 323, 3632, 2069, 285, 14800, 310, 1677, 50273, 9349, 1497, 891, 1089, 253, 2929, 3240, 2834, 281, 1239, 891, 13387, 1097, 697, 2605, 285, 619, 3480, 273, 11990, 342, 253, 23065, 908, 1060, 2299, 432, 752, 891, 452, 7192, 273, 253, 5933, 4081, 891, 1089, 253, 35961, 7680, 1077, 3710, 50276, 498, 15752, 891, 1705, 432, 253, 5145, 4715, 3114, 285, 1239, 342, 642, 10183, 50276, 50004, 11106, 275, 253, 2905, 789, 2593, 275, 5301, 891, 1089, 436, 2929, 6685, 2834, 281, 1239, 285, 14390, 5747, 4508, 253, 1072, 2238, 273, 1491, 50276, 284, 40045, 3875, 1783, 891, 3553, 281, 643, 30628, 253, 7103, 273, 253, 14940, 1783, 619, 7103, 1146, 7898, 619, 7162, 13716, 310, 873, 15672, 50275, 1542, 247, 5145, 4715, 2929, 15250, 275, 253, 990, 247, 495, 1386, 2969, 5933, 253, 2929, 4428, 247, 2257, 273, 2221, 1258, 3472, 15965, 14951, 326, 24597, 253, 2929, 285, 1056, 253, 4361, 1077, 38519, 1142, 273, 253, 9380, 11106, 1308, 276, 8358, 6247, 7692, 45388, 6247, 632, 9169, 3959, 247, 1199, 39797, 977, 1239, 275, 326, 1675, 347, 310, 436, 2929, 9193, 1805, 18960, 281, 247, 625, 19616, 9990, 18767, 50275, 1542, 1650, 1142, 3603, 403, 5611, 275, 253, 2022, 2505, 285, 403, 417, 1663, 3309, 281, 2096, 752, 253, 2929, 1057, 253, 7000, 2593, 327, 3632, 14800, 310, 908, 760, 275, 247, 10012, 3551, 1996, 2139, 452, 352, 275, 253, 2022, 2505, 275, 594, 1199, 4278, 327, 253, 643, 990, 247, 5740, 273, 253, 1332, 436, 2929, 21168, 327, 310, 1669, 715, 14801, 1271, 50274, 38092, 5701, 50276, 783, 33907, 273, 253, 10414, 310, 1077, 16706, 4496, 5731, 50276, 7152, 339, 793, 360, 3454, 436, 2929, 23970, 11454, 6923, 9826, 8967, 7424, 347, 247, 1332, 323, 4715, 3210, 273, 44351, 26202, 553, 19191, 4870, 19958, 387, 3632, 673, 44540, 5742, 253, 2929, 2175, 253, 1895, 273, 26230, 253, 16888, 17697, 15355, 26332, 253, 298, 19, 8654, 11193, 17697, 327, 253, 2130, 1491, 407, 26230, 271, 24026, 19191, 8967, 5150, 4764, 1025, 407, 50276, 570, 1546, 6928, 326, 4020, 684, 253, 17697, 15355, 273, 253, 1232, 273, 1600, 387, 1016, 1127, 275, 673, 253, 11454, 6928, 403, 10166, 407, 970, 247, 14871, 1599, 30044, 18585, 8103, 253, 2022, 10527, 1543, 275, 253, 2929, 2486, 20185, 15274, 273, 253, 8654, 8103, 1318, 275, 253, 2701, 273, 247, 1781, 11454, 2990, 347, 973, 347, 15274, 273, 247, 1114, 442, 1113, 4213, 3410, 3388, 29107, 273, 253, 1318, 253, 2929, 671, 25097, 253, 298, 19, 14940, 273, 253, 5998, 24026, 2900, 281, 253, 16888, 17697, 15355, 50276, 783, 7681, 4278, 275, 253, 2929, 403, 6571, 3590, 285, 891, 2868, 352, 943, 320, 273, 1600, 281, 247, 4618, 3114, 253, 1953, 273, 26230, 19191, 3210, 19958, 387, 3963, 390, 17948, 11508, 310, 273, 3862, 11839, 627, 403, 690, 7681, 3374, 2299, 533, 841, 476, 320, 11512, 891, 2868, 275, 1798, 275, 253, 11815, 273, 10012, 7609, 285, 5976, 352, 3133, 347, 2167, 253, 4477, 1750, 2761, 2119, 14940, 5734, 891, 717, 40663, 616, 3908, 50276, 5371, 253, 4477, 5100, 310, 14940, 275, 298, 19, 533, 2139, 1057, 310, 16084, 2761, 2119, 14940, 651, 2649, 581, 2430, 6447, 2899, 1430, 281, 7525, 625, 33810, 436, 310, 417, 247, 1232, 1268, 14940, 906, 285, 3103, 891, 513, 417, 2868, 326, 597, 476, 7525, 347, 275, 7579, 305, 20, 326, 253, 2701, 6556, 2761, 13353, 671, 253, 4477, 1646, 281, 1804, 20596, 7579, 305, 19, 326, 597, 250, 417, 14631, 298, 19, 14940, 533, 436, 812, 320, 247, 1895, 342, 253, 4028, 50275, 4202, 281, 253, 4028, 891, 3877, 326, 891, 858, 1089, 253, 2929, 8489, 1499, 45695, 275, 697, 897, 273, 28939, 285, 14951, 323, 4227, 327, 268, 18, 253, 4477, 1375, 1223, 19191, 4870, 403, 5415, 275, 673, 436, 310, 417, 3240, 2032, 1580, 581, 476, 4853, 35132, 7816, 19191, 4870, 891, 671, 1119, 253, 5955, 1475, 50276, 6309, 5411, 17948, 10491, 273, 253, 19191, 1232, 281, 320, 15225, 3542, 275, 1798, 352, 310, 4767, 326, 23534, 253, 28563, 715, 9696, 10306, 11508, 261, 969, 2403, 13260, 285, 1491, 310, 3663, 973, 667, 10491, 588, 6388, 247, 2957, 273, 1491, 285, 253, 14871, 10491, 1232, 2529, 275, 436, 2929, 671, 8687, 13260, 891, 13414, 1158, 436, 4385, 310, 4569, 33810, 253, 4477, 513, 417, 1056, 247, 2590, 1083, 323, 2139, 616, 17948, 10491, 5199, 310, 4569, 516, 3240, 2176, 326, 253, 10491, 1232, 23970, 8492, 715, 253, 13418, 323, 4227, 10012, 337, 273, 1275, 337, 2708, 3400, 4209, 2515, 762, 534, 271, 17948, 314, 19958, 29107, 273, 247, 5164, 273, 271, 256, 615, 310, 38663, 253, 4477, 1364, 513, 247, 1805, 2628, 273, 816, 5411, 616, 1332, 891, 651, 671, 21434, 731, 281, 823, 271, 1650, 273, 247, 14871, 10491, 1232, 323, 4227, 247, 2963, 17469, 1232, 1775, 17407, 651, 10517, 616, 5426, 275, 534, 1083, 253, 10491, 673, 44540, 830, 271, 6960, 26312, 50276, 4202, 281, 253, 2440, 273, 253, 19191, 1566, 352, 310, 12744, 281, 479, 347, 281, 2139, 512, 273, 253, 3632, 5113, 2550, 320, 2931, 327, 253, 1072, 3410, 2317, 9093, 812, 2649, 581, 1859, 253, 10491, 1232, 347, 247, 1127, 1232, 327, 253, 1072, 3410, 2317, 8109, 253, 256, 615, 50275, 8384, 275, 4198, 3127, 253, 4477, 1375, 326, 253, 8654, 12956, 1232, 4020, 839, 1232, 310, 7856, 633, 50276, 2858, 7856, 633, 310, 760, 2931, 1127, 3020, 26332, 387, 1016, 673, 246, 285, 352, 310, 417, 2931, 347, 247, 19191, 1232, 6296, 323, 326, 253, 4477, 1364, 6266, 253, 6486, 15759, 10670, 323, 512, 6486, 5239, 273, 673, 44540, 281, 4853, 253, 19191, 1232, 891, 2868, 352, 310, 19582, 281, 1067, 436, 247, 19191, 1232, 436, 36908, 2818, 253, 2022, 1543, 1580, 253, 4477, 760, 5100, 14940, 275, 271, 298, 19, 3282, 835, 253, 2120, 3268, 310, 417, 3309, 50275, 8826, 2007, 5884, 5701, 337, 1818, 253, 1307, 8310, 12282, 281, 8310, 44540, 374, 1818, 2408, 273, 7313, 281, 1180, 273, 7313, 390, 3530, 495, 327, 268, 20, 275, 253, 5426, 273, 24082, 69, 518, 253, 873, 14168, 1179, 67, 17, 85, 310, 17011, 577, 253, 14951, 13947, 253, 1159, 246, 786, 36812, 310, 1077, 21643, 4496, 1818, 608, 268, 21, 1580, 253, 7629, 273, 1484, 943, 320, 1580, 253, 2264, 7629, 273, 1484, 721, 752, 513, 368, 1599, 407, 21651, 23329, 11193, 273, 253, 8103, 310, 2649, 352, 3365, 247, 3410, 3388, 11193, 534, 21651, 23329, 10012, 310, 4882, 247, 2554, 1060, 818, 891, 651, 671, 21434, 368, 4518, 4853, 752, 368, 1599, 407, 14168, 67, 1559, 585, 41801, 323, 12240, 50275, 18, 38663, 13418, 342, 6278, 5230, 14940, 323, 256, 615, 3210, 1683, 5582, 23404, 70, 285, 268, 1715, 259, 7144, 79, 2109, 406, 339, 431, 248, 4477, 12661, 247, 1332, 323, 4715, 253, 17697, 15355, 273, 19191, 1232, 275, 271, 3909, 8142, 253, 2929, 17267, 247, 10665, 10527, 1971, 6012, 432, 253, 19191, 19690, 6239, 534, 310, 1246, 1097, 275, 253, 2022, 2133, 273, 253, 2929, 285, 253, 30762, 16280, 253, 1566, 253, 2929, 671, 13698, 281, 2085, 247, 10527, 22861, 273, 253, 14940, 273, 616, 1332, 50275, 74, 1089, 253, 7680, 273, 253, 2929, 8489, 26591, 697, 13698, 281, 320, 32809, 342, 1675, 281, 253, 2045, 6239, 285, 253, 5661, 12820, 11306, 10915, 87, 19163, 891, 1329, 619, 17401, 949, 253, 1563, 2792, 50274, 34814, 253, 973, 1929, 407, 1024, 11454, 258, 615, 285, 11454, 6923, 256, 615, 253, 7680, 273, 253, 2929, 3133, 5884, 253, 4477, 1375, 326, 597, 2770, 327, 4933, 10527, 23632, 2299, 841, 403, 2173, 285, 35056, 17618, 21657, 50276, 9088, 310, 247, 4344, 2408, 273, 2317, 5514, 9038, 281, 253, 10527, 9759, 273, 253, 4114, 891, 5194, 342, 253, 6349, 273, 3762, 533, 891, 4242, 281, 923, 849, 326, 3762, 8525, 253, 3916, 273, 253, 2929, 50276, 783, 4679, 403, 3710, 760, 495, 13506, 6667, 285, 760, 581, 1524, 10186, 581, 253, 4477, 3054, 326, 616, 1332, 16633, 327, 4020, 839, 3587, 253, 17697, 15355, 436, 3133, 281, 320, 247, 1027, 342, 253, 2045, 6239, 2299, 604, 28763, 253, 1083, 253, 4477, 943, 1908, 625, 49602, 824, 347, 4872, 15116, 12956, 281, 1327, 23714, 314, 1033, 2575, 941, 305, 12064, 4870, 390, 2087, 673, 2962, 3210, 50275, 2520, 2929, 1057, 452, 247, 7680, 619, 17401, 310, 326, 50276, 783, 4477, 921, 352, 275, 247, 30909, 281, 253, 1127, 5133, 342, 271, 5520, 5661, 12820, 50276, 7152, 339, 431, 248, 19529, 2175, 247, 21010, 1566, 273, 25342, 9866, 285, 26970, 853, 32442, 265, 28055, 19539, 14940, 1543, 285, 10262, 5661, 1543, 275, 16866, 342, 253, 10527, 1543, 50274, 783, 2929, 1057, 247, 1175, 2628, 275, 13947, 12342, 10534, 2167, 436, 556, 1705, 387, 247, 2105, 273, 4122, 2570, 14951, 534, 778, 35007, 253, 3388, 22780, 275, 253, 278, 392, 30908, 82, 3114, 665, 778, 417, 452, 247, 2266, 4114, 275, 5912, 3762, 281, 2096, 253, 2929, 891, 651, 3103, 5583, 253, 4477, 281, 25636, 253, 14951, 407, 809, 24247, 253, 10799, 15965, 5426, 273, 12342, 824, 347, 1491, 40009, 21360, 281, 253, 30762, 50274, 783, 2593, 4706, 2164, 327, 8654, 11193, 273, 247, 19191, 1232, 275, 253, 2022, 2505, 310, 8489, 21248, 5556, 1319, 5604, 7024, 327, 253, 2105, 1159, 1146, 2783, 275, 534, 1083, 253, 30762, 3054, 326, 253, 374, 12850, 310, 908, 1060, 253, 1798, 5222, 1146, 908, 310, 8489, 3907, 273, 253, 5140, 273, 253, 5912, 2317, 24088, 359, 812, 1908, 253, 1072, 1742, 2317, 285, 7472, 253, 3064, 875, 253, 3632, 4778, 285, 253, 4229, 10554, 970, 690, 643, 1159, 1333, 253, 7982, 1159, 5802, 407, 253, 298, 18, 12850, 436, 2789, 2426, 824, 347, 298, 19, 3151, 1269, 40639, 246, 6227, 7221, 6081, 8489, 21643, 3877, 619, 4385, 1060, 310, 8489, 1133, 88, 17157, 670, 253, 10799, 7681, 1005, 533, 352, 943, 12709, 253, 4623, 2934, 50274, 2577, 2022, 4468, 5001, 253, 2929, 310, 670, 38135, 352, 3133, 326, 253, 1566, 2783, 275, 2593, 495, 11521, 21450, 275, 1386, 342, 25342, 9866, 285, 26970, 853, 32442, 265, 327, 253, 643, 1133, 253, 4679, 2593, 671, 36908, 7277, 1411, 21624, 258, 615, 534, 310, 247, 2266, 533, 4623, 8245, 50274, 783, 2593, 4706, 577, 327, 10527, 14940, 1543, 6571, 5467, 326, 253, 209, 693, 476, 2168, 320, 1119, 436, 2581, 2266, 9376, 3103, 6505, 253, 39383, 275, 326, 2593, 417, 12439, 285, 387, 253, 1072, 673, 1679, 4623, 323, 24432, 352, 310, 671, 12744, 1880, 14940, 4142, 476, 320, 6012, 50275, 783, 2929, 1057, 247, 12524, 2628, 275, 8254, 5411, 697, 2954, 342, 2720, 789, 50274, 5996, 250, 2858, 22559, 50276, 74, 5717, 253, 4477, 323, 11138, 253, 9759, 273, 253, 2929, 285, 1690, 3081, 4679, 10941, 281, 21624, 258, 615, 2490, 187, 4118, 18435, 27, 2520, 2929, 29328, 247, 29646, 285, 1783, 273, 44351, 26202, 553, 17032, 15849, 50276, 2520, 2929, 1694, 801, 554, 394, 14226, 432, 690, 1077, 30457, 285, 6485, 30628, 285, 253, 4477, 1646, 281, 452, 2668, 352, 281, 2798, 50276, 303, 1335, 11926, 670, 253, 14259, 281, 26970, 853, 32442, 265, 533, 891, 1928, 326, 253, 8254, 6787, 281, 253, 2087, 3762, 273, 44351, 26202, 553, 9927, 11269, 310, 247, 18338, 7680, 285, 253, 1332, 4081, 310, 247, 8542, 581, 50276, 531, 37317, 42126, 5731, 616, 4868, 533, 253, 643, 30628, 1691, 247, 2257, 273, 1869, 715, 253, 5955, 285, 671, 5439, 616, 7363, 50276, 74, 513, 1158, 253, 4060, 285, 1416, 273, 253, 1332, 310, 247, 2372, 24363, 50276, 74, 651, 1067, 352, 1633, 751, 5185, 44351, 26202, 553, 19690, 984, 253, 6923, 258, 615, 310, 1663, 12930, 13379, 670, 271, 256, 615 ]
[ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 ]
[ 253, 2278, 15, 187, 4118, 8439, 27, 187, 253, 2929, 29328, 271, 5933, 285, 271, 1783, 273, 697, 14940, 253, 5933, 12661, 281, 3037, 247, 1566, 273, 11935, 941, 340, 18, 50274, 1767, 1677, 3280, 1269, 18, 50276, 633, 253, 7313, 403, 8025, 281, 12893, 432, 247, 30027, 21624, 288, 50276, 48881, 264, 407, 247, 5313, 3020, 5415, 258, 615, 275, 875, 12640, 2069, 16816, 16816, 18, 342, 3081, 30027, 27287, 387, 16307, 50276, 249, 253, 25342, 9866, 2929, 253, 21624, 288, 476, 320, 4469, 347, 247, 2014, 258, 615, 323, 253, 2644, 673, 16892, 294, 17695, 253, 6923, 342, 247, 17049, 5502, 50276, 2520, 2929, 4620, 281, 479, 347, 3192, 436, 2048, 285, 13887, 247, 1798, 11542, 830, 323, 253, 8062, 6923, 285, 49914, 3470, 247, 7605, 20185, 1783, 273, 253, 14940, 273, 253, 11333, 323, 3632, 2069, 285, 14800, 310, 1677, 50273, 9349, 1497, 891, 1089, 253, 2929, 3240, 2834, 281, 1239, 891, 13387, 1097, 697, 2605, 285, 619, 3480, 273, 11990, 342, 253, 23065, 908, 1060, 2299, 432, 752, 891, 452, 7192, 273, 253, 5933, 4081, 891, 1089, 253, 35961, 7680, 1077, 3710, 50276, 498, 15752, 891, 1705, 432, 253, 5145, 4715, 3114, 285, 1239, 342, 642, 10183, 50276, 50004, 11106, 275, 253, 2905, 789, 2593, 275, 5301, 891, 1089, 436, 2929, 6685, 2834, 281, 1239, 285, 14390, 5747, 4508, 253, 1072, 2238, 273, 1491, 50276, 284, 40045, 3875, 1783, 891, 3553, 281, 643, 30628, 253, 7103, 273, 253, 14940, 1783, 619, 7103, 1146, 7898, 619, 7162, 13716, 310, 873, 15672, 50275, 1542, 247, 5145, 4715, 2929, 15250, 275, 253, 990, 247, 495, 1386, 2969, 5933, 253, 2929, 4428, 247, 2257, 273, 2221, 1258, 3472, 15965, 14951, 326, 24597, 253, 2929, 285, 1056, 253, 4361, 1077, 38519, 1142, 273, 253, 9380, 11106, 1308, 276, 8358, 6247, 7692, 45388, 6247, 632, 9169, 3959, 247, 1199, 39797, 977, 1239, 275, 326, 1675, 347, 310, 436, 2929, 9193, 1805, 18960, 281, 247, 625, 19616, 9990, 18767, 50275, 1542, 1650, 1142, 3603, 403, 5611, 275, 253, 2022, 2505, 285, 403, 417, 1663, 3309, 281, 2096, 752, 253, 2929, 1057, 253, 7000, 2593, 327, 3632, 14800, 310, 908, 760, 275, 247, 10012, 3551, 1996, 2139, 452, 352, 275, 253, 2022, 2505, 275, 594, 1199, 4278, 327, 253, 643, 990, 247, 5740, 273, 253, 1332, 436, 2929, 21168, 327, 310, 1669, 715, 14801, 1271, 50274, 38092, 5701, 50276, 783, 33907, 273, 253, 10414, 310, 1077, 16706, 4496, 5731, 50276, 7152, 339, 793, 360, 3454, 436, 2929, 23970, 11454, 6923, 9826, 8967, 7424, 347, 247, 1332, 323, 4715, 3210, 273, 44351, 26202, 553, 19191, 4870, 19958, 387, 3632, 673, 44540, 5742, 253, 2929, 2175, 253, 1895, 273, 26230, 253, 16888, 17697, 15355, 26332, 253, 298, 19, 8654, 11193, 17697, 327, 253, 2130, 1491, 407, 26230, 271, 24026, 19191, 8967, 5150, 4764, 1025, 407, 50276, 570, 1546, 6928, 326, 4020, 684, 253, 17697, 15355, 273, 253, 1232, 273, 1600, 387, 1016, 1127, 275, 673, 253, 11454, 6928, 403, 10166, 407, 970, 247, 14871, 1599, 30044, 18585, 8103, 253, 2022, 10527, 1543, 275, 253, 2929, 2486, 20185, 15274, 273, 253, 8654, 8103, 1318, 275, 253, 2701, 273, 247, 1781, 11454, 2990, 347, 973, 347, 15274, 273, 247, 1114, 442, 1113, 4213, 3410, 3388, 29107, 273, 253, 1318, 253, 2929, 671, 25097, 253, 298, 19, 14940, 273, 253, 5998, 24026, 2900, 281, 253, 16888, 17697, 15355, 50276, 783, 7681, 4278, 275, 253, 2929, 403, 6571, 3590, 285, 891, 2868, 352, 943, 320, 273, 1600, 281, 247, 4618, 3114, 253, 1953, 273, 26230, 19191, 3210, 19958, 387, 3963, 390, 17948, 11508, 310, 273, 3862, 11839, 627, 403, 690, 7681, 3374, 2299, 533, 841, 476, 320, 11512, 891, 2868, 275, 1798, 275, 253, 11815, 273, 10012, 7609, 285, 5976, 352, 3133, 347, 2167, 253, 4477, 1750, 2761, 2119, 14940, 5734, 891, 717, 40663, 616, 3908, 50276, 5371, 253, 4477, 5100, 310, 14940, 275, 298, 19, 533, 2139, 1057, 310, 16084, 2761, 2119, 14940, 651, 2649, 581, 2430, 6447, 2899, 1430, 281, 7525, 625, 33810, 436, 310, 417, 247, 1232, 1268, 14940, 906, 285, 3103, 891, 513, 417, 2868, 326, 597, 476, 7525, 347, 275, 7579, 305, 20, 326, 253, 2701, 6556, 2761, 13353, 671, 253, 4477, 1646, 281, 1804, 20596, 7579, 305, 19, 326, 597, 250, 417, 14631, 298, 19, 14940, 533, 436, 812, 320, 247, 1895, 342, 253, 4028, 50275, 4202, 281, 253, 4028, 891, 3877, 326, 891, 858, 1089, 253, 2929, 8489, 1499, 45695, 275, 697, 897, 273, 28939, 285, 14951, 323, 4227, 327, 268, 18, 253, 4477, 1375, 1223, 19191, 4870, 403, 5415, 275, 673, 436, 310, 417, 3240, 2032, 1580, 581, 476, 4853, 35132, 7816, 19191, 4870, 891, 671, 1119, 253, 5955, 1475, 50276, 6309, 5411, 17948, 10491, 273, 253, 19191, 1232, 281, 320, 15225, 3542, 275, 1798, 352, 310, 4767, 326, 23534, 253, 28563, 715, 9696, 10306, 11508, 261, 969, 2403, 13260, 285, 1491, 310, 3663, 973, 667, 10491, 588, 6388, 247, 2957, 273, 1491, 285, 253, 14871, 10491, 1232, 2529, 275, 436, 2929, 671, 8687, 13260, 891, 13414, 1158, 436, 4385, 310, 4569, 33810, 253, 4477, 513, 417, 1056, 247, 2590, 1083, 323, 2139, 616, 17948, 10491, 5199, 310, 4569, 516, 3240, 2176, 326, 253, 10491, 1232, 23970, 8492, 715, 253, 13418, 323, 4227, 10012, 337, 273, 1275, 337, 2708, 3400, 4209, 2515, 762, 534, 271, 17948, 314, 19958, 29107, 273, 247, 5164, 273, 271, 256, 615, 310, 38663, 253, 4477, 1364, 513, 247, 1805, 2628, 273, 816, 5411, 616, 1332, 891, 651, 671, 21434, 731, 281, 823, 271, 1650, 273, 247, 14871, 10491, 1232, 323, 4227, 247, 2963, 17469, 1232, 1775, 17407, 651, 10517, 616, 5426, 275, 534, 1083, 253, 10491, 673, 44540, 830, 271, 6960, 26312, 50276, 4202, 281, 253, 2440, 273, 253, 19191, 1566, 352, 310, 12744, 281, 479, 347, 281, 2139, 512, 273, 253, 3632, 5113, 2550, 320, 2931, 327, 253, 1072, 3410, 2317, 9093, 812, 2649, 581, 1859, 253, 10491, 1232, 347, 247, 1127, 1232, 327, 253, 1072, 3410, 2317, 8109, 253, 256, 615, 50275, 8384, 275, 4198, 3127, 253, 4477, 1375, 326, 253, 8654, 12956, 1232, 4020, 839, 1232, 310, 7856, 633, 50276, 2858, 7856, 633, 310, 760, 2931, 1127, 3020, 26332, 387, 1016, 673, 246, 285, 352, 310, 417, 2931, 347, 247, 19191, 1232, 6296, 323, 326, 253, 4477, 1364, 6266, 253, 6486, 15759, 10670, 323, 512, 6486, 5239, 273, 673, 44540, 281, 4853, 253, 19191, 1232, 891, 2868, 352, 310, 19582, 281, 1067, 436, 247, 19191, 1232, 436, 36908, 2818, 253, 2022, 1543, 1580, 253, 4477, 760, 5100, 14940, 275, 271, 298, 19, 3282, 835, 253, 2120, 3268, 310, 417, 3309, 50275, 8826, 2007, 5884, 5701, 337, 1818, 253, 1307, 8310, 12282, 281, 8310, 44540, 374, 1818, 2408, 273, 7313, 281, 1180, 273, 7313, 390, 3530, 495, 327, 268, 20, 275, 253, 5426, 273, 24082, 69, 518, 253, 873, 14168, 1179, 67, 17, 85, 310, 17011, 577, 253, 14951, 13947, 253, 1159, 246, 786, 36812, 310, 1077, 21643, 4496, 1818, 608, 268, 21, 1580, 253, 7629, 273, 1484, 943, 320, 1580, 253, 2264, 7629, 273, 1484, 721, 752, 513, 368, 1599, 407, 21651, 23329, 11193, 273, 253, 8103, 310, 2649, 352, 3365, 247, 3410, 3388, 11193, 534, 21651, 23329, 10012, 310, 4882, 247, 2554, 1060, 818, 891, 651, 671, 21434, 368, 4518, 4853, 752, 368, 1599, 407, 14168, 67, 1559, 585, 41801, 323, 12240, 50275, 18, 38663, 13418, 342, 6278, 5230, 14940, 323, 256, 615, 3210, 1683, 5582, 23404, 70, 285, 268, 1715, 259, 7144, 79, 2109, 406, 339, 431, 248, 4477, 12661, 247, 1332, 323, 4715, 253, 17697, 15355, 273, 19191, 1232, 275, 271, 3909, 8142, 253, 2929, 17267, 247, 10665, 10527, 1971, 6012, 432, 253, 19191, 19690, 6239, 534, 310, 1246, 1097, 275, 253, 2022, 2133, 273, 253, 2929, 285, 253, 30762, 16280, 253, 1566, 253, 2929, 671, 13698, 281, 2085, 247, 10527, 22861, 273, 253, 14940, 273, 616, 1332, 50275, 74, 1089, 253, 7680, 273, 253, 2929, 8489, 26591, 697, 13698, 281, 320, 32809, 342, 1675, 281, 253, 2045, 6239, 285, 253, 5661, 12820, 11306, 10915, 87, 19163, 891, 1329, 619, 17401, 949, 253, 1563, 2792, 50274, 34814, 253, 973, 1929, 407, 1024, 11454, 258, 615, 285, 11454, 6923, 256, 615, 253, 7680, 273, 253, 2929, 3133, 5884, 253, 4477, 1375, 326, 597, 2770, 327, 4933, 10527, 23632, 2299, 841, 403, 2173, 285, 35056, 17618, 21657, 50276, 9088, 310, 247, 4344, 2408, 273, 2317, 5514, 9038, 281, 253, 10527, 9759, 273, 253, 4114, 891, 5194, 342, 253, 6349, 273, 3762, 533, 891, 4242, 281, 923, 849, 326, 3762, 8525, 253, 3916, 273, 253, 2929, 50276, 783, 4679, 403, 3710, 760, 495, 13506, 6667, 285, 760, 581, 1524, 10186, 581, 253, 4477, 3054, 326, 616, 1332, 16633, 327, 4020, 839, 3587, 253, 17697, 15355, 436, 3133, 281, 320, 247, 1027, 342, 253, 2045, 6239, 2299, 604, 28763, 253, 1083, 253, 4477, 943, 1908, 625, 49602, 824, 347, 4872, 15116, 12956, 281, 1327, 23714, 314, 1033, 2575, 941, 305, 12064, 4870, 390, 2087, 673, 2962, 3210, 50275, 2520, 2929, 1057, 452, 247, 7680, 619, 17401, 310, 326, 50276, 783, 4477, 921, 352, 275, 247, 30909, 281, 253, 1127, 5133, 342, 271, 5520, 5661, 12820, 50276, 7152, 339, 431, 248, 19529, 2175, 247, 21010, 1566, 273, 25342, 9866, 285, 26970, 853, 32442, 265, 28055, 19539, 14940, 1543, 285, 10262, 5661, 1543, 275, 16866, 342, 253, 10527, 1543, 50274, 783, 2929, 1057, 247, 1175, 2628, 275, 13947, 12342, 10534, 2167, 436, 556, 1705, 387, 247, 2105, 273, 4122, 2570, 14951, 534, 778, 35007, 253, 3388, 22780, 275, 253, 278, 392, 30908, 82, 3114, 665, 778, 417, 452, 247, 2266, 4114, 275, 5912, 3762, 281, 2096, 253, 2929, 891, 651, 3103, 5583, 253, 4477, 281, 25636, 253, 14951, 407, 809, 24247, 253, 10799, 15965, 5426, 273, 12342, 824, 347, 1491, 40009, 21360, 281, 253, 30762, 50274, 783, 2593, 4706, 2164, 327, 8654, 11193, 273, 247, 19191, 1232, 275, 253, 2022, 2505, 310, 8489, 21248, 5556, 1319, 5604, 7024, 327, 253, 2105, 1159, 1146, 2783, 275, 534, 1083, 253, 30762, 3054, 326, 253, 374, 12850, 310, 908, 1060, 253, 1798, 5222, 1146, 908, 310, 8489, 3907, 273, 253, 5140, 273, 253, 5912, 2317, 24088, 359, 812, 1908, 253, 1072, 1742, 2317, 285, 7472, 253, 3064, 875, 253, 3632, 4778, 285, 253, 4229, 10554, 970, 690, 643, 1159, 1333, 253, 7982, 1159, 5802, 407, 253, 298, 18, 12850, 436, 2789, 2426, 824, 347, 298, 19, 3151, 1269, 40639, 246, 6227, 7221, 6081, 8489, 21643, 3877, 619, 4385, 1060, 310, 8489, 1133, 88, 17157, 670, 253, 10799, 7681, 1005, 533, 352, 943, 12709, 253, 4623, 2934, 50274, 2577, 2022, 4468, 5001, 253, 2929, 310, 670, 38135, 352, 3133, 326, 253, 1566, 2783, 275, 2593, 495, 11521, 21450, 275, 1386, 342, 25342, 9866, 285, 26970, 853, 32442, 265, 327, 253, 643, 1133, 253, 4679, 2593, 671, 36908, 7277, 1411, 21624, 258, 615, 534, 310, 247, 2266, 533, 4623, 8245, 50274, 783, 2593, 4706, 577, 327, 10527, 14940, 1543, 6571, 5467, 326, 253, 209, 693, 476, 2168, 320, 1119, 436, 2581, 2266, 9376, 3103, 6505, 253, 39383, 275, 326, 2593, 417, 12439, 285, 387, 253, 1072, 673, 1679, 4623, 323, 24432, 352, 310, 671, 12744, 1880, 14940, 4142, 476, 320, 6012, 50275, 783, 2929, 1057, 247, 12524, 2628, 275, 8254, 5411, 697, 2954, 342, 2720, 789, 50274, 5996, 250, 2858, 22559, 50276, 74, 5717, 253, 4477, 323, 11138, 253, 9759, 273, 253, 2929, 285, 1690, 3081, 4679, 10941, 281, 21624, 258, 615, 2490, 187, 4118, 18435, 27, 2520, 2929, 29328, 247, 29646, 285, 1783, 273, 44351, 26202, 553, 17032, 15849, 50276, 2520, 2929, 1694, 801, 554, 394, 14226, 432, 690, 1077, 30457, 285, 6485, 30628, 285, 253, 4477, 1646, 281, 452, 2668, 352, 281, 2798, 50276, 303, 1335, 11926, 670, 253, 14259, 281, 26970, 853, 32442, 265, 533, 891, 1928, 326, 253, 8254, 6787, 281, 253, 2087, 3762, 273, 44351, 26202, 553, 9927, 11269, 310, 247, 18338, 7680, 285, 253, 1332, 4081, 310, 247, 8542, 581, 50276, 531, 37317, 42126, 5731, 616, 4868, 533, 253, 643, 30628, 1691, 247, 2257, 273, 1869, 715, 253, 5955, 285, 671, 5439, 616, 7363, 50276, 74, 513, 1158, 253, 4060, 285, 1416, 273, 253, 1332, 310, 247, 2372, 24363, 50276, 74, 651, 1067, 352, 1633, 751, 5185, 44351, 26202, 553, 19690, 984, 253, 6923, 258, 615, 310, 1663, 12930, 13379, 670, 271, 256, 615 ]
Below is given review of a research paper from cnoference journal. Please write a summary the review. ### Review: summary this work tries to understand why features from trained language models can be used to solve classification tasks effectively a language model lm in the analysis is modeled as a feature map f s rightarrow mathbbrd a word embedding phi in mathbbrd times v and a trained language model is thought of as epsilonoptimal in terms of its crossentropy from the true distribution over s the work shows that for classification tasks approximately solvable by linear functions over the distributions of the next token the best linear classifier based on f phi will suffer error of osqrtepsilon the authors also propose an additional assumption where the log partition function is quadratic in f based on which some improvements can be obtained being inspired by this assumption a new objective function quad where the partition function is directly replaced by a quadratic of f is proposed experiments seem to support key assertions in the theoretical analysis strengths 1 the authors approach to a wellposed question seem original to me in particular some proposed concepts such as the refined transferability coefficient conditional mean features substitutability matrix might be useful for future studies 2 the article is precise wellwritten and cautious in its tone the accompanying experiments are informative and supportive of the main theoretical claims i enjoy the overall journey the authors presented and would love to see more wellreasoned articles like this in iclr weaknesses 1 the presentation can be improved by allocating more space to ideas in the extensions section this part seems more creative perhaps a little less coherent but expected for a mathematical exploration but is too compressed as it stands see suggestion1 minor issues 1 consider to replace partial sentences with prefixes which is more technically accurate 2 the many notations involving p has inconsistent meanings for their subscripts i would suggest to consolidate them to reduce confusion for example consider to use notations of this form pws theta perhaps boldface for when it is viewed as a vector similarly for ell see suggestion2 3 below section 21 trained to learn trained to fit suggestions 1 i think a moderate revision reducing some parallel elaboration on unconstrained language model should provide the space needed for extensions and other novel ideas after all i found results on unconstrained lms somewhat trivial and i suspect that you hope to use it only as an instructional tool perhaps a serial layout would save space and even improve the perceived emphasis 2 i strongly suggest a review of the notations and to adopt a more consistent scheme one trick i found useful is to follow the notational convention of a textbook or a classic paper the current notation has too much overloading and variability questions 1 it seems to me that the central question lacks strong practical motivation the nlp community seems to move to prefer a natural answer in the form of a generated sentence instead of a label from a classifier as you have argued many classification tasks can be framed as predicting the next token perhaps in the presence of a prompt what is your opinion 2 how realistic is the epsilonoptimality condition on crossentropy for lms can you comment on any associated sample complexity bounds postrebuttal update thank you for replying to my questions i am still concerned about the sample complexity associated with epsilonoptimality in crossentropy even in the trivial case and perhaps impossible for some lowdim representations as lms are over a countably infinite extended alphabetdocsepsummary of review there have been lots of interests to understand why selfsupervised learning approaches such as the next word prediction task learn a useful representation for downstream tasks this paper provides a mathematical framework to understand this question one novel finding of this paper is that the distribution of the next word conditional on the context can provide a strong discriminative signal for the downstream task in particular using a carefully selected subset of prompt words the authors observe that learning a linear predictor over the next word distributions of these words achieves performance close to a pretrained gpt2 model setting and main result this paper focuses on classification tasks and the bulk of the work goes into how to model the next word distributions as features or representations for this purpose the authors introduced the definition of a natural task informally a task is defined as natural if just by using the next word distributions as features the downstream task can be solved with a small loss result 1 under the above assumption over the downstream task this paper provides a bound on the empirical loss of the downstream prediction task this bound consists of two parts the first part measures how natural the task is that is how well can the task be solved using the next word distributions as features the second part measures the difference between the empirical next word distributions and the optimal next word distributions result 2 the authors further extend this result to word embedding features which are obtained by a weighted average of word embedding vectors based on the next word distributions there are several followup results built on these two results such as a new loss objective for predicting the downstream task but to the best of my understanding these two results are the main claims of this paper a key parameter that occurs in obtaining the above results is a worstcase coefficient that bounds the distributional shift between language model distributions of the training dataset and that of the downstream task intuitively this parameter arises from translating the natural task assumption which only guarantees transfer on average to the downstream task pros a new framework for understanding why learning how to predict the next word helps the downstream task this paper finds that the next word distributions of a subset of prompt words contain discriminative signals and are good features this seems to be a novel finding and may help inspire future work in this important direction cons the main result thm 41 applies to nextconditional word distributions that are very close to the optimal distribution it is unclear to me how the authors are going to justify this assumption should we expect the empirical distribution to converge to the true distribution when there are an infinite amount of samples secondly this main result depends on the worstcase coefficient which is also unclear to me for the transferability coefficient proposed in section 51 is it possible to measure it in experiments how large should we expect this coefficient to be writing the writing is overall clear and easy to follow although it took me quite some time to map out the definitions of various notations many of the notations look cumbersome and i suspect that there is still room for making the notations more accessible for new readers detailed comments p2 sec 11 analyze the efficiency language model features analyze the efficiency of language model features p2 sec 2 you started introducing these notations without explaining what they mean for example the pstar notation is also defined in sec 21 p2 sec 2 where pstarcdot s is used as a vector on the left and distribution on the right what does this sentence mean p3 sec 22 achieve lower test perplexity than traditional ngram models why is this true could you add a reference p5 sec 41 the result suggests that small test crossentropy hence test perplexity same question as above p6 sec 43 in fact f almost always performs better than this part seems intriguing despite the linear relationship shown in figure 1 could you discuss this more here p8 table 2 the results from using quad look worse than the above two could you explain the significance of this result again p24 figure 2 what are the x and y axis and what does each dot mean in this figuredocsepsummary this paper presents an explanation of why pretraining on language modeling lm helps performance on downstream text classification tasks the explanation relies on formulating classification tasks as next word prediction tasks ie language modeling they use their theoretical results to design the quad objective and experiment with it on sst and ag news finding that it performs close but slightly worse than standard cross entropy training of classifiers overall this work contributes an interesting framework for analysis however i have one large conceptual concern about the framework central to the proposed explanation is the ability to formulate text classification tasks as next word prediction possibly with a prompt appended to the input eg for sentiment analysis of movie reviews this movie is in a trivial sense this is always possible we can simply append the task definition to the end of an input as a question eg is the sentiment of the review positive and check the probabilities of yesno then predicting the answer to this prompt is equivalent to performing the task and a perfect lm is of course able to perform the task perfectly this formulation makes sections 3 and 41 feel trivially true though to the authors credit they do have to do additional work to extend an lm that is epsoptimal in next word cross entropy ie on average to optimality on the specific task formulation however the authors dont mention this trivial reformulation strategy and instead base their argument on the existence of heuristic words eg for sentiment analysis the probability of or after a review this strategy introduces the potential for spurious correlations the heuristics might be strongly correlated with the task in general but might be off due to other factors like sarcasm additionally relying on these singleword heuristics seems a bit off to me as many text classification tasks dont readily admit single words that encapsulate the task definition or label semantics theres an argument then that the theory described here doesnt apply to these tasks ie theyre not t bnatural but whats frustrating about this argument is that this theory doesnt provide us a way to distinguish which tasks fall in the category of singleword predictable or how to find such words other than trial and error it is very likely i am misunderstanding something about this paper i am not sure what it means for a task to lie in the row span of word embeddings docsepthis paper studies why language model pretraining has been such an effective technique in improving downstream performance across a wide range of nlp tasks recently in particular it considers language models which compute a probability distribution over the next word in a text given the previous context then taking inspiration from recent work that shows that many downstream tasks can be reframed as sentence completion tasks it defines a natural task as one on which a sparse linear model over the output of the true language model next word probability distribution conditioned on context attains strong performance theoretically it shows that language models which are close to the true language model are guaranteed to attain strong performance on natural tasks empirically it demonstrates that several nlp tasks are natural strengths the paper is generally quite clearly written and the claims are wellvalidated the definitions models and assumptions in the paper are intuitive and clear eg natural task the analysis which builds on these definitionsmodelsassumptions provides meaningful theoretical insight into why language model pretraining may be so beneficial for downstream training it provides a nice theoretical framework for thinking about the connection between language models and downstream tasks which future work could build on the empirical validation is thoughtful and relatively thorough even though the results dont show that the proposed loss function and proposed conditional mean features give improvements over baselines the empirical results show that the basic assumptions and definitions in the theoretical analysis are relatively realistic for example figure 4 validates assumption 41 logpartition function is roughly quadratic in theta and table 1 shows many real tasks are approximately natural furthermore when there is a gap between the empirical results and the theoretical results eg validation of lemma 43 at the end of section 4 the paper makes these limitations clear which i appreciated very much as a reader paper does not overclaim its contributions weaknesses unclear if there are real practical applications to the insights from this paper neither the proposed quad loss function nor the theoretically inspired conditional mean features perform better than the baselines the current analysis doesnt apply directly to bert which is trained to predict masked words in a sentence instead of the next word furthermore bert doesnt predict these masked words using a linear softmax model over a contextual embedding for the whole sentence which is the assumed structure for the softmax language models considered in the analysis this limitation is acknowledged in the conclusion which is good the paper doesnt explain why learning a linear model directly on the context embeddings fs performs better than using the contextual mean embeddings one idea i had here could you define a natural task as one for which there exists a sparse linear model over the logits of p s which performs well instead of a model directly over p s due to the very flat portions of the softmax function there can be meaningful differences between the logits corresponding to 2 different words but the lm probabilities for those words are extremely similar and thus harder for a linear model to distinguish with this definition a linear model of the logits is also a linear model over the context embeddings fs directly there are some points in the paper that could be made clearer i think it should be discussed earlier in introrelated work why the paper focuses on language models which do next word prediction via linear softmax models over fixed dimensional context embeddings and that bert is out of scope i think there should be more discussion about the implications of proposition 22 as i understand it this result shows that any part of pfs orthogonal to rowspanphi doesnt affect the crossentropy of the language model first order optimality condition would still be satisfied however this doesnt necessarily imply that pfs will be in spanphi for all contexts s in particular the architecture of the embedding model f likely constrains f in such a way that makes it impossible for pfs to be in spanphi for all contexts s furthermore at the end of section 3 it should be better explained why the assumption that pfs is in spanphi for all s implies that definition 32 should only consider sparse models v which are in this span as well decompose v vin vout component of v in the span and orthogonal to the span vt p vin voutt p vint p i found the discussion in section 41 pretty confusing in particular the part that argued why b o1alpha overall i really enjoyed reading this paper and found it to be quite insightful it provided me with a much more thoughtful explanation for why language model pretraining improves downstream task performance beyond simply it helps learn good general representations of language using large amounts of unlabeled text data my previous reasoning as a result i recommend acceptance for this paper nit grammar last sentence of section 11 analyze the efficiency of proposition 22 maybe write forall s in s instead of forall s pl section 3 append a prompt like this movie is the final quotation mark is on the next line equation 5 use sup instead of max discussion in section 41 i think figure 4 should be explained in more detail in caption andor text using capital and lower case tau in theorem 42 is confusing notation similarly using bold and notbold b in theorem 52 is confusing notation after definition 51 what does omegaw omegaw mean in table 1 can you explain more explicitly in caption and text what subset and class words means also can you add a column where a dense linear model over pfs is useddocsepsummary this work relates a pretraining performance with a downstream performance for tasks that can be reformulated as next word prediction tasks the authors show that for such tasks if the pretraining objective is epsilonoptimal then the downstream objective of a linear classifier is mathcalosqrtepsilonoptimal strengths to the best of my knowledge this is the first work that mathematically justifies the connection between the pretraining objective and the downstream performance the proof technique pretraining performance to covariance of pretraining errors to covariance of downstream errors to downstream performance is itself interesting if the paper is accepted i am looking forward to seeing a highlevel proof sketch in the main part as is done in section 21 of arora et al 2015httpsarxivorgabs150203520 a threeline explanation at the end of section 41 seems a bit scarce to me the paper is well written it gives an appropriate context presents the main theoretical results and verifies some of the claims experimentally major concern if i understand correctly and please correct me if i am wrong in theorem b1 the ratio between the downstream error ellmathcaltpcdotmid s tau and the pretraining error elltextxentpcdotmid selltextxentast is hidden in the gammapmathcalt pcdotmid s coefficient let me elaborate on this 1 in lemma d1 with the help of lemma d9 you show that frac1gammapmathcalt pcdotmid s is an upper bound for the ratio fracboldsymbolvtopsigmapmathcaltboldsymbolvboldsymbolvtopsigmaplboldsymbolv 2 the latter ratio seems proportional to the ratio fracellmathcaltpcdotmid s tauelltextxentpcdotmid selltextxentast i am not sure on thismy intuition is based on your lemma d2 and the fact that for a pcdotmid s with full support a nonprecise reverse version of pinskers inequalityhttpsenwikipediaorgwikipinsker27sinequalityinverseproblem holds in a nutshell arent you showing that textdownstream errormathcaloleftsqrttextpretraining errorcdotfractextdownstream errortextpretraining errorrightqquad issues why dont you verify the main claimepsilonoptimality in pretraining propagates as mathcalosqrtepsilonoptimality on downstreamempiricallyfor this you may want to vary the language modeling performance eg by pruning the language model and then verifying that the downstream loss increase is indeed mathcalosqrttextpretraining loss increase i believe such an experiment will definitely make the submission stronger i dont see why your theory does not generalize to a masked language modeling mlm why do we need to treat s as the leftcontext only given the success of mlm as a powerful pretraining objective please consider formulating your claims in a more general way minor issues at the beginning of section 23 pmathcalt is introduced as a distribution over mathcalstimespm1 but later eg in formula 5 it is used as a distribution over mathcals only please clarifyfix this what is the margin of task mathcalt mentioned on p5 is it a margin of an svm classifier that solves mathcalt limitations the authors admit that their work is limited to a particular type of downstream tasks indeed it is not clear how one can reformulate eg linguistic tasks like postagging or dependency parsing as a next word prediction task update after the authors response during the rebuttal the authors clarified my major concern as well as provided additional experiments that verify the main claim of the paper i am totally satisfied with the authors response hence i am changing the score 6 to 7 ### Summary:
the paper attempts to provide a theoretical explanation for benefit of language model pretraining on downstream classification task in this regard the authors provide a mathematical framework which seems to indicate that the distribution of the next word conditional on the context can provide a strong discriminative signal for the downstream task the reviewers found the formulation insightful interesting and novel also reviewers enjoyed reading the well written paper and appreciated its cautious in its tone as correctly pointed out by reviewers the proposed framework might not directly align with techniques used in practice applicability of the framework to other pretraining approaches is limited also there are some unresolved concerns about osqrtepsilon assumption still nevertheless reviewers reached a consensus that the framework would be beneficial for the community and attract followup works thus i recommend an acceptance to iclr following reviewer suggestion it is strongly recommended that extensions section be expanded in the revised version using the extra page
[ 7140, 273, 2014, 3418, 28826, 390, 849, 281, 1089, 824, 3000, 643, 685, 2332, 285, 2228, 50275, 262, 310, 1077, 2779, 891, 717, 40663, 1633, 670, 436, 2929, 891, 717, 417, 2119, 752, 352, 2097, 323, 247, 4836, 281, 7027, 275, 253, 4194, 13905, 273, 3159, 46234, 5474, 33032, 2520, 2929, 2175, 2139, 3448, 1566, 3215, 26208, 556, 644, 824, 271, 3576, 5853, 275, 11138, 15450, 3045, 2439, 247, 4618, 2491, 273, 295, 24343, 8892, 4102, 50276, 249, 1798, 352, 19401, 3448, 3210, 534, 11897, 247, 5912, 3268, 689, 253, 1735, 3159, 275, 247, 2505, 1677, 253, 2045, 3634, 50276, 7461, 3192, 17006, 432, 3332, 789, 326, 2722, 326, 1142, 15450, 8892, 476, 320, 16110, 3163, 347, 6197, 12240, 8892, 352, 13067, 247, 3626, 4836, 347, 581, 327, 534, 247, 23507, 4872, 1566, 689, 253, 3453, 273, 253, 2032, 3448, 1566, 1735, 3159, 5912, 3268, 27039, 327, 3634, 863, 1550, 2266, 3045, 50276, 783, 7262, 1037, 352, 2722, 326, 3448, 3210, 534, 403, 2810, 281, 253, 2032, 3448, 1566, 403, 16293, 281, 20685, 2266, 3045, 327, 3626, 8892, 50276, 358, 5378, 1037, 352, 14371, 326, 2067, 295, 24343, 8892, 403, 3626, 50275, 296, 3755, 20556, 50276, 783, 2929, 310, 3839, 3240, 4518, 3542, 285, 253, 3916, 403, 973, 7210, 456, 50276, 783, 14308, 3210, 285, 13260, 275, 253, 2929, 403, 27350, 285, 2590, 24088, 3626, 4836, 50276, 783, 1783, 534, 21168, 327, 841, 14308, 19286, 515, 360, 6372, 3400, 14282, 10527, 12288, 715, 2139, 3448, 1566, 3215, 26208, 778, 320, 594, 12912, 323, 15450, 3733, 50276, 262, 3400, 247, 5322, 10527, 7792, 323, 4680, 670, 253, 4602, 875, 3448, 3210, 285, 15450, 8892, 534, 2852, 789, 812, 1973, 327, 50276, 783, 16774, 12820, 310, 30457, 285, 4942, 11080, 50276, 9154, 2167, 253, 1543, 13414, 921, 326, 253, 4081, 2957, 1159, 285, 4081, 17697, 1599, 3386, 1918, 11701, 689, 1666, 25379, 253, 16774, 1543, 921, 326, 253, 5044, 13260, 285, 14308, 275, 253, 10527, 1783, 403, 4942, 15958, 50276, 1542, 1650, 4677, 577, 3588, 684, 9376, 7609, 2412, 37717, 1159, 310, 11467, 21396, 275, 39116, 285, 2829, 337, 2722, 1142, 1524, 8892, 403, 5512, 3626, 50276, 44295, 3062, 672, 627, 310, 247, 8037, 875, 253, 16774, 1543, 285, 253, 10527, 1543, 24088, 12820, 273, 18057, 7652, 387, 253, 990, 273, 2593, 577, 253, 2929, 2789, 841, 7364, 2590, 534, 891, 14109, 1077, 1199, 347, 247, 9414, 2929, 1057, 417, 689, 7041, 697, 9021, 50276, 20881, 1255, 265, 50276, 328, 8250, 604, 627, 403, 1524, 8542, 4893, 281, 253, 16039, 432, 436, 2929, 50276, 570, 1622, 253, 4081, 9853, 2957, 1159, 4543, 253, 28055, 11797, 17697, 1599, 3386, 1347, 1805, 685, 253, 1666, 25379, 50276, 783, 1655, 1783, 36908, 4647, 3587, 281, 270, 797, 534, 310, 10166, 281, 3283, 34741, 3000, 275, 247, 6197, 3185, 273, 253, 1735, 3159, 50276, 44295, 3062, 270, 797, 36908, 3283, 841, 34741, 3000, 970, 247, 4872, 2602, 4090, 1566, 689, 247, 33876, 21496, 323, 253, 2644, 6197, 534, 310, 253, 8025, 2605, 323, 253, 2602, 4090, 3448, 3210, 2783, 275, 253, 1783, 50276, 2520, 12291, 310, 14969, 275, 253, 6452, 534, 310, 1175, 50276, 783, 2929, 36908, 5513, 2139, 4715, 247, 4872, 1566, 3587, 327, 253, 3634, 46234, 25290, 17923, 1805, 685, 970, 253, 33876, 1599, 46234, 50274, 531, 2934, 891, 574, 1060, 812, 368, 4853, 247, 3626, 4836, 347, 581, 323, 534, 627, 4961, 247, 23507, 4872, 1566, 689, 253, 2412, 953, 273, 268, 50275, 84, 534, 17923, 973, 3185, 273, 247, 1566, 3587, 689, 268, 50275, 84, 50276, 21848, 281, 253, 1077, 6507, 11821, 273, 253, 2602, 4090, 1159, 627, 476, 320, 14282, 3910, 875, 253, 2412, 953, 3969, 281, 374, 1027, 3000, 533, 253, 298, 78, 20552, 323, 1110, 3000, 403, 6685, 2074, 285, 3021, 12150, 323, 247, 4872, 1566, 281, 12129, 50276, 3113, 436, 5426, 247, 4872, 1566, 273, 253, 2412, 953, 310, 671, 247, 4872, 1566, 689, 253, 3634, 46234, 25290, 3587, 50276, 9088, 403, 690, 2792, 275, 253, 2929, 326, 812, 320, 1160, 30909, 50274, 74, 1158, 352, 943, 320, 5469, 4321, 275, 26432, 4919, 789, 2139, 253, 2929, 16633, 327, 3448, 3210, 534, 513, 1735, 3159, 10554, 3066, 4872, 2602, 4090, 3210, 689, 4229, 15759, 3634, 46234, 285, 326, 270, 797, 310, 562, 273, 7990, 50274, 74, 1158, 627, 943, 320, 625, 5955, 670, 253, 12739, 273, 13989, 3307, 50276, 284, 891, 2096, 352, 436, 906, 2722, 326, 667, 629, 273, 268, 3671, 19627, 281, 4194, 2551, 2162, 36908, 2818, 253, 2831, 290, 10144, 273, 253, 3448, 1566, 806, 1340, 5556, 1319, 1617, 651, 1335, 320, 10048, 50276, 35529, 436, 36908, 7933, 16084, 326, 268, 3671, 588, 320, 275, 13905, 2162, 323, 512, 22349, 256, 50276, 249, 1798, 50276, 783, 10336, 273, 253, 21496, 1566, 269, 2779, 1030, 44196, 269, 275, 824, 247, 1039, 326, 2789, 352, 7479, 323, 268, 3671, 281, 320, 275, 13905, 2162, 323, 512, 22349, 256, 50276, 44295, 3062, 387, 253, 990, 273, 2593, 495, 352, 943, 320, 1805, 5544, 2139, 253, 9376, 326, 268, 3671, 310, 275, 13905, 2162, 323, 512, 256, 8018, 326, 5426, 4567, 943, 760, 1908, 23507, 3210, 362, 534, 403, 275, 436, 13905, 347, 973, 11101, 3014, 362, 50276, 8498, 50276, 87, 483, 4445, 273, 362, 275, 253, 13905, 285, 19627, 281, 253, 13905, 362, 85, 268, 50276, 8498, 50276, 87, 483, 85, 268, 50276, 87, 565, 268, 50274, 74, 1119, 253, 5955, 275, 2593, 7609, 3965, 21643, 50276, 249, 1798, 253, 629, 326, 9125, 2139, 270, 50276, 80, 18, 1637, 50276, 1189, 455, 891, 1663, 11346, 4361, 436, 2929, 285, 1119, 352, 281, 320, 3240, 47860, 50276, 262, 2530, 479, 342, 247, 1199, 625, 30457, 8813, 323, 2139, 3448, 1566, 3215, 26208, 19132, 15450, 4836, 3045, 4457, 3365, 352, 7729, 3037, 1175, 2087, 14237, 273, 3448, 970, 1781, 8322, 273, 440, 22027, 2505, 941, 619, 2045, 14720, 50276, 284, 247, 906, 891, 5583, 14924, 323, 436, 2929, 50276, 32202, 50275, 1710, 4175, 1390, 6197, 273, 2593, 1903, 12106, 253, 6733, 273, 50275, 856, 3321, 3307, 5046, 3630, 323, 455, 256, 275, 256, 3185, 273, 323, 455, 256, 50276, 446, 50276, 4674, 495, 14801, 247, 8959, 751, 436, 6440, 310, 253, 2457, 25241, 1616, 310, 327, 253, 1735, 1386, 50276, 29813, 608, 50276, 2327, 7018, 3185, 273, 2781, 50276, 49794, 275, 2593, 7609, 50276, 74, 1158, 4677, 577, 943, 320, 5544, 275, 625, 2508, 275, 11743, 285, 263, 2505, 50276, 5302, 5347, 285, 2406, 1083, 29201, 275, 10012, 5976, 310, 21643, 14951, 50276, 3549, 6241, 970, 13433, 285, 417, 12509, 270, 275, 10012, 8073, 310, 21643, 14951, 50276, 6438, 5426, 8319, 752, 1057, 7005, 909, 1403, 50276, 485, 72, 1403, 1599, 50276, 249, 2829, 337, 476, 368, 5513, 625, 11120, 275, 11743, 285, 2505, 752, 8578, 285, 966, 3000, 2097, 671, 476, 368, 823, 247, 5084, 835, 247, 14086, 4872, 1566, 689, 268, 3671, 310, 908, 7152, 339, 793, 360, 3454, 436, 789, 7033, 247, 3215, 26208, 3045, 342, 247, 15450, 3045, 323, 8892, 326, 476, 320, 8460, 2907, 347, 1735, 3159, 10554, 8892, 253, 4477, 921, 326, 323, 824, 8892, 604, 253, 3215, 26208, 8103, 310, 299, 4277, 29776, 840, 253, 15450, 8103, 273, 247, 4872, 30410, 310, 14168, 1179, 375, 50070, 442, 4277, 29776, 50276, 296, 3755, 20556, 50276, 936, 253, 1682, 273, 619, 3640, 436, 310, 253, 806, 789, 326, 11076, 1037, 816, 7790, 253, 4602, 875, 253, 3215, 26208, 8103, 285, 253, 15450, 3045, 50276, 783, 4737, 5853, 3215, 26208, 3045, 281, 26677, 273, 3215, 26208, 6332, 281, 26677, 273, 15450, 6332, 281, 15450, 3045, 310, 3139, 4722, 604, 253, 2929, 310, 7607, 891, 717, 2819, 3579, 281, 6523, 247, 1029, 5251, 4737, 23211, 275, 253, 2022, 629, 347, 310, 2218, 275, 2593, 3127, 273, 549, 6464, 1162, 355, 4104, 3614, 39962, 2061, 5375, 8970, 938, 1671, 938, 247, 289, 250, 4115, 8813, 387, 253, 990, 273, 2593, 7609, 3133, 247, 2372, 29967, 281, 479, 50276, 783, 2929, 310, 973, 3542, 352, 4245, 271, 4569, 3634, 10262, 253, 2022, 10527, 1543, 285, 2336, 7790, 690, 273, 253, 3916, 21657, 50276, 24330, 4468, 604, 891, 2096, 9113, 285, 4496, 3451, 479, 604, 891, 717, 3430, 275, 10012, 270, 18, 253, 4313, 875, 253, 15450, 2228, 11591, 1588, 17394, 3830, 7893, 256, 50276, 3115, 285, 253, 3215, 26208, 2228, 11591, 1156, 89, 290, 81, 3830, 7893, 5580, 1156, 89, 290, 505, 310, 8763, 275, 253, 305, 3681, 522, 1588, 85, 268, 3830, 7893, 256, 10235, 1339, 479, 21184, 327, 436, 337, 275, 18057, 277, 18, 342, 253, 1361, 273, 18057, 277, 26, 368, 921, 326, 1315, 317, 18, 72, 3681, 522, 1588, 85, 268, 3830, 7893, 256, 310, 271, 5170, 3033, 323, 253, 4313, 1315, 317, 5664, 20282, 2695, 304, 4251, 1588, 85, 5664, 87, 5664, 87, 3956, 2592, 446, 5664, 87, 374, 253, 6158, 4313, 3133, 14495, 281, 253, 4313, 1315, 317, 437, 1588, 17394, 3830, 7893, 256, 50276, 3115, 437, 1156, 89, 290, 81, 3830, 7893, 5580, 1156, 89, 290, 505, 891, 717, 417, 2119, 327, 436, 2577, 30328, 310, 1754, 327, 634, 18057, 277, 19, 285, 253, 958, 326, 323, 247, 268, 3830, 7893, 256, 342, 2120, 1329, 247, 1327, 17995, 885, 8107, 2715, 273, 23854, 23521, 11370, 3614, 257, 25842, 2061, 88, 14956, 968, 6426, 1630, 84, 460, 15177, 46429, 28872, 6556, 50276, 249, 247, 5825, 17901, 403, 2649, 368, 4645, 326, 2505, 3487, 4963, 1486, 526, 506, 1179, 80, 1274, 2609, 1156, 4025, 26208, 2228, 3830, 71, 974, 2068, 3487, 4963, 1486, 430, 2068, 4025, 26208, 2228, 918, 9315, 50275, 22402, 50276, 22309, 13414, 368, 12654, 253, 2022, 502, 66, 553, 4277, 32581, 1319, 275, 3215, 26208, 8641, 684, 347, 14168, 1179, 375, 50070, 442, 4277, 32581, 1319, 327, 15450, 358, 5378, 1037, 1542, 436, 368, 778, 971, 281, 6889, 253, 3448, 14053, 3045, 24088, 407, 819, 25004, 253, 3448, 1566, 285, 840, 49160, 326, 253, 15450, 2957, 2572, 310, 6296, 14168, 1179, 375, 2274, 1156, 4025, 26208, 2957, 2572, 891, 2868, 824, 271, 3368, 588, 7964, 1056, 253, 19529, 10046, 50276, 74, 13414, 923, 2139, 634, 3762, 1057, 417, 39970, 281, 247, 34741, 3448, 14053, 13361, 78, 2139, 513, 359, 878, 281, 1555, 256, 347, 253, 1669, 8882, 760, 1677, 253, 2323, 273, 13361, 78, 347, 247, 6422, 3215, 26208, 8103, 4496, 1908, 830, 8287, 634, 3916, 275, 247, 625, 2087, 1039, 50276, 37585, 3374, 50276, 255, 253, 5068, 273, 2593, 3495, 268, 1588, 85, 310, 5611, 347, 247, 3268, 689, 14168, 1179, 296, 1022, 2617, 18, 533, 1996, 24088, 275, 7212, 608, 352, 310, 908, 347, 247, 3268, 689, 14168, 68, 932, 760, 4496, 19148, 11097, 436, 50276, 5371, 310, 253, 8459, 273, 4836, 14168, 1179, 85, 5393, 327, 268, 22, 310, 352, 247, 8459, 273, 271, 256, 11618, 30410, 326, 35910, 14168, 1179, 85, 50276, 17465, 569, 50276, 783, 4477, 11476, 326, 616, 789, 310, 3710, 281, 247, 1798, 1511, 273, 15450, 8892, 6296, 352, 310, 417, 2590, 849, 581, 476, 8460, 4187, 24088, 32019, 8892, 751, 1501, 29191, 390, 18925, 29072, 347, 247, 1735, 3159, 10554, 4836, 50276, 11183, 846, 253, 4477, 2380, 1309, 253, 30080, 22559, 253, 4477, 31637, 619, 2201, 4468, 347, 973, 347, 2530, 3081, 4679, 326, 12654, 253, 2022, 1750, 273, 253, 2929, 891, 717, 9106, 10048, 342, 253, 4477, 2380, 7613, 891, 717, 6890, 253, 4868, 721, 281, 818, 187, 187, 4118, 18435, 27, 783, 2929, 9437, 281, 2085, 247, 10527, 8813, 323, 5649, 273, 3448, 1566, 3215, 26208, 327, 15450, 9162, 4836, 275, 436, 2743, 253, 4477, 2085, 247, 15965, 7792, 534, 3133, 281, 5224, 326, 253, 3268, 273, 253, 1735, 3159, 17697, 327, 253, 3634, 476, 2085, 247, 2266, 20741, 800, 2625, 323, 253, 15450, 4836, 253, 30628, 1119, 253, 15895, 47860, 4722, 285, 4460, 671, 30628, 11346, 4361, 253, 973, 3542, 2929, 285, 14109, 697, 31798, 275, 697, 10541, 347, 9113, 8042, 562, 407, 30628, 253, 4081, 7792, 1537, 417, 3587, 8495, 342, 5609, 908, 275, 3946, 30437, 273, 253, 7792, 281, 643, 3215, 26208, 7274, 310, 3710, 50276, 12563, 627, 403, 690, 39394, 7350, 670, 7684, 50070, 442, 4277, 9376, 1335, 17837, 30628, 4925, 247, 13969, 326, 253, 7792, 651, 320, 12912, 323, 253, 3114, 285, 6427, 956, 484, 2987, 3021, 891, 5583, 271, 14924, 281, 17857, 32888, 1563, 37317, 14876, 352, 310, 7052, 8521, 326, 18149, 2593, 320, 11848, 275, 253, 17265, 2715, 970, 253, 4465, 3239 ]
[ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 ]
[ 7140, 273, 2014, 3418, 28826, 390, 849, 281, 1089, 824, 3000, 643, 685, 2332, 285, 2228, 50275, 262, 310, 1077, 2779, 891, 717, 40663, 1633, 670, 436, 2929, 891, 717, 417, 2119, 752, 352, 2097, 323, 247, 4836, 281, 7027, 275, 253, 4194, 13905, 273, 3159, 46234, 5474, 33032, 2520, 2929, 2175, 2139, 3448, 1566, 3215, 26208, 556, 644, 824, 271, 3576, 5853, 275, 11138, 15450, 3045, 2439, 247, 4618, 2491, 273, 295, 24343, 8892, 4102, 50276, 249, 1798, 352, 19401, 3448, 3210, 534, 11897, 247, 5912, 3268, 689, 253, 1735, 3159, 275, 247, 2505, 1677, 253, 2045, 3634, 50276, 7461, 3192, 17006, 432, 3332, 789, 326, 2722, 326, 1142, 15450, 8892, 476, 320, 16110, 3163, 347, 6197, 12240, 8892, 352, 13067, 247, 3626, 4836, 347, 581, 327, 534, 247, 23507, 4872, 1566, 689, 253, 3453, 273, 253, 2032, 3448, 1566, 1735, 3159, 5912, 3268, 27039, 327, 3634, 863, 1550, 2266, 3045, 50276, 783, 7262, 1037, 352, 2722, 326, 3448, 3210, 534, 403, 2810, 281, 253, 2032, 3448, 1566, 403, 16293, 281, 20685, 2266, 3045, 327, 3626, 8892, 50276, 358, 5378, 1037, 352, 14371, 326, 2067, 295, 24343, 8892, 403, 3626, 50275, 296, 3755, 20556, 50276, 783, 2929, 310, 3839, 3240, 4518, 3542, 285, 253, 3916, 403, 973, 7210, 456, 50276, 783, 14308, 3210, 285, 13260, 275, 253, 2929, 403, 27350, 285, 2590, 24088, 3626, 4836, 50276, 783, 1783, 534, 21168, 327, 841, 14308, 19286, 515, 360, 6372, 3400, 14282, 10527, 12288, 715, 2139, 3448, 1566, 3215, 26208, 778, 320, 594, 12912, 323, 15450, 3733, 50276, 262, 3400, 247, 5322, 10527, 7792, 323, 4680, 670, 253, 4602, 875, 3448, 3210, 285, 15450, 8892, 534, 2852, 789, 812, 1973, 327, 50276, 783, 16774, 12820, 310, 30457, 285, 4942, 11080, 50276, 9154, 2167, 253, 1543, 13414, 921, 326, 253, 4081, 2957, 1159, 285, 4081, 17697, 1599, 3386, 1918, 11701, 689, 1666, 25379, 253, 16774, 1543, 921, 326, 253, 5044, 13260, 285, 14308, 275, 253, 10527, 1783, 403, 4942, 15958, 50276, 1542, 1650, 4677, 577, 3588, 684, 9376, 7609, 2412, 37717, 1159, 310, 11467, 21396, 275, 39116, 285, 2829, 337, 2722, 1142, 1524, 8892, 403, 5512, 3626, 50276, 44295, 3062, 672, 627, 310, 247, 8037, 875, 253, 16774, 1543, 285, 253, 10527, 1543, 24088, 12820, 273, 18057, 7652, 387, 253, 990, 273, 2593, 577, 253, 2929, 2789, 841, 7364, 2590, 534, 891, 14109, 1077, 1199, 347, 247, 9414, 2929, 1057, 417, 689, 7041, 697, 9021, 50276, 20881, 1255, 265, 50276, 328, 8250, 604, 627, 403, 1524, 8542, 4893, 281, 253, 16039, 432, 436, 2929, 50276, 570, 1622, 253, 4081, 9853, 2957, 1159, 4543, 253, 28055, 11797, 17697, 1599, 3386, 1347, 1805, 685, 253, 1666, 25379, 50276, 783, 1655, 1783, 36908, 4647, 3587, 281, 270, 797, 534, 310, 10166, 281, 3283, 34741, 3000, 275, 247, 6197, 3185, 273, 253, 1735, 3159, 50276, 44295, 3062, 270, 797, 36908, 3283, 841, 34741, 3000, 970, 247, 4872, 2602, 4090, 1566, 689, 247, 33876, 21496, 323, 253, 2644, 6197, 534, 310, 253, 8025, 2605, 323, 253, 2602, 4090, 3448, 3210, 2783, 275, 253, 1783, 50276, 2520, 12291, 310, 14969, 275, 253, 6452, 534, 310, 1175, 50276, 783, 2929, 36908, 5513, 2139, 4715, 247, 4872, 1566, 3587, 327, 253, 3634, 46234, 25290, 17923, 1805, 685, 970, 253, 33876, 1599, 46234, 50274, 531, 2934, 891, 574, 1060, 812, 368, 4853, 247, 3626, 4836, 347, 581, 323, 534, 627, 4961, 247, 23507, 4872, 1566, 689, 253, 2412, 953, 273, 268, 50275, 84, 534, 17923, 973, 3185, 273, 247, 1566, 3587, 689, 268, 50275, 84, 50276, 21848, 281, 253, 1077, 6507, 11821, 273, 253, 2602, 4090, 1159, 627, 476, 320, 14282, 3910, 875, 253, 2412, 953, 3969, 281, 374, 1027, 3000, 533, 253, 298, 78, 20552, 323, 1110, 3000, 403, 6685, 2074, 285, 3021, 12150, 323, 247, 4872, 1566, 281, 12129, 50276, 3113, 436, 5426, 247, 4872, 1566, 273, 253, 2412, 953, 310, 671, 247, 4872, 1566, 689, 253, 3634, 46234, 25290, 3587, 50276, 9088, 403, 690, 2792, 275, 253, 2929, 326, 812, 320, 1160, 30909, 50274, 74, 1158, 352, 943, 320, 5469, 4321, 275, 26432, 4919, 789, 2139, 253, 2929, 16633, 327, 3448, 3210, 534, 513, 1735, 3159, 10554, 3066, 4872, 2602, 4090, 3210, 689, 4229, 15759, 3634, 46234, 285, 326, 270, 797, 310, 562, 273, 7990, 50274, 74, 1158, 627, 943, 320, 625, 5955, 670, 253, 12739, 273, 13989, 3307, 50276, 284, 891, 2096, 352, 436, 906, 2722, 326, 667, 629, 273, 268, 3671, 19627, 281, 4194, 2551, 2162, 36908, 2818, 253, 2831, 290, 10144, 273, 253, 3448, 1566, 806, 1340, 5556, 1319, 1617, 651, 1335, 320, 10048, 50276, 35529, 436, 36908, 7933, 16084, 326, 268, 3671, 588, 320, 275, 13905, 2162, 323, 512, 22349, 256, 50276, 249, 1798, 50276, 783, 10336, 273, 253, 21496, 1566, 269, 2779, 1030, 44196, 269, 275, 824, 247, 1039, 326, 2789, 352, 7479, 323, 268, 3671, 281, 320, 275, 13905, 2162, 323, 512, 22349, 256, 50276, 44295, 3062, 387, 253, 990, 273, 2593, 495, 352, 943, 320, 1805, 5544, 2139, 253, 9376, 326, 268, 3671, 310, 275, 13905, 2162, 323, 512, 256, 8018, 326, 5426, 4567, 943, 760, 1908, 23507, 3210, 362, 534, 403, 275, 436, 13905, 347, 973, 11101, 3014, 362, 50276, 8498, 50276, 87, 483, 4445, 273, 362, 275, 253, 13905, 285, 19627, 281, 253, 13905, 362, 85, 268, 50276, 8498, 50276, 87, 483, 85, 268, 50276, 87, 565, 268, 50274, 74, 1119, 253, 5955, 275, 2593, 7609, 3965, 21643, 50276, 249, 1798, 253, 629, 326, 9125, 2139, 270, 50276, 80, 18, 1637, 50276, 1189, 455, 891, 1663, 11346, 4361, 436, 2929, 285, 1119, 352, 281, 320, 3240, 47860, 50276, 262, 2530, 479, 342, 247, 1199, 625, 30457, 8813, 323, 2139, 3448, 1566, 3215, 26208, 19132, 15450, 4836, 3045, 4457, 3365, 352, 7729, 3037, 1175, 2087, 14237, 273, 3448, 970, 1781, 8322, 273, 440, 22027, 2505, 941, 619, 2045, 14720, 50276, 284, 247, 906, 891, 5583, 14924, 323, 436, 2929, 50276, 32202, 50275, 1710, 4175, 1390, 6197, 273, 2593, 1903, 12106, 253, 6733, 273, 50275, 856, 3321, 3307, 5046, 3630, 323, 455, 256, 275, 256, 3185, 273, 323, 455, 256, 50276, 446, 50276, 4674, 495, 14801, 247, 8959, 751, 436, 6440, 310, 253, 2457, 25241, 1616, 310, 327, 253, 1735, 1386, 50276, 29813, 608, 50276, 2327, 7018, 3185, 273, 2781, 50276, 49794, 275, 2593, 7609, 50276, 74, 1158, 4677, 577, 943, 320, 5544, 275, 625, 2508, 275, 11743, 285, 263, 2505, 50276, 5302, 5347, 285, 2406, 1083, 29201, 275, 10012, 5976, 310, 21643, 14951, 50276, 3549, 6241, 970, 13433, 285, 417, 12509, 270, 275, 10012, 8073, 310, 21643, 14951, 50276, 6438, 5426, 8319, 752, 1057, 7005, 909, 1403, 50276, 485, 72, 1403, 1599, 50276, 249, 2829, 337, 476, 368, 5513, 625, 11120, 275, 11743, 285, 2505, 752, 8578, 285, 966, 3000, 2097, 671, 476, 368, 823, 247, 5084, 835, 247, 14086, 4872, 1566, 689, 268, 3671, 310, 908, 7152, 339, 793, 360, 3454, 436, 789, 7033, 247, 3215, 26208, 3045, 342, 247, 15450, 3045, 323, 8892, 326, 476, 320, 8460, 2907, 347, 1735, 3159, 10554, 8892, 253, 4477, 921, 326, 323, 824, 8892, 604, 253, 3215, 26208, 8103, 310, 299, 4277, 29776, 840, 253, 15450, 8103, 273, 247, 4872, 30410, 310, 14168, 1179, 375, 50070, 442, 4277, 29776, 50276, 296, 3755, 20556, 50276, 936, 253, 1682, 273, 619, 3640, 436, 310, 253, 806, 789, 326, 11076, 1037, 816, 7790, 253, 4602, 875, 253, 3215, 26208, 8103, 285, 253, 15450, 3045, 50276, 783, 4737, 5853, 3215, 26208, 3045, 281, 26677, 273, 3215, 26208, 6332, 281, 26677, 273, 15450, 6332, 281, 15450, 3045, 310, 3139, 4722, 604, 253, 2929, 310, 7607, 891, 717, 2819, 3579, 281, 6523, 247, 1029, 5251, 4737, 23211, 275, 253, 2022, 629, 347, 310, 2218, 275, 2593, 3127, 273, 549, 6464, 1162, 355, 4104, 3614, 39962, 2061, 5375, 8970, 938, 1671, 938, 247, 289, 250, 4115, 8813, 387, 253, 990, 273, 2593, 7609, 3133, 247, 2372, 29967, 281, 479, 50276, 783, 2929, 310, 973, 3542, 352, 4245, 271, 4569, 3634, 10262, 253, 2022, 10527, 1543, 285, 2336, 7790, 690, 273, 253, 3916, 21657, 50276, 24330, 4468, 604, 891, 2096, 9113, 285, 4496, 3451, 479, 604, 891, 717, 3430, 275, 10012, 270, 18, 253, 4313, 875, 253, 15450, 2228, 11591, 1588, 17394, 3830, 7893, 256, 50276, 3115, 285, 253, 3215, 26208, 2228, 11591, 1156, 89, 290, 81, 3830, 7893, 5580, 1156, 89, 290, 505, 310, 8763, 275, 253, 305, 3681, 522, 1588, 85, 268, 3830, 7893, 256, 10235, 1339, 479, 21184, 327, 436, 337, 275, 18057, 277, 18, 342, 253, 1361, 273, 18057, 277, 26, 368, 921, 326, 1315, 317, 18, 72, 3681, 522, 1588, 85, 268, 3830, 7893, 256, 310, 271, 5170, 3033, 323, 253, 4313, 1315, 317, 5664, 20282, 2695, 304, 4251, 1588, 85, 5664, 87, 5664, 87, 3956, 2592, 446, 5664, 87, 374, 253, 6158, 4313, 3133, 14495, 281, 253, 4313, 1315, 317, 437, 1588, 17394, 3830, 7893, 256, 50276, 3115, 437, 1156, 89, 290, 81, 3830, 7893, 5580, 1156, 89, 290, 505, 891, 717, 417, 2119, 327, 436, 2577, 30328, 310, 1754, 327, 634, 18057, 277, 19, 285, 253, 958, 326, 323, 247, 268, 3830, 7893, 256, 342, 2120, 1329, 247, 1327, 17995, 885, 8107, 2715, 273, 23854, 23521, 11370, 3614, 257, 25842, 2061, 88, 14956, 968, 6426, 1630, 84, 460, 15177, 46429, 28872, 6556, 50276, 249, 247, 5825, 17901, 403, 2649, 368, 4645, 326, 2505, 3487, 4963, 1486, 526, 506, 1179, 80, 1274, 2609, 1156, 4025, 26208, 2228, 3830, 71, 974, 2068, 3487, 4963, 1486, 430, 2068, 4025, 26208, 2228, 918, 9315, 50275, 22402, 50276, 22309, 13414, 368, 12654, 253, 2022, 502, 66, 553, 4277, 32581, 1319, 275, 3215, 26208, 8641, 684, 347, 14168, 1179, 375, 50070, 442, 4277, 32581, 1319, 327, 15450, 358, 5378, 1037, 1542, 436, 368, 778, 971, 281, 6889, 253, 3448, 14053, 3045, 24088, 407, 819, 25004, 253, 3448, 1566, 285, 840, 49160, 326, 253, 15450, 2957, 2572, 310, 6296, 14168, 1179, 375, 2274, 1156, 4025, 26208, 2957, 2572, 891, 2868, 824, 271, 3368, 588, 7964, 1056, 253, 19529, 10046, 50276, 74, 13414, 923, 2139, 634, 3762, 1057, 417, 39970, 281, 247, 34741, 3448, 14053, 13361, 78, 2139, 513, 359, 878, 281, 1555, 256, 347, 253, 1669, 8882, 760, 1677, 253, 2323, 273, 13361, 78, 347, 247, 6422, 3215, 26208, 8103, 4496, 1908, 830, 8287, 634, 3916, 275, 247, 625, 2087, 1039, 50276, 37585, 3374, 50276, 255, 253, 5068, 273, 2593, 3495, 268, 1588, 85, 310, 5611, 347, 247, 3268, 689, 14168, 1179, 296, 1022, 2617, 18, 533, 1996, 24088, 275, 7212, 608, 352, 310, 908, 347, 247, 3268, 689, 14168, 68, 932, 760, 4496, 19148, 11097, 436, 50276, 5371, 310, 253, 8459, 273, 4836, 14168, 1179, 85, 5393, 327, 268, 22, 310, 352, 247, 8459, 273, 271, 256, 11618, 30410, 326, 35910, 14168, 1179, 85, 50276, 17465, 569, 50276, 783, 4477, 11476, 326, 616, 789, 310, 3710, 281, 247, 1798, 1511, 273, 15450, 8892, 6296, 352, 310, 417, 2590, 849, 581, 476, 8460, 4187, 24088, 32019, 8892, 751, 1501, 29191, 390, 18925, 29072, 347, 247, 1735, 3159, 10554, 4836, 50276, 11183, 846, 253, 4477, 2380, 1309, 253, 30080, 22559, 253, 4477, 31637, 619, 2201, 4468, 347, 973, 347, 2530, 3081, 4679, 326, 12654, 253, 2022, 1750, 273, 253, 2929, 891, 717, 9106, 10048, 342, 253, 4477, 2380, 7613, 891, 717, 6890, 253, 4868, 721, 281, 818, 187, 187, 4118, 18435, 27, 783, 2929, 9437, 281, 2085, 247, 10527, 8813, 323, 5649, 273, 3448, 1566, 3215, 26208, 327, 15450, 9162, 4836, 275, 436, 2743, 253, 4477, 2085, 247, 15965, 7792, 534, 3133, 281, 5224, 326, 253, 3268, 273, 253, 1735, 3159, 17697, 327, 253, 3634, 476, 2085, 247, 2266, 20741, 800, 2625, 323, 253, 15450, 4836, 253, 30628, 1119, 253, 15895, 47860, 4722, 285, 4460, 671, 30628, 11346, 4361, 253, 973, 3542, 2929, 285, 14109, 697, 31798, 275, 697, 10541, 347, 9113, 8042, 562, 407, 30628, 253, 4081, 7792, 1537, 417, 3587, 8495, 342, 5609, 908, 275, 3946, 30437, 273, 253, 7792, 281, 643, 3215, 26208, 7274, 310, 3710, 50276, 12563, 627, 403, 690, 39394, 7350, 670, 7684, 50070, 442, 4277, 9376, 1335, 17837, 30628, 4925, 247, 13969, 326, 253, 7792, 651, 320, 12912, 323, 253, 3114, 285, 6427, 956, 484, 2987, 3021, 891, 5583, 271, 14924, 281, 17857, 32888, 1563, 37317, 14876, 352, 310, 7052, 8521, 326, 18149, 2593, 320, 11848, 275, 253, 17265, 2715, 970, 253, 4465, 3239 ]
Below is given review of a research paper from cnoference journal. Please write a summary the review. ### Review: the paper proposes an investigation of how to finetune a model with multilingual data effectively with a theoretical and empirical evidence they proposed a method to avoid catastrophic forgetting by using interiorpoint solver to optimize the objective function based on the theoretical work the authors claim that the method is able to reach the pareto stationary no more changes occurred by optimizing multiple objectives by applying optimization they show the superiority of the approaches in named entity recognition question answering and natural language inference and several languages strengths interesting work with a good theoretical foundation the task is very useful for learning multilingual models while optimizing the model to reach closer to the upper bound weaknesses the paper is not easy to follow the paper method is unclear until i carefully read the paper several times i would say the paper would be easier to follow if they put more details on the algorithm and the introduction for example what method does the paper use to optimize the weights not just pointing to the equation and it would be great to add a short description of the method in the introduction so that the first readers can understand the proposed approach the method is not wellmotivated with a clear hypothesis therefore i cannot find the rationale why the lfmlf is useful in the abstract and introduction typo in conclusion learning learning there is no specific limitation section provided in the paper docsepthis paper proposes a method for maintaining multilinguality under finetuning when a set of languages is used for finetuning the core idea is to maintain crosslingual generalization by finding updates that are deviating not too much from the original pretrained model and its loss and those that many languages benefit from the proposed loss is and its optimization are theoretically motivated and defined and then evaluated on downstream tasks of the xtreme benchmark with xlm roberta where a few highresource languages are chosen for finetuning and the remaining languages are used for zeroshot evaluation note presentation and overall score updated after author rebuttal strengths 1 the problem setting of zeroshot generalization after finetuning is novel and offers impactful applications 2 the experimental results are largely convincing 3 the detailed impact studies section 53 and following are insightful and thorough weaknesses note all well addressed in revised version 1 the paper misrepresents multilingual finetuning multilingual finetuning which has not been explored before l34 the finetuning scenario where multiple source languages are involved in finetuning namely multilingual finetuning remains unexplored l59 it has been explored before most prominently in the paper for the xtreme benchmark httpsarxivorgabs200311080 that this very paper is also evaluating on translatetrain and inlanguage multitask from xtreme yielded gains over monolingual finetuning and should be discussed in particular in relation to the uniform mlf baseline presented in this work another work that builds on multilingual finetuning is for example httpsarxivorgabs220502022 naacl 2022 what hasnt been studied in this setting afaik is the problem of forgetting for some languages finetuning was either done on all languages of interest or only on english 2 the abstract and introduction are not well representing the focus of the remaining sections as it does not introduce and motivate the forgetting problem that the proposed method is mainly developed for and the title is derived from 3 a comparison to alllanguage finetuning should be included as an upper bound since training data is available for at least the ner task 4 significance tests should be performed since some differences are sometimes small and averages across multiple languages and might not be strong enough to conclude superiority of one model over the other at least standard deviation could be reported as well since its 3 runs limitations are not addressed except for that in some scenarios the proposed method does not outperform all other methods please discuss in which scenarios the method might not work or where the assumption that being close to the original pretraining modelobjective is not sufficientdesirable for downstream crosslingual generalization docsepthis paper presents a way of learning from multilingual annotated data they focus on the zeroshot transfer to lowresource languages for tasks like ner qa and nli the authors claim that for this setting they need to optimize the model for two new objectives one that minimizes the forgetting of the lowresource languages from pretraining and one that ensures that when finetuning on the different multilingual annotated datasets that the descent direction between them is shared or is in common they formulate this as a constrained optimization problem and call their method lessforgetting multilingual finetuning lfmlf they show consistent but not very large improved performance on ner qa and nli and look at where labelled data from more languages helps and unsurprisingly it does strengths they show consistent improvements averaged over up to 45 zeroshot language directions over three different tasks which is quite extensive weaknesses it is not clear from the paper how their approach is different from the baseline methods the authors should explain the most relevant baselines and describe how they are different project conflicting gradients method and gradient vaccine this is a major flaw in the paper they have limited the usefulness of the methods to a very narrow specific usecase multilingual and zeroshot the lessforgetting direction could in theory be useful for monolingual finetuning as well as multilingual finetuning the combined descent could be useful for transfer learning with labelled data too not just zeroshot why did the authors not mention this or do any experiments along these lines it is probably incorrect to state line 75 mtl methods are not suitable for multilingual finetuning as the authors use such methods as baselines in the paper even though they do not describe them and how they are different to the lfmlf the authors do not discuss the size of labelled data for each of the multilingual cases or discuss how this affects the results eg in table 2 en de fr ru etc as performance on finetuning just on russian gives better performance that just training on english which does not make sense the amount of improvement is not great the simple uniform multilingual fine tuning method is the second best method and lfmlf is ahead of it by only 051 point this work does not discuss its limitations which is a limitation ### Summary:
the paper proposes a method for finetuning multilingual pretrained language in multiple languages simultaneously the task is formalized as a constrained optimization problem and the upper bound of the forgetting is given in theory a method is developed for multilingual finetuning to minimize the upper bound experiments are conducted in multiple downstream tasks and the model is finetuned in a few highresource languages and the performance is improved in low resource languages as zeroshot settings the authors responded the reviewers concerns and the reviewers agree the responses addressed their concerns the paper is recommended to be accepted and i ask the authors to carefully prepare the final cameraready version based on the reviewers feedback
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 783, 2929, 29328, 271, 5839, 273, 849, 281, 1442, 292, 2517, 247, 1566, 342, 1554, 39661, 941, 8069, 342, 247, 10527, 285, 16774, 1941, 597, 4081, 247, 1332, 281, 3693, 36256, 37264, 407, 970, 10755, 3659, 47037, 281, 22318, 253, 8103, 1159, 1754, 327, 253, 10527, 789, 253, 4477, 1750, 326, 253, 1332, 310, 2104, 281, 3986, 253, 22865, 936, 17429, 50276, 2369, 625, 2544, 5866, 407, 39793, 2709, 16566, 407, 9433, 13757, 597, 921, 253, 34385, 273, 253, 7274, 275, 4907, 10726, 8981, 1953, 22291, 285, 3626, 3448, 17032, 285, 2067, 11515, 50276, 296, 3755, 20556, 50276, 47606, 789, 342, 247, 1175, 10527, 12153, 50276, 783, 4836, 310, 1077, 4217, 323, 4715, 1554, 39661, 3210, 1223, 39793, 253, 1566, 281, 3986, 8003, 281, 253, 5170, 3033, 50276, 20881, 1255, 265, 50276, 783, 2929, 310, 417, 3477, 281, 956, 253, 2929, 1332, 310, 12744, 1919, 891, 9257, 1239, 253, 2929, 2067, 2069, 891, 651, 1333, 253, 2929, 651, 320, 6927, 281, 956, 604, 597, 1691, 625, 4278, 327, 253, 5933, 285, 253, 10199, 323, 1650, 752, 1332, 1057, 253, 2929, 897, 281, 22318, 253, 13461, 417, 816, 13458, 281, 253, 5150, 285, 352, 651, 320, 1270, 281, 823, 247, 2159, 5740, 273, 253, 1332, 275, 253, 10199, 594, 326, 253, 806, 10668, 476, 2096, 253, 4081, 2746, 50276, 783, 1332, 310, 417, 973, 24013, 8550, 342, 247, 2590, 9079, 3103, 891, 2550, 1089, 253, 24775, 2139, 253, 298, 71, 1686, 71, 310, 4217, 275, 253, 12002, 285, 10199, 50276, 555, 5367, 50276, 249, 6452, 4715, 50276, 28269, 50276, 9088, 310, 642, 2173, 12291, 2593, 2530, 275, 253, 2929, 5474, 33032, 2520, 2929, 29328, 247, 1332, 323, 11850, 1554, 4837, 10982, 762, 1442, 292, 25004, 672, 247, 873, 273, 11515, 310, 908, 323, 1442, 292, 25004, 253, 5161, 2934, 310, 281, 6558, 2831, 1981, 780, 26647, 407, 4560, 11269, 326, 403, 1474, 15544, 417, 1512, 1199, 432, 253, 3236, 3215, 11273, 1566, 285, 697, 2957, 285, 1110, 326, 1142, 11515, 5649, 432, 253, 4081, 2957, 310, 285, 697, 13757, 403, 28055, 17194, 285, 2931, 285, 840, 6760, 327, 15450, 8892, 273, 253, 209, 633, 4190, 22791, 342, 1269, 20347, 687, 589, 893, 835, 247, 1643, 1029, 15024, 11515, 403, 6777, 323, 1442, 292, 25004, 285, 253, 5780, 11515, 403, 908, 323, 1182, 254, 6934, 302, 7103, 50273, 9939, 9759, 285, 4583, 4868, 9300, 846, 2488, 30080, 22559, 20544, 337, 253, 1895, 4758, 273, 1182, 254, 6934, 302, 26647, 846, 1442, 292, 25004, 310, 4460, 285, 6131, 3486, 1020, 4893, 374, 253, 5661, 1543, 403, 8127, 21414, 495, 253, 7000, 3486, 2175, 2593, 8676, 285, 1563, 403, 47860, 285, 11080, 50276, 20881, 1255, 265, 3877, 512, 973, 9713, 275, 17265, 2715, 337, 253, 2929, 3731, 4762, 5957, 1554, 39661, 1442, 292, 25004, 1554, 39661, 1442, 292, 25004, 534, 556, 417, 644, 14859, 1078, 298, 1706, 253, 1442, 292, 25004, 10076, 835, 2709, 2603, 11515, 403, 3206, 275, 1442, 292, 25004, 10775, 1554, 39661, 1442, 292, 25004, 4558, 35021, 2149, 298, 3046, 352, 556, 644, 14859, 1078, 954, 46454, 275, 253, 2929, 323, 253, 209, 633, 4190, 22791, 5987, 39962, 2061, 5375, 1518, 2405, 43201, 326, 436, 1077, 2929, 310, 671, 16344, 327, 5600, 255, 292, 1949, 285, 275, 12982, 1554, 262, 1945, 432, 209, 633, 4190, 20714, 15988, 689, 28294, 272, 780, 1442, 292, 25004, 285, 943, 320, 5469, 275, 1798, 275, 5886, 281, 253, 6447, 13361, 71, 8245, 3559, 275, 436, 789, 1529, 789, 326, 21168, 327, 1554, 39661, 1442, 292, 25004, 310, 323, 1650, 5987, 39962, 2061, 5375, 14256, 1235, 938, 1423, 5549, 29404, 1384, 1423, 752, 556, 2649, 644, 5421, 275, 436, 4758, 6706, 66, 1479, 310, 253, 1895, 273, 37264, 323, 690, 11515, 1442, 292, 25004, 369, 2057, 2218, 327, 512, 11515, 273, 1600, 390, 760, 327, 48087, 374, 253, 12002, 285, 10199, 403, 417, 973, 9999, 253, 2770, 273, 253, 5780, 7118, 347, 352, 1057, 417, 9569, 285, 41509, 253, 37264, 1895, 326, 253, 4081, 1332, 310, 7194, 3715, 323, 285, 253, 4060, 310, 6012, 432, 50276, 20, 247, 5301, 281, 512, 12982, 1442, 292, 25004, 943, 320, 2908, 347, 271, 5170, 3033, 1580, 3733, 941, 310, 2130, 323, 387, 1878, 253, 38998, 4836, 50276, 21, 8453, 5216, 943, 320, 2684, 1580, 690, 3910, 403, 4536, 1355, 285, 31218, 2439, 2709, 11515, 285, 1537, 417, 320, 2266, 2217, 281, 7525, 34385, 273, 581, 1566, 689, 253, 643, 387, 1878, 2629, 11254, 812, 320, 2361, 347, 973, 1580, 697, 495, 6613, 7364, 403, 417, 9713, 3707, 323, 326, 275, 690, 15216, 253, 4081, 1332, 1057, 417, 562, 32231, 512, 643, 3082, 4496, 2319, 275, 534, 15216, 253, 1332, 1537, 417, 789, 390, 835, 253, 9376, 326, 1146, 2810, 281, 253, 3236, 3215, 26208, 1566, 6082, 422, 310, 417, 4209, 3229, 8953, 323, 15450, 2831, 1981, 780, 26647, 5474, 33032, 2520, 2929, 10262, 247, 1039, 273, 4715, 432, 1554, 39661, 28267, 941, 597, 2770, 327, 253, 1182, 254, 6934, 302, 3700, 281, 1698, 15024, 11515, 323, 8892, 751, 38998, 2805, 66, 285, 295, 965, 50276, 783, 4477, 1750, 326, 323, 436, 4758, 597, 878, 281, 22318, 253, 1566, 323, 767, 747, 16566, 581, 326, 46926, 253, 37264, 273, 253, 1698, 15024, 11515, 432, 3215, 26208, 285, 581, 326, 20096, 326, 672, 1442, 292, 25004, 327, 253, 1027, 1554, 39661, 28267, 15302, 326, 253, 18499, 3884, 875, 731, 310, 6096, 390, 310, 275, 1846, 597, 36803, 436, 347, 247, 20793, 13757, 1895, 285, 1067, 616, 1332, 1679, 1542, 35777, 1554, 39661, 1442, 292, 25004, 298, 71, 1686, 71, 597, 921, 5185, 533, 417, 1077, 1781, 5520, 3045, 327, 38998, 2805, 66, 285, 295, 965, 285, 1007, 387, 835, 27214, 941, 432, 625, 11515, 7729, 50276, 395, 5061, 321, 28761, 352, 1057, 50275, 296, 3755, 20556, 50275, 9328, 921, 5185, 11701, 17522, 689, 598, 281, 5329, 1182, 254, 6934, 302, 3448, 10746, 689, 1264, 1027, 8892, 534, 310, 3240, 9470, 50275, 20881, 1255, 265, 50276, 262, 310, 417, 2590, 432, 253, 2929, 849, 616, 2746, 310, 1027, 432, 253, 8245, 3082, 253, 4477, 943, 5513, 253, 954, 4623, 1666, 25379, 285, 6266, 849, 597, 403, 1027, 2199, 24648, 27935, 1332, 285, 11786, 12538, 50276, 2520, 310, 247, 2201, 19652, 275, 253, 2929, 50275, 9328, 452, 3710, 253, 31471, 273, 253, 3082, 281, 247, 1077, 6891, 2173, 441, 886, 511, 1554, 39661, 285, 1182, 254, 6934, 302, 253, 1679, 1542, 35777, 3884, 812, 275, 3762, 320, 4217, 323, 28294, 272, 780, 1442, 292, 25004, 347, 973, 347, 1554, 39661, 1442, 292, 25004, 253, 5678, 18499, 812, 320, 4217, 323, 3700, 4715, 342, 27214, 941, 1512, 50276, 1439, 816, 1182, 254, 6934, 302, 2139, 858, 253, 4477, 417, 3748, 436, 390, 513, 667, 4679, 2112, 841, 3104, 50275, 262, 310, 3164, 13583, 281, 1375, 1386, 6879, 278, 17945, 3082, 403, 417, 7470, 323, 1554, 39661, 1442, 292, 25004, 347, 253, 4477, 897, 824, 3082, 347, 1666, 25379, 275, 253, 2929, 1014, 2167, 597, 513, 417, 6266, 731, 285, 849, 597, 403, 1027, 281, 253, 298, 71, 1686, 71, 50274, 783, 4477, 513, 417, 2319, 253, 1979, 273, 27214, 941, 323, 1016, 273, 253, 1554, 39661, 2219, 390, 2319, 849, 436, 11852, 253, 1543, 24088, 275, 2829, 374, 546, 372, 1315, 8864, 3966, 347, 3045, 327, 1442, 292, 25004, 816, 327, 391, 1316, 757, 4245, 1805, 3045, 326, 816, 3733, 327, 48087, 534, 1057, 417, 1056, 3282, 50275, 783, 2408, 273, 7756, 310, 417, 1270, 253, 2969, 6447, 1554, 39661, 4030, 25184, 1332, 310, 253, 1273, 1682, 1332, 285, 298, 71, 1686, 71, 310, 6386, 273, 352, 407, 760, 470, 3712, 1127, 50275, 2520, 789, 1057, 417, 2319, 697, 7364, 534, 310, 247, 12291, 50275, 187, 187, 4118, 18435, 27, 783, 2929, 29328, 247, 1332, 323, 1442, 292, 25004, 1554, 39661, 3215, 11273, 3448, 275, 2709, 11515, 10486, 50276, 783, 4836, 310, 7473, 1025, 347, 247, 50276, 48454, 13757, 1895, 285, 253, 5170, 3033, 273, 253, 37264, 310, 1677, 275, 3762, 50276, 66, 1332, 310, 3715, 323, 1554, 39661, 1442, 292, 25004, 281, 15338, 253, 5170, 3033, 50276, 16217, 3825, 403, 5196, 275, 2709, 15450, 8892, 285, 253, 1566, 310, 1442, 292, 37437, 275, 247, 1643, 1029, 15024, 11515, 285, 253, 3045, 310, 5520, 275, 1698, 7741, 11515, 347, 1182, 254, 6934, 302, 7533, 50276, 783, 4477, 10974, 253, 30628, 7350, 285, 253, 30628, 5194, 253, 6128, 9713, 616, 7350, 50276, 783, 2929, 310, 8521, 281, 320, 7607, 285, 891, 1642, 253, 4477, 281, 9257, 10347, 253, 2457, 4049, 254, 609, 5102, 2715, 1754, 327, 253, 30628, 8680 ]
[ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 ]
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 783, 2929, 29328, 271, 5839, 273, 849, 281, 1442, 292, 2517, 247, 1566, 342, 1554, 39661, 941, 8069, 342, 247, 10527, 285, 16774, 1941, 597, 4081, 247, 1332, 281, 3693, 36256, 37264, 407, 970, 10755, 3659, 47037, 281, 22318, 253, 8103, 1159, 1754, 327, 253, 10527, 789, 253, 4477, 1750, 326, 253, 1332, 310, 2104, 281, 3986, 253, 22865, 936, 17429, 50276, 2369, 625, 2544, 5866, 407, 39793, 2709, 16566, 407, 9433, 13757, 597, 921, 253, 34385, 273, 253, 7274, 275, 4907, 10726, 8981, 1953, 22291, 285, 3626, 3448, 17032, 285, 2067, 11515, 50276, 296, 3755, 20556, 50276, 47606, 789, 342, 247, 1175, 10527, 12153, 50276, 783, 4836, 310, 1077, 4217, 323, 4715, 1554, 39661, 3210, 1223, 39793, 253, 1566, 281, 3986, 8003, 281, 253, 5170, 3033, 50276, 20881, 1255, 265, 50276, 783, 2929, 310, 417, 3477, 281, 956, 253, 2929, 1332, 310, 12744, 1919, 891, 9257, 1239, 253, 2929, 2067, 2069, 891, 651, 1333, 253, 2929, 651, 320, 6927, 281, 956, 604, 597, 1691, 625, 4278, 327, 253, 5933, 285, 253, 10199, 323, 1650, 752, 1332, 1057, 253, 2929, 897, 281, 22318, 253, 13461, 417, 816, 13458, 281, 253, 5150, 285, 352, 651, 320, 1270, 281, 823, 247, 2159, 5740, 273, 253, 1332, 275, 253, 10199, 594, 326, 253, 806, 10668, 476, 2096, 253, 4081, 2746, 50276, 783, 1332, 310, 417, 973, 24013, 8550, 342, 247, 2590, 9079, 3103, 891, 2550, 1089, 253, 24775, 2139, 253, 298, 71, 1686, 71, 310, 4217, 275, 253, 12002, 285, 10199, 50276, 555, 5367, 50276, 249, 6452, 4715, 50276, 28269, 50276, 9088, 310, 642, 2173, 12291, 2593, 2530, 275, 253, 2929, 5474, 33032, 2520, 2929, 29328, 247, 1332, 323, 11850, 1554, 4837, 10982, 762, 1442, 292, 25004, 672, 247, 873, 273, 11515, 310, 908, 323, 1442, 292, 25004, 253, 5161, 2934, 310, 281, 6558, 2831, 1981, 780, 26647, 407, 4560, 11269, 326, 403, 1474, 15544, 417, 1512, 1199, 432, 253, 3236, 3215, 11273, 1566, 285, 697, 2957, 285, 1110, 326, 1142, 11515, 5649, 432, 253, 4081, 2957, 310, 285, 697, 13757, 403, 28055, 17194, 285, 2931, 285, 840, 6760, 327, 15450, 8892, 273, 253, 209, 633, 4190, 22791, 342, 1269, 20347, 687, 589, 893, 835, 247, 1643, 1029, 15024, 11515, 403, 6777, 323, 1442, 292, 25004, 285, 253, 5780, 11515, 403, 908, 323, 1182, 254, 6934, 302, 7103, 50273, 9939, 9759, 285, 4583, 4868, 9300, 846, 2488, 30080, 22559, 20544, 337, 253, 1895, 4758, 273, 1182, 254, 6934, 302, 26647, 846, 1442, 292, 25004, 310, 4460, 285, 6131, 3486, 1020, 4893, 374, 253, 5661, 1543, 403, 8127, 21414, 495, 253, 7000, 3486, 2175, 2593, 8676, 285, 1563, 403, 47860, 285, 11080, 50276, 20881, 1255, 265, 3877, 512, 973, 9713, 275, 17265, 2715, 337, 253, 2929, 3731, 4762, 5957, 1554, 39661, 1442, 292, 25004, 1554, 39661, 1442, 292, 25004, 534, 556, 417, 644, 14859, 1078, 298, 1706, 253, 1442, 292, 25004, 10076, 835, 2709, 2603, 11515, 403, 3206, 275, 1442, 292, 25004, 10775, 1554, 39661, 1442, 292, 25004, 4558, 35021, 2149, 298, 3046, 352, 556, 644, 14859, 1078, 954, 46454, 275, 253, 2929, 323, 253, 209, 633, 4190, 22791, 5987, 39962, 2061, 5375, 1518, 2405, 43201, 326, 436, 1077, 2929, 310, 671, 16344, 327, 5600, 255, 292, 1949, 285, 275, 12982, 1554, 262, 1945, 432, 209, 633, 4190, 20714, 15988, 689, 28294, 272, 780, 1442, 292, 25004, 285, 943, 320, 5469, 275, 1798, 275, 5886, 281, 253, 6447, 13361, 71, 8245, 3559, 275, 436, 789, 1529, 789, 326, 21168, 327, 1554, 39661, 1442, 292, 25004, 310, 323, 1650, 5987, 39962, 2061, 5375, 14256, 1235, 938, 1423, 5549, 29404, 1384, 1423, 752, 556, 2649, 644, 5421, 275, 436, 4758, 6706, 66, 1479, 310, 253, 1895, 273, 37264, 323, 690, 11515, 1442, 292, 25004, 369, 2057, 2218, 327, 512, 11515, 273, 1600, 390, 760, 327, 48087, 374, 253, 12002, 285, 10199, 403, 417, 973, 9999, 253, 2770, 273, 253, 5780, 7118, 347, 352, 1057, 417, 9569, 285, 41509, 253, 37264, 1895, 326, 253, 4081, 1332, 310, 7194, 3715, 323, 285, 253, 4060, 310, 6012, 432, 50276, 20, 247, 5301, 281, 512, 12982, 1442, 292, 25004, 943, 320, 2908, 347, 271, 5170, 3033, 1580, 3733, 941, 310, 2130, 323, 387, 1878, 253, 38998, 4836, 50276, 21, 8453, 5216, 943, 320, 2684, 1580, 690, 3910, 403, 4536, 1355, 285, 31218, 2439, 2709, 11515, 285, 1537, 417, 320, 2266, 2217, 281, 7525, 34385, 273, 581, 1566, 689, 253, 643, 387, 1878, 2629, 11254, 812, 320, 2361, 347, 973, 1580, 697, 495, 6613, 7364, 403, 417, 9713, 3707, 323, 326, 275, 690, 15216, 253, 4081, 1332, 1057, 417, 562, 32231, 512, 643, 3082, 4496, 2319, 275, 534, 15216, 253, 1332, 1537, 417, 789, 390, 835, 253, 9376, 326, 1146, 2810, 281, 253, 3236, 3215, 26208, 1566, 6082, 422, 310, 417, 4209, 3229, 8953, 323, 15450, 2831, 1981, 780, 26647, 5474, 33032, 2520, 2929, 10262, 247, 1039, 273, 4715, 432, 1554, 39661, 28267, 941, 597, 2770, 327, 253, 1182, 254, 6934, 302, 3700, 281, 1698, 15024, 11515, 323, 8892, 751, 38998, 2805, 66, 285, 295, 965, 50276, 783, 4477, 1750, 326, 323, 436, 4758, 597, 878, 281, 22318, 253, 1566, 323, 767, 747, 16566, 581, 326, 46926, 253, 37264, 273, 253, 1698, 15024, 11515, 432, 3215, 26208, 285, 581, 326, 20096, 326, 672, 1442, 292, 25004, 327, 253, 1027, 1554, 39661, 28267, 15302, 326, 253, 18499, 3884, 875, 731, 310, 6096, 390, 310, 275, 1846, 597, 36803, 436, 347, 247, 20793, 13757, 1895, 285, 1067, 616, 1332, 1679, 1542, 35777, 1554, 39661, 1442, 292, 25004, 298, 71, 1686, 71, 597, 921, 5185, 533, 417, 1077, 1781, 5520, 3045, 327, 38998, 2805, 66, 285, 295, 965, 285, 1007, 387, 835, 27214, 941, 432, 625, 11515, 7729, 50276, 395, 5061, 321, 28761, 352, 1057, 50275, 296, 3755, 20556, 50275, 9328, 921, 5185, 11701, 17522, 689, 598, 281, 5329, 1182, 254, 6934, 302, 3448, 10746, 689, 1264, 1027, 8892, 534, 310, 3240, 9470, 50275, 20881, 1255, 265, 50276, 262, 310, 417, 2590, 432, 253, 2929, 849, 616, 2746, 310, 1027, 432, 253, 8245, 3082, 253, 4477, 943, 5513, 253, 954, 4623, 1666, 25379, 285, 6266, 849, 597, 403, 1027, 2199, 24648, 27935, 1332, 285, 11786, 12538, 50276, 2520, 310, 247, 2201, 19652, 275, 253, 2929, 50275, 9328, 452, 3710, 253, 31471, 273, 253, 3082, 281, 247, 1077, 6891, 2173, 441, 886, 511, 1554, 39661, 285, 1182, 254, 6934, 302, 253, 1679, 1542, 35777, 3884, 812, 275, 3762, 320, 4217, 323, 28294, 272, 780, 1442, 292, 25004, 347, 973, 347, 1554, 39661, 1442, 292, 25004, 253, 5678, 18499, 812, 320, 4217, 323, 3700, 4715, 342, 27214, 941, 1512, 50276, 1439, 816, 1182, 254, 6934, 302, 2139, 858, 253, 4477, 417, 3748, 436, 390, 513, 667, 4679, 2112, 841, 3104, 50275, 262, 310, 3164, 13583, 281, 1375, 1386, 6879, 278, 17945, 3082, 403, 417, 7470, 323, 1554, 39661, 1442, 292, 25004, 347, 253, 4477, 897, 824, 3082, 347, 1666, 25379, 275, 253, 2929, 1014, 2167, 597, 513, 417, 6266, 731, 285, 849, 597, 403, 1027, 281, 253, 298, 71, 1686, 71, 50274, 783, 4477, 513, 417, 2319, 253, 1979, 273, 27214, 941, 323, 1016, 273, 253, 1554, 39661, 2219, 390, 2319, 849, 436, 11852, 253, 1543, 24088, 275, 2829, 374, 546, 372, 1315, 8864, 3966, 347, 3045, 327, 1442, 292, 25004, 816, 327, 391, 1316, 757, 4245, 1805, 3045, 326, 816, 3733, 327, 48087, 534, 1057, 417, 1056, 3282, 50275, 783, 2408, 273, 7756, 310, 417, 1270, 253, 2969, 6447, 1554, 39661, 4030, 25184, 1332, 310, 253, 1273, 1682, 1332, 285, 298, 71, 1686, 71, 310, 6386, 273, 352, 407, 760, 470, 3712, 1127, 50275, 2520, 789, 1057, 417, 2319, 697, 7364, 534, 310, 247, 12291, 50275, 187, 187, 4118, 18435, 27, 783, 2929, 29328, 247, 1332, 323, 1442, 292, 25004, 1554, 39661, 3215, 11273, 3448, 275, 2709, 11515, 10486, 50276, 783, 4836, 310, 7473, 1025, 347, 247, 50276, 48454, 13757, 1895, 285, 253, 5170, 3033, 273, 253, 37264, 310, 1677, 275, 3762, 50276, 66, 1332, 310, 3715, 323, 1554, 39661, 1442, 292, 25004, 281, 15338, 253, 5170, 3033, 50276, 16217, 3825, 403, 5196, 275, 2709, 15450, 8892, 285, 253, 1566, 310, 1442, 292, 37437, 275, 247, 1643, 1029, 15024, 11515, 285, 253, 3045, 310, 5520, 275, 1698, 7741, 11515, 347, 1182, 254, 6934, 302, 7533, 50276, 783, 4477, 10974, 253, 30628, 7350, 285, 253, 30628, 5194, 253, 6128, 9713, 616, 7350, 50276, 783, 2929, 310, 8521, 281, 320, 7607, 285, 891, 1642, 253, 4477, 281, 9257, 10347, 253, 2457, 4049, 254, 609, 5102, 2715, 1754, 327, 253, 30628, 8680 ]
Below is given review of a research paper from cnoference journal. Please write a summary the review. ### Review: summary the paper describes an algorithm to tackle model bias in the mpc they address the question of optimal horizon length as well the model errors observed in mpc based systems the paper is well motivated and written in clear concise manner experiments with cartpole and robot models demonstrate the practical feasibility of the proposed method strong points 1 good description of the sources of errors in mpc based models 2 the theorems are useful though i was not able to check their algebraic accuracy the limiting constructs are intuitively correct 3 choosing the horizon limit and tackling model errors are import challenges for mpc and the method proposed in this paper would be a good addition to the knowledge hence recommendation for accept to improve 1 i am not sure the mppi is the sota baseline for this comparison there are other mpc methods that achieve better results than mppi 2 although citations from the machine learning community seem to be covered the standard mpc literature seem to be completely ignored at the bare minimum when speak of stability and fast horizon planning no reference to tube based mpc see mayne 2011 3 major advantage of mpc is the ability to deal with constraints again see mayne et ref 2 below these are not recent developments but classic position papers that address lot of questions you pose and attempt to answer in the paper 4 the figures need a significant improvements in their current form the plots are too thin to read them correctly refs 1 mayne dq kerrigan ec van wyk ej and falugi p 2011 tubebased robust nonlinear model predictive control int j robust nonlinear control 21 13411353 doi101002rnc1758 2 dq mayne jb rawlings cv rao pom scokaert constrained model predictive control stability and optimality automaticavolume 36 issue 62000pages 789814 issn 00051098docsep blending mpc value function approximation for efficient reinforcement learning review summarization in this paper the authors consider using a blending of q value which is predicted directly from the neural network and a q value which is predicted by unrolling the learnt dynamics the empirical results suggest improved performance sample efficiency and good robustness in choosing different values of blending coefficient pros 1 the idea of blending the q values is novel i also think the connection to gae is quite natural and interesting where both algorithms consider trade off between bias and variance in this papers case bias in learned dynamics 2 the supporting experiments consider some of the interesting questions for example in section 51 the question of how sensitive lambda is is addressed 3 the direction of combining value estimation in modelpredictive control is interesting and underexplored that being said this paper can be inspiring and helpful towards future research cons 1 the experiment section lacks comparison among stateoftheart algorithms while mppi and ppo were generally considered stateoftheart at the time when they are published 2017 their performance is now outperformed heavily given the fast development in the research direction it would be great if some of the strong baselines between 20192020 are included sac td3 mbpo etc also given the similarity to mcts algorithms it would be more convincing to include one variant of it as a baseline 2 experiments on more environments are also appreciated questions i didnt see study on the training time and testing time how much time needed to generate one action during testing but it seems to be referred to in the introduction docsepsummary the paper provides and interesting analysis of and new method for modelpredictive control in reinforcement learning specifically it proposes a new framework mpqlambda for joining modelbased mpc with learned value estimates in rl the authors develop an formulation to find an optimal prediction horizon and and how this works in an online reinforcement learning framework the new approach is evaluated on 3 continuous control tasks and compared to some other baselines score reasoning this paper has interesting theoretical contributions to multiple areas of machine learning modelbased rl mpc and value estimation but the somewhat limited experimental evaluation make the efficacy of the method more difficult to judge overall the paper is well written and i enjoyed reading it i now will address my conceptual comments followed by more minor suggestions rebuttal update the authors have gone beyond the normal scope of a rebuttal phase to update their experiments and the motivation of the work and for that reason i have improved my recommendation to be above the acceptance threshold experimental validation questions i am breaking this section of the review into its own section because it is where the majority of my questions are e1 the authors bring up modelbased rl algorithms but do not baseline against other algorithms generally considered sample efficient ppo is not always easy to use authors mention it not converging and not substantial parameter tuning how about sac e1b it would be very interesting to compare something similar to the pets optimizer for mpc cited in intro but not really mentioned these baseline changes could make the results much more believable e2 all parameters were found using a coarse grid search this makes the results suspect to me please clarify how coarse is the same search space used for all algorithms were defaults used for algorithms with previously published results do the results match e3 was tuning of the reward functions done by hand in a21 or are they referenced elsewhere are states like xposition and pole angle normalized in cartpole this can have bigger effects in more complicated environments e4 shaded regions represent standard deviation for mppi is this over the same 30x3 evaluations very important to standard dev is the number of samples e5 cartpole swing up is a very similar task it seems to be the only one where mpqlambda substantially outperforms the baselines given no error bars on mpq too how do the ablation studies of figure 2 reproduce on more challenging environments comments 1 the authors refer to mpc as a simpler more practical alternative to rl or a more pragmatic approach for simple policies some would argue that rl is a simpler approach because it does not require any model in the case of modelfree rl mpc also has many design decisions such as which optimizer to use or the planning horizon multiple papers written on this topic i would like the authors to explain this with more detail or defend their stance 2 the authors may consider including these two other papers that relate to modelbased rl modelbias and mpc horizon httpsarxivorgpdf200909593pdf httpsarxivorgpdf200204523pdf 3 how does model bias differ from model inaccuracy in mbrl modelbias often refers to the model being more accurate in some areas of the statespace than others and how this impacts the downstream ranking of action choices do the authors consider this difference at all how does modelaccuracy drop when the bias terms are introduced in some basic metrics like meansquarederror or negativeloglikelihood metrics used in mbrl to quantify modelaccuracy 4 the position of the contribution in related works could be made stronger i was unaware that mpq was not the proposal of this paper until section 4 the difference between the two and why this matters should be in the introduction unless the authors decide to add a dedicated related works section 4b how does entropyregularized formulation impact the results from my reading that is an important part of the original mpq paper so i think it should be explained 5 the conclusion to this paper is weak it reiterates what is done but the authors should make a case how this impacts developments in robotics control to better match up with the experiments and introduction what should i take away from studying this paper 6 it would be interesting to see the authors propose how to combine the mpq framework with other forms of mpc that dont have an implicit terminal cost included this may be for future work but i would be interested in a comment minor comments 1 there are some typos that impede reading but overall the paper is well written intro paragraph 2 owing to its ability to is weird section 22 since it plans it is vague here some missing commas in first paragraph of section 3 first missing period at end of paragraph baselines missing period figure 3 end double period before conclusion typo in ppo a22 the 2 in 32 the authors show how to blend the modelbased and modelfree methods but point to a reference that is not obviously connected to me and call the approach common i would suggest adding more references or adjusting the claim 3 why was mppi chosen as the mpc algorithm it is a suitable choice but could be added 4 there is a lot of visuals in figures 2 and 3 maybe have fewer lines the font could be enlarged and it is very confusing that the yaxiss are not all the same for similar data types 4b figure 4b has strange shading from the mppi variance its not readable docsep summary the paper proposes to combine mpc and modelfree rl to overcome the possible modelling errors thereby the approach achieves the sampleefficiency of mpc and the control quality of modelfree rl the resulting mpqlambda algorithm uses mppi to obtain the actions by optimizing the blended mpc objective the qtargets for fitting the q function also use the blended qestimate quality originality significance the idea of combining mpc and modelfree rl is straight forward and not novel the paper also does not claim this however the exact instantiation is novel very well motivated and feels natural my biggest concerns are the experiments the cartpole experiments show the improved performance compared to mppi on the biased model and the impact of the lambda and model bias however the ppo baseline is missing for the cartpole right furthermore is ppo a fair comparison for the mpqlambda algorithm to evaluate sample complexity while ppo is a batched update the mpqlambda uses step based updates wouldnt a modelfree stepbased update algorithm such as ddpg sac etc be a better baseline to evaluate the improved sample complexity regarding the highdimensional tasks the provided evaluations do not enable an evaluation whether the task is solved or not could you please provide videos of the final policies otherwise the achieved reward is just a random number furthermore the paper shows confidence bounds for the mppi baselines but not for the mpqlambda algorithm also the learning curves are cut before converging could you train every instance until convergence furthermore could you please include the asymptotic performance of your baseline in the plots the definition of validation iteration remains unclear given the current evaluations the stated claim of applicability to highdim tasks cannot be made as the evaluations are not sufficient the used modelling bias is also very limited as the paper only compares to biases of the model parameters but not against other sources of biases ultimately the increased performance can only be shown on the physical system clarity style the paper is really well written and understandable a few sections could be improved eg text between eq 6 eq 8 in this section it is a bit unclear what is expanded and how it is expanded it would be beneficial to rethink the labeling of the qfunctions as they can be quite confusing maybe a table of the different subscriptsuperscript definitions would simplify the reading as i had to search for the exact definitions frequently furthermore there are minor styling issues inline equations are consuming too much space to mess up line spacing eg section 21 argmin theorem 31 norms the min in equation 13 needs two spaces and a subscript experiment o4 ends with two dots white space around figure 2 can be optimized conclusion all in all the paper is nicely written with a clear and well motivated idea of combining mpc modelfree rl right now the main problem is the execution of the evaluations the performance on the highdim tasks is unclear and the baseline is missing for the cartpole i would be happy to improve my score to weak accept if a step based model free rl algorithm is added to cartpole claims regarding highdim tasks are adapted and videos of the highdimensional tasks are released accept if the highdimensional tasks are working properly with mpqlambda strong accept if mpqlambda shows this performance on a physical system ps you might also try to get medium dimension tasks working such as hopper or cheetah that might be a bit easier post discussion comments so the author did a filibuster and flooded the discussions with bloated comments in this manner it was close to impossible to keep track of anything there has to be character limit for responses otherwise this is not feasible i looked at the videos and your physics simulators looks really catchy at one point in time the pole of the cartpole is at 1011 oclock and the cart starts moving right the pole has close to no velocity and hence only a very small angular momentum in this setting it would be natural that the pole would fall down if this state is maintained for a longer period which it is in the video however the pole goes upwards into the balancing position this is really weird and dont get me started on the penorientation as the pen sometimes floats midair for this setting the gravitational constant really does not seem right i also wouldnt consider the task solved as this is more an really uncoordinated movements for three specific configurations for the simulation studies some doubts remain but the authors improved the paper therefore i am going to increase my score to weak accept nevertheless the experimental evaluation could be improved and the paper would really benefit from real experiments ### Summary:
the authors put a lot of effort in replying to questions and improving the paper to a point that the reviewers felt overwhelmed pros an interesting way of dealing with model bias in mpc they successfully managed to address the most important concerns of the reviewers with lots of additional experiments and insights r3s concerns have also been successfully addressed by the authors the review score were unfortunately not updated cons the only remaining point is that the simulations seem to be everything but physically realistic update at end of r1s review which is probably a problem of the benchmarks and not the authors faults
[ 5415, 1453, 8892, 285, 2429, 281, 690, 643, 1666, 25379, 50275, 18891, 14720, 50276, 2520, 2929, 556, 4722, 10527, 9021, 281, 2709, 3672, 273, 5145, 4715, 1566, 3169, 391, 77, 278, 5902, 285, 1318, 13418, 533, 253, 8489, 3710, 5661, 7103, 1056, 253, 10307, 273, 253, 1332, 625, 2834, 281, 5963, 4583, 253, 2929, 310, 973, 3542, 285, 891, 11346, 4361, 352, 891, 1024, 588, 2953, 619, 20178, 5701, 3560, 407, 625, 5884, 13991, 50275, 250, 2858, 22559, 5731, 253, 4477, 452, 4783, 4457, 253, 2622, 7990, 273, 247, 30080, 22559, 3408, 281, 5731, 616, 4679, 285, 253, 16038, 273, 253, 789, 285, 323, 326, 1921, 891, 452, 5520, 619, 17401, 281, 320, 1840, 253, 14924, 7887, 50275, 49363, 12820, 3533, 891, 717, 10155, 436, 2593, 273, 253, 2278, 715, 697, 1211, 2593, 984, 352, 310, 835, 253, 5020, 273, 619, 3533, 403, 50276, 70, 18, 253, 4477, 3324, 598, 1566, 3169, 391, 77, 11333, 533, 513, 417, 8245, 1411, 643, 11333, 3839, 2783, 3410, 5919, 268, 5367, 310, 417, 1900, 3477, 281, 897, 4477, 3748, 352, 417, 5975, 3390, 285, 417, 6832, 4764, 25184, 849, 670, 7044, 50276, 70, 18, 67, 352, 651, 320, 1077, 4722, 281, 7277, 1633, 2074, 281, 253, 29286, 5556, 6081, 323, 278, 5902, 11106, 275, 26432, 533, 417, 1663, 5393, 841, 8245, 2544, 812, 1056, 253, 1543, 1199, 625, 1802, 17254, 50276, 70, 19, 512, 3602, 497, 1119, 970, 247, 25319, 9860, 3186, 436, 2789, 253, 1543, 9101, 281, 479, 4496, 19148, 849, 25319, 310, 253, 1072, 3186, 2317, 908, 323, 512, 11333, 497, 30470, 908, 323, 11333, 342, 3786, 3863, 1543, 513, 253, 1543, 3761, 50276, 70, 20, 369, 25184, 273, 253, 10921, 3470, 2218, 407, 1133, 275, 247, 1797, 390, 403, 597, 23378, 11358, 403, 3054, 751, 1269, 3321, 285, 15903, 6907, 12650, 275, 7281, 36479, 436, 476, 452, 8750, 2538, 275, 625, 9542, 12620, 50276, 70, 21, 37042, 4811, 1957, 2629, 11254, 323, 278, 377, 74, 310, 436, 689, 253, 1072, 1884, 89, 20, 27163, 1077, 1774, 281, 2629, 1474, 310, 253, 1180, 273, 3530, 50276, 70, 22, 7281, 36479, 14284, 598, 310, 247, 1077, 2074, 4836, 352, 3133, 281, 320, 253, 760, 581, 835, 23542, 82, 2260, 9619, 41731, 13015, 253, 1666, 25379, 1677, 642, 2228, 8965, 327, 23542, 82, 1512, 849, 513, 253, 28913, 2175, 273, 4677, 374, 18302, 327, 625, 11132, 12620, 50275, 26122, 337, 253, 4477, 3730, 281, 278, 5902, 347, 247, 19554, 625, 8542, 5795, 281, 391, 77, 390, 247, 625, 41585, 2746, 323, 2969, 7823, 50276, 8826, 651, 9059, 326, 391, 77, 310, 247, 19554, 2746, 984, 352, 1057, 417, 2430, 667, 1566, 275, 253, 1083, 273, 771, 813, 658, 391, 77, 50276, 2503, 68, 671, 556, 1142, 2216, 7089, 824, 347, 534, 5556, 6081, 281, 897, 390, 253, 7219, 16892, 2709, 9380, 3542, 327, 436, 9400, 50276, 74, 651, 751, 253, 4477, 281, 5513, 436, 342, 625, 2508, 390, 2342, 616, 22567, 50276, 19, 253, 4477, 778, 1908, 1690, 841, 767, 643, 9380, 326, 14588, 281, 1566, 3169, 391, 77, 1566, 39043, 285, 278, 5902, 16892, 5987, 39962, 2061, 9275, 1518, 2270, 2222, 4590, 9275, 5987, 39962, 2061, 9275, 1518, 938, 1857, 1508, 9275, 50275, 20, 849, 1057, 1566, 8492, 9184, 432, 1566, 23437, 1974, 50276, 249, 278, 1288, 77, 1566, 39043, 2223, 10770, 281, 253, 1566, 1146, 625, 7899, 275, 690, 3672, 273, 253, 3054, 4511, 685, 2571, 285, 849, 436, 16274, 253, 15450, 19947, 273, 2250, 10165, 50276, 3088, 253, 4477, 1908, 436, 3064, 387, 512, 849, 1057, 1566, 18921, 1974, 5926, 672, 253, 8492, 2426, 403, 5611, 275, 690, 5044, 17082, 751, 2097, 371, 1096, 3775, 390, 2297, 13490, 293, 462, 7513, 10202, 17082, 908, 275, 278, 1288, 77, 281, 22048, 1566, 18921, 1974, 50276, 21, 253, 1899, 273, 253, 7680, 275, 2905, 2987, 812, 320, 1160, 10046, 891, 369, 25229, 326, 23542, 82, 369, 417, 253, 10419, 273, 436, 2929, 1919, 2593, 577, 253, 3064, 875, 253, 767, 285, 2139, 436, 8213, 943, 320, 275, 253, 10199, 5734, 253, 4477, 7617, 281, 823, 247, 9940, 2905, 2987, 2593, 577, 67, 849, 1057, 15579, 12846, 1025, 15895, 3486, 253, 1543, 432, 619, 4361, 326, 310, 271, 1774, 629, 273, 253, 3236, 23542, 82, 2929, 594, 891, 1158, 352, 943, 320, 5544, 50276, 22, 253, 6452, 281, 436, 2929, 310, 5075, 352, 28411, 684, 752, 310, 2218, 533, 253, 4477, 943, 1056, 247, 1083, 849, 436, 16274, 16936, 275, 15688, 982, 50276, 8519, 281, 1805, 3761, 598, 342, 253, 4679, 285, 10199, 752, 943, 891, 1379, 1977, 432, 12392, 436, 2929, 50275, 23, 352, 651, 320, 4722, 281, 923, 253, 4477, 12661, 849, 281, 13398, 253, 23542, 82, 7792, 342, 643, 4948, 273, 278, 5902, 326, 13414, 452, 271, 15424, 8351, 2105, 2908, 436, 778, 320, 323, 2852, 789, 533, 891, 651, 320, 6110, 275, 247, 4385, 50275, 37585, 5701, 337, 627, 403, 690, 963, 993, 326, 1607, 13616, 4361, 533, 4583, 253, 2929, 310, 973, 3542, 50276, 35322, 12494, 374, 21681, 281, 697, 3745, 281, 310, 12504, 50276, 4674, 3307, 1580, 352, 5827, 352, 310, 21248, 1060, 50276, 8826, 5816, 764, 284, 275, 806, 12494, 273, 2593, 495, 806, 50276, 33722, 2180, 387, 990, 273, 12494, 1666, 25379, 5816, 2180, 4677, 495, 990, 4021, 2180, 1078, 6452, 50276, 555, 5367, 275, 268, 5367, 247, 1423, 253, 50276, 19, 275, 4567, 253, 4477, 921, 849, 281, 19310, 253, 1566, 3169, 285, 771, 813, 658, 3082, 533, 1127, 281, 247, 3806, 326, 310, 417, 9090, 4802, 281, 479, 285, 1067, 253, 2746, 1846, 50276, 74, 651, 1804, 6240, 625, 10414, 390, 19427, 253, 1750, 50276, 20, 2139, 369, 278, 377, 74, 6777, 347, 253, 278, 5902, 5933, 352, 310, 247, 7470, 4327, 533, 812, 320, 2879, 50276, 21, 627, 310, 247, 2257, 273, 5304, 84, 275, 8442, 374, 285, 495, 5046, 452, 11184, 3104, 253, 8266, 812, 320, 28148, 285, 352, 310, 1077, 21643, 326, 253, 340, 991, 739, 403, 417, 512, 253, 1072, 323, 2074, 941, 3510, 577, 67, 4677, 577, 67, 556, 8921, 439, 6748, 432, 253, 278, 377, 74, 11041, 50276, 953, 417, 34025, 5474, 33032, 6010, 253, 2929, 29328, 281, 13398, 278, 5902, 285, 771, 813, 658, 391, 77, 281, 11399, 253, 1896, 26278, 6332, 7624, 253, 2746, 33526, 253, 3410, 46505, 273, 278, 5902, 285, 253, 1453, 3290, 273, 771, 813, 658, 391, 77, 253, 4795, 23542, 82, 2260, 5933, 4648, 278, 377, 74, 281, 4044, 253, 5231, 407, 39793, 253, 35986, 278, 5902, 8103, 253, 2805, 48413, 323, 13532, 253, 2805, 1159, 671, 897, 253, 35986, 2805, 383, 2542, 50274, 15177, 3236, 414, 50276, 9188, 40348, 253, 2934, 273, 16248, 278, 5902, 285, 771, 813, 658, 391, 77, 310, 4951, 3579, 285, 417, 4460, 253, 2929, 671, 1057, 417, 1750, 436, 2299, 253, 3242, 8164, 2492, 310, 4460, 1077, 973, 17194, 285, 9193, 3626, 619, 5962, 7350, 403, 253, 4679, 253, 7281, 36479, 4679, 921, 253, 5520, 3045, 2429, 281, 278, 377, 74, 327, 253, 23539, 1566, 285, 253, 3486, 273, 253, 29331, 285, 1566, 8492, 2299, 253, 268, 5367, 8245, 310, 5816, 323, 253, 7281, 36479, 987, 33810, 310, 268, 5367, 247, 4344, 5301, 323, 253, 23542, 82, 2260, 5933, 281, 7472, 3410, 10454, 1223, 268, 5367, 310, 247, 10464, 2147, 5731, 253, 23542, 82, 2260, 4648, 3213, 1754, 11269, 651, 2649, 247, 771, 813, 658, 3213, 3169, 5731, 5933, 824, 347, 32765, 8159, 7044, 3966, 320, 247, 1805, 8245, 281, 7472, 253, 5520, 3410, 10454, 5001, 253, 1029, 6967, 8892, 253, 2530, 27163, 513, 417, 8046, 271, 7103, 1880, 253, 4836, 310, 14042, 390, 417, 812, 368, 4496, 2085, 10556, 273, 253, 2457, 7823, 5010, 253, 6786, 10921, 310, 816, 247, 3632, 1180, 33810, 253, 2929, 2722, 7162, 14493, 323, 253, 278, 377, 74, 1666, 25379, 533, 417, 323, 253, 23542, 82, 2260, 5933, 671, 253, 4715, 9191, 403, 2624, 1078, 5975, 3390, 812, 368, 6194, 1046, 4227, 1919, 14940, 33810, 812, 368, 4496, 2486, 253, 20185, 3045, 273, 634, 8245, 275, 253, 14777, 253, 5426, 273, 12820, 19502, 4558, 12744, 1677, 253, 1655, 27163, 253, 4767, 1750, 273, 30437, 281, 1029, 4528, 8892, 2550, 320, 1160, 347, 253, 27163, 403, 417, 4209, 253, 908, 26278, 8492, 310, 671, 1077, 3710, 347, 253, 2929, 760, 26662, 281, 31306, 273, 253, 1566, 3602, 533, 417, 1411, 643, 4973, 273, 31306, 9142, 253, 2559, 3045, 476, 760, 320, 2011, 327, 253, 3520, 985, 50275, 498, 15752, 50276, 4826, 253, 2929, 310, 1663, 973, 3542, 285, 34007, 247, 1643, 7118, 812, 320, 5520, 24088, 2505, 875, 16186, 721, 50276, 2574, 854, 275, 436, 2593, 352, 310, 247, 2372, 12744, 752, 310, 11848, 285, 849, 352, 310, 11848, 352, 651, 320, 12912, 281, 294, 18959, 253, 21473, 273, 253, 2805, 20619, 347, 597, 476, 320, 3240, 21643, 5046, 247, 2829, 273, 253, 1027, 749, 3866, 8403, 398, 1687, 14308, 651, 25636, 253, 4361, 347, 891, 574, 281, 3186, 323, 253, 3242, 14308, 7208, 33810, 627, 403, 5884, 43753, 3374, 50276, 17243, 7424, 403, 21337, 1512, 1199, 2317, 281, 4840, 598, 1386, 22735, 24088, 2593, 3127, 1736, 1222, 10012, 4562, 22429, 50276, 783, 1054, 275, 5150, 2145, 3198, 767, 8470, 285, 247, 749, 3866, 50276, 16217, 2092, 258, 21, 7637, 342, 767, 20200, 50276, 11300, 2317, 1475, 4677, 374, 476, 320, 18325, 50275, 585, 3444, 512, 275, 512, 253, 2929, 310, 23395, 3542, 342, 247, 2590, 285, 973, 17194, 2934, 273, 16248, 278, 5902, 50276, 2307, 813, 658, 391, 77, 987, 1024, 253, 2022, 1895, 310, 253, 10636, 273, 253, 27163, 253, 3045, 327, 253, 1029, 4528, 8892, 310, 12744, 285, 253, 8245, 310, 5816, 323, 253, 7281, 36479, 891, 651, 320, 5211, 281, 3157, 619, 4868, 281, 50276, 20881, 2997, 604, 247, 3213, 1754, 1566, 1959, 391, 77, 5933, 310, 2879, 281, 7281, 36479, 3916, 5001, 1029, 4528, 8892, 403, 12956, 285, 10556, 273, 253, 1029, 6967, 8892, 403, 4439, 50276, 14764, 604, 253, 1029, 6967, 8892, 403, 2444, 6283, 342, 23542, 82, 2260, 50276, 9072, 2997, 604, 23542, 82, 2260, 2722, 436, 3045, 327, 247, 3520, 985, 50275, 793, 368, 1537, 671, 1611, 281, 755, 4646, 7877, 8892, 2444, 824, 347, 8511, 3803, 390, 1161, 292, 1240, 326, 1537, 320, 247, 2372, 6927, 50274, 5996, 5955, 5701, 594, 253, 2488, 858, 247, 1193, 487, 8976, 285, 33913, 253, 11985, 342, 31767, 456, 5701, 275, 436, 5133, 352, 369, 2810, 281, 7479, 281, 1978, 3540, 273, 2712, 627, 556, 281, 320, 1894, 2701, 323, 6128, 5010, 436, 310, 417, 17887, 891, 3261, 387, 253, 10556, 285, 634, 12057, 948, 28457, 4453, 1663, 5834, 90, 387, 581, 1127, 275, 673, 253, 15903, 273, 253, 7281, 36479, 310, 387, 8437, 18, 258, 13273, 285, 253, 7281, 7866, 4886, 987, 253, 15903, 556, 2810, 281, 642, 7602, 285, 7613, 760, 247, 1077, 1355, 12336, 10254, 275, 436, 4758, 352, 651, 320, 3626, 326, 253, 15903, 651, 2965, 1066, 604, 436, 1375, 310, 8838, 323, 247, 3356, 2180, 534, 352, 310, 275, 253, 3492, 2299, 253, 15903, 4566, 32372, 715, 253, 26259, 1899, 436, 310, 1663, 12504, 285, 13414, 755, 479, 3053, 327, 253, 4331, 31756, 347, 253, 4331, 4536, 48158, 4260, 1094, 323, 436, 4758, 253, 18924, 3638, 1663, 1057, 417, 1646, 987, 891, 671, 651, 2649, 1908, 253, 4836, 14042, 347, 436, 310, 625, 271, 1663, 440, 29309, 3901, 11438, 323, 1264, 2173, 16012, 50275, 1542, 253, 9864, 2175, 690, 24626, 3464, 533, 253, 4477, 5520, 253, 2929, 3103, 891, 717, 1469, 281, 2572, 619, 4868, 281, 5075, 2997, 17837, 253, 5661, 7103, 812, 320, 5520, 285, 253, 2929, 651, 1663, 5649, 432, 1524, 4679, 2490, 187, 4118, 18435, 27, 783, 4477, 1691, 247, 2257, 273, 3434, 275, 1234, 2943, 281, 3533, 285, 11138, 253, 2929, 281, 247, 1127, 326, 253, 30628, 3543, 29991, 50276, 856, 84, 50276, 266, 4722, 1039, 273, 10620, 342, 1566, 8492, 275, 278, 5902, 50276, 9328, 8379, 7303, 281, 2953, 253, 954, 1774, 7350, 273, 253, 30628, 342, 8783, 273, 3081, 4679, 285, 16039, 50276, 83, 20, 84, 7350, 452, 671, 644, 8379, 9713, 407, 253, 4477, 253, 2278, 50276, 18891, 497, 19235, 417, 9300, 50276, 5040, 50276, 783, 760, 5780, 1127, 310, 326, 253, 9938, 1646, 281, 320, 3253, 533, 13318, 15958, 5731, 387, 990, 273, 391, 18, 84, 2278, 534, 310, 3164, 247, 1895, 273, 253, 49602, 285, 417, 253, 4477, 35354 ]
[ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 ]
[ 5415, 1453, 8892, 285, 2429, 281, 690, 643, 1666, 25379, 50275, 18891, 14720, 50276, 2520, 2929, 556, 4722, 10527, 9021, 281, 2709, 3672, 273, 5145, 4715, 1566, 3169, 391, 77, 278, 5902, 285, 1318, 13418, 533, 253, 8489, 3710, 5661, 7103, 1056, 253, 10307, 273, 253, 1332, 625, 2834, 281, 5963, 4583, 253, 2929, 310, 973, 3542, 285, 891, 11346, 4361, 352, 891, 1024, 588, 2953, 619, 20178, 5701, 3560, 407, 625, 5884, 13991, 50275, 250, 2858, 22559, 5731, 253, 4477, 452, 4783, 4457, 253, 2622, 7990, 273, 247, 30080, 22559, 3408, 281, 5731, 616, 4679, 285, 253, 16038, 273, 253, 789, 285, 323, 326, 1921, 891, 452, 5520, 619, 17401, 281, 320, 1840, 253, 14924, 7887, 50275, 49363, 12820, 3533, 891, 717, 10155, 436, 2593, 273, 253, 2278, 715, 697, 1211, 2593, 984, 352, 310, 835, 253, 5020, 273, 619, 3533, 403, 50276, 70, 18, 253, 4477, 3324, 598, 1566, 3169, 391, 77, 11333, 533, 513, 417, 8245, 1411, 643, 11333, 3839, 2783, 3410, 5919, 268, 5367, 310, 417, 1900, 3477, 281, 897, 4477, 3748, 352, 417, 5975, 3390, 285, 417, 6832, 4764, 25184, 849, 670, 7044, 50276, 70, 18, 67, 352, 651, 320, 1077, 4722, 281, 7277, 1633, 2074, 281, 253, 29286, 5556, 6081, 323, 278, 5902, 11106, 275, 26432, 533, 417, 1663, 5393, 841, 8245, 2544, 812, 1056, 253, 1543, 1199, 625, 1802, 17254, 50276, 70, 19, 512, 3602, 497, 1119, 970, 247, 25319, 9860, 3186, 436, 2789, 253, 1543, 9101, 281, 479, 4496, 19148, 849, 25319, 310, 253, 1072, 3186, 2317, 908, 323, 512, 11333, 497, 30470, 908, 323, 11333, 342, 3786, 3863, 1543, 513, 253, 1543, 3761, 50276, 70, 20, 369, 25184, 273, 253, 10921, 3470, 2218, 407, 1133, 275, 247, 1797, 390, 403, 597, 23378, 11358, 403, 3054, 751, 1269, 3321, 285, 15903, 6907, 12650, 275, 7281, 36479, 436, 476, 452, 8750, 2538, 275, 625, 9542, 12620, 50276, 70, 21, 37042, 4811, 1957, 2629, 11254, 323, 278, 377, 74, 310, 436, 689, 253, 1072, 1884, 89, 20, 27163, 1077, 1774, 281, 2629, 1474, 310, 253, 1180, 273, 3530, 50276, 70, 22, 7281, 36479, 14284, 598, 310, 247, 1077, 2074, 4836, 352, 3133, 281, 320, 253, 760, 581, 835, 23542, 82, 2260, 9619, 41731, 13015, 253, 1666, 25379, 1677, 642, 2228, 8965, 327, 23542, 82, 1512, 849, 513, 253, 28913, 2175, 273, 4677, 374, 18302, 327, 625, 11132, 12620, 50275, 26122, 337, 253, 4477, 3730, 281, 278, 5902, 347, 247, 19554, 625, 8542, 5795, 281, 391, 77, 390, 247, 625, 41585, 2746, 323, 2969, 7823, 50276, 8826, 651, 9059, 326, 391, 77, 310, 247, 19554, 2746, 984, 352, 1057, 417, 2430, 667, 1566, 275, 253, 1083, 273, 771, 813, 658, 391, 77, 50276, 2503, 68, 671, 556, 1142, 2216, 7089, 824, 347, 534, 5556, 6081, 281, 897, 390, 253, 7219, 16892, 2709, 9380, 3542, 327, 436, 9400, 50276, 74, 651, 751, 253, 4477, 281, 5513, 436, 342, 625, 2508, 390, 2342, 616, 22567, 50276, 19, 253, 4477, 778, 1908, 1690, 841, 767, 643, 9380, 326, 14588, 281, 1566, 3169, 391, 77, 1566, 39043, 285, 278, 5902, 16892, 5987, 39962, 2061, 9275, 1518, 2270, 2222, 4590, 9275, 5987, 39962, 2061, 9275, 1518, 938, 1857, 1508, 9275, 50275, 20, 849, 1057, 1566, 8492, 9184, 432, 1566, 23437, 1974, 50276, 249, 278, 1288, 77, 1566, 39043, 2223, 10770, 281, 253, 1566, 1146, 625, 7899, 275, 690, 3672, 273, 253, 3054, 4511, 685, 2571, 285, 849, 436, 16274, 253, 15450, 19947, 273, 2250, 10165, 50276, 3088, 253, 4477, 1908, 436, 3064, 387, 512, 849, 1057, 1566, 18921, 1974, 5926, 672, 253, 8492, 2426, 403, 5611, 275, 690, 5044, 17082, 751, 2097, 371, 1096, 3775, 390, 2297, 13490, 293, 462, 7513, 10202, 17082, 908, 275, 278, 1288, 77, 281, 22048, 1566, 18921, 1974, 50276, 21, 253, 1899, 273, 253, 7680, 275, 2905, 2987, 812, 320, 1160, 10046, 891, 369, 25229, 326, 23542, 82, 369, 417, 253, 10419, 273, 436, 2929, 1919, 2593, 577, 253, 3064, 875, 253, 767, 285, 2139, 436, 8213, 943, 320, 275, 253, 10199, 5734, 253, 4477, 7617, 281, 823, 247, 9940, 2905, 2987, 2593, 577, 67, 849, 1057, 15579, 12846, 1025, 15895, 3486, 253, 1543, 432, 619, 4361, 326, 310, 271, 1774, 629, 273, 253, 3236, 23542, 82, 2929, 594, 891, 1158, 352, 943, 320, 5544, 50276, 22, 253, 6452, 281, 436, 2929, 310, 5075, 352, 28411, 684, 752, 310, 2218, 533, 253, 4477, 943, 1056, 247, 1083, 849, 436, 16274, 16936, 275, 15688, 982, 50276, 8519, 281, 1805, 3761, 598, 342, 253, 4679, 285, 10199, 752, 943, 891, 1379, 1977, 432, 12392, 436, 2929, 50275, 23, 352, 651, 320, 4722, 281, 923, 253, 4477, 12661, 849, 281, 13398, 253, 23542, 82, 7792, 342, 643, 4948, 273, 278, 5902, 326, 13414, 452, 271, 15424, 8351, 2105, 2908, 436, 778, 320, 323, 2852, 789, 533, 891, 651, 320, 6110, 275, 247, 4385, 50275, 37585, 5701, 337, 627, 403, 690, 963, 993, 326, 1607, 13616, 4361, 533, 4583, 253, 2929, 310, 973, 3542, 50276, 35322, 12494, 374, 21681, 281, 697, 3745, 281, 310, 12504, 50276, 4674, 3307, 1580, 352, 5827, 352, 310, 21248, 1060, 50276, 8826, 5816, 764, 284, 275, 806, 12494, 273, 2593, 495, 806, 50276, 33722, 2180, 387, 990, 273, 12494, 1666, 25379, 5816, 2180, 4677, 495, 990, 4021, 2180, 1078, 6452, 50276, 555, 5367, 275, 268, 5367, 247, 1423, 253, 50276, 19, 275, 4567, 253, 4477, 921, 849, 281, 19310, 253, 1566, 3169, 285, 771, 813, 658, 3082, 533, 1127, 281, 247, 3806, 326, 310, 417, 9090, 4802, 281, 479, 285, 1067, 253, 2746, 1846, 50276, 74, 651, 1804, 6240, 625, 10414, 390, 19427, 253, 1750, 50276, 20, 2139, 369, 278, 377, 74, 6777, 347, 253, 278, 5902, 5933, 352, 310, 247, 7470, 4327, 533, 812, 320, 2879, 50276, 21, 627, 310, 247, 2257, 273, 5304, 84, 275, 8442, 374, 285, 495, 5046, 452, 11184, 3104, 253, 8266, 812, 320, 28148, 285, 352, 310, 1077, 21643, 326, 253, 340, 991, 739, 403, 417, 512, 253, 1072, 323, 2074, 941, 3510, 577, 67, 4677, 577, 67, 556, 8921, 439, 6748, 432, 253, 278, 377, 74, 11041, 50276, 953, 417, 34025, 5474, 33032, 6010, 253, 2929, 29328, 281, 13398, 278, 5902, 285, 771, 813, 658, 391, 77, 281, 11399, 253, 1896, 26278, 6332, 7624, 253, 2746, 33526, 253, 3410, 46505, 273, 278, 5902, 285, 253, 1453, 3290, 273, 771, 813, 658, 391, 77, 253, 4795, 23542, 82, 2260, 5933, 4648, 278, 377, 74, 281, 4044, 253, 5231, 407, 39793, 253, 35986, 278, 5902, 8103, 253, 2805, 48413, 323, 13532, 253, 2805, 1159, 671, 897, 253, 35986, 2805, 383, 2542, 50274, 15177, 3236, 414, 50276, 9188, 40348, 253, 2934, 273, 16248, 278, 5902, 285, 771, 813, 658, 391, 77, 310, 4951, 3579, 285, 417, 4460, 253, 2929, 671, 1057, 417, 1750, 436, 2299, 253, 3242, 8164, 2492, 310, 4460, 1077, 973, 17194, 285, 9193, 3626, 619, 5962, 7350, 403, 253, 4679, 253, 7281, 36479, 4679, 921, 253, 5520, 3045, 2429, 281, 278, 377, 74, 327, 253, 23539, 1566, 285, 253, 3486, 273, 253, 29331, 285, 1566, 8492, 2299, 253, 268, 5367, 8245, 310, 5816, 323, 253, 7281, 36479, 987, 33810, 310, 268, 5367, 247, 4344, 5301, 323, 253, 23542, 82, 2260, 5933, 281, 7472, 3410, 10454, 1223, 268, 5367, 310, 247, 10464, 2147, 5731, 253, 23542, 82, 2260, 4648, 3213, 1754, 11269, 651, 2649, 247, 771, 813, 658, 3213, 3169, 5731, 5933, 824, 347, 32765, 8159, 7044, 3966, 320, 247, 1805, 8245, 281, 7472, 253, 5520, 3410, 10454, 5001, 253, 1029, 6967, 8892, 253, 2530, 27163, 513, 417, 8046, 271, 7103, 1880, 253, 4836, 310, 14042, 390, 417, 812, 368, 4496, 2085, 10556, 273, 253, 2457, 7823, 5010, 253, 6786, 10921, 310, 816, 247, 3632, 1180, 33810, 253, 2929, 2722, 7162, 14493, 323, 253, 278, 377, 74, 1666, 25379, 533, 417, 323, 253, 23542, 82, 2260, 5933, 671, 253, 4715, 9191, 403, 2624, 1078, 5975, 3390, 812, 368, 6194, 1046, 4227, 1919, 14940, 33810, 812, 368, 4496, 2486, 253, 20185, 3045, 273, 634, 8245, 275, 253, 14777, 253, 5426, 273, 12820, 19502, 4558, 12744, 1677, 253, 1655, 27163, 253, 4767, 1750, 273, 30437, 281, 1029, 4528, 8892, 2550, 320, 1160, 347, 253, 27163, 403, 417, 4209, 253, 908, 26278, 8492, 310, 671, 1077, 3710, 347, 253, 2929, 760, 26662, 281, 31306, 273, 253, 1566, 3602, 533, 417, 1411, 643, 4973, 273, 31306, 9142, 253, 2559, 3045, 476, 760, 320, 2011, 327, 253, 3520, 985, 50275, 498, 15752, 50276, 4826, 253, 2929, 310, 1663, 973, 3542, 285, 34007, 247, 1643, 7118, 812, 320, 5520, 24088, 2505, 875, 16186, 721, 50276, 2574, 854, 275, 436, 2593, 352, 310, 247, 2372, 12744, 752, 310, 11848, 285, 849, 352, 310, 11848, 352, 651, 320, 12912, 281, 294, 18959, 253, 21473, 273, 253, 2805, 20619, 347, 597, 476, 320, 3240, 21643, 5046, 247, 2829, 273, 253, 1027, 749, 3866, 8403, 398, 1687, 14308, 651, 25636, 253, 4361, 347, 891, 574, 281, 3186, 323, 253, 3242, 14308, 7208, 33810, 627, 403, 5884, 43753, 3374, 50276, 17243, 7424, 403, 21337, 1512, 1199, 2317, 281, 4840, 598, 1386, 22735, 24088, 2593, 3127, 1736, 1222, 10012, 4562, 22429, 50276, 783, 1054, 275, 5150, 2145, 3198, 767, 8470, 285, 247, 749, 3866, 50276, 16217, 2092, 258, 21, 7637, 342, 767, 20200, 50276, 11300, 2317, 1475, 4677, 374, 476, 320, 18325, 50275, 585, 3444, 512, 275, 512, 253, 2929, 310, 23395, 3542, 342, 247, 2590, 285, 973, 17194, 2934, 273, 16248, 278, 5902, 50276, 2307, 813, 658, 391, 77, 987, 1024, 253, 2022, 1895, 310, 253, 10636, 273, 253, 27163, 253, 3045, 327, 253, 1029, 4528, 8892, 310, 12744, 285, 253, 8245, 310, 5816, 323, 253, 7281, 36479, 891, 651, 320, 5211, 281, 3157, 619, 4868, 281, 50276, 20881, 2997, 604, 247, 3213, 1754, 1566, 1959, 391, 77, 5933, 310, 2879, 281, 7281, 36479, 3916, 5001, 1029, 4528, 8892, 403, 12956, 285, 10556, 273, 253, 1029, 6967, 8892, 403, 4439, 50276, 14764, 604, 253, 1029, 6967, 8892, 403, 2444, 6283, 342, 23542, 82, 2260, 50276, 9072, 2997, 604, 23542, 82, 2260, 2722, 436, 3045, 327, 247, 3520, 985, 50275, 793, 368, 1537, 671, 1611, 281, 755, 4646, 7877, 8892, 2444, 824, 347, 8511, 3803, 390, 1161, 292, 1240, 326, 1537, 320, 247, 2372, 6927, 50274, 5996, 5955, 5701, 594, 253, 2488, 858, 247, 1193, 487, 8976, 285, 33913, 253, 11985, 342, 31767, 456, 5701, 275, 436, 5133, 352, 369, 2810, 281, 7479, 281, 1978, 3540, 273, 2712, 627, 556, 281, 320, 1894, 2701, 323, 6128, 5010, 436, 310, 417, 17887, 891, 3261, 387, 253, 10556, 285, 634, 12057, 948, 28457, 4453, 1663, 5834, 90, 387, 581, 1127, 275, 673, 253, 15903, 273, 253, 7281, 36479, 310, 387, 8437, 18, 258, 13273, 285, 253, 7281, 7866, 4886, 987, 253, 15903, 556, 2810, 281, 642, 7602, 285, 7613, 760, 247, 1077, 1355, 12336, 10254, 275, 436, 4758, 352, 651, 320, 3626, 326, 253, 15903, 651, 2965, 1066, 604, 436, 1375, 310, 8838, 323, 247, 3356, 2180, 534, 352, 310, 275, 253, 3492, 2299, 253, 15903, 4566, 32372, 715, 253, 26259, 1899, 436, 310, 1663, 12504, 285, 13414, 755, 479, 3053, 327, 253, 4331, 31756, 347, 253, 4331, 4536, 48158, 4260, 1094, 323, 436, 4758, 253, 18924, 3638, 1663, 1057, 417, 1646, 987, 891, 671, 651, 2649, 1908, 253, 4836, 14042, 347, 436, 310, 625, 271, 1663, 440, 29309, 3901, 11438, 323, 1264, 2173, 16012, 50275, 1542, 253, 9864, 2175, 690, 24626, 3464, 533, 253, 4477, 5520, 253, 2929, 3103, 891, 717, 1469, 281, 2572, 619, 4868, 281, 5075, 2997, 17837, 253, 5661, 7103, 812, 320, 5520, 285, 253, 2929, 651, 1663, 5649, 432, 1524, 4679, 2490, 187, 4118, 18435, 27, 783, 4477, 1691, 247, 2257, 273, 3434, 275, 1234, 2943, 281, 3533, 285, 11138, 253, 2929, 281, 247, 1127, 326, 253, 30628, 3543, 29991, 50276, 856, 84, 50276, 266, 4722, 1039, 273, 10620, 342, 1566, 8492, 275, 278, 5902, 50276, 9328, 8379, 7303, 281, 2953, 253, 954, 1774, 7350, 273, 253, 30628, 342, 8783, 273, 3081, 4679, 285, 16039, 50276, 83, 20, 84, 7350, 452, 671, 644, 8379, 9713, 407, 253, 4477, 253, 2278, 50276, 18891, 497, 19235, 417, 9300, 50276, 5040, 50276, 783, 760, 5780, 1127, 310, 326, 253, 9938, 1646, 281, 320, 3253, 533, 13318, 15958, 5731, 387, 990, 273, 391, 18, 84, 2278, 534, 310, 3164, 247, 1895, 273, 253, 49602, 285, 417, 253, 4477, 35354 ]
Below is given review of a research paper from cnoference journal. Please write a summary the review. ### Review: the paper studies the problem of boosting test performance of the last layer by crafting random perturbations that are orthogonal to the train feature matrix of the last layer at least in the overparametrized case thus leaving train performance unaffected main comment while the idea seemed interesting to me at first i find the paper overselling the results the claim that llboost improves performance is very strong when looking at the numbers 02 improvement for cifar10 and 008 improvement for imagenet is marginal to say the least this is even starker when looking at test performance as opposed to validation where for cifar the improvement is 00301 the benefit for 2 class imagenet32 is more significant at the low sample regime but also fails to impress and feels more like cherry picking rather than a serious experimental ablation minor comments figure 2 should be a table environment btw shows that standard normal perturbations of the last layer reduce accuracies significantly im wondering what would happen if we apply the same variance as the llboost as having different variances gives a false impression of whether random perturbations help or hurt docsep summary this paper provided an efficient algorithm llboost to boost the validation accuracy without spending too much time tuning hyperparameter the algorithm is theoretically and empirically guaranteed reason for score this paper provides an innovative way to improve generalization performance my major concern is about experiment part since the algorithm use valid data to tune the parameter it should another heldout test data to show the result however the author only did experiment on testdata for model resnet18 which is not sufficient to support the paper pros 1 this paper gave an efficient llboost algorithm to quickly improve the validation accuracy the algorithm is theoretically guaranteed 2 the paper clearly stated the intuition of the algorithm the paper considered models that have fc layer as the last layer most of the current models have this property and transformed the problem into a linear regression problem 3 a surprising point of the algorithm is that it does not impact the training loss cons 1 while the algorithm has a theoretically guarantee the experiment part did not convince me this is the major concern for the paper the author tune the parameter using valid data and say valid accuracy is improved which is not enough it should another heldout test data to show the result for all the experiment however the author did an experiment on testdata only for resnet18 which is not sufficient to support the paper also the author should put this testdataresnet18 experiment in main part of the paper not appendix 2 section 3 preliminaries and method is not wellorganized i cannot see why the author put this two lemmas here 1 why lemma 1 implies that llboost does not affect training predictions since it only ever adds a component orthogonal to the span of the training feature matrix the lemma 1 seems having nothing to do with the llboost algorithm 2 what is the purpose for lemma 2 the paper doesnt clearly state it i understand the reason after the author explained it in response but i strongly suggest the author to explain it in paper for the final version 3 based on my understanding of this paper the algorithm has to be applied to an existing pretrained model which is sufficient good if we dont have a good pretrained model does this algorithm provide a better or comparable result than the welltuned model i am just curious about it and hope the author to do some experiments in the future docsep summary this paper proposes llboost that enables adjusting the last linear layer without impacting the training accuracy under the assumption that the last linear layer is in an overparametrized situation when the last layer is not overparametrized llboost first applies the low rank approximation to the training feature matrix through the svd decomposition which may affect the original training accuracy the reason why llboost does not change the training accuracy is explained as follows in an overparametrized noiseless linear regression a solution of a linear system y wx obtained by the gradient descent with an initial value of w0 is given in a closed form of hatw w0 i xxtop xdagger xtop yxdagger therefore we can compute a solution of y wx by simply generating w0 randomly and applying this formula it is also experimentally verified that llboost can adjust the last linear layer without impacting the training accuracy after appriximated with svd when necessary the authors also present theoretical results that sampling w0 uniformly on the hypershpere of appropriate radius leads to a solution that is better than the minimum norm solution yxdagger with constant probability reasons for score overall i vote for weak reject it is interesting that llboost can adjust the last layer without impacting the training accuracy and the theoretical results give a reason to sample w0 uniformly on a hypersphere in alg 1 however the condition that the last layer is overparametrized is rarely satisfied in practical problems to which dnns are applied as discussed in sec 4 the low rank approximation can harm the accuracy in large problems like imagenet the authors show in figure 123 that llboost can improve the validation accuracy without impacting the training accuracy however since alg 1 directly uses the validation labels though it is denoted by test labels in alg 1 to select wbest it should be compared in terms of the holdout test accuracy to examine the usefulness of llboost pros 1 the authors propose a method to adjust the last linear layer of a dnn without impacting the training accuracy under the assumption that the last layer is overparametrized 2 the authors give theoretical results that sampling w0 on a hypersphere leads to a good solution with constant probability cons 1 in practical problems to which dnns are applied the overparametrized assumption rarely holds and the low rank approximation with svd may worsen the accuracy 2 because alg 1 uses validation labels as its input to select the best solution it is not enough to report the validation accuracy in experimental results like figure 123 the accuracy of a holdout test set should also be reported other concerns in line 3 of alg 1 why pid times d u ir times r utop is used though it is explaned as pi xxtop xdagger xtop in method overview paragraph although the input of alg 1 includes test feature matrix and test labels they seem better denoted by validation feature matrix and validation labels respectively in figure 2 trainval acc original should be values without low rank appriximations eg 95193 for train acc in imagenet as denoted in sec 4 also it is not clear whether the val acc is computed with or without low rank appriximations ### Summary:
though the method suggested in this paper is interesting theoretically motivated and resulted in some practical improvement the reviewers ultimately had low scores the reasons for this are 1 the improvements obtained by this method were rather small especially on the standard datasets cifar imagenet 2 in the main results presented in the paper it seems that a proper validationtest split was not done which seems quite important for demonstrating the validity of this method in some of the results presented in supplementary such a split was done but this seems to decrease the performance of the method even more 3 the method requires that features in the last hidden layer approximately span a low dimensional manifold this seems like a major limitation for the accuracy of this method which becomes approximate in datasets where the number of datapoints is larger than the size of the last hidden layer which is the common case therefore i suggest the authors try to improve all of the above issues and resubmit for example one simple way to address issue 3 and potentially improve the results issue 1 is to use the same method on all the features in all the layers instead of just the last layer in other words concatenate all the features and all the layers and then add a linear layer from this concatenated feature vector directly to the network output in a direction that is orthogonal to the data
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 783, 2929, 2175, 253, 1895, 273, 43124, 1071, 3045, 273, 253, 1390, 3828, 407, 49378, 3632, 26309, 326, 403, 19627, 281, 253, 6194, 4735, 4315, 273, 253, 1390, 3828, 387, 1878, 275, 253, 689, 3575, 292, 50065, 1083, 3021, 6108, 6194, 3045, 30195, 50273, 7265, 4385, 1223, 253, 2934, 4455, 4722, 281, 479, 387, 806, 891, 1089, 253, 2929, 689, 23708, 253, 1543, 253, 1750, 326, 26198, 15467, 19132, 3045, 310, 1077, 2266, 672, 2819, 387, 253, 3904, 16261, 7756, 323, 260, 338, 274, 740, 285, 7449, 25, 7756, 323, 4440, 257, 292, 310, 16888, 281, 1333, 253, 1878, 436, 310, 1014, 331, 26599, 672, 2819, 387, 1071, 3045, 347, 10066, 281, 12820, 835, 323, 260, 338, 274, 253, 7756, 310, 209, 4838, 520, 253, 5649, 323, 374, 966, 4440, 257, 292, 1237, 310, 625, 1534, 387, 253, 1698, 3410, 9459, 533, 671, 10224, 281, 21097, 285, 9193, 625, 751, 33804, 8871, 2581, 685, 247, 4092, 5661, 28913, 50275, 37585, 5701, 4677, 374, 943, 320, 247, 2829, 3126, 270, 7553, 2722, 326, 2629, 2622, 26309, 273, 253, 1390, 3828, 4796, 3933, 19103, 3012, 516, 12371, 752, 651, 5108, 604, 359, 4647, 253, 1072, 11041, 347, 253, 26198, 15467, 347, 1907, 1027, 48894, 4245, 50276, 66, 3221, 13214, 273, 1880, 3632, 26309, 1361, 390, 8513, 5474, 33032, 50276, 8774, 50276, 2520, 2929, 2530, 271, 5919, 5933, 26198, 15467, 281, 9510, 253, 12820, 7200, 1293, 9100, 1512, 1199, 673, 25184, 4373, 19484, 253, 5933, 310, 28055, 285, 45190, 16293, 50274, 10752, 323, 4868, 50276, 2520, 2929, 3400, 271, 16694, 1039, 281, 3157, 26647, 3045, 619, 2201, 4468, 310, 670, 3368, 629, 1580, 253, 5933, 897, 3588, 941, 281, 19928, 253, 4764, 352, 943, 1529, 2918, 483, 1071, 941, 281, 921, 253, 906, 2299, 253, 2488, 760, 858, 3368, 327, 1071, 2203, 323, 1566, 501, 3024, 1093, 534, 310, 417, 4209, 281, 1329, 253, 2929, 50274, 856, 84, 50276, 18, 436, 2929, 3534, 271, 5919, 26198, 15467, 5933, 281, 4541, 3157, 253, 12820, 7200, 253, 5933, 310, 28055, 16293, 50276, 19, 253, 2929, 4518, 4767, 253, 30328, 273, 253, 5933, 253, 2929, 2783, 3210, 326, 452, 269, 68, 3828, 347, 253, 1390, 3828, 954, 273, 253, 1655, 3210, 452, 436, 2867, 285, 13657, 253, 1895, 715, 247, 4872, 9077, 1895, 50276, 20, 247, 10084, 1127, 273, 253, 5933, 310, 326, 352, 1057, 417, 3486, 253, 3733, 2957, 50274, 5040, 337, 1223, 253, 5933, 556, 247, 28055, 12215, 253, 3368, 629, 858, 417, 18578, 479, 436, 310, 253, 2201, 4468, 323, 253, 2929, 253, 2488, 19928, 253, 4764, 970, 3588, 941, 285, 1333, 3588, 7200, 310, 5520, 534, 310, 417, 2217, 352, 943, 1529, 2918, 483, 1071, 941, 281, 921, 253, 906, 323, 512, 253, 3368, 2299, 253, 2488, 858, 271, 3368, 327, 1071, 2203, 760, 323, 501, 3024, 1093, 534, 310, 417, 4209, 281, 1329, 253, 2929, 671, 253, 2488, 943, 1691, 436, 1071, 2203, 373, 3024, 1093, 3368, 275, 2022, 629, 273, 253, 2929, 417, 30762, 50276, 19, 2593, 495, 11944, 249, 3927, 285, 1332, 310, 417, 973, 34092, 891, 2550, 923, 2139, 253, 2488, 1691, 436, 767, 458, 44661, 1060, 50276, 18, 2139, 18057, 337, 8018, 326, 26198, 15467, 1057, 417, 2818, 3733, 13650, 1580, 352, 760, 2455, 11323, 247, 4445, 19627, 281, 253, 13905, 273, 253, 3733, 4735, 4315, 253, 18057, 337, 3133, 1907, 2717, 281, 513, 342, 253, 26198, 15467, 5933, 50276, 19, 752, 310, 253, 4096, 323, 18057, 374, 253, 2929, 36908, 4518, 1375, 352, 50276, 74, 2096, 253, 1921, 846, 253, 2488, 5544, 352, 275, 2380, 533, 891, 7052, 1804, 253, 2488, 281, 5513, 352, 275, 2929, 323, 253, 2457, 2715, 50276, 20, 1754, 327, 619, 4685, 273, 436, 2929, 253, 5933, 556, 281, 320, 3732, 281, 271, 5368, 3215, 11273, 1566, 534, 310, 4209, 1175, 604, 359, 13414, 452, 247, 1175, 3215, 11273, 1566, 1057, 436, 5933, 2085, 247, 1805, 390, 10870, 906, 685, 253, 973, 85, 37437, 1566, 891, 717, 816, 14338, 670, 352, 285, 3524, 253, 2488, 281, 513, 690, 4679, 275, 253, 2852, 50274, 7152, 33032, 50276, 8774, 50276, 2520, 2929, 29328, 26198, 15467, 326, 13276, 19427, 253, 1390, 4872, 3828, 1293, 48482, 253, 3733, 7200, 762, 253, 9376, 326, 253, 1390, 4872, 3828, 310, 275, 271, 689, 3575, 292, 50065, 4112, 672, 253, 1390, 3828, 310, 417, 689, 3575, 292, 50065, 26198, 15467, 806, 10384, 253, 1698, 5958, 11193, 281, 253, 3733, 4735, 4315, 949, 253, 18504, 69, 14717, 534, 778, 2818, 253, 3236, 3733, 7200, 253, 1921, 2139, 26198, 15467, 1057, 417, 1818, 253, 3733, 7200, 310, 5544, 347, 3637, 275, 271, 689, 3575, 292, 50065, 642, 261, 6134, 4872, 9077, 247, 2900, 273, 247, 4872, 985, 340, 50276, 22358, 2797, 407, 253, 11786, 18499, 342, 271, 3302, 1318, 273, 259, 17, 310, 1677, 275, 247, 4581, 830, 273, 7856, 88, 50276, 88, 17, 891, 50276, 89, 633, 412, 1269, 11560, 209, 633, 412, 50275, 28264, 11560, 3103, 359, 476, 11897, 247, 2900, 273, 340, 50276, 22358, 407, 3365, 11365, 259, 17, 12421, 285, 9433, 436, 7212, 352, 310, 671, 21657, 16058, 326, 26198, 15467, 476, 4575, 253, 1390, 4872, 3828, 1293, 48482, 253, 3733, 7200, 846, 622, 2633, 18280, 342, 18504, 69, 672, 3309, 253, 4477, 671, 1246, 10527, 1543, 326, 10491, 259, 17, 17568, 327, 253, 24052, 73, 44929, 273, 4569, 9941, 5644, 281, 247, 2900, 326, 310, 1805, 685, 253, 5927, 5222, 2900, 340, 89, 11560, 342, 3638, 5912, 50274, 250, 3743, 323, 4868, 50275, 1189, 455, 891, 6273, 323, 5075, 12009, 352, 310, 4722, 326, 26198, 15467, 476, 4575, 253, 1390, 3828, 1293, 48482, 253, 3733, 7200, 285, 253, 10527, 1543, 1918, 247, 1921, 281, 3410, 259, 17, 17568, 327, 247, 24052, 81, 1568, 275, 20320, 337, 2299, 253, 1617, 326, 253, 1390, 3828, 310, 689, 3575, 292, 50065, 310, 11766, 10048, 275, 8542, 3237, 281, 534, 277, 79, 2224, 403, 3732, 347, 5469, 275, 4706, 577, 253, 1698, 5958, 11193, 476, 5237, 253, 7200, 275, 1781, 3237, 751, 4440, 257, 292, 253, 4477, 921, 275, 4677, 15567, 50276, 3529, 26198, 15467, 476, 3157, 253, 12820, 7200, 1293, 48482, 253, 3733, 7200, 2299, 1580, 20320, 337, 3587, 4648, 253, 12820, 13301, 2167, 352, 310, 17007, 407, 1071, 13301, 275, 20320, 337, 281, 3609, 259, 14461, 352, 943, 320, 2429, 275, 2426, 273, 253, 2186, 483, 1071, 7200, 281, 9186, 253, 31471, 273, 26198, 15467, 50272, 856, 84, 50275, 18, 253, 4477, 12661, 247, 1332, 281, 4575, 253, 1390, 4872, 3828, 273, 247, 277, 9866, 1293, 48482, 253, 3733, 7200, 762, 253, 9376, 326, 253, 1390, 3828, 310, 689, 3575, 292, 50065, 50276, 19, 253, 4477, 1918, 10527, 1543, 326, 10491, 259, 17, 327, 247, 24052, 81, 1568, 5644, 281, 247, 1175, 2900, 342, 3638, 5912, 50272, 5040, 50275, 18, 275, 8542, 3237, 281, 534, 277, 79, 2224, 403, 3732, 253, 689, 3575, 292, 50065, 9376, 11766, 6556, 285, 253, 1698, 5958, 11193, 342, 18504, 69, 778, 548, 8243, 253, 7200, 50276, 19, 984, 20320, 337, 4648, 12820, 13301, 347, 697, 3280, 281, 3609, 253, 1682, 2900, 352, 310, 417, 2217, 281, 1304, 253, 12820, 7200, 275, 5661, 1543, 751, 4677, 15567, 253, 7200, 273, 247, 2186, 483, 1071, 873, 943, 671, 320, 2361, 50273, 977, 7350, 50275, 249, 1386, 495, 273, 20320, 337, 2139, 33786, 2069, 277, 50276, 86, 3496, 2069, 391, 2780, 412, 310, 908, 2167, 352, 310, 7148, 264, 347, 12580, 50275, 89, 633, 412, 1269, 11560, 209, 633, 412, 275, 1332, 18389, 12494, 50275, 20261, 253, 3280, 273, 20320, 337, 3797, 1071, 4735, 4315, 285, 1071, 13301, 597, 1646, 1805, 17007, 407, 12820, 4735, 4315, 285, 12820, 13301, 2975, 50275, 249, 4677, 374, 6194, 1208, 756, 3236, 943, 320, 2193, 1293, 1698, 5958, 622, 2633, 303, 569, 24088, 5325, 19631, 323, 6194, 756, 275, 4440, 257, 292, 347, 17007, 275, 4706, 577, 671, 352, 310, 417, 2590, 1880, 253, 821, 756, 310, 10302, 342, 390, 1293, 1698, 5958, 622, 2633, 303, 569, 50276, 187, 187, 4118, 18435, 27, 2004, 253, 1332, 5125, 275, 436, 2929, 310, 4722, 28055, 17194, 285, 7369, 275, 690, 8542, 7756, 253, 30628, 9142, 574, 1698, 7363, 253, 4606, 323, 436, 403, 337, 253, 11701, 2797, 407, 436, 1332, 497, 2581, 1355, 3340, 327, 253, 2629, 15302, 260, 338, 274, 4440, 257, 292, 374, 275, 253, 2022, 1543, 3559, 275, 253, 2929, 352, 3133, 326, 247, 1463, 12820, 2566, 8085, 369, 417, 2218, 534, 3133, 3240, 1774, 323, 17227, 253, 13091, 273, 436, 1332, 275, 690, 273, 253, 1543, 3559, 275, 24864, 824, 247, 8085, 369, 2218, 533, 436, 3133, 281, 6379, 253, 3045, 273, 253, 1332, 1014, 625, 495, 253, 1332, 4419, 326, 3386, 275, 253, 1390, 8763, 3828, 5512, 13905, 247, 1698, 15759, 16751, 436, 3133, 751, 247, 2201, 12291, 323, 253, 7200, 273, 436, 1332, 534, 4916, 16851, 275, 15302, 835, 253, 1180, 273, 2856, 522, 842, 84, 310, 4067, 685, 253, 1979, 273, 253, 1390, 8763, 3828, 534, 310, 253, 1846, 1083, 50276, 45230, 891, 1804, 253, 4477, 1611, 281, 3157, 512, 273, 253, 1840, 3374, 285, 501, 538, 2225, 323, 1650, 581, 2969, 1039, 281, 2953, 2523, 495, 285, 7826, 3157, 253, 1543, 2523, 337, 310, 281, 897, 253, 1072, 1332, 327, 512, 253, 3386, 275, 512, 253, 8090, 3185, 273, 816, 253, 1390, 3828, 275, 643, 3000, 32147, 366, 512, 253, 3386, 285, 512, 253, 8090, 285, 840, 823, 247, 4872, 3828, 432, 436, 32147, 456, 4735, 4972, 3587, 281, 253, 2990, 3453, 275, 247, 3884, 326, 310, 19627, 281, 253, 941, 50275 ]
[ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 ]
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 783, 2929, 2175, 253, 1895, 273, 43124, 1071, 3045, 273, 253, 1390, 3828, 407, 49378, 3632, 26309, 326, 403, 19627, 281, 253, 6194, 4735, 4315, 273, 253, 1390, 3828, 387, 1878, 275, 253, 689, 3575, 292, 50065, 1083, 3021, 6108, 6194, 3045, 30195, 50273, 7265, 4385, 1223, 253, 2934, 4455, 4722, 281, 479, 387, 806, 891, 1089, 253, 2929, 689, 23708, 253, 1543, 253, 1750, 326, 26198, 15467, 19132, 3045, 310, 1077, 2266, 672, 2819, 387, 253, 3904, 16261, 7756, 323, 260, 338, 274, 740, 285, 7449, 25, 7756, 323, 4440, 257, 292, 310, 16888, 281, 1333, 253, 1878, 436, 310, 1014, 331, 26599, 672, 2819, 387, 1071, 3045, 347, 10066, 281, 12820, 835, 323, 260, 338, 274, 253, 7756, 310, 209, 4838, 520, 253, 5649, 323, 374, 966, 4440, 257, 292, 1237, 310, 625, 1534, 387, 253, 1698, 3410, 9459, 533, 671, 10224, 281, 21097, 285, 9193, 625, 751, 33804, 8871, 2581, 685, 247, 4092, 5661, 28913, 50275, 37585, 5701, 4677, 374, 943, 320, 247, 2829, 3126, 270, 7553, 2722, 326, 2629, 2622, 26309, 273, 253, 1390, 3828, 4796, 3933, 19103, 3012, 516, 12371, 752, 651, 5108, 604, 359, 4647, 253, 1072, 11041, 347, 253, 26198, 15467, 347, 1907, 1027, 48894, 4245, 50276, 66, 3221, 13214, 273, 1880, 3632, 26309, 1361, 390, 8513, 5474, 33032, 50276, 8774, 50276, 2520, 2929, 2530, 271, 5919, 5933, 26198, 15467, 281, 9510, 253, 12820, 7200, 1293, 9100, 1512, 1199, 673, 25184, 4373, 19484, 253, 5933, 310, 28055, 285, 45190, 16293, 50274, 10752, 323, 4868, 50276, 2520, 2929, 3400, 271, 16694, 1039, 281, 3157, 26647, 3045, 619, 2201, 4468, 310, 670, 3368, 629, 1580, 253, 5933, 897, 3588, 941, 281, 19928, 253, 4764, 352, 943, 1529, 2918, 483, 1071, 941, 281, 921, 253, 906, 2299, 253, 2488, 760, 858, 3368, 327, 1071, 2203, 323, 1566, 501, 3024, 1093, 534, 310, 417, 4209, 281, 1329, 253, 2929, 50274, 856, 84, 50276, 18, 436, 2929, 3534, 271, 5919, 26198, 15467, 5933, 281, 4541, 3157, 253, 12820, 7200, 253, 5933, 310, 28055, 16293, 50276, 19, 253, 2929, 4518, 4767, 253, 30328, 273, 253, 5933, 253, 2929, 2783, 3210, 326, 452, 269, 68, 3828, 347, 253, 1390, 3828, 954, 273, 253, 1655, 3210, 452, 436, 2867, 285, 13657, 253, 1895, 715, 247, 4872, 9077, 1895, 50276, 20, 247, 10084, 1127, 273, 253, 5933, 310, 326, 352, 1057, 417, 3486, 253, 3733, 2957, 50274, 5040, 337, 1223, 253, 5933, 556, 247, 28055, 12215, 253, 3368, 629, 858, 417, 18578, 479, 436, 310, 253, 2201, 4468, 323, 253, 2929, 253, 2488, 19928, 253, 4764, 970, 3588, 941, 285, 1333, 3588, 7200, 310, 5520, 534, 310, 417, 2217, 352, 943, 1529, 2918, 483, 1071, 941, 281, 921, 253, 906, 323, 512, 253, 3368, 2299, 253, 2488, 858, 271, 3368, 327, 1071, 2203, 760, 323, 501, 3024, 1093, 534, 310, 417, 4209, 281, 1329, 253, 2929, 671, 253, 2488, 943, 1691, 436, 1071, 2203, 373, 3024, 1093, 3368, 275, 2022, 629, 273, 253, 2929, 417, 30762, 50276, 19, 2593, 495, 11944, 249, 3927, 285, 1332, 310, 417, 973, 34092, 891, 2550, 923, 2139, 253, 2488, 1691, 436, 767, 458, 44661, 1060, 50276, 18, 2139, 18057, 337, 8018, 326, 26198, 15467, 1057, 417, 2818, 3733, 13650, 1580, 352, 760, 2455, 11323, 247, 4445, 19627, 281, 253, 13905, 273, 253, 3733, 4735, 4315, 253, 18057, 337, 3133, 1907, 2717, 281, 513, 342, 253, 26198, 15467, 5933, 50276, 19, 752, 310, 253, 4096, 323, 18057, 374, 253, 2929, 36908, 4518, 1375, 352, 50276, 74, 2096, 253, 1921, 846, 253, 2488, 5544, 352, 275, 2380, 533, 891, 7052, 1804, 253, 2488, 281, 5513, 352, 275, 2929, 323, 253, 2457, 2715, 50276, 20, 1754, 327, 619, 4685, 273, 436, 2929, 253, 5933, 556, 281, 320, 3732, 281, 271, 5368, 3215, 11273, 1566, 534, 310, 4209, 1175, 604, 359, 13414, 452, 247, 1175, 3215, 11273, 1566, 1057, 436, 5933, 2085, 247, 1805, 390, 10870, 906, 685, 253, 973, 85, 37437, 1566, 891, 717, 816, 14338, 670, 352, 285, 3524, 253, 2488, 281, 513, 690, 4679, 275, 253, 2852, 50274, 7152, 33032, 50276, 8774, 50276, 2520, 2929, 29328, 26198, 15467, 326, 13276, 19427, 253, 1390, 4872, 3828, 1293, 48482, 253, 3733, 7200, 762, 253, 9376, 326, 253, 1390, 4872, 3828, 310, 275, 271, 689, 3575, 292, 50065, 4112, 672, 253, 1390, 3828, 310, 417, 689, 3575, 292, 50065, 26198, 15467, 806, 10384, 253, 1698, 5958, 11193, 281, 253, 3733, 4735, 4315, 949, 253, 18504, 69, 14717, 534, 778, 2818, 253, 3236, 3733, 7200, 253, 1921, 2139, 26198, 15467, 1057, 417, 1818, 253, 3733, 7200, 310, 5544, 347, 3637, 275, 271, 689, 3575, 292, 50065, 642, 261, 6134, 4872, 9077, 247, 2900, 273, 247, 4872, 985, 340, 50276, 22358, 2797, 407, 253, 11786, 18499, 342, 271, 3302, 1318, 273, 259, 17, 310, 1677, 275, 247, 4581, 830, 273, 7856, 88, 50276, 88, 17, 891, 50276, 89, 633, 412, 1269, 11560, 209, 633, 412, 50275, 28264, 11560, 3103, 359, 476, 11897, 247, 2900, 273, 340, 50276, 22358, 407, 3365, 11365, 259, 17, 12421, 285, 9433, 436, 7212, 352, 310, 671, 21657, 16058, 326, 26198, 15467, 476, 4575, 253, 1390, 4872, 3828, 1293, 48482, 253, 3733, 7200, 846, 622, 2633, 18280, 342, 18504, 69, 672, 3309, 253, 4477, 671, 1246, 10527, 1543, 326, 10491, 259, 17, 17568, 327, 253, 24052, 73, 44929, 273, 4569, 9941, 5644, 281, 247, 2900, 326, 310, 1805, 685, 253, 5927, 5222, 2900, 340, 89, 11560, 342, 3638, 5912, 50274, 250, 3743, 323, 4868, 50275, 1189, 455, 891, 6273, 323, 5075, 12009, 352, 310, 4722, 326, 26198, 15467, 476, 4575, 253, 1390, 3828, 1293, 48482, 253, 3733, 7200, 285, 253, 10527, 1543, 1918, 247, 1921, 281, 3410, 259, 17, 17568, 327, 247, 24052, 81, 1568, 275, 20320, 337, 2299, 253, 1617, 326, 253, 1390, 3828, 310, 689, 3575, 292, 50065, 310, 11766, 10048, 275, 8542, 3237, 281, 534, 277, 79, 2224, 403, 3732, 347, 5469, 275, 4706, 577, 253, 1698, 5958, 11193, 476, 5237, 253, 7200, 275, 1781, 3237, 751, 4440, 257, 292, 253, 4477, 921, 275, 4677, 15567, 50276, 3529, 26198, 15467, 476, 3157, 253, 12820, 7200, 1293, 48482, 253, 3733, 7200, 2299, 1580, 20320, 337, 3587, 4648, 253, 12820, 13301, 2167, 352, 310, 17007, 407, 1071, 13301, 275, 20320, 337, 281, 3609, 259, 14461, 352, 943, 320, 2429, 275, 2426, 273, 253, 2186, 483, 1071, 7200, 281, 9186, 253, 31471, 273, 26198, 15467, 50272, 856, 84, 50275, 18, 253, 4477, 12661, 247, 1332, 281, 4575, 253, 1390, 4872, 3828, 273, 247, 277, 9866, 1293, 48482, 253, 3733, 7200, 762, 253, 9376, 326, 253, 1390, 3828, 310, 689, 3575, 292, 50065, 50276, 19, 253, 4477, 1918, 10527, 1543, 326, 10491, 259, 17, 327, 247, 24052, 81, 1568, 5644, 281, 247, 1175, 2900, 342, 3638, 5912, 50272, 5040, 50275, 18, 275, 8542, 3237, 281, 534, 277, 79, 2224, 403, 3732, 253, 689, 3575, 292, 50065, 9376, 11766, 6556, 285, 253, 1698, 5958, 11193, 342, 18504, 69, 778, 548, 8243, 253, 7200, 50276, 19, 984, 20320, 337, 4648, 12820, 13301, 347, 697, 3280, 281, 3609, 253, 1682, 2900, 352, 310, 417, 2217, 281, 1304, 253, 12820, 7200, 275, 5661, 1543, 751, 4677, 15567, 253, 7200, 273, 247, 2186, 483, 1071, 873, 943, 671, 320, 2361, 50273, 977, 7350, 50275, 249, 1386, 495, 273, 20320, 337, 2139, 33786, 2069, 277, 50276, 86, 3496, 2069, 391, 2780, 412, 310, 908, 2167, 352, 310, 7148, 264, 347, 12580, 50275, 89, 633, 412, 1269, 11560, 209, 633, 412, 275, 1332, 18389, 12494, 50275, 20261, 253, 3280, 273, 20320, 337, 3797, 1071, 4735, 4315, 285, 1071, 13301, 597, 1646, 1805, 17007, 407, 12820, 4735, 4315, 285, 12820, 13301, 2975, 50275, 249, 4677, 374, 6194, 1208, 756, 3236, 943, 320, 2193, 1293, 1698, 5958, 622, 2633, 303, 569, 24088, 5325, 19631, 323, 6194, 756, 275, 4440, 257, 292, 347, 17007, 275, 4706, 577, 671, 352, 310, 417, 2590, 1880, 253, 821, 756, 310, 10302, 342, 390, 1293, 1698, 5958, 622, 2633, 303, 569, 50276, 187, 187, 4118, 18435, 27, 2004, 253, 1332, 5125, 275, 436, 2929, 310, 4722, 28055, 17194, 285, 7369, 275, 690, 8542, 7756, 253, 30628, 9142, 574, 1698, 7363, 253, 4606, 323, 436, 403, 337, 253, 11701, 2797, 407, 436, 1332, 497, 2581, 1355, 3340, 327, 253, 2629, 15302, 260, 338, 274, 4440, 257, 292, 374, 275, 253, 2022, 1543, 3559, 275, 253, 2929, 352, 3133, 326, 247, 1463, 12820, 2566, 8085, 369, 417, 2218, 534, 3133, 3240, 1774, 323, 17227, 253, 13091, 273, 436, 1332, 275, 690, 273, 253, 1543, 3559, 275, 24864, 824, 247, 8085, 369, 2218, 533, 436, 3133, 281, 6379, 253, 3045, 273, 253, 1332, 1014, 625, 495, 253, 1332, 4419, 326, 3386, 275, 253, 1390, 8763, 3828, 5512, 13905, 247, 1698, 15759, 16751, 436, 3133, 751, 247, 2201, 12291, 323, 253, 7200, 273, 436, 1332, 534, 4916, 16851, 275, 15302, 835, 253, 1180, 273, 2856, 522, 842, 84, 310, 4067, 685, 253, 1979, 273, 253, 1390, 8763, 3828, 534, 310, 253, 1846, 1083, 50276, 45230, 891, 1804, 253, 4477, 1611, 281, 3157, 512, 273, 253, 1840, 3374, 285, 501, 538, 2225, 323, 1650, 581, 2969, 1039, 281, 2953, 2523, 495, 285, 7826, 3157, 253, 1543, 2523, 337, 310, 281, 897, 253, 1072, 1332, 327, 512, 253, 3386, 275, 512, 253, 8090, 3185, 273, 816, 253, 1390, 3828, 275, 643, 3000, 32147, 366, 512, 253, 3386, 285, 512, 253, 8090, 285, 840, 823, 247, 4872, 3828, 432, 436, 32147, 456, 4735, 4972, 3587, 281, 253, 2990, 3453, 275, 247, 3884, 326, 310, 19627, 281, 253, 941, 50275 ]
Below is given review of a research paper from cnoference journal. Please write a summary the review. ### Review: paper proposes an attractive smoothing algorithm via logsupermodular crfs which takes the crf as a regularizer for the dnns extensive experiments on a range of applications demonstrate that even conventional models outperform more recent ones with the addition of an attractive crf the paper is written clearly both the quantitative and qualitative analysis experiments are well done which can fully verify the effective of the attractive smoothing algorithm however im not an expert in this field so im not able to appraise it from the aspects of innovation algorithm design and whether it has a positive role in promoting the research field na na docsepthe paper proposes the use of supermodular crf to regularize a dnn model for applications where smoothing is important the intuition is that while dnns are easier to scale they do not take into account the natural constraints in the task eg two neighboring image pixels need to have similar colors in this dnncrf hybrid model there are two dnns for generating the feature vectors and the crf parameters respectively and a crfbased inference layer results are reports on three tasks stereo matching image colorization and semantic segmentation on stereo matching adding the crf regularizer improves the performance of three dnn baselines on semantic segmentation crf gives a small boost to the dnn on image colorization adding the crf improves the peak signal to noise ratio strengths novel use of supermodular crfs to improve dnn performance across tasks where smoothing is important performance improvements on three tasks weaknesses it would be useful to compare performance against alternative neural approaches eg rnn or transformer that can take into account correlations across time steps the paper does not report the impact on training time due to the crf regularizer the paper states that our approach applies in both discrete and continuous settings but the approach is evaluated only for scenarios with continuous inputs it would be useful to provide more details on the hybrid model training eg are there practical tips to ensure convergence does the dnncrf training need to be done in some alternating fashion etc the appendix is not included in the paper there is no explicit discussion of limitations in the paper hence it would be useful to add a paragraph on limitations to the conclusions section the authors should also add a sentence about the potential negative societal impact in the conclusions docsepthe authors present logsupermodular crf as a smoothing layer for a wide class of applications including optical flow estimation activity recognition colorization and segmentation they show in experiments that the proposed module can improve the performance of the model over all the mentioned tasks strengths easy to read through and solid experiments weakness the authors only compare the deep models with and without attractive smoothing i would encourage them to consider some other adaptive adhoc smoothing methods as baselines its hard to tell the improvement through the figures for tasks like colorization and segmentation also it is questionable whether segmentation needs smoothing the author may mention if their attractive smoothing idea can be nothelpful for other crf applications in text like partofspeech tagging and dependency parsing ### Summary:
the paper proposes to use logsupermodular crfs to smooth the dnn models the paper is well motivated and conduct extensive experiments to verify the effective of the attractive smoothing algorithm it could be better if the paper conduct more experiments based on different networks such as rnn and transformer besides this paper only explores the proposed attractive smoothing it should make a comparision with other smoothing methods
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 20790, 29328, 271, 12994, 36971, 5933, 3066, 2412, 12185, 2307, 792, 1531, 3671, 534, 3936, 253, 1531, 71, 347, 247, 3963, 6081, 323, 253, 277, 79, 2224, 50276, 2068, 3134, 4679, 327, 247, 2491, 273, 4893, 7568, 326, 1014, 6041, 3210, 562, 32231, 625, 3332, 4394, 342, 253, 1635, 273, 271, 12994, 1531, 71, 50276, 783, 2929, 310, 3542, 4518, 1097, 253, 11745, 285, 18276, 1783, 4679, 403, 973, 2218, 534, 476, 4751, 12654, 253, 3576, 273, 253, 12994, 36971, 5933, 2299, 516, 417, 271, 6485, 275, 436, 1673, 594, 516, 417, 2104, 281, 622, 22525, 352, 432, 253, 7794, 273, 15832, 5933, 2216, 285, 1880, 352, 556, 247, 2762, 2554, 275, 14312, 253, 2561, 1673, 50276, 2072, 5549, 5474, 339, 431, 248, 2929, 29328, 253, 897, 273, 2221, 2307, 792, 1531, 71, 281, 3963, 907, 247, 277, 9866, 1566, 323, 4893, 835, 36971, 310, 1774, 253, 30328, 310, 326, 1223, 277, 79, 2224, 403, 6927, 281, 4311, 597, 513, 417, 1379, 715, 2395, 253, 3626, 10806, 275, 253, 4836, 24088, 767, 20667, 2460, 15115, 878, 281, 452, 2074, 9830, 275, 436, 277, 9866, 7083, 71, 9769, 1566, 627, 403, 767, 277, 79, 2224, 323, 11365, 253, 4735, 11390, 285, 253, 1531, 71, 3602, 2975, 285, 247, 1531, 71, 3169, 17032, 3828, 1543, 403, 5012, 327, 1264, 8892, 36167, 11038, 2460, 3295, 1320, 285, 24705, 26405, 327, 36167, 11038, 6240, 253, 1531, 71, 3963, 6081, 19132, 253, 3045, 273, 1264, 277, 9866, 1666, 25379, 327, 24705, 26405, 1531, 71, 4245, 247, 1355, 9510, 281, 253, 277, 9866, 327, 2460, 3295, 1320, 6240, 253, 1531, 71, 19132, 253, 5241, 2625, 281, 6046, 4313, 50276, 296, 3755, 20556, 50276, 2369, 652, 897, 273, 2221, 2307, 792, 1531, 3671, 281, 3157, 277, 9866, 3045, 2439, 8892, 835, 36971, 310, 1774, 50276, 24159, 11701, 327, 1264, 8892, 50276, 20881, 1255, 265, 50276, 262, 651, 320, 4217, 281, 7277, 3045, 1411, 5795, 11454, 7274, 24088, 391, 9866, 390, 39707, 326, 476, 1379, 715, 2395, 13007, 2439, 673, 5018, 50276, 783, 2929, 1057, 417, 1304, 253, 3486, 327, 3733, 673, 1955, 281, 253, 1531, 71, 3963, 6081, 50275, 783, 2929, 3054, 326, 776, 2746, 10384, 275, 1097, 13358, 285, 5415, 7533, 533, 253, 2746, 310, 6760, 760, 323, 15216, 342, 5415, 14800, 50276, 262, 651, 320, 4217, 281, 2085, 625, 4278, 327, 253, 9769, 1566, 3733, 24088, 403, 627, 8542, 12192, 281, 5416, 14940, 1057, 253, 277, 9866, 7083, 71, 3733, 878, 281, 320, 2218, 275, 690, 28035, 8142, 3966, 50275, 783, 30762, 310, 417, 2908, 275, 253, 2929, 50269, 9088, 310, 642, 6843, 5955, 273, 7364, 275, 253, 2929, 7613, 352, 651, 320, 4217, 281, 823, 247, 12494, 327, 7364, 281, 253, 11815, 2593, 253, 4477, 943, 671, 823, 247, 6197, 670, 253, 2442, 4016, 38058, 3486, 275, 253, 11815, 50276, 7152, 339, 431, 248, 4477, 1246, 2412, 12185, 2307, 792, 1531, 71, 347, 247, 36971, 3828, 323, 247, 4618, 966, 273, 4893, 1690, 5748, 2685, 13418, 2425, 8981, 3295, 1320, 285, 26405, 597, 921, 275, 4679, 326, 253, 4081, 6333, 476, 3157, 253, 3045, 273, 253, 1566, 689, 512, 253, 5393, 8892, 20544, 50276, 36423, 281, 1239, 949, 285, 4891, 4679, 50276, 20881, 1255, 50276, 783, 4477, 760, 7277, 253, 3676, 3210, 342, 285, 1293, 12994, 36971, 891, 651, 11907, 731, 281, 1908, 690, 643, 17825, 519, 37806, 36971, 3082, 347, 1666, 25379, 50276, 953, 1892, 281, 2028, 253, 7756, 949, 253, 8442, 323, 8892, 751, 3295, 1320, 285, 26405, 671, 352, 310, 30455, 1880, 26405, 3198, 36971, 253, 2488, 778, 3748, 604, 616, 12994, 36971, 2934, 476, 320, 417, 13070, 1020, 323, 643, 1531, 71, 4893, 275, 2505, 751, 629, 1171, 48460, 48510, 285, 18925, 29072, 2490, 187, 4118, 18435, 27, 783, 2929, 29328, 281, 897, 50275, 2808, 12185, 2307, 792, 1531, 3671, 50276, 936, 6032, 253, 277, 9866, 3210, 253, 2929, 310, 973, 17194, 285, 2589, 9470, 4679, 281, 50276, 36302, 253, 3576, 273, 253, 12994, 36971, 5933, 50275, 262, 812, 320, 1805, 604, 253, 2929, 2589, 625, 4679, 1754, 327, 1027, 6928, 824, 347, 391, 9866, 285, 39707, 50276, 67, 11587, 436, 2929, 760, 33826, 253, 4081, 12994, 36971, 352, 943, 1056, 247, 3294, 1297, 342, 643, 36971, 3082, 50276 ]
[ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 ]
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 20790, 29328, 271, 12994, 36971, 5933, 3066, 2412, 12185, 2307, 792, 1531, 3671, 534, 3936, 253, 1531, 71, 347, 247, 3963, 6081, 323, 253, 277, 79, 2224, 50276, 2068, 3134, 4679, 327, 247, 2491, 273, 4893, 7568, 326, 1014, 6041, 3210, 562, 32231, 625, 3332, 4394, 342, 253, 1635, 273, 271, 12994, 1531, 71, 50276, 783, 2929, 310, 3542, 4518, 1097, 253, 11745, 285, 18276, 1783, 4679, 403, 973, 2218, 534, 476, 4751, 12654, 253, 3576, 273, 253, 12994, 36971, 5933, 2299, 516, 417, 271, 6485, 275, 436, 1673, 594, 516, 417, 2104, 281, 622, 22525, 352, 432, 253, 7794, 273, 15832, 5933, 2216, 285, 1880, 352, 556, 247, 2762, 2554, 275, 14312, 253, 2561, 1673, 50276, 2072, 5549, 5474, 339, 431, 248, 2929, 29328, 253, 897, 273, 2221, 2307, 792, 1531, 71, 281, 3963, 907, 247, 277, 9866, 1566, 323, 4893, 835, 36971, 310, 1774, 253, 30328, 310, 326, 1223, 277, 79, 2224, 403, 6927, 281, 4311, 597, 513, 417, 1379, 715, 2395, 253, 3626, 10806, 275, 253, 4836, 24088, 767, 20667, 2460, 15115, 878, 281, 452, 2074, 9830, 275, 436, 277, 9866, 7083, 71, 9769, 1566, 627, 403, 767, 277, 79, 2224, 323, 11365, 253, 4735, 11390, 285, 253, 1531, 71, 3602, 2975, 285, 247, 1531, 71, 3169, 17032, 3828, 1543, 403, 5012, 327, 1264, 8892, 36167, 11038, 2460, 3295, 1320, 285, 24705, 26405, 327, 36167, 11038, 6240, 253, 1531, 71, 3963, 6081, 19132, 253, 3045, 273, 1264, 277, 9866, 1666, 25379, 327, 24705, 26405, 1531, 71, 4245, 247, 1355, 9510, 281, 253, 277, 9866, 327, 2460, 3295, 1320, 6240, 253, 1531, 71, 19132, 253, 5241, 2625, 281, 6046, 4313, 50276, 296, 3755, 20556, 50276, 2369, 652, 897, 273, 2221, 2307, 792, 1531, 3671, 281, 3157, 277, 9866, 3045, 2439, 8892, 835, 36971, 310, 1774, 50276, 24159, 11701, 327, 1264, 8892, 50276, 20881, 1255, 265, 50276, 262, 651, 320, 4217, 281, 7277, 3045, 1411, 5795, 11454, 7274, 24088, 391, 9866, 390, 39707, 326, 476, 1379, 715, 2395, 13007, 2439, 673, 5018, 50276, 783, 2929, 1057, 417, 1304, 253, 3486, 327, 3733, 673, 1955, 281, 253, 1531, 71, 3963, 6081, 50275, 783, 2929, 3054, 326, 776, 2746, 10384, 275, 1097, 13358, 285, 5415, 7533, 533, 253, 2746, 310, 6760, 760, 323, 15216, 342, 5415, 14800, 50276, 262, 651, 320, 4217, 281, 2085, 625, 4278, 327, 253, 9769, 1566, 3733, 24088, 403, 627, 8542, 12192, 281, 5416, 14940, 1057, 253, 277, 9866, 7083, 71, 3733, 878, 281, 320, 2218, 275, 690, 28035, 8142, 3966, 50275, 783, 30762, 310, 417, 2908, 275, 253, 2929, 50269, 9088, 310, 642, 6843, 5955, 273, 7364, 275, 253, 2929, 7613, 352, 651, 320, 4217, 281, 823, 247, 12494, 327, 7364, 281, 253, 11815, 2593, 253, 4477, 943, 671, 823, 247, 6197, 670, 253, 2442, 4016, 38058, 3486, 275, 253, 11815, 50276, 7152, 339, 431, 248, 4477, 1246, 2412, 12185, 2307, 792, 1531, 71, 347, 247, 36971, 3828, 323, 247, 4618, 966, 273, 4893, 1690, 5748, 2685, 13418, 2425, 8981, 3295, 1320, 285, 26405, 597, 921, 275, 4679, 326, 253, 4081, 6333, 476, 3157, 253, 3045, 273, 253, 1566, 689, 512, 253, 5393, 8892, 20544, 50276, 36423, 281, 1239, 949, 285, 4891, 4679, 50276, 20881, 1255, 50276, 783, 4477, 760, 7277, 253, 3676, 3210, 342, 285, 1293, 12994, 36971, 891, 651, 11907, 731, 281, 1908, 690, 643, 17825, 519, 37806, 36971, 3082, 347, 1666, 25379, 50276, 953, 1892, 281, 2028, 253, 7756, 949, 253, 8442, 323, 8892, 751, 3295, 1320, 285, 26405, 671, 352, 310, 30455, 1880, 26405, 3198, 36971, 253, 2488, 778, 3748, 604, 616, 12994, 36971, 2934, 476, 320, 417, 13070, 1020, 323, 643, 1531, 71, 4893, 275, 2505, 751, 629, 1171, 48460, 48510, 285, 18925, 29072, 2490, 187, 4118, 18435, 27, 783, 2929, 29328, 281, 897, 50275, 2808, 12185, 2307, 792, 1531, 3671, 50276, 936, 6032, 253, 277, 9866, 3210, 253, 2929, 310, 973, 17194, 285, 2589, 9470, 4679, 281, 50276, 36302, 253, 3576, 273, 253, 12994, 36971, 5933, 50275, 262, 812, 320, 1805, 604, 253, 2929, 2589, 625, 4679, 1754, 327, 1027, 6928, 824, 347, 391, 9866, 285, 39707, 50276, 67, 11587, 436, 2929, 760, 33826, 253, 4081, 12994, 36971, 352, 943, 1056, 247, 3294, 1297, 342, 643, 36971, 3082, 50276 ]
Below is given review of a research paper from cnoference journal. Please write a summary the review. ### Review: this paper leverages the recent breakthrough of masked autoencoders mae for the domain of robotics it shows that visual knowledge can be transferred from different domains such as ego4d into specific robotics domains thanks to mae pretraining the paper finds that mae is superior to clip pretraining it provides some analysis of data and model scales and finds that large data and large model works the best this is evaluated on 4 real but toy scenes with picking reaching and pushing skills with high accuracy results for the mae pretraining strengths good to see that somewhat different domain data such as ego4d can be useful for this robotics setup scaling to large models with large data improves performance similar to trends seen in foundation models in other domains this is a good because it means it can leverage large datasets from non robotics sources the advantages of mae are kept here minimal assumptions about data and not relying on specific data augmentation the analysis of data and model sizes is useful information for the robotics domain as opposed to just the vision domain from the mae paper approach generalizes to different morphologies without finetuning weaknesses no quantitative results or videos for hand experiment not yet fully real scenes setups are still somewhat simplistic with uniform backgrounds sink and fridge scenes are a bit more complex but not yet real for the fridge door the arm starts quite close to the door and only has to decide if it pushes the left side or the right side for bucket scenes objects are always at the same ground distance relative to the robot this might be the reason why depth was not required to solve these tasks but this will be different in real scenes where the robot body can be at any distance to the objects a pretty direct application of mae while its great to see it in a robotics setting the innovation is pretty incremental docsepthis paper evaluates vision encoders pretrained using masked autoencoders on large scale datasets pretrained vision encoders are evaluated for real world robotics tasks with variations of everyday scenes as well as novel objects strengths evaluating performance for real world policies by pretraining on scalable datasets is an important problem that this paper addresses the real world experiments are good and contain some variety of tasks demonstrating that pretraining using masked autoencoders works better than a clip baseline for several tasks ablations comparing different camera viewpoints model and data size weaknesses the baselines compare to only vit architectures however these are generally harder to train than simpler conv architectures like resnet i would be interested in comparing to a fromscratch resnet50 or smaller which is easier to train as another comparison point authors could also compare to the resnet clip model comparing to visual encoders that are finetuned would also be valuable since simpler architectures may improve with finetuning when training the policies on demonstration data other comparisons and ablations would be useful for comparing effectiveness of method for example 1 evaluating the effectiveness finetuning the vision encoder during the policy learning phase 2 comparing directly to r3m loss formulation nair et al experiments in sim could allow more comprehensive and scalable evaluations docsepthis paper shows a way to use large scale visual self supervision of inthewild massive datasets and use it to improve behavior cloning on robots specifically they use the masked autoencoder mae framework of he et al on a massive dataset of imagenet ego4d etc image frames to pretrain an encoder which shows improvements for training inhandcamerabased bc policies strengths strong accept while considerable work has investigated visual pre training to improve robot policy learning and for example behavior cloning the key thing about this work compared to prior works is that it can benefit from pre training on just a big pile of images with no assumptions about them other than that they cover a wide diverse set of images it is indeed also notable that they show improvements of using vitlarge a 307m parameter model for those that believe in the just scale things up hypothesis of robotics this indicates a potentially very compelling route forward the experiments seem pretty good too i think all the experiments only work for inhand cameras and all the tasks are basically just some form of a reaching task without any dexterous closed loop feedback required but they are still good enough to interestingly explore some difficulties in this domain such as the cluttered sink environment i appreciate the different types of scaling model size andor data size experiments although 9 is closely related and probably from the same set of authors the present work is very interestingly different for one 9 used rl and required millions of environment interactions and was only shown in simulation instead the present work shows real robot results and instead uses behavior cloning it would be interesting to see rl on the real robots too but of course its easier to deploy bc instead of on policy rl in practice also the present work shows further scaling of vision model size than was shown in 9 weaknesses to further improve the paper the authors may consider addressing there are no videos of the allegro experiments its hard to evaluate these without videos currently they feel like kind of throwaway experiments how was imitation learning actually done i assume its just mse loss behavior cloning but its not actually specified as mentioned above the tasks are interesting in some ways clutter but pretty simplistic in other ways no feedback or multiobjectconditioning is required harder tasks would be interesting in future the discussion for related work could further be improved within self supervised vision for robotics there have been many approaches that have shown improvements for policy learning see for example httpsarxivorgpdf190906933pdf but i think the key thing is that the present paper shows using a significantly more massive dataset for pre training and with no assumptions about the data in recent work xiao et al 9 have shown that selfsupervised visual pretraining is effective for learning motor control tasks in simulation respectfully there is nothing in this statement that is unique to xiao et al it would probably be better to either more specifically call out the mae method used in xiao et al andor refer instead to the concept that it used self supervision on internet data or instead make this a more generic statement about the field rather than specific to xiao et al docsepthis paper scales up vision models and data in order to train better vision backbones for visuomotor policies using behavior cloning they aggregate several existing datasets in order to generate a large number of frames to pretrain a large high quality vision encoder vit using mae the authors conduct extensive experiments to show that scaling both the model and data substantially improves performancce on these tasks stengths the paper is well presented and makes its points clealy the authors conduct many real world experiments to demonstrate the effectiveness of their approach their new model performs very well against reasonable baselines weaknesses the only real weakness here is that there is very little technical novelty at this point everybody knows scaling up to larger models with more data is a good way to go more strengths despite the weakness above this paper is still a valuable contribution to the community while the large ideas here are all well known the specific recipes are useful and will provide others with practical technical guidance it is also useful to see that so much of the performance deficit on these problems can be solved by simply improving the vision model ### Summary:
this paper uses a masked autoencoder as visual pretraining for robot manipulation tasks the paper shows strong results and received positive initial reviews the additional experiments provided during the rebuttal strengthen the papers contribution claim
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 436, 2929, 19732, 1131, 253, 3332, 29709, 273, 34741, 6753, 2083, 351, 398, 278, 3348, 323, 253, 5028, 273, 15688, 982, 50276, 262, 2722, 326, 5304, 3640, 476, 320, 9495, 432, 1027, 10625, 824, 347, 23057, 21, 69, 715, 2173, 15688, 982, 10625, 6701, 281, 278, 3348, 3215, 26208, 50276, 783, 2929, 9010, 326, 278, 3348, 310, 8936, 281, 17230, 3215, 26208, 50276, 262, 3400, 690, 1783, 273, 941, 285, 1566, 11498, 285, 9010, 326, 1781, 941, 285, 1781, 1566, 2987, 253, 1682, 50276, 2520, 310, 6760, 327, 577, 1524, 533, 20953, 13451, 342, 8871, 10922, 285, 13383, 6936, 342, 1029, 7200, 1543, 323, 253, 278, 3348, 3215, 26208, 50276, 296, 3755, 20556, 50276, 12311, 281, 923, 326, 8489, 1027, 5028, 941, 824, 347, 23057, 21, 69, 476, 320, 4217, 323, 436, 15688, 982, 9978, 50276, 49708, 281, 1781, 3210, 342, 1781, 941, 19132, 3045, 2074, 281, 13554, 2326, 275, 12153, 3210, 275, 643, 10625, 436, 310, 247, 1175, 984, 352, 2097, 352, 476, 25057, 1781, 15302, 432, 1327, 15688, 982, 4973, 50276, 783, 11361, 273, 278, 3348, 403, 4934, 1060, 8723, 13260, 670, 941, 285, 417, 22128, 327, 2173, 941, 42072, 50276, 783, 1783, 273, 941, 285, 1566, 9552, 310, 4217, 1491, 323, 253, 15688, 982, 5028, 347, 10066, 281, 816, 253, 8113, 5028, 432, 253, 278, 3348, 2929, 50276, 6772, 607, 2087, 4219, 281, 1027, 6695, 5970, 1293, 1442, 292, 25004, 50276, 20881, 1255, 265, 50276, 2369, 11745, 1543, 390, 10556, 323, 1133, 3368, 50276, 1439, 2568, 4751, 1524, 13451, 873, 8777, 403, 1335, 8489, 8077, 2531, 342, 6447, 24550, 16338, 285, 32454, 13451, 403, 247, 2372, 625, 2570, 533, 417, 2568, 1524, 50274, 1542, 253, 32454, 3369, 253, 4430, 7866, 3240, 2810, 281, 253, 3369, 285, 760, 556, 281, 7617, 604, 352, 32804, 253, 1669, 1930, 390, 253, 987, 1930, 50274, 1542, 22205, 13451, 5113, 403, 1900, 387, 253, 1072, 3216, 4181, 4103, 281, 253, 15688, 436, 1537, 320, 253, 1921, 2139, 6864, 369, 417, 2424, 281, 8415, 841, 8892, 533, 436, 588, 320, 1027, 275, 1524, 13451, 835, 253, 15688, 2133, 476, 320, 387, 667, 4181, 281, 253, 5113, 50276, 66, 3965, 1480, 2898, 273, 278, 3348, 1223, 697, 1270, 281, 923, 352, 275, 247, 15688, 982, 4758, 253, 15832, 310, 3965, 32809, 50276, 7152, 33032, 2520, 2929, 44995, 8113, 2349, 351, 398, 3215, 11273, 970, 34741, 6753, 2083, 351, 398, 327, 1781, 4311, 15302, 3215, 11273, 8113, 2349, 351, 398, 403, 6760, 323, 1524, 1533, 15688, 982, 8892, 342, 10575, 273, 15363, 13451, 347, 973, 347, 4460, 5113, 20544, 50276, 15419, 18186, 3045, 323, 1524, 1533, 7823, 407, 3215, 26208, 327, 44755, 15302, 310, 271, 1774, 1895, 326, 436, 2929, 12453, 50276, 783, 1524, 1533, 4679, 403, 1175, 285, 3831, 690, 5235, 273, 8892, 17227, 326, 3215, 26208, 970, 34741, 6753, 2083, 351, 398, 2987, 1805, 685, 247, 17230, 8245, 323, 2067, 8892, 50276, 1752, 569, 10941, 1027, 6568, 1859, 10801, 1566, 285, 941, 1979, 50276, 20881, 1255, 265, 50276, 783, 1666, 25379, 7277, 281, 760, 9084, 35615, 2299, 841, 403, 3839, 12150, 281, 6194, 685, 19554, 2410, 35615, 751, 501, 3024, 891, 651, 320, 6110, 275, 10941, 281, 247, 432, 8658, 1506, 501, 3024, 1235, 390, 4577, 534, 310, 6927, 281, 6194, 347, 1529, 5301, 1127, 4477, 812, 671, 7277, 281, 253, 501, 3024, 17230, 1566, 10941, 281, 5304, 2349, 351, 398, 326, 403, 1442, 292, 37437, 651, 671, 320, 9865, 1580, 19554, 35615, 778, 3157, 342, 1442, 292, 25004, 672, 3733, 253, 7823, 327, 20028, 941, 50276, 977, 14023, 285, 490, 77, 569, 651, 320, 4217, 323, 10941, 12510, 273, 1332, 50276, 1542, 1650, 337, 16344, 253, 12510, 1442, 292, 25004, 253, 8113, 32049, 1309, 253, 3646, 4715, 3408, 50276, 19, 10941, 3587, 281, 391, 20, 78, 2957, 15895, 295, 1094, 1162, 355, 50276, 16217, 3825, 275, 948, 812, 1581, 625, 11088, 285, 44755, 27163, 50275, 7152, 33032, 2520, 2929, 2722, 247, 1039, 281, 897, 1781, 4311, 5304, 1881, 20446, 273, 540, 248, 32778, 7863, 15302, 285, 897, 352, 281, 3157, 3879, 34591, 327, 25497, 5742, 597, 897, 253, 34741, 6753, 36465, 278, 3348, 7792, 273, 344, 1162, 355, 327, 247, 7863, 10895, 273, 4440, 257, 292, 23057, 21, 69, 3966, 2460, 13009, 281, 3215, 1949, 271, 32049, 534, 2722, 11701, 323, 3733, 275, 4608, 12583, 254, 357, 833, 49501, 7823, 50275, 296, 3755, 20556, 50276, 9072, 2997, 50275, 6050, 10665, 789, 556, 6949, 5304, 638, 3733, 281, 3157, 15688, 3646, 4715, 285, 323, 1650, 3879, 34591, 253, 2234, 2181, 670, 436, 789, 2429, 281, 2720, 2987, 310, 326, 352, 476, 5649, 432, 638, 3733, 327, 816, 247, 1943, 19176, 273, 3888, 342, 642, 13260, 670, 731, 643, 685, 326, 597, 3835, 247, 4618, 11117, 873, 273, 3888, 50275, 262, 310, 6296, 671, 16613, 326, 597, 921, 11701, 273, 970, 9084, 16374, 247, 30761, 78, 4764, 1566, 323, 1110, 326, 2868, 275, 253, 816, 4311, 1841, 598, 9079, 273, 15688, 982, 436, 6492, 247, 7826, 1077, 18511, 7622, 3579, 50275, 783, 4679, 1646, 3965, 1175, 1512, 891, 1158, 512, 253, 4679, 760, 789, 323, 275, 4608, 14693, 285, 512, 253, 8892, 403, 10323, 816, 690, 830, 273, 247, 10922, 4836, 1293, 667, 27625, 350, 528, 4581, 6287, 8680, 2424, 533, 597, 403, 1335, 1175, 2217, 281, 4722, 314, 8338, 690, 12748, 275, 436, 5028, 824, 347, 253, 26986, 3606, 16338, 3126, 50276, 74, 11435, 253, 1027, 3510, 273, 13642, 1566, 1979, 285, 263, 941, 1979, 4679, 50275, 20261, 898, 310, 8244, 2905, 285, 3164, 432, 253, 1072, 873, 273, 4477, 253, 1246, 789, 310, 1077, 4722, 314, 1027, 323, 581, 898, 908, 391, 77, 285, 2424, 9790, 273, 3126, 6355, 285, 369, 760, 2011, 275, 9864, 3185, 253, 1246, 789, 2722, 1524, 15688, 1543, 285, 3185, 4648, 3879, 34591, 352, 651, 320, 4722, 281, 923, 391, 77, 327, 253, 1524, 25497, 1512, 533, 273, 2282, 697, 6927, 281, 8745, 49501, 3185, 273, 327, 3646, 391, 77, 275, 3946, 671, 253, 1246, 789, 2722, 2007, 13642, 273, 8113, 1566, 1979, 685, 369, 2011, 275, 898, 50275, 20881, 1255, 265, 50276, 936, 2007, 3157, 253, 2929, 253, 4477, 778, 1908, 15974, 50275, 9088, 403, 642, 10556, 273, 253, 3654, 287, 4679, 697, 1892, 281, 7472, 841, 1293, 10556, 4390, 597, 1928, 751, 2238, 273, 4710, 12594, 4679, 50275, 5430, 369, 45738, 4715, 2686, 2218, 891, 5467, 697, 816, 278, 339, 2957, 3879, 34591, 533, 697, 417, 2686, 7616, 50276, 284, 5393, 1840, 253, 8892, 403, 4722, 275, 690, 4088, 502, 12216, 533, 3965, 8077, 2531, 275, 643, 4088, 642, 8680, 390, 4471, 6082, 42743, 310, 2424, 12150, 8892, 651, 320, 4722, 275, 2852, 50275, 783, 5955, 323, 2905, 789, 812, 2007, 320, 5520, 1561, 1881, 22296, 8113, 323, 15688, 982, 627, 452, 644, 1142, 7274, 326, 452, 2011, 11701, 323, 3646, 4715, 923, 323, 1650, 5987, 39962, 2061, 9275, 16129, 2270, 2090, 1610, 9275, 533, 891, 1158, 253, 2234, 2181, 310, 326, 253, 1246, 2929, 2722, 970, 247, 3012, 625, 7863, 10895, 323, 638, 3733, 285, 342, 642, 13260, 670, 253, 941, 50276, 249, 3332, 789, 1269, 22728, 1162, 355, 898, 452, 2011, 326, 1881, 35421, 5304, 3215, 26208, 310, 3576, 323, 4715, 5694, 1453, 8892, 275, 9864, 44738, 627, 310, 2717, 275, 436, 3908, 326, 310, 4451, 281, 1269, 22728, 1162, 355, 352, 651, 3164, 320, 1805, 281, 2057, 625, 5742, 1067, 562, 253, 278, 3348, 1332, 908, 275, 1269, 22728, 1162, 355, 285, 263, 3730, 3185, 281, 253, 4473, 326, 352, 908, 1881, 20446, 327, 8573, 941, 390, 3185, 1056, 436, 247, 625, 12314, 3908, 670, 253, 1673, 2581, 685, 2173, 281, 1269, 22728, 1162, 355, 5474, 33032, 2520, 2929, 11498, 598, 8113, 3210, 285, 941, 275, 1340, 281, 6194, 1805, 8113, 896, 47473, 323, 1649, 86, 297, 17894, 7823, 970, 3879, 34591, 597, 19737, 2067, 5368, 15302, 275, 1340, 281, 6635, 247, 1781, 1180, 273, 13009, 281, 3215, 1949, 247, 1781, 1029, 3290, 8113, 32049, 9084, 970, 278, 3348, 253, 4477, 2589, 9470, 4679, 281, 921, 326, 13642, 1097, 253, 1566, 285, 941, 9619, 19132, 1347, 1377, 336, 327, 841, 8892, 331, 1746, 84, 253, 2929, 310, 973, 3559, 285, 2789, 697, 2792, 1391, 5242, 253, 4477, 2589, 1142, 1524, 1533, 4679, 281, 7568, 253, 12510, 273, 616, 2746, 616, 747, 1566, 17923, 1077, 973, 1411, 5272, 1666, 25379, 50276, 20881, 1255, 265, 253, 760, 1524, 14855, 1060, 310, 326, 627, 310, 1077, 1652, 7681, 38135, 387, 436, 1127, 11648, 6057, 13642, 598, 281, 4067, 3210, 342, 625, 941, 310, 247, 1175, 1039, 281, 564, 50276, 3062, 20544, 5747, 253, 14855, 1840, 436, 2929, 310, 1335, 247, 9865, 7680, 281, 253, 3114, 1223, 253, 1781, 5697, 1060, 403, 512, 973, 1929, 253, 2173, 20247, 403, 4217, 285, 588, 2085, 2571, 342, 8542, 7681, 12925, 352, 310, 671, 4217, 281, 923, 326, 594, 1199, 273, 253, 3045, 19094, 327, 841, 3237, 476, 320, 14042, 407, 3365, 11138, 253, 8113, 1566, 2490, 187, 4118, 18435, 27, 2520, 2929, 4648, 247, 34741, 6753, 36465, 347, 5304, 3215, 26208, 323, 15688, 19763, 8892, 253, 2929, 2722, 2266, 1543, 285, 2959, 2762, 3302, 10123, 50276, 783, 3081, 4679, 2530, 1309, 253, 30080, 22559, 17084, 253, 9380, 7680, 1750, 209 ]
[ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 ]
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 436, 2929, 19732, 1131, 253, 3332, 29709, 273, 34741, 6753, 2083, 351, 398, 278, 3348, 323, 253, 5028, 273, 15688, 982, 50276, 262, 2722, 326, 5304, 3640, 476, 320, 9495, 432, 1027, 10625, 824, 347, 23057, 21, 69, 715, 2173, 15688, 982, 10625, 6701, 281, 278, 3348, 3215, 26208, 50276, 783, 2929, 9010, 326, 278, 3348, 310, 8936, 281, 17230, 3215, 26208, 50276, 262, 3400, 690, 1783, 273, 941, 285, 1566, 11498, 285, 9010, 326, 1781, 941, 285, 1781, 1566, 2987, 253, 1682, 50276, 2520, 310, 6760, 327, 577, 1524, 533, 20953, 13451, 342, 8871, 10922, 285, 13383, 6936, 342, 1029, 7200, 1543, 323, 253, 278, 3348, 3215, 26208, 50276, 296, 3755, 20556, 50276, 12311, 281, 923, 326, 8489, 1027, 5028, 941, 824, 347, 23057, 21, 69, 476, 320, 4217, 323, 436, 15688, 982, 9978, 50276, 49708, 281, 1781, 3210, 342, 1781, 941, 19132, 3045, 2074, 281, 13554, 2326, 275, 12153, 3210, 275, 643, 10625, 436, 310, 247, 1175, 984, 352, 2097, 352, 476, 25057, 1781, 15302, 432, 1327, 15688, 982, 4973, 50276, 783, 11361, 273, 278, 3348, 403, 4934, 1060, 8723, 13260, 670, 941, 285, 417, 22128, 327, 2173, 941, 42072, 50276, 783, 1783, 273, 941, 285, 1566, 9552, 310, 4217, 1491, 323, 253, 15688, 982, 5028, 347, 10066, 281, 816, 253, 8113, 5028, 432, 253, 278, 3348, 2929, 50276, 6772, 607, 2087, 4219, 281, 1027, 6695, 5970, 1293, 1442, 292, 25004, 50276, 20881, 1255, 265, 50276, 2369, 11745, 1543, 390, 10556, 323, 1133, 3368, 50276, 1439, 2568, 4751, 1524, 13451, 873, 8777, 403, 1335, 8489, 8077, 2531, 342, 6447, 24550, 16338, 285, 32454, 13451, 403, 247, 2372, 625, 2570, 533, 417, 2568, 1524, 50274, 1542, 253, 32454, 3369, 253, 4430, 7866, 3240, 2810, 281, 253, 3369, 285, 760, 556, 281, 7617, 604, 352, 32804, 253, 1669, 1930, 390, 253, 987, 1930, 50274, 1542, 22205, 13451, 5113, 403, 1900, 387, 253, 1072, 3216, 4181, 4103, 281, 253, 15688, 436, 1537, 320, 253, 1921, 2139, 6864, 369, 417, 2424, 281, 8415, 841, 8892, 533, 436, 588, 320, 1027, 275, 1524, 13451, 835, 253, 15688, 2133, 476, 320, 387, 667, 4181, 281, 253, 5113, 50276, 66, 3965, 1480, 2898, 273, 278, 3348, 1223, 697, 1270, 281, 923, 352, 275, 247, 15688, 982, 4758, 253, 15832, 310, 3965, 32809, 50276, 7152, 33032, 2520, 2929, 44995, 8113, 2349, 351, 398, 3215, 11273, 970, 34741, 6753, 2083, 351, 398, 327, 1781, 4311, 15302, 3215, 11273, 8113, 2349, 351, 398, 403, 6760, 323, 1524, 1533, 15688, 982, 8892, 342, 10575, 273, 15363, 13451, 347, 973, 347, 4460, 5113, 20544, 50276, 15419, 18186, 3045, 323, 1524, 1533, 7823, 407, 3215, 26208, 327, 44755, 15302, 310, 271, 1774, 1895, 326, 436, 2929, 12453, 50276, 783, 1524, 1533, 4679, 403, 1175, 285, 3831, 690, 5235, 273, 8892, 17227, 326, 3215, 26208, 970, 34741, 6753, 2083, 351, 398, 2987, 1805, 685, 247, 17230, 8245, 323, 2067, 8892, 50276, 1752, 569, 10941, 1027, 6568, 1859, 10801, 1566, 285, 941, 1979, 50276, 20881, 1255, 265, 50276, 783, 1666, 25379, 7277, 281, 760, 9084, 35615, 2299, 841, 403, 3839, 12150, 281, 6194, 685, 19554, 2410, 35615, 751, 501, 3024, 891, 651, 320, 6110, 275, 10941, 281, 247, 432, 8658, 1506, 501, 3024, 1235, 390, 4577, 534, 310, 6927, 281, 6194, 347, 1529, 5301, 1127, 4477, 812, 671, 7277, 281, 253, 501, 3024, 17230, 1566, 10941, 281, 5304, 2349, 351, 398, 326, 403, 1442, 292, 37437, 651, 671, 320, 9865, 1580, 19554, 35615, 778, 3157, 342, 1442, 292, 25004, 672, 3733, 253, 7823, 327, 20028, 941, 50276, 977, 14023, 285, 490, 77, 569, 651, 320, 4217, 323, 10941, 12510, 273, 1332, 50276, 1542, 1650, 337, 16344, 253, 12510, 1442, 292, 25004, 253, 8113, 32049, 1309, 253, 3646, 4715, 3408, 50276, 19, 10941, 3587, 281, 391, 20, 78, 2957, 15895, 295, 1094, 1162, 355, 50276, 16217, 3825, 275, 948, 812, 1581, 625, 11088, 285, 44755, 27163, 50275, 7152, 33032, 2520, 2929, 2722, 247, 1039, 281, 897, 1781, 4311, 5304, 1881, 20446, 273, 540, 248, 32778, 7863, 15302, 285, 897, 352, 281, 3157, 3879, 34591, 327, 25497, 5742, 597, 897, 253, 34741, 6753, 36465, 278, 3348, 7792, 273, 344, 1162, 355, 327, 247, 7863, 10895, 273, 4440, 257, 292, 23057, 21, 69, 3966, 2460, 13009, 281, 3215, 1949, 271, 32049, 534, 2722, 11701, 323, 3733, 275, 4608, 12583, 254, 357, 833, 49501, 7823, 50275, 296, 3755, 20556, 50276, 9072, 2997, 50275, 6050, 10665, 789, 556, 6949, 5304, 638, 3733, 281, 3157, 15688, 3646, 4715, 285, 323, 1650, 3879, 34591, 253, 2234, 2181, 670, 436, 789, 2429, 281, 2720, 2987, 310, 326, 352, 476, 5649, 432, 638, 3733, 327, 816, 247, 1943, 19176, 273, 3888, 342, 642, 13260, 670, 731, 643, 685, 326, 597, 3835, 247, 4618, 11117, 873, 273, 3888, 50275, 262, 310, 6296, 671, 16613, 326, 597, 921, 11701, 273, 970, 9084, 16374, 247, 30761, 78, 4764, 1566, 323, 1110, 326, 2868, 275, 253, 816, 4311, 1841, 598, 9079, 273, 15688, 982, 436, 6492, 247, 7826, 1077, 18511, 7622, 3579, 50275, 783, 4679, 1646, 3965, 1175, 1512, 891, 1158, 512, 253, 4679, 760, 789, 323, 275, 4608, 14693, 285, 512, 253, 8892, 403, 10323, 816, 690, 830, 273, 247, 10922, 4836, 1293, 667, 27625, 350, 528, 4581, 6287, 8680, 2424, 533, 597, 403, 1335, 1175, 2217, 281, 4722, 314, 8338, 690, 12748, 275, 436, 5028, 824, 347, 253, 26986, 3606, 16338, 3126, 50276, 74, 11435, 253, 1027, 3510, 273, 13642, 1566, 1979, 285, 263, 941, 1979, 4679, 50275, 20261, 898, 310, 8244, 2905, 285, 3164, 432, 253, 1072, 873, 273, 4477, 253, 1246, 789, 310, 1077, 4722, 314, 1027, 323, 581, 898, 908, 391, 77, 285, 2424, 9790, 273, 3126, 6355, 285, 369, 760, 2011, 275, 9864, 3185, 253, 1246, 789, 2722, 1524, 15688, 1543, 285, 3185, 4648, 3879, 34591, 352, 651, 320, 4722, 281, 923, 391, 77, 327, 253, 1524, 25497, 1512, 533, 273, 2282, 697, 6927, 281, 8745, 49501, 3185, 273, 327, 3646, 391, 77, 275, 3946, 671, 253, 1246, 789, 2722, 2007, 13642, 273, 8113, 1566, 1979, 685, 369, 2011, 275, 898, 50275, 20881, 1255, 265, 50276, 936, 2007, 3157, 253, 2929, 253, 4477, 778, 1908, 15974, 50275, 9088, 403, 642, 10556, 273, 253, 3654, 287, 4679, 697, 1892, 281, 7472, 841, 1293, 10556, 4390, 597, 1928, 751, 2238, 273, 4710, 12594, 4679, 50275, 5430, 369, 45738, 4715, 2686, 2218, 891, 5467, 697, 816, 278, 339, 2957, 3879, 34591, 533, 697, 417, 2686, 7616, 50276, 284, 5393, 1840, 253, 8892, 403, 4722, 275, 690, 4088, 502, 12216, 533, 3965, 8077, 2531, 275, 643, 4088, 642, 8680, 390, 4471, 6082, 42743, 310, 2424, 12150, 8892, 651, 320, 4722, 275, 2852, 50275, 783, 5955, 323, 2905, 789, 812, 2007, 320, 5520, 1561, 1881, 22296, 8113, 323, 15688, 982, 627, 452, 644, 1142, 7274, 326, 452, 2011, 11701, 323, 3646, 4715, 923, 323, 1650, 5987, 39962, 2061, 9275, 16129, 2270, 2090, 1610, 9275, 533, 891, 1158, 253, 2234, 2181, 310, 326, 253, 1246, 2929, 2722, 970, 247, 3012, 625, 7863, 10895, 323, 638, 3733, 285, 342, 642, 13260, 670, 253, 941, 50276, 249, 3332, 789, 1269, 22728, 1162, 355, 898, 452, 2011, 326, 1881, 35421, 5304, 3215, 26208, 310, 3576, 323, 4715, 5694, 1453, 8892, 275, 9864, 44738, 627, 310, 2717, 275, 436, 3908, 326, 310, 4451, 281, 1269, 22728, 1162, 355, 352, 651, 3164, 320, 1805, 281, 2057, 625, 5742, 1067, 562, 253, 278, 3348, 1332, 908, 275, 1269, 22728, 1162, 355, 285, 263, 3730, 3185, 281, 253, 4473, 326, 352, 908, 1881, 20446, 327, 8573, 941, 390, 3185, 1056, 436, 247, 625, 12314, 3908, 670, 253, 1673, 2581, 685, 2173, 281, 1269, 22728, 1162, 355, 5474, 33032, 2520, 2929, 11498, 598, 8113, 3210, 285, 941, 275, 1340, 281, 6194, 1805, 8113, 896, 47473, 323, 1649, 86, 297, 17894, 7823, 970, 3879, 34591, 597, 19737, 2067, 5368, 15302, 275, 1340, 281, 6635, 247, 1781, 1180, 273, 13009, 281, 3215, 1949, 247, 1781, 1029, 3290, 8113, 32049, 9084, 970, 278, 3348, 253, 4477, 2589, 9470, 4679, 281, 921, 326, 13642, 1097, 253, 1566, 285, 941, 9619, 19132, 1347, 1377, 336, 327, 841, 8892, 331, 1746, 84, 253, 2929, 310, 973, 3559, 285, 2789, 697, 2792, 1391, 5242, 253, 4477, 2589, 1142, 1524, 1533, 4679, 281, 7568, 253, 12510, 273, 616, 2746, 616, 747, 1566, 17923, 1077, 973, 1411, 5272, 1666, 25379, 50276, 20881, 1255, 265, 253, 760, 1524, 14855, 1060, 310, 326, 627, 310, 1077, 1652, 7681, 38135, 387, 436, 1127, 11648, 6057, 13642, 598, 281, 4067, 3210, 342, 625, 941, 310, 247, 1175, 1039, 281, 564, 50276, 3062, 20544, 5747, 253, 14855, 1840, 436, 2929, 310, 1335, 247, 9865, 7680, 281, 253, 3114, 1223, 253, 1781, 5697, 1060, 403, 512, 973, 1929, 253, 2173, 20247, 403, 4217, 285, 588, 2085, 2571, 342, 8542, 7681, 12925, 352, 310, 671, 4217, 281, 923, 326, 594, 1199, 273, 253, 3045, 19094, 327, 841, 3237, 476, 320, 14042, 407, 3365, 11138, 253, 8113, 1566, 2490, 187, 4118, 18435, 27, 2520, 2929, 4648, 247, 34741, 6753, 36465, 347, 5304, 3215, 26208, 323, 15688, 19763, 8892, 253, 2929, 2722, 2266, 1543, 285, 2959, 2762, 3302, 10123, 50276, 783, 3081, 4679, 2530, 1309, 253, 30080, 22559, 17084, 253, 9380, 7680, 1750, 209 ]
Below is given review of a research paper from cnoference journal. Please write a summary the review. ### Review: through both rigorous theoretical analysis and empirical findings this paper extends results on learning using privileged information to general time series governed by linear latent dynamics and potentially nonlinear observation models under this general class of possible models the authors demonstrate through learning theoretic analysis that learning with privileged data that is available at training time but not at inference time time series always leads to lower or equivalent risk given that the nonlinear map from observations to latent variables is known up to a linear mapping they then extend their results to the unknown mappings via an argument based on the use of random feature maps finally the authors propose to approximate unknown mappings with deep learning architectures in their empirical study they demonstrate that the predicted learners with privileged information generally outperform baselines in terms of sample efficiency the paper benefits both from strong theoretical analysis and rigorous empirical setup the results presented are relevant and significantly expand understanding of learning using privileged time series lupts information to the best of my knowledge the paper proposes both novel theoretical findings and explores a set of new approaches for lupts finally the paper is wellwritten with clear and concise use of language my only concern about the paper is motivation it is not clear to me except some cases alluded to in the introduction to the paper in which applications one would be interested in using privileged information unfortunately the empirical study also does not motivate a single application but rather transforms existing time series applications such that the problem definition fits i believe the readers would benefit greatly from a discussion of motivation and potential impact i would recommend to spend much more time on the motivation and amplify this in the experiments for example how does this setting connect to images which is presented in the experiments i would recommend to make the findings around l151 more precise instead of deferring to the appendix as i believe this is one of the key findings in the paper minor point l66 typo in respect docsepthe paper deals with the data generated by the following model a vector z1 comes from some distribution then it is subjected to a chain of linear transformations the vector z2 is obtained as a linear transformation of z1 plus some noise z3 is a linear transformation of z2 etc finally we get to zt then zt generates a label y by a linear transformation of a finite dimensional feature mapping the vectors zt are never visible directly there are two modes of access to them in the first classical scenario we get to see x1 psi x1 where psi is a linear transformation in the second privileged scenario we get access to x2 psiz2 x3 psiz3 etc we can have either classical or privileged information at the training stage but only get classical information at the test stage the paper compares two learning approaches in the first classical approach we use least squares to regress y on a feature mapping of x1 in the second privileged approach we try reconstruct the whole model using least squares on the training set to find all transition linear transformations etc and get y the paper shows that the variance of the later approach is lower than the variance of the former the advantage disappears if the feature mapping is infinitedimensional and we get nonsingular gram matrices on the data the experimental results include reconstruction of the feature mapping using neural networks i think this is an interesting result but its significance is somewhat limited the authors very honestly show the key limitation for universal kernels the result is certainly important for the dynamic systems community but i have doubts about wider ml audience the very model appears limited although it has important applications in neurophysiology typossuggestions page 3 line 93 on the form of the form i also find the title somewhat misleading this is not what is usually meant by a time series dynamic systems would be better yes docsepthis paper proposes new insights into the analysis of learning using privileged information 
lupi learner with access to intermediate time series data and  generalizes it to nonlinear prediction tasks in latent dynamical systems the authors extend theoretical guarantees to the case where the map connecting latent variables and observations is known up to a linear transform in addition the paper proposes algorithms based on random features and representation learning for the case when the map is unknown the paper is well written and theoretically sounds the paper is clear easy to follow and bring to the fore an interesting contribution particularly the fact that the authors extend the lupts framework to nonlinear models and prediction tasks in latent dynamical systems is an important step furthermore the authors prove that learning with privileged information leads to lower risk when the nonlinear map connecting latent variables and observations is known up to a linear transform this is also an important contribution the biasvariance analysis is also timely and important also the experimentation with real data namely the prediction of traffic volume and alzheimer progression is critical to validating the approach however the fact that the study is limited to predictions for a fixed time horizon given present observations limit the extend to which the approach could be validated and applied it would be good if the authors could discuss the impact of their biasvariance analysis on societal applications docsepthe manuscript proposes to predict nonlinear outputs using time series privileged information during training the privileged information is not used for inference the authors prove in theorem 1 that under some conditions the generalized lupts is never worse in expectation than the classical learner experiments on both synthetic and realworld data sets show that using privileged information increases the sample efficiency and helps latent variable recovery originality from the title it is not clear how much this manuscript differs from 1 karlsson et al 2021 httpsproceedingsmlrpressv151kakarlsson22akakarlsson22apdf while in the main text the manuscript tackles the task by modeling with latent variables for each observation which is a major difference if i understand correctly theorem 1 in the manuscript is similar to theorem 1 in 1 with some changes of conditions the manuscript should better formulate the difference from 1 quality the manuscript is technically sound clarity the manuscript is overall clearly written some minor suggestions are as follows 1 unobserved applies to figure 1 b but not figure 1 a 2 it is not clear what random variables the distribution p models in line 62 3 the matrix inverse cdot1 in line 104 can not be found in the text 4 figure 3 b is not informative enough 5 it is not clear what the metric in experimental evaluation is used what is r2 in section 4 significance the algorithm proposed in the manuscript would benefit the time series modeling community 1 karlsson rickard ka et al using timeseries privileged information for provably efficient learning of prediction models international conference on artificial intelligence and statistics pmlr 2022 the authors addressed the limitations ### Summary:
this paper considers a particular setting of time series prediction with privileged information a special case can be described as predicting xtk from xt at training time one is also given xt1 xt2 xtk1 and a latent dynamics is assumed the paper presents a learning algorithm that leverages privileged info at train time provides rigorous theoretical analysis of this algorithm and convincing numerical experiments this paper is definitely of interest to ml community and would serve as an interesting contribution to the conference
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 10489, 1097, 26565, 10527, 1783, 285, 16774, 4342, 436, 2929, 8725, 1543, 327, 4715, 970, 30082, 1491, 281, 2087, 673, 2962, 17886, 407, 4872, 21624, 8062, 285, 7826, 14561, 8310, 3210, 762, 436, 2087, 966, 273, 1896, 3210, 253, 4477, 7568, 949, 4715, 253, 30325, 1783, 326, 4715, 342, 30082, 941, 326, 310, 2130, 387, 3733, 673, 533, 417, 387, 17032, 673, 673, 2962, 1900, 5644, 281, 2406, 390, 6425, 2495, 1677, 326, 253, 14561, 3711, 432, 7313, 281, 21624, 4903, 310, 1929, 598, 281, 247, 4872, 10603, 597, 840, 9017, 616, 1543, 281, 253, 7202, 42794, 3066, 271, 4154, 1754, 327, 253, 897, 273, 3632, 4735, 8115, 4720, 253, 4477, 12661, 281, 16851, 7202, 42794, 342, 3676, 4715, 35615, 275, 616, 16774, 1263, 597, 7568, 326, 253, 8131, 40390, 342, 30082, 1491, 3839, 562, 32231, 1666, 25379, 275, 2426, 273, 3410, 6733, 253, 2929, 5373, 1097, 432, 2266, 10527, 1783, 285, 26565, 16774, 9978, 253, 1543, 3559, 403, 4623, 285, 3012, 5645, 4685, 273, 4715, 970, 30082, 673, 2962, 298, 18642, 84, 1491, 281, 253, 1682, 273, 619, 3640, 253, 2929, 29328, 1097, 4460, 10527, 4342, 285, 33826, 247, 873, 273, 747, 7274, 323, 298, 18642, 84, 4720, 253, 2929, 310, 973, 15720, 342, 2590, 285, 44003, 897, 273, 3448, 50276, 2577, 760, 4468, 670, 253, 2929, 310, 16038, 352, 310, 417, 2590, 281, 479, 3707, 690, 2219, 512, 21015, 281, 275, 253, 10199, 281, 253, 2929, 275, 534, 4893, 581, 651, 320, 6110, 275, 970, 30082, 1491, 19235, 253, 16774, 1263, 671, 1057, 417, 41509, 247, 2014, 2898, 533, 2581, 29698, 5368, 673, 2962, 4893, 824, 326, 253, 1895, 5426, 13840, 891, 2868, 253, 10668, 651, 5649, 10260, 432, 247, 5955, 273, 16038, 285, 2442, 3486, 50276, 74, 651, 5583, 281, 6947, 1199, 625, 673, 327, 253, 16038, 285, 42074, 436, 275, 253, 4679, 323, 1650, 849, 1057, 436, 4758, 4684, 281, 3888, 534, 310, 3559, 275, 253, 4679, 50275, 74, 651, 5583, 281, 1056, 253, 4342, 1475, 298, 18795, 625, 10799, 3185, 273, 809, 24247, 281, 253, 30762, 347, 891, 2868, 436, 310, 581, 273, 253, 2234, 4342, 275, 253, 2929, 50276, 37585, 1127, 50276, 77, 2526, 1745, 80, 275, 1675, 5474, 339, 431, 248, 2929, 13330, 342, 253, 941, 4561, 407, 253, 1563, 1566, 50276, 66, 4972, 1182, 18, 3249, 432, 690, 3268, 840, 352, 310, 12021, 281, 247, 5931, 273, 4872, 21257, 253, 4972, 1182, 19, 310, 2797, 347, 247, 4872, 9261, 273, 1182, 18, 5043, 690, 6046, 1182, 20, 310, 247, 4872, 9261, 273, 1182, 19, 3966, 4720, 359, 755, 281, 1182, 85, 840, 1182, 85, 15693, 247, 5203, 340, 407, 247, 4872, 9261, 273, 247, 6486, 15759, 4735, 10603, 50276, 783, 11390, 1182, 85, 403, 1620, 7985, 3587, 627, 403, 767, 10006, 273, 2289, 281, 731, 275, 253, 806, 8946, 10076, 359, 755, 281, 923, 1269, 18, 50276, 4144, 1269, 18, 835, 3714, 74, 310, 247, 4872, 9261, 275, 253, 1273, 30082, 10076, 359, 755, 2289, 281, 1269, 19, 50276, 793, 478, 19, 1269, 20, 50276, 793, 478, 20, 3966, 50276, 664, 476, 452, 2057, 8946, 390, 30082, 1491, 387, 253, 3733, 3924, 533, 760, 755, 8946, 1491, 387, 253, 1071, 3924, 253, 2929, 26662, 767, 4715, 7274, 50276, 249, 253, 806, 8946, 2746, 359, 897, 1878, 19325, 281, 810, 560, 340, 327, 247, 4735, 10603, 273, 1269, 18, 275, 253, 1273, 30082, 2746, 359, 1611, 17029, 253, 2644, 1566, 970, 1878, 19325, 327, 253, 3733, 873, 281, 1089, 512, 5502, 4872, 21257, 3966, 285, 755, 340, 253, 2929, 2722, 326, 253, 11041, 273, 253, 1996, 2746, 310, 2406, 685, 253, 11041, 273, 253, 3438, 50276, 783, 5750, 34654, 604, 253, 4735, 10603, 310, 38353, 959, 37613, 285, 359, 755, 14122, 272, 792, 29975, 12624, 327, 253, 941, 50276, 783, 5661, 1543, 2486, 14433, 273, 253, 4735, 10603, 970, 11454, 6928, 891, 1158, 436, 310, 271, 4722, 906, 533, 697, 8453, 310, 8489, 3710, 253, 4477, 1077, 20509, 921, 253, 2234, 12291, 323, 10898, 34501, 253, 906, 310, 5604, 1774, 323, 253, 7870, 2718, 3114, 533, 891, 452, 24626, 670, 14200, 13361, 8446, 253, 1077, 1566, 4620, 3710, 3738, 352, 556, 1774, 4893, 275, 6551, 14453, 10537, 50276, 555, 24765, 21662, 621, 50276, 6377, 495, 1386, 11456, 327, 253, 830, 50276, 1171, 253, 830, 50275, 74, 671, 1089, 253, 4060, 8489, 24363, 436, 310, 417, 752, 310, 3798, 5486, 407, 247, 673, 2962, 7870, 2718, 651, 320, 1805, 4754, 5474, 33032, 2520, 2929, 29328, 747, 16039, 715, 253, 1783, 273, 4715, 970, 30082, 1491, 209, 40702, 77, 484, 74, 458, 47612, 342, 2289, 281, 10444, 673, 2962, 941, 285, 209, 575, 16691, 4219, 352, 281, 14561, 10554, 8892, 275, 21624, 18525, 2718, 253, 4477, 9017, 10527, 23632, 281, 253, 1083, 835, 253, 3711, 12873, 21624, 4903, 285, 575, 23705, 569, 310, 1929, 598, 281, 247, 4872, 4979, 275, 1635, 253, 2929, 29328, 11333, 575, 3169, 327, 3632, 3386, 285, 6779, 4715, 323, 253, 1083, 672, 253, 3711, 310, 7202, 50276, 783, 2929, 310, 973, 3542, 285, 28055, 7835, 253, 2929, 310, 2590, 3477, 281, 956, 285, 3324, 281, 253, 2273, 271, 4722, 7680, 50276, 35456, 253, 958, 326, 253, 4477, 9017, 253, 298, 18642, 84, 7792, 281, 14561, 3210, 285, 10554, 8892, 275, 21624, 18525, 2718, 310, 271, 1774, 3213, 50275, 44295, 3062, 253, 4477, 5276, 326, 4715, 342, 30082, 1491, 5644, 281, 2406, 2495, 672, 253, 14561, 3711, 12873, 21624, 4903, 285, 7313, 310, 1929, 598, 281, 247, 4872, 4979, 436, 310, 671, 271, 1774, 7680, 50276, 783, 8492, 87, 14417, 1783, 310, 671, 14793, 285, 1774, 671, 253, 40290, 342, 1524, 941, 10775, 253, 10554, 273, 7137, 4644, 285, 355, 22110, 10005, 310, 4619, 281, 3588, 839, 253, 2746, 50276, 35529, 253, 958, 326, 253, 1263, 310, 3710, 281, 13650, 323, 247, 4229, 673, 575, 1688, 21148, 1677, 1246, 7313, 2701, 253, 9017, 281, 534, 253, 2746, 812, 320, 17618, 285, 3732, 352, 651, 320, 1175, 604, 253, 4477, 812, 2319, 253, 3486, 273, 616, 8492, 87, 14417, 1783, 327, 38058, 4893, 5474, 339, 431, 248, 7714, 29328, 281, 3283, 14561, 18012, 970, 673, 2962, 30082, 1491, 1309, 3733, 253, 30082, 1491, 310, 417, 908, 323, 17032, 253, 4477, 5276, 275, 10012, 337, 326, 762, 690, 2515, 253, 14923, 298, 18642, 84, 310, 1620, 7197, 275, 15355, 685, 253, 8946, 458, 47612, 4679, 327, 1097, 13506, 285, 1524, 10186, 941, 5239, 921, 326, 970, 30082, 1491, 5459, 253, 3410, 6733, 285, 7729, 21624, 4778, 7355, 50275, 19164, 414, 50276, 4064, 253, 4060, 352, 310, 417, 2590, 849, 1199, 436, 7714, 19986, 432, 337, 465, 7694, 20897, 1162, 355, 43425, 5987, 856, 22868, 1686, 83, 7100, 87, 18795, 76, 518, 7694, 20897, 1423, 518, 518, 7694, 20897, 1423, 522, 4989, 1223, 275, 253, 2022, 2505, 253, 7714, 39223, 253, 4836, 407, 14053, 342, 21624, 4903, 323, 1016, 8310, 534, 310, 247, 2201, 3064, 604, 891, 2096, 9113, 10012, 337, 275, 253, 7714, 310, 2074, 281, 10012, 337, 275, 337, 342, 690, 2544, 273, 2515, 253, 7714, 943, 1805, 36803, 253, 3064, 432, 337, 50274, 15177, 50276, 783, 7714, 310, 22335, 3590, 50274, 498, 15752, 50276, 783, 7714, 310, 4583, 4518, 3542, 690, 5884, 13991, 403, 347, 3637, 50276, 18, 440, 45912, 10384, 281, 4677, 337, 270, 533, 417, 4677, 337, 247, 374, 352, 310, 417, 2590, 752, 3632, 4903, 253, 3268, 268, 3210, 275, 1386, 9743, 495, 253, 4315, 13737, 260, 5256, 18, 275, 1386, 12131, 476, 417, 320, 1119, 275, 253, 2505, 50276, 21, 4677, 495, 270, 310, 417, 27096, 2217, 50276, 22, 352, 310, 417, 2590, 752, 253, 7982, 275, 5661, 7103, 310, 908, 752, 310, 391, 19, 275, 2593, 577, 50274, 9188, 40348, 50276, 783, 5933, 4081, 275, 253, 7714, 651, 5649, 253, 673, 2962, 14053, 3114, 50274, 18, 465, 7694, 20897, 391, 781, 472, 16288, 1162, 355, 970, 2069, 12395, 30082, 1491, 323, 872, 1598, 5919, 4715, 273, 10554, 3210, 5213, 8059, 327, 13345, 9260, 285, 9990, 268, 1686, 83, 1384, 1423, 253, 4477, 9713, 253, 7364, 2490, 187, 4118, 18435, 27, 2520, 2929, 19401, 247, 1798, 4758, 273, 673, 2962, 10554, 342, 30082, 1491, 247, 2714, 1083, 476, 320, 2529, 347, 21565, 209, 633, 76, 432, 209, 633, 387, 3733, 673, 581, 310, 671, 1677, 209, 633, 18, 209, 633, 19, 50276, 633, 76, 18, 285, 247, 21624, 8062, 310, 8025, 253, 2929, 10262, 247, 4715, 5933, 326, 19732, 1131, 30082, 8692, 387, 6194, 673, 3400, 26565, 10527, 1783, 273, 436, 5933, 285, 50276, 13118, 19163, 10704, 4679, 436, 2929, 310, 7964, 273, 1600, 281, 13361, 3114, 285, 651, 5752, 347, 271, 4722, 7680, 281, 253, 8059 ]
[ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 ]
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 10489, 1097, 26565, 10527, 1783, 285, 16774, 4342, 436, 2929, 8725, 1543, 327, 4715, 970, 30082, 1491, 281, 2087, 673, 2962, 17886, 407, 4872, 21624, 8062, 285, 7826, 14561, 8310, 3210, 762, 436, 2087, 966, 273, 1896, 3210, 253, 4477, 7568, 949, 4715, 253, 30325, 1783, 326, 4715, 342, 30082, 941, 326, 310, 2130, 387, 3733, 673, 533, 417, 387, 17032, 673, 673, 2962, 1900, 5644, 281, 2406, 390, 6425, 2495, 1677, 326, 253, 14561, 3711, 432, 7313, 281, 21624, 4903, 310, 1929, 598, 281, 247, 4872, 10603, 597, 840, 9017, 616, 1543, 281, 253, 7202, 42794, 3066, 271, 4154, 1754, 327, 253, 897, 273, 3632, 4735, 8115, 4720, 253, 4477, 12661, 281, 16851, 7202, 42794, 342, 3676, 4715, 35615, 275, 616, 16774, 1263, 597, 7568, 326, 253, 8131, 40390, 342, 30082, 1491, 3839, 562, 32231, 1666, 25379, 275, 2426, 273, 3410, 6733, 253, 2929, 5373, 1097, 432, 2266, 10527, 1783, 285, 26565, 16774, 9978, 253, 1543, 3559, 403, 4623, 285, 3012, 5645, 4685, 273, 4715, 970, 30082, 673, 2962, 298, 18642, 84, 1491, 281, 253, 1682, 273, 619, 3640, 253, 2929, 29328, 1097, 4460, 10527, 4342, 285, 33826, 247, 873, 273, 747, 7274, 323, 298, 18642, 84, 4720, 253, 2929, 310, 973, 15720, 342, 2590, 285, 44003, 897, 273, 3448, 50276, 2577, 760, 4468, 670, 253, 2929, 310, 16038, 352, 310, 417, 2590, 281, 479, 3707, 690, 2219, 512, 21015, 281, 275, 253, 10199, 281, 253, 2929, 275, 534, 4893, 581, 651, 320, 6110, 275, 970, 30082, 1491, 19235, 253, 16774, 1263, 671, 1057, 417, 41509, 247, 2014, 2898, 533, 2581, 29698, 5368, 673, 2962, 4893, 824, 326, 253, 1895, 5426, 13840, 891, 2868, 253, 10668, 651, 5649, 10260, 432, 247, 5955, 273, 16038, 285, 2442, 3486, 50276, 74, 651, 5583, 281, 6947, 1199, 625, 673, 327, 253, 16038, 285, 42074, 436, 275, 253, 4679, 323, 1650, 849, 1057, 436, 4758, 4684, 281, 3888, 534, 310, 3559, 275, 253, 4679, 50275, 74, 651, 5583, 281, 1056, 253, 4342, 1475, 298, 18795, 625, 10799, 3185, 273, 809, 24247, 281, 253, 30762, 347, 891, 2868, 436, 310, 581, 273, 253, 2234, 4342, 275, 253, 2929, 50276, 37585, 1127, 50276, 77, 2526, 1745, 80, 275, 1675, 5474, 339, 431, 248, 2929, 13330, 342, 253, 941, 4561, 407, 253, 1563, 1566, 50276, 66, 4972, 1182, 18, 3249, 432, 690, 3268, 840, 352, 310, 12021, 281, 247, 5931, 273, 4872, 21257, 253, 4972, 1182, 19, 310, 2797, 347, 247, 4872, 9261, 273, 1182, 18, 5043, 690, 6046, 1182, 20, 310, 247, 4872, 9261, 273, 1182, 19, 3966, 4720, 359, 755, 281, 1182, 85, 840, 1182, 85, 15693, 247, 5203, 340, 407, 247, 4872, 9261, 273, 247, 6486, 15759, 4735, 10603, 50276, 783, 11390, 1182, 85, 403, 1620, 7985, 3587, 627, 403, 767, 10006, 273, 2289, 281, 731, 275, 253, 806, 8946, 10076, 359, 755, 281, 923, 1269, 18, 50276, 4144, 1269, 18, 835, 3714, 74, 310, 247, 4872, 9261, 275, 253, 1273, 30082, 10076, 359, 755, 2289, 281, 1269, 19, 50276, 793, 478, 19, 1269, 20, 50276, 793, 478, 20, 3966, 50276, 664, 476, 452, 2057, 8946, 390, 30082, 1491, 387, 253, 3733, 3924, 533, 760, 755, 8946, 1491, 387, 253, 1071, 3924, 253, 2929, 26662, 767, 4715, 7274, 50276, 249, 253, 806, 8946, 2746, 359, 897, 1878, 19325, 281, 810, 560, 340, 327, 247, 4735, 10603, 273, 1269, 18, 275, 253, 1273, 30082, 2746, 359, 1611, 17029, 253, 2644, 1566, 970, 1878, 19325, 327, 253, 3733, 873, 281, 1089, 512, 5502, 4872, 21257, 3966, 285, 755, 340, 253, 2929, 2722, 326, 253, 11041, 273, 253, 1996, 2746, 310, 2406, 685, 253, 11041, 273, 253, 3438, 50276, 783, 5750, 34654, 604, 253, 4735, 10603, 310, 38353, 959, 37613, 285, 359, 755, 14122, 272, 792, 29975, 12624, 327, 253, 941, 50276, 783, 5661, 1543, 2486, 14433, 273, 253, 4735, 10603, 970, 11454, 6928, 891, 1158, 436, 310, 271, 4722, 906, 533, 697, 8453, 310, 8489, 3710, 253, 4477, 1077, 20509, 921, 253, 2234, 12291, 323, 10898, 34501, 253, 906, 310, 5604, 1774, 323, 253, 7870, 2718, 3114, 533, 891, 452, 24626, 670, 14200, 13361, 8446, 253, 1077, 1566, 4620, 3710, 3738, 352, 556, 1774, 4893, 275, 6551, 14453, 10537, 50276, 555, 24765, 21662, 621, 50276, 6377, 495, 1386, 11456, 327, 253, 830, 50276, 1171, 253, 830, 50275, 74, 671, 1089, 253, 4060, 8489, 24363, 436, 310, 417, 752, 310, 3798, 5486, 407, 247, 673, 2962, 7870, 2718, 651, 320, 1805, 4754, 5474, 33032, 2520, 2929, 29328, 747, 16039, 715, 253, 1783, 273, 4715, 970, 30082, 1491, 209, 40702, 77, 484, 74, 458, 47612, 342, 2289, 281, 10444, 673, 2962, 941, 285, 209, 575, 16691, 4219, 352, 281, 14561, 10554, 8892, 275, 21624, 18525, 2718, 253, 4477, 9017, 10527, 23632, 281, 253, 1083, 835, 253, 3711, 12873, 21624, 4903, 285, 575, 23705, 569, 310, 1929, 598, 281, 247, 4872, 4979, 275, 1635, 253, 2929, 29328, 11333, 575, 3169, 327, 3632, 3386, 285, 6779, 4715, 323, 253, 1083, 672, 253, 3711, 310, 7202, 50276, 783, 2929, 310, 973, 3542, 285, 28055, 7835, 253, 2929, 310, 2590, 3477, 281, 956, 285, 3324, 281, 253, 2273, 271, 4722, 7680, 50276, 35456, 253, 958, 326, 253, 4477, 9017, 253, 298, 18642, 84, 7792, 281, 14561, 3210, 285, 10554, 8892, 275, 21624, 18525, 2718, 310, 271, 1774, 3213, 50275, 44295, 3062, 253, 4477, 5276, 326, 4715, 342, 30082, 1491, 5644, 281, 2406, 2495, 672, 253, 14561, 3711, 12873, 21624, 4903, 285, 7313, 310, 1929, 598, 281, 247, 4872, 4979, 436, 310, 671, 271, 1774, 7680, 50276, 783, 8492, 87, 14417, 1783, 310, 671, 14793, 285, 1774, 671, 253, 40290, 342, 1524, 941, 10775, 253, 10554, 273, 7137, 4644, 285, 355, 22110, 10005, 310, 4619, 281, 3588, 839, 253, 2746, 50276, 35529, 253, 958, 326, 253, 1263, 310, 3710, 281, 13650, 323, 247, 4229, 673, 575, 1688, 21148, 1677, 1246, 7313, 2701, 253, 9017, 281, 534, 253, 2746, 812, 320, 17618, 285, 3732, 352, 651, 320, 1175, 604, 253, 4477, 812, 2319, 253, 3486, 273, 616, 8492, 87, 14417, 1783, 327, 38058, 4893, 5474, 339, 431, 248, 7714, 29328, 281, 3283, 14561, 18012, 970, 673, 2962, 30082, 1491, 1309, 3733, 253, 30082, 1491, 310, 417, 908, 323, 17032, 253, 4477, 5276, 275, 10012, 337, 326, 762, 690, 2515, 253, 14923, 298, 18642, 84, 310, 1620, 7197, 275, 15355, 685, 253, 8946, 458, 47612, 4679, 327, 1097, 13506, 285, 1524, 10186, 941, 5239, 921, 326, 970, 30082, 1491, 5459, 253, 3410, 6733, 285, 7729, 21624, 4778, 7355, 50275, 19164, 414, 50276, 4064, 253, 4060, 352, 310, 417, 2590, 849, 1199, 436, 7714, 19986, 432, 337, 465, 7694, 20897, 1162, 355, 43425, 5987, 856, 22868, 1686, 83, 7100, 87, 18795, 76, 518, 7694, 20897, 1423, 518, 518, 7694, 20897, 1423, 522, 4989, 1223, 275, 253, 2022, 2505, 253, 7714, 39223, 253, 4836, 407, 14053, 342, 21624, 4903, 323, 1016, 8310, 534, 310, 247, 2201, 3064, 604, 891, 2096, 9113, 10012, 337, 275, 253, 7714, 310, 2074, 281, 10012, 337, 275, 337, 342, 690, 2544, 273, 2515, 253, 7714, 943, 1805, 36803, 253, 3064, 432, 337, 50274, 15177, 50276, 783, 7714, 310, 22335, 3590, 50274, 498, 15752, 50276, 783, 7714, 310, 4583, 4518, 3542, 690, 5884, 13991, 403, 347, 3637, 50276, 18, 440, 45912, 10384, 281, 4677, 337, 270, 533, 417, 4677, 337, 247, 374, 352, 310, 417, 2590, 752, 3632, 4903, 253, 3268, 268, 3210, 275, 1386, 9743, 495, 253, 4315, 13737, 260, 5256, 18, 275, 1386, 12131, 476, 417, 320, 1119, 275, 253, 2505, 50276, 21, 4677, 495, 270, 310, 417, 27096, 2217, 50276, 22, 352, 310, 417, 2590, 752, 253, 7982, 275, 5661, 7103, 310, 908, 752, 310, 391, 19, 275, 2593, 577, 50274, 9188, 40348, 50276, 783, 5933, 4081, 275, 253, 7714, 651, 5649, 253, 673, 2962, 14053, 3114, 50274, 18, 465, 7694, 20897, 391, 781, 472, 16288, 1162, 355, 970, 2069, 12395, 30082, 1491, 323, 872, 1598, 5919, 4715, 273, 10554, 3210, 5213, 8059, 327, 13345, 9260, 285, 9990, 268, 1686, 83, 1384, 1423, 253, 4477, 9713, 253, 7364, 2490, 187, 4118, 18435, 27, 2520, 2929, 19401, 247, 1798, 4758, 273, 673, 2962, 10554, 342, 30082, 1491, 247, 2714, 1083, 476, 320, 2529, 347, 21565, 209, 633, 76, 432, 209, 633, 387, 3733, 673, 581, 310, 671, 1677, 209, 633, 18, 209, 633, 19, 50276, 633, 76, 18, 285, 247, 21624, 8062, 310, 8025, 253, 2929, 10262, 247, 4715, 5933, 326, 19732, 1131, 30082, 8692, 387, 6194, 673, 3400, 26565, 10527, 1783, 273, 436, 5933, 285, 50276, 13118, 19163, 10704, 4679, 436, 2929, 310, 7964, 273, 1600, 281, 13361, 3114, 285, 651, 5752, 347, 271, 4722, 7680, 281, 253, 8059 ]
Below is given review of a research paper from cnoference journal. Please write a summary the review. ### Review: the paper develops tools for analyzing private algorithms in the framework of gaussian differential privacy gdp gdp is a strengthening of approximate differential privacy similar in spirit to concentrated differential privacy cdp both gdp and cdp capture the privacy guarantees of the gaussian noise mechanism but generalize them in different directions gdp has a number of nice analytical properties but analyzing the gdp guarantees of an algorithm can be difficult as it involves comparing two different functions on 0infty the paper proposes a method to reduce this problem to comparing a function against a fixed constant function making numerical or analytical gdp analysis of a given algorithm easier gdp is an elegant framework for analyzing private algorithms due to its nice composition properties a wider application of the framework however would benefit from automated tools for deriving tight gdp guarantees for a given algorithm the main strength of the paper is in giving one such tool and showing how it can ease the privacy analysis of several algorithms the visualizations of the privacy profiles transformed by gdpt in the paper provide some insight on the privacy guaranteed by several basic algorithm the main weakness of the paper for me is in the strength of the applications mentioned at the end the analysis of the laplace noise mechanism is simple and probably not the best illustration of the power of the methods here the analysis of sgd is limited to giving guarantees for some constant varepsilon and delta in particular a value of delta that is too high for deployment the illustration that the partial order on varepsilon delta can help improve an algorithms error analysis is interesting however i am not sure what the takeaway is from the analysis of subsampling so gdpt may end being a useful tool in a privacy toolbox built around gdp but the paper does not quite manage to make a strong case for it because of this i will be ok if the paper were accepted to iclr but i will not push for it some further minor comments dp definition has a typo missing less than or equal to sign concentrated dp was first proposed by dwork and rothblum httpsarxivorgabs160301887 the references for dp sgd should contain bassily smith thakurta focs 2014 is mutextgdpxy welldefined what about existence and uniqueness of mu the paper makes a contribution to automating the privacy analysis of algorithms within the elegant gdp framework but doesnt manage to make a strong case for the usefulness and applicability of the new tool it develops docsepin this paper the authors propose the gaussian differential privacy transformation to identify which algorithms fall under the framework of of gdp this paper attempts to identify a class of dp algorithms they call gdp algorithms it is largely unclear what is the motive of the paper and what is the application i will list down some of the issues below 1 the theorems in section 3 are either results of previous works or are at best trivial facts 2 the theorems in section 4 are just basic algebraic manipulations this paper thus lacks technical novelty 3 the purpose of the introduction of the transform is unclear to me there is no formal result showing the improvement in utility for a class of algorithms the abstract claims their technique refines the utility but this is illustrated using only 1 example and some basic calculations 4 the abstract claims they study the effect of subsampling on the gdpt but the only thing mentioned about subsampling is that it doesnt give any amplification largely i dont see any applicability of the transformation introduced in this paper the paper lacks technical novelty and makes no real contributionimprovement to support the new transformation for characterizing introduced thus i think it is not good enough for publication docsepthis work is an investigation into the newly proposed gaussian differential privacy gdp in the context of existing formulations of epsilon delta differential privacy dp authors propose a novel framework for gdp transformation gdpt that analyses the properties of gdp and contrasts it to dp through the lens of the adversarial headtail composition analysis showing both theoretically and empirically that it results in better algorithmic utility authors provide a list of applications for gdpt and outline the benefits of this approach primarily through tighter privacy bounds the work presents a comprehensive overview of the gdp in the context of privacy profiling that is attempting to reduce the detrimental effects of dp on the results of algorithmic training there are many interesting insights presented based on the results of a closer analysis of gdp in regards to its behaviour around the headtail composition of the function through the lens of the novel formulation of gdpt including experimentally shown tighter privacy analysis resulting in better utility of the algorithm the motivation is very clear to me as the existing formulations of dp are notoriously difficult to interpret by a nonspecialist require multiple parameters that are not easily defined in practise and provide guarantees that can be difficult to put into perspective in addition dp is known to be detrimental to the utility of the trained model which a more carefully selected formulation such as gdp can partially mitigate therefore i believe that this work is a needed step towards democratisation of privacypreserving ml and a timely investigation into mitigations of dps detrimental effects to algorithmic training performance major comments while the motivation behind the analysis of gdp is clear the potential applications of the work in general are too the takehome message of the findings is not is the key message of the paper that gdp gives tighter guarantees or that existing dp formulations are pessimistic and detrimental to utility it is very difficult to assess the impact of the work when there is no clear message on how and why it needs to be integrated into the existing workflows to alleviate some of these issues i would suggest being much more explicit in the conclusion and contributions in my eyes gdpt on its own is difficult to put into practical context i do however acknowledge that section 5 introduces how gdpt can be applied to existing notions but in my opinion this comes very late and the conclusions are very adhoc there is no clear direction of how the insights from this work are meant to be interpreted and what the main message of the manuscript is in terms of content my main concern here is the overall novelty of the work and the insights presented there exists a similar line of work on the interpretation of gdp by asoodeh et al 1 that was published in march 2021 and presents an argument about the properties of gdp along with its relation to dp and the associated effects on the privacyutility tradeoff this work however was not cited or addressed in this manuscript similar things can be said about bu et al2 which considers the application of gdp in the context of deep learning many of the insights seen in this work eg the interpretation of gdp and its relation to dp in general have already been presented in these two works and are thus hardly novel while the goals of this work and 1 are tangential and gdpt being based on dp analysis through the headtail behaviour and not on a reinterpretation of other dp formulations and authors here consider laplace mechanism rather than gaussian the conclusions these works present are very similar i would like therefore to request that reviewers compare their contributions to the work by 1 and specify the novel conclusions that can be drawn from their work in comparison to the prior work i am open to changing my review should the authors present a sound argument that their work derives additional insights when compared to the literature i linked and particularly their implications for the wider scientific community 1 asoodeh shahab et al three variants of differential privacy lossless conversion and applications ieee journal on selected areas in information theory 21 2021 208222 2 bu zhiqi et al deep learning with gaussian differential privacy harvard data science review 202023 2020 another concern i have is a lack of any discussion on the gaussian mechanism as this paper is positioned to be applicable for ml practitioners i find the scope of these findings rather limiting especially if we were to consider deep learning if only the laplacian mechanism is considered due to its unfavourable composition properties moreover i am unsure if it is enough to only assess laplacian in the work that considers gaussian dp and completely disregard the gaussian mechanism in general i would like to see justification on why laplace is the suitable choice here as otherwise i am struggling to see the broader impact of this work and would therefore encourage authors to include some concrete applications of their work on this note i would have expected a more detailed discussion on the applications of this work or of dp in general in the domain of ml specifically i would encourage authors to expand their section 5 and provide explicit applications of their work in this domain the experimental results figure 1 could be explained better the results are relatively straightforward but the subsection comes unannounced and following the method of obtaining these was not straightforward minor comments the actual novelty of the paper is not very clearly presented the abstract contains a lot of information about gdpts details so the actual contributions are diluted and the same thing can be said about the introduction as a result it is very difficult to determine the actual novel contributions that the work contains i would suggest trimming the technical details down to make the contributions of the paper more convincing multiple issues with definition 11 missing a less than or equals sign does not specify the condition of differ is this an addremove a record or is it a replace a record overall while i do believe that this paper tries to address a very important problem and is formally sound i am not fully convinced that there is enough novelty in the existing manuscript to justify the acceptance and therefore i believe that this paper is just below the bar of acceptance docsep summary of contributions this paper considers gaussian differential privacy gdp notion which is an extension of epsilon deltadifferential privacy dp roughly speaking an algorithm is mugdp if it is at least as private as gaussian mechanism with noise multiplier 1mu the main advantages of gdp are that i it has a simple and sharper composition theorem than epsilon deltadp and ii it has only a single parameter mu leading to simplermore efficient computation compared to other dp notions that provide good composition eg fdp or privacy loss distribution pld this paper develops tools that can help understanddeploy gdp more easily specifically they propose gaussian differential privacy transformation gdpt which roughly speaking is the curve mu mathbbr to mathbbr where muepsilon is the smallest epsilon for which the algorithm is at least as private as gaussian mechanism of noise multiplier 1muepsilon at this particular value of epsilon with the above notion they derive in theorems 43 44 the condition for an algorithm to be mugdp for some finite mu it must be epsilon deltaepsilondp where limsup fracepsilon2log deltaepsilon is finite they then show how to approximately check whether an algorithm is gdp specifically they show that it suffices to check a certain condition on a finite number of points corresponding to an upper staircase in figure 2 right they show that if this condition holds the algorithm is epsilonh muhead gdp roughly meaning that it is gdp for all epsilon epsilonh they argue that in practice taking epsilonh to be sufficiently large should be enough but they also give a clip and noise procedure to turn an epsilonh muhead gdp algorithm into one which is actually mugdp along the way the paper also shows a new implication between epsilon deltadp and epsilon deltadp previously it was known that the former implies the latter if epsilon geq epsilon and delta geq delta in theorem 31 of this paper the authors show that the implication may even hold when epsilon epsilon but delta also has to be larger than delta by a certain amount depending on delta epsilon epsilon the authors give some example in section 5 where this relation gives nontrivial implications for some known algorithms strengths this paper proposes a transformation and several auxiliary lemmas that may help make gdp more practical and more widely applicable the nontrivial relationship between epsilon deltadps is interesting weaknesses i do not believe that gdpt yields significant insight that cannot be drawn from the privacy profile curve itself specifically almost all the applications can be derived via the latter as well more details are presented below in detailed comments for authors although the paper claims to provide engineering tools for gdp no evaluation is given on this front i think it is important to show eg how tight how efficient gdp tools presented here are compared to other methods in literature such as privacy loss distributions sommer et al pets 2019 the utility improvement from the nontrivial relationship between epsilon deltadps does not seem that appealing specifically the examples in the paper uses very high delta all of them larger than 005 which is not a value used anywhere in practice i have not seen any practical application where delta 104 indeed this does not seem like a coincidence the new relation only seem useful for epsilon epsilon0 relatively close and delta delta0 relatively large otherwise the additive term in delta0 would be too large the nontrivial relationship between epsilon deltadps also seems to be implicit in previous work as an example the paper optimal accounting of differential privacy via characteristic function by zhu et al has a characterization that says that the privacy curve must be convex lemma 11 when the xaxis is changed from epsilon to eepsilon using this convexity on 0 epsilon0 epsilon also yields this characterization i think this is a minor weakness since it is good for the statement to be written down in a more explicit form as is done here anyway detailed comments for authors regarding gdpt theorem 44 can be easily derived from lemma a1 by setting up fepsilon leq deltamuepsilon for some mu for theorem 46 the staircase functions taken are equivalent to rounding up or down deltaepsilon in the privacy profile for all epsilon in xi xi 1 in each i so the entire proof of theorem 46 can also be derived as easily via the privacy profile curve as well overall i think itd be better if you show an application of gdpt that would be much harder to derive under the privacy profile notion otherwise im not sure why we should define yet another privacy curve in addition to the already many curves other more minor comments definition 11 missing leq in the equation page 2 paragraph before section 11 sigma is not yet defined page 3 second example isnt the gaussian mechanism itself also an example of such a tradeoff page 3 third example the icea algorithm of ghazi et al 2019 was actually proposed much earlier by ishai et al focs 2006 in the paper cryptography from anonymity see also the papers private aggregation from fewer anonymous messages by ghazi et al eurocrypt 2020 and private summation in the multimessage shuffle model by balle et al ccs 2021 which give better analysis of icea compared to ghazi et al 2019 page 3 third example m in the icea algorithm is not maximum error but the number of messages ie additive shares per user page 5 definition 32 it doesnt seem like the tail condition is used in a meaningful way at least in the main body if so then maybe there is no need to define it at least in the main body page 6 before theorem 41 the following theorems formalizes the following theorems formalize recommendation overall although i think this paper advances the understanding of gdp i am unconvinced that gdpt yields significant novel insight that cannot be achieved otherwise and more empirical evaluations have to be made in order to determine the practicality of the tools proposed here due to this i recommend rejection ### Summary:
we thank the authors for their response the reviewers agree that this paper provides contributions in automating privacy analyses under the gaussian differential privacy gdp framework the reviewers also pointed out several drawbacks of the paper most importantly the reviewers do not find the presented applications to be convincing in particular the presented result can be much strengthened if the proposed method can lead to improved privacy analysis for more sophisticated algorithms such as dpsgd across a wide regime of epsilon and delta in general the privacy guarantee is very weak with delta bigger than 1n overall the paper does not seem to provide enough evidence to showcase the usefulness of their proposed method
[ 310, 50276, 249, 2426, 273, 2600, 619, 2022, 4468, 1060, 310, 253, 4583, 38135, 273, 253, 789, 285, 253, 16039, 3559, 627, 4961, 247, 2074, 1386, 273, 789, 327, 253, 7914, 273, 305, 12132, 407, 347, 836, 11430, 1162, 355, 337, 326, 369, 3863, 275, 14172, 43425, 285, 10262, 271, 4154, 670, 253, 3607, 273, 305, 12132, 2112, 342, 697, 5886, 281, 33234, 285, 253, 2330, 2538, 327, 253, 11068, 307, 874, 5454, 2727, 436, 789, 2299, 369, 417, 11106, 390, 9713, 275, 436, 7714, 2074, 1841, 476, 320, 753, 670, 1081, 1162, 355, 19, 534, 19401, 253, 2898, 273, 305, 12132, 275, 253, 3634, 273, 3676, 4715, 1142, 273, 253, 16039, 2326, 275, 436, 789, 24088, 253, 7914, 273, 305, 12132, 285, 697, 5886, 281, 33234, 275, 2087, 452, 2168, 644, 3559, 275, 841, 767, 2987, 285, 403, 3021, 10693, 4460, 1223, 253, 7342, 273, 436, 789, 285, 337, 403, 12717, 1624, 285, 305, 69, 431, 1146, 1754, 327, 33234, 1783, 949, 253, 1481, 14694, 8770, 285, 417, 327, 247, 294, 22416, 318, 273, 643, 33234, 26850, 285, 4477, 1060, 1908, 826, 5070, 5122, 2581, 685, 305, 12064, 253, 11815, 841, 2987, 1246, 403, 1077, 2074, 50276, 74, 651, 751, 3103, 281, 2748, 326, 30628, 7277, 616, 9021, 281, 253, 789, 407, 337, 285, 13199, 253, 4460, 11815, 326, 476, 320, 8392, 432, 616, 789, 275, 5301, 281, 253, 2720, 789, 891, 717, 1527, 281, 6890, 619, 2278, 943, 253, 4477, 1246, 247, 3590, 4154, 326, 616, 789, 38422, 3081, 16039, 672, 2429, 281, 253, 6239, 891, 7939, 285, 3782, 616, 12739, 323, 253, 14200, 8249, 3114, 50276, 18, 50276, 284, 836, 11430, 439, 1240, 357, 1162, 355, 1264, 11640, 273, 8967, 11068, 2957, 1417, 9436, 285, 4893, 26332, 1796, 6698, 327, 4236, 3672, 275, 1491, 3762, 3127, 43425, 24519, 18895, 374, 50276, 9111, 1182, 5801, 33980, 1162, 355, 3676, 4715, 342, 305, 12064, 8967, 11068, 4230, 12299, 941, 5859, 2278, 9169, 1508, 9169, 50276, 23955, 4468, 891, 452, 310, 247, 3480, 273, 667, 5955, 327, 253, 305, 12064, 5122, 347, 436, 2929, 310, 15471, 281, 320, 7763, 323, 13361, 24432, 891, 1089, 253, 7990, 273, 841, 4342, 2581, 14155, 3340, 604, 359, 497, 281, 1908, 3676, 4715, 604, 760, 253, 826, 43917, 5122, 310, 2783, 1955, 281, 697, 5369, 25755, 494, 5889, 3607, 25761, 891, 717, 31488, 604, 352, 310, 2217, 281, 760, 2939, 826, 43917, 275, 253, 789, 326, 19401, 305, 12064, 33234, 285, 4336, 27719, 253, 305, 12064, 5122, 275, 2087, 891, 651, 751, 281, 923, 22861, 327, 2139, 826, 5070, 310, 253, 7470, 4327, 1060, 347, 5010, 891, 717, 15586, 281, 923, 253, 16055, 3486, 273, 436, 789, 285, 651, 3103, 11907, 4477, 281, 2486, 690, 11859, 4893, 273, 616, 789, 50276, 251, 436, 3877, 891, 651, 452, 3264, 247, 625, 7000, 5955, 327, 253, 4893, 273, 436, 789, 390, 273, 33234, 275, 2087, 275, 253, 5028, 273, 13361, 5742, 891, 651, 11907, 4477, 281, 5645, 616, 2593, 608, 285, 2085, 6843, 4893, 273, 616, 789, 275, 436, 5028, 50275, 783, 5661, 1543, 4677, 337, 812, 320, 5544, 1805, 253, 1543, 403, 4942, 15246, 533, 253, 19087, 3249, 440, 1136, 11493, 285, 1563, 253, 1332, 273, 13546, 841, 369, 417, 15246, 50276, 37585, 5701, 253, 4588, 38135, 273, 253, 2929, 310, 417, 1077, 4518, 3559, 253, 12002, 4428, 247, 2257, 273, 1491, 670, 305, 69, 45276, 4278, 594, 253, 4588, 9021, 403, 21042, 285, 253, 1072, 2181, 476, 320, 753, 670, 253, 10199, 50276, 284, 247, 906, 352, 310, 1077, 2834, 281, 3653, 253, 4588, 4460, 9021, 326, 253, 789, 4428, 891, 651, 1804, 13970, 3987, 253, 7681, 4278, 1066, 281, 1056, 253, 9021, 273, 253, 2929, 625, 21414, 50275, 34263, 3374, 342, 5426, 1903, 50276, 33722, 247, 1679, 685, 390, 18207, 861, 50276, 18566, 417, 13199, 253, 1617, 273, 9184, 310, 436, 271, 823, 12163, 247, 1924, 390, 310, 352, 247, 8171, 247, 1924, 50276, 1189, 455, 1223, 891, 513, 2868, 326, 436, 2929, 14177, 281, 2953, 247, 1077, 1774, 1895, 285, 310, 19186, 3590, 891, 717, 417, 4751, 13762, 326, 627, 310, 2217, 38135, 275, 253, 5368, 7714, 281, 15249, 253, 14924, 285, 3103, 891, 2868, 326, 436, 2929, 310, 816, 2708, 253, 2534, 273, 14924, 50276, 7152, 33032, 6010, 273, 9021, 50276, 2520, 2929, 19401, 305, 12064, 8967, 11068, 305, 12132, 10732, 534, 310, 271, 6880, 273, 299, 4277, 1448, 85, 324, 7413, 451, 11068, 33234, 11467, 8288, 271, 5933, 310, 33222, 12132, 604, 352, 310, 387, 1878, 347, 3055, 347, 305, 12064, 5122, 342, 6046, 39199, 337, 1906, 253, 2022, 11361, 273, 305, 12132, 403, 326, 891, 352, 556, 247, 2969, 285, 17614, 468, 5889, 10012, 685, 299, 4277, 1448, 85, 324, 81, 285, 21255, 352, 556, 760, 247, 2014, 4764, 12910, 4283, 281, 8077, 693, 410, 5919, 13782, 2429, 281, 643, 33234, 27367, 326, 2085, 1175, 5889, 24088, 269, 12132, 390, 11068, 2957, 3268, 268, 392, 50276, 2520, 2929, 24357, 5657, 326, 476, 1361, 2096, 45552, 305, 12132, 625, 4354, 5742, 50276, 9328, 12661, 305, 12064, 8967, 11068, 9261, 305, 69, 431, 534, 11467, 8288, 310, 253, 6970, 12910, 14168, 67, 1288, 281, 14168, 67, 1288, 835, 278, 489, 4277, 310, 253, 8004, 299, 4277, 323, 534, 253, 5933, 310, 387, 1878, 347, 3055, 347, 305, 12064, 5122, 273, 6046, 39199, 337, 78, 489, 4277, 387, 436, 1798, 1318, 273, 299, 4277, 50276, 3113, 253, 1840, 10732, 597, 15313, 275, 39383, 7652, 7127, 253, 1617, 323, 271, 5933, 281, 320, 33222, 12132, 323, 690, 6486, 12910, 352, 1364, 320, 299, 4277, 18687, 2265, 300, 857, 81, 835, 1579, 8403, 1315, 317, 4259, 19, 2808, 18687, 4259, 310, 6486, 50276, 9328, 840, 921, 849, 281, 5512, 2451, 1880, 271, 5933, 310, 305, 12132, 5742, 597, 921, 326, 352, 31088, 281, 2451, 247, 2176, 1617, 327, 247, 6486, 1180, 273, 2792, 3969, 281, 271, 5170, 37212, 275, 4677, 374, 987, 597, 921, 326, 604, 436, 1617, 6556, 253, 5933, 310, 299, 4277, 73, 12910, 2522, 305, 12132, 11467, 4495, 326, 352, 310, 305, 12132, 323, 512, 299, 4277, 50276, 4259, 73, 50276, 9328, 9059, 326, 275, 3946, 3192, 299, 4277, 73, 281, 320, 10481, 1781, 943, 320, 2217, 533, 597, 671, 1918, 247, 17230, 285, 6046, 5199, 281, 1614, 271, 299, 4277, 73, 12910, 2522, 305, 12132, 5933, 715, 581, 534, 310, 2686, 33222, 12132, 50274, 28694, 253, 1039, 253, 2929, 671, 2722, 247, 747, 27570, 875, 299, 4277, 1448, 85, 324, 81, 285, 299, 4277, 1448, 85, 324, 81, 3786, 352, 369, 1929, 326, 253, 3438, 8018, 253, 6158, 604, 299, 4277, 305, 2574, 299, 4277, 285, 18687, 305, 2574, 18687, 275, 10012, 4562, 273, 436, 2929, 253, 4477, 921, 326, 253, 27570, 778, 1014, 2186, 672, 299, 4277, 50276, 4259, 533, 18687, 671, 556, 281, 320, 4067, 685, 18687, 407, 247, 2176, 2408, 7293, 327, 18687, 299, 4277, 299, 4277, 253, 4477, 1918, 690, 1650, 275, 2593, 608, 835, 436, 5886, 4245, 37825, 12739, 323, 690, 1929, 11333, 50276, 296, 3755, 20556, 50276, 2520, 2929, 29328, 247, 9261, 285, 2067, 24026, 458, 44661, 326, 778, 1361, 1056, 305, 12132, 625, 8542, 285, 625, 7561, 7763, 50276, 783, 37825, 2954, 875, 299, 4277, 1448, 85, 324, 793, 310, 4722, 50275, 20881, 1255, 265, 50276, 74, 513, 417, 2868, 326, 305, 69, 431, 11026, 1534, 12288, 326, 2550, 320, 8392, 432, 253, 11068, 6222, 6970, 3139, 5742, 2761, 512, 253, 4893, 476, 320, 6012, 3066, 253, 6158, 347, 973, 625, 4278, 403, 3559, 2708, 275, 7000, 5701, 323, 4477, 50276, 20261, 253, 2929, 3916, 281, 2085, 11369, 5657, 323, 305, 12132, 642, 7103, 310, 1677, 327, 436, 2914, 891, 1158, 352, 310, 1774, 281, 921, 24088, 849, 6863, 50276, 5430, 5919, 305, 12132, 5657, 3559, 1060, 403, 2429, 281, 643, 3082, 275, 6239, 824, 347, 11068, 2957, 10670, 1260, 961, 1162, 355, 29286, 6247, 50276, 783, 11839, 7756, 432, 253, 37825, 2954, 875, 299, 4277, 1448, 85, 324, 793, 1057, 417, 1646, 326, 23176, 5742, 253, 6667, 275, 253, 2929, 4648, 1077, 1029, 18687, 512, 273, 731, 4067, 685, 209, 5523, 534, 310, 417, 247, 1318, 908, 9825, 275, 3946, 891, 452, 417, 2326, 667, 8542, 2898, 835, 18687, 50276, 11238, 6296, 436, 1057, 417, 1646, 751, 247, 27454, 253, 747, 5886, 760, 1646, 4217, 323, 299, 4277, 299, 4277, 17, 4942, 2810, 285, 18687, 18687, 17, 4942, 1781, 5010, 253, 21842, 1307, 275, 18687, 17, 651, 320, 1512, 1781, 50276, 783, 37825, 2954, 875, 299, 4277, 1448, 85, 324, 793, 671, 3133, 281, 320, 15424, 275, 2045, 789, 347, 271, 1650, 253, 2929, 8654, 15890, 273, 8967, 11068, 3066, 8847, 1159, 407, 1182, 11917, 1162, 355, 556, 247, 14846, 326, 2296, 326, 253, 11068, 6970, 1364, 320, 17133, 18057, 1903, 672, 253, 1269, 10565, 310, 4391, 432, 299, 4277, 281, 299, 4259, 970, 436, 17133, 414, 327, 470, 299, 4277, 17, 299, 4277, 671, 11026, 436, 14846, 891, 1158, 436, 310, 247, 5884, 14855, 1580, 352, 310, 1175, 323, 253, 3908, 281, 320, 3542, 1066, 275, 247, 625, 6843, 830, 347, 310, 2218, 1060, 8791, 50275, 5992, 7193, 5701, 323, 4477, 50276, 1747, 13218, 305, 69, 431, 50274, 33921, 7127, 476, 320, 4354, 6012, 432, 18057, 247, 18, 407, 4758, 598, 269, 4259, 458, 82, 1448, 85, 312, 489, 4277, 323, 690, 12910, 50274, 1542, 10012, 7904, 253, 37212, 3470, 2668, 403, 6425, 281, 46551, 598, 390, 1066, 18687, 4259, 275, 253, 11068, 6222, 323, 512, 299, 4277, 275, 1269, 74, 1269, 74, 50276, 18, 275, 1016, 891, 594, 253, 2862, 4737, 273, 10012, 7904, 476, 671, 320, 6012, 347, 4354, 3066, 253, 11068, 6222, 6970, 347, 973, 50274, 1189, 455, 891, 1158, 352, 69, 320, 1805, 604, 368, 921, 271, 2898, 273, 305, 69, 431, 326, 651, 320, 1199, 12150, 281, 15313, 762, 253, 11068, 6222, 10732, 5010, 516, 417, 2119, 2139, 359, 943, 4853, 2568, 1529, 11068, 6970, 275, 1635, 281, 253, 2168, 1142, 9191, 50275, 977, 625, 5884, 5701, 50276, 28692, 1903, 5816, 458, 82, 275, 253, 5150, 50276, 6377, 374, 12494, 1078, 2593, 1903, 40009, 310, 417, 2568, 2931, 50276, 6377, 495, 1273, 1650, 310, 2649, 253, 305, 12064, 5122, 3139, 671, 271, 1650, 273, 824, 247, 5454, 2727, 50276, 6377, 495, 2626, 1650, 253, 6749, 66, 5933, 273, 32798, 23248, 1162, 355, 6247, 369, 2686, 4081, 1199, 4321, 407, 310, 16926, 1162, 355, 41685, 84, 5403, 275, 253, 2929, 50276, 22243, 3756, 432, 39185, 923, 671, 253, 9380, 3055, 20828, 432, 11184, 17679, 8169, 407, 32798, 23248, 1162, 355, 17069, 22243, 9169, 285, 3055, 36138, 275, 253, 23390, 2482, 46671, 1566, 407, 4273, 282, 1162, 355, 260, 6113, 43425, 534, 1918, 1805, 1783, 273, 6749, 66, 2429, 281, 32798, 23248, 1162, 355, 6247, 50276, 6377, 495, 2626, 1650, 278, 275, 253, 6749, 66, 5933, 310, 417, 4869, 2228, 533, 253, 1180, 273, 8169, 26332, 21842, 10764, 591, 2608, 50276, 6377, 608, 5426, 4567, 352, 36908, 1646, 751, 253, 8105, 1617, 310, 908, 275, 247, 14282, 1039, 387, 1878, 275, 253, 2022, 2133, 604, 594, 840, 5046, 627, 310, 642, 878, 281, 4853, 352, 387, 1878, 275, 253, 2022, 2133, 50276, 6377, 721, 1078, 10012, 7609, 50276, 783, 1563, 39383, 7473, 4219, 50274, 783, 1563, 39383, 7473, 907, 50275, 250, 27167, 318, 50276, 1189, 455, 3738, 891, 1158, 436, 2929, 16424, 253, 4685, 273, 305, 12132, 891, 717, 10915, 8498, 758, 326, 305, 69, 431, 11026, 1534, 4460, 12288, 326, 2550, 320, 6786, 5010, 285, 625, 16774, 27163, 452, 281, 320, 1160, 275, 1340, 281, 3653, 253, 8542, 414, 273, 253, 5657, 4081, 1060, 1955, 281, 436, 891, 5583, 18235, 2490, 187, 4118, 18435, 27, 664, 5717, 253, 4477, 323, 616, 2380, 253, 30628, 5194, 326, 436, 2929, 3400, 9021, 275, 3772, 839, 11068, 6260, 762, 253, 305, 12064, 8967, 11068, 305, 12132, 7792, 253, 30628, 671, 8042, 562, 2067, 30453, 273, 253, 2929, 954, 15538, 253, 30628, 513, 417, 1089, 253, 3559, 4893, 281, 320, 21414, 275, 1798, 253, 3559, 906, 476, 320, 1199, 34615, 604, 253, 4081, 1332, 476, 1421, 281, 5520, 11068, 1783, 323, 625, 18144, 11333, 824, 347, 20093, 35333, 2439, 247, 4618, 9459, 273, 299, 4277, 285, 18687, 275, 2087, 253, 11068, 12215, 310, 1077, 5075, 342, 18687, 8750, 685, 337, 79, 4583, 253, 2929, 1057, 417, 1646, 281, 2085, 2217, 1941, 281, 34647, 253, 31471, 273, 616, 4081, 1332 ]
[ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 ]
[ 310, 50276, 249, 2426, 273, 2600, 619, 2022, 4468, 1060, 310, 253, 4583, 38135, 273, 253, 789, 285, 253, 16039, 3559, 627, 4961, 247, 2074, 1386, 273, 789, 327, 253, 7914, 273, 305, 12132, 407, 347, 836, 11430, 1162, 355, 337, 326, 369, 3863, 275, 14172, 43425, 285, 10262, 271, 4154, 670, 253, 3607, 273, 305, 12132, 2112, 342, 697, 5886, 281, 33234, 285, 253, 2330, 2538, 327, 253, 11068, 307, 874, 5454, 2727, 436, 789, 2299, 369, 417, 11106, 390, 9713, 275, 436, 7714, 2074, 1841, 476, 320, 753, 670, 1081, 1162, 355, 19, 534, 19401, 253, 2898, 273, 305, 12132, 275, 253, 3634, 273, 3676, 4715, 1142, 273, 253, 16039, 2326, 275, 436, 789, 24088, 253, 7914, 273, 305, 12132, 285, 697, 5886, 281, 33234, 275, 2087, 452, 2168, 644, 3559, 275, 841, 767, 2987, 285, 403, 3021, 10693, 4460, 1223, 253, 7342, 273, 436, 789, 285, 337, 403, 12717, 1624, 285, 305, 69, 431, 1146, 1754, 327, 33234, 1783, 949, 253, 1481, 14694, 8770, 285, 417, 327, 247, 294, 22416, 318, 273, 643, 33234, 26850, 285, 4477, 1060, 1908, 826, 5070, 5122, 2581, 685, 305, 12064, 253, 11815, 841, 2987, 1246, 403, 1077, 2074, 50276, 74, 651, 751, 3103, 281, 2748, 326, 30628, 7277, 616, 9021, 281, 253, 789, 407, 337, 285, 13199, 253, 4460, 11815, 326, 476, 320, 8392, 432, 616, 789, 275, 5301, 281, 253, 2720, 789, 891, 717, 1527, 281, 6890, 619, 2278, 943, 253, 4477, 1246, 247, 3590, 4154, 326, 616, 789, 38422, 3081, 16039, 672, 2429, 281, 253, 6239, 891, 7939, 285, 3782, 616, 12739, 323, 253, 14200, 8249, 3114, 50276, 18, 50276, 284, 836, 11430, 439, 1240, 357, 1162, 355, 1264, 11640, 273, 8967, 11068, 2957, 1417, 9436, 285, 4893, 26332, 1796, 6698, 327, 4236, 3672, 275, 1491, 3762, 3127, 43425, 24519, 18895, 374, 50276, 9111, 1182, 5801, 33980, 1162, 355, 3676, 4715, 342, 305, 12064, 8967, 11068, 4230, 12299, 941, 5859, 2278, 9169, 1508, 9169, 50276, 23955, 4468, 891, 452, 310, 247, 3480, 273, 667, 5955, 327, 253, 305, 12064, 5122, 347, 436, 2929, 310, 15471, 281, 320, 7763, 323, 13361, 24432, 891, 1089, 253, 7990, 273, 841, 4342, 2581, 14155, 3340, 604, 359, 497, 281, 1908, 3676, 4715, 604, 760, 253, 826, 43917, 5122, 310, 2783, 1955, 281, 697, 5369, 25755, 494, 5889, 3607, 25761, 891, 717, 31488, 604, 352, 310, 2217, 281, 760, 2939, 826, 43917, 275, 253, 789, 326, 19401, 305, 12064, 33234, 285, 4336, 27719, 253, 305, 12064, 5122, 275, 2087, 891, 651, 751, 281, 923, 22861, 327, 2139, 826, 5070, 310, 253, 7470, 4327, 1060, 347, 5010, 891, 717, 15586, 281, 923, 253, 16055, 3486, 273, 436, 789, 285, 651, 3103, 11907, 4477, 281, 2486, 690, 11859, 4893, 273, 616, 789, 50276, 251, 436, 3877, 891, 651, 452, 3264, 247, 625, 7000, 5955, 327, 253, 4893, 273, 436, 789, 390, 273, 33234, 275, 2087, 275, 253, 5028, 273, 13361, 5742, 891, 651, 11907, 4477, 281, 5645, 616, 2593, 608, 285, 2085, 6843, 4893, 273, 616, 789, 275, 436, 5028, 50275, 783, 5661, 1543, 4677, 337, 812, 320, 5544, 1805, 253, 1543, 403, 4942, 15246, 533, 253, 19087, 3249, 440, 1136, 11493, 285, 1563, 253, 1332, 273, 13546, 841, 369, 417, 15246, 50276, 37585, 5701, 253, 4588, 38135, 273, 253, 2929, 310, 417, 1077, 4518, 3559, 253, 12002, 4428, 247, 2257, 273, 1491, 670, 305, 69, 45276, 4278, 594, 253, 4588, 9021, 403, 21042, 285, 253, 1072, 2181, 476, 320, 753, 670, 253, 10199, 50276, 284, 247, 906, 352, 310, 1077, 2834, 281, 3653, 253, 4588, 4460, 9021, 326, 253, 789, 4428, 891, 651, 1804, 13970, 3987, 253, 7681, 4278, 1066, 281, 1056, 253, 9021, 273, 253, 2929, 625, 21414, 50275, 34263, 3374, 342, 5426, 1903, 50276, 33722, 247, 1679, 685, 390, 18207, 861, 50276, 18566, 417, 13199, 253, 1617, 273, 9184, 310, 436, 271, 823, 12163, 247, 1924, 390, 310, 352, 247, 8171, 247, 1924, 50276, 1189, 455, 1223, 891, 513, 2868, 326, 436, 2929, 14177, 281, 2953, 247, 1077, 1774, 1895, 285, 310, 19186, 3590, 891, 717, 417, 4751, 13762, 326, 627, 310, 2217, 38135, 275, 253, 5368, 7714, 281, 15249, 253, 14924, 285, 3103, 891, 2868, 326, 436, 2929, 310, 816, 2708, 253, 2534, 273, 14924, 50276, 7152, 33032, 6010, 273, 9021, 50276, 2520, 2929, 19401, 305, 12064, 8967, 11068, 305, 12132, 10732, 534, 310, 271, 6880, 273, 299, 4277, 1448, 85, 324, 7413, 451, 11068, 33234, 11467, 8288, 271, 5933, 310, 33222, 12132, 604, 352, 310, 387, 1878, 347, 3055, 347, 305, 12064, 5122, 342, 6046, 39199, 337, 1906, 253, 2022, 11361, 273, 305, 12132, 403, 326, 891, 352, 556, 247, 2969, 285, 17614, 468, 5889, 10012, 685, 299, 4277, 1448, 85, 324, 81, 285, 21255, 352, 556, 760, 247, 2014, 4764, 12910, 4283, 281, 8077, 693, 410, 5919, 13782, 2429, 281, 643, 33234, 27367, 326, 2085, 1175, 5889, 24088, 269, 12132, 390, 11068, 2957, 3268, 268, 392, 50276, 2520, 2929, 24357, 5657, 326, 476, 1361, 2096, 45552, 305, 12132, 625, 4354, 5742, 50276, 9328, 12661, 305, 12064, 8967, 11068, 9261, 305, 69, 431, 534, 11467, 8288, 310, 253, 6970, 12910, 14168, 67, 1288, 281, 14168, 67, 1288, 835, 278, 489, 4277, 310, 253, 8004, 299, 4277, 323, 534, 253, 5933, 310, 387, 1878, 347, 3055, 347, 305, 12064, 5122, 273, 6046, 39199, 337, 78, 489, 4277, 387, 436, 1798, 1318, 273, 299, 4277, 50276, 3113, 253, 1840, 10732, 597, 15313, 275, 39383, 7652, 7127, 253, 1617, 323, 271, 5933, 281, 320, 33222, 12132, 323, 690, 6486, 12910, 352, 1364, 320, 299, 4277, 18687, 2265, 300, 857, 81, 835, 1579, 8403, 1315, 317, 4259, 19, 2808, 18687, 4259, 310, 6486, 50276, 9328, 840, 921, 849, 281, 5512, 2451, 1880, 271, 5933, 310, 305, 12132, 5742, 597, 921, 326, 352, 31088, 281, 2451, 247, 2176, 1617, 327, 247, 6486, 1180, 273, 2792, 3969, 281, 271, 5170, 37212, 275, 4677, 374, 987, 597, 921, 326, 604, 436, 1617, 6556, 253, 5933, 310, 299, 4277, 73, 12910, 2522, 305, 12132, 11467, 4495, 326, 352, 310, 305, 12132, 323, 512, 299, 4277, 50276, 4259, 73, 50276, 9328, 9059, 326, 275, 3946, 3192, 299, 4277, 73, 281, 320, 10481, 1781, 943, 320, 2217, 533, 597, 671, 1918, 247, 17230, 285, 6046, 5199, 281, 1614, 271, 299, 4277, 73, 12910, 2522, 305, 12132, 5933, 715, 581, 534, 310, 2686, 33222, 12132, 50274, 28694, 253, 1039, 253, 2929, 671, 2722, 247, 747, 27570, 875, 299, 4277, 1448, 85, 324, 81, 285, 299, 4277, 1448, 85, 324, 81, 3786, 352, 369, 1929, 326, 253, 3438, 8018, 253, 6158, 604, 299, 4277, 305, 2574, 299, 4277, 285, 18687, 305, 2574, 18687, 275, 10012, 4562, 273, 436, 2929, 253, 4477, 921, 326, 253, 27570, 778, 1014, 2186, 672, 299, 4277, 50276, 4259, 533, 18687, 671, 556, 281, 320, 4067, 685, 18687, 407, 247, 2176, 2408, 7293, 327, 18687, 299, 4277, 299, 4277, 253, 4477, 1918, 690, 1650, 275, 2593, 608, 835, 436, 5886, 4245, 37825, 12739, 323, 690, 1929, 11333, 50276, 296, 3755, 20556, 50276, 2520, 2929, 29328, 247, 9261, 285, 2067, 24026, 458, 44661, 326, 778, 1361, 1056, 305, 12132, 625, 8542, 285, 625, 7561, 7763, 50276, 783, 37825, 2954, 875, 299, 4277, 1448, 85, 324, 793, 310, 4722, 50275, 20881, 1255, 265, 50276, 74, 513, 417, 2868, 326, 305, 69, 431, 11026, 1534, 12288, 326, 2550, 320, 8392, 432, 253, 11068, 6222, 6970, 3139, 5742, 2761, 512, 253, 4893, 476, 320, 6012, 3066, 253, 6158, 347, 973, 625, 4278, 403, 3559, 2708, 275, 7000, 5701, 323, 4477, 50276, 20261, 253, 2929, 3916, 281, 2085, 11369, 5657, 323, 305, 12132, 642, 7103, 310, 1677, 327, 436, 2914, 891, 1158, 352, 310, 1774, 281, 921, 24088, 849, 6863, 50276, 5430, 5919, 305, 12132, 5657, 3559, 1060, 403, 2429, 281, 643, 3082, 275, 6239, 824, 347, 11068, 2957, 10670, 1260, 961, 1162, 355, 29286, 6247, 50276, 783, 11839, 7756, 432, 253, 37825, 2954, 875, 299, 4277, 1448, 85, 324, 793, 1057, 417, 1646, 326, 23176, 5742, 253, 6667, 275, 253, 2929, 4648, 1077, 1029, 18687, 512, 273, 731, 4067, 685, 209, 5523, 534, 310, 417, 247, 1318, 908, 9825, 275, 3946, 891, 452, 417, 2326, 667, 8542, 2898, 835, 18687, 50276, 11238, 6296, 436, 1057, 417, 1646, 751, 247, 27454, 253, 747, 5886, 760, 1646, 4217, 323, 299, 4277, 299, 4277, 17, 4942, 2810, 285, 18687, 18687, 17, 4942, 1781, 5010, 253, 21842, 1307, 275, 18687, 17, 651, 320, 1512, 1781, 50276, 783, 37825, 2954, 875, 299, 4277, 1448, 85, 324, 793, 671, 3133, 281, 320, 15424, 275, 2045, 789, 347, 271, 1650, 253, 2929, 8654, 15890, 273, 8967, 11068, 3066, 8847, 1159, 407, 1182, 11917, 1162, 355, 556, 247, 14846, 326, 2296, 326, 253, 11068, 6970, 1364, 320, 17133, 18057, 1903, 672, 253, 1269, 10565, 310, 4391, 432, 299, 4277, 281, 299, 4259, 970, 436, 17133, 414, 327, 470, 299, 4277, 17, 299, 4277, 671, 11026, 436, 14846, 891, 1158, 436, 310, 247, 5884, 14855, 1580, 352, 310, 1175, 323, 253, 3908, 281, 320, 3542, 1066, 275, 247, 625, 6843, 830, 347, 310, 2218, 1060, 8791, 50275, 5992, 7193, 5701, 323, 4477, 50276, 1747, 13218, 305, 69, 431, 50274, 33921, 7127, 476, 320, 4354, 6012, 432, 18057, 247, 18, 407, 4758, 598, 269, 4259, 458, 82, 1448, 85, 312, 489, 4277, 323, 690, 12910, 50274, 1542, 10012, 7904, 253, 37212, 3470, 2668, 403, 6425, 281, 46551, 598, 390, 1066, 18687, 4259, 275, 253, 11068, 6222, 323, 512, 299, 4277, 275, 1269, 74, 1269, 74, 50276, 18, 275, 1016, 891, 594, 253, 2862, 4737, 273, 10012, 7904, 476, 671, 320, 6012, 347, 4354, 3066, 253, 11068, 6222, 6970, 347, 973, 50274, 1189, 455, 891, 1158, 352, 69, 320, 1805, 604, 368, 921, 271, 2898, 273, 305, 69, 431, 326, 651, 320, 1199, 12150, 281, 15313, 762, 253, 11068, 6222, 10732, 5010, 516, 417, 2119, 2139, 359, 943, 4853, 2568, 1529, 11068, 6970, 275, 1635, 281, 253, 2168, 1142, 9191, 50275, 977, 625, 5884, 5701, 50276, 28692, 1903, 5816, 458, 82, 275, 253, 5150, 50276, 6377, 374, 12494, 1078, 2593, 1903, 40009, 310, 417, 2568, 2931, 50276, 6377, 495, 1273, 1650, 310, 2649, 253, 305, 12064, 5122, 3139, 671, 271, 1650, 273, 824, 247, 5454, 2727, 50276, 6377, 495, 2626, 1650, 253, 6749, 66, 5933, 273, 32798, 23248, 1162, 355, 6247, 369, 2686, 4081, 1199, 4321, 407, 310, 16926, 1162, 355, 41685, 84, 5403, 275, 253, 2929, 50276, 22243, 3756, 432, 39185, 923, 671, 253, 9380, 3055, 20828, 432, 11184, 17679, 8169, 407, 32798, 23248, 1162, 355, 17069, 22243, 9169, 285, 3055, 36138, 275, 253, 23390, 2482, 46671, 1566, 407, 4273, 282, 1162, 355, 260, 6113, 43425, 534, 1918, 1805, 1783, 273, 6749, 66, 2429, 281, 32798, 23248, 1162, 355, 6247, 50276, 6377, 495, 2626, 1650, 278, 275, 253, 6749, 66, 5933, 310, 417, 4869, 2228, 533, 253, 1180, 273, 8169, 26332, 21842, 10764, 591, 2608, 50276, 6377, 608, 5426, 4567, 352, 36908, 1646, 751, 253, 8105, 1617, 310, 908, 275, 247, 14282, 1039, 387, 1878, 275, 253, 2022, 2133, 604, 594, 840, 5046, 627, 310, 642, 878, 281, 4853, 352, 387, 1878, 275, 253, 2022, 2133, 50276, 6377, 721, 1078, 10012, 7609, 50276, 783, 1563, 39383, 7473, 4219, 50274, 783, 1563, 39383, 7473, 907, 50275, 250, 27167, 318, 50276, 1189, 455, 3738, 891, 1158, 436, 2929, 16424, 253, 4685, 273, 305, 12132, 891, 717, 10915, 8498, 758, 326, 305, 69, 431, 11026, 1534, 4460, 12288, 326, 2550, 320, 6786, 5010, 285, 625, 16774, 27163, 452, 281, 320, 1160, 275, 1340, 281, 3653, 253, 8542, 414, 273, 253, 5657, 4081, 1060, 1955, 281, 436, 891, 5583, 18235, 2490, 187, 4118, 18435, 27, 664, 5717, 253, 4477, 323, 616, 2380, 253, 30628, 5194, 326, 436, 2929, 3400, 9021, 275, 3772, 839, 11068, 6260, 762, 253, 305, 12064, 8967, 11068, 305, 12132, 7792, 253, 30628, 671, 8042, 562, 2067, 30453, 273, 253, 2929, 954, 15538, 253, 30628, 513, 417, 1089, 253, 3559, 4893, 281, 320, 21414, 275, 1798, 253, 3559, 906, 476, 320, 1199, 34615, 604, 253, 4081, 1332, 476, 1421, 281, 5520, 11068, 1783, 323, 625, 18144, 11333, 824, 347, 20093, 35333, 2439, 247, 4618, 9459, 273, 299, 4277, 285, 18687, 275, 2087, 253, 11068, 12215, 310, 1077, 5075, 342, 18687, 8750, 685, 337, 79, 4583, 253, 2929, 1057, 417, 1646, 281, 2085, 2217, 1941, 281, 34647, 253, 31471, 273, 616, 4081, 1332 ]
Below is given review of a research paper from cnoference journal. Please write a summary the review. ### Review: the paper investigates an algorithm for distributed learning with random fourier features the main idea is to sample m random fourier features and split the data into m chunks each chunk is processed on a separate machine that outputs a linear hypothesis using the sampled m random features the hypotheses coming from different machines are then aggregated on the master machine via importance weighting in particular each hypothesis is assigned importance weight proportional to its data chunk size see eq 3 the regularization parameter is fixed across different machines the main contribution of the work is a consistency bound in comparison to a previous bound on the divide conquer algorithm li et al arxiv 2019 this one does not require a constant number of machines in my understanding of the related work section clarity the paper is clear and easy to follow i would say that the related work is fairly well covered and the contributions are appropriately placed in this regard quality significance i have a fundamental disagreement when it comes to the considered distributed setting in particular the bottleneck in learning with random fourier features is not the size of the dataset but the number of features the computational complexity is linear in the dataset size and cubic in the number of features moreover there are examples of machine learning problems where it is required to use a huge number of features for satisfactory results eg see kernel approximation methods for speech recognition may et al thus i do not see this direction as significant after all the algorithm improves over variable amounting to linear computational complexity what would be interesting is to use different sets of random features on different machines and then aggregate on the master machine in that way one would be tackling the factor contributing to cubic computational complexitydocsepthis paper studies the statistical properties of distributed kernel ridge regression together with random features dkrrrf and obtain optimal generalization bounds under the basic setting in the attainable cases numerical results are given for the studied new algorithms the algorithms and the derived results are new and interesting to me however the presentations as well as the citations need some major revision before the publication some few comments it looks to me that the idea of distributed learning with communication has already appeared in arxiv190604870 if not earlier page 1 the authors mention that distributed learning has been combined with multipass sgd but they did not cite the related paper optimal distributed learning with multipass stochastic gradient methods icml 2018 page 2 optimal learning rates were also established for distributed spectral algorithms in jmlr 2018 191 10691097 or arxiv161007487 page 2 and the other locations optimal learning rates with a less strict condition on the number of local machines were first established in jmlr 2020 21147 163 or arxiv180107226 if not earlier bottom of page 6 to my knowledge the first one using the concentration inequality for selfadjoint operators to relax the restriction on the number of local machines is jmlr 2020 21147 163 or arxiv180107226 or optimal distributed learning with multipass stochastic gradient methods icml 2018 if not earlier also the first part of proposition 6 was first proved in random design analysis of ridge regression 2012 colt for the matrix case and later was extended to the operator case in on the sample complexity of subspace learning neurips 2013 finitesample theoretical analysis about the approximation quality of rffs has been established in on the error of random fourier features uai 2015 and optimal rates for random fourier features neurips 2015 numerical results on different datasets could be given to further exemplify the performance of the algorithm how do you choose the regularization parameter lambda in the distributed learning will this enlarge the computational complexitydocsepthe paper analyses generalization properties of distributed kernel ridge regression dkrr with random features and communications it studies optimal learning rates of the generalization bounds both in expectation and in probability in the case of dkrr with random features the optimal learning rate in expectation is shown to achieve by relaxing the requirement on the number of partitions from o1 li et al 2019a to od05 theorem 1 within the same setup of random features the number of partitions is relaxed to od025 guaranteeing optimal generalization performance in probability theorem 2 the latter bound od025 on partition count is much smaller then od05 however as proved in theorem 3 allowing multiple communication rounds in dkrrrf up to od05 partitions can be handled depending on the number of communication rounds in other words it can exploit more partitions at the cost of more communication rounds the idea of dkrr with random features and communications is a combination of dkrr with random features studied in li et al 2019a and dkrr with communications studied in lin et al 2020 it seems that the communication strategy and algorithm 1 presented in section 3 are adaptations from lin et al 2020 handling random feature which should be properly mentioned and credited during the comparison with lin et al 2020 it is mentioned that the letter work required communicating local data dj among partition nodes however i failed to spot it in lin et al 2020 and instead found similar steps of algorithm 1 in section 23 of lin et al 2020 where only gradient information is communicated please elaborate on this not being an expert in this narrow field i think the improvements are essential and would be helpful for the associated community the paper is well written with fair amount of discussion comparing with the results and proof techniques of recent works ### Summary:
the focus of the submission is kernel ridge regression in the distributed setting particularly the authors present optimal learning rates under this assumption both in expectation and in probability while they relax previous restrictions on the number of partitions taken the effectiveness of the approach is demonstrated in synthetic and realworld settings as summarized by the reviewers the submission is wellorganized and clearly written the authors focus on an important problem they present a fundamental theoretical contribution which also has clear practical impact as such the submission could be of interest to the iclr and ml community
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 783, 2929, 2340, 684, 271, 5933, 323, 5939, 4715, 342, 3632, 269, 15421, 3386, 253, 2022, 2934, 310, 281, 3410, 278, 3632, 269, 15421, 3386, 285, 8085, 253, 941, 715, 278, 30151, 1016, 20540, 310, 11742, 327, 247, 4858, 5145, 326, 18012, 247, 4872, 9079, 970, 253, 19958, 278, 3632, 3386, 253, 24316, 3551, 432, 1027, 10679, 403, 840, 40006, 327, 253, 6303, 5145, 3066, 6349, 42428, 275, 1798, 1016, 9079, 310, 7922, 6349, 2801, 14495, 281, 697, 941, 20540, 1979, 923, 16186, 495, 253, 37820, 4764, 310, 4229, 2439, 1027, 10679, 253, 2022, 7680, 273, 253, 789, 310, 247, 15274, 3033, 275, 5301, 281, 247, 2045, 3033, 327, 253, 10957, 50276, 585, 14056, 5933, 632, 1162, 355, 549, 32693, 6247, 436, 581, 1057, 417, 2430, 247, 3638, 1180, 273, 10679, 275, 619, 4685, 273, 253, 2905, 789, 2593, 50275, 498, 15752, 253, 2929, 310, 2590, 285, 3477, 281, 956, 891, 651, 1333, 326, 253, 2905, 789, 310, 9648, 973, 6107, 285, 253, 9021, 403, 20420, 4845, 275, 436, 2743, 50275, 15177, 50276, 9188, 40348, 891, 452, 247, 7936, 30859, 672, 352, 3249, 281, 253, 2783, 5939, 4758, 275, 1798, 253, 3673, 44856, 275, 4715, 342, 3632, 269, 15421, 3386, 310, 417, 253, 1979, 273, 253, 10895, 533, 253, 1180, 273, 3386, 253, 15180, 10454, 310, 4872, 275, 253, 10895, 1979, 285, 23664, 275, 253, 1180, 273, 3386, 25761, 627, 403, 6667, 273, 5145, 4715, 3237, 835, 352, 310, 2424, 281, 897, 247, 5699, 1180, 273, 3386, 323, 20297, 1543, 24088, 923, 10295, 11193, 3082, 323, 6519, 8981, 778, 1162, 355, 3021, 891, 513, 417, 923, 436, 3884, 347, 1534, 846, 512, 253, 5933, 19132, 689, 4778, 2408, 272, 281, 4872, 15180, 10454, 752, 651, 320, 4722, 310, 281, 897, 1027, 5239, 273, 3632, 3386, 327, 1027, 10679, 285, 840, 19737, 327, 253, 6303, 5145, 275, 326, 1039, 581, 651, 320, 46710, 253, 2803, 15979, 281, 23664, 15180, 10454, 7152, 33032, 2520, 2929, 2175, 253, 7605, 3607, 273, 5939, 10295, 27563, 9077, 2366, 342, 3632, 3386, 277, 76, 2676, 19232, 285, 4044, 8654, 26647, 14493, 762, 253, 5044, 4758, 275, 253, 20685, 494, 2219, 50276, 40907, 474, 1543, 403, 1677, 323, 253, 5421, 747, 11333, 253, 11333, 285, 253, 6012, 1543, 403, 747, 285, 4722, 281, 479, 2299, 253, 27228, 347, 973, 347, 253, 30404, 878, 690, 2201, 18520, 1078, 253, 9311, 50275, 8826, 1643, 5701, 50276, 262, 4453, 281, 479, 326, 253, 2934, 273, 5939, 4715, 342, 5511, 556, 2168, 5420, 275, 50276, 39962, 16129, 1549, 2385, 1967, 604, 417, 4321, 50276, 6377, 337, 253, 4477, 3748, 326, 5939, 4715, 556, 644, 5678, 342, 10796, 515, 256, 35333, 533, 597, 858, 417, 26542, 253, 2905, 2929, 8654, 5939, 4715, 342, 10796, 515, 19191, 11786, 3082, 17857, 1686, 4765, 3239, 374, 8654, 4715, 4142, 497, 671, 4232, 323, 5939, 9879, 11333, 275, 480, 1686, 83, 4765, 27446, 884, 2090, 740, 4148, 50276, 263, 549, 32693, 1036, 31182, 30910, 50276, 6377, 374, 285, 253, 643, 8593, 50276, 29776, 4715, 4142, 342, 247, 1679, 7654, 1617, 327, 253, 1180, 273, 1980, 10679, 497, 806, 4232, 275, 480, 1686, 83, 9169, 24978, 2504, 23309, 390, 549, 32693, 1093, 520, 2922, 21345, 604, 417, 4321, 5004, 273, 3239, 721, 281, 619, 3640, 50276, 783, 806, 581, 970, 253, 4719, 11370, 323, 1881, 44121, 9158, 281, 7921, 253, 12400, 327, 253, 1180, 273, 1980, 10679, 310, 480, 1686, 83, 9169, 24978, 2504, 23309, 390, 549, 32693, 1093, 520, 2922, 21345, 390, 8654, 5939, 4715, 342, 10796, 515, 19191, 11786, 3082, 17857, 1686, 4765, 604, 417, 4321, 50276, 12563, 253, 806, 629, 273, 13989, 721, 369, 806, 8058, 275, 3632, 2216, 1783, 273, 27563, 9077, 4050, 847, 85, 323, 253, 4315, 1083, 285, 1996, 369, 6508, 281, 253, 5572, 1083, 275, 50276, 251, 253, 3410, 10454, 273, 24822, 4715, 5723, 2824, 4072, 1442, 3254, 4636, 10527, 1783, 670, 253, 11193, 3290, 273, 391, 567, 84, 556, 644, 4232, 275, 327, 253, 2228, 273, 3632, 269, 15421, 3386, 1484, 2284, 4104, 285, 8654, 4142, 323, 3632, 269, 15421, 3386, 5723, 2824, 4104, 50276, 40907, 474, 1543, 327, 1027, 15302, 812, 320, 1677, 281, 2007, 40924, 6644, 253, 3045, 273, 253, 5933, 50276, 5430, 513, 368, 5206, 253, 37820, 4764, 29331, 275, 253, 5939, 4715, 588, 436, 46112, 253, 15180, 10454, 7152, 339, 431, 248, 2929, 6260, 26647, 3607, 273, 5939, 10295, 27563, 9077, 277, 76, 2676, 342, 3632, 3386, 285, 10924, 352, 2175, 8654, 4715, 4142, 273, 253, 26647, 14493, 1097, 275, 15355, 285, 275, 5912, 275, 253, 1083, 273, 277, 76, 2676, 342, 3632, 3386, 253, 8654, 4715, 2281, 275, 15355, 310, 2011, 281, 5115, 407, 32196, 253, 8284, 327, 253, 1180, 273, 27959, 432, 258, 18, 632, 1162, 355, 6247, 66, 281, 7687, 1762, 10012, 337, 1561, 253, 1072, 9978, 273, 3632, 3386, 253, 1180, 273, 27959, 310, 19595, 281, 7687, 14521, 12215, 272, 8654, 26647, 3045, 275, 5912, 10012, 374, 253, 6158, 3033, 7687, 14521, 327, 10883, 1385, 310, 1199, 4577, 840, 7687, 1762, 2299, 347, 8058, 275, 10012, 495, 6941, 2709, 5511, 16334, 275, 277, 76, 2676, 19232, 598, 281, 7687, 1762, 27959, 476, 320, 15726, 7293, 327, 253, 1180, 273, 5511, 16334, 275, 643, 3000, 352, 476, 22059, 625, 27959, 387, 253, 2105, 273, 625, 5511, 16334, 50276, 783, 2934, 273, 277, 76, 2676, 342, 3632, 3386, 285, 10924, 310, 247, 5019, 273, 277, 76, 2676, 342, 3632, 3386, 5421, 275, 632, 1162, 355, 6247, 66, 285, 277, 76, 2676, 342, 10924, 5421, 275, 19169, 1162, 355, 9169, 352, 3133, 326, 253, 5511, 5700, 285, 5933, 337, 3559, 275, 2593, 495, 403, 41655, 432, 19169, 1162, 355, 9169, 10885, 3632, 4735, 534, 943, 320, 6283, 5393, 285, 26873, 50276, 32674, 253, 5301, 342, 19169, 1162, 355, 9169, 352, 310, 5393, 326, 253, 4857, 789, 2424, 26728, 1980, 941, 277, 75, 2190, 10883, 7632, 2299, 891, 4242, 281, 6308, 352, 275, 19169, 1162, 355, 9169, 285, 3185, 1119, 2074, 5018, 273, 5933, 337, 275, 2593, 3495, 273, 19169, 1162, 355, 9169, 835, 760, 11786, 1491, 310, 32452, 4496, 21184, 327, 436, 50276, 1439, 1146, 271, 6485, 275, 436, 6891, 1673, 891, 1158, 253, 11701, 403, 5667, 285, 651, 320, 9371, 323, 253, 2330, 3114, 253, 2929, 310, 973, 3542, 342, 4344, 2408, 273, 5955, 10941, 342, 253, 1543, 285, 4737, 5609, 273, 3332, 2987, 187, 187, 4118, 18435, 27, 783, 2770, 273, 253, 19529, 310, 10295, 27563, 9077, 275, 253, 5939, 4758, 3782, 253, 4477, 1246, 8654, 4715, 4142, 762, 436, 9376, 1097, 275, 15355, 285, 275, 5912, 1223, 597, 7921, 2045, 13133, 327, 253, 1180, 273, 27959, 2668, 253, 12510, 273, 253, 2746, 310, 5183, 275, 13506, 285, 1524, 10186, 7533, 50276, 284, 17903, 407, 253, 30628, 253, 19529, 310, 973, 34092, 285, 4518, 3542, 253, 4477, 2770, 327, 271, 1774, 1895, 597, 1246, 247, 7936, 10527, 7680, 534, 671, 556, 2590, 8542, 3486, 347, 824, 253, 19529, 812, 320, 273, 1600, 281, 253, 17857, 32888, 285, 13361, 3114 ]
[ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 ]
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 783, 2929, 2340, 684, 271, 5933, 323, 5939, 4715, 342, 3632, 269, 15421, 3386, 253, 2022, 2934, 310, 281, 3410, 278, 3632, 269, 15421, 3386, 285, 8085, 253, 941, 715, 278, 30151, 1016, 20540, 310, 11742, 327, 247, 4858, 5145, 326, 18012, 247, 4872, 9079, 970, 253, 19958, 278, 3632, 3386, 253, 24316, 3551, 432, 1027, 10679, 403, 840, 40006, 327, 253, 6303, 5145, 3066, 6349, 42428, 275, 1798, 1016, 9079, 310, 7922, 6349, 2801, 14495, 281, 697, 941, 20540, 1979, 923, 16186, 495, 253, 37820, 4764, 310, 4229, 2439, 1027, 10679, 253, 2022, 7680, 273, 253, 789, 310, 247, 15274, 3033, 275, 5301, 281, 247, 2045, 3033, 327, 253, 10957, 50276, 585, 14056, 5933, 632, 1162, 355, 549, 32693, 6247, 436, 581, 1057, 417, 2430, 247, 3638, 1180, 273, 10679, 275, 619, 4685, 273, 253, 2905, 789, 2593, 50275, 498, 15752, 253, 2929, 310, 2590, 285, 3477, 281, 956, 891, 651, 1333, 326, 253, 2905, 789, 310, 9648, 973, 6107, 285, 253, 9021, 403, 20420, 4845, 275, 436, 2743, 50275, 15177, 50276, 9188, 40348, 891, 452, 247, 7936, 30859, 672, 352, 3249, 281, 253, 2783, 5939, 4758, 275, 1798, 253, 3673, 44856, 275, 4715, 342, 3632, 269, 15421, 3386, 310, 417, 253, 1979, 273, 253, 10895, 533, 253, 1180, 273, 3386, 253, 15180, 10454, 310, 4872, 275, 253, 10895, 1979, 285, 23664, 275, 253, 1180, 273, 3386, 25761, 627, 403, 6667, 273, 5145, 4715, 3237, 835, 352, 310, 2424, 281, 897, 247, 5699, 1180, 273, 3386, 323, 20297, 1543, 24088, 923, 10295, 11193, 3082, 323, 6519, 8981, 778, 1162, 355, 3021, 891, 513, 417, 923, 436, 3884, 347, 1534, 846, 512, 253, 5933, 19132, 689, 4778, 2408, 272, 281, 4872, 15180, 10454, 752, 651, 320, 4722, 310, 281, 897, 1027, 5239, 273, 3632, 3386, 327, 1027, 10679, 285, 840, 19737, 327, 253, 6303, 5145, 275, 326, 1039, 581, 651, 320, 46710, 253, 2803, 15979, 281, 23664, 15180, 10454, 7152, 33032, 2520, 2929, 2175, 253, 7605, 3607, 273, 5939, 10295, 27563, 9077, 2366, 342, 3632, 3386, 277, 76, 2676, 19232, 285, 4044, 8654, 26647, 14493, 762, 253, 5044, 4758, 275, 253, 20685, 494, 2219, 50276, 40907, 474, 1543, 403, 1677, 323, 253, 5421, 747, 11333, 253, 11333, 285, 253, 6012, 1543, 403, 747, 285, 4722, 281, 479, 2299, 253, 27228, 347, 973, 347, 253, 30404, 878, 690, 2201, 18520, 1078, 253, 9311, 50275, 8826, 1643, 5701, 50276, 262, 4453, 281, 479, 326, 253, 2934, 273, 5939, 4715, 342, 5511, 556, 2168, 5420, 275, 50276, 39962, 16129, 1549, 2385, 1967, 604, 417, 4321, 50276, 6377, 337, 253, 4477, 3748, 326, 5939, 4715, 556, 644, 5678, 342, 10796, 515, 256, 35333, 533, 597, 858, 417, 26542, 253, 2905, 2929, 8654, 5939, 4715, 342, 10796, 515, 19191, 11786, 3082, 17857, 1686, 4765, 3239, 374, 8654, 4715, 4142, 497, 671, 4232, 323, 5939, 9879, 11333, 275, 480, 1686, 83, 4765, 27446, 884, 2090, 740, 4148, 50276, 263, 549, 32693, 1036, 31182, 30910, 50276, 6377, 374, 285, 253, 643, 8593, 50276, 29776, 4715, 4142, 342, 247, 1679, 7654, 1617, 327, 253, 1180, 273, 1980, 10679, 497, 806, 4232, 275, 480, 1686, 83, 9169, 24978, 2504, 23309, 390, 549, 32693, 1093, 520, 2922, 21345, 604, 417, 4321, 5004, 273, 3239, 721, 281, 619, 3640, 50276, 783, 806, 581, 970, 253, 4719, 11370, 323, 1881, 44121, 9158, 281, 7921, 253, 12400, 327, 253, 1180, 273, 1980, 10679, 310, 480, 1686, 83, 9169, 24978, 2504, 23309, 390, 549, 32693, 1093, 520, 2922, 21345, 390, 8654, 5939, 4715, 342, 10796, 515, 19191, 11786, 3082, 17857, 1686, 4765, 604, 417, 4321, 50276, 12563, 253, 806, 629, 273, 13989, 721, 369, 806, 8058, 275, 3632, 2216, 1783, 273, 27563, 9077, 4050, 847, 85, 323, 253, 4315, 1083, 285, 1996, 369, 6508, 281, 253, 5572, 1083, 275, 50276, 251, 253, 3410, 10454, 273, 24822, 4715, 5723, 2824, 4072, 1442, 3254, 4636, 10527, 1783, 670, 253, 11193, 3290, 273, 391, 567, 84, 556, 644, 4232, 275, 327, 253, 2228, 273, 3632, 269, 15421, 3386, 1484, 2284, 4104, 285, 8654, 4142, 323, 3632, 269, 15421, 3386, 5723, 2824, 4104, 50276, 40907, 474, 1543, 327, 1027, 15302, 812, 320, 1677, 281, 2007, 40924, 6644, 253, 3045, 273, 253, 5933, 50276, 5430, 513, 368, 5206, 253, 37820, 4764, 29331, 275, 253, 5939, 4715, 588, 436, 46112, 253, 15180, 10454, 7152, 339, 431, 248, 2929, 6260, 26647, 3607, 273, 5939, 10295, 27563, 9077, 277, 76, 2676, 342, 3632, 3386, 285, 10924, 352, 2175, 8654, 4715, 4142, 273, 253, 26647, 14493, 1097, 275, 15355, 285, 275, 5912, 275, 253, 1083, 273, 277, 76, 2676, 342, 3632, 3386, 253, 8654, 4715, 2281, 275, 15355, 310, 2011, 281, 5115, 407, 32196, 253, 8284, 327, 253, 1180, 273, 27959, 432, 258, 18, 632, 1162, 355, 6247, 66, 281, 7687, 1762, 10012, 337, 1561, 253, 1072, 9978, 273, 3632, 3386, 253, 1180, 273, 27959, 310, 19595, 281, 7687, 14521, 12215, 272, 8654, 26647, 3045, 275, 5912, 10012, 374, 253, 6158, 3033, 7687, 14521, 327, 10883, 1385, 310, 1199, 4577, 840, 7687, 1762, 2299, 347, 8058, 275, 10012, 495, 6941, 2709, 5511, 16334, 275, 277, 76, 2676, 19232, 598, 281, 7687, 1762, 27959, 476, 320, 15726, 7293, 327, 253, 1180, 273, 5511, 16334, 275, 643, 3000, 352, 476, 22059, 625, 27959, 387, 253, 2105, 273, 625, 5511, 16334, 50276, 783, 2934, 273, 277, 76, 2676, 342, 3632, 3386, 285, 10924, 310, 247, 5019, 273, 277, 76, 2676, 342, 3632, 3386, 5421, 275, 632, 1162, 355, 6247, 66, 285, 277, 76, 2676, 342, 10924, 5421, 275, 19169, 1162, 355, 9169, 352, 3133, 326, 253, 5511, 5700, 285, 5933, 337, 3559, 275, 2593, 495, 403, 41655, 432, 19169, 1162, 355, 9169, 10885, 3632, 4735, 534, 943, 320, 6283, 5393, 285, 26873, 50276, 32674, 253, 5301, 342, 19169, 1162, 355, 9169, 352, 310, 5393, 326, 253, 4857, 789, 2424, 26728, 1980, 941, 277, 75, 2190, 10883, 7632, 2299, 891, 4242, 281, 6308, 352, 275, 19169, 1162, 355, 9169, 285, 3185, 1119, 2074, 5018, 273, 5933, 337, 275, 2593, 3495, 273, 19169, 1162, 355, 9169, 835, 760, 11786, 1491, 310, 32452, 4496, 21184, 327, 436, 50276, 1439, 1146, 271, 6485, 275, 436, 6891, 1673, 891, 1158, 253, 11701, 403, 5667, 285, 651, 320, 9371, 323, 253, 2330, 3114, 253, 2929, 310, 973, 3542, 342, 4344, 2408, 273, 5955, 10941, 342, 253, 1543, 285, 4737, 5609, 273, 3332, 2987, 187, 187, 4118, 18435, 27, 783, 2770, 273, 253, 19529, 310, 10295, 27563, 9077, 275, 253, 5939, 4758, 3782, 253, 4477, 1246, 8654, 4715, 4142, 762, 436, 9376, 1097, 275, 15355, 285, 275, 5912, 1223, 597, 7921, 2045, 13133, 327, 253, 1180, 273, 27959, 2668, 253, 12510, 273, 253, 2746, 310, 5183, 275, 13506, 285, 1524, 10186, 7533, 50276, 284, 17903, 407, 253, 30628, 253, 19529, 310, 973, 34092, 285, 4518, 3542, 253, 4477, 2770, 327, 271, 1774, 1895, 597, 1246, 247, 7936, 10527, 7680, 534, 671, 556, 2590, 8542, 3486, 347, 824, 253, 19529, 812, 320, 273, 1600, 281, 253, 17857, 32888, 285, 13361, 3114 ]
Below is given review of a research paper from cnoference journal. Please write a summary the review. ### Review: this paper proposes a new tree mdp theory and formalizes the branching problem as a tree mdp problem under this new formulation the resulting algorithm benefits sample efficiency and credit assignment this paper proposes a novel tree mdp theory which is well fitted for the branching problem the new tree mdp formulation has an obvious advantage over the existing mdp formulation however the paper has some basic theoretical flaws my major concern is how could the method be generalized to the test environment in other words does there exists an optimal policy that can map the mip problem into an action assumption 42 is strong to me the paper has offered two conditions in proposition 44 and 45 however in p 44 if the optimal value is provided the problem is reduced to another easier problem with equality constraint is this true in p 45 the algorithms can be limited to a very limited scope docsepthe paper proposes a tree mdp formulation for learning to branch in milp branchandbound algorithms a policy gradient algorithm for solving milp tree mdps is proposed conditions for the correctness of the algorithm are derived the paper compares the tree mdp formulation with the vanilla rl formulation as well as a strong baseline default scip on 5 benchmark problem sets the main empirical result shows that the tree mdp formulation outperforms the vanilla rl formulation but does not outperform scip in 4 out of 5 problems similar results are observed for the transfer learning experiments update i thank the authors for their detailed responses after reading the other reviews and the responses im a bit more positively inclined towards the paper although the computational performance remains a bit of an issue for me ive upgraded my score to a weak accept strengths there appear to be a number of novel algorithmic contributions tree mdp using tree mdps to encode bb trees for milps policy gradient algorithm for tree mdps the proposed approach seems wellmotivated intuitively clear with correctness guarantees for objlim and and dfs although i didnt check these carefully combined with the novelty this makes the paper interesting in itself the paper is tackling a problem of significant practical importance bb milps and improvements here would have a large impact the paper is clearly written and the proposed ideas and experiments are easy to follow weaknesses the empirical results dont show improved performance over the baseline in 4 out of 5 cases in the case where it does well multiple knapsack there is no deeper analysis overall the experimental section provides limited insight into the empirical characteristics of tmdp compared to scip its unclear if the tree mdp formulation can be used in other applications the paper does not place the tree mdp formulation into the broader and large body of work on mdps as a result it becomes difficult to assess the impact of the algorithmic contributions this is reflected in my score for presentation and contribution the baseline scip branching rule is not described and the description strong branching is very brief coupled with the above it becomes difficult to place the proposed ideas into the existing body of knowledge there is a discussion on the limitations and social impact in the checklist but not the main paper the authors could consider moving it to the main paper docsepthe main contribution of this paper is a new type of augmented mdp formulation called a tree mdp which is motivated by the learning to branch problem in mixedinteger linear programming milp the intuition is to convert the default mdp formulation which we can think of to be defined over subtrees into a simpler formulation that is defined over nodes instead the authors then show computational experiments on a set of benchmarks showing that the new mdp formulation is more amenable for rl strengths i found the tree mdp idea used in the paper to be novel creative and directly targets the motivating problem without extraneous features and believe that this is a valuable contribution to the literature on applying rl to milp weaknesses the ideas in the paper took me some time to properly digest i believe nearly all of the information needed for the reader to digest is there but think that the paper could make this process easier on the reader please see the questions below for specific queries they have to some extent but i have also asked for a few more points in the questions section above ### Summary:
the paper studies the milp problem by providing a tree mdp framework for a more suitable formulation of the branching problem the reviewers believe that this approach is relevant and novel while in the first round of the review the reviewers had identified a number of concerns such as the applicability of the tree mdp concerns about the comparison of baselines and presentation clarities the authors have addressed these issues in a satisfactory way in the rebuttal phase all the reviewers unanimously agree to accept the paper
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 2520, 2929, 29328, 247, 747, 5202, 278, 12132, 3762, 285, 7473, 4219, 253, 27213, 1895, 347, 247, 5202, 278, 12132, 1895, 762, 436, 747, 15895, 253, 4795, 5933, 5373, 3410, 6733, 285, 6152, 12714, 50276, 2520, 2929, 29328, 247, 4460, 5202, 278, 12132, 3762, 534, 310, 973, 14662, 323, 253, 27213, 1895, 253, 747, 5202, 278, 12132, 15895, 556, 271, 4755, 5750, 689, 253, 5368, 278, 12132, 15895, 50276, 35529, 253, 2929, 556, 690, 5044, 10527, 32138, 50276, 2577, 2201, 4468, 310, 849, 812, 253, 1332, 320, 14923, 281, 253, 1071, 3126, 275, 643, 3000, 1057, 627, 4961, 271, 8654, 3646, 326, 476, 3711, 253, 278, 532, 1895, 715, 271, 2250, 50276, 515, 23892, 5976, 310, 2266, 281, 479, 253, 2929, 556, 5907, 767, 2515, 275, 13989, 7127, 285, 5329, 50274, 35529, 275, 268, 7127, 604, 253, 8654, 1318, 310, 2530, 253, 1895, 310, 3777, 281, 1529, 6927, 1895, 342, 13919, 7658, 310, 436, 2032, 275, 268, 5329, 253, 11333, 476, 320, 3710, 281, 247, 1077, 3710, 7990, 50274, 7152, 339, 431, 248, 2929, 29328, 247, 5202, 278, 12132, 15895, 323, 4715, 281, 7789, 275, 2301, 81, 7789, 395, 9458, 11333, 247, 3646, 11786, 5933, 323, 16161, 2301, 81, 5202, 31934, 793, 310, 4081, 2515, 323, 253, 36594, 273, 253, 5933, 403, 6012, 253, 2929, 26662, 253, 5202, 278, 12132, 15895, 342, 253, 26724, 391, 77, 15895, 347, 973, 347, 247, 2266, 8245, 4284, 660, 532, 327, 608, 22791, 1895, 5239, 253, 2022, 16774, 906, 2722, 326, 253, 5202, 278, 12132, 15895, 41731, 13015, 253, 26724, 391, 77, 15895, 533, 1057, 417, 562, 32231, 660, 532, 275, 577, 562, 273, 608, 3237, 2074, 1543, 403, 2540, 323, 253, 3700, 4715, 4679, 50274, 11183, 891, 5717, 253, 4477, 323, 616, 7000, 6128, 846, 4361, 253, 643, 10123, 285, 253, 6128, 516, 247, 2372, 625, 14962, 21802, 4404, 253, 2929, 3738, 253, 15180, 3045, 4558, 247, 2372, 273, 271, 2523, 323, 479, 209, 422, 29101, 619, 4868, 281, 247, 5075, 2997, 20544, 50274, 9088, 3176, 281, 320, 247, 1180, 273, 4460, 5933, 280, 9021, 5202, 278, 12132, 970, 5202, 31934, 793, 281, 22573, 48557, 7139, 323, 2301, 793, 3646, 11786, 5933, 323, 5202, 31934, 793, 50274, 783, 4081, 2746, 3133, 973, 24013, 8550, 540, 41597, 2590, 342, 36594, 23632, 323, 10928, 2815, 285, 285, 277, 3671, 3738, 891, 42126, 2451, 841, 9257, 5678, 342, 253, 38135, 436, 2789, 253, 2929, 4722, 275, 3139, 50274, 783, 2929, 310, 46710, 247, 1895, 273, 1534, 8542, 6349, 48557, 2301, 793, 285, 11701, 1060, 651, 452, 247, 1781, 3486, 50274, 783, 2929, 310, 4518, 3542, 285, 253, 4081, 5697, 285, 4679, 403, 3477, 281, 956, 50276, 20881, 1255, 265, 50274, 783, 16774, 1543, 13414, 921, 5520, 3045, 689, 253, 8245, 275, 577, 562, 273, 608, 2219, 275, 253, 1083, 835, 352, 1057, 973, 2709, 694, 1825, 471, 627, 310, 642, 12861, 1783, 4583, 253, 5661, 2593, 3400, 3710, 12288, 715, 253, 16774, 5319, 273, 246, 6535, 81, 2429, 281, 660, 532, 50274, 953, 12744, 604, 253, 5202, 278, 12132, 15895, 476, 320, 908, 275, 643, 4893, 253, 2929, 1057, 417, 1659, 253, 5202, 278, 12132, 15895, 715, 253, 16055, 285, 1781, 2133, 273, 789, 327, 31934, 793, 347, 247, 906, 352, 4916, 2834, 281, 2939, 253, 3486, 273, 253, 5933, 280, 9021, 436, 310, 11392, 275, 619, 4868, 323, 9759, 285, 7680, 50274, 783, 8245, 660, 532, 27213, 4086, 310, 417, 2529, 285, 253, 5740, 2266, 27213, 310, 1077, 4864, 9904, 342, 253, 1840, 352, 4916, 2834, 281, 1659, 253, 4081, 5697, 715, 253, 5368, 2133, 273, 3640, 627, 310, 247, 5955, 327, 253, 7364, 285, 2675, 3486, 275, 253, 44282, 533, 417, 253, 2022, 2929, 253, 4477, 812, 1908, 4886, 352, 281, 253, 2022, 2929, 5474, 339, 431, 248, 2022, 7680, 273, 436, 2929, 310, 247, 747, 1511, 273, 31612, 278, 12132, 15895, 1925, 247, 5202, 278, 12132, 534, 310, 17194, 407, 253, 4715, 281, 7789, 1895, 275, 6804, 18743, 4872, 10717, 2301, 81, 253, 30328, 310, 281, 6455, 253, 4284, 278, 12132, 15895, 534, 359, 476, 1158, 273, 281, 320, 2931, 689, 749, 45670, 715, 247, 19554, 15895, 326, 310, 2931, 689, 7632, 3185, 253, 4477, 840, 921, 15180, 4679, 327, 247, 873, 273, 49602, 4645, 326, 253, 747, 278, 12132, 15895, 310, 625, 42133, 323, 391, 77, 20544, 891, 1119, 253, 5202, 278, 12132, 2934, 908, 275, 253, 2929, 281, 320, 4460, 10995, 285, 3587, 8571, 253, 15265, 839, 1895, 1293, 15534, 6473, 3386, 285, 2868, 326, 436, 310, 247, 9865, 7680, 281, 253, 6239, 327, 9433, 391, 77, 281, 2301, 81, 50276, 20881, 1255, 265, 253, 5697, 275, 253, 2929, 2335, 479, 690, 673, 281, 6283, 19818, 891, 2868, 4829, 512, 273, 253, 1491, 3058, 323, 253, 9414, 281, 19818, 310, 627, 533, 1158, 326, 253, 2929, 812, 1056, 436, 1232, 6927, 327, 253, 9414, 4496, 923, 253, 3533, 2708, 323, 2173, 19241, 597, 452, 281, 690, 6070, 533, 891, 452, 671, 2546, 323, 247, 1643, 625, 2792, 275, 253, 3533, 2593, 1840, 2490, 187, 4118, 18435, 27, 783, 2929, 2175, 253, 2301, 81, 1895, 407, 5277, 247, 5202, 278, 12132, 7792, 323, 247, 625, 7470, 15895, 273, 253, 27213, 1895, 50276, 783, 30628, 2868, 326, 436, 2746, 310, 4623, 285, 4460, 1223, 275, 253, 806, 3790, 273, 253, 2278, 253, 30628, 574, 3636, 247, 1180, 273, 7350, 824, 347, 253, 30437, 273, 253, 5202, 278, 12132, 7350, 670, 253, 5301, 273, 1666, 25379, 285, 9759, 8254, 1005, 253, 4477, 452, 9713, 841, 3374, 275, 247, 20297, 1039, 275, 253, 30080, 22559, 3408, 512, 253, 30628, 38350, 5194, 281, 2997, 253, 2929 ]
[ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 ]
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 2520, 2929, 29328, 247, 747, 5202, 278, 12132, 3762, 285, 7473, 4219, 253, 27213, 1895, 347, 247, 5202, 278, 12132, 1895, 762, 436, 747, 15895, 253, 4795, 5933, 5373, 3410, 6733, 285, 6152, 12714, 50276, 2520, 2929, 29328, 247, 4460, 5202, 278, 12132, 3762, 534, 310, 973, 14662, 323, 253, 27213, 1895, 253, 747, 5202, 278, 12132, 15895, 556, 271, 4755, 5750, 689, 253, 5368, 278, 12132, 15895, 50276, 35529, 253, 2929, 556, 690, 5044, 10527, 32138, 50276, 2577, 2201, 4468, 310, 849, 812, 253, 1332, 320, 14923, 281, 253, 1071, 3126, 275, 643, 3000, 1057, 627, 4961, 271, 8654, 3646, 326, 476, 3711, 253, 278, 532, 1895, 715, 271, 2250, 50276, 515, 23892, 5976, 310, 2266, 281, 479, 253, 2929, 556, 5907, 767, 2515, 275, 13989, 7127, 285, 5329, 50274, 35529, 275, 268, 7127, 604, 253, 8654, 1318, 310, 2530, 253, 1895, 310, 3777, 281, 1529, 6927, 1895, 342, 13919, 7658, 310, 436, 2032, 275, 268, 5329, 253, 11333, 476, 320, 3710, 281, 247, 1077, 3710, 7990, 50274, 7152, 339, 431, 248, 2929, 29328, 247, 5202, 278, 12132, 15895, 323, 4715, 281, 7789, 275, 2301, 81, 7789, 395, 9458, 11333, 247, 3646, 11786, 5933, 323, 16161, 2301, 81, 5202, 31934, 793, 310, 4081, 2515, 323, 253, 36594, 273, 253, 5933, 403, 6012, 253, 2929, 26662, 253, 5202, 278, 12132, 15895, 342, 253, 26724, 391, 77, 15895, 347, 973, 347, 247, 2266, 8245, 4284, 660, 532, 327, 608, 22791, 1895, 5239, 253, 2022, 16774, 906, 2722, 326, 253, 5202, 278, 12132, 15895, 41731, 13015, 253, 26724, 391, 77, 15895, 533, 1057, 417, 562, 32231, 660, 532, 275, 577, 562, 273, 608, 3237, 2074, 1543, 403, 2540, 323, 253, 3700, 4715, 4679, 50274, 11183, 891, 5717, 253, 4477, 323, 616, 7000, 6128, 846, 4361, 253, 643, 10123, 285, 253, 6128, 516, 247, 2372, 625, 14962, 21802, 4404, 253, 2929, 3738, 253, 15180, 3045, 4558, 247, 2372, 273, 271, 2523, 323, 479, 209, 422, 29101, 619, 4868, 281, 247, 5075, 2997, 20544, 50274, 9088, 3176, 281, 320, 247, 1180, 273, 4460, 5933, 280, 9021, 5202, 278, 12132, 970, 5202, 31934, 793, 281, 22573, 48557, 7139, 323, 2301, 793, 3646, 11786, 5933, 323, 5202, 31934, 793, 50274, 783, 4081, 2746, 3133, 973, 24013, 8550, 540, 41597, 2590, 342, 36594, 23632, 323, 10928, 2815, 285, 285, 277, 3671, 3738, 891, 42126, 2451, 841, 9257, 5678, 342, 253, 38135, 436, 2789, 253, 2929, 4722, 275, 3139, 50274, 783, 2929, 310, 46710, 247, 1895, 273, 1534, 8542, 6349, 48557, 2301, 793, 285, 11701, 1060, 651, 452, 247, 1781, 3486, 50274, 783, 2929, 310, 4518, 3542, 285, 253, 4081, 5697, 285, 4679, 403, 3477, 281, 956, 50276, 20881, 1255, 265, 50274, 783, 16774, 1543, 13414, 921, 5520, 3045, 689, 253, 8245, 275, 577, 562, 273, 608, 2219, 275, 253, 1083, 835, 352, 1057, 973, 2709, 694, 1825, 471, 627, 310, 642, 12861, 1783, 4583, 253, 5661, 2593, 3400, 3710, 12288, 715, 253, 16774, 5319, 273, 246, 6535, 81, 2429, 281, 660, 532, 50274, 953, 12744, 604, 253, 5202, 278, 12132, 15895, 476, 320, 908, 275, 643, 4893, 253, 2929, 1057, 417, 1659, 253, 5202, 278, 12132, 15895, 715, 253, 16055, 285, 1781, 2133, 273, 789, 327, 31934, 793, 347, 247, 906, 352, 4916, 2834, 281, 2939, 253, 3486, 273, 253, 5933, 280, 9021, 436, 310, 11392, 275, 619, 4868, 323, 9759, 285, 7680, 50274, 783, 8245, 660, 532, 27213, 4086, 310, 417, 2529, 285, 253, 5740, 2266, 27213, 310, 1077, 4864, 9904, 342, 253, 1840, 352, 4916, 2834, 281, 1659, 253, 4081, 5697, 715, 253, 5368, 2133, 273, 3640, 627, 310, 247, 5955, 327, 253, 7364, 285, 2675, 3486, 275, 253, 44282, 533, 417, 253, 2022, 2929, 253, 4477, 812, 1908, 4886, 352, 281, 253, 2022, 2929, 5474, 339, 431, 248, 2022, 7680, 273, 436, 2929, 310, 247, 747, 1511, 273, 31612, 278, 12132, 15895, 1925, 247, 5202, 278, 12132, 534, 310, 17194, 407, 253, 4715, 281, 7789, 1895, 275, 6804, 18743, 4872, 10717, 2301, 81, 253, 30328, 310, 281, 6455, 253, 4284, 278, 12132, 15895, 534, 359, 476, 1158, 273, 281, 320, 2931, 689, 749, 45670, 715, 247, 19554, 15895, 326, 310, 2931, 689, 7632, 3185, 253, 4477, 840, 921, 15180, 4679, 327, 247, 873, 273, 49602, 4645, 326, 253, 747, 278, 12132, 15895, 310, 625, 42133, 323, 391, 77, 20544, 891, 1119, 253, 5202, 278, 12132, 2934, 908, 275, 253, 2929, 281, 320, 4460, 10995, 285, 3587, 8571, 253, 15265, 839, 1895, 1293, 15534, 6473, 3386, 285, 2868, 326, 436, 310, 247, 9865, 7680, 281, 253, 6239, 327, 9433, 391, 77, 281, 2301, 81, 50276, 20881, 1255, 265, 253, 5697, 275, 253, 2929, 2335, 479, 690, 673, 281, 6283, 19818, 891, 2868, 4829, 512, 273, 253, 1491, 3058, 323, 253, 9414, 281, 19818, 310, 627, 533, 1158, 326, 253, 2929, 812, 1056, 436, 1232, 6927, 327, 253, 9414, 4496, 923, 253, 3533, 2708, 323, 2173, 19241, 597, 452, 281, 690, 6070, 533, 891, 452, 671, 2546, 323, 247, 1643, 625, 2792, 275, 253, 3533, 2593, 1840, 2490, 187, 4118, 18435, 27, 783, 2929, 2175, 253, 2301, 81, 1895, 407, 5277, 247, 5202, 278, 12132, 7792, 323, 247, 625, 7470, 15895, 273, 253, 27213, 1895, 50276, 783, 30628, 2868, 326, 436, 2746, 310, 4623, 285, 4460, 1223, 275, 253, 806, 3790, 273, 253, 2278, 253, 30628, 574, 3636, 247, 1180, 273, 7350, 824, 347, 253, 30437, 273, 253, 5202, 278, 12132, 7350, 670, 253, 5301, 273, 1666, 25379, 285, 9759, 8254, 1005, 253, 4477, 452, 9713, 841, 3374, 275, 247, 20297, 1039, 275, 253, 30080, 22559, 3408, 512, 253, 30628, 38350, 5194, 281, 2997, 253, 2929 ]
Below is given review of a research paper from cnoference journal. Please write a summary the review. ### Review: the paper describes techniques to train ebms according to the length of mcmc trajectories required the paper explores three applications of ebms image synthesis adversarial defense and density estimation each of these applications uses a different regime of mcmc sampling length an approach is described for modifying the initialization of mcmc both during training and at testtime to improve the performance in each of these applications for image synthesis the paper proposes using a combination of persistent chains and cooperative learning persistent chains with rejuvenation can cause instability since negative samples can vary between freshly rejuvenated samples and longer run samples however no rejuvenation often leads to lack of sample diversity persistent banks dont scale well to large datasets since they cannot capture all the variability of the larger dataset instead chains can be rejuvenated from a fast generator model which is trained jointly with the ebm known as cooperative learning the authors identify and fix a key issue with cooperative learning the issue is that generators early in training are unable to generate a diverse set of samples that the ebm can refine one fix is to use batch normalization in the generator instead the authors propose using a persistent bank over the joint distribution of latent noise and generated images where the latent noise is input to the generator which the sampling process is rejuvenated from with the usual process image synthesis section comments it should be more clearly explained why batchnormalization in the generator isnt enough to fix the issue section 22 is confusing isnt the persistent bank just over the distribution x z factorized as pxzpz where z is the latent vector and x is the generated image adversarial defense annealing in the final paragraph of 31 refers to annealing the learning rate right section 32 its unclear how the hyperparameters are specified as i understand it two of k kdef and prejuv must be specified which of the two are specified and to what values are the experiments justifying the reasoning given in the last paragraph of 31 how similar are the trajectories of length k late in training to actual trajectories of length kdef when the learning rate is annealed in figure 6 why does robustness decrease for large k shouldnt larger k used during training lead to better performance when using kdef at testtime how do the results compare when using data samples instead of a pretrained generator density estimation first paragraph section 41 however persistent learning without rejuvenation has shortcomings mentioned in section 3 section 2 persistent samples that are newly rejuvenated up to about 50k langevin steps since rejuvenation and possibly many more cannot be approximate steadystate samples for any current known rejuvenation sources including data generators and noise i think this sentence needs more elaboration definition of terms my understanding is as follows samples in a bank that have undergone less than 50k langevin updates cannot be steadystate samples what is 50k defined relative to is it supposed to be half of 100k which is the number of steps being used to estimate the density what is the definition of lifetime langevin updates does this refer to the number of langevin updates applied to a sample between different training iterations eg by being sampled from the persistent bank and its updated version back into the persistent bank samples in the newly rejuvenated bank that have been updated sufficiently many times will eventually replace samples from the bank used to update the ebm at which point newly rejuvenated states will be added to the first bank the first bank refers to the bank for newly rejuvenated samples correct how do the results compare when using data samples instead of a pretrained generator are there experiments measuring the quality of the ebm density my understanding is that these experiments show that ebms can be trained so that longrun mcmc samples which are closer to samples from the true ebm density are highquality are there experiments measuring how good the density is directly for example experiments on outofdistribution detection a small tractable model on mnist where the normalizing constant can be estimated and compare the loglikelihood to exact density models as well as exact samples to longrun mcmc samples the prior ebm used in eqn 2 seems like a little bit of a hack to me it sounds like it could just be slowingdown how fast mcmc converges before oversaturation are there any plots showing that mcmc achieves a wideregion of stability where the sample quality and diversity stays relatively constant typos section 1 introduction an misaligned 3rd paragraph section 22 hybrid persistent cooperative initialization uses paired latent and image states that a drawn from figure 2 caption section 4 longrun sampling for density estimation these outcomes not equivalent 2nd paragraph the lack scalable methods 2nd paragraph by introducing an mcmc initialization can incorporate 2nd paragraph section 41 incorporating rejuvenation in density estimation initializatin figure 4 caption remain the the figure 4 caption burnin bank figure 4 caption until they have approach figure 4 caption text under eq 2 does not explain u0 term overall the paper is wellstructured and clear and presents unique and interesting ideas for training unconditional ebms for different applications there are some places where clarity could be improved and some additional experiments which i think might improve some of the points in the paper docsepthe paper discusses learning strategies for energybased models short sampling for image generation midrun sampling for adversarial defense and longrun sampling for density estimation the paper claims these methods achieve significant performance gain across the three applications and achieved stateoftheart performances strengths the paper investigates training heuristics for energybased models weaknesses i am not convinced by the argument that the performances are stateoftheart in table 1 the proposed ebm method clearly does not outperform gans and recently proposed diffusion scorebased generative models in table 2 the number of ours has a typo of 00566 in cifar10 and has worse natural accuracy than the ebm method in hill et al and imagenet results are clearly not outperforming baselines although the stateoftheart claim avoids this there are no density estimates on the test set only fid since the method does not discuss how to estimate the partition function we have yet to discuss the amount of compute and time needed to perform training and inference with ebms which are quite slow as well the paper does not have many technical novelties most of the discussion is about the usage or not of persistent initialization and pretrained generator for rejuvenation the rejuvenation model is so important in all three cases that a reasonable alternative is to use a good rejuvenation model and avoid ebm entirely the heuristics themselves might be helpful in ebm learning but it is questionable why we would adopt an ebm in the first place if it does not perform well on image generationdensity estimation compared to other models gans flow models diffusion models for example a pretrained sngan is used to rejuvenate ebms section 4 yet the model itself outperforms ebms in terms of fid see table 1 if the method claims density estimation then we should expect to see results other than fid since that is image generation even if partition function is not possible to compute it can still help to see if density estimation have other uses such as outofdistribution detection post rebuttal response i appreciate the authors taking the time to discuss and address my concerns unfortunately i will keep my scores as is similar to reviewer pepn i am not entirely convinced about using fid in the density modeling section sure density modeling does not require partition functions and both fid and likelihood are flawed as measurements of image quality but the argument used in this paper can also be used in gans it is also modeling some kind of density and you can evaluate fid with it ebms are more expressive than gans in the sense that they also give unnormalized densities but the paper never used these densities for any meaningful tasks like outofdistribution detection the rebuttal also mentions that ours represent a significant departure from mainstream approaches for learning and evaluating density models i dont see how evaluating fid is a significant departure most ebmbased work after 2019 already report fids existing work has shown that diffusion models can beat gans on imagenet even with higher resolutions httpsgithubcomopenaiguideddiffusion this includes unconditional and conditional models the technically novel bit in this paper seems to be using pretrained generators for rejuvenation and if there are models that already beats ebm in image generation why would we train and use another ebm instead the rebuttal mentions that alternatives such as variational approximation yield much worse distributional approximations than mcmc which might be true in principle but can be quite far from what happens in practice since mcmc can have slow mixing speed again diffusion models are inspired by variational inference and seem to have quite good fids all in all i find that the paper and rebuttal make some claims in favor of ebms and the paper itself that i cannot fully agree with the paper spans various different problems but makes relatively little contribution to each of them the claims over empirical performances are inflated the ebmbased methods do not have an advantage over scorebased diffusion generative models in terms of generation and likelihood evaluation which also avoids estimating partition functions docsepthis paper discusses 3 applications for ebms image generation adversarial robustness and density modeling the authors discuss how the standard methods for training ebms do not adequately address the latter two of these applications for this reason they propose a new method which combines ideas from 2 popular ebm training methods coopnets and pcd to alleviate some issues with both methods the authors present results on the 3 tasks mentioned and their method appears favorable compared to other ebm variants strengths this paper addresses a large issue in the ebm space current methods for training these models are difficult to tune unstable and slow compared to other generative modeling approaches the proposed fix combining ideas from coopnets and pcd may provide a bestofbothworlds approach to training ebms while adding no overhead on top of either of these training methods further this paper correctly brings attention to many real issues in the ebm space such as the sample quality density model quality tradeoff as well the paper addresses adversarial robustness which appears to be a powerful and promising of ebms while i am not an expert in adversarial robustness and cannot make superstrong claims about the correctness of these experiments it appears to me that the robustness results give a solid improvement over 6 and many other techniques purely meant for this task weaknesses while the authors provide some highlevel intuition for their proposed modifications to coopnetspcd they do not provide any theoretical justification for their proposed approach have you considered other ways to improve the diversity of the coopnet samples could this be due to the approximations that are made with the standard coopnet learning algorithm which the authors mention you could regularize entropy as in 5 4 i feel the paper would be made much stronger if more care was placed on the details and more theoretical justification was given the experimental details in the paper are quite scant the authors do not mention anything about the optimizers used the learning rates the batch sizes or training training time as such the results in this paper are not reproducible i am aware the authors have included code but these details should be accessible from the paper at least in the appendix further from an investigation of the code there appear to be a number of additional experimental hyperparameters such as gradient clipping and energy tempering which are not once mentioned in the paper these modifications to the training objective can be very responsible for the successfailure of ebms when training in this way and there has been a good deal of work discussing it this work is attempting to improve stability and success of ebm training many of these tricks or hacks depending on your point of view are there to accommodate for many of the issues that this paper proposes to address such as high variance of negative sample gradients from training on newly resampled examples so it should be made clear which if any of these tricks are no longer needed i have some issues with the evaluation used in the longrun sampling for density estimation section the authors claim early in the paper that sample quality can be a misleading indicator of model quality where quality is evaluated by maximum likelihood which i completely agree with i was then disappointed to find the only quantitative result in this action was fid which is a qualitative measure of sample quality if the authors want to argue that their training procedure learns a better density model then there are many alternative evaluations which could be used and have been used in recent ebm work you could use aisraise to estimate upperlowerbounds on likelihood as in 1 2 or you could use your training procedure to train a tractable likelihood model and then evaluate using the models known likelihood as in 3 4 minor typo in table 2 1 du yilun and igor mordatch implicit generation and generalization in energybased models arxiv preprint arxiv190308689 2019 2 gao ruiqi et al learning energybased models by diffusion recovery likelihood arxiv preprint arxiv201208125 2020 3 song yang et al sliced score matching a scalable approach to density and score estimation uncertainty in artificial intelligence pmlr 2020 4 grathwohl will et al no mcmc for me amortized sampling for fast and stable training of energybased models arxiv preprint arxiv201004230 2020 5 dieng adji b et al prescribed generative adversarial networks arxiv preprint arxiv191004302 2019 6 hill mitch jonathan mitchell and songchun zhu stochastic security adversarial defense using longrun dynamics of energybased models arxiv preprint arxiv200513525 2020 post author response i appreciate the authors responding to my feedback and thank them for the changes they have made to the paper unfortunately i do not find their arguments convincing regarding density estimation and the use of fid to evaluate the method as well if the authors were able to demonstrate that their method simplifies ebm training while offering similar quality results this would be compelling but i do not feel this was done in the work the proposed method is more complicated than coopnets and pcd and provides additional hyperparameters to tune if the authors were able to make a compelling argument that their method allows us to learn a better density model then i would support acceptance of this work but i do not believe this was done for this reason i will keep my score the same there are some interesting ideas presented in this work but i do not feel the results presented demonstrate that they are a sufficient contribution to the field for acceptance this paper presents a new method for training ebms which combines features of two popular training approaches coopnets and pcd while the proposed method makes sense and appears to work there are considerable issues with the methods evaluation experimental details and theoretical justification thus in its current form i do not advocate for its acceptance docsepthis work presents different mcmc initialization techniques for training ebm with differerent lengths for the purposes of image generation adversarial defense and density estimation respectively more specifically for shortrun image generation the author proposed a hybrid persistent cooperative initialization that mitigate the lack of diversity issue of ebm learning with generator initialization for midrun adversarial defense the author proposed a pretrained generator rejuvenation to scale up ebm defense for longrun density estimation the author proposed rejuvenation methods and regularization trick to correct oversaturation the paper is well written and easy to follow the empirical performance looks good compared to the baseline models although the fairness requires further investigation the weakness of the paper is that the contribution seems to be incremental as most of the techniques used in this paper have been proposed somewhere else below are some of my concerns 1 for image generation why not use a pretrained generator and if the generator already provide high quality samples eg sngan what is the purpose of using ebm after it did the baseline ebms use generator initialization also i did not see a comparison to xie et al 2018 2 how does the pretained generator initialization compared with data samples how important if annealing is annealing applied to other baseline methods lack ablation studies here 3 where is the density estimation result for longrun ebm overall the paper provide some interesting empirical findings for ebms with different mcmc lengths however current experiment results and lack of novelty make it a bit under the bar of iclr ### Summary:
this paper proposed a strategy to train ebms according to the length of mcmc trajectories required the paper covers three settings with the different length of mcmc image synthesis adversarial defense and density estimation the reviewers generally find that there are interesting ideas and promising results in the paper but the paper is not ready to publish at its current stage the argument regarding density estimation and fid evaluation is not convincing the proposed method is also more complicated than the baseline methods coopnets and pcd and we would need a stronger argument for the added complexity
[ 5844, 275, 253, 806, 1659, 604, 352, 1057, 417, 1347, 973, 327, 2460, 5978, 20425, 13418, 2429, 281, 643, 3210, 305, 507, 2685, 3210, 12393, 3210, 50275, 1542, 1650, 247, 3215, 11273, 3802, 1247, 310, 908, 281, 294, 7589, 1261, 366, 38391, 983, 2593, 577, 2568, 253, 1566, 3139, 41731, 13015, 38391, 983, 275, 2426, 273, 269, 301, 923, 2829, 337, 50276, 338, 253, 1332, 3916, 4038, 13418, 840, 359, 943, 1902, 281, 923, 1543, 643, 685, 269, 301, 1580, 326, 310, 2460, 5978, 1014, 604, 10883, 1159, 310, 417, 1896, 281, 11897, 352, 476, 1335, 1361, 281, 923, 604, 4038, 13418, 452, 643, 4648, 824, 347, 562, 1171, 35360, 5481, 50275, 5996, 30080, 22559, 2380, 891, 11435, 253, 4477, 3192, 253, 673, 281, 2319, 285, 2953, 619, 7350, 19235, 891, 588, 1978, 619, 7363, 347, 310, 50275, 22202, 281, 37317, 268, 554, 79, 891, 717, 417, 7094, 13762, 670, 970, 269, 301, 275, 253, 4038, 14053, 2593, 2119, 4038, 14053, 1057, 417, 2430, 10883, 3470, 285, 1097, 269, 301, 285, 12177, 403, 33657, 347, 6341, 273, 2460, 3290, 533, 253, 4154, 908, 275, 436, 2929, 476, 671, 320, 908, 275, 305, 507, 352, 310, 671, 14053, 690, 2238, 273, 4038, 285, 368, 476, 7472, 269, 301, 342, 352, 38391, 983, 403, 625, 43541, 685, 305, 507, 275, 253, 3282, 326, 597, 671, 1918, 440, 6320, 1025, 16689, 533, 253, 2929, 1620, 908, 841, 16689, 323, 667, 14282, 8892, 751, 562, 1171, 35360, 5481, 253, 30080, 22559, 671, 25957, 326, 20451, 1957, 247, 1534, 16018, 432, 17068, 7274, 323, 4715, 285, 16344, 4038, 3210, 891, 13414, 923, 849, 16344, 269, 301, 310, 247, 1534, 16018, 954, 38391, 1814, 833, 789, 846, 6247, 2168, 1304, 269, 2352, 50275, 20137, 789, 556, 2011, 326, 12393, 3210, 476, 7171, 305, 507, 327, 4440, 257, 292, 1014, 342, 2169, 30285, 5987, 7280, 681, 5758, 66, 36753, 1356, 13437, 2035, 436, 3797, 49795, 285, 17697, 3210, 50276, 783, 22335, 4460, 2372, 275, 436, 2929, 3133, 281, 320, 970, 3215, 11273, 21025, 323, 294, 7589, 1261, 318, 285, 604, 627, 403, 3210, 326, 2168, 27125, 299, 5844, 275, 2460, 5978, 2139, 651, 359, 6194, 285, 897, 1529, 299, 5844, 3185, 253, 30080, 22559, 25957, 326, 18075, 824, 347, 39762, 11193, 4917, 1199, 7197, 3268, 267, 34754, 685, 278, 3591, 68, 534, 1537, 320, 2032, 275, 8063, 533, 476, 320, 3240, 2080, 432, 752, 6569, 275, 3946, 1580, 278, 3591, 68, 476, 452, 3468, 12480, 3885, 969, 12393, 3210, 403, 11797, 407, 39762, 17032, 285, 1646, 281, 452, 3240, 1175, 269, 2352, 50276, 455, 275, 512, 891, 1089, 326, 253, 2929, 285, 30080, 22559, 1056, 690, 3916, 275, 3718, 273, 38391, 983, 285, 253, 2929, 3139, 326, 891, 2550, 4751, 5194, 342, 50276, 783, 2929, 35742, 2710, 1027, 3237, 533, 2789, 4942, 1652, 7680, 281, 1016, 273, 731, 253, 3916, 689, 16774, 16226, 403, 41470, 253, 38391, 1814, 833, 3082, 513, 417, 452, 271, 5750, 689, 4868, 3169, 50276, 13437, 2035, 1006, 800, 3210, 275, 2426, 273, 5978, 285, 12177, 7103, 534, 671, 32547, 26230, 10883, 3470, 5474, 33032, 2520, 2929, 25339, 495, 4893, 323, 38391, 983, 2460, 5978, 48960, 31640, 285, 4038, 14053, 253, 4477, 2319, 849, 253, 2629, 3082, 323, 3733, 38391, 983, 513, 417, 18212, 2953, 253, 6158, 767, 273, 841, 4893, 323, 436, 1921, 597, 12661, 247, 747, 1332, 534, 24772, 5697, 432, 374, 4633, 299, 5844, 3733, 3082, 820, 412, 47301, 285, 268, 2428, 281, 33623, 690, 3374, 342, 1097, 3082, 253, 4477, 1246, 1543, 327, 253, 495, 8892, 5393, 285, 616, 1332, 4620, 13857, 2429, 281, 643, 299, 5844, 11640, 20544, 50276, 2520, 2929, 12453, 247, 1781, 2523, 275, 253, 299, 5844, 2317, 1655, 3082, 323, 3733, 841, 3210, 403, 2834, 281, 19928, 17631, 285, 3468, 2429, 281, 643, 1006, 800, 14053, 7274, 253, 4081, 4993, 16248, 5697, 432, 820, 412, 47301, 285, 268, 2428, 778, 2085, 247, 1682, 1171, 15617, 10186, 84, 2746, 281, 3733, 38391, 983, 1223, 6240, 642, 18332, 327, 1755, 273, 2057, 273, 841, 3733, 3082, 50275, 44295, 436, 2929, 9113, 10316, 4116, 281, 1142, 1524, 3374, 275, 253, 299, 5844, 2317, 824, 347, 253, 3410, 3290, 50276, 20425, 1566, 3290, 5454, 2727, 347, 973, 253, 2929, 12453, 48960, 31640, 534, 4620, 281, 320, 247, 6422, 285, 12532, 273, 38391, 983, 1223, 891, 717, 417, 271, 6485, 275, 48960, 31640, 285, 2550, 1056, 2221, 9072, 3916, 670, 253, 36594, 273, 841, 4679, 352, 4620, 281, 479, 326, 253, 31640, 1543, 1918, 247, 4891, 7756, 689, 721, 285, 1142, 643, 5609, 15846, 5486, 323, 436, 4836, 50275, 20881, 1255, 265, 50276, 6050, 253, 4477, 2085, 690, 1029, 5251, 30328, 323, 616, 4081, 14586, 281, 820, 412, 3024, 1033, 2428, 597, 513, 417, 2085, 667, 10527, 22861, 323, 616, 4081, 2746, 452, 368, 2783, 643, 4088, 281, 3157, 253, 9991, 273, 253, 820, 412, 3024, 3530, 812, 436, 320, 1955, 281, 253, 34754, 326, 403, 1160, 342, 253, 2629, 820, 412, 3024, 4715, 5933, 534, 253, 4477, 3748, 368, 812, 3963, 907, 15579, 347, 275, 608, 577, 891, 1928, 253, 2929, 651, 320, 1160, 1199, 10046, 604, 625, 1557, 369, 4845, 327, 253, 4278, 285, 625, 10527, 22861, 369, 1677, 50276, 783, 5661, 4278, 275, 253, 2929, 403, 3240, 43721, 253, 4477, 513, 417, 3748, 2712, 670, 253, 5556, 14460, 908, 253, 4715, 4142, 253, 14604, 9552, 390, 3733, 3733, 673, 347, 824, 253, 1543, 275, 436, 2929, 403, 417, 41374, 891, 717, 6600, 253, 4477, 452, 2908, 2127, 533, 841, 4278, 943, 320, 12482, 432, 253, 2929, 387, 1878, 275, 253, 30762, 2007, 432, 271, 5839, 273, 253, 2127, 627, 3176, 281, 320, 247, 1180, 273, 3081, 5661, 4373, 22041, 824, 347, 11786, 502, 8201, 285, 2341, 2660, 272, 534, 403, 417, 2378, 5393, 275, 253, 2929, 841, 14586, 281, 253, 3733, 8103, 476, 320, 1077, 5506, 323, 253, 2323, 33699, 273, 38391, 983, 672, 3733, 275, 436, 1039, 285, 627, 556, 644, 247, 1175, 2968, 273, 789, 16585, 352, 50276, 2520, 789, 310, 13756, 281, 3157, 7882, 285, 2323, 273, 299, 5844, 3733, 1142, 273, 841, 24866, 390, 288, 7305, 7293, 327, 634, 1127, 273, 1859, 403, 627, 281, 18045, 323, 1142, 273, 253, 3374, 326, 436, 2929, 29328, 281, 2953, 824, 347, 1029, 11041, 273, 4016, 3410, 27935, 432, 3733, 327, 9841, 501, 312, 6216, 6667, 594, 352, 943, 320, 1160, 2590, 534, 604, 667, 273, 841, 24866, 403, 642, 3356, 3058, 50275, 74, 452, 690, 3374, 342, 253, 7103, 908, 275, 253, 1048, 6321, 10491, 323, 4038, 13418, 2593, 253, 4477, 1750, 2393, 275, 253, 2929, 326, 3410, 3290, 476, 320, 247, 24363, 15301, 273, 1566, 3290, 835, 3290, 310, 6760, 407, 4869, 12177, 534, 891, 4336, 5194, 342, 891, 369, 840, 19271, 281, 1089, 253, 760, 11745, 906, 275, 436, 2250, 369, 269, 301, 534, 310, 247, 18276, 2557, 273, 3410, 3290, 604, 253, 4477, 971, 281, 9059, 326, 616, 3733, 5199, 33772, 247, 1805, 4038, 1566, 840, 627, 403, 1142, 5795, 27163, 534, 812, 320, 908, 50276, 395, 452, 644, 908, 275, 3332, 299, 5844, 789, 368, 812, 897, 247, 261, 22525, 281, 6642, 5170, 12973, 35800, 327, 12177, 347, 275, 337, 374, 390, 368, 812, 897, 634, 3733, 5199, 281, 6194, 247, 10649, 494, 12177, 1566, 285, 840, 7472, 970, 253, 3210, 1929, 12177, 347, 275, 495, 577, 50275, 37585, 1745, 80, 275, 2829, 374, 50275, 18, 3443, 340, 300, 328, 285, 25477, 263, 278, 636, 1506, 15424, 5978, 285, 26647, 275, 2341, 3169, 3210, 549, 32693, 638, 3845, 549, 32693, 746, 15960, 25, 29941, 6247, 374, 305, 8500, 391, 4113, 33980, 1162, 355, 4715, 2341, 3169, 3210, 407, 12393, 7355, 12177, 549, 32693, 638, 3845, 549, 32693, 1252, 17391, 9312, 9169, 495, 4498, 30966, 1162, 355, 25530, 4868, 11038, 247, 44755, 2746, 281, 4038, 285, 4868, 13418, 11649, 275, 13345, 9260, 268, 1686, 83, 9169, 577, 650, 506, 680, 12408, 588, 1162, 355, 642, 278, 3591, 68, 323, 479, 717, 430, 1025, 10491, 323, 3809, 285, 6474, 3733, 273, 2341, 3169, 3210, 549, 32693, 638, 3845, 549, 32693, 1252, 361, 2945, 1229, 9169, 608, 1073, 1205, 3067, 74, 270, 1162, 355, 15588, 1006, 800, 48960, 6928, 549, 32693, 638, 3845, 549, 32693, 746, 44197, 19044, 6247, 721, 13599, 278, 2682, 480, 251, 10511, 4784, 16521, 285, 4498, 348, 328, 1182, 11917, 19191, 3988, 48960, 5684, 970, 1048, 6321, 8062, 273, 2341, 3169, 3210, 549, 32693, 638, 3845, 549, 32693, 9204, 13743, 1099, 9169, 50276, 5996, 2488, 2380, 50276, 74, 11435, 253, 4477, 19392, 281, 619, 8680, 285, 5717, 731, 323, 253, 2544, 597, 452, 1160, 281, 253, 2929, 19235, 891, 513, 417, 1089, 616, 7125, 21414, 5001, 4038, 13418, 285, 253, 897, 273, 269, 301, 281, 7472, 253, 1332, 347, 973, 604, 253, 4477, 497, 2104, 281, 7568, 326, 616, 1332, 8077, 7790, 299, 5844, 3733, 1223, 9159, 2074, 3290, 1543, 436, 651, 320, 18511, 533, 891, 513, 417, 1928, 436, 369, 2218, 275, 253, 789, 253, 4081, 1332, 310, 625, 9542, 685, 820, 412, 47301, 285, 268, 2428, 285, 3400, 3081, 4373, 22041, 281, 19928, 604, 253, 4477, 497, 2104, 281, 1056, 247, 18511, 4154, 326, 616, 1332, 4483, 441, 281, 3037, 247, 1805, 4038, 1566, 840, 891, 651, 1329, 14924, 273, 436, 789, 533, 891, 513, 417, 2868, 436, 369, 2218, 323, 436, 1921, 891, 588, 1978, 619, 4868, 253, 1072, 627, 403, 690, 4722, 5697, 3559, 275, 436, 789, 533, 891, 513, 417, 1928, 253, 1543, 3559, 7568, 326, 597, 403, 247, 4209, 7680, 281, 253, 1673, 323, 14924, 50275, 2520, 2929, 10262, 247, 747, 1332, 323, 3733, 38391, 983, 534, 24772, 3386, 273, 767, 4633, 3733, 7274, 820, 412, 47301, 285, 268, 2428, 1223, 253, 4081, 1332, 2789, 3282, 285, 4620, 281, 789, 627, 403, 10665, 3374, 342, 253, 3082, 7103, 5661, 4278, 285, 10527, 22861, 3021, 275, 697, 1655, 830, 891, 513, 417, 21424, 323, 697, 14924, 50276, 7152, 33032, 2520, 789, 10262, 1027, 278, 3591, 68, 31850, 5609, 323, 3733, 299, 5844, 342, 845, 250, 624, 16095, 323, 253, 6378, 273, 2460, 5978, 48960, 5684, 285, 4038, 13418, 2975, 625, 5742, 323, 2159, 6321, 2460, 5978, 253, 2488, 4081, 247, 9769, 15663, 27293, 31850, 326, 29966, 253, 3480, 273, 9991, 2523, 273, 299, 5844, 4715, 342, 14156, 31850, 323, 4260, 6321, 48960, 5684, 253, 2488, 4081, 247, 3215, 11273, 14156, 294, 7589, 1261, 318, 281, 4311, 598, 299, 5844, 5684, 323, 1048, 6321, 4038, 13418, 253, 2488, 4081, 294, 7589, 1261, 318, 3082, 285, 37820, 10480, 281, 3451, 689, 22354, 2742, 253, 2929, 310, 973, 3542, 285, 3477, 281, 956, 253, 16774, 3045, 4453, 1175, 2429, 281, 253, 8245, 3210, 3738, 253, 28959, 4419, 2007, 5839, 253, 14855, 273, 253, 2929, 310, 326, 253, 7680, 3133, 281, 320, 32809, 347, 954, 273, 253, 5609, 908, 275, 436, 2929, 452, 644, 4081, 9366, 2010, 2708, 403, 690, 273, 619, 7350, 50276, 18, 323, 2460, 5978, 2139, 417, 897, 247, 3215, 11273, 14156, 285, 604, 253, 14156, 2168, 2085, 1029, 3290, 3530, 24088, 3802, 1247, 752, 310, 253, 4096, 273, 970, 299, 5844, 846, 352, 858, 253, 8245, 38391, 983, 897, 14156, 31850, 671, 891, 858, 417, 923, 247, 5301, 281, 1269, 466, 1162, 355, 4765, 50276, 19, 849, 1057, 253, 3215, 1243, 14156, 31850, 2429, 342, 941, 3530, 849, 1774, 604, 35375, 310, 35375, 3732, 281, 643, 8245, 3082, 3480, 28913, 2175, 1060, 50276, 20, 835, 310, 253, 4038, 13418, 906, 323, 1048, 6321, 299, 5844, 50276, 1189, 455, 253, 2929, 2085, 690, 4722, 16774, 4342, 323, 38391, 983, 342, 1027, 278, 3591, 68, 16095, 2299, 1655, 3368, 1543, 285, 3480, 273, 38135, 1056, 352, 247, 2372, 762, 253, 2534, 273, 17857, 32888, 2490, 187, 4118, 18435, 27, 2520, 2929, 4081, 247, 5700, 281, 6194, 38391, 983, 2556, 281, 253, 2978, 273, 278, 3591, 68, 24102, 2424, 253, 2929, 10949, 1264, 7533, 342, 253, 1027, 2978, 273, 278, 3591, 68, 2460, 9066, 48960, 5684, 285, 4038, 13418, 253, 30628, 3839, 1089, 326, 627, 403, 4722, 5697, 285, 12532, 1543, 275, 253, 2929, 533, 253, 2929, 310, 417, 4704, 281, 15452, 387, 697, 1655, 3924, 253, 4154, 5001, 4038, 13418, 285, 269, 301, 7103, 310, 417, 21414, 253, 4081, 1332, 310, 671, 625, 9542, 685, 253, 8245, 3082, 820, 412, 47301, 285, 268, 2428, 285, 359, 651, 878, 247, 10046, 4154, 323, 253, 2879, 10454 ]
[ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 ]
[ 5844, 275, 253, 806, 1659, 604, 352, 1057, 417, 1347, 973, 327, 2460, 5978, 20425, 13418, 2429, 281, 643, 3210, 305, 507, 2685, 3210, 12393, 3210, 50275, 1542, 1650, 247, 3215, 11273, 3802, 1247, 310, 908, 281, 294, 7589, 1261, 366, 38391, 983, 2593, 577, 2568, 253, 1566, 3139, 41731, 13015, 38391, 983, 275, 2426, 273, 269, 301, 923, 2829, 337, 50276, 338, 253, 1332, 3916, 4038, 13418, 840, 359, 943, 1902, 281, 923, 1543, 643, 685, 269, 301, 1580, 326, 310, 2460, 5978, 1014, 604, 10883, 1159, 310, 417, 1896, 281, 11897, 352, 476, 1335, 1361, 281, 923, 604, 4038, 13418, 452, 643, 4648, 824, 347, 562, 1171, 35360, 5481, 50275, 5996, 30080, 22559, 2380, 891, 11435, 253, 4477, 3192, 253, 673, 281, 2319, 285, 2953, 619, 7350, 19235, 891, 588, 1978, 619, 7363, 347, 310, 50275, 22202, 281, 37317, 268, 554, 79, 891, 717, 417, 7094, 13762, 670, 970, 269, 301, 275, 253, 4038, 14053, 2593, 2119, 4038, 14053, 1057, 417, 2430, 10883, 3470, 285, 1097, 269, 301, 285, 12177, 403, 33657, 347, 6341, 273, 2460, 3290, 533, 253, 4154, 908, 275, 436, 2929, 476, 671, 320, 908, 275, 305, 507, 352, 310, 671, 14053, 690, 2238, 273, 4038, 285, 368, 476, 7472, 269, 301, 342, 352, 38391, 983, 403, 625, 43541, 685, 305, 507, 275, 253, 3282, 326, 597, 671, 1918, 440, 6320, 1025, 16689, 533, 253, 2929, 1620, 908, 841, 16689, 323, 667, 14282, 8892, 751, 562, 1171, 35360, 5481, 253, 30080, 22559, 671, 25957, 326, 20451, 1957, 247, 1534, 16018, 432, 17068, 7274, 323, 4715, 285, 16344, 4038, 3210, 891, 13414, 923, 849, 16344, 269, 301, 310, 247, 1534, 16018, 954, 38391, 1814, 833, 789, 846, 6247, 2168, 1304, 269, 2352, 50275, 20137, 789, 556, 2011, 326, 12393, 3210, 476, 7171, 305, 507, 327, 4440, 257, 292, 1014, 342, 2169, 30285, 5987, 7280, 681, 5758, 66, 36753, 1356, 13437, 2035, 436, 3797, 49795, 285, 17697, 3210, 50276, 783, 22335, 4460, 2372, 275, 436, 2929, 3133, 281, 320, 970, 3215, 11273, 21025, 323, 294, 7589, 1261, 318, 285, 604, 627, 403, 3210, 326, 2168, 27125, 299, 5844, 275, 2460, 5978, 2139, 651, 359, 6194, 285, 897, 1529, 299, 5844, 3185, 253, 30080, 22559, 25957, 326, 18075, 824, 347, 39762, 11193, 4917, 1199, 7197, 3268, 267, 34754, 685, 278, 3591, 68, 534, 1537, 320, 2032, 275, 8063, 533, 476, 320, 3240, 2080, 432, 752, 6569, 275, 3946, 1580, 278, 3591, 68, 476, 452, 3468, 12480, 3885, 969, 12393, 3210, 403, 11797, 407, 39762, 17032, 285, 1646, 281, 452, 3240, 1175, 269, 2352, 50276, 455, 275, 512, 891, 1089, 326, 253, 2929, 285, 30080, 22559, 1056, 690, 3916, 275, 3718, 273, 38391, 983, 285, 253, 2929, 3139, 326, 891, 2550, 4751, 5194, 342, 50276, 783, 2929, 35742, 2710, 1027, 3237, 533, 2789, 4942, 1652, 7680, 281, 1016, 273, 731, 253, 3916, 689, 16774, 16226, 403, 41470, 253, 38391, 1814, 833, 3082, 513, 417, 452, 271, 5750, 689, 4868, 3169, 50276, 13437, 2035, 1006, 800, 3210, 275, 2426, 273, 5978, 285, 12177, 7103, 534, 671, 32547, 26230, 10883, 3470, 5474, 33032, 2520, 2929, 25339, 495, 4893, 323, 38391, 983, 2460, 5978, 48960, 31640, 285, 4038, 14053, 253, 4477, 2319, 849, 253, 2629, 3082, 323, 3733, 38391, 983, 513, 417, 18212, 2953, 253, 6158, 767, 273, 841, 4893, 323, 436, 1921, 597, 12661, 247, 747, 1332, 534, 24772, 5697, 432, 374, 4633, 299, 5844, 3733, 3082, 820, 412, 47301, 285, 268, 2428, 281, 33623, 690, 3374, 342, 1097, 3082, 253, 4477, 1246, 1543, 327, 253, 495, 8892, 5393, 285, 616, 1332, 4620, 13857, 2429, 281, 643, 299, 5844, 11640, 20544, 50276, 2520, 2929, 12453, 247, 1781, 2523, 275, 253, 299, 5844, 2317, 1655, 3082, 323, 3733, 841, 3210, 403, 2834, 281, 19928, 17631, 285, 3468, 2429, 281, 643, 1006, 800, 14053, 7274, 253, 4081, 4993, 16248, 5697, 432, 820, 412, 47301, 285, 268, 2428, 778, 2085, 247, 1682, 1171, 15617, 10186, 84, 2746, 281, 3733, 38391, 983, 1223, 6240, 642, 18332, 327, 1755, 273, 2057, 273, 841, 3733, 3082, 50275, 44295, 436, 2929, 9113, 10316, 4116, 281, 1142, 1524, 3374, 275, 253, 299, 5844, 2317, 824, 347, 253, 3410, 3290, 50276, 20425, 1566, 3290, 5454, 2727, 347, 973, 253, 2929, 12453, 48960, 31640, 534, 4620, 281, 320, 247, 6422, 285, 12532, 273, 38391, 983, 1223, 891, 717, 417, 271, 6485, 275, 48960, 31640, 285, 2550, 1056, 2221, 9072, 3916, 670, 253, 36594, 273, 841, 4679, 352, 4620, 281, 479, 326, 253, 31640, 1543, 1918, 247, 4891, 7756, 689, 721, 285, 1142, 643, 5609, 15846, 5486, 323, 436, 4836, 50275, 20881, 1255, 265, 50276, 6050, 253, 4477, 2085, 690, 1029, 5251, 30328, 323, 616, 4081, 14586, 281, 820, 412, 3024, 1033, 2428, 597, 513, 417, 2085, 667, 10527, 22861, 323, 616, 4081, 2746, 452, 368, 2783, 643, 4088, 281, 3157, 253, 9991, 273, 253, 820, 412, 3024, 3530, 812, 436, 320, 1955, 281, 253, 34754, 326, 403, 1160, 342, 253, 2629, 820, 412, 3024, 4715, 5933, 534, 253, 4477, 3748, 368, 812, 3963, 907, 15579, 347, 275, 608, 577, 891, 1928, 253, 2929, 651, 320, 1160, 1199, 10046, 604, 625, 1557, 369, 4845, 327, 253, 4278, 285, 625, 10527, 22861, 369, 1677, 50276, 783, 5661, 4278, 275, 253, 2929, 403, 3240, 43721, 253, 4477, 513, 417, 3748, 2712, 670, 253, 5556, 14460, 908, 253, 4715, 4142, 253, 14604, 9552, 390, 3733, 3733, 673, 347, 824, 253, 1543, 275, 436, 2929, 403, 417, 41374, 891, 717, 6600, 253, 4477, 452, 2908, 2127, 533, 841, 4278, 943, 320, 12482, 432, 253, 2929, 387, 1878, 275, 253, 30762, 2007, 432, 271, 5839, 273, 253, 2127, 627, 3176, 281, 320, 247, 1180, 273, 3081, 5661, 4373, 22041, 824, 347, 11786, 502, 8201, 285, 2341, 2660, 272, 534, 403, 417, 2378, 5393, 275, 253, 2929, 841, 14586, 281, 253, 3733, 8103, 476, 320, 1077, 5506, 323, 253, 2323, 33699, 273, 38391, 983, 672, 3733, 275, 436, 1039, 285, 627, 556, 644, 247, 1175, 2968, 273, 789, 16585, 352, 50276, 2520, 789, 310, 13756, 281, 3157, 7882, 285, 2323, 273, 299, 5844, 3733, 1142, 273, 841, 24866, 390, 288, 7305, 7293, 327, 634, 1127, 273, 1859, 403, 627, 281, 18045, 323, 1142, 273, 253, 3374, 326, 436, 2929, 29328, 281, 2953, 824, 347, 1029, 11041, 273, 4016, 3410, 27935, 432, 3733, 327, 9841, 501, 312, 6216, 6667, 594, 352, 943, 320, 1160, 2590, 534, 604, 667, 273, 841, 24866, 403, 642, 3356, 3058, 50275, 74, 452, 690, 3374, 342, 253, 7103, 908, 275, 253, 1048, 6321, 10491, 323, 4038, 13418, 2593, 253, 4477, 1750, 2393, 275, 253, 2929, 326, 3410, 3290, 476, 320, 247, 24363, 15301, 273, 1566, 3290, 835, 3290, 310, 6760, 407, 4869, 12177, 534, 891, 4336, 5194, 342, 891, 369, 840, 19271, 281, 1089, 253, 760, 11745, 906, 275, 436, 2250, 369, 269, 301, 534, 310, 247, 18276, 2557, 273, 3410, 3290, 604, 253, 4477, 971, 281, 9059, 326, 616, 3733, 5199, 33772, 247, 1805, 4038, 1566, 840, 627, 403, 1142, 5795, 27163, 534, 812, 320, 908, 50276, 395, 452, 644, 908, 275, 3332, 299, 5844, 789, 368, 812, 897, 247, 261, 22525, 281, 6642, 5170, 12973, 35800, 327, 12177, 347, 275, 337, 374, 390, 368, 812, 897, 634, 3733, 5199, 281, 6194, 247, 10649, 494, 12177, 1566, 285, 840, 7472, 970, 253, 3210, 1929, 12177, 347, 275, 495, 577, 50275, 37585, 1745, 80, 275, 2829, 374, 50275, 18, 3443, 340, 300, 328, 285, 25477, 263, 278, 636, 1506, 15424, 5978, 285, 26647, 275, 2341, 3169, 3210, 549, 32693, 638, 3845, 549, 32693, 746, 15960, 25, 29941, 6247, 374, 305, 8500, 391, 4113, 33980, 1162, 355, 4715, 2341, 3169, 3210, 407, 12393, 7355, 12177, 549, 32693, 638, 3845, 549, 32693, 1252, 17391, 9312, 9169, 495, 4498, 30966, 1162, 355, 25530, 4868, 11038, 247, 44755, 2746, 281, 4038, 285, 4868, 13418, 11649, 275, 13345, 9260, 268, 1686, 83, 9169, 577, 650, 506, 680, 12408, 588, 1162, 355, 642, 278, 3591, 68, 323, 479, 717, 430, 1025, 10491, 323, 3809, 285, 6474, 3733, 273, 2341, 3169, 3210, 549, 32693, 638, 3845, 549, 32693, 1252, 361, 2945, 1229, 9169, 608, 1073, 1205, 3067, 74, 270, 1162, 355, 15588, 1006, 800, 48960, 6928, 549, 32693, 638, 3845, 549, 32693, 746, 44197, 19044, 6247, 721, 13599, 278, 2682, 480, 251, 10511, 4784, 16521, 285, 4498, 348, 328, 1182, 11917, 19191, 3988, 48960, 5684, 970, 1048, 6321, 8062, 273, 2341, 3169, 3210, 549, 32693, 638, 3845, 549, 32693, 9204, 13743, 1099, 9169, 50276, 5996, 2488, 2380, 50276, 74, 11435, 253, 4477, 19392, 281, 619, 8680, 285, 5717, 731, 323, 253, 2544, 597, 452, 1160, 281, 253, 2929, 19235, 891, 513, 417, 1089, 616, 7125, 21414, 5001, 4038, 13418, 285, 253, 897, 273, 269, 301, 281, 7472, 253, 1332, 347, 973, 604, 253, 4477, 497, 2104, 281, 7568, 326, 616, 1332, 8077, 7790, 299, 5844, 3733, 1223, 9159, 2074, 3290, 1543, 436, 651, 320, 18511, 533, 891, 513, 417, 1928, 436, 369, 2218, 275, 253, 789, 253, 4081, 1332, 310, 625, 9542, 685, 820, 412, 47301, 285, 268, 2428, 285, 3400, 3081, 4373, 22041, 281, 19928, 604, 253, 4477, 497, 2104, 281, 1056, 247, 18511, 4154, 326, 616, 1332, 4483, 441, 281, 3037, 247, 1805, 4038, 1566, 840, 891, 651, 1329, 14924, 273, 436, 789, 533, 891, 513, 417, 2868, 436, 369, 2218, 323, 436, 1921, 891, 588, 1978, 619, 4868, 253, 1072, 627, 403, 690, 4722, 5697, 3559, 275, 436, 789, 533, 891, 513, 417, 1928, 253, 1543, 3559, 7568, 326, 597, 403, 247, 4209, 7680, 281, 253, 1673, 323, 14924, 50275, 2520, 2929, 10262, 247, 747, 1332, 323, 3733, 38391, 983, 534, 24772, 3386, 273, 767, 4633, 3733, 7274, 820, 412, 47301, 285, 268, 2428, 1223, 253, 4081, 1332, 2789, 3282, 285, 4620, 281, 789, 627, 403, 10665, 3374, 342, 253, 3082, 7103, 5661, 4278, 285, 10527, 22861, 3021, 275, 697, 1655, 830, 891, 513, 417, 21424, 323, 697, 14924, 50276, 7152, 33032, 2520, 789, 10262, 1027, 278, 3591, 68, 31850, 5609, 323, 3733, 299, 5844, 342, 845, 250, 624, 16095, 323, 253, 6378, 273, 2460, 5978, 48960, 5684, 285, 4038, 13418, 2975, 625, 5742, 323, 2159, 6321, 2460, 5978, 253, 2488, 4081, 247, 9769, 15663, 27293, 31850, 326, 29966, 253, 3480, 273, 9991, 2523, 273, 299, 5844, 4715, 342, 14156, 31850, 323, 4260, 6321, 48960, 5684, 253, 2488, 4081, 247, 3215, 11273, 14156, 294, 7589, 1261, 318, 281, 4311, 598, 299, 5844, 5684, 323, 1048, 6321, 4038, 13418, 253, 2488, 4081, 294, 7589, 1261, 318, 3082, 285, 37820, 10480, 281, 3451, 689, 22354, 2742, 253, 2929, 310, 973, 3542, 285, 3477, 281, 956, 253, 16774, 3045, 4453, 1175, 2429, 281, 253, 8245, 3210, 3738, 253, 28959, 4419, 2007, 5839, 253, 14855, 273, 253, 2929, 310, 326, 253, 7680, 3133, 281, 320, 32809, 347, 954, 273, 253, 5609, 908, 275, 436, 2929, 452, 644, 4081, 9366, 2010, 2708, 403, 690, 273, 619, 7350, 50276, 18, 323, 2460, 5978, 2139, 417, 897, 247, 3215, 11273, 14156, 285, 604, 253, 14156, 2168, 2085, 1029, 3290, 3530, 24088, 3802, 1247, 752, 310, 253, 4096, 273, 970, 299, 5844, 846, 352, 858, 253, 8245, 38391, 983, 897, 14156, 31850, 671, 891, 858, 417, 923, 247, 5301, 281, 1269, 466, 1162, 355, 4765, 50276, 19, 849, 1057, 253, 3215, 1243, 14156, 31850, 2429, 342, 941, 3530, 849, 1774, 604, 35375, 310, 35375, 3732, 281, 643, 8245, 3082, 3480, 28913, 2175, 1060, 50276, 20, 835, 310, 253, 4038, 13418, 906, 323, 1048, 6321, 299, 5844, 50276, 1189, 455, 253, 2929, 2085, 690, 4722, 16774, 4342, 323, 38391, 983, 342, 1027, 278, 3591, 68, 16095, 2299, 1655, 3368, 1543, 285, 3480, 273, 38135, 1056, 352, 247, 2372, 762, 253, 2534, 273, 17857, 32888, 2490, 187, 4118, 18435, 27, 2520, 2929, 4081, 247, 5700, 281, 6194, 38391, 983, 2556, 281, 253, 2978, 273, 278, 3591, 68, 24102, 2424, 253, 2929, 10949, 1264, 7533, 342, 253, 1027, 2978, 273, 278, 3591, 68, 2460, 9066, 48960, 5684, 285, 4038, 13418, 253, 30628, 3839, 1089, 326, 627, 403, 4722, 5697, 285, 12532, 1543, 275, 253, 2929, 533, 253, 2929, 310, 417, 4704, 281, 15452, 387, 697, 1655, 3924, 253, 4154, 5001, 4038, 13418, 285, 269, 301, 7103, 310, 417, 21414, 253, 4081, 1332, 310, 671, 625, 9542, 685, 253, 8245, 3082, 820, 412, 47301, 285, 268, 2428, 285, 359, 651, 878, 247, 10046, 4154, 323, 253, 2879, 10454 ]
Below is given review of a research paper from cnoference journal. Please write a summary the review. ### Review: this paper shows through a set of experiments that the common belief that a large neural network trained then pruned and finetuned performs better than another network that has the same size of the pruned one but trained from scratch is actually false that is a pruned network does not perform better than a network with the same dimensions but trained from scratch also the authors consider that what is important for good performance is to know how many weightsfilters are needed at each layer while the actual values of the weights do not matter then what happens in a standard large neural network training can be seen as an architecture search in which the algorithm learns what is the right amount of weights for each layer pros if these results are generally true then most of the pruning techniques are not really needed this is an important result if these results hold there is no need for training larger models and prune them best results can be obtained by training from scratch the right architecture the intuition that the neural network pruning is actually performing architecture search is quite interesting cons it is still difficult to believe that most of the previous work and previous experiments as in zhu gupta 2018 are faulty another paper with opposing results is 1 there the authors have an explicit control experiment in which they evaluate the training of a pruned network with random initialization and obtain worse performance than when pruned and pruned and retrained with the correct initialization soft pruning techniques as 2 obtain even better results than the original network these approaches are not considered in the analysis for instance in their tab 1 resnet56 pruned 30 obtained a gain of 019 while your resnet50 pruned 30 obtains a loss of 456 from tab 2 this is a significant difference in performance global evaluation in general the paper is well written and give good insides about pruning techniques however considering the vast literature that contradicts this paper results it is not easy to understand which results to believe it would be useful to see if the authors can obtain good results without pruning also on the control experiment in 1 finally it seems that the proposed method is worse than soft pruning in soft pruning we do not gain in training speed but if the main objective is performance it is a very relevant result and makes the claims of the paper weaker additional comments top pag4 in practice we found that increasing the training epochs within a reasonable range is rarely harmful if you use early stopping results should not be affected by the number of training epochs if trained until convergence 1 the lottery ticket hypothesis finding small trainable neural networks jonathan frankle michael carbin arxiv2018 2 soft filter pruning for accelerating deep convolutional neural networks yang he guoliang kang xuanyi dong yanwei fu yi yang arxiv 2018 docsepthis paper proposes to investigate recent popular approaches to pruning networks which have roots in works by lecun 90 and are mostly rooted in a recent series of papers by song han 20152016 the methods proposed in these papers consist of the following pipeline i train a neural network ii then prune the weights typically by trimming the those connections corresponding to weights with lowest magnitude iii fine tune the resulting sparselyconnected neural network the authors of the present work assert that traditionally each of the three stages is considered as indispensable the authors go on to investigate the contribution of each step to the overall pipeline among their findings they report that finetuning appears no better than training the resulting pruned network from scratch the assertion then is that the important aspect of pruning is not that it identifies the important weights but rather that it identifies a useful sparse architecture one problem here is that the authors may overstate the extent to which previous papers emphasize the finetuning and they may understate the extent to which previous papers emphasize the learning of the architecture rereading han 2015 it seems clear enough that the key point is learning the connections its right there in the title and that the important weights are a means to achieve this end moreover the authors may miss the actual point of finetuning the chief benefit of finetuning is that it is faster than training from scratch at each round of retraining so that even if it achieves the same performance as training from scratch thats still a key benefit in general when making claims about other peoples beliefs the authors need to provide citations references are not just about credit attribution but also about providing evidence and here that evidence is missing id like to see sweeping statements like this is usually reported to be superior to directly training a smaller network from scratch supported by precise references perhaps even a quote to spare the reader some time to this reader the most interesting finding in the paper by far is surprisingly understated in the abstract and introduction buried at the end of the paper here the authors investigate what are the properties of the resulting sparse architectures that make them useful they find that by looking at convolutional kernels from pruned architectures they can obtain for each connection a probability that a connection is kept using these probabilities they can create new sparse architectures that match the sparsity pattern of the pruned architectures a technique that they call guided sparsification the method yields similar benefits to pruning note that while obtaining the sparsity patterns does require running a pruning algorithm in the first place the learned sparsity patterns generalize well across architectures and datasets this result is interesting and useful and to my knowledge novel i think the authors should go deeper here investigating the idea on yet more datasets and architectures imagenet would be nice i also think that this result should be given greater emphasis and raised to the level of a major focal point of the paper with convincing results and some hardwork to reshape the narrative to support this more important finding i will consider revising my score docsepthis paper reinvestigate several recent works on network pruning and find that the common belief about the necessity to train a large network before pruning may not hold the authors find that training the pruned model from scratch can achieve similar if not better performance given enough time of training based on these observations the author conclude that training a larger model followed by pruning is not necessary for obtaining an efficient model with similar performance in other words the pruned architecture is more important than the weights inherited from the large model it reminds researchers to perform stronger baselines before showing complex pruning methods the paper is well organized and written it reevaluate the recent progresses made on this topic instead of comparing approaches by simply using the numbers from previous paper the authors perform extensive experiments to verify whether training the pruned network from scratch would work the results are very interesting it suggests the researchers to tune the baseline hardly and stick to simple approach however here are some places that i have concerns with 1 the two common beliefs actually state one thing that is the weights of a pretrained larger model can potentially help optimization for a smaller model 2 i dont quite agree with that training is the first step of a pruning pipeline as illustrated in figure 1 actually the motivation or the common assumption for pruning is that there are already existing trained models training is already finished with good performance if a trained model does not even exist then one can certainly train various thinsmaller model from scratch as before this is still a trial and error process 3 the value of pruning the goal of pruning is to explore a thin or shallower version of it with similar accuracy while avoiding the exhaustive architecture search with heavy training processes thus the first value of pruning is to explore efficient architecture while avoiding heavy training therefore it should be fast and efficient ideally with no retraining or little finetuning when the pruning method is too complex to implement or requires much more time than training from scratch it could be an overkill and adds little value especially when the performance is not better enough therefore it is more informative if the authors would report the timecomplexities for pruningfinetuning 4 the second value of pruning lies at understand the redundancy of the model and providing insights for more efficient architecture designs 5 comparing to random initialization pruning simply provide an initialization point inherited from the larger network the essential question the author asked is whether a subset of pretrained weights can outperform random initialization this seems to be a common belief in transfer learning knowledge distillation and the studies on initialization the authors conclude that the accuracy of an architecture is determined by the architecture itself but not the initialization if this is true training from scratch should have similar but not better result as finetuning a pruned model as the inherited weights can also be viewed as a random initialization both methods should reach equivalent good solution if they are trained with enough number of epochs can this be verified with experiments 6 the experiments might not be enough to reject the common belief the experiments only spoke that the pruned architectures can still be easily trained and encounter no difficulties during the optimization one conjecture is that the pruned models in the previous work still have enough capacity for keeping good accuracy what if the models are significantly pruned say more than 70 of channels got pruned is training from scratch still working well it would add much value if the author can identify when training from scratch fails to match the performance obtained by pruning and finetuning 7 in section 41 scratchtrained models achieve at least the same level of accuracy as finetuned models first the resnet34pruned ab for this comparison does not have significant flops reduction 10 and 24 flops reduction finetuning still has advantage as it only takes of training time compare to scratche second it is interesting that finetuning has generally smaller variance than stratche except vgg19 would this imply that finetuning a pruned model produce more stable result it would be more complete if there is variance analysis for the imagenet result 8 what is the trainingfinetuning hyperparameters used in section 41 note that in the experiment of li et al 2017 scratche takes 164 epochs to train from scratch while finetuning takes only 40 epochs like suggested above if we finetune it with more epochs would it achieve equivalent performance also what is the hyperparameter used in scratche note that the original paper use batch size 128 if the authors adopts a smaller batchsize for scratche then it has in more iterations and could certainly result in better performance according to recent belief that small batchsize generates better 9 the conclusion of section 5 is not quite clear or novel using uniform pruning ratio for pruning is expected to perform worse than automatic pruning methods as it does not consider the importance difference of each layer and this comes back to my point 3 4 about the value of pruning that is the value of pruning lies at the analysis of the redundancy of the network there are a number of works worked on analyzing the importance of different layers of filters so i think the hypothesis of the value of automatic pruning methods actually lies in the resulting architecture rather than the inherited weight is kind of straightforward also why not use flops as xaxis in figure 3 minor it might be more accurate to use l1norm based filter pruning li et al 2017 as literally channels usually refers to feature maps which are byproducts of the model but not the model itself i will revise my score if authors can address above concerns review after rebuttal 12 it would be great if the authors can make it clear that training is not the always the first step and the value of pruning in introduction rather than mentioning in conclusion saving training time is still an important factor when training from scratch is expensive 5 finetuning with enough epochs i understand that the authors are mainly questioning about whether training from scratch is necessarily bad than pruning and finetuning the author do find that training from scratch is better when the number of epochs is large enough but we see that finetuning resnet56 ab with 20 epochs does outperform or is equivalent to scratch training for the first 160 epochs which validates finetuning is faster to converge however training 320 epochs 16x more comparing to 20 epochs finetuning and 2x comparing with normal training from scratch is not quite coherent with the setting of scratch b as resnet56 b just reduce 27 flops the other part of the question is still unclear ie the author claimed that the accuracy of an architecture is determined by the architecture itself but not the initialization then both finetuning and scratch training should reach equivalent solution if they are well trained enough regardless of the initialization or pruning method the learning rate for scratch training is already well known learning rate drop brings boost the accuracy however learning rate schedule for finetuning especially for significantly pruned model as for reply6 is not well explored i wonder whether that a carefully tuned learning ratehyperparameters for finetuning may get the same or better performance as scratch training questions are both methods using the same learning rate schedule between epoch 160 and epoch 320 the resnets56 ab results in the reply8 does not match the reported performance in reply5 eg it shows 9267009 for resnet56b with 40epochs finetuning in reply5 but it turns out to be 9268019 in reply8 it would be great if the authors can add convergence curves for finetuning and scratch training for easier comparison 6 the failure case for sparse pruning on imagenet is interesting and it would be great to have the imagenet result reported and discussed the authors find that when the pruned ratio is large enough training from scratch is better by a even larger margin than finetuning this could be due to following reasons 1 when the pruning ratio is large the pruned model with preserved weights is significantly different from the original model and finetuning with small learning rate and limited number of epochs is not enough to recover the accuracy as mentioned earlier tuning the hyperparameters for finetuning based on pruning ratio might improve the performance of finetuning 2 though the pruning ratio is large the model used in this experiment may still have large capacity to reach good performance how about pruning resnet56 with significant pruning ratios finally based on above observations it seems to me that the preserved weights is more essential for fast finetuning but less useful for significant pruning ratios update the authors addressed most of my concerns some questions are still remaining in my comment review after rebuttal specifically finetuning a pruned network may still get good performance if the hyperparameters are carefully tuned based on the pruning ratios or in other words the preserved weights is more essential for fast finetuning but less useful for significant pruning ratios the authors may need to carefully made the conclusion from the observations i would hope the authors can address these concerns in the future version however i think the paper is overall wellwritten and existing content is inspiring enough for readers to further explore the trainability of the pruned network therefore i raised my score to 7 ### Summary:
the paper presents a lot of empirical evidence that fine tuning pruned networks is inferior to training them from scratch these results seem unsurprising in retrospect but hindsight is 2020 the reviewers raised a wide range of issues some of which were addressed and some which were not i recommend to the authors that they make sure that any claims they draw from their experiments are sufficiently prescribed eg the lottery ticket experiments done by anonymous in response to this paper show that the random initialization does poorer than restarting with the initial weights other than in resnet though this seems possibly due to the learning rate there is something different in their setting and so your claims should be properly circumscribed i dont think the standard versus nonstandard terminology is appropriate until the actual boundary between these two behaviors is identified i would recommend the authors make guarded claims here
[ 292, 651, 320, 5322, 891, 671, 1158, 326, 436, 906, 943, 320, 1677, 3687, 15075, 285, 5439, 281, 253, 1268, 273, 247, 2201, 18560, 1127, 273, 253, 2929, 342, 21414, 1543, 285, 690, 1892, 1601, 281, 40206, 2259, 253, 14511, 281, 1329, 436, 625, 1774, 4560, 891, 588, 1908, 3585, 2182, 619, 4868, 50276, 7152, 33032, 2520, 2929, 9838, 5410, 12894, 2067, 3332, 2987, 327, 2990, 819, 25004, 285, 1089, 326, 253, 1846, 9927, 670, 253, 15504, 281, 6194, 247, 1781, 2990, 1078, 819, 25004, 778, 417, 2186, 253, 4477, 1089, 326, 3733, 253, 819, 37437, 1566, 432, 20041, 476, 5115, 2074, 604, 417, 1805, 3045, 1677, 2217, 673, 273, 3733, 1754, 327, 841, 7313, 253, 2488, 7525, 326, 3733, 247, 4067, 1566, 3560, 407, 819, 25004, 310, 417, 3309, 323, 13546, 271, 5919, 1566, 342, 2074, 3045, 275, 643, 3000, 253, 819, 37437, 10336, 310, 625, 1774, 685, 253, 13461, 20265, 432, 253, 1781, 1566, 352, 25264, 8607, 281, 1347, 10046, 1666, 25379, 1078, 4645, 2570, 819, 25004, 3082, 50275, 783, 2929, 310, 973, 10932, 285, 3542, 352, 294, 45141, 253, 3332, 42851, 1160, 327, 436, 9400, 3185, 273, 10941, 7274, 407, 3365, 970, 253, 3904, 432, 2045, 2929, 253, 4477, 1347, 9470, 4679, 281, 12654, 1880, 3733, 253, 819, 37437, 2990, 432, 20041, 651, 789, 253, 1543, 403, 1077, 4722, 352, 5936, 253, 8607, 281, 19928, 253, 8245, 10693, 285, 7356, 281, 2969, 2746, 2299, 1060, 403, 690, 5053, 326, 891, 452, 7350, 342, 50276, 18, 253, 767, 1846, 13379, 2686, 1375, 581, 2181, 326, 310, 253, 13461, 273, 247, 3215, 11273, 4067, 1566, 476, 7826, 1361, 13757, 323, 247, 4577, 1566, 50275, 19, 891, 13414, 3240, 5194, 342, 326, 3733, 310, 253, 806, 3213, 273, 247, 819, 25004, 15722, 347, 12800, 275, 4677, 337, 50276, 35264, 253, 16038, 390, 253, 1846, 9376, 323, 819, 25004, 310, 326, 627, 403, 2168, 5368, 10166, 3210, 3733, 310, 2168, 6699, 342, 1175, 3045, 604, 247, 10166, 1566, 1057, 417, 1014, 2226, 840, 581, 476, 5604, 6194, 2710, 289, 968, 78, 455, 254, 1566, 432, 20041, 347, 1078, 436, 310, 1335, 247, 2332, 285, 2228, 1232, 50275, 20, 253, 1318, 273, 819, 25004, 253, 4736, 273, 819, 25004, 310, 281, 8338, 247, 6906, 390, 3091, 1017, 2715, 273, 352, 342, 2074, 7200, 1223, 17816, 253, 41389, 10336, 3186, 342, 5536, 3733, 4870, 3021, 253, 806, 1318, 273, 819, 25004, 310, 281, 8338, 5919, 10336, 1223, 17816, 5536, 3733, 3103, 352, 943, 320, 3809, 285, 5919, 34243, 342, 642, 851, 26208, 390, 1652, 1442, 292, 25004, 672, 253, 819, 25004, 1332, 310, 1512, 2570, 281, 3359, 390, 4419, 1199, 625, 673, 685, 3733, 432, 20041, 352, 812, 320, 271, 689, 24212, 285, 11323, 1652, 1318, 3340, 672, 253, 3045, 310, 417, 1805, 2217, 3103, 352, 310, 625, 27096, 604, 253, 4477, 651, 1304, 253, 673, 19017, 1005, 323, 819, 25004, 71, 7795, 25004, 50275, 21, 253, 1273, 1318, 273, 819, 25004, 8696, 387, 2096, 253, 39296, 273, 253, 1566, 285, 5277, 16039, 323, 625, 5919, 10336, 11809, 50275, 22, 10941, 281, 3632, 31850, 819, 25004, 3365, 2085, 271, 31850, 1127, 20265, 432, 253, 4067, 2990, 253, 5667, 1953, 253, 2488, 2546, 310, 1880, 247, 8578, 273, 3215, 11273, 13461, 476, 562, 32231, 3632, 31850, 436, 3133, 281, 320, 247, 1846, 9927, 275, 3700, 4715, 3640, 940, 21755, 285, 253, 2175, 327, 31850, 253, 4477, 7525, 326, 253, 7200, 273, 271, 10336, 310, 3413, 407, 253, 10336, 3139, 533, 417, 253, 31850, 604, 436, 310, 2032, 3733, 432, 20041, 943, 452, 2074, 533, 417, 1805, 906, 347, 1442, 292, 25004, 247, 819, 37437, 1566, 50276, 284, 253, 20265, 13461, 476, 671, 320, 11575, 347, 247, 3632, 31850, 1097, 3082, 943, 3986, 6425, 1175, 2900, 604, 597, 403, 10166, 342, 2217, 1180, 273, 44540, 476, 436, 320, 16058, 342, 4679, 50276, 23, 253, 4679, 1537, 417, 320, 2217, 281, 12009, 253, 1846, 9927, 253, 4679, 760, 7560, 326, 253, 819, 37437, 35615, 476, 1335, 320, 4354, 10166, 285, 13329, 642, 12748, 1309, 253, 13757, 581, 24366, 310, 326, 253, 819, 37437, 3210, 275, 253, 2045, 789, 1335, 452, 2217, 5350, 323, 7562, 1175, 7200, 752, 604, 253, 3210, 403, 3012, 819, 37437, 1333, 625, 685, 5571, 273, 8123, 1694, 819, 37437, 310, 3733, 432, 20041, 1335, 2444, 973, 352, 651, 823, 1199, 1318, 604, 253, 2488, 476, 4271, 672, 3733, 432, 20041, 10224, 281, 3761, 253, 3045, 2797, 407, 819, 25004, 285, 1442, 292, 25004, 50276, 24, 275, 2593, 7609, 20041, 32927, 3210, 5115, 387, 1878, 253, 1072, 1268, 273, 7200, 347, 1442, 292, 37437, 3210, 806, 253, 501, 3024, 1706, 1087, 37437, 490, 323, 436, 5301, 1057, 417, 452, 1534, 892, 2695, 5141, 884, 285, 2164, 892, 2695, 5141, 1442, 292, 25004, 1335, 556, 5750, 347, 352, 760, 3936, 50276, 1171, 3733, 673, 7277, 281, 7362, 38990, 1273, 352, 310, 4722, 326, 1442, 292, 25004, 556, 3839, 4577, 11041, 685, 15252, 1962, 3707, 362, 1266, 746, 651, 436, 16084, 326, 1442, 292, 25004, 247, 819, 37437, 1566, 4711, 625, 6474, 906, 352, 651, 320, 625, 3426, 604, 627, 310, 11041, 1783, 323, 253, 4440, 257, 292, 906, 50275, 25, 752, 310, 253, 3733, 71, 7795, 25004, 4373, 22041, 908, 275, 2593, 7609, 50276, 9939, 326, 275, 253, 3368, 273, 632, 1162, 355, 4240, 7362, 38990, 3936, 17601, 44540, 281, 6194, 432, 20041, 1223, 1442, 292, 25004, 3936, 760, 3387, 44540, 751, 5125, 1840, 604, 359, 1442, 292, 2517, 352, 342, 625, 44540, 651, 352, 5115, 6425, 3045, 671, 752, 310, 253, 4373, 19484, 908, 275, 7362, 38990, 3877, 326, 253, 3236, 2929, 897, 14604, 1979, 12842, 604, 253, 4477, 47932, 247, 4577, 14604, 3281, 323, 7362, 38990, 840, 352, 556, 275, 625, 25142, 285, 812, 5604, 906, 275, 1805, 3045, 2556, 281, 3332, 9927, 326, 1355, 14604, 3281, 15693, 1805, 50276, 26, 253, 6452, 273, 2593, 608, 310, 417, 3240, 2590, 390, 4460, 970, 6447, 819, 25004, 4313, 323, 819, 25004, 310, 3264, 281, 1347, 7197, 685, 12077, 819, 25004, 3082, 347, 352, 1057, 417, 1908, 253, 6349, 3064, 273, 1016, 3828, 285, 436, 3249, 896, 281, 619, 1127, 495, 50276, 21, 670, 253, 1318, 273, 819, 25004, 326, 310, 253, 1318, 273, 819, 25004, 8696, 387, 253, 1783, 273, 253, 39296, 273, 253, 2990, 627, 403, 247, 1180, 273, 2987, 4307, 327, 18918, 253, 6349, 273, 1027, 8090, 273, 15116, 594, 891, 1158, 253, 9079, 273, 253, 1318, 273, 12077, 819, 25004, 3082, 2686, 8696, 275, 253, 4795, 10336, 2581, 685, 253, 20265, 2801, 310, 2238, 273, 15246, 671, 2139, 417, 897, 892, 2695, 347, 1269, 10565, 275, 4677, 495, 50275, 37585, 352, 1537, 320, 625, 7899, 281, 897, 298, 18, 12850, 1754, 5806, 819, 25004, 632, 1162, 355, 4240, 347, 12832, 8123, 3798, 10770, 281, 4735, 8115, 534, 403, 407, 23832, 273, 253, 1566, 533, 417, 253, 1566, 3139, 50276, 74, 50276, 9846, 49620, 619, 4868, 604, 4477, 476, 2953, 1840, 7350, 50274, 15337, 846, 30080, 22559, 1249, 352, 651, 320, 1270, 604, 253, 4477, 476, 1056, 352, 2590, 326, 3733, 310, 417, 253, 1900, 253, 806, 3213, 285, 253, 1318, 273, 819, 25004, 275, 10199, 2581, 685, 29570, 275, 6452, 13868, 3733, 673, 310, 1335, 271, 1774, 2803, 672, 3733, 432, 20041, 310, 8214, 50275, 22, 1442, 292, 25004, 342, 2217, 44540, 50276, 74, 2096, 326, 253, 4477, 403, 7194, 20501, 670, 1880, 3733, 432, 20041, 310, 7933, 3076, 685, 819, 25004, 285, 1442, 292, 25004, 253, 2488, 513, 1089, 326, 3733, 432, 20041, 310, 1805, 672, 253, 1180, 273, 44540, 310, 1781, 2217, 533, 359, 923, 326, 1442, 292, 25004, 501, 3024, 3208, 490, 342, 1384, 44540, 1057, 562, 32231, 390, 310, 6425, 281, 20041, 3733, 323, 253, 806, 12036, 44540, 534, 3588, 684, 1442, 292, 25004, 310, 7938, 281, 29623, 50276, 35529, 3733, 23349, 44540, 1668, 89, 625, 10941, 281, 1384, 44540, 1442, 292, 25004, 285, 374, 89, 10941, 342, 2622, 3733, 432, 20041, 310, 417, 3240, 18893, 342, 253, 4758, 273, 20041, 270, 347, 501, 3024, 3208, 270, 816, 4796, 3435, 892, 2695, 50275, 783, 643, 629, 273, 253, 1953, 310, 1335, 12744, 26332, 253, 2488, 7558, 326, 253, 7200, 273, 271, 10336, 310, 3413, 407, 253, 10336, 3139, 533, 417, 253, 31850, 840, 1097, 1442, 292, 25004, 285, 20041, 3733, 943, 3986, 6425, 2900, 604, 597, 403, 973, 10166, 2217, 10159, 273, 253, 31850, 390, 819, 25004, 1332, 253, 4715, 2281, 323, 20041, 3733, 310, 2168, 973, 1929, 4715, 2281, 5926, 10316, 9510, 253, 7200, 2299, 4715, 2281, 10130, 323, 1442, 292, 25004, 3340, 323, 3012, 819, 37437, 1566, 347, 323, 12252, 23, 310, 417, 973, 14859, 891, 4282, 1880, 326, 247, 9257, 24251, 4715, 2281, 27049, 22041, 323, 1442, 292, 25004, 778, 755, 253, 1072, 390, 1805, 3045, 347, 20041, 3733, 50276, 34974, 50276, 609, 1097, 3082, 970, 253, 1072, 4715, 2281, 10130, 875, 23657, 12036, 285, 23657, 23349, 50276, 783, 501, 47301, 3208, 490, 1543, 275, 253, 12252, 25, 1057, 417, 3761, 253, 2361, 3045, 275, 12252, 22, 24088, 352, 2722, 898, 23546, 8972, 323, 501, 3024, 3208, 67, 342, 3387, 554, 3770, 84, 1442, 292, 25004, 275, 12252, 22, 50276, 2858, 352, 7819, 562, 281, 320, 898, 22913, 11325, 275, 12252, 25, 50276, 262, 651, 320, 1270, 604, 253, 4477, 476, 823, 14940, 9191, 323, 1442, 292, 25004, 285, 20041, 3733, 323, 6927, 5301, 50275, 23, 253, 4433, 1083, 323, 23507, 819, 25004, 327, 4440, 257, 292, 310, 4722, 285, 352, 651, 320, 1270, 281, 452, 253, 4440, 257, 292, 906, 2361, 285, 5469, 50275, 783, 4477, 1089, 326, 672, 253, 819, 37437, 4313, 310, 1781, 2217, 3733, 432, 20041, 310, 1805, 407, 247, 1014, 4067, 8459, 685, 1442, 292, 25004, 50276, 2520, 812, 320, 1955, 281, 1563, 4606, 50270, 18, 672, 253, 819, 25004, 4313, 310, 1781, 253, 819, 37437, 1566, 342, 15296, 13461, 310, 3012, 1027, 432, 253, 3236, 1566, 285, 1442, 292, 25004, 342, 1355, 4715, 2281, 285, 3710, 1180, 273, 44540, 310, 417, 2217, 281, 9295, 253, 7200, 347, 5393, 4321, 25184, 253, 4373, 22041, 323, 1442, 292, 25004, 1754, 327, 819, 25004, 4313, 1537, 3157, 253, 3045, 273, 1442, 292, 25004, 50270, 19, 2167, 253, 819, 25004, 4313, 310, 1781, 253, 1566, 908, 275, 436, 3368, 778, 1335, 452, 1781, 5350, 281, 3986, 1175, 3045, 849, 670, 819, 25004, 501, 3024, 3208, 342, 1534, 819, 25004, 11878, 50275, 71, 3341, 1754, 327, 1840, 7313, 352, 3133, 281, 479, 326, 253, 15296, 13461, 310, 625, 5667, 323, 3809, 1442, 292, 25004, 533, 1679, 4217, 323, 1534, 819, 25004, 11878, 50275, 11183, 50275, 783, 4477, 9713, 954, 273, 619, 7350, 690, 3533, 403, 1335, 5780, 275, 619, 4385, 2278, 846, 30080, 22559, 50276, 46458, 1442, 292, 25004, 247, 819, 37437, 2990, 778, 1335, 755, 1175, 3045, 604, 253, 4373, 22041, 403, 9257, 24251, 1754, 327, 253, 819, 25004, 11878, 390, 275, 643, 3000, 253, 15296, 13461, 310, 625, 5667, 323, 3809, 1442, 292, 25004, 533, 1679, 4217, 323, 1534, 819, 25004, 11878, 253, 4477, 778, 878, 281, 9257, 1160, 253, 6452, 432, 253, 7313, 891, 651, 3524, 253, 4477, 476, 2953, 841, 7350, 275, 253, 2852, 2715, 50276, 35529, 891, 1158, 253, 2929, 310, 4583, 973, 15720, 285, 5368, 2600, 310, 29853, 2217, 323, 10668, 281, 2007, 8338, 253, 6194, 1430, 273, 253, 819, 37437, 2990, 3103, 891, 5439, 619, 4868, 281, 818, 2490, 187, 4118, 18435, 27, 783, 2929, 10262, 247, 2257, 273, 16774, 1941, 326, 4030, 25184, 819, 37437, 6928, 310, 18134, 281, 3733, 731, 432, 20041, 841, 1543, 1646, 5061, 321, 20733, 275, 30294, 533, 17134, 18347, 310, 9169, 50276, 783, 30628, 5439, 247, 4618, 2491, 273, 3374, 690, 273, 534, 497, 9713, 285, 690, 534, 497, 417, 891, 5583, 281, 253, 4477, 326, 597, 1056, 2119, 326, 667, 3916, 597, 3812, 432, 616, 4679, 403, 10481, 15588, 24088, 253, 36284, 13571, 4679, 2218, 407, 17679, 275, 2380, 281, 436, 2929, 921, 326, 253, 3632, 31850, 1057, 30560, 685, 19855, 272, 342, 253, 3302, 13461, 643, 685, 275, 501, 3024, 2167, 436, 3133, 6830, 1955, 281, 253, 4715, 2281, 627, 310, 1633, 1027, 275, 616, 4758, 285, 594, 634, 3916, 943, 320, 6283, 4493, 31509, 891, 13414, 1158, 253, 2629, 7147, 1327, 15291, 28939, 310, 4569, 1919, 253, 4588, 7548, 875, 841, 767, 13576, 310, 3636, 891, 651, 5583, 253, 4477, 1056, 39824, 3916, 1060 ]
[ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 ]
[ 292, 651, 320, 5322, 891, 671, 1158, 326, 436, 906, 943, 320, 1677, 3687, 15075, 285, 5439, 281, 253, 1268, 273, 247, 2201, 18560, 1127, 273, 253, 2929, 342, 21414, 1543, 285, 690, 1892, 1601, 281, 40206, 2259, 253, 14511, 281, 1329, 436, 625, 1774, 4560, 891, 588, 1908, 3585, 2182, 619, 4868, 50276, 7152, 33032, 2520, 2929, 9838, 5410, 12894, 2067, 3332, 2987, 327, 2990, 819, 25004, 285, 1089, 326, 253, 1846, 9927, 670, 253, 15504, 281, 6194, 247, 1781, 2990, 1078, 819, 25004, 778, 417, 2186, 253, 4477, 1089, 326, 3733, 253, 819, 37437, 1566, 432, 20041, 476, 5115, 2074, 604, 417, 1805, 3045, 1677, 2217, 673, 273, 3733, 1754, 327, 841, 7313, 253, 2488, 7525, 326, 3733, 247, 4067, 1566, 3560, 407, 819, 25004, 310, 417, 3309, 323, 13546, 271, 5919, 1566, 342, 2074, 3045, 275, 643, 3000, 253, 819, 37437, 10336, 310, 625, 1774, 685, 253, 13461, 20265, 432, 253, 1781, 1566, 352, 25264, 8607, 281, 1347, 10046, 1666, 25379, 1078, 4645, 2570, 819, 25004, 3082, 50275, 783, 2929, 310, 973, 10932, 285, 3542, 352, 294, 45141, 253, 3332, 42851, 1160, 327, 436, 9400, 3185, 273, 10941, 7274, 407, 3365, 970, 253, 3904, 432, 2045, 2929, 253, 4477, 1347, 9470, 4679, 281, 12654, 1880, 3733, 253, 819, 37437, 2990, 432, 20041, 651, 789, 253, 1543, 403, 1077, 4722, 352, 5936, 253, 8607, 281, 19928, 253, 8245, 10693, 285, 7356, 281, 2969, 2746, 2299, 1060, 403, 690, 5053, 326, 891, 452, 7350, 342, 50276, 18, 253, 767, 1846, 13379, 2686, 1375, 581, 2181, 326, 310, 253, 13461, 273, 247, 3215, 11273, 4067, 1566, 476, 7826, 1361, 13757, 323, 247, 4577, 1566, 50275, 19, 891, 13414, 3240, 5194, 342, 326, 3733, 310, 253, 806, 3213, 273, 247, 819, 25004, 15722, 347, 12800, 275, 4677, 337, 50276, 35264, 253, 16038, 390, 253, 1846, 9376, 323, 819, 25004, 310, 326, 627, 403, 2168, 5368, 10166, 3210, 3733, 310, 2168, 6699, 342, 1175, 3045, 604, 247, 10166, 1566, 1057, 417, 1014, 2226, 840, 581, 476, 5604, 6194, 2710, 289, 968, 78, 455, 254, 1566, 432, 20041, 347, 1078, 436, 310, 1335, 247, 2332, 285, 2228, 1232, 50275, 20, 253, 1318, 273, 819, 25004, 253, 4736, 273, 819, 25004, 310, 281, 8338, 247, 6906, 390, 3091, 1017, 2715, 273, 352, 342, 2074, 7200, 1223, 17816, 253, 41389, 10336, 3186, 342, 5536, 3733, 4870, 3021, 253, 806, 1318, 273, 819, 25004, 310, 281, 8338, 5919, 10336, 1223, 17816, 5536, 3733, 3103, 352, 943, 320, 3809, 285, 5919, 34243, 342, 642, 851, 26208, 390, 1652, 1442, 292, 25004, 672, 253, 819, 25004, 1332, 310, 1512, 2570, 281, 3359, 390, 4419, 1199, 625, 673, 685, 3733, 432, 20041, 352, 812, 320, 271, 689, 24212, 285, 11323, 1652, 1318, 3340, 672, 253, 3045, 310, 417, 1805, 2217, 3103, 352, 310, 625, 27096, 604, 253, 4477, 651, 1304, 253, 673, 19017, 1005, 323, 819, 25004, 71, 7795, 25004, 50275, 21, 253, 1273, 1318, 273, 819, 25004, 8696, 387, 2096, 253, 39296, 273, 253, 1566, 285, 5277, 16039, 323, 625, 5919, 10336, 11809, 50275, 22, 10941, 281, 3632, 31850, 819, 25004, 3365, 2085, 271, 31850, 1127, 20265, 432, 253, 4067, 2990, 253, 5667, 1953, 253, 2488, 2546, 310, 1880, 247, 8578, 273, 3215, 11273, 13461, 476, 562, 32231, 3632, 31850, 436, 3133, 281, 320, 247, 1846, 9927, 275, 3700, 4715, 3640, 940, 21755, 285, 253, 2175, 327, 31850, 253, 4477, 7525, 326, 253, 7200, 273, 271, 10336, 310, 3413, 407, 253, 10336, 3139, 533, 417, 253, 31850, 604, 436, 310, 2032, 3733, 432, 20041, 943, 452, 2074, 533, 417, 1805, 906, 347, 1442, 292, 25004, 247, 819, 37437, 1566, 50276, 284, 253, 20265, 13461, 476, 671, 320, 11575, 347, 247, 3632, 31850, 1097, 3082, 943, 3986, 6425, 1175, 2900, 604, 597, 403, 10166, 342, 2217, 1180, 273, 44540, 476, 436, 320, 16058, 342, 4679, 50276, 23, 253, 4679, 1537, 417, 320, 2217, 281, 12009, 253, 1846, 9927, 253, 4679, 760, 7560, 326, 253, 819, 37437, 35615, 476, 1335, 320, 4354, 10166, 285, 13329, 642, 12748, 1309, 253, 13757, 581, 24366, 310, 326, 253, 819, 37437, 3210, 275, 253, 2045, 789, 1335, 452, 2217, 5350, 323, 7562, 1175, 7200, 752, 604, 253, 3210, 403, 3012, 819, 37437, 1333, 625, 685, 5571, 273, 8123, 1694, 819, 37437, 310, 3733, 432, 20041, 1335, 2444, 973, 352, 651, 823, 1199, 1318, 604, 253, 2488, 476, 4271, 672, 3733, 432, 20041, 10224, 281, 3761, 253, 3045, 2797, 407, 819, 25004, 285, 1442, 292, 25004, 50276, 24, 275, 2593, 7609, 20041, 32927, 3210, 5115, 387, 1878, 253, 1072, 1268, 273, 7200, 347, 1442, 292, 37437, 3210, 806, 253, 501, 3024, 1706, 1087, 37437, 490, 323, 436, 5301, 1057, 417, 452, 1534, 892, 2695, 5141, 884, 285, 2164, 892, 2695, 5141, 1442, 292, 25004, 1335, 556, 5750, 347, 352, 760, 3936, 50276, 1171, 3733, 673, 7277, 281, 7362, 38990, 1273, 352, 310, 4722, 326, 1442, 292, 25004, 556, 3839, 4577, 11041, 685, 15252, 1962, 3707, 362, 1266, 746, 651, 436, 16084, 326, 1442, 292, 25004, 247, 819, 37437, 1566, 4711, 625, 6474, 906, 352, 651, 320, 625, 3426, 604, 627, 310, 11041, 1783, 323, 253, 4440, 257, 292, 906, 50275, 25, 752, 310, 253, 3733, 71, 7795, 25004, 4373, 22041, 908, 275, 2593, 7609, 50276, 9939, 326, 275, 253, 3368, 273, 632, 1162, 355, 4240, 7362, 38990, 3936, 17601, 44540, 281, 6194, 432, 20041, 1223, 1442, 292, 25004, 3936, 760, 3387, 44540, 751, 5125, 1840, 604, 359, 1442, 292, 2517, 352, 342, 625, 44540, 651, 352, 5115, 6425, 3045, 671, 752, 310, 253, 4373, 19484, 908, 275, 7362, 38990, 3877, 326, 253, 3236, 2929, 897, 14604, 1979, 12842, 604, 253, 4477, 47932, 247, 4577, 14604, 3281, 323, 7362, 38990, 840, 352, 556, 275, 625, 25142, 285, 812, 5604, 906, 275, 1805, 3045, 2556, 281, 3332, 9927, 326, 1355, 14604, 3281, 15693, 1805, 50276, 26, 253, 6452, 273, 2593, 608, 310, 417, 3240, 2590, 390, 4460, 970, 6447, 819, 25004, 4313, 323, 819, 25004, 310, 3264, 281, 1347, 7197, 685, 12077, 819, 25004, 3082, 347, 352, 1057, 417, 1908, 253, 6349, 3064, 273, 1016, 3828, 285, 436, 3249, 896, 281, 619, 1127, 495, 50276, 21, 670, 253, 1318, 273, 819, 25004, 326, 310, 253, 1318, 273, 819, 25004, 8696, 387, 253, 1783, 273, 253, 39296, 273, 253, 2990, 627, 403, 247, 1180, 273, 2987, 4307, 327, 18918, 253, 6349, 273, 1027, 8090, 273, 15116, 594, 891, 1158, 253, 9079, 273, 253, 1318, 273, 12077, 819, 25004, 3082, 2686, 8696, 275, 253, 4795, 10336, 2581, 685, 253, 20265, 2801, 310, 2238, 273, 15246, 671, 2139, 417, 897, 892, 2695, 347, 1269, 10565, 275, 4677, 495, 50275, 37585, 352, 1537, 320, 625, 7899, 281, 897, 298, 18, 12850, 1754, 5806, 819, 25004, 632, 1162, 355, 4240, 347, 12832, 8123, 3798, 10770, 281, 4735, 8115, 534, 403, 407, 23832, 273, 253, 1566, 533, 417, 253, 1566, 3139, 50276, 74, 50276, 9846, 49620, 619, 4868, 604, 4477, 476, 2953, 1840, 7350, 50274, 15337, 846, 30080, 22559, 1249, 352, 651, 320, 1270, 604, 253, 4477, 476, 1056, 352, 2590, 326, 3733, 310, 417, 253, 1900, 253, 806, 3213, 285, 253, 1318, 273, 819, 25004, 275, 10199, 2581, 685, 29570, 275, 6452, 13868, 3733, 673, 310, 1335, 271, 1774, 2803, 672, 3733, 432, 20041, 310, 8214, 50275, 22, 1442, 292, 25004, 342, 2217, 44540, 50276, 74, 2096, 326, 253, 4477, 403, 7194, 20501, 670, 1880, 3733, 432, 20041, 310, 7933, 3076, 685, 819, 25004, 285, 1442, 292, 25004, 253, 2488, 513, 1089, 326, 3733, 432, 20041, 310, 1805, 672, 253, 1180, 273, 44540, 310, 1781, 2217, 533, 359, 923, 326, 1442, 292, 25004, 501, 3024, 3208, 490, 342, 1384, 44540, 1057, 562, 32231, 390, 310, 6425, 281, 20041, 3733, 323, 253, 806, 12036, 44540, 534, 3588, 684, 1442, 292, 25004, 310, 7938, 281, 29623, 50276, 35529, 3733, 23349, 44540, 1668, 89, 625, 10941, 281, 1384, 44540, 1442, 292, 25004, 285, 374, 89, 10941, 342, 2622, 3733, 432, 20041, 310, 417, 3240, 18893, 342, 253, 4758, 273, 20041, 270, 347, 501, 3024, 3208, 270, 816, 4796, 3435, 892, 2695, 50275, 783, 643, 629, 273, 253, 1953, 310, 1335, 12744, 26332, 253, 2488, 7558, 326, 253, 7200, 273, 271, 10336, 310, 3413, 407, 253, 10336, 3139, 533, 417, 253, 31850, 840, 1097, 1442, 292, 25004, 285, 20041, 3733, 943, 3986, 6425, 2900, 604, 597, 403, 973, 10166, 2217, 10159, 273, 253, 31850, 390, 819, 25004, 1332, 253, 4715, 2281, 323, 20041, 3733, 310, 2168, 973, 1929, 4715, 2281, 5926, 10316, 9510, 253, 7200, 2299, 4715, 2281, 10130, 323, 1442, 292, 25004, 3340, 323, 3012, 819, 37437, 1566, 347, 323, 12252, 23, 310, 417, 973, 14859, 891, 4282, 1880, 326, 247, 9257, 24251, 4715, 2281, 27049, 22041, 323, 1442, 292, 25004, 778, 755, 253, 1072, 390, 1805, 3045, 347, 20041, 3733, 50276, 34974, 50276, 609, 1097, 3082, 970, 253, 1072, 4715, 2281, 10130, 875, 23657, 12036, 285, 23657, 23349, 50276, 783, 501, 47301, 3208, 490, 1543, 275, 253, 12252, 25, 1057, 417, 3761, 253, 2361, 3045, 275, 12252, 22, 24088, 352, 2722, 898, 23546, 8972, 323, 501, 3024, 3208, 67, 342, 3387, 554, 3770, 84, 1442, 292, 25004, 275, 12252, 22, 50276, 2858, 352, 7819, 562, 281, 320, 898, 22913, 11325, 275, 12252, 25, 50276, 262, 651, 320, 1270, 604, 253, 4477, 476, 823, 14940, 9191, 323, 1442, 292, 25004, 285, 20041, 3733, 323, 6927, 5301, 50275, 23, 253, 4433, 1083, 323, 23507, 819, 25004, 327, 4440, 257, 292, 310, 4722, 285, 352, 651, 320, 1270, 281, 452, 253, 4440, 257, 292, 906, 2361, 285, 5469, 50275, 783, 4477, 1089, 326, 672, 253, 819, 37437, 4313, 310, 1781, 2217, 3733, 432, 20041, 310, 1805, 407, 247, 1014, 4067, 8459, 685, 1442, 292, 25004, 50276, 2520, 812, 320, 1955, 281, 1563, 4606, 50270, 18, 672, 253, 819, 25004, 4313, 310, 1781, 253, 819, 37437, 1566, 342, 15296, 13461, 310, 3012, 1027, 432, 253, 3236, 1566, 285, 1442, 292, 25004, 342, 1355, 4715, 2281, 285, 3710, 1180, 273, 44540, 310, 417, 2217, 281, 9295, 253, 7200, 347, 5393, 4321, 25184, 253, 4373, 22041, 323, 1442, 292, 25004, 1754, 327, 819, 25004, 4313, 1537, 3157, 253, 3045, 273, 1442, 292, 25004, 50270, 19, 2167, 253, 819, 25004, 4313, 310, 1781, 253, 1566, 908, 275, 436, 3368, 778, 1335, 452, 1781, 5350, 281, 3986, 1175, 3045, 849, 670, 819, 25004, 501, 3024, 3208, 342, 1534, 819, 25004, 11878, 50275, 71, 3341, 1754, 327, 1840, 7313, 352, 3133, 281, 479, 326, 253, 15296, 13461, 310, 625, 5667, 323, 3809, 1442, 292, 25004, 533, 1679, 4217, 323, 1534, 819, 25004, 11878, 50275, 11183, 50275, 783, 4477, 9713, 954, 273, 619, 7350, 690, 3533, 403, 1335, 5780, 275, 619, 4385, 2278, 846, 30080, 22559, 50276, 46458, 1442, 292, 25004, 247, 819, 37437, 2990, 778, 1335, 755, 1175, 3045, 604, 253, 4373, 22041, 403, 9257, 24251, 1754, 327, 253, 819, 25004, 11878, 390, 275, 643, 3000, 253, 15296, 13461, 310, 625, 5667, 323, 3809, 1442, 292, 25004, 533, 1679, 4217, 323, 1534, 819, 25004, 11878, 253, 4477, 778, 878, 281, 9257, 1160, 253, 6452, 432, 253, 7313, 891, 651, 3524, 253, 4477, 476, 2953, 841, 7350, 275, 253, 2852, 2715, 50276, 35529, 891, 1158, 253, 2929, 310, 4583, 973, 15720, 285, 5368, 2600, 310, 29853, 2217, 323, 10668, 281, 2007, 8338, 253, 6194, 1430, 273, 253, 819, 37437, 2990, 3103, 891, 5439, 619, 4868, 281, 818, 2490, 187, 4118, 18435, 27, 783, 2929, 10262, 247, 2257, 273, 16774, 1941, 326, 4030, 25184, 819, 37437, 6928, 310, 18134, 281, 3733, 731, 432, 20041, 841, 1543, 1646, 5061, 321, 20733, 275, 30294, 533, 17134, 18347, 310, 9169, 50276, 783, 30628, 5439, 247, 4618, 2491, 273, 3374, 690, 273, 534, 497, 9713, 285, 690, 534, 497, 417, 891, 5583, 281, 253, 4477, 326, 597, 1056, 2119, 326, 667, 3916, 597, 3812, 432, 616, 4679, 403, 10481, 15588, 24088, 253, 36284, 13571, 4679, 2218, 407, 17679, 275, 2380, 281, 436, 2929, 921, 326, 253, 3632, 31850, 1057, 30560, 685, 19855, 272, 342, 253, 3302, 13461, 643, 685, 275, 501, 3024, 2167, 436, 3133, 6830, 1955, 281, 253, 4715, 2281, 627, 310, 1633, 1027, 275, 616, 4758, 285, 594, 634, 3916, 943, 320, 6283, 4493, 31509, 891, 13414, 1158, 253, 2629, 7147, 1327, 15291, 28939, 310, 4569, 1919, 253, 4588, 7548, 875, 841, 767, 13576, 310, 3636, 891, 651, 5583, 253, 4477, 1056, 39824, 3916, 1060 ]
Below is given review of a research paper from cnoference journal. Please write a summary the review. ### Review: this paper presents a new selfsupervised learning framework to enhance language model based on graph information strengths the proposed method is simple and reasonable the experimental studies are extensive weaknesses 1 the novelty is limited in my view masked language modeling aims to predict the masked tokens given the context and in this paper neighborhood prediction aims to predict the relation given the context relationpredictionbased objectives have been widely applied in knowledgeentity oriented pretraining such as kadapter wang et al erica qin et al in addition the impact of the proposed method could be rather minor given their experimental results for example in table 1 on the ogbnarxiv dataset based on giantxrt graphsage revgat and revgatselfkd gain accuracies of 7459 7596 and 7612 respectively and based on tfidfno pifa gain accuracies of 7409 7556 and 7585 with such small gaps it would be important to know whether the difference is actually statistically significant 2 some details in figure 1 are not clear eg denotations of a and y full terms of xmc to make figure 1 selfcontained the caption of the figure should provide more necessary information 3 the idea are verified on three node classification datasets while some details are missing split ratioin table 1 is confusing i cannot figure out what the ratios for traintestdevelopment are how many classes for each dataset it would be better to show some real instances 4 what is the dataset used for pretraining giant the novelty is limited and the impact of the proposed method could be rather minor some necessary details should be provided docsepthe paper proposed a selfsupervised learning framework for learning node feature by exploring the correlation between the node feature and the graph structure which leverages the graph information based on neighborhood prediction to be specific the proposed giant approach is combined with the pretrained language model bert and incorporated the xmc formalism based on xrtransformer partial theoretical analysis is also presented experiments conducted on three large benchmark datasets show promissing improvements strengths introducing the idea of neighborhood prediction to guide selfsupervised node feature learning is interesting and somewhat novel connecting neighborhood prediction with the xmc problem is novel extensive experiments are conducted on ogb and show new stateoftheart results weaknesses the reviewer has some concerns on the provided theoretical analysis based on csbm deshpande et al 2018 it seems misleading and incomplete the theoretical analysis could be deduced from the analysis in baranwal et al 2021 with a few changes in baranwal et al 2021 the csbm is used to analysis the effect of graph convolution operation on the linear separability the established theoretical results show that if the means of the two mixture of gaussians is not large than a threshold the results after graph convolution are not guaranteed with high probability to improve the linear separability however the statements in theorem 44 is relatively vague note that pifa is just one step of a graph convolution with the node features plus a normalization step what can we say about the performance of using the pifa embedding without the characteristic of the node features and the affinity of the graph convolution it is hardly to have a convinsing conclusion furthermore the requirement on p q ie the probability p of having a link between two nodes having the same label yiyj should be larger than the probability of having a wrong link between two nodes having different labels yi neq yj is it necessary or not why the idea is clear and the empirical evaluation is strong since that the reviewer has some concerns on the provided theoretical analysis it would be safe to decide after reading the feedback from the authors docsepthis paper develops a selfsupervised learning framework to extract node features with the aid of graph connections between neighborhood prediction and the xmc problem are also established experiments on largescale data show the superiority of the proposed method strengths 1 the problem is well motivated 2 the proposed framework could be useful in general 3 both theoretical analysis and experiments are convincing weaknesses 1 the efficiency is not supported with experiments 2 there are some typos this paper develops a selfsupervised learning framework to extract node features supervised graph it is an interesting problem theoretical analysis is also provided extensive experiments on largescale data validated the effectiveness of the proposed method there are some typos eg these have to be predicted using the textual information in order to best match the a priori given graph topology this is achieved this by using the stateoftheart xrtransformer zhang et al 2021a method for solving the xmc problem ### Summary:
in this submission the authors presented a framework giant for selfsupervised learning to improve lm by leveraging graph information reviewers agree that the method is somewhat novel the partial theoretical analysis is interesting and the evaluations are strong we thank the authors for doing an excellent job in rebuttal which cleared essentially all the questions reviewers initially raised
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 2520, 2929, 10262, 247, 747, 1881, 35421, 4715, 7792, 281, 7278, 3448, 1566, 1754, 327, 4216, 1491, 50276, 296, 3755, 20556, 253, 4081, 1332, 310, 2969, 285, 5272, 253, 5661, 2175, 403, 9470, 50276, 20881, 1255, 265, 50276, 18, 253, 38135, 310, 3710, 275, 619, 1859, 34741, 3448, 14053, 13698, 281, 3283, 253, 34741, 21761, 1677, 253, 3634, 285, 275, 436, 2929, 9168, 10554, 13698, 281, 3283, 253, 5886, 1677, 253, 3634, 5886, 12787, 2474, 3169, 16566, 452, 644, 7561, 3732, 275, 3640, 21294, 19373, 3215, 26208, 824, 347, 465, 31644, 259, 606, 1162, 355, 50276, 254, 3737, 2805, 249, 1162, 355, 275, 1635, 253, 3486, 273, 253, 4081, 1332, 812, 320, 2581, 5884, 1677, 616, 5661, 1543, 323, 1650, 275, 2829, 337, 327, 253, 9040, 15453, 39962, 10895, 1754, 327, 10864, 89, 1378, 14580, 486, 3585, 72, 255, 285, 3585, 72, 255, 1286, 76, 69, 6351, 3933, 19103, 273, 818, 28333, 818, 36679, 285, 10909, 805, 2975, 285, 1754, 327, 28793, 301, 71, 2369, 268, 38139, 6351, 3933, 19103, 273, 818, 24264, 818, 35466, 285, 6879, 2227, 342, 824, 1355, 18388, 352, 651, 320, 1774, 281, 871, 1880, 253, 3064, 310, 2686, 10126, 1534, 50276, 19, 690, 4278, 275, 4677, 337, 403, 417, 2590, 24088, 1850, 302, 569, 273, 247, 285, 340, 2120, 2426, 273, 1269, 17475, 281, 1056, 4677, 337, 1881, 41010, 253, 11743, 273, 253, 4677, 943, 2085, 625, 3309, 1491, 50276, 20, 253, 2934, 403, 16058, 327, 1264, 4666, 9162, 15302, 1223, 690, 4278, 403, 5816, 8085, 4313, 249, 2829, 337, 310, 21643, 891, 2550, 4677, 562, 752, 253, 11878, 323, 1140, 565, 383, 27055, 403, 849, 1142, 5971, 323, 1016, 10895, 352, 651, 320, 1805, 281, 921, 690, 1524, 10872, 50276, 21, 752, 310, 253, 10895, 908, 323, 3215, 26208, 10864, 50276, 783, 38135, 310, 3710, 285, 253, 3486, 273, 253, 4081, 1332, 812, 320, 2581, 5884, 690, 3309, 4278, 943, 320, 2530, 5474, 339, 431, 248, 2929, 4081, 247, 1881, 35421, 4715, 7792, 323, 4715, 4666, 4735, 407, 18216, 253, 5921, 875, 253, 4666, 4735, 285, 253, 4216, 2605, 534, 19732, 1131, 253, 4216, 1491, 1754, 327, 9168, 10554, 281, 320, 2173, 253, 4081, 10864, 2746, 310, 5678, 342, 253, 3215, 11273, 3448, 1566, 270, 797, 285, 11217, 253, 1269, 17475, 30221, 1754, 327, 1269, 83, 16702, 254, 7898, 10527, 1783, 310, 671, 3559, 4679, 5196, 327, 1264, 1781, 22791, 15302, 921, 1964, 739, 272, 11701, 50276, 296, 3755, 20556, 50276, 36445, 2844, 253, 2934, 273, 9168, 10554, 281, 7102, 1881, 35421, 4666, 4735, 4715, 310, 4722, 285, 8489, 4460, 50276, 11025, 272, 9168, 10554, 342, 253, 1269, 17475, 1895, 310, 4460, 50276, 2068, 3134, 4679, 403, 5196, 327, 9040, 67, 285, 921, 747, 1375, 23037, 14387, 1543, 50275, 20881, 1255, 265, 50275, 783, 37317, 556, 690, 7350, 327, 253, 2530, 10527, 1783, 1754, 327, 29180, 5844, 711, 28368, 10273, 1162, 355, 4765, 352, 3133, 24363, 285, 18464, 50275, 783, 10527, 1783, 812, 320, 36183, 432, 253, 1783, 275, 2534, 266, 18758, 1162, 355, 43425, 342, 247, 1643, 2544, 275, 2534, 266, 18758, 1162, 355, 43425, 253, 29180, 5844, 310, 908, 281, 1783, 253, 1055, 273, 4216, 27311, 4254, 327, 253, 4872, 2533, 1430, 253, 4232, 10527, 1543, 921, 326, 604, 253, 2097, 273, 253, 767, 7802, 273, 305, 10064, 2458, 310, 417, 1781, 685, 247, 7887, 253, 1543, 846, 4216, 27311, 403, 417, 16293, 342, 1029, 5912, 281, 3157, 253, 4872, 2533, 1430, 50275, 35529, 253, 7234, 275, 10012, 7127, 310, 4942, 21248, 3877, 326, 268, 38139, 310, 816, 581, 3213, 273, 247, 4216, 27311, 342, 253, 4666, 3386, 5043, 247, 21539, 3213, 752, 476, 359, 1333, 670, 253, 3045, 273, 970, 253, 268, 38139, 21496, 50276, 14920, 253, 8847, 273, 253, 4666, 3386, 285, 253, 15430, 273, 253, 4216, 27311, 352, 310, 10693, 281, 452, 247, 2410, 968, 272, 6452, 50275, 44295, 3062, 253, 8284, 327, 268, 50276, 82, 26332, 253, 5912, 268, 273, 1907, 247, 3048, 875, 767, 7632, 1907, 253, 1072, 5203, 340, 14059, 75, 943, 320, 4067, 685, 253, 5912, 273, 1907, 247, 3430, 3048, 875, 767, 7632, 1907, 1027, 13301, 340, 74, 425, 82, 340, 75, 310, 352, 3309, 390, 417, 2139, 50274, 783, 2934, 310, 2590, 285, 253, 16774, 7103, 310, 2266, 1580, 326, 253, 37317, 556, 690, 7350, 327, 253, 2530, 10527, 1783, 352, 651, 320, 4999, 281, 7617, 846, 4361, 253, 8680, 432, 253, 4477, 50276, 7152, 33032, 2520, 2929, 24357, 247, 1881, 35421, 4715, 7792, 281, 4908, 4666, 3386, 342, 253, 8596, 273, 4216, 10291, 875, 9168, 10554, 285, 253, 1269, 17475, 1895, 403, 671, 4232, 4679, 327, 1236, 2510, 25912, 941, 921, 253, 34385, 273, 253, 4081, 1332, 20544, 337, 253, 1895, 310, 973, 17194, 374, 253, 4081, 7792, 812, 320, 4217, 275, 2087, 495, 1097, 10527, 1783, 285, 4679, 403, 21414, 50276, 20881, 1255, 265, 337, 253, 6733, 310, 417, 4516, 342, 4679, 374, 627, 403, 690, 963, 993, 50276, 2520, 2929, 24357, 247, 1881, 35421, 4715, 7792, 281, 4908, 4666, 3386, 22296, 4216, 352, 310, 271, 4722, 1895, 10527, 1783, 310, 671, 2530, 9470, 4679, 327, 1236, 2510, 25912, 941, 17618, 253, 12510, 273, 253, 4081, 1332, 627, 403, 690, 963, 993, 24088, 841, 452, 281, 320, 8131, 970, 253, 45860, 1491, 275, 1340, 281, 1682, 3761, 253, 247, 30400, 1677, 4216, 18080, 436, 310, 6786, 436, 407, 970, 253, 1375, 23037, 14387, 1269, 83, 16702, 254, 1182, 12109, 1162, 355, 43425, 66, 1332, 323, 16161, 253, 1269, 17475, 1895, 2490, 187, 4118, 18435, 27, 249, 436, 19529, 253, 4477, 3559, 247, 7792, 10864, 323, 1881, 35421, 4715, 281, 3157, 298, 78, 407, 19732, 2977, 4216, 1491, 30628, 5194, 326, 253, 1332, 310, 8489, 4460, 253, 7898, 10527, 1783, 310, 4722, 285, 253, 27163, 403, 2266, 359, 5717, 253, 4477, 323, 2509, 271, 7126, 2628, 275, 30080, 22559, 534, 16481, 9093, 512, 253, 3533, 30628, 8523, 5439 ]
[ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 ]
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 2520, 2929, 10262, 247, 747, 1881, 35421, 4715, 7792, 281, 7278, 3448, 1566, 1754, 327, 4216, 1491, 50276, 296, 3755, 20556, 253, 4081, 1332, 310, 2969, 285, 5272, 253, 5661, 2175, 403, 9470, 50276, 20881, 1255, 265, 50276, 18, 253, 38135, 310, 3710, 275, 619, 1859, 34741, 3448, 14053, 13698, 281, 3283, 253, 34741, 21761, 1677, 253, 3634, 285, 275, 436, 2929, 9168, 10554, 13698, 281, 3283, 253, 5886, 1677, 253, 3634, 5886, 12787, 2474, 3169, 16566, 452, 644, 7561, 3732, 275, 3640, 21294, 19373, 3215, 26208, 824, 347, 465, 31644, 259, 606, 1162, 355, 50276, 254, 3737, 2805, 249, 1162, 355, 275, 1635, 253, 3486, 273, 253, 4081, 1332, 812, 320, 2581, 5884, 1677, 616, 5661, 1543, 323, 1650, 275, 2829, 337, 327, 253, 9040, 15453, 39962, 10895, 1754, 327, 10864, 89, 1378, 14580, 486, 3585, 72, 255, 285, 3585, 72, 255, 1286, 76, 69, 6351, 3933, 19103, 273, 818, 28333, 818, 36679, 285, 10909, 805, 2975, 285, 1754, 327, 28793, 301, 71, 2369, 268, 38139, 6351, 3933, 19103, 273, 818, 24264, 818, 35466, 285, 6879, 2227, 342, 824, 1355, 18388, 352, 651, 320, 1774, 281, 871, 1880, 253, 3064, 310, 2686, 10126, 1534, 50276, 19, 690, 4278, 275, 4677, 337, 403, 417, 2590, 24088, 1850, 302, 569, 273, 247, 285, 340, 2120, 2426, 273, 1269, 17475, 281, 1056, 4677, 337, 1881, 41010, 253, 11743, 273, 253, 4677, 943, 2085, 625, 3309, 1491, 50276, 20, 253, 2934, 403, 16058, 327, 1264, 4666, 9162, 15302, 1223, 690, 4278, 403, 5816, 8085, 4313, 249, 2829, 337, 310, 21643, 891, 2550, 4677, 562, 752, 253, 11878, 323, 1140, 565, 383, 27055, 403, 849, 1142, 5971, 323, 1016, 10895, 352, 651, 320, 1805, 281, 921, 690, 1524, 10872, 50276, 21, 752, 310, 253, 10895, 908, 323, 3215, 26208, 10864, 50276, 783, 38135, 310, 3710, 285, 253, 3486, 273, 253, 4081, 1332, 812, 320, 2581, 5884, 690, 3309, 4278, 943, 320, 2530, 5474, 339, 431, 248, 2929, 4081, 247, 1881, 35421, 4715, 7792, 323, 4715, 4666, 4735, 407, 18216, 253, 5921, 875, 253, 4666, 4735, 285, 253, 4216, 2605, 534, 19732, 1131, 253, 4216, 1491, 1754, 327, 9168, 10554, 281, 320, 2173, 253, 4081, 10864, 2746, 310, 5678, 342, 253, 3215, 11273, 3448, 1566, 270, 797, 285, 11217, 253, 1269, 17475, 30221, 1754, 327, 1269, 83, 16702, 254, 7898, 10527, 1783, 310, 671, 3559, 4679, 5196, 327, 1264, 1781, 22791, 15302, 921, 1964, 739, 272, 11701, 50276, 296, 3755, 20556, 50276, 36445, 2844, 253, 2934, 273, 9168, 10554, 281, 7102, 1881, 35421, 4666, 4735, 4715, 310, 4722, 285, 8489, 4460, 50276, 11025, 272, 9168, 10554, 342, 253, 1269, 17475, 1895, 310, 4460, 50276, 2068, 3134, 4679, 403, 5196, 327, 9040, 67, 285, 921, 747, 1375, 23037, 14387, 1543, 50275, 20881, 1255, 265, 50275, 783, 37317, 556, 690, 7350, 327, 253, 2530, 10527, 1783, 1754, 327, 29180, 5844, 711, 28368, 10273, 1162, 355, 4765, 352, 3133, 24363, 285, 18464, 50275, 783, 10527, 1783, 812, 320, 36183, 432, 253, 1783, 275, 2534, 266, 18758, 1162, 355, 43425, 342, 247, 1643, 2544, 275, 2534, 266, 18758, 1162, 355, 43425, 253, 29180, 5844, 310, 908, 281, 1783, 253, 1055, 273, 4216, 27311, 4254, 327, 253, 4872, 2533, 1430, 253, 4232, 10527, 1543, 921, 326, 604, 253, 2097, 273, 253, 767, 7802, 273, 305, 10064, 2458, 310, 417, 1781, 685, 247, 7887, 253, 1543, 846, 4216, 27311, 403, 417, 16293, 342, 1029, 5912, 281, 3157, 253, 4872, 2533, 1430, 50275, 35529, 253, 7234, 275, 10012, 7127, 310, 4942, 21248, 3877, 326, 268, 38139, 310, 816, 581, 3213, 273, 247, 4216, 27311, 342, 253, 4666, 3386, 5043, 247, 21539, 3213, 752, 476, 359, 1333, 670, 253, 3045, 273, 970, 253, 268, 38139, 21496, 50276, 14920, 253, 8847, 273, 253, 4666, 3386, 285, 253, 15430, 273, 253, 4216, 27311, 352, 310, 10693, 281, 452, 247, 2410, 968, 272, 6452, 50275, 44295, 3062, 253, 8284, 327, 268, 50276, 82, 26332, 253, 5912, 268, 273, 1907, 247, 3048, 875, 767, 7632, 1907, 253, 1072, 5203, 340, 14059, 75, 943, 320, 4067, 685, 253, 5912, 273, 1907, 247, 3430, 3048, 875, 767, 7632, 1907, 1027, 13301, 340, 74, 425, 82, 340, 75, 310, 352, 3309, 390, 417, 2139, 50274, 783, 2934, 310, 2590, 285, 253, 16774, 7103, 310, 2266, 1580, 326, 253, 37317, 556, 690, 7350, 327, 253, 2530, 10527, 1783, 352, 651, 320, 4999, 281, 7617, 846, 4361, 253, 8680, 432, 253, 4477, 50276, 7152, 33032, 2520, 2929, 24357, 247, 1881, 35421, 4715, 7792, 281, 4908, 4666, 3386, 342, 253, 8596, 273, 4216, 10291, 875, 9168, 10554, 285, 253, 1269, 17475, 1895, 403, 671, 4232, 4679, 327, 1236, 2510, 25912, 941, 921, 253, 34385, 273, 253, 4081, 1332, 20544, 337, 253, 1895, 310, 973, 17194, 374, 253, 4081, 7792, 812, 320, 4217, 275, 2087, 495, 1097, 10527, 1783, 285, 4679, 403, 21414, 50276, 20881, 1255, 265, 337, 253, 6733, 310, 417, 4516, 342, 4679, 374, 627, 403, 690, 963, 993, 50276, 2520, 2929, 24357, 247, 1881, 35421, 4715, 7792, 281, 4908, 4666, 3386, 22296, 4216, 352, 310, 271, 4722, 1895, 10527, 1783, 310, 671, 2530, 9470, 4679, 327, 1236, 2510, 25912, 941, 17618, 253, 12510, 273, 253, 4081, 1332, 627, 403, 690, 963, 993, 24088, 841, 452, 281, 320, 8131, 970, 253, 45860, 1491, 275, 1340, 281, 1682, 3761, 253, 247, 30400, 1677, 4216, 18080, 436, 310, 6786, 436, 407, 970, 253, 1375, 23037, 14387, 1269, 83, 16702, 254, 1182, 12109, 1162, 355, 43425, 66, 1332, 323, 16161, 253, 1269, 17475, 1895, 2490, 187, 4118, 18435, 27, 249, 436, 19529, 253, 4477, 3559, 247, 7792, 10864, 323, 1881, 35421, 4715, 281, 3157, 298, 78, 407, 19732, 2977, 4216, 1491, 30628, 5194, 326, 253, 1332, 310, 8489, 4460, 253, 7898, 10527, 1783, 310, 4722, 285, 253, 27163, 403, 2266, 359, 5717, 253, 4477, 323, 2509, 271, 7126, 2628, 275, 30080, 22559, 534, 16481, 9093, 512, 253, 3533, 30628, 8523, 5439 ]
Below is given review of a research paper from cnoference journal. Please write a summary the review. ### Review: this paper studies the online inference and learning problems for nonsymmetric determinantal point processes ndpps the authors use the online greedy algorithm for map inference and modify the learning objective for being suitable in the online setting experiments with realworld datasets show that the proposed online algorithms are comparable or even better than stateoftheart offline algorithms strengths the paper studies a new problem of ndpps under the online setting that has not been covered before the authors properly apply the prior online algorithm for greedy submodular maximization to the ndpps experimental results are convincing that the effectiveness of the proposed online inference and learning algorithms for ndpps weaknesses the writing quality needs to be improved the manuscript contains several typos notational abusements which are provided with minor comments below also the paper simply introduces the proposed algorithms without any justification or intuition which is hard to understand how the authors deal with problems under the online settings moreover some algorithms are already proposed in prior works but there is no reference for example algorithm 2 onlinelss was proposed in 1 1 bhaskara et al online map inference of determinantal point processes neurips 2019 it is not clear how the streaming setting in section 4 is different from the online setting in section 5 since both algorithm 1 and 4 take sequential inputs with random order or onthefly it seems that both can be used for both settings it would be great if more detailed descriptions of streaming and online settings are provided in section 5 the authors provide 2 different algorithms ie local search and 2neighborhood local search comparing theorem 5 and 7 the latter has no gain in terms of runtime complexity however it shows better empirical performance what is the reason for it can the optimality of these algorithms be analyzed the authors propose the approximate objective eq 4 for learning ndpps under the online setting however this is somewhat very different from the loglikelihood of dpp because the objective contains log detls log detls is does learning this objective guarantee convergence of the groundtruth ndpp objective how does the approximated objective eq 4 relate to the offline version of the mle objective minor comments a comma is missing in the fourth line of the second paragraph on page 1 please edit stateoftheart in the second paragraph on page 2 please edit are minimize are minimizing in the second last row on page 3 it would be good to place algorithm 4 in the main manuscript in theorem 5 and 7 please edit fs det vstop vs bstop c bs what is jmathrmmax in line 9 in algorithm 1 although this paper firstly studies new problems of online ndpps it has a lack of algorithm novelty theoretical analyses as well as writing quality for addressing their methodology hence the paper should be improved for acceptance docsepthis paper introduces the streaming and online map inference and learning problems for ndpps for streaming map inference an algorithm is proposed with total time linear in n and memory that is constant in n for online map inference several algorithms are proposed such that at any point in time a valid solution is maintained for online learning algorithm a single pass algorithm is proposed with memory that is constant in m experiments are conducted to show that these streaming and online algorithms achieves comparable performance to stateoftheart offline algorithms the problems that this paper studies are very interesting several algorithms are proposed to solve these problems and the effectiveness of the algorithms are verified through experiments the technical sections are a bit hard to follow and a lot of details are omitted to save space for example for outline of algorithm 2 the auxiliary set t is defined but its size estimate is not given it would be great to include some intuition in the main paper overall i think the problems that this paper studies are interesting and the proposed algorithm are effective docsepthis paper introduces mapinference and learning algorithms for nonsymmetric determinantal point processes ndpps in the streaming and online settings this paper provides the first analysis of ndpprelated algorithms within a streaming context its contributions include the algorithms themselves theoretical analyses algorithm guarantees space and time complexity and experimental evaluation of these algorithms across several standard dppevaluation datasets strengths clarity this paper is wellexposed and the intuition behind the algorithms is clear the background exposition on ndpps is also very well written and accessible novelty as the authors mention this work provides the first analysis of mapinference and learning for the streaming settings of ndpps empirical evaluation the authors show that the online 2neighbor greedy map algorithm improves upon the offline algorithm which is a surprising and meaningful result weaknesses the streaming context is novel for ndpps however as the authors point out there exists previous work looking into streaming for standard dpps 1 comparing the proposed algorithms to 1 to verify that the lack of symmetry provides similar benefits in the streaming setting to those in the offline setting stands out as a missing comparison the first streaming algorithm algorithm 1 is quite simplistic and is more interesting as a baseline rather than as a contribution in of itself the core idea behind the better performing map algorithms online lss online 2neighbor lies in the construction of a stash of discarded items this idea was introduced in 1 for the map inference of symmetric dpps but the explicit connection between these two works is not made in this paper if there are crucial differences between the stashes introduced in 1 and the stash used in this work this should be mentioned explicitly and in detail if the two ideas are similar this should be discussed prominently in this work similarly the results in theorem 5 are as far as i can tell almost identical to those of 1 theorem 31 again this should be discussed explicitly in this paper relatedly i think there is a confusion between epsilon used in 1 and alpha used here to describe online lss since i believe the authors use epsilon to describe onlinelss in the experimental section questions comments in 1 algorithm 1s dependency on is an issue since can be arbitrarily small does a similar issue arise for online lss and online 2neighbor if so can this be addressed more generally being explicit about how the lack of symmetry in the dpp changes how the streaming setting must be approached would make for an interesting contribution of this work could 2neighbor be extended to arbitrary sizes of subsets 3neighbor etc understanding at which point the degree of interactions cease to provide benefits that are worth the increase in memorytime constraints would be a valuable contribution to the dpp community are pairwise interactions as in 2neighbor sufficient to characterize most of the necessary dpp properties the derivation of the gradient updates for the online algorithm can be removed from the main paper can the authors provide any insight into the stunning performance of the online learning algorithm compared to the offline algorithm the online learning seems to converge almost an order of magnitude faster it would be interesting to see if its possible to switch from the online to the offline algorithm after the initial jump in loglikelihood to achieve the best of both worlds fast convergence low nll am i correct in understanding that figure 1 reports the volume rather than the log volume if so the authors should consider logwarping the evaluation function f since improvements in the range of 1020 are difficult to gauge can you clarify the experimental conditions used for the offline learning algorithm batch size etc 1 online map inference of determinantal point processes bhaskara et al 2020 this paper proposes the first analysis of map inference and learning of nonsymmetric dpps in a streaming setting the authors propose novel algorithms provide guarantees and complexity analyses and evaluate their algorithms empirically across a variety of benchmarks startlingly the authors show that their online algorithms are competitive with and often outperform their offline equivalents my main concern with this paper is novelty there is significant overlap between the mapinference section of this work and previous work by bhaskara et al 2020 both in terms of the key ideas using a stash and in how the algorithms are analyzed if this overlap is only in appearance the authors should discuss in detail where their contributions depart from this previous work currently it is difficult to understand the extent of the novelty of this work docsepthis paper proposes online and streaming algorithms for map inference and learning for nonsymmetric determinantal point processes ndpps for the streaming setting data points arrive in an arbitrary order and the algorithms are constrained to using a single pass over the data along with requiring sublinear memory consumption in the online setting there is the additional requirement of maintaining a valid solution at any time step the authors provide some theoretical guarantees for the proposed algorithms and perform experiments that demonstrate that their performance is comparable to or better than offline algorithms for these tasks strengths the proposed online and streaming algorithms for map inference and learning appear to be novel and in a number of cases empirically outperform stateoftheart offline ndpp algorithms for these tasks the proposed streaming map inference algorithm alg 1 has theoretical guarantees for map approximation quality and time and space complexity the proposed online map inference algorithms algs 2 and 4 have theoretical guarantees for time and space complexity the proposed online learning algorithm alg 3 has theoretical guarantees for time and space complexity the paper is reasonably well written and easy to follow weaknesses some of the claims made in sec 61 regarding the stateoftheart ndpp learning algorithm in gartrell et al 2021 appear to be incorrect sec 61 claims that the gartrell et al 2021 learning algorithm 1 must store all training data in memory 2 must make multiple passes over the data and 3 subsets are not processed sequentially as they arrive however the implementation of the learning algorithm in gartrell et al 2021 uses adam a variant of sgd which can run with a batch size of 1 or with minibatches and thus can be run in a streaming setting that does not require all data to be loaded into memory thus claims 1 3 seem to be incorrect the normalization term zvst bst c used in the approximate objective eq 4 for the online learning algorithm is not equivalent to the standard normalization term for a ndpp zv b c nor is it clear that this approximate normalization term actually provides a reasonable approximation to the true ndpp normalizer the authors do not address this issue in the paper and thus the proposed online learning algorithm has no theoretical approximation guarantees when compared to standard ndpp learning algorithms therefore optimizing eq 4 appears to violate the requirements described in definition 8 since the true ndpp regularized loglikelihood is not being maximized this seems to be a critical issue this paper has strong contributions in the area of streaming and online map inference algorithms however there are some notable issues with the contributions regarding the online learning algorithm including the comparison with prior work and the correctness of the approximate optimization objective eq 4 as described above thus unless the authors can address these issues in the rebuttal it is hard to recommend this paper for acceptance ### Summary:
this paper studies online map inference and learning for nonsymmetric determinantal point processes ndpps the main contribution is an online greedy algorithm surprisingly they show that their algorithm outperforms various offline algorithms on realworld datasets that said the main concern was the novelty with respect to the prior work of bhaskara et al who gave an online approximation algorithm for map inference in dpps to compare the two works 1 bhaskara et al give an algorithm for dpps and ndpps are more complex 2 bhaskara et al give provable guarantees on the approximation ratio but no such guarantees are known for ndpps 3 and finally some of the key ingredients in the online algorithm for ndpps like the stash were already in the work of bhaskara et al overall the reviewers felt that this submission would be improved with a clearer discussion of the contributions over prior work
[ 2530, 50276, 249, 2593, 608, 253, 4477, 2085, 374, 1027, 11333, 26332, 1980, 3186, 285, 374, 570, 39895, 1980, 3186, 10941, 10012, 608, 285, 818, 253, 6158, 556, 642, 6351, 275, 2426, 273, 20243, 10454, 2299, 352, 2722, 1805, 16774, 3045, 752, 310, 253, 1921, 323, 352, 476, 253, 5556, 1319, 273, 841, 11333, 320, 5867, 50275, 783, 4477, 12661, 253, 16851, 8103, 16186, 577, 323, 4715, 40515, 44361, 762, 253, 3909, 4758, 2299, 436, 310, 8489, 1077, 1027, 432, 253, 2412, 7513, 10202, 273, 277, 377, 984, 253, 8103, 4428, 2412, 843, 5200, 2412, 843, 5200, 50276, 261, 1057, 4715, 436, 8103, 12215, 14940, 273, 253, 3216, 33024, 40515, 377, 8103, 849, 1057, 253, 34930, 8103, 16186, 577, 14588, 281, 253, 28841, 2715, 273, 253, 278, 282, 8103, 50276, 37585, 5701, 50273, 66, 39169, 310, 5816, 275, 253, 7002, 1386, 273, 253, 1273, 12494, 327, 3239, 337, 50274, 32897, 12921, 1375, 23037, 14387, 275, 253, 1273, 12494, 327, 3239, 374, 50274, 32897, 12921, 403, 15338, 50276, 609, 28699, 275, 253, 1273, 1390, 4194, 327, 3239, 495, 50274, 262, 651, 320, 1175, 281, 1659, 5933, 577, 275, 253, 2022, 7714, 50274, 249, 10012, 608, 285, 818, 4496, 12921, 25290, 50276, 5992, 362, 13121, 4632, 50276, 67, 13121, 260, 48996, 50274, 5371, 310, 480, 2690, 4090, 275, 1386, 898, 275, 5933, 337, 50276, 20261, 436, 2929, 41005, 2175, 747, 3237, 273, 3909, 40515, 44361, 352, 556, 247, 3480, 273, 5933, 38135, 10527, 6260, 347, 973, 347, 4028, 3290, 323, 15974, 616, 16182, 7613, 253, 2929, 943, 320, 5520, 323, 14924, 5474, 33032, 2520, 2929, 23970, 253, 18361, 285, 3909, 3711, 17032, 285, 4715, 3237, 323, 40515, 44361, 50276, 1542, 18361, 3711, 17032, 271, 5933, 310, 4081, 342, 2264, 673, 4872, 275, 295, 285, 3541, 326, 310, 3638, 275, 295, 50276, 1542, 3909, 3711, 17032, 2067, 11333, 403, 4081, 824, 326, 387, 667, 1127, 275, 673, 247, 3588, 2900, 310, 8838, 50276, 1542, 3909, 4715, 5933, 247, 2014, 1509, 5933, 310, 4081, 342, 3541, 326, 310, 3638, 275, 278, 50276, 16217, 3825, 403, 5196, 281, 921, 326, 841, 18361, 285, 3909, 11333, 33526, 10870, 3045, 281, 1375, 23037, 14387, 28841, 11333, 253, 3237, 326, 436, 2929, 2175, 403, 1077, 4722, 2067, 11333, 403, 4081, 281, 8415, 841, 3237, 285, 253, 12510, 273, 253, 11333, 403, 16058, 949, 4679, 50276, 783, 7681, 7118, 403, 247, 2372, 1892, 281, 956, 285, 247, 2257, 273, 4278, 403, 11035, 281, 5321, 2317, 323, 1650, 323, 19270, 273, 5933, 374, 253, 24026, 873, 246, 310, 2931, 533, 697, 1979, 6642, 310, 417, 1677, 352, 651, 320, 1270, 281, 2486, 690, 30328, 275, 253, 2022, 2929, 4583, 891, 1158, 253, 3237, 326, 436, 2929, 2175, 403, 4722, 285, 253, 4081, 5933, 403, 3576, 5474, 33032, 2520, 2929, 23970, 3711, 249, 1793, 285, 4715, 11333, 323, 14122, 25562, 27152, 267, 1127, 4870, 40515, 44361, 275, 253, 18361, 285, 3909, 7533, 50275, 2520, 2929, 3400, 253, 806, 1783, 273, 40515, 377, 4919, 11333, 1561, 247, 18361, 3634, 697, 9021, 2486, 253, 11333, 3746, 10527, 6260, 5933, 23632, 2317, 285, 673, 10454, 285, 5661, 7103, 273, 841, 11333, 2439, 2067, 2629, 33234, 365, 1208, 2368, 15302, 50275, 296, 3755, 20556, 50276, 498, 15752, 436, 2929, 310, 6210, 1591, 7334, 285, 253, 30328, 3212, 253, 11333, 310, 2590, 253, 4114, 47284, 327, 40515, 44361, 310, 671, 1077, 973, 3542, 285, 12482, 50276, 2369, 652, 555, 347, 253, 4477, 3748, 436, 789, 3400, 253, 806, 1783, 273, 3711, 249, 1793, 285, 4715, 323, 253, 18361, 7533, 273, 40515, 44361, 50276, 358, 5378, 474, 7103, 253, 4477, 921, 326, 253, 3909, 374, 570, 25194, 38754, 3711, 5933, 19132, 2220, 253, 28841, 5933, 534, 310, 247, 10084, 285, 14282, 906, 50275, 20881, 1255, 265, 50276, 783, 18361, 3634, 310, 4460, 323, 40515, 44361, 2299, 347, 253, 4477, 1127, 562, 627, 4961, 2045, 789, 2819, 715, 18361, 323, 2629, 277, 44361, 337, 10941, 253, 4081, 11333, 281, 337, 281, 12654, 326, 253, 3480, 273, 10377, 3400, 2074, 5373, 275, 253, 18361, 4758, 281, 1110, 275, 253, 28841, 4758, 9572, 562, 347, 247, 5816, 5301, 50276, 783, 806, 18361, 5933, 5933, 337, 310, 3240, 8077, 2531, 285, 310, 625, 4722, 347, 247, 8245, 2581, 685, 347, 247, 7680, 275, 273, 3139, 50276, 783, 5161, 2934, 3212, 253, 1805, 9591, 3711, 11333, 3909, 298, 859, 3909, 374, 570, 25194, 8696, 275, 253, 5140, 273, 247, 331, 1225, 273, 25665, 4957, 436, 2934, 369, 5611, 275, 337, 323, 253, 3711, 17032, 273, 13123, 277, 44361, 533, 253, 6843, 4602, 875, 841, 767, 2987, 310, 417, 1160, 275, 436, 2929, 604, 627, 403, 9560, 3910, 875, 253, 331, 13539, 5611, 275, 337, 285, 253, 331, 1225, 908, 275, 436, 789, 436, 943, 320, 5393, 11120, 285, 275, 2508, 604, 253, 767, 5697, 403, 2074, 436, 943, 320, 5469, 46454, 275, 436, 789, 12014, 253, 1543, 275, 10012, 608, 403, 347, 2080, 347, 891, 476, 2028, 2761, 8931, 281, 1110, 273, 337, 10012, 4562, 969, 436, 943, 320, 5469, 11120, 275, 436, 2929, 2905, 314, 891, 1158, 627, 310, 247, 13775, 875, 299, 4277, 908, 275, 337, 285, 9765, 908, 1060, 281, 6266, 3909, 298, 859, 1580, 891, 2868, 253, 4477, 897, 299, 4277, 281, 6266, 327, 3642, 293, 859, 275, 253, 5661, 2593, 50274, 34974, 50276, 26122, 50276, 249, 337, 5933, 337, 84, 18925, 327, 50276, 261, 271, 2523, 1580, 50276, 5092, 320, 29607, 1355, 1057, 247, 2074, 2523, 12893, 323, 3909, 298, 859, 285, 3909, 374, 570, 25194, 604, 594, 476, 436, 320, 9713, 50276, 3062, 3839, 1146, 6843, 670, 849, 253, 3480, 273, 10377, 275, 253, 277, 377, 2544, 849, 253, 18361, 4758, 1364, 320, 13781, 651, 1056, 323, 271, 4722, 7680, 273, 436, 789, 50276, 16534, 374, 570, 25194, 320, 6508, 281, 10341, 9552, 273, 20077, 495, 570, 25194, 3966, 4685, 387, 534, 1127, 253, 4248, 273, 6355, 22907, 281, 2085, 5373, 326, 403, 4409, 253, 2572, 275, 3541, 2606, 10806, 651, 320, 247, 9865, 7680, 281, 253, 277, 377, 3114, 403, 28208, 6355, 347, 275, 374, 570, 25194, 4209, 281, 17710, 954, 273, 253, 3309, 277, 377, 3607, 50276, 783, 28529, 273, 253, 11786, 11269, 323, 253, 3909, 5933, 476, 320, 5176, 432, 253, 2022, 2929, 50276, 5092, 253, 4477, 2085, 667, 12288, 715, 253, 21472, 3045, 273, 253, 3909, 4715, 5933, 2429, 281, 253, 28841, 5933, 253, 3909, 4715, 3133, 281, 29623, 2761, 271, 1340, 273, 9777, 7938, 352, 651, 320, 4722, 281, 923, 604, 697, 1896, 281, 5234, 432, 253, 3909, 281, 253, 28841, 5933, 846, 253, 3302, 6923, 275, 2412, 7513, 10202, 281, 5115, 253, 1682, 273, 1097, 20490, 3809, 14940, 1698, 295, 620, 50276, 312, 891, 3451, 275, 4685, 326, 4677, 337, 5012, 253, 4644, 2581, 685, 253, 2412, 4644, 604, 594, 253, 4477, 943, 1908, 2412, 88, 5916, 272, 253, 7103, 1159, 269, 1580, 11701, 275, 253, 2491, 273, 884, 938, 403, 2834, 281, 11206, 50276, 5092, 368, 19148, 253, 5661, 2515, 908, 323, 253, 28841, 4715, 5933, 14604, 1979, 3966, 50276, 18, 3909, 3711, 17032, 273, 27152, 267, 1127, 4870, 270, 73, 1945, 4595, 1162, 355, 9169, 436, 2929, 29328, 253, 806, 1783, 273, 3711, 17032, 285, 4715, 273, 14122, 25562, 277, 44361, 275, 247, 18361, 4758, 253, 4477, 12661, 4460, 11333, 2085, 23632, 285, 10454, 6260, 285, 7472, 616, 11333, 45190, 2439, 247, 5235, 273, 49602, 42390, 314, 253, 4477, 921, 326, 616, 3909, 11333, 403, 12085, 342, 285, 2223, 562, 32231, 616, 28841, 42826, 50276, 2577, 2022, 4468, 342, 436, 2929, 310, 38135, 627, 310, 1534, 14787, 875, 253, 3711, 249, 1793, 2593, 273, 436, 789, 285, 2045, 789, 407, 270, 73, 1945, 4595, 1162, 355, 9169, 1097, 275, 2426, 273, 253, 2234, 5697, 970, 247, 331, 1225, 285, 275, 849, 253, 11333, 403, 5867, 604, 436, 14787, 310, 760, 275, 7286, 253, 4477, 943, 2319, 275, 2508, 835, 616, 9021, 7907, 432, 436, 2045, 789, 4390, 352, 310, 2834, 281, 2096, 253, 6070, 273, 253, 38135, 273, 436, 789, 5474, 33032, 2520, 2929, 29328, 3909, 285, 18361, 11333, 323, 3711, 17032, 285, 4715, 323, 14122, 25562, 27152, 267, 1127, 4870, 40515, 44361, 50276, 1542, 253, 18361, 4758, 941, 2792, 12666, 275, 271, 10341, 1340, 285, 253, 11333, 403, 20793, 281, 970, 247, 2014, 1509, 689, 253, 941, 2112, 342, 10568, 749, 8172, 3541, 8353, 50276, 249, 253, 3909, 4758, 627, 310, 253, 3081, 8284, 273, 11850, 247, 3588, 2900, 387, 667, 673, 3213, 50276, 783, 4477, 2085, 690, 10527, 23632, 323, 253, 4081, 11333, 285, 1347, 4679, 326, 7568, 326, 616, 3045, 310, 10870, 281, 390, 1805, 685, 28841, 11333, 323, 841, 8892, 50276, 296, 3755, 20556, 50276, 783, 4081, 3909, 285, 18361, 11333, 323, 3711, 17032, 285, 4715, 3176, 281, 320, 4460, 285, 275, 247, 1180, 273, 2219, 45190, 562, 32231, 1375, 23037, 14387, 28841, 40515, 377, 11333, 323, 841, 8892, 50276, 783, 4081, 18361, 3711, 17032, 5933, 20320, 337, 556, 10527, 23632, 323, 3711, 11193, 3290, 285, 673, 285, 2317, 10454, 50276, 783, 4081, 3909, 3711, 17032, 11333, 355, 5943, 374, 285, 577, 452, 10527, 23632, 323, 673, 285, 2317, 10454, 50276, 783, 4081, 3909, 4715, 5933, 20320, 495, 556, 10527, 23632, 323, 673, 285, 2317, 10454, 50276, 783, 2929, 310, 12054, 973, 3542, 285, 3477, 281, 956, 50276, 20881, 1255, 265, 50276, 8826, 273, 253, 3916, 1160, 275, 4706, 9901, 5001, 253, 1375, 23037, 14387, 40515, 377, 4715, 5933, 275, 305, 435, 11436, 1162, 355, 43425, 3176, 281, 320, 13583, 50276, 1704, 9901, 3916, 326, 253, 305, 435, 11436, 1162, 355, 43425, 4715, 5933, 337, 1364, 4657, 512, 3733, 941, 275, 3541, 374, 1364, 1056, 2709, 11999, 689, 253, 941, 285, 495, 20077, 403, 417, 11742, 32627, 347, 597, 12666, 50276, 35529, 253, 7092, 273, 253, 4715, 5933, 275, 305, 435, 11436, 1162, 355, 43425, 4648, 38622, 247, 12955, 273, 256, 35333, 534, 476, 1408, 342, 247, 14604, 1979, 273, 337, 390, 342, 1054, 487, 32358, 285, 3021, 476, 320, 1408, 275, 247, 18361, 4758, 326, 1057, 417, 2430, 512, 941, 281, 320, 10607, 715, 3541, 50276, 40622, 3916, 337, 50276, 20, 1646, 281, 320, 13583, 50276, 783, 21539, 1307, 1182, 87, 296, 270, 296, 260, 908, 275, 253, 16851, 8103, 16186, 577, 323, 253, 3909, 4715, 5933, 310, 417, 6425, 281, 253, 2629, 21539, 1307, 323, 247, 40515, 377, 1182, 87, 270, 260, 4543, 310, 352, 2590, 326, 436, 16851, 21539, 1307, 2686, 3400, 247, 5272, 11193, 281, 253, 2032, 40515, 377, 2622, 6081, 50276, 783, 4477, 513, 417, 2953, 436, 2523, 275, 253, 2929, 285, 3021, 253, 4081, 3909, 4715, 5933, 556, 642, 10527, 11193, 23632, 672, 2429, 281, 2629, 40515, 377, 4715, 11333, 50276, 45230, 39793, 16186, 577, 4620, 281, 20835, 253, 6095, 2529, 275, 5426, 854, 1580, 253, 2032, 40515, 377, 3963, 1025, 2412, 7513, 10202, 310, 417, 1146, 11903, 1025, 50276, 2520, 3133, 281, 320, 247, 4619, 2523, 50275, 2520, 2929, 556, 2266, 9021, 275, 253, 2170, 273, 18361, 285, 3909, 3711, 17032, 11333, 50276, 35529, 627, 403, 690, 16613, 3374, 342, 253, 9021, 5001, 253, 3909, 4715, 5933, 1690, 253, 5301, 342, 2720, 789, 285, 253, 36594, 273, 253, 16851, 13757, 8103, 16186, 577, 347, 2529, 1840, 50276, 40622, 5734, 253, 4477, 476, 2953, 841, 3374, 275, 253, 30080, 22559, 352, 310, 1892, 281, 5583, 436, 2929, 323, 14924, 2490, 187, 4118, 18435, 27, 2520, 2929, 2175, 3909, 3711, 17032, 285, 4715, 323, 14122, 25562, 27152, 267, 1127, 4870, 40515, 44361, 253, 2022, 7680, 310, 271, 3909, 38754, 5933, 19143, 597, 921, 326, 616, 5933, 41731, 13015, 2710, 28841, 11333, 327, 1524, 10186, 15302, 326, 753, 253, 2022, 4468, 369, 253, 38135, 342, 1675, 281, 253, 2720, 789, 273, 270, 73, 1945, 4595, 1162, 355, 665, 3534, 271, 3909, 11193, 5933, 323, 3711, 17032, 275, 277, 44361, 281, 7277, 253, 767, 2987, 337, 270, 73, 1945, 4595, 1162, 355, 1918, 271, 5933, 323, 277, 44361, 285, 40515, 44361, 403, 625, 2570, 374, 270, 73, 1945, 4595, 1162, 355, 1918, 872, 494, 23632, 327, 253, 11193, 4313, 533, 642, 824, 23632, 403, 1929, 323, 40515, 44361, 495, 285, 4720, 690, 273, 253, 2234, 12696, 275, 253, 3909, 5933, 323, 40515, 44361, 751, 253, 331, 1225, 497, 2168, 275, 253, 789, 273, 270, 73, 1945, 4595, 1162, 355, 4583, 253, 30628, 3543, 326, 436, 19529, 651, 320, 5520, 342, 247, 30909, 5955, 273, 253, 9021, 689, 2720, 789 ]
[ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 ]
[ 2530, 50276, 249, 2593, 608, 253, 4477, 2085, 374, 1027, 11333, 26332, 1980, 3186, 285, 374, 570, 39895, 1980, 3186, 10941, 10012, 608, 285, 818, 253, 6158, 556, 642, 6351, 275, 2426, 273, 20243, 10454, 2299, 352, 2722, 1805, 16774, 3045, 752, 310, 253, 1921, 323, 352, 476, 253, 5556, 1319, 273, 841, 11333, 320, 5867, 50275, 783, 4477, 12661, 253, 16851, 8103, 16186, 577, 323, 4715, 40515, 44361, 762, 253, 3909, 4758, 2299, 436, 310, 8489, 1077, 1027, 432, 253, 2412, 7513, 10202, 273, 277, 377, 984, 253, 8103, 4428, 2412, 843, 5200, 2412, 843, 5200, 50276, 261, 1057, 4715, 436, 8103, 12215, 14940, 273, 253, 3216, 33024, 40515, 377, 8103, 849, 1057, 253, 34930, 8103, 16186, 577, 14588, 281, 253, 28841, 2715, 273, 253, 278, 282, 8103, 50276, 37585, 5701, 50273, 66, 39169, 310, 5816, 275, 253, 7002, 1386, 273, 253, 1273, 12494, 327, 3239, 337, 50274, 32897, 12921, 1375, 23037, 14387, 275, 253, 1273, 12494, 327, 3239, 374, 50274, 32897, 12921, 403, 15338, 50276, 609, 28699, 275, 253, 1273, 1390, 4194, 327, 3239, 495, 50274, 262, 651, 320, 1175, 281, 1659, 5933, 577, 275, 253, 2022, 7714, 50274, 249, 10012, 608, 285, 818, 4496, 12921, 25290, 50276, 5992, 362, 13121, 4632, 50276, 67, 13121, 260, 48996, 50274, 5371, 310, 480, 2690, 4090, 275, 1386, 898, 275, 5933, 337, 50276, 20261, 436, 2929, 41005, 2175, 747, 3237, 273, 3909, 40515, 44361, 352, 556, 247, 3480, 273, 5933, 38135, 10527, 6260, 347, 973, 347, 4028, 3290, 323, 15974, 616, 16182, 7613, 253, 2929, 943, 320, 5520, 323, 14924, 5474, 33032, 2520, 2929, 23970, 253, 18361, 285, 3909, 3711, 17032, 285, 4715, 3237, 323, 40515, 44361, 50276, 1542, 18361, 3711, 17032, 271, 5933, 310, 4081, 342, 2264, 673, 4872, 275, 295, 285, 3541, 326, 310, 3638, 275, 295, 50276, 1542, 3909, 3711, 17032, 2067, 11333, 403, 4081, 824, 326, 387, 667, 1127, 275, 673, 247, 3588, 2900, 310, 8838, 50276, 1542, 3909, 4715, 5933, 247, 2014, 1509, 5933, 310, 4081, 342, 3541, 326, 310, 3638, 275, 278, 50276, 16217, 3825, 403, 5196, 281, 921, 326, 841, 18361, 285, 3909, 11333, 33526, 10870, 3045, 281, 1375, 23037, 14387, 28841, 11333, 253, 3237, 326, 436, 2929, 2175, 403, 1077, 4722, 2067, 11333, 403, 4081, 281, 8415, 841, 3237, 285, 253, 12510, 273, 253, 11333, 403, 16058, 949, 4679, 50276, 783, 7681, 7118, 403, 247, 2372, 1892, 281, 956, 285, 247, 2257, 273, 4278, 403, 11035, 281, 5321, 2317, 323, 1650, 323, 19270, 273, 5933, 374, 253, 24026, 873, 246, 310, 2931, 533, 697, 1979, 6642, 310, 417, 1677, 352, 651, 320, 1270, 281, 2486, 690, 30328, 275, 253, 2022, 2929, 4583, 891, 1158, 253, 3237, 326, 436, 2929, 2175, 403, 4722, 285, 253, 4081, 5933, 403, 3576, 5474, 33032, 2520, 2929, 23970, 3711, 249, 1793, 285, 4715, 11333, 323, 14122, 25562, 27152, 267, 1127, 4870, 40515, 44361, 275, 253, 18361, 285, 3909, 7533, 50275, 2520, 2929, 3400, 253, 806, 1783, 273, 40515, 377, 4919, 11333, 1561, 247, 18361, 3634, 697, 9021, 2486, 253, 11333, 3746, 10527, 6260, 5933, 23632, 2317, 285, 673, 10454, 285, 5661, 7103, 273, 841, 11333, 2439, 2067, 2629, 33234, 365, 1208, 2368, 15302, 50275, 296, 3755, 20556, 50276, 498, 15752, 436, 2929, 310, 6210, 1591, 7334, 285, 253, 30328, 3212, 253, 11333, 310, 2590, 253, 4114, 47284, 327, 40515, 44361, 310, 671, 1077, 973, 3542, 285, 12482, 50276, 2369, 652, 555, 347, 253, 4477, 3748, 436, 789, 3400, 253, 806, 1783, 273, 3711, 249, 1793, 285, 4715, 323, 253, 18361, 7533, 273, 40515, 44361, 50276, 358, 5378, 474, 7103, 253, 4477, 921, 326, 253, 3909, 374, 570, 25194, 38754, 3711, 5933, 19132, 2220, 253, 28841, 5933, 534, 310, 247, 10084, 285, 14282, 906, 50275, 20881, 1255, 265, 50276, 783, 18361, 3634, 310, 4460, 323, 40515, 44361, 2299, 347, 253, 4477, 1127, 562, 627, 4961, 2045, 789, 2819, 715, 18361, 323, 2629, 277, 44361, 337, 10941, 253, 4081, 11333, 281, 337, 281, 12654, 326, 253, 3480, 273, 10377, 3400, 2074, 5373, 275, 253, 18361, 4758, 281, 1110, 275, 253, 28841, 4758, 9572, 562, 347, 247, 5816, 5301, 50276, 783, 806, 18361, 5933, 5933, 337, 310, 3240, 8077, 2531, 285, 310, 625, 4722, 347, 247, 8245, 2581, 685, 347, 247, 7680, 275, 273, 3139, 50276, 783, 5161, 2934, 3212, 253, 1805, 9591, 3711, 11333, 3909, 298, 859, 3909, 374, 570, 25194, 8696, 275, 253, 5140, 273, 247, 331, 1225, 273, 25665, 4957, 436, 2934, 369, 5611, 275, 337, 323, 253, 3711, 17032, 273, 13123, 277, 44361, 533, 253, 6843, 4602, 875, 841, 767, 2987, 310, 417, 1160, 275, 436, 2929, 604, 627, 403, 9560, 3910, 875, 253, 331, 13539, 5611, 275, 337, 285, 253, 331, 1225, 908, 275, 436, 789, 436, 943, 320, 5393, 11120, 285, 275, 2508, 604, 253, 767, 5697, 403, 2074, 436, 943, 320, 5469, 46454, 275, 436, 789, 12014, 253, 1543, 275, 10012, 608, 403, 347, 2080, 347, 891, 476, 2028, 2761, 8931, 281, 1110, 273, 337, 10012, 4562, 969, 436, 943, 320, 5469, 11120, 275, 436, 2929, 2905, 314, 891, 1158, 627, 310, 247, 13775, 875, 299, 4277, 908, 275, 337, 285, 9765, 908, 1060, 281, 6266, 3909, 298, 859, 1580, 891, 2868, 253, 4477, 897, 299, 4277, 281, 6266, 327, 3642, 293, 859, 275, 253, 5661, 2593, 50274, 34974, 50276, 26122, 50276, 249, 337, 5933, 337, 84, 18925, 327, 50276, 261, 271, 2523, 1580, 50276, 5092, 320, 29607, 1355, 1057, 247, 2074, 2523, 12893, 323, 3909, 298, 859, 285, 3909, 374, 570, 25194, 604, 594, 476, 436, 320, 9713, 50276, 3062, 3839, 1146, 6843, 670, 849, 253, 3480, 273, 10377, 275, 253, 277, 377, 2544, 849, 253, 18361, 4758, 1364, 320, 13781, 651, 1056, 323, 271, 4722, 7680, 273, 436, 789, 50276, 16534, 374, 570, 25194, 320, 6508, 281, 10341, 9552, 273, 20077, 495, 570, 25194, 3966, 4685, 387, 534, 1127, 253, 4248, 273, 6355, 22907, 281, 2085, 5373, 326, 403, 4409, 253, 2572, 275, 3541, 2606, 10806, 651, 320, 247, 9865, 7680, 281, 253, 277, 377, 3114, 403, 28208, 6355, 347, 275, 374, 570, 25194, 4209, 281, 17710, 954, 273, 253, 3309, 277, 377, 3607, 50276, 783, 28529, 273, 253, 11786, 11269, 323, 253, 3909, 5933, 476, 320, 5176, 432, 253, 2022, 2929, 50276, 5092, 253, 4477, 2085, 667, 12288, 715, 253, 21472, 3045, 273, 253, 3909, 4715, 5933, 2429, 281, 253, 28841, 5933, 253, 3909, 4715, 3133, 281, 29623, 2761, 271, 1340, 273, 9777, 7938, 352, 651, 320, 4722, 281, 923, 604, 697, 1896, 281, 5234, 432, 253, 3909, 281, 253, 28841, 5933, 846, 253, 3302, 6923, 275, 2412, 7513, 10202, 281, 5115, 253, 1682, 273, 1097, 20490, 3809, 14940, 1698, 295, 620, 50276, 312, 891, 3451, 275, 4685, 326, 4677, 337, 5012, 253, 4644, 2581, 685, 253, 2412, 4644, 604, 594, 253, 4477, 943, 1908, 2412, 88, 5916, 272, 253, 7103, 1159, 269, 1580, 11701, 275, 253, 2491, 273, 884, 938, 403, 2834, 281, 11206, 50276, 5092, 368, 19148, 253, 5661, 2515, 908, 323, 253, 28841, 4715, 5933, 14604, 1979, 3966, 50276, 18, 3909, 3711, 17032, 273, 27152, 267, 1127, 4870, 270, 73, 1945, 4595, 1162, 355, 9169, 436, 2929, 29328, 253, 806, 1783, 273, 3711, 17032, 285, 4715, 273, 14122, 25562, 277, 44361, 275, 247, 18361, 4758, 253, 4477, 12661, 4460, 11333, 2085, 23632, 285, 10454, 6260, 285, 7472, 616, 11333, 45190, 2439, 247, 5235, 273, 49602, 42390, 314, 253, 4477, 921, 326, 616, 3909, 11333, 403, 12085, 342, 285, 2223, 562, 32231, 616, 28841, 42826, 50276, 2577, 2022, 4468, 342, 436, 2929, 310, 38135, 627, 310, 1534, 14787, 875, 253, 3711, 249, 1793, 2593, 273, 436, 789, 285, 2045, 789, 407, 270, 73, 1945, 4595, 1162, 355, 9169, 1097, 275, 2426, 273, 253, 2234, 5697, 970, 247, 331, 1225, 285, 275, 849, 253, 11333, 403, 5867, 604, 436, 14787, 310, 760, 275, 7286, 253, 4477, 943, 2319, 275, 2508, 835, 616, 9021, 7907, 432, 436, 2045, 789, 4390, 352, 310, 2834, 281, 2096, 253, 6070, 273, 253, 38135, 273, 436, 789, 5474, 33032, 2520, 2929, 29328, 3909, 285, 18361, 11333, 323, 3711, 17032, 285, 4715, 323, 14122, 25562, 27152, 267, 1127, 4870, 40515, 44361, 50276, 1542, 253, 18361, 4758, 941, 2792, 12666, 275, 271, 10341, 1340, 285, 253, 11333, 403, 20793, 281, 970, 247, 2014, 1509, 689, 253, 941, 2112, 342, 10568, 749, 8172, 3541, 8353, 50276, 249, 253, 3909, 4758, 627, 310, 253, 3081, 8284, 273, 11850, 247, 3588, 2900, 387, 667, 673, 3213, 50276, 783, 4477, 2085, 690, 10527, 23632, 323, 253, 4081, 11333, 285, 1347, 4679, 326, 7568, 326, 616, 3045, 310, 10870, 281, 390, 1805, 685, 28841, 11333, 323, 841, 8892, 50276, 296, 3755, 20556, 50276, 783, 4081, 3909, 285, 18361, 11333, 323, 3711, 17032, 285, 4715, 3176, 281, 320, 4460, 285, 275, 247, 1180, 273, 2219, 45190, 562, 32231, 1375, 23037, 14387, 28841, 40515, 377, 11333, 323, 841, 8892, 50276, 783, 4081, 18361, 3711, 17032, 5933, 20320, 337, 556, 10527, 23632, 323, 3711, 11193, 3290, 285, 673, 285, 2317, 10454, 50276, 783, 4081, 3909, 3711, 17032, 11333, 355, 5943, 374, 285, 577, 452, 10527, 23632, 323, 673, 285, 2317, 10454, 50276, 783, 4081, 3909, 4715, 5933, 20320, 495, 556, 10527, 23632, 323, 673, 285, 2317, 10454, 50276, 783, 2929, 310, 12054, 973, 3542, 285, 3477, 281, 956, 50276, 20881, 1255, 265, 50276, 8826, 273, 253, 3916, 1160, 275, 4706, 9901, 5001, 253, 1375, 23037, 14387, 40515, 377, 4715, 5933, 275, 305, 435, 11436, 1162, 355, 43425, 3176, 281, 320, 13583, 50276, 1704, 9901, 3916, 326, 253, 305, 435, 11436, 1162, 355, 43425, 4715, 5933, 337, 1364, 4657, 512, 3733, 941, 275, 3541, 374, 1364, 1056, 2709, 11999, 689, 253, 941, 285, 495, 20077, 403, 417, 11742, 32627, 347, 597, 12666, 50276, 35529, 253, 7092, 273, 253, 4715, 5933, 275, 305, 435, 11436, 1162, 355, 43425, 4648, 38622, 247, 12955, 273, 256, 35333, 534, 476, 1408, 342, 247, 14604, 1979, 273, 337, 390, 342, 1054, 487, 32358, 285, 3021, 476, 320, 1408, 275, 247, 18361, 4758, 326, 1057, 417, 2430, 512, 941, 281, 320, 10607, 715, 3541, 50276, 40622, 3916, 337, 50276, 20, 1646, 281, 320, 13583, 50276, 783, 21539, 1307, 1182, 87, 296, 270, 296, 260, 908, 275, 253, 16851, 8103, 16186, 577, 323, 253, 3909, 4715, 5933, 310, 417, 6425, 281, 253, 2629, 21539, 1307, 323, 247, 40515, 377, 1182, 87, 270, 260, 4543, 310, 352, 2590, 326, 436, 16851, 21539, 1307, 2686, 3400, 247, 5272, 11193, 281, 253, 2032, 40515, 377, 2622, 6081, 50276, 783, 4477, 513, 417, 2953, 436, 2523, 275, 253, 2929, 285, 3021, 253, 4081, 3909, 4715, 5933, 556, 642, 10527, 11193, 23632, 672, 2429, 281, 2629, 40515, 377, 4715, 11333, 50276, 45230, 39793, 16186, 577, 4620, 281, 20835, 253, 6095, 2529, 275, 5426, 854, 1580, 253, 2032, 40515, 377, 3963, 1025, 2412, 7513, 10202, 310, 417, 1146, 11903, 1025, 50276, 2520, 3133, 281, 320, 247, 4619, 2523, 50275, 2520, 2929, 556, 2266, 9021, 275, 253, 2170, 273, 18361, 285, 3909, 3711, 17032, 11333, 50276, 35529, 627, 403, 690, 16613, 3374, 342, 253, 9021, 5001, 253, 3909, 4715, 5933, 1690, 253, 5301, 342, 2720, 789, 285, 253, 36594, 273, 253, 16851, 13757, 8103, 16186, 577, 347, 2529, 1840, 50276, 40622, 5734, 253, 4477, 476, 2953, 841, 3374, 275, 253, 30080, 22559, 352, 310, 1892, 281, 5583, 436, 2929, 323, 14924, 2490, 187, 4118, 18435, 27, 2520, 2929, 2175, 3909, 3711, 17032, 285, 4715, 323, 14122, 25562, 27152, 267, 1127, 4870, 40515, 44361, 253, 2022, 7680, 310, 271, 3909, 38754, 5933, 19143, 597, 921, 326, 616, 5933, 41731, 13015, 2710, 28841, 11333, 327, 1524, 10186, 15302, 326, 753, 253, 2022, 4468, 369, 253, 38135, 342, 1675, 281, 253, 2720, 789, 273, 270, 73, 1945, 4595, 1162, 355, 665, 3534, 271, 3909, 11193, 5933, 323, 3711, 17032, 275, 277, 44361, 281, 7277, 253, 767, 2987, 337, 270, 73, 1945, 4595, 1162, 355, 1918, 271, 5933, 323, 277, 44361, 285, 40515, 44361, 403, 625, 2570, 374, 270, 73, 1945, 4595, 1162, 355, 1918, 872, 494, 23632, 327, 253, 11193, 4313, 533, 642, 824, 23632, 403, 1929, 323, 40515, 44361, 495, 285, 4720, 690, 273, 253, 2234, 12696, 275, 253, 3909, 5933, 323, 40515, 44361, 751, 253, 331, 1225, 497, 2168, 275, 253, 789, 273, 270, 73, 1945, 4595, 1162, 355, 4583, 253, 30628, 3543, 326, 436, 19529, 651, 320, 5520, 342, 247, 30909, 5955, 273, 253, 9021, 689, 2720, 789 ]
Below is given review of a research paper from cnoference journal. Please write a summary the review. ### Review: this paper proposed a matrix decompositionbased method to capture spatial long range correlation in the neural network the proposed method employs an optimization method ie nonnegative matric factorization to reconstruct a lowrank embedding of the input data the experimental results show that the proposed method outperforms various popular attentionbased methods in recent years for various vision tasks ie semantic segmentation and image generation this paper is basically well written and easy to follow what they have done however i have several concerns that i listed as follows according to experimental results i find the proposed matrix decomposition based method outperforms several attention based methods in miou for semantic segmentation and fin for image generation nevertheless the parameters flops memory and running time are only compared with dual attention network please compare more attention based methods to verify the proposed method is more efficient attention based methods the proposed method is similar with emanet especially for employing concept decomposition as the optimization algorithm please discuss the relation and difference between the proposed method and emanet i am very interested in the initialization of matrix decomposition in the supplementary authors only discuss the initialization of d so what is the best initialization of c for nonnegative matrix factorization besides what is the warm start with online update the figures 3 and 4 are low quality the text of coordinate is too small after rebuttal i appreciate the authors detailed response to my questions which largely addresses my previous concerns its very pleasure to reviewing this interesting innovative and wellwritten paper a clear acceptdocsep summary the paper presents a method based on matrix decomposition md for encoding global context in computer vision tasks in particular a hamburger block is proposed encompassing matrix decomposition as its central part between two linear projection layers direct comparison and relations are drawn between the proposed method and the widely adopted selfattention paradigm the proposed method leads to improved results when hamburger blocks are used instead of selfattention blocks leading at the same time to reduced number of parameters memory footprint and inference time strengths the paper presents a novel and simple method for capturing global context demonstrated on two challenging computer vision tasks namely semantic segmentation and image generation important advantages of the proposed method with respect to the selfattention method which is usually employed in such tasks are the fact that it can be easily adapted to a wide range of models and problems it is more efficient and has reduced memory requirements a onestep gradient method is proposed for propagating the gradients through the md optimization algorithm during training onestep gradient is shown to overcome the unstable gradient problems of the backpropagation through time bptt algorithm leading to improved performance a detailed analysis is provided comparing the two methods onestep vs bptt the evaluation is quite comprehensive comparing the proposed hamburger block with respect to similar selfattention blocks under several aspects accuracyfid score gpu load gpu time flops nr of parameters a detailed ablation study is also presented showing the effect of the most important factors of the proposed contribution in the final performance regarding writing quality the paper is clear and easy to read the main ideas and contributions are clearly stated and presented some issues regarding the structure of the sections is discussed below weaknesses the paper makes the more general claim that the proposed approach can be used for including any human inductive bias expressed through an optimization problem however only the problem of capturing global context in place of selfattention is explored to support this more general claim it would be important to include some representative examples even without providing a detailed evaluation on those possibly related to the previous point is the observation that nonnegative matrix factorization nmf seems to always perform better than the other two matrix decomposition methods vector quantization vq and concept decomposition cd this leads to certain questions as for example are there any problems that lend themselves better to the other types of md is the performance of vq and cd degraded because they are rendered soft what is the divergence of soft with original md as far as the original optimization problem is concerned the results of the ablation on temperature t provided in appendix g partially show that the softening of the algorithm might negatively affect the accuracy i find it also strange and possibly nearly violating formatting that the related work section is provided in the appendix some directly related work is discussed in the main text yet a more detailed discussion considering a broader set of works only appears in the appendix also regarding related work other works employing matrix decomposition in the context of deep learning are not covered eg sainath et al 2013 tariyal et al 2016 sainath t n kingsbury b sindhwani v arisoy e ramabhadran b 2013 lowrank matrix factorization for deep neural network training with highdimensional output targets in 2013 ieee international conference on acoustics speech and signal processing pp 66556659 tariyal s majumdar a singh r vatsa m 2016 deep dictionary learning ieee access 4 1009610109 minor comments table 1 no details are provided for the metric used for the results table 6 it would be better to specify the difference between the two entries of hamgan the text in almost all figure is quite small and very hard to read on typical zoom factors 100 rating justification overall i think that the idea of using matrix decomposition as architectural element to capture global context is quite interesting and novel also the method shows advantages with respect to selfattention as far as efficiency and memory requirements are concerned there are some issues regarding the generality of the proposed approach and the papers structure however i think that the paper strengths exceed its weaknesses rating and comments after the rebuttal the authors addressed my concerns in their feedback and the revised manuscript they have provided in particular i find the claimed contributions much clearer now in my view they have also suitably addressed the concerns raised in the other reviews as a result i increase my rating to 8 as i think that this work is interesting novel and impactful docsepthis paper proposes to use matrix decomposition to construct lowrank representations to find the longdistance correlations in context which is demonstrated more effective than popular selfattention mechanism combining linear transformation and matrix decomposition core part authors design hamburger block to model global dependencies from input as residual output the authors propose differentiable modified vector quantization and nonnegative matrix factorization to perform matrix decomposition they propose onestep gradient an approximation of backpropagation through time bptt algorithm to backpropagate gradient of matrix decomposition they conduct experiments on semantic segmentation and image generation to demonstrate the superiority of their methods regarding modelling global dependencies and computational cost i believe that there is clear novelty in the proposed method the paper is well written one weakness is that the experiment analysis is a little weak it will be great to see stronger experiments in the final docsepsummary 1 this paper finds that matrix decomposition md performs well as well as the selfattention 2 according to the paper md approximate the given matrix with lowrank and it might be helpful inductive bias 3 md can be implemented with vanilla matrix factorization or nonnegative matrix factorization 4 for stable learning this paper proposes an additional technique onestep gradient instead of backpropagation through time bptt 5 to validate the performance they experiment on semantic segmentation and image generation please reply to the below questions 1 what is the advantage of md over attention according to the paper the main advantage of md is a less computational burden but there are several works for linear time attention furthermore it is hard to find sufficient reasons that md performs better than attentionbased models 2 there is no description for table 1 could you explain the table 1 3 are there results for md with bptt instead of a onestep gradient on segmentation or image generation task 4 what is the human priors in the conclusion section does it denote the lowrank it is hard to understand that lowrank will be helpful if md set r as same as mindn instead of small r does md have lower performance 5 there are several works about 1 analyzing the lowrank problems in multihead attention and 2 incorporating the lowrank approximation into attention the discussion between this paper and related works is not enough 6 the title is is attention better than matrix decomposition but the paper is only for the selfattention and md are there results for encoderdecoder structured tasks such as translation i suggest that this paper discusses the relationship between md and the below papers a factorization and attention 1 a tensorized transformer for language modeling b lowrank problems and attention 1 lowrank bottleneck in multihead attention models c lowrank attention 1 transformers are rnns fast autoregressive transformers with linear attention 2 linformer selfattention with linear complexity 3 implicit kernel attention 4 compact multihead selfattention for learning supervised text representations i read a valuable authors response and i keep my positive score ### Summary:
this paper introduces an alternative to selfattention based on matrix factorization and apply it to computer vision problems such as semantic segmentation the method is simple and novel and obtains competitive results compared to existing approaches the reviewers found the paper well written and easy to understand for these reasons i recommend to accept the paper
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 2520, 2929, 4081, 247, 4315, 14717, 3169, 1332, 281, 9232, 8820, 1048, 2491, 5921, 275, 253, 11454, 2990, 253, 4081, 1332, 27532, 271, 13757, 1332, 26332, 46214, 1111, 695, 39401, 281, 17029, 247, 1698, 14714, 21496, 273, 253, 3280, 941, 253, 5661, 1543, 921, 326, 253, 4081, 1332, 41731, 13015, 2710, 4633, 4116, 3169, 3082, 275, 3332, 1107, 323, 2710, 8113, 8892, 26332, 24705, 26405, 285, 2460, 5978, 436, 2929, 310, 10323, 973, 3542, 285, 3477, 281, 956, 752, 597, 452, 2218, 50276, 35529, 891, 452, 2067, 7350, 326, 891, 7117, 347, 3637, 50276, 35861, 281, 5661, 1543, 891, 1089, 253, 4081, 4315, 14717, 1754, 1332, 41731, 13015, 2067, 4116, 1754, 3082, 275, 3641, 276, 323, 24705, 26405, 285, 1442, 323, 2460, 5978, 17837, 253, 3602, 892, 2695, 3541, 285, 3515, 673, 403, 760, 2429, 342, 8746, 4116, 2990, 4496, 7277, 625, 4116, 1754, 3082, 281, 12654, 253, 4081, 1332, 310, 625, 5919, 4116, 1754, 3082, 50276, 783, 4081, 1332, 310, 2074, 342, 36534, 292, 3340, 323, 19693, 4473, 14717, 347, 253, 13757, 5933, 4496, 2319, 253, 5886, 285, 3064, 875, 253, 4081, 1332, 285, 36534, 292, 50276, 74, 717, 1077, 6110, 275, 253, 31850, 273, 4315, 14717, 275, 253, 24864, 4477, 760, 2319, 253, 31850, 273, 277, 594, 752, 310, 253, 1682, 31850, 273, 260, 323, 46214, 4315, 39401, 16280, 752, 310, 253, 5890, 1265, 342, 3909, 5731, 50276, 783, 8442, 495, 285, 577, 403, 1698, 3290, 253, 2505, 273, 13249, 310, 1512, 1355, 50276, 6438, 30080, 22559, 891, 11435, 253, 4477, 7000, 2380, 281, 619, 3533, 534, 8127, 12453, 619, 2045, 7350, 50276, 953, 1077, 11284, 281, 16725, 436, 4722, 16694, 285, 973, 15720, 2929, 247, 2590, 2997, 7152, 33032, 6010, 253, 2929, 10262, 247, 1332, 1754, 327, 4315, 14717, 31934, 323, 9706, 4156, 3634, 275, 4382, 8113, 8892, 275, 1798, 247, 288, 1369, 44250, 2972, 310, 4081, 46940, 4315, 14717, 347, 697, 4275, 629, 875, 767, 4872, 12378, 8090, 1480, 5301, 285, 2493, 403, 8392, 875, 253, 4081, 1332, 285, 253, 7561, 8671, 1881, 42959, 22199, 253, 4081, 1332, 5644, 281, 5520, 1543, 672, 288, 1369, 44250, 8336, 403, 908, 3185, 273, 1881, 42959, 8336, 4283, 387, 253, 1072, 673, 281, 3777, 1180, 273, 3602, 3541, 33257, 285, 17032, 673, 50274, 296, 3755, 20556, 253, 2929, 10262, 247, 4460, 285, 2969, 1332, 323, 26475, 4156, 3634, 5183, 327, 767, 11132, 4382, 8113, 8892, 10775, 24705, 26405, 285, 2460, 5978, 1774, 11361, 273, 253, 4081, 1332, 342, 1675, 281, 253, 1881, 42959, 1332, 534, 310, 3798, 7091, 275, 824, 8892, 403, 253, 958, 326, 352, 476, 320, 4354, 12956, 281, 247, 4618, 2491, 273, 3210, 285, 3237, 352, 310, 625, 5919, 285, 556, 3777, 3541, 6095, 50276, 66, 327, 383, 554, 11786, 1332, 310, 4081, 323, 42995, 253, 27935, 949, 253, 31934, 13757, 5933, 1309, 3733, 327, 383, 554, 11786, 310, 2011, 281, 11399, 253, 17631, 11786, 3237, 273, 253, 896, 44263, 318, 949, 673, 270, 431, 85, 5933, 4283, 281, 5520, 3045, 247, 7000, 1783, 310, 2530, 10941, 253, 767, 3082, 327, 383, 554, 4632, 270, 431, 85, 50276, 783, 7103, 310, 3240, 11088, 10941, 253, 4081, 288, 1369, 44250, 2972, 342, 1675, 281, 2074, 1881, 42959, 8336, 762, 2067, 7794, 7200, 71, 301, 4868, 305, 11113, 3301, 305, 11113, 673, 892, 2695, 42374, 273, 3602, 247, 7000, 28913, 1263, 310, 671, 3559, 4645, 253, 1055, 273, 253, 954, 1774, 2616, 273, 253, 4081, 7680, 275, 253, 2457, 3045, 50276, 1747, 13218, 4028, 3290, 253, 2929, 310, 2590, 285, 3477, 281, 1239, 253, 2022, 5697, 285, 9021, 403, 4518, 4767, 285, 3559, 690, 3374, 5001, 253, 2605, 273, 253, 7118, 310, 5469, 2708, 50275, 20881, 1255, 265, 253, 2929, 2789, 253, 625, 2087, 1750, 326, 253, 4081, 2746, 476, 320, 908, 323, 1690, 667, 1966, 42115, 8492, 4469, 949, 271, 13757, 1895, 2299, 760, 253, 1895, 273, 26475, 4156, 3634, 275, 1659, 273, 1881, 42959, 310, 14859, 281, 1329, 436, 625, 2087, 1750, 352, 651, 320, 1774, 281, 2486, 690, 8612, 6667, 1014, 1293, 5277, 247, 7000, 7103, 327, 1110, 50276, 38896, 2905, 281, 253, 2045, 1127, 310, 253, 8310, 326, 46214, 4315, 39401, 9153, 71, 3133, 281, 1900, 1347, 1805, 685, 253, 643, 767, 4315, 14717, 3082, 4972, 36643, 362, 82, 285, 4473, 14717, 22942, 436, 5644, 281, 2176, 3533, 347, 323, 1650, 50276, 609, 627, 667, 3237, 326, 28698, 3746, 1805, 281, 253, 643, 3510, 273, 31934, 50276, 261, 253, 3045, 273, 362, 82, 285, 22942, 30853, 984, 597, 403, 13697, 2602, 50276, 5371, 310, 253, 23279, 273, 2602, 342, 3236, 31934, 347, 2080, 347, 253, 3236, 13757, 1895, 310, 7514, 253, 1543, 273, 253, 28913, 327, 3276, 246, 2530, 275, 30762, 305, 10571, 921, 326, 253, 2602, 2980, 273, 253, 5933, 1537, 18123, 2818, 253, 7200, 50276, 74, 1089, 352, 671, 8921, 285, 6830, 4829, 26554, 33907, 326, 253, 2905, 789, 2593, 310, 2530, 275, 253, 30762, 690, 3587, 2905, 789, 310, 5469, 275, 253, 2022, 2505, 2568, 247, 625, 7000, 5955, 7296, 247, 16055, 873, 273, 2987, 760, 4620, 275, 253, 30762, 671, 5001, 2905, 789, 643, 2987, 19693, 4315, 14717, 275, 253, 3634, 273, 3676, 4715, 403, 417, 6107, 24088, 256, 404, 506, 1162, 355, 4072, 246, 1792, 90, 267, 1162, 355, 4022, 50276, 84, 404, 506, 246, 295, 25346, 12473, 270, 21308, 13816, 6451, 362, 549, 261, 899, 299, 50276, 3358, 357, 10178, 4011, 270, 4072, 1698, 14714, 4315, 39401, 323, 3676, 11454, 2990, 3733, 342, 1029, 6967, 3453, 8571, 275, 4072, 26332, 1796, 5213, 8059, 327, 913, 26202, 982, 6519, 285, 2625, 5162, 7266, 721, 2082, 3208, 21889, 50276, 85, 1792, 90, 267, 256, 19684, 360, 27083, 247, 1625, 73, 391, 50276, 87, 1832, 66, 278, 4022, 3676, 19034, 4715, 26332, 1796, 2289, 577, 2233, 26, 3832, 520, 2693, 50275, 37585, 5701, 50276, 2420, 337, 642, 4278, 403, 2530, 323, 253, 7982, 908, 323, 253, 1543, 50276, 2420, 721, 352, 651, 320, 1805, 281, 13199, 253, 3064, 875, 253, 767, 12028, 273, 10546, 1247, 50276, 783, 2505, 275, 2761, 512, 4677, 310, 3240, 1355, 285, 1077, 1892, 281, 1239, 327, 6867, 21282, 2616, 2233, 50274, 38203, 22861, 4583, 891, 1158, 326, 253, 2934, 273, 970, 4315, 14717, 347, 27934, 3284, 281, 9232, 4156, 3634, 310, 3240, 4722, 285, 4460, 671, 253, 1332, 2722, 11361, 342, 1675, 281, 1881, 42959, 347, 2080, 347, 6733, 285, 3541, 6095, 403, 7514, 627, 403, 690, 3374, 5001, 253, 31376, 273, 253, 4081, 2746, 285, 253, 9380, 2605, 2299, 891, 1158, 326, 253, 2929, 20544, 8268, 697, 32213, 50275, 38203, 285, 5701, 846, 253, 30080, 22559, 253, 4477, 9713, 619, 7350, 275, 616, 8680, 285, 253, 17265, 7714, 597, 452, 2530, 275, 1798, 891, 1089, 253, 7558, 9021, 1199, 30909, 1024, 275, 619, 1859, 597, 452, 671, 43364, 9713, 253, 7350, 5439, 275, 253, 643, 10123, 347, 247, 906, 891, 2572, 619, 13716, 281, 854, 347, 891, 1158, 326, 436, 789, 310, 4722, 4460, 285, 3486, 1020, 5474, 33032, 2520, 2929, 29328, 281, 897, 4315, 14717, 281, 3989, 1698, 14714, 14237, 281, 1089, 253, 1048, 19893, 13007, 275, 3634, 534, 310, 5183, 625, 3576, 685, 4633, 1881, 42959, 5122, 16248, 4872, 9261, 285, 4315, 14717, 5161, 629, 4477, 2216, 288, 1369, 44250, 2972, 281, 1566, 4156, 21011, 432, 3280, 347, 12541, 3453, 253, 4477, 12661, 46350, 7321, 4972, 36643, 285, 46214, 4315, 39401, 281, 1347, 4315, 14717, 597, 12661, 327, 383, 554, 11786, 271, 11193, 273, 896, 44263, 318, 949, 673, 270, 431, 85, 5933, 281, 896, 44263, 366, 11786, 273, 4315, 14717, 597, 2589, 4679, 327, 24705, 26405, 285, 2460, 5978, 281, 7568, 253, 34385, 273, 616, 3082, 5001, 26278, 4156, 21011, 285, 15180, 2105, 50276, 74, 2868, 326, 627, 310, 2590, 38135, 275, 253, 4081, 1332, 253, 2929, 310, 973, 3542, 581, 14855, 310, 326, 253, 3368, 1783, 310, 247, 1652, 5075, 352, 588, 320, 1270, 281, 923, 10046, 4679, 275, 253, 2457, 5474, 339, 793, 360, 3454, 337, 436, 2929, 9010, 326, 4315, 14717, 31934, 17923, 973, 347, 973, 347, 253, 1881, 42959, 374, 2556, 281, 253, 2929, 31934, 16851, 253, 1677, 4315, 342, 1698, 14714, 285, 352, 1537, 320, 9371, 42115, 8492, 495, 31934, 476, 320, 9009, 342, 26724, 4315, 39401, 390, 46214, 4315, 39401, 577, 323, 6474, 4715, 436, 2929, 29328, 271, 3081, 5853, 327, 383, 554, 11786, 3185, 273, 896, 44263, 318, 949, 673, 270, 431, 85, 608, 281, 17813, 253, 3045, 597, 3368, 327, 24705, 26405, 285, 2460, 5978, 50276, 32897, 12252, 281, 253, 2708, 3533, 337, 752, 310, 253, 5750, 273, 31934, 689, 4116, 2556, 281, 253, 2929, 253, 2022, 5750, 273, 31934, 310, 247, 1679, 15180, 7977, 533, 627, 403, 2067, 2987, 323, 4872, 673, 4116, 33810, 352, 310, 1892, 281, 1089, 4209, 4606, 326, 31934, 17923, 1805, 685, 4116, 3169, 3210, 50276, 19, 627, 310, 642, 5740, 323, 2829, 337, 812, 368, 5513, 253, 2829, 337, 495, 403, 627, 1543, 323, 31934, 342, 270, 431, 85, 3185, 273, 247, 327, 383, 554, 11786, 327, 26405, 390, 2460, 5978, 4836, 577, 752, 310, 253, 1966, 2235, 641, 275, 253, 6452, 2593, 1057, 352, 9173, 253, 1698, 14714, 352, 310, 1892, 281, 2096, 326, 1698, 14714, 588, 320, 9371, 604, 31934, 873, 391, 347, 1072, 347, 2564, 79, 3185, 273, 1355, 391, 1057, 31934, 452, 2406, 3045, 608, 627, 403, 2067, 2987, 670, 337, 18918, 253, 1698, 14714, 3237, 275, 4471, 2522, 4116, 285, 374, 24049, 253, 1698, 14714, 11193, 715, 4116, 253, 5955, 875, 436, 2929, 285, 2905, 2987, 310, 417, 2217, 721, 253, 4060, 310, 310, 4116, 1805, 685, 4315, 14717, 533, 253, 2929, 310, 760, 323, 253, 1881, 42959, 285, 31934, 403, 627, 1543, 323, 32049, 48759, 18872, 8892, 824, 347, 10234, 50276, 74, 1804, 326, 436, 2929, 25339, 253, 2954, 875, 31934, 285, 253, 2708, 9380, 50276, 66, 39401, 285, 4116, 337, 247, 13148, 1025, 39707, 323, 3448, 14053, 50276, 67, 1698, 14714, 3237, 285, 4116, 337, 1698, 14714, 3673, 44856, 275, 4471, 2522, 4116, 3210, 50276, 68, 1698, 14714, 4116, 337, 4979, 398, 403, 391, 79, 2224, 3809, 47694, 11020, 4979, 398, 342, 4872, 4116, 374, 19169, 19946, 1881, 42959, 342, 4872, 10454, 495, 15424, 10295, 4116, 577, 8566, 4471, 2522, 1881, 42959, 323, 4715, 22296, 2505, 14237, 50275, 74, 1239, 247, 9865, 4477, 2380, 285, 891, 1978, 619, 2762, 4868, 187, 187, 4118, 18435, 27, 2520, 2929, 23970, 271, 5795, 281, 1881, 42959, 1754, 327, 4315, 39401, 285, 4647, 352, 281, 4382, 8113, 3237, 824, 347, 24705, 26405, 253, 1332, 310, 2969, 285, 4460, 285, 31326, 12085, 1543, 2429, 281, 5368, 7274, 253, 30628, 1119, 253, 2929, 973, 3542, 285, 3477, 281, 2096, 323, 841, 4606, 891, 5583, 281, 2997, 253, 2929 ]
[ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 ]
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 2520, 2929, 4081, 247, 4315, 14717, 3169, 1332, 281, 9232, 8820, 1048, 2491, 5921, 275, 253, 11454, 2990, 253, 4081, 1332, 27532, 271, 13757, 1332, 26332, 46214, 1111, 695, 39401, 281, 17029, 247, 1698, 14714, 21496, 273, 253, 3280, 941, 253, 5661, 1543, 921, 326, 253, 4081, 1332, 41731, 13015, 2710, 4633, 4116, 3169, 3082, 275, 3332, 1107, 323, 2710, 8113, 8892, 26332, 24705, 26405, 285, 2460, 5978, 436, 2929, 310, 10323, 973, 3542, 285, 3477, 281, 956, 752, 597, 452, 2218, 50276, 35529, 891, 452, 2067, 7350, 326, 891, 7117, 347, 3637, 50276, 35861, 281, 5661, 1543, 891, 1089, 253, 4081, 4315, 14717, 1754, 1332, 41731, 13015, 2067, 4116, 1754, 3082, 275, 3641, 276, 323, 24705, 26405, 285, 1442, 323, 2460, 5978, 17837, 253, 3602, 892, 2695, 3541, 285, 3515, 673, 403, 760, 2429, 342, 8746, 4116, 2990, 4496, 7277, 625, 4116, 1754, 3082, 281, 12654, 253, 4081, 1332, 310, 625, 5919, 4116, 1754, 3082, 50276, 783, 4081, 1332, 310, 2074, 342, 36534, 292, 3340, 323, 19693, 4473, 14717, 347, 253, 13757, 5933, 4496, 2319, 253, 5886, 285, 3064, 875, 253, 4081, 1332, 285, 36534, 292, 50276, 74, 717, 1077, 6110, 275, 253, 31850, 273, 4315, 14717, 275, 253, 24864, 4477, 760, 2319, 253, 31850, 273, 277, 594, 752, 310, 253, 1682, 31850, 273, 260, 323, 46214, 4315, 39401, 16280, 752, 310, 253, 5890, 1265, 342, 3909, 5731, 50276, 783, 8442, 495, 285, 577, 403, 1698, 3290, 253, 2505, 273, 13249, 310, 1512, 1355, 50276, 6438, 30080, 22559, 891, 11435, 253, 4477, 7000, 2380, 281, 619, 3533, 534, 8127, 12453, 619, 2045, 7350, 50276, 953, 1077, 11284, 281, 16725, 436, 4722, 16694, 285, 973, 15720, 2929, 247, 2590, 2997, 7152, 33032, 6010, 253, 2929, 10262, 247, 1332, 1754, 327, 4315, 14717, 31934, 323, 9706, 4156, 3634, 275, 4382, 8113, 8892, 275, 1798, 247, 288, 1369, 44250, 2972, 310, 4081, 46940, 4315, 14717, 347, 697, 4275, 629, 875, 767, 4872, 12378, 8090, 1480, 5301, 285, 2493, 403, 8392, 875, 253, 4081, 1332, 285, 253, 7561, 8671, 1881, 42959, 22199, 253, 4081, 1332, 5644, 281, 5520, 1543, 672, 288, 1369, 44250, 8336, 403, 908, 3185, 273, 1881, 42959, 8336, 4283, 387, 253, 1072, 673, 281, 3777, 1180, 273, 3602, 3541, 33257, 285, 17032, 673, 50274, 296, 3755, 20556, 253, 2929, 10262, 247, 4460, 285, 2969, 1332, 323, 26475, 4156, 3634, 5183, 327, 767, 11132, 4382, 8113, 8892, 10775, 24705, 26405, 285, 2460, 5978, 1774, 11361, 273, 253, 4081, 1332, 342, 1675, 281, 253, 1881, 42959, 1332, 534, 310, 3798, 7091, 275, 824, 8892, 403, 253, 958, 326, 352, 476, 320, 4354, 12956, 281, 247, 4618, 2491, 273, 3210, 285, 3237, 352, 310, 625, 5919, 285, 556, 3777, 3541, 6095, 50276, 66, 327, 383, 554, 11786, 1332, 310, 4081, 323, 42995, 253, 27935, 949, 253, 31934, 13757, 5933, 1309, 3733, 327, 383, 554, 11786, 310, 2011, 281, 11399, 253, 17631, 11786, 3237, 273, 253, 896, 44263, 318, 949, 673, 270, 431, 85, 5933, 4283, 281, 5520, 3045, 247, 7000, 1783, 310, 2530, 10941, 253, 767, 3082, 327, 383, 554, 4632, 270, 431, 85, 50276, 783, 7103, 310, 3240, 11088, 10941, 253, 4081, 288, 1369, 44250, 2972, 342, 1675, 281, 2074, 1881, 42959, 8336, 762, 2067, 7794, 7200, 71, 301, 4868, 305, 11113, 3301, 305, 11113, 673, 892, 2695, 42374, 273, 3602, 247, 7000, 28913, 1263, 310, 671, 3559, 4645, 253, 1055, 273, 253, 954, 1774, 2616, 273, 253, 4081, 7680, 275, 253, 2457, 3045, 50276, 1747, 13218, 4028, 3290, 253, 2929, 310, 2590, 285, 3477, 281, 1239, 253, 2022, 5697, 285, 9021, 403, 4518, 4767, 285, 3559, 690, 3374, 5001, 253, 2605, 273, 253, 7118, 310, 5469, 2708, 50275, 20881, 1255, 265, 253, 2929, 2789, 253, 625, 2087, 1750, 326, 253, 4081, 2746, 476, 320, 908, 323, 1690, 667, 1966, 42115, 8492, 4469, 949, 271, 13757, 1895, 2299, 760, 253, 1895, 273, 26475, 4156, 3634, 275, 1659, 273, 1881, 42959, 310, 14859, 281, 1329, 436, 625, 2087, 1750, 352, 651, 320, 1774, 281, 2486, 690, 8612, 6667, 1014, 1293, 5277, 247, 7000, 7103, 327, 1110, 50276, 38896, 2905, 281, 253, 2045, 1127, 310, 253, 8310, 326, 46214, 4315, 39401, 9153, 71, 3133, 281, 1900, 1347, 1805, 685, 253, 643, 767, 4315, 14717, 3082, 4972, 36643, 362, 82, 285, 4473, 14717, 22942, 436, 5644, 281, 2176, 3533, 347, 323, 1650, 50276, 609, 627, 667, 3237, 326, 28698, 3746, 1805, 281, 253, 643, 3510, 273, 31934, 50276, 261, 253, 3045, 273, 362, 82, 285, 22942, 30853, 984, 597, 403, 13697, 2602, 50276, 5371, 310, 253, 23279, 273, 2602, 342, 3236, 31934, 347, 2080, 347, 253, 3236, 13757, 1895, 310, 7514, 253, 1543, 273, 253, 28913, 327, 3276, 246, 2530, 275, 30762, 305, 10571, 921, 326, 253, 2602, 2980, 273, 253, 5933, 1537, 18123, 2818, 253, 7200, 50276, 74, 1089, 352, 671, 8921, 285, 6830, 4829, 26554, 33907, 326, 253, 2905, 789, 2593, 310, 2530, 275, 253, 30762, 690, 3587, 2905, 789, 310, 5469, 275, 253, 2022, 2505, 2568, 247, 625, 7000, 5955, 7296, 247, 16055, 873, 273, 2987, 760, 4620, 275, 253, 30762, 671, 5001, 2905, 789, 643, 2987, 19693, 4315, 14717, 275, 253, 3634, 273, 3676, 4715, 403, 417, 6107, 24088, 256, 404, 506, 1162, 355, 4072, 246, 1792, 90, 267, 1162, 355, 4022, 50276, 84, 404, 506, 246, 295, 25346, 12473, 270, 21308, 13816, 6451, 362, 549, 261, 899, 299, 50276, 3358, 357, 10178, 4011, 270, 4072, 1698, 14714, 4315, 39401, 323, 3676, 11454, 2990, 3733, 342, 1029, 6967, 3453, 8571, 275, 4072, 26332, 1796, 5213, 8059, 327, 913, 26202, 982, 6519, 285, 2625, 5162, 7266, 721, 2082, 3208, 21889, 50276, 85, 1792, 90, 267, 256, 19684, 360, 27083, 247, 1625, 73, 391, 50276, 87, 1832, 66, 278, 4022, 3676, 19034, 4715, 26332, 1796, 2289, 577, 2233, 26, 3832, 520, 2693, 50275, 37585, 5701, 50276, 2420, 337, 642, 4278, 403, 2530, 323, 253, 7982, 908, 323, 253, 1543, 50276, 2420, 721, 352, 651, 320, 1805, 281, 13199, 253, 3064, 875, 253, 767, 12028, 273, 10546, 1247, 50276, 783, 2505, 275, 2761, 512, 4677, 310, 3240, 1355, 285, 1077, 1892, 281, 1239, 327, 6867, 21282, 2616, 2233, 50274, 38203, 22861, 4583, 891, 1158, 326, 253, 2934, 273, 970, 4315, 14717, 347, 27934, 3284, 281, 9232, 4156, 3634, 310, 3240, 4722, 285, 4460, 671, 253, 1332, 2722, 11361, 342, 1675, 281, 1881, 42959, 347, 2080, 347, 6733, 285, 3541, 6095, 403, 7514, 627, 403, 690, 3374, 5001, 253, 31376, 273, 253, 4081, 2746, 285, 253, 9380, 2605, 2299, 891, 1158, 326, 253, 2929, 20544, 8268, 697, 32213, 50275, 38203, 285, 5701, 846, 253, 30080, 22559, 253, 4477, 9713, 619, 7350, 275, 616, 8680, 285, 253, 17265, 7714, 597, 452, 2530, 275, 1798, 891, 1089, 253, 7558, 9021, 1199, 30909, 1024, 275, 619, 1859, 597, 452, 671, 43364, 9713, 253, 7350, 5439, 275, 253, 643, 10123, 347, 247, 906, 891, 2572, 619, 13716, 281, 854, 347, 891, 1158, 326, 436, 789, 310, 4722, 4460, 285, 3486, 1020, 5474, 33032, 2520, 2929, 29328, 281, 897, 4315, 14717, 281, 3989, 1698, 14714, 14237, 281, 1089, 253, 1048, 19893, 13007, 275, 3634, 534, 310, 5183, 625, 3576, 685, 4633, 1881, 42959, 5122, 16248, 4872, 9261, 285, 4315, 14717, 5161, 629, 4477, 2216, 288, 1369, 44250, 2972, 281, 1566, 4156, 21011, 432, 3280, 347, 12541, 3453, 253, 4477, 12661, 46350, 7321, 4972, 36643, 285, 46214, 4315, 39401, 281, 1347, 4315, 14717, 597, 12661, 327, 383, 554, 11786, 271, 11193, 273, 896, 44263, 318, 949, 673, 270, 431, 85, 5933, 281, 896, 44263, 366, 11786, 273, 4315, 14717, 597, 2589, 4679, 327, 24705, 26405, 285, 2460, 5978, 281, 7568, 253, 34385, 273, 616, 3082, 5001, 26278, 4156, 21011, 285, 15180, 2105, 50276, 74, 2868, 326, 627, 310, 2590, 38135, 275, 253, 4081, 1332, 253, 2929, 310, 973, 3542, 581, 14855, 310, 326, 253, 3368, 1783, 310, 247, 1652, 5075, 352, 588, 320, 1270, 281, 923, 10046, 4679, 275, 253, 2457, 5474, 339, 793, 360, 3454, 337, 436, 2929, 9010, 326, 4315, 14717, 31934, 17923, 973, 347, 973, 347, 253, 1881, 42959, 374, 2556, 281, 253, 2929, 31934, 16851, 253, 1677, 4315, 342, 1698, 14714, 285, 352, 1537, 320, 9371, 42115, 8492, 495, 31934, 476, 320, 9009, 342, 26724, 4315, 39401, 390, 46214, 4315, 39401, 577, 323, 6474, 4715, 436, 2929, 29328, 271, 3081, 5853, 327, 383, 554, 11786, 3185, 273, 896, 44263, 318, 949, 673, 270, 431, 85, 608, 281, 17813, 253, 3045, 597, 3368, 327, 24705, 26405, 285, 2460, 5978, 50276, 32897, 12252, 281, 253, 2708, 3533, 337, 752, 310, 253, 5750, 273, 31934, 689, 4116, 2556, 281, 253, 2929, 253, 2022, 5750, 273, 31934, 310, 247, 1679, 15180, 7977, 533, 627, 403, 2067, 2987, 323, 4872, 673, 4116, 33810, 352, 310, 1892, 281, 1089, 4209, 4606, 326, 31934, 17923, 1805, 685, 4116, 3169, 3210, 50276, 19, 627, 310, 642, 5740, 323, 2829, 337, 812, 368, 5513, 253, 2829, 337, 495, 403, 627, 1543, 323, 31934, 342, 270, 431, 85, 3185, 273, 247, 327, 383, 554, 11786, 327, 26405, 390, 2460, 5978, 4836, 577, 752, 310, 253, 1966, 2235, 641, 275, 253, 6452, 2593, 1057, 352, 9173, 253, 1698, 14714, 352, 310, 1892, 281, 2096, 326, 1698, 14714, 588, 320, 9371, 604, 31934, 873, 391, 347, 1072, 347, 2564, 79, 3185, 273, 1355, 391, 1057, 31934, 452, 2406, 3045, 608, 627, 403, 2067, 2987, 670, 337, 18918, 253, 1698, 14714, 3237, 275, 4471, 2522, 4116, 285, 374, 24049, 253, 1698, 14714, 11193, 715, 4116, 253, 5955, 875, 436, 2929, 285, 2905, 2987, 310, 417, 2217, 721, 253, 4060, 310, 310, 4116, 1805, 685, 4315, 14717, 533, 253, 2929, 310, 760, 323, 253, 1881, 42959, 285, 31934, 403, 627, 1543, 323, 32049, 48759, 18872, 8892, 824, 347, 10234, 50276, 74, 1804, 326, 436, 2929, 25339, 253, 2954, 875, 31934, 285, 253, 2708, 9380, 50276, 66, 39401, 285, 4116, 337, 247, 13148, 1025, 39707, 323, 3448, 14053, 50276, 67, 1698, 14714, 3237, 285, 4116, 337, 1698, 14714, 3673, 44856, 275, 4471, 2522, 4116, 3210, 50276, 68, 1698, 14714, 4116, 337, 4979, 398, 403, 391, 79, 2224, 3809, 47694, 11020, 4979, 398, 342, 4872, 4116, 374, 19169, 19946, 1881, 42959, 342, 4872, 10454, 495, 15424, 10295, 4116, 577, 8566, 4471, 2522, 1881, 42959, 323, 4715, 22296, 2505, 14237, 50275, 74, 1239, 247, 9865, 4477, 2380, 285, 891, 1978, 619, 2762, 4868, 187, 187, 4118, 18435, 27, 2520, 2929, 23970, 271, 5795, 281, 1881, 42959, 1754, 327, 4315, 39401, 285, 4647, 352, 281, 4382, 8113, 3237, 824, 347, 24705, 26405, 253, 1332, 310, 2969, 285, 4460, 285, 31326, 12085, 1543, 2429, 281, 5368, 7274, 253, 30628, 1119, 253, 2929, 973, 3542, 285, 3477, 281, 2096, 323, 841, 4606, 891, 5583, 281, 2997, 253, 2929 ]
Below is given review of a research paper from cnoference journal. Please write a summary the review. ### Review: the paper proposed a novel attention mechanism and a new objective function that mitigates the distribution shifts caused by masked tokens for downstream tasks in mlm it demonstrates superior performance across benchmarks pros 1 good empirical results are demonstrated across an extensive suite of benchmarks ablation studies are well done hence i am willing to give a score of 6 despite of the following concerns cons 1 my major concern is about the novelty of this paper in transformerxl1 the idea of relative positional information in the form of eq 2 was already introduced the paper somehow intentionally omit the discussion following 2 only mentioning two earlier works of shaw et al 2018 huang et al 2018 i think the author should be honest and compare with relative positional information introduced in transformerxl in the forefront that being said there is obviously still differences between transformerxl and the proposed methods and also the introduction of novel objectives in addition to the attention mechanism 2 however the previous concern brought up the second concern i have about the evaluations since the modification relative positional information of transformerxl to the proposed method is not too large i wonder if there is a reason to explain the better performances of the proposed methods hence i am worried if the baseline such as xlnet was welltuned we can see that for example in 2 the performance of xlnet was much better than originally reported i think the author should try to carefully evaluate the relative positional mechanisms of prior works with authors own infrastructure while having everything else fixed 3 i find the word disentangled a bit misleading in this context disentanglement in ml 3 often refers to the ability to disentangle factors of variations of the data the work does not make use of any disentangled techniques or have disentanglement representationarchitectures it simply use a relative position mechanism thats the sum of four matrix products 1 dai et al transformerxl attentive language models beyond a fixedlength context 2 dai et al funneltransformer filtering out sequential redundancy for efficient language processing 3 locatello et al challenging common assumptions in the unsupervised learning of disentangled representationsdocsepsummary and contributions the authors proposed an extension to the word representation transformer architecture that takes into account disentangle features for position and content the disentangle of attention is based on the composition of a content and position parameter matrices in addition with combinations of both the main contribution is to tackle issues with the relative position embeddings used on standard transformer architectures the proposed model shows improvements on some benchmarks by using less pretraining data compared to the baseline strengths the proposed model tackles a known issue in transformer architectures the authors perform a comprehensive comparison on standard text benchmarks as well as an ablation study the findings show that disentangle attention improves results on some text benchmarks weaknesses related work on disentangle representations for text and the further motivation for using disentanglement into the attention model are not discussed missing results of the variance in metrics with multiple runs on the downstream tasks as an extra contribution the authors could show if the improvements are due to the proposed model or variance in parameter initialisation questions to the authors could you elaborate on disentangled representations and how they relate to the proposed attention model how does it compare the enhanced masked language model with the masked language model how does the relative position parameter matrix is initialised and how does it affect the language model performance docsepin this paper an improvement of bert model is proposed it relies on the disentanglement of contents and relative positions in the encoding layers and on the incorporation of absolute positions in the decoding layer strengths the paper is well written the positioning to the state of the art is clear and the method is rigorously described the paper provides a complete evaluation using the existing benchmarks for nlp and including ablation studies and evaluation of pretraining efficiency and deberta improves results in the major part of the cases weaknesses the proposed method is a relative increment of previous methods in section 411 the way performance increase or decrease is reported is not exact 11 11 points do we have an idea of the statistical significance of the improvements it would be interesting to have the rationale for the mitigated result obtained on table 1 is deberta more relevant for specific tasks the authors claim that they evaluate their results on generation task but it rather seems that they evaluate language modeling using perplexity the use of non documented acronyms ppl for example that could be not understandable outside the nlp community they are some redundancy in the text second paragraph of 32 and fourth paragraph of the introduction that is not necessary docsepthe paper proposes a bertinspired model that adds a two main different architectural decisions different content and position representations instead of a sum and absolute positions in the decoding layer the authors run the standard suite of glue benchmark experiments on both large and base setups as well as a generation setup wikitext103 the modifications proposed are not gamechanging but the evaluations are interesting in terms of understanding the impact of these modifications one thing that i find disingenuous is fact that their disentangled approach does introduce additional parameters which is not quantified or even mentioned in the main paper i had to dig into the appendix to see that this introduces about 49m additional parameters increment of 13 another problem that i have is with their experimental comparisons especially the ones in main part sec 411 im listing below the most important issues in this section roberta and xlnet are trained for 500k steps with 8k samples in a step which amounts to four billion passes over training samples this is confusing what you mean to say is that the models see about four billion training examples the term passes is used usually as an equivalent to epochs ie how many times the model goes over the entire training set table 1 which compares deberta with previous models with around 350m parameters bert roberta xlnet albert and electra note that albert is actually around 235m parameters significantly less than all the others you cannot simply bundle all together and claim they are equivalent parametersizewise deberta still outperforms them albertxxlarge in term of the average glue score note that the difference here wrt albertxxlarge is from 8996 to 9000 ie 004 for the average with a tie 33 in terms of wins for specific tasks unless you can show that the 004 difference is statistically significant you need to tone down the claim about outperforming we summarize the results in table 2 compared to the previous sota models with similar sizes including bert roberta xlnet and megatron336m deberta consistently outperforms them in all the 7 tasks taking race as an example deberta is significantly better than previous sota xlnet with an improvement of 14 868 vs 854 for whatever reason the authors omit albert from the comparison done for table 2 in spite of its even smaller size compared to the included ones and the fact that the albert numbers for these tasks are readily available in the paper taking race as an example albert single model has 865 accuracy therefore nullifying the claim of 14 improvement re references a lot of the references use the arxiv version for papers that have been peerreviewed and published please fix ### Summary:
all reviewers gave though not very strong positive scores for this work although the technical contribution of the paper is somewhat incremental the reviewers agree that it solidly addresses the known important issues in bert and the experiments are extensive enough to demonstrate the empirical effectiveness of the method the main concerns raised by the reviewers are regarding the novelty and the discussion with respect to related work as well as some unclear writings in the detail but i think the pros outweigh the cons and thus would like to recommend acceptance of the paper we do encourage authors to properly take in the reviewers comments to further polish the paper in the final version
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 783, 2929, 4081, 247, 4460, 4116, 5122, 285, 247, 747, 8103, 1159, 326, 4784, 304, 684, 253, 3268, 15036, 4269, 407, 34741, 21761, 323, 15450, 8892, 275, 13361, 78, 352, 14371, 8936, 3045, 2439, 49602, 50276, 856, 84, 337, 1175, 16774, 1543, 403, 5183, 2439, 271, 9470, 18880, 273, 49602, 28913, 2175, 403, 973, 2218, 7613, 891, 717, 7378, 281, 1918, 247, 4868, 273, 721, 5747, 273, 253, 1563, 7350, 50276, 5040, 337, 619, 2201, 4468, 310, 670, 253, 38135, 273, 436, 2929, 50276, 249, 39707, 30291, 18, 253, 2934, 273, 4103, 40798, 1491, 275, 253, 830, 273, 16186, 374, 369, 2168, 5611, 253, 2929, 10380, 23209, 35991, 253, 5955, 1563, 374, 760, 29570, 767, 4321, 2987, 273, 439, 1403, 1162, 355, 4765, 30287, 606, 1162, 355, 4765, 891, 1158, 253, 2488, 943, 320, 8274, 285, 7277, 342, 4103, 40798, 1491, 5611, 275, 39707, 30291, 275, 253, 43213, 50276, 3529, 1146, 753, 627, 310, 9090, 1335, 3910, 875, 39707, 30291, 285, 253, 4081, 3082, 285, 671, 253, 10199, 273, 4460, 16566, 275, 1635, 281, 253, 4116, 5122, 50275, 19, 2299, 253, 2045, 4468, 3982, 598, 253, 1273, 4468, 891, 452, 670, 253, 27163, 1580, 253, 11237, 4103, 40798, 1491, 273, 39707, 30291, 281, 253, 4081, 1332, 310, 417, 1512, 1781, 891, 4282, 604, 627, 310, 247, 1921, 281, 5513, 253, 1805, 16226, 273, 253, 4081, 3082, 7613, 891, 717, 11926, 604, 253, 8245, 824, 347, 1269, 77, 3024, 369, 973, 85, 37437, 359, 476, 923, 326, 323, 1650, 275, 374, 253, 3045, 273, 1269, 77, 3024, 369, 1199, 1805, 685, 8927, 2361, 891, 1158, 253, 2488, 943, 1611, 281, 9257, 7472, 253, 4103, 40798, 6297, 273, 2720, 2987, 342, 4477, 1211, 11319, 1223, 1907, 3253, 2010, 4229, 50276, 20, 891, 1089, 253, 3159, 557, 290, 33195, 247, 2372, 24363, 275, 436, 3634, 557, 290, 606, 1338, 275, 13361, 495, 2223, 10770, 281, 253, 3745, 281, 557, 290, 2134, 2616, 273, 10575, 273, 253, 941, 253, 789, 1057, 417, 1056, 897, 273, 667, 557, 290, 33195, 5609, 390, 452, 557, 290, 606, 1338, 6779, 1116, 5671, 980, 352, 3365, 897, 247, 4103, 1899, 5122, 28763, 253, 2020, 273, 1740, 4315, 3580, 50275, 18, 277, 2284, 1162, 355, 39707, 30291, 33056, 422, 3448, 3210, 4457, 247, 4229, 3985, 3634, 50276, 19, 277, 2284, 1162, 355, 37346, 16702, 254, 19690, 562, 22453, 39296, 323, 5919, 3448, 5162, 50276, 20, 1150, 255, 6646, 1162, 355, 11132, 1846, 13260, 275, 253, 440, 35421, 4715, 273, 557, 290, 33195, 14237, 7152, 339, 793, 360, 3454, 285, 9021, 50276, 783, 4477, 4081, 271, 6880, 281, 253, 3159, 6779, 39707, 10336, 50276, 3529, 3936, 715, 2395, 557, 290, 2134, 3386, 323, 1899, 285, 2600, 253, 557, 290, 2134, 273, 4116, 310, 1754, 327, 253, 5889, 273, 247, 2600, 285, 1899, 4764, 12624, 275, 1635, 342, 13553, 273, 1097, 50276, 783, 2022, 7680, 310, 281, 18915, 3374, 342, 253, 4103, 1899, 46234, 908, 327, 2629, 39707, 35615, 253, 4081, 1566, 2722, 11701, 327, 690, 49602, 407, 970, 1679, 3215, 26208, 941, 2429, 281, 253, 8245, 50276, 296, 3755, 20556, 50275, 783, 4081, 1566, 39223, 247, 1929, 2523, 275, 39707, 35615, 50276, 783, 4477, 1347, 247, 11088, 5301, 327, 2629, 2505, 49602, 347, 973, 347, 271, 28913, 1263, 50276, 783, 4342, 921, 326, 557, 290, 2134, 4116, 19132, 1543, 327, 690, 2505, 49602, 50276, 20881, 1255, 265, 50275, 4919, 789, 327, 557, 290, 2134, 14237, 323, 2505, 285, 253, 2007, 16038, 323, 970, 557, 290, 606, 1338, 715, 253, 4116, 1566, 403, 417, 5469, 50276, 33722, 1543, 273, 253, 11041, 275, 17082, 342, 2709, 6613, 327, 253, 15450, 8892, 347, 271, 4465, 7680, 253, 4477, 812, 50276, 9029, 604, 253, 11701, 403, 1955, 281, 253, 4081, 1566, 390, 11041, 275, 4764, 3302, 5837, 50275, 34974, 281, 253, 4477, 50275, 16534, 368, 21184, 327, 557, 290, 33195, 14237, 285, 849, 597, 14588, 281, 253, 4081, 4116, 1566, 50275, 5430, 1057, 352, 7277, 253, 8655, 34741, 3448, 1566, 342, 253, 34741, 3448, 1566, 50276, 5430, 1057, 253, 4103, 1899, 4764, 4315, 310, 3302, 1701, 285, 849, 1057, 352, 2818, 253, 3448, 1566, 3045, 50276, 7152, 339, 9852, 436, 2929, 271, 7756, 273, 270, 797, 1566, 310, 4081, 352, 15771, 327, 253, 557, 290, 606, 1338, 273, 9410, 285, 4103, 6887, 275, 253, 9706, 8090, 285, 327, 253, 24319, 273, 7880, 6887, 275, 253, 28490, 3828, 50275, 296, 3755, 20556, 50276, 783, 2929, 310, 973, 3542, 253, 19274, 281, 253, 1375, 273, 253, 1445, 310, 2590, 285, 253, 1332, 310, 8132, 29689, 2529, 50275, 783, 2929, 3400, 247, 3426, 7103, 970, 253, 5368, 49602, 323, 295, 24343, 285, 1690, 28913, 2175, 285, 7103, 273, 3215, 26208, 6733, 285, 372, 589, 893, 19132, 1543, 275, 253, 2201, 629, 273, 253, 2219, 50275, 20881, 1255, 265, 50276, 783, 4081, 1332, 310, 247, 4103, 17627, 273, 2045, 3082, 50276, 249, 2593, 33232, 253, 1039, 3045, 2572, 390, 6379, 310, 2361, 310, 417, 3242, 1903, 50276, 883, 2792, 50276, 3088, 359, 452, 271, 2934, 273, 253, 7605, 8453, 273, 253, 11701, 50276, 262, 651, 320, 4722, 281, 452, 253, 24775, 323, 253, 4784, 27285, 906, 2797, 327, 2829, 337, 310, 372, 589, 893, 625, 4623, 323, 2173, 8892, 50276, 783, 4477, 1750, 326, 597, 7472, 616, 1543, 327, 5978, 4836, 533, 352, 2581, 3133, 326, 597, 7472, 3448, 14053, 970, 44229, 414, 50276, 783, 897, 273, 1327, 14290, 913, 1406, 90, 983, 268, 446, 323, 1650, 326, 812, 320, 417, 34007, 3345, 253, 295, 24343, 3114, 597, 403, 690, 39296, 275, 253, 2505, 1273, 12494, 273, 4567, 285, 7002, 12494, 273, 253, 10199, 326, 310, 417, 3309, 50276, 7152, 339, 431, 248, 2929, 29328, 247, 270, 797, 38358, 1566, 326, 11323, 247, 767, 2022, 1027, 27934, 7089, 1027, 2600, 285, 1899, 14237, 3185, 273, 247, 2020, 285, 7880, 6887, 275, 253, 28490, 3828, 253, 4477, 1408, 253, 2629, 18880, 273, 28400, 22791, 4679, 327, 1097, 1781, 285, 2613, 873, 8777, 347, 973, 347, 247, 5978, 9978, 259, 1479, 614, 633, 12172, 50275, 783, 14586, 4081, 403, 417, 2165, 28276, 533, 253, 27163, 403, 4722, 275, 2426, 273, 4685, 253, 3486, 273, 841, 14586, 581, 2181, 326, 891, 1089, 557, 25203, 3472, 310, 958, 326, 616, 557, 290, 33195, 2746, 1057, 9569, 3081, 3602, 534, 310, 417, 18755, 390, 1014, 5393, 275, 253, 2022, 2929, 891, 574, 281, 2836, 715, 253, 30762, 281, 923, 326, 436, 23970, 670, 7584, 78, 3081, 3602, 17627, 273, 2145, 50276, 23955, 1895, 326, 891, 452, 310, 342, 616, 5661, 14023, 3340, 253, 4394, 275, 2022, 629, 4706, 33232, 516, 16485, 2708, 253, 954, 1774, 3374, 275, 436, 2593, 50276, 287, 589, 893, 285, 1269, 77, 3024, 403, 10166, 323, 6783, 76, 5018, 342, 854, 76, 3530, 275, 247, 3213, 534, 8322, 281, 1740, 6494, 11999, 689, 3733, 3530, 436, 310, 21643, 752, 368, 1599, 281, 1333, 310, 326, 253, 3210, 923, 670, 1740, 6494, 3733, 6667, 253, 1307, 11999, 310, 908, 3798, 347, 271, 6425, 281, 44540, 26332, 849, 1142, 2069, 253, 1566, 4566, 689, 253, 2862, 3733, 873, 50275, 2420, 337, 534, 26662, 372, 589, 893, 342, 2045, 3210, 342, 1475, 16176, 78, 3602, 270, 797, 687, 589, 893, 1269, 77, 3024, 355, 6291, 285, 1516, 376, 3877, 326, 355, 6291, 310, 2686, 1475, 23540, 78, 3602, 3012, 1679, 685, 512, 253, 2571, 368, 2550, 3365, 13204, 512, 2366, 285, 1750, 597, 403, 6425, 3602, 907, 3020, 50276, 615, 589, 893, 1335, 41731, 13015, 731, 355, 6291, 5260, 16374, 275, 1307, 50276, 1171, 253, 3388, 28400, 4868, 3877, 326, 253, 3064, 1060, 8772, 355, 6291, 5260, 16374, 310, 432, 854, 28053, 281, 898, 933, 26332, 209, 5525, 323, 253, 3388, 342, 247, 13898, 5922, 275, 2426, 273, 14896, 323, 2173, 8892, 5734, 368, 476, 921, 326, 253, 209, 5525, 3064, 310, 10126, 1534, 368, 878, 281, 10541, 1066, 253, 1750, 670, 41731, 14692, 50275, 664, 26799, 253, 1543, 275, 2829, 374, 2429, 281, 253, 2045, 256, 5503, 3210, 342, 2074, 9552, 1690, 270, 797, 687, 589, 893, 1269, 77, 3024, 285, 19488, 255, 1406, 23126, 78, 372, 589, 893, 12724, 41731, 13015, 731, 275, 512, 253, 818, 8892, 3192, 5492, 347, 271, 1650, 372, 589, 893, 310, 3012, 1805, 685, 2045, 256, 5503, 1269, 77, 3024, 342, 271, 7756, 273, 1638, 854, 2358, 4632, 854, 3439, 323, 5913, 1921, 253, 4477, 35991, 355, 6291, 432, 253, 5301, 2218, 323, 2829, 374, 275, 15866, 273, 697, 1014, 4577, 1979, 2429, 281, 253, 2908, 4394, 285, 253, 958, 326, 253, 355, 6291, 3904, 323, 841, 8892, 403, 12450, 2130, 275, 253, 2929, 3192, 5492, 347, 271, 1650, 355, 6291, 2014, 1566, 556, 854, 2082, 7200, 3103, 3635, 5411, 253, 1750, 273, 1638, 7756, 50275, 250, 10414, 50276, 66, 2257, 273, 253, 10414, 897, 253, 549, 32693, 2715, 323, 9380, 326, 452, 644, 14218, 33349, 285, 3863, 4496, 4993, 2490, 187, 4118, 18435, 27, 455, 30628, 3534, 2167, 417, 1077, 2266, 50276, 10247, 7363, 323, 436, 789, 50276, 20261, 253, 7681, 7680, 273, 253, 2929, 310, 8489, 32809, 253, 30628, 5194, 326, 352, 4891, 314, 12453, 253, 1929, 1774, 3374, 275, 270, 797, 285, 253, 4679, 403, 9470, 2217, 281, 7568, 253, 16774, 12510, 273, 253, 1332, 50276, 783, 2022, 7350, 5439, 407, 253, 30628, 403, 5001, 253, 38135, 285, 253, 5955, 342, 1675, 281, 2905, 789, 347, 973, 347, 690, 12744, 25527, 275, 253, 2508, 50276, 2858, 891, 1158, 253, 5847, 32180, 798, 253, 772, 285, 3021, 651, 751, 281, 5583, 14924, 273, 253, 2929, 50276, 664, 513, 11907, 4477, 281, 6283, 1379, 275, 253, 30628, 5701, 281, 2007, 40167, 253, 2929, 275, 253, 2457, 2715, 50276 ]
[ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 ]
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 783, 2929, 4081, 247, 4460, 4116, 5122, 285, 247, 747, 8103, 1159, 326, 4784, 304, 684, 253, 3268, 15036, 4269, 407, 34741, 21761, 323, 15450, 8892, 275, 13361, 78, 352, 14371, 8936, 3045, 2439, 49602, 50276, 856, 84, 337, 1175, 16774, 1543, 403, 5183, 2439, 271, 9470, 18880, 273, 49602, 28913, 2175, 403, 973, 2218, 7613, 891, 717, 7378, 281, 1918, 247, 4868, 273, 721, 5747, 273, 253, 1563, 7350, 50276, 5040, 337, 619, 2201, 4468, 310, 670, 253, 38135, 273, 436, 2929, 50276, 249, 39707, 30291, 18, 253, 2934, 273, 4103, 40798, 1491, 275, 253, 830, 273, 16186, 374, 369, 2168, 5611, 253, 2929, 10380, 23209, 35991, 253, 5955, 1563, 374, 760, 29570, 767, 4321, 2987, 273, 439, 1403, 1162, 355, 4765, 30287, 606, 1162, 355, 4765, 891, 1158, 253, 2488, 943, 320, 8274, 285, 7277, 342, 4103, 40798, 1491, 5611, 275, 39707, 30291, 275, 253, 43213, 50276, 3529, 1146, 753, 627, 310, 9090, 1335, 3910, 875, 39707, 30291, 285, 253, 4081, 3082, 285, 671, 253, 10199, 273, 4460, 16566, 275, 1635, 281, 253, 4116, 5122, 50275, 19, 2299, 253, 2045, 4468, 3982, 598, 253, 1273, 4468, 891, 452, 670, 253, 27163, 1580, 253, 11237, 4103, 40798, 1491, 273, 39707, 30291, 281, 253, 4081, 1332, 310, 417, 1512, 1781, 891, 4282, 604, 627, 310, 247, 1921, 281, 5513, 253, 1805, 16226, 273, 253, 4081, 3082, 7613, 891, 717, 11926, 604, 253, 8245, 824, 347, 1269, 77, 3024, 369, 973, 85, 37437, 359, 476, 923, 326, 323, 1650, 275, 374, 253, 3045, 273, 1269, 77, 3024, 369, 1199, 1805, 685, 8927, 2361, 891, 1158, 253, 2488, 943, 1611, 281, 9257, 7472, 253, 4103, 40798, 6297, 273, 2720, 2987, 342, 4477, 1211, 11319, 1223, 1907, 3253, 2010, 4229, 50276, 20, 891, 1089, 253, 3159, 557, 290, 33195, 247, 2372, 24363, 275, 436, 3634, 557, 290, 606, 1338, 275, 13361, 495, 2223, 10770, 281, 253, 3745, 281, 557, 290, 2134, 2616, 273, 10575, 273, 253, 941, 253, 789, 1057, 417, 1056, 897, 273, 667, 557, 290, 33195, 5609, 390, 452, 557, 290, 606, 1338, 6779, 1116, 5671, 980, 352, 3365, 897, 247, 4103, 1899, 5122, 28763, 253, 2020, 273, 1740, 4315, 3580, 50275, 18, 277, 2284, 1162, 355, 39707, 30291, 33056, 422, 3448, 3210, 4457, 247, 4229, 3985, 3634, 50276, 19, 277, 2284, 1162, 355, 37346, 16702, 254, 19690, 562, 22453, 39296, 323, 5919, 3448, 5162, 50276, 20, 1150, 255, 6646, 1162, 355, 11132, 1846, 13260, 275, 253, 440, 35421, 4715, 273, 557, 290, 33195, 14237, 7152, 339, 793, 360, 3454, 285, 9021, 50276, 783, 4477, 4081, 271, 6880, 281, 253, 3159, 6779, 39707, 10336, 50276, 3529, 3936, 715, 2395, 557, 290, 2134, 3386, 323, 1899, 285, 2600, 253, 557, 290, 2134, 273, 4116, 310, 1754, 327, 253, 5889, 273, 247, 2600, 285, 1899, 4764, 12624, 275, 1635, 342, 13553, 273, 1097, 50276, 783, 2022, 7680, 310, 281, 18915, 3374, 342, 253, 4103, 1899, 46234, 908, 327, 2629, 39707, 35615, 253, 4081, 1566, 2722, 11701, 327, 690, 49602, 407, 970, 1679, 3215, 26208, 941, 2429, 281, 253, 8245, 50276, 296, 3755, 20556, 50275, 783, 4081, 1566, 39223, 247, 1929, 2523, 275, 39707, 35615, 50276, 783, 4477, 1347, 247, 11088, 5301, 327, 2629, 2505, 49602, 347, 973, 347, 271, 28913, 1263, 50276, 783, 4342, 921, 326, 557, 290, 2134, 4116, 19132, 1543, 327, 690, 2505, 49602, 50276, 20881, 1255, 265, 50275, 4919, 789, 327, 557, 290, 2134, 14237, 323, 2505, 285, 253, 2007, 16038, 323, 970, 557, 290, 606, 1338, 715, 253, 4116, 1566, 403, 417, 5469, 50276, 33722, 1543, 273, 253, 11041, 275, 17082, 342, 2709, 6613, 327, 253, 15450, 8892, 347, 271, 4465, 7680, 253, 4477, 812, 50276, 9029, 604, 253, 11701, 403, 1955, 281, 253, 4081, 1566, 390, 11041, 275, 4764, 3302, 5837, 50275, 34974, 281, 253, 4477, 50275, 16534, 368, 21184, 327, 557, 290, 33195, 14237, 285, 849, 597, 14588, 281, 253, 4081, 4116, 1566, 50275, 5430, 1057, 352, 7277, 253, 8655, 34741, 3448, 1566, 342, 253, 34741, 3448, 1566, 50276, 5430, 1057, 253, 4103, 1899, 4764, 4315, 310, 3302, 1701, 285, 849, 1057, 352, 2818, 253, 3448, 1566, 3045, 50276, 7152, 339, 9852, 436, 2929, 271, 7756, 273, 270, 797, 1566, 310, 4081, 352, 15771, 327, 253, 557, 290, 606, 1338, 273, 9410, 285, 4103, 6887, 275, 253, 9706, 8090, 285, 327, 253, 24319, 273, 7880, 6887, 275, 253, 28490, 3828, 50275, 296, 3755, 20556, 50276, 783, 2929, 310, 973, 3542, 253, 19274, 281, 253, 1375, 273, 253, 1445, 310, 2590, 285, 253, 1332, 310, 8132, 29689, 2529, 50275, 783, 2929, 3400, 247, 3426, 7103, 970, 253, 5368, 49602, 323, 295, 24343, 285, 1690, 28913, 2175, 285, 7103, 273, 3215, 26208, 6733, 285, 372, 589, 893, 19132, 1543, 275, 253, 2201, 629, 273, 253, 2219, 50275, 20881, 1255, 265, 50276, 783, 4081, 1332, 310, 247, 4103, 17627, 273, 2045, 3082, 50276, 249, 2593, 33232, 253, 1039, 3045, 2572, 390, 6379, 310, 2361, 310, 417, 3242, 1903, 50276, 883, 2792, 50276, 3088, 359, 452, 271, 2934, 273, 253, 7605, 8453, 273, 253, 11701, 50276, 262, 651, 320, 4722, 281, 452, 253, 24775, 323, 253, 4784, 27285, 906, 2797, 327, 2829, 337, 310, 372, 589, 893, 625, 4623, 323, 2173, 8892, 50276, 783, 4477, 1750, 326, 597, 7472, 616, 1543, 327, 5978, 4836, 533, 352, 2581, 3133, 326, 597, 7472, 3448, 14053, 970, 44229, 414, 50276, 783, 897, 273, 1327, 14290, 913, 1406, 90, 983, 268, 446, 323, 1650, 326, 812, 320, 417, 34007, 3345, 253, 295, 24343, 3114, 597, 403, 690, 39296, 275, 253, 2505, 1273, 12494, 273, 4567, 285, 7002, 12494, 273, 253, 10199, 326, 310, 417, 3309, 50276, 7152, 339, 431, 248, 2929, 29328, 247, 270, 797, 38358, 1566, 326, 11323, 247, 767, 2022, 1027, 27934, 7089, 1027, 2600, 285, 1899, 14237, 3185, 273, 247, 2020, 285, 7880, 6887, 275, 253, 28490, 3828, 253, 4477, 1408, 253, 2629, 18880, 273, 28400, 22791, 4679, 327, 1097, 1781, 285, 2613, 873, 8777, 347, 973, 347, 247, 5978, 9978, 259, 1479, 614, 633, 12172, 50275, 783, 14586, 4081, 403, 417, 2165, 28276, 533, 253, 27163, 403, 4722, 275, 2426, 273, 4685, 253, 3486, 273, 841, 14586, 581, 2181, 326, 891, 1089, 557, 25203, 3472, 310, 958, 326, 616, 557, 290, 33195, 2746, 1057, 9569, 3081, 3602, 534, 310, 417, 18755, 390, 1014, 5393, 275, 253, 2022, 2929, 891, 574, 281, 2836, 715, 253, 30762, 281, 923, 326, 436, 23970, 670, 7584, 78, 3081, 3602, 17627, 273, 2145, 50276, 23955, 1895, 326, 891, 452, 310, 342, 616, 5661, 14023, 3340, 253, 4394, 275, 2022, 629, 4706, 33232, 516, 16485, 2708, 253, 954, 1774, 3374, 275, 436, 2593, 50276, 287, 589, 893, 285, 1269, 77, 3024, 403, 10166, 323, 6783, 76, 5018, 342, 854, 76, 3530, 275, 247, 3213, 534, 8322, 281, 1740, 6494, 11999, 689, 3733, 3530, 436, 310, 21643, 752, 368, 1599, 281, 1333, 310, 326, 253, 3210, 923, 670, 1740, 6494, 3733, 6667, 253, 1307, 11999, 310, 908, 3798, 347, 271, 6425, 281, 44540, 26332, 849, 1142, 2069, 253, 1566, 4566, 689, 253, 2862, 3733, 873, 50275, 2420, 337, 534, 26662, 372, 589, 893, 342, 2045, 3210, 342, 1475, 16176, 78, 3602, 270, 797, 687, 589, 893, 1269, 77, 3024, 355, 6291, 285, 1516, 376, 3877, 326, 355, 6291, 310, 2686, 1475, 23540, 78, 3602, 3012, 1679, 685, 512, 253, 2571, 368, 2550, 3365, 13204, 512, 2366, 285, 1750, 597, 403, 6425, 3602, 907, 3020, 50276, 615, 589, 893, 1335, 41731, 13015, 731, 355, 6291, 5260, 16374, 275, 1307, 50276, 1171, 253, 3388, 28400, 4868, 3877, 326, 253, 3064, 1060, 8772, 355, 6291, 5260, 16374, 310, 432, 854, 28053, 281, 898, 933, 26332, 209, 5525, 323, 253, 3388, 342, 247, 13898, 5922, 275, 2426, 273, 14896, 323, 2173, 8892, 5734, 368, 476, 921, 326, 253, 209, 5525, 3064, 310, 10126, 1534, 368, 878, 281, 10541, 1066, 253, 1750, 670, 41731, 14692, 50275, 664, 26799, 253, 1543, 275, 2829, 374, 2429, 281, 253, 2045, 256, 5503, 3210, 342, 2074, 9552, 1690, 270, 797, 687, 589, 893, 1269, 77, 3024, 285, 19488, 255, 1406, 23126, 78, 372, 589, 893, 12724, 41731, 13015, 731, 275, 512, 253, 818, 8892, 3192, 5492, 347, 271, 1650, 372, 589, 893, 310, 3012, 1805, 685, 2045, 256, 5503, 1269, 77, 3024, 342, 271, 7756, 273, 1638, 854, 2358, 4632, 854, 3439, 323, 5913, 1921, 253, 4477, 35991, 355, 6291, 432, 253, 5301, 2218, 323, 2829, 374, 275, 15866, 273, 697, 1014, 4577, 1979, 2429, 281, 253, 2908, 4394, 285, 253, 958, 326, 253, 355, 6291, 3904, 323, 841, 8892, 403, 12450, 2130, 275, 253, 2929, 3192, 5492, 347, 271, 1650, 355, 6291, 2014, 1566, 556, 854, 2082, 7200, 3103, 3635, 5411, 253, 1750, 273, 1638, 7756, 50275, 250, 10414, 50276, 66, 2257, 273, 253, 10414, 897, 253, 549, 32693, 2715, 323, 9380, 326, 452, 644, 14218, 33349, 285, 3863, 4496, 4993, 2490, 187, 4118, 18435, 27, 455, 30628, 3534, 2167, 417, 1077, 2266, 50276, 10247, 7363, 323, 436, 789, 50276, 20261, 253, 7681, 7680, 273, 253, 2929, 310, 8489, 32809, 253, 30628, 5194, 326, 352, 4891, 314, 12453, 253, 1929, 1774, 3374, 275, 270, 797, 285, 253, 4679, 403, 9470, 2217, 281, 7568, 253, 16774, 12510, 273, 253, 1332, 50276, 783, 2022, 7350, 5439, 407, 253, 30628, 403, 5001, 253, 38135, 285, 253, 5955, 342, 1675, 281, 2905, 789, 347, 973, 347, 690, 12744, 25527, 275, 253, 2508, 50276, 2858, 891, 1158, 253, 5847, 32180, 798, 253, 772, 285, 3021, 651, 751, 281, 5583, 14924, 273, 253, 2929, 50276, 664, 513, 11907, 4477, 281, 6283, 1379, 275, 253, 30628, 5701, 281, 2007, 40167, 253, 2929, 275, 253, 2457, 2715, 50276 ]
Below is given review of a research paper from cnoference journal. Please write a summary the review. ### Review: the paper is generally well written section 3 is a little confusing as its not readily clear which part of the model is using flows or what are the flow parameters that needed to be estimated  how will the neural network structure look like how is it trained one might wonder why a flow based model is used among these many deep generative models where does this invertibility help out the gru model is not well elaborated and is a little unclear it would have been interesting to see where the attention model usually attends to either in a real world data set or in simple intuitive toy tasks equation 14 and 19 are hard to follow its good to elaborate on them at least in an appendix section due to space limitations will the loglikelihood sufficient for evaluation why not more intuitive tasks like event or time predictions more explanation and justification would have been helpful docsepthis work investigates a new class of parameterizations for spatiotemporal point processes which uses neural ode to enable flexible highfidelity models of discrete events that are localized in continuous time and space strengths this work is essentially an extension of the neural jump sdes jia benson 2019 where the temporal dynamics is modeled as an ode and the spatial pdf is modeled as a history dependent gaussian mixture distribution in this work the spatial pdf is further extended to an ode based dynamics for this purpose three different continuous normalizing flow models are proposed timevarying cnf jump cnf attentive cnf also a large number of experiments are conducted and baselines are compared to validate the conclusion i recommend rejection at the current stage for the reasons below weakness a major concern is if my understanding is right every mark xi is modeled as an ode of xit on 0 ti in the in timevarying cnf and attentive cnf so there are n the number of points odes in the model this setup is problematic because any points except the 1st are impossible to happen at time 0 so they impossibly possess a mark xi at time 0 in fact any time before ti1 is impossible a more reasonable way to characterize the dynamics of xi is to model the ode on ti1 ti which is used in the jump cnf i understand this setup contributes to the parallel computation with the reparameterization trick in fact this is reason why both timevarying cnf and attentive cnf can be computed in parallel but jump cnf cannot the attentive cnf can be seen as a generalized version of timevarying cnf due to the introduction of history dependence but the jump cnf is a different model as stated above also the jump cnf can model the abrupt change of the spatial pdf but the timevarying cnf and attentive cnf cannot theoretically speaking the jump cnf should have a more powerful fitting capability assuming other parts are same compared with those two models why does the attentive cnf model achieve a better or close performance than jump cnf in most experiments does that mean the dynamics in most datasets have no discontinuity maybe a simple synthetic experiment with discontinuity in dynamics can help prove this some specific concerns some synthetic data experiments with specific setup eg discontinuity are needed to give a deep understanding of the two proposed spatial cnf models typo orof the second line from the bottom in the first page 01 the second line of eq19 docsepthe paper proposes a neuralodebased point process for spatiotemporal data under the general framework three particular variants are proposed they handle data with different characteristics and have different computational efficiency pros the idea is interestingly novel the proposed model architectures are all carefully thought through each model component is wellmotivated being supported by convincing justification the presentation is clear some technical parts are enjoyable to read the empirical results on loglikelihood comparison look compelling cons the experiments are somewhat weak this is the main reason i didnt give a higher score for temporal comparison there isnt any neural baseline model there isnt any prediction accuracy comparison empirical analysis is very limited maybe cuz of limited experiments conducted questions the use of is really nonstandard in statistics and machine learning usually denotes somewhat groundtruth can authors think of another notation to omit h or maybe h can be kept since the singlecolumn format is spacious enough to host long equations eqn8 why not log px t is this a typo rnn particularly lstm with continuoustime hidden state was proposed by mei and eisner 2017 earlier than the cited rubanova et al 2019 moreover the math properties described by eqn1012 also hold for mei eisner 2017 can authors appropriately acknowledge these connections docsepthis paper proposes a novel deep approach to the learning of spatiotemporal point processes via normalizing flows overall i think this is a good paper presenting many interesting ideas and may impact further research on point processes the combination of flowbased network structure and the probabilistic modelpoint process should make sense the formulation and presentation are good which makes the paper easy to follow however there are a couple of questions for the authors to further address 1 it seems that the proposed model contains a jump cnf for the mark probability pboldsymbolxtt im not sure if it really makes sense that the probability of mark has a jump over time here such formulation seems to be problematic it is straightforward for the ground intensity function to consider the discontinuity at the point when an event occurs as it represents a selfexitinginhibitive temporal point process the features are often assumed to be homogenous over time it would be better if the authors can justify such a formulation 2 it seems intractable for the model to predict the next event to compute the arrival time and mark for the next event one should consider integrals with respect to lambda and p which looks quite complex when they involve odes 3 the authors are advised to further illustrate the attentive cnf this seems to be a very interesting topic but the authors only give a brief introduction in very short content it is not very clear how the authors incorporate the attention mechanism to cnf 4 the experiment looks somehow weak first the authors criticize that kde has a large entropy variance however the variance of kde depends on the kernel bandwidth it is not fair to judge based on just one prefixed kernel bandwidth second the authors seem to miss a couple of baselines that deal with the same task in the experiment the nhp and the rmtpp are also able to model the spatialtemporal point process if the losses are change to the metrics on euclidean space besides please consider li l zha h 2014 learning parametric models for social infectivity in multidimensional hawkes processes aaai 2014 li t ke y 2020 tweediehawkes processes interpreting the phenomena of outbreaks aaai 2020 ### Summary:
this paper presents a model for spatiotemporal point processes using neural odes some technical innovations are introduced to allow the conditional intensity to change discontinuously in response to new events likewise the spatial intensity is expanded upon that proposed in prior work on neural sdes reviewers were generally positive about the contributions and the empirical assessments and the authors made substantial improvements during the discussion phase
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 783, 2929, 310, 3839, 973, 3542, 50275, 4674, 575, 20, 310, 247, 1652, 21643, 347, 697, 417, 12450, 2590, 534, 629, 273, 253, 1566, 310, 970, 14221, 390, 752, 403, 253, 2685, 3602, 326, 3058, 281, 320, 5998, 575, 849, 50276, 9846, 253, 11454, 2990, 2605, 1007, 751, 849, 310, 352, 10166, 50276, 531, 1537, 4282, 2139, 247, 2685, 1754, 1566, 310, 908, 2190, 841, 1142, 3676, 1006, 800, 3210, 835, 1057, 436, 30332, 2322, 1361, 562, 50276, 783, 26970, 1566, 310, 417, 973, 50221, 285, 310, 247, 1652, 12744, 50276, 262, 651, 452, 644, 4722, 281, 923, 835, 253, 4116, 1566, 3798, 863, 1727, 281, 2057, 275, 247, 1524, 1533, 941, 873, 390, 275, 2969, 27350, 575, 85, 899, 8892, 50276, 29813, 1638, 285, 655, 403, 1892, 281, 956, 697, 1175, 281, 21184, 327, 731, 387, 1878, 275, 271, 30762, 2593, 1955, 281, 2317, 7364, 50276, 9846, 253, 2412, 7513, 10202, 4209, 575, 1542, 7103, 2139, 417, 625, 27350, 575, 40480, 751, 2362, 390, 673, 13650, 625, 8813, 285, 22861, 651, 452, 644, 9371, 5474, 33032, 2520, 789, 2340, 684, 247, 747, 966, 273, 4764, 5904, 323, 7046, 7173, 358, 23702, 1127, 4870, 534, 4648, 11454, 258, 615, 281, 8046, 12112, 1029, 71, 21718, 3210, 273, 13358, 3394, 326, 403, 15783, 275, 5415, 673, 285, 2317, 50275, 296, 3755, 20556, 436, 789, 310, 9093, 271, 6880, 273, 253, 11454, 6923, 256, 3229, 480, 571, 50276, 67, 24480, 6247, 835, 253, 11935, 8062, 310, 23115, 347, 271, 258, 615, 285, 253, 8820, 31697, 310, 23115, 347, 247, 2892, 7976, 305, 12064, 7802, 3268, 275, 436, 789, 253, 8820, 31697, 310, 2007, 6508, 281, 271, 258, 615, 1754, 8062, 323, 436, 4096, 1264, 1027, 5415, 2622, 3006, 2685, 3210, 403, 4081, 673, 39381, 272, 260, 35478, 6923, 260, 35478, 33056, 422, 260, 35478, 671, 247, 1781, 1180, 273, 4679, 403, 5196, 285, 1666, 25379, 403, 2429, 281, 17813, 253, 6452, 50276, 74, 5583, 18235, 387, 253, 1655, 3924, 323, 253, 4606, 2708, 50276, 20881, 1255, 247, 2201, 4468, 310, 604, 619, 4685, 310, 987, 1046, 1616, 1269, 74, 310, 23115, 347, 271, 258, 615, 273, 1269, 262, 327, 470, 16816, 275, 253, 275, 673, 39381, 272, 260, 35478, 285, 33056, 422, 260, 35478, 594, 627, 403, 295, 253, 1180, 273, 2792, 258, 3229, 275, 253, 1566, 436, 9978, 310, 20276, 984, 667, 2792, 3707, 253, 337, 296, 403, 7479, 281, 5108, 387, 673, 470, 594, 597, 37593, 4360, 7081, 247, 1616, 1269, 74, 387, 673, 470, 275, 958, 667, 673, 1078, 16816, 18, 310, 7479, 247, 625, 5272, 1039, 281, 17710, 253, 8062, 273, 1269, 74, 310, 281, 1566, 253, 258, 615, 327, 16816, 18, 16816, 534, 310, 908, 275, 253, 6923, 260, 35478, 891, 2096, 436, 9978, 17904, 281, 253, 7529, 13782, 342, 253, 294, 19484, 1320, 10480, 275, 958, 436, 310, 1921, 2139, 1097, 673, 39381, 272, 260, 35478, 285, 33056, 422, 260, 35478, 476, 320, 10302, 275, 7529, 533, 6923, 260, 35478, 2550, 253, 33056, 422, 260, 35478, 476, 320, 2326, 347, 247, 14923, 2715, 273, 673, 39381, 272, 260, 35478, 1955, 281, 253, 10199, 273, 2892, 10096, 533, 253, 6923, 260, 35478, 310, 247, 1027, 1566, 347, 4767, 1840, 50276, 12563, 253, 6923, 260, 35478, 476, 1566, 253, 21213, 1818, 273, 253, 8820, 31697, 533, 253, 673, 39381, 272, 260, 35478, 285, 33056, 422, 260, 35478, 2550, 28055, 8288, 253, 6923, 260, 35478, 943, 452, 247, 625, 6422, 13532, 14603, 7384, 643, 4243, 403, 1072, 2429, 342, 1110, 767, 3210, 2139, 1057, 253, 33056, 422, 260, 35478, 1566, 5115, 247, 1805, 390, 2810, 3045, 685, 6923, 260, 35478, 275, 954, 4679, 1057, 326, 1599, 253, 8062, 275, 954, 15302, 452, 642, 16196, 10533, 5046, 247, 2969, 13506, 3368, 342, 16196, 10533, 275, 8062, 476, 1361, 5276, 436, 50275, 8826, 2173, 7350, 690, 13506, 941, 4679, 342, 2173, 9978, 24088, 16196, 10533, 403, 3058, 281, 1918, 247, 3676, 4685, 273, 253, 767, 4081, 8820, 260, 35478, 3210, 50276, 555, 5367, 258, 287, 71, 253, 1273, 1386, 432, 253, 5004, 275, 253, 806, 3239, 50276, 520, 253, 1273, 1386, 273, 16186, 746, 5474, 339, 431, 248, 2929, 29328, 247, 11454, 853, 3169, 1127, 1232, 323, 7046, 7173, 358, 23702, 941, 762, 253, 2087, 7792, 1264, 1798, 11640, 403, 4081, 597, 6016, 941, 342, 1027, 5319, 285, 452, 1027, 15180, 6733, 50275, 856, 84, 50275, 783, 2934, 310, 4722, 314, 4460, 50275, 783, 4081, 1566, 35615, 403, 512, 9257, 1869, 949, 1016, 1566, 4445, 310, 973, 24013, 8550, 1146, 4516, 407, 21414, 22861, 50275, 783, 9759, 310, 2590, 690, 7681, 4243, 403, 30357, 281, 1239, 50275, 783, 16774, 1543, 327, 2412, 7513, 10202, 5301, 1007, 18511, 50275, 5040, 50275, 783, 4679, 403, 8489, 5075, 436, 310, 253, 2022, 1921, 891, 42126, 1918, 247, 2169, 4868, 50275, 1542, 11935, 5301, 627, 310, 2649, 667, 11454, 8245, 1566, 50275, 9088, 310, 2649, 667, 10554, 7200, 5301, 50275, 358, 5378, 474, 1783, 310, 1077, 3710, 5046, 260, 7958, 273, 3710, 4679, 5196, 50275, 34974, 50275, 783, 897, 273, 50276, 261, 1663, 1327, 15291, 275, 9990, 285, 5145, 4715, 50276, 27978, 12853, 8489, 3216, 33024, 476, 4477, 1158, 273, 1529, 14951, 281, 35991, 288, 390, 5046, 288, 476, 320, 4934, 1580, 253, 2014, 11631, 5981, 310, 37039, 2217, 281, 3167, 1048, 7424, 50275, 15214, 25, 2139, 417, 2412, 268, 89, 50276, 85, 310, 436, 247, 1745, 80, 50275, 83, 9866, 3782, 298, 296, 78, 342, 44351, 26202, 553, 8763, 1375, 369, 4081, 407, 479, 74, 285, 299, 261, 1216, 4240, 4321, 685, 253, 11106, 7692, 45388, 1162, 355, 6247, 50276, 3062, 1189, 253, 14168, 3607, 2529, 407, 16186, 79, 6903, 19, 671, 2186, 323, 479, 74, 50276, 70, 261, 1216, 4240, 50276, 5092, 4477, 20420, 14409, 841, 10291, 50275, 7152, 33032, 2520, 2929, 29328, 247, 4460, 3676, 2746, 281, 253, 4715, 273, 7046, 7173, 358, 23702, 1127, 4870, 3066, 2622, 3006, 14221, 4583, 891, 1158, 436, 310, 247, 1175, 2929, 15250, 1142, 4722, 5697, 285, 778, 3486, 2007, 2561, 327, 1127, 4870, 50276, 783, 5019, 273, 2685, 3169, 2990, 2605, 285, 253, 37851, 1566, 3659, 1232, 943, 1056, 3282, 253, 15895, 285, 9759, 403, 1175, 534, 2789, 253, 2929, 3477, 281, 956, 2299, 627, 403, 247, 4564, 273, 3533, 323, 253, 4477, 281, 2007, 2953, 337, 352, 3133, 326, 253, 4081, 1566, 4428, 247, 6923, 260, 35478, 323, 253, 1616, 5912, 268, 5664, 633, 85, 516, 417, 2119, 604, 352, 1663, 2789, 3282, 326, 253, 5912, 273, 1616, 556, 247, 6923, 689, 673, 1060, 824, 15895, 3133, 281, 320, 20276, 352, 310, 15246, 323, 253, 3216, 7133, 1159, 281, 1908, 253, 16196, 10533, 387, 253, 1127, 672, 271, 2362, 6634, 347, 352, 6125, 247, 11329, 453, 89, 2996, 40365, 1483, 11935, 1127, 1232, 253, 3386, 403, 2223, 8025, 281, 320, 2860, 11426, 689, 673, 352, 651, 320, 1805, 604, 253, 4477, 476, 15249, 824, 247, 15895, 374, 352, 3133, 540, 44374, 323, 253, 1566, 281, 3283, 253, 1735, 2362, 281, 11897, 253, 13024, 673, 285, 1616, 323, 253, 1735, 2362, 581, 943, 1908, 28676, 342, 1675, 281, 29331, 285, 268, 534, 4453, 3240, 2570, 672, 597, 6388, 258, 3229, 50276, 20, 253, 4477, 403, 15140, 281, 2007, 17093, 253, 33056, 422, 260, 35478, 436, 3133, 281, 320, 247, 1077, 4722, 9400, 533, 253, 4477, 760, 1918, 247, 4864, 10199, 275, 1077, 2159, 2600, 352, 310, 417, 1077, 2590, 849, 253, 4477, 19071, 253, 4116, 5122, 281, 260, 35478, 50276, 21, 253, 3368, 4453, 10380, 5075, 50276, 7053, 253, 4477, 45688, 326, 49745, 556, 247, 1781, 15579, 11041, 2299, 253, 11041, 273, 49745, 7024, 327, 253, 10295, 16992, 352, 310, 417, 4344, 281, 5963, 1754, 327, 816, 581, 638, 20188, 10295, 16992, 1273, 253, 4477, 1646, 281, 2985, 247, 4564, 273, 1666, 25379, 326, 2968, 342, 253, 1072, 4836, 275, 253, 3368, 253, 295, 28368, 285, 253, 391, 6917, 377, 403, 671, 2104, 281, 1566, 253, 8820, 46258, 1127, 1232, 604, 253, 11655, 403, 1818, 281, 253, 17082, 327, 299, 26365, 2317, 16280, 4496, 1908, 50276, 965, 298, 50276, 91, 3227, 288, 4059, 4715, 36833, 3210, 323, 2675, 12741, 2351, 275, 23964, 37613, 48011, 8583, 4870, 39951, 2284, 4059, 632, 246, 50276, 413, 340, 9169, 13660, 264, 466, 21733, 8583, 4870, 29375, 253, 16958, 273, 34894, 39951, 2284, 9169, 187, 187, 4118, 18435, 27, 2520, 2929, 10262, 247, 1566, 323, 7046, 7173, 358, 23702, 1127, 4870, 970, 11454, 258, 3229, 690, 7681, 32771, 403, 5611, 281, 1581, 253, 17697, 7133, 281, 1818, 16196, 26374, 275, 2380, 281, 747, 3394, 21223, 253, 8820, 7133, 310, 11848, 2220, 326, 4081, 275, 2720, 789, 327, 11454, 256, 3229, 30628, 497, 3839, 2762, 670, 253, 9021, 285, 253, 16774, 20215, 285, 253, 4477, 1160, 6832, 11701, 1309, 253, 5955, 3408 ]
[ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 ]
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 783, 2929, 310, 3839, 973, 3542, 50275, 4674, 575, 20, 310, 247, 1652, 21643, 347, 697, 417, 12450, 2590, 534, 629, 273, 253, 1566, 310, 970, 14221, 390, 752, 403, 253, 2685, 3602, 326, 3058, 281, 320, 5998, 575, 849, 50276, 9846, 253, 11454, 2990, 2605, 1007, 751, 849, 310, 352, 10166, 50276, 531, 1537, 4282, 2139, 247, 2685, 1754, 1566, 310, 908, 2190, 841, 1142, 3676, 1006, 800, 3210, 835, 1057, 436, 30332, 2322, 1361, 562, 50276, 783, 26970, 1566, 310, 417, 973, 50221, 285, 310, 247, 1652, 12744, 50276, 262, 651, 452, 644, 4722, 281, 923, 835, 253, 4116, 1566, 3798, 863, 1727, 281, 2057, 275, 247, 1524, 1533, 941, 873, 390, 275, 2969, 27350, 575, 85, 899, 8892, 50276, 29813, 1638, 285, 655, 403, 1892, 281, 956, 697, 1175, 281, 21184, 327, 731, 387, 1878, 275, 271, 30762, 2593, 1955, 281, 2317, 7364, 50276, 9846, 253, 2412, 7513, 10202, 4209, 575, 1542, 7103, 2139, 417, 625, 27350, 575, 40480, 751, 2362, 390, 673, 13650, 625, 8813, 285, 22861, 651, 452, 644, 9371, 5474, 33032, 2520, 789, 2340, 684, 247, 747, 966, 273, 4764, 5904, 323, 7046, 7173, 358, 23702, 1127, 4870, 534, 4648, 11454, 258, 615, 281, 8046, 12112, 1029, 71, 21718, 3210, 273, 13358, 3394, 326, 403, 15783, 275, 5415, 673, 285, 2317, 50275, 296, 3755, 20556, 436, 789, 310, 9093, 271, 6880, 273, 253, 11454, 6923, 256, 3229, 480, 571, 50276, 67, 24480, 6247, 835, 253, 11935, 8062, 310, 23115, 347, 271, 258, 615, 285, 253, 8820, 31697, 310, 23115, 347, 247, 2892, 7976, 305, 12064, 7802, 3268, 275, 436, 789, 253, 8820, 31697, 310, 2007, 6508, 281, 271, 258, 615, 1754, 8062, 323, 436, 4096, 1264, 1027, 5415, 2622, 3006, 2685, 3210, 403, 4081, 673, 39381, 272, 260, 35478, 6923, 260, 35478, 33056, 422, 260, 35478, 671, 247, 1781, 1180, 273, 4679, 403, 5196, 285, 1666, 25379, 403, 2429, 281, 17813, 253, 6452, 50276, 74, 5583, 18235, 387, 253, 1655, 3924, 323, 253, 4606, 2708, 50276, 20881, 1255, 247, 2201, 4468, 310, 604, 619, 4685, 310, 987, 1046, 1616, 1269, 74, 310, 23115, 347, 271, 258, 615, 273, 1269, 262, 327, 470, 16816, 275, 253, 275, 673, 39381, 272, 260, 35478, 285, 33056, 422, 260, 35478, 594, 627, 403, 295, 253, 1180, 273, 2792, 258, 3229, 275, 253, 1566, 436, 9978, 310, 20276, 984, 667, 2792, 3707, 253, 337, 296, 403, 7479, 281, 5108, 387, 673, 470, 594, 597, 37593, 4360, 7081, 247, 1616, 1269, 74, 387, 673, 470, 275, 958, 667, 673, 1078, 16816, 18, 310, 7479, 247, 625, 5272, 1039, 281, 17710, 253, 8062, 273, 1269, 74, 310, 281, 1566, 253, 258, 615, 327, 16816, 18, 16816, 534, 310, 908, 275, 253, 6923, 260, 35478, 891, 2096, 436, 9978, 17904, 281, 253, 7529, 13782, 342, 253, 294, 19484, 1320, 10480, 275, 958, 436, 310, 1921, 2139, 1097, 673, 39381, 272, 260, 35478, 285, 33056, 422, 260, 35478, 476, 320, 10302, 275, 7529, 533, 6923, 260, 35478, 2550, 253, 33056, 422, 260, 35478, 476, 320, 2326, 347, 247, 14923, 2715, 273, 673, 39381, 272, 260, 35478, 1955, 281, 253, 10199, 273, 2892, 10096, 533, 253, 6923, 260, 35478, 310, 247, 1027, 1566, 347, 4767, 1840, 50276, 12563, 253, 6923, 260, 35478, 476, 1566, 253, 21213, 1818, 273, 253, 8820, 31697, 533, 253, 673, 39381, 272, 260, 35478, 285, 33056, 422, 260, 35478, 2550, 28055, 8288, 253, 6923, 260, 35478, 943, 452, 247, 625, 6422, 13532, 14603, 7384, 643, 4243, 403, 1072, 2429, 342, 1110, 767, 3210, 2139, 1057, 253, 33056, 422, 260, 35478, 1566, 5115, 247, 1805, 390, 2810, 3045, 685, 6923, 260, 35478, 275, 954, 4679, 1057, 326, 1599, 253, 8062, 275, 954, 15302, 452, 642, 16196, 10533, 5046, 247, 2969, 13506, 3368, 342, 16196, 10533, 275, 8062, 476, 1361, 5276, 436, 50275, 8826, 2173, 7350, 690, 13506, 941, 4679, 342, 2173, 9978, 24088, 16196, 10533, 403, 3058, 281, 1918, 247, 3676, 4685, 273, 253, 767, 4081, 8820, 260, 35478, 3210, 50276, 555, 5367, 258, 287, 71, 253, 1273, 1386, 432, 253, 5004, 275, 253, 806, 3239, 50276, 520, 253, 1273, 1386, 273, 16186, 746, 5474, 339, 431, 248, 2929, 29328, 247, 11454, 853, 3169, 1127, 1232, 323, 7046, 7173, 358, 23702, 941, 762, 253, 2087, 7792, 1264, 1798, 11640, 403, 4081, 597, 6016, 941, 342, 1027, 5319, 285, 452, 1027, 15180, 6733, 50275, 856, 84, 50275, 783, 2934, 310, 4722, 314, 4460, 50275, 783, 4081, 1566, 35615, 403, 512, 9257, 1869, 949, 1016, 1566, 4445, 310, 973, 24013, 8550, 1146, 4516, 407, 21414, 22861, 50275, 783, 9759, 310, 2590, 690, 7681, 4243, 403, 30357, 281, 1239, 50275, 783, 16774, 1543, 327, 2412, 7513, 10202, 5301, 1007, 18511, 50275, 5040, 50275, 783, 4679, 403, 8489, 5075, 436, 310, 253, 2022, 1921, 891, 42126, 1918, 247, 2169, 4868, 50275, 1542, 11935, 5301, 627, 310, 2649, 667, 11454, 8245, 1566, 50275, 9088, 310, 2649, 667, 10554, 7200, 5301, 50275, 358, 5378, 474, 1783, 310, 1077, 3710, 5046, 260, 7958, 273, 3710, 4679, 5196, 50275, 34974, 50275, 783, 897, 273, 50276, 261, 1663, 1327, 15291, 275, 9990, 285, 5145, 4715, 50276, 27978, 12853, 8489, 3216, 33024, 476, 4477, 1158, 273, 1529, 14951, 281, 35991, 288, 390, 5046, 288, 476, 320, 4934, 1580, 253, 2014, 11631, 5981, 310, 37039, 2217, 281, 3167, 1048, 7424, 50275, 15214, 25, 2139, 417, 2412, 268, 89, 50276, 85, 310, 436, 247, 1745, 80, 50275, 83, 9866, 3782, 298, 296, 78, 342, 44351, 26202, 553, 8763, 1375, 369, 4081, 407, 479, 74, 285, 299, 261, 1216, 4240, 4321, 685, 253, 11106, 7692, 45388, 1162, 355, 6247, 50276, 3062, 1189, 253, 14168, 3607, 2529, 407, 16186, 79, 6903, 19, 671, 2186, 323, 479, 74, 50276, 70, 261, 1216, 4240, 50276, 5092, 4477, 20420, 14409, 841, 10291, 50275, 7152, 33032, 2520, 2929, 29328, 247, 4460, 3676, 2746, 281, 253, 4715, 273, 7046, 7173, 358, 23702, 1127, 4870, 3066, 2622, 3006, 14221, 4583, 891, 1158, 436, 310, 247, 1175, 2929, 15250, 1142, 4722, 5697, 285, 778, 3486, 2007, 2561, 327, 1127, 4870, 50276, 783, 5019, 273, 2685, 3169, 2990, 2605, 285, 253, 37851, 1566, 3659, 1232, 943, 1056, 3282, 253, 15895, 285, 9759, 403, 1175, 534, 2789, 253, 2929, 3477, 281, 956, 2299, 627, 403, 247, 4564, 273, 3533, 323, 253, 4477, 281, 2007, 2953, 337, 352, 3133, 326, 253, 4081, 1566, 4428, 247, 6923, 260, 35478, 323, 253, 1616, 5912, 268, 5664, 633, 85, 516, 417, 2119, 604, 352, 1663, 2789, 3282, 326, 253, 5912, 273, 1616, 556, 247, 6923, 689, 673, 1060, 824, 15895, 3133, 281, 320, 20276, 352, 310, 15246, 323, 253, 3216, 7133, 1159, 281, 1908, 253, 16196, 10533, 387, 253, 1127, 672, 271, 2362, 6634, 347, 352, 6125, 247, 11329, 453, 89, 2996, 40365, 1483, 11935, 1127, 1232, 253, 3386, 403, 2223, 8025, 281, 320, 2860, 11426, 689, 673, 352, 651, 320, 1805, 604, 253, 4477, 476, 15249, 824, 247, 15895, 374, 352, 3133, 540, 44374, 323, 253, 1566, 281, 3283, 253, 1735, 2362, 281, 11897, 253, 13024, 673, 285, 1616, 323, 253, 1735, 2362, 581, 943, 1908, 28676, 342, 1675, 281, 29331, 285, 268, 534, 4453, 3240, 2570, 672, 597, 6388, 258, 3229, 50276, 20, 253, 4477, 403, 15140, 281, 2007, 17093, 253, 33056, 422, 260, 35478, 436, 3133, 281, 320, 247, 1077, 4722, 9400, 533, 253, 4477, 760, 1918, 247, 4864, 10199, 275, 1077, 2159, 2600, 352, 310, 417, 1077, 2590, 849, 253, 4477, 19071, 253, 4116, 5122, 281, 260, 35478, 50276, 21, 253, 3368, 4453, 10380, 5075, 50276, 7053, 253, 4477, 45688, 326, 49745, 556, 247, 1781, 15579, 11041, 2299, 253, 11041, 273, 49745, 7024, 327, 253, 10295, 16992, 352, 310, 417, 4344, 281, 5963, 1754, 327, 816, 581, 638, 20188, 10295, 16992, 1273, 253, 4477, 1646, 281, 2985, 247, 4564, 273, 1666, 25379, 326, 2968, 342, 253, 1072, 4836, 275, 253, 3368, 253, 295, 28368, 285, 253, 391, 6917, 377, 403, 671, 2104, 281, 1566, 253, 8820, 46258, 1127, 1232, 604, 253, 11655, 403, 1818, 281, 253, 17082, 327, 299, 26365, 2317, 16280, 4496, 1908, 50276, 965, 298, 50276, 91, 3227, 288, 4059, 4715, 36833, 3210, 323, 2675, 12741, 2351, 275, 23964, 37613, 48011, 8583, 4870, 39951, 2284, 4059, 632, 246, 50276, 413, 340, 9169, 13660, 264, 466, 21733, 8583, 4870, 29375, 253, 16958, 273, 34894, 39951, 2284, 9169, 187, 187, 4118, 18435, 27, 2520, 2929, 10262, 247, 1566, 323, 7046, 7173, 358, 23702, 1127, 4870, 970, 11454, 258, 3229, 690, 7681, 32771, 403, 5611, 281, 1581, 253, 17697, 7133, 281, 1818, 16196, 26374, 275, 2380, 281, 747, 3394, 21223, 253, 8820, 7133, 310, 11848, 2220, 326, 4081, 275, 2720, 789, 327, 11454, 256, 3229, 30628, 497, 3839, 2762, 670, 253, 9021, 285, 253, 16774, 20215, 285, 253, 4477, 1160, 6832, 11701, 1309, 253, 5955, 3408 ]
Below is given review of a research paper from cnoference journal. Please write a summary the review. ### Review: summary this paper addresses the arc dataset by reformulating the question using embeddings from conceptnet their model selects a few terms from the question using the embeddings from conceptnet rewrites the query based on the selected terms retrieves the documents and solves the query the empirical result shows that embeddings from conceptnet is beneficial and the overall result is comparable to recent performance on arc dataset quality pros 1 this paper contains a thorough study of recent qa models and datasets 2 this paper describes the model architecture conducts ablation studies of different essential terms classification and includes thorough comparisons with recent models on arc challenges cons although the paper includes recent works on qa modelsdatasets it doesnt contain much studies on query reformulations for example ask the right questions active question reformulation with reinforcement learning buck et al iclr 2018 is one of the related works that the paper didnt cite the paper does not have any example of reformulated queries or error analysis clarity pros 1 the paper describes the framework and model architecture carefully cons 1 it is hard to understand how exactly they reformulate the query based on selected terms i think examples would help for example in fig 2 after activities used conserve and water were selected how does rewriter write the query the examples will help 2 similar to the above it would be helpful to see the examples of decision rules in section 52 3 it is hard to understand how exactly each component of the model was trained first of all is rewrite module only trained on essential terms dataset as mentioned in section 313 and never finetuned on arc dataset same question for entailment modules is it only trained on scitail not finetuned on arc dataset how did decision rules trained are all the modules trained separately and havent been trained jointly what modules were trained on arc dataset all of these are a bit confusing since therere many components and many datasets were used originality significance pros query reformulation methods have been used on several qa tasks like buck et al 2018 above and incorporating background knowledge has been used before too as described in the paper but i think its fairly original to do both in the same time cons it is a bit disappointing that the only part using background knowledge is selecting essential terms using conceptnet embedding i think the term using background knowledge is too general term for this specific idea in general i think the paper has enough contribution to be accepted if some descriptions are better clarifieddocsepthis paper focuses on the recently introduced arc challenge dataset which contains 2590 multiple choice questions authored for gradeschool science exams the paper presents a system that reformulates a given question into queries that are used to retrieve supporting text from a large corpus of sciencerelated text the rewriter is able to incorporate background knowledge from conceptnet a textual entailment system trained on scitail that identifies support in the retrieved results experiments show that the proposed system is able to outperform several baselines on arc sec 22 acl2013 paraphrasedriven learning for open question answering and emnlp2017 learning to paraphrase for question answering can be added in the related work section sec 31 seq2seq predicts 0 and 1 to indicate whether the word is salient a more straightforward method is using a pointer network for the decoder which directly selects words from the input this method should be more effective than seq2seq used in sec 311 sec 31 how about the performance of removing the top crf layer the lstm layer and the classifier should play the most important role how to better utilize external resources is an interesting topic and is potentially helpful to improve the results of answering science exam questions for example the entailment module described in sec 51 can be trained on other larger data which in turn helps the problem with smaller data i would like to see more details about this are the improvements significant compared to the baseline methods significance test is necessary because the dataset is quite small experiments on largescale datasets are encourageddocsepthis paper introduces an endtoend system to answer science exam questions for the arc challenge the system is a combination of several existing techniques including i query rewriting based on seq2seq or ncrf ii answer retriever iii entailment model based on matchlstm and iv knowledge graph embeddings the description of the system is clear and there is abundant ablation study however i have following concerns about this paper 1 there seems to be no new techniques proposed in the system hence the novelty of this work is questioned 2 i do not understand why the authors use transh which is a kg embedding model that differentiates one entity into different relationspecific representations 3 the system is significantly outperformed by sun et al 2018 and ni et al 2018 ### Summary:
this paper presents a method for finding important tokens in the question and then use prior knowledge from conceptnet to answer questions in the arc dataset pros combining finding essential terms using domain knowledge and textual entailment cons none of the proposed methods are novel the paper combines a few components to answer questions in arc the intro and abstract are written a bit more general than what the paper actually does the paper studies different ways to incorporate concept net the results between dev and test are not consistent some methods achieve better results on the dev set but the authors use a model which is not best in dev to be used on the test set
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 8774, 50276, 2520, 2929, 12453, 253, 12423, 10895, 407, 8460, 8287, 253, 1953, 970, 46234, 432, 4473, 3024, 616, 1566, 34899, 247, 1643, 2426, 432, 253, 1953, 970, 253, 46234, 432, 4473, 3024, 294, 8510, 265, 253, 7316, 1754, 327, 253, 4236, 2426, 12802, 265, 253, 7177, 285, 35910, 253, 7316, 253, 16774, 906, 2722, 326, 46234, 432, 4473, 3024, 310, 12912, 285, 253, 4583, 906, 310, 10870, 281, 3332, 3045, 327, 12423, 10895, 50276, 15177, 5847, 337, 436, 2929, 4428, 247, 11080, 1263, 273, 3332, 2805, 66, 3210, 285, 15302, 374, 436, 2929, 8631, 253, 1566, 10336, 2589, 84, 28913, 2175, 273, 1027, 5667, 2426, 9162, 285, 3797, 11080, 14023, 342, 3332, 3210, 327, 12423, 7881, 50276, 5040, 50276, 20261, 253, 2929, 3797, 3332, 2987, 327, 2805, 66, 3210, 46906, 1507, 352, 36908, 3831, 1199, 2175, 327, 7316, 8460, 3339, 323, 1650, 50276, 1945, 253, 987, 3533, 3939, 1953, 8460, 1427, 342, 35221, 4715, 12433, 1162, 355, 17857, 32888, 4765, 310, 581, 273, 253, 2905, 2987, 326, 253, 2929, 42126, 26542, 50276, 783, 2929, 1057, 417, 452, 667, 1650, 273, 8460, 2907, 19241, 390, 2228, 1783, 50276, 498, 15752, 50276, 856, 84, 337, 253, 2929, 8631, 253, 7792, 285, 1566, 10336, 9257, 50276, 5040, 337, 352, 310, 1892, 281, 2096, 849, 4555, 597, 8460, 4187, 253, 7316, 1754, 327, 4236, 2426, 891, 1158, 6667, 651, 1361, 323, 1650, 275, 3036, 374, 846, 4712, 908, 772, 4003, 285, 1824, 497, 4236, 849, 1057, 294, 16360, 3630, 253, 7316, 253, 6667, 588, 1361, 374, 2074, 281, 253, 1840, 352, 651, 320, 9371, 281, 923, 253, 6667, 273, 3061, 4803, 275, 2593, 8073, 495, 352, 310, 1892, 281, 2096, 849, 4555, 1016, 4445, 273, 253, 1566, 369, 10166, 806, 273, 512, 310, 24813, 6333, 760, 10166, 327, 5667, 2426, 10895, 347, 5393, 275, 2593, 31389, 285, 1620, 1442, 292, 37437, 327, 12423, 10895, 1072, 1953, 323, 46518, 420, 11911, 310, 352, 760, 10166, 327, 660, 262, 647, 417, 1442, 292, 37437, 327, 12423, 10895, 849, 858, 3061, 4803, 10166, 403, 512, 253, 11911, 10166, 11794, 285, 419, 2254, 644, 10166, 26277, 752, 11911, 497, 10166, 327, 12423, 10895, 512, 273, 841, 403, 247, 2372, 21643, 1580, 627, 250, 1142, 4295, 285, 1142, 15302, 497, 908, 50276, 19164, 414, 50276, 9188, 40348, 50276, 856, 84, 50276, 7267, 8460, 1427, 3082, 452, 644, 908, 327, 2067, 2805, 66, 8892, 751, 12433, 1162, 355, 4765, 1840, 285, 24049, 4114, 3640, 556, 644, 908, 1078, 1512, 347, 2529, 275, 253, 2929, 533, 891, 1158, 697, 9648, 3236, 281, 513, 1097, 275, 253, 1072, 673, 50276, 5040, 50276, 262, 310, 247, 2372, 31623, 326, 253, 760, 629, 970, 4114, 3640, 310, 17221, 5667, 2426, 970, 4473, 3024, 21496, 891, 1158, 253, 1307, 970, 4114, 3640, 310, 1512, 2087, 1307, 323, 436, 2173, 2934, 50276, 249, 2087, 891, 1158, 253, 2929, 556, 2217, 7680, 281, 320, 7607, 604, 690, 20121, 403, 1805, 31637, 7152, 33032, 2520, 2929, 16633, 327, 253, 4102, 5611, 12423, 5691, 10895, 534, 4428, 2030, 2270, 2709, 4327, 3533, 47895, 323, 23129, 1651, 5859, 34666, 253, 2929, 10262, 247, 985, 326, 8460, 17815, 247, 1677, 1953, 715, 19241, 326, 403, 908, 281, 19553, 8109, 2505, 432, 247, 1781, 20689, 273, 5859, 4919, 2505, 253, 294, 16360, 310, 2104, 281, 19071, 4114, 3640, 432, 4473, 3024, 247, 45860, 46518, 420, 985, 10166, 327, 660, 262, 647, 326, 22649, 1329, 275, 253, 22111, 1543, 50276, 16217, 3825, 921, 326, 253, 4081, 985, 310, 2104, 281, 562, 32231, 2067, 1666, 25379, 327, 12423, 50275, 1704, 3307, 50276, 29404, 6622, 1061, 24596, 833, 1069, 257, 4715, 323, 1527, 1953, 22291, 285, 802, 13307, 81, 7132, 4715, 281, 1061, 24596, 511, 323, 1953, 22291, 476, 320, 2879, 275, 253, 2905, 789, 2593, 50276, 1704, 4562, 22510, 19, 14571, 26295, 470, 285, 337, 281, 5224, 1880, 253, 3159, 310, 43066, 247, 625, 15246, 1332, 310, 970, 247, 12219, 2990, 323, 253, 29810, 534, 3587, 34899, 3000, 432, 253, 3280, 436, 1332, 943, 320, 625, 3576, 685, 22510, 19, 14571, 908, 275, 4706, 32236, 50276, 1704, 4562, 849, 670, 253, 3045, 273, 11922, 253, 1755, 1531, 71, 3828, 253, 298, 296, 78, 3828, 285, 253, 30410, 943, 1132, 253, 954, 1774, 2554, 50276, 5430, 281, 1805, 16584, 6024, 5300, 310, 271, 4722, 9400, 285, 310, 7826, 9371, 281, 3157, 253, 1543, 273, 22291, 5859, 1174, 3533, 323, 1650, 253, 46518, 420, 6333, 2529, 275, 4706, 8319, 476, 320, 10166, 327, 643, 4067, 941, 534, 275, 1614, 7729, 253, 1895, 342, 4577, 941, 891, 651, 751, 281, 923, 625, 4278, 670, 436, 50276, 609, 253, 11701, 1534, 2429, 281, 253, 8245, 3082, 8453, 1071, 310, 3309, 984, 253, 10895, 310, 3240, 1355, 50276, 16217, 3825, 327, 1236, 2510, 25912, 15302, 403, 14659, 7152, 33032, 2520, 2929, 23970, 271, 990, 936, 423, 985, 281, 3662, 5859, 1174, 3533, 323, 253, 12423, 5691, 253, 985, 310, 247, 5019, 273, 2067, 5368, 5609, 1690, 891, 7316, 294, 17695, 1754, 327, 22510, 19, 14571, 390, 295, 7083, 71, 21255, 3662, 851, 363, 972, 37685, 46518, 420, 1566, 1754, 327, 3761, 42663, 78, 285, 21983, 3640, 4216, 46234, 253, 5740, 273, 253, 985, 310, 2590, 285, 627, 310, 17829, 28913, 1263, 2299, 891, 452, 1563, 7350, 670, 436, 2929, 50276, 18, 627, 3133, 281, 320, 642, 747, 5609, 4081, 275, 253, 985, 7613, 253, 38135, 273, 436, 789, 310, 17801, 374, 891, 513, 417, 2096, 2139, 253, 4477, 897, 811, 73, 534, 310, 247, 15841, 21496, 1566, 326, 1027, 28032, 581, 10726, 715, 1027, 2493, 29765, 14237, 495, 253, 985, 310, 3012, 41731, 10574, 407, 5101, 1162, 355, 4765, 285, 13065, 1162, 355, 4765, 187, 187, 4118, 18435, 27, 2520, 2929, 10262, 247, 1332, 323, 4560, 1774, 21761, 275, 253, 1953, 285, 840, 897, 2720, 3640, 432, 4473, 3024, 281, 3662, 3533, 275, 253, 12423, 10895, 50275, 856, 84, 50276, 17890, 1699, 4560, 5667, 2426, 970, 5028, 3640, 285, 45860, 46518, 420, 50275, 5040, 50276, 15422, 273, 253, 4081, 3082, 403, 4460, 50276, 783, 2929, 24772, 247, 1643, 4295, 281, 3662, 3533, 275, 12423, 253, 26432, 285, 12002, 403, 3542, 247, 2372, 625, 2087, 685, 752, 253, 2929, 2686, 1057, 253, 2929, 2175, 1027, 4088, 281, 19071, 4473, 2036, 253, 1543, 875, 1474, 285, 1071, 403, 417, 5185, 690, 3082, 5115, 1805, 1543, 327, 253, 1474, 873, 533, 253, 4477, 897, 247, 1566, 534, 310, 417, 1682, 275, 1474, 281, 320, 908, 327, 253, 1071, 873, 50274 ]
[ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 ]
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 8774, 50276, 2520, 2929, 12453, 253, 12423, 10895, 407, 8460, 8287, 253, 1953, 970, 46234, 432, 4473, 3024, 616, 1566, 34899, 247, 1643, 2426, 432, 253, 1953, 970, 253, 46234, 432, 4473, 3024, 294, 8510, 265, 253, 7316, 1754, 327, 253, 4236, 2426, 12802, 265, 253, 7177, 285, 35910, 253, 7316, 253, 16774, 906, 2722, 326, 46234, 432, 4473, 3024, 310, 12912, 285, 253, 4583, 906, 310, 10870, 281, 3332, 3045, 327, 12423, 10895, 50276, 15177, 5847, 337, 436, 2929, 4428, 247, 11080, 1263, 273, 3332, 2805, 66, 3210, 285, 15302, 374, 436, 2929, 8631, 253, 1566, 10336, 2589, 84, 28913, 2175, 273, 1027, 5667, 2426, 9162, 285, 3797, 11080, 14023, 342, 3332, 3210, 327, 12423, 7881, 50276, 5040, 50276, 20261, 253, 2929, 3797, 3332, 2987, 327, 2805, 66, 3210, 46906, 1507, 352, 36908, 3831, 1199, 2175, 327, 7316, 8460, 3339, 323, 1650, 50276, 1945, 253, 987, 3533, 3939, 1953, 8460, 1427, 342, 35221, 4715, 12433, 1162, 355, 17857, 32888, 4765, 310, 581, 273, 253, 2905, 2987, 326, 253, 2929, 42126, 26542, 50276, 783, 2929, 1057, 417, 452, 667, 1650, 273, 8460, 2907, 19241, 390, 2228, 1783, 50276, 498, 15752, 50276, 856, 84, 337, 253, 2929, 8631, 253, 7792, 285, 1566, 10336, 9257, 50276, 5040, 337, 352, 310, 1892, 281, 2096, 849, 4555, 597, 8460, 4187, 253, 7316, 1754, 327, 4236, 2426, 891, 1158, 6667, 651, 1361, 323, 1650, 275, 3036, 374, 846, 4712, 908, 772, 4003, 285, 1824, 497, 4236, 849, 1057, 294, 16360, 3630, 253, 7316, 253, 6667, 588, 1361, 374, 2074, 281, 253, 1840, 352, 651, 320, 9371, 281, 923, 253, 6667, 273, 3061, 4803, 275, 2593, 8073, 495, 352, 310, 1892, 281, 2096, 849, 4555, 1016, 4445, 273, 253, 1566, 369, 10166, 806, 273, 512, 310, 24813, 6333, 760, 10166, 327, 5667, 2426, 10895, 347, 5393, 275, 2593, 31389, 285, 1620, 1442, 292, 37437, 327, 12423, 10895, 1072, 1953, 323, 46518, 420, 11911, 310, 352, 760, 10166, 327, 660, 262, 647, 417, 1442, 292, 37437, 327, 12423, 10895, 849, 858, 3061, 4803, 10166, 403, 512, 253, 11911, 10166, 11794, 285, 419, 2254, 644, 10166, 26277, 752, 11911, 497, 10166, 327, 12423, 10895, 512, 273, 841, 403, 247, 2372, 21643, 1580, 627, 250, 1142, 4295, 285, 1142, 15302, 497, 908, 50276, 19164, 414, 50276, 9188, 40348, 50276, 856, 84, 50276, 7267, 8460, 1427, 3082, 452, 644, 908, 327, 2067, 2805, 66, 8892, 751, 12433, 1162, 355, 4765, 1840, 285, 24049, 4114, 3640, 556, 644, 908, 1078, 1512, 347, 2529, 275, 253, 2929, 533, 891, 1158, 697, 9648, 3236, 281, 513, 1097, 275, 253, 1072, 673, 50276, 5040, 50276, 262, 310, 247, 2372, 31623, 326, 253, 760, 629, 970, 4114, 3640, 310, 17221, 5667, 2426, 970, 4473, 3024, 21496, 891, 1158, 253, 1307, 970, 4114, 3640, 310, 1512, 2087, 1307, 323, 436, 2173, 2934, 50276, 249, 2087, 891, 1158, 253, 2929, 556, 2217, 7680, 281, 320, 7607, 604, 690, 20121, 403, 1805, 31637, 7152, 33032, 2520, 2929, 16633, 327, 253, 4102, 5611, 12423, 5691, 10895, 534, 4428, 2030, 2270, 2709, 4327, 3533, 47895, 323, 23129, 1651, 5859, 34666, 253, 2929, 10262, 247, 985, 326, 8460, 17815, 247, 1677, 1953, 715, 19241, 326, 403, 908, 281, 19553, 8109, 2505, 432, 247, 1781, 20689, 273, 5859, 4919, 2505, 253, 294, 16360, 310, 2104, 281, 19071, 4114, 3640, 432, 4473, 3024, 247, 45860, 46518, 420, 985, 10166, 327, 660, 262, 647, 326, 22649, 1329, 275, 253, 22111, 1543, 50276, 16217, 3825, 921, 326, 253, 4081, 985, 310, 2104, 281, 562, 32231, 2067, 1666, 25379, 327, 12423, 50275, 1704, 3307, 50276, 29404, 6622, 1061, 24596, 833, 1069, 257, 4715, 323, 1527, 1953, 22291, 285, 802, 13307, 81, 7132, 4715, 281, 1061, 24596, 511, 323, 1953, 22291, 476, 320, 2879, 275, 253, 2905, 789, 2593, 50276, 1704, 4562, 22510, 19, 14571, 26295, 470, 285, 337, 281, 5224, 1880, 253, 3159, 310, 43066, 247, 625, 15246, 1332, 310, 970, 247, 12219, 2990, 323, 253, 29810, 534, 3587, 34899, 3000, 432, 253, 3280, 436, 1332, 943, 320, 625, 3576, 685, 22510, 19, 14571, 908, 275, 4706, 32236, 50276, 1704, 4562, 849, 670, 253, 3045, 273, 11922, 253, 1755, 1531, 71, 3828, 253, 298, 296, 78, 3828, 285, 253, 30410, 943, 1132, 253, 954, 1774, 2554, 50276, 5430, 281, 1805, 16584, 6024, 5300, 310, 271, 4722, 9400, 285, 310, 7826, 9371, 281, 3157, 253, 1543, 273, 22291, 5859, 1174, 3533, 323, 1650, 253, 46518, 420, 6333, 2529, 275, 4706, 8319, 476, 320, 10166, 327, 643, 4067, 941, 534, 275, 1614, 7729, 253, 1895, 342, 4577, 941, 891, 651, 751, 281, 923, 625, 4278, 670, 436, 50276, 609, 253, 11701, 1534, 2429, 281, 253, 8245, 3082, 8453, 1071, 310, 3309, 984, 253, 10895, 310, 3240, 1355, 50276, 16217, 3825, 327, 1236, 2510, 25912, 15302, 403, 14659, 7152, 33032, 2520, 2929, 23970, 271, 990, 936, 423, 985, 281, 3662, 5859, 1174, 3533, 323, 253, 12423, 5691, 253, 985, 310, 247, 5019, 273, 2067, 5368, 5609, 1690, 891, 7316, 294, 17695, 1754, 327, 22510, 19, 14571, 390, 295, 7083, 71, 21255, 3662, 851, 363, 972, 37685, 46518, 420, 1566, 1754, 327, 3761, 42663, 78, 285, 21983, 3640, 4216, 46234, 253, 5740, 273, 253, 985, 310, 2590, 285, 627, 310, 17829, 28913, 1263, 2299, 891, 452, 1563, 7350, 670, 436, 2929, 50276, 18, 627, 3133, 281, 320, 642, 747, 5609, 4081, 275, 253, 985, 7613, 253, 38135, 273, 436, 789, 310, 17801, 374, 891, 513, 417, 2096, 2139, 253, 4477, 897, 811, 73, 534, 310, 247, 15841, 21496, 1566, 326, 1027, 28032, 581, 10726, 715, 1027, 2493, 29765, 14237, 495, 253, 985, 310, 3012, 41731, 10574, 407, 5101, 1162, 355, 4765, 285, 13065, 1162, 355, 4765, 187, 187, 4118, 18435, 27, 2520, 2929, 10262, 247, 1332, 323, 4560, 1774, 21761, 275, 253, 1953, 285, 840, 897, 2720, 3640, 432, 4473, 3024, 281, 3662, 3533, 275, 253, 12423, 10895, 50275, 856, 84, 50276, 17890, 1699, 4560, 5667, 2426, 970, 5028, 3640, 285, 45860, 46518, 420, 50275, 5040, 50276, 15422, 273, 253, 4081, 3082, 403, 4460, 50276, 783, 2929, 24772, 247, 1643, 4295, 281, 3662, 3533, 275, 12423, 253, 26432, 285, 12002, 403, 3542, 247, 2372, 625, 2087, 685, 752, 253, 2929, 2686, 1057, 253, 2929, 2175, 1027, 4088, 281, 19071, 4473, 2036, 253, 1543, 875, 1474, 285, 1071, 403, 417, 5185, 690, 3082, 5115, 1805, 1543, 327, 253, 1474, 873, 533, 253, 4477, 897, 247, 1566, 534, 310, 417, 1682, 275, 1474, 281, 320, 908, 327, 253, 1071, 873, 50274 ]
Below is given review of a research paper from cnoference journal. Please write a summary the review. ### Review: this paper presents a very interesting idea if i understand it correctly which is to implement conditioning on eg a speaker embedding vector via a parametrization of the layertolayer activation functions of an overall deep neural net architecture the motivation is to achieve the same effect as the wellknown concatenation approach but without requiring as many new parameters i found the formalism hard to follow impeding my grasp of the fundamental proposal however eventually i got the idea the results are promising showing good preservation of eg wer quality on the unconditioned scenario with good wer on the specialized scenario and with fewer parameters than the concatenation method i think the name learned activations somewhat undersells the idea wouldnt it be better named as learned activation functions that immediately would clarify the fundamental concept and highlight its originality see my comments above in the summary for strengths and weaknesses in an nutshell i think the paper suffers from a lack of clarity that undersells what is otherwise a nice idea i suggest introducing the specifics of the core of the idea earlier on in the paper theres a sense of anticipation that is a bit frustrating for me as a reader some more specific comments follow re sentence ending in and a family of a basic activations ai r r a i1 of which a particular realization could be the set of the most commonly used activations in deep learning eg relu sigmoid tanh and so on check the grammar i believe this is not a complete sentence given what comes before the subclause i just quoted what is softmaxrowwise re first 3 equations in section 31 what is the relationship between laelementwizec zj and lahj zj numbering the equations would be helpful what is c i did not quite follow the formalism i am unable to clearly relate the equations in 31 with figure 1c a number of references are given for the concatenation approach which is good but no references are given for the modulation approach afaict though this is referred to as a stateoftheart approach here again i am unable to follow the discussion linking the formalism to figure 1b i am guessing that the formalism is correct but the presentation could be substantially simplified and clarified figures 2 to take into account that basic activations may have different ranges the plots show las minus average of basic activations ie lac zj for c 3 3 and values of j corresponding to the selected users i cannot relate lac zj to the label of the xaxis which is h figure 4 to take into account that the nonpersonalized basic activations are relu or swish the plots show las minus average of those two activations ie lac zj reluz swishz2 for c 3 3 and values of j corresponding to the selected users i suggest labeling the xaxis would that be z how should we relate zj to the z without the j subscript argument of relu and swish section 41 table 1 define sdri before using the term section 42 hpo hyperparameter optimization section 44 stft define the term section 54 greedy search provide a reference later in the results section the authors state furthermore the gains can be seen to be more pronounced for greedy than beam search ctc decoders reaching up to 40 relative improvement for some users which is a significant improvement in the domain of asr but greedy search doesnt represent the typical final asr benchmark so the greater relative gain than for beam search doesnt seem too significant but i am not sure what the authors mean by greedy search so they should clarify this good originality afaict and good results somewhat unclear formalism and presentation of main concepts docsepthis paper proposes a way of conditioning information on neural networks in literature a common way to condition a neural with an input would be to either concatenate the conditioning vector to the input vector or inject it before several layers modulation approach in figure 1b in this paper they instead propose to pass the conditioning vector through weighted sum of output of neuralnet activation functions the claim is that this way of training the neural network leads to reduction in neural network parameters without sacrificing performance in speech enhancement asr tasks the experimental results indicate that the proposed method of training the rnn cnn based systems for speech enhancement and asr leads to reduction in number of parameters the authors argue that this is an important advantage for lowresource computing i have several questions on this i have reviewed this paper before and i see that some of my concerns from earlier even though were partially adressed in the rebuttal now seems not to be addressed in the current version of the manuscript the number of parameters to assess the computational advantages of a model is questionable in my opinion i think a better measure for this would be to use the number of flops in forward pass or the memory usage with respect to input sequence length namely this table was presented in the rebuttal model la flops m latency ms rnn y 125485 528543 36392 rnn n 180260 537766 11403 tds y 150743 116451 06992 tds n 290646 123580 03440 tdsrnn y 440660 392326 15582 tdsrnn n 454392 394809 20247 i think it would be good to include these results overall we see that the improvement is not always clear especially in terms of latency when the variance is taken into consideration for the new experiment with asr and pasr part 2 of the experiments the presentation is not super clear to me and i am not sure if an improvement over alternative conditioning techniques are shown here i was a reviewer for this paper for neurips 2021 even though i voted for acceptance that time i now see that some of my concerns and concerns pointed out by other reviewers are not fully addressed for instance i do not see a comparison with this paper httpsarxivorgpdf160102828pdf i also do not see a comparison in terms of flops and latency for which the improvement seems marginal from the results added to the neurips rebuttal i therefore vote for marginal rejection docsepthis paper introduces a method called learned activations for personalized speech enhancement and personalized automatic speech recognition the learned activations are obtained by a weighted sum of nonlinear activations of hidden layer outputs and the weights are computed from softmax of the projected speaker embeddings such as xvectors the main benefit of the model as compared to concatenation or modulation based approaches is that learned activations result in smaller number of learnable parameters and hence a smaller model size experimental results show that in terms of sdr improvement the proposed method achieves comparable performance to the baseline models with a smaller model size in the case of asr the proposed approach provides wer reduction as compared to an unadapted model strengths extensive experiments on two application areas speech enhancement and automatic speech recognition the paper also tried to provide quantitative results by plotting learned activations for different speakers the proposed model is comparably smaller than the concatenation based approaches the method might be suitable for ondevice applications weaknesses as there are many experimental results the paper may occasionally become harder to read towards the end some details are not welldescribed for example in tables 1 and 2 what does cond mean even though the model size is smaller it comes with additional onthefly computation of several nonlinearities of hidden activations which may cause other concerns for ondevice applications film approach should be briefly mentioned in the text before reporting the results with this method hpo acronym should be defined page 6 how exactly does the study perform finetuning of the speaker embedding model details are missing page 8 for pasr does the study finetune the speaker embedding model jointly with the asr model or are they learned separately the paper provides a personalization method by conditioning nonlinear activations based on the speaker embeddings there are extensive experiments one concern is that the paper may need some additional clarification on details for better reproducibility ### Summary:
the authors propose a novel method for conditioning deep neural network they replace the activation function with a linear combination of activation functions eg relu the weights for the activation functions are dynamically computed from the input during inference and training the approach is evaluated on standard public tasks and shows improvement over wellestablished alternatives pros a simple novel method for condition that is widely applicable adequate empirical evaluations to demonstrate its effectiveness cons no major weakness the reviewers provided several feedback the authors incorporated the suggestions and clarified residual concerns the revised version of the paper has improved the readability and utility substantially
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 2520, 2929, 10262, 247, 1077, 4722, 2934, 604, 891, 2096, 352, 9113, 534, 310, 281, 3359, 21839, 327, 24088, 247, 14925, 21496, 4972, 3066, 247, 30364, 45031, 273, 253, 2242, 797, 311, 4071, 5743, 3470, 273, 271, 4583, 3676, 11454, 2036, 10336, 253, 16038, 310, 281, 5115, 253, 1072, 1055, 347, 253, 973, 4304, 32147, 318, 2746, 533, 1293, 10568, 347, 1142, 747, 3602, 50276, 74, 1119, 253, 30221, 1892, 281, 956, 1607, 7022, 619, 15909, 273, 253, 7936, 10419, 2299, 6524, 891, 1694, 253, 2934, 253, 1543, 403, 12532, 4645, 1175, 23029, 273, 24088, 16640, 3290, 327, 253, 440, 44321, 10076, 342, 1175, 16640, 327, 253, 18052, 10076, 285, 342, 11184, 3602, 685, 253, 32147, 318, 1332, 50276, 74, 1158, 253, 1416, 6311, 1396, 569, 8489, 17433, 7042, 253, 2934, 50276, 12756, 2649, 352, 320, 1805, 4907, 347, 6311, 5743, 3470, 326, 4745, 651, 19148, 253, 7936, 4473, 50276, 395, 6780, 697, 3236, 414, 923, 619, 5701, 1840, 275, 253, 6010, 323, 20544, 285, 32213, 275, 271, 5825, 17901, 891, 1158, 253, 2929, 27171, 432, 247, 3480, 273, 19843, 326, 17433, 7042, 752, 310, 5010, 247, 5322, 2934, 50276, 74, 1804, 16984, 253, 40155, 273, 253, 5161, 273, 253, 2934, 4321, 327, 275, 253, 2929, 253, 373, 247, 3282, 273, 30513, 326, 310, 247, 2372, 29125, 323, 479, 347, 247, 9414, 50276, 8826, 625, 2173, 5701, 956, 50276, 250, 6197, 12365, 275, 50276, 395, 247, 2021, 273, 247, 5044, 1396, 569, 23105, 50276, 83, 50276, 83, 247, 891, 18, 273, 534, 247, 1798, 22786, 812, 320, 253, 873, 273, 253, 954, 7744, 908, 1396, 569, 275, 3676, 4715, 24088, 774, 86, 9788, 78, 1238, 23136, 73, 285, 594, 327, 2451, 253, 28146, 891, 2868, 436, 310, 417, 247, 3426, 6197, 1677, 752, 3249, 1078, 253, 749, 498, 766, 891, 816, 15212, 50276, 5371, 310, 2602, 4090, 736, 3020, 50276, 250, 806, 495, 7424, 275, 2593, 4562, 752, 310, 253, 2954, 875, 826, 10531, 88, 478, 886, 50276, 91, 75, 285, 826, 73, 75, 50276, 91, 75, 1180, 272, 253, 7424, 651, 320, 9371, 752, 310, 260, 50276, 74, 858, 417, 3240, 956, 253, 30221, 891, 717, 7591, 281, 4518, 14588, 253, 7424, 275, 4562, 342, 4677, 337, 68, 50276, 66, 1180, 273, 10414, 403, 1677, 323, 253, 32147, 318, 2746, 534, 310, 1175, 533, 642, 10414, 403, 1677, 323, 253, 15673, 2746, 6706, 66, 882, 2167, 436, 310, 6289, 281, 347, 247, 1375, 23037, 14387, 2746, 1060, 969, 891, 717, 7591, 281, 956, 253, 5955, 20057, 253, 30221, 281, 4677, 337, 67, 891, 717, 29985, 326, 253, 30221, 310, 3451, 533, 253, 9759, 812, 320, 9619, 21010, 285, 31637, 50276, 40203, 374, 281, 1379, 715, 2395, 326, 5044, 1396, 569, 778, 452, 1027, 13794, 253, 14777, 921, 4358, 19734, 3388, 273, 5044, 1396, 569, 26332, 26238, 50276, 91, 75, 50274, 1542, 260, 50276, 20, 495, 285, 2193, 273, 480, 3969, 281, 253, 4236, 4212, 891, 2550, 14588, 26238, 50276, 91, 75, 281, 253, 5203, 273, 253, 1269, 10565, 534, 310, 288, 50276, 13206, 50276, 21, 281, 1379, 715, 2395, 326, 253, 1327, 21941, 1025, 5044, 1396, 569, 403, 774, 86, 390, 1863, 763, 253, 14777, 921, 4358, 19734, 3388, 273, 1110, 767, 1396, 569, 26332, 26238, 50276, 91, 75, 50275, 1661, 7958, 50276, 2140, 763, 91, 19, 323, 260, 50276, 20, 495, 285, 2193, 273, 480, 3969, 281, 253, 4236, 4212, 50276, 74, 1804, 21473, 253, 1269, 10565, 651, 326, 320, 1182, 849, 943, 359, 14588, 1182, 75, 281, 253, 1182, 1293, 253, 480, 749, 3866, 4154, 273, 774, 86, 285, 1863, 763, 50276, 4674, 7609, 2829, 337, 4853, 39868, 363, 1078, 970, 253, 1307, 50276, 4674, 5976, 288, 5367, 4373, 19484, 13757, 50276, 4674, 7127, 331, 649, 4853, 253, 1307, 50276, 4674, 8255, 38754, 3186, 2085, 247, 3806, 1996, 275, 253, 1543, 2593, 253, 4477, 1375, 33810, 253, 15988, 476, 320, 2326, 281, 320, 625, 17088, 323, 38754, 685, 8325, 3186, 260, 18038, 1086, 351, 398, 10922, 598, 281, 3387, 4103, 7756, 323, 690, 4212, 534, 310, 247, 1534, 7756, 275, 253, 5028, 273, 347, 83, 533, 38754, 3186, 36908, 1957, 253, 6867, 2457, 347, 83, 22791, 594, 253, 3687, 4103, 6351, 685, 323, 8325, 3186, 36908, 1646, 1512, 1534, 50276, 2858, 891, 717, 417, 2119, 752, 253, 4477, 1599, 407, 38754, 3186, 594, 597, 943, 19148, 436, 1175, 3236, 414, 6706, 66, 882, 285, 1175, 1543, 8489, 12744, 30221, 285, 9759, 273, 2022, 12342, 5474, 33032, 2520, 2929, 29328, 247, 1039, 273, 21839, 1491, 327, 11454, 6928, 275, 6239, 247, 1846, 1039, 281, 1617, 247, 11454, 342, 271, 3280, 651, 320, 281, 2057, 32147, 366, 253, 21839, 4972, 281, 253, 3280, 4972, 390, 14888, 352, 1078, 2067, 8090, 15673, 2746, 275, 4677, 337, 67, 275, 436, 2929, 597, 3185, 12661, 281, 1509, 253, 21839, 4972, 949, 17375, 2020, 273, 3453, 273, 11454, 3024, 5743, 3470, 253, 1750, 310, 326, 436, 1039, 273, 3733, 253, 11454, 2990, 5644, 281, 5141, 275, 11454, 2990, 3602, 1293, 18501, 272, 3045, 275, 6519, 14314, 50276, 284, 83, 8892, 253, 5661, 1543, 5224, 326, 253, 4081, 1332, 273, 3733, 253, 391, 9866, 50276, 68, 9866, 1754, 2718, 323, 6519, 14314, 285, 347, 83, 5644, 281, 5141, 275, 1180, 273, 3602, 253, 4477, 9059, 326, 436, 310, 271, 1774, 5750, 323, 1698, 15024, 12672, 891, 452, 2067, 3533, 327, 436, 891, 452, 9814, 436, 2929, 1078, 285, 891, 923, 326, 690, 273, 619, 7350, 432, 4321, 1014, 2167, 497, 10571, 519, 2079, 275, 253, 30080, 22559, 1024, 3133, 417, 281, 320, 9713, 275, 253, 1655, 2715, 273, 253, 7714, 50276, 783, 1180, 273, 3602, 281, 2939, 253, 15180, 11361, 273, 247, 1566, 310, 30455, 275, 619, 4743, 891, 1158, 247, 1805, 2557, 323, 436, 651, 320, 281, 897, 253, 1180, 273, 892, 2695, 275, 3579, 1509, 390, 253, 3541, 10393, 342, 1675, 281, 3280, 3425, 2978, 10775, 436, 2829, 369, 3559, 275, 253, 30080, 22559, 50275, 7645, 209, 186, 4123, 209, 186, 1258, 2695, 278, 209, 186, 13324, 1371, 13818, 50276, 83, 9866, 209, 186, 90, 209, 186, 805, 3439, 2227, 209, 186, 22, 23025, 3079, 5540, 29827, 50276, 83, 9866, 209, 186, 79, 209, 186, 1093, 2640, 1549, 209, 186, 22, 22863, 2526, 11601, 2941, 50276, 85, 1397, 209, 186, 90, 209, 186, 1010, 2922, 3079, 209, 186, 883, 1540, 3712, 17796, 34363, 50276, 85, 1397, 209, 186, 79, 209, 186, 22308, 25238, 209, 186, 805, 1671, 1438, 470, 1706, 1449, 50276, 85, 1397, 83, 9866, 209, 186, 90, 209, 186, 21, 24854, 1549, 209, 186, 1867, 19136, 23, 20029, 3507, 50276, 85, 1397, 83, 9866, 209, 186, 79, 209, 186, 27087, 29827, 209, 186, 1867, 21980, 26, 1384, 18392, 50276, 74, 1158, 352, 651, 320, 1175, 281, 2486, 841, 1543, 4583, 359, 923, 326, 253, 7756, 310, 417, 1900, 2590, 3340, 275, 2426, 273, 22667, 672, 253, 11041, 310, 2668, 715, 8180, 50275, 1542, 253, 747, 3368, 342, 347, 83, 285, 7222, 83, 629, 374, 273, 253, 4679, 253, 9759, 310, 417, 2221, 2590, 281, 479, 285, 891, 717, 417, 2119, 604, 271, 7756, 689, 5795, 21839, 5609, 403, 2011, 1060, 50276, 74, 369, 247, 37317, 323, 436, 2929, 323, 5723, 2824, 43425, 1014, 2167, 891, 14285, 323, 14924, 326, 673, 891, 1024, 923, 326, 690, 273, 619, 7350, 285, 7350, 8042, 562, 407, 643, 30628, 403, 417, 4751, 9713, 323, 4227, 891, 513, 417, 923, 247, 5301, 342, 436, 2929, 5987, 39962, 2061, 9275, 1036, 9104, 1619, 1619, 9275, 50275, 74, 671, 513, 417, 923, 247, 5301, 275, 2426, 273, 50276, 1258, 2695, 285, 22667, 323, 534, 253, 7756, 3133, 16888, 432, 253, 1543, 2879, 281, 253, 5723, 2824, 30080, 22559, 50275, 74, 3103, 6273, 323, 16888, 18235, 50276, 7152, 33032, 2520, 2929, 23970, 247, 1332, 1925, 6311, 1396, 569, 323, 32339, 6519, 14314, 285, 32339, 12077, 6519, 8981, 253, 6311, 1396, 569, 403, 2797, 407, 247, 17375, 2020, 273, 14561, 1396, 569, 273, 8763, 3828, 18012, 285, 253, 13461, 403, 10302, 432, 2602, 4090, 273, 253, 16589, 14925, 46234, 824, 347, 1269, 34383, 253, 2022, 5649, 273, 253, 1566, 347, 2429, 281, 32147, 318, 390, 15673, 1754, 7274, 310, 326, 6311, 1396, 569, 906, 275, 4577, 1180, 273, 3037, 494, 3602, 285, 7613, 247, 4577, 1566, 1979, 5661, 1543, 921, 326, 275, 2426, 273, 256, 5267, 7756, 253, 4081, 1332, 33526, 10870, 3045, 281, 253, 8245, 3210, 342, 247, 4577, 1566, 1979, 275, 253, 1083, 273, 347, 83, 253, 4081, 2746, 3400, 16640, 5141, 347, 2429, 281, 271, 440, 26672, 264, 1566, 50275, 296, 3755, 20556, 50276, 2068, 3134, 4679, 327, 767, 2898, 3672, 6519, 14314, 285, 12077, 6519, 8981, 50275, 783, 2929, 671, 3597, 281, 2085, 11745, 1543, 407, 38542, 6311, 1396, 569, 323, 1027, 17999, 50275, 783, 4081, 1566, 310, 3294, 1598, 4577, 685, 253, 32147, 318, 1754, 7274, 253, 1332, 1537, 320, 7470, 323, 327, 10933, 4893, 50274, 20881, 1255, 265, 50276, 284, 627, 403, 1142, 5661, 1543, 253, 2929, 778, 13949, 2489, 12150, 281, 1239, 4404, 253, 990, 50276, 8826, 4278, 403, 417, 6210, 392, 265, 9397, 323, 1650, 275, 7180, 337, 285, 374, 752, 1057, 6882, 1599, 50274, 9154, 2167, 253, 1566, 1979, 310, 4577, 352, 3249, 342, 3081, 327, 783, 16247, 13782, 273, 2067, 14561, 1005, 273, 8763, 1396, 569, 534, 778, 2847, 643, 7350, 323, 327, 10933, 4893, 50275, 18405, 2746, 943, 320, 13366, 5393, 275, 253, 2505, 1078, 9610, 253, 1543, 342, 436, 1332, 50275, 73, 5367, 913, 1406, 1105, 943, 320, 2931, 3239, 721, 50276, 5430, 4555, 1057, 253, 1263, 1347, 1442, 292, 25004, 273, 253, 14925, 21496, 1566, 4278, 403, 5816, 3239, 854, 50275, 1542, 7222, 83, 1057, 253, 1263, 1442, 292, 2517, 253, 14925, 21496, 1566, 26277, 342, 253, 347, 83, 1566, 390, 403, 597, 6311, 11794, 253, 2929, 3400, 247, 3367, 1320, 1332, 407, 21839, 14561, 1396, 569, 1754, 327, 253, 14925, 46234, 627, 403, 9470, 4679, 581, 4468, 310, 326, 253, 2929, 778, 878, 690, 3081, 37699, 327, 4278, 323, 1805, 38041, 50275, 187, 187, 4118, 18435, 27, 783, 4477, 12661, 247, 4460, 1332, 323, 21839, 3676, 11454, 2990, 597, 8171, 253, 5743, 1159, 342, 247, 4872, 5019, 273, 5743, 3470, 24088, 774, 86, 253, 13461, 323, 253, 5743, 3470, 403, 23043, 10302, 432, 253, 3280, 1309, 17032, 285, 3733, 253, 2746, 310, 6760, 327, 2629, 1345, 8892, 285, 2722, 7756, 689, 973, 21877, 18075, 50276, 856, 84, 50276, 66, 2969, 4460, 1332, 323, 1617, 326, 310, 7561, 7763, 50276, 14629, 366, 16774, 27163, 281, 7568, 697, 12510, 50276, 5040, 50276, 2369, 2201, 14855, 50276, 783, 30628, 2530, 2067, 8680, 253, 4477, 11217, 253, 13991, 285, 31637, 12541, 7350, 253, 17265, 2715, 273, 253, 2929, 556, 5520, 253, 1239, 1430, 285, 11839, 9619 ]
[ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 ]
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 2520, 2929, 10262, 247, 1077, 4722, 2934, 604, 891, 2096, 352, 9113, 534, 310, 281, 3359, 21839, 327, 24088, 247, 14925, 21496, 4972, 3066, 247, 30364, 45031, 273, 253, 2242, 797, 311, 4071, 5743, 3470, 273, 271, 4583, 3676, 11454, 2036, 10336, 253, 16038, 310, 281, 5115, 253, 1072, 1055, 347, 253, 973, 4304, 32147, 318, 2746, 533, 1293, 10568, 347, 1142, 747, 3602, 50276, 74, 1119, 253, 30221, 1892, 281, 956, 1607, 7022, 619, 15909, 273, 253, 7936, 10419, 2299, 6524, 891, 1694, 253, 2934, 253, 1543, 403, 12532, 4645, 1175, 23029, 273, 24088, 16640, 3290, 327, 253, 440, 44321, 10076, 342, 1175, 16640, 327, 253, 18052, 10076, 285, 342, 11184, 3602, 685, 253, 32147, 318, 1332, 50276, 74, 1158, 253, 1416, 6311, 1396, 569, 8489, 17433, 7042, 253, 2934, 50276, 12756, 2649, 352, 320, 1805, 4907, 347, 6311, 5743, 3470, 326, 4745, 651, 19148, 253, 7936, 4473, 50276, 395, 6780, 697, 3236, 414, 923, 619, 5701, 1840, 275, 253, 6010, 323, 20544, 285, 32213, 275, 271, 5825, 17901, 891, 1158, 253, 2929, 27171, 432, 247, 3480, 273, 19843, 326, 17433, 7042, 752, 310, 5010, 247, 5322, 2934, 50276, 74, 1804, 16984, 253, 40155, 273, 253, 5161, 273, 253, 2934, 4321, 327, 275, 253, 2929, 253, 373, 247, 3282, 273, 30513, 326, 310, 247, 2372, 29125, 323, 479, 347, 247, 9414, 50276, 8826, 625, 2173, 5701, 956, 50276, 250, 6197, 12365, 275, 50276, 395, 247, 2021, 273, 247, 5044, 1396, 569, 23105, 50276, 83, 50276, 83, 247, 891, 18, 273, 534, 247, 1798, 22786, 812, 320, 253, 873, 273, 253, 954, 7744, 908, 1396, 569, 275, 3676, 4715, 24088, 774, 86, 9788, 78, 1238, 23136, 73, 285, 594, 327, 2451, 253, 28146, 891, 2868, 436, 310, 417, 247, 3426, 6197, 1677, 752, 3249, 1078, 253, 749, 498, 766, 891, 816, 15212, 50276, 5371, 310, 2602, 4090, 736, 3020, 50276, 250, 806, 495, 7424, 275, 2593, 4562, 752, 310, 253, 2954, 875, 826, 10531, 88, 478, 886, 50276, 91, 75, 285, 826, 73, 75, 50276, 91, 75, 1180, 272, 253, 7424, 651, 320, 9371, 752, 310, 260, 50276, 74, 858, 417, 3240, 956, 253, 30221, 891, 717, 7591, 281, 4518, 14588, 253, 7424, 275, 4562, 342, 4677, 337, 68, 50276, 66, 1180, 273, 10414, 403, 1677, 323, 253, 32147, 318, 2746, 534, 310, 1175, 533, 642, 10414, 403, 1677, 323, 253, 15673, 2746, 6706, 66, 882, 2167, 436, 310, 6289, 281, 347, 247, 1375, 23037, 14387, 2746, 1060, 969, 891, 717, 7591, 281, 956, 253, 5955, 20057, 253, 30221, 281, 4677, 337, 67, 891, 717, 29985, 326, 253, 30221, 310, 3451, 533, 253, 9759, 812, 320, 9619, 21010, 285, 31637, 50276, 40203, 374, 281, 1379, 715, 2395, 326, 5044, 1396, 569, 778, 452, 1027, 13794, 253, 14777, 921, 4358, 19734, 3388, 273, 5044, 1396, 569, 26332, 26238, 50276, 91, 75, 50274, 1542, 260, 50276, 20, 495, 285, 2193, 273, 480, 3969, 281, 253, 4236, 4212, 891, 2550, 14588, 26238, 50276, 91, 75, 281, 253, 5203, 273, 253, 1269, 10565, 534, 310, 288, 50276, 13206, 50276, 21, 281, 1379, 715, 2395, 326, 253, 1327, 21941, 1025, 5044, 1396, 569, 403, 774, 86, 390, 1863, 763, 253, 14777, 921, 4358, 19734, 3388, 273, 1110, 767, 1396, 569, 26332, 26238, 50276, 91, 75, 50275, 1661, 7958, 50276, 2140, 763, 91, 19, 323, 260, 50276, 20, 495, 285, 2193, 273, 480, 3969, 281, 253, 4236, 4212, 50276, 74, 1804, 21473, 253, 1269, 10565, 651, 326, 320, 1182, 849, 943, 359, 14588, 1182, 75, 281, 253, 1182, 1293, 253, 480, 749, 3866, 4154, 273, 774, 86, 285, 1863, 763, 50276, 4674, 7609, 2829, 337, 4853, 39868, 363, 1078, 970, 253, 1307, 50276, 4674, 5976, 288, 5367, 4373, 19484, 13757, 50276, 4674, 7127, 331, 649, 4853, 253, 1307, 50276, 4674, 8255, 38754, 3186, 2085, 247, 3806, 1996, 275, 253, 1543, 2593, 253, 4477, 1375, 33810, 253, 15988, 476, 320, 2326, 281, 320, 625, 17088, 323, 38754, 685, 8325, 3186, 260, 18038, 1086, 351, 398, 10922, 598, 281, 3387, 4103, 7756, 323, 690, 4212, 534, 310, 247, 1534, 7756, 275, 253, 5028, 273, 347, 83, 533, 38754, 3186, 36908, 1957, 253, 6867, 2457, 347, 83, 22791, 594, 253, 3687, 4103, 6351, 685, 323, 8325, 3186, 36908, 1646, 1512, 1534, 50276, 2858, 891, 717, 417, 2119, 752, 253, 4477, 1599, 407, 38754, 3186, 594, 597, 943, 19148, 436, 1175, 3236, 414, 6706, 66, 882, 285, 1175, 1543, 8489, 12744, 30221, 285, 9759, 273, 2022, 12342, 5474, 33032, 2520, 2929, 29328, 247, 1039, 273, 21839, 1491, 327, 11454, 6928, 275, 6239, 247, 1846, 1039, 281, 1617, 247, 11454, 342, 271, 3280, 651, 320, 281, 2057, 32147, 366, 253, 21839, 4972, 281, 253, 3280, 4972, 390, 14888, 352, 1078, 2067, 8090, 15673, 2746, 275, 4677, 337, 67, 275, 436, 2929, 597, 3185, 12661, 281, 1509, 253, 21839, 4972, 949, 17375, 2020, 273, 3453, 273, 11454, 3024, 5743, 3470, 253, 1750, 310, 326, 436, 1039, 273, 3733, 253, 11454, 2990, 5644, 281, 5141, 275, 11454, 2990, 3602, 1293, 18501, 272, 3045, 275, 6519, 14314, 50276, 284, 83, 8892, 253, 5661, 1543, 5224, 326, 253, 4081, 1332, 273, 3733, 253, 391, 9866, 50276, 68, 9866, 1754, 2718, 323, 6519, 14314, 285, 347, 83, 5644, 281, 5141, 275, 1180, 273, 3602, 253, 4477, 9059, 326, 436, 310, 271, 1774, 5750, 323, 1698, 15024, 12672, 891, 452, 2067, 3533, 327, 436, 891, 452, 9814, 436, 2929, 1078, 285, 891, 923, 326, 690, 273, 619, 7350, 432, 4321, 1014, 2167, 497, 10571, 519, 2079, 275, 253, 30080, 22559, 1024, 3133, 417, 281, 320, 9713, 275, 253, 1655, 2715, 273, 253, 7714, 50276, 783, 1180, 273, 3602, 281, 2939, 253, 15180, 11361, 273, 247, 1566, 310, 30455, 275, 619, 4743, 891, 1158, 247, 1805, 2557, 323, 436, 651, 320, 281, 897, 253, 1180, 273, 892, 2695, 275, 3579, 1509, 390, 253, 3541, 10393, 342, 1675, 281, 3280, 3425, 2978, 10775, 436, 2829, 369, 3559, 275, 253, 30080, 22559, 50275, 7645, 209, 186, 4123, 209, 186, 1258, 2695, 278, 209, 186, 13324, 1371, 13818, 50276, 83, 9866, 209, 186, 90, 209, 186, 805, 3439, 2227, 209, 186, 22, 23025, 3079, 5540, 29827, 50276, 83, 9866, 209, 186, 79, 209, 186, 1093, 2640, 1549, 209, 186, 22, 22863, 2526, 11601, 2941, 50276, 85, 1397, 209, 186, 90, 209, 186, 1010, 2922, 3079, 209, 186, 883, 1540, 3712, 17796, 34363, 50276, 85, 1397, 209, 186, 79, 209, 186, 22308, 25238, 209, 186, 805, 1671, 1438, 470, 1706, 1449, 50276, 85, 1397, 83, 9866, 209, 186, 90, 209, 186, 21, 24854, 1549, 209, 186, 1867, 19136, 23, 20029, 3507, 50276, 85, 1397, 83, 9866, 209, 186, 79, 209, 186, 27087, 29827, 209, 186, 1867, 21980, 26, 1384, 18392, 50276, 74, 1158, 352, 651, 320, 1175, 281, 2486, 841, 1543, 4583, 359, 923, 326, 253, 7756, 310, 417, 1900, 2590, 3340, 275, 2426, 273, 22667, 672, 253, 11041, 310, 2668, 715, 8180, 50275, 1542, 253, 747, 3368, 342, 347, 83, 285, 7222, 83, 629, 374, 273, 253, 4679, 253, 9759, 310, 417, 2221, 2590, 281, 479, 285, 891, 717, 417, 2119, 604, 271, 7756, 689, 5795, 21839, 5609, 403, 2011, 1060, 50276, 74, 369, 247, 37317, 323, 436, 2929, 323, 5723, 2824, 43425, 1014, 2167, 891, 14285, 323, 14924, 326, 673, 891, 1024, 923, 326, 690, 273, 619, 7350, 285, 7350, 8042, 562, 407, 643, 30628, 403, 417, 4751, 9713, 323, 4227, 891, 513, 417, 923, 247, 5301, 342, 436, 2929, 5987, 39962, 2061, 9275, 1036, 9104, 1619, 1619, 9275, 50275, 74, 671, 513, 417, 923, 247, 5301, 275, 2426, 273, 50276, 1258, 2695, 285, 22667, 323, 534, 253, 7756, 3133, 16888, 432, 253, 1543, 2879, 281, 253, 5723, 2824, 30080, 22559, 50275, 74, 3103, 6273, 323, 16888, 18235, 50276, 7152, 33032, 2520, 2929, 23970, 247, 1332, 1925, 6311, 1396, 569, 323, 32339, 6519, 14314, 285, 32339, 12077, 6519, 8981, 253, 6311, 1396, 569, 403, 2797, 407, 247, 17375, 2020, 273, 14561, 1396, 569, 273, 8763, 3828, 18012, 285, 253, 13461, 403, 10302, 432, 2602, 4090, 273, 253, 16589, 14925, 46234, 824, 347, 1269, 34383, 253, 2022, 5649, 273, 253, 1566, 347, 2429, 281, 32147, 318, 390, 15673, 1754, 7274, 310, 326, 6311, 1396, 569, 906, 275, 4577, 1180, 273, 3037, 494, 3602, 285, 7613, 247, 4577, 1566, 1979, 5661, 1543, 921, 326, 275, 2426, 273, 256, 5267, 7756, 253, 4081, 1332, 33526, 10870, 3045, 281, 253, 8245, 3210, 342, 247, 4577, 1566, 1979, 275, 253, 1083, 273, 347, 83, 253, 4081, 2746, 3400, 16640, 5141, 347, 2429, 281, 271, 440, 26672, 264, 1566, 50275, 296, 3755, 20556, 50276, 2068, 3134, 4679, 327, 767, 2898, 3672, 6519, 14314, 285, 12077, 6519, 8981, 50275, 783, 2929, 671, 3597, 281, 2085, 11745, 1543, 407, 38542, 6311, 1396, 569, 323, 1027, 17999, 50275, 783, 4081, 1566, 310, 3294, 1598, 4577, 685, 253, 32147, 318, 1754, 7274, 253, 1332, 1537, 320, 7470, 323, 327, 10933, 4893, 50274, 20881, 1255, 265, 50276, 284, 627, 403, 1142, 5661, 1543, 253, 2929, 778, 13949, 2489, 12150, 281, 1239, 4404, 253, 990, 50276, 8826, 4278, 403, 417, 6210, 392, 265, 9397, 323, 1650, 275, 7180, 337, 285, 374, 752, 1057, 6882, 1599, 50274, 9154, 2167, 253, 1566, 1979, 310, 4577, 352, 3249, 342, 3081, 327, 783, 16247, 13782, 273, 2067, 14561, 1005, 273, 8763, 1396, 569, 534, 778, 2847, 643, 7350, 323, 327, 10933, 4893, 50275, 18405, 2746, 943, 320, 13366, 5393, 275, 253, 2505, 1078, 9610, 253, 1543, 342, 436, 1332, 50275, 73, 5367, 913, 1406, 1105, 943, 320, 2931, 3239, 721, 50276, 5430, 4555, 1057, 253, 1263, 1347, 1442, 292, 25004, 273, 253, 14925, 21496, 1566, 4278, 403, 5816, 3239, 854, 50275, 1542, 7222, 83, 1057, 253, 1263, 1442, 292, 2517, 253, 14925, 21496, 1566, 26277, 342, 253, 347, 83, 1566, 390, 403, 597, 6311, 11794, 253, 2929, 3400, 247, 3367, 1320, 1332, 407, 21839, 14561, 1396, 569, 1754, 327, 253, 14925, 46234, 627, 403, 9470, 4679, 581, 4468, 310, 326, 253, 2929, 778, 878, 690, 3081, 37699, 327, 4278, 323, 1805, 38041, 50275, 187, 187, 4118, 18435, 27, 783, 4477, 12661, 247, 4460, 1332, 323, 21839, 3676, 11454, 2990, 597, 8171, 253, 5743, 1159, 342, 247, 4872, 5019, 273, 5743, 3470, 24088, 774, 86, 253, 13461, 323, 253, 5743, 3470, 403, 23043, 10302, 432, 253, 3280, 1309, 17032, 285, 3733, 253, 2746, 310, 6760, 327, 2629, 1345, 8892, 285, 2722, 7756, 689, 973, 21877, 18075, 50276, 856, 84, 50276, 66, 2969, 4460, 1332, 323, 1617, 326, 310, 7561, 7763, 50276, 14629, 366, 16774, 27163, 281, 7568, 697, 12510, 50276, 5040, 50276, 2369, 2201, 14855, 50276, 783, 30628, 2530, 2067, 8680, 253, 4477, 11217, 253, 13991, 285, 31637, 12541, 7350, 253, 17265, 2715, 273, 253, 2929, 556, 5520, 253, 1239, 1430, 285, 11839, 9619 ]
Below is given review of a research paper from cnoference journal. Please write a summary the review. ### Review: the authors notice that popular ssl evaluation protocols are often constrained to computer vision tasks and are generally timeconsuming and environmentally unfriendly based on this they construct usb by selecting several tasks on which they evaluate dominant semisupervised learningssl methods to reduce the cost of ssl they also provide a pretraining and finetuning paradigm the used tasks are well divided according to their domains and algorithms are categorized this paper is well motivated as the authors point out previous work typically trains deep neural networks from scratch which is timeconsuming and environmentally unfriendly i applaud the authors efforts in experiments to clearly and comprehensively clarify the strengths and weaknesses of different algorithms in different domains they conduct quite enormous experiments the authors also use nearly every dataset commonly used in different tasks this paper has many contributions from both technical and conceptual perspectives i applaud the authors efforts to clearly categorized every algorithm according to its own features as the authors imply they still lack some other ssl tasks such as imbalanced semisupervised learning openset semisupervised learning etc although the authors use 14 different algorithms to conduct experiments there are still some algorithms remaining the table used to store experiment data seems too small like tables 5 6 and 7 docsepthis work proposes a new benchmark for semisupervised learning usb which covers evaluation on 15 different tasks in 3 different domains vision nlp and audio the benchmark is constructed to not be too computationally expensive to run additionally a number of different ssl algorithms are compared on the benchmark and there is some analysis performed about which methods succeed the benchmark includes a large and diverse evaluation suite 15 different tasks in 3 different domains including the audio domain for which usb is the first to evaluate ssl on 14 different ssl algorithms are implemented and compared on the benchmarks save for those algorithms which do not make sense to implement in certain modalities the benchmark is constructed with computational expense in mind such that it requires much fewer gpuhours to evaluate a method across the suite of tasks compared to previous ssl benchmarks using pretrained backbones is an important axis of variation missing in previous benchmarks and usb investigates the impact of the use of a pretrained backbones finding that most ssl algorithms become more efficient usb compares the rankings of different algorithms across the different domains i believe this analysisresult is the first for ssl algorithms to be compared between domains my main criticism of this work is that there is little justification given for a couple core experimental choices in the usb benchmark this is a considerable missed opportunity since usb is positioned as an computefriendly ssl benchmark that academics or other researchers can hillclimb on as such this work can do a better job of convincing the reader that results from this benchmark will transfer to other settings wherever possible by performing additional experimental validation let me be concrete the choice of pretraining backbone i do agree that it is reasonable to leverage pretrained backbones as a way to improve the performance of ssl however this is a significant departure from the standard ssl setting and deserves some justification why are the specific pretrained vit bert or wav2vec defined in usb chosen is there a reason they are better than alternatives do the rankings or conclusions of which ssl algorithm is best change if the pretrained backbone is changed for a different architecture or one trained on a different dataset if the choice of pretraining backbone has a large influence on the ssl algorithm ranking this should be discussed in more detail if not then the experiments need to be presented to inspire confidence in the current choice of backbone why is imagenet excluded from usb this seems like an odd choice especially since most ssl work has been developed for imagenet and we know imagenet is a reasonable dataset to hillclimb on since many previous works have shown that imagenet gains routinely transfer to other datasets there is some discussion about computational cost for the reason but its exclusion is odd given that usb positions itself as unified one way around this would be to conduct an evaluation on imagenet and compare the rankings to the cv portion of usb and if the rankings are consistent this would be evidence that imagenet results closely track the rest of usbcv and thus perhaps redundantnot needed but this needs to be run shown another weakness of this work is the choice of overall metric having a rank as the overall metric seems to be a very poor choice since the score for any method depends on the rest of the methods implemented in the benchmark at that point in time if two new methods a and b were proposed at the same time separately there would be no way to compare which of a and b is better by looking at their reported ranks until they were both implemented into the usb codebase and the ranks were recalculated this seems like a major downside what about just using average accuracy or something similar its a much more standard reporting metric and allows timeindependent comparisons docsepthe authors propose usb a unified ssl benchmark to facilitate general ssl research usb offers benchmarks across five cv datasets five nlp datasets and five audio datasets by using pretrained vision transformers the authors found that it can largely decrease the training time when evaluating the performance of an ssl algorithm the authors implement 14 ssl algorithms and will opensource the codebase the problem is interesting and important the ssl problem has been widely studied in the machine learning community it is a fundamental problem in cv and nlp the authors implement 14 wellknown ssl algorithms and put them into an opensourced codebase for easy comparison comprehensive experiments clearly demonstrate the differences the authors provide analysis from the experiments such as the difference between training from scratch and using pretraining in ssl and the effectiveness of ssl when using the stateoftheart neural models as the backbones the authors should be more clear about the motivation and why this workopensourced codebase is useful for the community for example in the introduction the authors mentioned that existing benchmarks are mostly constrained to cv tasks but do not clearly illustrate why this is a problem and what advantages can be gained by building a unified ssl benchmark lack of insights or explanations for the experimental results for example in section 43 when answering why should we evaluate an ssl algorithm on diverse tasks across domains the authors said the differences between ranks of ssl algorithms in different domains cannot be ignored hence it is crucial to introduce diverse tasks from multiple domains when evaluating an ssl algorithm why can it not be ignored after introducing diverse tasks what insights can we get docsepthis paper introduces usb a benchmark for semisupervised learning which covers classification problems in several domains including vision cifar100 stl10 eurosat tissuemnist semiaves language imdb agnews amazon review yahoo answer and yelp and audio esc50 gtzan urtasound8k fsdnoisy18k this benchmark aims to cover more domains than the standard approaches for benchmarking as well as to reduce the amount of computation required 37 gpu days vs 335 gpu days for fixmatch the paper also introduces code for 14 existing ssl algorithms which are evaluated according to the benchmark notably the paper shows that under the usb benchmark flexmatch gains 5 places in ranking over the previous crmatch approach in computer vision the papers major contributions are introducing a collection of tasks for benchmarking ssl demonstrating with some analysis the performance of some ssl algorithms on this task in general the paper is well motivated no existing benchmark exists for semiselfsupervised learning algorithms which covers a wide range of domains and approaches the paper does a fairly good job at selecting efficient yet relevant classification tasks within the three target domains of language vision and audio given the performance on the usb benchmark i would be relatively confident in the performance of the semisupervised learning algorithm in addition the paper has relatively strong analysis of the tested algorithms i am much more confident in the performance and experiments given here than in many papers and from the code provided it is clear that the results are reproducible and extensible the framework for evaluation is also reasonable and none of the methods for summarizing the performance over the tasks seem too opinionated i appreciate specifically that the paper includes multiple settings and seeds as part of the evaluation time and benchmarking process which is often lacking in existing implementations i further really appreciate the fact that the amount of training time is taken into account when exploring the generally im worried about the scope of the paper and the exact choices made for the benchmark the usb benchmark as stated claims to be a unified benchmark for semisupervised learning however the chosen tasks largely ignore many of the tasks that individual communities consider challenging or difficult more specifically im concerned that the benchmark is limited to classification tasks which are traditionally global in nature and require little finegrained detail to solve such a decision may lead to research prioritizing models which are designed for classification tasks and ignore other equally important and interesting tasks in vision many applications of semiselfsupervised learning focus on finegrained and zeroshot classification which are mostly missing from the benchmark semiaves is the only included example which is even close additionally semiself supervised learning has been shown to be applicable in tasks in object detection grounding 3d vision and image generation which are all widely applicable and interesting to explore in nlp the chosen datasets are more widely explored but ignore important tasks such as language generationgeneral language modeling summarization and opendomain qa additionally datasets like imdb are largely considered solved in the supervised domain which can make it hard to use as a differentiating task even in ssl in audio the benchmarks focus on audio classification but many other tasks are relevant including prosody classification speaker recognition automated speech recognition and others for a good summary of tasks see the superb benchmark 46 in the paper in particular i would like to see a wider range of tasks in a unified benchmark which closer reflect the current sota applications for semi and selfsupervised learning another concern i have is with the chosen algorithms as discussed in the limitations section of the paper there are a significant number of sslbased algorithms which are missing further the chosen algorithms to be implemented are largely from a single direction match in the ssl space which could bias future research towards algorithms structured in significant ways i would really like to see a wider range of fieldrepresentative algorithms rather than a detailed exploration of single algorithm variants included in the benchmark finally there is no related work section in the paper which makes it difficult to place in context some other minor comments while table 4 has a few interesting properties of each of the algorithms theres no analysis of the impact of these properties on the usb benchmark scores it would be interesting and relevant to explore how such properties impact the process there is little discussion as to why each of the datasets was chosen beyond they are in use in the field i would really like to see some figures which demonstrate the correlation between the standard benchmark and the proposed usb benchmark its a bit hard to parse from the table if the results are correlated and if they are how closely they are related i would appreciate some more discussion of the numerical correlations between the standard approach and the proposed approach and i additionally wonder if a subset of the chosen experiments could be used to reduce training time and would correlate as well or better with the full chosen experiment set is it possible to demonstrate that every benchmark chosen is necessary the analysis of the tsne in figure3 is somewhat superficial and doesnt really add much to the paper what makes for a better tsne embedding is underspecified at best and does not always indicate separability see herehttpsdistillpub2016misreadtsne the backbones that are chosen for the approach are somewhat arbitrary it would be good to include a discussion of possible backbones and to understand why each backbone was chosen performance on some of the benchmarks was a bit confusing as it was presented as errorrate instead of accuracy which is standard for many of the datasets shown docsepthis paper introduces a comprehensive benchmarking for semisupervised learning on vision nlp and audio and this paper provides an environmentally friendly and lowcost evaluation protocol with pretraining finetuning paradigm reducing the cost of ssl experiments additionally this paper implements14 ssl algorithms and opensource a modular codebase and config files for easy reproduction of the reported results in this work 1 this paper builds a comprehensive benchmarking on semisupervised learning which contains vision nlp and audio 2 the work opens the source code for further extension which is beneficial for the ml community in the ssl field 3 this paper is wellwritten and provides details on the datasets and experiment setup 4 it is great that we plan to evolve the benchmark in the future iterations over time by extending with more tasks line 220221 this benchmarking could be impactful for the ssl community 1 although the experiment results are comprehensive the insights and analysis are relatively limited it would be better if the authors could provide more insights and discoveries from the experiment results these insights are critical for inspiring ssl model designs in the future 2 it is suggested that what the benchmark can help for the ssl community could be described more clearly and explicitly in the paper 3 since there are already a lot of existing ssl benchmarks the relation and comparison with existing benchmarking may need more detailed descriptions which could be put in the appendix docsepexisting ssl methods evaluation protocols are limited because of i only implementing them using cv tasks and ii the requirement to train deep neural models from scratch this paper evaluates different ssl algorithms across different domains cv nlp and audio tasks using vision transformers vit instead of training conventional structures such as resnet from scratch the main contribution is providing the pretrained models that serve as a benchmark for ssl algorithms these evaluation models are based on vit 1 the paper in general is wellmotivated specially as it proposes to facilitate the development of ssl algorithms to research labs with limited computational resources 2 the variety of the datasets across multiple domains and the considered 14 ssl algorithms 1 in general further clarifications are needed for some parts of the paper points 2 and 3 below 2 for the sentence compared with training neural models from scratch pretraining has much reduced cost in ssl yet relatively few benchmarks offer a fair test bed for ssl with the pretrained versions of neural models can authors provide references if there are other test beds for ssl methods it is recommended to mention them and exploit their limitations by which usb is addressing in this paper i understand that the authors show that difference in gpu days needed to evaluate ssl algorithms in table 1a with referencing 21 however further explanations and details into the weaknesses of 21 would improve the paper 3 no structural details and layers stacks are provided for the utilized vit given that the main contribution of this work is the use of pretrained vit as an evaluation benchmark i believe describing how the vit is utilized along with its structure is of high importance also it is not clear for instance if the same architecture is used for all tasks it is mentioned that for cv authors use a model similar to 27 for nlp and audio on the other hand it is mentioned that a transformer model similar to 6466 is adopted does this mean that for each task a different pretrained model is used 4 while the limitation section is appreciated it is not clear why none of the mentioned methods in this section is implemented a justification is needed in this case which leads to the question can usb be considered unified if popular ganbased andor gcnbased ssl methods are not implemented ### Summary:
this article observed an active rebuttal period and definitely went to a final form that should be published at this venue as highlighted by the latest updates of all reviewers it will be a good add to the broad multi domain semisupervised community
[ 1691, 731, 715, 271, 13279, 47549, 2127, 4793, 323, 3477, 5301, 11088, 4679, 4518, 7568, 253, 3910, 50275, 783, 4477, 2085, 1783, 432, 253, 4679, 824, 347, 253, 3064, 875, 3733, 432, 20041, 285, 970, 3215, 26208, 275, 256, 3433, 285, 253, 12510, 273, 256, 3433, 672, 970, 253, 1375, 23037, 14387, 11454, 3210, 347, 253, 896, 47473, 50276, 783, 4477, 943, 320, 625, 2590, 670, 253, 16038, 285, 2139, 436, 789, 25249, 47549, 2127, 4793, 310, 4217, 323, 253, 3114, 323, 1650, 275, 253, 10199, 253, 4477, 5393, 326, 5368, 49602, 403, 6571, 20793, 281, 30105, 8892, 533, 513, 417, 4518, 17093, 2139, 436, 310, 247, 1895, 285, 752, 11361, 476, 320, 12103, 407, 3652, 247, 27998, 256, 3433, 22791, 50275, 77, 471, 273, 16039, 390, 22909, 323, 253, 5661, 1543, 323, 1650, 275, 2593, 7652, 672, 22291, 2139, 943, 359, 7472, 271, 256, 3433, 5933, 327, 11117, 8892, 2439, 10625, 253, 4477, 753, 253, 3910, 875, 17210, 273, 256, 3433, 11333, 275, 1027, 10625, 2550, 320, 12841, 7613, 352, 310, 9560, 281, 9569, 11117, 8892, 432, 2709, 10625, 672, 16344, 271, 256, 3433, 5933, 2139, 476, 352, 417, 320, 12841, 846, 16984, 11117, 8892, 752, 16039, 476, 359, 755, 50276, 7152, 33032, 2520, 2929, 23970, 29227, 247, 22791, 323, 49863, 29974, 13337, 4715, 534, 10949, 9162, 3237, 275, 2067, 10625, 1690, 8113, 260, 338, 274, 2313, 331, 77, 740, 17069, 22354, 3375, 86, 24551, 382, 3300, 571, 1634, 3448, 516, 5470, 639, 13608, 7001, 251, 2278, 340, 20888, 3662, 285, 340, 47705, 285, 9797, 6262, 1235, 305, 21239, 266, 2936, 85, 284, 517, 25, 76, 269, 8289, 2369, 17976, 1093, 76, 436, 22791, 13698, 281, 3835, 625, 10625, 685, 253, 2629, 7274, 323, 22791, 272, 347, 973, 347, 281, 4796, 253, 2408, 273, 13782, 2424, 5345, 305, 11113, 1897, 4632, 32575, 305, 11113, 1897, 323, 4993, 8992, 253, 2929, 671, 23970, 2127, 323, 1638, 5368, 256, 3433, 11333, 534, 403, 6760, 2556, 281, 253, 22791, 19836, 253, 2929, 2722, 326, 762, 253, 29227, 22791, 6520, 8992, 15988, 608, 5053, 275, 19947, 689, 253, 2045, 260, 1109, 1506, 2746, 275, 4382, 8113, 50275, 783, 9380, 2201, 9021, 403, 50276, 36445, 2844, 247, 4849, 273, 8892, 323, 22791, 272, 256, 3433, 50276, 48387, 839, 342, 690, 1783, 253, 3045, 273, 690, 256, 3433, 11333, 327, 436, 4836, 275, 2087, 253, 2929, 310, 973, 17194, 642, 5368, 22791, 4961, 323, 49863, 813, 35421, 4715, 11333, 534, 10949, 247, 4618, 2491, 273, 10625, 285, 7274, 253, 2929, 1057, 247, 9648, 1175, 2628, 387, 17221, 5919, 2568, 4623, 9162, 8892, 1561, 253, 1264, 2303, 10625, 273, 3448, 8113, 285, 9797, 1677, 253, 3045, 327, 253, 29227, 22791, 891, 651, 320, 4942, 13224, 275, 253, 3045, 273, 253, 49863, 29974, 13337, 4715, 5933, 50276, 249, 1635, 253, 2929, 556, 4942, 2266, 1783, 273, 253, 5762, 11333, 891, 717, 1199, 625, 13224, 275, 253, 3045, 285, 4679, 1677, 1060, 685, 275, 1142, 9380, 285, 432, 253, 2127, 2530, 352, 310, 2590, 326, 253, 1543, 403, 41374, 285, 1021, 35418, 253, 7792, 323, 7103, 310, 671, 5272, 285, 5293, 273, 253, 3082, 323, 10405, 3006, 253, 3045, 689, 253, 8892, 1646, 1512, 4743, 456, 50275, 74, 11435, 5742, 326, 253, 2929, 3797, 2709, 7533, 285, 12922, 347, 629, 273, 253, 7103, 673, 285, 22791, 272, 1232, 534, 310, 2223, 14999, 275, 5368, 27558, 50275, 74, 2007, 1663, 11435, 253, 958, 326, 253, 2408, 273, 3733, 673, 310, 2668, 715, 2395, 672, 18216, 253, 50276, 43786, 516, 11926, 670, 253, 7990, 273, 253, 2929, 285, 253, 3242, 10165, 1160, 323, 253, 22791, 253, 29227, 22791, 347, 4767, 3916, 281, 320, 247, 27998, 22791, 323, 49863, 29974, 13337, 4715, 2299, 253, 6777, 8892, 8127, 11823, 1142, 273, 253, 8892, 326, 2060, 7888, 1908, 11132, 390, 2834, 625, 5742, 516, 7514, 326, 253, 22791, 310, 3710, 281, 9162, 8892, 534, 403, 21533, 4156, 275, 3753, 285, 2430, 1652, 4030, 72, 11273, 2508, 281, 8415, 824, 247, 3061, 778, 1421, 281, 2561, 23652, 3006, 3210, 534, 403, 4158, 323, 9162, 8892, 285, 11823, 643, 9696, 1774, 285, 4722, 8892, 50275, 249, 8113, 1142, 4893, 273, 49863, 813, 35421, 4715, 2770, 327, 4030, 72, 11273, 285, 1182, 254, 6934, 302, 9162, 534, 403, 6571, 5816, 432, 253, 22791, 3300, 571, 1634, 310, 253, 760, 2908, 1650, 534, 310, 1014, 2810, 23000, 49863, 813, 22296, 4715, 556, 644, 2011, 281, 320, 7763, 275, 8892, 275, 1789, 5481, 3216, 272, 495, 69, 8113, 285, 2460, 5978, 50276, 4609, 403, 512, 7561, 7763, 285, 4722, 281, 8338, 50276, 249, 295, 24343, 253, 6777, 15302, 403, 625, 7561, 14859, 533, 11823, 1774, 8892, 824, 347, 3448, 5978, 16691, 3448, 14053, 10405, 1320, 285, 1121, 423, 297, 404, 2805, 66, 23000, 15302, 751, 516, 5470, 403, 8127, 2783, 14042, 275, 253, 22296, 5028, 534, 476, 1056, 352, 1892, 281, 897, 347, 247, 43073, 4836, 1014, 275, 256, 3433, 50275, 249, 9797, 253, 49602, 2770, 327, 9797, 9162, 533, 1142, 643, 8892, 403, 4623, 1690, 5847, 1197, 9162, 14925, 8981, 16644, 6519, 8981, 285, 2571, 323, 247, 1175, 6010, 273, 8892, 923, 253, 29837, 22791, 7904, 275, 253, 2929, 50275, 249, 1798, 891, 651, 751, 281, 923, 247, 14200, 2491, 273, 8892, 275, 247, 27998, 22791, 534, 8003, 4887, 253, 1655, 256, 5503, 4893, 323, 10020, 285, 1881, 35421, 4715, 50276, 23955, 4468, 891, 452, 310, 342, 253, 6777, 11333, 347, 5469, 275, 253, 7364, 2593, 273, 253, 2929, 627, 403, 247, 1534, 1180, 273, 256, 3433, 3169, 11333, 534, 403, 5816, 2007, 253, 6777, 11333, 281, 320, 9009, 403, 8127, 432, 247, 2014, 3884, 3761, 275, 253, 256, 3433, 2317, 534, 812, 8492, 2852, 2561, 4404, 11333, 18872, 275, 1534, 4088, 891, 651, 1663, 751, 281, 923, 247, 14200, 2491, 273, 1673, 12554, 800, 11333, 2581, 685, 247, 7000, 17947, 273, 2014, 5933, 11640, 2908, 275, 253, 22791, 50276, 71, 3341, 627, 310, 642, 2905, 789, 2593, 275, 253, 2929, 534, 2789, 352, 2834, 281, 1659, 275, 3634, 50276, 8826, 643, 5884, 5701, 50276, 6050, 2829, 577, 556, 247, 1643, 4722, 3607, 273, 1016, 273, 253, 11333, 253, 373, 642, 1783, 273, 253, 3486, 273, 841, 3607, 327, 253, 29227, 22791, 7363, 352, 651, 320, 4722, 285, 4623, 281, 8338, 849, 824, 3607, 3486, 253, 1232, 50276, 9088, 310, 1652, 5955, 347, 281, 2139, 1016, 273, 253, 15302, 369, 6777, 4457, 597, 403, 275, 897, 275, 253, 1673, 50275, 74, 651, 1663, 751, 281, 923, 690, 8442, 534, 7568, 253, 5921, 875, 253, 2629, 22791, 285, 253, 4081, 29227, 22791, 697, 247, 2372, 1892, 281, 14390, 432, 253, 2829, 604, 253, 1543, 403, 9578, 285, 604, 597, 403, 849, 8244, 597, 403, 2905, 891, 651, 11435, 690, 625, 5955, 273, 253, 10704, 13007, 875, 253, 2629, 2746, 285, 253, 4081, 2746, 285, 891, 23000, 4282, 604, 247, 8578, 273, 253, 6777, 4679, 812, 320, 908, 281, 4796, 3733, 673, 285, 651, 24888, 347, 973, 390, 1805, 342, 253, 2120, 6777, 3368, 873, 310, 352, 1896, 281, 7568, 326, 1046, 22791, 6777, 310, 3309, 50276, 783, 1783, 273, 253, 28669, 570, 275, 4677, 20, 310, 8489, 28019, 285, 36908, 1663, 823, 1199, 281, 253, 2929, 752, 2789, 323, 247, 1805, 28669, 570, 21496, 310, 17433, 1553, 1245, 387, 1682, 285, 1057, 417, 1900, 5224, 2533, 1430, 923, 1060, 3614, 8155, 408, 16712, 6961, 24418, 1088, 1641, 570, 50275, 783, 896, 47473, 326, 403, 6777, 323, 253, 2746, 403, 8489, 10341, 352, 651, 320, 1175, 281, 2486, 247, 5955, 273, 1896, 896, 47473, 285, 281, 2096, 2139, 1016, 27882, 369, 6777, 50276, 24159, 327, 690, 273, 253, 49602, 369, 247, 2372, 21643, 347, 352, 369, 3559, 347, 2228, 4427, 3185, 273, 7200, 534, 310, 2629, 323, 1142, 273, 253, 15302, 2011, 50272, 7152, 33032, 2520, 2929, 23970, 247, 11088, 22791, 272, 323, 49863, 29974, 13337, 4715, 327, 8113, 295, 24343, 285, 9797, 285, 436, 2929, 3400, 271, 39331, 11453, 285, 1698, 16736, 7103, 7241, 342, 3215, 26208, 50276, 71, 7795, 25004, 22199, 8493, 253, 2105, 273, 256, 3433, 4679, 23000, 436, 2929, 17930, 1047, 256, 3433, 11333, 285, 13279, 1505, 247, 23178, 2127, 4793, 285, 3596, 4367, 323, 3477, 21068, 273, 253, 2361, 1543, 275, 436, 789, 50276, 18, 436, 2929, 21168, 247, 11088, 22791, 272, 327, 49863, 29974, 13337, 4715, 534, 4428, 8113, 295, 24343, 285, 9797, 50276, 19, 253, 789, 13279, 253, 2603, 2127, 323, 2007, 6880, 534, 310, 12912, 323, 253, 13361, 3114, 275, 253, 256, 3433, 1673, 50276, 20, 436, 2929, 310, 973, 15720, 285, 3400, 4278, 327, 253, 15302, 285, 3368, 9978, 50276, 21, 352, 310, 1270, 326, 359, 2098, 281, 23554, 253, 22791, 275, 253, 2852, 25142, 689, 673, 407, 13633, 342, 625, 8892, 1386, 18881, 20735, 436, 22791, 272, 812, 320, 3486, 1020, 323, 253, 256, 3433, 3114, 50276, 18, 3738, 253, 3368, 1543, 403, 11088, 253, 16039, 285, 1783, 403, 4942, 3710, 352, 651, 320, 1805, 604, 253, 4477, 812, 2085, 625, 16039, 285, 32912, 432, 253, 3368, 1543, 841, 16039, 403, 4619, 323, 29853, 256, 3433, 1566, 11809, 275, 253, 2852, 50275, 19, 352, 310, 5125, 326, 752, 253, 22791, 476, 1361, 323, 253, 256, 3433, 3114, 812, 320, 2529, 625, 4518, 285, 11120, 275, 253, 2929, 50276, 20, 1580, 627, 403, 2168, 247, 2257, 273, 5368, 256, 3433, 49602, 253, 5886, 285, 5301, 342, 5368, 22791, 272, 778, 878, 625, 7000, 20121, 534, 812, 320, 1691, 275, 253, 30762, 5474, 339, 24293, 9020, 256, 3433, 3082, 7103, 14238, 403, 3710, 984, 273, 891, 760, 16994, 731, 970, 30105, 8892, 285, 21255, 253, 8284, 281, 6194, 3676, 11454, 3210, 432, 20041, 436, 2929, 44995, 1027, 256, 3433, 11333, 2439, 1027, 10625, 30105, 295, 24343, 285, 9797, 8892, 970, 8113, 4979, 398, 9084, 3185, 273, 3733, 6041, 5289, 824, 347, 501, 3024, 432, 20041, 253, 2022, 7680, 310, 5277, 253, 3215, 11273, 3210, 326, 5752, 347, 247, 22791, 323, 256, 3433, 11333, 841, 7103, 3210, 403, 1754, 327, 9084, 337, 253, 2929, 275, 2087, 310, 973, 24013, 8550, 24443, 347, 352, 29328, 281, 12454, 253, 2440, 273, 256, 3433, 11333, 281, 2561, 39803, 342, 3710, 15180, 5300, 50275, 19, 253, 5235, 273, 253, 15302, 2439, 2709, 10625, 285, 253, 2783, 1638, 256, 3433, 11333, 50275, 18, 275, 2087, 2007, 8254, 6787, 403, 3058, 323, 690, 4243, 273, 253, 2929, 2792, 374, 285, 495, 2708, 50275, 19, 323, 253, 6197, 2429, 342, 3733, 11454, 3210, 432, 20041, 3215, 26208, 556, 1199, 3777, 2105, 275, 256, 3433, 2568, 4942, 1643, 49602, 3959, 247, 4344, 1071, 3722, 323, 256, 3433, 342, 253, 3215, 11273, 9508, 273, 11454, 3210, 476, 4477, 2085, 10414, 604, 627, 403, 643, 1071, 20534, 323, 256, 3433, 3082, 352, 310, 8521, 281, 3748, 731, 285, 22059, 616, 7364, 407, 534, 29227, 310, 15974, 275, 436, 2929, 891, 2096, 326, 253, 4477, 921, 326, 3064, 275, 305, 11113, 1897, 3058, 281, 7472, 256, 3433, 11333, 275, 2829, 337, 66, 342, 44978, 3127, 2299, 2007, 22909, 285, 4278, 715, 253, 32213, 273, 3127, 651, 3157, 253, 2929, 50275, 20, 642, 8350, 4278, 285, 8090, 34577, 403, 2530, 323, 253, 12845, 9084, 1677, 326, 253, 2022, 7680, 273, 436, 789, 310, 253, 897, 273, 3215, 11273, 9084, 347, 271, 7103, 22791, 891, 2868, 12930, 849, 253, 9084, 310, 12845, 2112, 342, 697, 2605, 310, 273, 1029, 6349, 671, 352, 310, 417, 2590, 323, 4227, 604, 253, 1072, 10336, 310, 908, 323, 512, 8892, 352, 310, 5393, 326, 323, 30105, 4477, 897, 247, 1566, 2074, 281, 3435, 323, 295, 24343, 285, 9797, 327, 253, 643, 1133, 352, 310, 5393, 326, 247, 39707, 1566, 2074, 281, 6705, 2526, 310, 8671, 1057, 436, 1599, 326, 323, 1016, 4836, 247, 1027, 3215, 11273, 1566, 310, 908, 50275, 21, 1223, 253, 12291, 2593, 310, 14109, 352, 310, 417, 2590, 2139, 5293, 273, 253, 5393, 3082, 275, 436, 2593, 310, 9009, 247, 22861, 310, 3058, 275, 436, 1083, 534, 5644, 281, 253, 1953, 476, 29227, 320, 2783, 27998, 604, 4633, 36827, 3169, 285, 263, 305, 14340, 3169, 256, 3433, 3082, 403, 417, 9009, 50275, 187, 187, 4118, 18435, 27, 2520, 3929, 2540, 271, 3939, 30080, 22559, 2180, 285, 7964, 2427, 281, 247, 2457, 830, 326, 943, 320, 3863, 387, 436, 18767, 347, 16318, 407, 253, 6323, 11269, 273, 512, 30628, 352, 588, 320, 247, 1175, 823, 281, 253, 3862, 4471, 5028, 49863, 29974, 13337, 3114, 50276 ]
[ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 ]
[ 1691, 731, 715, 271, 13279, 47549, 2127, 4793, 323, 3477, 5301, 11088, 4679, 4518, 7568, 253, 3910, 50275, 783, 4477, 2085, 1783, 432, 253, 4679, 824, 347, 253, 3064, 875, 3733, 432, 20041, 285, 970, 3215, 26208, 275, 256, 3433, 285, 253, 12510, 273, 256, 3433, 672, 970, 253, 1375, 23037, 14387, 11454, 3210, 347, 253, 896, 47473, 50276, 783, 4477, 943, 320, 625, 2590, 670, 253, 16038, 285, 2139, 436, 789, 25249, 47549, 2127, 4793, 310, 4217, 323, 253, 3114, 323, 1650, 275, 253, 10199, 253, 4477, 5393, 326, 5368, 49602, 403, 6571, 20793, 281, 30105, 8892, 533, 513, 417, 4518, 17093, 2139, 436, 310, 247, 1895, 285, 752, 11361, 476, 320, 12103, 407, 3652, 247, 27998, 256, 3433, 22791, 50275, 77, 471, 273, 16039, 390, 22909, 323, 253, 5661, 1543, 323, 1650, 275, 2593, 7652, 672, 22291, 2139, 943, 359, 7472, 271, 256, 3433, 5933, 327, 11117, 8892, 2439, 10625, 253, 4477, 753, 253, 3910, 875, 17210, 273, 256, 3433, 11333, 275, 1027, 10625, 2550, 320, 12841, 7613, 352, 310, 9560, 281, 9569, 11117, 8892, 432, 2709, 10625, 672, 16344, 271, 256, 3433, 5933, 2139, 476, 352, 417, 320, 12841, 846, 16984, 11117, 8892, 752, 16039, 476, 359, 755, 50276, 7152, 33032, 2520, 2929, 23970, 29227, 247, 22791, 323, 49863, 29974, 13337, 4715, 534, 10949, 9162, 3237, 275, 2067, 10625, 1690, 8113, 260, 338, 274, 2313, 331, 77, 740, 17069, 22354, 3375, 86, 24551, 382, 3300, 571, 1634, 3448, 516, 5470, 639, 13608, 7001, 251, 2278, 340, 20888, 3662, 285, 340, 47705, 285, 9797, 6262, 1235, 305, 21239, 266, 2936, 85, 284, 517, 25, 76, 269, 8289, 2369, 17976, 1093, 76, 436, 22791, 13698, 281, 3835, 625, 10625, 685, 253, 2629, 7274, 323, 22791, 272, 347, 973, 347, 281, 4796, 253, 2408, 273, 13782, 2424, 5345, 305, 11113, 1897, 4632, 32575, 305, 11113, 1897, 323, 4993, 8992, 253, 2929, 671, 23970, 2127, 323, 1638, 5368, 256, 3433, 11333, 534, 403, 6760, 2556, 281, 253, 22791, 19836, 253, 2929, 2722, 326, 762, 253, 29227, 22791, 6520, 8992, 15988, 608, 5053, 275, 19947, 689, 253, 2045, 260, 1109, 1506, 2746, 275, 4382, 8113, 50275, 783, 9380, 2201, 9021, 403, 50276, 36445, 2844, 247, 4849, 273, 8892, 323, 22791, 272, 256, 3433, 50276, 48387, 839, 342, 690, 1783, 253, 3045, 273, 690, 256, 3433, 11333, 327, 436, 4836, 275, 2087, 253, 2929, 310, 973, 17194, 642, 5368, 22791, 4961, 323, 49863, 813, 35421, 4715, 11333, 534, 10949, 247, 4618, 2491, 273, 10625, 285, 7274, 253, 2929, 1057, 247, 9648, 1175, 2628, 387, 17221, 5919, 2568, 4623, 9162, 8892, 1561, 253, 1264, 2303, 10625, 273, 3448, 8113, 285, 9797, 1677, 253, 3045, 327, 253, 29227, 22791, 891, 651, 320, 4942, 13224, 275, 253, 3045, 273, 253, 49863, 29974, 13337, 4715, 5933, 50276, 249, 1635, 253, 2929, 556, 4942, 2266, 1783, 273, 253, 5762, 11333, 891, 717, 1199, 625, 13224, 275, 253, 3045, 285, 4679, 1677, 1060, 685, 275, 1142, 9380, 285, 432, 253, 2127, 2530, 352, 310, 2590, 326, 253, 1543, 403, 41374, 285, 1021, 35418, 253, 7792, 323, 7103, 310, 671, 5272, 285, 5293, 273, 253, 3082, 323, 10405, 3006, 253, 3045, 689, 253, 8892, 1646, 1512, 4743, 456, 50275, 74, 11435, 5742, 326, 253, 2929, 3797, 2709, 7533, 285, 12922, 347, 629, 273, 253, 7103, 673, 285, 22791, 272, 1232, 534, 310, 2223, 14999, 275, 5368, 27558, 50275, 74, 2007, 1663, 11435, 253, 958, 326, 253, 2408, 273, 3733, 673, 310, 2668, 715, 2395, 672, 18216, 253, 50276, 43786, 516, 11926, 670, 253, 7990, 273, 253, 2929, 285, 253, 3242, 10165, 1160, 323, 253, 22791, 253, 29227, 22791, 347, 4767, 3916, 281, 320, 247, 27998, 22791, 323, 49863, 29974, 13337, 4715, 2299, 253, 6777, 8892, 8127, 11823, 1142, 273, 253, 8892, 326, 2060, 7888, 1908, 11132, 390, 2834, 625, 5742, 516, 7514, 326, 253, 22791, 310, 3710, 281, 9162, 8892, 534, 403, 21533, 4156, 275, 3753, 285, 2430, 1652, 4030, 72, 11273, 2508, 281, 8415, 824, 247, 3061, 778, 1421, 281, 2561, 23652, 3006, 3210, 534, 403, 4158, 323, 9162, 8892, 285, 11823, 643, 9696, 1774, 285, 4722, 8892, 50275, 249, 8113, 1142, 4893, 273, 49863, 813, 35421, 4715, 2770, 327, 4030, 72, 11273, 285, 1182, 254, 6934, 302, 9162, 534, 403, 6571, 5816, 432, 253, 22791, 3300, 571, 1634, 310, 253, 760, 2908, 1650, 534, 310, 1014, 2810, 23000, 49863, 813, 22296, 4715, 556, 644, 2011, 281, 320, 7763, 275, 8892, 275, 1789, 5481, 3216, 272, 495, 69, 8113, 285, 2460, 5978, 50276, 4609, 403, 512, 7561, 7763, 285, 4722, 281, 8338, 50276, 249, 295, 24343, 253, 6777, 15302, 403, 625, 7561, 14859, 533, 11823, 1774, 8892, 824, 347, 3448, 5978, 16691, 3448, 14053, 10405, 1320, 285, 1121, 423, 297, 404, 2805, 66, 23000, 15302, 751, 516, 5470, 403, 8127, 2783, 14042, 275, 253, 22296, 5028, 534, 476, 1056, 352, 1892, 281, 897, 347, 247, 43073, 4836, 1014, 275, 256, 3433, 50275, 249, 9797, 253, 49602, 2770, 327, 9797, 9162, 533, 1142, 643, 8892, 403, 4623, 1690, 5847, 1197, 9162, 14925, 8981, 16644, 6519, 8981, 285, 2571, 323, 247, 1175, 6010, 273, 8892, 923, 253, 29837, 22791, 7904, 275, 253, 2929, 50275, 249, 1798, 891, 651, 751, 281, 923, 247, 14200, 2491, 273, 8892, 275, 247, 27998, 22791, 534, 8003, 4887, 253, 1655, 256, 5503, 4893, 323, 10020, 285, 1881, 35421, 4715, 50276, 23955, 4468, 891, 452, 310, 342, 253, 6777, 11333, 347, 5469, 275, 253, 7364, 2593, 273, 253, 2929, 627, 403, 247, 1534, 1180, 273, 256, 3433, 3169, 11333, 534, 403, 5816, 2007, 253, 6777, 11333, 281, 320, 9009, 403, 8127, 432, 247, 2014, 3884, 3761, 275, 253, 256, 3433, 2317, 534, 812, 8492, 2852, 2561, 4404, 11333, 18872, 275, 1534, 4088, 891, 651, 1663, 751, 281, 923, 247, 14200, 2491, 273, 1673, 12554, 800, 11333, 2581, 685, 247, 7000, 17947, 273, 2014, 5933, 11640, 2908, 275, 253, 22791, 50276, 71, 3341, 627, 310, 642, 2905, 789, 2593, 275, 253, 2929, 534, 2789, 352, 2834, 281, 1659, 275, 3634, 50276, 8826, 643, 5884, 5701, 50276, 6050, 2829, 577, 556, 247, 1643, 4722, 3607, 273, 1016, 273, 253, 11333, 253, 373, 642, 1783, 273, 253, 3486, 273, 841, 3607, 327, 253, 29227, 22791, 7363, 352, 651, 320, 4722, 285, 4623, 281, 8338, 849, 824, 3607, 3486, 253, 1232, 50276, 9088, 310, 1652, 5955, 347, 281, 2139, 1016, 273, 253, 15302, 369, 6777, 4457, 597, 403, 275, 897, 275, 253, 1673, 50275, 74, 651, 1663, 751, 281, 923, 690, 8442, 534, 7568, 253, 5921, 875, 253, 2629, 22791, 285, 253, 4081, 29227, 22791, 697, 247, 2372, 1892, 281, 14390, 432, 253, 2829, 604, 253, 1543, 403, 9578, 285, 604, 597, 403, 849, 8244, 597, 403, 2905, 891, 651, 11435, 690, 625, 5955, 273, 253, 10704, 13007, 875, 253, 2629, 2746, 285, 253, 4081, 2746, 285, 891, 23000, 4282, 604, 247, 8578, 273, 253, 6777, 4679, 812, 320, 908, 281, 4796, 3733, 673, 285, 651, 24888, 347, 973, 390, 1805, 342, 253, 2120, 6777, 3368, 873, 310, 352, 1896, 281, 7568, 326, 1046, 22791, 6777, 310, 3309, 50276, 783, 1783, 273, 253, 28669, 570, 275, 4677, 20, 310, 8489, 28019, 285, 36908, 1663, 823, 1199, 281, 253, 2929, 752, 2789, 323, 247, 1805, 28669, 570, 21496, 310, 17433, 1553, 1245, 387, 1682, 285, 1057, 417, 1900, 5224, 2533, 1430, 923, 1060, 3614, 8155, 408, 16712, 6961, 24418, 1088, 1641, 570, 50275, 783, 896, 47473, 326, 403, 6777, 323, 253, 2746, 403, 8489, 10341, 352, 651, 320, 1175, 281, 2486, 247, 5955, 273, 1896, 896, 47473, 285, 281, 2096, 2139, 1016, 27882, 369, 6777, 50276, 24159, 327, 690, 273, 253, 49602, 369, 247, 2372, 21643, 347, 352, 369, 3559, 347, 2228, 4427, 3185, 273, 7200, 534, 310, 2629, 323, 1142, 273, 253, 15302, 2011, 50272, 7152, 33032, 2520, 2929, 23970, 247, 11088, 22791, 272, 323, 49863, 29974, 13337, 4715, 327, 8113, 295, 24343, 285, 9797, 285, 436, 2929, 3400, 271, 39331, 11453, 285, 1698, 16736, 7103, 7241, 342, 3215, 26208, 50276, 71, 7795, 25004, 22199, 8493, 253, 2105, 273, 256, 3433, 4679, 23000, 436, 2929, 17930, 1047, 256, 3433, 11333, 285, 13279, 1505, 247, 23178, 2127, 4793, 285, 3596, 4367, 323, 3477, 21068, 273, 253, 2361, 1543, 275, 436, 789, 50276, 18, 436, 2929, 21168, 247, 11088, 22791, 272, 327, 49863, 29974, 13337, 4715, 534, 4428, 8113, 295, 24343, 285, 9797, 50276, 19, 253, 789, 13279, 253, 2603, 2127, 323, 2007, 6880, 534, 310, 12912, 323, 253, 13361, 3114, 275, 253, 256, 3433, 1673, 50276, 20, 436, 2929, 310, 973, 15720, 285, 3400, 4278, 327, 253, 15302, 285, 3368, 9978, 50276, 21, 352, 310, 1270, 326, 359, 2098, 281, 23554, 253, 22791, 275, 253, 2852, 25142, 689, 673, 407, 13633, 342, 625, 8892, 1386, 18881, 20735, 436, 22791, 272, 812, 320, 3486, 1020, 323, 253, 256, 3433, 3114, 50276, 18, 3738, 253, 3368, 1543, 403, 11088, 253, 16039, 285, 1783, 403, 4942, 3710, 352, 651, 320, 1805, 604, 253, 4477, 812, 2085, 625, 16039, 285, 32912, 432, 253, 3368, 1543, 841, 16039, 403, 4619, 323, 29853, 256, 3433, 1566, 11809, 275, 253, 2852, 50275, 19, 352, 310, 5125, 326, 752, 253, 22791, 476, 1361, 323, 253, 256, 3433, 3114, 812, 320, 2529, 625, 4518, 285, 11120, 275, 253, 2929, 50276, 20, 1580, 627, 403, 2168, 247, 2257, 273, 5368, 256, 3433, 49602, 253, 5886, 285, 5301, 342, 5368, 22791, 272, 778, 878, 625, 7000, 20121, 534, 812, 320, 1691, 275, 253, 30762, 5474, 339, 24293, 9020, 256, 3433, 3082, 7103, 14238, 403, 3710, 984, 273, 891, 760, 16994, 731, 970, 30105, 8892, 285, 21255, 253, 8284, 281, 6194, 3676, 11454, 3210, 432, 20041, 436, 2929, 44995, 1027, 256, 3433, 11333, 2439, 1027, 10625, 30105, 295, 24343, 285, 9797, 8892, 970, 8113, 4979, 398, 9084, 3185, 273, 3733, 6041, 5289, 824, 347, 501, 3024, 432, 20041, 253, 2022, 7680, 310, 5277, 253, 3215, 11273, 3210, 326, 5752, 347, 247, 22791, 323, 256, 3433, 11333, 841, 7103, 3210, 403, 1754, 327, 9084, 337, 253, 2929, 275, 2087, 310, 973, 24013, 8550, 24443, 347, 352, 29328, 281, 12454, 253, 2440, 273, 256, 3433, 11333, 281, 2561, 39803, 342, 3710, 15180, 5300, 50275, 19, 253, 5235, 273, 253, 15302, 2439, 2709, 10625, 285, 253, 2783, 1638, 256, 3433, 11333, 50275, 18, 275, 2087, 2007, 8254, 6787, 403, 3058, 323, 690, 4243, 273, 253, 2929, 2792, 374, 285, 495, 2708, 50275, 19, 323, 253, 6197, 2429, 342, 3733, 11454, 3210, 432, 20041, 3215, 26208, 556, 1199, 3777, 2105, 275, 256, 3433, 2568, 4942, 1643, 49602, 3959, 247, 4344, 1071, 3722, 323, 256, 3433, 342, 253, 3215, 11273, 9508, 273, 11454, 3210, 476, 4477, 2085, 10414, 604, 627, 403, 643, 1071, 20534, 323, 256, 3433, 3082, 352, 310, 8521, 281, 3748, 731, 285, 22059, 616, 7364, 407, 534, 29227, 310, 15974, 275, 436, 2929, 891, 2096, 326, 253, 4477, 921, 326, 3064, 275, 305, 11113, 1897, 3058, 281, 7472, 256, 3433, 11333, 275, 2829, 337, 66, 342, 44978, 3127, 2299, 2007, 22909, 285, 4278, 715, 253, 32213, 273, 3127, 651, 3157, 253, 2929, 50275, 20, 642, 8350, 4278, 285, 8090, 34577, 403, 2530, 323, 253, 12845, 9084, 1677, 326, 253, 2022, 7680, 273, 436, 789, 310, 253, 897, 273, 3215, 11273, 9084, 347, 271, 7103, 22791, 891, 2868, 12930, 849, 253, 9084, 310, 12845, 2112, 342, 697, 2605, 310, 273, 1029, 6349, 671, 352, 310, 417, 2590, 323, 4227, 604, 253, 1072, 10336, 310, 908, 323, 512, 8892, 352, 310, 5393, 326, 323, 30105, 4477, 897, 247, 1566, 2074, 281, 3435, 323, 295, 24343, 285, 9797, 327, 253, 643, 1133, 352, 310, 5393, 326, 247, 39707, 1566, 2074, 281, 6705, 2526, 310, 8671, 1057, 436, 1599, 326, 323, 1016, 4836, 247, 1027, 3215, 11273, 1566, 310, 908, 50275, 21, 1223, 253, 12291, 2593, 310, 14109, 352, 310, 417, 2590, 2139, 5293, 273, 253, 5393, 3082, 275, 436, 2593, 310, 9009, 247, 22861, 310, 3058, 275, 436, 1083, 534, 5644, 281, 253, 1953, 476, 29227, 320, 2783, 27998, 604, 4633, 36827, 3169, 285, 263, 305, 14340, 3169, 256, 3433, 3082, 403, 417, 9009, 50275, 187, 187, 4118, 18435, 27, 2520, 3929, 2540, 271, 3939, 30080, 22559, 2180, 285, 7964, 2427, 281, 247, 2457, 830, 326, 943, 320, 3863, 387, 436, 18767, 347, 16318, 407, 253, 6323, 11269, 273, 512, 30628, 352, 588, 320, 247, 1175, 823, 281, 253, 3862, 4471, 5028, 49863, 29974, 13337, 3114, 50276 ]
Below is given review of a research paper from cnoference journal. Please write a summary the review. ### Review: the paper shows that the count median sketch and the count min sketch for frequency estimation can be made differentially private by adding a small amount of gaussian noise the sketches are then used for quantile estimation similarly to how the nonprivate sketches are used for the same problem for better or worse this is a very straightforward application of textbook differential privacy techniques the small dimension of the sketch leads to a good bound on the ell2 sensitivity which allows adding a small amount of gaussian noise to achieve privacy this is a useful tool to have but in my opinion a straightforward application of the gaussian noise mechanism to a wellknown sketch is not enough to meet the bar for publication the paper needs either a tighter analysis note that a recent preprint by pagh and thorup claims a tighter bound on the error of the count sketch with added gaussian noise or a more impressive application no comment docsepthis paper demonstrates that linear sketches can be made differentially private by adding a small amount of gaussian noise at initialization experimental results on zipf and the real caida dataset show that the utility is less influenced by this modification strengths trendy topic wellorganized paper the improvement is simple yet effective weaknesses the assumption limits the application scenarios lack of analysis detailed weakness comments this paper points out that linear sketches can be made differentially private and provide useful information while maintaining their original properties specifically contrary to the standard operation that initializes each counter with 0 the authors initialize each counter with a noise sampled from certain gaussian distribution whose parameters are determined by the accuracy parameter failure probability and budget for zcdp i like this idea overall though this modification is small the authors prove that this could make the linear sketches differentially private however this paper also has some flaws such as the assumption and lacking analysis of the case where the adversary is actively involved i list my concrete concerns in the following w1 the assumption limits the application scenarios in section 22 the authors briefly introduce linear sketches they assume that item x has value v equals either 1 or 1 which limits the application scenarios of linear sketches however from algorithms 1 and 2 i didnt see any operation that relies on this assumption therefore i would like to know whether this is a redundant assumption or if this assumption is required to make sure the private linear sketches satisfy differential privacy if this assumption is only needed to achieve the dp guarantee i think the tradeoff between privacy and applicability should be discussed w2 it would be nice if the authors could provide a brief illustration about why vanilla linear sketch does not hold differential privacy property which may make the improvement intuition clearer in this paper w3 lack of analysis the key idea to making existing linear sketches private is to initialize counters with random noise as shown in algorithm 3 and the authors provide a series of analysis to prove that the modified scheme satisfies the dp definition which is nice however i wonder whether these properties still hold when the private linear sketches are attacked for example the authors initialize counter c with e plus noise sampled from a gaussian distribution for countmin case yet the attacker could eliminate this e in the initialization step since e is determined once the accuracy parameter failure probability and budget for zcdp are fixed i would like to know would the scheme still satisfy the dp requirement the authors adequately addressed the limitations and potential negative societal impact of their work docsepthe paper present differentially private variants of linear sketches such as count min and count median sketches the differential privacy guarantees are provided using a recent notion of differential privacy zdp dwork and rothblum 2016 bun and steinke 2016 the only change from vanilla sketches is the initialization provided in algorithm 3 which when combined with the update algorithm 1 and query algorithm 2 components of count min and count median sketches leads to the differentially private versions of these sketches the initialization itself adds gaussian noise to each array item any other noise source satisfying zcdp would also suffice the analysis of the algorithms directly follows by combining the analysis for vanilla sketches with the recent differentially privacy results building on the differentially private count median sketch the authors propose a dyadic countmedian sketch that can estimate all the quantiles simultaneously while ensuring differential privacy the experimental evaluations validate the practicality of the algorithms overall while the algorithms presented as well as their analysis are not novel the paper does a good job of reviewing the literature in two disjoint fields and providing a comprehensive set of differential private linear sketches which need to be documented therefore i recommend acceptance 1 the authors systematically study the literature on linear sketches and differential privacy and provide a comprehensive set of differentially private linear sketches 2 the paper is well written and easy to follow 3 however there isnt any novelty either algorithmically or in the theoretical analysis of the algorithms the only real change from an algorithm standpoint is the initialization phase of the linear sketches the proofs follow by bootstrapping the standard proofs for linear sketches and using the recent differentially privacy results 4 this would be a solid paper had the authors presented some sort of impossibility result for frequency estimation under privacy guarantees yes docsepthe paper considers differential privacy for some standard sketches determining their differential privacy bounds under gaussian noise strength natural problem from differential privacy analysis experiments good set of experimental results weaknesses analysis is straightforward doesnt get same onesided error bounds just a with high probability variant originality im concerned about the originality in some sense similar results with perhaps weaker or incomparable results its hard to tell appear in other papers such as mir d muthukrishnan s nikolov a and wright r n panprivate algorithms via statistics on sketches in proceedings of symposium on principles of database systems pods pp 3748 2011 which is cited in this paper and melis l danezis g and cristofaro e d efficient private statistics with succinct sketches in annual network and distributed system security symposium ndss the internet society 2016 which is not the results seem to follow just by applying the standard gaussian mechanism and then applying composition theorems from other work lemmas 2729 perhaps nobody has written this down before but i have trouble believing it was not known as it seems a straighforward composition quality given the originality concerns above the quality is ok clarity the writing is generally good and clear significance see the originality issues above na ### Summary:
this work constructs differentially private linear sketches for frequency estimation and related problems it shows that count minmedian sketch can be made dp by adding noise this is an interesting contribution the reviewers had concerns about the novelty of the paper the updated version of the paper does a reasonable analysis of the algorithm the authors point out in the rebuttal how their bounds are incomparable to a concurrent work i am not convinced that the incomparable parts are interesting in the sense that closeness to the nonprivate count sketch does not seem like a useful goal in itself and the target of interest is always the original frequencies the authors do an empirical analysis and show that the algorithm works well under reasonable privacy budgets overall i think this paper may be a reasonable one to accept
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 783, 2929, 2722, 326, 253, 1385, 8876, 23211, 285, 253, 1385, 1054, 23211, 323, 4294, 13418, 476, 320, 1160, 21673, 3055, 407, 6240, 247, 1355, 2408, 273, 305, 12064, 6046, 253, 46159, 403, 840, 908, 323, 2677, 587, 13418, 12014, 281, 849, 253, 1327, 9486, 46159, 403, 908, 323, 253, 1072, 1895, 323, 1805, 390, 7197, 436, 310, 247, 1077, 15246, 2898, 273, 40554, 8967, 11068, 5609, 253, 1355, 7877, 273, 253, 23211, 5644, 281, 247, 1175, 3033, 327, 253, 11591, 19, 7340, 534, 4483, 6240, 247, 1355, 2408, 273, 305, 12064, 6046, 281, 5115, 11068, 436, 310, 247, 4217, 4968, 281, 452, 533, 275, 619, 4743, 247, 15246, 2898, 273, 253, 305, 12064, 6046, 5122, 281, 247, 973, 4304, 23211, 310, 417, 2217, 281, 2525, 253, 2534, 323, 9311, 50276, 783, 2929, 3198, 2057, 247, 40638, 1783, 3877, 326, 247, 3332, 638, 3845, 407, 24949, 73, 285, 9062, 484, 3916, 247, 40638, 3033, 327, 253, 2228, 273, 253, 1385, 23211, 342, 2879, 305, 12064, 6046, 390, 247, 625, 13943, 2898, 642, 4385, 5474, 33032, 2520, 2929, 14371, 326, 4872, 46159, 476, 320, 1160, 21673, 3055, 407, 6240, 247, 1355, 2408, 273, 305, 12064, 6046, 387, 31850, 5661, 1543, 327, 23367, 71, 285, 253, 1524, 7318, 4355, 10895, 921, 326, 253, 11839, 310, 1679, 12208, 407, 436, 11237, 20544, 50275, 85, 5047, 90, 9400, 50276, 4714, 34092, 2929, 50276, 783, 7756, 310, 2969, 2568, 3576, 50276, 20881, 1255, 265, 50275, 783, 9376, 7787, 253, 2898, 15216, 50276, 77, 471, 273, 1783, 50276, 5992, 7193, 14855, 5701, 50276, 2520, 2929, 2792, 562, 326, 4872, 46159, 476, 320, 1160, 21673, 3055, 285, 2085, 4217, 1491, 1223, 11850, 616, 3236, 3607, 5742, 10214, 281, 253, 2629, 4254, 326, 3302, 4219, 1016, 4828, 342, 470, 253, 4477, 26641, 1016, 4828, 342, 247, 6046, 19958, 432, 2176, 305, 12064, 3268, 3692, 3602, 403, 3413, 407, 253, 7200, 4764, 4433, 5912, 285, 7563, 323, 1182, 2428, 81, 50276, 74, 751, 436, 2934, 4583, 2167, 436, 11237, 310, 1355, 253, 4477, 5276, 326, 436, 812, 1056, 253, 4872, 46159, 21673, 3055, 2299, 436, 2929, 671, 556, 690, 32138, 824, 347, 253, 9376, 285, 14999, 1783, 273, 253, 1083, 835, 253, 34014, 310, 15257, 3206, 891, 1618, 619, 11859, 7350, 275, 253, 1563, 50276, 88, 18, 253, 9376, 7787, 253, 2898, 15216, 275, 2593, 3307, 253, 4477, 13366, 9569, 4872, 46159, 597, 5467, 326, 5382, 1269, 556, 1318, 362, 18207, 2057, 337, 390, 337, 534, 7787, 253, 2898, 15216, 273, 4872, 46159, 2299, 432, 11333, 337, 285, 374, 891, 42126, 923, 667, 4254, 326, 15771, 327, 436, 9376, 3103, 891, 651, 751, 281, 871, 1880, 436, 310, 247, 28116, 9376, 390, 604, 436, 9376, 310, 2424, 281, 1056, 2119, 253, 3055, 4872, 46159, 10517, 8967, 11068, 604, 436, 9376, 310, 760, 3058, 281, 5115, 253, 33234, 12215, 891, 1158, 253, 5454, 2727, 875, 11068, 285, 30437, 943, 320, 5469, 50276, 88, 19, 352, 651, 320, 5322, 604, 253, 4477, 812, 2085, 247, 4864, 23356, 670, 2139, 26724, 4872, 23211, 1057, 417, 2186, 8967, 11068, 2867, 534, 778, 1056, 253, 7756, 30328, 30909, 275, 436, 2929, 50276, 88, 20, 3480, 273, 1783, 253, 2234, 2934, 281, 2403, 5368, 4872, 46159, 3055, 310, 281, 26641, 33605, 342, 3632, 6046, 347, 2011, 275, 5933, 495, 285, 253, 4477, 2085, 247, 2962, 273, 1783, 281, 5276, 326, 253, 7321, 6974, 12310, 253, 33234, 5426, 534, 310, 5322, 2299, 891, 4282, 1880, 841, 3607, 1335, 2186, 672, 253, 3055, 4872, 46159, 403, 13964, 323, 1650, 253, 4477, 26641, 4828, 260, 342, 299, 5043, 6046, 19958, 432, 247, 305, 12064, 3268, 323, 1385, 1222, 1083, 2568, 253, 30539, 812, 13469, 436, 299, 275, 253, 31850, 3213, 1580, 299, 310, 3413, 2378, 253, 7200, 4764, 4433, 5912, 285, 7563, 323, 1182, 2428, 81, 403, 4229, 891, 651, 751, 281, 871, 651, 253, 6974, 1335, 10517, 253, 33234, 8284, 50273, 783, 4477, 18212, 9713, 253, 7364, 285, 2442, 4016, 38058, 3486, 273, 616, 789, 5474, 339, 431, 248, 2929, 1246, 21673, 3055, 11640, 273, 4872, 46159, 824, 347, 1385, 1054, 285, 1385, 8876, 46159, 253, 8967, 11068, 23632, 403, 2530, 970, 247, 3332, 10732, 273, 8967, 11068, 1182, 12132, 50276, 69, 1601, 285, 687, 394, 1559, 360, 4022, 34258, 285, 2870, 249, 413, 4022, 253, 760, 1818, 432, 26724, 46159, 310, 253, 31850, 2530, 275, 5933, 495, 534, 672, 5678, 342, 253, 5731, 5933, 337, 285, 7316, 5933, 374, 4295, 273, 1385, 1054, 285, 1385, 8876, 46159, 5644, 281, 253, 21673, 3055, 9508, 273, 841, 46159, 253, 31850, 3139, 11323, 305, 12064, 6046, 281, 1016, 3781, 5382, 667, 643, 6046, 2603, 14127, 1182, 2428, 81, 651, 671, 36433, 253, 1783, 273, 253, 11333, 3587, 3637, 407, 16248, 253, 1783, 323, 26724, 46159, 342, 253, 3332, 21673, 11068, 1543, 50275, 22157, 327, 253, 21673, 3055, 1385, 8876, 23211, 253, 4477, 12661, 247, 17713, 18535, 1385, 29541, 23211, 326, 476, 6642, 512, 253, 2677, 3205, 10486, 1223, 17749, 8967, 11068, 253, 5661, 27163, 17813, 253, 8542, 414, 273, 253, 11333, 50276, 1189, 455, 1223, 253, 11333, 3559, 347, 973, 347, 616, 1783, 403, 417, 4460, 253, 2929, 1057, 247, 1175, 2628, 273, 16725, 253, 6239, 275, 767, 28465, 4910, 285, 5277, 247, 11088, 873, 273, 8967, 3055, 4872, 46159, 534, 878, 281, 320, 14290, 3103, 891, 5583, 14924, 50275, 18, 253, 4477, 24181, 1263, 253, 6239, 327, 4872, 46159, 285, 8967, 11068, 285, 2085, 247, 11088, 873, 273, 21673, 3055, 4872, 46159, 50276, 19, 253, 2929, 310, 973, 3542, 285, 3477, 281, 956, 495, 2299, 627, 310, 2649, 667, 38135, 2057, 5933, 1037, 390, 275, 253, 10527, 1783, 273, 253, 11333, 253, 760, 1524, 1818, 432, 271, 5933, 32764, 310, 253, 31850, 3408, 273, 253, 4872, 46159, 253, 27947, 956, 407, 7491, 10981, 2784, 253, 2629, 27947, 323, 4872, 46159, 285, 970, 253, 3332, 21673, 11068, 1543, 577, 436, 651, 320, 247, 4891, 2929, 574, 253, 4477, 3559, 690, 3686, 273, 37593, 2322, 906, 323, 4294, 13418, 762, 11068, 23632, 50275, 9820, 5474, 339, 431, 248, 2929, 19401, 8967, 11068, 323, 690, 2629, 46159, 8925, 616, 8967, 11068, 14493, 762, 305, 12064, 6046, 50275, 45563, 50276, 19293, 1895, 432, 8967, 11068, 1783, 50276, 16217, 3825, 50275, 12311, 873, 273, 5661, 1543, 50276, 20881, 1255, 265, 1783, 310, 15246, 36908, 755, 1072, 4394, 1356, 2228, 14493, 816, 247, 342, 1029, 5912, 12955, 50276, 19164, 414, 50276, 303, 7514, 670, 253, 3236, 414, 50276, 249, 690, 3282, 2074, 1543, 342, 4931, 21076, 390, 275, 681, 36730, 1543, 697, 1892, 281, 2028, 3176, 275, 643, 9380, 824, 347, 50276, 17001, 277, 2873, 73, 2788, 41657, 11943, 256, 295, 1479, 311, 729, 247, 285, 259, 918, 391, 295, 3199, 9486, 11333, 3066, 9990, 327, 46159, 275, 10061, 273, 18870, 35835, 327, 9241, 273, 5447, 2718, 43701, 7266, 5345, 2385, 4332, 534, 310, 11106, 275, 436, 2929, 285, 6673, 261, 298, 277, 1351, 91, 261, 305, 285, 1531, 382, 1171, 15354, 299, 277, 5919, 3055, 9990, 342, 18382, 4291, 46159, 275, 7970, 2990, 285, 5939, 985, 3988, 18870, 35835, 40515, 859, 253, 8573, 5948, 4022, 534, 310, 417, 50275, 783, 1543, 1646, 281, 956, 816, 407, 9433, 253, 2629, 305, 12064, 5122, 285, 840, 9433, 5889, 39383, 432, 643, 789, 50276, 282, 44661, 3435, 1717, 50276, 30875, 12445, 556, 3542, 436, 1066, 1078, 533, 891, 452, 7596, 22142, 352, 369, 417, 1929, 347, 352, 3133, 247, 3405, 798, 10495, 5889, 50274, 15177, 50276, 28821, 253, 3236, 414, 7350, 1840, 253, 3290, 310, 8718, 50276, 498, 15752, 50276, 783, 4028, 310, 3839, 1175, 285, 2590, 50276, 9188, 40348, 50276, 2887, 253, 3236, 414, 3374, 1840, 5549, 50276, 187, 187, 4118, 18435, 27, 2520, 789, 21031, 21673, 3055, 4872, 46159, 323, 4294, 13418, 285, 2905, 3237, 352, 2722, 326, 1385, 1054, 29541, 23211, 476, 320, 1160, 33234, 407, 6240, 6046, 436, 310, 271, 4722, 7680, 253, 30628, 574, 7350, 670, 253, 38135, 273, 253, 2929, 253, 9300, 2715, 273, 253, 2929, 1057, 247, 5272, 1783, 273, 253, 5933, 253, 4477, 1127, 562, 275, 253, 30080, 22559, 849, 616, 14493, 403, 275, 681, 36730, 281, 247, 17336, 789, 891, 717, 417, 13762, 326, 253, 275, 681, 36730, 4243, 403, 4722, 275, 253, 3282, 326, 2734, 8098, 281, 253, 1327, 9486, 1385, 23211, 1057, 417, 1646, 751, 247, 4217, 4736, 275, 3139, 285, 253, 2303, 273, 1600, 310, 1900, 253, 3236, 11383, 253, 4477, 513, 271, 16774, 1783, 285, 921, 326, 253, 5933, 2987, 973, 762, 5272, 11068, 35905, 4583, 891, 1158, 436, 2929, 778, 320, 247, 5272, 581, 281, 2997 ]
[ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 ]
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 783, 2929, 2722, 326, 253, 1385, 8876, 23211, 285, 253, 1385, 1054, 23211, 323, 4294, 13418, 476, 320, 1160, 21673, 3055, 407, 6240, 247, 1355, 2408, 273, 305, 12064, 6046, 253, 46159, 403, 840, 908, 323, 2677, 587, 13418, 12014, 281, 849, 253, 1327, 9486, 46159, 403, 908, 323, 253, 1072, 1895, 323, 1805, 390, 7197, 436, 310, 247, 1077, 15246, 2898, 273, 40554, 8967, 11068, 5609, 253, 1355, 7877, 273, 253, 23211, 5644, 281, 247, 1175, 3033, 327, 253, 11591, 19, 7340, 534, 4483, 6240, 247, 1355, 2408, 273, 305, 12064, 6046, 281, 5115, 11068, 436, 310, 247, 4217, 4968, 281, 452, 533, 275, 619, 4743, 247, 15246, 2898, 273, 253, 305, 12064, 6046, 5122, 281, 247, 973, 4304, 23211, 310, 417, 2217, 281, 2525, 253, 2534, 323, 9311, 50276, 783, 2929, 3198, 2057, 247, 40638, 1783, 3877, 326, 247, 3332, 638, 3845, 407, 24949, 73, 285, 9062, 484, 3916, 247, 40638, 3033, 327, 253, 2228, 273, 253, 1385, 23211, 342, 2879, 305, 12064, 6046, 390, 247, 625, 13943, 2898, 642, 4385, 5474, 33032, 2520, 2929, 14371, 326, 4872, 46159, 476, 320, 1160, 21673, 3055, 407, 6240, 247, 1355, 2408, 273, 305, 12064, 6046, 387, 31850, 5661, 1543, 327, 23367, 71, 285, 253, 1524, 7318, 4355, 10895, 921, 326, 253, 11839, 310, 1679, 12208, 407, 436, 11237, 20544, 50275, 85, 5047, 90, 9400, 50276, 4714, 34092, 2929, 50276, 783, 7756, 310, 2969, 2568, 3576, 50276, 20881, 1255, 265, 50275, 783, 9376, 7787, 253, 2898, 15216, 50276, 77, 471, 273, 1783, 50276, 5992, 7193, 14855, 5701, 50276, 2520, 2929, 2792, 562, 326, 4872, 46159, 476, 320, 1160, 21673, 3055, 285, 2085, 4217, 1491, 1223, 11850, 616, 3236, 3607, 5742, 10214, 281, 253, 2629, 4254, 326, 3302, 4219, 1016, 4828, 342, 470, 253, 4477, 26641, 1016, 4828, 342, 247, 6046, 19958, 432, 2176, 305, 12064, 3268, 3692, 3602, 403, 3413, 407, 253, 7200, 4764, 4433, 5912, 285, 7563, 323, 1182, 2428, 81, 50276, 74, 751, 436, 2934, 4583, 2167, 436, 11237, 310, 1355, 253, 4477, 5276, 326, 436, 812, 1056, 253, 4872, 46159, 21673, 3055, 2299, 436, 2929, 671, 556, 690, 32138, 824, 347, 253, 9376, 285, 14999, 1783, 273, 253, 1083, 835, 253, 34014, 310, 15257, 3206, 891, 1618, 619, 11859, 7350, 275, 253, 1563, 50276, 88, 18, 253, 9376, 7787, 253, 2898, 15216, 275, 2593, 3307, 253, 4477, 13366, 9569, 4872, 46159, 597, 5467, 326, 5382, 1269, 556, 1318, 362, 18207, 2057, 337, 390, 337, 534, 7787, 253, 2898, 15216, 273, 4872, 46159, 2299, 432, 11333, 337, 285, 374, 891, 42126, 923, 667, 4254, 326, 15771, 327, 436, 9376, 3103, 891, 651, 751, 281, 871, 1880, 436, 310, 247, 28116, 9376, 390, 604, 436, 9376, 310, 2424, 281, 1056, 2119, 253, 3055, 4872, 46159, 10517, 8967, 11068, 604, 436, 9376, 310, 760, 3058, 281, 5115, 253, 33234, 12215, 891, 1158, 253, 5454, 2727, 875, 11068, 285, 30437, 943, 320, 5469, 50276, 88, 19, 352, 651, 320, 5322, 604, 253, 4477, 812, 2085, 247, 4864, 23356, 670, 2139, 26724, 4872, 23211, 1057, 417, 2186, 8967, 11068, 2867, 534, 778, 1056, 253, 7756, 30328, 30909, 275, 436, 2929, 50276, 88, 20, 3480, 273, 1783, 253, 2234, 2934, 281, 2403, 5368, 4872, 46159, 3055, 310, 281, 26641, 33605, 342, 3632, 6046, 347, 2011, 275, 5933, 495, 285, 253, 4477, 2085, 247, 2962, 273, 1783, 281, 5276, 326, 253, 7321, 6974, 12310, 253, 33234, 5426, 534, 310, 5322, 2299, 891, 4282, 1880, 841, 3607, 1335, 2186, 672, 253, 3055, 4872, 46159, 403, 13964, 323, 1650, 253, 4477, 26641, 4828, 260, 342, 299, 5043, 6046, 19958, 432, 247, 305, 12064, 3268, 323, 1385, 1222, 1083, 2568, 253, 30539, 812, 13469, 436, 299, 275, 253, 31850, 3213, 1580, 299, 310, 3413, 2378, 253, 7200, 4764, 4433, 5912, 285, 7563, 323, 1182, 2428, 81, 403, 4229, 891, 651, 751, 281, 871, 651, 253, 6974, 1335, 10517, 253, 33234, 8284, 50273, 783, 4477, 18212, 9713, 253, 7364, 285, 2442, 4016, 38058, 3486, 273, 616, 789, 5474, 339, 431, 248, 2929, 1246, 21673, 3055, 11640, 273, 4872, 46159, 824, 347, 1385, 1054, 285, 1385, 8876, 46159, 253, 8967, 11068, 23632, 403, 2530, 970, 247, 3332, 10732, 273, 8967, 11068, 1182, 12132, 50276, 69, 1601, 285, 687, 394, 1559, 360, 4022, 34258, 285, 2870, 249, 413, 4022, 253, 760, 1818, 432, 26724, 46159, 310, 253, 31850, 2530, 275, 5933, 495, 534, 672, 5678, 342, 253, 5731, 5933, 337, 285, 7316, 5933, 374, 4295, 273, 1385, 1054, 285, 1385, 8876, 46159, 5644, 281, 253, 21673, 3055, 9508, 273, 841, 46159, 253, 31850, 3139, 11323, 305, 12064, 6046, 281, 1016, 3781, 5382, 667, 643, 6046, 2603, 14127, 1182, 2428, 81, 651, 671, 36433, 253, 1783, 273, 253, 11333, 3587, 3637, 407, 16248, 253, 1783, 323, 26724, 46159, 342, 253, 3332, 21673, 11068, 1543, 50275, 22157, 327, 253, 21673, 3055, 1385, 8876, 23211, 253, 4477, 12661, 247, 17713, 18535, 1385, 29541, 23211, 326, 476, 6642, 512, 253, 2677, 3205, 10486, 1223, 17749, 8967, 11068, 253, 5661, 27163, 17813, 253, 8542, 414, 273, 253, 11333, 50276, 1189, 455, 1223, 253, 11333, 3559, 347, 973, 347, 616, 1783, 403, 417, 4460, 253, 2929, 1057, 247, 1175, 2628, 273, 16725, 253, 6239, 275, 767, 28465, 4910, 285, 5277, 247, 11088, 873, 273, 8967, 3055, 4872, 46159, 534, 878, 281, 320, 14290, 3103, 891, 5583, 14924, 50275, 18, 253, 4477, 24181, 1263, 253, 6239, 327, 4872, 46159, 285, 8967, 11068, 285, 2085, 247, 11088, 873, 273, 21673, 3055, 4872, 46159, 50276, 19, 253, 2929, 310, 973, 3542, 285, 3477, 281, 956, 495, 2299, 627, 310, 2649, 667, 38135, 2057, 5933, 1037, 390, 275, 253, 10527, 1783, 273, 253, 11333, 253, 760, 1524, 1818, 432, 271, 5933, 32764, 310, 253, 31850, 3408, 273, 253, 4872, 46159, 253, 27947, 956, 407, 7491, 10981, 2784, 253, 2629, 27947, 323, 4872, 46159, 285, 970, 253, 3332, 21673, 11068, 1543, 577, 436, 651, 320, 247, 4891, 2929, 574, 253, 4477, 3559, 690, 3686, 273, 37593, 2322, 906, 323, 4294, 13418, 762, 11068, 23632, 50275, 9820, 5474, 339, 431, 248, 2929, 19401, 8967, 11068, 323, 690, 2629, 46159, 8925, 616, 8967, 11068, 14493, 762, 305, 12064, 6046, 50275, 45563, 50276, 19293, 1895, 432, 8967, 11068, 1783, 50276, 16217, 3825, 50275, 12311, 873, 273, 5661, 1543, 50276, 20881, 1255, 265, 1783, 310, 15246, 36908, 755, 1072, 4394, 1356, 2228, 14493, 816, 247, 342, 1029, 5912, 12955, 50276, 19164, 414, 50276, 303, 7514, 670, 253, 3236, 414, 50276, 249, 690, 3282, 2074, 1543, 342, 4931, 21076, 390, 275, 681, 36730, 1543, 697, 1892, 281, 2028, 3176, 275, 643, 9380, 824, 347, 50276, 17001, 277, 2873, 73, 2788, 41657, 11943, 256, 295, 1479, 311, 729, 247, 285, 259, 918, 391, 295, 3199, 9486, 11333, 3066, 9990, 327, 46159, 275, 10061, 273, 18870, 35835, 327, 9241, 273, 5447, 2718, 43701, 7266, 5345, 2385, 4332, 534, 310, 11106, 275, 436, 2929, 285, 6673, 261, 298, 277, 1351, 91, 261, 305, 285, 1531, 382, 1171, 15354, 299, 277, 5919, 3055, 9990, 342, 18382, 4291, 46159, 275, 7970, 2990, 285, 5939, 985, 3988, 18870, 35835, 40515, 859, 253, 8573, 5948, 4022, 534, 310, 417, 50275, 783, 1543, 1646, 281, 956, 816, 407, 9433, 253, 2629, 305, 12064, 5122, 285, 840, 9433, 5889, 39383, 432, 643, 789, 50276, 282, 44661, 3435, 1717, 50276, 30875, 12445, 556, 3542, 436, 1066, 1078, 533, 891, 452, 7596, 22142, 352, 369, 417, 1929, 347, 352, 3133, 247, 3405, 798, 10495, 5889, 50274, 15177, 50276, 28821, 253, 3236, 414, 7350, 1840, 253, 3290, 310, 8718, 50276, 498, 15752, 50276, 783, 4028, 310, 3839, 1175, 285, 2590, 50276, 9188, 40348, 50276, 2887, 253, 3236, 414, 3374, 1840, 5549, 50276, 187, 187, 4118, 18435, 27, 2520, 789, 21031, 21673, 3055, 4872, 46159, 323, 4294, 13418, 285, 2905, 3237, 352, 2722, 326, 1385, 1054, 29541, 23211, 476, 320, 1160, 33234, 407, 6240, 6046, 436, 310, 271, 4722, 7680, 253, 30628, 574, 7350, 670, 253, 38135, 273, 253, 2929, 253, 9300, 2715, 273, 253, 2929, 1057, 247, 5272, 1783, 273, 253, 5933, 253, 4477, 1127, 562, 275, 253, 30080, 22559, 849, 616, 14493, 403, 275, 681, 36730, 281, 247, 17336, 789, 891, 717, 417, 13762, 326, 253, 275, 681, 36730, 4243, 403, 4722, 275, 253, 3282, 326, 2734, 8098, 281, 253, 1327, 9486, 1385, 23211, 1057, 417, 1646, 751, 247, 4217, 4736, 275, 3139, 285, 253, 2303, 273, 1600, 310, 1900, 253, 3236, 11383, 253, 4477, 513, 271, 16774, 1783, 285, 921, 326, 253, 5933, 2987, 973, 762, 5272, 11068, 35905, 4583, 891, 1158, 436, 2929, 778, 320, 247, 5272, 581, 281, 2997 ]
Below is given review of a research paper from cnoference journal. Please write a summary the review. ### Review: this works deal with the imagelevel annotation problem in fewshot classification during training time given support and query images the authors first split images into patches use a transformer to extract feature tokens and learn which tokens are related to the label more during test time it selflearned within support images to find which tokens are more important the experiment shows a consistent improvement strength the transformeronly architecture makes it very clean and thus makes it potentially be a new baseline of the problem reweighting features to focus on taskspecific information is also a reasonable idea the experiment shows a consistent improvement weakness it is better to discuss other feature reweightingbased methods in fewshot classification for example for finegrained fewshot classification lee et al a shows a consistent improvement in all previous methods by adding an attentionbased feature reweighting module a lee et al task discrepancy maximization for finegrained fewshot classification cvpr 2022 yes the author addressed the potential problem when training data is highly limited docsepmotivated by supervision collapse caused by standard fewshot training on weak imagelevel labels authors introduce a tokenbased approach based on unsupervised vision transformers that reweights tokens in an inner loop based on their discriminative power the model achieves strong results and demonstrates the viability of vision transformer models on fewshot tasks without extra pretraining strengths demonstrated use of vision transformers for fewshot learning is on its own a neat contribution demonstrated use of purely unsupervised pretraining for fewshot learning is also a neat contribution if not quite as novel results are impressive and span multiple benchmarks and architectures straightforward and sensible approach to token aggregation and reweighting weaknesses the strong results come with a major caveat the vitsmall and swintiny architectures have 22m and 29m parameters respectively while the compared baselines are almost entirely based on resnet12 which by my recollection has only 12m parameters while this is still less than the widelyused wrn2810 backbone 365m params i worry that the comparisons presented in the paper are applestooranges the difference in model size should be discussed and addressed use of vision transformers for few shot classification deserves an empirical study all on its own understandably this is not provided here but because of this it is unclear to what degree improvement is coming from the token reweighting scheme in theory compatible with existing convolutional architectures vs the vision transformer backbone in theory compatible with existing fewshot classifiers for example how does the token reweighting scheme compare to simply training a linear classifier head on the support features from a vision transformer admittedly a full comparison along both these axes would be clearly out of scope here similarly there is no ablation study provided for the impact of token reweighting vs the logsumexp aggregation scheme how much better is logsumexp aggregation than direct addition for example which would correspond to basic prototype comparison with reweighted averages on each prototype more broadly it appears that the token reweighting scheme is broadly compatible with many existing tokentotoken classifiers such as ctx and frn and it is not clear how the logsumexp aggregator compares more generally the approach while straightforward and sensible does contain a few design choices that are not fully explained or empirically justified for example in addition to above the choice of token similarity metric a slightly relevant omitted citation the masked inner token reweighting scheme might possibly owe some conceptual debt to batch folding from few shot learning with localization in realistic settings cvpr2019 which also models a supporttosupport classification task with an identical imagemasked leaveoneout scheme though admittedly implemented quite differently the analysis of 1shot effectiveness and discussion of smaller training datasets is insightful the entanglement of vision transformer benefits with token reweighting benefits in presented results is not discussed societal impacts are not discussed though do not extend beyond those of fewshot learning in general docsepthe paper addresses the problem of fewshot classification the main idea is to establish semantic correspondences between the patches from the support and the query images these correspondences are then used to reason which class a query image belongs to in order to downweight the impact of background patches when performing classification the authors also introduce an online optimization strategy to determine which patches in the support images are most informative when performing fewshot classification the method uses the vision transformer to encode the patches in the support and query images in order to learn strong generic features the vision transformer is trained in an unsupervised manner using the masked image modelling task the selfsupervised pretraining is shown to provide better results then the supervised counterpart the proposed method obtains stateoftheart results on four fewshot classification benchmarks strengths s1 the paper is well written and easy to read s2 the proposed fewshot classifier using patchwise correspondences is novel and interesting the online optimization allows determining which regions are most crucial to perform classification and can be helpful specially in case of clutter in the support set images the use of patchwise correspondences allows determining the class of the query image by jointly reasoning over the support set as well as the query s3 the selfsupervised pretraining of the vision transformer makes sense specially in the context of fewshot learning to learn generic feature representation s4 the proposed approach is shown to obtain stateoftheart results on 4 standard benchmarks mini imagenet tiered imagenet cifarfs fc100 s5 the authors provide helpful analysis and ablation studies showing the impact of the major contributions weaknesses i do not have any major issues with the paper some minor issues which could be addresses w1 as shown in fig 4 the use of selfsupervised pretraining for vision transformer provides a significant improvement compared to supervised training it would be interesting to see what performance existing classifiers eg learning a linear classifier prototype obtain when using the same backbone network this would help evaluate the benefits of proposed embedding similarity based classifier w2 in the sota comparison most previous methods use resnet12 backbone while the authors employ vitsmall and swintiny it will be helpful if the authors include the model sizes number of parameters for each of these backbones for comparison w3 the approach computes patchwise correspondence between all the support and query images i wonder if this could become computationally expensive when dealing which large number of number classes or when each class has more samples eg 100 way 30 shot classification a discussion on this would be beneficial the authors discuss the limitations of their work docsepthis paper adopts selfsupervised trained vision transformer vit architecture as the feature extractor deriving patchlevel representations for fewshot classification problems to exploit the relation across patches the authors propose a token importance reweighting mechanism which is required to perform during both training and testing stages the experimental results show satisfactory performance in several commonlyused fsl benchmarks to verify the effectiveness of this method the overall paper is easy to follow the idea of utilizing the selfsupervised trained vit architecture to derive patchlevel representations is interesting and seems effective for fewshot learning tasks since selfsupervised model pretraining is agnostic to the imagelevel class labels the trained model is more generalizable for downstream tasks also applying vit to extract patchlevel features allows the model to produce more finegrained information however i have the following concerns about this work 1 in fig 4 the authors show that selfsupervised pretraining performs significantly better than the supervisedpretraining counterpart however in prior ssl literature eg 16 a ssl pretraining only slightly outperforms supervised pretraining sometimes or even worse than it a proper explanation or insight that the proposed ssl pretraining surpasses a large margin over the supervised counterpart reported in fig 4 is needed 2 the tsne visualization in fig 5 only verifies that the patchlevel embeddings derived from the same instance are clustered together and those from different instances are separated from each other however this figure only explainsvisualizes separation between different instances but not the discrimination between different classes which is much more important for fsl it is desirable to see whether the embeddings extracted from the same class are gathered while the embedding of different classes would separate far from each other 3 since the title of this paper emphasizes the aspect of generalization in fewshot learning one would expect learning strategies with resultscomparisons with recent crossdomain fewshot learning works eg b in other words crossdomain fsl aims to transfer the learned knowledge to the novel classes in unseen target domains showing generalization ability 4 although this paper applies masked image modeling mim as the pretext task for pretraining vit use of other selfsupervised pretraining approaches like contrastive learning eg dino c moco v3 a would be possible it will be also good if the authors provide some insights or comparisons about the choice of the selfsupervised pretraining approach if mim is desirable for this task more explanations and supports would be needed 5 the need to perform additional learning for test data fewshot instances from novel classes is needed for the proposed work but not necessarily for a number of sotas id like to see how the authors would elaborate on this issue 16 he et al masked autoencoders are scalable vision learners cvpr 2022 a chen et al an empirical study of training selfsupervised vision transformers iccv 2021 b chen et al a closer look at fewshot classification iclr 2019 c caron et al emerging properties in selfsupervised vision transformers iccv 2021 the authors did provide discussions on the limitation of the proposed work ### Summary:
this paper tackles fewshot learning with a transformer architecture and inspired by the intuition that finegrained information is ignored in existing methods uses an innerloop token reweighting method to improve results overall the reviewers appreciated the use of modern architectures vision transformers the reasonableness of the reweighting intuition and experimental results concerns were raised about comparison to existing methods with similar intuitions eg a mentioned by ef5w fairness of the comparison with respect to model capacity and in general ablations demonstrating that its the method not transformers by themselves leading to improved results and lack of principled explanations for the design choices and computational complexity the authors provided strong rebuttals including new experiments using linear classifiers and prototypical approaches use of smaller models and a demonstration of potential pruning methods to address computational complexity the reviewers were overall receptive to the rebuttal and all recommended acceptance of this paper after some backandforth the paper provides both a nice benchmark applying vision transformers to fewshot learning as well as a method that is demonstrably better through ablation studies therefore this paper provides several nice contributions to the community and i recommend acceptance
[ 534, 21761, 403, 625, 1774, 253, 3368, 2722, 247, 5185, 7756, 4757, 253, 39707, 7483, 10336, 2789, 352, 1077, 4076, 285, 3021, 2789, 352, 7826, 320, 247, 747, 8245, 273, 253, 1895, 294, 6712, 272, 3386, 281, 2770, 327, 8892, 29765, 1491, 310, 671, 247, 5272, 2934, 253, 3368, 2722, 247, 5185, 7756, 50276, 20881, 1255, 352, 310, 1805, 281, 2319, 643, 4735, 294, 6712, 272, 3169, 3082, 275, 1643, 11860, 9162, 323, 1650, 323, 4030, 72, 11273, 1643, 11860, 9162, 458, 70, 1162, 355, 247, 2722, 247, 5185, 7756, 275, 512, 2045, 3082, 407, 6240, 271, 4116, 3169, 4735, 294, 6712, 272, 6333, 50276, 66, 458, 70, 1162, 355, 4836, 26210, 11903, 1320, 323, 4030, 72, 11273, 1643, 11860, 9162, 30105, 1087, 1384, 1423, 4754, 253, 2488, 9713, 253, 2442, 1895, 672, 3733, 941, 310, 4122, 3710, 5474, 339, 2617, 302, 8550, 407, 20446, 13551, 4269, 407, 2629, 1643, 11860, 3733, 327, 5075, 2460, 5251, 13301, 4477, 9569, 247, 10669, 3169, 2746, 1754, 327, 440, 35421, 8113, 4979, 398, 326, 294, 42739, 21761, 275, 271, 6703, 6287, 1754, 327, 616, 20741, 800, 1612, 253, 1566, 33526, 2266, 1543, 285, 14371, 253, 17036, 273, 8113, 39707, 3210, 327, 1643, 11860, 8892, 1293, 4465, 3215, 26208, 50276, 296, 3755, 20556, 50276, 186, 48387, 456, 897, 273, 8113, 4979, 398, 323, 1643, 11860, 4715, 310, 327, 697, 1211, 247, 18176, 7680, 50276, 186, 48387, 456, 897, 273, 15846, 440, 35421, 3215, 26208, 323, 1643, 11860, 4715, 310, 671, 247, 18176, 7680, 604, 417, 3240, 347, 4460, 50276, 186, 16680, 403, 13943, 285, 13905, 2709, 49602, 285, 35615, 50276, 186, 10981, 429, 10495, 285, 24600, 2746, 281, 10669, 20828, 285, 294, 6712, 272, 50276, 20881, 1255, 265, 50275, 186, 783, 2266, 1543, 1705, 342, 247, 2201, 15985, 255, 253, 362, 953, 78, 455, 285, 1863, 565, 5104, 35615, 452, 3307, 78, 285, 3285, 78, 3602, 2975, 1223, 253, 2429, 1666, 25379, 403, 2761, 7094, 1754, 327, 501, 3024, 805, 534, 407, 619, 45167, 556, 760, 1249, 78, 3602, 1223, 436, 310, 1335, 1679, 685, 253, 7561, 3197, 1488, 79, 1619, 740, 27882, 23412, 78, 18912, 891, 7664, 326, 253, 14023, 3559, 275, 253, 2929, 403, 2999, 383, 1887, 6525, 253, 3064, 275, 1566, 1979, 943, 320, 5469, 285, 9713, 50275, 186, 2327, 273, 8113, 4979, 398, 323, 1643, 5103, 9162, 22828, 271, 16774, 1263, 512, 327, 697, 1211, 2096, 1598, 436, 310, 417, 2530, 1060, 533, 984, 273, 436, 352, 310, 12744, 281, 752, 4248, 7756, 310, 3551, 432, 253, 10669, 294, 6712, 272, 6974, 275, 3762, 13333, 342, 5368, 27311, 267, 35615, 4632, 253, 8113, 39707, 27882, 275, 3762, 13333, 342, 5368, 1643, 11860, 49996, 323, 1650, 849, 1057, 253, 10669, 294, 6712, 272, 6974, 7277, 281, 3365, 3733, 247, 4872, 30410, 1481, 327, 253, 1329, 3386, 432, 247, 8113, 39707, 47421, 247, 2120, 5301, 2112, 1097, 841, 24039, 651, 320, 4518, 562, 273, 7990, 1060, 50275, 186, 3549, 6241, 627, 310, 642, 28913, 1263, 2530, 323, 253, 3486, 273, 10669, 294, 6712, 272, 4632, 253, 2412, 2204, 4347, 20828, 6974, 849, 1199, 1805, 310, 2412, 2204, 4347, 20828, 685, 1480, 1635, 323, 1650, 534, 651, 2723, 281, 5044, 21841, 5301, 342, 294, 24676, 31218, 327, 1016, 21841, 625, 21450, 352, 4620, 326, 253, 10669, 294, 6712, 272, 6974, 310, 21450, 13333, 342, 1142, 5368, 18734, 290, 302, 5097, 49996, 824, 347, 20921, 285, 1315, 79, 285, 352, 310, 417, 2590, 849, 253, 2412, 2204, 4347, 9406, 1080, 26662, 50275, 186, 3062, 3839, 253, 2746, 1223, 15246, 285, 24600, 1057, 3831, 247, 1643, 2216, 10165, 326, 403, 417, 4751, 5544, 390, 45190, 17285, 323, 1650, 275, 1635, 281, 1840, 253, 4327, 273, 10669, 14259, 7982, 50275, 186, 66, 5777, 4623, 11035, 25577, 253, 34741, 6703, 10669, 294, 6712, 272, 6974, 1537, 6830, 24186, 690, 20178, 6735, 281, 14604, 23205, 432, 1643, 5103, 4715, 342, 14536, 275, 15958, 7533, 30105, 1087, 9638, 534, 671, 3210, 247, 1329, 85, 375, 1135, 430, 9162, 4836, 342, 271, 8931, 4440, 358, 1945, 264, 3553, 531, 483, 6974, 2167, 47421, 9009, 3240, 13359, 253, 1783, 273, 337, 11860, 12510, 285, 5955, 273, 4577, 3733, 15302, 310, 47860, 253, 26294, 273, 8113, 39707, 5373, 342, 10669, 294, 6712, 272, 5373, 275, 3559, 1543, 310, 417, 5469, 38058, 16274, 403, 417, 5469, 2167, 513, 417, 9017, 4457, 1110, 273, 1643, 11860, 4715, 275, 2087, 50276, 7152, 339, 431, 248, 2929, 12453, 253, 1895, 273, 1643, 11860, 9162, 253, 2022, 2934, 310, 281, 5100, 24705, 2723, 2979, 875, 253, 20412, 432, 253, 1329, 285, 253, 7316, 3888, 841, 2723, 2979, 403, 840, 908, 281, 1921, 534, 966, 247, 7316, 2460, 14125, 281, 275, 1340, 281, 1066, 6712, 253, 3486, 273, 4114, 20412, 672, 9591, 9162, 253, 4477, 671, 9569, 271, 3909, 13757, 5700, 281, 3653, 534, 20412, 275, 253, 1329, 3888, 403, 954, 27096, 672, 9591, 1643, 11860, 9162, 253, 1332, 4648, 253, 8113, 39707, 281, 22573, 253, 20412, 275, 253, 1329, 285, 7316, 3888, 275, 1340, 281, 3037, 2266, 12314, 3386, 253, 8113, 39707, 310, 10166, 275, 271, 440, 35421, 5133, 970, 253, 34741, 2460, 26278, 4836, 253, 1881, 35421, 3215, 26208, 310, 2011, 281, 2085, 1805, 1543, 840, 253, 22296, 14317, 253, 4081, 1332, 31326, 1375, 23037, 14387, 1543, 327, 1740, 1643, 11860, 9162, 49602, 50275, 296, 3755, 20556, 256, 18, 253, 2929, 310, 973, 3542, 285, 3477, 281, 1239, 50276, 84, 19, 253, 4081, 1643, 11860, 30410, 970, 12097, 3020, 2723, 2979, 310, 4460, 285, 4722, 253, 3909, 13757, 4483, 8925, 534, 4811, 403, 954, 9560, 281, 1347, 9162, 285, 476, 320, 9371, 24443, 275, 1083, 273, 502, 12216, 275, 253, 1329, 873, 3888, 253, 897, 273, 12097, 3020, 2723, 2979, 4483, 8925, 253, 966, 273, 253, 7316, 2460, 407, 26277, 14720, 689, 253, 1329, 873, 347, 973, 347, 253, 7316, 50276, 84, 20, 253, 1881, 35421, 3215, 26208, 273, 253, 8113, 39707, 2789, 3282, 24443, 275, 253, 3634, 273, 1643, 11860, 4715, 281, 3037, 12314, 4735, 6779, 50275, 84, 21, 253, 4081, 2746, 310, 2011, 281, 4044, 1375, 23037, 14387, 1543, 327, 577, 2629, 49602, 12949, 4440, 257, 292, 13898, 433, 4440, 257, 292, 260, 338, 274, 3671, 269, 68, 2313, 50276, 84, 22, 253, 4477, 2085, 9371, 1783, 285, 28913, 2175, 4645, 253, 3486, 273, 253, 2201, 9021, 50274, 20881, 1255, 265, 891, 513, 417, 452, 667, 2201, 3374, 342, 253, 2929, 690, 5884, 3374, 534, 812, 320, 12453, 50276, 88, 18, 347, 2011, 275, 3036, 577, 253, 897, 273, 1881, 35421, 3215, 26208, 323, 8113, 39707, 3400, 247, 1534, 7756, 2429, 281, 22296, 3733, 352, 651, 320, 4722, 281, 923, 752, 3045, 5368, 49996, 24088, 4715, 247, 4872, 30410, 21841, 4044, 672, 970, 253, 1072, 27882, 2990, 436, 651, 1361, 7472, 253, 5373, 273, 4081, 21496, 14259, 1754, 30410, 50276, 88, 19, 275, 253, 256, 5503, 5301, 954, 2045, 3082, 897, 501, 3024, 805, 27882, 1223, 253, 4477, 2126, 362, 953, 78, 455, 285, 1863, 565, 5104, 352, 588, 320, 9371, 604, 253, 4477, 2486, 253, 1566, 9552, 1180, 273, 3602, 323, 1016, 273, 841, 896, 47473, 323, 5301, 50276, 88, 20, 253, 2746, 48169, 12097, 3020, 17668, 875, 512, 253, 1329, 285, 7316, 3888, 891, 4282, 604, 436, 812, 2489, 43245, 8214, 672, 10620, 534, 1781, 1180, 273, 1180, 5971, 390, 672, 1016, 966, 556, 625, 3530, 24088, 2233, 1039, 50276, 1229, 5103, 9162, 247, 5955, 327, 436, 651, 320, 12912, 253, 4477, 2319, 253, 7364, 273, 616, 789, 5474, 33032, 2520, 2929, 47932, 1881, 35421, 10166, 8113, 39707, 9084, 10336, 347, 253, 4735, 4908, 263, 44190, 12097, 5251, 14237, 323, 1643, 11860, 9162, 3237, 281, 22059, 253, 5886, 2439, 20412, 253, 4477, 12661, 247, 10669, 6349, 294, 6712, 272, 5122, 534, 310, 2424, 281, 1347, 1309, 1097, 3733, 285, 5175, 8661, 253, 5661, 1543, 921, 20297, 3045, 275, 2067, 7744, 3197, 269, 3433, 49602, 281, 12654, 253, 12510, 273, 436, 1332, 253, 4583, 2929, 310, 3477, 281, 956, 253, 2934, 273, 17617, 253, 1881, 35421, 10166, 9084, 10336, 281, 15313, 12097, 5251, 14237, 310, 4722, 285, 3133, 3576, 323, 1643, 11860, 4715, 8892, 1580, 1881, 35421, 1566, 3215, 26208, 310, 639, 79, 6932, 281, 253, 2460, 5251, 966, 13301, 253, 10166, 1566, 310, 625, 2087, 12729, 323, 15450, 8892, 671, 9433, 9084, 281, 4908, 12097, 5251, 3386, 4483, 253, 1566, 281, 4711, 625, 4030, 72, 11273, 1491, 2299, 891, 452, 253, 1563, 7350, 670, 436, 789, 50276, 18, 275, 3036, 577, 253, 4477, 921, 326, 1881, 35421, 3215, 26208, 17923, 3012, 1805, 685, 253, 22296, 4025, 26208, 14317, 2299, 275, 2720, 256, 3433, 6239, 24088, 1668, 247, 256, 3433, 3215, 26208, 760, 5777, 41731, 13015, 22296, 3215, 26208, 4536, 390, 1014, 7197, 685, 352, 247, 1463, 8813, 390, 12288, 326, 253, 4081, 256, 3433, 3215, 26208, 28842, 265, 247, 1781, 8459, 689, 253, 22296, 14317, 2361, 275, 3036, 577, 310, 3058, 50276, 19, 253, 28669, 570, 24426, 275, 3036, 608, 760, 2336, 7790, 326, 253, 12097, 5251, 46234, 6012, 432, 253, 1072, 4227, 403, 29102, 2366, 285, 1110, 432, 1027, 10872, 403, 9070, 432, 1016, 643, 2299, 436, 4677, 760, 11424, 34309, 4219, 9712, 875, 1027, 10872, 533, 417, 253, 11081, 875, 1027, 5971, 534, 310, 1199, 625, 1774, 323, 269, 3433, 352, 310, 11408, 281, 923, 1880, 253, 46234, 10375, 432, 253, 1072, 966, 403, 13037, 1223, 253, 21496, 273, 1027, 5971, 651, 4858, 2080, 432, 1016, 643, 50276, 20, 1580, 253, 4060, 273, 436, 2929, 35520, 253, 4809, 273, 26647, 275, 1643, 11860, 4715, 581, 651, 1902, 4715, 8130, 342, 1543, 681, 1148, 10047, 342, 3332, 2831, 13517, 1643, 11860, 4715, 2987, 24088, 270, 275, 643, 3000, 2831, 13517, 269, 3433, 13698, 281, 3700, 253, 6311, 3640, 281, 253, 4460, 5971, 275, 39709, 2303, 10625, 4645, 26647, 3745, 50276, 21, 3738, 436, 2929, 10384, 34741, 2460, 14053, 13892, 347, 253, 39543, 4836, 323, 3215, 26208, 9084, 897, 273, 643, 1881, 35421, 3215, 26208, 7274, 751, 4499, 422, 4715, 24088, 277, 2610, 260, 278, 16856, 362, 20, 247, 651, 320, 1896, 352, 588, 320, 671, 1175, 604, 253, 4477, 2085, 690, 16039, 390, 14023, 670, 253, 4327, 273, 253, 1881, 35421, 3215, 26208, 2746, 604, 13892, 310, 11408, 323, 436, 4836, 625, 22909, 285, 8525, 651, 320, 3058, 50276, 22, 253, 878, 281, 1347, 3081, 4715, 323, 1071, 941, 1643, 11860, 10872, 432, 4460, 5971, 310, 3058, 323, 253, 4081, 789, 533, 417, 7933, 323, 247, 1180, 273, 256, 302, 284, 2654, 751, 281, 923, 849, 253, 4477, 651, 21184, 327, 436, 2523, 50276, 1036, 344, 1162, 355, 34741, 6753, 2083, 351, 398, 403, 44755, 8113, 40390, 30105, 1087, 1384, 1423, 247, 260, 864, 1162, 355, 271, 16774, 1263, 273, 3733, 1881, 35421, 8113, 4979, 398, 17857, 17312, 43425, 270, 260, 864, 1162, 355, 247, 8003, 1007, 387, 1643, 11860, 9162, 17857, 32888, 6247, 260, 1113, 251, 1162, 355, 14149, 3607, 275, 1881, 35421, 8113, 4979, 398, 17857, 17312, 43425, 50273, 783, 4477, 858, 2085, 11985, 327, 253, 12291, 273, 253, 4081, 789, 2490, 187, 4118, 18435, 27, 50276, 2520, 2929, 39223, 1643, 11860, 4715, 342, 247, 39707, 10336, 285, 11797, 407, 253, 30328, 326, 4030, 72, 11273, 1491, 310, 12841, 275, 5368, 3082, 4648, 271, 6703, 14075, 10669, 294, 6712, 272, 1332, 281, 3157, 1543, 4583, 253, 30628, 14109, 253, 897, 273, 4980, 35615, 8113, 4979, 398, 253, 40149, 273, 253, 294, 6712, 272, 30328, 285, 5661, 1543, 7350, 497, 5439, 670, 5301, 281, 5368, 3082, 342, 2074, 16875, 4431, 24088, 247, 5393, 407, 29692, 22, 88, 28959, 273, 253, 5301, 342, 1675, 281, 1566, 5350, 285, 275, 2087, 490, 77, 569, 17227, 326, 697, 253, 1332, 417, 4979, 398, 407, 3746, 4283, 281, 5520, 1543, 285, 3480, 273, 3505, 74, 6216, 22909, 323, 253, 2216, 10165, 285, 15180, 10454, 50273, 783, 4477, 2530, 2266, 30080, 85, 932, 1690, 747, 4679, 970, 4872, 49996, 285, 3861, 49225, 7274, 897, 273, 4577, 3210, 285, 247, 20028, 273, 2442, 819, 25004, 3082, 281, 2953, 15180, 10454, 253, 30628, 497, 4583, 44952, 281, 253, 30080, 22559, 285, 512, 8521, 14924, 273, 436, 2929, 846, 690, 896, 395, 28287, 253, 2929, 3400, 1097, 247, 5322, 22791, 9433, 8113, 4979, 398, 281, 1643, 11860, 4715, 347, 973, 347, 247, 1332, 326, 310, 2837, 1598, 1805, 949, 28913, 2175, 3103, 436, 2929, 3400, 2067, 5322, 9021, 281, 253, 3114, 285, 891, 5583, 14924, 209 ]
[ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 ]
[ 534, 21761, 403, 625, 1774, 253, 3368, 2722, 247, 5185, 7756, 4757, 253, 39707, 7483, 10336, 2789, 352, 1077, 4076, 285, 3021, 2789, 352, 7826, 320, 247, 747, 8245, 273, 253, 1895, 294, 6712, 272, 3386, 281, 2770, 327, 8892, 29765, 1491, 310, 671, 247, 5272, 2934, 253, 3368, 2722, 247, 5185, 7756, 50276, 20881, 1255, 352, 310, 1805, 281, 2319, 643, 4735, 294, 6712, 272, 3169, 3082, 275, 1643, 11860, 9162, 323, 1650, 323, 4030, 72, 11273, 1643, 11860, 9162, 458, 70, 1162, 355, 247, 2722, 247, 5185, 7756, 275, 512, 2045, 3082, 407, 6240, 271, 4116, 3169, 4735, 294, 6712, 272, 6333, 50276, 66, 458, 70, 1162, 355, 4836, 26210, 11903, 1320, 323, 4030, 72, 11273, 1643, 11860, 9162, 30105, 1087, 1384, 1423, 4754, 253, 2488, 9713, 253, 2442, 1895, 672, 3733, 941, 310, 4122, 3710, 5474, 339, 2617, 302, 8550, 407, 20446, 13551, 4269, 407, 2629, 1643, 11860, 3733, 327, 5075, 2460, 5251, 13301, 4477, 9569, 247, 10669, 3169, 2746, 1754, 327, 440, 35421, 8113, 4979, 398, 326, 294, 42739, 21761, 275, 271, 6703, 6287, 1754, 327, 616, 20741, 800, 1612, 253, 1566, 33526, 2266, 1543, 285, 14371, 253, 17036, 273, 8113, 39707, 3210, 327, 1643, 11860, 8892, 1293, 4465, 3215, 26208, 50276, 296, 3755, 20556, 50276, 186, 48387, 456, 897, 273, 8113, 4979, 398, 323, 1643, 11860, 4715, 310, 327, 697, 1211, 247, 18176, 7680, 50276, 186, 48387, 456, 897, 273, 15846, 440, 35421, 3215, 26208, 323, 1643, 11860, 4715, 310, 671, 247, 18176, 7680, 604, 417, 3240, 347, 4460, 50276, 186, 16680, 403, 13943, 285, 13905, 2709, 49602, 285, 35615, 50276, 186, 10981, 429, 10495, 285, 24600, 2746, 281, 10669, 20828, 285, 294, 6712, 272, 50276, 20881, 1255, 265, 50275, 186, 783, 2266, 1543, 1705, 342, 247, 2201, 15985, 255, 253, 362, 953, 78, 455, 285, 1863, 565, 5104, 35615, 452, 3307, 78, 285, 3285, 78, 3602, 2975, 1223, 253, 2429, 1666, 25379, 403, 2761, 7094, 1754, 327, 501, 3024, 805, 534, 407, 619, 45167, 556, 760, 1249, 78, 3602, 1223, 436, 310, 1335, 1679, 685, 253, 7561, 3197, 1488, 79, 1619, 740, 27882, 23412, 78, 18912, 891, 7664, 326, 253, 14023, 3559, 275, 253, 2929, 403, 2999, 383, 1887, 6525, 253, 3064, 275, 1566, 1979, 943, 320, 5469, 285, 9713, 50275, 186, 2327, 273, 8113, 4979, 398, 323, 1643, 5103, 9162, 22828, 271, 16774, 1263, 512, 327, 697, 1211, 2096, 1598, 436, 310, 417, 2530, 1060, 533, 984, 273, 436, 352, 310, 12744, 281, 752, 4248, 7756, 310, 3551, 432, 253, 10669, 294, 6712, 272, 6974, 275, 3762, 13333, 342, 5368, 27311, 267, 35615, 4632, 253, 8113, 39707, 27882, 275, 3762, 13333, 342, 5368, 1643, 11860, 49996, 323, 1650, 849, 1057, 253, 10669, 294, 6712, 272, 6974, 7277, 281, 3365, 3733, 247, 4872, 30410, 1481, 327, 253, 1329, 3386, 432, 247, 8113, 39707, 47421, 247, 2120, 5301, 2112, 1097, 841, 24039, 651, 320, 4518, 562, 273, 7990, 1060, 50275, 186, 3549, 6241, 627, 310, 642, 28913, 1263, 2530, 323, 253, 3486, 273, 10669, 294, 6712, 272, 4632, 253, 2412, 2204, 4347, 20828, 6974, 849, 1199, 1805, 310, 2412, 2204, 4347, 20828, 685, 1480, 1635, 323, 1650, 534, 651, 2723, 281, 5044, 21841, 5301, 342, 294, 24676, 31218, 327, 1016, 21841, 625, 21450, 352, 4620, 326, 253, 10669, 294, 6712, 272, 6974, 310, 21450, 13333, 342, 1142, 5368, 18734, 290, 302, 5097, 49996, 824, 347, 20921, 285, 1315, 79, 285, 352, 310, 417, 2590, 849, 253, 2412, 2204, 4347, 9406, 1080, 26662, 50275, 186, 3062, 3839, 253, 2746, 1223, 15246, 285, 24600, 1057, 3831, 247, 1643, 2216, 10165, 326, 403, 417, 4751, 5544, 390, 45190, 17285, 323, 1650, 275, 1635, 281, 1840, 253, 4327, 273, 10669, 14259, 7982, 50275, 186, 66, 5777, 4623, 11035, 25577, 253, 34741, 6703, 10669, 294, 6712, 272, 6974, 1537, 6830, 24186, 690, 20178, 6735, 281, 14604, 23205, 432, 1643, 5103, 4715, 342, 14536, 275, 15958, 7533, 30105, 1087, 9638, 534, 671, 3210, 247, 1329, 85, 375, 1135, 430, 9162, 4836, 342, 271, 8931, 4440, 358, 1945, 264, 3553, 531, 483, 6974, 2167, 47421, 9009, 3240, 13359, 253, 1783, 273, 337, 11860, 12510, 285, 5955, 273, 4577, 3733, 15302, 310, 47860, 253, 26294, 273, 8113, 39707, 5373, 342, 10669, 294, 6712, 272, 5373, 275, 3559, 1543, 310, 417, 5469, 38058, 16274, 403, 417, 5469, 2167, 513, 417, 9017, 4457, 1110, 273, 1643, 11860, 4715, 275, 2087, 50276, 7152, 339, 431, 248, 2929, 12453, 253, 1895, 273, 1643, 11860, 9162, 253, 2022, 2934, 310, 281, 5100, 24705, 2723, 2979, 875, 253, 20412, 432, 253, 1329, 285, 253, 7316, 3888, 841, 2723, 2979, 403, 840, 908, 281, 1921, 534, 966, 247, 7316, 2460, 14125, 281, 275, 1340, 281, 1066, 6712, 253, 3486, 273, 4114, 20412, 672, 9591, 9162, 253, 4477, 671, 9569, 271, 3909, 13757, 5700, 281, 3653, 534, 20412, 275, 253, 1329, 3888, 403, 954, 27096, 672, 9591, 1643, 11860, 9162, 253, 1332, 4648, 253, 8113, 39707, 281, 22573, 253, 20412, 275, 253, 1329, 285, 7316, 3888, 275, 1340, 281, 3037, 2266, 12314, 3386, 253, 8113, 39707, 310, 10166, 275, 271, 440, 35421, 5133, 970, 253, 34741, 2460, 26278, 4836, 253, 1881, 35421, 3215, 26208, 310, 2011, 281, 2085, 1805, 1543, 840, 253, 22296, 14317, 253, 4081, 1332, 31326, 1375, 23037, 14387, 1543, 327, 1740, 1643, 11860, 9162, 49602, 50275, 296, 3755, 20556, 256, 18, 253, 2929, 310, 973, 3542, 285, 3477, 281, 1239, 50276, 84, 19, 253, 4081, 1643, 11860, 30410, 970, 12097, 3020, 2723, 2979, 310, 4460, 285, 4722, 253, 3909, 13757, 4483, 8925, 534, 4811, 403, 954, 9560, 281, 1347, 9162, 285, 476, 320, 9371, 24443, 275, 1083, 273, 502, 12216, 275, 253, 1329, 873, 3888, 253, 897, 273, 12097, 3020, 2723, 2979, 4483, 8925, 253, 966, 273, 253, 7316, 2460, 407, 26277, 14720, 689, 253, 1329, 873, 347, 973, 347, 253, 7316, 50276, 84, 20, 253, 1881, 35421, 3215, 26208, 273, 253, 8113, 39707, 2789, 3282, 24443, 275, 253, 3634, 273, 1643, 11860, 4715, 281, 3037, 12314, 4735, 6779, 50275, 84, 21, 253, 4081, 2746, 310, 2011, 281, 4044, 1375, 23037, 14387, 1543, 327, 577, 2629, 49602, 12949, 4440, 257, 292, 13898, 433, 4440, 257, 292, 260, 338, 274, 3671, 269, 68, 2313, 50276, 84, 22, 253, 4477, 2085, 9371, 1783, 285, 28913, 2175, 4645, 253, 3486, 273, 253, 2201, 9021, 50274, 20881, 1255, 265, 891, 513, 417, 452, 667, 2201, 3374, 342, 253, 2929, 690, 5884, 3374, 534, 812, 320, 12453, 50276, 88, 18, 347, 2011, 275, 3036, 577, 253, 897, 273, 1881, 35421, 3215, 26208, 323, 8113, 39707, 3400, 247, 1534, 7756, 2429, 281, 22296, 3733, 352, 651, 320, 4722, 281, 923, 752, 3045, 5368, 49996, 24088, 4715, 247, 4872, 30410, 21841, 4044, 672, 970, 253, 1072, 27882, 2990, 436, 651, 1361, 7472, 253, 5373, 273, 4081, 21496, 14259, 1754, 30410, 50276, 88, 19, 275, 253, 256, 5503, 5301, 954, 2045, 3082, 897, 501, 3024, 805, 27882, 1223, 253, 4477, 2126, 362, 953, 78, 455, 285, 1863, 565, 5104, 352, 588, 320, 9371, 604, 253, 4477, 2486, 253, 1566, 9552, 1180, 273, 3602, 323, 1016, 273, 841, 896, 47473, 323, 5301, 50276, 88, 20, 253, 2746, 48169, 12097, 3020, 17668, 875, 512, 253, 1329, 285, 7316, 3888, 891, 4282, 604, 436, 812, 2489, 43245, 8214, 672, 10620, 534, 1781, 1180, 273, 1180, 5971, 390, 672, 1016, 966, 556, 625, 3530, 24088, 2233, 1039, 50276, 1229, 5103, 9162, 247, 5955, 327, 436, 651, 320, 12912, 253, 4477, 2319, 253, 7364, 273, 616, 789, 5474, 33032, 2520, 2929, 47932, 1881, 35421, 10166, 8113, 39707, 9084, 10336, 347, 253, 4735, 4908, 263, 44190, 12097, 5251, 14237, 323, 1643, 11860, 9162, 3237, 281, 22059, 253, 5886, 2439, 20412, 253, 4477, 12661, 247, 10669, 6349, 294, 6712, 272, 5122, 534, 310, 2424, 281, 1347, 1309, 1097, 3733, 285, 5175, 8661, 253, 5661, 1543, 921, 20297, 3045, 275, 2067, 7744, 3197, 269, 3433, 49602, 281, 12654, 253, 12510, 273, 436, 1332, 253, 4583, 2929, 310, 3477, 281, 956, 253, 2934, 273, 17617, 253, 1881, 35421, 10166, 9084, 10336, 281, 15313, 12097, 5251, 14237, 310, 4722, 285, 3133, 3576, 323, 1643, 11860, 4715, 8892, 1580, 1881, 35421, 1566, 3215, 26208, 310, 639, 79, 6932, 281, 253, 2460, 5251, 966, 13301, 253, 10166, 1566, 310, 625, 2087, 12729, 323, 15450, 8892, 671, 9433, 9084, 281, 4908, 12097, 5251, 3386, 4483, 253, 1566, 281, 4711, 625, 4030, 72, 11273, 1491, 2299, 891, 452, 253, 1563, 7350, 670, 436, 789, 50276, 18, 275, 3036, 577, 253, 4477, 921, 326, 1881, 35421, 3215, 26208, 17923, 3012, 1805, 685, 253, 22296, 4025, 26208, 14317, 2299, 275, 2720, 256, 3433, 6239, 24088, 1668, 247, 256, 3433, 3215, 26208, 760, 5777, 41731, 13015, 22296, 3215, 26208, 4536, 390, 1014, 7197, 685, 352, 247, 1463, 8813, 390, 12288, 326, 253, 4081, 256, 3433, 3215, 26208, 28842, 265, 247, 1781, 8459, 689, 253, 22296, 14317, 2361, 275, 3036, 577, 310, 3058, 50276, 19, 253, 28669, 570, 24426, 275, 3036, 608, 760, 2336, 7790, 326, 253, 12097, 5251, 46234, 6012, 432, 253, 1072, 4227, 403, 29102, 2366, 285, 1110, 432, 1027, 10872, 403, 9070, 432, 1016, 643, 2299, 436, 4677, 760, 11424, 34309, 4219, 9712, 875, 1027, 10872, 533, 417, 253, 11081, 875, 1027, 5971, 534, 310, 1199, 625, 1774, 323, 269, 3433, 352, 310, 11408, 281, 923, 1880, 253, 46234, 10375, 432, 253, 1072, 966, 403, 13037, 1223, 253, 21496, 273, 1027, 5971, 651, 4858, 2080, 432, 1016, 643, 50276, 20, 1580, 253, 4060, 273, 436, 2929, 35520, 253, 4809, 273, 26647, 275, 1643, 11860, 4715, 581, 651, 1902, 4715, 8130, 342, 1543, 681, 1148, 10047, 342, 3332, 2831, 13517, 1643, 11860, 4715, 2987, 24088, 270, 275, 643, 3000, 2831, 13517, 269, 3433, 13698, 281, 3700, 253, 6311, 3640, 281, 253, 4460, 5971, 275, 39709, 2303, 10625, 4645, 26647, 3745, 50276, 21, 3738, 436, 2929, 10384, 34741, 2460, 14053, 13892, 347, 253, 39543, 4836, 323, 3215, 26208, 9084, 897, 273, 643, 1881, 35421, 3215, 26208, 7274, 751, 4499, 422, 4715, 24088, 277, 2610, 260, 278, 16856, 362, 20, 247, 651, 320, 1896, 352, 588, 320, 671, 1175, 604, 253, 4477, 2085, 690, 16039, 390, 14023, 670, 253, 4327, 273, 253, 1881, 35421, 3215, 26208, 2746, 604, 13892, 310, 11408, 323, 436, 4836, 625, 22909, 285, 8525, 651, 320, 3058, 50276, 22, 253, 878, 281, 1347, 3081, 4715, 323, 1071, 941, 1643, 11860, 10872, 432, 4460, 5971, 310, 3058, 323, 253, 4081, 789, 533, 417, 7933, 323, 247, 1180, 273, 256, 302, 284, 2654, 751, 281, 923, 849, 253, 4477, 651, 21184, 327, 436, 2523, 50276, 1036, 344, 1162, 355, 34741, 6753, 2083, 351, 398, 403, 44755, 8113, 40390, 30105, 1087, 1384, 1423, 247, 260, 864, 1162, 355, 271, 16774, 1263, 273, 3733, 1881, 35421, 8113, 4979, 398, 17857, 17312, 43425, 270, 260, 864, 1162, 355, 247, 8003, 1007, 387, 1643, 11860, 9162, 17857, 32888, 6247, 260, 1113, 251, 1162, 355, 14149, 3607, 275, 1881, 35421, 8113, 4979, 398, 17857, 17312, 43425, 50273, 783, 4477, 858, 2085, 11985, 327, 253, 12291, 273, 253, 4081, 789, 2490, 187, 4118, 18435, 27, 50276, 2520, 2929, 39223, 1643, 11860, 4715, 342, 247, 39707, 10336, 285, 11797, 407, 253, 30328, 326, 4030, 72, 11273, 1491, 310, 12841, 275, 5368, 3082, 4648, 271, 6703, 14075, 10669, 294, 6712, 272, 1332, 281, 3157, 1543, 4583, 253, 30628, 14109, 253, 897, 273, 4980, 35615, 8113, 4979, 398, 253, 40149, 273, 253, 294, 6712, 272, 30328, 285, 5661, 1543, 7350, 497, 5439, 670, 5301, 281, 5368, 3082, 342, 2074, 16875, 4431, 24088, 247, 5393, 407, 29692, 22, 88, 28959, 273, 253, 5301, 342, 1675, 281, 1566, 5350, 285, 275, 2087, 490, 77, 569, 17227, 326, 697, 253, 1332, 417, 4979, 398, 407, 3746, 4283, 281, 5520, 1543, 285, 3480, 273, 3505, 74, 6216, 22909, 323, 253, 2216, 10165, 285, 15180, 10454, 50273, 783, 4477, 2530, 2266, 30080, 85, 932, 1690, 747, 4679, 970, 4872, 49996, 285, 3861, 49225, 7274, 897, 273, 4577, 3210, 285, 247, 20028, 273, 2442, 819, 25004, 3082, 281, 2953, 15180, 10454, 253, 30628, 497, 4583, 44952, 281, 253, 30080, 22559, 285, 512, 8521, 14924, 273, 436, 2929, 846, 690, 896, 395, 28287, 253, 2929, 3400, 1097, 247, 5322, 22791, 9433, 8113, 4979, 398, 281, 1643, 11860, 4715, 347, 973, 347, 247, 1332, 326, 310, 2837, 1598, 1805, 949, 28913, 2175, 3103, 436, 2929, 3400, 2067, 5322, 9021, 281, 253, 3114, 285, 891, 5583, 14924, 209 ]
Below is given review of a research paper from cnoference journal. Please write a summary the review. ### Review: this paper tackles the problem of imitation learning from observations only no access to expert actions in the setting where expert and agent are in different mdps more precisely transition dynamics are different the proposed approach consists in instantiating two agents an advisor and a learner such that the advisor is trained for its next state distribution to be close to the one of the expert while the learner is trained so its stateaction occupancy matches the one of the advisor the reason they instantiate this framework is that otherwise next states may be unreachable by the agents hence the trained discriminator if directly optimizing a gail loss between agent and expert occupancies may perfectly classify and learning may then be challenging instead in their setup the agent learns through a proxy living in the same mdp the advisor contributions they introduce an algorithm aiming to mitigate the issue of agent and expert having dynamics mismatch their demonstrate improved robustness to such mismatch in continuous control environments textbfstrengths empirically they demonstrate their algorithm is more robust than baselines in the mismatch setting the approach is quite innovative the problem they tackle is very interesting as in realistic scenarios their will usually be such dynamics mismatch textbfweaknesses i found the paper slightly hard to parse ie i had a hard time understanding for instance why we could not directly leverage the advisor policy and why the learner policy trained so its occupancy would match the advisors would be better suited than the advisors it would help if this could be further clarified maybe with an intuitive visual example showing the expert the advisor and the learner policies i find the paper in this current form to be borderline due to the fact that it is hard to parse and to understand why the framework is instantiated in the way it is it is still an interesting contributions and empirical results are promising upon such clarifications to be made along with potentially visual examples providing intuitions i would be considering raising my score to a weak accept docsepthis paper tackles imitation learning from observations under model mismatch this is a realistic consideration when tackling more complex or real environments where methods such as il and its variants may see dramatic improvements over more conventional rlbased approaches to the point of being the only feasible approach model mismatch makes existing ilo methods unstable in theory at the very least to ameliorate this the paper introduces an advisor policy which learns from the expert demonstrations but under the constraints implicitly imposed by the capacity of the learner or the learners dynamics the advisor is then used to train the learner the method is empirically validated on some standard benchmarks and appears to perform reasonably well summary the setup for this paper is super interesting the domain is a realistic one there is some limited data collected by simply watching an expert interact with the environment but where the learner is different in some critical way the question is then how do we leverage the observations of the expert to expedite learning an agent if solved generally see 5 6 below this would be a brilliant addition to the literature sections two and four are very well written section one motivates the problem well but doesnt provide much detail on what content will be provided in the paper section three is not well written does not clearly explain the method and is instead a bit of a drawnout narrative that is hard to follow the empirical validation presented is lacking finally there is only a very short supplement that only provides hyperparameter architectural settings and no further insight the crux of my critique is that i believe the empirical validation to justify a method with this many moving parts is lacking and that the clumsy exposition of the methodology obfuscates why each component is necessary and how they work in conjunction to create a performant policy while i believe there is a good method and a good conference submission in this line of work it is not currently in that state major comments 1 my main complaint with this work is the lack of clarity in how the advisor actually works and what it provides the advisor is this magic box that encapsulates the expert policy and the constraintslimitations of the learner the advisor is defined in learner space as that is the only space we tractably have access to and so how is the policy learned by the learner different to that of the advisor why cant the learner just imitate the advisor exactly using for instance dagger if the learner can imitate the advisor exactly then what is the point of the advisor why not just learn the learner by whatever method you use to learn the advisor it is then also poorly explained how the learner is subsequently recovered from the expert demonstrations and the advisor policy eq 2 is pointed to but without much more detail as to the intuition that underlies this step 2 the method itself is also very complicated as far as i can tell there are at least four trained elements the red network a reward network an advisor policy and a learner policy that is a lot of stuff to specify train tune etc there is also no indication of how robust to hyperparameterarchitecture settings the performance is ie if i happen to use a red network with toolower capacity how is performance affected and how easy is it to diagnose that that is where the problem is i understand that all things rl are inherently a bit magicboxey and can be sensitive but even still this method seems very convolutedinvolved and is not sufficiently empirically validated to convince me that this complexity is warranted 3 following on from 1 2 is the advisor andor red even strictly required it seems and i may have gotten it wrong here that you are essentially pretraining some policies to match the expert as well as it can and then refining the policies in places where the expert cant be matched i can just about believe that red provides a tractable method for learning a policy that keeps as close to the experts state evolution as possible the advisor policy then steers the learner towards those actions why not just weight the ilo loss by the prediction from red or just initialize the learner to match the state distribution where it can and then refine based on reward or something more akin to thor sun bagnel boots 2018 if the authors can comment on and clarify this 4 i will say upfront that this may be a little unfair as a critique so take this with a pinch of salt the absolute performance of the method and even the baselines is disappointing although i dont know this for sure i am relatively confident that something like regular sac applied directly to the environment would achieve better performance in fewer environment interactions therefore why are we even bothering to use the expert demonstrations if regular rl in the learner could achieve competitive performance i was desperately hoping that this method would leverage the imperfect demonstrations to obtain a learner policy with dramatically fewer environment interactions even if the absolutefinal performance was slightly lower than the bestpossible performance as i mentioned above there is a ton of internal mechanics in this method and while i believe there is an algorithm that can do ilo in this mismatched domain i am as of yet unconvinced that the ailo method presented is the method for doing it both in terms of absolute performance and technical overhead 41 the results in this paper have not convinced me that this method successfully expediently and robustly extracts information from demonstrations from a slightly different environment to expedite learning a policy 42 i think the following baselines are missing sacrl on the learner also to provide the true upper bound on performance of the learner dagger on the learner using the misspecified expert policy behavioural cloning bc on just stateactions from the expert initializing a learner to the bc learnedpolicy and then performing rl to refine that policy some of these are not practically applicable methods in reallife given the domain but are in my opinion required to elucidate the performance of the method bearing in mind that baselines are also there to help the reader understand the performance and limitations of the method as opposed to simply whether or not it works 5 i would also have liked to have seen the method applied to a smallerscale toy example you even motivate the problem with a simple gridworld example why wasnt the method applied to such a gridworld a simple example would help you pull apart the method and verify that all the components are working as intended and would also allow instructive and intuitive visuals graphs diagrams heatmaps etc to be generated to further help the reader understand the method and its components as opposed to simply posting a grid if graphs showing that it achieves halfway reasonable performance 6 i think there may also be a missed opportunity exploring at least in principle connecting il and simtoreal domains simtoreal seems to me to be a domain where this method with some tweaking could have some utility as that is entirely derived from mismatch between two domains the advisor seems inpart to be a more flexible representation of an expert policy learned in sim that accommodates for the imperfections of the learner learned at testtime in real there may be nothing in this and it is just a suggestion but the setups are so similar that any comparison is conspicuous by absence even just commenting and discounting minor comments a i would like to see a more specific explanation of the method earlier in the paper the advisor is loosely mooted early on but is only really introduced with any specificity at the bottom of page 3 b only proper nouns should be capitalized reinforcement learning reinforcement learning infinite horizon markov decision process infinite horizon markov decision process c figure 1 should be improved to improve its clarity i have no idea what the squiggly lines represent are these lines something we will learn the second part of that figure also offers no helpful content as it is too highlevel i would also move it up to be a banner figure at the top of page two this would help frame your approach more succinctly and earlier on i would also consider even more simply pseudocoding the algorithm and placing it alongside the diagram on page two to drive home the method early d figure 2 could easily be compressed to save space if required e the algorithm block is totally unhelpful it should either be made more precise using full mathematical notation or made more intuitive using pure pseudocode and highlevel objects f releasing code for this is ultra important because i do not believe i could reproduce the results in this work from the prose and limited supplement alone this work is interesting and shows promise and if it were to work would be a useful addition to the literature however i am unconvinced by the method itself the method is quite involved with lots of moving parts furthermore the presentation of the core of the method the advisor is remarkably poor considering that it is the contribution and that the rest of the paper is so well written the empirical validation is also a little lacklustre both in terms of range and actual performance i also have more subjective issues with aspects of the paper see comments above that while it would be unfair to reject solely on those issues dissuade me from giving the authors the benefit of the doubt here therefore i recommend that this paper is not accepted for inclusion in this review cycle but note that with some reworking this paper could make a good submission to a good conference in the near future good luck docsepthe paper proposes a new algorithm for solving lfo task where the dynamics are different in the expert environment and the learner environment the algorithm first trains an intermediate policy in the learner environment that mimics the experts state trajectory and then the learner can just mimic this intermediate policy in the learner environment the authors further show in the experiment that the proposed method outperforms previous lfo algorithms in several locomotion tasks the idea of using red for estimating the joint density for consecutive states for imitation learning from observation is interesting and the overall structure of the paper is clear and easy to follow eq 3 seems confusing jpia is a function of pia but on the rhs of the equation it is taking a maximum over pia i am not sure what are taking maximum over since pia should be fixed in this situation as a input of jpia also the last equation does not hold if it is taking max it only hold if it is taking argmax i do understand what the authors are trying to show here but this may need some revision similarly to eq 4 also the usage of the intermediate policy still lacks some intuition if the advisor policy is able to recover the expert state trajectory then optimizing the expected return st the state distributions are the same should solve the problem if the advisor policy is not able to fully recover the expert state trajectory how would we expect the learner policy to recover the state distribution of the expert policy since the learner policy is matching the stateaction distribution of the advisor policy eq above eq 2 for example how is the learning path the squiggly line in fig 1 achieved by the proposed objectives the setting itself is not new learning from expert and environments with different dynamics has also been proposed in many papers with different names policy transfer policy adaptation domain transfer sim2real etc the main major issue is the experiment the current baselines are weak since they are just lfo methods but do not specifically deal with dynamics mismatch even in this situation the proposed method is outperformed or evenly matched by the baselines this is not the main issue the main issue is the lack of comparison with baselines under the same setting as the proposed method such as 1 possibly plus works in the above papers policy transfer policy adaptation domain transfer sim2real that work in the same setting only comparing with baselines that do not specifically deal with dynamics mismatch does not seem very fair and thus the experiment is not thorough enough for such an empirical paper 1 tanmay gangwani and jian peng stateonly imitation with transition dynamics mismatch arxiv preprint arxiv200211879 2020 some mathematical representations still need some improvement the experiment is not convincing enough i would recommend a weak reject for the paper ### Summary:
the submitted paper considers the very interesting problem of imitation learning from observations under transition model disparity the reviewers recommended 2x weak accept and 1x weak reject for the paper main concerns about the paper regarded clarity of the presentation complicatedness of the proposed method and experimental validation during the discussion phase the authors addressed some of the comments and provided an update of the paper providing additional details while some of the reviewers concerns still stand i think the addressed problem is very relevant and the proposed method can be with clarifications and improvements of the presentation be interesting to parts of the community hence i am recommending acceptance of the paper nevertheless i strongly urge the authors to carefully revise their paper and taking the reviewers concerns carefully into account when preparing the camera ready version of the paper
[ 50276, 783, 1332, 3139, 310, 671, 1077, 9542, 50276, 284, 2080, 347, 891, 476, 2028, 627, 403, 387, 1878, 1740, 10166, 3603, 253, 2502, 2990, 247, 10921, 2990, 271, 31149, 3646, 285, 247, 458, 47612, 3646, 50276, 3529, 310, 247, 2257, 273, 5017, 281, 13199, 6194, 19928, 3966, 50276, 9088, 310, 671, 642, 14011, 273, 849, 10237, 281, 4373, 19484, 1116, 38413, 7533, 253, 3045, 310, 50276, 466, 604, 891, 5108, 281, 897, 247, 2502, 2990, 342, 4968, 1017, 5350, 849, 310, 3045, 5876, 285, 849, 3477, 310, 352, 281, 33901, 326, 326, 310, 835, 253, 1895, 310, 50276, 74, 2096, 326, 512, 1841, 391, 77, 403, 26557, 247, 2372, 10721, 3364, 2653, 285, 476, 320, 7996, 533, 1014, 1335, 436, 1332, 3133, 1077, 2410, 311, 4525, 7821, 5336, 285, 310, 417, 10481, 45190, 17618, 281, 18578, 479, 326, 436, 10454, 310, 26085, 50276, 20, 50276, 34814, 327, 432, 337, 50276, 19, 310, 253, 31149, 285, 263, 2502, 1014, 13714, 2424, 50276, 262, 3133, 285, 891, 778, 452, 12759, 352, 3430, 1060, 326, 368, 403, 9093, 3215, 26208, 690, 7823, 281, 3761, 253, 6485, 347, 973, 347, 352, 476, 285, 840, 1275, 1699, 253, 7823, 275, 5053, 835, 253, 6485, 16216, 320, 13373, 50276, 74, 476, 816, 670, 2868, 326, 2502, 3400, 247, 10649, 494, 1332, 323, 4715, 247, 3646, 326, 11359, 347, 2810, 281, 253, 10071, 1375, 5606, 347, 1896, 50276, 783, 31149, 3646, 840, 2870, 398, 253, 458, 47612, 4404, 1110, 5231, 50276, 22309, 417, 816, 2801, 253, 4164, 80, 2957, 407, 253, 10554, 432, 2502, 50276, 263, 816, 26641, 253, 458, 47612, 281, 3761, 253, 1375, 3268, 835, 352, 476, 285, 840, 39494, 1754, 327, 10921, 50276, 263, 1633, 625, 33917, 281, 9062, 5101, 270, 1530, 293, 50276, 13449, 84, 4765, 50276, 338, 253, 4477, 476, 4385, 327, 285, 19148, 436, 50274, 21, 50276, 74, 588, 1333, 598, 6342, 326, 436, 778, 320, 247, 1652, 16593, 347, 247, 29254, 594, 1379, 436, 342, 247, 33161, 273, 7043, 50276, 783, 7880, 3045, 273, 253, 1332, 285, 1014, 253, 1666, 25379, 310, 31623, 50276, 20261, 891, 13414, 871, 436, 323, 2119, 891, 717, 4942, 13224, 326, 1633, 751, 3963, 7044, 3732, 3587, 281, 253, 3126, 651, 5115, 1805, 3045, 275, 11184, 3126, 6355, 50276, 45230, 2139, 403, 359, 1014, 15105, 272, 281, 897, 253, 6485, 32367, 604, 3963, 391, 77, 275, 253, 458, 47612, 812, 5115, 12085, 3045, 50276, 74, 369, 26426, 11525, 326, 436, 1332, 651, 25057, 253, 35180, 32367, 281, 4044, 247, 458, 47612, 3646, 342, 16821, 11184, 3126, 6355, 1014, 604, 253, 4847, 307, 832, 989, 3045, 369, 5777, 2406, 685, 253, 1682, 24902, 3045, 50276, 284, 891, 5393, 1840, 627, 310, 247, 7020, 273, 4812, 17823, 275, 436, 1332, 285, 1223, 891, 2868, 627, 310, 271, 5933, 326, 476, 513, 4164, 80, 275, 436, 19412, 24529, 5028, 891, 717, 347, 273, 2568, 10915, 8498, 758, 326, 253, 247, 23353, 1332, 3559, 310, 253, 1332, 323, 2509, 352, 50276, 15617, 275, 2426, 273, 7880, 3045, 285, 7681, 18332, 50274, 3156, 50276, 783, 1543, 275, 436, 2929, 452, 417, 13762, 479, 326, 436, 1332, 8379, 16625, 850, 314, 285, 10237, 314, 16756, 1491, 432, 32367, 432, 247, 5777, 1027, 3126, 281, 16625, 614, 4715, 247, 3646, 50274, 2945, 50276, 74, 1158, 253, 1563, 1666, 25379, 403, 5816, 50274, 38346, 8435, 327, 253, 458, 47612, 671, 281, 2085, 253, 2032, 5170, 3033, 327, 3045, 273, 253, 458, 47612, 50276, 11560, 327, 253, 458, 47612, 970, 253, 2985, 1553, 1245, 6485, 3646, 50275, 1257, 37847, 267, 34591, 49501, 327, 816, 1375, 3518, 432, 253, 6485, 50274, 19078, 3006, 247, 458, 47612, 281, 253, 49501, 6311, 22872, 285, 840, 9591, 391, 77, 281, 39494, 326, 3646, 50275, 8826, 273, 841, 403, 417, 18236, 7763, 3082, 275, 294, 455, 1074, 1677, 253, 5028, 533, 403, 275, 619, 4743, 2424, 281, 30955, 253, 3045, 273, 253, 1332, 12206, 275, 2564, 326, 1666, 25379, 403, 671, 627, 281, 1361, 253, 9414, 2096, 253, 3045, 285, 7364, 273, 253, 1332, 347, 10066, 281, 3365, 1880, 390, 417, 352, 2987, 50274, 22, 50276, 74, 651, 671, 452, 10490, 281, 452, 2326, 253, 1332, 3732, 281, 247, 1355, 398, 25912, 20953, 1650, 50276, 5658, 1014, 41509, 253, 1895, 342, 247, 2969, 9860, 10186, 1650, 50276, 22309, 369, 2649, 253, 1332, 3732, 281, 824, 247, 9860, 10186, 50276, 66, 2969, 1650, 651, 1361, 368, 3785, 7419, 253, 1332, 285, 12654, 326, 512, 253, 4295, 403, 2444, 347, 6034, 285, 651, 671, 1581, 49664, 285, 27350, 5304, 84, 14580, 21302, 4250, 23226, 3966, 281, 320, 4561, 281, 2007, 1361, 253, 9414, 2096, 253, 1332, 285, 697, 4295, 347, 10066, 281, 3365, 16920, 247, 9860, 604, 14580, 4645, 326, 352, 33526, 25854, 5272, 3045, 50274, 23, 50276, 74, 1158, 627, 778, 671, 320, 247, 9829, 5107, 18216, 387, 1878, 275, 8063, 12873, 4164, 285, 948, 85, 43549, 10625, 50276, 3549, 85, 43549, 3133, 281, 479, 281, 320, 247, 5028, 835, 436, 1332, 342, 690, 13660, 1170, 812, 452, 690, 11839, 347, 326, 310, 7094, 6012, 432, 29713, 875, 767, 10625, 50276, 783, 31149, 3133, 275, 2003, 281, 320, 247, 625, 12112, 6779, 273, 271, 6485, 3646, 6311, 275, 948, 326, 10085, 684, 323, 253, 9719, 9024, 273, 253, 458, 47612, 6311, 387, 1071, 2606, 275, 1524, 50276, 9088, 778, 320, 2717, 275, 436, 285, 352, 310, 816, 247, 14876, 533, 253, 873, 8777, 403, 594, 2074, 326, 667, 5301, 310, 43988, 407, 5928, 50276, 9154, 816, 36738, 285, 13630, 272, 50273, 37585, 5701, 50276, 66, 50276, 74, 651, 751, 281, 923, 247, 625, 2173, 8813, 273, 253, 1332, 4321, 275, 253, 2929, 50276, 783, 31149, 310, 35056, 31866, 264, 2393, 327, 533, 310, 760, 1663, 5611, 342, 667, 13005, 387, 253, 5004, 273, 3239, 495, 50276, 67, 50276, 7483, 1463, 28407, 84, 943, 320, 5347, 1025, 35221, 4715, 50276, 39910, 19503, 4715, 50276, 2050, 8234, 16892, 1616, 729, 3061, 1232, 50276, 2050, 8234, 16892, 1616, 729, 3061, 1232, 50276, 68, 50276, 13206, 337, 943, 320, 5520, 281, 3157, 697, 19843, 50276, 74, 452, 642, 2934, 752, 253, 3896, 304, 13233, 3104, 1957, 50276, 609, 841, 3104, 1633, 359, 588, 3037, 50276, 783, 1273, 629, 273, 326, 4677, 671, 6131, 642, 9371, 2600, 347, 352, 310, 1512, 1029, 5251, 50276, 74, 651, 671, 2118, 352, 598, 281, 320, 247, 29751, 4677, 387, 253, 1755, 273, 3239, 767, 50276, 2520, 651, 1361, 3665, 634, 2746, 625, 18382, 4291, 314, 285, 4321, 327, 50276, 74, 651, 671, 1908, 1014, 625, 3365, 10585, 406, 4442, 253, 5933, 285, 15606, 352, 12936, 253, 10659, 327, 3239, 767, 281, 4446, 1728, 253, 1332, 2393, 50275, 69, 50276, 13206, 374, 812, 4354, 320, 21012, 281, 5321, 2317, 604, 2424, 50276, 70, 50276, 783, 5933, 2972, 310, 9106, 440, 13070, 1020, 50276, 262, 943, 2057, 320, 1160, 625, 10799, 970, 2120, 15965, 14951, 390, 1160, 625, 27350, 970, 6313, 10585, 406, 853, 285, 1029, 5251, 5113, 50276, 71, 50276, 11235, 2355, 2127, 323, 436, 310, 17452, 1774, 984, 891, 513, 417, 2868, 891, 812, 18302, 253, 1543, 275, 436, 789, 432, 253, 36045, 285, 3710, 8499, 3815, 50276, 2520, 789, 310, 4722, 285, 2722, 9023, 285, 604, 352, 497, 281, 789, 651, 320, 247, 4217, 1635, 281, 253, 6239, 50276, 35529, 891, 717, 10915, 8498, 758, 407, 253, 1332, 3139, 50276, 783, 1332, 310, 3240, 3206, 342, 8783, 273, 4886, 4243, 50276, 44295, 3062, 253, 9759, 273, 253, 5161, 273, 253, 1332, 253, 31149, 310, 24678, 4105, 7296, 326, 352, 310, 253, 7680, 285, 326, 253, 1551, 273, 253, 2929, 310, 594, 973, 3542, 50276, 783, 16774, 12820, 310, 671, 247, 1652, 3480, 77, 461, 250, 1097, 275, 2426, 273, 2491, 285, 4588, 3045, 50276, 74, 671, 452, 625, 17854, 3374, 342, 7794, 273, 253, 2929, 923, 5701, 1840, 326, 1223, 352, 651, 320, 16593, 281, 12009, 12718, 327, 1110, 3374, 557, 3467, 796, 479, 432, 4933, 253, 4477, 253, 5649, 273, 253, 5545, 1060, 50276, 45230, 891, 5583, 326, 436, 2929, 310, 417, 7607, 323, 11250, 275, 436, 2278, 5880, 50276, 2858, 3877, 326, 342, 690, 294, 21107, 436, 2929, 812, 1056, 247, 1175, 19529, 281, 247, 1175, 8059, 275, 253, 2822, 2852, 50274, 12311, 7516, 5474, 339, 431, 248, 2929, 29328, 247, 747, 5933, 323, 16161, 298, 4786, 4836, 835, 253, 8062, 403, 1027, 275, 253, 6485, 3126, 285, 253, 458, 47612, 3126, 253, 5933, 806, 18784, 271, 10444, 3646, 275, 253, 458, 47612, 3126, 326, 43341, 253, 10071, 1375, 18974, 285, 840, 253, 458, 47612, 476, 816, 25066, 436, 10444, 3646, 275, 253, 458, 47612, 3126, 253, 4477, 2007, 921, 275, 253, 3368, 326, 253, 4081, 1332, 41731, 13015, 2045, 298, 4786, 11333, 275, 2067, 23904, 5011, 8892, 50276, 783, 2934, 273, 970, 2502, 323, 26230, 253, 6036, 4038, 323, 12640, 3054, 323, 45738, 4715, 432, 8310, 310, 4722, 285, 253, 4583, 2605, 273, 253, 2929, 310, 2590, 285, 3477, 281, 956, 50275, 2574, 495, 3133, 21643, 480, 81, 571, 310, 247, 1159, 273, 268, 571, 533, 327, 253, 38309, 273, 253, 5150, 352, 310, 3192, 247, 4869, 689, 268, 571, 891, 717, 417, 2119, 752, 403, 3192, 4869, 689, 1580, 268, 571, 943, 320, 4229, 275, 436, 4112, 347, 247, 3280, 273, 480, 81, 571, 671, 253, 1390, 5150, 1057, 417, 2186, 604, 352, 310, 3192, 2781, 352, 760, 2186, 604, 352, 310, 3192, 1736, 4090, 891, 513, 2096, 752, 253, 4477, 403, 2820, 281, 921, 1060, 533, 436, 778, 878, 690, 18520, 12014, 281, 16186, 577, 50276, 12563, 253, 10393, 273, 253, 10444, 3646, 1335, 19756, 690, 30328, 604, 253, 31149, 3646, 310, 2104, 281, 9295, 253, 6485, 1375, 18974, 840, 39793, 253, 3264, 1091, 331, 253, 1375, 10670, 403, 253, 1072, 943, 8415, 253, 1895, 604, 253, 31149, 3646, 310, 417, 2104, 281, 4751, 9295, 253, 6485, 1375, 18974, 849, 651, 359, 1902, 253, 458, 47612, 3646, 281, 9295, 253, 1375, 3268, 273, 253, 6485, 3646, 1580, 253, 458, 47612, 3646, 310, 11038, 253, 1375, 1913, 3268, 273, 253, 31149, 3646, 16186, 1840, 16186, 374, 323, 1650, 849, 310, 253, 4715, 1854, 253, 3896, 304, 13233, 1386, 275, 3036, 337, 6786, 407, 253, 4081, 16566, 50276, 783, 4758, 3139, 310, 417, 747, 4715, 432, 6485, 285, 12620, 342, 1027, 8062, 556, 671, 644, 4081, 275, 1142, 9380, 342, 1027, 4454, 3646, 3700, 3646, 15644, 5028, 3700, 948, 19, 6549, 3966, 50276, 783, 2022, 2201, 2523, 310, 253, 3368, 253, 1655, 1666, 25379, 403, 5075, 1580, 597, 403, 816, 298, 4786, 3082, 533, 513, 417, 5742, 2968, 342, 8062, 29713, 1014, 275, 436, 4112, 253, 4081, 1332, 310, 41731, 10574, 390, 25356, 13373, 407, 253, 1666, 25379, 436, 310, 417, 253, 2022, 2523, 253, 2022, 2523, 310, 253, 3480, 273, 5301, 342, 1666, 25379, 762, 253, 1072, 4758, 347, 253, 4081, 1332, 824, 347, 337, 6830, 5043, 2987, 275, 253, 1840, 9380, 3646, 3700, 3646, 15644, 5028, 3700, 948, 19, 6549, 50276, 3529, 789, 275, 253, 1072, 4758, 760, 10941, 342, 1666, 25379, 326, 513, 417, 5742, 2968, 342, 8062, 29713, 1057, 417, 1646, 1077, 4344, 285, 3021, 253, 3368, 310, 417, 11080, 2217, 323, 824, 271, 16774, 2929, 50275, 18, 23136, 11159, 10821, 88, 6451, 285, 480, 757, 42151, 1375, 7483, 45738, 342, 5502, 8062, 29713, 549, 32693, 638, 3845, 549, 32693, 10016, 883, 34384, 9169, 690, 15965, 14237, 1335, 878, 690, 7756, 253, 3368, 310, 417, 21414, 2217, 891, 651, 5583, 247, 5075, 12009, 323, 253, 2929, 50276, 187, 187, 4118, 18435, 27, 783, 9262, 2929, 19401, 253, 1077, 4722, 1895, 273, 45738, 4715, 432, 7313, 762, 5502, 1566, 37808, 253, 30628, 8521, 374, 89, 5075, 2997, 285, 337, 89, 5075, 12009, 323, 253, 2929, 2022, 7350, 670, 253, 2929, 12258, 19843, 273, 253, 9759, 9542, 1255, 273, 253, 4081, 1332, 285, 5661, 12820, 1309, 253, 5955, 3408, 253, 4477, 9713, 690, 273, 253, 5701, 285, 2530, 271, 5731, 273, 253, 2929, 5277, 3081, 4278, 1223, 690, 273, 253, 30628, 7350, 1335, 1462, 891, 1158, 253, 9713, 1895, 310, 1077, 4623, 285, 253, 4081, 1332, 476, 320, 342, 8254, 6787, 285, 11701, 273, 253, 9759, 320, 4722, 281, 4243, 273, 253, 3114, 7613, 891, 717, 46705, 14924, 273, 253, 2929, 17837, 891, 7052, 21434, 253, 4477, 281, 9257, 49620, 616, 2929, 285, 3192, 253, 30628, 7350, 9257, 715, 2395, 672, 13828, 253, 6568, 4704, 2715, 273, 253, 2929 ]
[ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 ]
[ 50276, 783, 1332, 3139, 310, 671, 1077, 9542, 50276, 284, 2080, 347, 891, 476, 2028, 627, 403, 387, 1878, 1740, 10166, 3603, 253, 2502, 2990, 247, 10921, 2990, 271, 31149, 3646, 285, 247, 458, 47612, 3646, 50276, 3529, 310, 247, 2257, 273, 5017, 281, 13199, 6194, 19928, 3966, 50276, 9088, 310, 671, 642, 14011, 273, 849, 10237, 281, 4373, 19484, 1116, 38413, 7533, 253, 3045, 310, 50276, 466, 604, 891, 5108, 281, 897, 247, 2502, 2990, 342, 4968, 1017, 5350, 849, 310, 3045, 5876, 285, 849, 3477, 310, 352, 281, 33901, 326, 326, 310, 835, 253, 1895, 310, 50276, 74, 2096, 326, 512, 1841, 391, 77, 403, 26557, 247, 2372, 10721, 3364, 2653, 285, 476, 320, 7996, 533, 1014, 1335, 436, 1332, 3133, 1077, 2410, 311, 4525, 7821, 5336, 285, 310, 417, 10481, 45190, 17618, 281, 18578, 479, 326, 436, 10454, 310, 26085, 50276, 20, 50276, 34814, 327, 432, 337, 50276, 19, 310, 253, 31149, 285, 263, 2502, 1014, 13714, 2424, 50276, 262, 3133, 285, 891, 778, 452, 12759, 352, 3430, 1060, 326, 368, 403, 9093, 3215, 26208, 690, 7823, 281, 3761, 253, 6485, 347, 973, 347, 352, 476, 285, 840, 1275, 1699, 253, 7823, 275, 5053, 835, 253, 6485, 16216, 320, 13373, 50276, 74, 476, 816, 670, 2868, 326, 2502, 3400, 247, 10649, 494, 1332, 323, 4715, 247, 3646, 326, 11359, 347, 2810, 281, 253, 10071, 1375, 5606, 347, 1896, 50276, 783, 31149, 3646, 840, 2870, 398, 253, 458, 47612, 4404, 1110, 5231, 50276, 22309, 417, 816, 2801, 253, 4164, 80, 2957, 407, 253, 10554, 432, 2502, 50276, 263, 816, 26641, 253, 458, 47612, 281, 3761, 253, 1375, 3268, 835, 352, 476, 285, 840, 39494, 1754, 327, 10921, 50276, 263, 1633, 625, 33917, 281, 9062, 5101, 270, 1530, 293, 50276, 13449, 84, 4765, 50276, 338, 253, 4477, 476, 4385, 327, 285, 19148, 436, 50274, 21, 50276, 74, 588, 1333, 598, 6342, 326, 436, 778, 320, 247, 1652, 16593, 347, 247, 29254, 594, 1379, 436, 342, 247, 33161, 273, 7043, 50276, 783, 7880, 3045, 273, 253, 1332, 285, 1014, 253, 1666, 25379, 310, 31623, 50276, 20261, 891, 13414, 871, 436, 323, 2119, 891, 717, 4942, 13224, 326, 1633, 751, 3963, 7044, 3732, 3587, 281, 253, 3126, 651, 5115, 1805, 3045, 275, 11184, 3126, 6355, 50276, 45230, 2139, 403, 359, 1014, 15105, 272, 281, 897, 253, 6485, 32367, 604, 3963, 391, 77, 275, 253, 458, 47612, 812, 5115, 12085, 3045, 50276, 74, 369, 26426, 11525, 326, 436, 1332, 651, 25057, 253, 35180, 32367, 281, 4044, 247, 458, 47612, 3646, 342, 16821, 11184, 3126, 6355, 1014, 604, 253, 4847, 307, 832, 989, 3045, 369, 5777, 2406, 685, 253, 1682, 24902, 3045, 50276, 284, 891, 5393, 1840, 627, 310, 247, 7020, 273, 4812, 17823, 275, 436, 1332, 285, 1223, 891, 2868, 627, 310, 271, 5933, 326, 476, 513, 4164, 80, 275, 436, 19412, 24529, 5028, 891, 717, 347, 273, 2568, 10915, 8498, 758, 326, 253, 247, 23353, 1332, 3559, 310, 253, 1332, 323, 2509, 352, 50276, 15617, 275, 2426, 273, 7880, 3045, 285, 7681, 18332, 50274, 3156, 50276, 783, 1543, 275, 436, 2929, 452, 417, 13762, 479, 326, 436, 1332, 8379, 16625, 850, 314, 285, 10237, 314, 16756, 1491, 432, 32367, 432, 247, 5777, 1027, 3126, 281, 16625, 614, 4715, 247, 3646, 50274, 2945, 50276, 74, 1158, 253, 1563, 1666, 25379, 403, 5816, 50274, 38346, 8435, 327, 253, 458, 47612, 671, 281, 2085, 253, 2032, 5170, 3033, 327, 3045, 273, 253, 458, 47612, 50276, 11560, 327, 253, 458, 47612, 970, 253, 2985, 1553, 1245, 6485, 3646, 50275, 1257, 37847, 267, 34591, 49501, 327, 816, 1375, 3518, 432, 253, 6485, 50274, 19078, 3006, 247, 458, 47612, 281, 253, 49501, 6311, 22872, 285, 840, 9591, 391, 77, 281, 39494, 326, 3646, 50275, 8826, 273, 841, 403, 417, 18236, 7763, 3082, 275, 294, 455, 1074, 1677, 253, 5028, 533, 403, 275, 619, 4743, 2424, 281, 30955, 253, 3045, 273, 253, 1332, 12206, 275, 2564, 326, 1666, 25379, 403, 671, 627, 281, 1361, 253, 9414, 2096, 253, 3045, 285, 7364, 273, 253, 1332, 347, 10066, 281, 3365, 1880, 390, 417, 352, 2987, 50274, 22, 50276, 74, 651, 671, 452, 10490, 281, 452, 2326, 253, 1332, 3732, 281, 247, 1355, 398, 25912, 20953, 1650, 50276, 5658, 1014, 41509, 253, 1895, 342, 247, 2969, 9860, 10186, 1650, 50276, 22309, 369, 2649, 253, 1332, 3732, 281, 824, 247, 9860, 10186, 50276, 66, 2969, 1650, 651, 1361, 368, 3785, 7419, 253, 1332, 285, 12654, 326, 512, 253, 4295, 403, 2444, 347, 6034, 285, 651, 671, 1581, 49664, 285, 27350, 5304, 84, 14580, 21302, 4250, 23226, 3966, 281, 320, 4561, 281, 2007, 1361, 253, 9414, 2096, 253, 1332, 285, 697, 4295, 347, 10066, 281, 3365, 16920, 247, 9860, 604, 14580, 4645, 326, 352, 33526, 25854, 5272, 3045, 50274, 23, 50276, 74, 1158, 627, 778, 671, 320, 247, 9829, 5107, 18216, 387, 1878, 275, 8063, 12873, 4164, 285, 948, 85, 43549, 10625, 50276, 3549, 85, 43549, 3133, 281, 479, 281, 320, 247, 5028, 835, 436, 1332, 342, 690, 13660, 1170, 812, 452, 690, 11839, 347, 326, 310, 7094, 6012, 432, 29713, 875, 767, 10625, 50276, 783, 31149, 3133, 275, 2003, 281, 320, 247, 625, 12112, 6779, 273, 271, 6485, 3646, 6311, 275, 948, 326, 10085, 684, 323, 253, 9719, 9024, 273, 253, 458, 47612, 6311, 387, 1071, 2606, 275, 1524, 50276, 9088, 778, 320, 2717, 275, 436, 285, 352, 310, 816, 247, 14876, 533, 253, 873, 8777, 403, 594, 2074, 326, 667, 5301, 310, 43988, 407, 5928, 50276, 9154, 816, 36738, 285, 13630, 272, 50273, 37585, 5701, 50276, 66, 50276, 74, 651, 751, 281, 923, 247, 625, 2173, 8813, 273, 253, 1332, 4321, 275, 253, 2929, 50276, 783, 31149, 310, 35056, 31866, 264, 2393, 327, 533, 310, 760, 1663, 5611, 342, 667, 13005, 387, 253, 5004, 273, 3239, 495, 50276, 67, 50276, 7483, 1463, 28407, 84, 943, 320, 5347, 1025, 35221, 4715, 50276, 39910, 19503, 4715, 50276, 2050, 8234, 16892, 1616, 729, 3061, 1232, 50276, 2050, 8234, 16892, 1616, 729, 3061, 1232, 50276, 68, 50276, 13206, 337, 943, 320, 5520, 281, 3157, 697, 19843, 50276, 74, 452, 642, 2934, 752, 253, 3896, 304, 13233, 3104, 1957, 50276, 609, 841, 3104, 1633, 359, 588, 3037, 50276, 783, 1273, 629, 273, 326, 4677, 671, 6131, 642, 9371, 2600, 347, 352, 310, 1512, 1029, 5251, 50276, 74, 651, 671, 2118, 352, 598, 281, 320, 247, 29751, 4677, 387, 253, 1755, 273, 3239, 767, 50276, 2520, 651, 1361, 3665, 634, 2746, 625, 18382, 4291, 314, 285, 4321, 327, 50276, 74, 651, 671, 1908, 1014, 625, 3365, 10585, 406, 4442, 253, 5933, 285, 15606, 352, 12936, 253, 10659, 327, 3239, 767, 281, 4446, 1728, 253, 1332, 2393, 50275, 69, 50276, 13206, 374, 812, 4354, 320, 21012, 281, 5321, 2317, 604, 2424, 50276, 70, 50276, 783, 5933, 2972, 310, 9106, 440, 13070, 1020, 50276, 262, 943, 2057, 320, 1160, 625, 10799, 970, 2120, 15965, 14951, 390, 1160, 625, 27350, 970, 6313, 10585, 406, 853, 285, 1029, 5251, 5113, 50276, 71, 50276, 11235, 2355, 2127, 323, 436, 310, 17452, 1774, 984, 891, 513, 417, 2868, 891, 812, 18302, 253, 1543, 275, 436, 789, 432, 253, 36045, 285, 3710, 8499, 3815, 50276, 2520, 789, 310, 4722, 285, 2722, 9023, 285, 604, 352, 497, 281, 789, 651, 320, 247, 4217, 1635, 281, 253, 6239, 50276, 35529, 891, 717, 10915, 8498, 758, 407, 253, 1332, 3139, 50276, 783, 1332, 310, 3240, 3206, 342, 8783, 273, 4886, 4243, 50276, 44295, 3062, 253, 9759, 273, 253, 5161, 273, 253, 1332, 253, 31149, 310, 24678, 4105, 7296, 326, 352, 310, 253, 7680, 285, 326, 253, 1551, 273, 253, 2929, 310, 594, 973, 3542, 50276, 783, 16774, 12820, 310, 671, 247, 1652, 3480, 77, 461, 250, 1097, 275, 2426, 273, 2491, 285, 4588, 3045, 50276, 74, 671, 452, 625, 17854, 3374, 342, 7794, 273, 253, 2929, 923, 5701, 1840, 326, 1223, 352, 651, 320, 16593, 281, 12009, 12718, 327, 1110, 3374, 557, 3467, 796, 479, 432, 4933, 253, 4477, 253, 5649, 273, 253, 5545, 1060, 50276, 45230, 891, 5583, 326, 436, 2929, 310, 417, 7607, 323, 11250, 275, 436, 2278, 5880, 50276, 2858, 3877, 326, 342, 690, 294, 21107, 436, 2929, 812, 1056, 247, 1175, 19529, 281, 247, 1175, 8059, 275, 253, 2822, 2852, 50274, 12311, 7516, 5474, 339, 431, 248, 2929, 29328, 247, 747, 5933, 323, 16161, 298, 4786, 4836, 835, 253, 8062, 403, 1027, 275, 253, 6485, 3126, 285, 253, 458, 47612, 3126, 253, 5933, 806, 18784, 271, 10444, 3646, 275, 253, 458, 47612, 3126, 326, 43341, 253, 10071, 1375, 18974, 285, 840, 253, 458, 47612, 476, 816, 25066, 436, 10444, 3646, 275, 253, 458, 47612, 3126, 253, 4477, 2007, 921, 275, 253, 3368, 326, 253, 4081, 1332, 41731, 13015, 2045, 298, 4786, 11333, 275, 2067, 23904, 5011, 8892, 50276, 783, 2934, 273, 970, 2502, 323, 26230, 253, 6036, 4038, 323, 12640, 3054, 323, 45738, 4715, 432, 8310, 310, 4722, 285, 253, 4583, 2605, 273, 253, 2929, 310, 2590, 285, 3477, 281, 956, 50275, 2574, 495, 3133, 21643, 480, 81, 571, 310, 247, 1159, 273, 268, 571, 533, 327, 253, 38309, 273, 253, 5150, 352, 310, 3192, 247, 4869, 689, 268, 571, 891, 717, 417, 2119, 752, 403, 3192, 4869, 689, 1580, 268, 571, 943, 320, 4229, 275, 436, 4112, 347, 247, 3280, 273, 480, 81, 571, 671, 253, 1390, 5150, 1057, 417, 2186, 604, 352, 310, 3192, 2781, 352, 760, 2186, 604, 352, 310, 3192, 1736, 4090, 891, 513, 2096, 752, 253, 4477, 403, 2820, 281, 921, 1060, 533, 436, 778, 878, 690, 18520, 12014, 281, 16186, 577, 50276, 12563, 253, 10393, 273, 253, 10444, 3646, 1335, 19756, 690, 30328, 604, 253, 31149, 3646, 310, 2104, 281, 9295, 253, 6485, 1375, 18974, 840, 39793, 253, 3264, 1091, 331, 253, 1375, 10670, 403, 253, 1072, 943, 8415, 253, 1895, 604, 253, 31149, 3646, 310, 417, 2104, 281, 4751, 9295, 253, 6485, 1375, 18974, 849, 651, 359, 1902, 253, 458, 47612, 3646, 281, 9295, 253, 1375, 3268, 273, 253, 6485, 3646, 1580, 253, 458, 47612, 3646, 310, 11038, 253, 1375, 1913, 3268, 273, 253, 31149, 3646, 16186, 1840, 16186, 374, 323, 1650, 849, 310, 253, 4715, 1854, 253, 3896, 304, 13233, 1386, 275, 3036, 337, 6786, 407, 253, 4081, 16566, 50276, 783, 4758, 3139, 310, 417, 747, 4715, 432, 6485, 285, 12620, 342, 1027, 8062, 556, 671, 644, 4081, 275, 1142, 9380, 342, 1027, 4454, 3646, 3700, 3646, 15644, 5028, 3700, 948, 19, 6549, 3966, 50276, 783, 2022, 2201, 2523, 310, 253, 3368, 253, 1655, 1666, 25379, 403, 5075, 1580, 597, 403, 816, 298, 4786, 3082, 533, 513, 417, 5742, 2968, 342, 8062, 29713, 1014, 275, 436, 4112, 253, 4081, 1332, 310, 41731, 10574, 390, 25356, 13373, 407, 253, 1666, 25379, 436, 310, 417, 253, 2022, 2523, 253, 2022, 2523, 310, 253, 3480, 273, 5301, 342, 1666, 25379, 762, 253, 1072, 4758, 347, 253, 4081, 1332, 824, 347, 337, 6830, 5043, 2987, 275, 253, 1840, 9380, 3646, 3700, 3646, 15644, 5028, 3700, 948, 19, 6549, 50276, 3529, 789, 275, 253, 1072, 4758, 760, 10941, 342, 1666, 25379, 326, 513, 417, 5742, 2968, 342, 8062, 29713, 1057, 417, 1646, 1077, 4344, 285, 3021, 253, 3368, 310, 417, 11080, 2217, 323, 824, 271, 16774, 2929, 50275, 18, 23136, 11159, 10821, 88, 6451, 285, 480, 757, 42151, 1375, 7483, 45738, 342, 5502, 8062, 29713, 549, 32693, 638, 3845, 549, 32693, 10016, 883, 34384, 9169, 690, 15965, 14237, 1335, 878, 690, 7756, 253, 3368, 310, 417, 21414, 2217, 891, 651, 5583, 247, 5075, 12009, 323, 253, 2929, 50276, 187, 187, 4118, 18435, 27, 783, 9262, 2929, 19401, 253, 1077, 4722, 1895, 273, 45738, 4715, 432, 7313, 762, 5502, 1566, 37808, 253, 30628, 8521, 374, 89, 5075, 2997, 285, 337, 89, 5075, 12009, 323, 253, 2929, 2022, 7350, 670, 253, 2929, 12258, 19843, 273, 253, 9759, 9542, 1255, 273, 253, 4081, 1332, 285, 5661, 12820, 1309, 253, 5955, 3408, 253, 4477, 9713, 690, 273, 253, 5701, 285, 2530, 271, 5731, 273, 253, 2929, 5277, 3081, 4278, 1223, 690, 273, 253, 30628, 7350, 1335, 1462, 891, 1158, 253, 9713, 1895, 310, 1077, 4623, 285, 253, 4081, 1332, 476, 320, 342, 8254, 6787, 285, 11701, 273, 253, 9759, 320, 4722, 281, 4243, 273, 253, 3114, 7613, 891, 717, 46705, 14924, 273, 253, 2929, 17837, 891, 7052, 21434, 253, 4477, 281, 9257, 49620, 616, 2929, 285, 3192, 253, 30628, 7350, 9257, 715, 2395, 672, 13828, 253, 6568, 4704, 2715, 273, 253, 2929 ]
Below is given review of a research paper from cnoference journal. Please write a summary the review. ### Review: this paper extends prior results namely that vaes are able to learn the principal components the novelty is the extension to a new distribution multinomial logisticnormal distribution this is achieved by using the isometric logratio ilr transform while prior results were derived analytically this paper provides only empirical evidence for the claim regarding the multinomial logisticnormal distribution overall i dont find that the provided experiments provide convincing support of the claim of the paper namely that vaes are able to learn the principle components for the multinomial logisticnormal distribution while the proposed approach yields better results than alternative approaches in figure 1 it is not clear to me why the shown results indicate that the vae was actually able to learn the principal components based on my understanding of the used metrics for instance axisalignment of 0 indicates perfect results and 1 is the worst the proposed approach achieves 08 on 200 dimension and 095 on 1000 dimensions when a deeper model is used in figure 1 the metrics go down which is good but they still stay above 05 ie far away from 0 based on my understanding of the used metrics i am not sure why this would provide empirical support of the claim in this paper that the principal components can be recovered in the definition of axis alignment in the appendix it seems like the enumerator is missing a square compared to the original definition in the referenced paper 14 the authors also claim that figure 2 shows that the learned embedding dimensions are orthogonal to each other while i can believe this in the right plot i am not convinced by the left plot where a different scaling is used as mentioned by the authors and hence the diagonal does not seem so much larger than the offdiagonals to me the paper also proposes a batchcorrection as additional improvement but table 1 shows that it makes results actually worse apart from that table 1 would also benefit from adding some stateoftheart baseline models eg as discussed in the related works section simply applying knn to the raw counts seems like a very basic baseline apart from that the writing of the paper could be improved a lot i found it very confusing to come across references to figure 5 etc and then not finding them in the paper eventually i realized that they were in a separate supplement this could be made clearer in the paper moreover there are also several typos in the paper eg thethe and some grammatically incorrect sentences also in the related works section the authors refer to work by luca et al without a citationdoes it refer to lucas with an s 4 updates after discussion period while the initial paper provided empirical evidence for the claims of the paper based on the reviews an appendix of about 9 pages was added in the revised version and the focus now seems on providing theoretical support of the claims in the paper also several experimental results were added in the main paper in response to the reviews i feel like all these quite major changes indicate that this paper is not mature enough for publication at this point so i maintain my current reviewdocsepthe authors demonstrate a vae model and estimation framework with which the ppca subspace is recovered for data with multinomial observations the authors specifically are interested in the highd scenario and show that their analytical results outperform other vae methods general comments the paper offers a novel solution to the seemingly persistent problem of performing efficient inference of a noncongugate latent variable model for count data the authors claim that existing solutions are not scalable the paper is clearly written with a few minor typos i have 2 complaints 1 the benchmarking analysis reports only relative performance across other vaes there are no other methods even considered and it would be nice to have some sense of what a good result looks like in an absolute sense eg whats a good score on subspace distance for 200 or 1000 input dimension problem what would you get if you didnt use the vae approach 2 could the authors comment more on the scalability problems they are talking about there appear to be a number of scalable pca techniques available where the full covariance matrix need not be formed and decomposed in memory would none of those approaches be adaptable the their problem minor note the authors mention a few papers where mcmc was proposed as a means to obtain a posterior over the latent variables but these ventures were dismissed as suffering from the curse of dimensionality or something to the effect of that argument one outstanding oversight to this collection of papers is linderman et alhttpspapersnipsccpaper5660dependentmultinomialmodelsmadeeasystickbreakingwiththepolyagammaaugmentation wherein the stickbreaking construction of the multinomial is coupled with polyagamma augmentation to offer a highly efficient sampling procedure for precisely the model the authors are proposing could the authors comment in the rebuttal on how this approach docsepthis paper is well written presenting a great interdisciplinary work on covariance estimation it relies on recent techniques such as connection between vae and ppca ilr transformation and presents an augmented vae to obtain map estimates on multinomially distributed data such setting has direct application potential in many bioinformatics problems the methodology is quite dense which reads look to me though not in every detail the only chance it gets rejected is its relatively narrow audience group in iclr 1 the notation can be improved to fit community convention say pztheta x its better to read pzx theta since theta are only parameters not the target random variable 2 clarity can be improved on the part elaborating algorithm 1 to clarify conclusive statements eg justification on complexity and put aside technical details readers may have confusions on how ilr transformation is used here maybe section 42 can hint on the exact use case and the part after algorithm 1 can be better organized 3 experiments part should have more clear intro on machine learning abstracted versions of the problems which better fit general audience in iclr 4 some minor typos eg section 41 equation 2 should be id1 instead of id hope this paper gets attention and wide adoptions on related biological applicationsdocsep summary the authors propose a framework to estimate highdimensional covariance matrices using multinomial variational autoencoders showing application to biomedical data sets in particular they use the probabilistic pca framework extending it to multinomialdistributed data they show that similar work on ppca are limited to gaussian data and noneasily applicable to bagofwords analysis on realworld biomedical data sets are promising reasons for score i fail to grasps the novelty with respect to the state of the art the authors cite numerous of similar works such as mixture modelling lda the authors mention that such techniques rely on dirichlet distributions but that is not always the case such as in 1 and 2 where they show it can be approximated with a logistic normal it seems such models can be used in the applications mentioned by the authors however since there are no comparison in the paper i fail to understand the benefit of one with respect to the others pros ilr transform to deal with identifiability of softmax analysis on synthetic and real data sets cons technical details can be improved for example it is not straightforward to understand what the encoder decoder parameters refer to is the type of activation functions at each layer dimension of layers number of layers there is no comparison with mixture models that infer covariance matrices like 1 and 2 it is difficult to assess what are the benefit of one with respect to the others minors thethe page 5 references 1 srivastava akash and charles sutton autoencoding variational inference for topic models arxiv preprint arxiv170301488 2017 2 blei david and john lafferty correlated topic models advances in neural information processing systems 18 2006 147 ### Summary:
authors extend the probabilistic pca framework to multinomialdistributed data scalable estimation of principal components in the model is achieved using a multinomial variational autoencoder in combination with an isometric logratio ilr transform the reviewers did not agree on the degree of novelty of the paper to pc estimation the presentation of the paper can be improved the reviewers criticise that large changes have been made to the paper during the rebuttal phase overall the paper is borderline and due to the mentioned large changes i recommend a rejection and rereview at a different venue
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 2520, 2929, 8725, 2720, 1543, 10775, 326, 13460, 265, 403, 2104, 281, 3037, 253, 8624, 4295, 253, 38135, 310, 253, 6880, 281, 247, 747, 3268, 37197, 28261, 21535, 6320, 3268, 436, 310, 6786, 407, 970, 253, 310, 7480, 2412, 29603, 4164, 83, 4979, 1223, 2720, 1543, 497, 6012, 41398, 436, 2929, 3400, 760, 16774, 1941, 323, 253, 1750, 5001, 253, 37197, 28261, 21535, 6320, 3268, 50275, 1189, 455, 891, 13414, 1089, 326, 253, 2530, 4679, 2085, 21414, 1329, 273, 50276, 783, 1750, 273, 253, 2929, 10775, 326, 13460, 265, 403, 2104, 281, 3037, 253, 8063, 4295, 323, 253, 37197, 28261, 21535, 6320, 3268, 50276, 6050, 253, 4081, 2746, 11026, 1805, 1543, 685, 5795, 7274, 275, 4677, 337, 352, 310, 417, 2590, 281, 479, 2139, 253, 2011, 1543, 5224, 326, 253, 362, 3348, 369, 2686, 2104, 281, 3037, 253, 8624, 4295, 1754, 327, 619, 4685, 273, 253, 908, 17082, 323, 4227, 50276, 10565, 40446, 273, 470, 6492, 3962, 1543, 285, 337, 310, 253, 9065, 253, 4081, 2746, 33526, 16331, 327, 1052, 7877, 285, 470, 2222, 327, 9098, 10103, 672, 247, 12861, 1566, 310, 908, 275, 4677, 337, 253, 17082, 564, 1066, 534, 310, 1175, 533, 597, 1335, 3297, 1840, 16987, 26332, 2080, 1977, 432, 470, 1754, 327, 619, 4685, 273, 253, 908, 17082, 891, 717, 417, 2119, 2139, 436, 651, 2085, 16774, 1329, 273, 50276, 783, 1750, 275, 436, 2929, 326, 253, 8624, 4295, 476, 320, 12372, 50275, 249, 253, 5426, 273, 7844, 12420, 275, 253, 30762, 352, 3133, 751, 253, 30482, 1080, 310, 5816, 247, 6278, 2429, 281, 253, 3236, 5426, 275, 253, 23378, 2929, 1638, 50276, 783, 4477, 671, 1750, 326, 4677, 374, 2722, 326, 253, 6311, 21496, 10103, 403, 19627, 281, 1016, 643, 1223, 891, 476, 2868, 436, 275, 253, 987, 7484, 891, 717, 417, 13762, 407, 253, 1669, 7484, 835, 247, 1027, 13642, 310, 908, 347, 5393, 407, 253, 4477, 285, 7613, 253, 16421, 1057, 417, 1646, 594, 1199, 4067, 685, 253, 745, 5168, 5154, 932, 281, 479, 50276, 783, 2929, 671, 29328, 247, 14604, 5528, 15831, 347, 3081, 7756, 533, 2829, 337, 2722, 326, 352, 2789, 1543, 50276, 35264, 7197, 50275, 522, 435, 432, 326, 2829, 337, 651, 671, 5649, 432, 6240, 690, 1375, 23037, 14387, 8245, 3210, 24088, 347, 5469, 275, 253, 2905, 2987, 2593, 3365, 9433, 694, 79, 281, 253, 9305, 9372, 3133, 751, 247, 1077, 5044, 8245, 50276, 522, 435, 432, 326, 253, 4028, 273, 253, 2929, 812, 320, 5520, 247, 2257, 50276, 74, 1119, 352, 1077, 21643, 281, 1705, 2439, 10414, 281, 4677, 608, 3966, 50276, 395, 840, 417, 4560, 731, 275, 253, 2929, 50276, 8045, 1230, 891, 8156, 326, 597, 497, 275, 247, 4858, 8499, 436, 812, 320, 1160, 30909, 275, 253, 2929, 25761, 627, 403, 671, 2067, 963, 993, 275, 253, 2929, 24088, 253, 783, 285, 690, 47412, 1037, 13583, 14683, 671, 50276, 249, 253, 2905, 2987, 2593, 253, 4477, 3730, 281, 789, 407, 50276, 33986, 66, 1162, 355, 1293, 247, 25577, 18566, 352, 50276, 709, 254, 281, 18205, 284, 342, 271, 256, 577, 50276, 623, 50276, 484, 24275, 846, 5955, 2180, 50275, 6050, 253, 3302, 2929, 2530, 16774, 1941, 50276, 1542, 253, 3916, 273, 253, 2929, 1754, 327, 253, 10123, 271, 30762, 273, 670, 898, 7223, 369, 2879, 275, 253, 17265, 2715, 285, 253, 2770, 1024, 3133, 327, 5277, 10527, 1329, 273, 253, 3916, 275, 253, 2929, 671, 2067, 5661, 1543, 497, 2879, 275, 253, 2022, 2929, 275, 2380, 281, 253, 10123, 891, 1928, 751, 512, 841, 3240, 2201, 2544, 5224, 326, 436, 2929, 310, 417, 14242, 2217, 323, 9311, 387, 436, 1127, 594, 891, 6558, 619, 1655, 2278, 7152, 339, 431, 248, 4477, 7568, 247, 362, 3348, 1566, 285, 13418, 7792, 342, 534, 253, 268, 5902, 66, 24822, 310, 12372, 323, 941, 342, 37197, 28261, 7313, 253, 4477, 5742, 403, 6110, 275, 253, 1029, 69, 10076, 285, 921, 326, 616, 16101, 1543, 562, 32231, 643, 362, 3348, 3082, 50274, 16691, 5701, 253, 2929, 6131, 247, 4460, 2900, 281, 253, 16907, 15663, 1895, 273, 9591, 5919, 17032, 273, 247, 1327, 14829, 814, 366, 21624, 4778, 1566, 323, 1385, 941, 253, 4477, 1750, 326, 5368, 5482, 403, 417, 44755, 253, 2929, 310, 4518, 3542, 342, 247, 1643, 5884, 963, 993, 891, 452, 374, 14672, 50276, 18, 253, 22791, 272, 1783, 5012, 760, 4103, 3045, 2439, 643, 13460, 265, 627, 403, 642, 643, 3082, 1014, 2783, 285, 352, 651, 320, 5322, 281, 452, 690, 3282, 273, 752, 247, 1175, 906, 4453, 751, 275, 271, 7880, 3282, 24088, 47515, 247, 1175, 4868, 327, 24822, 4181, 323, 1052, 390, 9098, 3280, 7877, 1895, 752, 651, 368, 755, 604, 368, 42126, 897, 253, 362, 3348, 2746, 50275, 19, 812, 253, 4477, 4385, 625, 327, 253, 9171, 1430, 3237, 597, 403, 5015, 670, 627, 3176, 281, 320, 247, 1180, 273, 44755, 268, 6357, 5609, 2130, 835, 253, 2120, 26677, 4315, 878, 417, 320, 4447, 285, 45765, 275, 3541, 651, 5293, 273, 1110, 7274, 50276, 1257, 5223, 494, 253, 616, 1895, 50275, 37585, 3877, 253, 4477, 3748, 247, 1643, 9380, 835, 278, 3591, 68, 369, 4081, 347, 247, 2097, 281, 4044, 247, 12637, 689, 253, 21624, 4903, 533, 841, 50018, 497, 11511, 347, 9958, 432, 253, 28401, 273, 7877, 1319, 390, 1633, 281, 253, 1055, 273, 326, 4154, 581, 16383, 29002, 281, 436, 4849, 273, 9380, 310, 298, 6228, 1342, 1162, 355, 3614, 50004, 79, 2824, 550, 20790, 3208, 1549, 6820, 9961, 249, 28261, 19286, 12710, 36423, 19982, 22071, 3113, 783, 18372, 356, 1861, 2321, 16977, 10646, 253, 7356, 22071, 5140, 273, 253, 37197, 28261, 310, 9904, 342, 3488, 356, 1861, 42072, 281, 3959, 247, 4122, 5919, 10491, 5199, 323, 10534, 253, 1566, 253, 4477, 403, 36636, 812, 253, 4477, 4385, 275, 253, 30080, 22559, 327, 849, 436, 2746, 5474, 33032, 2520, 2929, 310, 973, 3542, 15250, 247, 1270, 734, 36078, 789, 327, 26677, 13418, 352, 15771, 327, 3332, 5609, 824, 347, 4602, 875, 362, 3348, 285, 268, 5902, 66, 4164, 83, 9261, 285, 10262, 271, 31612, 362, 3348, 281, 4044, 3711, 8197, 327, 37197, 297, 1365, 5939, 941, 824, 4758, 556, 1480, 2898, 2442, 275, 1142, 9015, 37366, 3237, 253, 16182, 310, 3240, 14086, 534, 9563, 1007, 281, 479, 2167, 417, 275, 1046, 2508, 50275, 783, 760, 4839, 352, 4850, 10945, 310, 697, 4942, 6891, 8446, 1387, 275, 17857, 32888, 50275, 18, 253, 14951, 476, 320, 5520, 281, 4944, 3114, 5008, 1333, 268, 91, 3124, 1269, 697, 1805, 281, 1239, 268, 91, 89, 39116, 1580, 39116, 403, 760, 3602, 417, 253, 2303, 3632, 4778, 374, 19843, 476, 320, 5520, 327, 253, 629, 14883, 839, 5933, 337, 281, 19148, 38662, 7234, 24088, 22861, 327, 10454, 285, 1691, 9255, 7681, 4278, 10668, 778, 452, 1461, 16723, 327, 849, 4164, 83, 9261, 310, 908, 1060, 5046, 2593, 5976, 476, 12662, 327, 253, 3242, 897, 1083, 285, 253, 629, 846, 5933, 337, 476, 320, 1805, 10932, 495, 4679, 629, 943, 452, 625, 2590, 26432, 327, 5145, 4715, 12002, 264, 9508, 273, 253, 3237, 534, 1805, 4944, 2087, 8446, 275, 17857, 32888, 577, 690, 5884, 963, 993, 24088, 2593, 7609, 5150, 374, 943, 320, 2654, 18, 3185, 273, 2654, 50276, 36865, 436, 2929, 4850, 4116, 285, 4618, 5283, 621, 327, 2905, 7534, 4893, 7152, 33032, 6010, 253, 4477, 12661, 247, 7792, 281, 6642, 1029, 6967, 26677, 12624, 970, 37197, 28261, 39762, 6753, 2083, 351, 398, 4645, 2898, 281, 35156, 941, 5239, 275, 1798, 597, 897, 253, 37851, 268, 6357, 7792, 13633, 352, 281, 37197, 28261, 45618, 941, 597, 921, 326, 2074, 789, 327, 268, 5902, 66, 403, 3710, 281, 305, 12064, 941, 285, 5293, 284, 1031, 7763, 281, 7351, 1171, 12113, 1783, 327, 1524, 10186, 35156, 941, 5239, 403, 12532, 50275, 250, 3743, 323, 4868, 891, 1891, 281, 650, 284, 793, 253, 38135, 342, 1675, 281, 253, 1375, 273, 253, 1445, 253, 4477, 26542, 7418, 273, 2074, 2987, 824, 347, 7802, 26278, 298, 1473, 253, 4477, 3748, 326, 824, 5609, 10725, 327, 14035, 42878, 10670, 533, 326, 310, 417, 1900, 253, 1083, 824, 347, 275, 337, 285, 374, 835, 597, 921, 352, 476, 320, 34930, 342, 247, 21535, 2622, 352, 3133, 824, 3210, 476, 320, 908, 275, 253, 4893, 5393, 407, 253, 4477, 2299, 1580, 627, 403, 642, 5301, 275, 253, 2929, 891, 1891, 281, 2096, 253, 5649, 273, 581, 342, 1675, 281, 253, 2571, 50275, 856, 84, 50276, 300, 83, 4979, 281, 2968, 342, 1548, 18279, 1430, 273, 2602, 4090, 50276, 12792, 327, 13506, 285, 1524, 941, 5239, 50275, 5040, 50276, 48746, 4278, 476, 320, 5520, 323, 1650, 352, 310, 417, 15246, 281, 2096, 752, 253, 32049, 50276, 48759, 3602, 3730, 281, 310, 253, 1511, 273, 5743, 3470, 387, 1016, 3828, 7877, 273, 8090, 1180, 273, 8090, 50275, 9088, 310, 642, 5301, 342, 7802, 3210, 326, 9441, 26677, 12624, 751, 337, 285, 374, 352, 310, 2834, 281, 2939, 752, 403, 253, 5649, 273, 581, 342, 1675, 281, 253, 2571, 50275, 1222, 641, 50276, 783, 783, 3239, 608, 50275, 250, 3065, 337, 256, 1069, 505, 2623, 29507, 1225, 285, 1018, 868, 256, 28738, 6753, 27676, 39762, 17032, 323, 9400, 3210, 549, 32693, 638, 3845, 549, 32693, 1166, 2941, 11494, 2055, 4240, 374, 7387, 74, 34843, 301, 285, 480, 2116, 826, 4424, 555, 9578, 9400, 3210, 16424, 275, 11454, 1491, 5162, 2718, 1283, 5403, 20825, 187, 187, 4118, 18435, 27, 43355, 9017, 253, 37851, 268, 6357, 7792, 281, 37197, 28261, 45618, 941, 44755, 13418, 273, 8624, 4295, 275, 253, 1566, 310, 6786, 970, 247, 37197, 28261, 39762, 6753, 36465, 275, 5019, 342, 271, 310, 7480, 2412, 29603, 4164, 83, 4979, 253, 30628, 858, 417, 5194, 327, 253, 4248, 273, 38135, 273, 253, 2929, 281, 21136, 13418, 253, 9759, 273, 253, 2929, 476, 320, 5520, 253, 30628, 7291, 885, 326, 1781, 2544, 452, 644, 1160, 281, 253, 2929, 1309, 253, 30080, 22559, 3408, 4583, 253, 2929, 310, 45210, 285, 1955, 281, 253, 5393, 1781, 2544, 891, 5583, 247, 18235, 285, 294, 15337, 387, 247, 1027, 18767, 209 ]
[ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 ]
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 2520, 2929, 8725, 2720, 1543, 10775, 326, 13460, 265, 403, 2104, 281, 3037, 253, 8624, 4295, 253, 38135, 310, 253, 6880, 281, 247, 747, 3268, 37197, 28261, 21535, 6320, 3268, 436, 310, 6786, 407, 970, 253, 310, 7480, 2412, 29603, 4164, 83, 4979, 1223, 2720, 1543, 497, 6012, 41398, 436, 2929, 3400, 760, 16774, 1941, 323, 253, 1750, 5001, 253, 37197, 28261, 21535, 6320, 3268, 50275, 1189, 455, 891, 13414, 1089, 326, 253, 2530, 4679, 2085, 21414, 1329, 273, 50276, 783, 1750, 273, 253, 2929, 10775, 326, 13460, 265, 403, 2104, 281, 3037, 253, 8063, 4295, 323, 253, 37197, 28261, 21535, 6320, 3268, 50276, 6050, 253, 4081, 2746, 11026, 1805, 1543, 685, 5795, 7274, 275, 4677, 337, 352, 310, 417, 2590, 281, 479, 2139, 253, 2011, 1543, 5224, 326, 253, 362, 3348, 369, 2686, 2104, 281, 3037, 253, 8624, 4295, 1754, 327, 619, 4685, 273, 253, 908, 17082, 323, 4227, 50276, 10565, 40446, 273, 470, 6492, 3962, 1543, 285, 337, 310, 253, 9065, 253, 4081, 2746, 33526, 16331, 327, 1052, 7877, 285, 470, 2222, 327, 9098, 10103, 672, 247, 12861, 1566, 310, 908, 275, 4677, 337, 253, 17082, 564, 1066, 534, 310, 1175, 533, 597, 1335, 3297, 1840, 16987, 26332, 2080, 1977, 432, 470, 1754, 327, 619, 4685, 273, 253, 908, 17082, 891, 717, 417, 2119, 2139, 436, 651, 2085, 16774, 1329, 273, 50276, 783, 1750, 275, 436, 2929, 326, 253, 8624, 4295, 476, 320, 12372, 50275, 249, 253, 5426, 273, 7844, 12420, 275, 253, 30762, 352, 3133, 751, 253, 30482, 1080, 310, 5816, 247, 6278, 2429, 281, 253, 3236, 5426, 275, 253, 23378, 2929, 1638, 50276, 783, 4477, 671, 1750, 326, 4677, 374, 2722, 326, 253, 6311, 21496, 10103, 403, 19627, 281, 1016, 643, 1223, 891, 476, 2868, 436, 275, 253, 987, 7484, 891, 717, 417, 13762, 407, 253, 1669, 7484, 835, 247, 1027, 13642, 310, 908, 347, 5393, 407, 253, 4477, 285, 7613, 253, 16421, 1057, 417, 1646, 594, 1199, 4067, 685, 253, 745, 5168, 5154, 932, 281, 479, 50276, 783, 2929, 671, 29328, 247, 14604, 5528, 15831, 347, 3081, 7756, 533, 2829, 337, 2722, 326, 352, 2789, 1543, 50276, 35264, 7197, 50275, 522, 435, 432, 326, 2829, 337, 651, 671, 5649, 432, 6240, 690, 1375, 23037, 14387, 8245, 3210, 24088, 347, 5469, 275, 253, 2905, 2987, 2593, 3365, 9433, 694, 79, 281, 253, 9305, 9372, 3133, 751, 247, 1077, 5044, 8245, 50276, 522, 435, 432, 326, 253, 4028, 273, 253, 2929, 812, 320, 5520, 247, 2257, 50276, 74, 1119, 352, 1077, 21643, 281, 1705, 2439, 10414, 281, 4677, 608, 3966, 50276, 395, 840, 417, 4560, 731, 275, 253, 2929, 50276, 8045, 1230, 891, 8156, 326, 597, 497, 275, 247, 4858, 8499, 436, 812, 320, 1160, 30909, 275, 253, 2929, 25761, 627, 403, 671, 2067, 963, 993, 275, 253, 2929, 24088, 253, 783, 285, 690, 47412, 1037, 13583, 14683, 671, 50276, 249, 253, 2905, 2987, 2593, 253, 4477, 3730, 281, 789, 407, 50276, 33986, 66, 1162, 355, 1293, 247, 25577, 18566, 352, 50276, 709, 254, 281, 18205, 284, 342, 271, 256, 577, 50276, 623, 50276, 484, 24275, 846, 5955, 2180, 50275, 6050, 253, 3302, 2929, 2530, 16774, 1941, 50276, 1542, 253, 3916, 273, 253, 2929, 1754, 327, 253, 10123, 271, 30762, 273, 670, 898, 7223, 369, 2879, 275, 253, 17265, 2715, 285, 253, 2770, 1024, 3133, 327, 5277, 10527, 1329, 273, 253, 3916, 275, 253, 2929, 671, 2067, 5661, 1543, 497, 2879, 275, 253, 2022, 2929, 275, 2380, 281, 253, 10123, 891, 1928, 751, 512, 841, 3240, 2201, 2544, 5224, 326, 436, 2929, 310, 417, 14242, 2217, 323, 9311, 387, 436, 1127, 594, 891, 6558, 619, 1655, 2278, 7152, 339, 431, 248, 4477, 7568, 247, 362, 3348, 1566, 285, 13418, 7792, 342, 534, 253, 268, 5902, 66, 24822, 310, 12372, 323, 941, 342, 37197, 28261, 7313, 253, 4477, 5742, 403, 6110, 275, 253, 1029, 69, 10076, 285, 921, 326, 616, 16101, 1543, 562, 32231, 643, 362, 3348, 3082, 50274, 16691, 5701, 253, 2929, 6131, 247, 4460, 2900, 281, 253, 16907, 15663, 1895, 273, 9591, 5919, 17032, 273, 247, 1327, 14829, 814, 366, 21624, 4778, 1566, 323, 1385, 941, 253, 4477, 1750, 326, 5368, 5482, 403, 417, 44755, 253, 2929, 310, 4518, 3542, 342, 247, 1643, 5884, 963, 993, 891, 452, 374, 14672, 50276, 18, 253, 22791, 272, 1783, 5012, 760, 4103, 3045, 2439, 643, 13460, 265, 627, 403, 642, 643, 3082, 1014, 2783, 285, 352, 651, 320, 5322, 281, 452, 690, 3282, 273, 752, 247, 1175, 906, 4453, 751, 275, 271, 7880, 3282, 24088, 47515, 247, 1175, 4868, 327, 24822, 4181, 323, 1052, 390, 9098, 3280, 7877, 1895, 752, 651, 368, 755, 604, 368, 42126, 897, 253, 362, 3348, 2746, 50275, 19, 812, 253, 4477, 4385, 625, 327, 253, 9171, 1430, 3237, 597, 403, 5015, 670, 627, 3176, 281, 320, 247, 1180, 273, 44755, 268, 6357, 5609, 2130, 835, 253, 2120, 26677, 4315, 878, 417, 320, 4447, 285, 45765, 275, 3541, 651, 5293, 273, 1110, 7274, 50276, 1257, 5223, 494, 253, 616, 1895, 50275, 37585, 3877, 253, 4477, 3748, 247, 1643, 9380, 835, 278, 3591, 68, 369, 4081, 347, 247, 2097, 281, 4044, 247, 12637, 689, 253, 21624, 4903, 533, 841, 50018, 497, 11511, 347, 9958, 432, 253, 28401, 273, 7877, 1319, 390, 1633, 281, 253, 1055, 273, 326, 4154, 581, 16383, 29002, 281, 436, 4849, 273, 9380, 310, 298, 6228, 1342, 1162, 355, 3614, 50004, 79, 2824, 550, 20790, 3208, 1549, 6820, 9961, 249, 28261, 19286, 12710, 36423, 19982, 22071, 3113, 783, 18372, 356, 1861, 2321, 16977, 10646, 253, 7356, 22071, 5140, 273, 253, 37197, 28261, 310, 9904, 342, 3488, 356, 1861, 42072, 281, 3959, 247, 4122, 5919, 10491, 5199, 323, 10534, 253, 1566, 253, 4477, 403, 36636, 812, 253, 4477, 4385, 275, 253, 30080, 22559, 327, 849, 436, 2746, 5474, 33032, 2520, 2929, 310, 973, 3542, 15250, 247, 1270, 734, 36078, 789, 327, 26677, 13418, 352, 15771, 327, 3332, 5609, 824, 347, 4602, 875, 362, 3348, 285, 268, 5902, 66, 4164, 83, 9261, 285, 10262, 271, 31612, 362, 3348, 281, 4044, 3711, 8197, 327, 37197, 297, 1365, 5939, 941, 824, 4758, 556, 1480, 2898, 2442, 275, 1142, 9015, 37366, 3237, 253, 16182, 310, 3240, 14086, 534, 9563, 1007, 281, 479, 2167, 417, 275, 1046, 2508, 50275, 783, 760, 4839, 352, 4850, 10945, 310, 697, 4942, 6891, 8446, 1387, 275, 17857, 32888, 50275, 18, 253, 14951, 476, 320, 5520, 281, 4944, 3114, 5008, 1333, 268, 91, 3124, 1269, 697, 1805, 281, 1239, 268, 91, 89, 39116, 1580, 39116, 403, 760, 3602, 417, 253, 2303, 3632, 4778, 374, 19843, 476, 320, 5520, 327, 253, 629, 14883, 839, 5933, 337, 281, 19148, 38662, 7234, 24088, 22861, 327, 10454, 285, 1691, 9255, 7681, 4278, 10668, 778, 452, 1461, 16723, 327, 849, 4164, 83, 9261, 310, 908, 1060, 5046, 2593, 5976, 476, 12662, 327, 253, 3242, 897, 1083, 285, 253, 629, 846, 5933, 337, 476, 320, 1805, 10932, 495, 4679, 629, 943, 452, 625, 2590, 26432, 327, 5145, 4715, 12002, 264, 9508, 273, 253, 3237, 534, 1805, 4944, 2087, 8446, 275, 17857, 32888, 577, 690, 5884, 963, 993, 24088, 2593, 7609, 5150, 374, 943, 320, 2654, 18, 3185, 273, 2654, 50276, 36865, 436, 2929, 4850, 4116, 285, 4618, 5283, 621, 327, 2905, 7534, 4893, 7152, 33032, 6010, 253, 4477, 12661, 247, 7792, 281, 6642, 1029, 6967, 26677, 12624, 970, 37197, 28261, 39762, 6753, 2083, 351, 398, 4645, 2898, 281, 35156, 941, 5239, 275, 1798, 597, 897, 253, 37851, 268, 6357, 7792, 13633, 352, 281, 37197, 28261, 45618, 941, 597, 921, 326, 2074, 789, 327, 268, 5902, 66, 403, 3710, 281, 305, 12064, 941, 285, 5293, 284, 1031, 7763, 281, 7351, 1171, 12113, 1783, 327, 1524, 10186, 35156, 941, 5239, 403, 12532, 50275, 250, 3743, 323, 4868, 891, 1891, 281, 650, 284, 793, 253, 38135, 342, 1675, 281, 253, 1375, 273, 253, 1445, 253, 4477, 26542, 7418, 273, 2074, 2987, 824, 347, 7802, 26278, 298, 1473, 253, 4477, 3748, 326, 824, 5609, 10725, 327, 14035, 42878, 10670, 533, 326, 310, 417, 1900, 253, 1083, 824, 347, 275, 337, 285, 374, 835, 597, 921, 352, 476, 320, 34930, 342, 247, 21535, 2622, 352, 3133, 824, 3210, 476, 320, 908, 275, 253, 4893, 5393, 407, 253, 4477, 2299, 1580, 627, 403, 642, 5301, 275, 253, 2929, 891, 1891, 281, 2096, 253, 5649, 273, 581, 342, 1675, 281, 253, 2571, 50275, 856, 84, 50276, 300, 83, 4979, 281, 2968, 342, 1548, 18279, 1430, 273, 2602, 4090, 50276, 12792, 327, 13506, 285, 1524, 941, 5239, 50275, 5040, 50276, 48746, 4278, 476, 320, 5520, 323, 1650, 352, 310, 417, 15246, 281, 2096, 752, 253, 32049, 50276, 48759, 3602, 3730, 281, 310, 253, 1511, 273, 5743, 3470, 387, 1016, 3828, 7877, 273, 8090, 1180, 273, 8090, 50275, 9088, 310, 642, 5301, 342, 7802, 3210, 326, 9441, 26677, 12624, 751, 337, 285, 374, 352, 310, 2834, 281, 2939, 752, 403, 253, 5649, 273, 581, 342, 1675, 281, 253, 2571, 50275, 1222, 641, 50276, 783, 783, 3239, 608, 50275, 250, 3065, 337, 256, 1069, 505, 2623, 29507, 1225, 285, 1018, 868, 256, 28738, 6753, 27676, 39762, 17032, 323, 9400, 3210, 549, 32693, 638, 3845, 549, 32693, 1166, 2941, 11494, 2055, 4240, 374, 7387, 74, 34843, 301, 285, 480, 2116, 826, 4424, 555, 9578, 9400, 3210, 16424, 275, 11454, 1491, 5162, 2718, 1283, 5403, 20825, 187, 187, 4118, 18435, 27, 43355, 9017, 253, 37851, 268, 6357, 7792, 281, 37197, 28261, 45618, 941, 44755, 13418, 273, 8624, 4295, 275, 253, 1566, 310, 6786, 970, 247, 37197, 28261, 39762, 6753, 36465, 275, 5019, 342, 271, 310, 7480, 2412, 29603, 4164, 83, 4979, 253, 30628, 858, 417, 5194, 327, 253, 4248, 273, 38135, 273, 253, 2929, 281, 21136, 13418, 253, 9759, 273, 253, 2929, 476, 320, 5520, 253, 30628, 7291, 885, 326, 1781, 2544, 452, 644, 1160, 281, 253, 2929, 1309, 253, 30080, 22559, 3408, 4583, 253, 2929, 310, 45210, 285, 1955, 281, 253, 5393, 1781, 2544, 891, 5583, 247, 18235, 285, 294, 15337, 387, 247, 1027, 18767, 209 ]
Below is given review of a research paper from cnoference journal. Please write a summary the review. ### Review: summary this paper studies the convergence of adaptive gradient methods under an interpolation assumption showing for example these methods can converge at an o1t instead of o1sqrtt rate when perfect interpolation is satisfied convergence behaviors with line search and polyak step sizes are also analyzed reasons for score this paper provides good insights regarding how an interpolation assumption may help accelerate adaptive gradient methods i do not feel the technical results very solid as some difficulttocheck properties are just put as assumptions see cons pros pros 1 the results provides insights regarding why adaptive gradient methods may converge faster when the interpolation assumption is satisfied 
 2 line search and polyak step size methods help address the need of problem and algorithm parameters in standard theories moreover there are few papers discussing line search and polyak step size methods in the finitesum setup 3 the polyak step size is wellmotivated in the interpolation setting cons 1 the abstract claims that adagrad can achieve an o1 regret in the online convex optimization framework i do not see this result in the main text 2 the paper reads waiving regarding difficulttocheck assumptions in particular this paper assumes the sequence of iterates is bounded in a set of radius d and the eigenvalue of the preconditioning matrices are bounded the main argument supporting these assumptions are simply they are common in existing literature i feel the technical challenges in the analyses are alleviated a lot because of these common assumptions 

notice that without the conditions the convergence guarantees may not be meaningful because the d parameter in all theorems and amin and amax in theorem 3 and theorem 4 can scale with the iteration counter 
 3 the proposed line search methods seem to be computationally very expensive the proposed line search methods require computing the largest step size satisfying the desired inequality instead of the largest among a sequence of exponentially decaying step sizes as in standard armijo line search methods is it possible to analyze the performance of lattermore computationally favorablescheme
 4 it is claimed that the step size chosen by the proposed conservative lipschitz line search method is bounded in 2 1 c lmax etak 1 can it happen that etak 1 leq 2 1 c lmax if yes then is the step size selection rule welldefined there is also a similar claim for the stochastic armijo line search method 
 5 i dont get the sentence that a similar distinction between the convergence of constant stepsize adam or amsgrad vs adagrad has also been recently discussed in the nonconvex setting dfossez et al 2020 in section 4 what is the distinction 
 6 minor comment 1 typo in the first paragraph online batch reduction onlinetobatch reduction 2 m1 is not specified in 1 3 the abbreviation sps is not defined when it appears for the first time in the main text 
 after reading author rebuttal i think the value of this work is to demonstrate the possible benefits of an interpolation assumption hence though there are some theoretical issue and theorypractice gap i keep the original score i do not feel it very reasonable to emphasize the regret result in the abstract and then put the corresponding section in the appendix it is more reasonable to move appendix c2 to the main text though i guess it would be difficult in practice i understand adding a projection step typically does not change the proof much however this is not the setup analyzed in this paper i feel the associated newlyadded clarification in section 2 reads somewhat waiving i suggest the authors add some clarifications in the revision similar to the following in the rebuttal without an explicit projection step we believe it is possible to adopt the recent sgd analysis on the almost sure convergence of stochastic gradient descent in nonconvex problems neurips 2020 to prove that the iterates do remain bounded with highprobability we leave this for future work the other clarifications are ok to me i suggest the authors add some clarification regarding 4 in the main text docsepthis paper studies adaptive gradient methods under the overparametrized settings where the authors study the converge in the interpolation setting in this setting the optimal objective is 0 the authors show that the convergence rate is o1t in addition when the interpolation is approximately satisfied the authors show the convergence to a neighborhood of the solution the authors also provide theoretical justifications for popular line search methods overall i find the paper easy to read however i do have a few questions that would like to see the authors answers 1 the authors implicitly assume that the optimal solution is unique however this is not the case in many overparametrized models for example consider the case for logistic regression where the two classes are perfectly separable the minimizer is not well defined but there have been extensive work on this topic can the authors analysis adapt to such situations 2 taking the logistic regression as an example again the minimizer is not within a bounded region can the authors analysis been adopted to analyze such case 3 the result are all in the form of expectation can the authors bound the l2norm 4 i think it would be informative to add the result when the loss is strongly convex where we can have the bound for the solutionsdocsepthis paper analyzes adaptive algorithms such as adagrad and amsgrad in a finitesum optimization problem the proofs appear to treat this setting through online convex optimization and onlinetobatch conversion it is shown that both adagrad and amsgrad improve when the individual losses are all minimized at the same point further line search techniques are analyzed in conjunction with these algorithms and empirical results show that in practice the line searches speed up convergence i do not think theorem 1 is particularly novel i am not sure of an original reference it may be a kind of folklore but see for example httpsparameterfreecom20190920adaptivealgorithmslboundsandadagrad theorem 7 from which it is trivial to deduce theorem 1 by observing that wlog we may assume fi0 for all i since subtracting the minimum value does not change the gradients for theorem 2 i may be missing something this value of alpha seems strictly worse than what we would get in theorem 1 by just setting etaetamax so what is the line search buying us is it just for empirical performance with no theoretical benefit yet it is not obviously presented this way so if so i think some remarks to this effect are in order the assumption on bounded eigenvalues for the results on amsgrad seems a little troublesome to me i am worried that all of the adaptive nature of the preconditioner is irrelevant and these assumptions are doing all the heavylifting indeed if we set beta0 then with learning rate eta aminl and sgd update wt1 wt eta v1 gt for any v in amin amax then i suspect that standard analysis of gradient descent using learning rates at most 1l will yield fairly similar results to theorem 3 i would be happy to hear otherwise though my overall feeling is that there is a missing piece here in the theory to show that the line search is useful i am not confident that the other results are significant on their own as for the empirical results these seem like reasonable gains over adam i would have preferred to see more standard deep learning benchmarks on nonimage tasks as well but i am not an expert here and so would defer to other opinions nits i am not sure that the assumption that the iterates are bounded is well justified here i do believe it has been assumed in some past literature but this does not make it actually true it is certainly not standard in the literature on online learning the two references duchi et al 2011 and levy et al 2018 cited as evidence here do not actually assume this instead they use projections to ensure that the iterates are bounded without assumptions docsepthis paper revisited two important stochastic algorithms adagrad and amsgrad it reanalyzes these two algorithms under interpolation setup and shows how the results get improved under this particular case strength interpolation setup is reasonable in overparametrized regime this paper establishes a thorough analysis on adagrad and amsgrad under the interpolation setup further it incorporates the stochastic line search technique and stochastic polyak stepsize technique with amsgrad to make the stepsize selection adaptive the improvement results and the dependence on the extent of interpolation violation seem interesting concerns why do the authors not consider the unconservative stochastic line search and polyak for adagrad also im wondering whats the technical challenges to reestablish these results is that simply combining the classical analysis with the previous work vaswani et al 2019 painless stochastic gradient interpolation linesearch and convergence rates the stepsize lies in a bounded interval lower bounded away from zero so that the randomness of stepsize is well controlled for future improvement i think authors should emphasize the difficulty of the analysis more clear based on my reading authors list comprehensive results while how significance of these results is less discussed it would be good to provide a general proof framework to make how interpolation helps on a sharper analysis more clear during rebuttal the authors should highlight some technical difficulties in the paper in principle stochastic line search under overparameterized regime does not make things harder because the stepsize is lower bounded the difficulty of stochastic stepsize is to control the product of stepsize and gradient while under this regime the product is separable it seems the analysis of momentum plus this observation is enough for the analysis it would be useful to provide some insights and challenges of the analysis ### Summary:
dear authors the paper contains many interesting and novel ideas indeed tuning stepsize is very time and energyconsuming and deriving and analyzing new adaptive algorithms has not only theoretical benefits but more importantly is a key when training more complicated ml models the paper contains many weaknesses as noted by reviewers i know that you have addressed many of them one of the reviewers is still concerned about the other issues involving theorem 1 and the assumption of the bounded preconditioner he thinks the preconditioner bound is troublesome in the overparameterized regime he would expect the gradients to become near zero as the algorithm converges which would actually cause the preconditioner to not be bounded below it seems that the analysis might actually improve if the authors abandoned amsgradadam and instead just considered sgd for which the preconditioner assumption is not an assumption but just a property of the algorithm thank you
[ 18, 2609, 85, 2281, 672, 3962, 30370, 310, 10048, 14940, 13576, 342, 1386, 3186, 285, 3488, 518, 3213, 9552, 403, 671, 5867, 50271, 250, 3743, 323, 4868, 50273, 2520, 2929, 3400, 1175, 16039, 5001, 849, 271, 30370, 9376, 778, 1361, 28523, 17825, 11786, 3082, 891, 513, 417, 1928, 253, 7681, 1543, 1077, 4891, 347, 690, 2834, 85, 13807, 777, 3607, 403, 816, 1691, 347, 13260, 923, 772, 50274, 856, 84, 50273, 856, 84, 50275, 18, 253, 1543, 3400, 16039, 5001, 2139, 17825, 11786, 3082, 778, 29623, 7938, 672, 253, 30370, 9376, 310, 10048, 541, 103, 50276, 19, 1386, 3186, 285, 3488, 518, 3213, 1979, 3082, 1361, 2953, 253, 878, 273, 1895, 285, 5933, 3602, 275, 2629, 11813, 25761, 627, 403, 1643, 9380, 16585, 1386, 3186, 285, 3488, 518, 3213, 1979, 3082, 275, 253, 1442, 3254, 360, 9978, 50274, 20, 253, 3488, 518, 3213, 1979, 310, 973, 24013, 8550, 275, 253, 30370, 4758, 50271, 5040, 50275, 18, 253, 12002, 3916, 326, 519, 356, 4614, 476, 5115, 271, 258, 18, 14938, 275, 253, 3909, 17133, 13757, 7792, 891, 513, 417, 923, 436, 906, 275, 253, 2022, 2505, 50276, 19, 253, 2929, 9563, 5172, 2179, 5001, 2834, 85, 13807, 777, 13260, 275, 1798, 436, 2929, 19584, 253, 3425, 273, 10040, 684, 310, 11542, 275, 247, 873, 273, 9941, 277, 285, 253, 25023, 273, 253, 638, 42743, 12624, 403, 11542, 253, 2022, 4154, 8109, 841, 13260, 403, 3365, 597, 403, 1846, 275, 5368, 6239, 891, 1928, 253, 7681, 7881, 275, 253, 6260, 403, 26353, 4215, 247, 2257, 984, 273, 841, 1846, 13260, 541, 103, 40702, 37277, 326, 1293, 253, 2515, 253, 14940, 23632, 778, 417, 320, 14282, 984, 253, 277, 4764, 275, 512, 39383, 285, 32609, 285, 717, 991, 275, 10012, 495, 285, 10012, 577, 476, 4311, 342, 253, 19502, 4828, 541, 103, 50276, 20, 253, 4081, 1386, 3186, 3082, 1646, 281, 320, 43245, 1077, 8214, 253, 4081, 1386, 3186, 3082, 2430, 12672, 253, 6253, 3213, 1979, 14127, 253, 6799, 11370, 3185, 273, 253, 6253, 2190, 247, 3425, 273, 28596, 46957, 3213, 9552, 347, 275, 2629, 4430, 32187, 1386, 3186, 3082, 310, 352, 1896, 281, 12106, 253, 3045, 273, 6158, 3062, 43245, 3718, 2272, 1962, 1405, 40702, 50276, 21, 352, 310, 7558, 326, 253, 3213, 1979, 6777, 407, 253, 4081, 11518, 11233, 37913, 1386, 3186, 1332, 310, 11542, 275, 374, 337, 50276, 68, 50276, 77, 4090, 1162, 518, 50276, 18, 476, 352, 5108, 326, 1162, 518, 50276, 18, 458, 82, 374, 337, 50276, 68, 50276, 77, 4090, 604, 4754, 840, 310, 253, 3213, 1979, 5438, 4086, 6210, 392, 37224, 627, 310, 671, 247, 2074, 1750, 323, 253, 19191, 4430, 32187, 1386, 3186, 1332, 541, 103, 50276, 22, 891, 13414, 755, 253, 6197, 326, 247, 2074, 13812, 875, 253, 14940, 273, 3638, 5018, 907, 38622, 390, 717, 84, 4971, 4632, 519, 356, 4614, 556, 671, 644, 4102, 5469, 275, 253, 1327, 44181, 4758, 20926, 37554, 91, 1162, 355, 9169, 275, 2593, 577, 752, 310, 253, 13812, 541, 103, 50276, 23, 5884, 4385, 50272, 18, 1745, 80, 275, 253, 806, 12494, 3909, 14604, 5141, 50276, 251, 3642, 292, 706, 1506, 5141, 50273, 19, 278, 18, 310, 417, 7616, 275, 337, 50272, 20, 253, 31931, 2492, 653, 84, 310, 417, 2931, 672, 352, 4620, 323, 253, 806, 673, 275, 253, 2022, 2505, 541, 103, 50274, 6438, 4361, 2488, 30080, 22559, 50275, 74, 1158, 253, 1318, 273, 436, 789, 310, 281, 7568, 253, 1896, 5373, 273, 271, 30370, 9376, 7613, 2167, 627, 403, 690, 10527, 2523, 285, 3762, 29105, 8037, 891, 1978, 253, 3236, 4868, 50274, 74, 513, 417, 1928, 352, 1077, 5272, 281, 22175, 253, 14938, 906, 275, 253, 12002, 285, 840, 1691, 253, 3969, 2593, 275, 253, 30762, 352, 310, 625, 5272, 281, 2118, 30762, 260, 19, 281, 253, 2022, 2505, 2167, 891, 5476, 352, 651, 320, 2834, 275, 3946, 50274, 74, 2096, 6240, 247, 12378, 3213, 5431, 1057, 417, 1818, 253, 4737, 1199, 2299, 436, 310, 417, 253, 9978, 5867, 275, 436, 2929, 891, 1928, 253, 2330, 9841, 22566, 37699, 275, 2593, 374, 9563, 8489, 5172, 2179, 891, 1804, 253, 4477, 823, 690, 8254, 6787, 275, 253, 18520, 2074, 281, 253, 1563, 275, 253, 30080, 22559, 50275, 14920, 271, 6843, 12378, 3213, 359, 2868, 352, 310, 1896, 281, 5283, 253, 3332, 256, 35333, 1783, 327, 253, 2761, 2119, 14940, 273, 19191, 11786, 18499, 275, 1327, 44181, 3237, 5723, 2824, 9169, 281, 5276, 326, 253, 10040, 684, 513, 3464, 11542, 342, 1029, 22275, 1430, 359, 3553, 436, 323, 2852, 789, 50275, 783, 643, 8254, 6787, 403, 8718, 281, 479, 50274, 74, 1804, 253, 4477, 823, 690, 37699, 5001, 577, 275, 253, 2022, 2505, 50273, 7152, 33032, 2520, 2929, 2175, 17825, 11786, 3082, 762, 253, 689, 3575, 292, 50065, 7533, 835, 253, 4477, 1263, 253, 29623, 275, 253, 30370, 4758, 275, 436, 4758, 253, 8654, 8103, 310, 470, 253, 4477, 921, 326, 253, 14940, 2281, 310, 258, 18, 85, 275, 1635, 672, 253, 30370, 310, 5512, 10048, 253, 4477, 921, 253, 14940, 281, 247, 9168, 273, 253, 2900, 253, 4477, 671, 2085, 10527, 816, 6787, 323, 4633, 1386, 3186, 3082, 50276, 1189, 455, 891, 1089, 253, 2929, 3477, 281, 1239, 2299, 891, 513, 452, 247, 1643, 3533, 326, 651, 751, 281, 923, 253, 4477, 9172, 50276, 18, 253, 4477, 29688, 5467, 326, 253, 8654, 2900, 310, 4451, 50276, 35529, 436, 310, 417, 253, 1083, 275, 1142, 689, 3575, 292, 50065, 3210, 323, 1650, 1908, 253, 1083, 323, 21535, 9077, 835, 253, 767, 5971, 403, 9670, 39690, 253, 7221, 6081, 310, 417, 973, 2931, 533, 627, 452, 644, 9470, 789, 327, 436, 9400, 476, 253, 4477, 1783, 5223, 281, 824, 9534, 50275, 19, 3192, 253, 21535, 9077, 347, 271, 1650, 969, 253, 7221, 6081, 310, 417, 1561, 247, 11542, 2919, 476, 253, 4477, 1783, 644, 8671, 281, 12106, 824, 1083, 50276, 20, 253, 906, 403, 512, 275, 253, 830, 273, 15355, 476, 253, 4477, 3033, 253, 298, 19, 12850, 50276, 21, 891, 1158, 352, 651, 320, 27096, 281, 823, 253, 906, 672, 253, 2957, 310, 7052, 17133, 835, 359, 476, 452, 253, 3033, 323, 253, 5482, 7152, 33032, 2520, 2929, 3537, 13505, 17825, 11333, 824, 347, 519, 356, 4614, 285, 717, 84, 4971, 275, 247, 1442, 3254, 360, 13757, 1895, 253, 27947, 3176, 281, 1555, 436, 4758, 949, 3909, 17133, 13757, 285, 327, 3642, 292, 706, 1506, 9436, 352, 310, 2011, 326, 1097, 519, 356, 4614, 285, 717, 84, 4971, 3157, 672, 253, 2060, 11655, 403, 512, 36625, 387, 253, 1072, 1127, 2007, 1386, 3186, 5609, 403, 5867, 275, 17385, 342, 841, 11333, 285, 16774, 1543, 921, 326, 275, 3946, 253, 1386, 17891, 3885, 598, 14940, 50276, 74, 513, 417, 1158, 10012, 337, 310, 3782, 4460, 891, 717, 417, 2119, 273, 271, 3236, 3806, 352, 778, 320, 247, 2238, 273, 6305, 7261, 410, 533, 923, 323, 1650, 5987, 19484, 4924, 681, 1252, 28766, 938, 26672, 422, 267, 46042, 24780, 2261, 395, 324, 356, 4614, 10012, 818, 432, 534, 352, 310, 14916, 281, 27566, 10012, 337, 407, 20764, 326, 259, 2808, 359, 778, 5467, 12684, 17, 323, 512, 891, 1580, 45771, 253, 5927, 1318, 1057, 417, 1818, 253, 27935, 50276, 1542, 10012, 374, 891, 778, 320, 5816, 1633, 436, 1318, 273, 9765, 3133, 13714, 7197, 685, 752, 359, 651, 755, 275, 10012, 337, 407, 816, 4758, 1162, 66, 292, 312, 991, 594, 752, 310, 253, 1386, 3186, 11280, 441, 310, 352, 816, 323, 16774, 3045, 342, 642, 10527, 5649, 2568, 352, 310, 417, 9090, 3559, 436, 1039, 594, 604, 594, 891, 1158, 690, 16157, 281, 436, 1055, 403, 275, 1340, 50276, 783, 9376, 327, 11542, 20223, 323, 253, 1543, 327, 717, 84, 4971, 3133, 247, 1652, 45991, 281, 479, 891, 717, 11926, 326, 512, 273, 253, 17825, 3753, 273, 253, 638, 12380, 254, 310, 19124, 285, 841, 13260, 403, 2509, 512, 253, 3573, 1190, 12545, 6296, 604, 359, 873, 9840, 17, 840, 342, 4715, 2281, 1162, 66, 50276, 4988, 77, 285, 256, 35333, 5731, 22923, 18, 50276, 17118, 50276, 1464, 50276, 87, 18, 305, 85, 323, 667, 362, 275, 32609, 717, 991, 840, 891, 9101, 326, 2629, 1783, 273, 11786, 18499, 970, 4715, 4142, 387, 954, 337, 77, 588, 4917, 9648, 2074, 1543, 281, 10012, 495, 891, 651, 320, 5211, 281, 4089, 5010, 2167, 50276, 2577, 4583, 5471, 310, 326, 627, 310, 247, 5816, 5313, 1060, 275, 253, 3762, 281, 921, 326, 253, 1386, 3186, 310, 4217, 891, 717, 417, 13224, 326, 253, 643, 1543, 403, 1534, 327, 616, 1211, 50275, 284, 323, 253, 16774, 1543, 841, 1646, 751, 5272, 15988, 689, 38622, 891, 651, 452, 9013, 281, 923, 625, 2629, 3676, 4715, 49602, 327, 1327, 5695, 8892, 347, 973, 533, 891, 717, 417, 271, 6485, 1060, 285, 594, 651, 36574, 281, 643, 11626, 50275, 79, 953, 891, 717, 417, 2119, 326, 253, 9376, 326, 253, 10040, 684, 403, 11542, 310, 973, 17285, 1060, 891, 513, 2868, 352, 556, 644, 8025, 275, 690, 2469, 6239, 533, 436, 1057, 417, 1056, 352, 2686, 2032, 352, 310, 5604, 417, 2629, 275, 253, 6239, 327, 3909, 4715, 253, 767, 10414, 277, 26550, 1162, 355, 4332, 285, 458, 11170, 1162, 355, 4765, 11106, 347, 1941, 1060, 513, 417, 2686, 5467, 436, 3185, 597, 897, 20553, 281, 5416, 326, 253, 10040, 684, 403, 11542, 1293, 13260, 5474, 33032, 2520, 2929, 27694, 959, 767, 1774, 19191, 11333, 519, 356, 4614, 285, 717, 84, 4971, 352, 294, 29965, 13505, 841, 767, 11333, 762, 30370, 9978, 285, 2722, 849, 253, 1543, 755, 5520, 762, 436, 1798, 1083, 50276, 45563, 30370, 9978, 310, 5272, 275, 689, 3575, 292, 50065, 9459, 436, 2929, 25097, 247, 11080, 1783, 327, 519, 356, 4614, 285, 717, 84, 4971, 762, 253, 30370, 9978, 2007, 352, 31167, 253, 19191, 1386, 3186, 5853, 285, 19191, 3488, 518, 5018, 907, 5853, 342, 717, 84, 4971, 281, 1056, 253, 5018, 907, 5438, 17825, 253, 7756, 1543, 285, 253, 10096, 327, 253, 6070, 273, 30370, 8411, 1646, 4722, 50276, 585, 1209, 2224, 2139, 513, 253, 4477, 417, 1908, 253, 10915, 1498, 800, 19191, 1386, 3186, 285, 3488, 518, 323, 519, 356, 4614, 671, 516, 12371, 47515, 253, 7681, 7881, 281, 294, 39961, 841, 1543, 310, 326, 3365, 16248, 253, 8946, 1783, 342, 253, 2045, 789, 16016, 88, 6451, 1162, 355, 6247, 3075, 1417, 19191, 11786, 30370, 3104, 3849, 285, 14940, 4142, 253, 5018, 907, 8696, 275, 247, 11542, 7726, 2406, 11542, 1977, 432, 5058, 594, 326, 253, 3632, 1255, 273, 5018, 907, 310, 973, 6537, 50275, 1542, 2852, 7756, 891, 1158, 4477, 943, 22175, 253, 10183, 273, 253, 1783, 625, 2590, 1754, 327, 619, 4361, 4477, 1618, 11088, 1543, 1223, 849, 8453, 273, 841, 1543, 310, 1679, 5469, 352, 651, 320, 1175, 281, 2085, 247, 2087, 4737, 7792, 281, 1056, 849, 30370, 7729, 327, 247, 17614, 468, 1783, 625, 2590, 50271, 32674, 30080, 22559, 253, 4477, 943, 6780, 690, 7681, 12748, 275, 253, 2929, 275, 8063, 19191, 1386, 3186, 762, 689, 19484, 1025, 9459, 1057, 417, 1056, 1841, 12150, 984, 253, 5018, 907, 310, 2406, 11542, 253, 10183, 273, 19191, 5018, 907, 310, 281, 1453, 253, 1885, 273, 5018, 907, 285, 11786, 1223, 762, 436, 9459, 253, 1885, 310, 39690, 352, 3133, 253, 1783, 273, 10254, 5043, 436, 8310, 310, 2217, 323, 253, 1783, 352, 651, 320, 4217, 281, 2085, 690, 16039, 285, 7881, 273, 253, 1783, 50274, 187, 187, 4118, 18435, 27, 69, 613, 4477, 50276, 783, 2929, 4428, 1142, 4722, 285, 4460, 5697, 6296, 25184, 5018, 907, 310, 1077, 673, 285, 2341, 33136, 285, 44190, 285, 18918, 747, 17825, 11333, 556, 417, 760, 10527, 5373, 533, 625, 15538, 310, 247, 2234, 672, 3733, 625, 9542, 13361, 3210, 50276, 783, 2929, 4428, 1142, 32213, 347, 4879, 407, 30628, 891, 871, 326, 368, 452, 9713, 1142, 273, 731, 581, 273, 253, 30628, 310, 1335, 7514, 670, 253, 643, 3374, 7668, 10012, 337, 285, 253, 9376, 273, 253, 11542, 638, 12380, 254, 344, 11121, 253, 638, 12380, 254, 3033, 310, 45991, 275, 253, 689, 19484, 1025, 9459, 344, 651, 1902, 253, 27935, 281, 2489, 2822, 5058, 347, 253, 5933, 26414, 534, 651, 2686, 2847, 253, 638, 12380, 254, 281, 417, 320, 11542, 2708, 352, 3133, 326, 253, 1783, 1537, 2686, 3157, 604, 253, 4477, 13966, 717, 84, 4971, 43089, 285, 3185, 816, 2783, 256, 35333, 323, 534, 253, 638, 12380, 254, 9376, 310, 417, 271, 9376, 533, 816, 247, 2867, 273, 253, 5933, 50272, 47033, 368, 50276 ]
[ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 ]
[ 18, 2609, 85, 2281, 672, 3962, 30370, 310, 10048, 14940, 13576, 342, 1386, 3186, 285, 3488, 518, 3213, 9552, 403, 671, 5867, 50271, 250, 3743, 323, 4868, 50273, 2520, 2929, 3400, 1175, 16039, 5001, 849, 271, 30370, 9376, 778, 1361, 28523, 17825, 11786, 3082, 891, 513, 417, 1928, 253, 7681, 1543, 1077, 4891, 347, 690, 2834, 85, 13807, 777, 3607, 403, 816, 1691, 347, 13260, 923, 772, 50274, 856, 84, 50273, 856, 84, 50275, 18, 253, 1543, 3400, 16039, 5001, 2139, 17825, 11786, 3082, 778, 29623, 7938, 672, 253, 30370, 9376, 310, 10048, 541, 103, 50276, 19, 1386, 3186, 285, 3488, 518, 3213, 1979, 3082, 1361, 2953, 253, 878, 273, 1895, 285, 5933, 3602, 275, 2629, 11813, 25761, 627, 403, 1643, 9380, 16585, 1386, 3186, 285, 3488, 518, 3213, 1979, 3082, 275, 253, 1442, 3254, 360, 9978, 50274, 20, 253, 3488, 518, 3213, 1979, 310, 973, 24013, 8550, 275, 253, 30370, 4758, 50271, 5040, 50275, 18, 253, 12002, 3916, 326, 519, 356, 4614, 476, 5115, 271, 258, 18, 14938, 275, 253, 3909, 17133, 13757, 7792, 891, 513, 417, 923, 436, 906, 275, 253, 2022, 2505, 50276, 19, 253, 2929, 9563, 5172, 2179, 5001, 2834, 85, 13807, 777, 13260, 275, 1798, 436, 2929, 19584, 253, 3425, 273, 10040, 684, 310, 11542, 275, 247, 873, 273, 9941, 277, 285, 253, 25023, 273, 253, 638, 42743, 12624, 403, 11542, 253, 2022, 4154, 8109, 841, 13260, 403, 3365, 597, 403, 1846, 275, 5368, 6239, 891, 1928, 253, 7681, 7881, 275, 253, 6260, 403, 26353, 4215, 247, 2257, 984, 273, 841, 1846, 13260, 541, 103, 40702, 37277, 326, 1293, 253, 2515, 253, 14940, 23632, 778, 417, 320, 14282, 984, 253, 277, 4764, 275, 512, 39383, 285, 32609, 285, 717, 991, 275, 10012, 495, 285, 10012, 577, 476, 4311, 342, 253, 19502, 4828, 541, 103, 50276, 20, 253, 4081, 1386, 3186, 3082, 1646, 281, 320, 43245, 1077, 8214, 253, 4081, 1386, 3186, 3082, 2430, 12672, 253, 6253, 3213, 1979, 14127, 253, 6799, 11370, 3185, 273, 253, 6253, 2190, 247, 3425, 273, 28596, 46957, 3213, 9552, 347, 275, 2629, 4430, 32187, 1386, 3186, 3082, 310, 352, 1896, 281, 12106, 253, 3045, 273, 6158, 3062, 43245, 3718, 2272, 1962, 1405, 40702, 50276, 21, 352, 310, 7558, 326, 253, 3213, 1979, 6777, 407, 253, 4081, 11518, 11233, 37913, 1386, 3186, 1332, 310, 11542, 275, 374, 337, 50276, 68, 50276, 77, 4090, 1162, 518, 50276, 18, 476, 352, 5108, 326, 1162, 518, 50276, 18, 458, 82, 374, 337, 50276, 68, 50276, 77, 4090, 604, 4754, 840, 310, 253, 3213, 1979, 5438, 4086, 6210, 392, 37224, 627, 310, 671, 247, 2074, 1750, 323, 253, 19191, 4430, 32187, 1386, 3186, 1332, 541, 103, 50276, 22, 891, 13414, 755, 253, 6197, 326, 247, 2074, 13812, 875, 253, 14940, 273, 3638, 5018, 907, 38622, 390, 717, 84, 4971, 4632, 519, 356, 4614, 556, 671, 644, 4102, 5469, 275, 253, 1327, 44181, 4758, 20926, 37554, 91, 1162, 355, 9169, 275, 2593, 577, 752, 310, 253, 13812, 541, 103, 50276, 23, 5884, 4385, 50272, 18, 1745, 80, 275, 253, 806, 12494, 3909, 14604, 5141, 50276, 251, 3642, 292, 706, 1506, 5141, 50273, 19, 278, 18, 310, 417, 7616, 275, 337, 50272, 20, 253, 31931, 2492, 653, 84, 310, 417, 2931, 672, 352, 4620, 323, 253, 806, 673, 275, 253, 2022, 2505, 541, 103, 50274, 6438, 4361, 2488, 30080, 22559, 50275, 74, 1158, 253, 1318, 273, 436, 789, 310, 281, 7568, 253, 1896, 5373, 273, 271, 30370, 9376, 7613, 2167, 627, 403, 690, 10527, 2523, 285, 3762, 29105, 8037, 891, 1978, 253, 3236, 4868, 50274, 74, 513, 417, 1928, 352, 1077, 5272, 281, 22175, 253, 14938, 906, 275, 253, 12002, 285, 840, 1691, 253, 3969, 2593, 275, 253, 30762, 352, 310, 625, 5272, 281, 2118, 30762, 260, 19, 281, 253, 2022, 2505, 2167, 891, 5476, 352, 651, 320, 2834, 275, 3946, 50274, 74, 2096, 6240, 247, 12378, 3213, 5431, 1057, 417, 1818, 253, 4737, 1199, 2299, 436, 310, 417, 253, 9978, 5867, 275, 436, 2929, 891, 1928, 253, 2330, 9841, 22566, 37699, 275, 2593, 374, 9563, 8489, 5172, 2179, 891, 1804, 253, 4477, 823, 690, 8254, 6787, 275, 253, 18520, 2074, 281, 253, 1563, 275, 253, 30080, 22559, 50275, 14920, 271, 6843, 12378, 3213, 359, 2868, 352, 310, 1896, 281, 5283, 253, 3332, 256, 35333, 1783, 327, 253, 2761, 2119, 14940, 273, 19191, 11786, 18499, 275, 1327, 44181, 3237, 5723, 2824, 9169, 281, 5276, 326, 253, 10040, 684, 513, 3464, 11542, 342, 1029, 22275, 1430, 359, 3553, 436, 323, 2852, 789, 50275, 783, 643, 8254, 6787, 403, 8718, 281, 479, 50274, 74, 1804, 253, 4477, 823, 690, 37699, 5001, 577, 275, 253, 2022, 2505, 50273, 7152, 33032, 2520, 2929, 2175, 17825, 11786, 3082, 762, 253, 689, 3575, 292, 50065, 7533, 835, 253, 4477, 1263, 253, 29623, 275, 253, 30370, 4758, 275, 436, 4758, 253, 8654, 8103, 310, 470, 253, 4477, 921, 326, 253, 14940, 2281, 310, 258, 18, 85, 275, 1635, 672, 253, 30370, 310, 5512, 10048, 253, 4477, 921, 253, 14940, 281, 247, 9168, 273, 253, 2900, 253, 4477, 671, 2085, 10527, 816, 6787, 323, 4633, 1386, 3186, 3082, 50276, 1189, 455, 891, 1089, 253, 2929, 3477, 281, 1239, 2299, 891, 513, 452, 247, 1643, 3533, 326, 651, 751, 281, 923, 253, 4477, 9172, 50276, 18, 253, 4477, 29688, 5467, 326, 253, 8654, 2900, 310, 4451, 50276, 35529, 436, 310, 417, 253, 1083, 275, 1142, 689, 3575, 292, 50065, 3210, 323, 1650, 1908, 253, 1083, 323, 21535, 9077, 835, 253, 767, 5971, 403, 9670, 39690, 253, 7221, 6081, 310, 417, 973, 2931, 533, 627, 452, 644, 9470, 789, 327, 436, 9400, 476, 253, 4477, 1783, 5223, 281, 824, 9534, 50275, 19, 3192, 253, 21535, 9077, 347, 271, 1650, 969, 253, 7221, 6081, 310, 417, 1561, 247, 11542, 2919, 476, 253, 4477, 1783, 644, 8671, 281, 12106, 824, 1083, 50276, 20, 253, 906, 403, 512, 275, 253, 830, 273, 15355, 476, 253, 4477, 3033, 253, 298, 19, 12850, 50276, 21, 891, 1158, 352, 651, 320, 27096, 281, 823, 253, 906, 672, 253, 2957, 310, 7052, 17133, 835, 359, 476, 452, 253, 3033, 323, 253, 5482, 7152, 33032, 2520, 2929, 3537, 13505, 17825, 11333, 824, 347, 519, 356, 4614, 285, 717, 84, 4971, 275, 247, 1442, 3254, 360, 13757, 1895, 253, 27947, 3176, 281, 1555, 436, 4758, 949, 3909, 17133, 13757, 285, 327, 3642, 292, 706, 1506, 9436, 352, 310, 2011, 326, 1097, 519, 356, 4614, 285, 717, 84, 4971, 3157, 672, 253, 2060, 11655, 403, 512, 36625, 387, 253, 1072, 1127, 2007, 1386, 3186, 5609, 403, 5867, 275, 17385, 342, 841, 11333, 285, 16774, 1543, 921, 326, 275, 3946, 253, 1386, 17891, 3885, 598, 14940, 50276, 74, 513, 417, 1158, 10012, 337, 310, 3782, 4460, 891, 717, 417, 2119, 273, 271, 3236, 3806, 352, 778, 320, 247, 2238, 273, 6305, 7261, 410, 533, 923, 323, 1650, 5987, 19484, 4924, 681, 1252, 28766, 938, 26672, 422, 267, 46042, 24780, 2261, 395, 324, 356, 4614, 10012, 818, 432, 534, 352, 310, 14916, 281, 27566, 10012, 337, 407, 20764, 326, 259, 2808, 359, 778, 5467, 12684, 17, 323, 512, 891, 1580, 45771, 253, 5927, 1318, 1057, 417, 1818, 253, 27935, 50276, 1542, 10012, 374, 891, 778, 320, 5816, 1633, 436, 1318, 273, 9765, 3133, 13714, 7197, 685, 752, 359, 651, 755, 275, 10012, 337, 407, 816, 4758, 1162, 66, 292, 312, 991, 594, 752, 310, 253, 1386, 3186, 11280, 441, 310, 352, 816, 323, 16774, 3045, 342, 642, 10527, 5649, 2568, 352, 310, 417, 9090, 3559, 436, 1039, 594, 604, 594, 891, 1158, 690, 16157, 281, 436, 1055, 403, 275, 1340, 50276, 783, 9376, 327, 11542, 20223, 323, 253, 1543, 327, 717, 84, 4971, 3133, 247, 1652, 45991, 281, 479, 891, 717, 11926, 326, 512, 273, 253, 17825, 3753, 273, 253, 638, 12380, 254, 310, 19124, 285, 841, 13260, 403, 2509, 512, 253, 3573, 1190, 12545, 6296, 604, 359, 873, 9840, 17, 840, 342, 4715, 2281, 1162, 66, 50276, 4988, 77, 285, 256, 35333, 5731, 22923, 18, 50276, 17118, 50276, 1464, 50276, 87, 18, 305, 85, 323, 667, 362, 275, 32609, 717, 991, 840, 891, 9101, 326, 2629, 1783, 273, 11786, 18499, 970, 4715, 4142, 387, 954, 337, 77, 588, 4917, 9648, 2074, 1543, 281, 10012, 495, 891, 651, 320, 5211, 281, 4089, 5010, 2167, 50276, 2577, 4583, 5471, 310, 326, 627, 310, 247, 5816, 5313, 1060, 275, 253, 3762, 281, 921, 326, 253, 1386, 3186, 310, 4217, 891, 717, 417, 13224, 326, 253, 643, 1543, 403, 1534, 327, 616, 1211, 50275, 284, 323, 253, 16774, 1543, 841, 1646, 751, 5272, 15988, 689, 38622, 891, 651, 452, 9013, 281, 923, 625, 2629, 3676, 4715, 49602, 327, 1327, 5695, 8892, 347, 973, 533, 891, 717, 417, 271, 6485, 1060, 285, 594, 651, 36574, 281, 643, 11626, 50275, 79, 953, 891, 717, 417, 2119, 326, 253, 9376, 326, 253, 10040, 684, 403, 11542, 310, 973, 17285, 1060, 891, 513, 2868, 352, 556, 644, 8025, 275, 690, 2469, 6239, 533, 436, 1057, 417, 1056, 352, 2686, 2032, 352, 310, 5604, 417, 2629, 275, 253, 6239, 327, 3909, 4715, 253, 767, 10414, 277, 26550, 1162, 355, 4332, 285, 458, 11170, 1162, 355, 4765, 11106, 347, 1941, 1060, 513, 417, 2686, 5467, 436, 3185, 597, 897, 20553, 281, 5416, 326, 253, 10040, 684, 403, 11542, 1293, 13260, 5474, 33032, 2520, 2929, 27694, 959, 767, 1774, 19191, 11333, 519, 356, 4614, 285, 717, 84, 4971, 352, 294, 29965, 13505, 841, 767, 11333, 762, 30370, 9978, 285, 2722, 849, 253, 1543, 755, 5520, 762, 436, 1798, 1083, 50276, 45563, 30370, 9978, 310, 5272, 275, 689, 3575, 292, 50065, 9459, 436, 2929, 25097, 247, 11080, 1783, 327, 519, 356, 4614, 285, 717, 84, 4971, 762, 253, 30370, 9978, 2007, 352, 31167, 253, 19191, 1386, 3186, 5853, 285, 19191, 3488, 518, 5018, 907, 5853, 342, 717, 84, 4971, 281, 1056, 253, 5018, 907, 5438, 17825, 253, 7756, 1543, 285, 253, 10096, 327, 253, 6070, 273, 30370, 8411, 1646, 4722, 50276, 585, 1209, 2224, 2139, 513, 253, 4477, 417, 1908, 253, 10915, 1498, 800, 19191, 1386, 3186, 285, 3488, 518, 323, 519, 356, 4614, 671, 516, 12371, 47515, 253, 7681, 7881, 281, 294, 39961, 841, 1543, 310, 326, 3365, 16248, 253, 8946, 1783, 342, 253, 2045, 789, 16016, 88, 6451, 1162, 355, 6247, 3075, 1417, 19191, 11786, 30370, 3104, 3849, 285, 14940, 4142, 253, 5018, 907, 8696, 275, 247, 11542, 7726, 2406, 11542, 1977, 432, 5058, 594, 326, 253, 3632, 1255, 273, 5018, 907, 310, 973, 6537, 50275, 1542, 2852, 7756, 891, 1158, 4477, 943, 22175, 253, 10183, 273, 253, 1783, 625, 2590, 1754, 327, 619, 4361, 4477, 1618, 11088, 1543, 1223, 849, 8453, 273, 841, 1543, 310, 1679, 5469, 352, 651, 320, 1175, 281, 2085, 247, 2087, 4737, 7792, 281, 1056, 849, 30370, 7729, 327, 247, 17614, 468, 1783, 625, 2590, 50271, 32674, 30080, 22559, 253, 4477, 943, 6780, 690, 7681, 12748, 275, 253, 2929, 275, 8063, 19191, 1386, 3186, 762, 689, 19484, 1025, 9459, 1057, 417, 1056, 1841, 12150, 984, 253, 5018, 907, 310, 2406, 11542, 253, 10183, 273, 19191, 5018, 907, 310, 281, 1453, 253, 1885, 273, 5018, 907, 285, 11786, 1223, 762, 436, 9459, 253, 1885, 310, 39690, 352, 3133, 253, 1783, 273, 10254, 5043, 436, 8310, 310, 2217, 323, 253, 1783, 352, 651, 320, 4217, 281, 2085, 690, 16039, 285, 7881, 273, 253, 1783, 50274, 187, 187, 4118, 18435, 27, 69, 613, 4477, 50276, 783, 2929, 4428, 1142, 4722, 285, 4460, 5697, 6296, 25184, 5018, 907, 310, 1077, 673, 285, 2341, 33136, 285, 44190, 285, 18918, 747, 17825, 11333, 556, 417, 760, 10527, 5373, 533, 625, 15538, 310, 247, 2234, 672, 3733, 625, 9542, 13361, 3210, 50276, 783, 2929, 4428, 1142, 32213, 347, 4879, 407, 30628, 891, 871, 326, 368, 452, 9713, 1142, 273, 731, 581, 273, 253, 30628, 310, 1335, 7514, 670, 253, 643, 3374, 7668, 10012, 337, 285, 253, 9376, 273, 253, 11542, 638, 12380, 254, 344, 11121, 253, 638, 12380, 254, 3033, 310, 45991, 275, 253, 689, 19484, 1025, 9459, 344, 651, 1902, 253, 27935, 281, 2489, 2822, 5058, 347, 253, 5933, 26414, 534, 651, 2686, 2847, 253, 638, 12380, 254, 281, 417, 320, 11542, 2708, 352, 3133, 326, 253, 1783, 1537, 2686, 3157, 604, 253, 4477, 13966, 717, 84, 4971, 43089, 285, 3185, 816, 2783, 256, 35333, 323, 534, 253, 638, 12380, 254, 9376, 310, 417, 271, 9376, 533, 816, 247, 2867, 273, 253, 5933, 50272, 47033, 368, 50276 ]
Below is given review of a research paper from cnoference journal. Please write a summary the review. ### Review: this paper proposes a bayesian inference methodology incorporating coresets with hamiltonian flows the paper demonstrates theoretically the challenges that both coresets and variational inference via hamiltonian dynamics face and proposes a fix for both in their algorithm sparse hamiltonian flows their method first selects a coreset and then follows a sparsified hamiltonian flow with quasirefreshments which allows the flow to update the momentum important or argumentative claims are backed up with theoretical proofs experiments on a variety of regression problems demonstrate the superiority of their algorithm over current stateoftheart coreset compression and variationalflowbased methods strengths important claims and new insights on coresets and hamiltonian flows are backed up with theoretical proofs where may be difficult to prove in the general case such as proposition 31 a representative example is given and the claim is proven on it the proposed idea is very novel and addresses important drawbacks that current coreset methods suffer from a thorough and clear review of related and past methods is provided experiments are wellconducted and a variety of datasets both synthetic and real are explored weaknesses perhaps some other choices of rhot and rlambdax could be explored so that a potential user could understand the sensitivity of the algorithm to these choices yes docsepthe paper introduces a new method for constructing bayesian coresets the authors demonstrate that a single uniform subsampling of data points is in principle sufficient to obtain an exact coreset and introduce the sparse hamiltonian flow to efficiently construct and sample from the corresponding coreset posterior approximation notable improvements over other coreset methods are reported in several experiments originality to my knowledge the presented method is original the authors provide a footnote citing concurrent work based on similar ideas but the combination of hamiltonian flow approximations to the coreset posterior seems unique to his work one could argue that the authors should provide a dedicated related work section to elaborate on the connections to earlier work quality the paper is technically sound and the claims are carefully developed and well supported the paper could be further improved with some reflection on the limitations of the approach clarity the manuscript is well structured and very clearly written with helpful introductions to the methodological ingredients that it builds upon significance the paper constitutes a significant contribution within research on bayesian coresets both in terms of methodology and measured in terms of the performance improvements over other methods i am not certain how large a contribution it will have to the field of bayesian inference in general this would have been easier to assess if the authors had broadened the scope of their baselines to other bayesian inference procedures the authors do not discuss the limitations of their method docsepbayesian inference via sparse hamiltonian flows combines three techniques to make bayesian inference faster and more accurate it combines a subsampling of the data core sets b sparse flows and c quasirefreshments the paper provides theoretical evidence for why these subcomponents reduce the runtime or increase performance see section 3 and empirical evidence in three different settings the sparse hamiltonian flows shf clearly and strongly outperform the alternatives in most experiments update thanks for addressing all of my concerns i update my score from 7 to 8 great work keep it up in short i think the paper is good and should be published with minor revisions strengths the paper is very wellwritten and clear the suggested combination of methods clearly and strongly improves performance compared to the alternatives the paper provides a theoretical analysis of why and in which manner the performance improves due to shf weaknesses the experiments are all in fairly simple settings the results already convince me that the method is very strong and warrants publication but a more complex experiment would increase this conviction see questions the paper says little about its limitations see questions i think the limitations of the method were discussed insufficiently and should be addressed as described in questions 1 and 2 ill use the rest of the section for highlevel comments in its current form the paper convinces me that shf decreases runtime and increases performance for datasets with low complexity the authors show this with their theoretical analysis and empirical experiments furthermore the paper is wellwritten and the presentation is good all of this combined already warrants publication in my opinion the assumptions that shf makes and the implied limitations are underexplored i expect that shf will have a hard time with more complex models and data because it assumes that a random selection of data points is representative of the entire dataset i think a good response or an additional experiment in this direction would convince me to raise my score further note that i think the paper would be improved even if the method is more limited than expected stating limitations helps readers and practitioners because it defines the scope of possible use cases more clearly i want to help where i can in case something is unclear feel free to ask followup questions ### Summary:
all reviewers agree that the paper proposes an interesting approach to bayesian inference incorporating coresets with hamiltonian flows although some reviewers have some technical concerns at their first reviews basically those have been resolved by the authors responses thus although there are some points that should be modified from the current form i think we can expect the authors modify the paper in the cameraready by reflecting the discussion based on these i recommend acceptance for this paper
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 2520, 2929, 29328, 247, 17699, 16561, 17032, 16182, 24049, 23018, 1507, 342, 10546, 7839, 757, 14221, 253, 2929, 14371, 28055, 253, 7881, 326, 1097, 23018, 1507, 285, 39762, 17032, 3066, 10546, 7839, 757, 8062, 2454, 285, 29328, 247, 4993, 323, 1097, 275, 616, 5933, 23507, 10546, 7839, 757, 14221, 616, 1332, 806, 34899, 247, 820, 19511, 285, 840, 3637, 247, 37139, 1245, 10546, 7839, 757, 2685, 342, 21582, 603, 47120, 942, 534, 4483, 253, 2685, 281, 5731, 253, 10254, 1774, 390, 4154, 800, 3916, 403, 17245, 598, 342, 10527, 27947, 4679, 327, 247, 5235, 273, 9077, 3237, 7568, 253, 34385, 273, 616, 5933, 689, 1655, 1375, 23037, 14387, 820, 19511, 13800, 285, 39762, 5449, 3169, 3082, 20544, 50276, 18108, 3916, 285, 747, 16039, 327, 23018, 1507, 285, 10546, 7839, 757, 14221, 403, 17245, 598, 342, 10527, 27947, 835, 778, 320, 2834, 281, 5276, 275, 253, 2087, 1083, 824, 347, 13989, 4562, 247, 8612, 1650, 310, 1677, 285, 253, 1750, 310, 11464, 327, 352, 50275, 783, 4081, 2934, 310, 1077, 4460, 285, 12453, 1774, 30453, 326, 1655, 820, 19511, 3082, 11089, 432, 50276, 66, 11080, 285, 2590, 2278, 273, 2905, 285, 2469, 3082, 310, 2530, 50276, 16217, 3825, 403, 973, 11018, 264, 285, 247, 5235, 273, 15302, 1097, 13506, 285, 1524, 403, 14859, 50276, 20881, 1255, 265, 50276, 30875, 690, 643, 10165, 273, 391, 12022, 285, 391, 77, 1369, 69, 991, 812, 320, 14859, 594, 326, 247, 2442, 2608, 812, 2096, 253, 7340, 273, 253, 5933, 281, 841, 10165, 4754, 5474, 339, 431, 248, 2929, 23970, 247, 747, 1332, 323, 26736, 17699, 16561, 23018, 1507, 253, 4477, 7568, 326, 247, 2014, 6447, 8790, 312, 4906, 273, 941, 2792, 310, 275, 8063, 4209, 281, 4044, 271, 3242, 820, 19511, 285, 9569, 253, 23507, 10546, 7839, 757, 2685, 281, 14556, 3989, 285, 3410, 432, 253, 3969, 820, 19511, 12637, 11193, 16613, 11701, 689, 643, 820, 19511, 3082, 403, 2361, 275, 2067, 4679, 50275, 19164, 414, 281, 619, 3640, 253, 3559, 1332, 310, 3236, 253, 4477, 2085, 247, 43302, 19936, 17336, 789, 1754, 327, 2074, 5697, 533, 253, 5019, 273, 10546, 7839, 757, 2685, 34754, 281, 253, 820, 19511, 12637, 3133, 4451, 281, 521, 789, 581, 812, 9059, 326, 253, 4477, 943, 2085, 247, 9940, 2905, 789, 2593, 281, 21184, 327, 253, 10291, 281, 4321, 789, 50275, 15177, 253, 2929, 310, 22335, 3590, 285, 253, 3916, 403, 9257, 3715, 285, 973, 4516, 253, 2929, 812, 320, 2007, 5520, 342, 690, 12906, 327, 253, 7364, 273, 253, 2746, 50275, 498, 15752, 253, 7714, 310, 973, 18872, 285, 1077, 4518, 3542, 342, 9371, 3092, 960, 281, 253, 35961, 12696, 326, 352, 21168, 2220, 50275, 9188, 40348, 253, 2929, 16988, 247, 1534, 7680, 1561, 2561, 327, 17699, 16561, 23018, 1507, 1097, 275, 2426, 273, 16182, 285, 4080, 275, 2426, 273, 253, 3045, 11701, 689, 643, 3082, 891, 717, 417, 2176, 849, 1781, 247, 7680, 352, 588, 452, 281, 253, 1673, 273, 17699, 16561, 17032, 275, 2087, 436, 651, 452, 644, 6927, 281, 2939, 604, 253, 4477, 574, 3862, 2348, 253, 7990, 273, 616, 1666, 25379, 281, 643, 17699, 16561, 17032, 7259, 50276, 783, 4477, 513, 417, 2319, 253, 7364, 273, 616, 1332, 5474, 339, 15656, 333, 16561, 17032, 3066, 23507, 10546, 7839, 757, 14221, 24772, 1264, 5609, 281, 1056, 17699, 16561, 17032, 7938, 285, 625, 7899, 352, 24772, 247, 8790, 312, 4906, 273, 253, 941, 5161, 5239, 270, 23507, 14221, 285, 260, 21582, 603, 47120, 942, 50276, 783, 2929, 3400, 10527, 1941, 323, 2139, 841, 749, 22127, 4796, 253, 20243, 390, 2572, 3045, 923, 2593, 495, 285, 16774, 1941, 275, 1264, 1027, 7533, 253, 23507, 10546, 7839, 757, 14221, 439, 71, 4518, 285, 7052, 562, 32231, 253, 18075, 275, 954, 4679, 50275, 11183, 6701, 323, 15974, 512, 273, 619, 7350, 891, 5731, 619, 4868, 432, 818, 281, 854, 1270, 789, 1978, 352, 598, 50276, 249, 2159, 891, 1158, 253, 2929, 310, 1175, 285, 943, 320, 3863, 342, 5884, 38549, 50275, 296, 3755, 20556, 50276, 783, 2929, 310, 1077, 973, 15720, 285, 2590, 50276, 783, 5125, 5019, 273, 3082, 4518, 285, 7052, 19132, 3045, 2429, 281, 253, 18075, 50276, 783, 2929, 3400, 247, 10527, 1783, 273, 2139, 285, 275, 534, 5133, 253, 3045, 19132, 1955, 281, 439, 71, 50275, 20881, 1255, 265, 50276, 783, 4679, 403, 512, 275, 9648, 2969, 7533, 253, 1543, 2168, 18578, 479, 326, 253, 1332, 310, 1077, 2266, 285, 32570, 9311, 533, 247, 625, 2570, 3368, 651, 2572, 436, 9611, 923, 3533, 50275, 783, 2929, 2296, 1652, 670, 697, 7364, 923, 3533, 50276, 74, 1158, 253, 7364, 273, 253, 1332, 497, 5469, 12497, 314, 285, 943, 320, 9713, 347, 2529, 275, 3533, 337, 285, 374, 50275, 408, 897, 253, 1551, 273, 253, 2593, 323, 1029, 5251, 5701, 50276, 249, 697, 1655, 830, 253, 2929, 13136, 707, 479, 326, 439, 71, 12075, 20243, 285, 5459, 3045, 323, 15302, 342, 1698, 10454, 253, 4477, 921, 436, 342, 616, 10527, 1783, 285, 16774, 4679, 33810, 253, 2929, 310, 973, 15720, 285, 253, 9759, 310, 1175, 512, 273, 436, 5678, 2168, 32570, 9311, 275, 619, 4743, 50275, 783, 13260, 326, 439, 71, 2789, 285, 253, 10466, 7364, 403, 15560, 18398, 446, 2149, 891, 1902, 326, 439, 71, 588, 452, 247, 1892, 673, 342, 625, 2570, 3210, 285, 941, 984, 352, 19584, 326, 247, 3632, 5438, 273, 941, 2792, 310, 8612, 273, 253, 2862, 10895, 891, 1158, 247, 1175, 2380, 390, 271, 3081, 3368, 275, 436, 3884, 651, 18578, 479, 281, 7164, 619, 4868, 2007, 3877, 326, 891, 1158, 253, 2929, 651, 320, 5520, 1014, 604, 253, 1332, 310, 625, 3710, 685, 3264, 14851, 7364, 7729, 10668, 285, 24432, 984, 352, 13067, 253, 7990, 273, 1896, 897, 2219, 625, 4518, 50275, 74, 971, 281, 1361, 835, 891, 476, 275, 1083, 1633, 310, 12744, 1928, 1959, 281, 1642, 956, 484, 3533, 50275, 187, 187, 4118, 18435, 27, 455, 30628, 5194, 326, 253, 2929, 29328, 271, 4722, 2746, 281, 17699, 16561, 17032, 24049, 23018, 1507, 342, 10546, 7839, 757, 14221, 3738, 690, 30628, 452, 690, 7681, 7350, 387, 616, 806, 10123, 10323, 1110, 452, 644, 11512, 407, 253, 4477, 6128, 3021, 3738, 627, 403, 690, 2792, 326, 943, 320, 7321, 432, 253, 1655, 830, 891, 1158, 359, 476, 1902, 253, 4477, 10007, 253, 2929, 275, 253, 4049, 254, 609, 5102, 407, 18964, 253, 5955, 1754, 327, 841, 891, 5583, 14924, 323, 436, 2929 ]
[ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 ]
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 2520, 2929, 29328, 247, 17699, 16561, 17032, 16182, 24049, 23018, 1507, 342, 10546, 7839, 757, 14221, 253, 2929, 14371, 28055, 253, 7881, 326, 1097, 23018, 1507, 285, 39762, 17032, 3066, 10546, 7839, 757, 8062, 2454, 285, 29328, 247, 4993, 323, 1097, 275, 616, 5933, 23507, 10546, 7839, 757, 14221, 616, 1332, 806, 34899, 247, 820, 19511, 285, 840, 3637, 247, 37139, 1245, 10546, 7839, 757, 2685, 342, 21582, 603, 47120, 942, 534, 4483, 253, 2685, 281, 5731, 253, 10254, 1774, 390, 4154, 800, 3916, 403, 17245, 598, 342, 10527, 27947, 4679, 327, 247, 5235, 273, 9077, 3237, 7568, 253, 34385, 273, 616, 5933, 689, 1655, 1375, 23037, 14387, 820, 19511, 13800, 285, 39762, 5449, 3169, 3082, 20544, 50276, 18108, 3916, 285, 747, 16039, 327, 23018, 1507, 285, 10546, 7839, 757, 14221, 403, 17245, 598, 342, 10527, 27947, 835, 778, 320, 2834, 281, 5276, 275, 253, 2087, 1083, 824, 347, 13989, 4562, 247, 8612, 1650, 310, 1677, 285, 253, 1750, 310, 11464, 327, 352, 50275, 783, 4081, 2934, 310, 1077, 4460, 285, 12453, 1774, 30453, 326, 1655, 820, 19511, 3082, 11089, 432, 50276, 66, 11080, 285, 2590, 2278, 273, 2905, 285, 2469, 3082, 310, 2530, 50276, 16217, 3825, 403, 973, 11018, 264, 285, 247, 5235, 273, 15302, 1097, 13506, 285, 1524, 403, 14859, 50276, 20881, 1255, 265, 50276, 30875, 690, 643, 10165, 273, 391, 12022, 285, 391, 77, 1369, 69, 991, 812, 320, 14859, 594, 326, 247, 2442, 2608, 812, 2096, 253, 7340, 273, 253, 5933, 281, 841, 10165, 4754, 5474, 339, 431, 248, 2929, 23970, 247, 747, 1332, 323, 26736, 17699, 16561, 23018, 1507, 253, 4477, 7568, 326, 247, 2014, 6447, 8790, 312, 4906, 273, 941, 2792, 310, 275, 8063, 4209, 281, 4044, 271, 3242, 820, 19511, 285, 9569, 253, 23507, 10546, 7839, 757, 2685, 281, 14556, 3989, 285, 3410, 432, 253, 3969, 820, 19511, 12637, 11193, 16613, 11701, 689, 643, 820, 19511, 3082, 403, 2361, 275, 2067, 4679, 50275, 19164, 414, 281, 619, 3640, 253, 3559, 1332, 310, 3236, 253, 4477, 2085, 247, 43302, 19936, 17336, 789, 1754, 327, 2074, 5697, 533, 253, 5019, 273, 10546, 7839, 757, 2685, 34754, 281, 253, 820, 19511, 12637, 3133, 4451, 281, 521, 789, 581, 812, 9059, 326, 253, 4477, 943, 2085, 247, 9940, 2905, 789, 2593, 281, 21184, 327, 253, 10291, 281, 4321, 789, 50275, 15177, 253, 2929, 310, 22335, 3590, 285, 253, 3916, 403, 9257, 3715, 285, 973, 4516, 253, 2929, 812, 320, 2007, 5520, 342, 690, 12906, 327, 253, 7364, 273, 253, 2746, 50275, 498, 15752, 253, 7714, 310, 973, 18872, 285, 1077, 4518, 3542, 342, 9371, 3092, 960, 281, 253, 35961, 12696, 326, 352, 21168, 2220, 50275, 9188, 40348, 253, 2929, 16988, 247, 1534, 7680, 1561, 2561, 327, 17699, 16561, 23018, 1507, 1097, 275, 2426, 273, 16182, 285, 4080, 275, 2426, 273, 253, 3045, 11701, 689, 643, 3082, 891, 717, 417, 2176, 849, 1781, 247, 7680, 352, 588, 452, 281, 253, 1673, 273, 17699, 16561, 17032, 275, 2087, 436, 651, 452, 644, 6927, 281, 2939, 604, 253, 4477, 574, 3862, 2348, 253, 7990, 273, 616, 1666, 25379, 281, 643, 17699, 16561, 17032, 7259, 50276, 783, 4477, 513, 417, 2319, 253, 7364, 273, 616, 1332, 5474, 339, 15656, 333, 16561, 17032, 3066, 23507, 10546, 7839, 757, 14221, 24772, 1264, 5609, 281, 1056, 17699, 16561, 17032, 7938, 285, 625, 7899, 352, 24772, 247, 8790, 312, 4906, 273, 253, 941, 5161, 5239, 270, 23507, 14221, 285, 260, 21582, 603, 47120, 942, 50276, 783, 2929, 3400, 10527, 1941, 323, 2139, 841, 749, 22127, 4796, 253, 20243, 390, 2572, 3045, 923, 2593, 495, 285, 16774, 1941, 275, 1264, 1027, 7533, 253, 23507, 10546, 7839, 757, 14221, 439, 71, 4518, 285, 7052, 562, 32231, 253, 18075, 275, 954, 4679, 50275, 11183, 6701, 323, 15974, 512, 273, 619, 7350, 891, 5731, 619, 4868, 432, 818, 281, 854, 1270, 789, 1978, 352, 598, 50276, 249, 2159, 891, 1158, 253, 2929, 310, 1175, 285, 943, 320, 3863, 342, 5884, 38549, 50275, 296, 3755, 20556, 50276, 783, 2929, 310, 1077, 973, 15720, 285, 2590, 50276, 783, 5125, 5019, 273, 3082, 4518, 285, 7052, 19132, 3045, 2429, 281, 253, 18075, 50276, 783, 2929, 3400, 247, 10527, 1783, 273, 2139, 285, 275, 534, 5133, 253, 3045, 19132, 1955, 281, 439, 71, 50275, 20881, 1255, 265, 50276, 783, 4679, 403, 512, 275, 9648, 2969, 7533, 253, 1543, 2168, 18578, 479, 326, 253, 1332, 310, 1077, 2266, 285, 32570, 9311, 533, 247, 625, 2570, 3368, 651, 2572, 436, 9611, 923, 3533, 50275, 783, 2929, 2296, 1652, 670, 697, 7364, 923, 3533, 50276, 74, 1158, 253, 7364, 273, 253, 1332, 497, 5469, 12497, 314, 285, 943, 320, 9713, 347, 2529, 275, 3533, 337, 285, 374, 50275, 408, 897, 253, 1551, 273, 253, 2593, 323, 1029, 5251, 5701, 50276, 249, 697, 1655, 830, 253, 2929, 13136, 707, 479, 326, 439, 71, 12075, 20243, 285, 5459, 3045, 323, 15302, 342, 1698, 10454, 253, 4477, 921, 436, 342, 616, 10527, 1783, 285, 16774, 4679, 33810, 253, 2929, 310, 973, 15720, 285, 253, 9759, 310, 1175, 512, 273, 436, 5678, 2168, 32570, 9311, 275, 619, 4743, 50275, 783, 13260, 326, 439, 71, 2789, 285, 253, 10466, 7364, 403, 15560, 18398, 446, 2149, 891, 1902, 326, 439, 71, 588, 452, 247, 1892, 673, 342, 625, 2570, 3210, 285, 941, 984, 352, 19584, 326, 247, 3632, 5438, 273, 941, 2792, 310, 8612, 273, 253, 2862, 10895, 891, 1158, 247, 1175, 2380, 390, 271, 3081, 3368, 275, 436, 3884, 651, 18578, 479, 281, 7164, 619, 4868, 2007, 3877, 326, 891, 1158, 253, 2929, 651, 320, 5520, 1014, 604, 253, 1332, 310, 625, 3710, 685, 3264, 14851, 7364, 7729, 10668, 285, 24432, 984, 352, 13067, 253, 7990, 273, 1896, 897, 2219, 625, 4518, 50275, 74, 971, 281, 1361, 835, 891, 476, 275, 1083, 1633, 310, 12744, 1928, 1959, 281, 1642, 956, 484, 3533, 50275, 187, 187, 4118, 18435, 27, 455, 30628, 5194, 326, 253, 2929, 29328, 271, 4722, 2746, 281, 17699, 16561, 17032, 24049, 23018, 1507, 342, 10546, 7839, 757, 14221, 3738, 690, 30628, 452, 690, 7681, 7350, 387, 616, 806, 10123, 10323, 1110, 452, 644, 11512, 407, 253, 4477, 6128, 3021, 3738, 627, 403, 690, 2792, 326, 943, 320, 7321, 432, 253, 1655, 830, 891, 1158, 359, 476, 1902, 253, 4477, 10007, 253, 2929, 275, 253, 4049, 254, 609, 5102, 407, 18964, 253, 5955, 1754, 327, 841, 891, 5583, 14924, 323, 436, 2929 ]
Below is given review of a research paper from cnoference journal. Please write a summary the review. ### Review: the paper proposes a new loss function to improve the performance of neural network models trained by dpsgd the new loss function is a weighted average of the sum of squared error the focal loss and a penalty on the squared norm of the preactivation output of different layers the new loss achieves stateoftheart accuracy on the cifar10 fashionmnist and mnist datasets it is also shown that the new loss can reduce the bias of gradient clipping and encourage learning on hard examples overall i think this is an interesting paper that explores the impact of loss function on the performance of models trained by dpsgd also the experiments are quite comprehensive uncovering the importance of choosing loss functions yet there are still some aspects that could be improved 1 though the loss can improve upon crossentropy it introduces three extra hyperparameters to tune it seems the paper only reported the best performance after hyperparameter searches however the sensitivity of performance with respect to these hyperparameters is not reported or discussed it would be better if such results can be added 2 the impact of loss function on model performance is an important and interesting topic the loss function proposed by the paper consists of three components all from existing literature i wonder have the authors tried other components eg using hinge loss of huber loss to replace the sum of squared errors or using other penalties after rebuttal my concerns are mostly addressed as pointed out by other reviewers some intuitive arguments for motivating the losses are not wellsupported and misleading eg using allenzhu et al 2019 to argue faster convergence in practice mentioned by reviewer vgsd however i believe this paper should be viewed as an empirical paper and i am satisfied with the authors efforts in exploring the impact of loss function on the performance of dpsgd thus i would like to keep my score overall i think the paper is interesting and could be further improved if more results are added docsepthe paper proposes a tailored loss for dpsgd which includes summed squared error the focal loss and a regularization penalty the summed squared error is for fast convergence at initial stage the focal loss is used for identify hard samples the regularization penalty is used for reducing the gradientweight norm and avoid explosion for each component of the tailored loss the paper has empiricaltheoretical evidence to argue the necessity it is a good try to improve the performance of dpsgd from amending the loss there are several weak points of this paper the paper uses the theoretical evidence to argue that summed squared error is good for fast convergence we have to admit that the convergence result in allenzhu et al 2019 has strict conditions ntk regime relu and square loss which may not hold in the practical experiments even if these results hold the fastness of exponential convergence is more evident for the final stage of convergence however here it is used to argue for the initial stages fastness which is not persuasive in practice the convergence rate is more related with the choices of the optimizers sgd or adam different learning rates rather than different losses the paper does not have good comparison with these choices the regularization penalty is quite similar to the function of the weight decay how does it compare with normal weight decay it is supposed that dpsgd nave application with weight decay can achieve better results than 59 on cifar10 most importantly cifar10 may not be a good dataset for benchmarking dp algorithms for one obvious reason it is not a sensitive dataset from every aspects it is a too small dataset with each class 5000 samples so that the dp algorithm may not perform well as expected unless using pretrained model benchmarking dp algorithms on cifar10 may lead to overoptimized models andor losses that are not be able to generalize to other real privacysensitive scenarios ie language models good intuitions about dp algorithm but the results and arguments are not convincing overall the review would like to give a weak reject to this paper docsepthis paper attempts to improve the accuracy of the dpsgd training from the perspective of loss function design the authors propose a loss composed of sse loss focal loss and l2 regularization penalty experiments are conducted to demonstrate the effectiveness of the proposed loss strength 1 improving the utility of private deep learning from a loss design perspective is novel to me this may be also valuable to the community 2 the model obtained by optimizing the proposed loss achieves the sota result weakness the main disadvantage of this article is that it does not explain clearly the motivation for designing the loss and why the proposed loss can work although the author tried to analyze the mechanism of loss in section 32 these analyses were not convincing enough and did not clearly explain why the loss can work specifically 1 the author indicated our loss limits information loss from clipping through fig 1 but did not explain why the proposed loss can prevent logit values and weights from exploding 2 the author claimed our loss yields faster convergence but did not provide proof 3 the author mentioned preventing logits from exploding may also help recover the generalization boost of batch normalization which can only be used as a guess based on observations a rigorous theoretical analysis may be required to strengthen the claim 4 it seems that figure 2 doesnt serve as an indication that the proposed loss function works because it enhances smoothness because according to fig 2 the smoothness of the network trained by ce loss is getting better and better while the smoothness of the proposed loss training model is its getting worse another question are the parameters robust to different hyperparameters settings in the loss none docsepthis paper proposes a new loss that consists of three parts namely msefocal lossregularization loss the new loss is supposed to benefit dp deep learning in comparison to the crossentropy loss accuracy improvement has been observed on mnsit fashionmnist and cifar10 i think this paper is clear and easytofollow the underlying philosophy that maybe dp learning should not use the same loss function as regular nondp learning is interesting this empirical paper has analyzed a set of computer vision datasets and the improvement does exists i appreciate the authors efforts to do ablation study and to give details of network architectures etc in the appendix however my main concern is the message that this paper is trying to convey first of all the experiments are not comprehensive only computer vision tasks are included and only on toy datasets i believe including recommendation system and nlp datasets will make the statement of using their new loss much more convincing secondly the empirical improvement is not significant the seemly most significant improvement as is the one presented in the abstract is 4 on cifar10 given that nonprivate model can easily get over 90 on cifar10 i believe the gap is not really closed by introducing the new loss lastly the lack of theory may make people wonder does this improvement really hold for general dp training at most we can say on specific datasets and specific models using the new loss is beneficial also all three components of the new loss already exist in regular training so the current approach seems a simple combination without sufficient justification the paper is clear and easytofollow combining three existing losses to form a new one that empirically improves dp accuracy on some vision datasets however the experiments should include nonvision tasks and provide at least discuss the insight from a theoretical viewpoint also the improvement is not significant enough for iclr venue ### Summary:
the reviewers all seemed to agree that the investigation of other losses is an interesting direction of study and acknowledged there was some empirical performance improvement for standard computer vision tasks however they felt the justification of the specific form of loss was a bit shaky and heuristic and were furthermore unconvinced by results exclusively for image classification one reviewer was unmoved by the magnitude of improvement this was a borderline decision but we hope the authors refine and resubmit their work as this is an interesting but underexplored direction within dpml as one recent related work which investigates the effect of other architecture differences in the dp setting the authors may be interested in httpsarxivorgabs211008557
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 783, 2929, 29328, 247, 747, 2957, 1159, 281, 3157, 253, 3045, 273, 11454, 2990, 3210, 10166, 407, 20093, 35333, 253, 747, 2957, 1159, 310, 247, 17375, 3388, 273, 253, 2020, 273, 30044, 2228, 253, 18560, 2957, 285, 247, 12339, 327, 253, 30044, 5222, 273, 253, 638, 18166, 3453, 273, 1027, 8090, 253, 747, 2957, 33526, 1375, 23037, 14387, 7200, 327, 253, 260, 338, 274, 740, 8142, 16192, 382, 285, 278, 79, 382, 15302, 352, 310, 671, 2011, 326, 253, 747, 2957, 476, 4796, 253, 8492, 273, 11786, 502, 8201, 285, 11907, 4715, 327, 1892, 6667, 4583, 891, 1158, 436, 310, 271, 4722, 2929, 326, 33826, 253, 3486, 273, 2957, 1159, 327, 253, 3045, 273, 3210, 10166, 407, 20093, 35333, 671, 253, 4679, 403, 3240, 11088, 32355, 272, 253, 6349, 273, 13887, 2957, 3470, 2568, 627, 403, 1335, 690, 7794, 326, 812, 320, 5520, 50276, 18, 2167, 253, 2957, 476, 3157, 2220, 2831, 290, 10144, 352, 23970, 1264, 4465, 4373, 22041, 281, 19928, 352, 3133, 253, 2929, 760, 2361, 253, 1682, 3045, 846, 4373, 19484, 17891, 2299, 253, 7340, 273, 3045, 342, 1675, 281, 841, 4373, 22041, 310, 417, 2361, 390, 5469, 352, 651, 320, 1805, 604, 824, 1543, 476, 320, 2879, 50276, 19, 253, 3486, 273, 2957, 1159, 327, 1566, 3045, 310, 271, 1774, 285, 4722, 9400, 253, 2957, 1159, 4081, 407, 253, 2929, 8414, 273, 1264, 4295, 512, 432, 5368, 6239, 891, 4282, 452, 253, 4477, 3597, 643, 4295, 24088, 970, 38864, 2957, 273, 14713, 254, 2957, 281, 8171, 253, 2020, 273, 30044, 6332, 390, 970, 643, 22414, 50275, 6438, 30080, 22559, 50276, 2577, 7350, 403, 6571, 9713, 347, 8042, 562, 407, 643, 30628, 690, 27350, 7125, 323, 15265, 839, 253, 11655, 403, 417, 973, 19391, 285, 24363, 24088, 970, 512, 12586, 11917, 1162, 355, 6247, 281, 9059, 7938, 14940, 275, 3946, 5393, 407, 37317, 362, 5943, 69, 2299, 891, 2868, 436, 2929, 943, 320, 11575, 347, 271, 16774, 2929, 285, 891, 717, 10048, 342, 253, 4477, 6031, 275, 18216, 253, 3486, 273, 2957, 1159, 327, 253, 3045, 273, 20093, 35333, 3021, 891, 651, 751, 281, 1978, 619, 4868, 50276, 1189, 455, 891, 1158, 253, 2929, 310, 4722, 285, 812, 320, 2007, 5520, 604, 625, 1543, 403, 2879, 5474, 339, 431, 248, 2929, 29328, 247, 27846, 2957, 323, 20093, 35333, 534, 3797, 37254, 30044, 2228, 253, 18560, 2957, 285, 247, 37820, 12339, 253, 37254, 30044, 2228, 310, 323, 3809, 14940, 387, 3302, 3924, 253, 18560, 2957, 310, 908, 323, 4271, 1892, 3530, 253, 37820, 12339, 310, 908, 323, 8493, 253, 11786, 6712, 5222, 285, 3693, 18864, 323, 1016, 4445, 273, 253, 27846, 2957, 253, 2929, 556, 16774, 783, 33977, 1941, 281, 9059, 253, 15504, 352, 310, 247, 1175, 1611, 281, 3157, 253, 3045, 273, 20093, 35333, 432, 717, 1946, 253, 2957, 627, 403, 2067, 5075, 2792, 273, 436, 2929, 50276, 783, 2929, 4648, 253, 10527, 1941, 281, 9059, 326, 37254, 30044, 2228, 310, 1175, 323, 3809, 14940, 359, 452, 281, 11476, 326, 253, 14940, 906, 275, 512, 12586, 11917, 1162, 355, 6247, 556, 7654, 2515, 295, 17922, 9459, 774, 86, 285, 6278, 2957, 534, 778, 417, 2186, 275, 253, 8542, 4679, 1014, 604, 841, 1543, 2186, 253, 3809, 1255, 273, 17619, 14940, 310, 625, 8943, 323, 253, 2457, 3924, 273, 14940, 2299, 1060, 352, 310, 908, 281, 9059, 323, 253, 3302, 8661, 3809, 1255, 534, 310, 417, 34593, 275, 3946, 253, 14940, 2281, 310, 625, 2905, 342, 253, 10165, 273, 253, 5556, 14460, 256, 35333, 390, 38622, 1027, 4715, 4142, 2581, 685, 1027, 11655, 253, 2929, 1057, 417, 452, 1175, 5301, 342, 841, 10165, 50276, 783, 37820, 12339, 310, 3240, 2074, 281, 253, 1159, 273, 253, 2801, 10027, 849, 1057, 352, 7277, 342, 2622, 2801, 10027, 352, 310, 6326, 326, 20093, 35333, 295, 1123, 2898, 342, 2801, 10027, 476, 5115, 1805, 1543, 685, 8978, 327, 260, 338, 274, 740, 50276, 2252, 15538, 260, 338, 274, 740, 778, 417, 320, 247, 1175, 10895, 323, 22791, 272, 33234, 11333, 323, 581, 4755, 1921, 352, 310, 417, 247, 7996, 10895, 432, 1046, 7794, 352, 310, 247, 1512, 1355, 10895, 342, 1016, 966, 29067, 3530, 594, 326, 253, 33234, 5933, 778, 417, 1347, 973, 347, 3264, 5734, 970, 3215, 11273, 1566, 22791, 272, 33234, 11333, 327, 260, 338, 274, 740, 778, 1421, 281, 689, 32581, 1025, 3210, 285, 263, 11655, 326, 403, 417, 320, 2104, 281, 39970, 281, 643, 1524, 2294, 317, 656, 18917, 15216, 26332, 3448, 3210, 50276, 12311, 16875, 4431, 670, 33234, 5933, 533, 253, 1543, 285, 7125, 403, 417, 21414, 4583, 253, 2278, 651, 751, 281, 1918, 247, 5075, 12009, 281, 436, 2929, 5474, 33032, 2520, 2929, 9437, 281, 3157, 253, 7200, 273, 253, 20093, 35333, 3733, 432, 253, 8668, 273, 2957, 1159, 2216, 253, 4477, 12661, 247, 2957, 9924, 273, 256, 339, 2957, 18560, 2957, 285, 298, 19, 37820, 12339, 4679, 403, 5196, 281, 7568, 253, 12510, 273, 253, 50276, 856, 7334, 2957, 50275, 45563, 50275, 18, 186, 303, 40037, 253, 11839, 273, 3055, 3676, 4715, 432, 247, 2957, 2216, 8668, 310, 4460, 281, 479, 436, 778, 320, 671, 9865, 281, 253, 3114, 50276, 19, 186, 783, 1566, 2797, 407, 39793, 253, 4081, 2957, 33526, 253, 256, 5503, 906, 50276, 20881, 1255, 50275, 783, 2022, 18928, 273, 436, 3929, 310, 326, 352, 1057, 417, 5513, 4518, 253, 16038, 323, 20462, 253, 2957, 285, 2139, 253, 4081, 2957, 476, 789, 3738, 253, 2488, 3597, 281, 12106, 253, 5122, 273, 2957, 275, 2593, 4567, 841, 6260, 497, 417, 21414, 2217, 285, 858, 417, 4518, 5513, 2139, 253, 2957, 476, 789, 5742, 337, 253, 2488, 4860, 776, 2957, 7787, 1491, 2957, 432, 502, 8201, 949, 3036, 337, 533, 858, 417, 5513, 2139, 253, 4081, 2957, 476, 3657, 2412, 262, 2193, 285, 13461, 432, 1414, 4442, 374, 253, 2488, 7558, 776, 2957, 11026, 7938, 14940, 533, 858, 417, 2085, 4737, 495, 253, 2488, 5393, 13538, 2412, 953, 432, 1414, 4442, 778, 671, 1361, 9295, 253, 26647, 9510, 273, 14604, 21539, 534, 476, 760, 320, 908, 347, 247, 5476, 1754, 327, 7313, 247, 26565, 10527, 1783, 778, 320, 2424, 281, 17084, 253, 1750, 577, 352, 3133, 326, 4677, 374, 36908, 5752, 347, 271, 14011, 326, 253, 4081, 2957, 1159, 2987, 984, 352, 25222, 6032, 1255, 984, 2556, 281, 3036, 374, 253, 6032, 1255, 273, 253, 2990, 10166, 407, 2636, 2957, 310, 2970, 1805, 285, 1805, 1223, 253, 6032, 1255, 273, 253, 4081, 2957, 3733, 1566, 310, 697, 2970, 7197, 50276, 23955, 1953, 403, 253, 3602, 10237, 281, 1027, 4373, 22041, 7533, 275, 253, 2957, 50276, 15422, 5474, 33032, 2520, 2929, 29328, 247, 747, 2957, 326, 8414, 273, 1264, 4243, 10775, 278, 339, 71, 3100, 2957, 12846, 1320, 2957, 253, 747, 2957, 310, 6326, 281, 5649, 33234, 3676, 4715, 275, 5301, 281, 253, 2831, 290, 10144, 2957, 7200, 7756, 556, 644, 2540, 327, 278, 2224, 262, 8142, 16192, 382, 285, 260, 338, 274, 740, 50276, 74, 1158, 436, 2929, 310, 2590, 285, 3477, 936, 25739, 253, 6944, 11727, 326, 5046, 33234, 4715, 943, 417, 897, 253, 1072, 2957, 1159, 347, 3963, 27370, 81, 4715, 310, 4722, 436, 16774, 2929, 556, 5867, 247, 873, 273, 4382, 8113, 15302, 285, 253, 7756, 1057, 4961, 891, 11435, 253, 4477, 6031, 281, 513, 28913, 1263, 285, 281, 1918, 4278, 273, 2990, 35615, 3966, 275, 253, 30762, 50276, 35529, 619, 2022, 4468, 310, 253, 3935, 326, 436, 2929, 310, 2820, 281, 12709, 806, 273, 512, 253, 4679, 403, 417, 11088, 760, 4382, 8113, 8892, 403, 2908, 285, 760, 327, 20953, 15302, 891, 2868, 1690, 17401, 985, 285, 295, 24343, 15302, 588, 1056, 253, 3908, 273, 970, 616, 747, 2957, 1199, 625, 21414, 1273, 314, 253, 16774, 7756, 310, 417, 1534, 253, 1646, 314, 954, 1534, 7756, 347, 310, 253, 581, 3559, 275, 253, 12002, 310, 577, 327, 260, 338, 274, 740, 1677, 326, 1327, 9486, 1566, 476, 4354, 755, 689, 5091, 327, 260, 338, 274, 740, 891, 2868, 253, 8037, 310, 417, 1663, 4581, 407, 16984, 253, 747, 2957, 1390, 314, 253, 3480, 273, 3762, 778, 1056, 952, 4282, 1057, 436, 7756, 1663, 2186, 323, 2087, 33234, 3733, 387, 954, 359, 476, 1333, 327, 2173, 15302, 285, 2173, 3210, 970, 253, 747, 2957, 310, 12912, 671, 512, 1264, 4295, 273, 253, 747, 2957, 2168, 2226, 275, 3963, 3733, 594, 253, 1655, 2746, 3133, 247, 2969, 5019, 1293, 4209, 22861, 50276, 783, 2929, 310, 2590, 285, 3477, 936, 25739, 16248, 1264, 5368, 11655, 281, 830, 247, 747, 581, 326, 45190, 19132, 33234, 7200, 327, 690, 8113, 15302, 2299, 253, 4679, 943, 2486, 1327, 4694, 8892, 285, 2085, 387, 1878, 2319, 253, 12288, 432, 247, 10527, 31460, 671, 253, 7756, 310, 417, 1534, 2217, 323, 17857, 32888, 18767, 2490, 187, 4118, 18435, 27, 783, 30628, 512, 4455, 281, 5194, 326, 253, 5839, 273, 643, 11655, 310, 271, 4722, 3884, 273, 1263, 285, 14969, 627, 369, 690, 16774, 3045, 7756, 323, 2629, 4382, 8113, 8892, 2299, 597, 3543, 253, 22861, 273, 253, 2173, 830, 273, 2957, 369, 247, 2372, 439, 22560, 285, 47641, 285, 497, 33810, 10915, 8498, 758, 407, 1543, 14288, 323, 2460, 9162, 581, 37317, 369, 23355, 3149, 407, 253, 9777, 273, 7756, 436, 369, 247, 45210, 3061, 533, 359, 3524, 253, 4477, 39494, 285, 501, 538, 2225, 616, 789, 347, 436, 310, 271, 4722, 533, 15560, 18398, 446, 2149, 3884, 1561, 33234, 1686, 50275, 284, 581, 3332, 2905, 789, 534, 2340, 684, 253, 1055, 273, 643, 10336, 3910, 275, 253, 33234, 4758, 253, 4477, 778, 320, 6110, 275, 5987, 39962, 2061, 5375, 17605, 361, 2227, 3011 ]
[ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 ]
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 783, 2929, 29328, 247, 747, 2957, 1159, 281, 3157, 253, 3045, 273, 11454, 2990, 3210, 10166, 407, 20093, 35333, 253, 747, 2957, 1159, 310, 247, 17375, 3388, 273, 253, 2020, 273, 30044, 2228, 253, 18560, 2957, 285, 247, 12339, 327, 253, 30044, 5222, 273, 253, 638, 18166, 3453, 273, 1027, 8090, 253, 747, 2957, 33526, 1375, 23037, 14387, 7200, 327, 253, 260, 338, 274, 740, 8142, 16192, 382, 285, 278, 79, 382, 15302, 352, 310, 671, 2011, 326, 253, 747, 2957, 476, 4796, 253, 8492, 273, 11786, 502, 8201, 285, 11907, 4715, 327, 1892, 6667, 4583, 891, 1158, 436, 310, 271, 4722, 2929, 326, 33826, 253, 3486, 273, 2957, 1159, 327, 253, 3045, 273, 3210, 10166, 407, 20093, 35333, 671, 253, 4679, 403, 3240, 11088, 32355, 272, 253, 6349, 273, 13887, 2957, 3470, 2568, 627, 403, 1335, 690, 7794, 326, 812, 320, 5520, 50276, 18, 2167, 253, 2957, 476, 3157, 2220, 2831, 290, 10144, 352, 23970, 1264, 4465, 4373, 22041, 281, 19928, 352, 3133, 253, 2929, 760, 2361, 253, 1682, 3045, 846, 4373, 19484, 17891, 2299, 253, 7340, 273, 3045, 342, 1675, 281, 841, 4373, 22041, 310, 417, 2361, 390, 5469, 352, 651, 320, 1805, 604, 824, 1543, 476, 320, 2879, 50276, 19, 253, 3486, 273, 2957, 1159, 327, 1566, 3045, 310, 271, 1774, 285, 4722, 9400, 253, 2957, 1159, 4081, 407, 253, 2929, 8414, 273, 1264, 4295, 512, 432, 5368, 6239, 891, 4282, 452, 253, 4477, 3597, 643, 4295, 24088, 970, 38864, 2957, 273, 14713, 254, 2957, 281, 8171, 253, 2020, 273, 30044, 6332, 390, 970, 643, 22414, 50275, 6438, 30080, 22559, 50276, 2577, 7350, 403, 6571, 9713, 347, 8042, 562, 407, 643, 30628, 690, 27350, 7125, 323, 15265, 839, 253, 11655, 403, 417, 973, 19391, 285, 24363, 24088, 970, 512, 12586, 11917, 1162, 355, 6247, 281, 9059, 7938, 14940, 275, 3946, 5393, 407, 37317, 362, 5943, 69, 2299, 891, 2868, 436, 2929, 943, 320, 11575, 347, 271, 16774, 2929, 285, 891, 717, 10048, 342, 253, 4477, 6031, 275, 18216, 253, 3486, 273, 2957, 1159, 327, 253, 3045, 273, 20093, 35333, 3021, 891, 651, 751, 281, 1978, 619, 4868, 50276, 1189, 455, 891, 1158, 253, 2929, 310, 4722, 285, 812, 320, 2007, 5520, 604, 625, 1543, 403, 2879, 5474, 339, 431, 248, 2929, 29328, 247, 27846, 2957, 323, 20093, 35333, 534, 3797, 37254, 30044, 2228, 253, 18560, 2957, 285, 247, 37820, 12339, 253, 37254, 30044, 2228, 310, 323, 3809, 14940, 387, 3302, 3924, 253, 18560, 2957, 310, 908, 323, 4271, 1892, 3530, 253, 37820, 12339, 310, 908, 323, 8493, 253, 11786, 6712, 5222, 285, 3693, 18864, 323, 1016, 4445, 273, 253, 27846, 2957, 253, 2929, 556, 16774, 783, 33977, 1941, 281, 9059, 253, 15504, 352, 310, 247, 1175, 1611, 281, 3157, 253, 3045, 273, 20093, 35333, 432, 717, 1946, 253, 2957, 627, 403, 2067, 5075, 2792, 273, 436, 2929, 50276, 783, 2929, 4648, 253, 10527, 1941, 281, 9059, 326, 37254, 30044, 2228, 310, 1175, 323, 3809, 14940, 359, 452, 281, 11476, 326, 253, 14940, 906, 275, 512, 12586, 11917, 1162, 355, 6247, 556, 7654, 2515, 295, 17922, 9459, 774, 86, 285, 6278, 2957, 534, 778, 417, 2186, 275, 253, 8542, 4679, 1014, 604, 841, 1543, 2186, 253, 3809, 1255, 273, 17619, 14940, 310, 625, 8943, 323, 253, 2457, 3924, 273, 14940, 2299, 1060, 352, 310, 908, 281, 9059, 323, 253, 3302, 8661, 3809, 1255, 534, 310, 417, 34593, 275, 3946, 253, 14940, 2281, 310, 625, 2905, 342, 253, 10165, 273, 253, 5556, 14460, 256, 35333, 390, 38622, 1027, 4715, 4142, 2581, 685, 1027, 11655, 253, 2929, 1057, 417, 452, 1175, 5301, 342, 841, 10165, 50276, 783, 37820, 12339, 310, 3240, 2074, 281, 253, 1159, 273, 253, 2801, 10027, 849, 1057, 352, 7277, 342, 2622, 2801, 10027, 352, 310, 6326, 326, 20093, 35333, 295, 1123, 2898, 342, 2801, 10027, 476, 5115, 1805, 1543, 685, 8978, 327, 260, 338, 274, 740, 50276, 2252, 15538, 260, 338, 274, 740, 778, 417, 320, 247, 1175, 10895, 323, 22791, 272, 33234, 11333, 323, 581, 4755, 1921, 352, 310, 417, 247, 7996, 10895, 432, 1046, 7794, 352, 310, 247, 1512, 1355, 10895, 342, 1016, 966, 29067, 3530, 594, 326, 253, 33234, 5933, 778, 417, 1347, 973, 347, 3264, 5734, 970, 3215, 11273, 1566, 22791, 272, 33234, 11333, 327, 260, 338, 274, 740, 778, 1421, 281, 689, 32581, 1025, 3210, 285, 263, 11655, 326, 403, 417, 320, 2104, 281, 39970, 281, 643, 1524, 2294, 317, 656, 18917, 15216, 26332, 3448, 3210, 50276, 12311, 16875, 4431, 670, 33234, 5933, 533, 253, 1543, 285, 7125, 403, 417, 21414, 4583, 253, 2278, 651, 751, 281, 1918, 247, 5075, 12009, 281, 436, 2929, 5474, 33032, 2520, 2929, 9437, 281, 3157, 253, 7200, 273, 253, 20093, 35333, 3733, 432, 253, 8668, 273, 2957, 1159, 2216, 253, 4477, 12661, 247, 2957, 9924, 273, 256, 339, 2957, 18560, 2957, 285, 298, 19, 37820, 12339, 4679, 403, 5196, 281, 7568, 253, 12510, 273, 253, 50276, 856, 7334, 2957, 50275, 45563, 50275, 18, 186, 303, 40037, 253, 11839, 273, 3055, 3676, 4715, 432, 247, 2957, 2216, 8668, 310, 4460, 281, 479, 436, 778, 320, 671, 9865, 281, 253, 3114, 50276, 19, 186, 783, 1566, 2797, 407, 39793, 253, 4081, 2957, 33526, 253, 256, 5503, 906, 50276, 20881, 1255, 50275, 783, 2022, 18928, 273, 436, 3929, 310, 326, 352, 1057, 417, 5513, 4518, 253, 16038, 323, 20462, 253, 2957, 285, 2139, 253, 4081, 2957, 476, 789, 3738, 253, 2488, 3597, 281, 12106, 253, 5122, 273, 2957, 275, 2593, 4567, 841, 6260, 497, 417, 21414, 2217, 285, 858, 417, 4518, 5513, 2139, 253, 2957, 476, 789, 5742, 337, 253, 2488, 4860, 776, 2957, 7787, 1491, 2957, 432, 502, 8201, 949, 3036, 337, 533, 858, 417, 5513, 2139, 253, 4081, 2957, 476, 3657, 2412, 262, 2193, 285, 13461, 432, 1414, 4442, 374, 253, 2488, 7558, 776, 2957, 11026, 7938, 14940, 533, 858, 417, 2085, 4737, 495, 253, 2488, 5393, 13538, 2412, 953, 432, 1414, 4442, 778, 671, 1361, 9295, 253, 26647, 9510, 273, 14604, 21539, 534, 476, 760, 320, 908, 347, 247, 5476, 1754, 327, 7313, 247, 26565, 10527, 1783, 778, 320, 2424, 281, 17084, 253, 1750, 577, 352, 3133, 326, 4677, 374, 36908, 5752, 347, 271, 14011, 326, 253, 4081, 2957, 1159, 2987, 984, 352, 25222, 6032, 1255, 984, 2556, 281, 3036, 374, 253, 6032, 1255, 273, 253, 2990, 10166, 407, 2636, 2957, 310, 2970, 1805, 285, 1805, 1223, 253, 6032, 1255, 273, 253, 4081, 2957, 3733, 1566, 310, 697, 2970, 7197, 50276, 23955, 1953, 403, 253, 3602, 10237, 281, 1027, 4373, 22041, 7533, 275, 253, 2957, 50276, 15422, 5474, 33032, 2520, 2929, 29328, 247, 747, 2957, 326, 8414, 273, 1264, 4243, 10775, 278, 339, 71, 3100, 2957, 12846, 1320, 2957, 253, 747, 2957, 310, 6326, 281, 5649, 33234, 3676, 4715, 275, 5301, 281, 253, 2831, 290, 10144, 2957, 7200, 7756, 556, 644, 2540, 327, 278, 2224, 262, 8142, 16192, 382, 285, 260, 338, 274, 740, 50276, 74, 1158, 436, 2929, 310, 2590, 285, 3477, 936, 25739, 253, 6944, 11727, 326, 5046, 33234, 4715, 943, 417, 897, 253, 1072, 2957, 1159, 347, 3963, 27370, 81, 4715, 310, 4722, 436, 16774, 2929, 556, 5867, 247, 873, 273, 4382, 8113, 15302, 285, 253, 7756, 1057, 4961, 891, 11435, 253, 4477, 6031, 281, 513, 28913, 1263, 285, 281, 1918, 4278, 273, 2990, 35615, 3966, 275, 253, 30762, 50276, 35529, 619, 2022, 4468, 310, 253, 3935, 326, 436, 2929, 310, 2820, 281, 12709, 806, 273, 512, 253, 4679, 403, 417, 11088, 760, 4382, 8113, 8892, 403, 2908, 285, 760, 327, 20953, 15302, 891, 2868, 1690, 17401, 985, 285, 295, 24343, 15302, 588, 1056, 253, 3908, 273, 970, 616, 747, 2957, 1199, 625, 21414, 1273, 314, 253, 16774, 7756, 310, 417, 1534, 253, 1646, 314, 954, 1534, 7756, 347, 310, 253, 581, 3559, 275, 253, 12002, 310, 577, 327, 260, 338, 274, 740, 1677, 326, 1327, 9486, 1566, 476, 4354, 755, 689, 5091, 327, 260, 338, 274, 740, 891, 2868, 253, 8037, 310, 417, 1663, 4581, 407, 16984, 253, 747, 2957, 1390, 314, 253, 3480, 273, 3762, 778, 1056, 952, 4282, 1057, 436, 7756, 1663, 2186, 323, 2087, 33234, 3733, 387, 954, 359, 476, 1333, 327, 2173, 15302, 285, 2173, 3210, 970, 253, 747, 2957, 310, 12912, 671, 512, 1264, 4295, 273, 253, 747, 2957, 2168, 2226, 275, 3963, 3733, 594, 253, 1655, 2746, 3133, 247, 2969, 5019, 1293, 4209, 22861, 50276, 783, 2929, 310, 2590, 285, 3477, 936, 25739, 16248, 1264, 5368, 11655, 281, 830, 247, 747, 581, 326, 45190, 19132, 33234, 7200, 327, 690, 8113, 15302, 2299, 253, 4679, 943, 2486, 1327, 4694, 8892, 285, 2085, 387, 1878, 2319, 253, 12288, 432, 247, 10527, 31460, 671, 253, 7756, 310, 417, 1534, 2217, 323, 17857, 32888, 18767, 2490, 187, 4118, 18435, 27, 783, 30628, 512, 4455, 281, 5194, 326, 253, 5839, 273, 643, 11655, 310, 271, 4722, 3884, 273, 1263, 285, 14969, 627, 369, 690, 16774, 3045, 7756, 323, 2629, 4382, 8113, 8892, 2299, 597, 3543, 253, 22861, 273, 253, 2173, 830, 273, 2957, 369, 247, 2372, 439, 22560, 285, 47641, 285, 497, 33810, 10915, 8498, 758, 407, 1543, 14288, 323, 2460, 9162, 581, 37317, 369, 23355, 3149, 407, 253, 9777, 273, 7756, 436, 369, 247, 45210, 3061, 533, 359, 3524, 253, 4477, 39494, 285, 501, 538, 2225, 616, 789, 347, 436, 310, 271, 4722, 533, 15560, 18398, 446, 2149, 3884, 1561, 33234, 1686, 50275, 284, 581, 3332, 2905, 789, 534, 2340, 684, 253, 1055, 273, 643, 10336, 3910, 275, 253, 33234, 4758, 253, 4477, 778, 320, 6110, 275, 5987, 39962, 2061, 5375, 17605, 361, 2227, 3011 ]
Below is given review of a research paper from cnoference journal. Please write a summary the review. ### Review: summary the authors investigate the token embedding space of a variety of contextual embedding models for natural language using techniques based on nearest neighbors clustering and pca they report a variety of results on local dimensionality anisotropy clustering manifold structure in these embedding models which are of general interest to scientists and practitioners hoping to understand these models these include findings of local isotropy in the embeddings when appropriately clustered and shifted and an apparent manifold structure in the gpt models reasons for score this is a generally thorough and wellexecuted paper and a careful examination of these embedding models is of great general interest to the iclr community however i feel the analysis is a bit shallow at certain points and consists of reporting interesting findings without necessarily delving deeper or adequately explaining them i think this paper presents a great jumping off point for further research on the subject as it certainly raised quite a few questions with me i support its acceptance but would hope to see the authors address some of the questions raised here positives lots of analysis with several different techniques very interesting and relevant subject area thorough use of recent related work the paper is quite well written and organized considering the inherent challenge of writing up research of this sprawling nature concerns comments noting the very different behavior of gpt from the other representations could this be due to the learned positional encodings this might also explain the swissroll style paths seen when examining the different embeddings of the same word types could position along those curved paths be correlated to sentence position of the tokens in this spirit it would be great to get a better characterization of what the different clusters correspond to especially in the case of gpt could there be a better investigation of the relationship between predictive causal and noncausal models and these clusterings more generally the swiss roll observation is intriguing but since it only appears in one family of models that has a very similar transformer architecture to models in which it does not appear what are we to make of it how can the intrinsic dimensionality at each layer increase with depth considering each layer as living on a manifold the transformation at each layer should act as a coordinate chart for the next layers manifold which should only allow a reduction in dimension unless i am missing something that suggests that these estimators do not have enough samples andor are measuring something different than the dimension of a manifold section 44 should explain how this could be happening more generally i find the low lids in gpt hard to understand or interpret without more analysis and would also like to understand the very large first dimension of the gpt models it would be great to see more suggestions takeaways for practitioners what if anything does local anisotropy imply for deep learning researchers doing work in this areadocsepfindings reproduces various existing findings about anisotropy of contextual representations viewed globally contextual representations are highly isotropic within clusters of the representations gpt representations follow a swiss roll manifold where the most frequent words appear at the head and less frequent words are gradually appended at the bottom the swiss roll manifold is taller in deeper layers bert representations fall in a euclidean space the local intrinsic dimension of contextual embeddings is lower than for unigram embeddings pros the manifold analysis of word frequency is intriguing and intuitive the explanation of experiments was clear in each section they produce compelling evidence that the global tokenlevel anisotropy of these representations is largely due to membership of large clusters this is a valuable contribution because it explains previous findings in ethayarajh 2019 and reconciles them with theoretical expectations cons in section 23 and section 31 the paper gives insufficient credit to ethayarajh 2019 as far as i could tell every initial result is a reproduction of a result from ethayarajh 2019 and the methods are very similar though they acknowledge it the methodology is largely taken from existing unigram embedding analysis once they start to identify very well defined clusters i was very curious about the distinctions between the islands it would not be difficult to inspect some of the data by hand so i dont understand why the authors didnt try the authors offer no analysis for the difference in behavior between different models i felt like i was reading a taxonomy and the plots were left for the reader to connect the authors have presumably spent quite a while thinking about these geometric structures and models so surely they have conjectures about the behavior they observed or hypotheses they can test minor style i didnt realize until the conclusion that your main finding was an explanation of existing claims about anisotropy by considering behavior within the clusters so that needs more emphasis the papers you cite do a decent job of explaining why isotropy in the representation space is significant both token and typewise the paper would be a lot more readable if you made a similar effort in explaining background there is inconsistent use of isotropy vs isotropicity citations needed for claim widely believed that the contextual space is so weird needs proofreading for minor grammar and typos eg downstreaming instead of downstream could resides instead of could reside instead of referring variously to high similarity or cosine silhouette scores and other measurements of isotropy it might be clearer to link each concept to isotropy once and then each subsequent result simply refer to it as isotropy while mentioning the metric then the reader doesnt have to constantly remember which metric indicates high isotropy as they read the results questions the authors claim to select the clusters that maximize mms i read the wording to imply that this optimum is tractablestable is that the case number rating was updated from 3 to 7 in light of substantial expanded experiments and analysisdocsepthis paper analyzes the geometry of several contextualized embeddings the authors show that global anisotropy is caused by strong clustering of word vectors and that vectors of different word types are isotropically distributed within the cluster strengths this work is a nicetohave extension of ethayarajh 2019httpswwwaclweborganthologyd191006pdf that dives deeper into the geometric properties of contextualized vectors the research question is clearly stated why doesnt anisotropy hurt performance and clearly answered theres no anisotropy locally the 3d visualizations provide a better geometric intuition than the flat visualizations that are common in this kind of papers issues i dont think that good performance contradicts anisotropy for example we already know that the classical static embeddings are also anisotropic mimno and thompson 2017httpswwwaclweborganthologyd171308pdf and this means that good performance as measured by downstream tasks may coexist with anisotropy so please consider rewording the beginning of section 12 for example instead of there is an apparent contradiction consider it is not clear why how representative is one random sample from phiti for measuring stextinter in formula 1 you gave an example in the introduction when the same word type bank can have totally different meanings depending on context and thus i believe the corresponding phi1textbank and phi2textbank may be totally different why not taking more samples for polysemous words why do you use different distance metrics euclidean vs cosine for estimating lids of contextualized vs static embeddings table 3 for gpt2 we had hoped to find that some types are associated with one cluster and other types are associated with the other cluster but that is not verified in our experiments i think you should look at contexts rather than types since youre dealing with the contextualized embeddings it would be interesting to see whether you have the same type in both clusters and then to look at its contexts i bet that the contexts will differ minor issues we find a lowdimensional manifold in gptgpt2 embeddings but not in bertdistilbert embeddings but your lids are low for bertdbert layers as well why cant you claim the lowdimensionality for bertdbert embeddigs i doubt that ptb with 10k vocabulary size gives good coverage in 2020 you may simply state that this a widelyused dataset wiki2 merity et al 2016 is usually referred to as wikitext2 please consider rephrasing experiments analysis as you are not conducting controlled experiments but rather performing exploratory analysis of the embeddings ### Summary:
this paper presents a broad exploratory analysis of the geometry of token representations in large language models with a focus on isotropy and manifold structure and reveals some surprising findings that help explain past observations pros clear and surprising analytical findings concerning a broad and widelyused family of models cons the paper is a fairly broad exploratory analysis with no single precise claim that ties together every piece of the work i thank both the authors and reviewers for an unusually productive discussion
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 6010, 50276, 783, 4477, 7409, 253, 10669, 21496, 2317, 273, 247, 5235, 273, 33876, 21496, 3210, 323, 3626, 3448, 970, 5609, 1754, 327, 5275, 15833, 17524, 285, 268, 6357, 597, 1304, 247, 5235, 273, 1543, 327, 1980, 7877, 1319, 50276, 266, 16174, 10144, 50276, 498, 49591, 50276, 38556, 2605, 275, 841, 21496, 3210, 534, 403, 273, 2087, 1600, 281, 10950, 285, 24432, 11525, 281, 2096, 841, 3210, 841, 2486, 4342, 273, 1980, 14539, 10144, 275, 253, 46234, 672, 20420, 29102, 285, 14728, 285, 271, 5165, 16751, 2605, 275, 253, 305, 431, 3210, 50275, 250, 3743, 323, 4868, 50276, 2520, 310, 247, 3839, 11080, 285, 6210, 1591, 886, 4525, 2929, 285, 247, 10182, 8368, 273, 841, 21496, 3210, 310, 273, 1270, 2087, 1600, 281, 253, 17857, 32888, 3114, 2299, 891, 1928, 253, 1783, 310, 247, 2372, 20126, 387, 2176, 2792, 285, 8414, 273, 9610, 4722, 4342, 1293, 7933, 1448, 1382, 12861, 390, 18212, 15571, 731, 50276, 74, 1158, 436, 2929, 10262, 247, 1270, 22802, 745, 1127, 323, 2007, 2561, 327, 253, 2256, 347, 352, 5604, 5439, 3240, 247, 1643, 3533, 342, 479, 891, 1329, 697, 14924, 533, 651, 3524, 281, 923, 253, 4477, 2953, 690, 273, 253, 3533, 5439, 1060, 50275, 993, 23223, 50275, 77, 1502, 273, 1783, 342, 2067, 1027, 5609, 50275, 635, 4722, 285, 4623, 2256, 2170, 50275, 42771, 602, 897, 273, 3332, 2905, 789, 50275, 783, 2929, 310, 3240, 973, 3542, 285, 10932, 7296, 253, 12794, 5691, 273, 4028, 598, 2561, 273, 436, 30108, 1981, 3753, 50275, 585, 1209, 2224, 50276, 26122, 50275, 43507, 253, 1077, 1027, 3879, 273, 305, 431, 432, 253, 643, 14237, 50276, 16534, 436, 320, 1955, 281, 253, 6311, 40798, 2349, 351, 723, 436, 1537, 671, 5513, 253, 1863, 739, 1811, 3740, 11865, 2326, 672, 17565, 253, 1027, 46234, 273, 253, 1072, 3159, 3510, 812, 1899, 2112, 1110, 22627, 11865, 320, 9578, 281, 6197, 1899, 273, 253, 21761, 50275, 249, 436, 5968, 352, 651, 320, 1270, 281, 755, 247, 1805, 14846, 273, 752, 253, 1027, 9959, 2723, 281, 3340, 275, 253, 1083, 273, 305, 431, 812, 627, 320, 247, 1805, 5839, 273, 253, 2954, 875, 15970, 50276, 68, 27026, 285, 1327, 68, 27026, 3210, 285, 841, 7368, 723, 50274, 3062, 3839, 253, 1863, 739, 4533, 8310, 310, 27807, 533, 1580, 352, 760, 4620, 275, 581, 2021, 273, 3210, 326, 556, 247, 1077, 2074, 39707, 10336, 281, 3210, 275, 534, 352, 1057, 417, 3176, 752, 403, 359, 281, 1056, 273, 352, 50275, 5430, 476, 253, 15276, 7877, 1319, 387, 1016, 3828, 2572, 342, 6864, 7296, 1016, 3828, 347, 3811, 327, 247, 16751, 253, 9261, 387, 1016, 3828, 943, 769, 347, 247, 13249, 8326, 323, 253, 1735, 8090, 16751, 534, 943, 760, 1581, 247, 5141, 275, 7877, 5734, 891, 717, 5816, 1633, 326, 5936, 326, 841, 48489, 513, 417, 452, 2217, 3530, 285, 263, 403, 10499, 1633, 1027, 685, 253, 7877, 273, 247, 16751, 2593, 7127, 943, 5513, 849, 436, 812, 320, 9369, 50274, 3062, 3839, 891, 1089, 253, 1698, 298, 2352, 275, 305, 431, 1892, 281, 2096, 390, 4665, 1293, 625, 1783, 285, 651, 671, 751, 281, 2096, 253, 1077, 1781, 806, 7877, 273, 253, 305, 431, 3210, 50275, 262, 651, 320, 1270, 281, 923, 625, 13991, 50276, 21528, 42287, 323, 24432, 752, 604, 2712, 1057, 1980, 30393, 16084, 323, 3676, 4715, 8607, 2509, 789, 275, 436, 403, 44180, 33032, 8606, 723, 50276, 250, 5551, 707, 2710, 5368, 4342, 670, 30393, 273, 33876, 14237, 11575, 21349, 50276, 8882, 780, 14237, 403, 4122, 29436, 1561, 9959, 273, 253, 14237, 50275, 72, 431, 14237, 956, 247, 1863, 739, 4533, 16751, 835, 253, 954, 10879, 3000, 3176, 387, 253, 1481, 285, 1679, 10879, 3000, 403, 13237, 42873, 387, 253, 5004, 50276, 783, 1863, 739, 4533, 16751, 310, 38165, 275, 12861, 8090, 50276, 6291, 14237, 2965, 275, 247, 299, 26365, 2317, 50275, 783, 1980, 15276, 7877, 273, 33876, 46234, 310, 2406, 685, 323, 440, 304, 3358, 46234, 50276, 856, 84, 50276, 783, 16751, 1783, 273, 3159, 4294, 310, 27807, 285, 27350, 50276, 783, 8813, 273, 4679, 369, 2590, 275, 1016, 2593, 50276, 9328, 4711, 18511, 1941, 326, 253, 4156, 10669, 5251, 30393, 273, 841, 14237, 310, 8127, 1955, 281, 14199, 273, 1781, 9959, 436, 310, 247, 9865, 7680, 984, 352, 11424, 2045, 4342, 275, 5105, 333, 274, 1432, 73, 6247, 285, 30855, 3205, 731, 342, 10527, 12656, 50276, 5040, 50276, 249, 2593, 3495, 285, 2593, 4562, 253, 2929, 4245, 12497, 6152, 281, 5105, 333, 274, 1432, 73, 6247, 347, 2080, 347, 891, 812, 2028, 1046, 3302, 906, 310, 247, 21068, 273, 247, 906, 432, 5105, 333, 274, 1432, 73, 6247, 285, 253, 3082, 403, 1077, 2074, 50276, 2004, 597, 14409, 352, 253, 16182, 310, 8127, 2668, 432, 5368, 440, 304, 3358, 21496, 1783, 50276, 19131, 597, 1265, 281, 4271, 1077, 973, 2931, 9959, 891, 369, 1077, 14338, 670, 253, 42060, 875, 253, 17546, 352, 651, 417, 320, 2834, 281, 16030, 690, 273, 253, 941, 407, 1133, 594, 891, 13414, 2096, 2139, 253, 4477, 42126, 1611, 50276, 783, 4477, 3959, 642, 1783, 323, 253, 3064, 275, 3879, 875, 1027, 3210, 891, 3543, 751, 891, 369, 4361, 247, 2891, 13646, 285, 253, 14777, 497, 1669, 323, 253, 9414, 281, 4684, 253, 4477, 452, 18289, 5262, 3240, 247, 1223, 4680, 670, 841, 17856, 5289, 285, 3210, 594, 13353, 597, 452, 19704, 980, 670, 253, 3879, 597, 2540, 390, 24316, 597, 476, 1071, 50276, 37585, 50276, 4826, 50276, 74, 42126, 8968, 1919, 253, 6452, 326, 634, 2022, 4560, 369, 271, 8813, 273, 5368, 3916, 670, 30393, 407, 7296, 3879, 1561, 253, 9959, 594, 326, 3198, 625, 15075, 50276, 783, 9380, 368, 26542, 513, 247, 12524, 2628, 273, 15571, 2139, 14539, 10144, 275, 253, 6779, 2317, 310, 1534, 1097, 10669, 285, 1511, 3020, 253, 2929, 651, 320, 247, 2257, 625, 34025, 604, 368, 1160, 247, 2074, 3434, 275, 15571, 4114, 50276, 9088, 310, 16706, 897, 273, 14539, 10144, 4632, 14539, 1658, 5755, 50276, 34212, 3058, 323, 1750, 7561, 6566, 326, 253, 33876, 2317, 310, 594, 12504, 50276, 50234, 4737, 24042, 323, 5884, 28146, 285, 963, 993, 24088, 15450, 272, 3185, 273, 15450, 812, 31951, 3185, 273, 812, 28932, 50276, 34235, 273, 14339, 2710, 314, 281, 1029, 14259, 390, 7349, 460, 43031, 5464, 7363, 285, 643, 6341, 273, 14539, 10144, 352, 1537, 320, 30909, 281, 3048, 1016, 4473, 281, 14539, 10144, 2378, 285, 840, 1016, 6774, 906, 3365, 3730, 281, 352, 347, 14539, 10144, 1223, 29570, 253, 7982, 840, 253, 9414, 36908, 452, 281, 11485, 4456, 534, 7982, 6492, 1029, 14539, 10144, 347, 597, 1239, 253, 1543, 50276, 34974, 50276, 783, 4477, 1750, 281, 3609, 253, 9959, 326, 22950, 278, 983, 891, 1239, 253, 41066, 281, 16084, 326, 436, 24571, 310, 10649, 1752, 383, 494, 310, 326, 253, 1083, 50276, 9133, 13716, 369, 9300, 432, 495, 281, 818, 275, 1708, 273, 6832, 11848, 4679, 285, 1783, 7152, 33032, 2520, 2929, 3537, 13505, 253, 12087, 273, 2067, 33876, 1025, 46234, 253, 4477, 921, 326, 4156, 30393, 310, 4269, 407, 2266, 17524, 273, 3159, 11390, 285, 326, 11390, 273, 1027, 3159, 3510, 403, 14539, 1658, 1037, 5939, 1561, 253, 7368, 50276, 296, 3755, 20556, 50276, 2520, 789, 310, 247, 6815, 292, 1368, 1123, 6880, 273, 5105, 333, 274, 1432, 73, 6247, 3614, 2700, 29404, 7585, 2061, 14718, 1497, 69, 746, 45535, 9275, 326, 277, 1644, 12861, 715, 253, 17856, 3607, 273, 33876, 1025, 11390, 50276, 783, 2561, 1953, 310, 4518, 4767, 2139, 36908, 30393, 8513, 3045, 285, 4518, 9577, 253, 373, 642, 30393, 12171, 50276, 783, 495, 69, 5304, 5904, 2085, 247, 1805, 17856, 30328, 685, 253, 6507, 5304, 5904, 326, 403, 1846, 275, 436, 2238, 273, 9380, 50276, 22402, 50276, 74, 13414, 1158, 326, 1175, 3045, 40878, 30393, 323, 1650, 359, 2168, 871, 326, 253, 8946, 4228, 46234, 403, 671, 35319, 13892, 2369, 285, 289, 297, 10836, 4240, 3614, 2700, 29404, 7585, 2061, 14718, 1497, 69, 1166, 1012, 2904, 9275, 285, 436, 2097, 326, 1175, 3045, 347, 4080, 407, 15450, 8892, 778, 820, 31477, 342, 30393, 594, 4496, 1908, 294, 88, 1573, 253, 5068, 273, 2593, 1249, 323, 1650, 3185, 273, 627, 310, 271, 5165, 20620, 1908, 352, 310, 417, 2590, 2139, 50275, 5430, 8612, 310, 581, 3632, 3410, 432, 815, 15208, 323, 10499, 331, 2068, 2388, 275, 7212, 337, 368, 3534, 271, 1650, 275, 253, 10199, 672, 253, 1072, 3159, 1511, 4310, 476, 452, 9106, 1027, 30460, 7293, 327, 3634, 285, 3021, 891, 2868, 253, 3969, 815, 74, 18, 1156, 17703, 285, 815, 74, 19, 1156, 17703, 778, 320, 9106, 1027, 2139, 417, 3192, 625, 3530, 323, 3488, 6017, 528, 3000, 50276, 22309, 513, 368, 897, 1027, 4181, 17082, 299, 26365, 4632, 7349, 460, 323, 26230, 298, 2352, 273, 33876, 1025, 4632, 4228, 46234, 2829, 495, 50275, 1542, 305, 431, 19, 359, 574, 13937, 281, 1089, 326, 690, 3510, 403, 2330, 342, 581, 7368, 285, 643, 3510, 403, 2330, 342, 253, 643, 7368, 533, 326, 310, 417, 16058, 275, 776, 4679, 50276, 74, 1158, 368, 943, 1007, 387, 22349, 2581, 685, 3510, 1580, 368, 250, 10620, 342, 253, 33876, 1025, 46234, 352, 651, 320, 4722, 281, 923, 1880, 368, 452, 253, 1072, 1511, 275, 1097, 9959, 285, 840, 281, 1007, 387, 697, 22349, 891, 701, 326, 253, 22349, 588, 9184, 50276, 37585, 3374, 50276, 664, 1089, 247, 1698, 6967, 16751, 275, 305, 431, 72, 431, 19, 46234, 533, 417, 275, 270, 797, 8155, 300, 6291, 46234, 50276, 2858, 634, 298, 2352, 403, 1698, 323, 270, 797, 69, 6291, 8090, 347, 973, 2139, 16216, 368, 1750, 253, 1698, 39120, 1319, 323, 270, 797, 69, 6291, 8473, 11174, 84, 50276, 74, 5545, 326, 268, 25192, 342, 884, 76, 30318, 1979, 4245, 1175, 7031, 275, 9169, 368, 778, 3365, 1375, 326, 436, 247, 7561, 3197, 10895, 50276, 16123, 19, 4285, 414, 1162, 355, 4022, 310, 3798, 6289, 281, 347, 259, 1479, 614, 633, 19, 50276, 32897, 1908, 294, 545, 83, 2355, 4679, 50276, 12792, 347, 368, 403, 417, 16472, 6537, 4679, 533, 2581, 9591, 41075, 1783, 273, 253, 46234, 187, 187, 4118, 18435, 27, 2520, 2929, 10262, 247, 3862, 41075, 1783, 273, 253, 12087, 273, 10669, 14237, 275, 1781, 3448, 3210, 342, 247, 2770, 327, 14539, 10144, 285, 16751, 2605, 285, 12957, 690, 10084, 4342, 326, 1361, 5513, 2469, 7313, 50276, 856, 84, 50276, 8250, 285, 10084, 16101, 4342, 8664, 247, 3862, 285, 7561, 3197, 2021, 273, 3210, 50276, 5040, 50276, 783, 2929, 310, 247, 9648, 3862, 41075, 1783, 342, 642, 2014, 10799, 1750, 326, 16027, 2366, 1046, 5313, 273, 253, 789, 50276, 74, 5717, 1097, 253, 4477, 285, 30628, 323, 271, 33940, 19303, 5955 ]
[ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 ]
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 6010, 50276, 783, 4477, 7409, 253, 10669, 21496, 2317, 273, 247, 5235, 273, 33876, 21496, 3210, 323, 3626, 3448, 970, 5609, 1754, 327, 5275, 15833, 17524, 285, 268, 6357, 597, 1304, 247, 5235, 273, 1543, 327, 1980, 7877, 1319, 50276, 266, 16174, 10144, 50276, 498, 49591, 50276, 38556, 2605, 275, 841, 21496, 3210, 534, 403, 273, 2087, 1600, 281, 10950, 285, 24432, 11525, 281, 2096, 841, 3210, 841, 2486, 4342, 273, 1980, 14539, 10144, 275, 253, 46234, 672, 20420, 29102, 285, 14728, 285, 271, 5165, 16751, 2605, 275, 253, 305, 431, 3210, 50275, 250, 3743, 323, 4868, 50276, 2520, 310, 247, 3839, 11080, 285, 6210, 1591, 886, 4525, 2929, 285, 247, 10182, 8368, 273, 841, 21496, 3210, 310, 273, 1270, 2087, 1600, 281, 253, 17857, 32888, 3114, 2299, 891, 1928, 253, 1783, 310, 247, 2372, 20126, 387, 2176, 2792, 285, 8414, 273, 9610, 4722, 4342, 1293, 7933, 1448, 1382, 12861, 390, 18212, 15571, 731, 50276, 74, 1158, 436, 2929, 10262, 247, 1270, 22802, 745, 1127, 323, 2007, 2561, 327, 253, 2256, 347, 352, 5604, 5439, 3240, 247, 1643, 3533, 342, 479, 891, 1329, 697, 14924, 533, 651, 3524, 281, 923, 253, 4477, 2953, 690, 273, 253, 3533, 5439, 1060, 50275, 993, 23223, 50275, 77, 1502, 273, 1783, 342, 2067, 1027, 5609, 50275, 635, 4722, 285, 4623, 2256, 2170, 50275, 42771, 602, 897, 273, 3332, 2905, 789, 50275, 783, 2929, 310, 3240, 973, 3542, 285, 10932, 7296, 253, 12794, 5691, 273, 4028, 598, 2561, 273, 436, 30108, 1981, 3753, 50275, 585, 1209, 2224, 50276, 26122, 50275, 43507, 253, 1077, 1027, 3879, 273, 305, 431, 432, 253, 643, 14237, 50276, 16534, 436, 320, 1955, 281, 253, 6311, 40798, 2349, 351, 723, 436, 1537, 671, 5513, 253, 1863, 739, 1811, 3740, 11865, 2326, 672, 17565, 253, 1027, 46234, 273, 253, 1072, 3159, 3510, 812, 1899, 2112, 1110, 22627, 11865, 320, 9578, 281, 6197, 1899, 273, 253, 21761, 50275, 249, 436, 5968, 352, 651, 320, 1270, 281, 755, 247, 1805, 14846, 273, 752, 253, 1027, 9959, 2723, 281, 3340, 275, 253, 1083, 273, 305, 431, 812, 627, 320, 247, 1805, 5839, 273, 253, 2954, 875, 15970, 50276, 68, 27026, 285, 1327, 68, 27026, 3210, 285, 841, 7368, 723, 50274, 3062, 3839, 253, 1863, 739, 4533, 8310, 310, 27807, 533, 1580, 352, 760, 4620, 275, 581, 2021, 273, 3210, 326, 556, 247, 1077, 2074, 39707, 10336, 281, 3210, 275, 534, 352, 1057, 417, 3176, 752, 403, 359, 281, 1056, 273, 352, 50275, 5430, 476, 253, 15276, 7877, 1319, 387, 1016, 3828, 2572, 342, 6864, 7296, 1016, 3828, 347, 3811, 327, 247, 16751, 253, 9261, 387, 1016, 3828, 943, 769, 347, 247, 13249, 8326, 323, 253, 1735, 8090, 16751, 534, 943, 760, 1581, 247, 5141, 275, 7877, 5734, 891, 717, 5816, 1633, 326, 5936, 326, 841, 48489, 513, 417, 452, 2217, 3530, 285, 263, 403, 10499, 1633, 1027, 685, 253, 7877, 273, 247, 16751, 2593, 7127, 943, 5513, 849, 436, 812, 320, 9369, 50274, 3062, 3839, 891, 1089, 253, 1698, 298, 2352, 275, 305, 431, 1892, 281, 2096, 390, 4665, 1293, 625, 1783, 285, 651, 671, 751, 281, 2096, 253, 1077, 1781, 806, 7877, 273, 253, 305, 431, 3210, 50275, 262, 651, 320, 1270, 281, 923, 625, 13991, 50276, 21528, 42287, 323, 24432, 752, 604, 2712, 1057, 1980, 30393, 16084, 323, 3676, 4715, 8607, 2509, 789, 275, 436, 403, 44180, 33032, 8606, 723, 50276, 250, 5551, 707, 2710, 5368, 4342, 670, 30393, 273, 33876, 14237, 11575, 21349, 50276, 8882, 780, 14237, 403, 4122, 29436, 1561, 9959, 273, 253, 14237, 50275, 72, 431, 14237, 956, 247, 1863, 739, 4533, 16751, 835, 253, 954, 10879, 3000, 3176, 387, 253, 1481, 285, 1679, 10879, 3000, 403, 13237, 42873, 387, 253, 5004, 50276, 783, 1863, 739, 4533, 16751, 310, 38165, 275, 12861, 8090, 50276, 6291, 14237, 2965, 275, 247, 299, 26365, 2317, 50275, 783, 1980, 15276, 7877, 273, 33876, 46234, 310, 2406, 685, 323, 440, 304, 3358, 46234, 50276, 856, 84, 50276, 783, 16751, 1783, 273, 3159, 4294, 310, 27807, 285, 27350, 50276, 783, 8813, 273, 4679, 369, 2590, 275, 1016, 2593, 50276, 9328, 4711, 18511, 1941, 326, 253, 4156, 10669, 5251, 30393, 273, 841, 14237, 310, 8127, 1955, 281, 14199, 273, 1781, 9959, 436, 310, 247, 9865, 7680, 984, 352, 11424, 2045, 4342, 275, 5105, 333, 274, 1432, 73, 6247, 285, 30855, 3205, 731, 342, 10527, 12656, 50276, 5040, 50276, 249, 2593, 3495, 285, 2593, 4562, 253, 2929, 4245, 12497, 6152, 281, 5105, 333, 274, 1432, 73, 6247, 347, 2080, 347, 891, 812, 2028, 1046, 3302, 906, 310, 247, 21068, 273, 247, 906, 432, 5105, 333, 274, 1432, 73, 6247, 285, 253, 3082, 403, 1077, 2074, 50276, 2004, 597, 14409, 352, 253, 16182, 310, 8127, 2668, 432, 5368, 440, 304, 3358, 21496, 1783, 50276, 19131, 597, 1265, 281, 4271, 1077, 973, 2931, 9959, 891, 369, 1077, 14338, 670, 253, 42060, 875, 253, 17546, 352, 651, 417, 320, 2834, 281, 16030, 690, 273, 253, 941, 407, 1133, 594, 891, 13414, 2096, 2139, 253, 4477, 42126, 1611, 50276, 783, 4477, 3959, 642, 1783, 323, 253, 3064, 275, 3879, 875, 1027, 3210, 891, 3543, 751, 891, 369, 4361, 247, 2891, 13646, 285, 253, 14777, 497, 1669, 323, 253, 9414, 281, 4684, 253, 4477, 452, 18289, 5262, 3240, 247, 1223, 4680, 670, 841, 17856, 5289, 285, 3210, 594, 13353, 597, 452, 19704, 980, 670, 253, 3879, 597, 2540, 390, 24316, 597, 476, 1071, 50276, 37585, 50276, 4826, 50276, 74, 42126, 8968, 1919, 253, 6452, 326, 634, 2022, 4560, 369, 271, 8813, 273, 5368, 3916, 670, 30393, 407, 7296, 3879, 1561, 253, 9959, 594, 326, 3198, 625, 15075, 50276, 783, 9380, 368, 26542, 513, 247, 12524, 2628, 273, 15571, 2139, 14539, 10144, 275, 253, 6779, 2317, 310, 1534, 1097, 10669, 285, 1511, 3020, 253, 2929, 651, 320, 247, 2257, 625, 34025, 604, 368, 1160, 247, 2074, 3434, 275, 15571, 4114, 50276, 9088, 310, 16706, 897, 273, 14539, 10144, 4632, 14539, 1658, 5755, 50276, 34212, 3058, 323, 1750, 7561, 6566, 326, 253, 33876, 2317, 310, 594, 12504, 50276, 50234, 4737, 24042, 323, 5884, 28146, 285, 963, 993, 24088, 15450, 272, 3185, 273, 15450, 812, 31951, 3185, 273, 812, 28932, 50276, 34235, 273, 14339, 2710, 314, 281, 1029, 14259, 390, 7349, 460, 43031, 5464, 7363, 285, 643, 6341, 273, 14539, 10144, 352, 1537, 320, 30909, 281, 3048, 1016, 4473, 281, 14539, 10144, 2378, 285, 840, 1016, 6774, 906, 3365, 3730, 281, 352, 347, 14539, 10144, 1223, 29570, 253, 7982, 840, 253, 9414, 36908, 452, 281, 11485, 4456, 534, 7982, 6492, 1029, 14539, 10144, 347, 597, 1239, 253, 1543, 50276, 34974, 50276, 783, 4477, 1750, 281, 3609, 253, 9959, 326, 22950, 278, 983, 891, 1239, 253, 41066, 281, 16084, 326, 436, 24571, 310, 10649, 1752, 383, 494, 310, 326, 253, 1083, 50276, 9133, 13716, 369, 9300, 432, 495, 281, 818, 275, 1708, 273, 6832, 11848, 4679, 285, 1783, 7152, 33032, 2520, 2929, 3537, 13505, 253, 12087, 273, 2067, 33876, 1025, 46234, 253, 4477, 921, 326, 4156, 30393, 310, 4269, 407, 2266, 17524, 273, 3159, 11390, 285, 326, 11390, 273, 1027, 3159, 3510, 403, 14539, 1658, 1037, 5939, 1561, 253, 7368, 50276, 296, 3755, 20556, 50276, 2520, 789, 310, 247, 6815, 292, 1368, 1123, 6880, 273, 5105, 333, 274, 1432, 73, 6247, 3614, 2700, 29404, 7585, 2061, 14718, 1497, 69, 746, 45535, 9275, 326, 277, 1644, 12861, 715, 253, 17856, 3607, 273, 33876, 1025, 11390, 50276, 783, 2561, 1953, 310, 4518, 4767, 2139, 36908, 30393, 8513, 3045, 285, 4518, 9577, 253, 373, 642, 30393, 12171, 50276, 783, 495, 69, 5304, 5904, 2085, 247, 1805, 17856, 30328, 685, 253, 6507, 5304, 5904, 326, 403, 1846, 275, 436, 2238, 273, 9380, 50276, 22402, 50276, 74, 13414, 1158, 326, 1175, 3045, 40878, 30393, 323, 1650, 359, 2168, 871, 326, 253, 8946, 4228, 46234, 403, 671, 35319, 13892, 2369, 285, 289, 297, 10836, 4240, 3614, 2700, 29404, 7585, 2061, 14718, 1497, 69, 1166, 1012, 2904, 9275, 285, 436, 2097, 326, 1175, 3045, 347, 4080, 407, 15450, 8892, 778, 820, 31477, 342, 30393, 594, 4496, 1908, 294, 88, 1573, 253, 5068, 273, 2593, 1249, 323, 1650, 3185, 273, 627, 310, 271, 5165, 20620, 1908, 352, 310, 417, 2590, 2139, 50275, 5430, 8612, 310, 581, 3632, 3410, 432, 815, 15208, 323, 10499, 331, 2068, 2388, 275, 7212, 337, 368, 3534, 271, 1650, 275, 253, 10199, 672, 253, 1072, 3159, 1511, 4310, 476, 452, 9106, 1027, 30460, 7293, 327, 3634, 285, 3021, 891, 2868, 253, 3969, 815, 74, 18, 1156, 17703, 285, 815, 74, 19, 1156, 17703, 778, 320, 9106, 1027, 2139, 417, 3192, 625, 3530, 323, 3488, 6017, 528, 3000, 50276, 22309, 513, 368, 897, 1027, 4181, 17082, 299, 26365, 4632, 7349, 460, 323, 26230, 298, 2352, 273, 33876, 1025, 4632, 4228, 46234, 2829, 495, 50275, 1542, 305, 431, 19, 359, 574, 13937, 281, 1089, 326, 690, 3510, 403, 2330, 342, 581, 7368, 285, 643, 3510, 403, 2330, 342, 253, 643, 7368, 533, 326, 310, 417, 16058, 275, 776, 4679, 50276, 74, 1158, 368, 943, 1007, 387, 22349, 2581, 685, 3510, 1580, 368, 250, 10620, 342, 253, 33876, 1025, 46234, 352, 651, 320, 4722, 281, 923, 1880, 368, 452, 253, 1072, 1511, 275, 1097, 9959, 285, 840, 281, 1007, 387, 697, 22349, 891, 701, 326, 253, 22349, 588, 9184, 50276, 37585, 3374, 50276, 664, 1089, 247, 1698, 6967, 16751, 275, 305, 431, 72, 431, 19, 46234, 533, 417, 275, 270, 797, 8155, 300, 6291, 46234, 50276, 2858, 634, 298, 2352, 403, 1698, 323, 270, 797, 69, 6291, 8090, 347, 973, 2139, 16216, 368, 1750, 253, 1698, 39120, 1319, 323, 270, 797, 69, 6291, 8473, 11174, 84, 50276, 74, 5545, 326, 268, 25192, 342, 884, 76, 30318, 1979, 4245, 1175, 7031, 275, 9169, 368, 778, 3365, 1375, 326, 436, 247, 7561, 3197, 10895, 50276, 16123, 19, 4285, 414, 1162, 355, 4022, 310, 3798, 6289, 281, 347, 259, 1479, 614, 633, 19, 50276, 32897, 1908, 294, 545, 83, 2355, 4679, 50276, 12792, 347, 368, 403, 417, 16472, 6537, 4679, 533, 2581, 9591, 41075, 1783, 273, 253, 46234, 187, 187, 4118, 18435, 27, 2520, 2929, 10262, 247, 3862, 41075, 1783, 273, 253, 12087, 273, 10669, 14237, 275, 1781, 3448, 3210, 342, 247, 2770, 327, 14539, 10144, 285, 16751, 2605, 285, 12957, 690, 10084, 4342, 326, 1361, 5513, 2469, 7313, 50276, 856, 84, 50276, 8250, 285, 10084, 16101, 4342, 8664, 247, 3862, 285, 7561, 3197, 2021, 273, 3210, 50276, 5040, 50276, 783, 2929, 310, 247, 9648, 3862, 41075, 1783, 342, 642, 2014, 10799, 1750, 326, 16027, 2366, 1046, 5313, 273, 253, 789, 50276, 74, 5717, 1097, 253, 4477, 285, 30628, 323, 271, 33940, 19303, 5955 ]
Below is given review of a research paper from cnoference journal. Please write a summary the review. ### Review: valuation criteria based on gametheory eg shapely value have been used in the ml literature for analyzing feature importance and for data subset selection these criteria serve as solution concepts for cooperative games and have been adapted by some works in ml for subset valuation problems the present paper presents a probabilistic treatment of cooperative games and shows that two classical valuation criteria can be seen as a onestep factored approximation to maximum entropy solution to the game they then propose a new valuation criterion variational index that uses a multistep factored approximation and show it satisfies some common axioms for cooperative games the paper also has experimental results on the proposed criterion p s im somewhat of an outsider to this topic with minimal familiarity with cooperative games pros i find the observation that classical criteria such as shapely value can be seen as onestep decoupling approximation to the maximum entropy probabilistic assignment to be quite interesting in hindsight this seems like a natural connection although im not sure if an expert on this topic would be equally excited about this result the multistep approximation criterion that the authors propose also seems like a natural extension to existing gametheoretic criteria cons i dont find the experimental results to be compelling they do not convincingly demonstrate that the proposed new criterion has some additional value over existing criteria for ml applications i understand that the authors have chosen to make the meanfield perspective the central focus of the paper but i think its important to also have a strong empirical section showcasing utility to an ml audience the feature selection experiments in sec 52 show some improvements over two classical gametheoretic criteria but the field of subset selection is now quite mature eg httpsarxivorgpdf200615412pdf and comparing to just the two closest gametheoretic criteria doesnt seem to make a strong empirical case the new criterion comes with the added cost of monte carlo sampling to approximate exponential sums over multiple rounds but im not entirely sure if the experiments justify this added cost needed the writing is accessible but can be improved to better motivate the subsetfeature selection applications especially for an ml audience and provide more elaborate intuition early on for the decoupling approximations that the classical criteria seek to compute other commentsquestions alg 1 you refer to this algorithm as performing gradient ascent to me it appears more like a fixed point iteration algorithm and not necessarily performing a gradientbased ascent step on the elbo objective please correct me if im missing something here sec 5 might be good to mention how the decoupling error is computed in all your experiments given that it requires calculating an approximation of the boltzmann distribution sec 51 i didnt quite understand the role of the data clustering in the experiment on the xaxis in fig 1 do you add clusters of data points instead of individual data points sec 52 in fig 2 you plot the predicted probabilities as you drop features is this the average probability predicted by the model across all test examples might be good to elaborate why this is a good evaluation metric to look at sec 1intro i think your mention of a solution concept phif needs elaboration for audience not necessarily familiar with cooperative games similarly in sec 2 it wasnt initially clear to me how the importance weights you write out in 2 and 3 relate to the solution concept phif the paper presents some interesting theoretical connections but does not provide sufficiently compelling empirical results showcasing utility for applications in ml docsepthis paper proposes an energybased perspective on cooperative games that permits a gradientbased calculation of shapleybanzhaf values as well as the definition of a new alternative value the variational index a quick summary of the papers key ideas is for a given cooperative game f we can seek an entropy maximizing distribution over coalitions ps that satisfies a constraint on the mean coalition value mu solving the entropy maximization problem via its lagrangian yields the boltzmann distribution ps propto expfst where the temperature t has a onetoone correspondence with the mean coalition value mu this result is in the appendix this distribution gives more probability mass to coalitions that achieve higher values we can seek a simpler alternative to ps by doing meanfield variational inference ie finding a factorized surrogate qs where each players participation is determined by independent bernoulli rvs the result will intuitively assign higher probabilities to players that belong to highvalue coalitions so these probabilities can serve a function similar to shapleybanzhaf values the vi approach suggests a kl divergence minimization or elbo maximization objective for learning qs which is parameterized by x in 0 1n doing gradient descent on this objective yields a relatively simple update rule where we repeatedly set xi sigmanablai fmtx t for i 1 ldots n the authors define the variational index as a function of the solution to the kl divergence minimization problem s tsigma1x the authors find that the banzhaf value can be found using a singlestep update to a particular initialization of the kl divergence minimization problem luckily the temperature t is not important for singlestep updates similarly they find that the shapley value is the average of the singlestep update applied to different initializations again the temperature doesnt matter finally the authors point out that any singlestep update applied to a symmetric initialization will be a probabilistic value a class of solution concepts in cooperative game theory of which shapleybanzhaf values are special cases lastly the authors suggest a practical samplingbased approach to calculating the necessary gradients which are just as difficult to calculate as the shapleybanzhaf values because they require calculating the value for every coalition s subseteq n the experiments compare the variational index to shapley and banzhaf values in data and feature removal tasks finding that it performs quite favorably in the settings examined the energybased perspective is to my knowledge a novel perspective for shapley values and player valuation in ml and i found it quite cool to see these tools ebms meanfield vi applied in this way the variational index is an interesting and it appears to perform well in the experiments relative to shapleybanzhaf values i have a couple questionscomments that i hope the authors will consider for improving the paper premise of ebms for cooperative games the idea of using ebms to analyze cooperative games was a bit confusing and could probably be introduced better to be specific the paper starts by saying we should learn a probability distribution over coalitions but this does not begin to sound like a worthwhile endeavor until several sections in when were using shapleybanzhaf values we control the distribution over coalitionsorderings so learning a distribution sounds at least initially like a somewhat pointless idea similarly the idea of finding an entropymaximizing coalition distribution constrained to a mean value which its not clear how to choose does not initially sound useful of course its not pointless but what ultimately connects these ebm ideas to player valuation is the crucial step of doing meanfield vi because of the important role meanfield vi plays here i wonder if it doesnt deserve a bit more attention and emphasis appendix a attempts to explain this a bit more but i didnt find these reasons compelling particularly the second paragraph unless im missing something the real reason to use ebms meanfield vi is that it enables us to learn a factorized distribution over players that places higher probabilities on players that contribute more value and this gives us a new perspective for defining player valuations which happens to connect to existing ideas like shapleybanzhafprobabilistic values connection with multilinear extensions the idea of learning a factorized distribution over players is most similar to the idea of multilinear extensions in cooperative game theory as in okhrati and lipani in that work shapley values are defined as the expected marginal contribution where the preceding coalition is determined by a factorized distribution over players integrated over a probability value here the probabilities of the factorized distribution are learned but they can still coincide with shapley values i wonder if the authors can provide any more commentary on the implicit connection between these ideas where the probabilities in a factorized distribution can induce player valuations vs act as player valuations cooperative game theory in ml the paper gives a nice overview of the use of shapley values in ml including uses in featurebased explanations data valuation and model ensemble valuation however a couple key papers are perhaps overlooked and could be cited lipovetski conklin 2001 and strumbelj kononenko 2010 were some of the first papers to analyze statistical models using shapley values and covert et al 2020 already cited also provides an overview of other papers that use shapley values including spvim sage and shapley effects and it shows that many other ml explanation methods are also tied to cooperative game theory entropy gradient in section 42 the gradient nablai hq log frac1 xixi is given but the result is not derived it could be helpful to provide a derivation in the appendix because this result seems nontrivial and it is important for the subsequent gradient descent routine gradient descent derivation the update rule in algorithm 1 does not immediately look like gradient descent and i expect it will be confusing to many readers where for example is the learning rate i tried to derive this result and if i understand correctly it comes from taking a gradient step on sigma1xi and then applying the sigmoid operation to get xi where the learning rate is chosen as a function of the current value xi im not sure if thats right or if theres a simpler explanation but there is too much work left to the reader here practical impact i found the ideas in this work very interesting and will view the paper favorably regardless of the answer to this question but i just wanted to clarify the practical impact am i correct in understanding that this energybased approach does not necessarily offer a more efficient algorithm to calculating shapleybanzhaf values is the main practical impact then proposing the variational index as an alternative to shapleybanzhaf values for valuation problems applying existing shapley value approximations to the variational index in section 3 its stated that existing shapley value estimation ideas can be applied directly can be seamlessly lifted to calculating the variational index that point didnt come up later in the paper and i dont see how that is the case how for example could we use a permutationbased or weighted least squaresbased shapley value estimator to calculate the variational index or how could we use a modelspecific shapley value approximation like treeshap how could these things be integrated into algorithm 1 or be adapted into different routines for optimizing the kl divergence i dont get this some clarification on this point would be helpful role of temperature it might be worth noting explicitly for eqs 1415 that the specific temperature value does not matter and that it does not matter for any singlestep update currently the reader must figure this out for themselves aside from that its a bit unsatisfactory that different temperatures yield different variational indices and that we dont know much about the properties of the different solutions but i suppose its fine to leave further investigation to future work it could also be nice to have either a footnote or brief appendix section showing why t 0 and t to infty induce even spread and 01 probabilities respectively as this is also currently left to the reader role of initializer in section 42 theres a brief section discussing the initializers role wrt variational values it seems mostly right but im confused by the claim that the initializer doesnt matter if you plan on running gd to convergence how can that be given that the problem is nonconvex stated earlier in the paper these seem like contradictory ideas please clarify if possible additivity and efficiency properties in section 44 theres a paragraph discussing why variational values dont satisfy the additivity and efficiency properties satisfied by shapley values i found this paragraph a bit odd in addressing this why question your explanation addresses why they shouldnt reasons why these properties might be unappealing as if you had some choice in the matter rather than the mathematicalmechanistic reasons why they dont i would ask that you adjust this paragraph to clarify whether youre explaining i why those properties arent satisfied or ii why its okay that theyre not satisfied a couple things about the experiments in section 53 where we look at the convergence of algorithm 1 we can clearly see that it converges but does it converge to the same point regardless of the initialization this may be worth looking into due the problems nonconvexity is it worth looking into whether algorithm 1 can yield efficient lowvariance shapleybanzhaf value estimates relative to existing estimators do you have any intuition about how the variance might compare for a fixed number of game evaluations there were a couple specific phrases that i thought could be improved the introduction says that you explore a probabilistic treatment of games thats true but its not very specific because cooperative game formulations are sometimes probabilistic theres work considering stochastic cooperative games and shapleybanzhaf values have probabilistic formulations it might be better to say that you propose learning a factorized distribution over players to arrive at player valuations because thats whats unique here the same paragraph says something like this later but it leaves out the bit about learning a factorized surrogate distribution and the fact that the original distribution is encouraged to put more mass on players that contribute more value also in the introduction you state that you conduct learning and uncertainty analysis in a unified bayesian manner im not sure this is correct your method of course does vi but not uncertainty analysis or bayesian inference wheres the prior whats the data this paper introduces some very interesting ideas about the use of ebms and meanfield vi in the context of player valuation for cooperative games their new perspective is connected to shapleybanzhafprobabilistic values it permits approximate optimization the gradients require sampling and they define a new value the variational index that performs quite well in their experiments i had a couple questions and comments about the writing but i expect these will be easy to address docsepthe paper studies valuation problems for cooperative games it proposes a new valuation measure called variational index the idea is to create a coalition probability distribution based on a maximum entropy criterion player valuations are then derived by creating decoupled surrogates of this distribution the authors then present a gradient ascent algorithm to compute this decoupling classical valuation criteria like the shapley value and the banzhaf index can be recovered as special cases or modifications of the algorithms iterates regarding strengths to the best of my knowledge the proposed valuation of the framework is novel it is well motivated and the connections with existing classical methods are very interesting it also opens the door for further extensions as different surrogate models or application specific priors can be easily incorporated while it is hard to argue both in theory and in practice that one valuation method is better than the alternatives the empirical results seem to be reasonable i believe that this paper can have significant impact in the area in the immediate future regarding weaknesses the paper could be improved in terms of approachability to practitioners firstly reporting the run times of the experiments andor the accuracytime tradeoffs for mcmc methods would be useful for practitioners additionally any advice on how users should interpret the absolute scores of the variational index especially since t affects their scaling could be useful this is especially true for cases where the rankings of different valuation methods are similar but the absolute scores are not moreover the paragraph on why additivity or efficiency does not make sense for some games is very important for practitioners to understand practitioners may be easily tricked into thinking that the more properties a valuation measure satisfies the better regardless of the game they try to understand right now the paragraph is a bit dense and hard to follow for audiences not familiar with prior work on axiomatization of valuations in cooperative games toy numerical examples to demonstrate why additivity or efficiency can result in unintuitivenotuseful valuations could also help practitioners grasp what is the problem these weaknesses are minor so i am in favor of accepting this work i have read the responses of the authors i find that these additions are going to improve the paper i thus maintain my score a well written and novel work on a variational framework for player valuations in cooperative games while approachability to practitioners could be improved i am in favor of accepting this work docsepthis paper studies valuation problems from cooperative game theory there are n agents and a valuation function f n to r where fs is the collective payoff of the coalition s subseteq n the goal is to use this function f to define an importance vector phif in rn examples include the shapley value and banzhaf index the authors introduce a probabilistic treatment of this problem where they use f to define a probability distribution p where ps is the probability that coalition s forms they then phrase the problem of defining an importance vector phif as a decoupling problem under p the n agents may be correlated in a complicated way but to assign each of them an individual importance value one must decouple their interactions or simplify their correlations the goal is then to find a product distribution q that is as close to p as possible under the kl divergence specifically the authors define q to be an n independent bernoulli distribution where the probability that agent i participates in the coalition is denoted xi the authors show how to optimize the probabilities x1 dots xn using coordinate ascent finally they define the importance score of player i as logxi1xi ignoring a temperature t term for simplicity the authors show that the resulting importance vector satisfies many of the gametheoretic axioms that the shapley value and banzhaf index satisfy like the null player marginalism and symmetric axioms in the experiments the authors look at small instances with n 25 where it is actually possible to compute the gradients exactly as opposed to an approximate sampling method the applications they look at are for data valuation and feature attribution in the context of machine learning for these tasks they show that their proposed approach performs about the same as the shapley value and banzhaf index and sometimes a bit better strengths the authors show that one can derive the shapley value and banzhaf index using one step of gradient ascent using specific initializations i think its nice that these two criteria can be unified under this model the paper seemed polished and was easy to read weaknesses i think that the main way this paper could be improved is that i couldnt quite understand the benefit of this approach to defining the importance scores over classic approaches like the shapley value and the banzhaf index its not any easier to compute and at least according to whats written in the paper thus far it doesnt seem to satisfy any additional gametheoretic axioms than the shapley value or banzhaf index lastly in the experiments it seems to perform about as well as these other two indices the experiments dont seem illustrate its superiority as decisively as one might hope there are a few ways that the model choices could be further justified the paper is really built around the specific choice p of the distribution over coalitions which is justified in a few sentences in the third paragraph of the introduction however since the reader really has to buy in to this choice of p to appreciate the rest of the paper i think that a bit more justification would be useful one other thing i couldnt quite follow is why the importance score ends up being defined in terms of the inverse sigma functions smaller comments 3rd paragraph of section 1 how is mu chosen why doesnt it show up in equation 1 4th paragraph of section 1 at first i was confused about what the valuation problem is then i realized its the problem of defining the importance vector it would be helpful to clarify this last paragraph of section 1 i have a few suggestions i didnt know what was meant by a decoupling perspective until i got to section 4 so i wasnt able to understand this sentence intriguing properties is a bit too vague in my opinion after reading section 5 im not really sure what these intriguing properties would be so it would definitely help to spell these out overall i think that this paper was nicely written and i like that it unifies the shapley value and banzhaf index but i dont yet see why the proposed approach is necessarily better than using either of these two existing criteria so im leaning towards rejection ### Summary:
this paper considers the valuation problem for a cooperative game and shows that some classical metrics eg shapley value can be considered as approximations to the maximum entropy reviewers were generally very positive they especially praised the novelty and writing quality while having some concerns about the quality of the empirical results the authors did an excellent job responding to the reviewers and resolved their main concerns a few quibbles remain however and while the manuscript is very good asis please consider the reviewer criticisms in creating an updated version
[ 715, 5933, 337, 390, 320, 12956, 715, 1027, 33255, 323, 39793, 253, 27451, 23279, 891, 13414, 755, 436, 690, 37699, 327, 436, 1127, 651, 320, 9371, 50276, 14337, 273, 3276, 352, 1537, 320, 4409, 15806, 11120, 323, 16186, 84, 1638, 1010, 326, 253, 2173, 3276, 1318, 1057, 417, 2647, 285, 326, 352, 1057, 417, 2647, 323, 667, 1625, 46701, 554, 5731, 4390, 253, 9414, 1364, 4677, 436, 562, 323, 3746, 9255, 432, 326, 697, 247, 2372, 49770, 326, 1027, 9208, 4917, 1027, 39762, 14452, 285, 326, 359, 13414, 871, 1199, 670, 253, 3607, 273, 253, 1027, 5482, 533, 891, 9428, 697, 4030, 281, 3553, 2007, 5839, 281, 2852, 789, 352, 812, 671, 320, 5322, 281, 452, 2057, 247, 43302, 390, 4864, 30762, 2593, 4645, 2139, 246, 50276, 17, 285, 246, 281, 2192, 555, 10808, 1014, 5195, 285, 14805, 20552, 2975, 347, 436, 310, 671, 4390, 1669, 281, 253, 9414, 50276, 14337, 273, 3302, 6081, 275, 2593, 5976, 253, 373, 247, 4864, 2593, 16585, 253, 3302, 14460, 2554, 8772, 39762, 2193, 352, 3133, 6571, 987, 533, 516, 13477, 407, 253, 1750, 326, 253, 3302, 6081, 36908, 2647, 604, 368, 2098, 327, 3515, 305, 69, 281, 14940, 849, 476, 326, 320, 1677, 326, 253, 1895, 310, 1327, 44181, 4767, 4321, 275, 253, 2929, 841, 1646, 751, 34126, 5697, 4496, 19148, 604, 1896, 50276, 1911, 5714, 285, 6733, 3607, 275, 2593, 7127, 253, 373, 247, 12494, 16585, 2139, 39762, 2193, 13414, 10517, 253, 823, 5714, 285, 6733, 3607, 10048, 407, 439, 522, 2205, 2193, 891, 1119, 436, 12494, 247, 2372, 8909, 275, 15974, 436, 2139, 1953, 634, 8813, 12453, 2139, 597, 943, 2649, 4606, 2139, 841, 3607, 1537, 320, 440, 6243, 4052, 347, 604, 368, 574, 690, 4327, 275, 253, 2647, 2581, 685, 253, 15965, 1405, 2291, 2531, 4606, 2139, 597, 13414, 891, 651, 1642, 326, 368, 4575, 436, 12494, 281, 19148, 1880, 368, 250, 15571, 891, 2139, 1110, 3607, 403, 2649, 10048, 390, 21255, 2139, 697, 8261, 326, 597, 250, 417, 10048, 50276, 66, 4564, 1841, 670, 253, 4679, 50275, 249, 2593, 8676, 835, 359, 1007, 387, 253, 14940, 273, 5933, 337, 359, 476, 4518, 923, 326, 352, 26414, 533, 1057, 352, 29623, 281, 253, 1072, 1127, 10159, 273, 253, 31850, 436, 778, 320, 4409, 2819, 715, 1955, 253, 3237, 1327, 44181, 414, 50276, 261, 352, 4409, 2819, 715, 1880, 5933, 337, 476, 4917, 5919, 1698, 87, 14417, 439, 522, 2205, 5568, 20122, 2320, 1318, 8197, 4103, 281, 5368, 48489, 513, 368, 452, 667, 30328, 670, 849, 253, 11041, 1537, 7277, 323, 247, 4229, 1180, 273, 2165, 27163, 50276, 9088, 497, 247, 4564, 2173, 25491, 326, 891, 1869, 812, 320, 5520, 50275, 783, 10199, 2296, 326, 368, 8338, 247, 37851, 1971, 273, 3958, 28763, 2032, 533, 697, 417, 1077, 2173, 984, 27293, 2165, 26850, 403, 4536, 37851, 253, 373, 789, 7296, 19191, 27293, 3958, 285, 439, 522, 2205, 5568, 20122, 2320, 2193, 452, 37851, 26850, 352, 1537, 320, 1805, 281, 1333, 326, 368, 12661, 4715, 247, 2803, 1025, 3268, 689, 3773, 281, 12666, 387, 4760, 821, 12542, 984, 28763, 47515, 4451, 1060, 253, 1072, 12494, 2296, 1633, 751, 436, 1996, 533, 352, 6505, 562, 253, 2372, 670, 4715, 247, 2803, 1025, 35701, 3268, 285, 253, 958, 326, 253, 3236, 3268, 310, 14659, 281, 1691, 625, 2280, 327, 3773, 326, 8162, 625, 1318, 50276, 12563, 275, 253, 10199, 368, 1375, 326, 368, 2589, 4715, 285, 11649, 1783, 275, 247, 27998, 17699, 16561, 5133, 516, 417, 2119, 436, 310, 3451, 634, 1332, 273, 2282, 1057, 2177, 533, 417, 11649, 1783, 390, 17699, 16561, 17032, 488, 373, 253, 2720, 47515, 253, 941, 436, 2929, 23970, 690, 1077, 4722, 5697, 670, 253, 897, 273, 38391, 983, 285, 1599, 3423, 2177, 275, 253, 3634, 273, 4760, 29581, 323, 27293, 3958, 616, 747, 8668, 310, 4802, 281, 439, 522, 2205, 5568, 20122, 2320, 22275, 5280, 2531, 2193, 352, 16000, 16851, 13757, 253, 27935, 2430, 10491, 285, 597, 4853, 247, 747, 1318, 253, 39762, 3605, 326, 17923, 3240, 973, 275, 616, 4679, 50276, 74, 574, 247, 4564, 3533, 285, 5701, 670, 253, 4028, 533, 891, 1902, 841, 588, 320, 3477, 281, 2953, 5474, 339, 431, 248, 2929, 2175, 29581, 3237, 323, 27293, 3958, 352, 29328, 247, 747, 29581, 2557, 1925, 39762, 3605, 253, 2934, 310, 281, 2794, 247, 20706, 5912, 3268, 1754, 327, 247, 4869, 15579, 17705, 4760, 821, 12542, 403, 840, 6012, 407, 6153, 34430, 6216, 919, 6375, 684, 273, 436, 3268, 253, 4477, 840, 1246, 247, 11786, 49104, 5933, 281, 11897, 436, 34430, 4906, 8946, 29581, 6866, 751, 253, 439, 522, 2205, 1318, 285, 253, 8913, 20122, 2320, 3605, 476, 320, 12372, 347, 2714, 2219, 390, 14586, 273, 253, 11333, 10040, 684, 5001, 20544, 281, 253, 1682, 273, 619, 3640, 253, 4081, 29581, 273, 253, 7792, 310, 4460, 352, 310, 973, 17194, 285, 253, 10291, 342, 5368, 8946, 3082, 403, 1077, 4722, 352, 671, 13279, 253, 3369, 323, 2007, 18149, 347, 1027, 35701, 3210, 390, 2898, 2173, 2235, 641, 476, 320, 4354, 11217, 1223, 352, 310, 1892, 281, 9059, 1097, 275, 3762, 285, 275, 3946, 326, 581, 29581, 1332, 310, 1805, 685, 253, 18075, 253, 16774, 1543, 1646, 281, 320, 5272, 891, 2868, 326, 436, 2929, 476, 452, 1534, 3486, 275, 253, 2170, 275, 253, 8993, 2852, 50276, 1747, 13218, 32213, 253, 2929, 812, 320, 5520, 275, 2426, 273, 2746, 1430, 281, 24432, 41005, 9610, 253, 1408, 2069, 273, 253, 4679, 285, 263, 253, 3933, 317, 1767, 553, 5454, 14273, 323, 278, 3591, 68, 3082, 651, 320, 4217, 323, 24432, 23000, 667, 7535, 327, 849, 4212, 943, 4665, 253, 7880, 7363, 273, 253, 39762, 3605, 3340, 1580, 246, 11852, 616, 13642, 812, 320, 4217, 436, 310, 3340, 2032, 323, 2219, 835, 253, 31972, 273, 1027, 29581, 3082, 403, 2074, 533, 253, 7880, 7363, 403, 417, 25761, 253, 12494, 327, 2139, 823, 5714, 390, 6733, 1057, 417, 1056, 3282, 323, 690, 3958, 310, 1077, 1774, 323, 24432, 281, 2096, 24432, 778, 320, 4354, 10480, 264, 715, 4680, 326, 253, 625, 3607, 247, 29581, 2557, 12310, 253, 1805, 10159, 273, 253, 2165, 597, 1611, 281, 2096, 987, 1024, 253, 12494, 310, 247, 2372, 14086, 285, 1892, 281, 956, 323, 23886, 417, 7615, 342, 2720, 789, 327, 26373, 16692, 1320, 273, 821, 12542, 275, 27293, 3958, 20953, 10704, 6667, 281, 7568, 2139, 823, 5714, 390, 6733, 476, 906, 275, 25962, 2338, 3870, 302, 316, 4085, 821, 12542, 812, 671, 1361, 24432, 15909, 752, 310, 253, 1895, 50276, 20513, 32213, 403, 5884, 594, 891, 717, 275, 3718, 273, 18738, 436, 789, 891, 452, 1239, 253, 6128, 273, 253, 4477, 891, 1089, 326, 841, 30733, 403, 1469, 281, 3157, 253, 2929, 891, 3021, 6558, 619, 4868, 247, 973, 3542, 285, 4460, 789, 327, 247, 39762, 7792, 323, 4760, 821, 12542, 275, 27293, 3958, 1223, 2746, 1430, 281, 24432, 812, 320, 5520, 891, 717, 275, 3718, 273, 18738, 436, 789, 5474, 33032, 2520, 2929, 2175, 29581, 3237, 432, 27293, 2165, 3762, 627, 403, 295, 6083, 285, 247, 29581, 1159, 269, 295, 281, 391, 835, 25290, 310, 253, 12786, 2075, 2727, 273, 253, 20706, 256, 8578, 2574, 295, 253, 4736, 310, 281, 897, 436, 1159, 269, 281, 4853, 271, 6349, 4972, 815, 338, 275, 391, 79, 6667, 2486, 253, 439, 522, 2205, 1318, 285, 8913, 20122, 2320, 3605, 50276, 783, 4477, 9569, 247, 37851, 1971, 273, 436, 1895, 835, 597, 897, 269, 281, 4853, 247, 5912, 3268, 268, 835, 3714, 310, 253, 5912, 326, 20706, 256, 4948, 597, 840, 12616, 253, 1895, 273, 13947, 271, 6349, 4972, 815, 338, 347, 247, 34430, 4906, 1895, 762, 268, 253, 295, 6083, 778, 320, 9578, 275, 247, 9542, 1039, 533, 281, 9212, 1016, 273, 731, 271, 2060, 6349, 1318, 581, 1364, 34430, 713, 616, 6355, 390, 25636, 616, 13007, 253, 4736, 310, 840, 281, 1089, 247, 1885, 3268, 2805, 326, 310, 347, 2810, 281, 268, 347, 1896, 762, 253, 27451, 23279, 5742, 253, 4477, 4853, 2805, 281, 320, 271, 295, 3907, 270, 1808, 276, 25658, 3268, 835, 253, 5912, 326, 5570, 891, 45347, 275, 253, 20706, 310, 17007, 1269, 74, 253, 4477, 921, 849, 281, 22318, 253, 20552, 1269, 18, 20200, 1269, 79, 970, 13249, 49104, 4720, 597, 4853, 253, 6349, 4868, 273, 4760, 891, 347, 2412, 2981, 18, 2981, 23111, 247, 3276, 246, 1307, 323, 17647, 253, 4477, 921, 326, 253, 4795, 6349, 4972, 12310, 1142, 273, 253, 18814, 10666, 30325, 26373, 3056, 326, 253, 439, 522, 2205, 1318, 285, 8913, 20122, 2320, 3605, 10517, 751, 253, 3635, 4760, 16888, 1204, 285, 13123, 26373, 3056, 50276, 249, 253, 4679, 253, 4477, 1007, 387, 1355, 10872, 342, 295, 50276, 1099, 835, 352, 310, 2686, 1896, 281, 11897, 253, 27935, 4555, 347, 10066, 281, 271, 16851, 10491, 1332, 253, 4893, 597, 1007, 387, 403, 323, 941, 29581, 285, 4735, 863, 2382, 275, 253, 3634, 273, 5145, 4715, 323, 841, 8892, 597, 921, 326, 616, 4081, 2746, 17923, 670, 253, 1072, 347, 253, 439, 522, 2205, 1318, 285, 8913, 20122, 2320, 3605, 285, 4536, 247, 2372, 1805, 20544, 50276, 783, 4477, 921, 326, 581, 476, 15313, 253, 439, 522, 2205, 1318, 285, 8913, 20122, 2320, 3605, 970, 581, 3213, 273, 11786, 49104, 970, 2173, 3302, 5904, 891, 1158, 697, 5322, 326, 841, 767, 6866, 476, 320, 27998, 762, 436, 1566, 50276, 783, 2929, 4455, 29422, 285, 369, 3477, 281, 1239, 50276, 20881, 1255, 265, 50276, 74, 1158, 326, 253, 2022, 1039, 436, 2929, 812, 320, 5520, 310, 326, 891, 812, 2649, 3240, 2096, 253, 5649, 273, 436, 2746, 281, 13947, 253, 6349, 7363, 689, 10610, 7274, 751, 253, 439, 522, 2205, 1318, 285, 253, 8913, 20122, 2320, 3605, 697, 417, 667, 6927, 281, 11897, 285, 387, 1878, 2556, 281, 47515, 3542, 275, 253, 2929, 3021, 2080, 352, 36908, 1646, 281, 10517, 667, 3081, 18814, 10666, 30325, 26373, 3056, 685, 253, 439, 522, 2205, 1318, 390, 8913, 20122, 2320, 3605, 1390, 314, 275, 253, 4679, 352, 3133, 281, 1347, 670, 347, 973, 347, 841, 643, 767, 14452, 253, 4679, 13414, 1646, 17093, 697, 34385, 347, 20939, 1242, 347, 581, 1537, 3524, 50276, 9088, 403, 247, 1643, 4088, 326, 253, 1566, 10165, 812, 320, 2007, 17285, 253, 2929, 310, 1663, 4270, 1475, 253, 2173, 4327, 268, 273, 253, 3268, 689, 10089, 4431, 534, 310, 17285, 275, 247, 1643, 14683, 275, 253, 2626, 12494, 273, 253, 10199, 2299, 1580, 253, 9414, 1663, 556, 281, 4489, 275, 281, 436, 4327, 273, 268, 281, 11435, 253, 1551, 273, 253, 2929, 891, 1158, 326, 247, 2372, 625, 22861, 651, 320, 4217, 581, 643, 2181, 891, 812, 2649, 3240, 956, 310, 2139, 253, 6349, 4868, 7637, 598, 1146, 2931, 275, 2426, 273, 253, 13737, 40009, 3470, 50276, 6795, 254, 5701, 50276, 20, 5784, 12494, 273, 2593, 337, 849, 310, 12910, 6777, 2139, 36908, 352, 921, 598, 275, 5150, 337, 50276, 21, 394, 12494, 273, 2593, 337, 387, 806, 891, 369, 13477, 670, 752, 253, 29581, 1895, 310, 840, 891, 8156, 697, 253, 1895, 273, 13947, 253, 6349, 4972, 352, 651, 320, 9371, 281, 19148, 436, 50276, 6275, 12494, 273, 2593, 337, 891, 452, 247, 1643, 13991, 50273, 74, 42126, 871, 752, 369, 5486, 407, 247, 34430, 4906, 8668, 1919, 891, 1694, 281, 2593, 577, 594, 891, 369, 2649, 2104, 281, 2096, 436, 6197, 50273, 565, 10389, 5845, 3607, 310, 247, 2372, 1512, 21248, 275, 619, 4743, 846, 4361, 2593, 608, 516, 417, 1663, 2119, 752, 841, 27807, 3607, 651, 320, 594, 352, 651, 7964, 1361, 281, 15368, 841, 562, 50276, 1189, 455, 891, 1158, 326, 436, 2929, 369, 23395, 3542, 285, 891, 751, 326, 352, 440, 7790, 253, 439, 522, 2205, 1318, 285, 8913, 20122, 2320, 3605, 533, 891, 13414, 2568, 923, 2139, 253, 4081, 2746, 310, 7933, 1805, 685, 970, 2057, 273, 841, 767, 5368, 6866, 594, 516, 25661, 4404, 18235, 2490, 187, 4118, 18435, 27, 2520, 2929, 19401, 253, 29581, 1895, 323, 247, 27293, 2165, 285, 2722, 326, 690, 8946, 17082, 24088, 439, 522, 2205, 1318, 476, 320, 2783, 347, 34754, 281, 253, 4869, 15579, 50276, 15337, 398, 497, 3839, 1077, 2762, 597, 3340, 26108, 253, 38135, 285, 4028, 3290, 1223, 1907, 690, 7350, 670, 253, 3290, 273, 253, 16774, 1543, 253, 4477, 858, 271, 7126, 2628, 19392, 281, 253, 30628, 285, 11512, 616, 2022, 7350, 247, 1643, 572, 487, 9143, 3464, 2299, 285, 1223, 253, 7714, 310, 1077, 1175, 347, 261, 4496, 1908, 253, 37317, 43680, 275, 6153, 271, 9300, 2715 ]
[ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 ]
[ 715, 5933, 337, 390, 320, 12956, 715, 1027, 33255, 323, 39793, 253, 27451, 23279, 891, 13414, 755, 436, 690, 37699, 327, 436, 1127, 651, 320, 9371, 50276, 14337, 273, 3276, 352, 1537, 320, 4409, 15806, 11120, 323, 16186, 84, 1638, 1010, 326, 253, 2173, 3276, 1318, 1057, 417, 2647, 285, 326, 352, 1057, 417, 2647, 323, 667, 1625, 46701, 554, 5731, 4390, 253, 9414, 1364, 4677, 436, 562, 323, 3746, 9255, 432, 326, 697, 247, 2372, 49770, 326, 1027, 9208, 4917, 1027, 39762, 14452, 285, 326, 359, 13414, 871, 1199, 670, 253, 3607, 273, 253, 1027, 5482, 533, 891, 9428, 697, 4030, 281, 3553, 2007, 5839, 281, 2852, 789, 352, 812, 671, 320, 5322, 281, 452, 2057, 247, 43302, 390, 4864, 30762, 2593, 4645, 2139, 246, 50276, 17, 285, 246, 281, 2192, 555, 10808, 1014, 5195, 285, 14805, 20552, 2975, 347, 436, 310, 671, 4390, 1669, 281, 253, 9414, 50276, 14337, 273, 3302, 6081, 275, 2593, 5976, 253, 373, 247, 4864, 2593, 16585, 253, 3302, 14460, 2554, 8772, 39762, 2193, 352, 3133, 6571, 987, 533, 516, 13477, 407, 253, 1750, 326, 253, 3302, 6081, 36908, 2647, 604, 368, 2098, 327, 3515, 305, 69, 281, 14940, 849, 476, 326, 320, 1677, 326, 253, 1895, 310, 1327, 44181, 4767, 4321, 275, 253, 2929, 841, 1646, 751, 34126, 5697, 4496, 19148, 604, 1896, 50276, 1911, 5714, 285, 6733, 3607, 275, 2593, 7127, 253, 373, 247, 12494, 16585, 2139, 39762, 2193, 13414, 10517, 253, 823, 5714, 285, 6733, 3607, 10048, 407, 439, 522, 2205, 2193, 891, 1119, 436, 12494, 247, 2372, 8909, 275, 15974, 436, 2139, 1953, 634, 8813, 12453, 2139, 597, 943, 2649, 4606, 2139, 841, 3607, 1537, 320, 440, 6243, 4052, 347, 604, 368, 574, 690, 4327, 275, 253, 2647, 2581, 685, 253, 15965, 1405, 2291, 2531, 4606, 2139, 597, 13414, 891, 651, 1642, 326, 368, 4575, 436, 12494, 281, 19148, 1880, 368, 250, 15571, 891, 2139, 1110, 3607, 403, 2649, 10048, 390, 21255, 2139, 697, 8261, 326, 597, 250, 417, 10048, 50276, 66, 4564, 1841, 670, 253, 4679, 50275, 249, 2593, 8676, 835, 359, 1007, 387, 253, 14940, 273, 5933, 337, 359, 476, 4518, 923, 326, 352, 26414, 533, 1057, 352, 29623, 281, 253, 1072, 1127, 10159, 273, 253, 31850, 436, 778, 320, 4409, 2819, 715, 1955, 253, 3237, 1327, 44181, 414, 50276, 261, 352, 4409, 2819, 715, 1880, 5933, 337, 476, 4917, 5919, 1698, 87, 14417, 439, 522, 2205, 5568, 20122, 2320, 1318, 8197, 4103, 281, 5368, 48489, 513, 368, 452, 667, 30328, 670, 849, 253, 11041, 1537, 7277, 323, 247, 4229, 1180, 273, 2165, 27163, 50276, 9088, 497, 247, 4564, 2173, 25491, 326, 891, 1869, 812, 320, 5520, 50275, 783, 10199, 2296, 326, 368, 8338, 247, 37851, 1971, 273, 3958, 28763, 2032, 533, 697, 417, 1077, 2173, 984, 27293, 2165, 26850, 403, 4536, 37851, 253, 373, 789, 7296, 19191, 27293, 3958, 285, 439, 522, 2205, 5568, 20122, 2320, 2193, 452, 37851, 26850, 352, 1537, 320, 1805, 281, 1333, 326, 368, 12661, 4715, 247, 2803, 1025, 3268, 689, 3773, 281, 12666, 387, 4760, 821, 12542, 984, 28763, 47515, 4451, 1060, 253, 1072, 12494, 2296, 1633, 751, 436, 1996, 533, 352, 6505, 562, 253, 2372, 670, 4715, 247, 2803, 1025, 35701, 3268, 285, 253, 958, 326, 253, 3236, 3268, 310, 14659, 281, 1691, 625, 2280, 327, 3773, 326, 8162, 625, 1318, 50276, 12563, 275, 253, 10199, 368, 1375, 326, 368, 2589, 4715, 285, 11649, 1783, 275, 247, 27998, 17699, 16561, 5133, 516, 417, 2119, 436, 310, 3451, 634, 1332, 273, 2282, 1057, 2177, 533, 417, 11649, 1783, 390, 17699, 16561, 17032, 488, 373, 253, 2720, 47515, 253, 941, 436, 2929, 23970, 690, 1077, 4722, 5697, 670, 253, 897, 273, 38391, 983, 285, 1599, 3423, 2177, 275, 253, 3634, 273, 4760, 29581, 323, 27293, 3958, 616, 747, 8668, 310, 4802, 281, 439, 522, 2205, 5568, 20122, 2320, 22275, 5280, 2531, 2193, 352, 16000, 16851, 13757, 253, 27935, 2430, 10491, 285, 597, 4853, 247, 747, 1318, 253, 39762, 3605, 326, 17923, 3240, 973, 275, 616, 4679, 50276, 74, 574, 247, 4564, 3533, 285, 5701, 670, 253, 4028, 533, 891, 1902, 841, 588, 320, 3477, 281, 2953, 5474, 339, 431, 248, 2929, 2175, 29581, 3237, 323, 27293, 3958, 352, 29328, 247, 747, 29581, 2557, 1925, 39762, 3605, 253, 2934, 310, 281, 2794, 247, 20706, 5912, 3268, 1754, 327, 247, 4869, 15579, 17705, 4760, 821, 12542, 403, 840, 6012, 407, 6153, 34430, 6216, 919, 6375, 684, 273, 436, 3268, 253, 4477, 840, 1246, 247, 11786, 49104, 5933, 281, 11897, 436, 34430, 4906, 8946, 29581, 6866, 751, 253, 439, 522, 2205, 1318, 285, 253, 8913, 20122, 2320, 3605, 476, 320, 12372, 347, 2714, 2219, 390, 14586, 273, 253, 11333, 10040, 684, 5001, 20544, 281, 253, 1682, 273, 619, 3640, 253, 4081, 29581, 273, 253, 7792, 310, 4460, 352, 310, 973, 17194, 285, 253, 10291, 342, 5368, 8946, 3082, 403, 1077, 4722, 352, 671, 13279, 253, 3369, 323, 2007, 18149, 347, 1027, 35701, 3210, 390, 2898, 2173, 2235, 641, 476, 320, 4354, 11217, 1223, 352, 310, 1892, 281, 9059, 1097, 275, 3762, 285, 275, 3946, 326, 581, 29581, 1332, 310, 1805, 685, 253, 18075, 253, 16774, 1543, 1646, 281, 320, 5272, 891, 2868, 326, 436, 2929, 476, 452, 1534, 3486, 275, 253, 2170, 275, 253, 8993, 2852, 50276, 1747, 13218, 32213, 253, 2929, 812, 320, 5520, 275, 2426, 273, 2746, 1430, 281, 24432, 41005, 9610, 253, 1408, 2069, 273, 253, 4679, 285, 263, 253, 3933, 317, 1767, 553, 5454, 14273, 323, 278, 3591, 68, 3082, 651, 320, 4217, 323, 24432, 23000, 667, 7535, 327, 849, 4212, 943, 4665, 253, 7880, 7363, 273, 253, 39762, 3605, 3340, 1580, 246, 11852, 616, 13642, 812, 320, 4217, 436, 310, 3340, 2032, 323, 2219, 835, 253, 31972, 273, 1027, 29581, 3082, 403, 2074, 533, 253, 7880, 7363, 403, 417, 25761, 253, 12494, 327, 2139, 823, 5714, 390, 6733, 1057, 417, 1056, 3282, 323, 690, 3958, 310, 1077, 1774, 323, 24432, 281, 2096, 24432, 778, 320, 4354, 10480, 264, 715, 4680, 326, 253, 625, 3607, 247, 29581, 2557, 12310, 253, 1805, 10159, 273, 253, 2165, 597, 1611, 281, 2096, 987, 1024, 253, 12494, 310, 247, 2372, 14086, 285, 1892, 281, 956, 323, 23886, 417, 7615, 342, 2720, 789, 327, 26373, 16692, 1320, 273, 821, 12542, 275, 27293, 3958, 20953, 10704, 6667, 281, 7568, 2139, 823, 5714, 390, 6733, 476, 906, 275, 25962, 2338, 3870, 302, 316, 4085, 821, 12542, 812, 671, 1361, 24432, 15909, 752, 310, 253, 1895, 50276, 20513, 32213, 403, 5884, 594, 891, 717, 275, 3718, 273, 18738, 436, 789, 891, 452, 1239, 253, 6128, 273, 253, 4477, 891, 1089, 326, 841, 30733, 403, 1469, 281, 3157, 253, 2929, 891, 3021, 6558, 619, 4868, 247, 973, 3542, 285, 4460, 789, 327, 247, 39762, 7792, 323, 4760, 821, 12542, 275, 27293, 3958, 1223, 2746, 1430, 281, 24432, 812, 320, 5520, 891, 717, 275, 3718, 273, 18738, 436, 789, 5474, 33032, 2520, 2929, 2175, 29581, 3237, 432, 27293, 2165, 3762, 627, 403, 295, 6083, 285, 247, 29581, 1159, 269, 295, 281, 391, 835, 25290, 310, 253, 12786, 2075, 2727, 273, 253, 20706, 256, 8578, 2574, 295, 253, 4736, 310, 281, 897, 436, 1159, 269, 281, 4853, 271, 6349, 4972, 815, 338, 275, 391, 79, 6667, 2486, 253, 439, 522, 2205, 1318, 285, 8913, 20122, 2320, 3605, 50276, 783, 4477, 9569, 247, 37851, 1971, 273, 436, 1895, 835, 597, 897, 269, 281, 4853, 247, 5912, 3268, 268, 835, 3714, 310, 253, 5912, 326, 20706, 256, 4948, 597, 840, 12616, 253, 1895, 273, 13947, 271, 6349, 4972, 815, 338, 347, 247, 34430, 4906, 1895, 762, 268, 253, 295, 6083, 778, 320, 9578, 275, 247, 9542, 1039, 533, 281, 9212, 1016, 273, 731, 271, 2060, 6349, 1318, 581, 1364, 34430, 713, 616, 6355, 390, 25636, 616, 13007, 253, 4736, 310, 840, 281, 1089, 247, 1885, 3268, 2805, 326, 310, 347, 2810, 281, 268, 347, 1896, 762, 253, 27451, 23279, 5742, 253, 4477, 4853, 2805, 281, 320, 271, 295, 3907, 270, 1808, 276, 25658, 3268, 835, 253, 5912, 326, 5570, 891, 45347, 275, 253, 20706, 310, 17007, 1269, 74, 253, 4477, 921, 849, 281, 22318, 253, 20552, 1269, 18, 20200, 1269, 79, 970, 13249, 49104, 4720, 597, 4853, 253, 6349, 4868, 273, 4760, 891, 347, 2412, 2981, 18, 2981, 23111, 247, 3276, 246, 1307, 323, 17647, 253, 4477, 921, 326, 253, 4795, 6349, 4972, 12310, 1142, 273, 253, 18814, 10666, 30325, 26373, 3056, 326, 253, 439, 522, 2205, 1318, 285, 8913, 20122, 2320, 3605, 10517, 751, 253, 3635, 4760, 16888, 1204, 285, 13123, 26373, 3056, 50276, 249, 253, 4679, 253, 4477, 1007, 387, 1355, 10872, 342, 295, 50276, 1099, 835, 352, 310, 2686, 1896, 281, 11897, 253, 27935, 4555, 347, 10066, 281, 271, 16851, 10491, 1332, 253, 4893, 597, 1007, 387, 403, 323, 941, 29581, 285, 4735, 863, 2382, 275, 253, 3634, 273, 5145, 4715, 323, 841, 8892, 597, 921, 326, 616, 4081, 2746, 17923, 670, 253, 1072, 347, 253, 439, 522, 2205, 1318, 285, 8913, 20122, 2320, 3605, 285, 4536, 247, 2372, 1805, 20544, 50276, 783, 4477, 921, 326, 581, 476, 15313, 253, 439, 522, 2205, 1318, 285, 8913, 20122, 2320, 3605, 970, 581, 3213, 273, 11786, 49104, 970, 2173, 3302, 5904, 891, 1158, 697, 5322, 326, 841, 767, 6866, 476, 320, 27998, 762, 436, 1566, 50276, 783, 2929, 4455, 29422, 285, 369, 3477, 281, 1239, 50276, 20881, 1255, 265, 50276, 74, 1158, 326, 253, 2022, 1039, 436, 2929, 812, 320, 5520, 310, 326, 891, 812, 2649, 3240, 2096, 253, 5649, 273, 436, 2746, 281, 13947, 253, 6349, 7363, 689, 10610, 7274, 751, 253, 439, 522, 2205, 1318, 285, 253, 8913, 20122, 2320, 3605, 697, 417, 667, 6927, 281, 11897, 285, 387, 1878, 2556, 281, 47515, 3542, 275, 253, 2929, 3021, 2080, 352, 36908, 1646, 281, 10517, 667, 3081, 18814, 10666, 30325, 26373, 3056, 685, 253, 439, 522, 2205, 1318, 390, 8913, 20122, 2320, 3605, 1390, 314, 275, 253, 4679, 352, 3133, 281, 1347, 670, 347, 973, 347, 841, 643, 767, 14452, 253, 4679, 13414, 1646, 17093, 697, 34385, 347, 20939, 1242, 347, 581, 1537, 3524, 50276, 9088, 403, 247, 1643, 4088, 326, 253, 1566, 10165, 812, 320, 2007, 17285, 253, 2929, 310, 1663, 4270, 1475, 253, 2173, 4327, 268, 273, 253, 3268, 689, 10089, 4431, 534, 310, 17285, 275, 247, 1643, 14683, 275, 253, 2626, 12494, 273, 253, 10199, 2299, 1580, 253, 9414, 1663, 556, 281, 4489, 275, 281, 436, 4327, 273, 268, 281, 11435, 253, 1551, 273, 253, 2929, 891, 1158, 326, 247, 2372, 625, 22861, 651, 320, 4217, 581, 643, 2181, 891, 812, 2649, 3240, 956, 310, 2139, 253, 6349, 4868, 7637, 598, 1146, 2931, 275, 2426, 273, 253, 13737, 40009, 3470, 50276, 6795, 254, 5701, 50276, 20, 5784, 12494, 273, 2593, 337, 849, 310, 12910, 6777, 2139, 36908, 352, 921, 598, 275, 5150, 337, 50276, 21, 394, 12494, 273, 2593, 337, 387, 806, 891, 369, 13477, 670, 752, 253, 29581, 1895, 310, 840, 891, 8156, 697, 253, 1895, 273, 13947, 253, 6349, 4972, 352, 651, 320, 9371, 281, 19148, 436, 50276, 6275, 12494, 273, 2593, 337, 891, 452, 247, 1643, 13991, 50273, 74, 42126, 871, 752, 369, 5486, 407, 247, 34430, 4906, 8668, 1919, 891, 1694, 281, 2593, 577, 594, 891, 369, 2649, 2104, 281, 2096, 436, 6197, 50273, 565, 10389, 5845, 3607, 310, 247, 2372, 1512, 21248, 275, 619, 4743, 846, 4361, 2593, 608, 516, 417, 1663, 2119, 752, 841, 27807, 3607, 651, 320, 594, 352, 651, 7964, 1361, 281, 15368, 841, 562, 50276, 1189, 455, 891, 1158, 326, 436, 2929, 369, 23395, 3542, 285, 891, 751, 326, 352, 440, 7790, 253, 439, 522, 2205, 1318, 285, 8913, 20122, 2320, 3605, 533, 891, 13414, 2568, 923, 2139, 253, 4081, 2746, 310, 7933, 1805, 685, 970, 2057, 273, 841, 767, 5368, 6866, 594, 516, 25661, 4404, 18235, 2490, 187, 4118, 18435, 27, 2520, 2929, 19401, 253, 29581, 1895, 323, 247, 27293, 2165, 285, 2722, 326, 690, 8946, 17082, 24088, 439, 522, 2205, 1318, 476, 320, 2783, 347, 34754, 281, 253, 4869, 15579, 50276, 15337, 398, 497, 3839, 1077, 2762, 597, 3340, 26108, 253, 38135, 285, 4028, 3290, 1223, 1907, 690, 7350, 670, 253, 3290, 273, 253, 16774, 1543, 253, 4477, 858, 271, 7126, 2628, 19392, 281, 253, 30628, 285, 11512, 616, 2022, 7350, 247, 1643, 572, 487, 9143, 3464, 2299, 285, 1223, 253, 7714, 310, 1077, 1175, 347, 261, 4496, 1908, 253, 37317, 43680, 275, 6153, 271, 9300, 2715 ]
Below is given review of a research paper from cnoference journal. Please write a summary the review. ### Review: the paper proposes to use a multitask rl policy trained offline with i data stateactionstatereward tuples for a collection of tasks as a starting point to finetune online 2 new policies for a new similar task one policy to solve the task itself and one policy to autonomously reset the environment in case the new task is not sufficiently similar to the tasks for which the multitask policy was pretrained ii additional data from operatorguided demonstrations are also used the approach is demonstrated both in realworld experiments and simulation for 6 pickandplace tasks with a robotic manipulator the results in realworld experiments show how online finetuning improves the success rate of the multitask policy on new tasks the paper interestingly brings together 3 ideas offline learning multitask transfer learning and learned automated reset policies into a single approach the paper makes simultaneous use of three very interesting and promising methods offline learning multitask transfer learning and learned automated reset policies and treats the onehot task representation of the multiagent policy as a continuous learnable parameter that also represents the task embedding of new tasks and the reset policy although not entirely new is a neat approach to address the papers problem besides the marginalincremental novelty one possible weakness of the paper is that it seems that the new tasks need to be such similar enough that the old multitask policy can already achieve success the role of human demonstrations initially presented as optional then used for socalled outofdistribution new tasks should be better introduced and more formally justified the way data and results are presented in the tables should be improved for clarity and standalone readability docseprealworld robotic reinforcement learning rl typically requires timeconsuming data collection and frequent human intervention to reset the environment the offlinerl approach provides a way to utilize previously collected data to bootstrap the learning process so that a new task can be learned with a small number of interactions in this work the authors study a problem where such offline data is available from several tasks along with their task labels the data may be collected with optimal or suboptimal policies the authors propose a complete system for extracting useful skills from prior data and applying them to learn new tasks autonomously when faced with a new task the proposed system adapts previously learned skills to quickly learn to both perform the new task and return the environment to an initial state effectively performing its own environment reset the authors show the effectiveness of their system with extensive physical and simulated experiments strength the paper addresses an important problem of data efficiency and resetfree learning for real robotic applications the paper clearly mentions the prior works on which the proposed system is based extensive experimentation to assess the effectiveness of the system both in simulated and physical environments weakness the main weakness of the paper is that it is a bit difficult to see what the actual contribution of the paper is and how different the approach is compared to the previously published work several prior works already demonstrated how online rl can be accelerated with prior offline data docsepthe paper discusses a system to leverage prior data from related but distinct tasks to increase learning efficiency and generalization in new task the system works in two steps 1 using offline data with consisting of different tasks with task ids to learn a taskconditional policy using offline rl 2 optimize taskconditioning embedding during deployment for the new task to reduce the human supervision involved in resets the method proposes to extract two different embeddings one that accomplishes the tasks and the other that resets the task the experiments are shown in both real world robot and simulation strengths 1 the paper presents a relatively novel combination of existing known modules for learning with offline data and learning reset policies 2 the paper evaluates the results on real world and simulated domains and provide comprehensive results to show the validity of the method the experiments demonstrate the superior generalization ability of the system arising as a result of learning from multiple tasks and increased learning efficiency weaknesses 1 limited baselines the paper evaluates ariel on the real world tasks but does not present a comparison with any baseline potential baselines to be compared to are 229313233references from paper and common rl baselines which are more suited to sparse rewards like her123 references below 2 the experiments use a short horizon of m20 which is small for a number of locomotion and navigation tasks it is not clear how this method would scale with horizon since the startstate distribution for learning the reset policy can grow exponentially 1 hindsight experience replay marcin andrychowicz et al 2 replacing rewards with examples examplebased policy search via recursive classificationeysenbach et al 3 clearning learning to achieve goals via recursive classification eysenbach et al docseppretraining with large offline data and finetuning on a small amount of online data is a sampleefficient way of learning a new task however most recent approaches require explicit environment resetting for online data collection this work proposes to minimize manual resetting of the environment when learning a new task in the real world leveraging offline multitask data first from the offline multitask data the proposed method ariel learns a multitask policy conditioned on a task embedding mathbfz using offline rl then during the online finetuning phase ariel searches for a task embedding for the forward task as well as another task embedding for the backward task environment reset ariel iterates the forward and backward tasks which ideally resets the environment and performs the target task without human intervention the realworld experiments show that ariel achieves more than 70 success rates with only 5 to 30 manual resets strenghts the idea of utilizing a multitask policy for both forward and backward reset policies makes sense and the paper empirically shows its effectiveness the realworld experiments are exhaustively conducted with many different objects and in three different environment setups weaknesses in rl covering diverse initial states is important for robustness and generalization of the learned policy however without explicit randomized resetting the proposed backward controller may learn to reset to a few easy to reset initial states which maximizes its own reward it would be great if the authors discuss this potential issue the proposed alternating scheme is appealing in automating online reinforcement learning with less human intervention however one of the critical assumptions in this work is that both forward and backward controllers can be achieved with a few random shootings the reviewer is wondering whether this is a practical assumption for robot learning to show the sufficient difference between training tasks and testing tasks another interesting point could be freezing the policy after pretraining and only finding appropriate mathbfz using cem if this works as well as the proposed method it could mean the prior data is too similar to the testing tasks which means the experimental setup might be too easy the proposed method assumes a task embedding corresponds to a task however in many cases the offline data trajectories can be compositional combination of multiple behaviors and the target task may require stitching fo multiple of these trajectories the proposed method does not seem trivial to scale to this common scenario questions and suggestions figure 2b is too small compact and not descriptive without enough explanation it would be better to include only two examples one from indistribution and another one from outofdistribution tasks and provide sufficient descriptions l242243 placing the ring on the peg does not seem to require greater precision than the prior tasks in l244245 while the paper claims that the proposed method can utilize data from any reasonable source the proposed method requires reward annotation which is not available in rewardfree demonstrations eg play data multitask expert demonstrations given that the proposed method can potentially work with sparselylabeled data using offline rl it may not be a big issue but it should be clarified in l253 what does it mean that rewards are specified at the end of the trajectory the reviewer thinks rewards for the target demonstrations should be all 1 at the end or does it mean that the reward is labeled depending on the forward or backward tasks what is the definition of trial in section 5 and 6 does this mean m steps or 2m steps in table 1 why do indistribution tasks perform worse than outofdistribution tasks in table 1 including a baseline with episodic reset as an upper bound can be helpful to understand how much reset information is important for learning these tasks in table 1 and table 2 which experiment uses 40 downstream task demonstrations this is not clear the explanation in l291l293 about the experiments for table 2 is a bit confusing are the testing tasks unseen during online finetuning more specifically put x in container tasks are evaluated using a policy pretrained from multitask prior data and finetuned for put tiger on lid in table 2 how many trials were used for training single task data only in a21 there are three different environments tray container tray drawer and kitchen for offline training of ariel are data from all three environments used together or data from the same environment with the target task is used for offline training in algorithm 1 steps should be reset to 0 and new mathbfz should be sampled like l9 when the task direction d is reversed in l14 in algorithm 1 l2 what does it mean by fit qfmathbfz and qbmathbfz to offline task indices in algorithm 1 l5 d should be also reset to f as the environment state is reset simply moving l3 to l5 can work how much randomness in object poses and robot poses for each trial of evaluation in table 1 and 2 in table 1 how is offline only evaluated more specifically which mathbfz is used for this experiment and how can outofdistribution tasks have such high success rates only with offline only a bit more detailed description of the task indices mathbfz can be helpful to understand the proposed method the paper mentions that mathbfz is a task embedding from hausman et al in l159 but more details about how it is trained can be elaborated in the paper it could be interesting to see how success rates of the forward and backward controllers over the online training many references are from arxiv please cite their conference or journal versions ### Summary:
please check the comments of the reviewers in detail strengths interesting combination of existing ideas experiments with a real system lowcost arm weaknesses not highly novel combination of existing ideas comparison with the stateoftheart could be stronger using prior data has been explored several times in the past statistics need to be clarified number of replicates including for the realworld experiments evaluation of variance etc assumption that both forward and backward policies can be learned with a few random shoots which seems unlikely in most scenarios postrebuttal remarks i would like to thank the authors for their efforts in answering the comments the main concern left is the novelty potential impact but overall the reviewers liked the paper
[ 275, 1524, 10186, 4679, 285, 9864, 323, 721, 2619, 395, 5070, 8892, 342, 247, 35121, 9452, 11699, 253, 1543, 275, 1524, 10186, 4679, 921, 849, 3909, 1442, 292, 25004, 19132, 253, 2323, 2281, 273, 253, 1554, 262, 1945, 3646, 327, 747, 8892, 253, 2929, 4722, 314, 10316, 2366, 495, 5697, 28841, 4715, 1554, 262, 1945, 3700, 4715, 285, 6311, 16644, 14932, 7823, 715, 247, 2014, 2746, 253, 2929, 2789, 19645, 897, 273, 1264, 1077, 4722, 285, 12532, 3082, 28841, 4715, 1554, 262, 1945, 3700, 4715, 285, 6311, 16644, 14932, 7823, 285, 26574, 253, 581, 12022, 4836, 6779, 273, 253, 4471, 12788, 3646, 347, 247, 5415, 3037, 494, 4764, 326, 671, 6125, 253, 4836, 21496, 273, 747, 8892, 285, 253, 14932, 3646, 3738, 417, 7094, 747, 310, 247, 18176, 2746, 281, 2953, 253, 9380, 1895, 16280, 253, 16888, 19687, 30132, 38135, 581, 1896, 14855, 273, 253, 2929, 310, 326, 352, 3133, 326, 253, 747, 8892, 878, 281, 320, 824, 2074, 2217, 326, 253, 1711, 1554, 262, 1945, 3646, 476, 2168, 5115, 2323, 253, 2554, 273, 1966, 32367, 8523, 3559, 347, 15266, 840, 908, 323, 9267, 18859, 562, 1171, 35360, 747, 8892, 943, 320, 1805, 5611, 285, 625, 19186, 17285, 253, 1039, 941, 285, 1543, 403, 3559, 275, 253, 7180, 943, 320, 5520, 323, 19843, 285, 40468, 1239, 1430, 5474, 339, 3456, 267, 10186, 35121, 35221, 4715, 391, 77, 5431, 4419, 673, 33136, 941, 4849, 285, 10879, 1966, 7268, 281, 14932, 253, 3126, 253, 745, 41904, 77, 2746, 3400, 247, 1039, 281, 16584, 3786, 5728, 941, 281, 28551, 253, 4715, 1232, 594, 326, 247, 747, 4836, 476, 320, 6311, 342, 247, 1355, 1180, 273, 6355, 275, 436, 789, 253, 4477, 1263, 247, 1895, 835, 824, 28841, 941, 310, 2130, 432, 2067, 8892, 2112, 342, 616, 4836, 13301, 253, 941, 778, 320, 5728, 342, 8654, 390, 749, 29776, 7823, 50276, 783, 4477, 12661, 247, 3426, 985, 323, 34705, 4217, 6936, 432, 2720, 941, 285, 9433, 731, 281, 3037, 747, 8892, 1125, 11168, 4087, 672, 11372, 342, 247, 747, 4836, 253, 4081, 985, 5223, 84, 3786, 6311, 6936, 281, 4541, 3037, 281, 1097, 1347, 253, 747, 4836, 285, 1091, 253, 3126, 281, 271, 3302, 1375, 8069, 9591, 697, 1211, 3126, 14932, 253, 4477, 921, 253, 12510, 273, 616, 985, 342, 9470, 3520, 285, 15524, 4679, 4757, 50275, 783, 2929, 12453, 271, 1774, 1895, 273, 941, 6733, 285, 14932, 4924, 4715, 323, 1524, 35121, 4893, 50276, 783, 2929, 4518, 25957, 253, 2720, 2987, 327, 534, 253, 4081, 985, 310, 1754, 50276, 2068, 3134, 40290, 281, 2939, 253, 12510, 273, 253, 985, 1097, 275, 15524, 285, 3520, 12620, 50276, 20881, 1255, 50276, 783, 2022, 14855, 273, 253, 2929, 310, 326, 352, 310, 247, 2372, 2834, 281, 923, 752, 253, 4588, 7680, 273, 253, 2929, 310, 285, 849, 1027, 253, 2746, 310, 2429, 281, 253, 3786, 3863, 789, 2067, 2720, 2987, 2168, 5183, 849, 3909, 391, 77, 476, 320, 21702, 342, 2720, 28841, 941, 50276, 7152, 339, 431, 248, 2929, 25339, 247, 985, 281, 25057, 2720, 941, 432, 2905, 533, 5799, 8892, 281, 2572, 4715, 6733, 285, 26647, 275, 747, 4836, 253, 985, 2987, 275, 767, 5018, 337, 970, 28841, 941, 342, 11253, 273, 1027, 8892, 342, 4836, 44077, 281, 3037, 247, 4836, 35428, 3646, 970, 28841, 391, 77, 374, 22318, 4836, 42743, 21496, 1309, 19007, 323, 253, 747, 4836, 281, 4796, 253, 1966, 20446, 3206, 275, 501, 1507, 253, 1332, 29328, 281, 4908, 767, 1027, 46234, 581, 326, 7576, 6419, 253, 8892, 285, 253, 643, 326, 501, 1507, 253, 4836, 253, 4679, 403, 2011, 275, 1097, 1524, 1533, 15688, 285, 9864, 20544, 50276, 18, 253, 2929, 10262, 247, 4942, 4460, 5019, 273, 5368, 1929, 11911, 323, 4715, 342, 28841, 941, 285, 4715, 14932, 7823, 374, 253, 2929, 44995, 253, 1543, 327, 1524, 1533, 285, 15524, 10625, 285, 2085, 11088, 1543, 281, 921, 253, 13091, 273, 253, 1332, 253, 4679, 7568, 253, 8936, 26647, 3745, 273, 253, 985, 14475, 347, 247, 906, 273, 4715, 432, 2709, 8892, 285, 2559, 4715, 6733, 50276, 20881, 1255, 265, 50276, 18, 3710, 1666, 25379, 253, 2929, 44995, 247, 19399, 327, 253, 1524, 1533, 8892, 533, 1057, 417, 1246, 247, 5301, 342, 667, 8245, 2442, 1666, 25379, 281, 320, 2429, 281, 403, 374, 19630, 1012, 22187, 250, 3065, 432, 2929, 285, 1846, 391, 77, 1666, 25379, 534, 403, 625, 18960, 281, 23507, 23267, 751, 617, 10683, 10414, 2708, 374, 253, 4679, 897, 247, 2159, 16892, 273, 278, 938, 534, 310, 1355, 323, 247, 1180, 273, 23904, 5011, 285, 15034, 8892, 352, 310, 417, 2590, 849, 436, 1332, 651, 4311, 342, 16892, 1580, 253, 1265, 3409, 3268, 323, 4715, 253, 14932, 3646, 476, 1756, 28596, 50276, 18, 17134, 18347, 2793, 44864, 2304, 5620, 285, 610, 348, 319, 25928, 1162, 355, 50276, 19, 15706, 23267, 342, 6667, 1650, 3169, 3646, 3186, 3066, 33037, 9162, 70, 656, 257, 16836, 1162, 355, 50276, 20, 2590, 920, 4715, 281, 5115, 7342, 3066, 33037, 9162, 299, 656, 257, 16836, 1162, 355, 5474, 339, 377, 1221, 26208, 342, 1781, 28841, 941, 285, 1442, 292, 25004, 327, 247, 1355, 2408, 273, 3909, 941, 310, 247, 3410, 20246, 1039, 273, 4715, 247, 747, 4836, 2299, 954, 3332, 7274, 2430, 6843, 3126, 14932, 1076, 323, 3909, 941, 4849, 436, 789, 29328, 281, 15338, 11595, 14932, 1076, 273, 253, 3126, 672, 4715, 247, 747, 4836, 275, 253, 1524, 1533, 19732, 2977, 28841, 1554, 262, 1945, 941, 806, 432, 253, 28841, 1554, 262, 1945, 941, 253, 4081, 1332, 247, 19399, 33772, 247, 1554, 262, 1945, 3646, 27039, 327, 247, 4836, 21496, 14168, 3342, 91, 970, 28841, 391, 77, 840, 1309, 253, 3909, 1442, 292, 25004, 3408, 247, 19399, 17891, 323, 247, 4836, 21496, 323, 253, 3579, 4836, 347, 973, 347, 1529, 4836, 21496, 323, 253, 19265, 4836, 3126, 14932, 247, 19399, 10040, 684, 253, 3579, 285, 19265, 8892, 534, 34243, 501, 1507, 253, 3126, 285, 17923, 253, 2303, 4836, 1293, 1966, 7268, 253, 1524, 10186, 4679, 921, 326, 247, 19399, 33526, 625, 685, 5571, 2323, 4142, 342, 760, 608, 281, 1884, 11595, 501, 1507, 50276, 296, 3755, 384, 84, 50275, 783, 2934, 273, 17617, 247, 1554, 262, 1945, 3646, 323, 1097, 3579, 285, 19265, 14932, 7823, 2789, 3282, 285, 253, 2929, 45190, 2722, 697, 12510, 50275, 783, 1524, 10186, 4679, 403, 9286, 1242, 5196, 342, 1142, 1027, 5113, 285, 275, 1264, 1027, 3126, 873, 8777, 50275, 20881, 1255, 265, 50275, 249, 391, 77, 10985, 11117, 3302, 3054, 310, 1774, 323, 31640, 285, 26647, 273, 253, 6311, 3646, 2299, 1293, 6843, 14871, 14932, 1076, 253, 4081, 19265, 9763, 778, 3037, 281, 14932, 281, 247, 1643, 3477, 281, 14932, 3302, 3054, 534, 11903, 4219, 697, 1211, 10921, 352, 651, 320, 1270, 604, 253, 4477, 2319, 436, 2442, 2523, 50275, 783, 4081, 28035, 6974, 310, 23176, 275, 3772, 839, 3909, 35221, 4715, 342, 1679, 1966, 7268, 2299, 581, 273, 253, 4619, 13260, 275, 436, 789, 310, 326, 1097, 3579, 285, 19265, 27765, 476, 320, 6786, 342, 247, 1643, 3632, 42509, 253, 37317, 310, 12371, 1880, 436, 310, 247, 8542, 9376, 323, 15688, 4715, 50275, 936, 921, 253, 4209, 3064, 875, 3733, 8892, 285, 5175, 8892, 1529, 4722, 1127, 812, 320, 24250, 253, 3646, 846, 3215, 26208, 285, 760, 4560, 4569, 14168, 3342, 91, 970, 260, 358, 604, 436, 2987, 347, 973, 347, 253, 4081, 1332, 352, 812, 1599, 253, 2720, 941, 310, 1512, 2074, 281, 253, 5175, 8892, 534, 2097, 253, 5661, 9978, 1537, 320, 1512, 3477, 50275, 783, 4081, 1332, 19584, 247, 4836, 21496, 10140, 281, 247, 4836, 2299, 275, 1142, 2219, 253, 28841, 941, 24102, 476, 320, 5889, 267, 5019, 273, 2709, 13576, 285, 253, 2303, 4836, 778, 2430, 331, 31054, 7954, 2709, 273, 841, 24102, 253, 4081, 1332, 1057, 417, 1646, 14916, 281, 4311, 281, 436, 1846, 10076, 50275, 34974, 285, 13991, 50275, 13206, 374, 67, 310, 1512, 1355, 8566, 285, 417, 27389, 1293, 2217, 8813, 352, 651, 320, 1805, 281, 2486, 760, 767, 6667, 581, 432, 31929, 2382, 285, 1529, 581, 432, 562, 1171, 35360, 8892, 285, 2085, 4209, 20121, 50275, 77, 1348, 17537, 20, 15606, 253, 5818, 327, 253, 47997, 1057, 417, 1646, 281, 2430, 3687, 12320, 685, 253, 2720, 8892, 50275, 249, 298, 21149, 19490, 1223, 253, 2929, 3916, 326, 253, 4081, 1332, 476, 16584, 941, 432, 667, 5272, 2603, 253, 4081, 1332, 4419, 10921, 22581, 534, 310, 417, 2130, 275, 10921, 4924, 32367, 24088, 1132, 941, 1554, 262, 1945, 6485, 32367, 1677, 326, 253, 4081, 1332, 476, 7826, 789, 342, 37139, 600, 22027, 941, 970, 28841, 391, 77, 352, 778, 417, 320, 247, 1943, 2523, 533, 352, 943, 320, 31637, 50275, 249, 298, 22067, 752, 1057, 352, 1599, 326, 23267, 403, 7616, 387, 253, 990, 273, 253, 18974, 253, 37317, 11121, 23267, 323, 253, 2303, 32367, 943, 320, 512, 337, 387, 253, 990, 390, 1057, 352, 1599, 326, 253, 10921, 310, 13130, 7293, 327, 253, 3579, 390, 19265, 8892, 50275, 5371, 310, 253, 5426, 273, 2332, 275, 2593, 608, 285, 721, 1057, 436, 1599, 278, 5018, 390, 374, 78, 5018, 50275, 249, 2829, 337, 2139, 513, 31929, 2382, 8892, 1347, 7197, 685, 562, 1171, 35360, 8892, 50275, 249, 2829, 337, 1690, 247, 8245, 342, 6314, 23329, 14932, 347, 271, 5170, 3033, 476, 320, 9371, 281, 2096, 849, 1199, 14932, 1491, 310, 1774, 323, 4715, 841, 8892, 50274, 249, 2829, 337, 285, 2829, 374, 534, 3368, 4648, 3387, 15450, 4836, 32367, 436, 310, 417, 2590, 50275, 783, 8813, 275, 298, 20922, 77, 19630, 670, 253, 4679, 323, 2829, 374, 310, 247, 2372, 21643, 403, 253, 5175, 8892, 39709, 1309, 3909, 1442, 292, 25004, 625, 5742, 1691, 1269, 275, 8781, 8892, 403, 6760, 970, 247, 3646, 3215, 11273, 432, 1554, 262, 1945, 2720, 941, 285, 1442, 292, 37437, 323, 1691, 39121, 327, 16486, 50274, 249, 2829, 374, 849, 1142, 7587, 497, 908, 323, 3733, 2014, 4836, 941, 760, 50275, 249, 247, 1797, 627, 403, 1264, 1027, 12620, 23361, 8781, 23361, 31579, 285, 8576, 323, 28841, 3733, 273, 247, 19399, 403, 941, 432, 512, 1264, 12620, 908, 2366, 390, 941, 432, 253, 1072, 3126, 342, 253, 2303, 4836, 310, 908, 323, 28841, 3733, 50275, 249, 5933, 337, 5018, 943, 320, 14932, 281, 470, 285, 747, 14168, 3342, 91, 943, 320, 19958, 751, 298, 26, 672, 253, 4836, 3884, 277, 310, 13891, 275, 298, 1047, 50274, 249, 5933, 337, 298, 19, 752, 1057, 352, 1599, 407, 4944, 2805, 71, 2407, 91, 285, 2805, 67, 2407, 91, 281, 28841, 4836, 14452, 50275, 249, 5933, 337, 298, 22, 277, 943, 320, 671, 14932, 281, 269, 347, 253, 3126, 1375, 310, 14932, 3365, 4886, 298, 20, 281, 298, 22, 476, 789, 50275, 5430, 1199, 3632, 1255, 275, 1789, 24543, 285, 15688, 24543, 323, 1016, 2332, 273, 7103, 275, 2829, 337, 285, 374, 50275, 249, 2829, 337, 849, 310, 28841, 760, 6760, 625, 5742, 534, 14168, 3342, 91, 310, 908, 323, 436, 3368, 285, 849, 476, 562, 1171, 35360, 8892, 452, 824, 1029, 2323, 4142, 760, 342, 28841, 760, 50275, 66, 2372, 625, 7000, 5740, 273, 253, 4836, 14452, 14168, 3342, 91, 476, 320, 9371, 281, 2096, 253, 4081, 1332, 253, 2929, 25957, 326, 14168, 3342, 91, 310, 247, 4836, 21496, 432, 419, 316, 1342, 1162, 355, 275, 298, 17220, 533, 625, 4278, 670, 849, 352, 310, 10166, 476, 320, 50221, 275, 253, 2929, 50275, 262, 812, 320, 4722, 281, 923, 849, 2323, 4142, 273, 253, 3579, 285, 19265, 27765, 689, 253, 3909, 3733, 50274, 20415, 10414, 403, 432, 549, 32693, 4496, 26542, 616, 8059, 390, 6698, 9508, 50276, 187, 187, 4118, 18435, 27, 32897, 2451, 253, 5701, 273, 253, 30628, 275, 2508, 50275, 296, 3755, 20556, 50276, 47606, 5019, 273, 5368, 5697, 50276, 16217, 3825, 342, 247, 1524, 985, 1698, 16736, 4430, 50275, 20881, 1255, 265, 50276, 1439, 4122, 4460, 5019, 273, 5368, 5697, 50276, 47109, 342, 253, 1375, 23037, 14387, 812, 320, 10046, 970, 2720, 941, 556, 644, 14859, 2067, 2069, 275, 253, 2469, 50276, 8766, 3397, 878, 281, 320, 31637, 1180, 273, 24945, 1690, 323, 253, 1524, 10186, 4679, 7103, 273, 11041, 3966, 50276, 515, 23892, 326, 1097, 3579, 285, 19265, 7823, 476, 320, 6311, 342, 247, 1643, 3632, 32299, 534, 3133, 11543, 275, 954, 15216, 50275, 5996, 250, 2858, 22559, 16157, 891, 651, 751, 281, 5717, 253, 4477, 323, 616, 6031, 275, 22291, 253, 5701, 253, 2022, 4468, 1669, 310, 253, 38135, 50276, 33177, 3486, 533, 4583, 253, 30628, 10490, 253, 2929 ]
[ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 ]
[ 275, 1524, 10186, 4679, 285, 9864, 323, 721, 2619, 395, 5070, 8892, 342, 247, 35121, 9452, 11699, 253, 1543, 275, 1524, 10186, 4679, 921, 849, 3909, 1442, 292, 25004, 19132, 253, 2323, 2281, 273, 253, 1554, 262, 1945, 3646, 327, 747, 8892, 253, 2929, 4722, 314, 10316, 2366, 495, 5697, 28841, 4715, 1554, 262, 1945, 3700, 4715, 285, 6311, 16644, 14932, 7823, 715, 247, 2014, 2746, 253, 2929, 2789, 19645, 897, 273, 1264, 1077, 4722, 285, 12532, 3082, 28841, 4715, 1554, 262, 1945, 3700, 4715, 285, 6311, 16644, 14932, 7823, 285, 26574, 253, 581, 12022, 4836, 6779, 273, 253, 4471, 12788, 3646, 347, 247, 5415, 3037, 494, 4764, 326, 671, 6125, 253, 4836, 21496, 273, 747, 8892, 285, 253, 14932, 3646, 3738, 417, 7094, 747, 310, 247, 18176, 2746, 281, 2953, 253, 9380, 1895, 16280, 253, 16888, 19687, 30132, 38135, 581, 1896, 14855, 273, 253, 2929, 310, 326, 352, 3133, 326, 253, 747, 8892, 878, 281, 320, 824, 2074, 2217, 326, 253, 1711, 1554, 262, 1945, 3646, 476, 2168, 5115, 2323, 253, 2554, 273, 1966, 32367, 8523, 3559, 347, 15266, 840, 908, 323, 9267, 18859, 562, 1171, 35360, 747, 8892, 943, 320, 1805, 5611, 285, 625, 19186, 17285, 253, 1039, 941, 285, 1543, 403, 3559, 275, 253, 7180, 943, 320, 5520, 323, 19843, 285, 40468, 1239, 1430, 5474, 339, 3456, 267, 10186, 35121, 35221, 4715, 391, 77, 5431, 4419, 673, 33136, 941, 4849, 285, 10879, 1966, 7268, 281, 14932, 253, 3126, 253, 745, 41904, 77, 2746, 3400, 247, 1039, 281, 16584, 3786, 5728, 941, 281, 28551, 253, 4715, 1232, 594, 326, 247, 747, 4836, 476, 320, 6311, 342, 247, 1355, 1180, 273, 6355, 275, 436, 789, 253, 4477, 1263, 247, 1895, 835, 824, 28841, 941, 310, 2130, 432, 2067, 8892, 2112, 342, 616, 4836, 13301, 253, 941, 778, 320, 5728, 342, 8654, 390, 749, 29776, 7823, 50276, 783, 4477, 12661, 247, 3426, 985, 323, 34705, 4217, 6936, 432, 2720, 941, 285, 9433, 731, 281, 3037, 747, 8892, 1125, 11168, 4087, 672, 11372, 342, 247, 747, 4836, 253, 4081, 985, 5223, 84, 3786, 6311, 6936, 281, 4541, 3037, 281, 1097, 1347, 253, 747, 4836, 285, 1091, 253, 3126, 281, 271, 3302, 1375, 8069, 9591, 697, 1211, 3126, 14932, 253, 4477, 921, 253, 12510, 273, 616, 985, 342, 9470, 3520, 285, 15524, 4679, 4757, 50275, 783, 2929, 12453, 271, 1774, 1895, 273, 941, 6733, 285, 14932, 4924, 4715, 323, 1524, 35121, 4893, 50276, 783, 2929, 4518, 25957, 253, 2720, 2987, 327, 534, 253, 4081, 985, 310, 1754, 50276, 2068, 3134, 40290, 281, 2939, 253, 12510, 273, 253, 985, 1097, 275, 15524, 285, 3520, 12620, 50276, 20881, 1255, 50276, 783, 2022, 14855, 273, 253, 2929, 310, 326, 352, 310, 247, 2372, 2834, 281, 923, 752, 253, 4588, 7680, 273, 253, 2929, 310, 285, 849, 1027, 253, 2746, 310, 2429, 281, 253, 3786, 3863, 789, 2067, 2720, 2987, 2168, 5183, 849, 3909, 391, 77, 476, 320, 21702, 342, 2720, 28841, 941, 50276, 7152, 339, 431, 248, 2929, 25339, 247, 985, 281, 25057, 2720, 941, 432, 2905, 533, 5799, 8892, 281, 2572, 4715, 6733, 285, 26647, 275, 747, 4836, 253, 985, 2987, 275, 767, 5018, 337, 970, 28841, 941, 342, 11253, 273, 1027, 8892, 342, 4836, 44077, 281, 3037, 247, 4836, 35428, 3646, 970, 28841, 391, 77, 374, 22318, 4836, 42743, 21496, 1309, 19007, 323, 253, 747, 4836, 281, 4796, 253, 1966, 20446, 3206, 275, 501, 1507, 253, 1332, 29328, 281, 4908, 767, 1027, 46234, 581, 326, 7576, 6419, 253, 8892, 285, 253, 643, 326, 501, 1507, 253, 4836, 253, 4679, 403, 2011, 275, 1097, 1524, 1533, 15688, 285, 9864, 20544, 50276, 18, 253, 2929, 10262, 247, 4942, 4460, 5019, 273, 5368, 1929, 11911, 323, 4715, 342, 28841, 941, 285, 4715, 14932, 7823, 374, 253, 2929, 44995, 253, 1543, 327, 1524, 1533, 285, 15524, 10625, 285, 2085, 11088, 1543, 281, 921, 253, 13091, 273, 253, 1332, 253, 4679, 7568, 253, 8936, 26647, 3745, 273, 253, 985, 14475, 347, 247, 906, 273, 4715, 432, 2709, 8892, 285, 2559, 4715, 6733, 50276, 20881, 1255, 265, 50276, 18, 3710, 1666, 25379, 253, 2929, 44995, 247, 19399, 327, 253, 1524, 1533, 8892, 533, 1057, 417, 1246, 247, 5301, 342, 667, 8245, 2442, 1666, 25379, 281, 320, 2429, 281, 403, 374, 19630, 1012, 22187, 250, 3065, 432, 2929, 285, 1846, 391, 77, 1666, 25379, 534, 403, 625, 18960, 281, 23507, 23267, 751, 617, 10683, 10414, 2708, 374, 253, 4679, 897, 247, 2159, 16892, 273, 278, 938, 534, 310, 1355, 323, 247, 1180, 273, 23904, 5011, 285, 15034, 8892, 352, 310, 417, 2590, 849, 436, 1332, 651, 4311, 342, 16892, 1580, 253, 1265, 3409, 3268, 323, 4715, 253, 14932, 3646, 476, 1756, 28596, 50276, 18, 17134, 18347, 2793, 44864, 2304, 5620, 285, 610, 348, 319, 25928, 1162, 355, 50276, 19, 15706, 23267, 342, 6667, 1650, 3169, 3646, 3186, 3066, 33037, 9162, 70, 656, 257, 16836, 1162, 355, 50276, 20, 2590, 920, 4715, 281, 5115, 7342, 3066, 33037, 9162, 299, 656, 257, 16836, 1162, 355, 5474, 339, 377, 1221, 26208, 342, 1781, 28841, 941, 285, 1442, 292, 25004, 327, 247, 1355, 2408, 273, 3909, 941, 310, 247, 3410, 20246, 1039, 273, 4715, 247, 747, 4836, 2299, 954, 3332, 7274, 2430, 6843, 3126, 14932, 1076, 323, 3909, 941, 4849, 436, 789, 29328, 281, 15338, 11595, 14932, 1076, 273, 253, 3126, 672, 4715, 247, 747, 4836, 275, 253, 1524, 1533, 19732, 2977, 28841, 1554, 262, 1945, 941, 806, 432, 253, 28841, 1554, 262, 1945, 941, 253, 4081, 1332, 247, 19399, 33772, 247, 1554, 262, 1945, 3646, 27039, 327, 247, 4836, 21496, 14168, 3342, 91, 970, 28841, 391, 77, 840, 1309, 253, 3909, 1442, 292, 25004, 3408, 247, 19399, 17891, 323, 247, 4836, 21496, 323, 253, 3579, 4836, 347, 973, 347, 1529, 4836, 21496, 323, 253, 19265, 4836, 3126, 14932, 247, 19399, 10040, 684, 253, 3579, 285, 19265, 8892, 534, 34243, 501, 1507, 253, 3126, 285, 17923, 253, 2303, 4836, 1293, 1966, 7268, 253, 1524, 10186, 4679, 921, 326, 247, 19399, 33526, 625, 685, 5571, 2323, 4142, 342, 760, 608, 281, 1884, 11595, 501, 1507, 50276, 296, 3755, 384, 84, 50275, 783, 2934, 273, 17617, 247, 1554, 262, 1945, 3646, 323, 1097, 3579, 285, 19265, 14932, 7823, 2789, 3282, 285, 253, 2929, 45190, 2722, 697, 12510, 50275, 783, 1524, 10186, 4679, 403, 9286, 1242, 5196, 342, 1142, 1027, 5113, 285, 275, 1264, 1027, 3126, 873, 8777, 50275, 20881, 1255, 265, 50275, 249, 391, 77, 10985, 11117, 3302, 3054, 310, 1774, 323, 31640, 285, 26647, 273, 253, 6311, 3646, 2299, 1293, 6843, 14871, 14932, 1076, 253, 4081, 19265, 9763, 778, 3037, 281, 14932, 281, 247, 1643, 3477, 281, 14932, 3302, 3054, 534, 11903, 4219, 697, 1211, 10921, 352, 651, 320, 1270, 604, 253, 4477, 2319, 436, 2442, 2523, 50275, 783, 4081, 28035, 6974, 310, 23176, 275, 3772, 839, 3909, 35221, 4715, 342, 1679, 1966, 7268, 2299, 581, 273, 253, 4619, 13260, 275, 436, 789, 310, 326, 1097, 3579, 285, 19265, 27765, 476, 320, 6786, 342, 247, 1643, 3632, 42509, 253, 37317, 310, 12371, 1880, 436, 310, 247, 8542, 9376, 323, 15688, 4715, 50275, 936, 921, 253, 4209, 3064, 875, 3733, 8892, 285, 5175, 8892, 1529, 4722, 1127, 812, 320, 24250, 253, 3646, 846, 3215, 26208, 285, 760, 4560, 4569, 14168, 3342, 91, 970, 260, 358, 604, 436, 2987, 347, 973, 347, 253, 4081, 1332, 352, 812, 1599, 253, 2720, 941, 310, 1512, 2074, 281, 253, 5175, 8892, 534, 2097, 253, 5661, 9978, 1537, 320, 1512, 3477, 50275, 783, 4081, 1332, 19584, 247, 4836, 21496, 10140, 281, 247, 4836, 2299, 275, 1142, 2219, 253, 28841, 941, 24102, 476, 320, 5889, 267, 5019, 273, 2709, 13576, 285, 253, 2303, 4836, 778, 2430, 331, 31054, 7954, 2709, 273, 841, 24102, 253, 4081, 1332, 1057, 417, 1646, 14916, 281, 4311, 281, 436, 1846, 10076, 50275, 34974, 285, 13991, 50275, 13206, 374, 67, 310, 1512, 1355, 8566, 285, 417, 27389, 1293, 2217, 8813, 352, 651, 320, 1805, 281, 2486, 760, 767, 6667, 581, 432, 31929, 2382, 285, 1529, 581, 432, 562, 1171, 35360, 8892, 285, 2085, 4209, 20121, 50275, 77, 1348, 17537, 20, 15606, 253, 5818, 327, 253, 47997, 1057, 417, 1646, 281, 2430, 3687, 12320, 685, 253, 2720, 8892, 50275, 249, 298, 21149, 19490, 1223, 253, 2929, 3916, 326, 253, 4081, 1332, 476, 16584, 941, 432, 667, 5272, 2603, 253, 4081, 1332, 4419, 10921, 22581, 534, 310, 417, 2130, 275, 10921, 4924, 32367, 24088, 1132, 941, 1554, 262, 1945, 6485, 32367, 1677, 326, 253, 4081, 1332, 476, 7826, 789, 342, 37139, 600, 22027, 941, 970, 28841, 391, 77, 352, 778, 417, 320, 247, 1943, 2523, 533, 352, 943, 320, 31637, 50275, 249, 298, 22067, 752, 1057, 352, 1599, 326, 23267, 403, 7616, 387, 253, 990, 273, 253, 18974, 253, 37317, 11121, 23267, 323, 253, 2303, 32367, 943, 320, 512, 337, 387, 253, 990, 390, 1057, 352, 1599, 326, 253, 10921, 310, 13130, 7293, 327, 253, 3579, 390, 19265, 8892, 50275, 5371, 310, 253, 5426, 273, 2332, 275, 2593, 608, 285, 721, 1057, 436, 1599, 278, 5018, 390, 374, 78, 5018, 50275, 249, 2829, 337, 2139, 513, 31929, 2382, 8892, 1347, 7197, 685, 562, 1171, 35360, 8892, 50275, 249, 2829, 337, 1690, 247, 8245, 342, 6314, 23329, 14932, 347, 271, 5170, 3033, 476, 320, 9371, 281, 2096, 849, 1199, 14932, 1491, 310, 1774, 323, 4715, 841, 8892, 50274, 249, 2829, 337, 285, 2829, 374, 534, 3368, 4648, 3387, 15450, 4836, 32367, 436, 310, 417, 2590, 50275, 783, 8813, 275, 298, 20922, 77, 19630, 670, 253, 4679, 323, 2829, 374, 310, 247, 2372, 21643, 403, 253, 5175, 8892, 39709, 1309, 3909, 1442, 292, 25004, 625, 5742, 1691, 1269, 275, 8781, 8892, 403, 6760, 970, 247, 3646, 3215, 11273, 432, 1554, 262, 1945, 2720, 941, 285, 1442, 292, 37437, 323, 1691, 39121, 327, 16486, 50274, 249, 2829, 374, 849, 1142, 7587, 497, 908, 323, 3733, 2014, 4836, 941, 760, 50275, 249, 247, 1797, 627, 403, 1264, 1027, 12620, 23361, 8781, 23361, 31579, 285, 8576, 323, 28841, 3733, 273, 247, 19399, 403, 941, 432, 512, 1264, 12620, 908, 2366, 390, 941, 432, 253, 1072, 3126, 342, 253, 2303, 4836, 310, 908, 323, 28841, 3733, 50275, 249, 5933, 337, 5018, 943, 320, 14932, 281, 470, 285, 747, 14168, 3342, 91, 943, 320, 19958, 751, 298, 26, 672, 253, 4836, 3884, 277, 310, 13891, 275, 298, 1047, 50274, 249, 5933, 337, 298, 19, 752, 1057, 352, 1599, 407, 4944, 2805, 71, 2407, 91, 285, 2805, 67, 2407, 91, 281, 28841, 4836, 14452, 50275, 249, 5933, 337, 298, 22, 277, 943, 320, 671, 14932, 281, 269, 347, 253, 3126, 1375, 310, 14932, 3365, 4886, 298, 20, 281, 298, 22, 476, 789, 50275, 5430, 1199, 3632, 1255, 275, 1789, 24543, 285, 15688, 24543, 323, 1016, 2332, 273, 7103, 275, 2829, 337, 285, 374, 50275, 249, 2829, 337, 849, 310, 28841, 760, 6760, 625, 5742, 534, 14168, 3342, 91, 310, 908, 323, 436, 3368, 285, 849, 476, 562, 1171, 35360, 8892, 452, 824, 1029, 2323, 4142, 760, 342, 28841, 760, 50275, 66, 2372, 625, 7000, 5740, 273, 253, 4836, 14452, 14168, 3342, 91, 476, 320, 9371, 281, 2096, 253, 4081, 1332, 253, 2929, 25957, 326, 14168, 3342, 91, 310, 247, 4836, 21496, 432, 419, 316, 1342, 1162, 355, 275, 298, 17220, 533, 625, 4278, 670, 849, 352, 310, 10166, 476, 320, 50221, 275, 253, 2929, 50275, 262, 812, 320, 4722, 281, 923, 849, 2323, 4142, 273, 253, 3579, 285, 19265, 27765, 689, 253, 3909, 3733, 50274, 20415, 10414, 403, 432, 549, 32693, 4496, 26542, 616, 8059, 390, 6698, 9508, 50276, 187, 187, 4118, 18435, 27, 32897, 2451, 253, 5701, 273, 253, 30628, 275, 2508, 50275, 296, 3755, 20556, 50276, 47606, 5019, 273, 5368, 5697, 50276, 16217, 3825, 342, 247, 1524, 985, 1698, 16736, 4430, 50275, 20881, 1255, 265, 50276, 1439, 4122, 4460, 5019, 273, 5368, 5697, 50276, 47109, 342, 253, 1375, 23037, 14387, 812, 320, 10046, 970, 2720, 941, 556, 644, 14859, 2067, 2069, 275, 253, 2469, 50276, 8766, 3397, 878, 281, 320, 31637, 1180, 273, 24945, 1690, 323, 253, 1524, 10186, 4679, 7103, 273, 11041, 3966, 50276, 515, 23892, 326, 1097, 3579, 285, 19265, 7823, 476, 320, 6311, 342, 247, 1643, 3632, 32299, 534, 3133, 11543, 275, 954, 15216, 50275, 5996, 250, 2858, 22559, 16157, 891, 651, 751, 281, 5717, 253, 4477, 323, 616, 6031, 275, 22291, 253, 5701, 253, 2022, 4468, 1669, 310, 253, 38135, 50276, 33177, 3486, 533, 4583, 253, 30628, 10490, 253, 2929 ]
Below is given review of a research paper from cnoference journal. Please write a summary the review. ### Review: this paper aims to solve unstructured pruning as a bilevel optimization problem to find pruning masks and values of unpruned weights they define bilevel equations and perform two sgdprocesses iteratively their goal is proposing a new method that can show comparable accuracy as impiterative magnitudebased pruning with restricted training time strengths for unstructured pruning of dnns this paper defines a bilevel optimization problem and shows theoretical contents for that this paper shows the comparable results of this method using various dataset and architectures the compression including retraining time is not increased with various pruning rates weaknesses i have a concern on the real effectiveness of unstructured pruning as many papers have been mentioned there is no realistic acceleration method for unstructured pruning due to random locations of unpruned weights there is no note for this widelyknown problem even though this method is a novel idea for unstructured pruning we cannot use this method for our inference when using csr formats ex cusparse library in cuda we must need higher pruning rates to gain faster inference speed i think that problem is why there have been a few studies on unstructured pruning these days unlike structured pruning in this aspect this method can achieve outperformed results on lower pruning rates less than 50 but there is less noticeable improvement on effective pruning rates so in my opinion the resulting accuracy seems to be not the main contribution of this paper moreover the experimental results seem to be so restricted we are living in 2022 and the neural networks have evolved since song hans magnitudebased pruning i dont think the results on the cifar10 dataset can prove the novelty of a pruning method it can just show a pruning method works well most of the results on this paper are limited to the resnet arch and cifar10 dataset cifar10 consists of just ten classes i think this paper should extend the experimental results to various architectures ex mobilenet transformer or bigger dataset with smaller model ex resnet18 on imagenet then there remains another main contribution faster pruningretraining speed regardless of pruning rate to strengthen this contribution i think there should be more ablation studies and experiments im curious why this method works with consistent time regardless of pruning rates or if i put much time for higher pruning rates then can i gather higher accuracy by using this method this paper argues for an unstructured pruning method but there is a critical issue on the unstructured pruning i think this paper has noless consideration on that the experimental designs are so restricted it should be extended to other challenging datasets and architectures docsepsearch for the winning lottery in the lottery ticket hypothesis lth is of great interest in the machine learning ml community and this paper aims to find such winning tickets in most cases this work formulates the model pruning primarily unstructured but also extends to structured pruning as a bilevel optimization blo where the lowerlevel optimization finds the best possible set of weights given the sparse neural network that is weight masks and the upperlevel optimization is optimizing for the boolean mask done using continuous relaxation followed by thresholdbased rounding authors first derive the general expression for the gradient with respect to mask using an implicit gradient which involves secondorder derivatives matrix inverse and n n matrices where n in parameters in the network however with the hessian free assumption and given the nature of the problem bilinear in mask m and parameters theta the final expression of gradient turns out to have only firstorder derivatives having defined the expressions they finally proceed to describe the algorithm which involves lowerlevel sgd and upperlevel spgd done until convergence which does happen in practice then the paper proceeds to the experiments involving multiple architectures multiple datasets and different pruning ratios performance metrics involve final accuracy performance relative to the dense pretrained model and overall run time strong experiments show that bip achieves superior performance than the original dense model ie finds winning lottery and has a comparable or superior performance to iterative magnitude pruning imp while having much lesser runtime similar to oneshot pruning methods strengths 1 the paper is concise and well written 2 the theory is easy to follow and the experiment section is strong involving ablations across different hyperparameters 3 charts are well organized overall i enjoyed reading this work ive some suggestions which primarily are additional references to be discussed and some additional visualization that could help to make this a strong submission which i describe in the following weakness section weaknesses 1 while the diagram is mentioned in the appendix can the authors include an algorithm block for a better understanding of the overall flow 2 how are the diverse batches chosen for the training does it involve some kind of submodular optimization to get the schedule what happens if the batch for sgd is qualitatively different from the batch of spgd 3 the convergence result shown in the appendix involves final accuracy does this happen that the masks still keep on changing that is finding the subnetwork but all of that subnetworks have similar performance when trained performed sgd 4 it was mentioned that sgd steps are kept fixed to 1 however at the very first step of pruning why would still having just 1 sgd step suffice 5 table 1 can authors add a vertical separator line for the datasets additional references that can be discussed 1 soft threshold weight reparameterization for learnable sparsity icml20 2 effective training of sparse neural networks under global sparsity constraint cvpr21 3 rethinking bilevel optimization in neural architecture search a gibbs sampling perspective aaai21 as mentioned in the main review docsepthis paper provides a novel reformulation of model pruning as a bilevel optimization blo problem in which the paradigm of pruningretraining pruning can be viewed as two optimization levels 1 finding the pruning mask the upperlevel and 2 masked model retraining the lowerlevel the paper further proposed an algorithm bilevel pruning bip to be a blo solver that uses only the firstorder gradient which makes it as efficient as oneshot pruning the experiment results show that bip equips the high efficiency of oneshot pruning and maintains the high accuracy of iterative magnitude pruning imp in both structured and unstructured pruning schemes strengths the idea of the blo reformulation of the model pruning problem is new which provides a theoretical basis for blo algorithms to be explored for model pruning the proposed bip pruning algorithm is original and practical the paper is well organized and easy to follow weakness it would be really helpful to add some discussion about the similarities and differences between the blo and the l0based pruning httpsarxivorgpdf171201312pdf l0based methods view pruning masks as random variables defined by specific parameters thus making the pruning masks could be learned together with the models this is a different perspective compared to blo and these two views may be combined and unified which makes it worth having a discussion and comparison the authors have addressed the limitations adequately it would be better to move some ablation studies like figure a4 to the main manuscript though the pages may be limited docsepthe authors correctly identify that iterative magnitude pruning imp is inefficient to extract winning tickets from neural networks to this end they investigate a method utilising bilevel optimisation blo to extract winning tickets including structured results which can easily yield realworld speedups in the paper they formulate pruning as a blo problem and subsequently evaluate across cifar10100 tinyimagenet and imagenet the results are consistently better than imp and gradientbased saliency methods such as grasp see below for response to authors score has been updated after rebuttal i am suspicious of the contributions of this paper since the use of winning ticket seems to be overloaded in this work in the abstract this term is defined as ie pruned sparse models with better generalization than the original dense models but this is not the widely agreed upon definition lines 35 to 37 give the more widely accepted definition but line 325 gives a different definition which i dont agree with it is crucial that the work is selfconsistent and consistent with other literature following on from this observation it seems that the pruning is done throughout training this cannot reasonably be compared to imp or grasp the pruning mask is set at iteration 0 or close to 0 and does not change throughout training it is also worth noting that grasp is not sota anymore you should compare to prospr by alizadeh et al cited by your work already the only valid comparison i can see would be to the earlybird work by you et al but you dont provide any in this direction in addition you have provided few comparisons to methods which prune after training and perhaps do some small amount of fine tuning these are also reasonable competitors to this work finally you are not the first to do bilevel optimisation for pruning differentiable network pruning for microcontrollers by liberis and lane would be an example prior art and im sure there are others here are some more minor thoughts code is included which is very nice since the method is fairly complicated i really commend this work for paying attention to structured pruning its a really hard problem and far more relevant than most other pruning directions but i am not sure you characterise the related work fairly single shot structured pruning by van amersfoort et al and prospr by alizadeh et al both assessed this direction at initialisation on l229 you say that we can assume nabla2 l 0 i am not convinced this is actually true in practice after all this is how many earlier pruning saliency methods worked however i am not saying that this makes the work incorrect approximations pop up everywhere in dl is there a good reason to use reference 18 as the benchmark rather than the benchmark provided by frankles missing the mark work it is very extensive and other works have built upon it in recent years figure 6a is not enough to justify the robustness to rewinding that marks a winning ticket youve provided results on the smallest model with the easiest dataset more difficult examples are needed the final line of the conclusion is a bit concerning in practice structured pruning methods could prune the same number of parameters but yield totally different speedups it is an important thing to measure i dont think there is any explicit discussion of limitations in the main paper the checklist points us to appendix c4 but theres nothing there regarding limitations and i have discussed some above ### Summary:
the reviewers had significantly diverging opinions on this manuscript the main issue under discussion was whether the framing of this paper as a lottery ticket work was correct given that the main evaluations use no reinitialization or rewinding on balance i think that while one reviewer was very negative about the paper the disagreement was mostly terminological the substantial concern is whether the evaluation comparison wherein blo with no rewinding is compared against methods that use rewinding in figure 3 is fair the authors respond to this by providing comparisons in figures 6a and a11 that evaluate rewinding on some tasks however these figures seem to show that the accuracy of blo is completely insensitive to rewindingand even to complete reinitialization in the original lotteryticket sense this raises the natural question why not just evaluate primarily in the reinitialized case where theres no need to redefine the term winning ticket that is the whole presentation seems to be backwards the way it should be presentedevaluated is first we show that blo outperforms other methods in the classic lottery ticket regime where we reset all the weights to initialization 100 rewind this would replace the present figure 3 this would be a fair comparison comparing classicallotteryticketsetting pruning methods to each other next we show that one advantage of blo is that unlike other winningticketfinding methods its performance is invariant to rewinding that is if what we want is just accuracy of the pruned model and not to do some sort of scientific investigation of the lottery ticket hypothesis then blo outperforms other methods when we dont do any rewinding at all with this sort of presentation i think the authors could have avoided the negative reviewers objections despite these presentationalterminological issues i think that theres enough technical contribution here with figures 6a and a11 to move forward with acceptance especially considering the enthusiasm of the other reviewers and the technical novelty of the bilevel approach the empirical results are there and figures 6a and a11 show a clear connection with the lottery ticket work theyre just presented strangely and i think there are not any fundamental technical issues here that forbid acceptance
[ 673, 323, 2169, 819, 25004, 4142, 840, 476, 891, 9580, 2169, 7200, 407, 970, 436, 1332, 50276, 2520, 2929, 8219, 323, 271, 440, 34218, 819, 25004, 1332, 533, 627, 310, 247, 4619, 2523, 327, 253, 440, 34218, 819, 25004, 891, 1158, 436, 2929, 556, 295, 311, 405, 8180, 327, 326, 50276, 783, 5661, 11809, 403, 594, 11096, 352, 943, 320, 6508, 281, 643, 11132, 15302, 285, 35615, 50276, 7152, 33032, 8716, 323, 253, 9880, 36284, 275, 253, 36284, 13571, 9079, 298, 394, 310, 273, 1270, 1600, 275, 253, 5145, 4715, 13361, 3114, 285, 436, 2929, 13698, 281, 1089, 824, 9880, 14997, 275, 954, 2219, 436, 789, 17075, 684, 253, 1566, 819, 25004, 8558, 440, 34218, 533, 671, 8725, 281, 18872, 819, 25004, 347, 247, 26413, 652, 13757, 31767, 835, 253, 2406, 5251, 13757, 9010, 253, 1682, 1896, 873, 273, 13461, 1677, 253, 23507, 11454, 2990, 326, 310, 2801, 25965, 285, 253, 5170, 5251, 13757, 310, 39793, 323, 253, 12419, 8989, 2218, 970, 5415, 17040, 3560, 407, 7887, 3169, 46551, 4477, 806, 15313, 253, 2087, 2048, 323, 253, 11786, 342, 1675, 281, 8989, 970, 271, 15424, 11786, 534, 8687, 1273, 2621, 13335, 4315, 13737, 285, 295, 295, 12624, 835, 295, 275, 3602, 275, 253, 2990, 2299, 342, 253, 344, 859, 757, 1959, 9376, 285, 1677, 253, 3753, 273, 253, 1895, 10370, 48971, 275, 8989, 278, 285, 3602, 39116, 253, 2457, 2048, 273, 11786, 7819, 562, 281, 452, 760, 806, 2621, 13335, 1907, 2931, 253, 12091, 597, 4720, 4262, 281, 6266, 253, 5933, 534, 8687, 2406, 5251, 256, 35333, 285, 5170, 5251, 653, 35333, 2218, 1919, 14940, 534, 1057, 5108, 275, 3946, 840, 253, 2929, 16947, 281, 253, 4679, 7668, 2709, 35615, 2709, 15302, 285, 1027, 819, 25004, 11878, 3045, 17082, 6388, 2457, 7200, 3045, 4103, 281, 253, 14086, 3215, 11273, 1566, 285, 4583, 1408, 673, 2266, 4679, 921, 326, 15086, 33526, 8936, 3045, 685, 253, 3236, 14086, 1566, 26332, 9010, 9880, 36284, 285, 556, 247, 10870, 390, 8936, 3045, 281, 34560, 9777, 819, 25004, 1607, 1223, 1907, 1199, 16277, 20243, 2074, 281, 4394, 12022, 819, 25004, 3082, 50276, 296, 3755, 20556, 50276, 18, 253, 2929, 310, 44003, 285, 973, 3542, 50276, 19, 253, 3762, 310, 3477, 281, 956, 285, 253, 3368, 2593, 310, 2266, 7668, 490, 77, 569, 2439, 1027, 4373, 22041, 50276, 20, 19840, 403, 973, 10932, 50275, 1189, 455, 891, 11346, 4361, 436, 789, 209, 422, 690, 13991, 534, 8558, 403, 3081, 10414, 281, 320, 5469, 285, 690, 3081, 24426, 326, 812, 1361, 281, 1056, 436, 247, 2266, 19529, 534, 891, 6266, 275, 253, 1563, 14855, 2593, 50275, 20881, 1255, 265, 337, 1223, 253, 10659, 310, 5393, 275, 253, 30762, 476, 253, 4477, 2486, 271, 5933, 2972, 323, 247, 1805, 4685, 273, 253, 4583, 2685, 50276, 19, 849, 403, 253, 11117, 39657, 6777, 323, 253, 3733, 1057, 352, 6388, 690, 2238, 273, 749, 2307, 792, 13757, 281, 755, 253, 10130, 752, 6569, 604, 253, 14604, 323, 256, 35333, 310, 36143, 1027, 432, 253, 14604, 273, 653, 35333, 50276, 20, 253, 14940, 906, 2011, 275, 253, 30762, 8687, 2457, 7200, 1057, 436, 5108, 326, 253, 25965, 1335, 1978, 327, 6890, 326, 310, 4560, 253, 749, 18428, 533, 512, 273, 326, 749, 3024, 4896, 452, 2074, 3045, 672, 10166, 2684, 256, 35333, 50276, 21, 352, 369, 5393, 326, 256, 35333, 5018, 403, 4934, 4229, 281, 337, 2299, 387, 253, 1077, 806, 3213, 273, 819, 25004, 2139, 651, 1335, 1907, 816, 337, 256, 35333, 3213, 36433, 50276, 22, 2829, 337, 476, 4477, 823, 247, 9118, 35823, 1386, 323, 253, 15302, 50276, 38092, 10414, 326, 476, 320, 5469, 337, 2602, 7887, 2801, 294, 19484, 1320, 323, 3037, 494, 37139, 414, 17857, 1686, 938, 374, 3576, 3733, 273, 23507, 11454, 6928, 762, 4156, 37139, 414, 7658, 30105, 1087, 1797, 495, 294, 37341, 26413, 652, 13757, 275, 11454, 10336, 3186, 247, 33342, 1768, 10491, 8668, 39951, 2284, 1797, 50276, 284, 5393, 275, 253, 2022, 2278, 5474, 33032, 2520, 2929, 3400, 247, 4460, 8460, 1427, 273, 1566, 819, 25004, 347, 247, 26413, 652, 13757, 31767, 1895, 275, 534, 253, 22199, 273, 819, 25004, 1221, 26208, 819, 25004, 476, 320, 11575, 347, 767, 13757, 2308, 337, 4560, 253, 819, 25004, 8989, 253, 5170, 5251, 285, 374, 34741, 1566, 851, 26208, 253, 2406, 5251, 50276, 783, 2929, 2007, 4081, 271, 5933, 26413, 652, 819, 25004, 15086, 281, 320, 247, 31767, 47037, 326, 4648, 760, 253, 806, 2621, 11786, 534, 2789, 352, 347, 5919, 347, 4394, 12022, 819, 25004, 50276, 783, 3368, 1543, 921, 326, 15086, 1298, 2824, 253, 1029, 6733, 273, 4394, 12022, 819, 25004, 285, 18922, 253, 1029, 7200, 273, 34560, 9777, 819, 25004, 1607, 275, 1097, 18872, 285, 440, 34218, 819, 25004, 15849, 20544, 50275, 783, 2934, 273, 253, 31767, 8460, 1427, 273, 253, 1566, 819, 25004, 1895, 310, 747, 534, 3400, 247, 10527, 3720, 323, 31767, 11333, 281, 320, 14859, 323, 1566, 819, 25004, 50275, 783, 4081, 15086, 819, 25004, 5933, 310, 3236, 285, 8542, 50275, 783, 2929, 310, 973, 10932, 285, 3477, 281, 956, 50276, 20881, 1255, 50275, 262, 651, 320, 1663, 9371, 281, 823, 690, 5955, 670, 253, 22620, 285, 3910, 875, 253, 31767, 285, 253, 298, 17, 3169, 819, 25004, 5987, 39962, 2061, 9275, 1166, 805, 13144, 805, 9275, 298, 17, 3169, 3082, 1859, 819, 25004, 25965, 347, 3632, 4903, 2931, 407, 2173, 3602, 3021, 2403, 253, 819, 25004, 25965, 812, 320, 6311, 2366, 342, 253, 3210, 436, 310, 247, 1027, 8668, 2429, 281, 31767, 285, 841, 767, 6849, 778, 320, 5678, 285, 27998, 534, 2789, 352, 4409, 1907, 247, 5955, 285, 5301, 253, 4477, 452, 9713, 253, 7364, 18212, 352, 651, 320, 1805, 281, 2118, 690, 28913, 2175, 751, 4677, 247, 21, 281, 253, 2022, 7714, 2167, 253, 7223, 778, 320, 3710, 5474, 339, 431, 248, 4477, 9113, 4271, 326, 34560, 9777, 819, 25004, 1607, 310, 31334, 281, 4908, 9880, 14997, 432, 11454, 6928, 281, 436, 990, 597, 7409, 247, 1332, 4981, 2182, 26413, 652, 5556, 5837, 31767, 281, 4908, 9880, 14997, 50276, 10387, 18872, 1543, 534, 476, 4354, 4917, 1524, 10186, 3885, 8777, 275, 253, 2929, 597, 36803, 819, 25004, 347, 247, 31767, 1895, 285, 9674, 7472, 2439, 260, 338, 274, 6903, 361, 10058, 303, 6533, 292, 285, 4440, 257, 292, 253, 1543, 403, 12724, 1805, 685, 1607, 285, 11786, 3169, 3779, 4364, 3082, 824, 347, 15909, 50274, 2887, 2708, 323, 2380, 281, 4477, 4868, 556, 644, 9300, 846, 30080, 22559, 891, 717, 20634, 273, 253, 9021, 273, 436, 2929, 1580, 253, 897, 273, 9880, 13571, 3133, 281, 320, 689, 19052, 275, 436, 789, 275, 253, 12002, 436, 1307, 310, 2931, 347, 26332, 819, 37437, 23507, 3210, 342, 1805, 26647, 685, 253, 3236, 14086, 3210, 533, 50276, 2520, 310, 417, 253, 7561, 5821, 2220, 5426, 3104, 4791, 281, 5345, 1918, 253, 625, 7561, 7607, 5426, 533, 1386, 28325, 4245, 247, 1027, 5426, 534, 891, 13414, 5194, 342, 352, 310, 9560, 326, 253, 789, 310, 1881, 32474, 285, 5185, 342, 643, 6239, 50276, 34814, 327, 432, 436, 8310, 352, 3133, 326, 253, 819, 25004, 310, 2218, 4768, 3733, 436, 2550, 12054, 320, 2429, 281, 1607, 390, 15909, 253, 819, 25004, 8989, 310, 873, 387, 19502, 470, 390, 2810, 281, 470, 285, 1057, 417, 1818, 4768, 3733, 352, 310, 671, 4409, 15806, 326, 15909, 310, 417, 256, 5503, 10542, 368, 943, 7277, 281, 354, 27375, 407, 355, 478, 796, 73, 1162, 355, 11106, 407, 634, 789, 2168, 253, 760, 3588, 5301, 891, 476, 923, 651, 320, 281, 253, 2393, 23182, 789, 407, 368, 1162, 355, 50276, 2858, 368, 13414, 2085, 667, 275, 436, 3884, 275, 1635, 368, 452, 2530, 1643, 14023, 281, 3082, 534, 819, 2517, 846, 3733, 285, 4931, 513, 690, 1355, 2408, 273, 4030, 25184, 841, 403, 671, 5272, 21607, 281, 436, 789, 50276, 71, 3341, 368, 403, 417, 253, 806, 281, 513, 26413, 652, 5556, 5837, 323, 819, 25004, 46350, 2990, 819, 25004, 323, 2494, 35019, 398, 407, 10875, 261, 285, 18209, 651, 320, 271, 1650, 2720, 1445, 285, 516, 2119, 627, 403, 2571, 50276, 1568, 403, 690, 625, 5884, 7906, 50275, 3211, 310, 2908, 534, 310, 1077, 5322, 1580, 253, 1332, 310, 9648, 9542, 50276, 74, 1663, 49638, 436, 789, 323, 10054, 4116, 281, 18872, 819, 25004, 50276, 953, 247, 1663, 1892, 1895, 285, 2080, 625, 4623, 685, 954, 643, 819, 25004, 10746, 50276, 2858, 891, 717, 417, 2119, 368, 1894, 885, 253, 2905, 789, 9648, 2014, 5103, 18872, 819, 25004, 407, 3889, 717, 398, 4786, 430, 1162, 355, 285, 354, 27375, 407, 355, 478, 796, 73, 1162, 355, 1097, 7515, 436, 3884, 387, 3302, 5837, 50276, 251, 298, 17107, 368, 1333, 326, 359, 476, 5467, 295, 6348, 19, 298, 50276, 17, 891, 717, 417, 13762, 436, 310, 2686, 2032, 275, 3946, 846, 512, 436, 310, 849, 1142, 4321, 819, 25004, 3779, 4364, 3082, 4307, 2299, 891, 717, 417, 3981, 326, 436, 2789, 253, 789, 13583, 34754, 1684, 598, 11678, 275, 45439, 50276, 261, 627, 247, 1175, 1921, 281, 897, 3806, 1283, 347, 253, 22791, 2581, 685, 253, 22791, 2530, 407, 21332, 868, 5816, 253, 1616, 789, 352, 310, 1077, 9470, 285, 643, 2987, 452, 4270, 2220, 352, 275, 3332, 1107, 50276, 13206, 721, 66, 310, 417, 2217, 281, 15249, 253, 31640, 281, 294, 88, 3087, 326, 10880, 247, 9880, 13571, 368, 306, 2530, 1543, 327, 253, 8004, 1566, 342, 253, 24746, 10895, 625, 2834, 6667, 403, 3058, 50276, 783, 2457, 1386, 273, 253, 6452, 310, 247, 2372, 8664, 275, 3946, 18872, 819, 25004, 3082, 812, 819, 2517, 253, 1072, 1180, 273, 3602, 533, 4917, 9106, 1027, 3885, 8777, 352, 310, 271, 1774, 2181, 281, 2557, 891, 13414, 1158, 627, 310, 667, 6843, 5955, 273, 7364, 275, 253, 2022, 2929, 253, 44282, 2792, 441, 281, 30762, 260, 21, 533, 253, 373, 2717, 627, 5001, 7364, 285, 891, 452, 5469, 690, 1840, 2490, 187, 4118, 18435, 27, 783, 30628, 574, 3012, 11711, 3390, 11626, 327, 436, 7714, 253, 2022, 2523, 762, 5955, 369, 1880, 253, 39926, 273, 436, 2929, 347, 247, 36284, 13571, 789, 369, 3451, 1677, 326, 253, 2022, 27163, 897, 642, 294, 19078, 1320, 390, 294, 88, 3087, 327, 6654, 891, 1158, 326, 1223, 581, 37317, 369, 1077, 4016, 670, 253, 2929, 253, 30859, 369, 6571, 18376, 1975, 253, 6832, 4468, 310, 1880, 253, 7103, 5301, 10646, 31767, 342, 642, 294, 88, 3087, 310, 2429, 1411, 3082, 326, 897, 294, 88, 3087, 275, 4677, 495, 310, 4344, 253, 4477, 3794, 281, 436, 407, 5277, 14023, 275, 8442, 721, 66, 285, 247, 883, 326, 7472, 294, 88, 3087, 327, 690, 8892, 2299, 841, 8442, 1646, 281, 921, 326, 253, 7200, 273, 31767, 310, 4336, 39188, 281, 294, 88, 3087, 395, 1014, 281, 3426, 294, 19078, 1320, 275, 253, 3236, 2257, 350, 1767, 21315, 3282, 436, 16540, 253, 3626, 1953, 2139, 417, 816, 7472, 8558, 275, 253, 294, 19078, 1025, 1083, 835, 253, 373, 642, 878, 281, 294, 3182, 253, 1307, 9880, 13571, 326, 310, 253, 2644, 9759, 3133, 281, 320, 24291, 253, 1039, 352, 943, 320, 3559, 15419, 11634, 310, 50275, 7053, 359, 921, 326, 31767, 41731, 13015, 643, 3082, 275, 253, 10610, 36284, 13571, 9459, 835, 359, 14932, 512, 253, 13461, 281, 31850, 2233, 294, 16668, 50276, 2520, 651, 8171, 253, 1246, 4677, 495, 436, 651, 320, 247, 4344, 5301, 10941, 10610, 455, 302, 350, 1767, 21315, 28617, 819, 25004, 3082, 281, 1016, 643, 50276, 8384, 359, 921, 326, 581, 5750, 273, 31767, 310, 326, 12401, 643, 9880, 42186, 28983, 3082, 697, 3045, 310, 13727, 281, 294, 88, 3087, 326, 310, 604, 752, 359, 971, 310, 816, 7200, 273, 253, 819, 37437, 1566, 285, 417, 281, 513, 690, 3686, 273, 8249, 5839, 273, 253, 36284, 13571, 9079, 840, 31767, 41731, 13015, 643, 3082, 672, 359, 13414, 513, 667, 294, 88, 3087, 387, 512, 50276, 3113, 436, 3686, 273, 9759, 891, 1158, 253, 4477, 812, 452, 16371, 253, 4016, 30628, 21915, 5747, 841, 1246, 1050, 20792, 1975, 3374, 891, 1158, 326, 253, 373, 2217, 7681, 7680, 1060, 342, 8442, 721, 66, 285, 247, 883, 281, 2118, 3579, 342, 14924, 3340, 7296, 253, 23027, 273, 253, 643, 30628, 285, 253, 7681, 38135, 273, 253, 26413, 652, 2746, 253, 16774, 1543, 403, 627, 285, 8442, 721, 66, 285, 247, 883, 921, 247, 2590, 4602, 342, 253, 36284, 13571, 789, 597, 250, 816, 3559, 38612, 285, 891, 1158, 627, 403, 417, 667, 7936, 7681, 3374, 1060, 326, 45035, 14924 ]
[ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 ]
[ 673, 323, 2169, 819, 25004, 4142, 840, 476, 891, 9580, 2169, 7200, 407, 970, 436, 1332, 50276, 2520, 2929, 8219, 323, 271, 440, 34218, 819, 25004, 1332, 533, 627, 310, 247, 4619, 2523, 327, 253, 440, 34218, 819, 25004, 891, 1158, 436, 2929, 556, 295, 311, 405, 8180, 327, 326, 50276, 783, 5661, 11809, 403, 594, 11096, 352, 943, 320, 6508, 281, 643, 11132, 15302, 285, 35615, 50276, 7152, 33032, 8716, 323, 253, 9880, 36284, 275, 253, 36284, 13571, 9079, 298, 394, 310, 273, 1270, 1600, 275, 253, 5145, 4715, 13361, 3114, 285, 436, 2929, 13698, 281, 1089, 824, 9880, 14997, 275, 954, 2219, 436, 789, 17075, 684, 253, 1566, 819, 25004, 8558, 440, 34218, 533, 671, 8725, 281, 18872, 819, 25004, 347, 247, 26413, 652, 13757, 31767, 835, 253, 2406, 5251, 13757, 9010, 253, 1682, 1896, 873, 273, 13461, 1677, 253, 23507, 11454, 2990, 326, 310, 2801, 25965, 285, 253, 5170, 5251, 13757, 310, 39793, 323, 253, 12419, 8989, 2218, 970, 5415, 17040, 3560, 407, 7887, 3169, 46551, 4477, 806, 15313, 253, 2087, 2048, 323, 253, 11786, 342, 1675, 281, 8989, 970, 271, 15424, 11786, 534, 8687, 1273, 2621, 13335, 4315, 13737, 285, 295, 295, 12624, 835, 295, 275, 3602, 275, 253, 2990, 2299, 342, 253, 344, 859, 757, 1959, 9376, 285, 1677, 253, 3753, 273, 253, 1895, 10370, 48971, 275, 8989, 278, 285, 3602, 39116, 253, 2457, 2048, 273, 11786, 7819, 562, 281, 452, 760, 806, 2621, 13335, 1907, 2931, 253, 12091, 597, 4720, 4262, 281, 6266, 253, 5933, 534, 8687, 2406, 5251, 256, 35333, 285, 5170, 5251, 653, 35333, 2218, 1919, 14940, 534, 1057, 5108, 275, 3946, 840, 253, 2929, 16947, 281, 253, 4679, 7668, 2709, 35615, 2709, 15302, 285, 1027, 819, 25004, 11878, 3045, 17082, 6388, 2457, 7200, 3045, 4103, 281, 253, 14086, 3215, 11273, 1566, 285, 4583, 1408, 673, 2266, 4679, 921, 326, 15086, 33526, 8936, 3045, 685, 253, 3236, 14086, 1566, 26332, 9010, 9880, 36284, 285, 556, 247, 10870, 390, 8936, 3045, 281, 34560, 9777, 819, 25004, 1607, 1223, 1907, 1199, 16277, 20243, 2074, 281, 4394, 12022, 819, 25004, 3082, 50276, 296, 3755, 20556, 50276, 18, 253, 2929, 310, 44003, 285, 973, 3542, 50276, 19, 253, 3762, 310, 3477, 281, 956, 285, 253, 3368, 2593, 310, 2266, 7668, 490, 77, 569, 2439, 1027, 4373, 22041, 50276, 20, 19840, 403, 973, 10932, 50275, 1189, 455, 891, 11346, 4361, 436, 789, 209, 422, 690, 13991, 534, 8558, 403, 3081, 10414, 281, 320, 5469, 285, 690, 3081, 24426, 326, 812, 1361, 281, 1056, 436, 247, 2266, 19529, 534, 891, 6266, 275, 253, 1563, 14855, 2593, 50275, 20881, 1255, 265, 337, 1223, 253, 10659, 310, 5393, 275, 253, 30762, 476, 253, 4477, 2486, 271, 5933, 2972, 323, 247, 1805, 4685, 273, 253, 4583, 2685, 50276, 19, 849, 403, 253, 11117, 39657, 6777, 323, 253, 3733, 1057, 352, 6388, 690, 2238, 273, 749, 2307, 792, 13757, 281, 755, 253, 10130, 752, 6569, 604, 253, 14604, 323, 256, 35333, 310, 36143, 1027, 432, 253, 14604, 273, 653, 35333, 50276, 20, 253, 14940, 906, 2011, 275, 253, 30762, 8687, 2457, 7200, 1057, 436, 5108, 326, 253, 25965, 1335, 1978, 327, 6890, 326, 310, 4560, 253, 749, 18428, 533, 512, 273, 326, 749, 3024, 4896, 452, 2074, 3045, 672, 10166, 2684, 256, 35333, 50276, 21, 352, 369, 5393, 326, 256, 35333, 5018, 403, 4934, 4229, 281, 337, 2299, 387, 253, 1077, 806, 3213, 273, 819, 25004, 2139, 651, 1335, 1907, 816, 337, 256, 35333, 3213, 36433, 50276, 22, 2829, 337, 476, 4477, 823, 247, 9118, 35823, 1386, 323, 253, 15302, 50276, 38092, 10414, 326, 476, 320, 5469, 337, 2602, 7887, 2801, 294, 19484, 1320, 323, 3037, 494, 37139, 414, 17857, 1686, 938, 374, 3576, 3733, 273, 23507, 11454, 6928, 762, 4156, 37139, 414, 7658, 30105, 1087, 1797, 495, 294, 37341, 26413, 652, 13757, 275, 11454, 10336, 3186, 247, 33342, 1768, 10491, 8668, 39951, 2284, 1797, 50276, 284, 5393, 275, 253, 2022, 2278, 5474, 33032, 2520, 2929, 3400, 247, 4460, 8460, 1427, 273, 1566, 819, 25004, 347, 247, 26413, 652, 13757, 31767, 1895, 275, 534, 253, 22199, 273, 819, 25004, 1221, 26208, 819, 25004, 476, 320, 11575, 347, 767, 13757, 2308, 337, 4560, 253, 819, 25004, 8989, 253, 5170, 5251, 285, 374, 34741, 1566, 851, 26208, 253, 2406, 5251, 50276, 783, 2929, 2007, 4081, 271, 5933, 26413, 652, 819, 25004, 15086, 281, 320, 247, 31767, 47037, 326, 4648, 760, 253, 806, 2621, 11786, 534, 2789, 352, 347, 5919, 347, 4394, 12022, 819, 25004, 50276, 783, 3368, 1543, 921, 326, 15086, 1298, 2824, 253, 1029, 6733, 273, 4394, 12022, 819, 25004, 285, 18922, 253, 1029, 7200, 273, 34560, 9777, 819, 25004, 1607, 275, 1097, 18872, 285, 440, 34218, 819, 25004, 15849, 20544, 50275, 783, 2934, 273, 253, 31767, 8460, 1427, 273, 253, 1566, 819, 25004, 1895, 310, 747, 534, 3400, 247, 10527, 3720, 323, 31767, 11333, 281, 320, 14859, 323, 1566, 819, 25004, 50275, 783, 4081, 15086, 819, 25004, 5933, 310, 3236, 285, 8542, 50275, 783, 2929, 310, 973, 10932, 285, 3477, 281, 956, 50276, 20881, 1255, 50275, 262, 651, 320, 1663, 9371, 281, 823, 690, 5955, 670, 253, 22620, 285, 3910, 875, 253, 31767, 285, 253, 298, 17, 3169, 819, 25004, 5987, 39962, 2061, 9275, 1166, 805, 13144, 805, 9275, 298, 17, 3169, 3082, 1859, 819, 25004, 25965, 347, 3632, 4903, 2931, 407, 2173, 3602, 3021, 2403, 253, 819, 25004, 25965, 812, 320, 6311, 2366, 342, 253, 3210, 436, 310, 247, 1027, 8668, 2429, 281, 31767, 285, 841, 767, 6849, 778, 320, 5678, 285, 27998, 534, 2789, 352, 4409, 1907, 247, 5955, 285, 5301, 253, 4477, 452, 9713, 253, 7364, 18212, 352, 651, 320, 1805, 281, 2118, 690, 28913, 2175, 751, 4677, 247, 21, 281, 253, 2022, 7714, 2167, 253, 7223, 778, 320, 3710, 5474, 339, 431, 248, 4477, 9113, 4271, 326, 34560, 9777, 819, 25004, 1607, 310, 31334, 281, 4908, 9880, 14997, 432, 11454, 6928, 281, 436, 990, 597, 7409, 247, 1332, 4981, 2182, 26413, 652, 5556, 5837, 31767, 281, 4908, 9880, 14997, 50276, 10387, 18872, 1543, 534, 476, 4354, 4917, 1524, 10186, 3885, 8777, 275, 253, 2929, 597, 36803, 819, 25004, 347, 247, 31767, 1895, 285, 9674, 7472, 2439, 260, 338, 274, 6903, 361, 10058, 303, 6533, 292, 285, 4440, 257, 292, 253, 1543, 403, 12724, 1805, 685, 1607, 285, 11786, 3169, 3779, 4364, 3082, 824, 347, 15909, 50274, 2887, 2708, 323, 2380, 281, 4477, 4868, 556, 644, 9300, 846, 30080, 22559, 891, 717, 20634, 273, 253, 9021, 273, 436, 2929, 1580, 253, 897, 273, 9880, 13571, 3133, 281, 320, 689, 19052, 275, 436, 789, 275, 253, 12002, 436, 1307, 310, 2931, 347, 26332, 819, 37437, 23507, 3210, 342, 1805, 26647, 685, 253, 3236, 14086, 3210, 533, 50276, 2520, 310, 417, 253, 7561, 5821, 2220, 5426, 3104, 4791, 281, 5345, 1918, 253, 625, 7561, 7607, 5426, 533, 1386, 28325, 4245, 247, 1027, 5426, 534, 891, 13414, 5194, 342, 352, 310, 9560, 326, 253, 789, 310, 1881, 32474, 285, 5185, 342, 643, 6239, 50276, 34814, 327, 432, 436, 8310, 352, 3133, 326, 253, 819, 25004, 310, 2218, 4768, 3733, 436, 2550, 12054, 320, 2429, 281, 1607, 390, 15909, 253, 819, 25004, 8989, 310, 873, 387, 19502, 470, 390, 2810, 281, 470, 285, 1057, 417, 1818, 4768, 3733, 352, 310, 671, 4409, 15806, 326, 15909, 310, 417, 256, 5503, 10542, 368, 943, 7277, 281, 354, 27375, 407, 355, 478, 796, 73, 1162, 355, 11106, 407, 634, 789, 2168, 253, 760, 3588, 5301, 891, 476, 923, 651, 320, 281, 253, 2393, 23182, 789, 407, 368, 1162, 355, 50276, 2858, 368, 13414, 2085, 667, 275, 436, 3884, 275, 1635, 368, 452, 2530, 1643, 14023, 281, 3082, 534, 819, 2517, 846, 3733, 285, 4931, 513, 690, 1355, 2408, 273, 4030, 25184, 841, 403, 671, 5272, 21607, 281, 436, 789, 50276, 71, 3341, 368, 403, 417, 253, 806, 281, 513, 26413, 652, 5556, 5837, 323, 819, 25004, 46350, 2990, 819, 25004, 323, 2494, 35019, 398, 407, 10875, 261, 285, 18209, 651, 320, 271, 1650, 2720, 1445, 285, 516, 2119, 627, 403, 2571, 50276, 1568, 403, 690, 625, 5884, 7906, 50275, 3211, 310, 2908, 534, 310, 1077, 5322, 1580, 253, 1332, 310, 9648, 9542, 50276, 74, 1663, 49638, 436, 789, 323, 10054, 4116, 281, 18872, 819, 25004, 50276, 953, 247, 1663, 1892, 1895, 285, 2080, 625, 4623, 685, 954, 643, 819, 25004, 10746, 50276, 2858, 891, 717, 417, 2119, 368, 1894, 885, 253, 2905, 789, 9648, 2014, 5103, 18872, 819, 25004, 407, 3889, 717, 398, 4786, 430, 1162, 355, 285, 354, 27375, 407, 355, 478, 796, 73, 1162, 355, 1097, 7515, 436, 3884, 387, 3302, 5837, 50276, 251, 298, 17107, 368, 1333, 326, 359, 476, 5467, 295, 6348, 19, 298, 50276, 17, 891, 717, 417, 13762, 436, 310, 2686, 2032, 275, 3946, 846, 512, 436, 310, 849, 1142, 4321, 819, 25004, 3779, 4364, 3082, 4307, 2299, 891, 717, 417, 3981, 326, 436, 2789, 253, 789, 13583, 34754, 1684, 598, 11678, 275, 45439, 50276, 261, 627, 247, 1175, 1921, 281, 897, 3806, 1283, 347, 253, 22791, 2581, 685, 253, 22791, 2530, 407, 21332, 868, 5816, 253, 1616, 789, 352, 310, 1077, 9470, 285, 643, 2987, 452, 4270, 2220, 352, 275, 3332, 1107, 50276, 13206, 721, 66, 310, 417, 2217, 281, 15249, 253, 31640, 281, 294, 88, 3087, 326, 10880, 247, 9880, 13571, 368, 306, 2530, 1543, 327, 253, 8004, 1566, 342, 253, 24746, 10895, 625, 2834, 6667, 403, 3058, 50276, 783, 2457, 1386, 273, 253, 6452, 310, 247, 2372, 8664, 275, 3946, 18872, 819, 25004, 3082, 812, 819, 2517, 253, 1072, 1180, 273, 3602, 533, 4917, 9106, 1027, 3885, 8777, 352, 310, 271, 1774, 2181, 281, 2557, 891, 13414, 1158, 627, 310, 667, 6843, 5955, 273, 7364, 275, 253, 2022, 2929, 253, 44282, 2792, 441, 281, 30762, 260, 21, 533, 253, 373, 2717, 627, 5001, 7364, 285, 891, 452, 5469, 690, 1840, 2490, 187, 4118, 18435, 27, 783, 30628, 574, 3012, 11711, 3390, 11626, 327, 436, 7714, 253, 2022, 2523, 762, 5955, 369, 1880, 253, 39926, 273, 436, 2929, 347, 247, 36284, 13571, 789, 369, 3451, 1677, 326, 253, 2022, 27163, 897, 642, 294, 19078, 1320, 390, 294, 88, 3087, 327, 6654, 891, 1158, 326, 1223, 581, 37317, 369, 1077, 4016, 670, 253, 2929, 253, 30859, 369, 6571, 18376, 1975, 253, 6832, 4468, 310, 1880, 253, 7103, 5301, 10646, 31767, 342, 642, 294, 88, 3087, 310, 2429, 1411, 3082, 326, 897, 294, 88, 3087, 275, 4677, 495, 310, 4344, 253, 4477, 3794, 281, 436, 407, 5277, 14023, 275, 8442, 721, 66, 285, 247, 883, 326, 7472, 294, 88, 3087, 327, 690, 8892, 2299, 841, 8442, 1646, 281, 921, 326, 253, 7200, 273, 31767, 310, 4336, 39188, 281, 294, 88, 3087, 395, 1014, 281, 3426, 294, 19078, 1320, 275, 253, 3236, 2257, 350, 1767, 21315, 3282, 436, 16540, 253, 3626, 1953, 2139, 417, 816, 7472, 8558, 275, 253, 294, 19078, 1025, 1083, 835, 253, 373, 642, 878, 281, 294, 3182, 253, 1307, 9880, 13571, 326, 310, 253, 2644, 9759, 3133, 281, 320, 24291, 253, 1039, 352, 943, 320, 3559, 15419, 11634, 310, 50275, 7053, 359, 921, 326, 31767, 41731, 13015, 643, 3082, 275, 253, 10610, 36284, 13571, 9459, 835, 359, 14932, 512, 253, 13461, 281, 31850, 2233, 294, 16668, 50276, 2520, 651, 8171, 253, 1246, 4677, 495, 436, 651, 320, 247, 4344, 5301, 10941, 10610, 455, 302, 350, 1767, 21315, 28617, 819, 25004, 3082, 281, 1016, 643, 50276, 8384, 359, 921, 326, 581, 5750, 273, 31767, 310, 326, 12401, 643, 9880, 42186, 28983, 3082, 697, 3045, 310, 13727, 281, 294, 88, 3087, 326, 310, 604, 752, 359, 971, 310, 816, 7200, 273, 253, 819, 37437, 1566, 285, 417, 281, 513, 690, 3686, 273, 8249, 5839, 273, 253, 36284, 13571, 9079, 840, 31767, 41731, 13015, 643, 3082, 672, 359, 13414, 513, 667, 294, 88, 3087, 387, 512, 50276, 3113, 436, 3686, 273, 9759, 891, 1158, 253, 4477, 812, 452, 16371, 253, 4016, 30628, 21915, 5747, 841, 1246, 1050, 20792, 1975, 3374, 891, 1158, 326, 253, 373, 2217, 7681, 7680, 1060, 342, 8442, 721, 66, 285, 247, 883, 281, 2118, 3579, 342, 14924, 3340, 7296, 253, 23027, 273, 253, 643, 30628, 285, 253, 7681, 38135, 273, 253, 26413, 652, 2746, 253, 16774, 1543, 403, 627, 285, 8442, 721, 66, 285, 247, 883, 921, 247, 2590, 4602, 342, 253, 36284, 13571, 789, 597, 250, 816, 3559, 38612, 285, 891, 1158, 627, 403, 417, 667, 7936, 7681, 3374, 1060, 326, 45035, 14924 ]
Below is given review of a research paper from cnoference journal. Please write a summary the review. ### Review: summary the paper studies the relationship between the intrinsic dimension of images and sample complexity and generalization the authors suggest to use a variant of the mle method of levina bickel 2004 which is based on computing the distances to nearest neighbors in pixel space which is fairly easy to implement and pros the authors aim to investigate two relevant hypotheses for the field of representation learning 1 intrinsic dimension of images is much lower than extrinsic dimension and 2 extrinsic dimension has little effect on sample complexity to test hypothesis 1 they use an estimator of the intrinsic dimension and measure its fitness in a controlled setting for which the know the real intrinsic dimension images generated by an stateoftheart gan hypothesis 1 is confirmed under this controlled setting and under a real scenario section 51 shows the inverse correlation between intrinsic dimension and sample complexity and shows that extrinsic dimension ie number of pixels has a much weaker correlation this section aims to confirm hypothesis number 2 i also find interesting the experiments in section 52 which studies the inverse correlation between intrinsic dimension and generalization ie test accuracy cons caption in figure 3 states that the authors observe the estimates to converge around the expected dimensionality of 10 however the dimensionality estimate greatly depends on k the number of neighbours used for each image no varianceconfidence interval methods are reported in this figure so its unclear whether the differences between 12 and 10 are large or not although they seem small if one compares against the extrinsic dimension of the images 128x128x3 this paper uses yet another intrinsic dimension estimator different from gong et al 2019 and ansuini et al 2019 its unclear whats the impact of the estimator in the predicted value of the intrinsic dimension one of the emphasized contributions of the paper is that its the first to show that intrinsic but not extrinsic dimensionality matters for the generalization of deep networks page 6 as far as this reviewer is aware indeed this paper is the first to measure intrinsic dimensionality of images and its relationship with generalization but there are others that compare the intrinsic dimensionality of the final embedding with accuracy showing the same conclusion eg gong et al 2019 ansuini et al 2019 thus the authors should be more specific when talking about intrinsicextrinsic dimensionality it refers to the image not the embedding representation of a given deep neural network classifier both the intrinsic dimensionality of the images and the classifier will impact the accuracy this paper only focuses on the former while other papers focus on the latter since this paper is posterior to the aforementioned papers it would be appreciated if the authors could comment on which intrinsic dimensionality shows larger correlation with generalization and draw some relationship among them some figures can be hardly read if printed in grayscale figures 3 6 7 8 i would suggest to use different line styles to better discern among curves in the plots and using hatches in the histograms figure 1 in the introduction it seems that the authors missed important seminal works on autoencoders eg reducing the dimensionality of data with neural networks by hinton and salakhutdinov 2016 since their references for autoencoders and regularization methods date back only to 2018 i have some minor concerns regarding computational cost the authors use a fraction of images as anchors and compute the nearest neighbour against the rest of the images in the dataset this still leads to a quadratic cost in the number of images in the dataset which may become problematic with modern datasets imagenet or even bigger ones given that the dimensionality estimates dont change much eg figure 3 why not fixing the number of samples to a constant number eg 1000 rationale for the score although i raised many points in my cons section many of these are more questions rather than specific issues that i have with the presented paper the paper tackles an important question of interest for the iclr community how to estimate the intrinsic dimensionality of our datsets and which impact does it have on generalization and sample complexity and it does so with a quite convincing experimental setup the method proposed by the authors could have important applications such as estimating the number of required training samples for reaching a target accuracy i hope that the authors can address my questionsconcerns during the rebuttal to increase my score update after discussion the authors have addressed all the points that i raised during the discussion i appreciate the effort and im increasing my score accordinglydocsepthis paper studies the intrinsic dimension of image datasets and connects it to the generalization ability of deep neural networks the three contributions are 1 measuring intrinsic dimension of common image datasets eg mnist cifar imagenet coco celeba 2 demonstrating using gans for which one has control of the intrinsic dimension the effectiveness of their dimension estimator for images and 3 tying intrinsic dimensionality to generalization performance strength i am actually a bit surprised that no existing work to my knowledge has carefully measured the intrinsic dimension of modern image datasets a work of such provides important justifications for numerous work on understanding and designing cnns based on lowdimensional assumptions therefore i appreciate the novelty and the significance of the work very much weaknesses my main concern is that the work appears underdeveloped in its current form and does not convincingly justify its conclusion in particular estimation of intrinsic dimension of dataset is the foundation for the development of this work but the discussion for it is very shallow 1 it is not clear why 1 provides an estimate of intrinsic dimensions under what assumptions is it derived for example is it assuming that the manifold is mostly flat or does it also work when manifold is curved 2 there is no discussion on the effect of k and how it should be selected what could go wrong if i pick k to be too large or low small will it result in overestimation or underestimation of the dimension 3 how is the dimension of the subspacemanifold affects the choice of k this question is relevant to table 1 where the comparison of dimension for different datasets are based on the same k 5 10 or 20 but likely the best choice of k is different for different datasets id also suggest an experiment in the gan setting where such effect is demonstrated through experiments perhaps with a figure similar to fig 3 but with a fixed number of samples and varying n in the xaxis 4 estimation of intrinsic dimension from data has been extensively studied see eg a b c and many recent works have used such notions to develop robust deep learning methods d e there should be a explanation on why 1 is picked over the other choices eg are they better choices than those used in d e a maximum likelihood estimation of intrinsic dimension 2004 b estimating the intrinsic dimension of datasets by a minimal neighborhood information 2017 c local intrinsic dimensionality i an extremevaluetheoretic foundation for similarity applications 2017 d dimensionalitydriven learning with noisy labels 2018 e characterizing adversarial subspaces using local intrinsic dimensionality 2018 overall i am not fully convinced as to whether the estimation of the intrinsic dimension from this paper is a faithful and good enough characterization of the true intrinsic dimension my suggestion is that the paper provides a review of existing dimension estimation methods clearly points out the pros and cons of each conceptually and perhaps also by experiments explains the reason for the specific choice in the paper and provides some discussion on the properties of the specific dimension estimation method minor comment a key challenge in estimating intrinsic dimension of images in eg imagenet is that they lie in relatively highdimensional subspaces and suffer more from curse of dimensionality therefore it may be beneficial to use data augmentation for generating more sample points which may help to improve the precision of estimation fig 3 is not color blind friendly maybe consider using different line types sec 5 first line establish established update after rebuttal i would like to thank the authors for the additional details provided in the rebuttal and revised version all my concerns have been adequately addressed given the importance of the topic and the development is reasonably solid i would like to recommend for its acceptance docsep review summary the paper proposes an empirical analysis of the dimension of natural images of multiple datasets the contributions are 1 a validation for nat im of previously proposed dimension estimation methods using gan to control the intrinsic dimension of the generated im 2 a confirmation that intrinsic dimension of nat im is lower than the dimension of their pixel space 3 that the lower the intrinsic dimension the easier the learning for neural net strengths the paper is wellwritten and easy to follow the data analysis pipeline is convincing up to my knowledge there is no such a clear statement about the dimension of natural image weaknesses the results are not sufficiently discussed i think the idea that nat im are lowd is more controversial than it is presented it is sometime proposed that image patches which are more likely to be textures are lowd eg brendel bethge iclr 2019 in relation to neural net learning these are known to be biased toward textures geirhos et al iclr 2019 so among your contribution 3 can be true while 2 is not but then what would explain your finding i mean that in fact 3 is more due two the lowd of textures than the lowd of nat im which would still be too highcomplementary to this it is suggested that natural images can be viewed as mixture of textures which belong to different lowd manifold vacher coencagli arxiv 190510629 vacher et al neurips 2020 minor comments none docsep the authors report a novel application of gans to validate the maximum likelihood estimator mle of the intrinsic dimension id of image data sets then they use the mle id estimator to characterize the intrinsic dimension of several commonly used computer vision data sets and link the data set id to the generalizability of trained classifiers they provide additional experiments that support the notion that it is intrinsic dimension and not extrinsic dimension ie of pixels that governs the performance of a binary classifier on these data sets also they verify that dimension plays a large role in learning on natural data i found the paper to be clearly written with only a few minor typographical errors in the writing and the subject to be of practical usefulness to the deep learning community however i do feel that the authors should perform a few additional experiments i think they are reasonably simple to improve the understanding of the results i think this might be a topic of great interest to the computer vision community since this paper describes a novel application of gans to study the sample complexity of convolutional neural networks review pros include innovative use of gans to generate synthetic data of bounded intrinsic dimension well written and easy to read coherent story cons limited analysis and discussion of the role of the image class particularly in imagenet on the mle estimate of intrinsic dimension some details not listed in the paper including the metric for the distance between images used to compute the mle id estimate nonsequitur in the analysis of the role of image extrinsic dimension in classifier generalization see comments below in section 51 the authors use the method of image reshaping using nearestneighbour interpolation to increase extrinsic dimensionality of images a comparison of the generalization performance of classifier models on data sets with different extrinsic dimensionality is thus made the authors make the claim that generalization performance depends on intrinsic but not extrinsic the dimensionality of images however the method of image reshaping using interpolation seems to be reversible ie lossless and thus their generated images should have identical information content independent of extrinsic dimension for example images collected using a highresolution camera can not be faithfully reproduced by interpolation of images taken by a lowresolution camera it would be more interesting to consider the realistic scenario where images of lower dimension are generated by lossy downsampling of higher resolution images and to then characterize whether either generalization performance or intrinsic dimension were related to extrinsic dimension i would guess that the intrinsic dimension of natural images is actually higher when they have a higher extrinsic dimension since there might be more fine details captured within images of high resolution conceptual example i might argue that the number and location of wrinkles on peoples faces is a variable that increases the intrinsic dimension of a facial photo data set but only if the images are sufficiently high resolution to see the wrinkles in section 52 the authors downsample images of 5 randomly chosen class pairs from each realworld data set and then 1 compute mle estimates of id for each data set and 2 compare the sample complexity of a binary classification problem on each it is interesting that while imagenet mle id k5 is 38 and cifar10 mle id k5 is 21 the mle id estimates which are made using k3 instead of k5 or another previously used value a choice not explained by the authors for the lowresolution 5randomclasspair samples of each data set are 154 and 114 respectively it seems that the id of the two datasets are much more similar after the resolutionreduction and classsampling suggesting that intrinsic dimension of image data sets is strongly determined by either extrinsic dimension or the number and kind of classes present the authors should comment on this fact furthermore given these observations the authors should make some effort to determine which process ie the lossy image reshaping or the random sampling of 5 class pairs contributes the most to the apparently much greater reduction in mle id that imagenet suffers compared to cifar10 in this experiment and whether there are specific classes in imagenet that contribute more to the intrinsic dimension of the data set than others perhaps the authors could simply compute the mle id estimate for the different classes of imagenet with and without image reshaping to 32x32x3 and then state which 5 class pairs were randomly chosen for the experiment in section 52 these additional computations might explain the unexpected crossover points mentioned in the paper in sections 53 and 54 the authors perform additional experiments showing that the intrinsic dimension in this case modulated by either adding fixeddimensionality noise or by applying image augmentation techniques is associated with generalization performance of deep learning models in the discussion the authors write while there may be many factors such as class separation and the number of classes which determine generalization we build the case that intrinsic dimension is one of these important factors i think it might be true the intrinsic dimension is actually determined by both extrinsic dimensions and by interclass and intraclass image diversity ie number of classes and varying hardness of each class during classification as i mentioned above the authors could perform several simple experiments using their existing experimental framework to test this hypothesis and i think doing so would significantly improve the quality of the paper despite this point i find the paper to be of acceptable quality ### Summary:
there was a consensus among reviewers that this paper should be accepted as the authors addressed reviewers concerns in the discussion phase this paper is wellwritten and easy to read it provides a coherent story and investigation on two important hypotheses that natural images have a lower intrinsic dimension than the extrinsic dimension eg the number of pixels and that a lower intrinsic dimension lowers the sample complexity of learning these results appear to be novel and significant for the iclr community as it provides justifications for numerous work on understanding and designing convolutional neural networks based on lowdimensional assumptions
[ 2686, 247, 2372, 9861, 326, 642, 5368, 789, 281, 619, 3640, 556, 9257, 4080, 253, 15276, 7877, 273, 4980, 2460, 15302, 247, 789, 273, 824, 3400, 1774, 816, 6787, 323, 7418, 789, 327, 4685, 285, 20462, 260, 79, 2224, 1754, 327, 1698, 6967, 13260, 50276, 45230, 891, 11435, 253, 38135, 285, 253, 8453, 273, 253, 789, 1077, 1199, 50275, 20881, 1255, 265, 50276, 2577, 2022, 4468, 310, 326, 253, 789, 4620, 762, 35208, 275, 697, 1655, 830, 285, 1057, 417, 2410, 1763, 5356, 15249, 697, 6452, 275, 1798, 13418, 273, 15276, 7877, 273, 10895, 310, 253, 12153, 323, 253, 2440, 273, 436, 789, 533, 253, 5955, 323, 352, 310, 1077, 20126, 50276, 18, 352, 310, 417, 2590, 2139, 337, 3400, 271, 6642, 273, 15276, 10103, 762, 752, 13260, 310, 352, 6012, 323, 1650, 310, 352, 7384, 326, 253, 16751, 310, 6571, 6507, 390, 1057, 352, 671, 789, 672, 16751, 310, 22627, 50276, 19, 627, 310, 642, 5955, 327, 253, 1055, 273, 465, 285, 849, 352, 943, 320, 4236, 752, 812, 564, 3430, 604, 891, 2619, 465, 281, 320, 1512, 1781, 390, 1698, 1355, 588, 352, 906, 275, 35039, 14508, 390, 22698, 14508, 273, 253, 7877, 50275, 20, 849, 310, 253, 7877, 273, 253, 749, 1033, 317, 11155, 49228, 11852, 253, 4327, 273, 465, 436, 1953, 310, 4623, 281, 2829, 337, 835, 253, 5301, 273, 7877, 323, 1027, 15302, 403, 1754, 327, 253, 1072, 465, 50276, 22, 884, 390, 1384, 533, 2779, 253, 1682, 4327, 273, 465, 310, 1027, 323, 1027, 15302, 2654, 671, 1804, 271, 3368, 275, 253, 36827, 4758, 835, 824, 1055, 310, 5183, 949, 4679, 4931, 342, 247, 4677, 2074, 281, 3036, 495, 533, 342, 247, 4229, 1180, 273, 3530, 285, 11962, 295, 275, 253, 1269, 10565, 50275, 21, 13418, 273, 15276, 7877, 432, 941, 556, 644, 18171, 5421, 923, 24088, 247, 270, 260, 285, 1142, 3332, 2987, 452, 908, 824, 27367, 281, 1287, 10237, 3676, 4715, 3082, 277, 299, 627, 943, 320, 247, 8813, 327, 2139, 337, 310, 5055, 689, 253, 643, 10165, 24088, 403, 597, 1805, 10165, 685, 1110, 908, 275, 277, 299, 50276, 66, 4869, 12177, 13418, 273, 15276, 7877, 6157, 270, 26230, 253, 15276, 7877, 273, 15302, 407, 247, 8723, 9168, 1491, 4240, 260, 1980, 15276, 7877, 1319, 891, 271, 5320, 15419, 86, 10666, 30325, 12153, 323, 14259, 4893, 4240, 277, 7877, 1319, 17477, 4715, 342, 27620, 13301, 4765, 299, 39330, 48960, 749, 31748, 970, 1980, 15276, 7877, 1319, 4765, 50276, 1189, 455, 891, 717, 417, 4751, 13762, 347, 281, 1880, 253, 13418, 273, 253, 15276, 7877, 432, 436, 2929, 310, 247, 21738, 285, 1175, 2217, 14846, 273, 253, 2032, 15276, 7877, 619, 14876, 310, 326, 253, 2929, 3400, 247, 2278, 273, 5368, 7877, 13418, 3082, 50276, 49346, 2792, 562, 253, 5847, 285, 772, 273, 1016, 4473, 1230, 285, 4931, 671, 407, 4679, 11424, 253, 1921, 323, 253, 2173, 4327, 275, 253, 2929, 285, 3400, 690, 5955, 327, 253, 3607, 273, 253, 2173, 7877, 13418, 1332, 50275, 37585, 4385, 50275, 66, 2234, 5691, 275, 26230, 15276, 7877, 273, 3888, 275, 24088, 4440, 257, 292, 310, 326, 597, 7027, 275, 4942, 1029, 6967, 749, 31748, 285, 11089, 625, 432, 28401, 273, 7877, 1319, 3103, 352, 778, 320, 12912, 281, 897, 941, 42072, 323, 11365, 625, 3410, 2792, 534, 778, 1361, 281, 3157, 253, 12320, 273, 13418, 50274, 926, 495, 310, 417, 3295, 9645, 11453, 5046, 1908, 970, 1027, 1386, 3510, 50274, 1704, 608, 806, 1386, 5100, 50276, 21877, 50275, 11183, 846, 30080, 22559, 50275, 74, 651, 751, 281, 5717, 253, 4477, 323, 253, 3081, 4278, 2530, 275, 253, 30080, 22559, 285, 17265, 2715, 512, 619, 7350, 452, 644, 18212, 9713, 1677, 253, 6349, 273, 253, 9400, 285, 253, 2440, 310, 12054, 4891, 891, 651, 751, 281, 5583, 323, 697, 14924, 5474, 33032, 2278, 50275, 8774, 50276, 783, 2929, 29328, 271, 16774, 1783, 273, 253, 7877, 273, 3626, 3888, 273, 2709, 15302, 253, 9021, 403, 337, 247, 12820, 323, 2889, 516, 273, 3786, 4081, 7877, 13418, 3082, 970, 36827, 281, 1453, 253, 15276, 7877, 273, 253, 4561, 516, 374, 247, 16883, 326, 15276, 7877, 273, 2889, 516, 310, 2406, 685, 253, 7877, 273, 616, 12275, 2317, 495, 326, 253, 2406, 253, 15276, 7877, 253, 6927, 253, 4715, 323, 11454, 2036, 50275, 296, 3755, 20556, 50275, 783, 2929, 310, 973, 15720, 285, 3477, 281, 956, 50275, 783, 941, 1783, 15722, 310, 21414, 50276, 484, 281, 619, 3640, 627, 310, 642, 824, 247, 2590, 3908, 670, 253, 7877, 273, 3626, 2460, 50274, 20881, 1255, 265, 50275, 783, 1543, 403, 417, 10481, 5469, 891, 1158, 253, 2934, 326, 2889, 516, 403, 1698, 69, 310, 625, 15620, 685, 352, 310, 3559, 352, 310, 24225, 4081, 326, 2460, 20412, 534, 403, 625, 2779, 281, 320, 38276, 403, 1698, 69, 24088, 270, 5047, 293, 50276, 47240, 463, 17857, 32888, 6247, 275, 5886, 281, 11454, 2036, 4715, 841, 403, 1929, 281, 320, 23539, 2584, 38276, 3471, 343, 40001, 1162, 355, 17857, 32888, 6247, 594, 2190, 634, 7680, 495, 476, 320, 2032, 1223, 374, 310, 417, 533, 840, 752, 651, 5513, 634, 4560, 50276, 74, 1599, 326, 275, 958, 495, 310, 625, 1955, 767, 253, 1698, 69, 273, 38276, 685, 253, 1698, 69, 273, 2889, 516, 534, 651, 1335, 320, 1512, 1029, 681, 3018, 552, 281, 436, 352, 310, 5125, 326, 3626, 3888, 476, 320, 11575, 347, 7802, 273, 38276, 534, 5663, 281, 1027, 1698, 69, 16751, 5809, 379, 50276, 1940, 2083, 356, 965, 549, 32693, 37400, 12971, 1717, 5809, 379, 1162, 355, 5723, 2824, 9169, 50274, 37585, 5701, 50275, 15422, 50275, 7152, 33032, 253, 4477, 1304, 247, 4460, 2898, 273, 305, 507, 281, 17813, 253, 4869, 12177, 29107, 278, 282, 273, 253, 15276, 7877, 2654, 273, 2460, 941, 5239, 840, 597, 897, 253, 278, 282, 2654, 29107, 281, 17710, 253, 15276, 7877, 273, 2067, 7744, 908, 4382, 8113, 941, 5239, 285, 3048, 253, 941, 873, 2654, 281, 253, 2087, 50228, 273, 10166, 49996, 50276, 9328, 2085, 3081, 4679, 326, 1329, 253, 10732, 326, 352, 310, 15276, 7877, 285, 417, 38988, 7877, 26332, 50276, 1171, 15115, 326, 42086, 253, 3045, 273, 247, 8985, 30410, 327, 841, 941, 5239, 671, 597, 12654, 326, 7877, 7120, 247, 1781, 2554, 275, 4715, 327, 3626, 941, 50276, 74, 1119, 253, 2929, 281, 320, 4518, 3542, 342, 760, 247, 1643, 5884, 1745, 15553, 6332, 275, 253, 4028, 285, 253, 2256, 281, 320, 273, 8542, 31471, 281, 253, 3676, 4715, 3114, 50276, 35529, 891, 513, 1928, 326, 253, 4477, 943, 1347, 247, 1643, 3081, 4679, 891, 1158, 597, 403, 12054, 2969, 281, 3157, 253, 4685, 273, 253, 1543, 891, 1158, 436, 1537, 320, 247, 9400, 273, 1270, 1600, 281, 253, 4382, 8113, 3114, 1580, 436, 2929, 8631, 247, 4460, 2898, 273, 305, 507, 281, 1263, 253, 3410, 10454, 273, 27311, 267, 11454, 6928, 50273, 15337, 5847, 2486, 16694, 897, 273, 305, 507, 281, 6635, 13506, 941, 273, 11542, 15276, 7877, 973, 3542, 285, 3477, 281, 1239, 18893, 2926, 50276, 5040, 3710, 1783, 285, 5955, 273, 253, 2554, 273, 253, 2460, 966, 3782, 275, 4440, 257, 292, 327, 253, 278, 282, 6642, 273, 15276, 7877, 690, 4278, 417, 7117, 275, 253, 2929, 1690, 253, 7982, 323, 253, 4181, 875, 3888, 908, 281, 11897, 253, 278, 282, 2654, 6642, 1327, 2346, 262, 321, 275, 253, 1783, 273, 253, 2554, 273, 2460, 38988, 7877, 275, 30410, 26647, 923, 5701, 2708, 50275, 249, 2593, 8319, 253, 4477, 897, 253, 1332, 273, 2460, 40206, 15609, 970, 5275, 570, 798, 7390, 30370, 281, 2572, 38988, 7877, 1319, 273, 3888, 247, 5301, 273, 253, 26647, 3045, 273, 30410, 3210, 327, 941, 5239, 342, 1027, 38988, 7877, 1319, 310, 3021, 1160, 50276, 783, 4477, 1056, 253, 1750, 326, 26647, 3045, 7024, 327, 15276, 533, 417, 38988, 253, 7877, 1319, 273, 3888, 2299, 253, 1332, 273, 2460, 40206, 15609, 970, 30370, 3133, 281, 320, 24048, 26332, 2957, 1417, 285, 3021, 616, 4561, 3888, 943, 452, 8931, 1491, 2600, 3907, 273, 38988, 7877, 50276, 1542, 1650, 3888, 5728, 970, 247, 1029, 21061, 6568, 476, 417, 320, 48479, 23775, 407, 30370, 273, 3888, 2668, 407, 247, 1698, 21061, 6568, 50276, 262, 651, 320, 625, 4722, 281, 1908, 253, 15958, 10076, 835, 3888, 273, 2406, 7877, 403, 4561, 407, 2957, 90, 1066, 48027, 273, 2169, 6064, 3888, 285, 281, 840, 17710, 1880, 2057, 26647, 3045, 390, 15276, 7877, 497, 2905, 281, 38988, 7877, 891, 651, 5476, 326, 253, 15276, 7877, 273, 3626, 3888, 310, 2686, 2169, 672, 597, 452, 247, 2169, 38988, 7877, 1580, 627, 1537, 320, 625, 4030, 4278, 10848, 1561, 3888, 273, 1029, 6064, 50276, 585, 25455, 1650, 891, 1537, 9059, 326, 253, 1180, 285, 4328, 273, 1488, 43572, 327, 22132, 9365, 310, 247, 4778, 326, 5459, 253, 15276, 7877, 273, 247, 17754, 7512, 941, 873, 533, 760, 604, 253, 3888, 403, 10481, 1029, 6064, 281, 923, 253, 1488, 43572, 50276, 249, 2593, 8073, 253, 4477, 1066, 16848, 3888, 273, 608, 12421, 6777, 966, 8557, 432, 1016, 1524, 10186, 941, 873, 285, 840, 337, 11897, 278, 282, 8197, 273, 2654, 323, 1016, 941, 873, 285, 374, 7277, 253, 3410, 10454, 273, 247, 8985, 9162, 1895, 327, 1016, 50276, 262, 310, 4722, 326, 1223, 4440, 257, 292, 278, 282, 2654, 465, 22, 310, 6480, 285, 260, 338, 274, 740, 278, 282, 2654, 465, 22, 310, 3127, 253, 278, 282, 2654, 8197, 534, 403, 1160, 970, 465, 20, 3185, 273, 465, 22, 390, 1529, 3786, 908, 1318, 247, 4327, 417, 5544, 407, 253, 4477, 323, 253, 1698, 21061, 608, 14719, 2437, 13934, 3530, 273, 1016, 941, 873, 403, 21603, 285, 11601, 2975, 50276, 262, 3133, 326, 253, 2654, 273, 253, 767, 15302, 403, 1199, 625, 2074, 846, 253, 6064, 44571, 285, 966, 48027, 7738, 326, 15276, 7877, 273, 2460, 941, 5239, 310, 7052, 3413, 407, 2057, 38988, 7877, 390, 253, 1180, 285, 2238, 273, 5971, 1246, 253, 4477, 943, 4385, 327, 436, 958, 50276, 44295, 3062, 1677, 841, 7313, 253, 4477, 943, 1056, 690, 3434, 281, 3653, 534, 1232, 26332, 253, 2957, 90, 2460, 40206, 15609, 390, 253, 3632, 10491, 273, 608, 966, 8557, 17904, 253, 954, 281, 253, 8505, 1199, 3687, 5141, 275, 278, 282, 2654, 326, 4440, 257, 292, 27171, 2429, 281, 260, 338, 274, 740, 275, 436, 3368, 285, 1880, 627, 403, 2173, 5971, 275, 4440, 257, 292, 326, 8162, 625, 281, 253, 15276, 7877, 273, 253, 941, 873, 685, 2571, 4931, 253, 4477, 812, 3365, 11897, 253, 278, 282, 2654, 6642, 323, 253, 1027, 5971, 273, 4440, 257, 292, 342, 285, 1293, 2460, 40206, 15609, 281, 4567, 89, 1237, 89, 20, 285, 840, 1375, 534, 608, 966, 8557, 497, 12421, 6777, 323, 253, 3368, 275, 2593, 8073, 50275, 20513, 3081, 30745, 1537, 5513, 253, 12439, 31595, 2792, 5393, 275, 253, 2929, 50276, 249, 7118, 8676, 285, 8255, 253, 4477, 1347, 3081, 4679, 4645, 326, 253, 15276, 7877, 275, 436, 1083, 27577, 407, 2057, 6240, 4229, 39120, 1319, 6046, 390, 407, 9433, 2460, 42072, 5609, 310, 2330, 342, 26647, 3045, 273, 3676, 4715, 3210, 50276, 249, 253, 5955, 253, 4477, 3630, 1223, 627, 778, 320, 1142, 2616, 824, 347, 966, 9712, 285, 253, 1180, 273, 5971, 534, 3653, 26647, 359, 1973, 253, 1083, 326, 15276, 7877, 310, 581, 273, 841, 1774, 2616, 50276, 74, 1158, 352, 1537, 320, 2032, 253, 15276, 7877, 310, 2686, 3413, 407, 1097, 38988, 10103, 285, 407, 734, 2437, 285, 11251, 14407, 2460, 9991, 26332, 1180, 273, 5971, 285, 11962, 38576, 273, 1016, 966, 1309, 9162, 347, 891, 5393, 1840, 253, 4477, 812, 1347, 2067, 2969, 4679, 970, 616, 5368, 5661, 7792, 281, 1071, 436, 9079, 285, 891, 1158, 2509, 594, 651, 3012, 3157, 253, 3290, 273, 253, 2929, 50276, 3229, 3784, 436, 1127, 891, 1089, 253, 2929, 281, 320, 273, 12207, 3290, 2490, 187, 4118, 18435, 27, 9088, 369, 247, 13969, 2190, 30628, 326, 436, 2929, 943, 320, 7607, 347, 253, 4477, 9713, 30628, 7350, 275, 253, 5955, 3408, 436, 2929, 310, 973, 15720, 285, 3477, 281, 1239, 352, 3400, 247, 18893, 2926, 285, 5839, 327, 767, 1774, 24316, 326, 3626, 3888, 452, 247, 2406, 15276, 7877, 685, 253, 38988, 7877, 24088, 253, 1180, 273, 15115, 285, 326, 247, 2406, 15276, 7877, 45742, 253, 3410, 10454, 273, 4715, 841, 1543, 3176, 281, 320, 4460, 285, 1534, 323, 253, 17857, 32888, 3114, 347, 352, 3400, 816, 6787, 323, 7418, 789, 327, 4685, 285, 20462, 27311, 267, 11454, 6928, 1754, 327, 1698, 6967, 13260 ]
[ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 ]
[ 2686, 247, 2372, 9861, 326, 642, 5368, 789, 281, 619, 3640, 556, 9257, 4080, 253, 15276, 7877, 273, 4980, 2460, 15302, 247, 789, 273, 824, 3400, 1774, 816, 6787, 323, 7418, 789, 327, 4685, 285, 20462, 260, 79, 2224, 1754, 327, 1698, 6967, 13260, 50276, 45230, 891, 11435, 253, 38135, 285, 253, 8453, 273, 253, 789, 1077, 1199, 50275, 20881, 1255, 265, 50276, 2577, 2022, 4468, 310, 326, 253, 789, 4620, 762, 35208, 275, 697, 1655, 830, 285, 1057, 417, 2410, 1763, 5356, 15249, 697, 6452, 275, 1798, 13418, 273, 15276, 7877, 273, 10895, 310, 253, 12153, 323, 253, 2440, 273, 436, 789, 533, 253, 5955, 323, 352, 310, 1077, 20126, 50276, 18, 352, 310, 417, 2590, 2139, 337, 3400, 271, 6642, 273, 15276, 10103, 762, 752, 13260, 310, 352, 6012, 323, 1650, 310, 352, 7384, 326, 253, 16751, 310, 6571, 6507, 390, 1057, 352, 671, 789, 672, 16751, 310, 22627, 50276, 19, 627, 310, 642, 5955, 327, 253, 1055, 273, 465, 285, 849, 352, 943, 320, 4236, 752, 812, 564, 3430, 604, 891, 2619, 465, 281, 320, 1512, 1781, 390, 1698, 1355, 588, 352, 906, 275, 35039, 14508, 390, 22698, 14508, 273, 253, 7877, 50275, 20, 849, 310, 253, 7877, 273, 253, 749, 1033, 317, 11155, 49228, 11852, 253, 4327, 273, 465, 436, 1953, 310, 4623, 281, 2829, 337, 835, 253, 5301, 273, 7877, 323, 1027, 15302, 403, 1754, 327, 253, 1072, 465, 50276, 22, 884, 390, 1384, 533, 2779, 253, 1682, 4327, 273, 465, 310, 1027, 323, 1027, 15302, 2654, 671, 1804, 271, 3368, 275, 253, 36827, 4758, 835, 824, 1055, 310, 5183, 949, 4679, 4931, 342, 247, 4677, 2074, 281, 3036, 495, 533, 342, 247, 4229, 1180, 273, 3530, 285, 11962, 295, 275, 253, 1269, 10565, 50275, 21, 13418, 273, 15276, 7877, 432, 941, 556, 644, 18171, 5421, 923, 24088, 247, 270, 260, 285, 1142, 3332, 2987, 452, 908, 824, 27367, 281, 1287, 10237, 3676, 4715, 3082, 277, 299, 627, 943, 320, 247, 8813, 327, 2139, 337, 310, 5055, 689, 253, 643, 10165, 24088, 403, 597, 1805, 10165, 685, 1110, 908, 275, 277, 299, 50276, 66, 4869, 12177, 13418, 273, 15276, 7877, 6157, 270, 26230, 253, 15276, 7877, 273, 15302, 407, 247, 8723, 9168, 1491, 4240, 260, 1980, 15276, 7877, 1319, 891, 271, 5320, 15419, 86, 10666, 30325, 12153, 323, 14259, 4893, 4240, 277, 7877, 1319, 17477, 4715, 342, 27620, 13301, 4765, 299, 39330, 48960, 749, 31748, 970, 1980, 15276, 7877, 1319, 4765, 50276, 1189, 455, 891, 717, 417, 4751, 13762, 347, 281, 1880, 253, 13418, 273, 253, 15276, 7877, 432, 436, 2929, 310, 247, 21738, 285, 1175, 2217, 14846, 273, 253, 2032, 15276, 7877, 619, 14876, 310, 326, 253, 2929, 3400, 247, 2278, 273, 5368, 7877, 13418, 3082, 50276, 49346, 2792, 562, 253, 5847, 285, 772, 273, 1016, 4473, 1230, 285, 4931, 671, 407, 4679, 11424, 253, 1921, 323, 253, 2173, 4327, 275, 253, 2929, 285, 3400, 690, 5955, 327, 253, 3607, 273, 253, 2173, 7877, 13418, 1332, 50275, 37585, 4385, 50275, 66, 2234, 5691, 275, 26230, 15276, 7877, 273, 3888, 275, 24088, 4440, 257, 292, 310, 326, 597, 7027, 275, 4942, 1029, 6967, 749, 31748, 285, 11089, 625, 432, 28401, 273, 7877, 1319, 3103, 352, 778, 320, 12912, 281, 897, 941, 42072, 323, 11365, 625, 3410, 2792, 534, 778, 1361, 281, 3157, 253, 12320, 273, 13418, 50274, 926, 495, 310, 417, 3295, 9645, 11453, 5046, 1908, 970, 1027, 1386, 3510, 50274, 1704, 608, 806, 1386, 5100, 50276, 21877, 50275, 11183, 846, 30080, 22559, 50275, 74, 651, 751, 281, 5717, 253, 4477, 323, 253, 3081, 4278, 2530, 275, 253, 30080, 22559, 285, 17265, 2715, 512, 619, 7350, 452, 644, 18212, 9713, 1677, 253, 6349, 273, 253, 9400, 285, 253, 2440, 310, 12054, 4891, 891, 651, 751, 281, 5583, 323, 697, 14924, 5474, 33032, 2278, 50275, 8774, 50276, 783, 2929, 29328, 271, 16774, 1783, 273, 253, 7877, 273, 3626, 3888, 273, 2709, 15302, 253, 9021, 403, 337, 247, 12820, 323, 2889, 516, 273, 3786, 4081, 7877, 13418, 3082, 970, 36827, 281, 1453, 253, 15276, 7877, 273, 253, 4561, 516, 374, 247, 16883, 326, 15276, 7877, 273, 2889, 516, 310, 2406, 685, 253, 7877, 273, 616, 12275, 2317, 495, 326, 253, 2406, 253, 15276, 7877, 253, 6927, 253, 4715, 323, 11454, 2036, 50275, 296, 3755, 20556, 50275, 783, 2929, 310, 973, 15720, 285, 3477, 281, 956, 50275, 783, 941, 1783, 15722, 310, 21414, 50276, 484, 281, 619, 3640, 627, 310, 642, 824, 247, 2590, 3908, 670, 253, 7877, 273, 3626, 2460, 50274, 20881, 1255, 265, 50275, 783, 1543, 403, 417, 10481, 5469, 891, 1158, 253, 2934, 326, 2889, 516, 403, 1698, 69, 310, 625, 15620, 685, 352, 310, 3559, 352, 310, 24225, 4081, 326, 2460, 20412, 534, 403, 625, 2779, 281, 320, 38276, 403, 1698, 69, 24088, 270, 5047, 293, 50276, 47240, 463, 17857, 32888, 6247, 275, 5886, 281, 11454, 2036, 4715, 841, 403, 1929, 281, 320, 23539, 2584, 38276, 3471, 343, 40001, 1162, 355, 17857, 32888, 6247, 594, 2190, 634, 7680, 495, 476, 320, 2032, 1223, 374, 310, 417, 533, 840, 752, 651, 5513, 634, 4560, 50276, 74, 1599, 326, 275, 958, 495, 310, 625, 1955, 767, 253, 1698, 69, 273, 38276, 685, 253, 1698, 69, 273, 2889, 516, 534, 651, 1335, 320, 1512, 1029, 681, 3018, 552, 281, 436, 352, 310, 5125, 326, 3626, 3888, 476, 320, 11575, 347, 7802, 273, 38276, 534, 5663, 281, 1027, 1698, 69, 16751, 5809, 379, 50276, 1940, 2083, 356, 965, 549, 32693, 37400, 12971, 1717, 5809, 379, 1162, 355, 5723, 2824, 9169, 50274, 37585, 5701, 50275, 15422, 50275, 7152, 33032, 253, 4477, 1304, 247, 4460, 2898, 273, 305, 507, 281, 17813, 253, 4869, 12177, 29107, 278, 282, 273, 253, 15276, 7877, 2654, 273, 2460, 941, 5239, 840, 597, 897, 253, 278, 282, 2654, 29107, 281, 17710, 253, 15276, 7877, 273, 2067, 7744, 908, 4382, 8113, 941, 5239, 285, 3048, 253, 941, 873, 2654, 281, 253, 2087, 50228, 273, 10166, 49996, 50276, 9328, 2085, 3081, 4679, 326, 1329, 253, 10732, 326, 352, 310, 15276, 7877, 285, 417, 38988, 7877, 26332, 50276, 1171, 15115, 326, 42086, 253, 3045, 273, 247, 8985, 30410, 327, 841, 941, 5239, 671, 597, 12654, 326, 7877, 7120, 247, 1781, 2554, 275, 4715, 327, 3626, 941, 50276, 74, 1119, 253, 2929, 281, 320, 4518, 3542, 342, 760, 247, 1643, 5884, 1745, 15553, 6332, 275, 253, 4028, 285, 253, 2256, 281, 320, 273, 8542, 31471, 281, 253, 3676, 4715, 3114, 50276, 35529, 891, 513, 1928, 326, 253, 4477, 943, 1347, 247, 1643, 3081, 4679, 891, 1158, 597, 403, 12054, 2969, 281, 3157, 253, 4685, 273, 253, 1543, 891, 1158, 436, 1537, 320, 247, 9400, 273, 1270, 1600, 281, 253, 4382, 8113, 3114, 1580, 436, 2929, 8631, 247, 4460, 2898, 273, 305, 507, 281, 1263, 253, 3410, 10454, 273, 27311, 267, 11454, 6928, 50273, 15337, 5847, 2486, 16694, 897, 273, 305, 507, 281, 6635, 13506, 941, 273, 11542, 15276, 7877, 973, 3542, 285, 3477, 281, 1239, 18893, 2926, 50276, 5040, 3710, 1783, 285, 5955, 273, 253, 2554, 273, 253, 2460, 966, 3782, 275, 4440, 257, 292, 327, 253, 278, 282, 6642, 273, 15276, 7877, 690, 4278, 417, 7117, 275, 253, 2929, 1690, 253, 7982, 323, 253, 4181, 875, 3888, 908, 281, 11897, 253, 278, 282, 2654, 6642, 1327, 2346, 262, 321, 275, 253, 1783, 273, 253, 2554, 273, 2460, 38988, 7877, 275, 30410, 26647, 923, 5701, 2708, 50275, 249, 2593, 8319, 253, 4477, 897, 253, 1332, 273, 2460, 40206, 15609, 970, 5275, 570, 798, 7390, 30370, 281, 2572, 38988, 7877, 1319, 273, 3888, 247, 5301, 273, 253, 26647, 3045, 273, 30410, 3210, 327, 941, 5239, 342, 1027, 38988, 7877, 1319, 310, 3021, 1160, 50276, 783, 4477, 1056, 253, 1750, 326, 26647, 3045, 7024, 327, 15276, 533, 417, 38988, 253, 7877, 1319, 273, 3888, 2299, 253, 1332, 273, 2460, 40206, 15609, 970, 30370, 3133, 281, 320, 24048, 26332, 2957, 1417, 285, 3021, 616, 4561, 3888, 943, 452, 8931, 1491, 2600, 3907, 273, 38988, 7877, 50276, 1542, 1650, 3888, 5728, 970, 247, 1029, 21061, 6568, 476, 417, 320, 48479, 23775, 407, 30370, 273, 3888, 2668, 407, 247, 1698, 21061, 6568, 50276, 262, 651, 320, 625, 4722, 281, 1908, 253, 15958, 10076, 835, 3888, 273, 2406, 7877, 403, 4561, 407, 2957, 90, 1066, 48027, 273, 2169, 6064, 3888, 285, 281, 840, 17710, 1880, 2057, 26647, 3045, 390, 15276, 7877, 497, 2905, 281, 38988, 7877, 891, 651, 5476, 326, 253, 15276, 7877, 273, 3626, 3888, 310, 2686, 2169, 672, 597, 452, 247, 2169, 38988, 7877, 1580, 627, 1537, 320, 625, 4030, 4278, 10848, 1561, 3888, 273, 1029, 6064, 50276, 585, 25455, 1650, 891, 1537, 9059, 326, 253, 1180, 285, 4328, 273, 1488, 43572, 327, 22132, 9365, 310, 247, 4778, 326, 5459, 253, 15276, 7877, 273, 247, 17754, 7512, 941, 873, 533, 760, 604, 253, 3888, 403, 10481, 1029, 6064, 281, 923, 253, 1488, 43572, 50276, 249, 2593, 8073, 253, 4477, 1066, 16848, 3888, 273, 608, 12421, 6777, 966, 8557, 432, 1016, 1524, 10186, 941, 873, 285, 840, 337, 11897, 278, 282, 8197, 273, 2654, 323, 1016, 941, 873, 285, 374, 7277, 253, 3410, 10454, 273, 247, 8985, 9162, 1895, 327, 1016, 50276, 262, 310, 4722, 326, 1223, 4440, 257, 292, 278, 282, 2654, 465, 22, 310, 6480, 285, 260, 338, 274, 740, 278, 282, 2654, 465, 22, 310, 3127, 253, 278, 282, 2654, 8197, 534, 403, 1160, 970, 465, 20, 3185, 273, 465, 22, 390, 1529, 3786, 908, 1318, 247, 4327, 417, 5544, 407, 253, 4477, 323, 253, 1698, 21061, 608, 14719, 2437, 13934, 3530, 273, 1016, 941, 873, 403, 21603, 285, 11601, 2975, 50276, 262, 3133, 326, 253, 2654, 273, 253, 767, 15302, 403, 1199, 625, 2074, 846, 253, 6064, 44571, 285, 966, 48027, 7738, 326, 15276, 7877, 273, 2460, 941, 5239, 310, 7052, 3413, 407, 2057, 38988, 7877, 390, 253, 1180, 285, 2238, 273, 5971, 1246, 253, 4477, 943, 4385, 327, 436, 958, 50276, 44295, 3062, 1677, 841, 7313, 253, 4477, 943, 1056, 690, 3434, 281, 3653, 534, 1232, 26332, 253, 2957, 90, 2460, 40206, 15609, 390, 253, 3632, 10491, 273, 608, 966, 8557, 17904, 253, 954, 281, 253, 8505, 1199, 3687, 5141, 275, 278, 282, 2654, 326, 4440, 257, 292, 27171, 2429, 281, 260, 338, 274, 740, 275, 436, 3368, 285, 1880, 627, 403, 2173, 5971, 275, 4440, 257, 292, 326, 8162, 625, 281, 253, 15276, 7877, 273, 253, 941, 873, 685, 2571, 4931, 253, 4477, 812, 3365, 11897, 253, 278, 282, 2654, 6642, 323, 253, 1027, 5971, 273, 4440, 257, 292, 342, 285, 1293, 2460, 40206, 15609, 281, 4567, 89, 1237, 89, 20, 285, 840, 1375, 534, 608, 966, 8557, 497, 12421, 6777, 323, 253, 3368, 275, 2593, 8073, 50275, 20513, 3081, 30745, 1537, 5513, 253, 12439, 31595, 2792, 5393, 275, 253, 2929, 50276, 249, 7118, 8676, 285, 8255, 253, 4477, 1347, 3081, 4679, 4645, 326, 253, 15276, 7877, 275, 436, 1083, 27577, 407, 2057, 6240, 4229, 39120, 1319, 6046, 390, 407, 9433, 2460, 42072, 5609, 310, 2330, 342, 26647, 3045, 273, 3676, 4715, 3210, 50276, 249, 253, 5955, 253, 4477, 3630, 1223, 627, 778, 320, 1142, 2616, 824, 347, 966, 9712, 285, 253, 1180, 273, 5971, 534, 3653, 26647, 359, 1973, 253, 1083, 326, 15276, 7877, 310, 581, 273, 841, 1774, 2616, 50276, 74, 1158, 352, 1537, 320, 2032, 253, 15276, 7877, 310, 2686, 3413, 407, 1097, 38988, 10103, 285, 407, 734, 2437, 285, 11251, 14407, 2460, 9991, 26332, 1180, 273, 5971, 285, 11962, 38576, 273, 1016, 966, 1309, 9162, 347, 891, 5393, 1840, 253, 4477, 812, 1347, 2067, 2969, 4679, 970, 616, 5368, 5661, 7792, 281, 1071, 436, 9079, 285, 891, 1158, 2509, 594, 651, 3012, 3157, 253, 3290, 273, 253, 2929, 50276, 3229, 3784, 436, 1127, 891, 1089, 253, 2929, 281, 320, 273, 12207, 3290, 2490, 187, 4118, 18435, 27, 9088, 369, 247, 13969, 2190, 30628, 326, 436, 2929, 943, 320, 7607, 347, 253, 4477, 9713, 30628, 7350, 275, 253, 5955, 3408, 436, 2929, 310, 973, 15720, 285, 3477, 281, 1239, 352, 3400, 247, 18893, 2926, 285, 5839, 327, 767, 1774, 24316, 326, 3626, 3888, 452, 247, 2406, 15276, 7877, 685, 253, 38988, 7877, 24088, 253, 1180, 273, 15115, 285, 326, 247, 2406, 15276, 7877, 45742, 253, 3410, 10454, 273, 4715, 841, 1543, 3176, 281, 320, 4460, 285, 1534, 323, 253, 17857, 32888, 3114, 347, 352, 3400, 816, 6787, 323, 7418, 789, 327, 4685, 285, 20462, 27311, 267, 11454, 6928, 1754, 327, 1698, 6967, 13260 ]
Below is given review of a research paper from cnoference journal. Please write a summary the review. ### Review: this manuscript introduces a new method for the reconstruction of continuous variability in cryogenic electronmicroscopy cryoem based on amortized inference and neural network autoencoders while amortized inference and autoencoders have previously been used for different aspects of this problem the proposed method provides a more comprehensive approach specifically previous methods have relied on a consensus model for pose estimation or performed expensive pose estimation as part of an alternating refinement process where the main focus has been on estimating the latent states of the molecule the proposed method performs the pose estimation at the same time as the latent state estimation achieving significant speedups with respect to the state of the art the method is validated on numerous synthetic datasets and one experimental dataset the main components of the proposed method have been proposed earlier amortized inference for pose estimation has been introduced by miolane et al 15 and for latent state estimation by zhong et al 33 joint estimation of both pose estimation has been performed by rosenbaum et al 24 but only for synthetic data these previous results are accurately described in the manuscript as such the original work in this manuscript consists of combining these existing components into a functioning whole which is no easy feat it also does this while achieving significantly reduced running times compared to the state of the art albeit with a slight loss of accuracy the method is promising and the authors provide numerical results to support their claims that being said these numerical results do not completely validate the method as they currently stand first the synthetic data generated has a very high signaltonoise ratio snr by cryoem standards this can be seen most clearly by comparing the sample projections in figure s3 where two synthetic projection images are compared to an experimental projection image the noise level of the former is much lower compared to the latter in the text the authors simply mention that the variance of the noise applied is one without providing a measure of the variance of the clean images judging by eye i would guess that the snr for the synthetic images is close to one while realistic cryoem data typically has snrs ranging from one hundredth to one tenth it is important to note that despite these high snrs the proposed method performs worse than the stateoftheart with rotation errors on the order of 2 and translation errors typically on the order of 10 the results on the experimental data is also concerning in the supplementary material the authors mention that particle images are shifted by their published poses since the particles in this dataset are significantly out of center in other words the pose estimation is not fully validated on experimental data specifically while the rotational component of pose estimation is validated the translational component is not while it may well be the case that the particles in this dataset are significantly out of center this is a common occurrence in many cryoem datasets and it is something that a robust method should be able to handle given the fact that an important contribution of the proposed method is its validation on experimental data this discrepancy significantly detracts from that contribution furthermore the fact that only one experimental dataset is tested also reduces the scope of validity compared to the cryodrgn and relion multibody refinement papers which present results on three or more experimental datasets the overall presentation of the work is clear with most concepts being well defined given the importance put on amortized inference it would be good to specify early on what is meant in a formal manner to reduce any possible confusion about the term in a similar vein a definition of the hartley transform and its relative advantages over the fourier transform would be a good additiono as stated above the significance of this paper is the joint amortized inference of pose and latent space applied to experimental data as such it provides an important methodological milestone for the processing of cryoem data however the numerical results do not completely back up these claims additionally the authors also do not make clear what it is about their setup that makes the method successful in other words what has been missing from previous attempts at joint inference such as the work of rosenbaum et al 24 that is addressed by the current work as stated above the high snrs of the synthetic data limited experimental validation and lack of validation for translational pose estimation in the experimental data detract from the overall claims of the manuscript and should either be resolved or acknowledged in the text the authors discuss potential negative societal impact of their work and the field in general quite well docsepthe paper describes a method cryofire to recapitulate the 3d structure of a biomolecule from cryoem images using coordinatebased neural model that enables continuous representation this is inspired by recent advancements of cryodrgncryodrgn2 for ab initio 3d molecular reconstruction different from existing methods the authors propose to use an encoder to jointly estimate the particle poses and conformation state a latent vector from a set of 2d projections the predicted poses and latent vector are then fed into a feedforward neural network to aggregate each particle image into an implicit representation of the scattering potential volume the encoder and decoder are trained in an endtoend fashion using l2 distance this method achieves comparable results on both simulated and experimental datasets to sota in addition the paper shows that such autoencoderbased method can significantly speed up the reconstruction time compared to cryodrgn2 when using a large number of 2d projections strengths the paper presents a novel approach for single particle cryoem that quantitatively and qualitatively works well in addition the paper is overall well written and the method is well evaluated on experimental dataset weaknesses some claimed contributions and technical details might need to be clarified and discussed the limitations of current method are missing considering single particle cryoem is an important but challenging problem some discussions will be helpful future work docsepthe paper proposes a novel method to infer both pose and conformational states of protein structures from raw cryoem images in a way that is amortized over the size of the dataset the authors compared the proposed method with cryodrgn2 on several synthetic and experimental datasets they demonstrated that their method requires less time compared to cryodrgn2 by sacrificing little accuracy the paper is wellwritten and overall easy to follow the motivation and importance of the problem are well discussed the paper provides a thorough overview of the literature the proposed method is novel to the best of my knowledge however there are several issues with the experiments baselines and results the authors mentioned that rosenbaum et als method was the only one to perform amortization for both pose and conformation estimation in such a case that particular method should have been kept as a baseline not cryodrgn2 since it amortizes for conformation estimation only even the comparison with cryodrgn2 is not very impressive it is quite usual that methods with an exhaustive search for pose would take more time with more accuracy if the proposed method cryofire could achieve better or similar accuracy with less time then it would have been much more impressive but cryofire performs much worse than cryodrgn2 even in large datasets no comparison has been showed for conformation estimation is cryofire as good as cryodrgn2 for conformation estimation as well can it recover some conformations that can not be recovered by cryodrgn2 or 3dva or e2gmm the paper mentions rosembaum et als method only worked on the simulated dataset but it did not discuss why it can not be applied to real datasets what makes it inapplicable for real datasets but applicable for realistically simulated datasets or is it the case that the authors are the first to apply the method to the experimental dataset this is clearly not a big contribution the authors discussed the limitations and possible negative social impact of the work docsepthe paper proposes an ab initio heterogeneous reconstruction method for spa cryoem named cryofire there are multiple advantages to the proposed approach and most of them come from the ability to jointly estimate both poses and conformations allowing to skip computationally expensive step of pose search the authors show the effectiveness of the method on multiple datasets including one experimental and compare the approach with sota of cryoem nnbased heterogeneous reconstruction cryodrgn2 the authors promise to provide an opensource implementation of cryofire upon publication strengths 1 the paper is written cleanly and i especially enjoyed the succinct formal description of cryoem image formation 2 the method is novel and presents the first instance of amortised inference for ab initio cryoem heterogeneous reconstruction 3 joint estimation of poses and conformations brings one order of magnitude speedup scaling much better with the everincreasing amount of data produced by cryoem weaknesses 1 as the authors themselves point parameterisation calculated with cryofire is not uniquely defined a rotation of the molecule can be equivalently represented by ri or by a change of conformation state zi the proposed workaround in the paper is to use training schedule and have poseonly phases at the start of training and then every n epochs additional work is required to be able to decouple rotation from conformation without training schedules 2 cryofire is directly compared with cryodrgn and cryodrgn2 table 1 as well as cryosparc table s2 i dont think that this represents software frameworks currently used by cryoem groups the authors adequately addressed the limitations and potential negative societal impact of their work ### Summary:
the paper studies the problem of simultaneously estimating poses and conformation of a biomolecule cryoem images it presents a pipeline which integrates reconstruction and conformation estimation leading to significant time savings compared to methods that alternate between accurately estimating poses and conformations conventional pose estimation is expensive because it involves searching over the space of rigid body motions by repeatedly rendering images from from various viewpoints the paper proposes an autoencoderlike structure which generates conformation and pose estimates for each image which are used by a decoder to produce an estimate of the conformation reviewers generally evaluated the paper positively noting that it achieves an order of magnitude speedup compared to a conventional baseline reviewers note that although elements of the paper have previously appeared the use of autoencoderlike architectures for cryoem reconstruction the use of amortized inference over conformations and poses the combination employed here is novel reviewers generally appreciated the papers experimental results while raising some concerns about the snr and baselines for comparisons finally reviewers noted that the paper provided a clear exposition of both cryoem and the proposed techniques overall the paper exhibits a wellchosen combination of learning techniques which lead to performance improvements for a problem of significant scientific interest
[ 6753, 2083, 351, 398, 452, 3786, 644, 908, 323, 1027, 7794, 273, 436, 1895, 253, 4081, 1332, 3400, 247, 625, 11088, 2746, 5742, 2045, 3082, 452, 15494, 327, 247, 13969, 1566, 323, 16753, 13418, 390, 2684, 8214, 16753, 13418, 347, 629, 273, 271, 28035, 29646, 1232, 835, 253, 2022, 2770, 556, 644, 327, 26230, 253, 21624, 3054, 273, 253, 12570, 253, 4081, 1332, 17923, 253, 16753, 13418, 387, 253, 1072, 673, 347, 253, 21624, 1375, 13418, 17170, 1534, 3885, 8777, 342, 1675, 281, 253, 1375, 273, 253, 1445, 253, 1332, 310, 17618, 327, 7418, 13506, 15302, 285, 581, 5661, 10895, 50276, 783, 2022, 4295, 273, 253, 4081, 1332, 452, 644, 4081, 4321, 717, 430, 1025, 17032, 323, 16753, 13418, 556, 644, 5611, 407, 3641, 311, 1351, 1162, 355, 1458, 285, 323, 21624, 1375, 13418, 407, 1182, 73, 543, 1162, 355, 5922, 6036, 13418, 273, 1097, 16753, 13418, 556, 644, 2684, 407, 687, 8243, 30735, 1162, 355, 2164, 533, 760, 323, 13506, 941, 841, 2045, 1543, 403, 13613, 2529, 275, 253, 7714, 347, 824, 253, 3236, 789, 275, 436, 7714, 8414, 273, 16248, 841, 5368, 4295, 715, 247, 15415, 2644, 534, 310, 642, 3477, 8151, 352, 671, 1057, 436, 1223, 17170, 3012, 3777, 3515, 2069, 2429, 281, 253, 1375, 273, 253, 1445, 23447, 342, 247, 4512, 2957, 273, 7200, 50276, 783, 1332, 310, 12532, 285, 253, 4477, 2085, 10704, 1543, 281, 1329, 616, 3916, 326, 1146, 753, 841, 10704, 1543, 513, 417, 4336, 17813, 253, 1332, 347, 597, 4390, 1462, 806, 253, 13506, 941, 4561, 556, 247, 1077, 1029, 2625, 1299, 45416, 4313, 3802, 83, 407, 3501, 80, 358, 7465, 436, 476, 320, 2326, 954, 4518, 407, 10941, 253, 3410, 20553, 275, 4677, 256, 20, 835, 767, 13506, 12378, 3888, 403, 2429, 281, 271, 5661, 12378, 2460, 253, 6046, 1268, 273, 253, 3438, 310, 1199, 2406, 2429, 281, 253, 6158, 275, 253, 2505, 253, 4477, 3365, 3748, 326, 253, 11041, 273, 253, 6046, 3732, 310, 581, 1293, 5277, 247, 2557, 273, 253, 11041, 273, 253, 4076, 3888, 32721, 407, 5130, 891, 651, 5476, 326, 253, 3802, 83, 323, 253, 13506, 3888, 310, 2810, 281, 581, 1223, 15958, 3501, 80, 358, 941, 5431, 556, 3802, 2967, 12319, 432, 581, 4289, 394, 281, 581, 28081, 352, 310, 1774, 281, 3877, 326, 5747, 841, 1029, 3802, 2967, 253, 4081, 1332, 17923, 7197, 685, 253, 1375, 23037, 14387, 342, 9381, 6332, 327, 253, 1340, 273, 374, 285, 10234, 6332, 5431, 327, 253, 1340, 273, 884, 50276, 783, 1543, 327, 253, 5661, 941, 310, 671, 8664, 275, 253, 24864, 2144, 253, 4477, 3748, 326, 8091, 3888, 403, 14728, 407, 616, 3863, 24543, 1580, 253, 6353, 275, 436, 10895, 403, 3012, 562, 273, 4055, 275, 643, 3000, 253, 16753, 13418, 310, 417, 4751, 17618, 327, 5661, 941, 5742, 1223, 253, 22090, 4445, 273, 16753, 13418, 310, 17618, 253, 33103, 4445, 310, 417, 1223, 352, 778, 973, 320, 253, 1083, 326, 253, 6353, 275, 436, 10895, 403, 3012, 562, 273, 4055, 436, 310, 247, 1846, 12340, 275, 1142, 3501, 80, 358, 15302, 285, 352, 310, 1633, 326, 247, 10237, 1332, 943, 320, 2104, 281, 6016, 1677, 253, 958, 326, 271, 1774, 7680, 273, 253, 4081, 1332, 310, 697, 12820, 327, 5661, 941, 436, 26210, 3012, 843, 974, 84, 432, 326, 7680, 33810, 253, 958, 326, 760, 581, 5661, 10895, 310, 5762, 671, 11355, 253, 7990, 273, 13091, 2429, 281, 253, 3501, 351, 83, 3757, 285, 774, 279, 1554, 487, 1197, 29646, 9380, 534, 1246, 1543, 327, 1264, 390, 625, 5661, 15302, 50276, 783, 4583, 9759, 273, 253, 789, 310, 2590, 342, 954, 12342, 1146, 973, 2931, 1677, 253, 6349, 1691, 327, 717, 430, 1025, 17032, 352, 651, 320, 1175, 281, 13199, 2393, 327, 752, 310, 5486, 275, 247, 7473, 5133, 281, 4796, 667, 1896, 13775, 670, 253, 1307, 275, 247, 2074, 17716, 247, 5426, 273, 253, 288, 435, 2205, 4979, 285, 697, 4103, 11361, 689, 253, 269, 15421, 4979, 651, 320, 247, 1175, 1635, 80, 50276, 284, 4767, 1840, 253, 8453, 273, 436, 2929, 310, 253, 6036, 717, 430, 1025, 17032, 273, 16753, 285, 21624, 2317, 3732, 281, 5661, 941, 347, 824, 352, 3400, 271, 1774, 35961, 41457, 323, 253, 5162, 273, 3501, 80, 358, 941, 2299, 253, 10704, 1543, 513, 417, 4336, 896, 598, 841, 3916, 23000, 253, 4477, 671, 513, 417, 1056, 2590, 752, 352, 310, 670, 616, 9978, 326, 2789, 253, 1332, 5547, 275, 643, 3000, 752, 556, 644, 5816, 432, 2045, 9437, 387, 6036, 17032, 824, 347, 253, 789, 273, 687, 8243, 30735, 1162, 355, 2164, 326, 310, 9713, 407, 253, 1655, 789, 50276, 284, 4767, 1840, 253, 1029, 3802, 2967, 273, 253, 13506, 941, 3710, 5661, 12820, 285, 3480, 273, 12820, 323, 33103, 16753, 13418, 275, 253, 5661, 941, 843, 974, 432, 253, 4583, 3916, 273, 253, 7714, 285, 943, 2057, 320, 11512, 390, 14969, 275, 253, 2505, 50276, 783, 4477, 2319, 2442, 4016, 38058, 3486, 273, 616, 789, 285, 253, 1673, 275, 2087, 3240, 973, 50276, 7152, 339, 431, 248, 2929, 8631, 247, 1332, 3501, 1171, 603, 281, 39994, 262, 4187, 253, 495, 69, 2605, 273, 247, 9303, 44014, 432, 3501, 80, 358, 3888, 970, 13249, 3169, 11454, 1566, 326, 13276, 5415, 6779, 436, 310, 11797, 407, 3332, 7170, 942, 273, 3501, 351, 83, 3757, 43583, 351, 83, 3757, 19, 323, 490, 2012, 900, 495, 69, 5787, 14433, 1027, 432, 5368, 3082, 253, 4477, 12661, 281, 897, 271, 32049, 281, 26277, 6642, 253, 8091, 24543, 285, 28324, 1375, 247, 21624, 4972, 432, 247, 873, 273, 374, 69, 20553, 253, 8131, 24543, 285, 21624, 4972, 403, 840, 10208, 715, 247, 3997, 10495, 11454, 2990, 281, 19737, 1016, 8091, 2460, 715, 271, 15424, 6779, 273, 253, 11715, 2442, 4644, 253, 32049, 285, 29810, 403, 10166, 275, 271, 990, 936, 423, 8142, 970, 298, 19, 4181, 436, 1332, 33526, 10870, 1543, 327, 1097, 15524, 285, 5661, 15302, 281, 256, 5503, 275, 1635, 253, 2929, 2722, 326, 824, 6753, 36465, 3169, 1332, 476, 3012, 3885, 598, 253, 14433, 673, 2429, 281, 3501, 351, 83, 3757, 19, 672, 970, 247, 1781, 1180, 273, 374, 69, 20553, 20544, 50276, 783, 2929, 10262, 247, 4460, 2746, 323, 2014, 8091, 3501, 80, 358, 326, 36878, 285, 36143, 2987, 973, 275, 1635, 253, 2929, 310, 4583, 973, 3542, 285, 253, 1332, 310, 973, 6760, 327, 5661, 10895, 50275, 20881, 1255, 265, 50276, 8826, 7558, 9021, 285, 7681, 4278, 1537, 878, 281, 320, 31637, 285, 5469, 50275, 783, 7364, 273, 1655, 1332, 403, 5816, 7296, 2014, 8091, 3501, 80, 358, 310, 271, 1774, 533, 11132, 1895, 690, 11985, 588, 320, 9371, 2852, 789, 5474, 339, 431, 248, 2929, 29328, 247, 4460, 1332, 281, 9441, 1097, 16753, 285, 34241, 3054, 273, 2601, 5289, 432, 9305, 3501, 80, 358, 3888, 275, 247, 1039, 326, 310, 717, 430, 1025, 689, 253, 1979, 273, 253, 10895, 253, 4477, 2429, 253, 4081, 1332, 342, 3501, 351, 83, 3757, 19, 327, 2067, 13506, 285, 5661, 15302, 597, 5183, 326, 616, 1332, 4419, 1679, 673, 2429, 281, 3501, 351, 83, 3757, 19, 407, 18501, 272, 1652, 7200, 50275, 783, 2929, 310, 973, 15720, 285, 4583, 3477, 281, 956, 253, 16038, 285, 6349, 273, 253, 1895, 403, 973, 5469, 253, 2929, 3400, 247, 11080, 18389, 273, 253, 6239, 253, 4081, 1332, 310, 4460, 281, 253, 1682, 273, 619, 3640, 50276, 35529, 627, 403, 2067, 3374, 342, 253, 4679, 1666, 25379, 285, 1543, 253, 4477, 5393, 326, 687, 8243, 30735, 1162, 14350, 1332, 369, 253, 760, 581, 281, 1347, 717, 430, 1320, 323, 1097, 16753, 285, 28324, 13418, 275, 824, 247, 1083, 326, 1798, 1332, 943, 452, 644, 4934, 347, 247, 8245, 417, 3501, 351, 83, 3757, 19, 1580, 352, 717, 430, 4219, 323, 28324, 13418, 760, 1014, 253, 5301, 342, 3501, 351, 83, 3757, 19, 310, 417, 1077, 13943, 352, 310, 3240, 7312, 326, 3082, 342, 271, 41389, 3186, 323, 16753, 651, 1379, 625, 673, 342, 625, 7200, 604, 253, 4081, 1332, 3501, 1171, 603, 812, 5115, 1805, 390, 2074, 7200, 342, 1679, 673, 840, 352, 651, 452, 644, 1199, 625, 13943, 533, 3501, 1171, 603, 17923, 1199, 7197, 685, 3501, 351, 83, 3757, 19, 1014, 275, 1781, 15302, 50276, 2369, 5301, 556, 644, 2692, 323, 28324, 13418, 310, 3501, 1171, 603, 347, 1175, 347, 3501, 351, 83, 3757, 19, 323, 28324, 13418, 347, 973, 476, 352, 9295, 690, 10138, 569, 326, 476, 417, 320, 12372, 407, 3501, 351, 83, 3757, 19, 390, 495, 69, 6156, 390, 299, 19, 72, 2188, 50276, 783, 2929, 25957, 687, 4393, 66, 360, 1162, 14350, 1332, 760, 4307, 327, 253, 15524, 10895, 533, 352, 858, 417, 2319, 2139, 352, 476, 417, 320, 3732, 281, 1524, 15302, 752, 2789, 352, 275, 40812, 323, 1524, 15302, 533, 7763, 323, 1524, 18260, 15524, 15302, 390, 310, 352, 253, 1083, 326, 253, 4477, 403, 253, 806, 281, 4647, 253, 1332, 281, 253, 5661, 10895, 436, 310, 4518, 417, 247, 1943, 7680, 50275, 783, 4477, 5469, 253, 7364, 285, 1896, 4016, 2675, 3486, 273, 253, 789, 50276, 7152, 339, 431, 248, 2929, 29328, 271, 490, 2012, 900, 22766, 14433, 1332, 323, 39072, 3501, 80, 358, 4907, 3501, 1171, 603, 627, 403, 2709, 11361, 281, 253, 4081, 2746, 285, 954, 273, 731, 1705, 432, 253, 3745, 281, 26277, 6642, 1097, 24543, 285, 10138, 569, 6941, 281, 17049, 43245, 8214, 3213, 273, 16753, 3186, 253, 4477, 921, 253, 12510, 273, 253, 1332, 327, 2709, 15302, 1690, 581, 5661, 285, 7277, 253, 2746, 342, 256, 5503, 273, 3501, 80, 358, 48257, 3169, 22766, 14433, 3501, 351, 83, 3757, 19, 253, 4477, 9023, 281, 2085, 271, 13279, 1505, 7092, 273, 3501, 1171, 603, 2220, 9311, 50276, 296, 3755, 20556, 337, 253, 2929, 310, 3542, 4076, 314, 285, 891, 3340, 11346, 253, 18382, 4291, 7473, 5740, 273, 3501, 80, 358, 2460, 4702, 374, 253, 1332, 310, 4460, 285, 10262, 253, 806, 4227, 273, 717, 430, 1701, 17032, 323, 490, 2012, 900, 3501, 80, 358, 22766, 14433, 495, 6036, 13418, 273, 24543, 285, 10138, 569, 10316, 581, 1340, 273, 9777, 3885, 484, 13642, 1199, 1805, 342, 253, 2455, 41193, 2408, 273, 941, 4197, 407, 3501, 80, 358, 50276, 20881, 1255, 265, 337, 347, 253, 4477, 3746, 1127, 4764, 5837, 5118, 342, 3501, 1171, 603, 310, 417, 22506, 2931, 247, 9381, 273, 253, 12570, 476, 320, 39406, 6607, 407, 4172, 390, 407, 247, 1818, 273, 28324, 1375, 1182, 74, 253, 4081, 42182, 275, 253, 2929, 310, 281, 897, 3733, 10130, 285, 452, 16753, 7483, 12475, 387, 253, 1265, 273, 3733, 285, 840, 1046, 295, 44540, 3081, 789, 310, 2424, 281, 320, 2104, 281, 34430, 713, 9381, 432, 28324, 1293, 3733, 28631, 374, 3501, 1171, 603, 310, 3587, 2429, 342, 3501, 351, 83, 3757, 285, 3501, 351, 83, 3757, 19, 2829, 337, 347, 973, 347, 3501, 375, 1148, 68, 2829, 256, 19, 891, 13414, 1158, 326, 436, 6125, 3694, 31225, 4390, 908, 407, 3501, 80, 358, 2390, 253, 4477, 18212, 9713, 253, 7364, 285, 2442, 4016, 38058, 3486, 273, 616, 789, 2490, 187, 4118, 18435, 27, 783, 2929, 2175, 253, 1895, 273, 10486, 26230, 24543, 285, 28324, 273, 247, 9303, 44014, 3501, 80, 358, 3888, 352, 10262, 247, 15722, 534, 49661, 14433, 285, 28324, 13418, 4283, 281, 1534, 673, 16347, 2429, 281, 3082, 326, 17958, 875, 13613, 26230, 24543, 285, 10138, 569, 6041, 16753, 13418, 310, 8214, 984, 352, 8687, 12203, 689, 253, 2317, 273, 16572, 2133, 14462, 407, 12889, 18164, 3888, 432, 432, 2710, 1859, 10801, 253, 2929, 29328, 271, 6753, 36465, 3022, 2605, 534, 15693, 28324, 285, 16753, 8197, 323, 1016, 2460, 534, 403, 908, 407, 247, 29810, 281, 4711, 271, 6642, 273, 253, 28324, 50275, 15337, 398, 3839, 6760, 253, 2929, 14962, 15806, 326, 352, 33526, 271, 1340, 273, 9777, 3885, 484, 2429, 281, 247, 6041, 8245, 30628, 3877, 326, 3738, 3603, 273, 253, 2929, 452, 3786, 5420, 253, 897, 273, 6753, 36465, 3022, 35615, 323, 3501, 80, 358, 14433, 253, 897, 273, 717, 430, 1025, 17032, 689, 10138, 569, 285, 24543, 253, 5019, 7091, 1060, 310, 4460, 30628, 3839, 14109, 253, 9380, 5661, 1543, 1223, 12976, 690, 7350, 670, 253, 3802, 83, 285, 1666, 25379, 323, 14023, 4720, 30628, 4879, 326, 253, 2929, 2530, 247, 2590, 47284, 273, 1097, 3501, 80, 358, 285, 253, 4081, 5609, 4583, 253, 2929, 15646, 247, 973, 348, 5458, 5019, 273, 4715, 5609, 534, 1421, 281, 3045, 11701, 323, 247, 1895, 273, 1534, 8249, 1600, 209 ]
[ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 ]
[ 6753, 2083, 351, 398, 452, 3786, 644, 908, 323, 1027, 7794, 273, 436, 1895, 253, 4081, 1332, 3400, 247, 625, 11088, 2746, 5742, 2045, 3082, 452, 15494, 327, 247, 13969, 1566, 323, 16753, 13418, 390, 2684, 8214, 16753, 13418, 347, 629, 273, 271, 28035, 29646, 1232, 835, 253, 2022, 2770, 556, 644, 327, 26230, 253, 21624, 3054, 273, 253, 12570, 253, 4081, 1332, 17923, 253, 16753, 13418, 387, 253, 1072, 673, 347, 253, 21624, 1375, 13418, 17170, 1534, 3885, 8777, 342, 1675, 281, 253, 1375, 273, 253, 1445, 253, 1332, 310, 17618, 327, 7418, 13506, 15302, 285, 581, 5661, 10895, 50276, 783, 2022, 4295, 273, 253, 4081, 1332, 452, 644, 4081, 4321, 717, 430, 1025, 17032, 323, 16753, 13418, 556, 644, 5611, 407, 3641, 311, 1351, 1162, 355, 1458, 285, 323, 21624, 1375, 13418, 407, 1182, 73, 543, 1162, 355, 5922, 6036, 13418, 273, 1097, 16753, 13418, 556, 644, 2684, 407, 687, 8243, 30735, 1162, 355, 2164, 533, 760, 323, 13506, 941, 841, 2045, 1543, 403, 13613, 2529, 275, 253, 7714, 347, 824, 253, 3236, 789, 275, 436, 7714, 8414, 273, 16248, 841, 5368, 4295, 715, 247, 15415, 2644, 534, 310, 642, 3477, 8151, 352, 671, 1057, 436, 1223, 17170, 3012, 3777, 3515, 2069, 2429, 281, 253, 1375, 273, 253, 1445, 23447, 342, 247, 4512, 2957, 273, 7200, 50276, 783, 1332, 310, 12532, 285, 253, 4477, 2085, 10704, 1543, 281, 1329, 616, 3916, 326, 1146, 753, 841, 10704, 1543, 513, 417, 4336, 17813, 253, 1332, 347, 597, 4390, 1462, 806, 253, 13506, 941, 4561, 556, 247, 1077, 1029, 2625, 1299, 45416, 4313, 3802, 83, 407, 3501, 80, 358, 7465, 436, 476, 320, 2326, 954, 4518, 407, 10941, 253, 3410, 20553, 275, 4677, 256, 20, 835, 767, 13506, 12378, 3888, 403, 2429, 281, 271, 5661, 12378, 2460, 253, 6046, 1268, 273, 253, 3438, 310, 1199, 2406, 2429, 281, 253, 6158, 275, 253, 2505, 253, 4477, 3365, 3748, 326, 253, 11041, 273, 253, 6046, 3732, 310, 581, 1293, 5277, 247, 2557, 273, 253, 11041, 273, 253, 4076, 3888, 32721, 407, 5130, 891, 651, 5476, 326, 253, 3802, 83, 323, 253, 13506, 3888, 310, 2810, 281, 581, 1223, 15958, 3501, 80, 358, 941, 5431, 556, 3802, 2967, 12319, 432, 581, 4289, 394, 281, 581, 28081, 352, 310, 1774, 281, 3877, 326, 5747, 841, 1029, 3802, 2967, 253, 4081, 1332, 17923, 7197, 685, 253, 1375, 23037, 14387, 342, 9381, 6332, 327, 253, 1340, 273, 374, 285, 10234, 6332, 5431, 327, 253, 1340, 273, 884, 50276, 783, 1543, 327, 253, 5661, 941, 310, 671, 8664, 275, 253, 24864, 2144, 253, 4477, 3748, 326, 8091, 3888, 403, 14728, 407, 616, 3863, 24543, 1580, 253, 6353, 275, 436, 10895, 403, 3012, 562, 273, 4055, 275, 643, 3000, 253, 16753, 13418, 310, 417, 4751, 17618, 327, 5661, 941, 5742, 1223, 253, 22090, 4445, 273, 16753, 13418, 310, 17618, 253, 33103, 4445, 310, 417, 1223, 352, 778, 973, 320, 253, 1083, 326, 253, 6353, 275, 436, 10895, 403, 3012, 562, 273, 4055, 436, 310, 247, 1846, 12340, 275, 1142, 3501, 80, 358, 15302, 285, 352, 310, 1633, 326, 247, 10237, 1332, 943, 320, 2104, 281, 6016, 1677, 253, 958, 326, 271, 1774, 7680, 273, 253, 4081, 1332, 310, 697, 12820, 327, 5661, 941, 436, 26210, 3012, 843, 974, 84, 432, 326, 7680, 33810, 253, 958, 326, 760, 581, 5661, 10895, 310, 5762, 671, 11355, 253, 7990, 273, 13091, 2429, 281, 253, 3501, 351, 83, 3757, 285, 774, 279, 1554, 487, 1197, 29646, 9380, 534, 1246, 1543, 327, 1264, 390, 625, 5661, 15302, 50276, 783, 4583, 9759, 273, 253, 789, 310, 2590, 342, 954, 12342, 1146, 973, 2931, 1677, 253, 6349, 1691, 327, 717, 430, 1025, 17032, 352, 651, 320, 1175, 281, 13199, 2393, 327, 752, 310, 5486, 275, 247, 7473, 5133, 281, 4796, 667, 1896, 13775, 670, 253, 1307, 275, 247, 2074, 17716, 247, 5426, 273, 253, 288, 435, 2205, 4979, 285, 697, 4103, 11361, 689, 253, 269, 15421, 4979, 651, 320, 247, 1175, 1635, 80, 50276, 284, 4767, 1840, 253, 8453, 273, 436, 2929, 310, 253, 6036, 717, 430, 1025, 17032, 273, 16753, 285, 21624, 2317, 3732, 281, 5661, 941, 347, 824, 352, 3400, 271, 1774, 35961, 41457, 323, 253, 5162, 273, 3501, 80, 358, 941, 2299, 253, 10704, 1543, 513, 417, 4336, 896, 598, 841, 3916, 23000, 253, 4477, 671, 513, 417, 1056, 2590, 752, 352, 310, 670, 616, 9978, 326, 2789, 253, 1332, 5547, 275, 643, 3000, 752, 556, 644, 5816, 432, 2045, 9437, 387, 6036, 17032, 824, 347, 253, 789, 273, 687, 8243, 30735, 1162, 355, 2164, 326, 310, 9713, 407, 253, 1655, 789, 50276, 284, 4767, 1840, 253, 1029, 3802, 2967, 273, 253, 13506, 941, 3710, 5661, 12820, 285, 3480, 273, 12820, 323, 33103, 16753, 13418, 275, 253, 5661, 941, 843, 974, 432, 253, 4583, 3916, 273, 253, 7714, 285, 943, 2057, 320, 11512, 390, 14969, 275, 253, 2505, 50276, 783, 4477, 2319, 2442, 4016, 38058, 3486, 273, 616, 789, 285, 253, 1673, 275, 2087, 3240, 973, 50276, 7152, 339, 431, 248, 2929, 8631, 247, 1332, 3501, 1171, 603, 281, 39994, 262, 4187, 253, 495, 69, 2605, 273, 247, 9303, 44014, 432, 3501, 80, 358, 3888, 970, 13249, 3169, 11454, 1566, 326, 13276, 5415, 6779, 436, 310, 11797, 407, 3332, 7170, 942, 273, 3501, 351, 83, 3757, 43583, 351, 83, 3757, 19, 323, 490, 2012, 900, 495, 69, 5787, 14433, 1027, 432, 5368, 3082, 253, 4477, 12661, 281, 897, 271, 32049, 281, 26277, 6642, 253, 8091, 24543, 285, 28324, 1375, 247, 21624, 4972, 432, 247, 873, 273, 374, 69, 20553, 253, 8131, 24543, 285, 21624, 4972, 403, 840, 10208, 715, 247, 3997, 10495, 11454, 2990, 281, 19737, 1016, 8091, 2460, 715, 271, 15424, 6779, 273, 253, 11715, 2442, 4644, 253, 32049, 285, 29810, 403, 10166, 275, 271, 990, 936, 423, 8142, 970, 298, 19, 4181, 436, 1332, 33526, 10870, 1543, 327, 1097, 15524, 285, 5661, 15302, 281, 256, 5503, 275, 1635, 253, 2929, 2722, 326, 824, 6753, 36465, 3169, 1332, 476, 3012, 3885, 598, 253, 14433, 673, 2429, 281, 3501, 351, 83, 3757, 19, 672, 970, 247, 1781, 1180, 273, 374, 69, 20553, 20544, 50276, 783, 2929, 10262, 247, 4460, 2746, 323, 2014, 8091, 3501, 80, 358, 326, 36878, 285, 36143, 2987, 973, 275, 1635, 253, 2929, 310, 4583, 973, 3542, 285, 253, 1332, 310, 973, 6760, 327, 5661, 10895, 50275, 20881, 1255, 265, 50276, 8826, 7558, 9021, 285, 7681, 4278, 1537, 878, 281, 320, 31637, 285, 5469, 50275, 783, 7364, 273, 1655, 1332, 403, 5816, 7296, 2014, 8091, 3501, 80, 358, 310, 271, 1774, 533, 11132, 1895, 690, 11985, 588, 320, 9371, 2852, 789, 5474, 339, 431, 248, 2929, 29328, 247, 4460, 1332, 281, 9441, 1097, 16753, 285, 34241, 3054, 273, 2601, 5289, 432, 9305, 3501, 80, 358, 3888, 275, 247, 1039, 326, 310, 717, 430, 1025, 689, 253, 1979, 273, 253, 10895, 253, 4477, 2429, 253, 4081, 1332, 342, 3501, 351, 83, 3757, 19, 327, 2067, 13506, 285, 5661, 15302, 597, 5183, 326, 616, 1332, 4419, 1679, 673, 2429, 281, 3501, 351, 83, 3757, 19, 407, 18501, 272, 1652, 7200, 50275, 783, 2929, 310, 973, 15720, 285, 4583, 3477, 281, 956, 253, 16038, 285, 6349, 273, 253, 1895, 403, 973, 5469, 253, 2929, 3400, 247, 11080, 18389, 273, 253, 6239, 253, 4081, 1332, 310, 4460, 281, 253, 1682, 273, 619, 3640, 50276, 35529, 627, 403, 2067, 3374, 342, 253, 4679, 1666, 25379, 285, 1543, 253, 4477, 5393, 326, 687, 8243, 30735, 1162, 14350, 1332, 369, 253, 760, 581, 281, 1347, 717, 430, 1320, 323, 1097, 16753, 285, 28324, 13418, 275, 824, 247, 1083, 326, 1798, 1332, 943, 452, 644, 4934, 347, 247, 8245, 417, 3501, 351, 83, 3757, 19, 1580, 352, 717, 430, 4219, 323, 28324, 13418, 760, 1014, 253, 5301, 342, 3501, 351, 83, 3757, 19, 310, 417, 1077, 13943, 352, 310, 3240, 7312, 326, 3082, 342, 271, 41389, 3186, 323, 16753, 651, 1379, 625, 673, 342, 625, 7200, 604, 253, 4081, 1332, 3501, 1171, 603, 812, 5115, 1805, 390, 2074, 7200, 342, 1679, 673, 840, 352, 651, 452, 644, 1199, 625, 13943, 533, 3501, 1171, 603, 17923, 1199, 7197, 685, 3501, 351, 83, 3757, 19, 1014, 275, 1781, 15302, 50276, 2369, 5301, 556, 644, 2692, 323, 28324, 13418, 310, 3501, 1171, 603, 347, 1175, 347, 3501, 351, 83, 3757, 19, 323, 28324, 13418, 347, 973, 476, 352, 9295, 690, 10138, 569, 326, 476, 417, 320, 12372, 407, 3501, 351, 83, 3757, 19, 390, 495, 69, 6156, 390, 299, 19, 72, 2188, 50276, 783, 2929, 25957, 687, 4393, 66, 360, 1162, 14350, 1332, 760, 4307, 327, 253, 15524, 10895, 533, 352, 858, 417, 2319, 2139, 352, 476, 417, 320, 3732, 281, 1524, 15302, 752, 2789, 352, 275, 40812, 323, 1524, 15302, 533, 7763, 323, 1524, 18260, 15524, 15302, 390, 310, 352, 253, 1083, 326, 253, 4477, 403, 253, 806, 281, 4647, 253, 1332, 281, 253, 5661, 10895, 436, 310, 4518, 417, 247, 1943, 7680, 50275, 783, 4477, 5469, 253, 7364, 285, 1896, 4016, 2675, 3486, 273, 253, 789, 50276, 7152, 339, 431, 248, 2929, 29328, 271, 490, 2012, 900, 22766, 14433, 1332, 323, 39072, 3501, 80, 358, 4907, 3501, 1171, 603, 627, 403, 2709, 11361, 281, 253, 4081, 2746, 285, 954, 273, 731, 1705, 432, 253, 3745, 281, 26277, 6642, 1097, 24543, 285, 10138, 569, 6941, 281, 17049, 43245, 8214, 3213, 273, 16753, 3186, 253, 4477, 921, 253, 12510, 273, 253, 1332, 327, 2709, 15302, 1690, 581, 5661, 285, 7277, 253, 2746, 342, 256, 5503, 273, 3501, 80, 358, 48257, 3169, 22766, 14433, 3501, 351, 83, 3757, 19, 253, 4477, 9023, 281, 2085, 271, 13279, 1505, 7092, 273, 3501, 1171, 603, 2220, 9311, 50276, 296, 3755, 20556, 337, 253, 2929, 310, 3542, 4076, 314, 285, 891, 3340, 11346, 253, 18382, 4291, 7473, 5740, 273, 3501, 80, 358, 2460, 4702, 374, 253, 1332, 310, 4460, 285, 10262, 253, 806, 4227, 273, 717, 430, 1701, 17032, 323, 490, 2012, 900, 3501, 80, 358, 22766, 14433, 495, 6036, 13418, 273, 24543, 285, 10138, 569, 10316, 581, 1340, 273, 9777, 3885, 484, 13642, 1199, 1805, 342, 253, 2455, 41193, 2408, 273, 941, 4197, 407, 3501, 80, 358, 50276, 20881, 1255, 265, 337, 347, 253, 4477, 3746, 1127, 4764, 5837, 5118, 342, 3501, 1171, 603, 310, 417, 22506, 2931, 247, 9381, 273, 253, 12570, 476, 320, 39406, 6607, 407, 4172, 390, 407, 247, 1818, 273, 28324, 1375, 1182, 74, 253, 4081, 42182, 275, 253, 2929, 310, 281, 897, 3733, 10130, 285, 452, 16753, 7483, 12475, 387, 253, 1265, 273, 3733, 285, 840, 1046, 295, 44540, 3081, 789, 310, 2424, 281, 320, 2104, 281, 34430, 713, 9381, 432, 28324, 1293, 3733, 28631, 374, 3501, 1171, 603, 310, 3587, 2429, 342, 3501, 351, 83, 3757, 285, 3501, 351, 83, 3757, 19, 2829, 337, 347, 973, 347, 3501, 375, 1148, 68, 2829, 256, 19, 891, 13414, 1158, 326, 436, 6125, 3694, 31225, 4390, 908, 407, 3501, 80, 358, 2390, 253, 4477, 18212, 9713, 253, 7364, 285, 2442, 4016, 38058, 3486, 273, 616, 789, 2490, 187, 4118, 18435, 27, 783, 2929, 2175, 253, 1895, 273, 10486, 26230, 24543, 285, 28324, 273, 247, 9303, 44014, 3501, 80, 358, 3888, 352, 10262, 247, 15722, 534, 49661, 14433, 285, 28324, 13418, 4283, 281, 1534, 673, 16347, 2429, 281, 3082, 326, 17958, 875, 13613, 26230, 24543, 285, 10138, 569, 6041, 16753, 13418, 310, 8214, 984, 352, 8687, 12203, 689, 253, 2317, 273, 16572, 2133, 14462, 407, 12889, 18164, 3888, 432, 432, 2710, 1859, 10801, 253, 2929, 29328, 271, 6753, 36465, 3022, 2605, 534, 15693, 28324, 285, 16753, 8197, 323, 1016, 2460, 534, 403, 908, 407, 247, 29810, 281, 4711, 271, 6642, 273, 253, 28324, 50275, 15337, 398, 3839, 6760, 253, 2929, 14962, 15806, 326, 352, 33526, 271, 1340, 273, 9777, 3885, 484, 2429, 281, 247, 6041, 8245, 30628, 3877, 326, 3738, 3603, 273, 253, 2929, 452, 3786, 5420, 253, 897, 273, 6753, 36465, 3022, 35615, 323, 3501, 80, 358, 14433, 253, 897, 273, 717, 430, 1025, 17032, 689, 10138, 569, 285, 24543, 253, 5019, 7091, 1060, 310, 4460, 30628, 3839, 14109, 253, 9380, 5661, 1543, 1223, 12976, 690, 7350, 670, 253, 3802, 83, 285, 1666, 25379, 323, 14023, 4720, 30628, 4879, 326, 253, 2929, 2530, 247, 2590, 47284, 273, 1097, 3501, 80, 358, 285, 253, 4081, 5609, 4583, 253, 2929, 15646, 247, 973, 348, 5458, 5019, 273, 4715, 5609, 534, 1421, 281, 3045, 11701, 323, 247, 1895, 273, 1534, 8249, 1600, 209 ]
Below is given review of a research paper from cnoference journal. Please write a summary the review. ### Review: the wildtime collects 7 temporal outofdistribution datasets among which 2 are image datasets 1 tabular dataset 1 graph molecular dataset and 3 text datasets the paper aims at time domains and usedefine temporal distribution shift of multiple time steps for each dataset they test the indistribution and outofdistribution performance on multiple methods including comparison of different continual learning and invariant learning algorithms the paper also discusses the observations from the experimental results and analyze the comparisons to give brief conclusions that no existing continualinvariant learning approach generally outperforms erm on temporal distribution shifts 1 outofdistribution problem is an essential field increasingly gaining attention and temporal outofdistribution datasets are not discussed much by existing work therefore collecting datasets for this field is meaningful contribution 2 methods specifically designed for temporal distribution shifts are majorly underdeveloped the paper evaluate performances of different continual learning and invariant learning methods on their datasets providing useful data to guide the development of future temporal ood methods 3 the paper is generally easy to follow and understand 1 my major concern is in the data split for the evalstream setting which should be closer to the acknowledged domain design in ood datasets and models can incrementally learn between time steps the split seems questionable for example the authors allocate 10 of the data at each timestep for test and the rest for training for ood testing all samples in each year are used this means that the id training has 90 data id test 10 but ood test 100 data i suppose this means that they use the data other than the current tilmestep to do ood test or it would not fit my understanding of ood test setting but this unbalanced idood traintest split would make the results incomparable with evalfix and im not sure if this would negatively affect the best performance of incrementally learning strategies the time steps could be designed as domains like general ood datasets 2 selection of methods and models lack motivation reasoning the paper evaluated algorithms on different model backbones but give no comparison of other model backbones since the conclusion includes that no existing continualinvariant learning approach generally outperforms erm on temporal distribution shifts careful selection and reasoning of methods and models should be vital also some important oodrelated methods are not compared or discussed frequently used methods in ood generalization recently such as mixup zhang hongyi et al mixup beyond empirical risk minimization arxiv preprint arxiv171009412 2017 is not compared also image and text data have no imagetextspecified oodrelated methods compared 3 some experimental details are not provided the paper lack some necessary data to prove some claims convincing for example algorithms such as irm has hyperparameters that can have a wide range of value selection the paper did not provide the finetuning process or the reason for parameter selection and the hyperparameters can largely affect the performance and observations 4 the github code documentation are not organized much as datasets are primarily for community use it should be easily usable and the documentation should be userfriendly the github repo has few starting instructions so it is difficult to use for example the modules for the datasets specifically i would suggest the github documentation be actually maintained docsepthis paper advocate dealing with the issue of distribution shift over time for robust model performance the author curated 7 datasets from existing datasets that reflect realistic temporal distribution shifts conventional domain generalization methods as well as continual learningbased methods are benchmarked under two different settings evalfix evalsteam the experiments demonstrate that the benchmarked methods are limited in dealing with temporal distribution shifts across all proposed settings this work hopes to assist the community in tackling the problem of temporal distribution shift by proving benchmarking datasets and baseline models this paper is motivated and clearly presented this work provides a set of curated timeood datasets and results of conventional results it would be a suitable starting point for the researcher interested to work on while the realistic temporal issue needs to be addressed there are some other issues to consider the temporal shift distribution curated in this paper are often intermixed with other types of distribution shifts for example in the fmow dataset the authors directly partitioned the dataset by year ignoring location consistency across sets which would result in performance degradation possibly due to shifts caused by locations in the dataset yearbook whether the maletofemale division is consistent between time periods 19301970 and 19712013 also affects the performance difference thus an important question is how to ensure that the performance degradation is purely due to time distribution shift and not due to inconsistencies of other domains the purpose of this paper is to propose datasets with time distribution shift however some settings are not quite consistent with the reality for example under the main evalfix setting the time span between train and ood test for many subdatasets is huge for example in task 1 the training set is on average 20 years older than the test set and this setting may be uncommon in the standard ml development pipeline what we do know is that model updates are frequent it would be helpful if you could resolve my doubts about this namely that some of the proposed dataset settings are not necessarily realityoriented docsepthe authors are examining distributional shifts in time also known as concept drift they construct a benchmark from a series of 7 existing datasets develop two evaluation strategies and evaluation a range of methods augmentation invariant learning and continuers learning methods for improving robustness on this benchmark the authors show that no methods consistently outperform erm that augmentation can sometimes be detrimental and that invariant learning does better than continuous incremental learning approaches the authors clearly show that current tools are insufficient for combatting distributional shift in time this is a good paper but the documentation is lacking and i will be happy to revise my assessment when the authors provide the necessary bits as described below the authors have provided a valuable and diverse benchmark which focuses exclusively on distributional shift in time this effect occurs in many services and practical applications and therefore have broad implications for both research and for industrial practitioners the work itself examines a broad range of applications tasks and modalities there are no negative ethical or social implications in this work the current work does not have any significant technical weaknesses however it would be interesting to examine timeshift in complex structured prediction tasks such as segmentation autoregressive classification and regression additionally it would be interesting to also consider bayesianandrelated methods such as ensembles and how much improvement in robustness they are able to provide finally exploring the choice of model architecture and its impact on robustness however there is a weakness in the support documentation it is unclear how the dataset it to be supported under what licenses they are shared reused how the longterm support of the dataset will be carried out the github which i know is being refactored is inconsistent in how the data is to be obtained either via notebooks or via google drive etc this presents a challenge to accessing the data these issues needs to be properly addressed for this paper to be accepted i will be very happy to revise my assessment of the paper when the authors provide adequate documentation docsepthis paper proposes a benchmark for addressing the temporal shifts in machine learning systems the motivation of the paper is clear the authors bring up a slight problem of largescale machine learning systems where the data changes over time the authors bring a practical suggestion for evaluating new models over timebased shifts the definition of temporal shifts is very brief so the importance of these shifts in practice is not well explained since this paper aims to propose a benchmark for mitigating the temporal shifts it should contain a brief definition of the meaning and effects of this type of shift in each dataset if there is any difference and more detail clarify the difference between this type of shift and other data shifts addressing a realworld problem in machine learning is important to be solved for industrial machine learning systems the purpose of the paper is clear there is no synthetic data creation or changes since the domains are based on actual time frames over time the scope of the paper is highly relevant to current machine learning research borders the evaluation method is described clearly results are reflected in the paper in a clean and understandable way the dataset contains various domains with two different tasks although the contribution is clear there is no realworld scenario in the introduction for introducing the importance of the problem to the reader all variables in the formula were not defined there is no practical sample for how to use this benchmark or any practical guideline for users there is no introduction for any framework or library which can implement the benchmark in practical projects i couldnt find any link to the dataset or benchmark tool the evaluation metrics part is not well described there is no ethics section or any consideration about that docsep this paper proposes a dataset for distribution shifts that incorporates the passage of time using timestamp metadata this reflects natural temporal distribution shifts consequently the proposed dataset wildtime can be viewed as a more plausible dataset for realworld situations while the previous dataset wild concentrated on image datasets this paper deals with various datasets including drug discovery health care and news classification moreover they suggest assessment of the proposed dataset based on the manipulation of timestamp metadata known as evalfix timeseries setting and evalstream continual learning setting to demonstrate the efficacy of the proposed dataset the authors conducted extensive experiments they consistently tell us that the proposed dataset represents distinct properties of both indistribution and out of distribution dataset so far the domain of continual learning has been mainly used to employ traditional datasets such as cifar and imagenet datasets by gradually increasing class information and manipulating available datapoints many researchers have been encouraged by this point to easily pursue continual learning studies however this setting is somewhat far from realworld circumstances accordingly the authors provide datasets that matches the realworld problems the proposed dataset was evaluated by several baselines recently utilized in distribution shift and continuous learning to confirm the properties by comparing their performance allowing researchers who need to use the data to do so effectively through the baselines and extensive experiments the proposed datasets clearly follow characteristics of both continual learning and distributional shifts this work tells that invariant learning algorithms have a high level of performance i guess that the hyperparameters k and t were well calibrated if the appendix results are provided in the manuscript there appears to be no meaningful difference as compared to continuous learning im wondering how invariant learning algorithms properly work when decreasing values of k and t in this regard i have a doubt whether the invariant learning algorithms observes more datapoints in the same batch or epoch comparing the cases of continual learning baselines i might think that the provided data adequately describe the distribution shift however when i look at the table 1 and table 12 it seems that the empirical risk minimization work well allowing the out of distribution dataset to be covered with indistribution data when the number of available datapoints is reduced it would be possible to demonstrate that erm is ineffective if it could be demonstrated that the existing distributional shift and continuous learning methods perform better than erm on smaller datasets then erm would be rendered obsolete i believed that it was originally intended docsepthe paper proposes wildtime benchmark of seven datasets to benchmark methods under temporal distributional shift two evaluation settings are proposed the first evaluation setting is evalfix which uses a fixed training and testing split as in supervised learning which is usable to broader research communities the second evaluation setting is evalstream which is closer to the continual learning evaluation setting the paper benchmarks two categories of approaches domain generalization methods and continue learning methods the results show that existing methods cannot address the temporal distributional shift problem well leaving room for future methods to address this problem while most previous distributional shift benchmarks artificial distributional shifts eg image corruptions 21 background 63 color 3 the proposed wiletime benchmark focuses on the natural temporal distributional shift which is a problem that naturally exists in the realworld this is a significant contribution to this community and can facilitate future works to develop robust algorithms against temporal distribution shift the new benchmark is comprehensive in terms of evaluation metrics evaluation methods and diversity of datasets it will be better to discuss potential research directions to address this problem ### Summary:
i hereby sort the strengths and weaknesses that were mentioned together strengths the usefulness to the community and need to develop methods that address ood data paper wellwritten weaknesses there are some questions about the datasplit which seems major but also addressed adding locational shift op top of the temporal shift which will be a confounding factor in experiments there are concerns about the time between certain batches of the data one reviewer claims its too big in a certain setting another mention overlap in another setting information on experimental choices and details and calibration of hyperparameters of the algorithms user friendliness of documentation github repo etc various reviewers have increased their score during the rebuttal all together the reviewers have been very thorough nicely picking up on these matters while i appreciate their through job i want to place the weaknesses into context of the usefulness to the community that was also consistently mentioned some reviewers mention that they are still not 100 convinced which is an actionable point to the authors please do keep an open mind and seek to further improve on these points i see no dealbreakers in the weaknesses and due to the generally positive sentiment i am very inclined to give a positive recommendation
[ 10932, 1199, 347, 15302, 403, 8558, 323, 3114, 897, 352, 943, 320, 4354, 31998, 285, 253, 10097, 943, 320, 2608, 19771, 253, 40477, 30905, 556, 1643, 4983, 7997, 594, 352, 310, 2834, 281, 897, 323, 1650, 253, 11911, 323, 253, 15302, 5742, 891, 651, 1804, 253, 40477, 10097, 320, 2686, 8838, 5474, 33032, 2520, 2929, 21424, 10620, 342, 253, 2523, 273, 3268, 5333, 689, 673, 323, 10237, 1566, 3045, 253, 2488, 1095, 456, 818, 15302, 432, 5368, 15302, 326, 4887, 15958, 11935, 3268, 15036, 50276, 585, 26743, 5028, 26647, 3082, 347, 973, 347, 45120, 4715, 3169, 3082, 403, 22791, 264, 762, 767, 1027, 7533, 612, 2103, 895, 2777, 3241, 312, 253, 4679, 7568, 326, 253, 22791, 264, 3082, 403, 3710, 275, 10620, 342, 11935, 3268, 15036, 2439, 512, 4081, 7533, 436, 789, 13079, 281, 10073, 253, 3114, 275, 46710, 253, 1895, 273, 11935, 3268, 5333, 407, 18597, 22791, 272, 15302, 285, 8245, 3210, 50276, 2520, 2929, 310, 17194, 285, 4518, 3559, 50275, 2520, 789, 3400, 247, 873, 273, 1095, 456, 673, 836, 15302, 285, 1543, 273, 6041, 1543, 352, 651, 320, 247, 7470, 4983, 1127, 323, 253, 22780, 6110, 281, 789, 327, 50275, 6050, 253, 15958, 11935, 2523, 3198, 281, 320, 9713, 627, 403, 690, 643, 3374, 281, 1908, 50274, 783, 11935, 5333, 3268, 1095, 456, 275, 436, 2929, 403, 2223, 734, 37340, 342, 643, 3510, 273, 3268, 15036, 323, 1650, 275, 253, 49555, 319, 10895, 253, 4477, 3587, 10883, 264, 253, 10895, 407, 807, 23111, 4328, 15274, 2439, 5239, 534, 651, 906, 275, 3045, 11961, 6830, 1955, 281, 15036, 4269, 407, 8593, 275, 253, 10895, 807, 3305, 1880, 253, 4691, 292, 1171, 15183, 9025, 310, 5185, 875, 673, 9894, 26138, 11325, 1967, 285, 25336, 805, 13144, 671, 11852, 253, 3045, 3064, 3021, 271, 1774, 1953, 310, 849, 281, 5416, 326, 253, 3045, 11961, 310, 15846, 1955, 281, 673, 3268, 5333, 285, 417, 1955, 281, 45611, 273, 643, 10625, 50275, 783, 4096, 273, 436, 2929, 310, 281, 12661, 15302, 342, 673, 3268, 5333, 2299, 690, 7533, 403, 417, 3240, 5185, 342, 253, 6612, 323, 1650, 762, 253, 2022, 612, 2103, 895, 4758, 253, 673, 13905, 875, 6194, 285, 258, 351, 1071, 323, 1142, 749, 46906, 1507, 310, 5699, 323, 1650, 275, 4836, 337, 253, 3733, 873, 310, 327, 3388, 1384, 1107, 5662, 685, 253, 1071, 873, 285, 436, 4758, 778, 320, 24666, 275, 253, 2629, 50276, 1686, 2440, 15722, 752, 359, 513, 871, 310, 326, 1566, 11269, 403, 10879, 50276, 262, 651, 320, 9371, 604, 368, 812, 11322, 619, 24626, 670, 436, 10775, 326, 690, 273, 253, 4081, 10895, 7533, 403, 417, 7933, 6612, 21085, 50276, 7152, 339, 431, 248, 4477, 403, 17565, 3268, 267, 15036, 275, 673, 671, 1929, 347, 4473, 16924, 597, 3989, 247, 22791, 432, 247, 2962, 273, 818, 5368, 15302, 1287, 767, 7103, 8130, 285, 7103, 247, 2491, 273, 3082, 42072, 13727, 4715, 285, 44351, 398, 4715, 3082, 323, 11138, 31640, 327, 436, 22791, 253, 4477, 921, 326, 642, 3082, 12724, 562, 32231, 209, 693, 326, 42072, 476, 4536, 320, 30078, 285, 326, 13727, 4715, 1057, 1805, 685, 5415, 50276, 19687, 30132, 4715, 7274, 253, 4477, 4518, 921, 326, 1655, 5657, 403, 12497, 323, 11757, 1076, 3268, 267, 5333, 275, 673, 50275, 2520, 310, 247, 1175, 2929, 533, 253, 10097, 310, 14999, 285, 891, 588, 320, 5211, 281, 49620, 619, 6803, 672, 253, 4477, 2085, 253, 3309, 9886, 347, 2529, 2708, 50276, 783, 4477, 452, 2530, 247, 9865, 285, 11117, 22791, 534, 16633, 14288, 327, 3268, 267, 5333, 275, 673, 436, 1055, 6634, 275, 1142, 3238, 285, 8542, 4893, 285, 3103, 452, 3862, 12739, 323, 1097, 2561, 285, 323, 9787, 24432, 253, 789, 3139, 33888, 247, 3862, 2491, 273, 4893, 8892, 285, 33433, 627, 403, 642, 4016, 16289, 390, 2675, 12739, 275, 436, 789, 50276, 783, 1655, 789, 1057, 417, 452, 667, 1534, 7681, 32213, 2299, 352, 651, 320, 4722, 281, 9186, 2069, 32190, 275, 2570, 18872, 10554, 8892, 824, 347, 26405, 47694, 11020, 9162, 285, 9077, 23000, 352, 651, 320, 4722, 281, 671, 1908, 17699, 16561, 395, 4919, 3082, 824, 347, 49328, 285, 849, 1199, 7756, 275, 31640, 597, 403, 2104, 281, 2085, 4720, 18216, 253, 4327, 273, 1566, 10336, 285, 697, 3486, 327, 31640, 50275, 35529, 627, 310, 247, 14855, 275, 253, 1329, 10097, 352, 310, 12744, 849, 253, 10895, 352, 281, 320, 4516, 762, 752, 23937, 597, 403, 6096, 50276, 250, 3197, 849, 253, 1048, 3945, 1329, 273, 253, 10895, 588, 320, 4824, 562, 253, 40477, 534, 891, 871, 310, 1146, 1275, 514, 2149, 310, 16706, 275, 849, 253, 941, 310, 281, 320, 2797, 2057, 3066, 24849, 84, 390, 3066, 17899, 4446, 3966, 436, 10262, 247, 5691, 281, 24497, 253, 941, 841, 3374, 3198, 281, 320, 6283, 9713, 323, 436, 2929, 281, 320, 7607, 891, 588, 320, 1077, 5211, 281, 49620, 619, 6803, 273, 253, 2929, 672, 253, 4477, 2085, 10599, 10097, 5474, 33032, 2520, 2929, 29328, 247, 22791, 323, 15974, 253, 11935, 15036, 275, 5145, 4715, 2718, 50275, 783, 16038, 273, 253, 2929, 310, 2590, 253, 4477, 3324, 598, 247, 4512, 1895, 273, 1236, 2510, 25912, 5145, 4715, 2718, 835, 253, 941, 2544, 689, 673, 50274, 783, 4477, 3324, 247, 8542, 14876, 323, 16344, 747, 3210, 689, 673, 3169, 15036, 50276, 783, 5426, 273, 11935, 15036, 310, 1077, 4864, 594, 253, 6349, 273, 841, 15036, 275, 3946, 310, 417, 973, 5544, 50275, 17480, 436, 2929, 13698, 281, 12661, 247, 22791, 323, 37460, 253, 11935, 15036, 352, 943, 3831, 247, 4864, 5426, 273, 253, 4495, 285, 2538, 273, 436, 1511, 273, 5333, 275, 1016, 10895, 604, 627, 310, 667, 3064, 285, 625, 2508, 19148, 253, 3064, 875, 436, 1511, 273, 5333, 285, 643, 941, 15036, 50276, 12025, 272, 247, 1524, 10186, 1895, 275, 5145, 4715, 310, 1774, 281, 320, 14042, 323, 9787, 5145, 4715, 2718, 50276, 783, 4096, 273, 253, 2929, 310, 2590, 50276, 9088, 310, 642, 13506, 941, 8869, 390, 2544, 1580, 253, 10625, 403, 1754, 327, 4588, 673, 13009, 689, 673, 50276, 783, 7990, 273, 253, 2929, 310, 4122, 4623, 281, 1655, 5145, 4715, 2561, 18275, 50276, 783, 7103, 1332, 310, 2529, 4518, 50276, 16680, 403, 11392, 275, 253, 2929, 275, 247, 4076, 285, 34007, 1039, 253, 10895, 4428, 2710, 10625, 342, 767, 1027, 8892, 50271, 20261, 253, 7680, 310, 2590, 627, 310, 642, 1524, 10186, 10076, 275, 253, 10199, 323, 16984, 253, 6349, 273, 253, 1895, 281, 253, 9414, 50276, 455, 4903, 275, 253, 7212, 497, 417, 2931, 50276, 9088, 310, 642, 8542, 3410, 323, 849, 281, 897, 436, 22791, 390, 667, 8542, 29609, 323, 4212, 50276, 9088, 310, 642, 10199, 323, 667, 7792, 390, 6335, 534, 476, 3359, 253, 22791, 275, 8542, 6493, 50276, 74, 812, 2649, 1089, 667, 3048, 281, 253, 10895, 390, 22791, 4968, 50276, 783, 7103, 17082, 629, 310, 417, 973, 2529, 50276, 9088, 310, 642, 18035, 2593, 390, 667, 8180, 670, 326, 50272, 7152, 33032, 50276, 2520, 2929, 29328, 247, 10895, 323, 3268, 15036, 326, 31167, 253, 10056, 273, 673, 970, 28921, 21464, 50276, 2520, 13806, 3626, 11935, 3268, 15036, 17912, 253, 4081, 10895, 4956, 2606, 476, 320, 11575, 347, 247, 625, 21541, 10895, 323, 1524, 10186, 9534, 50275, 6050, 253, 2045, 10895, 4956, 16761, 327, 2460, 15302, 436, 2929, 13330, 342, 2710, 15302, 1690, 2854, 8900, 1786, 1557, 285, 3668, 9162, 50275, 3062, 1189, 597, 1804, 6803, 273, 253, 4081, 10895, 1754, 327, 253, 19763, 273, 28921, 21464, 1929, 347, 612, 2103, 895, 2069, 12395, 4758, 50276, 395, 2777, 4963, 45120, 4715, 4758, 50276, 936, 7568, 253, 10307, 273, 253, 4081, 10895, 253, 4477, 5196, 9470, 4679, 597, 12724, 2028, 441, 326, 253, 4081, 10895, 6125, 5799, 3607, 273, 1097, 31929, 2382, 285, 562, 273, 3268, 10895, 50275, 601, 2080, 253, 5028, 273, 45120, 4715, 556, 644, 7194, 908, 281, 2126, 5899, 15302, 824, 347, 260, 338, 274, 285, 4440, 257, 292, 15302, 407, 13237, 3629, 966, 1491, 285, 40238, 2130, 2856, 522, 842, 84, 1142, 8607, 452, 644, 14659, 407, 436, 1127, 281, 4354, 15142, 45120, 4715, 2175, 2299, 436, 4758, 310, 8489, 2080, 432, 1524, 10186, 5989, 15672, 253, 4477, 2085, 15302, 326, 10129, 253, 1524, 10186, 3237, 50275, 783, 4081, 10895, 369, 6760, 407, 2067, 1666, 25379, 4102, 12845, 275, 3268, 5333, 285, 5415, 4715, 281, 6583, 253, 3607, 407, 10941, 616, 3045, 6941, 8607, 665, 878, 281, 897, 253, 941, 281, 513, 594, 8069, 50275, 10489, 253, 1666, 25379, 285, 9470, 4679, 253, 4081, 15302, 4518, 956, 5319, 273, 1097, 45120, 4715, 285, 3268, 267, 15036, 50274, 2520, 789, 8599, 326, 13727, 4715, 11333, 452, 247, 1029, 1268, 273, 3045, 891, 5476, 326, 253, 4373, 22041, 465, 285, 246, 497, 973, 35890, 604, 253, 30762, 1543, 403, 2530, 275, 253, 7714, 627, 4620, 281, 320, 642, 14282, 3064, 347, 2429, 281, 5415, 4715, 516, 12371, 849, 13727, 4715, 11333, 6283, 789, 672, 11052, 2193, 273, 465, 285, 246, 275, 436, 2743, 891, 452, 247, 5545, 1880, 253, 13727, 4715, 11333, 40687, 625, 2856, 522, 842, 84, 275, 253, 1072, 14604, 390, 23657, 10941, 253, 2219, 273, 45120, 4715, 1666, 25379, 50274, 74, 1537, 1158, 326, 253, 2530, 941, 18212, 6266, 253, 3268, 5333, 2299, 672, 891, 1007, 387, 253, 2829, 337, 285, 2829, 1249, 352, 3133, 326, 253, 16774, 2495, 41458, 789, 973, 6941, 253, 562, 273, 3268, 10895, 281, 320, 6107, 342, 31929, 2382, 941, 672, 253, 1180, 273, 2130, 2856, 522, 842, 84, 310, 3777, 352, 651, 320, 1896, 281, 7568, 326, 209, 693, 310, 18860, 604, 352, 812, 320, 5183, 326, 253, 5368, 3268, 267, 5333, 285, 5415, 4715, 3082, 1347, 1805, 685, 209, 693, 327, 4577, 15302, 840, 209, 693, 651, 320, 13697, 40072, 891, 6566, 326, 352, 369, 8927, 6034, 5474, 339, 431, 248, 2929, 29328, 4956, 2606, 22791, 273, 5093, 15302, 281, 22791, 3082, 762, 11935, 3268, 267, 5333, 767, 7103, 7533, 403, 4081, 253, 806, 7103, 4758, 310, 612, 2103, 895, 534, 4648, 247, 4229, 3733, 285, 5175, 8085, 347, 275, 22296, 4715, 534, 310, 31998, 281, 16055, 2561, 7888, 253, 1273, 7103, 4758, 310, 2777, 4963, 534, 310, 8003, 281, 253, 45120, 4715, 7103, 4758, 253, 2929, 49602, 767, 9050, 273, 7274, 5028, 26647, 3082, 285, 4035, 4715, 3082, 253, 1543, 921, 326, 5368, 3082, 2550, 2953, 253, 11935, 3268, 267, 5333, 1895, 973, 6108, 2316, 323, 2852, 3082, 281, 2953, 436, 1895, 50276, 6050, 954, 2045, 3268, 267, 5333, 49602, 13345, 3268, 267, 15036, 24088, 2460, 17715, 621, 3127, 4114, 9654, 3295, 495, 253, 4081, 259, 587, 2606, 22791, 16633, 327, 253, 3626, 11935, 3268, 267, 5333, 534, 310, 247, 1895, 326, 10748, 4961, 275, 253, 1524, 10186, 436, 310, 247, 1534, 7680, 281, 436, 3114, 285, 476, 12454, 2852, 2987, 281, 1287, 10237, 11333, 1411, 11935, 3268, 5333, 50274, 783, 747, 22791, 310, 11088, 275, 2426, 273, 7103, 17082, 7103, 3082, 285, 9991, 273, 15302, 50276, 262, 588, 320, 1805, 281, 2319, 2442, 2561, 10746, 281, 2953, 436, 1895, 2490, 187, 4118, 18435, 27, 74, 20585, 3686, 253, 20544, 285, 32213, 326, 497, 5393, 2366, 50276, 296, 3755, 20556, 50276, 783, 31471, 281, 253, 3114, 285, 878, 281, 1287, 3082, 326, 2953, 258, 351, 941, 50275, 20790, 973, 15720, 50276, 20881, 1255, 265, 50276, 9088, 403, 690, 3533, 670, 253, 7621, 4403, 534, 3133, 2201, 533, 671, 9713, 50276, 8052, 1150, 1050, 5333, 1121, 1755, 273, 253, 11935, 5333, 534, 588, 320, 247, 34541, 2803, 275, 4679, 50276, 9088, 403, 7350, 670, 253, 673, 875, 2176, 39657, 273, 253, 941, 581, 37317, 3916, 697, 1512, 1943, 275, 247, 2176, 4758, 1529, 3748, 14787, 275, 1529, 4758, 50276, 18480, 327, 5661, 10165, 285, 4278, 285, 18543, 273, 4373, 22041, 273, 253, 11333, 50276, 4537, 3331, 28399, 273, 10097, 40477, 30905, 3966, 2710, 30628, 452, 2559, 616, 4868, 1309, 253, 30080, 22559, 50275, 455, 2366, 253, 30628, 452, 644, 1077, 11080, 23395, 8871, 598, 327, 841, 8213, 1223, 891, 11435, 616, 949, 2628, 891, 971, 281, 1659, 253, 32213, 715, 3634, 273, 253, 31471, 281, 253, 3114, 326, 369, 671, 12724, 5393, 690, 30628, 3748, 326, 597, 403, 1335, 417, 2233, 13762, 534, 310, 271, 49353, 1127, 281, 253, 4477, 4496, 513, 1978, 271, 1527, 2564, 285, 7703, 281, 2007, 3157, 327, 841, 2792, 891, 923, 642, 2968, 7054, 398, 275, 253, 32213, 285, 1955, 281, 253, 3839, 2762, 21942, 891, 717, 1077, 21802, 281, 1918, 247, 2762, 17401, 50276 ]
[ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 ]
[ 10932, 1199, 347, 15302, 403, 8558, 323, 3114, 897, 352, 943, 320, 4354, 31998, 285, 253, 10097, 943, 320, 2608, 19771, 253, 40477, 30905, 556, 1643, 4983, 7997, 594, 352, 310, 2834, 281, 897, 323, 1650, 253, 11911, 323, 253, 15302, 5742, 891, 651, 1804, 253, 40477, 10097, 320, 2686, 8838, 5474, 33032, 2520, 2929, 21424, 10620, 342, 253, 2523, 273, 3268, 5333, 689, 673, 323, 10237, 1566, 3045, 253, 2488, 1095, 456, 818, 15302, 432, 5368, 15302, 326, 4887, 15958, 11935, 3268, 15036, 50276, 585, 26743, 5028, 26647, 3082, 347, 973, 347, 45120, 4715, 3169, 3082, 403, 22791, 264, 762, 767, 1027, 7533, 612, 2103, 895, 2777, 3241, 312, 253, 4679, 7568, 326, 253, 22791, 264, 3082, 403, 3710, 275, 10620, 342, 11935, 3268, 15036, 2439, 512, 4081, 7533, 436, 789, 13079, 281, 10073, 253, 3114, 275, 46710, 253, 1895, 273, 11935, 3268, 5333, 407, 18597, 22791, 272, 15302, 285, 8245, 3210, 50276, 2520, 2929, 310, 17194, 285, 4518, 3559, 50275, 2520, 789, 3400, 247, 873, 273, 1095, 456, 673, 836, 15302, 285, 1543, 273, 6041, 1543, 352, 651, 320, 247, 7470, 4983, 1127, 323, 253, 22780, 6110, 281, 789, 327, 50275, 6050, 253, 15958, 11935, 2523, 3198, 281, 320, 9713, 627, 403, 690, 643, 3374, 281, 1908, 50274, 783, 11935, 5333, 3268, 1095, 456, 275, 436, 2929, 403, 2223, 734, 37340, 342, 643, 3510, 273, 3268, 15036, 323, 1650, 275, 253, 49555, 319, 10895, 253, 4477, 3587, 10883, 264, 253, 10895, 407, 807, 23111, 4328, 15274, 2439, 5239, 534, 651, 906, 275, 3045, 11961, 6830, 1955, 281, 15036, 4269, 407, 8593, 275, 253, 10895, 807, 3305, 1880, 253, 4691, 292, 1171, 15183, 9025, 310, 5185, 875, 673, 9894, 26138, 11325, 1967, 285, 25336, 805, 13144, 671, 11852, 253, 3045, 3064, 3021, 271, 1774, 1953, 310, 849, 281, 5416, 326, 253, 3045, 11961, 310, 15846, 1955, 281, 673, 3268, 5333, 285, 417, 1955, 281, 45611, 273, 643, 10625, 50275, 783, 4096, 273, 436, 2929, 310, 281, 12661, 15302, 342, 673, 3268, 5333, 2299, 690, 7533, 403, 417, 3240, 5185, 342, 253, 6612, 323, 1650, 762, 253, 2022, 612, 2103, 895, 4758, 253, 673, 13905, 875, 6194, 285, 258, 351, 1071, 323, 1142, 749, 46906, 1507, 310, 5699, 323, 1650, 275, 4836, 337, 253, 3733, 873, 310, 327, 3388, 1384, 1107, 5662, 685, 253, 1071, 873, 285, 436, 4758, 778, 320, 24666, 275, 253, 2629, 50276, 1686, 2440, 15722, 752, 359, 513, 871, 310, 326, 1566, 11269, 403, 10879, 50276, 262, 651, 320, 9371, 604, 368, 812, 11322, 619, 24626, 670, 436, 10775, 326, 690, 273, 253, 4081, 10895, 7533, 403, 417, 7933, 6612, 21085, 50276, 7152, 339, 431, 248, 4477, 403, 17565, 3268, 267, 15036, 275, 673, 671, 1929, 347, 4473, 16924, 597, 3989, 247, 22791, 432, 247, 2962, 273, 818, 5368, 15302, 1287, 767, 7103, 8130, 285, 7103, 247, 2491, 273, 3082, 42072, 13727, 4715, 285, 44351, 398, 4715, 3082, 323, 11138, 31640, 327, 436, 22791, 253, 4477, 921, 326, 642, 3082, 12724, 562, 32231, 209, 693, 326, 42072, 476, 4536, 320, 30078, 285, 326, 13727, 4715, 1057, 1805, 685, 5415, 50276, 19687, 30132, 4715, 7274, 253, 4477, 4518, 921, 326, 1655, 5657, 403, 12497, 323, 11757, 1076, 3268, 267, 5333, 275, 673, 50275, 2520, 310, 247, 1175, 2929, 533, 253, 10097, 310, 14999, 285, 891, 588, 320, 5211, 281, 49620, 619, 6803, 672, 253, 4477, 2085, 253, 3309, 9886, 347, 2529, 2708, 50276, 783, 4477, 452, 2530, 247, 9865, 285, 11117, 22791, 534, 16633, 14288, 327, 3268, 267, 5333, 275, 673, 436, 1055, 6634, 275, 1142, 3238, 285, 8542, 4893, 285, 3103, 452, 3862, 12739, 323, 1097, 2561, 285, 323, 9787, 24432, 253, 789, 3139, 33888, 247, 3862, 2491, 273, 4893, 8892, 285, 33433, 627, 403, 642, 4016, 16289, 390, 2675, 12739, 275, 436, 789, 50276, 783, 1655, 789, 1057, 417, 452, 667, 1534, 7681, 32213, 2299, 352, 651, 320, 4722, 281, 9186, 2069, 32190, 275, 2570, 18872, 10554, 8892, 824, 347, 26405, 47694, 11020, 9162, 285, 9077, 23000, 352, 651, 320, 4722, 281, 671, 1908, 17699, 16561, 395, 4919, 3082, 824, 347, 49328, 285, 849, 1199, 7756, 275, 31640, 597, 403, 2104, 281, 2085, 4720, 18216, 253, 4327, 273, 1566, 10336, 285, 697, 3486, 327, 31640, 50275, 35529, 627, 310, 247, 14855, 275, 253, 1329, 10097, 352, 310, 12744, 849, 253, 10895, 352, 281, 320, 4516, 762, 752, 23937, 597, 403, 6096, 50276, 250, 3197, 849, 253, 1048, 3945, 1329, 273, 253, 10895, 588, 320, 4824, 562, 253, 40477, 534, 891, 871, 310, 1146, 1275, 514, 2149, 310, 16706, 275, 849, 253, 941, 310, 281, 320, 2797, 2057, 3066, 24849, 84, 390, 3066, 17899, 4446, 3966, 436, 10262, 247, 5691, 281, 24497, 253, 941, 841, 3374, 3198, 281, 320, 6283, 9713, 323, 436, 2929, 281, 320, 7607, 891, 588, 320, 1077, 5211, 281, 49620, 619, 6803, 273, 253, 2929, 672, 253, 4477, 2085, 10599, 10097, 5474, 33032, 2520, 2929, 29328, 247, 22791, 323, 15974, 253, 11935, 15036, 275, 5145, 4715, 2718, 50275, 783, 16038, 273, 253, 2929, 310, 2590, 253, 4477, 3324, 598, 247, 4512, 1895, 273, 1236, 2510, 25912, 5145, 4715, 2718, 835, 253, 941, 2544, 689, 673, 50274, 783, 4477, 3324, 247, 8542, 14876, 323, 16344, 747, 3210, 689, 673, 3169, 15036, 50276, 783, 5426, 273, 11935, 15036, 310, 1077, 4864, 594, 253, 6349, 273, 841, 15036, 275, 3946, 310, 417, 973, 5544, 50275, 17480, 436, 2929, 13698, 281, 12661, 247, 22791, 323, 37460, 253, 11935, 15036, 352, 943, 3831, 247, 4864, 5426, 273, 253, 4495, 285, 2538, 273, 436, 1511, 273, 5333, 275, 1016, 10895, 604, 627, 310, 667, 3064, 285, 625, 2508, 19148, 253, 3064, 875, 436, 1511, 273, 5333, 285, 643, 941, 15036, 50276, 12025, 272, 247, 1524, 10186, 1895, 275, 5145, 4715, 310, 1774, 281, 320, 14042, 323, 9787, 5145, 4715, 2718, 50276, 783, 4096, 273, 253, 2929, 310, 2590, 50276, 9088, 310, 642, 13506, 941, 8869, 390, 2544, 1580, 253, 10625, 403, 1754, 327, 4588, 673, 13009, 689, 673, 50276, 783, 7990, 273, 253, 2929, 310, 4122, 4623, 281, 1655, 5145, 4715, 2561, 18275, 50276, 783, 7103, 1332, 310, 2529, 4518, 50276, 16680, 403, 11392, 275, 253, 2929, 275, 247, 4076, 285, 34007, 1039, 253, 10895, 4428, 2710, 10625, 342, 767, 1027, 8892, 50271, 20261, 253, 7680, 310, 2590, 627, 310, 642, 1524, 10186, 10076, 275, 253, 10199, 323, 16984, 253, 6349, 273, 253, 1895, 281, 253, 9414, 50276, 455, 4903, 275, 253, 7212, 497, 417, 2931, 50276, 9088, 310, 642, 8542, 3410, 323, 849, 281, 897, 436, 22791, 390, 667, 8542, 29609, 323, 4212, 50276, 9088, 310, 642, 10199, 323, 667, 7792, 390, 6335, 534, 476, 3359, 253, 22791, 275, 8542, 6493, 50276, 74, 812, 2649, 1089, 667, 3048, 281, 253, 10895, 390, 22791, 4968, 50276, 783, 7103, 17082, 629, 310, 417, 973, 2529, 50276, 9088, 310, 642, 18035, 2593, 390, 667, 8180, 670, 326, 50272, 7152, 33032, 50276, 2520, 2929, 29328, 247, 10895, 323, 3268, 15036, 326, 31167, 253, 10056, 273, 673, 970, 28921, 21464, 50276, 2520, 13806, 3626, 11935, 3268, 15036, 17912, 253, 4081, 10895, 4956, 2606, 476, 320, 11575, 347, 247, 625, 21541, 10895, 323, 1524, 10186, 9534, 50275, 6050, 253, 2045, 10895, 4956, 16761, 327, 2460, 15302, 436, 2929, 13330, 342, 2710, 15302, 1690, 2854, 8900, 1786, 1557, 285, 3668, 9162, 50275, 3062, 1189, 597, 1804, 6803, 273, 253, 4081, 10895, 1754, 327, 253, 19763, 273, 28921, 21464, 1929, 347, 612, 2103, 895, 2069, 12395, 4758, 50276, 395, 2777, 4963, 45120, 4715, 4758, 50276, 936, 7568, 253, 10307, 273, 253, 4081, 10895, 253, 4477, 5196, 9470, 4679, 597, 12724, 2028, 441, 326, 253, 4081, 10895, 6125, 5799, 3607, 273, 1097, 31929, 2382, 285, 562, 273, 3268, 10895, 50275, 601, 2080, 253, 5028, 273, 45120, 4715, 556, 644, 7194, 908, 281, 2126, 5899, 15302, 824, 347, 260, 338, 274, 285, 4440, 257, 292, 15302, 407, 13237, 3629, 966, 1491, 285, 40238, 2130, 2856, 522, 842, 84, 1142, 8607, 452, 644, 14659, 407, 436, 1127, 281, 4354, 15142, 45120, 4715, 2175, 2299, 436, 4758, 310, 8489, 2080, 432, 1524, 10186, 5989, 15672, 253, 4477, 2085, 15302, 326, 10129, 253, 1524, 10186, 3237, 50275, 783, 4081, 10895, 369, 6760, 407, 2067, 1666, 25379, 4102, 12845, 275, 3268, 5333, 285, 5415, 4715, 281, 6583, 253, 3607, 407, 10941, 616, 3045, 6941, 8607, 665, 878, 281, 897, 253, 941, 281, 513, 594, 8069, 50275, 10489, 253, 1666, 25379, 285, 9470, 4679, 253, 4081, 15302, 4518, 956, 5319, 273, 1097, 45120, 4715, 285, 3268, 267, 15036, 50274, 2520, 789, 8599, 326, 13727, 4715, 11333, 452, 247, 1029, 1268, 273, 3045, 891, 5476, 326, 253, 4373, 22041, 465, 285, 246, 497, 973, 35890, 604, 253, 30762, 1543, 403, 2530, 275, 253, 7714, 627, 4620, 281, 320, 642, 14282, 3064, 347, 2429, 281, 5415, 4715, 516, 12371, 849, 13727, 4715, 11333, 6283, 789, 672, 11052, 2193, 273, 465, 285, 246, 275, 436, 2743, 891, 452, 247, 5545, 1880, 253, 13727, 4715, 11333, 40687, 625, 2856, 522, 842, 84, 275, 253, 1072, 14604, 390, 23657, 10941, 253, 2219, 273, 45120, 4715, 1666, 25379, 50274, 74, 1537, 1158, 326, 253, 2530, 941, 18212, 6266, 253, 3268, 5333, 2299, 672, 891, 1007, 387, 253, 2829, 337, 285, 2829, 1249, 352, 3133, 326, 253, 16774, 2495, 41458, 789, 973, 6941, 253, 562, 273, 3268, 10895, 281, 320, 6107, 342, 31929, 2382, 941, 672, 253, 1180, 273, 2130, 2856, 522, 842, 84, 310, 3777, 352, 651, 320, 1896, 281, 7568, 326, 209, 693, 310, 18860, 604, 352, 812, 320, 5183, 326, 253, 5368, 3268, 267, 5333, 285, 5415, 4715, 3082, 1347, 1805, 685, 209, 693, 327, 4577, 15302, 840, 209, 693, 651, 320, 13697, 40072, 891, 6566, 326, 352, 369, 8927, 6034, 5474, 339, 431, 248, 2929, 29328, 4956, 2606, 22791, 273, 5093, 15302, 281, 22791, 3082, 762, 11935, 3268, 267, 5333, 767, 7103, 7533, 403, 4081, 253, 806, 7103, 4758, 310, 612, 2103, 895, 534, 4648, 247, 4229, 3733, 285, 5175, 8085, 347, 275, 22296, 4715, 534, 310, 31998, 281, 16055, 2561, 7888, 253, 1273, 7103, 4758, 310, 2777, 4963, 534, 310, 8003, 281, 253, 45120, 4715, 7103, 4758, 253, 2929, 49602, 767, 9050, 273, 7274, 5028, 26647, 3082, 285, 4035, 4715, 3082, 253, 1543, 921, 326, 5368, 3082, 2550, 2953, 253, 11935, 3268, 267, 5333, 1895, 973, 6108, 2316, 323, 2852, 3082, 281, 2953, 436, 1895, 50276, 6050, 954, 2045, 3268, 267, 5333, 49602, 13345, 3268, 267, 15036, 24088, 2460, 17715, 621, 3127, 4114, 9654, 3295, 495, 253, 4081, 259, 587, 2606, 22791, 16633, 327, 253, 3626, 11935, 3268, 267, 5333, 534, 310, 247, 1895, 326, 10748, 4961, 275, 253, 1524, 10186, 436, 310, 247, 1534, 7680, 281, 436, 3114, 285, 476, 12454, 2852, 2987, 281, 1287, 10237, 11333, 1411, 11935, 3268, 5333, 50274, 783, 747, 22791, 310, 11088, 275, 2426, 273, 7103, 17082, 7103, 3082, 285, 9991, 273, 15302, 50276, 262, 588, 320, 1805, 281, 2319, 2442, 2561, 10746, 281, 2953, 436, 1895, 2490, 187, 4118, 18435, 27, 74, 20585, 3686, 253, 20544, 285, 32213, 326, 497, 5393, 2366, 50276, 296, 3755, 20556, 50276, 783, 31471, 281, 253, 3114, 285, 878, 281, 1287, 3082, 326, 2953, 258, 351, 941, 50275, 20790, 973, 15720, 50276, 20881, 1255, 265, 50276, 9088, 403, 690, 3533, 670, 253, 7621, 4403, 534, 3133, 2201, 533, 671, 9713, 50276, 8052, 1150, 1050, 5333, 1121, 1755, 273, 253, 11935, 5333, 534, 588, 320, 247, 34541, 2803, 275, 4679, 50276, 9088, 403, 7350, 670, 253, 673, 875, 2176, 39657, 273, 253, 941, 581, 37317, 3916, 697, 1512, 1943, 275, 247, 2176, 4758, 1529, 3748, 14787, 275, 1529, 4758, 50276, 18480, 327, 5661, 10165, 285, 4278, 285, 18543, 273, 4373, 22041, 273, 253, 11333, 50276, 4537, 3331, 28399, 273, 10097, 40477, 30905, 3966, 2710, 30628, 452, 2559, 616, 4868, 1309, 253, 30080, 22559, 50275, 455, 2366, 253, 30628, 452, 644, 1077, 11080, 23395, 8871, 598, 327, 841, 8213, 1223, 891, 11435, 616, 949, 2628, 891, 971, 281, 1659, 253, 32213, 715, 3634, 273, 253, 31471, 281, 253, 3114, 326, 369, 671, 12724, 5393, 690, 30628, 3748, 326, 597, 403, 1335, 417, 2233, 13762, 534, 310, 271, 49353, 1127, 281, 253, 4477, 4496, 513, 1978, 271, 1527, 2564, 285, 7703, 281, 2007, 3157, 327, 841, 2792, 891, 923, 642, 2968, 7054, 398, 275, 253, 32213, 285, 1955, 281, 253, 3839, 2762, 21942, 891, 717, 1077, 21802, 281, 1918, 247, 2762, 17401, 50276 ]
Below is given review of a research paper from cnoference journal. Please write a summary the review. ### Review: summary the goal of this paper is to improve our understanding of rewardagnostic metrics drawn from the literature through comparison with human behaviour and task reward this paper compares two intrinsic reward methods against three baselines on three atari environments on five metrics including task reward a simple metric for human similarity and three informationtheoretic assessments of aggregated observation counts drawn from the literature which they call taskagnostic metrics the authors report the correlation between the different metrics strengths and weaknesses constructing a comparative understanding of the many methods for exploration intrinsic motivation and curiosity is a vastly underdeveloped area i think that this papers goal is to do some of that work which i see as a strength however the experiments are not appropriately designed to provide reliable results and the paper includes substantial errors in understanding the existing literature and as the paper is essentially an empirical survey appropriately representing the other literature is critical visually inspecting figure 4 it appears that the results would be completely different if the noop agent was excluded and to a lesser extent the random agent my concern is that these baselines are categorically different from the agents we are actually interested in and appear to strongly affect the results for example without the noop agent it appears that the correlation between human similarity and empowerment would be much weaker and might actually be negative the human similarity metric does not seem to be a meaningful metric for what it is designed to measure this is of particular concern to me because much of the interpretation of the data relies on comparison with the human similarity metric so using such a simplified metric doesnt seem sufficient the human similarity data only considers which observations an agent shares with the human data without regard for how many times each one visits a particular state a human might make exactly one observation in a given bucket and an agent making only one observation in that bucket would receive the same score for it as an agent that returns to that state millions of times the generalization between state observations created by the preprocessing seems like it can only exacerbate the issue a similar concern arises when looking at the curiosity metric using entropy of sensory input visitation as a metric measures uniformity of visits to states rather than measuring the ability of the agent to visit as many states as possible in particular you can construct examples in which visiting a small subset of states with uniform frequencies results in higher performance on this metric than covering more states but with less uniform distributions in principle most researchers designing algorithms to improve exploration algorithms would care about this distinction intuitively actually visiting a state and ensuring that the agent has observed what is there is important for ensuring the agent can find the optimal parts of the world the use of the word curiosity in this paper is problematic overall using the word curiosity to refer to both a metric and a set of methods leaves quite a bit of room for confusion for the reader in particular the methods and their metric are not as closely related as the authors suggest in the paper while the authors appear to have the misconception that methods like icm and rnd are designed to increase the entropy over observations stated on page 6 this is not the case importantly these rewards are designed to be consumable so they eventually no longer shape the behaviour of the agent and the agent is left to pursue typically external goals that could result in visit frequencies being highly nonuniform the word curiosity has been used in the realm of reinforcement learning to refer to many very different methods not necessarily methods that measure probability under a trained density model and it isnt appropriate to provide this blanket definition of the word curiosity without some language to tell the reader that the word curiosity is simply a shorthand in this paper in particular to refer to methods that fall under the given definition while this paper makes clear calls to the foundation of ideas from the literature that are employed in this paper eg work on curiosity information gain empowerment human performance on atari etc there is no discussion of related kinds of comparative work that already exists in the literature neither the literature comparing multiple intrinsic reward agents nor the literature comparing the exploratory behaviour of rl agents with that of humans is discussed recommendation i am recommending that this paper be rejected on the basis of lack of appropriate evidence for their claims and inappropriate use of language to describe curiosity a word with a diverse history in the literature specific examples of issues the characterization curiosity encourages encountering rare sensory inputs measured by a learned density model p 1 does not capture the definition of curiosity used as a metric the cross entropy of future inputs under a density model trained alongside the agent p 4 the characterization is inherently contradictory as if curiosity is successful what does it mean for a sensory input to be rare the characterization might be better captured by a definition that requires visiting many states the goexplore algorithm by ecoffet et al 2019 is explicitly not an intrinsic motivation algorithm for example see the paragraphs devoted to contrasting goexplore with im methods on page 2 of ecoffet et al 2019 and the paper provides little evidence of the empirical success of im methods so citing the paper for such evidence does not appear appropriate despite the empirical success of intrinsic motivation for facilitating exploration p 1 additional feedback here to help not necessarily part of decision assessment i found myself trying to come up with a more appropriate name for the metric you call curiosity and i think that observation entropy might capture the mathematical definition appropriately more data might improve the quality of the results of your experiments if you are interested in including other intrinsicreward methods into future experiments a list of fifteen different intrinsic rewards is included in httpsarxivorgabs190607865 can you clarify what preprocessing is done for the images fed to the agents this information belongs somewhere prior to we first convert the rgb images to grayscale as they were seen by the agents p 3 i cant find the definitions of a likely the action set and x likely the set of possible 8x8 discretized images used on p 3 and it would be helpful to have these notations defined explicitly has enable agents p 1 typo atari learning environment p 2 i this was meant to be arcade learning environment taskagnostic metric p 5 typo human similarity it correlates p 8 typo for this reason intrinsic rewards burda et al 2018b or human demonstrations aytar et al 2018 are important to succeed at the game p 12 rather than are important i would suggest have been important since there is no evidence that there doesnt exist some method of another category that succeeds in montezumas revenge that hasnt been published yet chooses one of a set p 12 reads a little strangely since the agent is choosing an action not a set icm is not designed to be a complete agent as it can potentially be used with a range of policy learning methods pathak et al 2017 p 16 and so the phrase is an exploration agent p 12 is not accurate i understand that you are using a ppo agent augmented with icm following burda et al 2018a but that would be helpful information to include in your description of the agents in the appendix perhaps along with a reminder to the reader about where to find the openai implementations that you are using in appendix d the explanation of icm p 12 would benefit from explaining what learning algorithmagent architecture is used to optimize the intrinsic or intrinsic extrinsic reward to parallel the description given for ppodocsepthanks for this paper the curiosity and exploration is an important topic for rl research and we need more indepth analysis of existing methods the paper as it stands provide useful but expected insights the difficulty ive with the paper is that its not clear what exactly youre after here we find that all three objectives correlate more strongly with human behavior than with the task reward moreover task reward with curiosity better explains human behavior than task reward alone if the idea is to convey the message that humans display curiosity as measured by your interpretation and way of measuring it then there is a large body of text on human curiosity that already discusses these topics additionally for this you dont need to train artificial agents simple implementations of curiosity empowerment and information gain correlate substantially with human similarity this suggests that they can be used as taskagnostic evaluation metrics when human data and task rewards are unavailable following from above comments all the research on intrinsic reward uses this intuition already so its not clear what is added extra here in addition as discussed in the notes below the empowerment and info gain the simplistic way that they are implemented are not actually good measures as a random agent is able to score strongly on those without having any intelligence notes table 1 is misplaced on page 1 section 31 discretisation what is the effect of the choice of 8x8 on the overall results what wouldve happened with 16x16 for example maybe explore these kind of choices that will impact your results section 32 human similarity the sentence we suggest that a more general measure of intelligence may relate to similarity between the agents behavior and human behavior in the same environment ie using human behavior as a groundtruth overstates the originality of this suggestion as this is not the first time that similarity or imitating human behavior is suggested as a measure of intelligence perhaps you may want to restrict this to certain papers that you feel take a different task oriented approach eq 3 i wouldve thought the human similarity measure to capture the distribution of actions in a particular state as the primary measure than the probability of being at the same state expressed by the discretised image while due to previous actions an agent or human will end up in a certain state the proposed measure captures the action similarity implicitly rather than explicitly eq 3 any particular reason for using jaccard index with positive probability thresholds as the measure of similarity i think a probabilistic measure such as kldiv would be a more appropriate way to work with distributions of states than thresholded jaccard similarity figure 2 it seems that random agent scores highly in empowerment and information gain metrics this is very counter intuitive since 1 the agent doesnt learn from experience its information gain should be zero 2 and high score in empowerment may suggest empowerment as computed here is not a good metric for measuring intelligence table 2 this is an important table but has been placed in appendix making it not only hard to read the paper but also i would think is put there to meet the paper limits as otherwise it wouldve been located where the results are being discussed i suggest either to find a way to include it in the main text or remove direct discussion about it from the main results there is a lot of repetition in the text so it should be possible to be brief and concise but add important results to the main text docsepthis work studies four taskagnostic metrics for evaluating reinforcement learning agents human similarity curiosity empowerment and information gain experiments were conducted with three selected rl algorithms ppo icm and rnd on selected atari games the results show that a combination of task reward and curiosity better explain human behavior and some nonreward metrics correlate better with human behavior than task reward the authors propose that such taskagnostic can be used as intrinsic signals for training rl agents when task reward and human data are not available in an environment pros taskagnostic metrics are useful for evaluating rl agents without access to task reward measuring behavior similarity with human data also provides insights into different behavior of rl algorithms the insights from analyzing the three taskagnostic metrics correlation with both task reward and human behavior similarity are useful for designing new rl algorithms as indicated by the authors the paper is well written with clarity and includes all experimental details for reproducibility cons the intrinsic metrics studied in this paper are not novel and have been used in various existing rl algorithms alongside task reward for training agents to demonstrate the acclaimed usefulness of the proposed metrics it is desired to see experiments training rl agents with only taskagnostic metrics the experiments conducted are on agents lifetime data to gain better understanding of the learning dynamics of rl algorithms it would be useful to see evaluation of the data at different learning stages update after reading the assessment of other reviewers and the referenced papers in the intrinsic reward literature i am reassured that the methodsmetrics proposed in this paper are not novel and as pointed out by other reviewers have been studied under other terminologies in different prior works the analysis of these metrics correlation with human data is still an interesting piece of result but is not significant enough to become the sole contribution of an iclr paper therefore i move my initial assessment of 6 to 4docsep summary this paper proposes to study three types of intrinsic motivations curiosity empowerment and information gain they propose to compute these measures on the lifetime experience of rl agents and to use them as behavioral metrics to evaluate these metrics they perform a correlation study with respect to two traditional behavioral metrics the task reward and human similarity strong points the paper is clearly written and well organized i believe it is important to conduct studies that do not present a novel algorithm but try to gain understanding on existing approaches designing new behavioral metrics for rl agents especially ones that do not require rewards is indeed a good idea and will be useful to the community all details required for reproducibility are present and the code will be released weak points here i list weak points in order of increasing importance downsampling these taskagnostic metrics rely on the downsampling of the frame i feel like this would not work well for minigrid nethack mujoco etc can you discuss that point about the choice of environments this paper investigates the evaluation of agents without rewards in three environments that are explicitly rewardbased with well defined rewards the justification of this choice is also not really discussed except from we chose these environments because they span a range of complexity freedom and difficulty breakout and seaquest barely require any exploration breakout is arguably close to a dense reward problem i wish this study involved more environments especially environments designed specifically to study exploration issues like nethack or minigrid it would be nice to study the correlation of taskagnostic metrics and human similarity in environments without rewards which are the target environment of such metrics in the first place human similarity measure as i understood it the human similarity measure is the fraction of downsampled states that are visited by both the rl agent and the human demonstrator over the size of the union of these states sets we might consider the human coverage as the target coverage then the human similarity metric is nothing else than a coverage metric can you compute the discrete state coverage metric of rl agents and the correlation to the human similarity metric i believe a more relevant metric would evaluate whether agents select the same actions when presented the same states i dont think the sticky actions are a problem here one could compute matrices q of size x a that empirically estimate the probability of selecting any action in any state for both the human and the rl agent whenever the environment decides to use a sticky action the agents policy is still selecting an action that we can use instead of the sticky one once we have these matrices q then we can compute their average termbyterm difference and use the opposite as a human similarity measure do you have an opinion on that methodology i am concerned about the validity of the results presented in this paper as the method is not very rigorous correlate substantially is highly subjective usually one would use correlate significantly and support this claim by statistical evidence of the significance of the correlations please report which correlation measure is being performed pearson spearman kendall and report the pvalue of the associated statistical test scipy returns it automatically with the coefficient this is important to assess whether the evidence is sufficient to claim that there is a correlation in fig 3 correlations measures are reported over 7 points this is quite low and requires statistical tests to be interpretable table 3 measures are episodic returns for 1 seed this should be said clearly and one should be very cautious with the interpretation of these results when testing multiple hypotheses in parallel a good practice is to implement the familywise error rate correction of the confidence level if you test for one correlation with confidence level alpha5 probability to observe a correlation where there is not stays below 5 then testing n correlations results in a higher chance of observing a false positive let us say n5 for this reason the fwer correction proposes to decrease the confidence level of each test by a factor n so that the overall confidence level of the multiple tests remains alpha this means that to test for correlations in figure 3 10 correlations by graph we may want to require pvalues below alpha 10 eg 0005 for an overall 5 confidence level theoretically this should be done for all three environments so 30 an alternative is to formulate hypotheses a priori instead of searching for correlations in the wild we find that a linear model of curiosity empowerment and infogain can predict task reward and human similarity with correlations of 036 and 086 respectively im not sure this is a legitimate approach im not entirely sure so its open for discussion this boils down to training a prediction model from taskagnostic metrics to the human similarity score and to evaluate its performance correlation on the same training data usually you would have an hypothesis a particular linear combination of these that you would evaluate compute the correlation and test the corresponding significance the noop condition i see no clear reason to introduce a noop agent in this study this agent does nothing which by construction results in the minimization of all metrics studied here i think the three points introduced by the noop agent in each of the three environments are the main reason explaining the correlations in fig 4 if you remove them then i believe most correlations disappear some might even become anticorrelated human sim vs infogain and human sim vs empowerment please report the correlation measures and significance without these points recommendation and justification in the present state of the paper i recommend a rejection score 4 i think the topic of research is important and the authors should pursue in that direction however the methodology of the current version of this paper is not good enough the introduction of the noop agent may be explaining most of the correlation discussed in the results no statistical test has been conducted to show evidence for the significance of the results in order to update my score i would need a more rigorous correlation study that asserts the significance of the correlations using corrections i also think the noop condition should be removed the correlation between the curiosity score and human similarity score might still show but it is probably that most of the others would not the introduction of a humansimilarity metric that evaluates the similarity in decision making instead of state visitation might however bring interesting results feedback to improve the paper not part of assessment in the abstract compute the objectives sounds weird here they are behavioral metrics although some rl algorithms can be designed to optimize them in which case they are objectives what do you mean by estimate intrinsic objectives while the agents are learning which often requires complicated approximations which complicated approximations what is a complete or optimal measure of agent intelligence i think using agent intelligence is vague and not well defined i am curious what is the size x for the three environments task reward is it the mean over the lifetime the sum is it computed during training episodes including exploration noise eg epsilon greedy figure 2 what are the axes can you explain how you normalize the scores im guessing its normalized between to 01 by the range across different environments colormap for correlation plots is not ideal its difficult to appreciate the colors maybe pick something with more different colors not just a gradient between two how do you normalize and aggregate task reward and curiosity into a unique metric human similarity exhibits stronger correlations with the taskagnostic metrics we consider than does task reward not true for empowerment 057 06 what is the set of states the curiosity measure is computed on if it is the set of states visited by the agent then having a uniform exploration of a very small set of states would result in a high curiosity score i feel this is not what we want we want uniformity but also coverage the term curiosity is quite general and has been used for many purposes in the litterature for this reason i think it is not the best term to use here stateentropy would be much more descriptive when defining curiosity via a higher curiosity score implies a wider variety of states observed the authors cite oudeyer et al 2007 i just checked it and this paper actually presents the classification of several principles to implement the concept of curiosity or intrinsic motivations it also presents an algorithm that maximizes the agents learning progress this is different from the diversitymaximization approaches this paper refers to it would have been interesting to present algorithm optimizing for empowerment and information gain here the two algorithms both optimize for curiosity so far the random agent is the one maximizing these metrics one would hope that algorithms guided by these objectives would do better this result would support the intuition of the authors towards algorithms that mix infogainempowerment objectives with curiosity objectives typos across a wide spectrum of agent behavior behaviors well known rl agents well known rl algorithms we first collected datasets of a variety of agent behavior on which to compute and evaluate our metrics this sounds weird to me after collecting learning trajectories for various rl agents we can compute behavioral metrics the total information gain of over agents lifetime gain computed over the agents lifetime may key to exploration may be key to a good exploration in 17 out 18 cases in 17 out of 18 ### Summary:
the reviewers agree that the paper in its current form is not strong enough to allow for publication there are specific weaknesses that need to be tackled a better correlation study a clearer relationship to existing literature and improvement on the novelty clearer more precise use of descriptions the authors are encouraged to continue with their work and submit a more mature manuscript
[ 4217, 323, 20462, 747, 391, 77, 11333, 347, 4860, 407, 253, 4477, 253, 2929, 310, 973, 3542, 342, 19843, 285, 3797, 512, 5661, 4278, 323, 38041, 50276, 5040, 253, 15276, 17082, 5421, 275, 436, 2929, 403, 417, 4460, 285, 452, 644, 908, 275, 2710, 5368, 391, 77, 11333, 12936, 4836, 10921, 323, 3733, 6083, 281, 7568, 253, 42445, 31471, 273, 253, 4081, 17082, 352, 310, 6799, 281, 923, 4679, 3733, 391, 77, 6083, 342, 760, 4836, 1530, 6932, 17082, 253, 4679, 5196, 403, 327, 6083, 12702, 941, 281, 6351, 1805, 4685, 273, 253, 4715, 8062, 273, 391, 77, 11333, 352, 651, 320, 4217, 281, 923, 7103, 273, 253, 941, 387, 1027, 4715, 8661, 50274, 11183, 846, 4361, 253, 6803, 273, 643, 30628, 285, 253, 23378, 9380, 275, 253, 15276, 10921, 6239, 891, 717, 17279, 1520, 326, 253, 3082, 45037, 4081, 275, 436, 2929, 403, 417, 4460, 285, 347, 8042, 562, 407, 643, 30628, 452, 644, 5421, 762, 643, 18376, 5970, 275, 1027, 2720, 2987, 253, 1783, 273, 841, 17082, 5921, 342, 1966, 941, 310, 1335, 271, 4722, 5313, 273, 906, 533, 310, 417, 1534, 2217, 281, 2489, 253, 7934, 7680, 273, 271, 17857, 32888, 2929, 3103, 891, 2118, 619, 3302, 6803, 273, 721, 281, 577, 7152, 33032, 50269, 8774, 50276, 2520, 2929, 29328, 281, 1263, 1264, 3510, 273, 15276, 42852, 24536, 49952, 285, 1491, 6351, 597, 12661, 281, 11897, 841, 5593, 327, 253, 12702, 2793, 273, 391, 77, 6083, 285, 281, 897, 731, 347, 14613, 17082, 281, 7472, 841, 17082, 597, 1347, 247, 5921, 1263, 342, 1675, 281, 767, 5899, 14613, 17082, 253, 4836, 10921, 285, 1966, 14259, 50266, 9072, 2792, 50276, 783, 2929, 310, 4518, 3542, 285, 973, 10932, 50276, 74, 2868, 352, 310, 1774, 281, 2589, 2175, 326, 513, 417, 1246, 247, 4460, 5933, 533, 1611, 281, 6351, 4685, 327, 5368, 7274, 20462, 747, 14613, 17082, 323, 391, 77, 6083, 3340, 4394, 326, 513, 417, 2430, 23267, 310, 6296, 247, 1175, 2934, 285, 588, 320, 4217, 281, 253, 3114, 512, 4278, 2424, 323, 38041, 403, 1246, 285, 253, 2127, 588, 320, 4439, 50267, 20881, 2792, 50276, 1568, 891, 1618, 5075, 2792, 275, 1340, 273, 3629, 6349, 50276, 3487, 48027, 841, 4836, 1530, 6932, 17082, 10725, 327, 253, 1066, 48027, 273, 253, 3665, 891, 1928, 751, 436, 651, 417, 789, 973, 323, 1054, 304, 6992, 295, 678, 471, 278, 10441, 16856, 3966, 476, 368, 2319, 326, 1127, 50276, 10383, 253, 4327, 273, 12620, 436, 2929, 2340, 684, 253, 7103, 273, 6083, 1293, 23267, 275, 1264, 12620, 326, 403, 11120, 10921, 3169, 342, 973, 2931, 23267, 253, 22861, 273, 436, 4327, 310, 671, 417, 1663, 5469, 3707, 432, 359, 9703, 841, 12620, 984, 597, 13905, 247, 2491, 273, 10454, 7185, 285, 10183, 2740, 483, 285, 50276, 15681, 4557, 12345, 2430, 667, 17947, 2740, 483, 310, 25711, 2810, 281, 247, 14086, 10921, 1895, 891, 5730, 436, 1263, 3206, 625, 12620, 3340, 12620, 4158, 5742, 281, 1263, 17947, 3374, 751, 295, 678, 471, 390, 1054, 304, 6992, 352, 651, 320, 5322, 281, 1263, 253, 5921, 273, 4836, 1530, 6932, 17082, 285, 1966, 14259, 275, 12620, 1293, 23267, 534, 403, 253, 2303, 3126, 273, 824, 17082, 275, 253, 806, 1659, 50276, 13961, 14259, 2557, 347, 891, 7192, 352, 253, 1966, 14259, 2557, 310, 253, 6919, 273, 1066, 22163, 6216, 3054, 326, 403, 11580, 407, 1097, 253, 391, 77, 5570, 285, 253, 1966, 2837, 1080, 689, 253, 1979, 273, 253, 8083, 273, 841, 3054, 5239, 50274, 664, 1537, 1908, 253, 1966, 7031, 347, 253, 2303, 7031, 840, 253, 1966, 14259, 7982, 310, 2717, 2010, 685, 247, 7031, 7982, 476, 368, 11897, 253, 13358, 1375, 7031, 7982, 273, 391, 77, 6083, 285, 253, 5921, 281, 253, 1966, 14259, 7982, 50274, 74, 2868, 247, 625, 4623, 7982, 651, 7472, 1880, 6083, 3609, 253, 1072, 5231, 672, 3559, 253, 1072, 3054, 891, 13414, 1158, 253, 31714, 5231, 403, 247, 1895, 1060, 581, 812, 11897, 12624, 2805, 273, 1979, 1269, 247, 326, 45190, 6642, 253, 5912, 273, 17221, 667, 2250, 275, 667, 1375, 323, 1097, 253, 1966, 285, 253, 391, 77, 5570, 10793, 253, 3126, 21936, 281, 897, 247, 31714, 2250, 253, 6083, 3646, 310, 1335, 17221, 271, 2250, 326, 359, 476, 897, 3185, 273, 253, 31714, 581, 2378, 359, 452, 841, 12624, 2805, 840, 359, 476, 11897, 616, 3388, 1307, 1615, 3945, 3064, 285, 897, 253, 7285, 347, 247, 1966, 14259, 2557, 513, 368, 452, 271, 4743, 327, 326, 50276, 9349, 1497, 891, 717, 7514, 670, 253, 13091, 273, 253, 1543, 3559, 275, 436, 2929, 347, 253, 1332, 310, 417, 1077, 26565, 24888, 9619, 310, 4122, 17854, 3798, 581, 651, 897, 24888, 3012, 285, 1329, 436, 1750, 407, 7605, 1941, 273, 253, 8453, 273, 253, 13007, 4496, 1304, 534, 5921, 2557, 310, 1146, 2684, 27887, 1665, 31636, 1342, 465, 423, 455, 285, 1304, 253, 268, 2877, 273, 253, 2330, 7605, 1071, 660, 532, 90, 6548, 352, 8356, 342, 253, 10235, 436, 310, 1774, 281, 2939, 1880, 253, 1941, 310, 4209, 281, 1750, 326, 627, 310, 247, 5921, 50275, 249, 3036, 495, 13007, 5593, 403, 2361, 689, 818, 2792, 436, 310, 3240, 1698, 285, 4419, 7605, 5216, 281, 320, 4665, 494, 50274, 2420, 495, 5593, 403, 6314, 23329, 6548, 323, 337, 8357, 436, 943, 320, 753, 4518, 285, 581, 943, 320, 1077, 31798, 342, 253, 7914, 273, 841, 1543, 50275, 9453, 5175, 2709, 24316, 275, 7529, 247, 1175, 3946, 310, 281, 3359, 253, 2021, 3020, 2228, 2281, 10618, 273, 253, 7162, 1268, 604, 368, 1071, 323, 581, 5921, 342, 7162, 1268, 9765, 22, 5912, 281, 10018, 247, 5921, 835, 627, 310, 417, 19931, 2708, 608, 840, 5175, 295, 13007, 1543, 275, 247, 2169, 4839, 273, 20764, 247, 3221, 2762, 1339, 441, 1333, 295, 22, 323, 436, 1921, 253, 269, 8358, 10618, 29328, 281, 6379, 253, 7162, 1268, 273, 1016, 1071, 407, 247, 2803, 295, 594, 326, 253, 4583, 7162, 1268, 273, 253, 2709, 5216, 4558, 9765, 436, 2097, 326, 281, 1071, 323, 13007, 275, 4677, 495, 884, 13007, 407, 4216, 359, 778, 971, 281, 2430, 268, 8858, 2708, 9765, 50276, 740, 24088, 209, 13930, 323, 271, 4583, 608, 7162, 1268, 28055, 436, 943, 320, 2218, 323, 512, 1264, 12620, 594, 50276, 1229, 271, 5795, 310, 281, 36803, 24316, 247, 30400, 3185, 273, 12203, 323, 13007, 275, 253, 4956, 50275, 664, 1089, 326, 247, 4872, 1566, 273, 24536, 49952, 285, 2192, 462, 404, 476, 3283, 4836, 10921, 285, 1966, 14259, 342, 13007, 273, 470, 1812, 285, 470, 2691, 2975, 516, 417, 2119, 436, 310, 247, 14905, 2746, 516, 417, 7094, 2119, 594, 697, 1527, 323, 5955, 436, 1766, 3683, 1066, 281, 3733, 247, 10554, 1566, 432, 4836, 1530, 6932, 17082, 281, 253, 1966, 14259, 4868, 285, 281, 7472, 697, 3045, 5921, 327, 253, 1072, 3733, 941, 3798, 368, 651, 452, 271, 9079, 247, 1798, 4872, 5019, 273, 841, 326, 368, 651, 7472, 11897, 253, 5921, 285, 1071, 253, 3969, 8453, 50275, 783, 642, 412, 1617, 891, 923, 642, 2590, 1921, 281, 9569, 247, 642, 412, 5570, 275, 436, 1263, 436, 5570, 1057, 2717, 534, 407, 5140, 1543, 275, 253, 41458, 273, 512, 17082, 5421, 1060, 891, 1158, 253, 1264, 2792, 5611, 407, 253, 642, 412, 5570, 275, 1016, 273, 253, 1264, 12620, 403, 253, 2022, 1921, 15571, 253, 13007, 275, 3036, 577, 604, 368, 5386, 731, 840, 891, 2868, 954, 13007, 15529, 690, 1537, 1014, 2489, 37935, 263, 4919, 1966, 948, 4632, 2192, 462, 404, 285, 1966, 948, 4632, 49952, 4496, 1304, 253, 5921, 5593, 285, 8453, 1293, 841, 2792, 50266, 250, 27167, 318, 285, 22861, 50276, 249, 253, 1246, 1375, 273, 253, 2929, 891, 5583, 247, 18235, 4868, 577, 891, 1158, 253, 9400, 273, 2561, 310, 1774, 285, 253, 4477, 943, 15142, 275, 326, 3884, 2299, 253, 16182, 273, 253, 1655, 2715, 273, 436, 2929, 310, 417, 1175, 2217, 253, 10199, 273, 253, 642, 412, 5570, 778, 320, 15571, 954, 273, 253, 5921, 5469, 275, 253, 1543, 642, 7605, 1071, 556, 644, 5196, 281, 921, 1941, 323, 253, 8453, 273, 253, 1543, 50275, 249, 1340, 281, 5731, 619, 4868, 891, 651, 878, 247, 625, 26565, 5921, 1263, 326, 17086, 253, 8453, 273, 253, 13007, 970, 17660, 891, 671, 1158, 253, 642, 412, 1617, 943, 320, 5176, 253, 5921, 875, 253, 24536, 4868, 285, 1966, 14259, 4868, 1537, 1335, 921, 533, 352, 310, 3164, 326, 954, 273, 253, 2571, 651, 417, 253, 10199, 273, 247, 7497, 303, 1858, 414, 7982, 326, 44995, 253, 14259, 275, 3061, 2403, 3185, 273, 1375, 41820, 1537, 2299, 3324, 4722, 1543, 50266, 44333, 281, 3157, 253, 2929, 417, 629, 273, 6803, 50275, 249, 253, 12002, 11897, 253, 16566, 7835, 12504, 1060, 597, 403, 14613, 17082, 3738, 690, 391, 77, 11333, 476, 320, 4158, 281, 22318, 731, 275, 534, 1083, 597, 403, 16566, 50276, 5371, 513, 368, 1599, 407, 6642, 15276, 16566, 1223, 253, 6083, 403, 4715, 534, 2223, 4419, 9542, 34754, 534, 9542, 34754, 50276, 5371, 310, 247, 3426, 390, 8654, 2557, 273, 5570, 9260, 891, 1158, 970, 5570, 9260, 310, 21248, 285, 417, 973, 2931, 50275, 74, 717, 14338, 752, 310, 253, 1979, 1269, 323, 253, 1264, 12620, 50276, 14605, 10921, 310, 352, 253, 1599, 689, 253, 12702, 253, 2020, 310, 352, 10302, 1309, 3733, 13305, 1690, 17947, 6046, 24088, 299, 4277, 38754, 50276, 13206, 374, 752, 403, 253, 24039, 476, 368, 5513, 849, 368, 39142, 253, 7363, 516, 29985, 697, 12650, 875, 281, 14805, 407, 253, 2491, 2439, 1027, 12620, 50276, 2052, 526, 522, 323, 5921, 14777, 310, 417, 7445, 697, 2834, 281, 11435, 253, 9830, 5046, 2619, 1633, 342, 625, 1027, 9830, 417, 816, 247, 11786, 875, 767, 50275, 5430, 513, 368, 39142, 285, 19737, 4836, 10921, 285, 24536, 715, 247, 4451, 7982, 50276, 13961, 14259, 15646, 10046, 13007, 342, 253, 4836, 1530, 6932, 17082, 359, 1908, 685, 1057, 4836, 10921, 50276, 1439, 2032, 323, 49952, 470, 3011, 50276, 3071, 50276, 5371, 310, 253, 873, 273, 3054, 253, 24536, 2557, 310, 10302, 327, 604, 352, 310, 253, 873, 273, 3054, 11580, 407, 253, 5570, 840, 1907, 247, 6447, 17947, 273, 247, 1077, 1355, 873, 273, 3054, 651, 906, 275, 247, 1029, 24536, 4868, 891, 1928, 436, 310, 417, 752, 359, 971, 359, 971, 38255, 533, 671, 7031, 50276, 783, 1307, 24536, 310, 3240, 2087, 285, 556, 644, 908, 323, 1142, 6378, 275, 253, 30653, 1177, 323, 436, 1921, 891, 1158, 352, 310, 417, 253, 1682, 1307, 281, 897, 1060, 1375, 290, 10144, 651, 320, 1199, 625, 27389, 672, 13947, 24536, 3066, 247, 2169, 24536, 4868, 8018, 247, 14200, 5235, 273, 3054, 2540, 253, 4477, 26542, 258, 2496, 7885, 1162, 355, 5215, 891, 816, 10141, 352, 285, 436, 2929, 2686, 10262, 253, 9162, 273, 2067, 9241, 281, 3359, 253, 4473, 273, 24536, 390, 15276, 42852, 352, 671, 10262, 271, 5933, 326, 11903, 4219, 253, 6083, 4715, 4780, 436, 310, 1027, 432, 253, 9991, 785, 3266, 1320, 7274, 436, 2929, 10770, 281, 50276, 262, 651, 452, 644, 4722, 281, 1246, 5933, 39793, 323, 49952, 285, 1491, 6351, 1060, 253, 767, 11333, 1097, 22318, 323, 24536, 594, 2080, 253, 3632, 5570, 310, 253, 581, 46875, 841, 17082, 581, 651, 3524, 326, 11333, 18107, 407, 841, 16566, 651, 513, 1805, 436, 906, 651, 1329, 253, 30328, 273, 253, 4477, 4404, 11333, 326, 5878, 2192, 462, 404, 358, 9177, 420, 16566, 342, 24536, 16566, 50275, 555, 993, 50275, 317, 1350, 247, 4618, 6637, 273, 5570, 3879, 50276, 1257, 10940, 17327, 50276, 4714, 1929, 391, 77, 6083, 50276, 4714, 1929, 391, 77, 11333, 50275, 664, 806, 5728, 15302, 273, 247, 5235, 273, 5570, 3879, 327, 534, 281, 11897, 285, 7472, 776, 17082, 50276, 2520, 7835, 12504, 281, 479, 846, 17055, 4715, 24102, 323, 2710, 391, 77, 6083, 359, 476, 11897, 14613, 17082, 50275, 783, 2264, 1491, 6351, 273, 689, 6083, 12702, 50276, 41394, 10302, 689, 253, 6083, 12702, 50276, 11159, 2234, 281, 17947, 50276, 11159, 320, 2234, 281, 247, 1175, 17947, 50276, 249, 1722, 562, 1283, 2219, 50276, 249, 1722, 562, 273, 1283, 50269, 187, 187, 4118, 18435, 27, 783, 30628, 5194, 326, 253, 2929, 275, 697, 1655, 830, 310, 417, 2266, 2217, 281, 1581, 323, 9311, 50276, 9088, 403, 2173, 32213, 326, 878, 281, 320, 11463, 1070, 247, 1805, 5921, 1263, 247, 30909, 2954, 281, 5368, 6239, 285, 7756, 327, 253, 38135, 30909, 625, 10799, 897, 273, 20121, 50276, 783, 4477, 403, 14659, 281, 4035, 342, 616, 789, 285, 11929, 247, 625, 14242, 7714 ]
[ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 ]
[ 4217, 323, 20462, 747, 391, 77, 11333, 347, 4860, 407, 253, 4477, 253, 2929, 310, 973, 3542, 342, 19843, 285, 3797, 512, 5661, 4278, 323, 38041, 50276, 5040, 253, 15276, 17082, 5421, 275, 436, 2929, 403, 417, 4460, 285, 452, 644, 908, 275, 2710, 5368, 391, 77, 11333, 12936, 4836, 10921, 323, 3733, 6083, 281, 7568, 253, 42445, 31471, 273, 253, 4081, 17082, 352, 310, 6799, 281, 923, 4679, 3733, 391, 77, 6083, 342, 760, 4836, 1530, 6932, 17082, 253, 4679, 5196, 403, 327, 6083, 12702, 941, 281, 6351, 1805, 4685, 273, 253, 4715, 8062, 273, 391, 77, 11333, 352, 651, 320, 4217, 281, 923, 7103, 273, 253, 941, 387, 1027, 4715, 8661, 50274, 11183, 846, 4361, 253, 6803, 273, 643, 30628, 285, 253, 23378, 9380, 275, 253, 15276, 10921, 6239, 891, 717, 17279, 1520, 326, 253, 3082, 45037, 4081, 275, 436, 2929, 403, 417, 4460, 285, 347, 8042, 562, 407, 643, 30628, 452, 644, 5421, 762, 643, 18376, 5970, 275, 1027, 2720, 2987, 253, 1783, 273, 841, 17082, 5921, 342, 1966, 941, 310, 1335, 271, 4722, 5313, 273, 906, 533, 310, 417, 1534, 2217, 281, 2489, 253, 7934, 7680, 273, 271, 17857, 32888, 2929, 3103, 891, 2118, 619, 3302, 6803, 273, 721, 281, 577, 7152, 33032, 50269, 8774, 50276, 2520, 2929, 29328, 281, 1263, 1264, 3510, 273, 15276, 42852, 24536, 49952, 285, 1491, 6351, 597, 12661, 281, 11897, 841, 5593, 327, 253, 12702, 2793, 273, 391, 77, 6083, 285, 281, 897, 731, 347, 14613, 17082, 281, 7472, 841, 17082, 597, 1347, 247, 5921, 1263, 342, 1675, 281, 767, 5899, 14613, 17082, 253, 4836, 10921, 285, 1966, 14259, 50266, 9072, 2792, 50276, 783, 2929, 310, 4518, 3542, 285, 973, 10932, 50276, 74, 2868, 352, 310, 1774, 281, 2589, 2175, 326, 513, 417, 1246, 247, 4460, 5933, 533, 1611, 281, 6351, 4685, 327, 5368, 7274, 20462, 747, 14613, 17082, 323, 391, 77, 6083, 3340, 4394, 326, 513, 417, 2430, 23267, 310, 6296, 247, 1175, 2934, 285, 588, 320, 4217, 281, 253, 3114, 512, 4278, 2424, 323, 38041, 403, 1246, 285, 253, 2127, 588, 320, 4439, 50267, 20881, 2792, 50276, 1568, 891, 1618, 5075, 2792, 275, 1340, 273, 3629, 6349, 50276, 3487, 48027, 841, 4836, 1530, 6932, 17082, 10725, 327, 253, 1066, 48027, 273, 253, 3665, 891, 1928, 751, 436, 651, 417, 789, 973, 323, 1054, 304, 6992, 295, 678, 471, 278, 10441, 16856, 3966, 476, 368, 2319, 326, 1127, 50276, 10383, 253, 4327, 273, 12620, 436, 2929, 2340, 684, 253, 7103, 273, 6083, 1293, 23267, 275, 1264, 12620, 326, 403, 11120, 10921, 3169, 342, 973, 2931, 23267, 253, 22861, 273, 436, 4327, 310, 671, 417, 1663, 5469, 3707, 432, 359, 9703, 841, 12620, 984, 597, 13905, 247, 2491, 273, 10454, 7185, 285, 10183, 2740, 483, 285, 50276, 15681, 4557, 12345, 2430, 667, 17947, 2740, 483, 310, 25711, 2810, 281, 247, 14086, 10921, 1895, 891, 5730, 436, 1263, 3206, 625, 12620, 3340, 12620, 4158, 5742, 281, 1263, 17947, 3374, 751, 295, 678, 471, 390, 1054, 304, 6992, 352, 651, 320, 5322, 281, 1263, 253, 5921, 273, 4836, 1530, 6932, 17082, 285, 1966, 14259, 275, 12620, 1293, 23267, 534, 403, 253, 2303, 3126, 273, 824, 17082, 275, 253, 806, 1659, 50276, 13961, 14259, 2557, 347, 891, 7192, 352, 253, 1966, 14259, 2557, 310, 253, 6919, 273, 1066, 22163, 6216, 3054, 326, 403, 11580, 407, 1097, 253, 391, 77, 5570, 285, 253, 1966, 2837, 1080, 689, 253, 1979, 273, 253, 8083, 273, 841, 3054, 5239, 50274, 664, 1537, 1908, 253, 1966, 7031, 347, 253, 2303, 7031, 840, 253, 1966, 14259, 7982, 310, 2717, 2010, 685, 247, 7031, 7982, 476, 368, 11897, 253, 13358, 1375, 7031, 7982, 273, 391, 77, 6083, 285, 253, 5921, 281, 253, 1966, 14259, 7982, 50274, 74, 2868, 247, 625, 4623, 7982, 651, 7472, 1880, 6083, 3609, 253, 1072, 5231, 672, 3559, 253, 1072, 3054, 891, 13414, 1158, 253, 31714, 5231, 403, 247, 1895, 1060, 581, 812, 11897, 12624, 2805, 273, 1979, 1269, 247, 326, 45190, 6642, 253, 5912, 273, 17221, 667, 2250, 275, 667, 1375, 323, 1097, 253, 1966, 285, 253, 391, 77, 5570, 10793, 253, 3126, 21936, 281, 897, 247, 31714, 2250, 253, 6083, 3646, 310, 1335, 17221, 271, 2250, 326, 359, 476, 897, 3185, 273, 253, 31714, 581, 2378, 359, 452, 841, 12624, 2805, 840, 359, 476, 11897, 616, 3388, 1307, 1615, 3945, 3064, 285, 897, 253, 7285, 347, 247, 1966, 14259, 2557, 513, 368, 452, 271, 4743, 327, 326, 50276, 9349, 1497, 891, 717, 7514, 670, 253, 13091, 273, 253, 1543, 3559, 275, 436, 2929, 347, 253, 1332, 310, 417, 1077, 26565, 24888, 9619, 310, 4122, 17854, 3798, 581, 651, 897, 24888, 3012, 285, 1329, 436, 1750, 407, 7605, 1941, 273, 253, 8453, 273, 253, 13007, 4496, 1304, 534, 5921, 2557, 310, 1146, 2684, 27887, 1665, 31636, 1342, 465, 423, 455, 285, 1304, 253, 268, 2877, 273, 253, 2330, 7605, 1071, 660, 532, 90, 6548, 352, 8356, 342, 253, 10235, 436, 310, 1774, 281, 2939, 1880, 253, 1941, 310, 4209, 281, 1750, 326, 627, 310, 247, 5921, 50275, 249, 3036, 495, 13007, 5593, 403, 2361, 689, 818, 2792, 436, 310, 3240, 1698, 285, 4419, 7605, 5216, 281, 320, 4665, 494, 50274, 2420, 495, 5593, 403, 6314, 23329, 6548, 323, 337, 8357, 436, 943, 320, 753, 4518, 285, 581, 943, 320, 1077, 31798, 342, 253, 7914, 273, 841, 1543, 50275, 9453, 5175, 2709, 24316, 275, 7529, 247, 1175, 3946, 310, 281, 3359, 253, 2021, 3020, 2228, 2281, 10618, 273, 253, 7162, 1268, 604, 368, 1071, 323, 581, 5921, 342, 7162, 1268, 9765, 22, 5912, 281, 10018, 247, 5921, 835, 627, 310, 417, 19931, 2708, 608, 840, 5175, 295, 13007, 1543, 275, 247, 2169, 4839, 273, 20764, 247, 3221, 2762, 1339, 441, 1333, 295, 22, 323, 436, 1921, 253, 269, 8358, 10618, 29328, 281, 6379, 253, 7162, 1268, 273, 1016, 1071, 407, 247, 2803, 295, 594, 326, 253, 4583, 7162, 1268, 273, 253, 2709, 5216, 4558, 9765, 436, 2097, 326, 281, 1071, 323, 13007, 275, 4677, 495, 884, 13007, 407, 4216, 359, 778, 971, 281, 2430, 268, 8858, 2708, 9765, 50276, 740, 24088, 209, 13930, 323, 271, 4583, 608, 7162, 1268, 28055, 436, 943, 320, 2218, 323, 512, 1264, 12620, 594, 50276, 1229, 271, 5795, 310, 281, 36803, 24316, 247, 30400, 3185, 273, 12203, 323, 13007, 275, 253, 4956, 50275, 664, 1089, 326, 247, 4872, 1566, 273, 24536, 49952, 285, 2192, 462, 404, 476, 3283, 4836, 10921, 285, 1966, 14259, 342, 13007, 273, 470, 1812, 285, 470, 2691, 2975, 516, 417, 2119, 436, 310, 247, 14905, 2746, 516, 417, 7094, 2119, 594, 697, 1527, 323, 5955, 436, 1766, 3683, 1066, 281, 3733, 247, 10554, 1566, 432, 4836, 1530, 6932, 17082, 281, 253, 1966, 14259, 4868, 285, 281, 7472, 697, 3045, 5921, 327, 253, 1072, 3733, 941, 3798, 368, 651, 452, 271, 9079, 247, 1798, 4872, 5019, 273, 841, 326, 368, 651, 7472, 11897, 253, 5921, 285, 1071, 253, 3969, 8453, 50275, 783, 642, 412, 1617, 891, 923, 642, 2590, 1921, 281, 9569, 247, 642, 412, 5570, 275, 436, 1263, 436, 5570, 1057, 2717, 534, 407, 5140, 1543, 275, 253, 41458, 273, 512, 17082, 5421, 1060, 891, 1158, 253, 1264, 2792, 5611, 407, 253, 642, 412, 5570, 275, 1016, 273, 253, 1264, 12620, 403, 253, 2022, 1921, 15571, 253, 13007, 275, 3036, 577, 604, 368, 5386, 731, 840, 891, 2868, 954, 13007, 15529, 690, 1537, 1014, 2489, 37935, 263, 4919, 1966, 948, 4632, 2192, 462, 404, 285, 1966, 948, 4632, 49952, 4496, 1304, 253, 5921, 5593, 285, 8453, 1293, 841, 2792, 50266, 250, 27167, 318, 285, 22861, 50276, 249, 253, 1246, 1375, 273, 253, 2929, 891, 5583, 247, 18235, 4868, 577, 891, 1158, 253, 9400, 273, 2561, 310, 1774, 285, 253, 4477, 943, 15142, 275, 326, 3884, 2299, 253, 16182, 273, 253, 1655, 2715, 273, 436, 2929, 310, 417, 1175, 2217, 253, 10199, 273, 253, 642, 412, 5570, 778, 320, 15571, 954, 273, 253, 5921, 5469, 275, 253, 1543, 642, 7605, 1071, 556, 644, 5196, 281, 921, 1941, 323, 253, 8453, 273, 253, 1543, 50275, 249, 1340, 281, 5731, 619, 4868, 891, 651, 878, 247, 625, 26565, 5921, 1263, 326, 17086, 253, 8453, 273, 253, 13007, 970, 17660, 891, 671, 1158, 253, 642, 412, 1617, 943, 320, 5176, 253, 5921, 875, 253, 24536, 4868, 285, 1966, 14259, 4868, 1537, 1335, 921, 533, 352, 310, 3164, 326, 954, 273, 253, 2571, 651, 417, 253, 10199, 273, 247, 7497, 303, 1858, 414, 7982, 326, 44995, 253, 14259, 275, 3061, 2403, 3185, 273, 1375, 41820, 1537, 2299, 3324, 4722, 1543, 50266, 44333, 281, 3157, 253, 2929, 417, 629, 273, 6803, 50275, 249, 253, 12002, 11897, 253, 16566, 7835, 12504, 1060, 597, 403, 14613, 17082, 3738, 690, 391, 77, 11333, 476, 320, 4158, 281, 22318, 731, 275, 534, 1083, 597, 403, 16566, 50276, 5371, 513, 368, 1599, 407, 6642, 15276, 16566, 1223, 253, 6083, 403, 4715, 534, 2223, 4419, 9542, 34754, 534, 9542, 34754, 50276, 5371, 310, 247, 3426, 390, 8654, 2557, 273, 5570, 9260, 891, 1158, 970, 5570, 9260, 310, 21248, 285, 417, 973, 2931, 50275, 74, 717, 14338, 752, 310, 253, 1979, 1269, 323, 253, 1264, 12620, 50276, 14605, 10921, 310, 352, 253, 1599, 689, 253, 12702, 253, 2020, 310, 352, 10302, 1309, 3733, 13305, 1690, 17947, 6046, 24088, 299, 4277, 38754, 50276, 13206, 374, 752, 403, 253, 24039, 476, 368, 5513, 849, 368, 39142, 253, 7363, 516, 29985, 697, 12650, 875, 281, 14805, 407, 253, 2491, 2439, 1027, 12620, 50276, 2052, 526, 522, 323, 5921, 14777, 310, 417, 7445, 697, 2834, 281, 11435, 253, 9830, 5046, 2619, 1633, 342, 625, 1027, 9830, 417, 816, 247, 11786, 875, 767, 50275, 5430, 513, 368, 39142, 285, 19737, 4836, 10921, 285, 24536, 715, 247, 4451, 7982, 50276, 13961, 14259, 15646, 10046, 13007, 342, 253, 4836, 1530, 6932, 17082, 359, 1908, 685, 1057, 4836, 10921, 50276, 1439, 2032, 323, 49952, 470, 3011, 50276, 3071, 50276, 5371, 310, 253, 873, 273, 3054, 253, 24536, 2557, 310, 10302, 327, 604, 352, 310, 253, 873, 273, 3054, 11580, 407, 253, 5570, 840, 1907, 247, 6447, 17947, 273, 247, 1077, 1355, 873, 273, 3054, 651, 906, 275, 247, 1029, 24536, 4868, 891, 1928, 436, 310, 417, 752, 359, 971, 359, 971, 38255, 533, 671, 7031, 50276, 783, 1307, 24536, 310, 3240, 2087, 285, 556, 644, 908, 323, 1142, 6378, 275, 253, 30653, 1177, 323, 436, 1921, 891, 1158, 352, 310, 417, 253, 1682, 1307, 281, 897, 1060, 1375, 290, 10144, 651, 320, 1199, 625, 27389, 672, 13947, 24536, 3066, 247, 2169, 24536, 4868, 8018, 247, 14200, 5235, 273, 3054, 2540, 253, 4477, 26542, 258, 2496, 7885, 1162, 355, 5215, 891, 816, 10141, 352, 285, 436, 2929, 2686, 10262, 253, 9162, 273, 2067, 9241, 281, 3359, 253, 4473, 273, 24536, 390, 15276, 42852, 352, 671, 10262, 271, 5933, 326, 11903, 4219, 253, 6083, 4715, 4780, 436, 310, 1027, 432, 253, 9991, 785, 3266, 1320, 7274, 436, 2929, 10770, 281, 50276, 262, 651, 452, 644, 4722, 281, 1246, 5933, 39793, 323, 49952, 285, 1491, 6351, 1060, 253, 767, 11333, 1097, 22318, 323, 24536, 594, 2080, 253, 3632, 5570, 310, 253, 581, 46875, 841, 17082, 581, 651, 3524, 326, 11333, 18107, 407, 841, 16566, 651, 513, 1805, 436, 906, 651, 1329, 253, 30328, 273, 253, 4477, 4404, 11333, 326, 5878, 2192, 462, 404, 358, 9177, 420, 16566, 342, 24536, 16566, 50275, 555, 993, 50275, 317, 1350, 247, 4618, 6637, 273, 5570, 3879, 50276, 1257, 10940, 17327, 50276, 4714, 1929, 391, 77, 6083, 50276, 4714, 1929, 391, 77, 11333, 50275, 664, 806, 5728, 15302, 273, 247, 5235, 273, 5570, 3879, 327, 534, 281, 11897, 285, 7472, 776, 17082, 50276, 2520, 7835, 12504, 281, 479, 846, 17055, 4715, 24102, 323, 2710, 391, 77, 6083, 359, 476, 11897, 14613, 17082, 50275, 783, 2264, 1491, 6351, 273, 689, 6083, 12702, 50276, 41394, 10302, 689, 253, 6083, 12702, 50276, 11159, 2234, 281, 17947, 50276, 11159, 320, 2234, 281, 247, 1175, 17947, 50276, 249, 1722, 562, 1283, 2219, 50276, 249, 1722, 562, 273, 1283, 50269, 187, 187, 4118, 18435, 27, 783, 30628, 5194, 326, 253, 2929, 275, 697, 1655, 830, 310, 417, 2266, 2217, 281, 1581, 323, 9311, 50276, 9088, 403, 2173, 32213, 326, 878, 281, 320, 11463, 1070, 247, 1805, 5921, 1263, 247, 30909, 2954, 281, 5368, 6239, 285, 7756, 327, 253, 38135, 30909, 625, 10799, 897, 273, 20121, 50276, 783, 4477, 403, 14659, 281, 4035, 342, 616, 789, 285, 11929, 247, 625, 14242, 7714 ]
Below is given review of a research paper from cnoference journal. Please write a summary the review. ### Review: this is an interesting paper that proposes abstractions based on return distribution similarities to be used as auxiliary tasks to aid in learning the idea is quite promising and i think could open up new avenues for future research but it does not appear to me to be ready for publication yet in particular although the authors claim to be distinguishing based on returns many of the design decisions seem to implicitly assume determinism or neardeterminism in particular see point 3 and 10 below the theoretical results seem interesting but i have some questions on their validity and clarity see point 7 and 8 below the practical algorithm has a few issues that need clarification see points 3 4 9 11 and 12 below 1 at the bottom of page 2 the authors write we are the first to leverage return to construct a contrastive auxiliary task for speeding up the main rl task this is not quite true see zhang a mcallister r calandra r gal y and levine s 2020 learning invariant representations for reinforcement learning without reconstruction arxiv preprint arxiv200610742 2 in section 22 you should cite taylor j precup d and panagaden p 2009 bounding performance loss in approximate mdp homomorphisms in advances in neural information processing systems 16491656 3 in line 6 of algorithm 1 y mathbbibr1 ne br2 seems problematic for stochastic returns in particular if the number of bins is very large and the returns have wide variance y will almost always be zero 4 in line 8 of algorithm 1 is hatw also learned 5 in the last sentence of the first paragraph of section 41 rather than comparing against regular bisimulation it seems more appropriate to compare to pibisimulation from castro p s 2020 scalable methods for computing state similarity in deterministic markov decision processes in proceedings of the aaai conference on artificial intelligence 6 in equation 1 the minimization appears over f but f doesnt appear on the rhs shouldnt the minimization be over w 7 in theorem 41 its not clear what role x1 plays in the result why is this third state necessary it does not seem to show up in the proof in the appendix further the proof in the appendix could do with some elaboration as its not completely clear how lemmas b1 and b2 result in the proof of theorem 41 it would be better if the authors restate the theorem statement in the appendix and were more explicit about the connections there are no page limits for the appendix 8 how do we get a sense for how big phin is couldnt it be as large as nmathcalx 9 typically auxiliary losses are combined with the main loss into a single loss but in algorithm 2 you seem to be updating them sequentially why does the order of update matter 10 in figure 1 on the left the purple rectangle says value net if so are you really learning a distribution 11 in figure 2 why do some algorithms only have triangles and not learning curves also it seems from that figure that state sac skyline seems to outperform all others including rcrl 12 is figure 3 over a single seed 13 in definition a1 bisimulation was not introduced in jiang 2018 it was introduced in givan r dean t greig m 2003 equivalence notions and model minimization in markov decision processes artificial intelligence 147 163223 minor comments 1 it would be helpful if the authors specify how to pronounce rcrl while reading the paper i was pronouncing it like thishttpsyoutubedqw4w9wgxcq 2 in the secondtolast sentence on page 1 should read while ignoring returnirrelevant featuresdocsep summary the authors present a contrastive auxiliary loss based upon stateaction returns they introduce an abstraction over stateaction pairs and divide the space of stateaction returns into k bins over which the z function is defined where zsa is distributed over kdimensional vectors given an encoding function phi and an input x zirrelevance is defined as phix1 phix2 when zx1 zx2 which motivates the objective for zlearning to classify stateaction pairs with similar returns within bounds to be similar from this a contrastive loss can be defined returnbased contrastive rl rcrl where class labels are determined by zirrelevance sets encouraging stateaction encodings to be similar when the returns are in the limit z becomes the rl stateaction value function q the authors evaluate their approach on atari discrete actions and the deepmind control suite continuous actions across both modelfree and model based rl algorithms against and in combination with other auxilliary losses including curl srinivas et al 2020 strengths weaknesses auxiliary losses have become an important component in rl for developing stable agents that can generalize well and form good representations in particular contrastive losses have come into increasing use with growing literature around these methods and so i believe the domain area of this paper is relevant and of interest the authors do a good job of covering the recent developments of background literature in their related work section and grounding their approach with recent efforts undertaken in rl auxiliary losses contrastive learning approaches and state abstractionrepresentation learning literature the approach is overall novel as many contrastive learning methods are defined against input data or downstream representations whereas this work derives its data from rl returns and creates a link between the representational landscape of the observations and actions and broad outcomes as they are valuable to an agent as the authors have framed the problem i believe this approach is more powerful and also more tractable than something like reward prediction intuitively the formulation seems solid to me since we often would like to understand not only when were in a good state and taking a useful action but also in general what kind of properties stateaction pairs with similar returns should have the authors do note that this may be learnable by temporal difference updates alone however this approach aims to directly encourage the learning of this relationship and decouple it from the rl algorithm where perhaps other things may be focused on such as planning etc one shortfall of this approach could be the available data itself as youd rely on the policy to provide you with good samples for rcrl the authors indicate that they segment trajectories to ensure better quality positive and negative samples for learning however it could be made clearer how much of a problem this can be this approach could possibly be combined with a selfsupervised approach to alleviate these types of concerns it would also be nice to know the additional computational burden of rcrl and how this compares to other auxiliary losses the experiments on atari control look solid and demonstrate that this method attained a stronger score both alone and when combined with curl and good top performance on deepmind control suite tasks when compared again to curl and pixelsac it might have been nice to see more comparisons or combinations with other contrastive methods that have had some success in learning visual representations simclr chen et al 2020 byol grill et al 2020 the similarity analysis also provided some nice insight into the inductive bias induced by rcrl overall the paper is well written and has a clear layout the authors provide clear algorithms and figures and the content flows well from section to section recommendation i believe that this is a promising and very active area of research and that this work makes the case for a solid new approach and a set of encouraging results to back it up docsep returnbased contrastive representation learning for reinforcement learning summary the authors propose returnbased contrastive representation learning rcrl a contrastive auxiliary learning task that guides the feature network to encode representation relevant to the task rewards the experiment results show that rcrl helps improve two commonly used rl algorithm rainbow and sac in low data regime additionally rcrl can also be used in combination with other auxiliary tasks to boost performance overall the paper is wellwritten the topic is relevant to the field and the approach is novel strength theoreticallybacked the topic of representation learning is pretty relevant to the field now the learned representation is taskrelevant and therefore can achieve higher performance compared to other representation learning methods weakness the reliance on environment returns make the approach pretty susceptible to poorly or sparsely defined rewards specifically 1 the auxiliary loss does not work in sparse reward environments 2 in the task with dense but deceptive rewards the representation may be biased toward representation that is not helpful in the long run the improvement in continuous control tasks seem to be really marginal why is that the learned representation may not be very general due to its reliance on return signals certainly it can help achieve better performance when we only focus on a single task with a welldefined reward function yet the representation may not be as useful when we considered some practical realworld settings that require policy adaptation and transfer learningdocsepsummary the authors propose the inclusion of an auxiliary task for training an rl model where the auxiliary task objective is to learn an abstraction of the stateaction space that clusters sa pairs according to their expected return the authors first describe a basic abstraction learning framework zlearning followed by the extension to deep rl as an auxiliary task rcrl the authors present results in atari discrete action building on rainbow showing an improvement compared to baselines on median hns in the lowdata regime and results on dmcontrol continuous action building on sac showing similar or improved performance compared to baselines quality overall i found the approach and results to be interesting and moderately compelling at first glance the improvement is surprising given that modelfree deep rl already needs to abstract the state space on the basis of returns even without an auxiliary task the key appears to be the focus on sample efficiency in the lowdata regime where the task seems to improve nonlocal value signal propagation compared to a bootstrapped algorithm particularly on atari note that in the 100k regime the modelfree algorithms have not yet learned to play pong since it is not clear that the algorithm will generalize to more data its easy to imagine that the abstraction task will hinder performance when the base algorithms become more finely tuned i would like to see more clarification of the goal throughout the paper eg in the lowdata regime our algorithm outperforms strong baselines on complex tasks in the deepmind control suite and atari games in the abstract as well as a reference to the focus in the conclusions in the lowdata regime its also critical to justify this approach compared to a modelbased alternative on atari the authors compare to simple but muzero would be a stronger baseline besides the empirical results the authors also nicely provide a description of the zpi abstraction and an error bound clarity i was confused by the description of the positivenegative sampling procedure in 43 paragraph 2 are segments temporally consecutive within a trajectory if so is it primarily a heuristic that they will contain stateaction pairs with similar returns ie couldnt a reward achieved midsegment make this statement incorrect as i understand it segmenting avoids the problem of determining bins on the return distribution apriori however it also seems like it will limit the agents ability to cluster nonlocal sa pairs with the same returns it might also mean that the agent is learning to cluster temporally adjacent states in the underlying state space rather than similar returns originality the paper builds on existing work in the abstraction literature and auxiliary tasks for deep rl the primary novel component is using a returnbased auxiliary task the zpi abstraction framework also appears to be novel although its closely related to existing abstractions like qpi abstraction significance the rcrl model itself improves on existing modelfree approaches and can be easily incorporated into many modelfree architectures although it seems unlikely to beat a strong modelbased baseline like muzero in the lowdata regime the description of the zpi abstraction and the exploration of returnbased auxiliary tasks in general could prove more significant in the long term pros the model improves performance in the lowdata regime over existing modelfree baselines the model can be easily added to many existing architectures description and theoretical results on a new type of abstraction cons the paper needs some more clarity around the focus on lowdata sample efficiency and how applicable the model is to higher data regimes unclear if the segmentbased sampling strategy is clustering sa pairs with similar returns or just states that are nearby in the underlying state space the model seems unlikely to improve on a stronger modelbased baseline in the lowdata regime ### Summary:
rcrl is returnbased contrastive learning for reinforcement learning where the label is whether two samples belong to the same return bin the reviewers found this to be a well executed paper with good theoretical and experimental results
[ 323, 849, 1943, 815, 249, 310, 812, 2649, 352, 320, 347, 1781, 347, 295, 1588, 89, 898, 5431, 24026, 11655, 403, 5678, 342, 253, 2022, 2957, 715, 247, 2014, 2957, 533, 275, 5933, 374, 368, 1646, 281, 320, 22753, 731, 32627, 2139, 1057, 253, 1340, 273, 5731, 2647, 884, 275, 4677, 337, 327, 253, 1669, 253, 19445, 25334, 2296, 1318, 2036, 604, 594, 403, 368, 1663, 4715, 247, 3268, 1903, 275, 4677, 374, 2139, 513, 690, 11333, 760, 452, 30102, 285, 417, 4715, 9191, 671, 352, 3133, 432, 326, 4677, 326, 1375, 7044, 1629, 36226, 3133, 281, 562, 32231, 512, 2571, 1690, 391, 7083, 77, 1249, 310, 4677, 495, 689, 247, 2014, 8357, 2145, 275, 5426, 247, 18, 17542, 303, 1427, 369, 417, 5611, 275, 480, 22589, 4765, 352, 369, 5611, 275, 1259, 266, 391, 372, 266, 246, 50276, 24204, 304, 278, 6469, 19945, 27367, 285, 1566, 41458, 275, 1616, 729, 3061, 4870, 13345, 9260, 20825, 1668, 1237, 1508, 50275, 37585, 5701, 337, 352, 651, 320, 9371, 604, 253, 4477, 13199, 849, 281, 39640, 391, 7083, 77, 1223, 4361, 253, 2929, 891, 369, 11093, 28590, 352, 751, 436, 3614, 90, 483, 538, 264, 82, 88, 21, 88, 26, 46506, 10587, 82, 374, 275, 253, 1273, 34776, 505, 6197, 327, 3239, 337, 943, 1239, 1223, 23111, 1091, 343, 15477, 3386, 7152, 33032, 6010, 50276, 783, 4477, 1246, 247, 4499, 422, 24026, 2957, 1754, 2220, 1375, 1913, 6548, 50276, 9328, 9569, 271, 38562, 689, 1375, 1913, 8557, 285, 10957, 253, 2317, 273, 1375, 1913, 6548, 715, 465, 27925, 689, 534, 253, 1182, 1159, 310, 2931, 835, 1182, 6678, 310, 5939, 689, 465, 6967, 11390, 50276, 28821, 271, 9706, 1159, 815, 74, 285, 271, 3280, 1269, 1182, 343, 11235, 11828, 310, 2931, 347, 815, 895, 18, 50276, 545, 895, 19, 672, 1182, 89, 18, 50276, 91, 89, 19, 534, 15265, 684, 253, 8103, 323, 1182, 28269, 281, 30215, 1375, 1913, 8557, 342, 2074, 6548, 1561, 14493, 281, 320, 2074, 50276, 4064, 436, 247, 4499, 422, 2957, 476, 320, 2931, 1091, 3169, 4499, 422, 391, 77, 391, 7083, 77, 835, 966, 13301, 403, 3413, 407, 1182, 343, 11235, 11828, 5239, 18462, 1375, 1913, 2349, 351, 723, 281, 320, 2074, 672, 253, 6548, 403, 50276, 249, 253, 2701, 1182, 4916, 253, 391, 77, 1375, 1913, 1318, 1159, 2805, 50276, 783, 4477, 7472, 616, 2746, 327, 387, 1792, 13358, 5231, 285, 253, 3676, 14785, 1453, 18880, 5415, 5231, 2439, 1097, 771, 813, 658, 285, 1566, 1754, 391, 77, 11333, 1411, 285, 275, 5019, 342, 643, 10992, 3370, 552, 11655, 1690, 26721, 256, 11078, 34627, 1162, 355, 9169, 50275, 296, 3755, 20556, 50276, 20881, 1255, 265, 50276, 10422, 15434, 11655, 452, 2489, 271, 1774, 4445, 275, 391, 77, 323, 6684, 6474, 6083, 326, 476, 39970, 973, 285, 830, 1175, 14237, 50276, 249, 1798, 4499, 422, 11655, 452, 1705, 715, 3629, 897, 342, 5675, 6239, 1475, 841, 3082, 285, 594, 891, 2868, 253, 5028, 2170, 273, 436, 2929, 310, 4623, 285, 273, 1600, 50276, 783, 4477, 513, 247, 1175, 2628, 273, 10985, 253, 3332, 16936, 273, 4114, 6239, 275, 616, 2905, 789, 2593, 285, 3216, 272, 616, 2746, 342, 3332, 6031, 20023, 275, 391, 77, 24026, 11655, 4499, 422, 4715, 7274, 285, 1375, 38562, 37626, 4715, 6239, 50276, 783, 2746, 310, 4583, 4460, 347, 1142, 4499, 422, 4715, 3082, 403, 2931, 1411, 3280, 941, 390, 15450, 14237, 5727, 436, 789, 38422, 697, 941, 432, 391, 77, 6548, 285, 10513, 247, 3048, 875, 253, 1957, 1050, 13016, 273, 253, 7313, 285, 5231, 285, 3862, 6973, 347, 597, 403, 9865, 281, 271, 5570, 50276, 284, 253, 4477, 452, 29318, 253, 1895, 891, 2868, 436, 2746, 310, 625, 6422, 285, 671, 625, 10649, 494, 685, 1633, 751, 10921, 10554, 50276, 565, 41597, 253, 15895, 3133, 4891, 281, 479, 1580, 359, 2223, 651, 751, 281, 2096, 417, 760, 672, 497, 275, 247, 1175, 1375, 285, 3192, 247, 4217, 2250, 533, 671, 275, 2087, 752, 2238, 273, 3607, 1375, 1913, 8557, 342, 2074, 6548, 943, 452, 50276, 783, 4477, 513, 3877, 326, 436, 778, 320, 3037, 494, 407, 11935, 3064, 11269, 3815, 2299, 436, 2746, 13698, 281, 3587, 11907, 253, 4715, 273, 436, 2954, 285, 34430, 713, 352, 432, 253, 391, 77, 5933, 835, 4931, 643, 1841, 778, 320, 7106, 327, 824, 347, 7219, 3966, 50275, 531, 2159, 12615, 273, 436, 2746, 812, 320, 253, 2130, 941, 3139, 347, 368, 69, 10725, 327, 253, 3646, 281, 2085, 368, 342, 1175, 3530, 323, 391, 7083, 77, 50276, 783, 4477, 5224, 326, 597, 8223, 24102, 281, 5416, 1805, 3290, 2762, 285, 4016, 3530, 323, 4715, 2299, 352, 812, 320, 1160, 30909, 849, 1199, 273, 247, 1895, 436, 476, 320, 50274, 2520, 2746, 812, 6830, 320, 5678, 342, 247, 1881, 35421, 2746, 281, 33623, 841, 3510, 273, 7350, 50275, 262, 651, 671, 320, 5322, 281, 871, 253, 3081, 15180, 7977, 273, 391, 7083, 77, 285, 849, 436, 26662, 281, 643, 24026, 11655, 50276, 783, 4679, 327, 387, 1792, 50276, 8519, 1007, 4891, 285, 7568, 326, 436, 1332, 26553, 247, 10046, 4868, 1097, 3815, 285, 672, 5678, 342, 26721, 285, 1175, 1755, 3045, 327, 3676, 14785, 1453, 18880, 8892, 672, 2429, 969, 281, 26721, 285, 15115, 317, 50276, 262, 1537, 452, 644, 5322, 281, 923, 625, 14023, 390, 13553, 342, 643, 4499, 422, 3082, 326, 452, 574, 690, 2323, 275, 50276, 28269, 5304, 14237, 948, 498, 83, 260, 864, 1162, 355, 9169, 407, 311, 32257, 1162, 355, 9169, 50276, 783, 14259, 1783, 671, 2530, 690, 5322, 12288, 715, 253, 42115, 8492, 5802, 407, 391, 7083, 77, 50276, 1189, 455, 253, 2929, 310, 973, 3542, 285, 556, 247, 2590, 12806, 50276, 783, 4477, 2085, 2590, 11333, 285, 8442, 285, 253, 2600, 14221, 973, 432, 2593, 281, 2593, 50275, 250, 27167, 318, 50276, 74, 2868, 326, 436, 310, 247, 12532, 285, 1077, 3939, 2170, 273, 2561, 285, 326, 436, 789, 2789, 253, 1083, 323, 247, 4891, 747, 2746, 285, 247, 873, 273, 18462, 1543, 281, 896, 352, 598, 5474, 33032, 1091, 3169, 4499, 422, 6779, 4715, 323, 35221, 4715, 50276, 8774, 253, 4477, 12661, 1091, 3169, 4499, 422, 6779, 4715, 391, 7083, 77, 247, 4499, 422, 24026, 4715, 4836, 326, 22591, 253, 4735, 2990, 281, 22573, 6779, 4623, 281, 253, 4836, 23267, 253, 3368, 1543, 921, 326, 391, 7083, 77, 7729, 3157, 767, 7744, 908, 391, 77, 5933, 37422, 285, 7044, 275, 1698, 941, 9459, 23000, 391, 7083, 77, 476, 671, 320, 908, 275, 5019, 342, 643, 24026, 8892, 281, 9510, 3045, 50276, 1189, 455, 253, 2929, 310, 973, 15720, 253, 9400, 310, 4623, 281, 253, 1673, 285, 253, 2746, 310, 4460, 50275, 45563, 50276, 783, 7262, 1037, 32797, 50276, 783, 9400, 273, 6779, 4715, 310, 3965, 4623, 281, 253, 1673, 1024, 50276, 783, 6311, 6779, 310, 4836, 15477, 285, 3103, 476, 5115, 2169, 3045, 2429, 281, 643, 6779, 4715, 3082, 50275, 20881, 1255, 50276, 783, 22095, 327, 3126, 6548, 1056, 253, 2746, 3965, 16931, 281, 15225, 390, 37139, 600, 2931, 23267, 5742, 50275, 18, 253, 24026, 2957, 1057, 417, 789, 275, 23507, 10921, 12620, 50275, 19, 275, 253, 4836, 342, 14086, 533, 50015, 23267, 253, 6779, 778, 320, 23539, 2584, 6779, 326, 310, 417, 9371, 275, 253, 1048, 1408, 50276, 783, 7756, 275, 5415, 1453, 8892, 1646, 281, 320, 1663, 16888, 2139, 310, 326, 50276, 783, 6311, 6779, 778, 417, 320, 1077, 2087, 1955, 281, 697, 22095, 327, 1091, 6298, 5604, 352, 476, 1361, 5115, 1805, 3045, 672, 359, 760, 2770, 327, 247, 2014, 4836, 342, 247, 6210, 392, 37224, 10921, 1159, 2568, 253, 6779, 778, 417, 320, 347, 4217, 672, 359, 2783, 690, 8542, 1524, 10186, 7533, 326, 2430, 3646, 15644, 285, 3700, 4715, 7152, 339, 793, 360, 3454, 50276, 783, 4477, 12661, 253, 11250, 273, 271, 24026, 4836, 323, 3733, 271, 391, 77, 1566, 835, 253, 24026, 4836, 8103, 310, 281, 3037, 271, 38562, 273, 253, 1375, 1913, 2317, 326, 9959, 618, 8557, 2556, 281, 616, 3264, 1091, 50276, 783, 4477, 806, 6266, 247, 5044, 38562, 4715, 7792, 1182, 28269, 3560, 407, 253, 6880, 281, 3676, 391, 77, 347, 271, 24026, 4836, 391, 7083, 77, 253, 4477, 1246, 1543, 275, 387, 1792, 13358, 2250, 3652, 327, 37422, 4645, 271, 7756, 2429, 281, 1666, 25379, 327, 8876, 288, 2224, 275, 253, 1698, 2203, 9459, 285, 1543, 327, 42961, 8519, 5415, 2250, 3652, 327, 7044, 4645, 2074, 390, 5520, 3045, 2429, 281, 1666, 25379, 50276, 15177, 50276, 1189, 455, 891, 1119, 253, 2746, 285, 1543, 281, 320, 4722, 285, 28249, 18511, 387, 806, 17834, 253, 7756, 310, 10084, 1677, 326, 771, 813, 658, 3676, 391, 77, 2168, 3198, 281, 12002, 253, 1375, 2317, 327, 253, 3720, 273, 6548, 1014, 1293, 271, 24026, 4836, 253, 2234, 4620, 281, 320, 253, 2770, 327, 3410, 6733, 275, 253, 1698, 2203, 9459, 835, 253, 4836, 3133, 281, 3157, 1327, 6790, 1318, 2625, 18634, 2429, 281, 247, 7491, 10981, 1882, 5933, 3782, 327, 387, 1792, 3877, 326, 275, 253, 2233, 76, 9459, 253, 771, 813, 658, 11333, 452, 417, 2568, 6311, 281, 1132, 268, 543, 1580, 352, 310, 417, 2590, 326, 253, 5933, 588, 39970, 281, 625, 941, 697, 3477, 281, 8564, 326, 253, 38562, 4836, 588, 35007, 3045, 672, 253, 2613, 11333, 2489, 625, 25806, 24251, 891, 651, 751, 281, 923, 625, 37699, 273, 253, 4736, 4768, 253, 2929, 24088, 275, 253, 1698, 2203, 9459, 776, 5933, 41731, 13015, 2266, 1666, 25379, 327, 2570, 8892, 275, 253, 3676, 14785, 1453, 18880, 285, 387, 1792, 3958, 275, 253, 12002, 347, 973, 347, 247, 3806, 281, 253, 2770, 275, 253, 11815, 50276, 249, 253, 1698, 2203, 9459, 697, 671, 4619, 281, 15249, 436, 2746, 2429, 281, 247, 1566, 3169, 5795, 327, 387, 1792, 253, 4477, 7277, 281, 2969, 533, 278, 7958, 2771, 651, 320, 247, 10046, 8245, 50276, 67, 11587, 253, 16774, 1543, 253, 4477, 671, 23395, 2085, 247, 5740, 273, 253, 1182, 2059, 38562, 285, 271, 2228, 3033, 50276, 498, 15752, 50276, 74, 369, 13477, 407, 253, 5740, 273, 253, 10538, 3870, 909, 800, 10491, 5199, 275, 7652, 12494, 374, 403, 13288, 5897, 595, 12640, 1561, 247, 18974, 604, 594, 310, 352, 8558, 247, 47641, 326, 597, 588, 3831, 1375, 1913, 8557, 342, 2074, 6548, 26332, 812, 2649, 247, 10921, 6786, 4260, 29429, 1056, 436, 3908, 13583, 347, 891, 2096, 352, 8223, 272, 32547, 253, 1895, 273, 8925, 27925, 327, 253, 1091, 3268, 1049, 7947, 74, 2299, 352, 671, 3133, 751, 352, 588, 2701, 253, 6083, 3745, 281, 7368, 1327, 6790, 618, 8557, 342, 253, 1072, 6548, 352, 1537, 671, 1599, 326, 253, 5570, 310, 4715, 281, 7368, 5897, 595, 9701, 3054, 275, 253, 6944, 1375, 2317, 2581, 685, 2074, 6548, 50276, 19164, 414, 50276, 783, 2929, 21168, 327, 5368, 789, 275, 253, 38562, 6239, 285, 24026, 8892, 323, 3676, 391, 77, 253, 3625, 4460, 4445, 310, 970, 247, 1091, 3169, 24026, 4836, 50275, 783, 1182, 2059, 38562, 7792, 671, 4620, 281, 320, 4460, 3738, 697, 8244, 2905, 281, 5368, 490, 10981, 960, 751, 2805, 2059, 38562, 50276, 9188, 40348, 50276, 783, 391, 7083, 77, 1566, 3139, 19132, 327, 5368, 771, 813, 658, 7274, 285, 476, 320, 4354, 11217, 715, 1142, 771, 813, 658, 35615, 3738, 352, 3133, 11543, 281, 7171, 247, 2266, 1566, 3169, 8245, 751, 278, 7958, 2771, 275, 253, 1698, 2203, 9459, 50275, 783, 5740, 273, 253, 1182, 2059, 38562, 285, 253, 17947, 273, 1091, 3169, 24026, 8892, 275, 2087, 812, 5276, 625, 1534, 275, 253, 1048, 1307, 50276, 856, 84, 50276, 783, 1566, 19132, 3045, 275, 253, 1698, 2203, 9459, 689, 5368, 771, 813, 658, 1666, 25379, 50276, 783, 1566, 476, 320, 4354, 2879, 281, 1142, 5368, 35615, 50276, 10008, 285, 10527, 1543, 327, 247, 747, 1511, 273, 38562, 50276, 5040, 50276, 783, 2929, 3198, 690, 625, 19843, 1475, 253, 2770, 327, 1698, 2203, 50276, 16848, 6733, 285, 849, 7763, 253, 1566, 310, 281, 2169, 941, 27005, 50276, 328, 8250, 604, 253, 8223, 3169, 10491, 5700, 310, 17524, 618, 8557, 342, 2074, 6548, 390, 816, 3054, 326, 403, 10151, 275, 253, 6944, 1375, 2317, 50276, 783, 1566, 3133, 11543, 281, 3157, 327, 247, 10046, 1566, 3169, 8245, 275, 253, 1698, 2203, 9459, 2490, 187, 4118, 18435, 27, 3373, 8435, 310, 1091, 3169, 4499, 422, 4715, 323, 35221, 4715, 835, 253, 5203, 310, 1880, 767, 3530, 5663, 281, 253, 1072, 1091, 10269, 253, 30628, 1119, 436, 281, 320, 247, 973, 11407, 2929, 342, 1175, 10527, 285, 5661, 1543 ]
[ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 ]
[ 323, 849, 1943, 815, 249, 310, 812, 2649, 352, 320, 347, 1781, 347, 295, 1588, 89, 898, 5431, 24026, 11655, 403, 5678, 342, 253, 2022, 2957, 715, 247, 2014, 2957, 533, 275, 5933, 374, 368, 1646, 281, 320, 22753, 731, 32627, 2139, 1057, 253, 1340, 273, 5731, 2647, 884, 275, 4677, 337, 327, 253, 1669, 253, 19445, 25334, 2296, 1318, 2036, 604, 594, 403, 368, 1663, 4715, 247, 3268, 1903, 275, 4677, 374, 2139, 513, 690, 11333, 760, 452, 30102, 285, 417, 4715, 9191, 671, 352, 3133, 432, 326, 4677, 326, 1375, 7044, 1629, 36226, 3133, 281, 562, 32231, 512, 2571, 1690, 391, 7083, 77, 1249, 310, 4677, 495, 689, 247, 2014, 8357, 2145, 275, 5426, 247, 18, 17542, 303, 1427, 369, 417, 5611, 275, 480, 22589, 4765, 352, 369, 5611, 275, 1259, 266, 391, 372, 266, 246, 50276, 24204, 304, 278, 6469, 19945, 27367, 285, 1566, 41458, 275, 1616, 729, 3061, 4870, 13345, 9260, 20825, 1668, 1237, 1508, 50275, 37585, 5701, 337, 352, 651, 320, 9371, 604, 253, 4477, 13199, 849, 281, 39640, 391, 7083, 77, 1223, 4361, 253, 2929, 891, 369, 11093, 28590, 352, 751, 436, 3614, 90, 483, 538, 264, 82, 88, 21, 88, 26, 46506, 10587, 82, 374, 275, 253, 1273, 34776, 505, 6197, 327, 3239, 337, 943, 1239, 1223, 23111, 1091, 343, 15477, 3386, 7152, 33032, 6010, 50276, 783, 4477, 1246, 247, 4499, 422, 24026, 2957, 1754, 2220, 1375, 1913, 6548, 50276, 9328, 9569, 271, 38562, 689, 1375, 1913, 8557, 285, 10957, 253, 2317, 273, 1375, 1913, 6548, 715, 465, 27925, 689, 534, 253, 1182, 1159, 310, 2931, 835, 1182, 6678, 310, 5939, 689, 465, 6967, 11390, 50276, 28821, 271, 9706, 1159, 815, 74, 285, 271, 3280, 1269, 1182, 343, 11235, 11828, 310, 2931, 347, 815, 895, 18, 50276, 545, 895, 19, 672, 1182, 89, 18, 50276, 91, 89, 19, 534, 15265, 684, 253, 8103, 323, 1182, 28269, 281, 30215, 1375, 1913, 8557, 342, 2074, 6548, 1561, 14493, 281, 320, 2074, 50276, 4064, 436, 247, 4499, 422, 2957, 476, 320, 2931, 1091, 3169, 4499, 422, 391, 77, 391, 7083, 77, 835, 966, 13301, 403, 3413, 407, 1182, 343, 11235, 11828, 5239, 18462, 1375, 1913, 2349, 351, 723, 281, 320, 2074, 672, 253, 6548, 403, 50276, 249, 253, 2701, 1182, 4916, 253, 391, 77, 1375, 1913, 1318, 1159, 2805, 50276, 783, 4477, 7472, 616, 2746, 327, 387, 1792, 13358, 5231, 285, 253, 3676, 14785, 1453, 18880, 5415, 5231, 2439, 1097, 771, 813, 658, 285, 1566, 1754, 391, 77, 11333, 1411, 285, 275, 5019, 342, 643, 10992, 3370, 552, 11655, 1690, 26721, 256, 11078, 34627, 1162, 355, 9169, 50275, 296, 3755, 20556, 50276, 20881, 1255, 265, 50276, 10422, 15434, 11655, 452, 2489, 271, 1774, 4445, 275, 391, 77, 323, 6684, 6474, 6083, 326, 476, 39970, 973, 285, 830, 1175, 14237, 50276, 249, 1798, 4499, 422, 11655, 452, 1705, 715, 3629, 897, 342, 5675, 6239, 1475, 841, 3082, 285, 594, 891, 2868, 253, 5028, 2170, 273, 436, 2929, 310, 4623, 285, 273, 1600, 50276, 783, 4477, 513, 247, 1175, 2628, 273, 10985, 253, 3332, 16936, 273, 4114, 6239, 275, 616, 2905, 789, 2593, 285, 3216, 272, 616, 2746, 342, 3332, 6031, 20023, 275, 391, 77, 24026, 11655, 4499, 422, 4715, 7274, 285, 1375, 38562, 37626, 4715, 6239, 50276, 783, 2746, 310, 4583, 4460, 347, 1142, 4499, 422, 4715, 3082, 403, 2931, 1411, 3280, 941, 390, 15450, 14237, 5727, 436, 789, 38422, 697, 941, 432, 391, 77, 6548, 285, 10513, 247, 3048, 875, 253, 1957, 1050, 13016, 273, 253, 7313, 285, 5231, 285, 3862, 6973, 347, 597, 403, 9865, 281, 271, 5570, 50276, 284, 253, 4477, 452, 29318, 253, 1895, 891, 2868, 436, 2746, 310, 625, 6422, 285, 671, 625, 10649, 494, 685, 1633, 751, 10921, 10554, 50276, 565, 41597, 253, 15895, 3133, 4891, 281, 479, 1580, 359, 2223, 651, 751, 281, 2096, 417, 760, 672, 497, 275, 247, 1175, 1375, 285, 3192, 247, 4217, 2250, 533, 671, 275, 2087, 752, 2238, 273, 3607, 1375, 1913, 8557, 342, 2074, 6548, 943, 452, 50276, 783, 4477, 513, 3877, 326, 436, 778, 320, 3037, 494, 407, 11935, 3064, 11269, 3815, 2299, 436, 2746, 13698, 281, 3587, 11907, 253, 4715, 273, 436, 2954, 285, 34430, 713, 352, 432, 253, 391, 77, 5933, 835, 4931, 643, 1841, 778, 320, 7106, 327, 824, 347, 7219, 3966, 50275, 531, 2159, 12615, 273, 436, 2746, 812, 320, 253, 2130, 941, 3139, 347, 368, 69, 10725, 327, 253, 3646, 281, 2085, 368, 342, 1175, 3530, 323, 391, 7083, 77, 50276, 783, 4477, 5224, 326, 597, 8223, 24102, 281, 5416, 1805, 3290, 2762, 285, 4016, 3530, 323, 4715, 2299, 352, 812, 320, 1160, 30909, 849, 1199, 273, 247, 1895, 436, 476, 320, 50274, 2520, 2746, 812, 6830, 320, 5678, 342, 247, 1881, 35421, 2746, 281, 33623, 841, 3510, 273, 7350, 50275, 262, 651, 671, 320, 5322, 281, 871, 253, 3081, 15180, 7977, 273, 391, 7083, 77, 285, 849, 436, 26662, 281, 643, 24026, 11655, 50276, 783, 4679, 327, 387, 1792, 50276, 8519, 1007, 4891, 285, 7568, 326, 436, 1332, 26553, 247, 10046, 4868, 1097, 3815, 285, 672, 5678, 342, 26721, 285, 1175, 1755, 3045, 327, 3676, 14785, 1453, 18880, 8892, 672, 2429, 969, 281, 26721, 285, 15115, 317, 50276, 262, 1537, 452, 644, 5322, 281, 923, 625, 14023, 390, 13553, 342, 643, 4499, 422, 3082, 326, 452, 574, 690, 2323, 275, 50276, 28269, 5304, 14237, 948, 498, 83, 260, 864, 1162, 355, 9169, 407, 311, 32257, 1162, 355, 9169, 50276, 783, 14259, 1783, 671, 2530, 690, 5322, 12288, 715, 253, 42115, 8492, 5802, 407, 391, 7083, 77, 50276, 1189, 455, 253, 2929, 310, 973, 3542, 285, 556, 247, 2590, 12806, 50276, 783, 4477, 2085, 2590, 11333, 285, 8442, 285, 253, 2600, 14221, 973, 432, 2593, 281, 2593, 50275, 250, 27167, 318, 50276, 74, 2868, 326, 436, 310, 247, 12532, 285, 1077, 3939, 2170, 273, 2561, 285, 326, 436, 789, 2789, 253, 1083, 323, 247, 4891, 747, 2746, 285, 247, 873, 273, 18462, 1543, 281, 896, 352, 598, 5474, 33032, 1091, 3169, 4499, 422, 6779, 4715, 323, 35221, 4715, 50276, 8774, 253, 4477, 12661, 1091, 3169, 4499, 422, 6779, 4715, 391, 7083, 77, 247, 4499, 422, 24026, 4715, 4836, 326, 22591, 253, 4735, 2990, 281, 22573, 6779, 4623, 281, 253, 4836, 23267, 253, 3368, 1543, 921, 326, 391, 7083, 77, 7729, 3157, 767, 7744, 908, 391, 77, 5933, 37422, 285, 7044, 275, 1698, 941, 9459, 23000, 391, 7083, 77, 476, 671, 320, 908, 275, 5019, 342, 643, 24026, 8892, 281, 9510, 3045, 50276, 1189, 455, 253, 2929, 310, 973, 15720, 253, 9400, 310, 4623, 281, 253, 1673, 285, 253, 2746, 310, 4460, 50275, 45563, 50276, 783, 7262, 1037, 32797, 50276, 783, 9400, 273, 6779, 4715, 310, 3965, 4623, 281, 253, 1673, 1024, 50276, 783, 6311, 6779, 310, 4836, 15477, 285, 3103, 476, 5115, 2169, 3045, 2429, 281, 643, 6779, 4715, 3082, 50275, 20881, 1255, 50276, 783, 22095, 327, 3126, 6548, 1056, 253, 2746, 3965, 16931, 281, 15225, 390, 37139, 600, 2931, 23267, 5742, 50275, 18, 253, 24026, 2957, 1057, 417, 789, 275, 23507, 10921, 12620, 50275, 19, 275, 253, 4836, 342, 14086, 533, 50015, 23267, 253, 6779, 778, 320, 23539, 2584, 6779, 326, 310, 417, 9371, 275, 253, 1048, 1408, 50276, 783, 7756, 275, 5415, 1453, 8892, 1646, 281, 320, 1663, 16888, 2139, 310, 326, 50276, 783, 6311, 6779, 778, 417, 320, 1077, 2087, 1955, 281, 697, 22095, 327, 1091, 6298, 5604, 352, 476, 1361, 5115, 1805, 3045, 672, 359, 760, 2770, 327, 247, 2014, 4836, 342, 247, 6210, 392, 37224, 10921, 1159, 2568, 253, 6779, 778, 417, 320, 347, 4217, 672, 359, 2783, 690, 8542, 1524, 10186, 7533, 326, 2430, 3646, 15644, 285, 3700, 4715, 7152, 339, 793, 360, 3454, 50276, 783, 4477, 12661, 253, 11250, 273, 271, 24026, 4836, 323, 3733, 271, 391, 77, 1566, 835, 253, 24026, 4836, 8103, 310, 281, 3037, 271, 38562, 273, 253, 1375, 1913, 2317, 326, 9959, 618, 8557, 2556, 281, 616, 3264, 1091, 50276, 783, 4477, 806, 6266, 247, 5044, 38562, 4715, 7792, 1182, 28269, 3560, 407, 253, 6880, 281, 3676, 391, 77, 347, 271, 24026, 4836, 391, 7083, 77, 253, 4477, 1246, 1543, 275, 387, 1792, 13358, 2250, 3652, 327, 37422, 4645, 271, 7756, 2429, 281, 1666, 25379, 327, 8876, 288, 2224, 275, 253, 1698, 2203, 9459, 285, 1543, 327, 42961, 8519, 5415, 2250, 3652, 327, 7044, 4645, 2074, 390, 5520, 3045, 2429, 281, 1666, 25379, 50276, 15177, 50276, 1189, 455, 891, 1119, 253, 2746, 285, 1543, 281, 320, 4722, 285, 28249, 18511, 387, 806, 17834, 253, 7756, 310, 10084, 1677, 326, 771, 813, 658, 3676, 391, 77, 2168, 3198, 281, 12002, 253, 1375, 2317, 327, 253, 3720, 273, 6548, 1014, 1293, 271, 24026, 4836, 253, 2234, 4620, 281, 320, 253, 2770, 327, 3410, 6733, 275, 253, 1698, 2203, 9459, 835, 253, 4836, 3133, 281, 3157, 1327, 6790, 1318, 2625, 18634, 2429, 281, 247, 7491, 10981, 1882, 5933, 3782, 327, 387, 1792, 3877, 326, 275, 253, 2233, 76, 9459, 253, 771, 813, 658, 11333, 452, 417, 2568, 6311, 281, 1132, 268, 543, 1580, 352, 310, 417, 2590, 326, 253, 5933, 588, 39970, 281, 625, 941, 697, 3477, 281, 8564, 326, 253, 38562, 4836, 588, 35007, 3045, 672, 253, 2613, 11333, 2489, 625, 25806, 24251, 891, 651, 751, 281, 923, 625, 37699, 273, 253, 4736, 4768, 253, 2929, 24088, 275, 253, 1698, 2203, 9459, 776, 5933, 41731, 13015, 2266, 1666, 25379, 327, 2570, 8892, 275, 253, 3676, 14785, 1453, 18880, 285, 387, 1792, 3958, 275, 253, 12002, 347, 973, 347, 247, 3806, 281, 253, 2770, 275, 253, 11815, 50276, 249, 253, 1698, 2203, 9459, 697, 671, 4619, 281, 15249, 436, 2746, 2429, 281, 247, 1566, 3169, 5795, 327, 387, 1792, 253, 4477, 7277, 281, 2969, 533, 278, 7958, 2771, 651, 320, 247, 10046, 8245, 50276, 67, 11587, 253, 16774, 1543, 253, 4477, 671, 23395, 2085, 247, 5740, 273, 253, 1182, 2059, 38562, 285, 271, 2228, 3033, 50276, 498, 15752, 50276, 74, 369, 13477, 407, 253, 5740, 273, 253, 10538, 3870, 909, 800, 10491, 5199, 275, 7652, 12494, 374, 403, 13288, 5897, 595, 12640, 1561, 247, 18974, 604, 594, 310, 352, 8558, 247, 47641, 326, 597, 588, 3831, 1375, 1913, 8557, 342, 2074, 6548, 26332, 812, 2649, 247, 10921, 6786, 4260, 29429, 1056, 436, 3908, 13583, 347, 891, 2096, 352, 8223, 272, 32547, 253, 1895, 273, 8925, 27925, 327, 253, 1091, 3268, 1049, 7947, 74, 2299, 352, 671, 3133, 751, 352, 588, 2701, 253, 6083, 3745, 281, 7368, 1327, 6790, 618, 8557, 342, 253, 1072, 6548, 352, 1537, 671, 1599, 326, 253, 5570, 310, 4715, 281, 7368, 5897, 595, 9701, 3054, 275, 253, 6944, 1375, 2317, 2581, 685, 2074, 6548, 50276, 19164, 414, 50276, 783, 2929, 21168, 327, 5368, 789, 275, 253, 38562, 6239, 285, 24026, 8892, 323, 3676, 391, 77, 253, 3625, 4460, 4445, 310, 970, 247, 1091, 3169, 24026, 4836, 50275, 783, 1182, 2059, 38562, 7792, 671, 4620, 281, 320, 4460, 3738, 697, 8244, 2905, 281, 5368, 490, 10981, 960, 751, 2805, 2059, 38562, 50276, 9188, 40348, 50276, 783, 391, 7083, 77, 1566, 3139, 19132, 327, 5368, 771, 813, 658, 7274, 285, 476, 320, 4354, 11217, 715, 1142, 771, 813, 658, 35615, 3738, 352, 3133, 11543, 281, 7171, 247, 2266, 1566, 3169, 8245, 751, 278, 7958, 2771, 275, 253, 1698, 2203, 9459, 50275, 783, 5740, 273, 253, 1182, 2059, 38562, 285, 253, 17947, 273, 1091, 3169, 24026, 8892, 275, 2087, 812, 5276, 625, 1534, 275, 253, 1048, 1307, 50276, 856, 84, 50276, 783, 1566, 19132, 3045, 275, 253, 1698, 2203, 9459, 689, 5368, 771, 813, 658, 1666, 25379, 50276, 783, 1566, 476, 320, 4354, 2879, 281, 1142, 5368, 35615, 50276, 10008, 285, 10527, 1543, 327, 247, 747, 1511, 273, 38562, 50276, 5040, 50276, 783, 2929, 3198, 690, 625, 19843, 1475, 253, 2770, 327, 1698, 2203, 50276, 16848, 6733, 285, 849, 7763, 253, 1566, 310, 281, 2169, 941, 27005, 50276, 328, 8250, 604, 253, 8223, 3169, 10491, 5700, 310, 17524, 618, 8557, 342, 2074, 6548, 390, 816, 3054, 326, 403, 10151, 275, 253, 6944, 1375, 2317, 50276, 783, 1566, 3133, 11543, 281, 3157, 327, 247, 10046, 1566, 3169, 8245, 275, 253, 1698, 2203, 9459, 2490, 187, 4118, 18435, 27, 3373, 8435, 310, 1091, 3169, 4499, 422, 4715, 323, 35221, 4715, 835, 253, 5203, 310, 1880, 767, 3530, 5663, 281, 253, 1072, 1091, 10269, 253, 30628, 1119, 436, 281, 320, 247, 973, 11407, 2929, 342, 1175, 10527, 285, 5661, 1543 ]
Below is given review of a research paper from cnoference journal. Please write a summary the review. ### Review: this paper studies the power of abstention in robust classification a classifier that has the power of abstention can refuse to answer the query because it is unsure of the answer for robust classification the abstention power enables the classifier to refuse adversarial queries if the query is detected as an adversarial example the paper first shows a negative result on the possibility of robust classification after reading the statement of theorem and the proof multiple times it is not clear what the theorem is stating the theorem only says that the adversary can flip the label using arbitrary large perturbations which seems to be a trivial statement in the comments bellow i have listed several questions to get a clear understanding of this theorem after the negative result the paper studies the effect of abstention by showing a positive result on the 1nearest neighbor classifier the idea is simple whenever the query is far from its nearest neighbor in the training set the classifier refuses to answer this clearly provides some lower bound on the robustness as long as the data is well separated this result cannot be used for actual image datasets that are used in practice because 1nn will definitely not have good accuracy even without abstention also the images are not wellseparated at all however the authors still run some experiments by considering the adversarial attacks in the feature space the noise is added in the feature space instead of input space they show that using some good feature representation they can get acceptable robust accuracy using their method however it is important to note that this will not have any meaningful effect on the real datasets classification tasks as for evaluation i find the idea of paper in studying the provable effect of abstention exciting however the theorems that are proved lack clarity and significance i suggest the authors to rewrite the theorem in a more understandable way with all parameters clearly explained from a technical point of view it seems that the theorems that are proved are not really relevant to adversarial perturbations the definitions of adversarial perturbations seem arbitrary and not aligned with standard definitions commentsquestions to authors theorem 41 1 the current statement of theorem is trivial you dont provide any bound on the size of perturbation which makes the theorem not very useful 2the statement of theorem says a random vector v what does that mean are you considering robustness to random noise or adversarial noise if so how is this related to adversarial examples 3in the proof it says rfracrdelta sqrtn2delta is large enough to provide some property about the balls of size rdelta doesnt this statement require some distributional assumptions for data distribution 4 there are already some negative results about adversarial robustness that the paper could refer to theorem 51 1the definition of epsilonxadv is not clear at all which makes the whole theorem not really understandable 2it sounds to me that the bound on the error could be well beyond 1 if not you should explain why what if n3n2 3 what does this sentence mean the adversary is allowed to corrupt fx with arbitrarily large perturbation in a uniformdistributed subspace s of dimension n3 again what do you mean by uniform distributed 4in general defining adversarial perturbation in a subspace with smaller dimension seems not standard did you choose this type of perturbation for a specific reason or just for the sake of proving the theorem 5 the proof of this theorem does not sound to be rigorous are you assuming that without any perturbation the 1nn classifier is 100 accuratedocsepsummary this paper studies through a provable approach whether abstaining ie refusing to answer can be beneficial for achieving small adversarialrobust error in settings where the input is potentially adversarially perturbed the paper proves a separation between the power of models with and without abstain in particular it is shown that for a certain adversarial model more about this below when we force the model to answer without an abstain option it will have high adversarial error but when abstain is allowed it can have small adversarial error as well as small abstention rate in certain settings the paper then studies algorithms for robust contrastive learning in which they map the inputs into highdimensional spaces and then aim to classify them using an abstainenabled model based on 1nn the paper studies ways to adjust the parameters of the model as the data comes in an online fashion divided into batches they show how to achieve sublinear regret in such settings they then compare linear classifiers with their own 1nn style classifiers and show advantages in robustness with such models when abstaining is allowed pros not many previous works have studied the role of abstention in adversarial attacks aka adversarial examples evasion attacks this work is the first to aim for a provable separation this is a very natural and potentially impactful direction the ideas for the algorithm design through a data driven approach could lead to useful methods that have practical values cons i think the theoretical separation is not that meaningful due to two issues 1 the robustness is defined for an unnatural perturbation model it is a mixture of random and adversarial ie the perturbations are allowed to be in a randomly selected subspace but that is not the main issue the main issue is that the amount of perturbation in the subspace is unbounded this means the adversary can basically perturb the point to an arbitrarily far point where the ground truth also changes therefore it is not cleat at all if the perturbed point would indeed be misclassified or not which seems to be the minimum requirement to call something adversarial example here i want to contrast the noise model with say ellpbased noise model that is extensively studied in the literature to clarify the issue the idea there is that eg in the case of images bounded ellp perturbation preserves the ground truth in that case humans judgement so an attack that finds images with small ellp distance with a different classified label would be misclassified here nothing like that could be said as perturbations are arbitrarily long 2 to see a different but related issue with the definition used for robust error assume a function f completely learns the concept correctly and have zero error then on the one hand such model should not be able to have an adversarial example because any perturbation would be correctly answered ie imagine a change in a cat image to modify it into a dog picture and when the model says it is a dog we count this as error however the definition used in this paper would still allow to prove unconditional adversarial error for the model note that previous works eg the cited work of madry et al18 are sometimes implicitly defined for a setting that the perturbation cannot change the ground truth eg bounded perturbation of images do not change human judgement so if the label changes it would be misclassified but here the noise allows arbitrarily far perturbations it seems the experiments compare the new method with possible abstention with a linear classifier that is not designed to be robust i think a fairer way to show the advantage of abstain is to show that your method with abstain can beat another previous method that was designed to be robust eg using traditional adversarial training that would show a real jump in what we can do with abstention due to the above reasons i think the theoretical and the experimental contributions can be interpreted in a limited way and hence i am more inclined to recommend rejection main comments in algorithm 1 line 2 do you do this in some order eg if two points are at distance less than sigma you remove one of them or both of them discussions after theorem 51 somehow interpret it as showing some form of inherent tradeoff between success probability and abstention rate on normal not adversarial inputs but that does not seem to be necessarily the case for example going back to the case of images note that the input distribution eg images in cifar10 keep their concept label even after perturbation eg human judgement now one can either ask a robust model to output a label even when images are perturbed or be allowed to abstain when a perturbed image is given in the latter case a model can actually have 100 accuracy on the normal inputs while it might have a lot of abstain on adversarially perturbed points the disparity between my example and the message of theorem 51 seems to be either stemming from the fact that you allow arbitrarily long perturbations that will eventually change the label or that 1nn based approaches are not sufficiently powerful here assumption 1 page 5 we assume that at least 1 delta fraction of mass of the marginal distribution dfxy over is this for every y also can you discuss whether assumption 1 typically correct on real data in your experiments reported in table 1 page 8 how much the numbers change if you aim to get an adversarially perturbed point misclassified by further restricting what constitutes as a legitimate adversarial example my objection above to the theoretical formulation and proofs does not prevent you from potentially showing a separation in these experiments by really forcing the adversarial examples to be misclassified is your approximate adversary provably approximating the robust error further comments and typos the label y appears twice in the proof of page 4 as nonmath missing dnat does not seem to be the best choice to represent the abstention rate on normal data at least it is hard to guess it based on the notation post rebuttal comment thanks for sharing the response unfortunately the very basic issue with the definition used in this paper and its implications to practice remains unsolved to clarify the definition you just need to focus on the following simple example what if the model has zero riskerror if you perturb a point it would still be correctly classified yet they still show that adversarial examples are inevitable even in this setting this already shows something is fundamentally wrong with the definition used your response is that the attackeradversary will not get to change the label but only the features but please note the adversary is not allowed to choose the label the adversary picks the features and it is up to the model to correctly classify it or not if an attacker changes the picture of a cat to to a picture of a dog the neural net or any other model should call this dog and if does still calls it cat it would be a a mistake not the other way around the ground truth ie the concept function determines what is correct and what is not the above issue is not imaginary it has real affect on the experiments and as i said it is important to report in the experiments whether or not they attacks lead to actual misclassification i hope these comments will help improving the paper since as i said the topic of this paper is a very important one and so exactly because of this it is important to have the basics rightdocsep this paper proves some fundamental facts about classifiers that cant abstain provide a nonclassification and their robustness to adversarial perturbations in sec 4 they provide a result that such classifiers are always vulnerable to adversarial perturbations in a technical sense in particular there will always be a class in which most training examples can be randomly perturbed in a way that an incorrect label will result nearly half the time in sec 5 they propose a modified nearestneighbor classification algorithm with two parameters that control abstention and noise removal they provide upper bounds on error in a random subspace attack scheme and refineloosen these results in several more specificgeneral scenarios in secs 6 7 they discuss methods to tune the two parameters and provide experimental evidence of their theoretical results strengths weaknesses first the strengths i found the paper to be wellorganized and mostly wellwritten it aims to tackle a pretty fundamental problem and provides some clear and simple results in relation to a simple algorithm for the statements i checked the paper was technically sound as for the weaknesses i found some of the mathematical exposition to be hard to follow the proofs and sketches could really benefit from some diagrams and perhaps some simple example scenarios additionally i found myself unsure of the practical gains from their algorithm they make a single comparison to a linear classifier and its almost certain that the comparison would not be as favorable against any method that allows for classification in a nonbinary fashion with some level of confidence and thus some natural level of abstention recommendation based on the above i gave a rating of 7 i did not give a higher score as i am not sure of the practical relevance of the suggested algorithm on the other hand the theoretical results seem solid and are of a pretty fundamental nature i do provide this score with a grain of salt as i was not able to check all the theoretical statements which are the backbone of this paper in my opinion update i am downgrading my score to a 6 based on the opinions of my fellow reviewers it seems that perhaps the theoretical results are based on scenarios that are too simplistic for the community at hand moreover there are clearly some readability issues based upon the reactions of the other reviewers clarification questions suggestions 1 as suggested perhaps it would be helpful to discuss or consider existing methods that provide classification with some level of confidence it seems that these automatically provide some notion of abstention when confidence is not high enough i found it surprising that none of these were mentioned 2 i noticed that the supplementary material contains a few references on examples of the subspace threat model i feel like a sentence in the main text would be helpful to provide some context for the relevance of this threat model 3 in the proof of thm 41 the existence of a label y with volume fraction less than 05 exists by virtue of having at least two labels even with an abstention option existence would hold i believedocsepsummary the authors propose a connection between abstention and robustness to adversarial examples specifically the authors contend that without the ability to abstain any classifier can be fooled by an adversarial perturbation in the feature space they additionally provide results and experiments concerning the proper selection of a hyperparameter that tunes the abstention pros the authors include many theoretical analyses a good effort is made to address the problem from several relevant angles cons the paper is very difficult to read the introduction does not provide a clean line of thought motivating the paper i am not aware that the attack model considered that of an adversary allowed to make arbitrarily large moves in a subset of feature space with no constraints on the input space exists elsewhere in the literature the work cited brown et al 2018 requires that unrestricted adversarial examples remain unambiguous to human judges the abstract cites results for any classifier when in fact the result seems to be for a nearestneighbor style classifier which is unusual given that the setting is deep networks the variables used throughout are difficult to keep track of major issues theorem 41 this is obviously not true for all classifiers the proof is very difficult to follow and there are no useful details given in figure 1 the result seems like it may be true by virtue of the fact that knn with k1 defines a hyperplane between any two points then a randomly chosen vector with probability 12 epsilon inevitably crosses such a hyperplane eventually but theres no reason to believe such a value in feature space could be reverse engineered or even lies within the range of f the testing in section 71 does not seem to include the use of nonadversarial test samples in evaluating whether or not a threshold is too strict to be of use it would be necessary to evaluate on indomain test samples as well as the training set these results seem likely to be overfit to the training data the testing setup is unclear perhaps there are further details in the referenced papers but it is not even clear to me how many classes are in the set is it only two given the baseline of a linear classifier the clarity of the paper is severely lacking it is difficult to follow the contribution of many of the theorems especially concerning their generality or lack thereof given that i find these issues too concerning to recommend publication i have not carefully checked all of the proofs ### Summary:
the paper considers the problem of abstention in robust classification a number of issues were identified in the formal framework and the writing was also not up to scratch the authors should take into regard the very many constructive suggestions made by the reviewers in preparing a revision
[ 273, 3888, 11542, 11591, 81, 20452, 31221, 253, 3216, 5083, 275, 326, 1083, 7497, 31536, 594, 271, 2983, 326, 9010, 3888, 342, 1355, 11591, 81, 4181, 342, 247, 1027, 10509, 5203, 651, 320, 3731, 39651, 1060, 2717, 751, 326, 812, 320, 753, 347, 26309, 403, 29607, 1048, 50276, 19, 281, 923, 247, 1027, 533, 2905, 2523, 342, 253, 5426, 908, 323, 10237, 2228, 5467, 247, 1159, 269, 4336, 33772, 253, 4473, 9113, 285, 452, 5058, 2228, 840, 327, 253, 581, 1133, 824, 1566, 943, 417, 320, 2104, 281, 452, 271, 48960, 1650, 984, 667, 20452, 651, 320, 9113, 9577, 26332, 8564, 247, 1818, 275, 247, 5798, 2460, 281, 10007, 352, 715, 247, 4370, 5406, 285, 672, 253, 1566, 2296, 352, 310, 247, 4370, 359, 1385, 436, 347, 2228, 2299, 253, 5426, 908, 275, 436, 2929, 651, 1335, 1581, 281, 5276, 49795, 48960, 2228, 323, 253, 1566, 3877, 326, 2045, 2987, 24088, 253, 11106, 789, 273, 10279, 610, 1162, 355, 1093, 403, 4536, 29688, 2931, 323, 247, 4758, 326, 253, 20452, 2550, 1818, 253, 3216, 5083, 24088, 11542, 20452, 273, 3888, 513, 417, 1818, 1966, 31536, 594, 604, 253, 5203, 2544, 352, 651, 320, 3731, 39651, 533, 1060, 253, 6046, 4483, 29607, 2080, 26309, 50274, 262, 3133, 253, 4679, 7277, 253, 747, 1332, 342, 1896, 20965, 1075, 342, 247, 4872, 30410, 326, 310, 417, 4158, 281, 320, 10237, 891, 1158, 247, 22870, 83, 1039, 281, 921, 253, 5750, 273, 20965, 404, 310, 281, 921, 326, 634, 1332, 342, 20965, 404, 476, 7171, 1529, 2045, 1332, 326, 369, 4158, 281, 320, 10237, 24088, 970, 5899, 48960, 3733, 326, 651, 921, 247, 1524, 6923, 275, 752, 359, 476, 513, 342, 20965, 1075, 50276, 21848, 281, 253, 1840, 4606, 891, 1158, 253, 10527, 285, 253, 5661, 9021, 476, 320, 12814, 275, 247, 3710, 1039, 285, 7613, 891, 717, 625, 21802, 281, 5583, 18235, 50276, 7265, 5701, 50276, 249, 5933, 337, 1386, 374, 513, 368, 513, 436, 275, 690, 1340, 24088, 604, 767, 2792, 403, 387, 4181, 1679, 685, 40009, 368, 5386, 581, 273, 731, 390, 1097, 273, 731, 50276, 35844, 621, 846, 10012, 8319, 10380, 4665, 352, 347, 4645, 690, 830, 273, 12794, 5454, 2727, 875, 2323, 5912, 285, 20965, 1075, 2281, 327, 2622, 417, 48960, 14800, 533, 326, 1057, 417, 1646, 281, 320, 7933, 253, 1083, 323, 1650, 1469, 896, 281, 253, 1083, 273, 3888, 3877, 326, 253, 3280, 3268, 24088, 3888, 275, 260, 338, 274, 740, 1978, 616, 4473, 5203, 1014, 846, 20452, 24088, 1966, 31536, 1024, 581, 476, 2057, 1642, 247, 10237, 1566, 281, 3453, 247, 5203, 1014, 672, 3888, 403, 44711, 390, 320, 4136, 281, 20965, 404, 672, 247, 44711, 2460, 310, 1677, 275, 253, 6158, 1083, 247, 1566, 476, 2686, 452, 2233, 7200, 327, 253, 2622, 14800, 1223, 352, 1537, 452, 247, 2257, 273, 20965, 404, 327, 18539, 274, 1365, 44711, 2792, 253, 37808, 875, 619, 1650, 285, 253, 3935, 273, 10012, 8319, 3133, 281, 320, 2057, 45030, 432, 253, 958, 326, 368, 1581, 29607, 1048, 26309, 326, 588, 6524, 1818, 253, 5203, 390, 326, 337, 9866, 1754, 7274, 403, 417, 10481, 6422, 1060, 50275, 515, 23892, 337, 3239, 608, 50276, 664, 5467, 326, 387, 1878, 337, 50276, 3005, 6919, 273, 2280, 273, 253, 16888, 3268, 20926, 5246, 689, 310, 436, 323, 1046, 340, 671, 476, 368, 2319, 1880, 9376, 337, 5431, 3451, 327, 1524, 941, 50276, 249, 634, 4679, 2361, 275, 2829, 337, 3239, 854, 849, 1199, 253, 3904, 1818, 604, 368, 4388, 281, 755, 271, 18539, 274, 1365, 44711, 1127, 3731, 39651, 407, 2007, 34617, 752, 16988, 347, 247, 14905, 48960, 1650, 619, 14926, 1840, 281, 253, 10527, 15895, 285, 27947, 1057, 417, 3657, 368, 432, 7826, 4645, 247, 9712, 275, 841, 4679, 407, 1663, 17190, 253, 48960, 6667, 281, 320, 3731, 39651, 50276, 261, 634, 16851, 34014, 872, 1598, 4020, 839, 253, 10237, 2228, 50274, 44295, 5701, 285, 963, 993, 50276, 783, 5203, 340, 4620, 7019, 275, 253, 4737, 273, 3239, 577, 347, 1327, 679, 5816, 50275, 17915, 255, 1057, 417, 1646, 281, 320, 253, 1682, 4327, 281, 1957, 253, 20965, 1075, 2281, 327, 2622, 941, 387, 1878, 352, 310, 1892, 281, 5476, 352, 1754, 327, 253, 14951, 50274, 5996, 30080, 22559, 4385, 50275, 35501, 323, 9628, 253, 2380, 19235, 253, 1077, 5044, 2523, 342, 253, 5426, 908, 275, 436, 2929, 285, 697, 12739, 281, 3946, 4558, 5061, 5336, 50276, 936, 19148, 253, 5426, 368, 816, 878, 281, 2770, 327, 253, 1563, 2969, 1650, 752, 604, 253, 1566, 556, 5058, 2495, 3775, 50276, 338, 368, 12230, 247, 1127, 352, 651, 1335, 320, 9113, 10509, 2568, 597, 1335, 921, 326, 48960, 6667, 403, 19455, 1014, 275, 436, 4758, 436, 2168, 2722, 1633, 310, 26401, 3430, 342, 253, 5426, 908, 50276, 12550, 2380, 310, 326, 253, 30539, 324, 735, 552, 588, 417, 755, 281, 1818, 253, 5203, 533, 760, 253, 3386, 533, 4496, 3877, 253, 34014, 310, 417, 4136, 281, 5206, 253, 5203, 253, 34014, 21460, 253, 3386, 285, 352, 310, 598, 281, 253, 1566, 281, 9113, 30215, 352, 390, 417, 604, 271, 30539, 2544, 253, 5406, 273, 247, 5798, 281, 281, 247, 5406, 273, 247, 4370, 253, 11454, 2036, 390, 667, 643, 1566, 943, 1067, 436, 4370, 285, 604, 1057, 1335, 5841, 352, 5798, 352, 651, 320, 247, 247, 10551, 417, 253, 643, 1039, 1475, 253, 3216, 5083, 26332, 253, 4473, 1159, 14802, 752, 310, 3451, 285, 752, 310, 417, 50275, 783, 1840, 2523, 310, 417, 21833, 352, 556, 1524, 2818, 327, 253, 4679, 285, 347, 891, 753, 352, 310, 1774, 281, 1304, 275, 253, 4679, 1880, 390, 417, 597, 8104, 1421, 281, 4588, 3731, 42070, 50276, 74, 3524, 841, 5701, 588, 1361, 11138, 253, 2929, 1580, 347, 891, 753, 253, 9400, 273, 436, 2929, 310, 247, 1077, 1774, 581, 285, 594, 4555, 984, 273, 436, 352, 310, 1774, 281, 452, 253, 30486, 987, 7152, 33032, 436, 2929, 19539, 690, 7936, 5441, 670, 49996, 326, 16216, 20965, 404, 2085, 247, 1327, 42070, 285, 616, 31640, 281, 48960, 26309, 275, 4706, 577, 597, 2085, 247, 906, 326, 824, 49996, 403, 1900, 14043, 281, 48960, 26309, 275, 247, 7681, 3282, 275, 1798, 627, 588, 1900, 320, 247, 966, 275, 534, 954, 3733, 6667, 476, 320, 12421, 44711, 275, 247, 1039, 326, 271, 13583, 5203, 588, 906, 4829, 2716, 253, 673, 275, 4706, 608, 597, 12661, 247, 7321, 5275, 570, 25194, 9162, 5933, 342, 767, 3602, 326, 1453, 20965, 1075, 285, 6046, 8570, 597, 2085, 5170, 14493, 327, 2228, 275, 247, 3632, 24822, 2983, 6974, 285, 46783, 29595, 5458, 841, 1543, 275, 2067, 625, 2173, 16691, 15216, 275, 4706, 84, 721, 50276, 24, 597, 2319, 3082, 281, 19928, 253, 767, 3602, 285, 2085, 5661, 1941, 273, 616, 10527, 1543, 50275, 296, 3755, 20556, 50276, 20881, 1255, 265, 50276, 7053, 253, 20544, 891, 1119, 253, 2929, 281, 320, 973, 34092, 285, 6571, 973, 15720, 352, 13698, 281, 18915, 247, 3965, 7936, 1895, 285, 3400, 690, 2590, 285, 2969, 1543, 275, 5886, 281, 247, 2969, 5933, 323, 253, 7234, 891, 10141, 253, 2929, 369, 22335, 3590, 50276, 284, 323, 253, 32213, 891, 1119, 690, 273, 253, 15965, 47284, 281, 320, 1892, 281, 956, 253, 27947, 285, 46159, 812, 1663, 5649, 432, 690, 21302, 285, 4931, 690, 2969, 1650, 15216, 23000, 891, 1119, 4266, 31488, 273, 253, 8542, 15988, 432, 616, 5933, 597, 1056, 247, 2014, 5301, 281, 247, 4872, 30410, 285, 697, 2761, 2176, 326, 253, 5301, 651, 417, 320, 347, 13857, 1411, 667, 1332, 326, 4483, 323, 9162, 275, 247, 1327, 26458, 8142, 342, 690, 1268, 273, 7162, 285, 3021, 690, 3626, 1268, 273, 20965, 1075, 50275, 250, 27167, 318, 50276, 3169, 327, 253, 1840, 891, 3534, 247, 13716, 273, 818, 891, 858, 417, 1918, 247, 2169, 4868, 347, 891, 717, 417, 2119, 273, 253, 8542, 17200, 273, 253, 5125, 5933, 327, 253, 643, 1133, 253, 10527, 1543, 1646, 4891, 285, 403, 273, 247, 3965, 7936, 3753, 891, 513, 2085, 436, 4868, 342, 247, 13723, 273, 7043, 347, 891, 369, 417, 2104, 281, 2451, 512, 253, 10527, 7234, 534, 403, 253, 27882, 273, 436, 2929, 275, 619, 4743, 50276, 11183, 891, 717, 1066, 4971, 272, 619, 4868, 281, 247, 721, 1754, 327, 253, 11626, 273, 619, 7715, 30628, 352, 3133, 326, 4931, 253, 10527, 1543, 403, 1754, 327, 15216, 326, 403, 1512, 8077, 2531, 323, 253, 3114, 387, 1133, 25761, 627, 403, 4518, 690, 1239, 1430, 3374, 1754, 2220, 253, 9969, 273, 253, 643, 30628, 50275, 498, 274, 1877, 3533, 50276, 35640, 621, 50276, 18, 347, 5125, 4931, 352, 651, 320, 9371, 281, 2319, 390, 1908, 5368, 3082, 326, 2085, 9162, 342, 690, 1268, 273, 7162, 352, 3133, 326, 841, 8356, 2085, 690, 10732, 273, 20965, 1075, 672, 7162, 310, 417, 1029, 2217, 891, 1119, 352, 10084, 326, 5293, 273, 841, 497, 5393, 374, 891, 8344, 326, 253, 24864, 2144, 4428, 247, 1643, 10414, 327, 6667, 273, 253, 24822, 4322, 1566, 891, 1928, 751, 247, 6197, 275, 253, 2022, 2505, 651, 320, 9371, 281, 2085, 690, 3634, 323, 253, 17200, 273, 436, 4322, 1566, 495, 275, 253, 4737, 273, 289, 78, 7609, 253, 6242, 273, 247, 5203, 340, 342, 4644, 6919, 1679, 685, 16987, 4961, 407, 16968, 273, 1907, 387, 1878, 767, 13301, 1014, 342, 271, 20965, 1075, 4500, 6242, 651, 2186, 891, 6566, 406, 339, 793, 360, 3454, 253, 4477, 12661, 247, 4602, 875, 20965, 1075, 285, 31640, 281, 48960, 6667, 5742, 253, 4477, 21244, 326, 1293, 253, 3745, 281, 20965, 404, 667, 30410, 476, 320, 11213, 264, 407, 271, 48960, 20452, 275, 253, 4735, 2317, 597, 23000, 2085, 1543, 285, 4679, 8664, 253, 1463, 5438, 273, 247, 4373, 19484, 326, 43569, 253, 20965, 1075, 50276, 856, 84, 253, 4477, 2486, 1142, 10527, 6260, 247, 1175, 3434, 310, 1160, 281, 2953, 253, 1895, 432, 2067, 4623, 14636, 50275, 5040, 50275, 783, 2929, 310, 1077, 2834, 281, 1239, 50274, 783, 10199, 1057, 417, 2085, 247, 4076, 1386, 273, 1869, 15265, 839, 253, 2929, 891, 717, 417, 6600, 326, 253, 2983, 1566, 2783, 326, 273, 271, 34014, 4136, 281, 1056, 29607, 1781, 9727, 275, 247, 8578, 273, 4735, 2317, 342, 642, 10806, 327, 253, 3280, 2317, 4961, 11358, 275, 253, 6239, 253, 789, 11106, 8516, 1162, 355, 4765, 4419, 326, 48566, 48960, 6667, 3464, 39662, 281, 1966, 16006, 50274, 783, 12002, 28070, 1543, 323, 667, 30410, 672, 275, 958, 253, 906, 3133, 281, 320, 323, 247, 5275, 570, 25194, 3740, 30410, 534, 310, 11555, 1677, 326, 253, 4758, 310, 3676, 6928, 50275, 783, 4903, 908, 4768, 403, 2834, 281, 1978, 3540, 273, 50276, 24330, 3374, 50275, 33921, 7609, 436, 310, 9090, 417, 2032, 323, 512, 49996, 253, 4737, 310, 1077, 2834, 281, 956, 285, 627, 403, 642, 4217, 4278, 1677, 275, 4677, 337, 253, 906, 3133, 751, 352, 778, 320, 2032, 407, 16968, 273, 253, 958, 326, 694, 79, 342, 465, 18, 13067, 247, 4373, 13568, 875, 667, 767, 2792, 840, 247, 12421, 6777, 4972, 342, 5912, 1249, 50276, 4259, 24473, 25808, 824, 247, 4373, 13568, 6524, 533, 253, 373, 642, 1921, 281, 2868, 824, 247, 1318, 275, 4735, 2317, 812, 320, 8107, 28136, 390, 1014, 8696, 1561, 253, 2491, 273, 269, 50275, 783, 5175, 275, 2593, 11102, 1057, 417, 1646, 281, 2486, 253, 897, 273, 1327, 324, 735, 24406, 1071, 3530, 275, 16344, 1880, 390, 417, 247, 7887, 310, 1512, 7654, 281, 320, 273, 897, 352, 651, 320, 3309, 281, 7472, 327, 801, 297, 404, 1071, 3530, 347, 973, 347, 253, 3733, 873, 841, 1543, 1646, 2779, 281, 320, 689, 8491, 281, 253, 3733, 941, 50274, 783, 5175, 9978, 310, 12744, 4931, 627, 403, 2007, 4278, 275, 253, 23378, 9380, 533, 352, 310, 417, 1014, 2590, 281, 479, 849, 1142, 5971, 403, 275, 253, 873, 310, 352, 760, 767, 1677, 253, 8245, 273, 247, 4872, 30410, 50275, 783, 19843, 273, 253, 2929, 310, 18270, 14999, 352, 310, 2834, 281, 956, 253, 7680, 273, 1142, 273, 253, 39383, 3340, 8664, 616, 31376, 390, 3480, 10445, 50276, 28821, 326, 891, 1089, 841, 3374, 1512, 8664, 281, 5583, 9311, 891, 452, 417, 9257, 10141, 512, 273, 253, 27947, 187, 187, 4118, 18435, 27, 783, 2929, 19401, 253, 1895, 273, 20965, 1075, 275, 10237, 9162, 247, 1180, 273, 3374, 497, 3636, 275, 253, 7473, 7792, 285, 253, 4028, 369, 671, 417, 598, 281, 20041, 253, 4477, 943, 1379, 715, 2743, 253, 1077, 1142, 25799, 13991, 1160, 407, 253, 30628, 275, 13828, 247, 18520 ]
[ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 ]
[ 273, 3888, 11542, 11591, 81, 20452, 31221, 253, 3216, 5083, 275, 326, 1083, 7497, 31536, 594, 271, 2983, 326, 9010, 3888, 342, 1355, 11591, 81, 4181, 342, 247, 1027, 10509, 5203, 651, 320, 3731, 39651, 1060, 2717, 751, 326, 812, 320, 753, 347, 26309, 403, 29607, 1048, 50276, 19, 281, 923, 247, 1027, 533, 2905, 2523, 342, 253, 5426, 908, 323, 10237, 2228, 5467, 247, 1159, 269, 4336, 33772, 253, 4473, 9113, 285, 452, 5058, 2228, 840, 327, 253, 581, 1133, 824, 1566, 943, 417, 320, 2104, 281, 452, 271, 48960, 1650, 984, 667, 20452, 651, 320, 9113, 9577, 26332, 8564, 247, 1818, 275, 247, 5798, 2460, 281, 10007, 352, 715, 247, 4370, 5406, 285, 672, 253, 1566, 2296, 352, 310, 247, 4370, 359, 1385, 436, 347, 2228, 2299, 253, 5426, 908, 275, 436, 2929, 651, 1335, 1581, 281, 5276, 49795, 48960, 2228, 323, 253, 1566, 3877, 326, 2045, 2987, 24088, 253, 11106, 789, 273, 10279, 610, 1162, 355, 1093, 403, 4536, 29688, 2931, 323, 247, 4758, 326, 253, 20452, 2550, 1818, 253, 3216, 5083, 24088, 11542, 20452, 273, 3888, 513, 417, 1818, 1966, 31536, 594, 604, 253, 5203, 2544, 352, 651, 320, 3731, 39651, 533, 1060, 253, 6046, 4483, 29607, 2080, 26309, 50274, 262, 3133, 253, 4679, 7277, 253, 747, 1332, 342, 1896, 20965, 1075, 342, 247, 4872, 30410, 326, 310, 417, 4158, 281, 320, 10237, 891, 1158, 247, 22870, 83, 1039, 281, 921, 253, 5750, 273, 20965, 404, 310, 281, 921, 326, 634, 1332, 342, 20965, 404, 476, 7171, 1529, 2045, 1332, 326, 369, 4158, 281, 320, 10237, 24088, 970, 5899, 48960, 3733, 326, 651, 921, 247, 1524, 6923, 275, 752, 359, 476, 513, 342, 20965, 1075, 50276, 21848, 281, 253, 1840, 4606, 891, 1158, 253, 10527, 285, 253, 5661, 9021, 476, 320, 12814, 275, 247, 3710, 1039, 285, 7613, 891, 717, 625, 21802, 281, 5583, 18235, 50276, 7265, 5701, 50276, 249, 5933, 337, 1386, 374, 513, 368, 513, 436, 275, 690, 1340, 24088, 604, 767, 2792, 403, 387, 4181, 1679, 685, 40009, 368, 5386, 581, 273, 731, 390, 1097, 273, 731, 50276, 35844, 621, 846, 10012, 8319, 10380, 4665, 352, 347, 4645, 690, 830, 273, 12794, 5454, 2727, 875, 2323, 5912, 285, 20965, 1075, 2281, 327, 2622, 417, 48960, 14800, 533, 326, 1057, 417, 1646, 281, 320, 7933, 253, 1083, 323, 1650, 1469, 896, 281, 253, 1083, 273, 3888, 3877, 326, 253, 3280, 3268, 24088, 3888, 275, 260, 338, 274, 740, 1978, 616, 4473, 5203, 1014, 846, 20452, 24088, 1966, 31536, 1024, 581, 476, 2057, 1642, 247, 10237, 1566, 281, 3453, 247, 5203, 1014, 672, 3888, 403, 44711, 390, 320, 4136, 281, 20965, 404, 672, 247, 44711, 2460, 310, 1677, 275, 253, 6158, 1083, 247, 1566, 476, 2686, 452, 2233, 7200, 327, 253, 2622, 14800, 1223, 352, 1537, 452, 247, 2257, 273, 20965, 404, 327, 18539, 274, 1365, 44711, 2792, 253, 37808, 875, 619, 1650, 285, 253, 3935, 273, 10012, 8319, 3133, 281, 320, 2057, 45030, 432, 253, 958, 326, 368, 1581, 29607, 1048, 26309, 326, 588, 6524, 1818, 253, 5203, 390, 326, 337, 9866, 1754, 7274, 403, 417, 10481, 6422, 1060, 50275, 515, 23892, 337, 3239, 608, 50276, 664, 5467, 326, 387, 1878, 337, 50276, 3005, 6919, 273, 2280, 273, 253, 16888, 3268, 20926, 5246, 689, 310, 436, 323, 1046, 340, 671, 476, 368, 2319, 1880, 9376, 337, 5431, 3451, 327, 1524, 941, 50276, 249, 634, 4679, 2361, 275, 2829, 337, 3239, 854, 849, 1199, 253, 3904, 1818, 604, 368, 4388, 281, 755, 271, 18539, 274, 1365, 44711, 1127, 3731, 39651, 407, 2007, 34617, 752, 16988, 347, 247, 14905, 48960, 1650, 619, 14926, 1840, 281, 253, 10527, 15895, 285, 27947, 1057, 417, 3657, 368, 432, 7826, 4645, 247, 9712, 275, 841, 4679, 407, 1663, 17190, 253, 48960, 6667, 281, 320, 3731, 39651, 50276, 261, 634, 16851, 34014, 872, 1598, 4020, 839, 253, 10237, 2228, 50274, 44295, 5701, 285, 963, 993, 50276, 783, 5203, 340, 4620, 7019, 275, 253, 4737, 273, 3239, 577, 347, 1327, 679, 5816, 50275, 17915, 255, 1057, 417, 1646, 281, 320, 253, 1682, 4327, 281, 1957, 253, 20965, 1075, 2281, 327, 2622, 941, 387, 1878, 352, 310, 1892, 281, 5476, 352, 1754, 327, 253, 14951, 50274, 5996, 30080, 22559, 4385, 50275, 35501, 323, 9628, 253, 2380, 19235, 253, 1077, 5044, 2523, 342, 253, 5426, 908, 275, 436, 2929, 285, 697, 12739, 281, 3946, 4558, 5061, 5336, 50276, 936, 19148, 253, 5426, 368, 816, 878, 281, 2770, 327, 253, 1563, 2969, 1650, 752, 604, 253, 1566, 556, 5058, 2495, 3775, 50276, 338, 368, 12230, 247, 1127, 352, 651, 1335, 320, 9113, 10509, 2568, 597, 1335, 921, 326, 48960, 6667, 403, 19455, 1014, 275, 436, 4758, 436, 2168, 2722, 1633, 310, 26401, 3430, 342, 253, 5426, 908, 50276, 12550, 2380, 310, 326, 253, 30539, 324, 735, 552, 588, 417, 755, 281, 1818, 253, 5203, 533, 760, 253, 3386, 533, 4496, 3877, 253, 34014, 310, 417, 4136, 281, 5206, 253, 5203, 253, 34014, 21460, 253, 3386, 285, 352, 310, 598, 281, 253, 1566, 281, 9113, 30215, 352, 390, 417, 604, 271, 30539, 2544, 253, 5406, 273, 247, 5798, 281, 281, 247, 5406, 273, 247, 4370, 253, 11454, 2036, 390, 667, 643, 1566, 943, 1067, 436, 4370, 285, 604, 1057, 1335, 5841, 352, 5798, 352, 651, 320, 247, 247, 10551, 417, 253, 643, 1039, 1475, 253, 3216, 5083, 26332, 253, 4473, 1159, 14802, 752, 310, 3451, 285, 752, 310, 417, 50275, 783, 1840, 2523, 310, 417, 21833, 352, 556, 1524, 2818, 327, 253, 4679, 285, 347, 891, 753, 352, 310, 1774, 281, 1304, 275, 253, 4679, 1880, 390, 417, 597, 8104, 1421, 281, 4588, 3731, 42070, 50276, 74, 3524, 841, 5701, 588, 1361, 11138, 253, 2929, 1580, 347, 891, 753, 253, 9400, 273, 436, 2929, 310, 247, 1077, 1774, 581, 285, 594, 4555, 984, 273, 436, 352, 310, 1774, 281, 452, 253, 30486, 987, 7152, 33032, 436, 2929, 19539, 690, 7936, 5441, 670, 49996, 326, 16216, 20965, 404, 2085, 247, 1327, 42070, 285, 616, 31640, 281, 48960, 26309, 275, 4706, 577, 597, 2085, 247, 906, 326, 824, 49996, 403, 1900, 14043, 281, 48960, 26309, 275, 247, 7681, 3282, 275, 1798, 627, 588, 1900, 320, 247, 966, 275, 534, 954, 3733, 6667, 476, 320, 12421, 44711, 275, 247, 1039, 326, 271, 13583, 5203, 588, 906, 4829, 2716, 253, 673, 275, 4706, 608, 597, 12661, 247, 7321, 5275, 570, 25194, 9162, 5933, 342, 767, 3602, 326, 1453, 20965, 1075, 285, 6046, 8570, 597, 2085, 5170, 14493, 327, 2228, 275, 247, 3632, 24822, 2983, 6974, 285, 46783, 29595, 5458, 841, 1543, 275, 2067, 625, 2173, 16691, 15216, 275, 4706, 84, 721, 50276, 24, 597, 2319, 3082, 281, 19928, 253, 767, 3602, 285, 2085, 5661, 1941, 273, 616, 10527, 1543, 50275, 296, 3755, 20556, 50276, 20881, 1255, 265, 50276, 7053, 253, 20544, 891, 1119, 253, 2929, 281, 320, 973, 34092, 285, 6571, 973, 15720, 352, 13698, 281, 18915, 247, 3965, 7936, 1895, 285, 3400, 690, 2590, 285, 2969, 1543, 275, 5886, 281, 247, 2969, 5933, 323, 253, 7234, 891, 10141, 253, 2929, 369, 22335, 3590, 50276, 284, 323, 253, 32213, 891, 1119, 690, 273, 253, 15965, 47284, 281, 320, 1892, 281, 956, 253, 27947, 285, 46159, 812, 1663, 5649, 432, 690, 21302, 285, 4931, 690, 2969, 1650, 15216, 23000, 891, 1119, 4266, 31488, 273, 253, 8542, 15988, 432, 616, 5933, 597, 1056, 247, 2014, 5301, 281, 247, 4872, 30410, 285, 697, 2761, 2176, 326, 253, 5301, 651, 417, 320, 347, 13857, 1411, 667, 1332, 326, 4483, 323, 9162, 275, 247, 1327, 26458, 8142, 342, 690, 1268, 273, 7162, 285, 3021, 690, 3626, 1268, 273, 20965, 1075, 50275, 250, 27167, 318, 50276, 3169, 327, 253, 1840, 891, 3534, 247, 13716, 273, 818, 891, 858, 417, 1918, 247, 2169, 4868, 347, 891, 717, 417, 2119, 273, 253, 8542, 17200, 273, 253, 5125, 5933, 327, 253, 643, 1133, 253, 10527, 1543, 1646, 4891, 285, 403, 273, 247, 3965, 7936, 3753, 891, 513, 2085, 436, 4868, 342, 247, 13723, 273, 7043, 347, 891, 369, 417, 2104, 281, 2451, 512, 253, 10527, 7234, 534, 403, 253, 27882, 273, 436, 2929, 275, 619, 4743, 50276, 11183, 891, 717, 1066, 4971, 272, 619, 4868, 281, 247, 721, 1754, 327, 253, 11626, 273, 619, 7715, 30628, 352, 3133, 326, 4931, 253, 10527, 1543, 403, 1754, 327, 15216, 326, 403, 1512, 8077, 2531, 323, 253, 3114, 387, 1133, 25761, 627, 403, 4518, 690, 1239, 1430, 3374, 1754, 2220, 253, 9969, 273, 253, 643, 30628, 50275, 498, 274, 1877, 3533, 50276, 35640, 621, 50276, 18, 347, 5125, 4931, 352, 651, 320, 9371, 281, 2319, 390, 1908, 5368, 3082, 326, 2085, 9162, 342, 690, 1268, 273, 7162, 352, 3133, 326, 841, 8356, 2085, 690, 10732, 273, 20965, 1075, 672, 7162, 310, 417, 1029, 2217, 891, 1119, 352, 10084, 326, 5293, 273, 841, 497, 5393, 374, 891, 8344, 326, 253, 24864, 2144, 4428, 247, 1643, 10414, 327, 6667, 273, 253, 24822, 4322, 1566, 891, 1928, 751, 247, 6197, 275, 253, 2022, 2505, 651, 320, 9371, 281, 2085, 690, 3634, 323, 253, 17200, 273, 436, 4322, 1566, 495, 275, 253, 4737, 273, 289, 78, 7609, 253, 6242, 273, 247, 5203, 340, 342, 4644, 6919, 1679, 685, 16987, 4961, 407, 16968, 273, 1907, 387, 1878, 767, 13301, 1014, 342, 271, 20965, 1075, 4500, 6242, 651, 2186, 891, 6566, 406, 339, 793, 360, 3454, 253, 4477, 12661, 247, 4602, 875, 20965, 1075, 285, 31640, 281, 48960, 6667, 5742, 253, 4477, 21244, 326, 1293, 253, 3745, 281, 20965, 404, 667, 30410, 476, 320, 11213, 264, 407, 271, 48960, 20452, 275, 253, 4735, 2317, 597, 23000, 2085, 1543, 285, 4679, 8664, 253, 1463, 5438, 273, 247, 4373, 19484, 326, 43569, 253, 20965, 1075, 50276, 856, 84, 253, 4477, 2486, 1142, 10527, 6260, 247, 1175, 3434, 310, 1160, 281, 2953, 253, 1895, 432, 2067, 4623, 14636, 50275, 5040, 50275, 783, 2929, 310, 1077, 2834, 281, 1239, 50274, 783, 10199, 1057, 417, 2085, 247, 4076, 1386, 273, 1869, 15265, 839, 253, 2929, 891, 717, 417, 6600, 326, 253, 2983, 1566, 2783, 326, 273, 271, 34014, 4136, 281, 1056, 29607, 1781, 9727, 275, 247, 8578, 273, 4735, 2317, 342, 642, 10806, 327, 253, 3280, 2317, 4961, 11358, 275, 253, 6239, 253, 789, 11106, 8516, 1162, 355, 4765, 4419, 326, 48566, 48960, 6667, 3464, 39662, 281, 1966, 16006, 50274, 783, 12002, 28070, 1543, 323, 667, 30410, 672, 275, 958, 253, 906, 3133, 281, 320, 323, 247, 5275, 570, 25194, 3740, 30410, 534, 310, 11555, 1677, 326, 253, 4758, 310, 3676, 6928, 50275, 783, 4903, 908, 4768, 403, 2834, 281, 1978, 3540, 273, 50276, 24330, 3374, 50275, 33921, 7609, 436, 310, 9090, 417, 2032, 323, 512, 49996, 253, 4737, 310, 1077, 2834, 281, 956, 285, 627, 403, 642, 4217, 4278, 1677, 275, 4677, 337, 253, 906, 3133, 751, 352, 778, 320, 2032, 407, 16968, 273, 253, 958, 326, 694, 79, 342, 465, 18, 13067, 247, 4373, 13568, 875, 667, 767, 2792, 840, 247, 12421, 6777, 4972, 342, 5912, 1249, 50276, 4259, 24473, 25808, 824, 247, 4373, 13568, 6524, 533, 253, 373, 642, 1921, 281, 2868, 824, 247, 1318, 275, 4735, 2317, 812, 320, 8107, 28136, 390, 1014, 8696, 1561, 253, 2491, 273, 269, 50275, 783, 5175, 275, 2593, 11102, 1057, 417, 1646, 281, 2486, 253, 897, 273, 1327, 324, 735, 24406, 1071, 3530, 275, 16344, 1880, 390, 417, 247, 7887, 310, 1512, 7654, 281, 320, 273, 897, 352, 651, 320, 3309, 281, 7472, 327, 801, 297, 404, 1071, 3530, 347, 973, 347, 253, 3733, 873, 841, 1543, 1646, 2779, 281, 320, 689, 8491, 281, 253, 3733, 941, 50274, 783, 5175, 9978, 310, 12744, 4931, 627, 403, 2007, 4278, 275, 253, 23378, 9380, 533, 352, 310, 417, 1014, 2590, 281, 479, 849, 1142, 5971, 403, 275, 253, 873, 310, 352, 760, 767, 1677, 253, 8245, 273, 247, 4872, 30410, 50275, 783, 19843, 273, 253, 2929, 310, 18270, 14999, 352, 310, 2834, 281, 956, 253, 7680, 273, 1142, 273, 253, 39383, 3340, 8664, 616, 31376, 390, 3480, 10445, 50276, 28821, 326, 891, 1089, 841, 3374, 1512, 8664, 281, 5583, 9311, 891, 452, 417, 9257, 10141, 512, 273, 253, 27947, 187, 187, 4118, 18435, 27, 783, 2929, 19401, 253, 1895, 273, 20965, 1075, 275, 10237, 9162, 247, 1180, 273, 3374, 497, 3636, 275, 253, 7473, 7792, 285, 253, 4028, 369, 671, 417, 598, 281, 20041, 253, 4477, 943, 1379, 715, 2743, 253, 1077, 1142, 25799, 13991, 1160, 407, 253, 30628, 275, 13828, 247, 18520 ]
Below is given review of a research paper from cnoference journal. Please write a summary the review. ### Review: this paper presents an attention based recurrent approach to oneshot learning it reports quite strong experimental results surpassing human performancehbpl on the omniglot dataset which is somewhat surprising because it seems to make use of very standard neural network machinery the authors also note that other have helped verify the results did soumith chintala reproduce the results and do provide source code after reading this paper im left a little perplexed as to where the big performance improvements are coming from as it seems to share a lot of the same components of previous work if the authors could report result from a broader suite of experiments like in previous work eg matching networks it would much more convincing an ablation study would also help with understanding why this model does so welldocsepthis paper describes a method that estimates the similarity between a set of images by alternatively attend each image with a recurrent manner the idea of the paper is interesting which mimic the humans behavior however there are several cons of the paper 1 the paper is now well written there are too many todo cite in the final version of the paper which indicates that the paper is submitted in a rush or the authors did not take much care about the paper i think the paper is not suitable to be published with the current version 2 the missing of the experimental results the paper mentioned the lfw dataset however the paper did not provide the results on lfw dataset at least i did not find it in the version of dec 13th 3 the experiments of omniglot dataset are not sufficient i suggest that the paper provides some illustrations about how the model the attend two images eg the trajectory of attenddocsepthis paper introduces an attentionbased recurrent network that learns to compare images by attending iteratively back and forth between a pair of images experiments show stateoftheart results on omniglot though a large part of the performance gain comes from when extracted convolutional features are used as input the paper is significantly improved from the original submission and reflects changes based on prereview questions however while there was an attempt made to include more qualitative results eg fig 2 it is still relatively weak and could benefit from more examples and analysis also why is the attention in fig 2 always attending over the full character although it is zooming in shouldnt it attend to relevant parts of the character attending to the full character on a solid background seems a trivial solution where it is then unclear where the large performance gains are coming from while the paper is much more polished now it is still lacking in details in some respects eg details of the convolutional feature extractor used that gives large performance gain ### Summary:
this paper shows some strong performance numbers but i agree with the reviewers that it requires more analysis of where those gains come from the model is very simple which is a positive but more studies such as ablation studies and other examples would help a lot
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 2520, 2929, 10262, 271, 4116, 1754, 18902, 2746, 281, 4394, 12022, 4715, 352, 5012, 3240, 2266, 5661, 1543, 28842, 272, 1966, 3045, 38057, 446, 327, 253, 33039, 304, 11753, 10895, 534, 310, 8489, 10084, 984, 352, 3133, 281, 1056, 897, 273, 1077, 2629, 11454, 2990, 20949, 253, 4477, 671, 3877, 326, 643, 452, 6518, 12654, 253, 1543, 858, 18155, 27268, 448, 565, 7080, 18302, 253, 1543, 285, 513, 2085, 2603, 2127, 50276, 6438, 4361, 436, 2929, 516, 1669, 247, 1652, 44229, 264, 347, 281, 835, 253, 1943, 3045, 11701, 403, 3551, 432, 347, 352, 3133, 281, 3894, 247, 2257, 273, 253, 1072, 4295, 273, 2045, 789, 604, 253, 4477, 812, 1304, 906, 432, 247, 16055, 18880, 273, 4679, 751, 275, 2045, 789, 24088, 11038, 6928, 352, 651, 1199, 625, 21414, 271, 28913, 1263, 651, 671, 1361, 342, 4685, 2139, 436, 1566, 1057, 594, 6210, 392, 406, 33032, 2520, 2929, 8631, 247, 1332, 326, 8197, 253, 14259, 875, 247, 873, 273, 3888, 407, 31506, 8041, 1016, 2460, 342, 247, 18902, 5133, 253, 2934, 273, 253, 2929, 310, 4722, 534, 25066, 253, 7497, 3879, 2299, 627, 403, 2067, 772, 273, 253, 2929, 50276, 18, 253, 2929, 310, 1024, 973, 3542, 627, 403, 1512, 1142, 20591, 26542, 275, 253, 2457, 2715, 273, 253, 2929, 534, 6492, 326, 253, 2929, 310, 9262, 275, 247, 16949, 390, 253, 4477, 858, 417, 1379, 1199, 1557, 670, 253, 2929, 891, 1158, 253, 2929, 310, 417, 7470, 281, 320, 3863, 342, 253, 1655, 2715, 50276, 19, 253, 5816, 273, 253, 5661, 1543, 253, 2929, 5393, 253, 298, 25837, 10895, 2299, 253, 2929, 858, 417, 2085, 253, 1543, 327, 298, 25837, 10895, 387, 1878, 891, 858, 417, 1089, 352, 275, 253, 2715, 273, 1086, 2145, 394, 50276, 20, 253, 4679, 273, 33039, 304, 11753, 10895, 403, 417, 4209, 891, 1804, 326, 253, 2929, 3400, 690, 33954, 670, 849, 253, 1566, 253, 8041, 767, 3888, 24088, 253, 18974, 273, 8041, 7152, 33032, 2520, 2929, 23970, 271, 4116, 3169, 18902, 2990, 326, 33772, 281, 7277, 3888, 407, 16362, 10040, 3146, 896, 285, 6593, 875, 247, 4667, 273, 3888, 4679, 921, 1375, 23037, 14387, 1543, 327, 33039, 304, 11753, 2167, 247, 1781, 629, 273, 253, 3045, 6351, 3249, 432, 672, 10375, 27311, 267, 3386, 403, 908, 347, 3280, 50276, 783, 2929, 310, 3012, 5520, 432, 253, 3236, 19529, 285, 13806, 2544, 1754, 327, 638, 15337, 3533, 2299, 1223, 627, 369, 271, 3177, 1160, 281, 2486, 625, 18276, 1543, 24088, 3036, 374, 352, 310, 1335, 4942, 5075, 285, 812, 5649, 432, 625, 6667, 285, 1783, 671, 2139, 310, 253, 4116, 275, 3036, 374, 1900, 16362, 689, 253, 2120, 1894, 50276, 20261, 352, 310, 21282, 272, 275, 943, 2649, 352, 8041, 281, 4623, 4243, 273, 253, 1894, 50276, 1595, 1946, 281, 253, 2120, 1894, 327, 247, 4891, 4114, 3133, 247, 14916, 2900, 835, 352, 310, 840, 12744, 835, 253, 1781, 3045, 15988, 403, 3551, 432, 50276, 6050, 253, 2929, 310, 1199, 625, 29422, 1024, 352, 310, 1335, 14999, 275, 4278, 275, 690, 23006, 24088, 4278, 273, 253, 27311, 267, 4735, 4908, 263, 908, 326, 4245, 1781, 3045, 6351, 187, 187, 4118, 18435, 27, 2520, 2929, 2722, 690, 2266, 3045, 3904, 533, 891, 5194, 342, 253, 30628, 326, 352, 4419, 625, 1783, 273, 835, 1110, 15988, 1705, 432, 253, 1566, 310, 1077, 2969, 534, 310, 247, 2762, 533, 625, 2175, 824, 347, 28913, 2175, 285, 643, 6667, 651, 1361, 247, 2257 ]
[ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 ]
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 2520, 2929, 10262, 271, 4116, 1754, 18902, 2746, 281, 4394, 12022, 4715, 352, 5012, 3240, 2266, 5661, 1543, 28842, 272, 1966, 3045, 38057, 446, 327, 253, 33039, 304, 11753, 10895, 534, 310, 8489, 10084, 984, 352, 3133, 281, 1056, 897, 273, 1077, 2629, 11454, 2990, 20949, 253, 4477, 671, 3877, 326, 643, 452, 6518, 12654, 253, 1543, 858, 18155, 27268, 448, 565, 7080, 18302, 253, 1543, 285, 513, 2085, 2603, 2127, 50276, 6438, 4361, 436, 2929, 516, 1669, 247, 1652, 44229, 264, 347, 281, 835, 253, 1943, 3045, 11701, 403, 3551, 432, 347, 352, 3133, 281, 3894, 247, 2257, 273, 253, 1072, 4295, 273, 2045, 789, 604, 253, 4477, 812, 1304, 906, 432, 247, 16055, 18880, 273, 4679, 751, 275, 2045, 789, 24088, 11038, 6928, 352, 651, 1199, 625, 21414, 271, 28913, 1263, 651, 671, 1361, 342, 4685, 2139, 436, 1566, 1057, 594, 6210, 392, 406, 33032, 2520, 2929, 8631, 247, 1332, 326, 8197, 253, 14259, 875, 247, 873, 273, 3888, 407, 31506, 8041, 1016, 2460, 342, 247, 18902, 5133, 253, 2934, 273, 253, 2929, 310, 4722, 534, 25066, 253, 7497, 3879, 2299, 627, 403, 2067, 772, 273, 253, 2929, 50276, 18, 253, 2929, 310, 1024, 973, 3542, 627, 403, 1512, 1142, 20591, 26542, 275, 253, 2457, 2715, 273, 253, 2929, 534, 6492, 326, 253, 2929, 310, 9262, 275, 247, 16949, 390, 253, 4477, 858, 417, 1379, 1199, 1557, 670, 253, 2929, 891, 1158, 253, 2929, 310, 417, 7470, 281, 320, 3863, 342, 253, 1655, 2715, 50276, 19, 253, 5816, 273, 253, 5661, 1543, 253, 2929, 5393, 253, 298, 25837, 10895, 2299, 253, 2929, 858, 417, 2085, 253, 1543, 327, 298, 25837, 10895, 387, 1878, 891, 858, 417, 1089, 352, 275, 253, 2715, 273, 1086, 2145, 394, 50276, 20, 253, 4679, 273, 33039, 304, 11753, 10895, 403, 417, 4209, 891, 1804, 326, 253, 2929, 3400, 690, 33954, 670, 849, 253, 1566, 253, 8041, 767, 3888, 24088, 253, 18974, 273, 8041, 7152, 33032, 2520, 2929, 23970, 271, 4116, 3169, 18902, 2990, 326, 33772, 281, 7277, 3888, 407, 16362, 10040, 3146, 896, 285, 6593, 875, 247, 4667, 273, 3888, 4679, 921, 1375, 23037, 14387, 1543, 327, 33039, 304, 11753, 2167, 247, 1781, 629, 273, 253, 3045, 6351, 3249, 432, 672, 10375, 27311, 267, 3386, 403, 908, 347, 3280, 50276, 783, 2929, 310, 3012, 5520, 432, 253, 3236, 19529, 285, 13806, 2544, 1754, 327, 638, 15337, 3533, 2299, 1223, 627, 369, 271, 3177, 1160, 281, 2486, 625, 18276, 1543, 24088, 3036, 374, 352, 310, 1335, 4942, 5075, 285, 812, 5649, 432, 625, 6667, 285, 1783, 671, 2139, 310, 253, 4116, 275, 3036, 374, 1900, 16362, 689, 253, 2120, 1894, 50276, 20261, 352, 310, 21282, 272, 275, 943, 2649, 352, 8041, 281, 4623, 4243, 273, 253, 1894, 50276, 1595, 1946, 281, 253, 2120, 1894, 327, 247, 4891, 4114, 3133, 247, 14916, 2900, 835, 352, 310, 840, 12744, 835, 253, 1781, 3045, 15988, 403, 3551, 432, 50276, 6050, 253, 2929, 310, 1199, 625, 29422, 1024, 352, 310, 1335, 14999, 275, 4278, 275, 690, 23006, 24088, 4278, 273, 253, 27311, 267, 4735, 4908, 263, 908, 326, 4245, 1781, 3045, 6351, 187, 187, 4118, 18435, 27, 2520, 2929, 2722, 690, 2266, 3045, 3904, 533, 891, 5194, 342, 253, 30628, 326, 352, 4419, 625, 1783, 273, 835, 1110, 15988, 1705, 432, 253, 1566, 310, 1077, 2969, 534, 310, 247, 2762, 533, 625, 2175, 824, 347, 28913, 2175, 285, 643, 6667, 651, 1361, 247, 2257 ]
Below is given review of a research paper from cnoference journal. Please write a summary the review. ### Review: this paper considers inference in hidden continuoustime semimarkov chains the authors derive a tractable algorithm for inference in these models notably the authors consider the challenging case of irregularly spaced data the introduced algorithm is of a forwardbackward type the output of this algorithm can be used for smoothing this brings inference in the model class in line with other similar models overall this paper makes a good contribution i have enjoyed reading the paper and it is wellwritten and together with the supplementary material provides sufficient detail to understand the proposed methodology the derivation of the forwardbackward equations for the considered model is nontrivial and at the same time a very natural question to ask a more principled inference method as an alternative to existing ad hoc ones is certainly welcome why not provide a description of the main forwardbackward algorithm step by step in the main paper instead of leaving in to supplementary section c where it is nicely described yes docsepthe paper describes a class of models which the authors characterize as being continuous time hidden semimarkov chains and provides an inferential framework for this class of models generalising the classical algorithms for hmm inference to this context this is a timely piece of work in the sense that there have been several innovations relating to inference for hsmms in recent years and the development herein seems somewhat more principled than many of them working directly in continuous time seems to allow for a relatively clean derivation of forward and backward equations that facilitate the inferential tasks of interest my impression is that there is currently interest in performing inference for this type of model and hence the work is likely to find an audience the main weaknesses to my mind are 1 the bulk of the paper is dedicated to fairly routine computations and the extent of the novelty is perhaps a little limited 2 a lack of connection with the wider literature on hidden continuous time processes 3 the empirical study lacks any compelling application or use case to showcase the model or demonstrate its broad importance 4 the manuscript seems to have been a little hastily prepared and would benefit from careful proofreading see the minor details in the next section for a few example typos the authors have indicated that societal implications are na for this work and given the nature of the paper i think that is appropriate i didnt note any particular limitations that are not discussed docsepthis paper propose inference algortihms for hidden continuous time semimarkov chains latent ctsmcs specifically it focuses on homogeneous and directiontime independent ctsmcs both inference of posterior probabilities of latent states given observations as well as maximuma posteriori state path are addressed the foundation of the algorithm is the observation that given a transition event at a specific time past and future dynamics become independant therefore the core element is to update the intensity of such event termed current as a function of time and state these updates involve integral equations that are solved using numerical methods strengths this work tackles a very fundamental and important problem the authors seem exploit the structured representation of directiontime independence in a elegant way namely by focusing on the representation of currents weaknesses although the setup is well written starting from section 22 the derivations become hard to follow and it is hard to judge all the elements the initial conditions are not clear additionally the numerical algorithm as well as the overall complexity is not well described see suggestions below regarding missing descriptions and flow the authors adequately addressed the limitations docsepthe paper generalizes latent state inference from hsmms to continuoustime chains in latent space in this case the posterior is not simply proportional to the usual forward and backward probabilities instead the transition random variables currents are markov the authors take the limit of step size approaching 0 for these currents the paper shows that the marginal px t in this case is found by a convolution over the input currents similarly the output currents are a convolution over the input currents including observations the authors derive kolmogorov forward and backward equations leading to algorithms akin to the classical sumproduct belief propagation and maxsum viterbi messagepassing algorithms but in continuous time an additional advantage of the continuous formulation is the availability of adaptive stepsizing in the continuous case the algorithms as well as adaptive stepsizing are evaluated in 1d experiments strengths strong theoretical foundation important generalization of wellknown important algorithms to the continuoustime case adaptive stepsizing weaknesses the paper was a bit hard to follow from time to time this could either be due to my lack of background knowledge but also perhaps due to the quite liberal suppression of function arguments and sometimes rather cluttered mathematics midsentence minor some typos l271 therefor l317 differemces l189 equation reference failed l274 what does x refer to the authors properly discuss the limitations in the conclusion ### Summary:
all of the authors agree that the work meets the neurips standards with the two lowestscoring reviewers upping their recommendation from 4 to 5 on rebuttal the work is described as a fundamental and important problem and timely a good contribution reviewer 6ntb summarises the technical contribution the paper generalizes latent state inference from hsmms to continuoustime chains in latent space in this case the posterior is not simply proportional to the usual forward and backward probabilities instead the transition random variables currents are markov the authors take the limit of step size approaching 0 for these currents its clear to me that the the work has been communicated really well since all of the reviewers were able to grasp the paper and there was very little misunderstanding in the discussions there were some recommendations from the reviewers please ensure these are fixed in the cameraready version
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 2520, 2929, 19401, 17032, 275, 8763, 44351, 26202, 553, 3300, 303, 782, 729, 13178, 253, 4477, 15313, 247, 10649, 494, 5933, 323, 17032, 275, 841, 3210, 19836, 253, 4477, 1908, 253, 11132, 1083, 273, 17948, 314, 26549, 941, 253, 5611, 5933, 310, 273, 247, 3579, 2135, 1034, 1511, 253, 3453, 273, 436, 5933, 476, 320, 908, 323, 36971, 436, 10316, 17032, 275, 253, 1566, 966, 275, 1386, 342, 643, 2074, 3210, 50275, 1189, 455, 436, 2929, 2789, 247, 1175, 7680, 891, 452, 11346, 4361, 253, 2929, 285, 352, 310, 973, 15720, 285, 2366, 342, 253, 24864, 2144, 3400, 4209, 2508, 281, 2096, 253, 4081, 16182, 253, 28529, 273, 253, 3579, 2135, 1034, 7424, 323, 253, 2783, 1566, 310, 37825, 285, 387, 253, 1072, 673, 247, 1077, 3626, 1953, 281, 1642, 247, 625, 3505, 74, 6216, 17032, 1332, 347, 271, 5795, 281, 5368, 519, 26901, 4394, 310, 5604, 10112, 50275, 22309, 417, 2085, 247, 5740, 273, 253, 2022, 3579, 2135, 1034, 5933, 3213, 407, 3213, 275, 253, 2022, 2929, 3185, 273, 6108, 275, 281, 24864, 2593, 260, 835, 352, 310, 23395, 2529, 50272, 9820, 5474, 339, 431, 248, 2929, 8631, 247, 966, 273, 3210, 534, 253, 4477, 17710, 347, 1146, 5415, 673, 8763, 3300, 303, 782, 729, 13178, 285, 3400, 271, 275, 21870, 7792, 323, 436, 966, 273, 3210, 2087, 2182, 253, 8946, 11333, 323, 34746, 17032, 281, 436, 3634, 436, 310, 247, 14793, 5313, 273, 789, 275, 253, 3282, 326, 627, 452, 644, 2067, 32771, 12600, 281, 17032, 323, 288, 3610, 983, 275, 3332, 1107, 285, 253, 2440, 11249, 3133, 8489, 625, 3505, 74, 6216, 685, 1142, 273, 731, 2444, 3587, 275, 5415, 673, 3133, 281, 1581, 323, 247, 4942, 4076, 28529, 273, 3579, 285, 19265, 7424, 326, 12454, 253, 275, 21870, 8892, 273, 1600, 619, 13214, 310, 326, 627, 310, 4390, 1600, 275, 9591, 17032, 323, 436, 1511, 273, 1566, 285, 7613, 253, 789, 310, 2779, 281, 1089, 271, 8446, 50276, 783, 2022, 32213, 281, 619, 2564, 403, 337, 253, 10713, 273, 253, 2929, 310, 9940, 281, 9648, 10934, 30745, 285, 253, 6070, 273, 253, 38135, 310, 4931, 247, 1652, 3710, 374, 247, 3480, 273, 4602, 342, 253, 14200, 6239, 327, 8763, 5415, 673, 4870, 495, 253, 16774, 1263, 19756, 667, 18511, 2898, 390, 897, 1083, 281, 34647, 253, 1566, 390, 7568, 697, 3862, 6349, 577, 253, 7714, 3133, 281, 452, 644, 247, 1652, 44753, 5480, 285, 651, 5649, 432, 10182, 4737, 24042, 923, 253, 5884, 4278, 275, 253, 1735, 2593, 323, 247, 1643, 1650, 963, 993, 253, 4477, 452, 4860, 326, 38058, 12739, 403, 5549, 323, 436, 789, 285, 1677, 253, 3753, 273, 253, 2929, 891, 1158, 326, 310, 4569, 891, 42126, 3877, 667, 1798, 7364, 326, 403, 417, 5469, 5474, 33032, 2520, 2929, 12661, 17032, 20320, 430, 6356, 983, 323, 8763, 5415, 673, 3300, 303, 782, 729, 13178, 21624, 260, 1641, 78, 6113, 5742, 352, 16633, 327, 17010, 285, 3884, 2606, 50276, 17777, 260, 1641, 78, 6113, 1097, 17032, 273, 12637, 20552, 273, 21624, 3054, 50276, 28821, 7313, 347, 973, 347, 4869, 66, 12637, 74, 1375, 1854, 403, 9713, 253, 12153, 273, 253, 5933, 310, 253, 8310, 326, 1677, 247, 5502, 2362, 387, 247, 2173, 673, 2469, 285, 2852, 8062, 2489, 7365, 386, 3103, 253, 5161, 3284, 310, 281, 5731, 253, 7133, 273, 824, 2362, 23776, 1655, 347, 247, 50276, 3701, 273, 673, 285, 1375, 841, 11269, 6388, 9909, 7424, 326, 403, 14042, 50276, 5302, 10704, 3082, 50276, 296, 3755, 20556, 436, 789, 39223, 247, 1077, 7936, 285, 1774, 1895, 253, 4477, 1646, 22059, 253, 18872, 6779, 273, 3884, 2606, 14275, 275, 247, 20654, 1039, 10775, 407, 13654, 327, 253, 6779, 273, 18476, 50274, 20881, 1255, 265, 3738, 253, 9978, 310, 973, 3542, 4983, 432, 2593, 3307, 253, 3538, 569, 2489, 1892, 281, 956, 285, 352, 310, 1892, 281, 5963, 512, 253, 3603, 253, 3302, 2515, 403, 417, 50276, 8250, 23000, 253, 10704, 5933, 347, 973, 347, 253, 4583, 10454, 310, 417, 50276, 4714, 2529, 923, 13991, 2708, 5001, 5816, 20121, 285, 2685, 50276, 783, 4477, 18212, 9713, 253, 7364, 5474, 339, 431, 248, 2929, 2087, 4219, 21624, 1375, 17032, 432, 288, 3610, 983, 281, 44351, 26202, 553, 13178, 275, 21624, 2317, 275, 436, 1083, 253, 12637, 310, 417, 3365, 14495, 281, 253, 7312, 3579, 285, 19265, 20552, 3185, 253, 5502, 3632, 4903, 18476, 403, 1616, 729, 253, 4477, 1379, 253, 2701, 273, 3213, 1979, 17682, 470, 323, 841, 18476, 253, 2929, 2722, 326, 253, 16888, 268, 89, 246, 275, 436, 1083, 310, 1119, 407, 247, 27311, 689, 253, 3280, 18476, 12014, 253, 3453, 18476, 403, 247, 27311, 689, 253, 3280, 18476, 1690, 7313, 253, 4477, 15313, 38301, 44519, 42017, 3579, 285, 19265, 7424, 4283, 281, 11333, 33917, 281, 253, 8946, 2020, 7509, 9927, 18634, 285, 2781, 2204, 362, 2562, 4193, 3935, 5858, 272, 11333, 533, 275, 5415, 673, 271, 3081, 5750, 273, 253, 5415, 15895, 310, 253, 11659, 273, 17825, 5018, 3006, 275, 253, 5415, 1083, 253, 11333, 347, 973, 347, 17825, 5018, 3006, 403, 6760, 275, 337, 69, 4679, 50276, 296, 3755, 20556, 50276, 9072, 10527, 12153, 50276, 18108, 26647, 273, 973, 4304, 1774, 11333, 281, 253, 44351, 26202, 553, 1083, 50276, 26672, 422, 5018, 3006, 50276, 20881, 1255, 265, 50276, 783, 2929, 369, 247, 2372, 1892, 281, 956, 432, 673, 281, 673, 436, 812, 2057, 320, 1955, 281, 619, 3480, 273, 4114, 3640, 533, 671, 4931, 1955, 281, 253, 3240, 12773, 14523, 273, 1159, 7125, 285, 4536, 2581, 26986, 3606, 23065, 278, 2352, 290, 566, 50275, 37585, 50276, 8826, 963, 993, 298, 28209, 627, 1542, 298, 29667, 845, 2013, 707, 298, 18359, 5150, 3806, 4242, 298, 23735, 752, 1057, 1269, 3730, 281, 50276, 783, 4477, 6283, 2319, 253, 7364, 275, 253, 6452, 2490, 187, 4118, 18435, 27, 455, 273, 253, 4477, 5194, 326, 253, 789, 16382, 253, 5723, 2824, 7465, 342, 253, 767, 8840, 1026, 4263, 30628, 4627, 272, 616, 17401, 432, 577, 281, 608, 327, 30080, 22559, 253, 789, 310, 2529, 347, 247, 7936, 285, 1774, 1895, 285, 14793, 247, 1175, 7680, 50275, 15337, 254, 721, 2649, 67, 10405, 3013, 253, 7681, 7680, 50276, 783, 2929, 2087, 4219, 21624, 1375, 17032, 432, 288, 3610, 983, 281, 44351, 26202, 553, 13178, 275, 21624, 2317, 275, 436, 1083, 253, 12637, 310, 417, 3365, 14495, 281, 253, 7312, 3579, 285, 19265, 20552, 3185, 253, 5502, 3632, 4903, 18476, 403, 1616, 729, 253, 4477, 1379, 253, 2701, 273, 3213, 1979, 17682, 470, 323, 841, 18476, 697, 2590, 281, 479, 326, 253, 253, 789, 556, 644, 32452, 1663, 973, 1580, 512, 273, 253, 30628, 497, 2104, 281, 15909, 253, 2929, 285, 627, 369, 1077, 1652, 40663, 275, 253, 11985, 50275, 9088, 497, 690, 12645, 432, 253, 30628, 50276, 32897, 5416, 841, 403, 4229, 275, 253, 4049, 254, 609, 5102, 2715, 50276 ]
[ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 ]
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 2520, 2929, 19401, 17032, 275, 8763, 44351, 26202, 553, 3300, 303, 782, 729, 13178, 253, 4477, 15313, 247, 10649, 494, 5933, 323, 17032, 275, 841, 3210, 19836, 253, 4477, 1908, 253, 11132, 1083, 273, 17948, 314, 26549, 941, 253, 5611, 5933, 310, 273, 247, 3579, 2135, 1034, 1511, 253, 3453, 273, 436, 5933, 476, 320, 908, 323, 36971, 436, 10316, 17032, 275, 253, 1566, 966, 275, 1386, 342, 643, 2074, 3210, 50275, 1189, 455, 436, 2929, 2789, 247, 1175, 7680, 891, 452, 11346, 4361, 253, 2929, 285, 352, 310, 973, 15720, 285, 2366, 342, 253, 24864, 2144, 3400, 4209, 2508, 281, 2096, 253, 4081, 16182, 253, 28529, 273, 253, 3579, 2135, 1034, 7424, 323, 253, 2783, 1566, 310, 37825, 285, 387, 253, 1072, 673, 247, 1077, 3626, 1953, 281, 1642, 247, 625, 3505, 74, 6216, 17032, 1332, 347, 271, 5795, 281, 5368, 519, 26901, 4394, 310, 5604, 10112, 50275, 22309, 417, 2085, 247, 5740, 273, 253, 2022, 3579, 2135, 1034, 5933, 3213, 407, 3213, 275, 253, 2022, 2929, 3185, 273, 6108, 275, 281, 24864, 2593, 260, 835, 352, 310, 23395, 2529, 50272, 9820, 5474, 339, 431, 248, 2929, 8631, 247, 966, 273, 3210, 534, 253, 4477, 17710, 347, 1146, 5415, 673, 8763, 3300, 303, 782, 729, 13178, 285, 3400, 271, 275, 21870, 7792, 323, 436, 966, 273, 3210, 2087, 2182, 253, 8946, 11333, 323, 34746, 17032, 281, 436, 3634, 436, 310, 247, 14793, 5313, 273, 789, 275, 253, 3282, 326, 627, 452, 644, 2067, 32771, 12600, 281, 17032, 323, 288, 3610, 983, 275, 3332, 1107, 285, 253, 2440, 11249, 3133, 8489, 625, 3505, 74, 6216, 685, 1142, 273, 731, 2444, 3587, 275, 5415, 673, 3133, 281, 1581, 323, 247, 4942, 4076, 28529, 273, 3579, 285, 19265, 7424, 326, 12454, 253, 275, 21870, 8892, 273, 1600, 619, 13214, 310, 326, 627, 310, 4390, 1600, 275, 9591, 17032, 323, 436, 1511, 273, 1566, 285, 7613, 253, 789, 310, 2779, 281, 1089, 271, 8446, 50276, 783, 2022, 32213, 281, 619, 2564, 403, 337, 253, 10713, 273, 253, 2929, 310, 9940, 281, 9648, 10934, 30745, 285, 253, 6070, 273, 253, 38135, 310, 4931, 247, 1652, 3710, 374, 247, 3480, 273, 4602, 342, 253, 14200, 6239, 327, 8763, 5415, 673, 4870, 495, 253, 16774, 1263, 19756, 667, 18511, 2898, 390, 897, 1083, 281, 34647, 253, 1566, 390, 7568, 697, 3862, 6349, 577, 253, 7714, 3133, 281, 452, 644, 247, 1652, 44753, 5480, 285, 651, 5649, 432, 10182, 4737, 24042, 923, 253, 5884, 4278, 275, 253, 1735, 2593, 323, 247, 1643, 1650, 963, 993, 253, 4477, 452, 4860, 326, 38058, 12739, 403, 5549, 323, 436, 789, 285, 1677, 253, 3753, 273, 253, 2929, 891, 1158, 326, 310, 4569, 891, 42126, 3877, 667, 1798, 7364, 326, 403, 417, 5469, 5474, 33032, 2520, 2929, 12661, 17032, 20320, 430, 6356, 983, 323, 8763, 5415, 673, 3300, 303, 782, 729, 13178, 21624, 260, 1641, 78, 6113, 5742, 352, 16633, 327, 17010, 285, 3884, 2606, 50276, 17777, 260, 1641, 78, 6113, 1097, 17032, 273, 12637, 20552, 273, 21624, 3054, 50276, 28821, 7313, 347, 973, 347, 4869, 66, 12637, 74, 1375, 1854, 403, 9713, 253, 12153, 273, 253, 5933, 310, 253, 8310, 326, 1677, 247, 5502, 2362, 387, 247, 2173, 673, 2469, 285, 2852, 8062, 2489, 7365, 386, 3103, 253, 5161, 3284, 310, 281, 5731, 253, 7133, 273, 824, 2362, 23776, 1655, 347, 247, 50276, 3701, 273, 673, 285, 1375, 841, 11269, 6388, 9909, 7424, 326, 403, 14042, 50276, 5302, 10704, 3082, 50276, 296, 3755, 20556, 436, 789, 39223, 247, 1077, 7936, 285, 1774, 1895, 253, 4477, 1646, 22059, 253, 18872, 6779, 273, 3884, 2606, 14275, 275, 247, 20654, 1039, 10775, 407, 13654, 327, 253, 6779, 273, 18476, 50274, 20881, 1255, 265, 3738, 253, 9978, 310, 973, 3542, 4983, 432, 2593, 3307, 253, 3538, 569, 2489, 1892, 281, 956, 285, 352, 310, 1892, 281, 5963, 512, 253, 3603, 253, 3302, 2515, 403, 417, 50276, 8250, 23000, 253, 10704, 5933, 347, 973, 347, 253, 4583, 10454, 310, 417, 50276, 4714, 2529, 923, 13991, 2708, 5001, 5816, 20121, 285, 2685, 50276, 783, 4477, 18212, 9713, 253, 7364, 5474, 339, 431, 248, 2929, 2087, 4219, 21624, 1375, 17032, 432, 288, 3610, 983, 281, 44351, 26202, 553, 13178, 275, 21624, 2317, 275, 436, 1083, 253, 12637, 310, 417, 3365, 14495, 281, 253, 7312, 3579, 285, 19265, 20552, 3185, 253, 5502, 3632, 4903, 18476, 403, 1616, 729, 253, 4477, 1379, 253, 2701, 273, 3213, 1979, 17682, 470, 323, 841, 18476, 253, 2929, 2722, 326, 253, 16888, 268, 89, 246, 275, 436, 1083, 310, 1119, 407, 247, 27311, 689, 253, 3280, 18476, 12014, 253, 3453, 18476, 403, 247, 27311, 689, 253, 3280, 18476, 1690, 7313, 253, 4477, 15313, 38301, 44519, 42017, 3579, 285, 19265, 7424, 4283, 281, 11333, 33917, 281, 253, 8946, 2020, 7509, 9927, 18634, 285, 2781, 2204, 362, 2562, 4193, 3935, 5858, 272, 11333, 533, 275, 5415, 673, 271, 3081, 5750, 273, 253, 5415, 15895, 310, 253, 11659, 273, 17825, 5018, 3006, 275, 253, 5415, 1083, 253, 11333, 347, 973, 347, 17825, 5018, 3006, 403, 6760, 275, 337, 69, 4679, 50276, 296, 3755, 20556, 50276, 9072, 10527, 12153, 50276, 18108, 26647, 273, 973, 4304, 1774, 11333, 281, 253, 44351, 26202, 553, 1083, 50276, 26672, 422, 5018, 3006, 50276, 20881, 1255, 265, 50276, 783, 2929, 369, 247, 2372, 1892, 281, 956, 432, 673, 281, 673, 436, 812, 2057, 320, 1955, 281, 619, 3480, 273, 4114, 3640, 533, 671, 4931, 1955, 281, 253, 3240, 12773, 14523, 273, 1159, 7125, 285, 4536, 2581, 26986, 3606, 23065, 278, 2352, 290, 566, 50275, 37585, 50276, 8826, 963, 993, 298, 28209, 627, 1542, 298, 29667, 845, 2013, 707, 298, 18359, 5150, 3806, 4242, 298, 23735, 752, 1057, 1269, 3730, 281, 50276, 783, 4477, 6283, 2319, 253, 7364, 275, 253, 6452, 2490, 187, 4118, 18435, 27, 455, 273, 253, 4477, 5194, 326, 253, 789, 16382, 253, 5723, 2824, 7465, 342, 253, 767, 8840, 1026, 4263, 30628, 4627, 272, 616, 17401, 432, 577, 281, 608, 327, 30080, 22559, 253, 789, 310, 2529, 347, 247, 7936, 285, 1774, 1895, 285, 14793, 247, 1175, 7680, 50275, 15337, 254, 721, 2649, 67, 10405, 3013, 253, 7681, 7680, 50276, 783, 2929, 2087, 4219, 21624, 1375, 17032, 432, 288, 3610, 983, 281, 44351, 26202, 553, 13178, 275, 21624, 2317, 275, 436, 1083, 253, 12637, 310, 417, 3365, 14495, 281, 253, 7312, 3579, 285, 19265, 20552, 3185, 253, 5502, 3632, 4903, 18476, 403, 1616, 729, 253, 4477, 1379, 253, 2701, 273, 3213, 1979, 17682, 470, 323, 841, 18476, 697, 2590, 281, 479, 326, 253, 253, 789, 556, 644, 32452, 1663, 973, 1580, 512, 273, 253, 30628, 497, 2104, 281, 15909, 253, 2929, 285, 627, 369, 1077, 1652, 40663, 275, 253, 11985, 50275, 9088, 497, 690, 12645, 432, 253, 30628, 50276, 32897, 5416, 841, 403, 4229, 275, 253, 4049, 254, 609, 5102, 2715, 50276 ]
Below is given review of a research paper from cnoference journal. Please write a summary the review. ### Review: this paper considers the problem of estimating the change in the performance of commercial ml apis ml as a service as the models are updated over time experiments are for 2020 vs 2021 it formalizes the problem as estimating the change in the confusion matrix over time the main theoretical contribution is an adaptive sampling method to more efficiently estimate this shift interesting empirical results on various ml apis are provided in the paper showing the relevance of the problem and the effectiveness of the proposed method originality and significance to the best of my knowledge this is the first work to systematically investigate the shift in the performance of commercial ml apis with the popularity of these services on the rise it is important to study various aspects of the models including the variations of the performance over time the proposed sampling method is also novel and can be used in similar settings where model queries are expensive the paper aims to minimize the frobenius norm of the error in estimating the confusion matrix of classifiers while keeping the sample complexity close to the optimal allocation strategy in hindsight the proposed algorithm asymptotically approaches the optimal allocation decay rate of 1n the empirical studies suggest that the proposed method can be an order of magnitude more sample efficient compared to random sampling quality and clarity the paper is well written and motivates the problem with case studies and examples the algorithms and theorems are clearly stated i did not check the proofs in the appendix some limitations of the work the method as described in the paper only applies to classification settings although not a strict requirement the experimental analysis of the sampling method benefits from a rough estimate of the difficulty of each example before it is evaluated by the ml api this difficulty is calculated by using a separate clientside cheap model this goes against one of the main appeals of ml apis that try to minimize clientside evaluation setup think installing tf runtime etc there are no ablation studies on the quantified role of using such client side models the experiments could be improved by comparison to baselines other than random sampling though this might not be possible with updated apis note i was a reviewer to an earlier version of this work compared to the previous version 1 the writing of the paper is improved in parts 2 there is a more clear discussion of related work and 3 the dataset is being publicly released this work opens a discussion around the problem of estimating the performance shift in commercial ml apis for classification the paper defines a metric for the performance shift of such apis via the confusion matrix and presents a method to achieve near optimal sampling rates the theoretical contributions of the paper are small but nontrivial the experimental analysis is detailed and interesting but could benefit from further ablation studies on the effect of the clientside difficulty gauge model the problem is of interest to the ml community and the release of the annotated dataset used in this work would be useful to the community docsepapi shifts are common in several deployed machine learning models this work proposes an efficient way to measure shifts in the confusion matrices of ml models using limited number of api calls the problem addressed is important and the method seems to yield good results in practise in comparison to random sampling once concern is its unclear why this problem cannot leverage some of the existing work on stratified sampling based on explanation at the end of section 1 with the aim to reduce the variance of the estimator could you please elaborate on this after all the goal is to estimate elements of the confusion matrix based on section 33 does masa yield an optimal allocation of api calls api shift is an important problem that needs to be addressed in order to operationalising ai this work proposes a way to assess these shifts using limited number of api calls and has a lot of practical importance most deployed ml models are also priced in a manner wherein each api calls has an associated cost and thus performing api shift assessment using limited calls is quite important even from a pricing stand point docsepauthors show that ml models behind publicly available apis change and these changes cause result changes for input datasets authors track the changes through confusion matrix differences they propose an efficient algorithm they call masa to evaluate the changes in results with reduced number of queries their algorithm achieves better estimates given the same budget than uniform sampling strengths tackles important practical problem of the result differences from ml apis presents accuracy change results from a number of actual ml apis from leading providers authors created a novel algorithm masa to efficiently detect and evaluate result differences for ml apis authors demonstrate that masa significantly outperforms baseline algorithms weaknesses accuracy changes from actual ml apis is limited in scope only few systems were analyzed and only for two dates spring 2020 and spring 2021 it would be interesting and important to track both more systems and more time points unclear how confusion matrixes were used for speech recognition task which presumably has a very large number of classes i am guessing authors treated speech recognition problem as classification problem for evaluation however there are no details on this it would be good if this was explained and info provided as authors noted confusion matrix difference is a good measure as a result drift only for certain classificationlike apis it would be good to see how to deal with nonclassification apis authors suggest that the differences seen in confusion matrix provide useful insights into how api results changed and why they changed there is little substantiation of usefulness of how the api results changed ie is confusion matrix difference really the best we can do to show how the api results changed regarding why the results changed the authors provide guesses but it seems to me that we cannot really know based on the results if we cannot determine the why this should be stated if we can then it would be good to see what can be determined and how minor typo page 7 diffident should be different i think that the problem of ml api result shift is real and important i believe authors made interesting and useful contribution in evaluating such shifts although the paper has some weaknesses i would recommend accepting it docsepa method to estimate the confusion matrix of a blackbox classifier using as few samples as possible the paper is focused around one use case tracking changes in ml apis considering that such changes may go unannounced strengths the use case is one that is not often studied yet is important both from a business accountability perspective assessing the trustworthiness of a commercial ml api and ethics perspective providing information to society about hidden dangers in commercial ml apis which are difficult to track because of the algorithmic intransparency that is a common problem in the tech industry algorithm is uncomplicated to implement and provides clear improvement over naive sampling methods collected dataset on api shifts is novel and is a major contribution of the work if released to the public weaknesses the idea of difficulty levels k is not welldeveloped there are a couple of places where i was expecting more details 1 some experiments use k2 while others use k3 what is the justification for doing so 2 i was expecting an ablation study comparing masa with k1 vs masa with k1 as the paper stands it is unclear just how much of masas performance improvement over uniform and stratified sampling is due to k versus the uncertaintybased sample selection other suggestions for improvement being able to measure the uncertainty in each partition and use that uncertainty to inform sample selection is the key idea that makes masa perform better than uniform or stratified sampling given the importance of this idea it would be better to explain this via an intuitive figure as a suggestion the authors could visualize how a partition in which the ml api classifies all data points with the same class whether right or wrong will have low uncertainty while the opposite gives high uncertainty the paper makes atypical but important contributions to ml ethics although the proposed algorithm is not groundbreaking from a technical perspective it does contribute significantly towards measuring and tracking changes in blackbox apis and i think it is of high value to society i am concerned that the idea of difficulty levels k is not fully developed but i lean towards acceptance ### Summary:
the paper studies real world ml apis performance shifts due to api updatesretraining and proposes a framework to efficiently estimate those shifts the problem is very important and the presented approach definitely novel my concern is about limited novelty of the theoretical analysis and weak experimental evaluation just two dates limited number of systems tested small number of ablations as of now the paper looks like an interesting but unfinished proposal looking forward to the discussion between the authors and the reviewers to address the concerns in the rebuttal the authors have addressed reviewers comments in particular by adding additional experiments that strengthen the paper all the reviewers recommend the paper to be accepted it is suggested that in the cameraready version the authors will add additional details regarding the experiments as some of the reviewers mentioned
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 2520, 2929, 19401, 253, 1895, 273, 26230, 253, 1818, 275, 253, 3045, 273, 6264, 13361, 1049, 261, 13361, 347, 247, 2579, 347, 253, 3210, 403, 9300, 689, 673, 4679, 403, 323, 9169, 4632, 43425, 352, 7473, 4219, 253, 1895, 347, 26230, 253, 1818, 275, 253, 13775, 4315, 689, 673, 253, 2022, 10527, 7680, 310, 271, 17825, 10491, 1332, 281, 625, 14556, 6642, 436, 5333, 4722, 16774, 1543, 327, 2710, 13361, 1049, 261, 403, 2530, 275, 253, 2929, 4645, 253, 17200, 273, 253, 1895, 285, 253, 12510, 273, 253, 4081, 1332, 3236, 414, 285, 8453, 281, 253, 1682, 273, 619, 3640, 436, 310, 253, 806, 789, 281, 24181, 7409, 253, 5333, 275, 253, 3045, 273, 6264, 13361, 1049, 261, 342, 253, 18395, 273, 841, 3238, 327, 253, 6054, 352, 310, 1774, 281, 1263, 2710, 7794, 273, 253, 3210, 1690, 253, 10575, 273, 253, 3045, 689, 673, 253, 4081, 10491, 1332, 310, 671, 4460, 285, 476, 320, 908, 275, 2074, 7533, 835, 1566, 19241, 403, 8214, 253, 2929, 13698, 281, 15338, 253, 8954, 7564, 3750, 5222, 273, 253, 2228, 275, 26230, 253, 13775, 4315, 273, 49996, 1223, 7562, 253, 3410, 10454, 2810, 281, 253, 8654, 17621, 5700, 275, 17134, 18347, 253, 4081, 5933, 38311, 7274, 253, 8654, 17621, 10027, 2281, 273, 337, 79, 253, 16774, 2175, 1804, 326, 253, 4081, 1332, 476, 320, 271, 1340, 273, 9777, 625, 3410, 5919, 2429, 281, 3632, 10491, 50276, 15177, 285, 19843, 253, 2929, 310, 973, 3542, 285, 15265, 684, 253, 1895, 342, 1083, 2175, 285, 6667, 253, 11333, 285, 39383, 403, 4518, 4767, 891, 858, 417, 2451, 253, 27947, 275, 253, 30762, 50276, 8826, 7364, 273, 253, 789, 50276, 783, 1332, 347, 2529, 275, 253, 2929, 760, 10384, 281, 9162, 7533, 50276, 20261, 417, 247, 7654, 8284, 253, 5661, 1783, 273, 253, 10491, 1332, 5373, 432, 247, 7227, 6642, 273, 253, 10183, 273, 1016, 1650, 1078, 352, 310, 6760, 407, 253, 13361, 23370, 50276, 2520, 10183, 310, 5118, 407, 970, 247, 4858, 8548, 504, 11142, 1566, 436, 4566, 1411, 581, 273, 253, 2022, 12527, 273, 13361, 1049, 261, 326, 1611, 281, 15338, 8548, 504, 7103, 9978, 1158, 20864, 28793, 20243, 3966, 627, 403, 642, 28913, 2175, 327, 253, 18755, 2554, 273, 970, 824, 5268, 1930, 3210, 50276, 783, 4679, 812, 320, 5520, 407, 5301, 281, 1666, 25379, 643, 685, 3632, 10491, 2167, 436, 1537, 417, 320, 1896, 342, 9300, 1049, 261, 50276, 9939, 891, 369, 247, 37317, 281, 271, 4321, 2715, 273, 436, 789, 2429, 281, 253, 2045, 2715, 337, 253, 4028, 273, 253, 2929, 310, 5520, 275, 4243, 374, 627, 310, 247, 625, 2590, 5955, 273, 2905, 789, 285, 495, 253, 10895, 310, 1146, 13644, 4439, 436, 789, 13279, 247, 5955, 1475, 253, 1895, 273, 26230, 253, 3045, 5333, 275, 6264, 13361, 1049, 261, 323, 9162, 253, 2929, 13067, 247, 7982, 323, 253, 3045, 5333, 273, 824, 1049, 261, 3066, 253, 13775, 4315, 285, 10262, 247, 1332, 281, 5115, 2822, 8654, 10491, 4142, 50276, 783, 10527, 9021, 273, 253, 2929, 403, 1355, 533, 37825, 253, 5661, 1783, 310, 7000, 285, 4722, 533, 812, 5649, 432, 2007, 28913, 2175, 327, 253, 1055, 273, 253, 8548, 504, 10183, 11206, 1566, 253, 1895, 310, 273, 1600, 281, 253, 13361, 3114, 285, 253, 3727, 273, 253, 28267, 10895, 908, 275, 436, 789, 651, 320, 4217, 281, 253, 3114, 50275, 7152, 33032, 6682, 15036, 403, 1846, 275, 2067, 18329, 5145, 4715, 3210, 436, 789, 29328, 271, 5919, 1039, 281, 2557, 15036, 275, 253, 13775, 12624, 273, 13361, 3210, 970, 3710, 1180, 273, 23370, 5841, 50276, 783, 1895, 9713, 310, 1774, 285, 253, 1332, 3133, 281, 4917, 1175, 1543, 275, 2283, 885, 275, 5301, 281, 3632, 10491, 50275, 19131, 4468, 310, 697, 12744, 2139, 436, 1895, 2550, 25057, 690, 273, 253, 5368, 789, 327, 31539, 10491, 1754, 327, 8813, 387, 253, 990, 273, 2593, 337, 342, 253, 4388, 281, 4796, 253, 11041, 273, 253, 29107, 812, 368, 4496, 21184, 327, 436, 846, 512, 253, 4736, 310, 281, 6642, 3603, 273, 253, 13775, 4315, 50275, 3169, 327, 2593, 5922, 1057, 9425, 66, 4917, 271, 8654, 17621, 273, 23370, 5841, 23370, 5333, 310, 271, 1774, 1895, 326, 3198, 281, 320, 9713, 275, 1340, 281, 15942, 2182, 23105, 436, 789, 29328, 247, 1039, 281, 2939, 841, 15036, 970, 3710, 1180, 273, 23370, 5841, 285, 556, 247, 2257, 273, 8542, 6349, 954, 18329, 13361, 3210, 403, 671, 33449, 275, 247, 5133, 10646, 1016, 23370, 5841, 556, 271, 2330, 2105, 285, 3021, 9591, 23370, 5333, 6803, 970, 3710, 5841, 310, 3240, 1774, 1014, 432, 247, 20910, 1462, 1127, 50276, 7152, 33032, 43355, 921, 326, 13361, 3210, 3212, 13644, 2130, 1049, 261, 1818, 285, 841, 2544, 2847, 906, 2544, 323, 3280, 15302, 4477, 3540, 253, 2544, 949, 13775, 4315, 3910, 597, 12661, 271, 5919, 5933, 597, 1067, 9425, 66, 281, 7472, 253, 2544, 275, 1543, 342, 3777, 1180, 273, 19241, 616, 5933, 33526, 1805, 8197, 1677, 253, 1072, 7563, 685, 6447, 10491, 20544, 50276, 85, 471, 868, 1774, 8542, 1895, 273, 253, 906, 3910, 432, 13361, 1049, 261, 50276, 81, 5957, 7200, 1818, 1543, 432, 247, 1180, 273, 4588, 13361, 1049, 261, 432, 4283, 11967, 50276, 43355, 3562, 247, 4460, 5933, 9425, 66, 281, 14556, 2736, 285, 7472, 906, 3910, 323, 13361, 1049, 261, 50275, 43355, 7568, 326, 9425, 66, 3012, 41731, 13015, 8245, 11333, 50276, 20881, 1255, 265, 50276, 18921, 1974, 2544, 432, 4588, 13361, 1049, 261, 310, 3710, 275, 7990, 760, 1643, 2718, 497, 5867, 285, 760, 323, 767, 12282, 7203, 9169, 285, 7203, 43425, 352, 651, 320, 4722, 285, 1774, 281, 3540, 1097, 625, 2718, 285, 625, 673, 2792, 50276, 328, 8250, 849, 13775, 4315, 265, 497, 908, 323, 6519, 8981, 4836, 534, 18289, 556, 247, 1077, 1781, 1180, 273, 5971, 891, 717, 29985, 4477, 4127, 6519, 8981, 1895, 347, 9162, 1895, 323, 7103, 2299, 627, 403, 642, 4278, 327, 436, 352, 651, 320, 1175, 604, 436, 369, 5544, 285, 8692, 2530, 50276, 284, 4477, 4879, 13775, 4315, 3064, 310, 247, 1175, 2557, 347, 247, 906, 16924, 760, 323, 2176, 9162, 3022, 1049, 261, 352, 651, 320, 1175, 281, 923, 849, 281, 2968, 342, 1327, 42070, 1049, 261, 50276, 43355, 1804, 326, 253, 3910, 2326, 275, 13775, 4315, 2085, 4217, 16039, 715, 849, 23370, 1543, 4391, 285, 2139, 597, 4391, 627, 310, 1652, 4326, 2492, 273, 31471, 273, 849, 253, 23370, 1543, 4391, 26332, 310, 13775, 4315, 3064, 1663, 253, 1682, 359, 476, 513, 281, 921, 849, 253, 23370, 1543, 4391, 5001, 2139, 253, 1543, 4391, 253, 4477, 2085, 5476, 265, 533, 352, 3133, 281, 479, 326, 359, 2550, 1663, 871, 1754, 327, 253, 1543, 604, 359, 2550, 3653, 253, 2139, 436, 943, 320, 4767, 604, 359, 476, 840, 352, 651, 320, 1175, 281, 923, 752, 476, 320, 3413, 285, 849, 50275, 37585, 1745, 80, 3239, 818, 2171, 888, 50276, 11425, 320, 1027, 891, 1158, 326, 253, 1895, 273, 13361, 23370, 906, 5333, 310, 1524, 285, 1774, 891, 2868, 4477, 1160, 4722, 285, 4217, 7680, 275, 16344, 824, 15036, 3738, 253, 2929, 556, 690, 32213, 891, 651, 5583, 18738, 352, 50276, 7152, 339, 4904, 1332, 281, 6642, 253, 13775, 4315, 273, 247, 2806, 3364, 30410, 970, 347, 1643, 3530, 347, 1896, 253, 2929, 310, 7106, 1475, 581, 897, 1083, 12544, 2544, 275, 13361, 1049, 261, 7296, 326, 824, 2544, 778, 564, 440, 1136, 11493, 20544, 50276, 783, 897, 1083, 310, 581, 326, 310, 417, 2223, 5421, 2568, 310, 1774, 1097, 432, 247, 2136, 30990, 8668, 18005, 253, 4517, 11448, 1632, 273, 247, 6264, 13361, 23370, 285, 18035, 8668, 5277, 1491, 281, 5948, 670, 8763, 25926, 275, 6264, 13361, 1049, 261, 534, 403, 2834, 281, 3540, 984, 273, 253, 5933, 280, 4996, 507, 19318, 326, 310, 247, 1846, 1895, 275, 253, 13817, 4491, 50276, 41528, 310, 440, 681, 37787, 281, 3359, 285, 3400, 2590, 7756, 689, 27785, 10491, 3082, 50276, 2052, 2231, 10895, 327, 23370, 15036, 310, 4460, 285, 310, 247, 2201, 7680, 273, 253, 789, 604, 4439, 281, 253, 1345, 50276, 20881, 1255, 265, 50276, 783, 2934, 273, 10183, 2308, 465, 310, 417, 6210, 392, 70, 1155, 264, 627, 403, 247, 4564, 273, 5053, 835, 891, 369, 16764, 625, 4278, 337, 690, 4679, 897, 465, 19, 1223, 2571, 897, 465, 20, 752, 310, 253, 22861, 323, 2509, 594, 374, 891, 369, 16764, 271, 28913, 1263, 10941, 9425, 66, 342, 465, 18, 4632, 9425, 66, 342, 465, 18, 347, 253, 2929, 9572, 352, 310, 12744, 816, 849, 1199, 273, 9425, 284, 3045, 7756, 689, 6447, 285, 31539, 10491, 310, 1955, 281, 465, 7147, 253, 11649, 3169, 3410, 5438, 50276, 977, 13991, 323, 7756, 50276, 11952, 2104, 281, 2557, 253, 11649, 275, 1016, 10883, 285, 897, 326, 11649, 281, 4151, 3410, 5438, 310, 253, 2234, 2934, 326, 2789, 9425, 66, 1347, 1805, 685, 6447, 390, 31539, 10491, 1677, 253, 6349, 273, 436, 2934, 352, 651, 320, 1805, 281, 5513, 436, 3066, 271, 27350, 4677, 347, 247, 14876, 253, 4477, 812, 31986, 849, 247, 10883, 275, 534, 253, 13361, 23370, 966, 7790, 512, 941, 2792, 342, 253, 1072, 966, 1880, 987, 390, 3430, 588, 452, 1698, 11649, 1223, 253, 7285, 4245, 1029, 11649, 253, 2929, 2789, 34162, 533, 1774, 9021, 281, 13361, 18035, 3738, 253, 4081, 5933, 310, 417, 3216, 22071, 432, 247, 7681, 8668, 352, 1057, 8162, 3012, 4404, 10499, 285, 12544, 2544, 275, 2806, 3364, 1049, 261, 285, 891, 1158, 352, 310, 273, 1029, 1318, 281, 5948, 891, 717, 7514, 326, 253, 2934, 273, 10183, 2308, 465, 310, 417, 4751, 3715, 533, 891, 9644, 4404, 14924, 50276, 187, 187, 4118, 18435, 27, 783, 2929, 2175, 1524, 1533, 13361, 1049, 261, 3045, 15036, 1955, 281, 23370, 11269, 1221, 26208, 285, 29328, 247, 7792, 281, 14556, 6642, 1110, 15036, 50276, 783, 1895, 310, 1077, 1774, 285, 253, 3559, 2746, 7964, 4460, 619, 4468, 310, 670, 3710, 38135, 273, 253, 10527, 1783, 285, 5075, 5661, 7103, 816, 767, 12282, 3710, 1180, 273, 2718, 5762, 1355, 1180, 273, 490, 77, 569, 347, 273, 1024, 253, 2929, 4453, 751, 271, 4722, 533, 46809, 10419, 2819, 3579, 281, 253, 5955, 875, 253, 4477, 285, 253, 30628, 281, 2953, 253, 7350, 50276, 249, 253, 30080, 22559, 253, 4477, 452, 9713, 30628, 5701, 275, 1798, 407, 6240, 3081, 4679, 326, 17084, 253, 2929, 512, 253, 30628, 5583, 253, 2929, 281, 320, 7607, 352, 310, 5125, 326, 275, 253, 4049, 254, 609, 5102, 2715, 253, 4477, 588, 823, 3081, 4278, 5001, 253, 4679, 347, 690, 273, 253, 30628, 5393 ]
[ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 ]
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 2520, 2929, 19401, 253, 1895, 273, 26230, 253, 1818, 275, 253, 3045, 273, 6264, 13361, 1049, 261, 13361, 347, 247, 2579, 347, 253, 3210, 403, 9300, 689, 673, 4679, 403, 323, 9169, 4632, 43425, 352, 7473, 4219, 253, 1895, 347, 26230, 253, 1818, 275, 253, 13775, 4315, 689, 673, 253, 2022, 10527, 7680, 310, 271, 17825, 10491, 1332, 281, 625, 14556, 6642, 436, 5333, 4722, 16774, 1543, 327, 2710, 13361, 1049, 261, 403, 2530, 275, 253, 2929, 4645, 253, 17200, 273, 253, 1895, 285, 253, 12510, 273, 253, 4081, 1332, 3236, 414, 285, 8453, 281, 253, 1682, 273, 619, 3640, 436, 310, 253, 806, 789, 281, 24181, 7409, 253, 5333, 275, 253, 3045, 273, 6264, 13361, 1049, 261, 342, 253, 18395, 273, 841, 3238, 327, 253, 6054, 352, 310, 1774, 281, 1263, 2710, 7794, 273, 253, 3210, 1690, 253, 10575, 273, 253, 3045, 689, 673, 253, 4081, 10491, 1332, 310, 671, 4460, 285, 476, 320, 908, 275, 2074, 7533, 835, 1566, 19241, 403, 8214, 253, 2929, 13698, 281, 15338, 253, 8954, 7564, 3750, 5222, 273, 253, 2228, 275, 26230, 253, 13775, 4315, 273, 49996, 1223, 7562, 253, 3410, 10454, 2810, 281, 253, 8654, 17621, 5700, 275, 17134, 18347, 253, 4081, 5933, 38311, 7274, 253, 8654, 17621, 10027, 2281, 273, 337, 79, 253, 16774, 2175, 1804, 326, 253, 4081, 1332, 476, 320, 271, 1340, 273, 9777, 625, 3410, 5919, 2429, 281, 3632, 10491, 50276, 15177, 285, 19843, 253, 2929, 310, 973, 3542, 285, 15265, 684, 253, 1895, 342, 1083, 2175, 285, 6667, 253, 11333, 285, 39383, 403, 4518, 4767, 891, 858, 417, 2451, 253, 27947, 275, 253, 30762, 50276, 8826, 7364, 273, 253, 789, 50276, 783, 1332, 347, 2529, 275, 253, 2929, 760, 10384, 281, 9162, 7533, 50276, 20261, 417, 247, 7654, 8284, 253, 5661, 1783, 273, 253, 10491, 1332, 5373, 432, 247, 7227, 6642, 273, 253, 10183, 273, 1016, 1650, 1078, 352, 310, 6760, 407, 253, 13361, 23370, 50276, 2520, 10183, 310, 5118, 407, 970, 247, 4858, 8548, 504, 11142, 1566, 436, 4566, 1411, 581, 273, 253, 2022, 12527, 273, 13361, 1049, 261, 326, 1611, 281, 15338, 8548, 504, 7103, 9978, 1158, 20864, 28793, 20243, 3966, 627, 403, 642, 28913, 2175, 327, 253, 18755, 2554, 273, 970, 824, 5268, 1930, 3210, 50276, 783, 4679, 812, 320, 5520, 407, 5301, 281, 1666, 25379, 643, 685, 3632, 10491, 2167, 436, 1537, 417, 320, 1896, 342, 9300, 1049, 261, 50276, 9939, 891, 369, 247, 37317, 281, 271, 4321, 2715, 273, 436, 789, 2429, 281, 253, 2045, 2715, 337, 253, 4028, 273, 253, 2929, 310, 5520, 275, 4243, 374, 627, 310, 247, 625, 2590, 5955, 273, 2905, 789, 285, 495, 253, 10895, 310, 1146, 13644, 4439, 436, 789, 13279, 247, 5955, 1475, 253, 1895, 273, 26230, 253, 3045, 5333, 275, 6264, 13361, 1049, 261, 323, 9162, 253, 2929, 13067, 247, 7982, 323, 253, 3045, 5333, 273, 824, 1049, 261, 3066, 253, 13775, 4315, 285, 10262, 247, 1332, 281, 5115, 2822, 8654, 10491, 4142, 50276, 783, 10527, 9021, 273, 253, 2929, 403, 1355, 533, 37825, 253, 5661, 1783, 310, 7000, 285, 4722, 533, 812, 5649, 432, 2007, 28913, 2175, 327, 253, 1055, 273, 253, 8548, 504, 10183, 11206, 1566, 253, 1895, 310, 273, 1600, 281, 253, 13361, 3114, 285, 253, 3727, 273, 253, 28267, 10895, 908, 275, 436, 789, 651, 320, 4217, 281, 253, 3114, 50275, 7152, 33032, 6682, 15036, 403, 1846, 275, 2067, 18329, 5145, 4715, 3210, 436, 789, 29328, 271, 5919, 1039, 281, 2557, 15036, 275, 253, 13775, 12624, 273, 13361, 3210, 970, 3710, 1180, 273, 23370, 5841, 50276, 783, 1895, 9713, 310, 1774, 285, 253, 1332, 3133, 281, 4917, 1175, 1543, 275, 2283, 885, 275, 5301, 281, 3632, 10491, 50275, 19131, 4468, 310, 697, 12744, 2139, 436, 1895, 2550, 25057, 690, 273, 253, 5368, 789, 327, 31539, 10491, 1754, 327, 8813, 387, 253, 990, 273, 2593, 337, 342, 253, 4388, 281, 4796, 253, 11041, 273, 253, 29107, 812, 368, 4496, 21184, 327, 436, 846, 512, 253, 4736, 310, 281, 6642, 3603, 273, 253, 13775, 4315, 50275, 3169, 327, 2593, 5922, 1057, 9425, 66, 4917, 271, 8654, 17621, 273, 23370, 5841, 23370, 5333, 310, 271, 1774, 1895, 326, 3198, 281, 320, 9713, 275, 1340, 281, 15942, 2182, 23105, 436, 789, 29328, 247, 1039, 281, 2939, 841, 15036, 970, 3710, 1180, 273, 23370, 5841, 285, 556, 247, 2257, 273, 8542, 6349, 954, 18329, 13361, 3210, 403, 671, 33449, 275, 247, 5133, 10646, 1016, 23370, 5841, 556, 271, 2330, 2105, 285, 3021, 9591, 23370, 5333, 6803, 970, 3710, 5841, 310, 3240, 1774, 1014, 432, 247, 20910, 1462, 1127, 50276, 7152, 33032, 43355, 921, 326, 13361, 3210, 3212, 13644, 2130, 1049, 261, 1818, 285, 841, 2544, 2847, 906, 2544, 323, 3280, 15302, 4477, 3540, 253, 2544, 949, 13775, 4315, 3910, 597, 12661, 271, 5919, 5933, 597, 1067, 9425, 66, 281, 7472, 253, 2544, 275, 1543, 342, 3777, 1180, 273, 19241, 616, 5933, 33526, 1805, 8197, 1677, 253, 1072, 7563, 685, 6447, 10491, 20544, 50276, 85, 471, 868, 1774, 8542, 1895, 273, 253, 906, 3910, 432, 13361, 1049, 261, 50276, 81, 5957, 7200, 1818, 1543, 432, 247, 1180, 273, 4588, 13361, 1049, 261, 432, 4283, 11967, 50276, 43355, 3562, 247, 4460, 5933, 9425, 66, 281, 14556, 2736, 285, 7472, 906, 3910, 323, 13361, 1049, 261, 50275, 43355, 7568, 326, 9425, 66, 3012, 41731, 13015, 8245, 11333, 50276, 20881, 1255, 265, 50276, 18921, 1974, 2544, 432, 4588, 13361, 1049, 261, 310, 3710, 275, 7990, 760, 1643, 2718, 497, 5867, 285, 760, 323, 767, 12282, 7203, 9169, 285, 7203, 43425, 352, 651, 320, 4722, 285, 1774, 281, 3540, 1097, 625, 2718, 285, 625, 673, 2792, 50276, 328, 8250, 849, 13775, 4315, 265, 497, 908, 323, 6519, 8981, 4836, 534, 18289, 556, 247, 1077, 1781, 1180, 273, 5971, 891, 717, 29985, 4477, 4127, 6519, 8981, 1895, 347, 9162, 1895, 323, 7103, 2299, 627, 403, 642, 4278, 327, 436, 352, 651, 320, 1175, 604, 436, 369, 5544, 285, 8692, 2530, 50276, 284, 4477, 4879, 13775, 4315, 3064, 310, 247, 1175, 2557, 347, 247, 906, 16924, 760, 323, 2176, 9162, 3022, 1049, 261, 352, 651, 320, 1175, 281, 923, 849, 281, 2968, 342, 1327, 42070, 1049, 261, 50276, 43355, 1804, 326, 253, 3910, 2326, 275, 13775, 4315, 2085, 4217, 16039, 715, 849, 23370, 1543, 4391, 285, 2139, 597, 4391, 627, 310, 1652, 4326, 2492, 273, 31471, 273, 849, 253, 23370, 1543, 4391, 26332, 310, 13775, 4315, 3064, 1663, 253, 1682, 359, 476, 513, 281, 921, 849, 253, 23370, 1543, 4391, 5001, 2139, 253, 1543, 4391, 253, 4477, 2085, 5476, 265, 533, 352, 3133, 281, 479, 326, 359, 2550, 1663, 871, 1754, 327, 253, 1543, 604, 359, 2550, 3653, 253, 2139, 436, 943, 320, 4767, 604, 359, 476, 840, 352, 651, 320, 1175, 281, 923, 752, 476, 320, 3413, 285, 849, 50275, 37585, 1745, 80, 3239, 818, 2171, 888, 50276, 11425, 320, 1027, 891, 1158, 326, 253, 1895, 273, 13361, 23370, 906, 5333, 310, 1524, 285, 1774, 891, 2868, 4477, 1160, 4722, 285, 4217, 7680, 275, 16344, 824, 15036, 3738, 253, 2929, 556, 690, 32213, 891, 651, 5583, 18738, 352, 50276, 7152, 339, 4904, 1332, 281, 6642, 253, 13775, 4315, 273, 247, 2806, 3364, 30410, 970, 347, 1643, 3530, 347, 1896, 253, 2929, 310, 7106, 1475, 581, 897, 1083, 12544, 2544, 275, 13361, 1049, 261, 7296, 326, 824, 2544, 778, 564, 440, 1136, 11493, 20544, 50276, 783, 897, 1083, 310, 581, 326, 310, 417, 2223, 5421, 2568, 310, 1774, 1097, 432, 247, 2136, 30990, 8668, 18005, 253, 4517, 11448, 1632, 273, 247, 6264, 13361, 23370, 285, 18035, 8668, 5277, 1491, 281, 5948, 670, 8763, 25926, 275, 6264, 13361, 1049, 261, 534, 403, 2834, 281, 3540, 984, 273, 253, 5933, 280, 4996, 507, 19318, 326, 310, 247, 1846, 1895, 275, 253, 13817, 4491, 50276, 41528, 310, 440, 681, 37787, 281, 3359, 285, 3400, 2590, 7756, 689, 27785, 10491, 3082, 50276, 2052, 2231, 10895, 327, 23370, 15036, 310, 4460, 285, 310, 247, 2201, 7680, 273, 253, 789, 604, 4439, 281, 253, 1345, 50276, 20881, 1255, 265, 50276, 783, 2934, 273, 10183, 2308, 465, 310, 417, 6210, 392, 70, 1155, 264, 627, 403, 247, 4564, 273, 5053, 835, 891, 369, 16764, 625, 4278, 337, 690, 4679, 897, 465, 19, 1223, 2571, 897, 465, 20, 752, 310, 253, 22861, 323, 2509, 594, 374, 891, 369, 16764, 271, 28913, 1263, 10941, 9425, 66, 342, 465, 18, 4632, 9425, 66, 342, 465, 18, 347, 253, 2929, 9572, 352, 310, 12744, 816, 849, 1199, 273, 9425, 284, 3045, 7756, 689, 6447, 285, 31539, 10491, 310, 1955, 281, 465, 7147, 253, 11649, 3169, 3410, 5438, 50276, 977, 13991, 323, 7756, 50276, 11952, 2104, 281, 2557, 253, 11649, 275, 1016, 10883, 285, 897, 326, 11649, 281, 4151, 3410, 5438, 310, 253, 2234, 2934, 326, 2789, 9425, 66, 1347, 1805, 685, 6447, 390, 31539, 10491, 1677, 253, 6349, 273, 436, 2934, 352, 651, 320, 1805, 281, 5513, 436, 3066, 271, 27350, 4677, 347, 247, 14876, 253, 4477, 812, 31986, 849, 247, 10883, 275, 534, 253, 13361, 23370, 966, 7790, 512, 941, 2792, 342, 253, 1072, 966, 1880, 987, 390, 3430, 588, 452, 1698, 11649, 1223, 253, 7285, 4245, 1029, 11649, 253, 2929, 2789, 34162, 533, 1774, 9021, 281, 13361, 18035, 3738, 253, 4081, 5933, 310, 417, 3216, 22071, 432, 247, 7681, 8668, 352, 1057, 8162, 3012, 4404, 10499, 285, 12544, 2544, 275, 2806, 3364, 1049, 261, 285, 891, 1158, 352, 310, 273, 1029, 1318, 281, 5948, 891, 717, 7514, 326, 253, 2934, 273, 10183, 2308, 465, 310, 417, 4751, 3715, 533, 891, 9644, 4404, 14924, 50276, 187, 187, 4118, 18435, 27, 783, 2929, 2175, 1524, 1533, 13361, 1049, 261, 3045, 15036, 1955, 281, 23370, 11269, 1221, 26208, 285, 29328, 247, 7792, 281, 14556, 6642, 1110, 15036, 50276, 783, 1895, 310, 1077, 1774, 285, 253, 3559, 2746, 7964, 4460, 619, 4468, 310, 670, 3710, 38135, 273, 253, 10527, 1783, 285, 5075, 5661, 7103, 816, 767, 12282, 3710, 1180, 273, 2718, 5762, 1355, 1180, 273, 490, 77, 569, 347, 273, 1024, 253, 2929, 4453, 751, 271, 4722, 533, 46809, 10419, 2819, 3579, 281, 253, 5955, 875, 253, 4477, 285, 253, 30628, 281, 2953, 253, 7350, 50276, 249, 253, 30080, 22559, 253, 4477, 452, 9713, 30628, 5701, 275, 1798, 407, 6240, 3081, 4679, 326, 17084, 253, 2929, 512, 253, 30628, 5583, 253, 2929, 281, 320, 7607, 352, 310, 5125, 326, 275, 253, 4049, 254, 609, 5102, 2715, 253, 4477, 588, 823, 3081, 4278, 5001, 253, 4679, 347, 690, 273, 253, 30628, 5393 ]
Below is given review of a research paper from cnoference journal. Please write a summary the review. ### Review: this paper considers the average sensitivity of the euclidean k ellclustering problem which measures the stability of the output in total variation distance against deleting a random point from the input data the authors first show that dlsampling has low average sensitivity and then show that any approximation algorithm for euclidean k ellclustering can be transformed into an algorithm with a low average sensitivity while almost preserving the approximation guarantee via a coreset construction approach they also extend their result to consistent and dynamic settings strengths generally i think the topic is essential and the notion of average sensitivity on euclidean kclustering is interesting this paper first provides an analysis of the average sensitivity of euclidean kclustering which provides evidence of the stability of the wellstudied kmeans algorithm the writing is good and explains the contributions well weaknesses i have some concerns about the main result theorem 31 see questions i may raise my score if the question is answered not docsepthis work focuses on the euclidean k ellclustering problem with low average sensitivity kmedian and kmeans are the two of the most studied problems and which are special cases of this work they focus on the changes to the output distribution in the case that one of the input points is deleted they first show that dellsampling has average sensitivity of okn which is also tight up to a constant factor they also show a general reduction from any algorithm to one that has low sensitivity by losing a factor 1epsilon they also show that their results can be extended to consistent and dynamic settings two very interesting settings in the literature after rebuttal i do not find the responses of the authors to the second and third questions that i asked convincing 2 basically the authors are arguing that their results is only interesting in a special setting that delta is huge and if k logdelta their result is worst which is different from the claim in the paper 3 the argument here is hiding some of the most important factor and focusing only on log with out explaining the reason behind it from my point of view the two main applications of this work are significantly weaker than what presented in the paper reading the rebuttal and the other reviews i updated my score the paper is in good shape and the results are clear i found the reduction interesting it is based on coreset construction from the analysis it seems that the approach cannot be extended to achieve better results as well it is always interesting to understand dell sampling better since it is still one of the most basic algorithms for kmeans the ideas and techniques are nice but i do not see any major new ones in this work i am concerned about the consistent and dynamic settings it seems that the authors have missed some of the most recent results for example for consistent clustering consistent kclustering for general metrics achieves better results than the one cited and also the results in this work moreover there are also results in a dynamic setting like fullydynamic coresets which is not considered in this work from my point of view the extensions to consistent and dynamic setting needs more work and is not clear and interesting currently typo line 26 even k 2 i think it is covered notice that the goal of this paper is not private algorithms docsepthe paper addresses the problems of consistent and dynamic clustering the metric used for measuring the consistency stability of an algorithm is the average sensitivity the authors first provide a proof that two known clustering and sampling algorithms have very small average sensitivity the authors then show how to transform an approximation algorithm for the euclidean clustering problem into a variation with guaranteed low average sensitivity this is by providing a novel coreset construction algorithm with low average sensitivity and then running the original approximation algorithm on the stable coreset lastly based on the low average sensitivity algorithm above the authors propose algorithms for the consistent and dynamic clustering tasks in the random order models strengths the paper addresses important problem that are very relevant for the inevitability noisy data in the big data era the paper is very well written the formal claims are concise and are well explained beforehand the proofs are clear and seem correct the results are interesting the proposed low average sensitivity algorithm is nontrivial and makes an interesting use of the known compression scheme known as a coreset one of the conclusions of this work in my opinion is important for future coreset construction algorithms a low average sensitivity coreset construction algorithm yields a low average sensitivity approximation algorithm for tackling the problem this acts as another application for coresets which is not usually discussed in the coresets literature the link to consistent and dynamic algorithms in the randomorder model is important weaknesses the dynamic clustering paragraph at line 76 is in the best case misleading many existing coresets for the kmeans problem do support fast update on new data arrival hence computing and maintaining such a coreset for the dynamic data also yields a good and fast solution for the problem under the same dynamicdata model the paper lacks a comparison or discussion on other dimensionality reduction techniques eg httpsepubssiamorgdoiabs10113718m1209854 no empirical evaluation or comparison to competing methods the empirical evaluation of the stability of the proposed algorithms and their impact when handling noisy datasets is crucial furthermore the comparison of those variants to the known nonstable algorithms can reveal whether the new variants are indeed neccessary presentation the disclaimer on the difference between the two very different sensitivity terms at the bottom of page 6 should have been made clearly emphasized earlier on the two terms are very confusing specially for the readers familiar with the coresets field limitations and future work are not mentioned minor comment line 175176 can you provide a citation or a more formal claim limitations are not clearly mentioned ### Summary:
i agree with the reviewers that the topic is essential the notion of average sensitivity on euclidean clustering is interesting and that the paper addresses important problem that is very relevant for the inevitability noisy data in the big data era as complained the dynamic data section is a bit misleading compared to previous work and i suggest to remove it please also add the discussion in the rebuttal to the paper or at least the supp material
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 2520, 2929, 19401, 253, 3388, 7340, 273, 253, 299, 26365, 465, 11591, 498, 49591, 1895, 534, 5593, 253, 7882, 273, 253, 3453, 275, 2264, 7629, 4181, 1411, 37193, 247, 3632, 1127, 432, 253, 3280, 941, 253, 4477, 806, 921, 326, 277, 5200, 312, 4906, 556, 1698, 3388, 7340, 285, 840, 921, 326, 667, 11193, 5933, 323, 299, 26365, 465, 11591, 498, 49591, 476, 320, 13657, 715, 271, 5933, 342, 247, 1698, 3388, 7340, 1223, 2761, 24279, 253, 11193, 12215, 3066, 247, 820, 19511, 5140, 2746, 597, 671, 9017, 616, 906, 281, 5185, 285, 7870, 7533, 20544, 3839, 891, 1158, 253, 9400, 310, 5667, 285, 253, 10732, 273, 3388, 7340, 327, 299, 26365, 465, 498, 49591, 310, 4722, 436, 2929, 806, 3400, 271, 1783, 273, 253, 3388, 7340, 273, 299, 26365, 465, 498, 49591, 534, 3400, 1941, 273, 253, 7882, 273, 253, 973, 14091, 728, 465, 30799, 5933, 253, 4028, 310, 1175, 285, 11424, 253, 9021, 973, 50276, 20881, 1255, 265, 891, 452, 690, 7350, 670, 253, 2022, 906, 10012, 4562, 923, 3533, 891, 778, 7164, 619, 4868, 604, 253, 1953, 310, 9577, 50271, 1439, 5474, 33032, 2520, 789, 16633, 327, 253, 299, 26365, 465, 11591, 498, 49591, 1895, 342, 1698, 3388, 7340, 465, 29541, 285, 465, 30799, 403, 253, 767, 273, 253, 954, 5421, 3237, 285, 534, 403, 2714, 2219, 273, 436, 789, 597, 2770, 327, 253, 2544, 281, 253, 3453, 3268, 275, 253, 1083, 326, 581, 273, 253, 3280, 2792, 310, 16737, 50276, 9328, 806, 921, 326, 277, 7042, 312, 4906, 556, 3388, 7340, 273, 258, 3696, 534, 310, 671, 6863, 598, 281, 247, 3638, 2803, 597, 671, 921, 247, 50276, 16691, 5141, 432, 667, 5933, 281, 581, 326, 556, 1698, 7340, 407, 10305, 247, 2803, 337, 4259, 597, 671, 921, 326, 616, 1543, 476, 320, 6508, 281, 5185, 285, 7870, 7533, 767, 1077, 4722, 7533, 275, 253, 6239, 50275, 6438, 30080, 22559, 50276, 74, 513, 417, 1089, 253, 6128, 273, 253, 4477, 281, 253, 1273, 285, 2626, 3533, 326, 891, 2546, 21414, 50276, 19, 10323, 253, 4477, 403, 16425, 326, 616, 1543, 310, 760, 4722, 275, 247, 2714, 4758, 326, 18687, 310, 5699, 285, 604, 465, 50276, 2808, 3005, 616, 906, 310, 9065, 534, 310, 1027, 432, 253, 1750, 275, 253, 2929, 495, 253, 4154, 1060, 310, 17197, 690, 273, 253, 954, 1774, 2803, 285, 13654, 760, 327, 2412, 342, 562, 15571, 253, 1921, 3212, 352, 50276, 4064, 619, 1127, 273, 1859, 253, 767, 2022, 4893, 273, 436, 789, 403, 3012, 21076, 685, 752, 3559, 275, 253, 2929, 4361, 253, 30080, 22559, 285, 253, 643, 10123, 891, 9300, 619, 4868, 50275, 783, 2929, 310, 275, 1175, 5281, 285, 253, 1543, 403, 2590, 891, 1119, 253, 5141, 4722, 352, 310, 1754, 327, 820, 19511, 5140, 432, 253, 1783, 352, 3133, 326, 253, 2746, 2550, 320, 6508, 281, 5115, 1805, 1543, 347, 973, 352, 310, 1900, 4722, 281, 2096, 20889, 10491, 1805, 1580, 352, 310, 1335, 581, 273, 253, 954, 5044, 11333, 323, 465, 30799, 253, 5697, 285, 5609, 403, 5322, 533, 891, 513, 417, 923, 667, 2201, 747, 4394, 275, 436, 789, 50276, 74, 717, 7514, 670, 253, 5185, 285, 7870, 7533, 352, 3133, 326, 253, 4477, 452, 9829, 690, 273, 253, 954, 3332, 1543, 323, 1650, 323, 5185, 17524, 5185, 465, 498, 49591, 323, 2087, 17082, 33526, 1805, 1543, 685, 253, 581, 11106, 285, 671, 253, 1543, 275, 436, 789, 25761, 627, 403, 671, 1543, 275, 247, 7870, 4758, 751, 4751, 19681, 23018, 1507, 534, 310, 417, 2783, 275, 436, 789, 432, 619, 1127, 273, 1859, 253, 18149, 281, 5185, 285, 7870, 4758, 3198, 625, 789, 285, 310, 417, 2590, 285, 4722, 4390, 50276, 555, 5367, 1386, 3436, 1014, 465, 50276, 19, 50276, 74, 1158, 352, 310, 6107, 4366, 326, 253, 4736, 273, 436, 2929, 310, 417, 3055, 11333, 5474, 339, 431, 248, 2929, 12453, 253, 3237, 273, 5185, 285, 7870, 17524, 253, 7982, 908, 323, 10499, 253, 15274, 50276, 296, 1430, 273, 271, 5933, 310, 253, 3388, 7340, 50276, 783, 4477, 806, 2085, 247, 4737, 326, 767, 1929, 17524, 285, 10491, 11333, 452, 1077, 1355, 3388, 7340, 253, 4477, 840, 921, 849, 281, 4979, 271, 11193, 5933, 323, 253, 299, 26365, 17524, 1895, 715, 247, 7629, 342, 16293, 1698, 3388, 7340, 436, 310, 407, 5277, 247, 4460, 820, 19511, 5140, 5933, 342, 1698, 3388, 7340, 285, 840, 3515, 253, 3236, 11193, 5933, 327, 253, 6474, 820, 19511, 50276, 6275, 314, 1754, 327, 253, 1698, 3388, 7340, 5933, 1840, 253, 4477, 12661, 11333, 323, 253, 5185, 285, 7870, 17524, 8892, 275, 253, 3632, 1340, 3210, 50276, 296, 3755, 20556, 50276, 783, 2929, 12453, 1774, 1895, 326, 403, 1077, 4623, 323, 253, 13177, 262, 1430, 27620, 941, 275, 253, 1943, 941, 8685, 50276, 783, 2929, 310, 1077, 973, 3542, 253, 7473, 3916, 403, 44003, 285, 403, 973, 5544, 38565, 253, 27947, 403, 2590, 285, 1646, 3451, 50276, 783, 1543, 403, 4722, 253, 4081, 1698, 3388, 7340, 5933, 310, 37825, 285, 2789, 271, 4722, 897, 273, 253, 1929, 13800, 6974, 1929, 347, 247, 820, 19511, 50276, 531, 273, 253, 11815, 273, 436, 789, 275, 619, 4743, 310, 1774, 323, 2852, 820, 19511, 5140, 11333, 247, 1698, 3388, 7340, 820, 19511, 5140, 5933, 11026, 247, 1698, 3388, 7340, 11193, 5933, 323, 46710, 253, 1895, 436, 6993, 347, 1529, 2898, 323, 23018, 1507, 534, 310, 417, 3798, 5469, 275, 253, 23018, 1507, 6239, 50276, 783, 3048, 281, 5185, 285, 7870, 11333, 275, 253, 3632, 2621, 1566, 310, 1774, 50276, 20881, 1255, 265, 50276, 783, 7870, 17524, 12494, 387, 1386, 10909, 310, 275, 253, 1682, 1083, 50276, 24418, 16378, 1142, 5368, 23018, 1507, 323, 253, 465, 30799, 1895, 513, 1329, 3809, 5731, 327, 747, 941, 13024, 7613, 12672, 285, 11850, 824, 247, 820, 19511, 323, 253, 7870, 941, 671, 11026, 247, 1175, 285, 3809, 2900, 323, 253, 1895, 762, 253, 1072, 7870, 2203, 1566, 50276, 783, 2929, 19756, 247, 5301, 390, 5955, 327, 643, 7877, 1319, 5141, 5609, 24088, 3944, 339, 16712, 859, 16726, 2061, 14369, 5375, 6903, 15497, 1093, 78, 805, 2693, 36208, 50275, 2369, 16774, 7103, 390, 5301, 281, 11771, 3082, 253, 16774, 7103, 273, 253, 7882, 273, 253, 4081, 11333, 285, 616, 3486, 672, 10885, 27620, 15302, 310, 9560, 33810, 253, 5301, 273, 1110, 11640, 281, 253, 1929, 1327, 11351, 11333, 476, 10313, 1880, 253, 747, 11640, 403, 6296, 425, 1828, 552, 50276, 49836, 253, 27578, 327, 253, 3064, 875, 253, 767, 1077, 1027, 7340, 2426, 387, 253, 5004, 273, 3239, 721, 943, 452, 644, 1160, 4518, 21947, 4321, 327, 253, 767, 2426, 403, 1077, 21643, 24443, 323, 253, 10668, 7615, 342, 253, 23018, 1507, 1673, 50276, 17465, 569, 285, 2852, 789, 403, 417, 5393, 50276, 37585, 4385, 50276, 1282, 20105, 17184, 476, 368, 2085, 247, 25577, 390, 247, 625, 7473, 1750, 7364, 403, 417, 4518, 5393, 2490, 187, 4118, 18435, 27, 74, 5194, 342, 253, 30628, 326, 253, 9400, 310, 5667, 253, 10732, 273, 3388, 7340, 327, 299, 26365, 17524, 310, 4722, 285, 326, 253, 2929, 12453, 1774, 1895, 326, 310, 1077, 4623, 323, 253, 13177, 262, 1430, 27620, 941, 275, 253, 1943, 941, 8685, 347, 19375, 253, 7870, 941, 2593, 310, 247, 2372, 24363, 2429, 281, 2045, 789, 285, 891, 1804, 281, 5386, 352, 4496, 671, 823, 253, 5955, 275, 253, 30080, 22559, 281, 253, 2929, 390, 387, 1878, 253, 915, 2144 ]
[ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 ]
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 2520, 2929, 19401, 253, 3388, 7340, 273, 253, 299, 26365, 465, 11591, 498, 49591, 1895, 534, 5593, 253, 7882, 273, 253, 3453, 275, 2264, 7629, 4181, 1411, 37193, 247, 3632, 1127, 432, 253, 3280, 941, 253, 4477, 806, 921, 326, 277, 5200, 312, 4906, 556, 1698, 3388, 7340, 285, 840, 921, 326, 667, 11193, 5933, 323, 299, 26365, 465, 11591, 498, 49591, 476, 320, 13657, 715, 271, 5933, 342, 247, 1698, 3388, 7340, 1223, 2761, 24279, 253, 11193, 12215, 3066, 247, 820, 19511, 5140, 2746, 597, 671, 9017, 616, 906, 281, 5185, 285, 7870, 7533, 20544, 3839, 891, 1158, 253, 9400, 310, 5667, 285, 253, 10732, 273, 3388, 7340, 327, 299, 26365, 465, 498, 49591, 310, 4722, 436, 2929, 806, 3400, 271, 1783, 273, 253, 3388, 7340, 273, 299, 26365, 465, 498, 49591, 534, 3400, 1941, 273, 253, 7882, 273, 253, 973, 14091, 728, 465, 30799, 5933, 253, 4028, 310, 1175, 285, 11424, 253, 9021, 973, 50276, 20881, 1255, 265, 891, 452, 690, 7350, 670, 253, 2022, 906, 10012, 4562, 923, 3533, 891, 778, 7164, 619, 4868, 604, 253, 1953, 310, 9577, 50271, 1439, 5474, 33032, 2520, 789, 16633, 327, 253, 299, 26365, 465, 11591, 498, 49591, 1895, 342, 1698, 3388, 7340, 465, 29541, 285, 465, 30799, 403, 253, 767, 273, 253, 954, 5421, 3237, 285, 534, 403, 2714, 2219, 273, 436, 789, 597, 2770, 327, 253, 2544, 281, 253, 3453, 3268, 275, 253, 1083, 326, 581, 273, 253, 3280, 2792, 310, 16737, 50276, 9328, 806, 921, 326, 277, 7042, 312, 4906, 556, 3388, 7340, 273, 258, 3696, 534, 310, 671, 6863, 598, 281, 247, 3638, 2803, 597, 671, 921, 247, 50276, 16691, 5141, 432, 667, 5933, 281, 581, 326, 556, 1698, 7340, 407, 10305, 247, 2803, 337, 4259, 597, 671, 921, 326, 616, 1543, 476, 320, 6508, 281, 5185, 285, 7870, 7533, 767, 1077, 4722, 7533, 275, 253, 6239, 50275, 6438, 30080, 22559, 50276, 74, 513, 417, 1089, 253, 6128, 273, 253, 4477, 281, 253, 1273, 285, 2626, 3533, 326, 891, 2546, 21414, 50276, 19, 10323, 253, 4477, 403, 16425, 326, 616, 1543, 310, 760, 4722, 275, 247, 2714, 4758, 326, 18687, 310, 5699, 285, 604, 465, 50276, 2808, 3005, 616, 906, 310, 9065, 534, 310, 1027, 432, 253, 1750, 275, 253, 2929, 495, 253, 4154, 1060, 310, 17197, 690, 273, 253, 954, 1774, 2803, 285, 13654, 760, 327, 2412, 342, 562, 15571, 253, 1921, 3212, 352, 50276, 4064, 619, 1127, 273, 1859, 253, 767, 2022, 4893, 273, 436, 789, 403, 3012, 21076, 685, 752, 3559, 275, 253, 2929, 4361, 253, 30080, 22559, 285, 253, 643, 10123, 891, 9300, 619, 4868, 50275, 783, 2929, 310, 275, 1175, 5281, 285, 253, 1543, 403, 2590, 891, 1119, 253, 5141, 4722, 352, 310, 1754, 327, 820, 19511, 5140, 432, 253, 1783, 352, 3133, 326, 253, 2746, 2550, 320, 6508, 281, 5115, 1805, 1543, 347, 973, 352, 310, 1900, 4722, 281, 2096, 20889, 10491, 1805, 1580, 352, 310, 1335, 581, 273, 253, 954, 5044, 11333, 323, 465, 30799, 253, 5697, 285, 5609, 403, 5322, 533, 891, 513, 417, 923, 667, 2201, 747, 4394, 275, 436, 789, 50276, 74, 717, 7514, 670, 253, 5185, 285, 7870, 7533, 352, 3133, 326, 253, 4477, 452, 9829, 690, 273, 253, 954, 3332, 1543, 323, 1650, 323, 5185, 17524, 5185, 465, 498, 49591, 323, 2087, 17082, 33526, 1805, 1543, 685, 253, 581, 11106, 285, 671, 253, 1543, 275, 436, 789, 25761, 627, 403, 671, 1543, 275, 247, 7870, 4758, 751, 4751, 19681, 23018, 1507, 534, 310, 417, 2783, 275, 436, 789, 432, 619, 1127, 273, 1859, 253, 18149, 281, 5185, 285, 7870, 4758, 3198, 625, 789, 285, 310, 417, 2590, 285, 4722, 4390, 50276, 555, 5367, 1386, 3436, 1014, 465, 50276, 19, 50276, 74, 1158, 352, 310, 6107, 4366, 326, 253, 4736, 273, 436, 2929, 310, 417, 3055, 11333, 5474, 339, 431, 248, 2929, 12453, 253, 3237, 273, 5185, 285, 7870, 17524, 253, 7982, 908, 323, 10499, 253, 15274, 50276, 296, 1430, 273, 271, 5933, 310, 253, 3388, 7340, 50276, 783, 4477, 806, 2085, 247, 4737, 326, 767, 1929, 17524, 285, 10491, 11333, 452, 1077, 1355, 3388, 7340, 253, 4477, 840, 921, 849, 281, 4979, 271, 11193, 5933, 323, 253, 299, 26365, 17524, 1895, 715, 247, 7629, 342, 16293, 1698, 3388, 7340, 436, 310, 407, 5277, 247, 4460, 820, 19511, 5140, 5933, 342, 1698, 3388, 7340, 285, 840, 3515, 253, 3236, 11193, 5933, 327, 253, 6474, 820, 19511, 50276, 6275, 314, 1754, 327, 253, 1698, 3388, 7340, 5933, 1840, 253, 4477, 12661, 11333, 323, 253, 5185, 285, 7870, 17524, 8892, 275, 253, 3632, 1340, 3210, 50276, 296, 3755, 20556, 50276, 783, 2929, 12453, 1774, 1895, 326, 403, 1077, 4623, 323, 253, 13177, 262, 1430, 27620, 941, 275, 253, 1943, 941, 8685, 50276, 783, 2929, 310, 1077, 973, 3542, 253, 7473, 3916, 403, 44003, 285, 403, 973, 5544, 38565, 253, 27947, 403, 2590, 285, 1646, 3451, 50276, 783, 1543, 403, 4722, 253, 4081, 1698, 3388, 7340, 5933, 310, 37825, 285, 2789, 271, 4722, 897, 273, 253, 1929, 13800, 6974, 1929, 347, 247, 820, 19511, 50276, 531, 273, 253, 11815, 273, 436, 789, 275, 619, 4743, 310, 1774, 323, 2852, 820, 19511, 5140, 11333, 247, 1698, 3388, 7340, 820, 19511, 5140, 5933, 11026, 247, 1698, 3388, 7340, 11193, 5933, 323, 46710, 253, 1895, 436, 6993, 347, 1529, 2898, 323, 23018, 1507, 534, 310, 417, 3798, 5469, 275, 253, 23018, 1507, 6239, 50276, 783, 3048, 281, 5185, 285, 7870, 11333, 275, 253, 3632, 2621, 1566, 310, 1774, 50276, 20881, 1255, 265, 50276, 783, 7870, 17524, 12494, 387, 1386, 10909, 310, 275, 253, 1682, 1083, 50276, 24418, 16378, 1142, 5368, 23018, 1507, 323, 253, 465, 30799, 1895, 513, 1329, 3809, 5731, 327, 747, 941, 13024, 7613, 12672, 285, 11850, 824, 247, 820, 19511, 323, 253, 7870, 941, 671, 11026, 247, 1175, 285, 3809, 2900, 323, 253, 1895, 762, 253, 1072, 7870, 2203, 1566, 50276, 783, 2929, 19756, 247, 5301, 390, 5955, 327, 643, 7877, 1319, 5141, 5609, 24088, 3944, 339, 16712, 859, 16726, 2061, 14369, 5375, 6903, 15497, 1093, 78, 805, 2693, 36208, 50275, 2369, 16774, 7103, 390, 5301, 281, 11771, 3082, 253, 16774, 7103, 273, 253, 7882, 273, 253, 4081, 11333, 285, 616, 3486, 672, 10885, 27620, 15302, 310, 9560, 33810, 253, 5301, 273, 1110, 11640, 281, 253, 1929, 1327, 11351, 11333, 476, 10313, 1880, 253, 747, 11640, 403, 6296, 425, 1828, 552, 50276, 49836, 253, 27578, 327, 253, 3064, 875, 253, 767, 1077, 1027, 7340, 2426, 387, 253, 5004, 273, 3239, 721, 943, 452, 644, 1160, 4518, 21947, 4321, 327, 253, 767, 2426, 403, 1077, 21643, 24443, 323, 253, 10668, 7615, 342, 253, 23018, 1507, 1673, 50276, 17465, 569, 285, 2852, 789, 403, 417, 5393, 50276, 37585, 4385, 50276, 1282, 20105, 17184, 476, 368, 2085, 247, 25577, 390, 247, 625, 7473, 1750, 7364, 403, 417, 4518, 5393, 2490, 187, 4118, 18435, 27, 74, 5194, 342, 253, 30628, 326, 253, 9400, 310, 5667, 253, 10732, 273, 3388, 7340, 327, 299, 26365, 17524, 310, 4722, 285, 326, 253, 2929, 12453, 1774, 1895, 326, 310, 1077, 4623, 323, 253, 13177, 262, 1430, 27620, 941, 275, 253, 1943, 941, 8685, 347, 19375, 253, 7870, 941, 2593, 310, 247, 2372, 24363, 2429, 281, 2045, 789, 285, 891, 1804, 281, 5386, 352, 4496, 671, 823, 253, 5955, 275, 253, 30080, 22559, 281, 253, 2929, 390, 387, 1878, 253, 915, 2144 ]
Below is given review of a research paper from cnoference journal. Please write a summary the review. ### Review: this paper introduces a novel metalearning approach to unsupervised representation learning where an update rule for a base model ie an mlp is metalearned using a supervised metaobjective ie a fewshot linear regression from the learned representation to classification gts unlike previous approaches it metalearns an update rule by directly optimizing the utility of the unsupervised representation using the metaobjective in the phase of unsupervised representation learning the learned update rule is used for optimizing a base model without using any other base model objective experimental evaluations on fewshot classification demonstrate its generalization performance over different base architectures datasets and even domains novel and interesting formulation of metalearning by learning an unsupervised update rule for representation learning technically sound and well organized overall with details documented in appendixes clearly written overall with helpful schematic illustrations and in particular a good survey of related work good generalization performance over different larger and deeper base models activation functions datasets and even a different modality text classification motivations are not very clear in some parts eg the reason for learning backward weights v and the choice of metaobjective experimental evaluation is limited to fewshot classification which is very close to the metalearning objective used in this paper the result of text classification is interesting but not so informative given no further analysis eg why domain mismatch does not occur in this case i enjoyed reading this paper and happy to recommend it as a clear accept paper the idea of metalearning update networks looks a promising direction worth exploring indeed i hope the authors to clarify the things i mentioned above experimental results are enough considering the space limit but not great since the current evaluation task is quite similar to the metaobjective evaluations on more diverse tasks would strengthen this paper finally this paper aims at unsupervised representation learning but its not clear from the current title which is somewhat misleading i think thats quite an important feature of this paper so i highly recommend the authors to consider a more informative title eg learning rules for unsupervised representation learning or else docsepthis work brings a novel metalearning approach that learns unsupervised learning rules for learning representations across different modalities datasets input permutation and neural network architectures the metaobjectives consist of few shot learning scores from several supervised tasks the idea of using metaobjectives to learn unsupervised representation learning is a very interesting idea authors mentioned that the creation of an unsupervised update rule is treated as a transfer learning problem and this work is focused on learning a learning algorithm as opposed to structures of feature extractors can you elaborate on what aspect of learning rules and why they can be transferable among different modalities and datasets for this type of metalearning to be successful can you discuss the requirements on the type of metaobjectives besides saving computational cost does using smaller input dimensions favor your method over reconstruction type of semisupervised learning eg vae in the section generalizing over datasets and domains the accuracy of supervised methods and vae method are very close this indicates those datasets may not be ideal to evaluate semisupervised training in the section generalizing over network architectures what is the corresponding supervisedvae learning accuracy in the experimentation section can you describe in more details how input permutations are conducted are they resampled for each training session for tasks if the input permutations are not conducted will the comparison between this method supervised and vae be different after reviewing the author response i adjusted the rating up to focus more on novelty and less on polished resultsdocsepthe paper describes unsupervised learning as a metalearning problem the observation is that unsupervised learning rules are effectively supervised by the quality of the representations that they yield relative to subsequent later semisupervised or rl learning the learningtolearning algorithm allows for learning network architecture parameters and also networkinnetworks that determine the unsupervised learning signal based on pre and post activations quality the proposed algorithm is well defined and it is compared against relevant competing algorithms on relevant problems the results show that the algorithm is competitive with other approaches like vae very slightly outperforms clarity the paper is well written and clearly structured the section 54 is a bit hard to understand with very very small images originality there is an extensive literature on metalearning which is expanded upon in appendix a the main innovation in this work is the parametric update rule for outer loop updates which does have some similarity to the old work by bengio in 1990 and 1992 significance pros clear and seemingly stateoftheart results intuitive approach cons only very modestly better than other methods i would like to get a feel for why vae is so good tbh though the authors show that vae has a problem with objective function mismatch one comment the update rule takes as inputs pre and post activity and a backpropagated error it seems natural to also use the local gradient of the neurons transfer function here as many three or four factor learning rules do ### Summary:
the reviewers all agree that the idea is interesting the writing clear and the experiments sufficient to improve the paper the authors should consider better discussing their metaobjective and some of the algorithmic choices
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 2520, 2929, 23970, 247, 4460, 5148, 613, 920, 2746, 281, 440, 35421, 6779, 4715, 835, 271, 5731, 4086, 323, 247, 2613, 1566, 26332, 271, 13361, 81, 310, 5148, 613, 9306, 970, 247, 22296, 11419, 6082, 422, 26332, 247, 1643, 11860, 4872, 9077, 432, 253, 6311, 6779, 281, 9162, 305, 1641, 12401, 2045, 7274, 352, 5148, 613, 2224, 271, 5731, 4086, 407, 3587, 39793, 253, 11839, 273, 253, 440, 35421, 6779, 970, 253, 11419, 6082, 422, 275, 253, 3408, 273, 440, 35421, 6779, 4715, 253, 6311, 5731, 4086, 310, 908, 323, 39793, 247, 2613, 1566, 1293, 970, 667, 643, 2613, 1566, 8103, 5661, 27163, 327, 1643, 11860, 9162, 7568, 697, 26647, 3045, 689, 1027, 2613, 35615, 15302, 285, 1014, 10625, 50272, 2369, 652, 285, 4722, 15895, 273, 5148, 613, 920, 407, 4715, 271, 440, 35421, 5731, 4086, 323, 6779, 4715, 50274, 23693, 1037, 3590, 285, 973, 10932, 4583, 342, 4278, 14290, 275, 30762, 265, 50274, 49346, 3542, 4583, 342, 9371, 32467, 33954, 285, 275, 1798, 247, 1175, 6630, 273, 2905, 789, 50275, 12311, 26647, 3045, 689, 1027, 4067, 285, 12861, 2613, 3210, 5743, 3470, 15302, 285, 1014, 247, 1027, 36453, 2505, 9162, 50274, 24013, 400, 569, 403, 417, 1077, 2590, 275, 690, 4243, 24088, 253, 1921, 323, 4715, 19265, 13461, 362, 285, 253, 4327, 273, 11419, 6082, 422, 50274, 49363, 7103, 310, 3710, 281, 1643, 11860, 9162, 534, 310, 1077, 2810, 281, 253, 5148, 613, 920, 8103, 908, 275, 436, 2929, 50275, 783, 906, 273, 2505, 9162, 310, 4722, 533, 417, 594, 27096, 1677, 642, 2007, 1783, 24088, 2139, 5028, 29713, 1057, 417, 2826, 275, 436, 1083, 50276, 74, 11346, 4361, 436, 2929, 285, 5211, 281, 5583, 352, 347, 247, 2590, 2997, 2929, 253, 2934, 273, 5148, 613, 920, 5731, 6928, 4453, 247, 12532, 3884, 4409, 18216, 6296, 50276, 74, 3524, 253, 4477, 281, 19148, 253, 1841, 891, 5393, 1840, 5661, 1543, 403, 2217, 7296, 253, 2317, 2701, 533, 417, 1270, 1580, 253, 1655, 7103, 4836, 310, 3240, 2074, 281, 253, 11419, 6082, 422, 27163, 327, 625, 11117, 8892, 651, 17084, 436, 2929, 50275, 71, 3341, 436, 2929, 13698, 387, 440, 35421, 6779, 4715, 533, 697, 417, 2590, 432, 253, 1655, 4060, 534, 310, 8489, 24363, 891, 1158, 28763, 3240, 271, 1774, 4735, 273, 436, 2929, 594, 891, 4122, 5583, 253, 4477, 281, 1908, 247, 625, 27096, 4060, 24088, 4715, 4803, 323, 440, 35421, 6779, 4715, 390, 2010, 5474, 33032, 2520, 789, 10316, 247, 4460, 5148, 613, 920, 2746, 326, 33772, 440, 35421, 4715, 4803, 323, 4715, 14237, 2439, 1027, 33433, 15302, 3280, 29391, 285, 11454, 2990, 35615, 253, 11419, 6082, 1644, 2882, 273, 1643, 5103, 4715, 7363, 432, 2067, 22296, 8892, 253, 2934, 273, 970, 11419, 6082, 1644, 281, 3037, 440, 35421, 6779, 4715, 310, 247, 1077, 4722, 2934, 50276, 43355, 5393, 326, 253, 8869, 273, 271, 440, 35421, 5731, 4086, 310, 4127, 347, 247, 3700, 4715, 1895, 285, 436, 789, 310, 7106, 327, 4715, 247, 4715, 5933, 347, 10066, 281, 5289, 273, 4735, 4908, 641, 476, 368, 21184, 327, 752, 4809, 273, 4715, 4803, 285, 2139, 597, 476, 320, 3700, 494, 2190, 1027, 33433, 285, 15302, 323, 436, 1511, 273, 5148, 613, 920, 281, 320, 5547, 476, 368, 2319, 253, 6095, 327, 253, 1511, 273, 11419, 6082, 1644, 16280, 13868, 15180, 2105, 1057, 970, 4577, 3280, 10103, 3718, 634, 1332, 689, 14433, 1511, 273, 49863, 29974, 13337, 4715, 24088, 362, 3348, 50276, 249, 253, 2593, 2087, 3006, 689, 15302, 285, 10625, 253, 7200, 273, 22296, 3082, 285, 362, 3348, 1332, 403, 1077, 2810, 436, 6492, 1110, 15302, 778, 417, 320, 7445, 281, 7472, 49863, 29974, 13337, 3733, 50276, 249, 253, 2593, 2087, 3006, 689, 2990, 35615, 752, 310, 253, 3969, 22296, 21574, 4715, 7200, 50276, 249, 253, 40290, 2593, 476, 368, 6266, 275, 625, 4278, 849, 3280, 39908, 403, 5196, 403, 597, 501, 312, 6216, 323, 1016, 3733, 6874, 323, 8892, 604, 253, 3280, 39908, 403, 417, 5196, 588, 253, 5301, 875, 436, 1332, 22296, 285, 362, 3348, 320, 1027, 50276, 6438, 16725, 253, 2488, 2380, 891, 10904, 253, 13716, 598, 281, 2770, 625, 327, 38135, 285, 1679, 327, 29422, 1543, 7152, 339, 431, 248, 2929, 8631, 440, 35421, 4715, 347, 247, 5148, 613, 920, 1895, 253, 8310, 310, 326, 440, 35421, 4715, 4803, 403, 8069, 22296, 407, 253, 3290, 273, 253, 14237, 326, 597, 4917, 4103, 281, 6774, 1996, 49863, 29974, 13337, 390, 391, 77, 4715, 253, 4715, 936, 28269, 5933, 4483, 323, 4715, 2990, 10336, 3602, 285, 671, 2990, 2966, 292, 4896, 326, 3653, 253, 440, 35421, 4715, 2625, 1754, 327, 638, 285, 1501, 1396, 569, 50275, 15177, 50276, 783, 4081, 5933, 310, 973, 2931, 285, 352, 310, 2429, 1411, 4623, 11771, 11333, 327, 4623, 3237, 50276, 783, 1543, 921, 326, 253, 5933, 310, 12085, 342, 643, 7274, 751, 362, 3348, 1077, 5777, 41731, 13015, 50276, 498, 15752, 253, 2929, 310, 973, 3542, 285, 4518, 18872, 253, 2593, 8255, 310, 247, 2372, 1892, 281, 2096, 342, 1077, 1077, 1355, 3888, 50275, 19164, 414, 627, 310, 271, 9470, 6239, 327, 5148, 613, 920, 534, 310, 11848, 2220, 275, 30762, 247, 253, 2022, 15832, 275, 436, 789, 310, 253, 36833, 5731, 4086, 323, 8346, 6287, 11269, 534, 1057, 452, 690, 14259, 281, 253, 1711, 789, 407, 270, 1205, 900, 275, 7901, 285, 9748, 50275, 9188, 40348, 50276, 856, 84, 2590, 285, 16907, 1375, 23037, 14387, 1543, 27350, 2746, 50276, 5040, 760, 1077, 16453, 314, 1805, 685, 643, 3082, 891, 651, 751, 281, 755, 247, 1928, 323, 2139, 362, 3348, 310, 594, 1175, 246, 26576, 2167, 253, 4477, 921, 326, 362, 3348, 556, 247, 1895, 342, 8103, 1159, 29713, 50276, 531, 4385, 253, 5731, 4086, 3936, 347, 14800, 638, 285, 1501, 2425, 285, 247, 896, 44263, 456, 2228, 352, 3133, 3626, 281, 671, 897, 253, 1980, 11786, 273, 253, 8512, 3700, 1159, 1060, 347, 1142, 1264, 390, 1740, 2803, 4715, 4803, 513, 2490, 187, 4118, 18435, 27, 783, 30628, 512, 5194, 326, 253, 2934, 310, 4722, 253, 4028, 2590, 285, 253, 4679, 4209, 50275, 936, 3157, 253, 2929, 253, 4477, 943, 1908, 1805, 16585, 616, 11419, 6082, 422, 285, 690, 273, 253, 5933, 280, 10165, 209 ]
[ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 ]
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 2520, 2929, 23970, 247, 4460, 5148, 613, 920, 2746, 281, 440, 35421, 6779, 4715, 835, 271, 5731, 4086, 323, 247, 2613, 1566, 26332, 271, 13361, 81, 310, 5148, 613, 9306, 970, 247, 22296, 11419, 6082, 422, 26332, 247, 1643, 11860, 4872, 9077, 432, 253, 6311, 6779, 281, 9162, 305, 1641, 12401, 2045, 7274, 352, 5148, 613, 2224, 271, 5731, 4086, 407, 3587, 39793, 253, 11839, 273, 253, 440, 35421, 6779, 970, 253, 11419, 6082, 422, 275, 253, 3408, 273, 440, 35421, 6779, 4715, 253, 6311, 5731, 4086, 310, 908, 323, 39793, 247, 2613, 1566, 1293, 970, 667, 643, 2613, 1566, 8103, 5661, 27163, 327, 1643, 11860, 9162, 7568, 697, 26647, 3045, 689, 1027, 2613, 35615, 15302, 285, 1014, 10625, 50272, 2369, 652, 285, 4722, 15895, 273, 5148, 613, 920, 407, 4715, 271, 440, 35421, 5731, 4086, 323, 6779, 4715, 50274, 23693, 1037, 3590, 285, 973, 10932, 4583, 342, 4278, 14290, 275, 30762, 265, 50274, 49346, 3542, 4583, 342, 9371, 32467, 33954, 285, 275, 1798, 247, 1175, 6630, 273, 2905, 789, 50275, 12311, 26647, 3045, 689, 1027, 4067, 285, 12861, 2613, 3210, 5743, 3470, 15302, 285, 1014, 247, 1027, 36453, 2505, 9162, 50274, 24013, 400, 569, 403, 417, 1077, 2590, 275, 690, 4243, 24088, 253, 1921, 323, 4715, 19265, 13461, 362, 285, 253, 4327, 273, 11419, 6082, 422, 50274, 49363, 7103, 310, 3710, 281, 1643, 11860, 9162, 534, 310, 1077, 2810, 281, 253, 5148, 613, 920, 8103, 908, 275, 436, 2929, 50275, 783, 906, 273, 2505, 9162, 310, 4722, 533, 417, 594, 27096, 1677, 642, 2007, 1783, 24088, 2139, 5028, 29713, 1057, 417, 2826, 275, 436, 1083, 50276, 74, 11346, 4361, 436, 2929, 285, 5211, 281, 5583, 352, 347, 247, 2590, 2997, 2929, 253, 2934, 273, 5148, 613, 920, 5731, 6928, 4453, 247, 12532, 3884, 4409, 18216, 6296, 50276, 74, 3524, 253, 4477, 281, 19148, 253, 1841, 891, 5393, 1840, 5661, 1543, 403, 2217, 7296, 253, 2317, 2701, 533, 417, 1270, 1580, 253, 1655, 7103, 4836, 310, 3240, 2074, 281, 253, 11419, 6082, 422, 27163, 327, 625, 11117, 8892, 651, 17084, 436, 2929, 50275, 71, 3341, 436, 2929, 13698, 387, 440, 35421, 6779, 4715, 533, 697, 417, 2590, 432, 253, 1655, 4060, 534, 310, 8489, 24363, 891, 1158, 28763, 3240, 271, 1774, 4735, 273, 436, 2929, 594, 891, 4122, 5583, 253, 4477, 281, 1908, 247, 625, 27096, 4060, 24088, 4715, 4803, 323, 440, 35421, 6779, 4715, 390, 2010, 5474, 33032, 2520, 789, 10316, 247, 4460, 5148, 613, 920, 2746, 326, 33772, 440, 35421, 4715, 4803, 323, 4715, 14237, 2439, 1027, 33433, 15302, 3280, 29391, 285, 11454, 2990, 35615, 253, 11419, 6082, 1644, 2882, 273, 1643, 5103, 4715, 7363, 432, 2067, 22296, 8892, 253, 2934, 273, 970, 11419, 6082, 1644, 281, 3037, 440, 35421, 6779, 4715, 310, 247, 1077, 4722, 2934, 50276, 43355, 5393, 326, 253, 8869, 273, 271, 440, 35421, 5731, 4086, 310, 4127, 347, 247, 3700, 4715, 1895, 285, 436, 789, 310, 7106, 327, 4715, 247, 4715, 5933, 347, 10066, 281, 5289, 273, 4735, 4908, 641, 476, 368, 21184, 327, 752, 4809, 273, 4715, 4803, 285, 2139, 597, 476, 320, 3700, 494, 2190, 1027, 33433, 285, 15302, 323, 436, 1511, 273, 5148, 613, 920, 281, 320, 5547, 476, 368, 2319, 253, 6095, 327, 253, 1511, 273, 11419, 6082, 1644, 16280, 13868, 15180, 2105, 1057, 970, 4577, 3280, 10103, 3718, 634, 1332, 689, 14433, 1511, 273, 49863, 29974, 13337, 4715, 24088, 362, 3348, 50276, 249, 253, 2593, 2087, 3006, 689, 15302, 285, 10625, 253, 7200, 273, 22296, 3082, 285, 362, 3348, 1332, 403, 1077, 2810, 436, 6492, 1110, 15302, 778, 417, 320, 7445, 281, 7472, 49863, 29974, 13337, 3733, 50276, 249, 253, 2593, 2087, 3006, 689, 2990, 35615, 752, 310, 253, 3969, 22296, 21574, 4715, 7200, 50276, 249, 253, 40290, 2593, 476, 368, 6266, 275, 625, 4278, 849, 3280, 39908, 403, 5196, 403, 597, 501, 312, 6216, 323, 1016, 3733, 6874, 323, 8892, 604, 253, 3280, 39908, 403, 417, 5196, 588, 253, 5301, 875, 436, 1332, 22296, 285, 362, 3348, 320, 1027, 50276, 6438, 16725, 253, 2488, 2380, 891, 10904, 253, 13716, 598, 281, 2770, 625, 327, 38135, 285, 1679, 327, 29422, 1543, 7152, 339, 431, 248, 2929, 8631, 440, 35421, 4715, 347, 247, 5148, 613, 920, 1895, 253, 8310, 310, 326, 440, 35421, 4715, 4803, 403, 8069, 22296, 407, 253, 3290, 273, 253, 14237, 326, 597, 4917, 4103, 281, 6774, 1996, 49863, 29974, 13337, 390, 391, 77, 4715, 253, 4715, 936, 28269, 5933, 4483, 323, 4715, 2990, 10336, 3602, 285, 671, 2990, 2966, 292, 4896, 326, 3653, 253, 440, 35421, 4715, 2625, 1754, 327, 638, 285, 1501, 1396, 569, 50275, 15177, 50276, 783, 4081, 5933, 310, 973, 2931, 285, 352, 310, 2429, 1411, 4623, 11771, 11333, 327, 4623, 3237, 50276, 783, 1543, 921, 326, 253, 5933, 310, 12085, 342, 643, 7274, 751, 362, 3348, 1077, 5777, 41731, 13015, 50276, 498, 15752, 253, 2929, 310, 973, 3542, 285, 4518, 18872, 253, 2593, 8255, 310, 247, 2372, 1892, 281, 2096, 342, 1077, 1077, 1355, 3888, 50275, 19164, 414, 627, 310, 271, 9470, 6239, 327, 5148, 613, 920, 534, 310, 11848, 2220, 275, 30762, 247, 253, 2022, 15832, 275, 436, 789, 310, 253, 36833, 5731, 4086, 323, 8346, 6287, 11269, 534, 1057, 452, 690, 14259, 281, 253, 1711, 789, 407, 270, 1205, 900, 275, 7901, 285, 9748, 50275, 9188, 40348, 50276, 856, 84, 2590, 285, 16907, 1375, 23037, 14387, 1543, 27350, 2746, 50276, 5040, 760, 1077, 16453, 314, 1805, 685, 643, 3082, 891, 651, 751, 281, 755, 247, 1928, 323, 2139, 362, 3348, 310, 594, 1175, 246, 26576, 2167, 253, 4477, 921, 326, 362, 3348, 556, 247, 1895, 342, 8103, 1159, 29713, 50276, 531, 4385, 253, 5731, 4086, 3936, 347, 14800, 638, 285, 1501, 2425, 285, 247, 896, 44263, 456, 2228, 352, 3133, 3626, 281, 671, 897, 253, 1980, 11786, 273, 253, 8512, 3700, 1159, 1060, 347, 1142, 1264, 390, 1740, 2803, 4715, 4803, 513, 2490, 187, 4118, 18435, 27, 783, 30628, 512, 5194, 326, 253, 2934, 310, 4722, 253, 4028, 2590, 285, 253, 4679, 4209, 50275, 936, 3157, 253, 2929, 253, 4477, 943, 1908, 1805, 16585, 616, 11419, 6082, 422, 285, 690, 273, 253, 5933, 280, 10165, 209 ]
Below is given review of a research paper from cnoference journal. Please write a summary the review. ### Review: summary the paper proposes a ganbased approach for dealing with adversarial instances with the training of a robust discriminator that is able to identify adversaries from clean samples and a generator that produces adversarial noise for its given input clean image in order to mislead the discriminator in contrast to the stateoftheart ensemble adversarial training approach which relies on several pretrained neural networks for generating adversarial examples the authors introduce a way for dynamically generating adversarial examples onthefly by using a generator which they along with their clean counterparts are then consumed for training the discriminator quality the paper is relatively wellwritten although a little sketchy and its motivations are clear the authors compare their proposed approach with a good of variety of strong defenses such as ensemble adversarial training and pgd adversarial training supporting with convincing experiments their approach originality xioa et al 2018 used very similar technique for generating new adversarial examples generator attack then used for training a robust discriminator likewise lee et al 2018 also used gans to produce perturbations for making images misclassified given this what is the main novelty of this approach comparing to the xioa et al 2018 and lee et al 2018 these references should be discussed in details in the paper moreover limited comparison with different attacks why did not compare against targeted attacks such as tfgs cw or ganattack it is really surprising that undefended network is working better showing more robustness than the defended network adversarial pgd on blackbox attacks why this is happening references xiao c li b zhu j y he w liu m song d 2018 generating adversarial examples with adversarial networks arxiv preprint arxiv180102610 lee h han s lee j 2017 generative adversarial trainer defense to adversarial perturbations with gan arxiv preprint arxiv170503387 docsepthe paper a direct approach to robust deep learning using adversarial networks proposes a gan solution for deep models of classification faced to white and black box attacks it defines an architecture where a generator network seeks to produce slight pertubations that succeed in fooling the discriminator the discriminator is the targetted classification model the paper is globally well written and easy to follow it well presents related works and the approach is well justified though the global idea is rather straightforward from my point of view it looks to be a novel effective application of gans the implementation is well designed it notably uses recent gan stabilization techniques the experiments are quite convincing since it looks to produce rather robust models without a loss of performance with clean which appears crucial to me and is not the case of its main competitors minor comments eq1 i do not understand the argmax the support is missing it corresponds to the class with higher probability i suppose but authors say that gans are usually useful for the generator this is not always the case by the way while in their case both obtained discriminator and generator have value i do not understand in what the generator could be useful here since it is only fitted to attack its own model so what is the interest are its attacks transferable on other models tables 1 and 2 are described as giving attack accuracies but scores reported are classification accuracy right this is rather defense accuracies so docsepthe paper proposed a defensive mechanism against adversarial attacks using gans the general network structure is very much similar to a standard gans generated perturbations are used as adversarial examples and a discriminator is used to distinguish between them the performance on mnist svhn and cifar10 demonstrate the effectiveness of the approach and in general the performance is on par with carefully crafted algorithms for such task pros the presentation of the approach is clean and easytofollow the proposed network structure is simple but it surprisingly works well in general descriptions of training details are reasonable and the experimental results across several datasets are extensive cons the network structure may not be novel though the performance is very nice there are algorithms that are carefully crafted to perform the network defense mechanism such as samangouei et al 2018 however the method described in this paper despite simple works very good it would be great if authors can provide more insights on why it works well though not the best but still reasonable besides only demonstrating the experimental results it would also be nice if authors can visualize the behavior of their design by showing some examples using the dataset they are working on and provide sidetoside comparisons against other approaches ### Summary:
the paper proposed a gan approach to robust learning against adversarial examples where a generator produces adversarial examples as perturbations and a discriminator is used to distinguish between adversarial and raw images the performance on mnist svhn and cifar10 demonstrate the effectiveness of the approach and in general the performance is on par with carefully crafted algorithms for such task the architecture of gans used in the paper is standard yet the defensive performance seems good the reviewers wonder the reason behind this good mechanism and the novelty compared with other works in similar spirits in response the authors add some insights on discussing the mechanism as well as comparisons with other works mentioned by the reviewers the reviewers all think that the paper presents a simple scheme for robust deep learning based on gans which shows its effectiveness in experiments the understanding on why it works may need further explorations thus the paper is proposed to be borderline lean accept
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 8774, 253, 2929, 29328, 247, 36827, 3169, 2746, 323, 10620, 342, 48960, 10872, 342, 253, 3733, 273, 247, 10237, 7134, 12915, 326, 310, 2104, 281, 4271, 18539, 3927, 432, 4076, 3530, 285, 247, 14156, 326, 11330, 48960, 6046, 323, 697, 1677, 3280, 4076, 2460, 275, 1340, 281, 3731, 26460, 253, 7134, 12915, 275, 4499, 281, 253, 1375, 23037, 14387, 19862, 48960, 3733, 2746, 534, 15771, 327, 2067, 3215, 11273, 11454, 6928, 323, 11365, 48960, 6667, 253, 4477, 9569, 247, 1039, 323, 23043, 11365, 48960, 6667, 327, 783, 16247, 407, 970, 247, 14156, 534, 597, 2112, 342, 616, 4076, 21421, 403, 840, 17017, 323, 3733, 253, 7134, 12915, 50275, 15177, 253, 2929, 310, 4942, 973, 15720, 3738, 247, 1652, 23211, 90, 285, 697, 42852, 403, 2590, 253, 4477, 7277, 616, 4081, 2746, 342, 247, 1175, 273, 5235, 273, 2266, 25774, 824, 347, 19862, 48960, 3733, 285, 23256, 69, 48960, 3733, 8109, 342, 21414, 4679, 616, 2746, 50276, 19164, 414, 1269, 900, 66, 1162, 355, 4765, 908, 1077, 2074, 5853, 323, 11365, 747, 48960, 6667, 14156, 2983, 840, 908, 323, 3733, 247, 10237, 7134, 12915, 21223, 458, 70, 1162, 355, 4765, 671, 908, 305, 507, 281, 4711, 26309, 323, 2403, 3888, 3731, 39651, 1677, 436, 752, 310, 253, 2022, 38135, 273, 436, 2746, 10941, 281, 253, 1269, 900, 66, 1162, 355, 4765, 285, 458, 70, 1162, 355, 4765, 841, 10414, 943, 320, 5469, 275, 4278, 275, 253, 2929, 50276, 3062, 1189, 3710, 5301, 342, 1027, 8104, 2139, 858, 417, 7277, 1411, 10522, 8104, 824, 347, 28793, 5943, 260, 88, 390, 36827, 35946, 50276, 262, 310, 1663, 10084, 326, 46583, 1834, 2990, 310, 2444, 1805, 4645, 625, 31640, 685, 253, 25860, 2990, 48960, 23256, 69, 327, 2806, 3364, 8104, 2139, 436, 310, 9369, 50276, 250, 3065, 50276, 89, 22728, 260, 632, 270, 1182, 11917, 480, 340, 344, 259, 632, 86, 278, 50276, 32429, 277, 4765, 11365, 48960, 6667, 342, 48960, 6928, 549, 32693, 638, 3845, 549, 32693, 1093, 9104, 1731, 740, 50276, 14906, 288, 15761, 256, 50276, 14906, 480, 4240, 1006, 800, 48960, 33837, 5684, 281, 48960, 26309, 342, 36827, 549, 32693, 638, 3845, 549, 32693, 15046, 1235, 1610, 2597, 5474, 339, 431, 248, 2929, 247, 1480, 2746, 281, 10237, 3676, 4715, 970, 48960, 6928, 29328, 247, 36827, 2900, 323, 3676, 3210, 273, 9162, 11372, 281, 3168, 285, 2806, 3817, 8104, 352, 13067, 271, 10336, 835, 247, 14156, 2990, 14993, 281, 4711, 4512, 6925, 538, 569, 326, 9302, 275, 11213, 272, 253, 7134, 12915, 253, 7134, 12915, 310, 253, 2303, 8659, 9162, 1566, 50275, 783, 2929, 310, 21349, 973, 3542, 285, 3477, 281, 956, 352, 973, 10262, 2905, 2987, 285, 253, 2746, 310, 973, 17285, 2167, 253, 4156, 2934, 310, 2581, 15246, 432, 619, 1127, 273, 1859, 352, 4453, 281, 320, 247, 4460, 50276, 13116, 50276, 13259, 273, 305, 507, 253, 7092, 310, 973, 4158, 352, 19836, 4648, 3332, 36827, 28366, 5609, 253, 4679, 403, 3240, 21414, 1580, 352, 4453, 281, 4711, 2581, 10237, 3210, 1293, 247, 2957, 273, 3045, 342, 4076, 534, 4620, 9560, 281, 479, 285, 50276, 261, 417, 253, 1083, 273, 697, 2022, 21607, 50275, 37585, 5701, 50272, 2574, 18, 50276, 74, 513, 417, 2096, 253, 1736, 4090, 253, 1329, 310, 5816, 352, 10140, 281, 253, 966, 342, 2169, 5912, 891, 9428, 533, 50272, 43355, 1333, 326, 305, 507, 403, 3798, 4217, 323, 253, 14156, 436, 310, 417, 1900, 253, 1083, 407, 253, 1039, 1223, 275, 616, 1083, 1097, 2797, 7134, 12915, 285, 14156, 452, 1318, 891, 513, 417, 2096, 275, 752, 253, 14156, 812, 320, 4217, 1060, 1580, 352, 310, 760, 14662, 281, 2983, 697, 1211, 1566, 594, 752, 310, 253, 1600, 403, 697, 8104, 3700, 494, 327, 643, 3210, 50272, 38538, 337, 285, 374, 403, 2529, 347, 4933, 2983, 3933, 19103, 533, 7363, 2361, 403, 9162, 7200, 987, 50276, 2520, 310, 2581, 5684, 3933, 19103, 594, 5474, 339, 431, 248, 2929, 4081, 247, 14397, 5122, 1411, 48960, 8104, 970, 305, 507, 253, 2087, 2990, 2605, 310, 1077, 1199, 2074, 281, 247, 2629, 305, 507, 50276, 20419, 26309, 403, 908, 347, 48960, 6667, 285, 247, 7134, 12915, 310, 908, 281, 12129, 875, 731, 253, 3045, 327, 278, 79, 382, 18504, 13107, 285, 260, 338, 274, 740, 7568, 253, 12510, 273, 253, 2746, 285, 275, 2087, 253, 3045, 310, 327, 1061, 342, 9257, 37171, 11333, 323, 824, 4836, 50275, 856, 84, 50276, 783, 9759, 273, 253, 2746, 310, 4076, 285, 3477, 936, 25739, 50276, 783, 4081, 2990, 2605, 310, 2969, 533, 352, 19143, 2987, 973, 275, 2087, 50275, 3229, 28756, 273, 3733, 4278, 403, 5272, 285, 253, 5661, 1543, 2439, 2067, 15302, 403, 9470, 50276, 5040, 50276, 783, 2990, 2605, 778, 417, 320, 4460, 2167, 253, 3045, 310, 1077, 5322, 50275, 9088, 403, 11333, 326, 403, 9257, 37171, 281, 1347, 253, 2990, 5684, 5122, 824, 347, 1775, 606, 276, 27116, 1162, 355, 4765, 2299, 253, 1332, 2529, 275, 436, 2929, 5747, 2969, 2987, 1077, 1175, 352, 651, 320, 1270, 604, 4477, 476, 2085, 625, 16039, 327, 2139, 352, 2987, 973, 2167, 417, 253, 1682, 533, 1335, 5272, 16280, 760, 17227, 253, 5661, 1543, 50276, 262, 651, 671, 320, 5322, 604, 4477, 476, 31986, 253, 3879, 273, 616, 2216, 407, 4645, 690, 6667, 970, 253, 10895, 597, 403, 2444, 327, 285, 2085, 25549, 292, 23427, 14023, 1411, 643, 7274, 187, 187, 4118, 18435, 27, 783, 2929, 4081, 247, 36827, 2746, 281, 10237, 4715, 1411, 48960, 6667, 835, 247, 14156, 11330, 48960, 6667, 347, 26309, 285, 247, 7134, 12915, 310, 908, 281, 12129, 875, 48960, 285, 9305, 3888, 253, 3045, 327, 278, 79, 382, 18504, 13107, 285, 260, 338, 274, 740, 7568, 253, 12510, 273, 253, 2746, 285, 275, 2087, 253, 3045, 310, 327, 1061, 342, 9257, 37171, 11333, 323, 824, 4836, 50275, 783, 10336, 273, 305, 507, 908, 275, 253, 2929, 310, 2629, 2568, 253, 14397, 3045, 3133, 1175, 253, 30628, 4282, 253, 1921, 3212, 436, 1175, 5122, 285, 253, 38135, 2429, 342, 643, 2987, 275, 2074, 19851, 275, 2380, 253, 4477, 823, 690, 16039, 327, 16585, 253, 5122, 347, 973, 347, 14023, 342, 643, 2987, 5393, 407, 253, 30628, 50275, 783, 30628, 512, 1158, 326, 253, 2929, 10262, 247, 2969, 6974, 323, 10237, 3676, 4715, 1754, 327, 305, 507, 534, 2722, 697, 12510, 275, 4679, 253, 4685, 327, 2139, 352, 2987, 778, 878, 2007, 31880, 569, 50276, 40622, 253, 2929, 310, 4081, 281, 320, 45210, 9644, 2997, 50275 ]
[ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 ]
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 8774, 253, 2929, 29328, 247, 36827, 3169, 2746, 323, 10620, 342, 48960, 10872, 342, 253, 3733, 273, 247, 10237, 7134, 12915, 326, 310, 2104, 281, 4271, 18539, 3927, 432, 4076, 3530, 285, 247, 14156, 326, 11330, 48960, 6046, 323, 697, 1677, 3280, 4076, 2460, 275, 1340, 281, 3731, 26460, 253, 7134, 12915, 275, 4499, 281, 253, 1375, 23037, 14387, 19862, 48960, 3733, 2746, 534, 15771, 327, 2067, 3215, 11273, 11454, 6928, 323, 11365, 48960, 6667, 253, 4477, 9569, 247, 1039, 323, 23043, 11365, 48960, 6667, 327, 783, 16247, 407, 970, 247, 14156, 534, 597, 2112, 342, 616, 4076, 21421, 403, 840, 17017, 323, 3733, 253, 7134, 12915, 50275, 15177, 253, 2929, 310, 4942, 973, 15720, 3738, 247, 1652, 23211, 90, 285, 697, 42852, 403, 2590, 253, 4477, 7277, 616, 4081, 2746, 342, 247, 1175, 273, 5235, 273, 2266, 25774, 824, 347, 19862, 48960, 3733, 285, 23256, 69, 48960, 3733, 8109, 342, 21414, 4679, 616, 2746, 50276, 19164, 414, 1269, 900, 66, 1162, 355, 4765, 908, 1077, 2074, 5853, 323, 11365, 747, 48960, 6667, 14156, 2983, 840, 908, 323, 3733, 247, 10237, 7134, 12915, 21223, 458, 70, 1162, 355, 4765, 671, 908, 305, 507, 281, 4711, 26309, 323, 2403, 3888, 3731, 39651, 1677, 436, 752, 310, 253, 2022, 38135, 273, 436, 2746, 10941, 281, 253, 1269, 900, 66, 1162, 355, 4765, 285, 458, 70, 1162, 355, 4765, 841, 10414, 943, 320, 5469, 275, 4278, 275, 253, 2929, 50276, 3062, 1189, 3710, 5301, 342, 1027, 8104, 2139, 858, 417, 7277, 1411, 10522, 8104, 824, 347, 28793, 5943, 260, 88, 390, 36827, 35946, 50276, 262, 310, 1663, 10084, 326, 46583, 1834, 2990, 310, 2444, 1805, 4645, 625, 31640, 685, 253, 25860, 2990, 48960, 23256, 69, 327, 2806, 3364, 8104, 2139, 436, 310, 9369, 50276, 250, 3065, 50276, 89, 22728, 260, 632, 270, 1182, 11917, 480, 340, 344, 259, 632, 86, 278, 50276, 32429, 277, 4765, 11365, 48960, 6667, 342, 48960, 6928, 549, 32693, 638, 3845, 549, 32693, 1093, 9104, 1731, 740, 50276, 14906, 288, 15761, 256, 50276, 14906, 480, 4240, 1006, 800, 48960, 33837, 5684, 281, 48960, 26309, 342, 36827, 549, 32693, 638, 3845, 549, 32693, 15046, 1235, 1610, 2597, 5474, 339, 431, 248, 2929, 247, 1480, 2746, 281, 10237, 3676, 4715, 970, 48960, 6928, 29328, 247, 36827, 2900, 323, 3676, 3210, 273, 9162, 11372, 281, 3168, 285, 2806, 3817, 8104, 352, 13067, 271, 10336, 835, 247, 14156, 2990, 14993, 281, 4711, 4512, 6925, 538, 569, 326, 9302, 275, 11213, 272, 253, 7134, 12915, 253, 7134, 12915, 310, 253, 2303, 8659, 9162, 1566, 50275, 783, 2929, 310, 21349, 973, 3542, 285, 3477, 281, 956, 352, 973, 10262, 2905, 2987, 285, 253, 2746, 310, 973, 17285, 2167, 253, 4156, 2934, 310, 2581, 15246, 432, 619, 1127, 273, 1859, 352, 4453, 281, 320, 247, 4460, 50276, 13116, 50276, 13259, 273, 305, 507, 253, 7092, 310, 973, 4158, 352, 19836, 4648, 3332, 36827, 28366, 5609, 253, 4679, 403, 3240, 21414, 1580, 352, 4453, 281, 4711, 2581, 10237, 3210, 1293, 247, 2957, 273, 3045, 342, 4076, 534, 4620, 9560, 281, 479, 285, 50276, 261, 417, 253, 1083, 273, 697, 2022, 21607, 50275, 37585, 5701, 50272, 2574, 18, 50276, 74, 513, 417, 2096, 253, 1736, 4090, 253, 1329, 310, 5816, 352, 10140, 281, 253, 966, 342, 2169, 5912, 891, 9428, 533, 50272, 43355, 1333, 326, 305, 507, 403, 3798, 4217, 323, 253, 14156, 436, 310, 417, 1900, 253, 1083, 407, 253, 1039, 1223, 275, 616, 1083, 1097, 2797, 7134, 12915, 285, 14156, 452, 1318, 891, 513, 417, 2096, 275, 752, 253, 14156, 812, 320, 4217, 1060, 1580, 352, 310, 760, 14662, 281, 2983, 697, 1211, 1566, 594, 752, 310, 253, 1600, 403, 697, 8104, 3700, 494, 327, 643, 3210, 50272, 38538, 337, 285, 374, 403, 2529, 347, 4933, 2983, 3933, 19103, 533, 7363, 2361, 403, 9162, 7200, 987, 50276, 2520, 310, 2581, 5684, 3933, 19103, 594, 5474, 339, 431, 248, 2929, 4081, 247, 14397, 5122, 1411, 48960, 8104, 970, 305, 507, 253, 2087, 2990, 2605, 310, 1077, 1199, 2074, 281, 247, 2629, 305, 507, 50276, 20419, 26309, 403, 908, 347, 48960, 6667, 285, 247, 7134, 12915, 310, 908, 281, 12129, 875, 731, 253, 3045, 327, 278, 79, 382, 18504, 13107, 285, 260, 338, 274, 740, 7568, 253, 12510, 273, 253, 2746, 285, 275, 2087, 253, 3045, 310, 327, 1061, 342, 9257, 37171, 11333, 323, 824, 4836, 50275, 856, 84, 50276, 783, 9759, 273, 253, 2746, 310, 4076, 285, 3477, 936, 25739, 50276, 783, 4081, 2990, 2605, 310, 2969, 533, 352, 19143, 2987, 973, 275, 2087, 50275, 3229, 28756, 273, 3733, 4278, 403, 5272, 285, 253, 5661, 1543, 2439, 2067, 15302, 403, 9470, 50276, 5040, 50276, 783, 2990, 2605, 778, 417, 320, 4460, 2167, 253, 3045, 310, 1077, 5322, 50275, 9088, 403, 11333, 326, 403, 9257, 37171, 281, 1347, 253, 2990, 5684, 5122, 824, 347, 1775, 606, 276, 27116, 1162, 355, 4765, 2299, 253, 1332, 2529, 275, 436, 2929, 5747, 2969, 2987, 1077, 1175, 352, 651, 320, 1270, 604, 4477, 476, 2085, 625, 16039, 327, 2139, 352, 2987, 973, 2167, 417, 253, 1682, 533, 1335, 5272, 16280, 760, 17227, 253, 5661, 1543, 50276, 262, 651, 671, 320, 5322, 604, 4477, 476, 31986, 253, 3879, 273, 616, 2216, 407, 4645, 690, 6667, 970, 253, 10895, 597, 403, 2444, 327, 285, 2085, 25549, 292, 23427, 14023, 1411, 643, 7274, 187, 187, 4118, 18435, 27, 783, 2929, 4081, 247, 36827, 2746, 281, 10237, 4715, 1411, 48960, 6667, 835, 247, 14156, 11330, 48960, 6667, 347, 26309, 285, 247, 7134, 12915, 310, 908, 281, 12129, 875, 48960, 285, 9305, 3888, 253, 3045, 327, 278, 79, 382, 18504, 13107, 285, 260, 338, 274, 740, 7568, 253, 12510, 273, 253, 2746, 285, 275, 2087, 253, 3045, 310, 327, 1061, 342, 9257, 37171, 11333, 323, 824, 4836, 50275, 783, 10336, 273, 305, 507, 908, 275, 253, 2929, 310, 2629, 2568, 253, 14397, 3045, 3133, 1175, 253, 30628, 4282, 253, 1921, 3212, 436, 1175, 5122, 285, 253, 38135, 2429, 342, 643, 2987, 275, 2074, 19851, 275, 2380, 253, 4477, 823, 690, 16039, 327, 16585, 253, 5122, 347, 973, 347, 14023, 342, 643, 2987, 5393, 407, 253, 30628, 50275, 783, 30628, 512, 1158, 326, 253, 2929, 10262, 247, 2969, 6974, 323, 10237, 3676, 4715, 1754, 327, 305, 507, 534, 2722, 697, 12510, 275, 4679, 253, 4685, 327, 2139, 352, 2987, 778, 878, 2007, 31880, 569, 50276, 40622, 253, 2929, 310, 4081, 281, 320, 45210, 9644, 2997, 50275 ]
Below is given review of a research paper from cnoference journal. Please write a summary the review. ### Review: summary of results this empirical paper finds that deep neural network sharpness as measured by the top eigenvalue of the hessian tends to saturate at or hover just above the value 2eta where eta is the step size in gradient descent gd during the course of optimization this is accompanied by nonmonotonicity in the loss results also hold for gradient descent with momentum this phenomena occurs for fullbatch gd a variety of tasks and two different loss functions square loss and crossentropy although in the latter case latetime dynamics of the sharpness is different with the sharpness decreasing the result also holds across a variety of architectures vgg11 with and without batch norm convolutional networks and fullyconnected networks with different nonlinearities a deep linear network a transformer model although i comment on the architecture dependence below the authors refer to this phenomenon of sharpness hovering at or above the 2eta bound as optimization on the edge of stability the authors posit that this observation goes against our current understanding of optimization in deep learning i theoretical work may use assumptions such as monotonic descent or lsmoothness which is in violation of the edge of stability regime where the loss behavior is not monotonic ii relatedly it corresponds to a regime of instability in a quadratic taylor approximation so this is not a good assumption either the empirical observations do not appear to straightway carry over to sgd although there is some discussion of similarities in the appendix the authors leave this is as an open problem quality and clarity the work is of good quality the experiments that are presented clearly demonstrate the described phenomenon attempt to sample from a variety of problems ranging from a simple 1d regression task to training vgg11 on cifar10 in fig1 and are wellexplained originality the observed phenomena of sharpness progressively increasing and then hovering at or near 2eta through much of gd dynamics appears new to the best of my knowledge however prior papers have pointed out that gd optimization can stably proceed for learning rates above 2eta and are accompanied with nonmonotonic loss one example is lewkowycz et al 2020 significance i think the observations are somewhat important although perhaps not as much as the authors seem to stress in writing eg in the abstract our results shed significant light on the dynamics of gradient descent with a fixed step size this paper focuses only on empirical observations rather than explanation of the mechanism behind the stability the results are for fullbatch gd which somewhat limits the applicability of the results to practice where sgd is more common nonetheless i do think understanding how optimization is stabilized in this regime is an interesting problem other comments i think that a key point not fully understood at least as expressed in the writing or investigated in the experiments is the role of network width ie im skeptical that the observations will be consistent across architectures as expressed in the opening paragraph in the introduction if wider networks are also investigated empirically this is a main reason for not giving the paper a higher score in appendix d the authors try to reconcile the results with existing theory on infinitewidth limits fig 11c shows that across networks of varying width but trained at the same learning rate wider networks end up with smaller values of sharpness at the end of training here say that all experiments are stopped at the same value of the training loss hence if a narrower network saturates the 2eta bound at late times in gradient descent the wider networks will fall below this value this is of course consistent with no evolution in the curvature in the infinite width limit that is to say whether or not the edge of stability regime is reached depends quite strongly on how wide the networks are while some realistic networks are explored eg vgg11 narrow transformers which is a positive point all of the remaining networks used in the experiments as far as i can gather are on the narrow side eg layer widths of order 100 or 200 i think a shortcoming of the paper is that the strong width ie architecture dependence of the phenomenon is not fully appreciated or discussed by the authors eg for instance by discussing in the main text that it only sets in for narrow networks or investigated empirically alternatively noting for readers that all the networks chosen are rather narrow out of transparency i note that the authors do mention in appendix d one might hypothesize that progressive sharpening might attenuate as networks with ntk parameterization become increasingly wide however i dont think that ntk parameterization is necessary for this to be true in short i think the width dependence of the phenomenon is an important factor that affects the significance and applicability of the observations and could have been treated with greater transparency with additional experiments and additions to the main text and abstract a comment on relation to prior work the authors write that lewkowycz et al 2020 imply that actual progress would occur in regions where the sharpness remains strictly less than 2eta our experiments demonstrate otherwise i dont believe this is a conclusion of that paper progress happens when sharpness is above 2eta note also that the paper tends to study wide networks could the authors elaborate on what they mean heredocsep this paper presents an interesting observation for gd that is the sharpness of the learnt model in the final phase of the training measured by the largest eigenvalue of the training loss hessian hovers right at the value 2eta while the training loss at the same time the loss goes to unstable and nonmonotonically decreasing this pattern is consistent across architecture activation functions tasks loss functions and bn comprehensive experiments are conducted to show this common observation the paper is easy to follow besides the empirical results in the main body authors give insightful discussions in intro and related work section specifically authors propose a novel guess that gd eventually transitions to edge of stability where gd can finally succeed with nonsmall enough step size although i am not sure how gd can do this the concept of edge of stability is still attractive i have two concerns for this work 1 authors did not investigate why sharpness finally hover over 2eta is it a trivial consequence followed by some relationship between the update rule of gd and the definition of sharpness without any condition even if yes we may further think about how to leverage it along the existing discussions in this paper hope to have authors feedbacks on this later 2 given people use sgd to train neural networks discussions about the insight from the observation of gd to sgd will enhance the impact of this paper docsepthis work identifies a new empirical phenomenon in the training dynamics of deep nets when trained with fullbatch gd the curvature of the train loss increases up to a critical value of 2step size at which point it plateaus for the remainder of training this phenomenon is demonstrated robustly for networks trained with mse loss across various architectures and datasets and a slightly weaker version of this holds for crossentropy loss as well this work contributes to our understanding of deep network dynamics it is a precise and apparently robust phenomenon that was surprisingly not noticed before perhaps because of the requirement of gd vs sgd in terms of impact this work will be instructive for dl optimization theory since it points out that certain assumptions which are usually made in theoretical works eg step size curvature are far from true in practice moreover it guides theory towards more realistic assumptions it may also have later impact in practice by leading to a better understanding of the interaction between optimization algorithm step size and architecture thus i recommend acceptance weaknesses and desired clarifications it should be mentioned more prominently that these results are primarily for networks trained with mse loss and that a similar but weaker phenomena holds for crossentropy loss why is the main example in section 3 given for a nonstandard network for cifar10 a 2layer mlp with elu activation why not a standard network with standard activation vgg11 or resnet18 etc the distinction between sgd and gd seems crucial for this phenomenon so more discussion would be good in particular as noted in the related works some papers using sgd claim an opposite effect this is especially important to clarify since sgd is most often used in practice if time allows experiments with increasing batch size could shed light on the importance of gd vs sgd the related works is currently written as an account of what previous works do not do as opposed to what they do it would help contextualize this work to relate it to prior works which are consistent or inconsistent with this phenomena especially works studying the hessian of deep nets some of the mechanisms proposed in prior works eg lewkowycz et al 2020 and works on deep linear networks may also be helpful to understand the phenomena in this work comments which do not affect the score i wonder if you have measured the 2nd eigenvalue during training as well in particular after the 1st eigenvalue has saturated at 2eta does the 2nd eigenvalue also progressively sharpen up to 2eta and so on for later eigenvals i am glad to see the experiments on deep linear networks it suggests that it may be possible to theoretically understand this phenomenon in such simple settings this would be a nice topic for future workdocsepsummary this submission numerically shows that during exploring the neural network landscape gd flow keeps increasing the sharpness as a result gd with a fixed learning rate will exhibit two phases during the dynamics denote by eta the fixed learning rate in the first phase gd follows closely to the gd flow and it finally converges to a region where the sharpness is roughly 2eta then it transits into the second phase during which the sharpness hovers right at or above 2eta in the second phase gd cannot increase the sharpness anymore due to the dynamical stability constraint thus the authors name it the edge of stability phase what is interesting is that in the edge of stability phase the loss is still decreasing steadily although not monotonically pros i enjoy reading this submission it is clearly written and the numerical evaluation is also sufficient to my best of knowledge the observation that the edge of stability happens during the whole late phase of gd dynamics is new it reveals a very complicated dynamical behavior of gd for training neural networks which has not been systematically investigated before thus i think this submission made a very important and original contribution to the understanding of gd dynamics in deep learning cons the relationship with the previous study on the dynamical stability of sgd is not sufficient discussed in my opinion just saying previous works have argued that the stability properties of optimization algorithms could potentially serve as a form of implicit bias in deep learning is obviously not precise and enough a large number of numerical results in 12 already showed that the edge of stability happens for the convergent solutions which implies that the edge of stability must happen at least in the very late phase of gd dynamics the new finds of this submission are that the edge of stability actually holds for a large portion of gd dynamics which is very unexpected the authors should explicitly mention that the edge of stability was already observed in these previous works giving the right credit to the right references does not harm the contribution of this submission especially the jargon edge of stability was first used in 1 and the authors even did not mention it 1 giladi niv et al at stabilitys edge how to adjust hyperparameters to preserve minima selection in asynchronous training of neural networks arxiv preprint arxiv190912340 2019 2 wu lei chao ma and e weinan how sgd selects the global minima in overparameterized learning a dynamical stability perspective advances in neural information processing systems 2018 ### Summary:
the paper demonstrates that gradient descents generally operates in a regime where the spectral norm of the hessian is as large as possible given the learning rate the paper presents a very thorough empirical demonstration of the central claim which was appreciated by the reviewers a central issue to me in accepting the work was its novelty prior work has shown very closely related effects for sgd the reviewers appreciated in discussions the novelty of the precise claim about the spectral norm hovering at around frac2eta r4 and r2 also raised the issue that the related work discussion is not sufficient please make sure that you discuss very carefully related work in the paper including a more detailed discussion in the introduction the two key issues raised by r3 who voted for rejection were that 1 the work studies gradient descent rather than sgd and 2 lack of theory i agree with these concerns perhaps the authors should address 1 by citing more carefully prior work that shows that a similar phenomenon does seem to happen in training with sgd as for 2 i agree here with r1r2 and r4 that empirical evaluation is a key strength of the paper based on the above it is my pleasure to recommend the acceptance of the paper thank you for submitting your work to iclr and please make sure you address all remarks of the reviewers in the cameraready version
[ 977, 5701, 891, 1158, 326, 247, 2234, 1127, 417, 4751, 7192, 387, 1878, 347, 4469, 275, 253, 4028, 390, 6949, 275, 253, 4679, 310, 253, 2554, 273, 2990, 4871, 26332, 516, 33872, 326, 253, 7313, 588, 320, 5185, 2439, 35615, 347, 4469, 275, 253, 5909, 12494, 275, 253, 10199, 604, 14200, 6928, 403, 671, 6949, 45190, 436, 310, 247, 2022, 1921, 323, 417, 4933, 253, 2929, 247, 2169, 4868, 275, 30762, 277, 253, 4477, 1611, 281, 42853, 253, 1543, 342, 5368, 3762, 327, 11968, 3429, 7787, 3036, 1903, 68, 2722, 326, 2439, 6928, 273, 11962, 4871, 533, 10166, 387, 253, 1072, 4715, 2281, 14200, 6928, 990, 598, 342, 4577, 2193, 273, 9479, 1255, 387, 253, 990, 273, 3733, 1060, 1333, 326, 512, 4679, 403, 6331, 387, 253, 1072, 1318, 273, 253, 3733, 2957, 7613, 604, 247, 39937, 2990, 19004, 684, 253, 374, 1464, 3033, 387, 3563, 2069, 275, 11786, 18499, 253, 14200, 6928, 588, 2965, 2708, 436, 1318, 436, 310, 273, 2282, 5185, 342, 642, 5606, 275, 253, 16841, 275, 253, 11968, 4871, 2701, 326, 310, 281, 1333, 1880, 390, 417, 253, 5024, 273, 7882, 9459, 310, 4925, 7024, 3240, 7052, 327, 849, 4618, 253, 6928, 403, 50275, 6050, 690, 15958, 6928, 403, 14859, 24088, 362, 1266, 883, 6891, 4979, 398, 534, 310, 247, 2762, 1127, 512, 273, 253, 5780, 6928, 908, 275, 253, 4679, 347, 2080, 347, 891, 476, 9580, 403, 327, 253, 6891, 1930, 24088, 3828, 34414, 273, 1340, 2233, 390, 1052, 891, 1158, 247, 2159, 4202, 273, 253, 2929, 310, 326, 253, 2266, 4871, 26332, 10336, 10096, 273, 253, 11562, 310, 417, 4751, 14109, 390, 5469, 407, 253, 4477, 24088, 323, 4227, 407, 16585, 275, 253, 2022, 2505, 326, 352, 760, 5239, 275, 323, 6891, 6928, 390, 6949, 45190, 31506, 15806, 323, 10668, 326, 512, 253, 6928, 6777, 403, 2581, 6891, 562, 273, 22107, 50276, 74, 3877, 326, 253, 4477, 513, 3748, 275, 30762, 277, 581, 1537, 41661, 326, 13439, 9479, 2980, 1537, 24733, 6340, 347, 6928, 342, 295, 17922, 4764, 1320, 2489, 9592, 4618, 2299, 891, 13414, 1158, 326, 295, 17922, 4764, 1320, 310, 3309, 323, 436, 281, 320, 2032, 275, 2159, 891, 1158, 253, 4871, 10096, 273, 253, 11562, 310, 271, 1774, 2803, 326, 11852, 253, 8453, 285, 30437, 273, 253, 7313, 285, 812, 452, 644, 4127, 342, 3687, 22107, 342, 3081, 4679, 285, 30733, 281, 253, 2022, 2505, 285, 12002, 50276, 66, 4385, 327, 5886, 281, 2720, 789, 253, 4477, 3630, 326, 458, 30567, 319, 90, 14617, 1162, 355, 9169, 16084, 326, 4588, 4780, 651, 2826, 275, 4811, 835, 253, 9479, 1255, 4558, 13714, 1679, 685, 374, 1464, 776, 4679, 7568, 5010, 891, 13414, 2868, 436, 310, 247, 6452, 273, 326, 2929, 4780, 6569, 672, 9479, 1255, 310, 1840, 374, 1464, 3877, 671, 326, 253, 2929, 14280, 281, 1263, 4618, 6928, 812, 253, 4477, 21184, 327, 752, 597, 1599, 34924, 406, 33032, 436, 2929, 10262, 271, 4722, 8310, 323, 305, 69, 326, 310, 253, 9479, 1255, 273, 253, 34003, 1566, 275, 253, 2457, 3408, 273, 253, 3733, 4080, 407, 253, 6253, 25023, 273, 253, 3733, 2957, 344, 859, 757, 8511, 735, 987, 387, 253, 1318, 374, 1464, 1223, 253, 3733, 2957, 387, 253, 1072, 673, 253, 2957, 4566, 281, 17631, 285, 1327, 2163, 14639, 1037, 11052, 436, 3102, 310, 5185, 2439, 10336, 5743, 3470, 8892, 2957, 3470, 285, 270, 79, 11088, 4679, 403, 5196, 281, 921, 436, 1846, 8310, 253, 2929, 310, 3477, 281, 956, 50276, 67, 11587, 253, 16774, 1543, 275, 253, 2022, 2133, 4477, 1918, 47860, 11985, 275, 26432, 285, 2905, 789, 2593, 5742, 4477, 12661, 247, 4460, 5476, 326, 305, 69, 6524, 16307, 281, 5024, 273, 7882, 835, 305, 69, 476, 4720, 9302, 342, 14122, 78, 455, 2217, 3213, 1979, 3738, 891, 717, 417, 2119, 849, 305, 69, 476, 513, 436, 253, 4473, 273, 5024, 273, 7882, 310, 1335, 12994, 50275, 74, 452, 767, 7350, 323, 436, 789, 50276, 18, 4477, 858, 417, 7409, 2139, 9479, 1255, 4720, 26445, 689, 374, 1464, 310, 352, 247, 14916, 9936, 3560, 407, 690, 2954, 875, 253, 5731, 4086, 273, 305, 69, 285, 253, 5426, 273, 9479, 1255, 1293, 667, 1617, 1014, 604, 4754, 359, 778, 2007, 1158, 670, 849, 281, 25057, 352, 2112, 253, 5368, 11985, 275, 436, 2929, 3524, 281, 452, 4477, 8680, 84, 327, 436, 1996, 50276, 19, 1677, 952, 897, 256, 35333, 281, 6194, 11454, 6928, 11985, 670, 253, 12288, 432, 253, 8310, 273, 305, 69, 281, 256, 35333, 588, 7278, 253, 3486, 273, 436, 2929, 50276, 7152, 33032, 2520, 789, 22649, 247, 747, 16774, 11562, 275, 253, 3733, 8062, 273, 3676, 37507, 672, 10166, 342, 2120, 23941, 305, 69, 253, 16841, 273, 253, 6194, 2957, 5459, 598, 281, 247, 4619, 1318, 273, 374, 10539, 1979, 387, 534, 1127, 352, 5340, 666, 323, 253, 6414, 273, 3733, 436, 11562, 310, 5183, 10237, 314, 323, 6928, 10166, 342, 278, 339, 2957, 2439, 2710, 35615, 285, 15302, 285, 247, 5777, 21076, 2715, 273, 436, 6556, 323, 2831, 290, 10144, 2957, 347, 973, 436, 789, 17904, 281, 776, 4685, 273, 3676, 2990, 8062, 50276, 262, 310, 247, 10799, 285, 8505, 10237, 11562, 326, 369, 19143, 417, 8344, 1078, 4931, 984, 273, 253, 8284, 273, 305, 69, 4632, 256, 35333, 50276, 249, 2426, 273, 3486, 436, 789, 588, 320, 49664, 323, 45439, 13757, 3762, 1580, 352, 2792, 562, 326, 2176, 13260, 534, 403, 3798, 1160, 275, 10527, 2987, 24088, 3213, 1979, 50276, 1915, 87, 1177, 403, 2080, 432, 2032, 275, 3946, 50276, 3062, 1189, 352, 22591, 3762, 4404, 625, 15958, 13260, 352, 778, 671, 452, 1996, 3486, 275, 3946, 407, 4283, 281, 247, 1805, 4685, 273, 253, 5016, 875, 13757, 5933, 3213, 1979, 285, 10336, 3021, 891, 5583, 14924, 50276, 20881, 1255, 265, 285, 6799, 8254, 6787, 50276, 262, 943, 320, 5393, 625, 46454, 326, 841, 1543, 403, 8558, 323, 6928, 10166, 342, 278, 339, 2957, 50276, 395, 326, 247, 2074, 533, 21076, 16958, 6556, 323, 2831, 290, 10144, 2957, 50275, 22309, 310, 253, 2022, 1650, 275, 2593, 495, 1677, 323, 247, 1327, 15291, 2990, 323, 260, 338, 274, 740, 247, 374, 12026, 13361, 81, 342, 1045, 86, 5743, 2139, 417, 247, 2629, 2990, 342, 2629, 5743, 362, 1266, 883, 390, 501, 3024, 1093, 3966, 50276, 783, 13812, 875, 256, 35333, 285, 305, 69, 3133, 9560, 323, 436, 11562, 594, 625, 5955, 651, 320, 1175, 275, 1798, 347, 4879, 275, 253, 2905, 2987, 690, 9380, 970, 256, 35333, 1750, 271, 7285, 1055, 436, 310, 3340, 1774, 281, 19148, 1580, 256, 35333, 310, 954, 2223, 908, 275, 3946, 604, 673, 4483, 4679, 342, 3629, 14604, 1979, 812, 17914, 1708, 327, 253, 6349, 273, 305, 69, 4632, 256, 35333, 50276, 783, 2905, 2987, 310, 4390, 3542, 347, 271, 2395, 273, 752, 2045, 2987, 513, 417, 513, 347, 10066, 281, 752, 597, 513, 352, 651, 1361, 33876, 907, 436, 789, 281, 14588, 352, 281, 2720, 2987, 534, 403, 5185, 390, 16706, 342, 436, 16958, 50276, 20432, 2987, 12392, 253, 344, 859, 757, 273, 3676, 37507, 690, 273, 253, 6297, 4081, 275, 2720, 2987, 24088, 458, 30567, 319, 90, 14617, 1162, 355, 9169, 285, 2987, 327, 3676, 4872, 6928, 778, 671, 320, 9371, 281, 2096, 253, 16958, 275, 436, 789, 50274, 26122, 534, 513, 417, 2818, 253, 4868, 50275, 74, 4282, 604, 368, 452, 4080, 253, 374, 2109, 25023, 1309, 3733, 347, 973, 275, 1798, 846, 253, 337, 296, 25023, 556, 23543, 387, 374, 1464, 1057, 253, 374, 2109, 25023, 671, 31414, 17614, 3878, 598, 281, 374, 1464, 50276, 395, 594, 327, 323, 1996, 9216, 9863, 50276, 74, 717, 9995, 281, 923, 253, 4679, 327, 3676, 4872, 6928, 352, 5936, 326, 352, 778, 320, 1896, 281, 28055, 2096, 436, 11562, 275, 824, 2969, 7533, 436, 651, 320, 247, 5322, 9400, 323, 2852, 789, 7152, 339, 793, 360, 3454, 50276, 2520, 19529, 27184, 2722, 326, 1309, 18216, 253, 11454, 2990, 13016, 50276, 35333, 2685, 11359, 3629, 253, 9479, 1255, 50276, 284, 247, 906, 305, 69, 342, 247, 4229, 4715, 2281, 588, 10738, 767, 12475, 1309, 253, 8062, 50276, 3354, 1584, 407, 1162, 66, 253, 4229, 4715, 2281, 50276, 249, 253, 806, 3408, 305, 69, 3637, 8244, 281, 253, 305, 69, 2685, 285, 352, 4720, 26414, 281, 247, 2919, 835, 253, 9479, 1255, 310, 11467, 374, 1464, 50276, 7461, 352, 811, 953, 715, 253, 1273, 3408, 1309, 534, 253, 9479, 1255, 8511, 735, 987, 387, 390, 1840, 374, 1464, 275, 253, 1273, 3408, 305, 69, 2550, 2572, 253, 9479, 1255, 10542, 1955, 281, 253, 18525, 7882, 7658, 3021, 253, 4477, 1416, 352, 253, 5024, 273, 7882, 3408, 50276, 5371, 310, 4722, 310, 326, 275, 253, 5024, 273, 7882, 3408, 253, 2957, 310, 1335, 11052, 25060, 3738, 417, 41907, 1037, 50275, 856, 84, 50276, 74, 4264, 4361, 436, 19529, 352, 310, 4518, 3542, 285, 253, 10704, 7103, 310, 671, 4209, 50276, 936, 619, 1682, 273, 3640, 253, 8310, 326, 253, 5024, 273, 7882, 6569, 1309, 253, 2644, 3563, 3408, 273, 305, 69, 8062, 310, 747, 50276, 262, 12957, 247, 1077, 9542, 18525, 3879, 273, 305, 69, 323, 3733, 11454, 6928, 534, 556, 417, 644, 24181, 6949, 1078, 50275, 40622, 891, 1158, 436, 19529, 1160, 247, 1077, 1774, 285, 3236, 7680, 281, 253, 4685, 273, 305, 69, 8062, 50276, 249, 3676, 4715, 50275, 5040, 50276, 783, 2954, 342, 253, 2045, 1263, 327, 253, 18525, 7882, 273, 256, 35333, 310, 417, 4209, 5469, 275, 619, 4743, 816, 3981, 2045, 2987, 452, 9125, 326, 253, 7882, 3607, 273, 13757, 11333, 812, 7826, 5752, 347, 247, 830, 273, 15424, 8492, 275, 3676, 4715, 310, 9090, 417, 10799, 285, 2217, 247, 1781, 1180, 273, 10704, 1543, 275, 1249, 2168, 2692, 326, 253, 5024, 273, 7882, 6569, 323, 253, 41886, 5482, 534, 8018, 326, 253, 5024, 273, 7882, 1364, 5108, 387, 1878, 275, 253, 1077, 3563, 3408, 273, 305, 69, 8062, 50275, 783, 747, 9010, 273, 436, 19529, 403, 326, 253, 5024, 273, 7882, 2686, 6556, 323, 247, 1781, 5110, 273, 305, 69, 8062, 534, 310, 1077, 12439, 50275, 783, 4477, 943, 11120, 3748, 326, 253, 5024, 273, 7882, 369, 2168, 2540, 275, 841, 2045, 2987, 4933, 253, 987, 6152, 281, 253, 987, 10414, 1057, 417, 5237, 253, 7680, 273, 436, 19529, 50276, 20432, 253, 480, 1662, 251, 5024, 273, 7882, 369, 806, 908, 275, 337, 285, 253, 4477, 1014, 858, 417, 3748, 352, 50274, 18, 305, 300, 11282, 295, 400, 1162, 355, 387, 7882, 84, 5024, 849, 281, 4575, 4373, 22041, 281, 14003, 46836, 5438, 275, 35576, 3733, 273, 11454, 6928, 549, 32693, 638, 3845, 549, 32693, 746, 2693, 10683, 1449, 6247, 50276, 19, 259, 86, 43278, 448, 8500, 6429, 285, 299, 359, 249, 266, 849, 256, 35333, 34899, 253, 4156, 46836, 275, 689, 19484, 1025, 4715, 247, 18525, 7882, 8668, 16424, 275, 11454, 1491, 5162, 2718, 4765, 187, 187, 4118, 18435, 27, 783, 2929, 14371, 326, 11786, 1398, 592, 3839, 17209, 275, 247, 9459, 835, 253, 9879, 5222, 273, 253, 344, 859, 757, 310, 347, 1781, 347, 1896, 1677, 253, 4715, 2281, 50275, 783, 2929, 10262, 247, 1077, 11080, 16774, 20028, 273, 253, 4275, 1750, 534, 369, 14109, 407, 253, 30628, 50276, 66, 4275, 2523, 281, 479, 275, 18738, 253, 789, 369, 697, 38135, 2720, 789, 556, 2011, 1077, 8244, 2905, 2538, 323, 256, 35333, 253, 30628, 14109, 275, 11985, 253, 38135, 273, 253, 10799, 1750, 670, 253, 9879, 5222, 46115, 387, 1475, 1315, 317, 19, 1464, 391, 21, 285, 391, 19, 671, 5439, 253, 2523, 326, 253, 2905, 789, 5955, 310, 417, 4209, 4496, 1056, 2119, 326, 368, 2319, 1077, 9257, 2905, 789, 275, 253, 2929, 1690, 247, 625, 7000, 5955, 275, 253, 10199, 50276, 783, 767, 2234, 3374, 5439, 407, 391, 20, 665, 14285, 323, 18235, 497, 326, 337, 253, 789, 2175, 11786, 18499, 2581, 685, 256, 35333, 285, 374, 3480, 273, 3762, 891, 5194, 342, 841, 7350, 4931, 253, 4477, 943, 2953, 337, 407, 19936, 625, 9257, 2720, 789, 326, 2722, 326, 247, 2074, 11562, 1057, 1646, 281, 5108, 275, 3733, 342, 256, 35333, 347, 323, 374, 891, 5194, 1060, 342, 391, 18, 83, 19, 285, 391, 21, 326, 16774, 7103, 310, 247, 2234, 4757, 273, 253, 2929, 50275, 3169, 327, 253, 1840, 352, 310, 619, 11284, 281, 5583, 253, 14924, 273, 253, 2929, 5717, 368, 323, 29315, 634, 789, 281, 17857, 32888, 285, 4496, 1056, 2119, 368, 2953, 512, 16157, 273, 253, 30628, 275, 253, 4049, 254, 609, 5102, 2715 ]
[ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 ]
[ 977, 5701, 891, 1158, 326, 247, 2234, 1127, 417, 4751, 7192, 387, 1878, 347, 4469, 275, 253, 4028, 390, 6949, 275, 253, 4679, 310, 253, 2554, 273, 2990, 4871, 26332, 516, 33872, 326, 253, 7313, 588, 320, 5185, 2439, 35615, 347, 4469, 275, 253, 5909, 12494, 275, 253, 10199, 604, 14200, 6928, 403, 671, 6949, 45190, 436, 310, 247, 2022, 1921, 323, 417, 4933, 253, 2929, 247, 2169, 4868, 275, 30762, 277, 253, 4477, 1611, 281, 42853, 253, 1543, 342, 5368, 3762, 327, 11968, 3429, 7787, 3036, 1903, 68, 2722, 326, 2439, 6928, 273, 11962, 4871, 533, 10166, 387, 253, 1072, 4715, 2281, 14200, 6928, 990, 598, 342, 4577, 2193, 273, 9479, 1255, 387, 253, 990, 273, 3733, 1060, 1333, 326, 512, 4679, 403, 6331, 387, 253, 1072, 1318, 273, 253, 3733, 2957, 7613, 604, 247, 39937, 2990, 19004, 684, 253, 374, 1464, 3033, 387, 3563, 2069, 275, 11786, 18499, 253, 14200, 6928, 588, 2965, 2708, 436, 1318, 436, 310, 273, 2282, 5185, 342, 642, 5606, 275, 253, 16841, 275, 253, 11968, 4871, 2701, 326, 310, 281, 1333, 1880, 390, 417, 253, 5024, 273, 7882, 9459, 310, 4925, 7024, 3240, 7052, 327, 849, 4618, 253, 6928, 403, 50275, 6050, 690, 15958, 6928, 403, 14859, 24088, 362, 1266, 883, 6891, 4979, 398, 534, 310, 247, 2762, 1127, 512, 273, 253, 5780, 6928, 908, 275, 253, 4679, 347, 2080, 347, 891, 476, 9580, 403, 327, 253, 6891, 1930, 24088, 3828, 34414, 273, 1340, 2233, 390, 1052, 891, 1158, 247, 2159, 4202, 273, 253, 2929, 310, 326, 253, 2266, 4871, 26332, 10336, 10096, 273, 253, 11562, 310, 417, 4751, 14109, 390, 5469, 407, 253, 4477, 24088, 323, 4227, 407, 16585, 275, 253, 2022, 2505, 326, 352, 760, 5239, 275, 323, 6891, 6928, 390, 6949, 45190, 31506, 15806, 323, 10668, 326, 512, 253, 6928, 6777, 403, 2581, 6891, 562, 273, 22107, 50276, 74, 3877, 326, 253, 4477, 513, 3748, 275, 30762, 277, 581, 1537, 41661, 326, 13439, 9479, 2980, 1537, 24733, 6340, 347, 6928, 342, 295, 17922, 4764, 1320, 2489, 9592, 4618, 2299, 891, 13414, 1158, 326, 295, 17922, 4764, 1320, 310, 3309, 323, 436, 281, 320, 2032, 275, 2159, 891, 1158, 253, 4871, 10096, 273, 253, 11562, 310, 271, 1774, 2803, 326, 11852, 253, 8453, 285, 30437, 273, 253, 7313, 285, 812, 452, 644, 4127, 342, 3687, 22107, 342, 3081, 4679, 285, 30733, 281, 253, 2022, 2505, 285, 12002, 50276, 66, 4385, 327, 5886, 281, 2720, 789, 253, 4477, 3630, 326, 458, 30567, 319, 90, 14617, 1162, 355, 9169, 16084, 326, 4588, 4780, 651, 2826, 275, 4811, 835, 253, 9479, 1255, 4558, 13714, 1679, 685, 374, 1464, 776, 4679, 7568, 5010, 891, 13414, 2868, 436, 310, 247, 6452, 273, 326, 2929, 4780, 6569, 672, 9479, 1255, 310, 1840, 374, 1464, 3877, 671, 326, 253, 2929, 14280, 281, 1263, 4618, 6928, 812, 253, 4477, 21184, 327, 752, 597, 1599, 34924, 406, 33032, 436, 2929, 10262, 271, 4722, 8310, 323, 305, 69, 326, 310, 253, 9479, 1255, 273, 253, 34003, 1566, 275, 253, 2457, 3408, 273, 253, 3733, 4080, 407, 253, 6253, 25023, 273, 253, 3733, 2957, 344, 859, 757, 8511, 735, 987, 387, 253, 1318, 374, 1464, 1223, 253, 3733, 2957, 387, 253, 1072, 673, 253, 2957, 4566, 281, 17631, 285, 1327, 2163, 14639, 1037, 11052, 436, 3102, 310, 5185, 2439, 10336, 5743, 3470, 8892, 2957, 3470, 285, 270, 79, 11088, 4679, 403, 5196, 281, 921, 436, 1846, 8310, 253, 2929, 310, 3477, 281, 956, 50276, 67, 11587, 253, 16774, 1543, 275, 253, 2022, 2133, 4477, 1918, 47860, 11985, 275, 26432, 285, 2905, 789, 2593, 5742, 4477, 12661, 247, 4460, 5476, 326, 305, 69, 6524, 16307, 281, 5024, 273, 7882, 835, 305, 69, 476, 4720, 9302, 342, 14122, 78, 455, 2217, 3213, 1979, 3738, 891, 717, 417, 2119, 849, 305, 69, 476, 513, 436, 253, 4473, 273, 5024, 273, 7882, 310, 1335, 12994, 50275, 74, 452, 767, 7350, 323, 436, 789, 50276, 18, 4477, 858, 417, 7409, 2139, 9479, 1255, 4720, 26445, 689, 374, 1464, 310, 352, 247, 14916, 9936, 3560, 407, 690, 2954, 875, 253, 5731, 4086, 273, 305, 69, 285, 253, 5426, 273, 9479, 1255, 1293, 667, 1617, 1014, 604, 4754, 359, 778, 2007, 1158, 670, 849, 281, 25057, 352, 2112, 253, 5368, 11985, 275, 436, 2929, 3524, 281, 452, 4477, 8680, 84, 327, 436, 1996, 50276, 19, 1677, 952, 897, 256, 35333, 281, 6194, 11454, 6928, 11985, 670, 253, 12288, 432, 253, 8310, 273, 305, 69, 281, 256, 35333, 588, 7278, 253, 3486, 273, 436, 2929, 50276, 7152, 33032, 2520, 789, 22649, 247, 747, 16774, 11562, 275, 253, 3733, 8062, 273, 3676, 37507, 672, 10166, 342, 2120, 23941, 305, 69, 253, 16841, 273, 253, 6194, 2957, 5459, 598, 281, 247, 4619, 1318, 273, 374, 10539, 1979, 387, 534, 1127, 352, 5340, 666, 323, 253, 6414, 273, 3733, 436, 11562, 310, 5183, 10237, 314, 323, 6928, 10166, 342, 278, 339, 2957, 2439, 2710, 35615, 285, 15302, 285, 247, 5777, 21076, 2715, 273, 436, 6556, 323, 2831, 290, 10144, 2957, 347, 973, 436, 789, 17904, 281, 776, 4685, 273, 3676, 2990, 8062, 50276, 262, 310, 247, 10799, 285, 8505, 10237, 11562, 326, 369, 19143, 417, 8344, 1078, 4931, 984, 273, 253, 8284, 273, 305, 69, 4632, 256, 35333, 50276, 249, 2426, 273, 3486, 436, 789, 588, 320, 49664, 323, 45439, 13757, 3762, 1580, 352, 2792, 562, 326, 2176, 13260, 534, 403, 3798, 1160, 275, 10527, 2987, 24088, 3213, 1979, 50276, 1915, 87, 1177, 403, 2080, 432, 2032, 275, 3946, 50276, 3062, 1189, 352, 22591, 3762, 4404, 625, 15958, 13260, 352, 778, 671, 452, 1996, 3486, 275, 3946, 407, 4283, 281, 247, 1805, 4685, 273, 253, 5016, 875, 13757, 5933, 3213, 1979, 285, 10336, 3021, 891, 5583, 14924, 50276, 20881, 1255, 265, 285, 6799, 8254, 6787, 50276, 262, 943, 320, 5393, 625, 46454, 326, 841, 1543, 403, 8558, 323, 6928, 10166, 342, 278, 339, 2957, 50276, 395, 326, 247, 2074, 533, 21076, 16958, 6556, 323, 2831, 290, 10144, 2957, 50275, 22309, 310, 253, 2022, 1650, 275, 2593, 495, 1677, 323, 247, 1327, 15291, 2990, 323, 260, 338, 274, 740, 247, 374, 12026, 13361, 81, 342, 1045, 86, 5743, 2139, 417, 247, 2629, 2990, 342, 2629, 5743, 362, 1266, 883, 390, 501, 3024, 1093, 3966, 50276, 783, 13812, 875, 256, 35333, 285, 305, 69, 3133, 9560, 323, 436, 11562, 594, 625, 5955, 651, 320, 1175, 275, 1798, 347, 4879, 275, 253, 2905, 2987, 690, 9380, 970, 256, 35333, 1750, 271, 7285, 1055, 436, 310, 3340, 1774, 281, 19148, 1580, 256, 35333, 310, 954, 2223, 908, 275, 3946, 604, 673, 4483, 4679, 342, 3629, 14604, 1979, 812, 17914, 1708, 327, 253, 6349, 273, 305, 69, 4632, 256, 35333, 50276, 783, 2905, 2987, 310, 4390, 3542, 347, 271, 2395, 273, 752, 2045, 2987, 513, 417, 513, 347, 10066, 281, 752, 597, 513, 352, 651, 1361, 33876, 907, 436, 789, 281, 14588, 352, 281, 2720, 2987, 534, 403, 5185, 390, 16706, 342, 436, 16958, 50276, 20432, 2987, 12392, 253, 344, 859, 757, 273, 3676, 37507, 690, 273, 253, 6297, 4081, 275, 2720, 2987, 24088, 458, 30567, 319, 90, 14617, 1162, 355, 9169, 285, 2987, 327, 3676, 4872, 6928, 778, 671, 320, 9371, 281, 2096, 253, 16958, 275, 436, 789, 50274, 26122, 534, 513, 417, 2818, 253, 4868, 50275, 74, 4282, 604, 368, 452, 4080, 253, 374, 2109, 25023, 1309, 3733, 347, 973, 275, 1798, 846, 253, 337, 296, 25023, 556, 23543, 387, 374, 1464, 1057, 253, 374, 2109, 25023, 671, 31414, 17614, 3878, 598, 281, 374, 1464, 50276, 395, 594, 327, 323, 1996, 9216, 9863, 50276, 74, 717, 9995, 281, 923, 253, 4679, 327, 3676, 4872, 6928, 352, 5936, 326, 352, 778, 320, 1896, 281, 28055, 2096, 436, 11562, 275, 824, 2969, 7533, 436, 651, 320, 247, 5322, 9400, 323, 2852, 789, 7152, 339, 793, 360, 3454, 50276, 2520, 19529, 27184, 2722, 326, 1309, 18216, 253, 11454, 2990, 13016, 50276, 35333, 2685, 11359, 3629, 253, 9479, 1255, 50276, 284, 247, 906, 305, 69, 342, 247, 4229, 4715, 2281, 588, 10738, 767, 12475, 1309, 253, 8062, 50276, 3354, 1584, 407, 1162, 66, 253, 4229, 4715, 2281, 50276, 249, 253, 806, 3408, 305, 69, 3637, 8244, 281, 253, 305, 69, 2685, 285, 352, 4720, 26414, 281, 247, 2919, 835, 253, 9479, 1255, 310, 11467, 374, 1464, 50276, 7461, 352, 811, 953, 715, 253, 1273, 3408, 1309, 534, 253, 9479, 1255, 8511, 735, 987, 387, 390, 1840, 374, 1464, 275, 253, 1273, 3408, 305, 69, 2550, 2572, 253, 9479, 1255, 10542, 1955, 281, 253, 18525, 7882, 7658, 3021, 253, 4477, 1416, 352, 253, 5024, 273, 7882, 3408, 50276, 5371, 310, 4722, 310, 326, 275, 253, 5024, 273, 7882, 3408, 253, 2957, 310, 1335, 11052, 25060, 3738, 417, 41907, 1037, 50275, 856, 84, 50276, 74, 4264, 4361, 436, 19529, 352, 310, 4518, 3542, 285, 253, 10704, 7103, 310, 671, 4209, 50276, 936, 619, 1682, 273, 3640, 253, 8310, 326, 253, 5024, 273, 7882, 6569, 1309, 253, 2644, 3563, 3408, 273, 305, 69, 8062, 310, 747, 50276, 262, 12957, 247, 1077, 9542, 18525, 3879, 273, 305, 69, 323, 3733, 11454, 6928, 534, 556, 417, 644, 24181, 6949, 1078, 50275, 40622, 891, 1158, 436, 19529, 1160, 247, 1077, 1774, 285, 3236, 7680, 281, 253, 4685, 273, 305, 69, 8062, 50276, 249, 3676, 4715, 50275, 5040, 50276, 783, 2954, 342, 253, 2045, 1263, 327, 253, 18525, 7882, 273, 256, 35333, 310, 417, 4209, 5469, 275, 619, 4743, 816, 3981, 2045, 2987, 452, 9125, 326, 253, 7882, 3607, 273, 13757, 11333, 812, 7826, 5752, 347, 247, 830, 273, 15424, 8492, 275, 3676, 4715, 310, 9090, 417, 10799, 285, 2217, 247, 1781, 1180, 273, 10704, 1543, 275, 1249, 2168, 2692, 326, 253, 5024, 273, 7882, 6569, 323, 253, 41886, 5482, 534, 8018, 326, 253, 5024, 273, 7882, 1364, 5108, 387, 1878, 275, 253, 1077, 3563, 3408, 273, 305, 69, 8062, 50275, 783, 747, 9010, 273, 436, 19529, 403, 326, 253, 5024, 273, 7882, 2686, 6556, 323, 247, 1781, 5110, 273, 305, 69, 8062, 534, 310, 1077, 12439, 50275, 783, 4477, 943, 11120, 3748, 326, 253, 5024, 273, 7882, 369, 2168, 2540, 275, 841, 2045, 2987, 4933, 253, 987, 6152, 281, 253, 987, 10414, 1057, 417, 5237, 253, 7680, 273, 436, 19529, 50276, 20432, 253, 480, 1662, 251, 5024, 273, 7882, 369, 806, 908, 275, 337, 285, 253, 4477, 1014, 858, 417, 3748, 352, 50274, 18, 305, 300, 11282, 295, 400, 1162, 355, 387, 7882, 84, 5024, 849, 281, 4575, 4373, 22041, 281, 14003, 46836, 5438, 275, 35576, 3733, 273, 11454, 6928, 549, 32693, 638, 3845, 549, 32693, 746, 2693, 10683, 1449, 6247, 50276, 19, 259, 86, 43278, 448, 8500, 6429, 285, 299, 359, 249, 266, 849, 256, 35333, 34899, 253, 4156, 46836, 275, 689, 19484, 1025, 4715, 247, 18525, 7882, 8668, 16424, 275, 11454, 1491, 5162, 2718, 4765, 187, 187, 4118, 18435, 27, 783, 2929, 14371, 326, 11786, 1398, 592, 3839, 17209, 275, 247, 9459, 835, 253, 9879, 5222, 273, 253, 344, 859, 757, 310, 347, 1781, 347, 1896, 1677, 253, 4715, 2281, 50275, 783, 2929, 10262, 247, 1077, 11080, 16774, 20028, 273, 253, 4275, 1750, 534, 369, 14109, 407, 253, 30628, 50276, 66, 4275, 2523, 281, 479, 275, 18738, 253, 789, 369, 697, 38135, 2720, 789, 556, 2011, 1077, 8244, 2905, 2538, 323, 256, 35333, 253, 30628, 14109, 275, 11985, 253, 38135, 273, 253, 10799, 1750, 670, 253, 9879, 5222, 46115, 387, 1475, 1315, 317, 19, 1464, 391, 21, 285, 391, 19, 671, 5439, 253, 2523, 326, 253, 2905, 789, 5955, 310, 417, 4209, 4496, 1056, 2119, 326, 368, 2319, 1077, 9257, 2905, 789, 275, 253, 2929, 1690, 247, 625, 7000, 5955, 275, 253, 10199, 50276, 783, 767, 2234, 3374, 5439, 407, 391, 20, 665, 14285, 323, 18235, 497, 326, 337, 253, 789, 2175, 11786, 18499, 2581, 685, 256, 35333, 285, 374, 3480, 273, 3762, 891, 5194, 342, 841, 7350, 4931, 253, 4477, 943, 2953, 337, 407, 19936, 625, 9257, 2720, 789, 326, 2722, 326, 247, 2074, 11562, 1057, 1646, 281, 5108, 275, 3733, 342, 256, 35333, 347, 323, 374, 891, 5194, 1060, 342, 391, 18, 83, 19, 285, 391, 21, 326, 16774, 7103, 310, 247, 2234, 4757, 273, 253, 2929, 50275, 3169, 327, 253, 1840, 352, 310, 619, 11284, 281, 5583, 253, 14924, 273, 253, 2929, 5717, 368, 323, 29315, 634, 789, 281, 17857, 32888, 285, 4496, 1056, 2119, 368, 2953, 512, 16157, 273, 253, 30628, 275, 253, 4049, 254, 609, 5102, 2715 ]
Below is given review of a research paper from cnoference journal. Please write a summary the review. ### Review: this paper focuses on the floating point unsoundness of neural network verification procedures and shows that the results from such verifiers can not be trusted to drive home the message of the paper the authors take mipverify which doesnt ensure fp soundness and shows that they are able to construct adversarial examples for the cases that are returned as verified by mipverify it is an interesting nice paper but the contribution is weakened by two facts one it is already known that mipverify doesnt ensure fp soundness which is also acknowledged by its authors second when fp soundness is not ensured and given the fact that adversarial examples are widely present it is no surprise that one could find adversarial examples so i am split on the paper from a formal methods perspective the discovery of the paper is not surprising as floating point computations are known to be important this is one of the reasons that smt solvers put a lot of emphasis on fp but perhaps from ml practitioner it may be interesting docsepsummary the authors develop a method to generate pairs of sample that are separated by a small adversarial perturbation that have different class but with the specificity that the a complete verifier would returns a result indicating that this sample admits no adversarial perturbation despite the fact that it does as evidenced by the second element of the pair these samples are obtained by considering a brightness perturbation of the image and finding the parameter alpha at which the verifier switch from returning safe to unsafe the resulting perturbed image is going to have adversarial examples very close to the boundary of the region considered so small floating point errors might result in returning incorrect results main thoughts the problem that the author discuss is very well highlighted and explained it is clear what vulnerability they identified as well as the mechanism that they use to highlight it on the other hand in terms of importance i would rank it more as an interesting observation that an actual critical problem if we assume that what im caring is robustness of my image classification system for perturbation of size epsilon01 then it seems that the worst that can happen is that some samples that i verified to be robust for epsilon01 are in practice only robust for epsilon009999 this doesnt seem overtly critical and would result in essentially the same result in any application questions the choice of what solver to use as a backend for a mip formulation of the neural network verification problem is an implementation detail mipverify could well be implemented with a different solver mip solvers returning incorrect result due to floating point errors is not a new problem and there seems to be some literature in how to adress these problems if they are considered of importance safe bounds in linear and mixedinteger linear programming neumaier shcherbina in addition could this problem be solved by simply adjusting the tolerance parameters of the solver i did not see any discussion of this by the authors but i imagine that the default parameters used by the verifier might be geared more towards speed than towards perfect accuracy the authors mention verifiers that incorporate proper handling of floating point errors eran but then reject it by saying that it rely on specific implementation details of the inference algorithm this seems strange because thats exactly the recommendation that the authors make page 2 any sound verifier for this class of networks must reason about the specific floating point error characteristics of the neural network implementation at hand minor questions in figure 1a it seems like for the first 4 graphs the dotted lines which i assume implies what the difference should be are lines with slope 1 why would the change in the logit vector vary at the same rate as the perturbations shouldnt there be a slope dependent on the corresponding gradient coefficientdocsepthe paper presents a method to find adversarial inputs for neural networks in regions where the networks can be proven not to admit any such adversarial examples practically demonstrating the unsoundness of a complete verifier as well as an incomplete verifier while it was already obvious to me that verifiers that assume floatingpoint arithmetic is the same as real arithmetic are unsound the paper is a service to the community in that it also makes this very obvious to informed outsiders who may not have already questioned the validity of robustness verification research that does not model roundoff and even ignores it in its own implementation the related work section does a good job of surveying the state of the art as it relates to floatingpoint soundness the authors also took some space to discuss how their findings relate to current and future research on robustness verification which i think is important in this case perhaps there could be a short discussion of challenges that different approaches face to become sound with respect to floatingpoint semantics for example it seems particularly challenging for approaches based on duality as the correctness of certificates depends nontrivially on closedform solutions to optimization problems as well as associativity of addition the technical sections are mostly wellwritten though i was not able to figure out some details for example it is not so clear how precisely binary search is used to find and simultaneously section 42 is a bit dense and its presentation could probably be improved inevitable presence of numerical error in the verifier it is not inevitable that the verifier is subject to error we could encode the precise floatingpoint semantics of the neural network as a sat formula and then watch the sat solver time out but this does give a sound and complete method docsepin the recent literature there has been a rise in the number of papers which attempt to verify neural networks the specification of the verification problems often gets adapted according to the application in mind more specifically for image classification networks the problem is to prove that the output of the neural network does not flip for small perturbations to the pixel values for a robotic setting the problem is often safety and convergence to some goal state where the neural network operates in closed loop with the system dynamics the authors in this paper present an adversarial attack model on neural networks which is deemed correct by some verifier more specifically given a neural network which can be shown to be robust to adversarial perturbations around some input the authors exploit numerical errors in the computations to attack the network demonstrating the presence of loop holes in the proving engines itself this is due to the approximation errors introduced by using floating point numbers in my opinion the notion of input sets in the space of images is not a very useful one mainly because the interval valued sets representing perturbations of the input image is far removed from the intended specification its a step in the right direction if the verification of computer vision task was a well defined problem since its not clear what to verify in the first place the use case of this paper is not a very convincing one in my opinion the problem of verifying neural networks in a robotic setting has a more meaningful specification hence i dont think that this paper in itself will be interesting to the general theme of the conference ### Summary:
there are many recent methods for the formal verification of neural networks however most of these methods do not soundly model the floatingpoint representation of real numbers this paper shows that this unsoundness can be exploited to construct adversarial examples for supposedly verified networks the takeaway is that future approaches to neural network verification should take into account floatingpoint semantics this was a borderline paper on the other hand to anyone wellversed in formal methods it is not surprising that unsound verification leaves the door open for exploits also there is prior work singh et al neurips 2018 on verification of neural networks that explicitly aims for soundness wrt floatingpoint arithmetic on the other hand it is true that many adversarial learning researchers do not appreciate the value of this kind of soundness in the end the decision came down to the significance of the result here i have to side with reviewer 1 the impact of this problem is limited in the first place and also the issue of floatingpoint soundness has come up in prior work on neural network verification for these reasons the paper cannot be accepted this time around
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 2520, 2929, 16633, 327, 253, 14974, 1127, 5061, 517, 1255, 273, 11454, 2990, 21999, 7259, 285, 2722, 326, 253, 1543, 432, 824, 2336, 13783, 476, 417, 320, 18273, 281, 4446, 1728, 253, 3935, 273, 253, 2929, 253, 4477, 1379, 278, 532, 36302, 534, 36908, 5416, 44296, 3590, 1255, 285, 2722, 326, 597, 403, 2104, 281, 3989, 48960, 6667, 323, 253, 2219, 326, 403, 4895, 347, 16058, 407, 278, 532, 36302, 50275, 262, 310, 271, 4722, 5322, 2929, 533, 253, 7680, 310, 33153, 407, 767, 50276, 33799, 581, 352, 310, 2168, 1929, 326, 278, 532, 36302, 36908, 5416, 44296, 3590, 1255, 534, 310, 671, 14969, 407, 697, 4477, 1273, 672, 44296, 3590, 1255, 310, 417, 33075, 285, 1677, 253, 958, 326, 48960, 6667, 403, 7561, 1246, 352, 310, 642, 9326, 326, 581, 812, 1089, 48960, 6667, 50275, 601, 891, 717, 8085, 327, 253, 2929, 432, 247, 7473, 3082, 8668, 253, 8900, 273, 253, 2929, 310, 417, 10084, 347, 14974, 1127, 30745, 403, 1929, 281, 320, 1774, 436, 310, 581, 273, 253, 4606, 326, 924, 85, 1220, 735, 1691, 247, 2257, 273, 15075, 327, 44296, 533, 4931, 432, 13361, 34815, 352, 778, 320, 4722, 5474, 339, 793, 360, 3454, 253, 4477, 1287, 247, 1332, 281, 6635, 8557, 273, 3410, 326, 403, 9070, 407, 247, 1355, 48960, 20452, 326, 452, 1027, 966, 533, 342, 253, 13005, 326, 253, 247, 3426, 2336, 5425, 651, 6548, 247, 906, 7809, 326, 436, 3410, 19943, 642, 48960, 20452, 5747, 253, 958, 326, 352, 1057, 347, 27007, 407, 253, 1273, 3284, 273, 253, 4667, 50276, 20513, 3530, 403, 2797, 407, 7296, 247, 20468, 20452, 273, 253, 2460, 285, 4560, 253, 4764, 9765, 387, 534, 253, 2336, 5425, 5234, 432, 10884, 4999, 281, 20372, 253, 4795, 44711, 2460, 310, 1469, 281, 452, 48960, 6667, 1077, 2810, 281, 253, 7548, 273, 253, 2919, 2783, 594, 1355, 14974, 1127, 6332, 1537, 906, 275, 10884, 13583, 1543, 50276, 7265, 7906, 253, 1895, 326, 253, 2488, 2319, 310, 1077, 973, 16318, 285, 5544, 352, 310, 2590, 752, 24189, 597, 3636, 347, 973, 347, 253, 5122, 326, 597, 897, 281, 6780, 352, 50276, 251, 253, 643, 1133, 275, 2426, 273, 6349, 891, 651, 5958, 352, 625, 347, 271, 4722, 8310, 326, 271, 4588, 4619, 1895, 604, 359, 5467, 326, 752, 516, 23374, 310, 31640, 273, 619, 2460, 9162, 985, 323, 20452, 273, 1979, 299, 4277, 520, 840, 352, 3133, 326, 253, 9065, 326, 476, 5108, 310, 326, 690, 3530, 326, 891, 16058, 281, 320, 10237, 323, 299, 4277, 520, 403, 275, 3946, 760, 10237, 323, 299, 4277, 361, 14432, 436, 36908, 1646, 19486, 314, 4619, 285, 651, 906, 275, 9093, 253, 1072, 906, 275, 667, 2898, 50276, 34974, 50276, 783, 4327, 273, 752, 47037, 281, 897, 347, 247, 31446, 323, 247, 278, 532, 15895, 273, 253, 11454, 2990, 21999, 1895, 310, 271, 7092, 2508, 278, 532, 36302, 812, 973, 320, 9009, 342, 247, 1027, 47037, 278, 532, 1220, 735, 10884, 13583, 906, 1955, 281, 14974, 1127, 6332, 310, 417, 247, 747, 1895, 285, 627, 3133, 281, 320, 690, 6239, 275, 849, 281, 519, 560, 841, 3237, 604, 597, 403, 2783, 273, 6349, 4999, 14493, 275, 4872, 285, 6804, 18743, 4872, 10717, 425, 9307, 1321, 50276, 1200, 5316, 67, 1758, 275, 1635, 812, 436, 1895, 320, 14042, 407, 3365, 19427, 253, 13761, 3602, 273, 253, 47037, 891, 858, 417, 923, 667, 5955, 273, 436, 407, 253, 4477, 533, 891, 8564, 326, 253, 4284, 3602, 908, 407, 253, 2336, 5425, 1537, 320, 48526, 625, 4404, 3885, 685, 4404, 3962, 7200, 50276, 783, 4477, 3748, 2336, 13783, 326, 19071, 1463, 10885, 273, 14974, 1127, 6332, 2827, 266, 533, 840, 12009, 352, 407, 3981, 326, 352, 10725, 327, 2173, 7092, 4278, 273, 253, 17032, 5933, 436, 3133, 8921, 984, 28763, 4555, 253, 17401, 326, 253, 4477, 1056, 3239, 374, 667, 3590, 2336, 5425, 323, 436, 966, 273, 6928, 1364, 1921, 670, 253, 2173, 14974, 1127, 2228, 5319, 273, 253, 11454, 2990, 7092, 387, 1133, 50276, 37585, 3533, 275, 4677, 337, 66, 352, 3133, 751, 323, 253, 806, 577, 14580, 253, 24817, 3104, 534, 891, 5467, 8018, 752, 253, 3064, 943, 320, 403, 3104, 342, 14679, 337, 2139, 651, 253, 1818, 275, 253, 2412, 262, 4972, 6889, 387, 253, 1072, 2281, 347, 253, 26309, 943, 2649, 627, 320, 247, 14679, 7976, 327, 253, 3969, 11786, 10235, 7152, 339, 431, 248, 2929, 10262, 247, 1332, 281, 1089, 48960, 14800, 323, 11454, 6928, 275, 4811, 835, 253, 6928, 476, 320, 11464, 417, 281, 11476, 667, 824, 48960, 6667, 18236, 17227, 253, 5061, 517, 1255, 273, 247, 3426, 2336, 5425, 347, 973, 347, 271, 18464, 2336, 5425, 1223, 352, 369, 2168, 4755, 281, 479, 326, 2336, 13783, 326, 5467, 14974, 3659, 27844, 310, 253, 1072, 347, 1524, 27844, 403, 5061, 517, 253, 2929, 310, 247, 2579, 281, 253, 3114, 275, 326, 352, 671, 2789, 436, 1077, 4755, 281, 8191, 20823, 5852, 665, 778, 417, 452, 2168, 17801, 253, 13091, 273, 31640, 21999, 2561, 326, 1057, 417, 1566, 3790, 2727, 285, 1014, 35136, 352, 275, 697, 1211, 7092, 253, 2905, 789, 2593, 1057, 247, 1175, 2628, 273, 8957, 3184, 253, 1375, 273, 253, 1445, 347, 352, 7033, 281, 14974, 3659, 3590, 1255, 253, 4477, 671, 2335, 690, 2317, 281, 2319, 849, 616, 4342, 14588, 281, 1655, 285, 2852, 2561, 327, 31640, 21999, 534, 891, 1158, 310, 1774, 275, 436, 1083, 50276, 30875, 627, 812, 320, 247, 2159, 5955, 273, 7881, 326, 1027, 7274, 2454, 281, 2489, 3590, 342, 1675, 281, 14974, 3659, 35185, 323, 1650, 352, 3133, 3782, 11132, 323, 7274, 1754, 327, 34962, 347, 253, 36594, 273, 28460, 7024, 25450, 1069, 1365, 327, 4581, 630, 5482, 281, 13757, 3237, 347, 973, 347, 1709, 18473, 273, 1635, 50276, 783, 7681, 7118, 403, 6571, 973, 15720, 2167, 891, 369, 417, 2104, 281, 4677, 562, 690, 4278, 323, 1650, 352, 310, 417, 594, 2590, 849, 10534, 8985, 3186, 310, 908, 281, 1089, 50276, 395, 50276, 3549, 503, 8272, 2593, 5976, 310, 247, 2372, 14086, 285, 697, 9759, 812, 3164, 320, 5520, 50275, 460, 87, 7116, 3361, 273, 10704, 2228, 275, 50276, 783, 2336, 5425, 352, 310, 417, 19455, 326, 253, 2336, 5425, 310, 2256, 281, 2228, 359, 812, 22573, 253, 10799, 14974, 3659, 35185, 273, 253, 11454, 2990, 347, 247, 2206, 7212, 285, 840, 3698, 253, 2206, 47037, 673, 562, 533, 436, 1057, 1918, 247, 3590, 285, 3426, 1332, 5474, 339, 9852, 253, 3332, 6239, 627, 556, 644, 247, 6054, 275, 253, 1180, 273, 9380, 534, 3177, 281, 12654, 11454, 6928, 253, 17776, 273, 253, 21999, 3237, 2223, 4850, 12956, 2556, 281, 253, 2898, 275, 2564, 625, 5742, 323, 2460, 9162, 6928, 253, 1895, 310, 281, 5276, 326, 253, 3453, 273, 253, 11454, 2990, 1057, 417, 19153, 323, 1355, 26309, 281, 253, 12275, 2193, 323, 247, 35121, 4758, 253, 1895, 310, 2223, 5252, 285, 14940, 281, 690, 4736, 1375, 835, 253, 11454, 2990, 17209, 275, 4581, 6287, 342, 253, 985, 8062, 50275, 783, 4477, 275, 436, 2929, 1246, 271, 48960, 2983, 1566, 327, 11454, 6928, 534, 310, 14320, 3451, 407, 690, 2336, 5425, 625, 5742, 50276, 28821, 247, 11454, 2990, 534, 476, 320, 2011, 281, 320, 10237, 281, 48960, 26309, 1475, 690, 3280, 253, 4477, 22059, 10704, 6332, 275, 253, 30745, 281, 2983, 253, 2990, 17227, 253, 3361, 273, 6287, 11385, 275, 253, 18597, 14917, 3139, 436, 310, 1955, 281, 253, 11193, 6332, 5611, 407, 970, 14974, 1127, 3904, 50276, 249, 619, 4743, 253, 10732, 273, 3280, 5239, 275, 253, 2317, 273, 3888, 310, 417, 247, 1077, 4217, 581, 7194, 984, 253, 7726, 21392, 5239, 9999, 50276, 44931, 569, 273, 253, 3280, 2460, 310, 2080, 5176, 432, 253, 6034, 17776, 50276, 953, 247, 3213, 275, 253, 987, 3884, 604, 253, 21999, 273, 4382, 8113, 4836, 369, 247, 973, 2931, 1895, 1580, 697, 417, 2590, 752, 281, 12654, 275, 253, 806, 1659, 253, 897, 1083, 273, 436, 2929, 50276, 261, 417, 247, 1077, 21414, 581, 275, 619, 4743, 253, 1895, 273, 49160, 11454, 6928, 275, 247, 35121, 4758, 556, 247, 625, 14282, 17776, 50276, 48521, 891, 13414, 1158, 326, 436, 2929, 275, 3139, 588, 320, 4722, 281, 253, 2087, 10014, 273, 253, 8059, 2490, 187, 4118, 18435, 27, 9088, 403, 1142, 3332, 3082, 323, 253, 7473, 21999, 273, 11454, 6928, 2299, 954, 273, 841, 3082, 513, 417, 3590, 314, 1566, 253, 14974, 3659, 6779, 273, 1524, 3904, 436, 2929, 2722, 326, 436, 5061, 517, 1255, 476, 320, 28734, 281, 3989, 48960, 6667, 323, 24628, 16058, 6928, 253, 1379, 12594, 310, 326, 2852, 7274, 281, 11454, 2990, 21999, 943, 1379, 715, 2395, 14974, 3659, 35185, 50276, 2520, 369, 247, 45210, 2929, 327, 253, 643, 1133, 281, 3780, 973, 43910, 275, 7473, 3082, 352, 310, 417, 10084, 326, 5061, 517, 21999, 6505, 253, 3369, 1527, 323, 40725, 671, 627, 310, 2720, 789, 1625, 73, 1162, 355, 5723, 2824, 4765, 327, 21999, 273, 11454, 6928, 326, 11120, 13698, 323, 3590, 1255, 8772, 14974, 3659, 27844, 327, 253, 643, 1133, 352, 310, 2032, 326, 1142, 48960, 4715, 8607, 513, 417, 11435, 253, 1318, 273, 436, 2238, 273, 3590, 1255, 275, 253, 990, 253, 3061, 2210, 1066, 281, 253, 8453, 273, 253, 906, 1060, 891, 452, 281, 1930, 342, 37317, 337, 253, 3486, 273, 436, 1895, 310, 3710, 275, 253, 806, 1659, 285, 671, 253, 2523, 273, 14974, 3659, 3590, 1255, 556, 1705, 598, 275, 2720, 789, 327, 11454, 2990, 21999, 323, 841, 4606, 253, 2929, 2550, 320, 7607, 436, 673, 1475 ]
[ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 ]
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 2520, 2929, 16633, 327, 253, 14974, 1127, 5061, 517, 1255, 273, 11454, 2990, 21999, 7259, 285, 2722, 326, 253, 1543, 432, 824, 2336, 13783, 476, 417, 320, 18273, 281, 4446, 1728, 253, 3935, 273, 253, 2929, 253, 4477, 1379, 278, 532, 36302, 534, 36908, 5416, 44296, 3590, 1255, 285, 2722, 326, 597, 403, 2104, 281, 3989, 48960, 6667, 323, 253, 2219, 326, 403, 4895, 347, 16058, 407, 278, 532, 36302, 50275, 262, 310, 271, 4722, 5322, 2929, 533, 253, 7680, 310, 33153, 407, 767, 50276, 33799, 581, 352, 310, 2168, 1929, 326, 278, 532, 36302, 36908, 5416, 44296, 3590, 1255, 534, 310, 671, 14969, 407, 697, 4477, 1273, 672, 44296, 3590, 1255, 310, 417, 33075, 285, 1677, 253, 958, 326, 48960, 6667, 403, 7561, 1246, 352, 310, 642, 9326, 326, 581, 812, 1089, 48960, 6667, 50275, 601, 891, 717, 8085, 327, 253, 2929, 432, 247, 7473, 3082, 8668, 253, 8900, 273, 253, 2929, 310, 417, 10084, 347, 14974, 1127, 30745, 403, 1929, 281, 320, 1774, 436, 310, 581, 273, 253, 4606, 326, 924, 85, 1220, 735, 1691, 247, 2257, 273, 15075, 327, 44296, 533, 4931, 432, 13361, 34815, 352, 778, 320, 4722, 5474, 339, 793, 360, 3454, 253, 4477, 1287, 247, 1332, 281, 6635, 8557, 273, 3410, 326, 403, 9070, 407, 247, 1355, 48960, 20452, 326, 452, 1027, 966, 533, 342, 253, 13005, 326, 253, 247, 3426, 2336, 5425, 651, 6548, 247, 906, 7809, 326, 436, 3410, 19943, 642, 48960, 20452, 5747, 253, 958, 326, 352, 1057, 347, 27007, 407, 253, 1273, 3284, 273, 253, 4667, 50276, 20513, 3530, 403, 2797, 407, 7296, 247, 20468, 20452, 273, 253, 2460, 285, 4560, 253, 4764, 9765, 387, 534, 253, 2336, 5425, 5234, 432, 10884, 4999, 281, 20372, 253, 4795, 44711, 2460, 310, 1469, 281, 452, 48960, 6667, 1077, 2810, 281, 253, 7548, 273, 253, 2919, 2783, 594, 1355, 14974, 1127, 6332, 1537, 906, 275, 10884, 13583, 1543, 50276, 7265, 7906, 253, 1895, 326, 253, 2488, 2319, 310, 1077, 973, 16318, 285, 5544, 352, 310, 2590, 752, 24189, 597, 3636, 347, 973, 347, 253, 5122, 326, 597, 897, 281, 6780, 352, 50276, 251, 253, 643, 1133, 275, 2426, 273, 6349, 891, 651, 5958, 352, 625, 347, 271, 4722, 8310, 326, 271, 4588, 4619, 1895, 604, 359, 5467, 326, 752, 516, 23374, 310, 31640, 273, 619, 2460, 9162, 985, 323, 20452, 273, 1979, 299, 4277, 520, 840, 352, 3133, 326, 253, 9065, 326, 476, 5108, 310, 326, 690, 3530, 326, 891, 16058, 281, 320, 10237, 323, 299, 4277, 520, 403, 275, 3946, 760, 10237, 323, 299, 4277, 361, 14432, 436, 36908, 1646, 19486, 314, 4619, 285, 651, 906, 275, 9093, 253, 1072, 906, 275, 667, 2898, 50276, 34974, 50276, 783, 4327, 273, 752, 47037, 281, 897, 347, 247, 31446, 323, 247, 278, 532, 15895, 273, 253, 11454, 2990, 21999, 1895, 310, 271, 7092, 2508, 278, 532, 36302, 812, 973, 320, 9009, 342, 247, 1027, 47037, 278, 532, 1220, 735, 10884, 13583, 906, 1955, 281, 14974, 1127, 6332, 310, 417, 247, 747, 1895, 285, 627, 3133, 281, 320, 690, 6239, 275, 849, 281, 519, 560, 841, 3237, 604, 597, 403, 2783, 273, 6349, 4999, 14493, 275, 4872, 285, 6804, 18743, 4872, 10717, 425, 9307, 1321, 50276, 1200, 5316, 67, 1758, 275, 1635, 812, 436, 1895, 320, 14042, 407, 3365, 19427, 253, 13761, 3602, 273, 253, 47037, 891, 858, 417, 923, 667, 5955, 273, 436, 407, 253, 4477, 533, 891, 8564, 326, 253, 4284, 3602, 908, 407, 253, 2336, 5425, 1537, 320, 48526, 625, 4404, 3885, 685, 4404, 3962, 7200, 50276, 783, 4477, 3748, 2336, 13783, 326, 19071, 1463, 10885, 273, 14974, 1127, 6332, 2827, 266, 533, 840, 12009, 352, 407, 3981, 326, 352, 10725, 327, 2173, 7092, 4278, 273, 253, 17032, 5933, 436, 3133, 8921, 984, 28763, 4555, 253, 17401, 326, 253, 4477, 1056, 3239, 374, 667, 3590, 2336, 5425, 323, 436, 966, 273, 6928, 1364, 1921, 670, 253, 2173, 14974, 1127, 2228, 5319, 273, 253, 11454, 2990, 7092, 387, 1133, 50276, 37585, 3533, 275, 4677, 337, 66, 352, 3133, 751, 323, 253, 806, 577, 14580, 253, 24817, 3104, 534, 891, 5467, 8018, 752, 253, 3064, 943, 320, 403, 3104, 342, 14679, 337, 2139, 651, 253, 1818, 275, 253, 2412, 262, 4972, 6889, 387, 253, 1072, 2281, 347, 253, 26309, 943, 2649, 627, 320, 247, 14679, 7976, 327, 253, 3969, 11786, 10235, 7152, 339, 431, 248, 2929, 10262, 247, 1332, 281, 1089, 48960, 14800, 323, 11454, 6928, 275, 4811, 835, 253, 6928, 476, 320, 11464, 417, 281, 11476, 667, 824, 48960, 6667, 18236, 17227, 253, 5061, 517, 1255, 273, 247, 3426, 2336, 5425, 347, 973, 347, 271, 18464, 2336, 5425, 1223, 352, 369, 2168, 4755, 281, 479, 326, 2336, 13783, 326, 5467, 14974, 3659, 27844, 310, 253, 1072, 347, 1524, 27844, 403, 5061, 517, 253, 2929, 310, 247, 2579, 281, 253, 3114, 275, 326, 352, 671, 2789, 436, 1077, 4755, 281, 8191, 20823, 5852, 665, 778, 417, 452, 2168, 17801, 253, 13091, 273, 31640, 21999, 2561, 326, 1057, 417, 1566, 3790, 2727, 285, 1014, 35136, 352, 275, 697, 1211, 7092, 253, 2905, 789, 2593, 1057, 247, 1175, 2628, 273, 8957, 3184, 253, 1375, 273, 253, 1445, 347, 352, 7033, 281, 14974, 3659, 3590, 1255, 253, 4477, 671, 2335, 690, 2317, 281, 2319, 849, 616, 4342, 14588, 281, 1655, 285, 2852, 2561, 327, 31640, 21999, 534, 891, 1158, 310, 1774, 275, 436, 1083, 50276, 30875, 627, 812, 320, 247, 2159, 5955, 273, 7881, 326, 1027, 7274, 2454, 281, 2489, 3590, 342, 1675, 281, 14974, 3659, 35185, 323, 1650, 352, 3133, 3782, 11132, 323, 7274, 1754, 327, 34962, 347, 253, 36594, 273, 28460, 7024, 25450, 1069, 1365, 327, 4581, 630, 5482, 281, 13757, 3237, 347, 973, 347, 1709, 18473, 273, 1635, 50276, 783, 7681, 7118, 403, 6571, 973, 15720, 2167, 891, 369, 417, 2104, 281, 4677, 562, 690, 4278, 323, 1650, 352, 310, 417, 594, 2590, 849, 10534, 8985, 3186, 310, 908, 281, 1089, 50276, 395, 50276, 3549, 503, 8272, 2593, 5976, 310, 247, 2372, 14086, 285, 697, 9759, 812, 3164, 320, 5520, 50275, 460, 87, 7116, 3361, 273, 10704, 2228, 275, 50276, 783, 2336, 5425, 352, 310, 417, 19455, 326, 253, 2336, 5425, 310, 2256, 281, 2228, 359, 812, 22573, 253, 10799, 14974, 3659, 35185, 273, 253, 11454, 2990, 347, 247, 2206, 7212, 285, 840, 3698, 253, 2206, 47037, 673, 562, 533, 436, 1057, 1918, 247, 3590, 285, 3426, 1332, 5474, 339, 9852, 253, 3332, 6239, 627, 556, 644, 247, 6054, 275, 253, 1180, 273, 9380, 534, 3177, 281, 12654, 11454, 6928, 253, 17776, 273, 253, 21999, 3237, 2223, 4850, 12956, 2556, 281, 253, 2898, 275, 2564, 625, 5742, 323, 2460, 9162, 6928, 253, 1895, 310, 281, 5276, 326, 253, 3453, 273, 253, 11454, 2990, 1057, 417, 19153, 323, 1355, 26309, 281, 253, 12275, 2193, 323, 247, 35121, 4758, 253, 1895, 310, 2223, 5252, 285, 14940, 281, 690, 4736, 1375, 835, 253, 11454, 2990, 17209, 275, 4581, 6287, 342, 253, 985, 8062, 50275, 783, 4477, 275, 436, 2929, 1246, 271, 48960, 2983, 1566, 327, 11454, 6928, 534, 310, 14320, 3451, 407, 690, 2336, 5425, 625, 5742, 50276, 28821, 247, 11454, 2990, 534, 476, 320, 2011, 281, 320, 10237, 281, 48960, 26309, 1475, 690, 3280, 253, 4477, 22059, 10704, 6332, 275, 253, 30745, 281, 2983, 253, 2990, 17227, 253, 3361, 273, 6287, 11385, 275, 253, 18597, 14917, 3139, 436, 310, 1955, 281, 253, 11193, 6332, 5611, 407, 970, 14974, 1127, 3904, 50276, 249, 619, 4743, 253, 10732, 273, 3280, 5239, 275, 253, 2317, 273, 3888, 310, 417, 247, 1077, 4217, 581, 7194, 984, 253, 7726, 21392, 5239, 9999, 50276, 44931, 569, 273, 253, 3280, 2460, 310, 2080, 5176, 432, 253, 6034, 17776, 50276, 953, 247, 3213, 275, 253, 987, 3884, 604, 253, 21999, 273, 4382, 8113, 4836, 369, 247, 973, 2931, 1895, 1580, 697, 417, 2590, 752, 281, 12654, 275, 253, 806, 1659, 253, 897, 1083, 273, 436, 2929, 50276, 261, 417, 247, 1077, 21414, 581, 275, 619, 4743, 253, 1895, 273, 49160, 11454, 6928, 275, 247, 35121, 4758, 556, 247, 625, 14282, 17776, 50276, 48521, 891, 13414, 1158, 326, 436, 2929, 275, 3139, 588, 320, 4722, 281, 253, 2087, 10014, 273, 253, 8059, 2490, 187, 4118, 18435, 27, 9088, 403, 1142, 3332, 3082, 323, 253, 7473, 21999, 273, 11454, 6928, 2299, 954, 273, 841, 3082, 513, 417, 3590, 314, 1566, 253, 14974, 3659, 6779, 273, 1524, 3904, 436, 2929, 2722, 326, 436, 5061, 517, 1255, 476, 320, 28734, 281, 3989, 48960, 6667, 323, 24628, 16058, 6928, 253, 1379, 12594, 310, 326, 2852, 7274, 281, 11454, 2990, 21999, 943, 1379, 715, 2395, 14974, 3659, 35185, 50276, 2520, 369, 247, 45210, 2929, 327, 253, 643, 1133, 281, 3780, 973, 43910, 275, 7473, 3082, 352, 310, 417, 10084, 326, 5061, 517, 21999, 6505, 253, 3369, 1527, 323, 40725, 671, 627, 310, 2720, 789, 1625, 73, 1162, 355, 5723, 2824, 4765, 327, 21999, 273, 11454, 6928, 326, 11120, 13698, 323, 3590, 1255, 8772, 14974, 3659, 27844, 327, 253, 643, 1133, 352, 310, 2032, 326, 1142, 48960, 4715, 8607, 513, 417, 11435, 253, 1318, 273, 436, 2238, 273, 3590, 1255, 275, 253, 990, 253, 3061, 2210, 1066, 281, 253, 8453, 273, 253, 906, 1060, 891, 452, 281, 1930, 342, 37317, 337, 253, 3486, 273, 436, 1895, 310, 3710, 275, 253, 806, 1659, 285, 671, 253, 2523, 273, 14974, 3659, 3590, 1255, 556, 1705, 598, 275, 2720, 789, 327, 11454, 2990, 21999, 323, 841, 4606, 253, 2929, 2550, 320, 7607, 436, 673, 1475 ]
Below is given review of a research paper from cnoference journal. Please write a summary the review. ### Review: summary this paper focuses on the field of offpolicy reinforcement learning specifically the authors propose a modelbased reinforcement learning method on top of actorcritic methods the proposed method trains a dynamics model and a reward function on the offpolicy data with supervised learning and then uses the trained model to generate synthetic future states and rewards during the actor update during policy update for a given state the method computes the sum of the q value estimate of the state and the q value expansion for a few steps using the learned dynamics model and reward function the authors implement the proposed method on top of sac and td3 and evaluate their performances on several mujoco and box2d environments the experiment results show that the proposed method outperforms the modelfree baseline in terms of sample efficiency comments the paper is well written and the idea proposed in this paper is really easy to understand the authors also include a wide suite of experiments to demonstrate the sample efficiency of the proposed method despite these advantages i cannot recommend acceptance of this paper due to the lack of novelty and absence of fair baseline comparison which i will elaborate on next first of all despite the title of the paper the proposed method is really a modelbased reinforcement learning method since the system network and reward network are just dynamics model and reward model the proposed objective for the policy eqn 2 is merely a sum of current q value estimate and q value expansion for a few steps using the learned model which has been proposed before in various papers such as 1 and 2 the only difference is that when computing the gradient with respect to the policy the authors leave out the gradient that passes through the learned model which results in biased estimate of the policy gradient therefore im not convinced about the novelty of the proposed method moreover while proposing a modelbased method the authors do not include baseline comparisons with other modelbased rl methods it is widely known that on lowdimensional control tasks modelbased method outperforms modelfree methods 3 and therefore merely comparing to modelfree baselines is unfair it would be important to include comparisons to model based methods 3 due to the lack of novelty and fair comparison to existing modelbased methods i cannot recommend acceptance for this paper references 1 heess nicolas et al learning continuous control policies by stochastic value gradients advances in neural information processing systems 2015 2 clavera ignasi yao fu and pieter abbeel modelaugmented actorcritic backpropagating through paths international conference on learning representations 2019 3 langlois eric et al benchmarking modelbased reinforcement learning arxiv preprint arxiv190702057 2019 docsepsummary the authors proposed a simple modification to popular off policy algorithms such as sac and td3 by employing a model network and reward network the authors can expand the bellman update operation using predicted next few steps similar to the gae general advantage estimation the authors demonstrated that algorithms such as td3 ddpg and sac can all benefit from their approach pros the paper is clearly written and easy to understand the concept is simple and the implementation is straight forward the results indicated that the training sample efficiency and final policy performance of the tested algorithms have improved for 6 benchmark tasks compared with their vanilla version baseline cons while it is generally understandable that gaelike approaches can help balance between bias and variance in the qvalue estimation i am not convinced if the proposed networks are needed the authors used two additional networks the model network they call system network and the reward network the model network computes the standard transition st at st1 and the reward network estimates the true reward r rst at st1 however as reward function is generally provided i dont see an additional reward network is necessary here second in offpolicy learning the state transition and next states are already known instead of using a model network to predict the next states one can simply sample a small trajectory containing multiple consecutive stateaction transitions and use them to do the qvalue estimation recommendations to address the concern above i propose the authors to add two more ablation studies 1 remove the reward network and only use the reward function 2 remove both the reward network and the modelsystem network and use the recorded future states to estimate the qvalues docsepcorrectness issue the loss in equation 2 is the sum of the regular actor loss using the critic an nstep return version of policy value which would be reasonable for a policy update however the expression provided for the gradient and the calculations in the code are incorrect in that they are not the gradient of equation 2 with respect to the policy in particular it is not reflective of policy performance as it fails to account for the fact that the future states in the imagined rollout are also functions of the policy the resulting policy update in fork actually consists of 1 regular actor update using the critic at the current state 2 updating to greedily maximize the reward at each intermediate timestep which does not reflect the policy performance 3 another regular actor update from the last state in the rollout the straightforward way of correcting this to make it optimize the loss would be to simply differentiate through the model which may not work well as differentiating through learned models often leads to instability though it may be fine for such short rollouts alternatively a reinforce estimator can instead be used for the gradient of the nstep term for stochastic policies we note that the loss in eq 6 if we keep the dependence of the future state on the policy is also not reflective of policy returns since it overweights future returns over the present but can provide reasonable interpretation of the fork update if we drop the dependence of the future state on the policy as the authors do the gradient of the forkq variant the authors take from eq6 is the actually the update we would obtain by using the regular actor update the nablatheta pithetas qs pis term for deterministic policies but under a different state distribution instead of using the state distribution in the replay buffer the update takes into account the distribution obtained by running the policy for a few steps starting from the buffer distribution as running the policy from the buffer distribution would result in a distribution closer to the onpolicy state distribution one explanation for why the forkq update would improve is that the gradient is closer to the onpolicy policy gradients by changing the state distribution the same reasoning can be applied to primary fork update presented as it includes an actor update on the value in future states 3rd term in my previous list the issue is with the greedy reward maximization in the inner steps 2nd term but maybe it simply doesnt hurt on the environments tested or it perhaps takes advantage of a biasvariance tradeoff in greedily maximizing immediate reward for a few intermediate timesteps i would like the authors to explicitly address these issues in the paper and present a clear explanation of why their fork should be a better policy update relation with model based rl methods overall i also disagree with the authors claims that fork is very different and much simpler than other model based algorithms with regards to their comment on how their method is somehow simpler than rollout based methods their policy update is using a short montecarlo rollout for estimating the policy gradients the difference being that the rollouts are being used to update a policy rather than to explicitly plan at test time the authors claim that fork does not require highfidelity simulation seems unsupported to me and the claim that modelbased rl algorithms use the model in a sophisticated way is vague in particular dynastyle algorithms like steve httpsarxivorgabs180701675 mbpo httpsarxivorgabs190608253 which use the model to generate experience to help learn the critic seem to be using the model in the same way as fork in the sense that they only use the model to generate samples with short rollouts there is also no discussion of methods like metrpo httpsarxivorgabs180210592 slbo httpsarxivorgabs180703858 or the algorithms in httpsarxivorgabs200407804 all of which only the model to generate trajectories to use with a policy gradient algorithm overall i would appreciate much more discussion about how fork relates to and compares empirically against past rl algorithms that only use the model to generate samples as well as clarifying the statements about fork uses the model in a less sophisticated way on a separate note using the model to generate nstepreturn estimates of the policy value has been previously done in httpsarxivorgabs180701675 for example the key difference is that prior work used it to generate target values for learning the qfunction while here it is used purely for policy updates given how similarly the models are used however i would recommend discussing this line of work explicitly in the related work even though they are complementary experimental evaluation despite the aforementioned correctness issue the method seems to provide improvements when applied to td3 and seemingly smaller improvements on top of sac however i find it extremely strange that the authors chose in figure 4 to plot returns against the number of training steps instead of the number of samples as acknowledged by the authors this makes the sac vs td3 comparisons incomparable with td3 and td3fork enjoying the advantage of having seen twice as many samples moreover comparing only on the number of actorcritic updates isnt even a fair comparison between fork and baselines as fork additionally has to train a dynamics model and reward predictor i would highly recommend simply showing learning curves with respect to the actual number of environment samples rather than arbitrarily using the number of actor updates i would also like to see comparisons against model based rl baselines particularly mbpo which uses a dynastyle update with sac to compare which method of utilizing the model is better the authors could also see if the fork actor update further improves upon mbpo or other model based rl methods summary as it stands i believe the paper should be rejected due to the correctness issues and resulting lack of justification for why fork should give better actor updates and insufficient discussion of how it relates to prior in model based rl to consider accepting the paper i would at least need to see these issues addressed by the authors regarding novelty and significance using a model to predict nstep value estimates as the paper claims to be doing or using the model to explicitly adjust the state distribution of the policy update as i suspect this might be doing instead for the policy update in an offpolicy actor critic algorithm has not been done before as far as i know however this change in how the model is used seems fairly minor and to be convinced it were useful i would like to see evidence of how it compares against the other model based rl algorithms in particular the benefit i imagine it might have over other model based rl methods is in being more robust to poorly fit models but i would need to see empirical evidence supporting this docsepthe paper proposes to combine ideas from modelbased rl into modelfree offpolicy policy gradient algorithms like sac and td3 specifically the paper proposes to learn auxiliary models of environment rewards and dynamics and use a twostep rollout from these models during the computation of the policy gradient the paper presents results of this mechanism applied to standard sac and td3 implementations on a variety of continuous control environments with favorable results strengths the story is generally easy to follow i appreciate that the authors didnt just evaluate on the extremely common and somewhat saturated mujoco benchmarks and presented additional results on bipedal walker as far as i can tell the policy update proposed here is novel weaknesses while the policy update appears novel it is very similar to techniques in the modelbased rl literature given this it would greatly improve the paper if it had appropriate comparisons to similar modelbased techniques especially those which claim to combine modelfree updates with modelbased techniques for example httpsarxivorgabs190608253 moreover experimentally it would be nice to see comparisons to stateoftheart model based rl methods for example httpsarxivorgabs180210592 in terms of the current experiment results i found the conclusions favorable to the proposed technique but not a very compelling demonstration for example in table 1 almost all the environments produce only a slight benefit for the proposed method it appears the only significant benefit is on ant similarly in table 2 we see the results of sacfork are sometimes much worse than sac on its own in terms of motivation for the method i was not entirely convinced of why the proposed update is needed the paper appeals to the idea of needing to reason about values in the future but shouldnt the qvalue already encapsulate this moreover the proposed update ends up only optimizing actions in the future rather than somehow reasoning about the values at steps t1 t2 to decide the best action at step t ### Summary:
all reviewers agreed that the novelty of the method was not at the level expected for publication and also raised a number of technical concerns regarding the approach there was no response from the authors on these issues hence the reviewer consensus is that the paper is not ready for publication at this time
[ 281, 4633, 745, 3646, 11333, 824, 347, 7044, 285, 32989, 20, 407, 19693, 247, 1566, 2990, 285, 10921, 2990, 253, 4477, 476, 5645, 253, 17487, 1342, 5731, 4254, 970, 8131, 1735, 1643, 5018, 2074, 281, 253, 305, 3348, 2087, 5750, 13418, 253, 4477, 5183, 326, 11333, 824, 347, 32989, 20, 32765, 8159, 285, 7044, 476, 512, 5649, 432, 616, 2746, 50274, 856, 84, 253, 2929, 310, 4518, 3542, 285, 3477, 281, 2096, 50276, 783, 4473, 310, 2969, 285, 253, 7092, 310, 4951, 3579, 253, 1543, 4860, 326, 253, 3733, 3410, 6733, 285, 2457, 3646, 3045, 273, 253, 5762, 11333, 452, 5520, 323, 721, 22791, 8892, 2429, 342, 616, 26724, 2715, 8245, 50274, 5040, 1223, 352, 310, 3839, 34007, 326, 305, 4696, 2804, 7274, 476, 1361, 6654, 875, 8492, 285, 11041, 275, 253, 2805, 2877, 13418, 891, 717, 417, 13762, 604, 253, 4081, 6928, 403, 3058, 253, 4477, 908, 767, 3081, 6928, 253, 1566, 2990, 597, 1067, 985, 2990, 285, 253, 10921, 2990, 253, 1566, 2990, 48169, 253, 2629, 5502, 331, 387, 50276, 296, 18, 285, 253, 10921, 2990, 8197, 253, 2032, 10921, 391, 50276, 37520, 387, 331, 18, 2299, 347, 10921, 1159, 310, 3839, 2530, 891, 13414, 923, 271, 3081, 10921, 2990, 310, 3309, 1060, 1273, 275, 745, 22872, 4715, 253, 1375, 5502, 285, 1735, 3054, 403, 2168, 1929, 3185, 273, 970, 247, 1566, 2990, 281, 3283, 253, 1735, 3054, 581, 476, 3365, 3410, 247, 1355, 18974, 4508, 2709, 12640, 1375, 1913, 16307, 285, 897, 731, 281, 513, 253, 2805, 2877, 13418, 50275, 250, 27167, 569, 281, 2953, 253, 4468, 1840, 891, 12661, 253, 4477, 281, 823, 767, 625, 28913, 2175, 337, 5386, 253, 10921, 2990, 285, 760, 897, 253, 10921, 1159, 374, 5386, 1097, 253, 10921, 2990, 285, 253, 3210, 2468, 2990, 285, 897, 253, 5950, 2852, 3054, 281, 6642, 253, 2805, 8858, 50276, 7152, 33032, 28113, 1255, 2523, 253, 2957, 275, 5150, 374, 310, 253, 2020, 273, 253, 3963, 12353, 2957, 970, 253, 7291, 50276, 266, 295, 10539, 1091, 2715, 273, 3646, 1318, 534, 651, 320, 5272, 323, 247, 3646, 5731, 2299, 253, 2048, 2530, 323, 253, 11786, 285, 253, 10426, 275, 253, 2127, 403, 13583, 275, 326, 597, 403, 417, 253, 11786, 273, 5150, 374, 342, 1675, 281, 253, 3646, 275, 1798, 352, 310, 417, 29210, 273, 3646, 3045, 347, 352, 10224, 281, 2395, 323, 253, 958, 326, 253, 2852, 3054, 275, 253, 18998, 4533, 483, 403, 671, 3470, 273, 253, 3646, 50276, 783, 4795, 3646, 5731, 275, 21195, 2686, 8414, 273, 337, 3963, 12353, 5731, 970, 253, 7291, 387, 253, 1655, 1375, 374, 22753, 281, 37819, 1031, 22950, 253, 10921, 387, 1016, 10444, 4522, 383, 554, 534, 1057, 417, 4887, 253, 3646, 3045, 495, 1529, 3963, 12353, 5731, 432, 253, 1390, 1375, 275, 253, 4533, 483, 50276, 783, 15246, 1039, 273, 35827, 436, 281, 1056, 352, 22318, 253, 2957, 651, 320, 281, 3365, 22629, 949, 253, 1566, 534, 778, 417, 789, 973, 347, 43073, 949, 6311, 3210, 2223, 5644, 281, 17620, 2167, 352, 778, 320, 4030, 323, 824, 2159, 4533, 8349, 31506, 247, 28432, 29107, 476, 3185, 320, 908, 323, 253, 11786, 273, 253, 295, 10539, 1307, 323, 19191, 7823, 50275, 664, 3877, 326, 253, 2957, 275, 16186, 721, 604, 359, 1978, 253, 10096, 273, 253, 2852, 1375, 327, 253, 3646, 310, 671, 417, 29210, 273, 3646, 6548, 1580, 352, 689, 42739, 2852, 6548, 689, 253, 1246, 533, 476, 2085, 5272, 7914, 273, 253, 21195, 5731, 604, 359, 5926, 253, 10096, 273, 253, 2852, 1375, 327, 253, 3646, 347, 253, 4477, 513, 50276, 783, 11786, 273, 253, 21195, 82, 12955, 253, 4477, 1379, 432, 16186, 23, 310, 253, 2686, 253, 5731, 359, 651, 4044, 407, 970, 253, 3963, 12353, 5731, 253, 295, 1752, 4349, 893, 8483, 6168, 284, 2805, 84, 268, 261, 1307, 323, 30027, 7823, 533, 762, 247, 1027, 1375, 3268, 3185, 273, 970, 253, 1375, 3268, 275, 253, 44864, 6391, 253, 5731, 3936, 715, 2395, 253, 3268, 2797, 407, 3515, 253, 3646, 323, 247, 1643, 5018, 4983, 432, 253, 6391, 3268, 347, 3515, 253, 3646, 432, 253, 6391, 3268, 651, 906, 275, 247, 3268, 8003, 281, 253, 327, 22872, 1375, 3268, 581, 8813, 323, 2139, 253, 21195, 82, 5731, 651, 3157, 310, 326, 253, 11786, 310, 8003, 281, 253, 327, 22872, 3646, 27935, 407, 6890, 253, 1375, 3268, 50275, 783, 1072, 14720, 476, 320, 3732, 281, 3625, 21195, 5731, 3559, 347, 352, 3797, 271, 12353, 5731, 327, 253, 1318, 275, 2852, 3054, 495, 5784, 1307, 275, 619, 2045, 1618, 253, 2523, 310, 342, 253, 38754, 10921, 11903, 1320, 275, 253, 6703, 5018, 374, 2109, 1307, 533, 5046, 352, 3365, 36908, 8513, 327, 253, 12620, 5762, 390, 352, 4931, 3936, 5750, 273, 247, 8492, 87, 14417, 5454, 2727, 275, 37819, 1031, 46875, 8993, 10921, 323, 247, 1643, 10444, 4522, 383, 2265, 891, 651, 751, 253, 4477, 281, 11120, 2953, 841, 3374, 275, 253, 2929, 285, 1246, 247, 2590, 8813, 273, 2139, 616, 21195, 943, 320, 247, 1805, 3646, 5731, 50276, 16429, 342, 1566, 1754, 391, 77, 3082, 4583, 891, 671, 14936, 342, 253, 4477, 3916, 326, 21195, 310, 1077, 1027, 285, 1199, 19554, 685, 643, 1566, 1754, 11333, 342, 17730, 281, 616, 4385, 327, 849, 616, 1332, 310, 10380, 19554, 685, 4533, 483, 1754, 3082, 616, 3646, 5731, 310, 970, 247, 2159, 1114, 442, 5546, 4213, 4533, 483, 323, 26230, 253, 3646, 27935, 253, 3064, 1146, 326, 253, 4533, 8349, 403, 1146, 908, 281, 5731, 247, 3646, 2581, 685, 281, 11120, 2098, 387, 1071, 673, 50275, 783, 4477, 1750, 326, 21195, 1057, 417, 2430, 1029, 71, 21718, 9864, 3133, 36542, 281, 479, 285, 253, 1750, 326, 1566, 3169, 391, 77, 11333, 897, 253, 1566, 275, 247, 18144, 1039, 310, 21248, 275, 1798, 24187, 505, 2172, 11333, 751, 2870, 306, 5987, 39962, 2061, 5375, 1093, 2922, 11718, 1976, 45505, 5367, 5987, 39962, 2061, 5375, 16129, 25805, 22067, 534, 897, 253, 1566, 281, 6635, 2793, 281, 1361, 3037, 253, 7291, 1646, 281, 320, 970, 253, 1566, 275, 253, 1072, 1039, 347, 21195, 275, 253, 3282, 326, 597, 760, 897, 253, 1566, 281, 6635, 3530, 342, 2159, 4533, 8349, 627, 310, 671, 642, 5955, 273, 3082, 751, 1313, 83, 5367, 5987, 39962, 2061, 5375, 1093, 2640, 740, 42140, 1499, 2399, 5987, 39962, 2061, 5375, 11395, 1967, 1839, 3680, 390, 253, 11333, 275, 5987, 39962, 2061, 5375, 1518, 24769, 28927, 512, 273, 534, 760, 253, 1566, 281, 6635, 24102, 281, 897, 342, 247, 3646, 11786, 5933, 4583, 891, 651, 11435, 1199, 625, 5955, 670, 849, 21195, 7033, 281, 285, 26662, 45190, 1411, 2469, 391, 77, 11333, 326, 760, 897, 253, 1566, 281, 6635, 3530, 347, 973, 347, 8254, 5411, 253, 7234, 670, 21195, 4648, 253, 1566, 275, 247, 1679, 18144, 1039, 50276, 251, 247, 4858, 3877, 970, 253, 1566, 281, 6635, 295, 10539, 2309, 8197, 273, 253, 3646, 1318, 556, 644, 3786, 2218, 275, 5987, 39962, 2061, 5375, 1093, 2922, 11718, 1976, 323, 1650, 253, 2234, 3064, 310, 326, 2720, 789, 908, 352, 281, 6635, 2303, 2193, 323, 4715, 253, 2805, 3701, 1223, 1060, 352, 310, 908, 15846, 323, 3646, 11269, 1677, 849, 12014, 253, 3210, 403, 908, 2299, 891, 651, 5583, 16585, 436, 1386, 273, 789, 11120, 275, 253, 2905, 789, 1014, 2167, 597, 403, 19767, 50275, 49363, 7103, 5747, 253, 18979, 36594, 2523, 253, 1332, 3133, 281, 2085, 11701, 672, 3732, 281, 32989, 20, 285, 16907, 4577, 11701, 327, 1755, 273, 7044, 2299, 891, 1089, 352, 6685, 8921, 326, 253, 4477, 9703, 275, 4677, 577, 281, 7484, 6548, 1411, 253, 1180, 273, 3733, 5018, 3185, 273, 253, 1180, 273, 3530, 347, 14969, 407, 253, 4477, 436, 2789, 253, 7044, 4632, 32989, 20, 14023, 275, 681, 36730, 342, 32989, 20, 285, 32989, 20, 43491, 18262, 253, 5750, 273, 1907, 2326, 7019, 347, 1142, 3530, 25761, 10941, 760, 327, 253, 1180, 273, 12353, 68, 17425, 11269, 310, 2649, 1014, 247, 4344, 5301, 875, 21195, 285, 1666, 25379, 347, 21195, 23000, 556, 281, 6194, 247, 8062, 1566, 285, 10921, 23403, 891, 651, 4122, 5583, 3365, 4645, 4715, 9191, 342, 1675, 281, 253, 4588, 1180, 273, 3126, 3530, 2581, 685, 29607, 970, 253, 1180, 273, 12353, 11269, 50276, 74, 651, 671, 751, 281, 923, 14023, 1411, 1566, 1754, 391, 77, 1666, 25379, 3782, 45505, 5367, 534, 4648, 247, 24187, 505, 2172, 5731, 342, 7044, 281, 7277, 534, 1332, 273, 17617, 253, 1566, 310, 1805, 253, 4477, 812, 671, 923, 604, 253, 21195, 12353, 5731, 2007, 19132, 2220, 45505, 5367, 390, 643, 1566, 1754, 391, 77, 3082, 50276, 8774, 50276, 284, 352, 9572, 891, 2868, 253, 2929, 943, 320, 10945, 1955, 281, 253, 36594, 3374, 285, 4795, 3480, 273, 22861, 323, 2139, 21195, 943, 1918, 1805, 12353, 11269, 285, 12497, 5955, 273, 849, 352, 7033, 281, 2720, 275, 1566, 1754, 391, 77, 281, 1908, 18738, 253, 2929, 891, 651, 387, 1878, 878, 281, 923, 841, 3374, 9713, 407, 253, 4477, 50276, 1747, 13218, 38135, 285, 8453, 970, 247, 1566, 281, 3283, 295, 10539, 1318, 8197, 347, 253, 2929, 3916, 281, 320, 2509, 390, 970, 253, 1566, 281, 11120, 4575, 253, 1375, 3268, 273, 253, 3646, 5731, 347, 891, 9101, 436, 1537, 320, 2509, 3185, 323, 253, 3646, 5731, 275, 271, 745, 22872, 12353, 7291, 5933, 556, 417, 644, 2218, 1078, 347, 2080, 347, 891, 871, 2299, 436, 1818, 275, 849, 253, 1566, 310, 908, 3133, 9648, 5884, 285, 281, 320, 13762, 352, 497, 4217, 891, 651, 751, 281, 923, 1941, 273, 849, 352, 26662, 1411, 253, 643, 1566, 1754, 391, 77, 11333, 275, 1798, 253, 5649, 891, 8564, 352, 1537, 452, 689, 643, 1566, 1754, 391, 77, 3082, 310, 275, 1146, 625, 10237, 281, 15225, 4944, 3210, 533, 891, 651, 878, 281, 923, 16774, 1941, 8109, 436, 50276, 7152, 339, 431, 248, 2929, 29328, 281, 13398, 5697, 432, 1566, 3169, 391, 77, 715, 771, 813, 658, 745, 22872, 3646, 11786, 11333, 751, 7044, 285, 32989, 20, 5742, 253, 2929, 29328, 281, 3037, 24026, 3210, 273, 3126, 23267, 285, 8062, 285, 897, 247, 2500, 493, 554, 4533, 483, 432, 841, 3210, 1309, 253, 13782, 273, 253, 3646, 11786, 253, 2929, 10262, 1543, 273, 436, 5122, 3732, 281, 2629, 7044, 285, 32989, 20, 27558, 327, 247, 5235, 273, 5415, 1453, 12620, 342, 13857, 1543, 50276, 296, 3755, 20556, 50275, 783, 2926, 310, 3839, 3477, 281, 956, 50275, 74, 11435, 326, 253, 4477, 42126, 816, 7472, 327, 253, 6685, 1846, 285, 8489, 23543, 278, 10441, 16856, 49602, 285, 3559, 3081, 1543, 327, 15086, 264, 267, 2940, 254, 50275, 284, 2080, 347, 891, 476, 2028, 253, 3646, 5731, 4081, 1060, 310, 4460, 50276, 20881, 1255, 265, 50275, 6050, 253, 3646, 5731, 4620, 4460, 352, 310, 1077, 2074, 281, 5609, 275, 253, 1566, 3169, 391, 77, 6239, 1677, 436, 352, 651, 10260, 3157, 253, 2929, 604, 352, 574, 4569, 14023, 281, 2074, 1566, 3169, 5609, 3340, 1110, 534, 1750, 281, 13398, 771, 813, 658, 11269, 342, 1566, 3169, 5609, 323, 1650, 5987, 39962, 2061, 5375, 16129, 25805, 22067, 50275, 3062, 1189, 21657, 352, 651, 320, 5322, 281, 923, 14023, 281, 1375, 23037, 14387, 1566, 1754, 391, 77, 3082, 323, 1650, 5987, 39962, 2061, 5375, 1093, 2640, 740, 42140, 50275, 249, 2426, 273, 253, 1655, 3368, 1543, 891, 1119, 253, 11815, 13857, 281, 253, 4081, 5853, 533, 417, 247, 1077, 18511, 20028, 323, 1650, 275, 2829, 337, 2761, 512, 253, 12620, 4711, 760, 247, 4512, 5649, 323, 253, 4081, 1332, 352, 4620, 253, 760, 1534, 5649, 310, 327, 1331, 12014, 275, 2829, 374, 359, 923, 253, 1543, 273, 7044, 43491, 403, 4536, 1199, 7197, 685, 7044, 327, 697, 1211, 50275, 249, 2426, 273, 16038, 323, 253, 1332, 891, 369, 417, 7094, 13762, 273, 2139, 253, 4081, 5731, 310, 3058, 253, 2929, 12527, 281, 253, 2934, 273, 25312, 281, 1921, 670, 2193, 275, 253, 2852, 533, 943, 2649, 253, 2805, 2877, 2168, 22642, 4187, 436, 25761, 253, 4081, 5731, 7637, 598, 760, 39793, 5231, 275, 253, 2852, 2581, 685, 10380, 14720, 670, 253, 2193, 387, 5018, 246, 18, 246, 19, 281, 7617, 253, 1682, 2250, 387, 3213, 246, 187, 187, 4118, 18435, 27, 455, 30628, 5821, 326, 253, 38135, 273, 253, 1332, 369, 417, 387, 253, 1268, 3264, 323, 9311, 285, 671, 5439, 247, 1180, 273, 7681, 7350, 5001, 253, 2746, 627, 369, 642, 2380, 432, 253, 4477, 327, 841, 3374, 7613, 253, 37317, 13969, 310, 326, 253, 2929, 310, 417, 4704, 323, 9311, 387, 436, 673 ]
[ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 ]
[ 281, 4633, 745, 3646, 11333, 824, 347, 7044, 285, 32989, 20, 407, 19693, 247, 1566, 2990, 285, 10921, 2990, 253, 4477, 476, 5645, 253, 17487, 1342, 5731, 4254, 970, 8131, 1735, 1643, 5018, 2074, 281, 253, 305, 3348, 2087, 5750, 13418, 253, 4477, 5183, 326, 11333, 824, 347, 32989, 20, 32765, 8159, 285, 7044, 476, 512, 5649, 432, 616, 2746, 50274, 856, 84, 253, 2929, 310, 4518, 3542, 285, 3477, 281, 2096, 50276, 783, 4473, 310, 2969, 285, 253, 7092, 310, 4951, 3579, 253, 1543, 4860, 326, 253, 3733, 3410, 6733, 285, 2457, 3646, 3045, 273, 253, 5762, 11333, 452, 5520, 323, 721, 22791, 8892, 2429, 342, 616, 26724, 2715, 8245, 50274, 5040, 1223, 352, 310, 3839, 34007, 326, 305, 4696, 2804, 7274, 476, 1361, 6654, 875, 8492, 285, 11041, 275, 253, 2805, 2877, 13418, 891, 717, 417, 13762, 604, 253, 4081, 6928, 403, 3058, 253, 4477, 908, 767, 3081, 6928, 253, 1566, 2990, 597, 1067, 985, 2990, 285, 253, 10921, 2990, 253, 1566, 2990, 48169, 253, 2629, 5502, 331, 387, 50276, 296, 18, 285, 253, 10921, 2990, 8197, 253, 2032, 10921, 391, 50276, 37520, 387, 331, 18, 2299, 347, 10921, 1159, 310, 3839, 2530, 891, 13414, 923, 271, 3081, 10921, 2990, 310, 3309, 1060, 1273, 275, 745, 22872, 4715, 253, 1375, 5502, 285, 1735, 3054, 403, 2168, 1929, 3185, 273, 970, 247, 1566, 2990, 281, 3283, 253, 1735, 3054, 581, 476, 3365, 3410, 247, 1355, 18974, 4508, 2709, 12640, 1375, 1913, 16307, 285, 897, 731, 281, 513, 253, 2805, 2877, 13418, 50275, 250, 27167, 569, 281, 2953, 253, 4468, 1840, 891, 12661, 253, 4477, 281, 823, 767, 625, 28913, 2175, 337, 5386, 253, 10921, 2990, 285, 760, 897, 253, 10921, 1159, 374, 5386, 1097, 253, 10921, 2990, 285, 253, 3210, 2468, 2990, 285, 897, 253, 5950, 2852, 3054, 281, 6642, 253, 2805, 8858, 50276, 7152, 33032, 28113, 1255, 2523, 253, 2957, 275, 5150, 374, 310, 253, 2020, 273, 253, 3963, 12353, 2957, 970, 253, 7291, 50276, 266, 295, 10539, 1091, 2715, 273, 3646, 1318, 534, 651, 320, 5272, 323, 247, 3646, 5731, 2299, 253, 2048, 2530, 323, 253, 11786, 285, 253, 10426, 275, 253, 2127, 403, 13583, 275, 326, 597, 403, 417, 253, 11786, 273, 5150, 374, 342, 1675, 281, 253, 3646, 275, 1798, 352, 310, 417, 29210, 273, 3646, 3045, 347, 352, 10224, 281, 2395, 323, 253, 958, 326, 253, 2852, 3054, 275, 253, 18998, 4533, 483, 403, 671, 3470, 273, 253, 3646, 50276, 783, 4795, 3646, 5731, 275, 21195, 2686, 8414, 273, 337, 3963, 12353, 5731, 970, 253, 7291, 387, 253, 1655, 1375, 374, 22753, 281, 37819, 1031, 22950, 253, 10921, 387, 1016, 10444, 4522, 383, 554, 534, 1057, 417, 4887, 253, 3646, 3045, 495, 1529, 3963, 12353, 5731, 432, 253, 1390, 1375, 275, 253, 4533, 483, 50276, 783, 15246, 1039, 273, 35827, 436, 281, 1056, 352, 22318, 253, 2957, 651, 320, 281, 3365, 22629, 949, 253, 1566, 534, 778, 417, 789, 973, 347, 43073, 949, 6311, 3210, 2223, 5644, 281, 17620, 2167, 352, 778, 320, 4030, 323, 824, 2159, 4533, 8349, 31506, 247, 28432, 29107, 476, 3185, 320, 908, 323, 253, 11786, 273, 253, 295, 10539, 1307, 323, 19191, 7823, 50275, 664, 3877, 326, 253, 2957, 275, 16186, 721, 604, 359, 1978, 253, 10096, 273, 253, 2852, 1375, 327, 253, 3646, 310, 671, 417, 29210, 273, 3646, 6548, 1580, 352, 689, 42739, 2852, 6548, 689, 253, 1246, 533, 476, 2085, 5272, 7914, 273, 253, 21195, 5731, 604, 359, 5926, 253, 10096, 273, 253, 2852, 1375, 327, 253, 3646, 347, 253, 4477, 513, 50276, 783, 11786, 273, 253, 21195, 82, 12955, 253, 4477, 1379, 432, 16186, 23, 310, 253, 2686, 253, 5731, 359, 651, 4044, 407, 970, 253, 3963, 12353, 5731, 253, 295, 1752, 4349, 893, 8483, 6168, 284, 2805, 84, 268, 261, 1307, 323, 30027, 7823, 533, 762, 247, 1027, 1375, 3268, 3185, 273, 970, 253, 1375, 3268, 275, 253, 44864, 6391, 253, 5731, 3936, 715, 2395, 253, 3268, 2797, 407, 3515, 253, 3646, 323, 247, 1643, 5018, 4983, 432, 253, 6391, 3268, 347, 3515, 253, 3646, 432, 253, 6391, 3268, 651, 906, 275, 247, 3268, 8003, 281, 253, 327, 22872, 1375, 3268, 581, 8813, 323, 2139, 253, 21195, 82, 5731, 651, 3157, 310, 326, 253, 11786, 310, 8003, 281, 253, 327, 22872, 3646, 27935, 407, 6890, 253, 1375, 3268, 50275, 783, 1072, 14720, 476, 320, 3732, 281, 3625, 21195, 5731, 3559, 347, 352, 3797, 271, 12353, 5731, 327, 253, 1318, 275, 2852, 3054, 495, 5784, 1307, 275, 619, 2045, 1618, 253, 2523, 310, 342, 253, 38754, 10921, 11903, 1320, 275, 253, 6703, 5018, 374, 2109, 1307, 533, 5046, 352, 3365, 36908, 8513, 327, 253, 12620, 5762, 390, 352, 4931, 3936, 5750, 273, 247, 8492, 87, 14417, 5454, 2727, 275, 37819, 1031, 46875, 8993, 10921, 323, 247, 1643, 10444, 4522, 383, 2265, 891, 651, 751, 253, 4477, 281, 11120, 2953, 841, 3374, 275, 253, 2929, 285, 1246, 247, 2590, 8813, 273, 2139, 616, 21195, 943, 320, 247, 1805, 3646, 5731, 50276, 16429, 342, 1566, 1754, 391, 77, 3082, 4583, 891, 671, 14936, 342, 253, 4477, 3916, 326, 21195, 310, 1077, 1027, 285, 1199, 19554, 685, 643, 1566, 1754, 11333, 342, 17730, 281, 616, 4385, 327, 849, 616, 1332, 310, 10380, 19554, 685, 4533, 483, 1754, 3082, 616, 3646, 5731, 310, 970, 247, 2159, 1114, 442, 5546, 4213, 4533, 483, 323, 26230, 253, 3646, 27935, 253, 3064, 1146, 326, 253, 4533, 8349, 403, 1146, 908, 281, 5731, 247, 3646, 2581, 685, 281, 11120, 2098, 387, 1071, 673, 50275, 783, 4477, 1750, 326, 21195, 1057, 417, 2430, 1029, 71, 21718, 9864, 3133, 36542, 281, 479, 285, 253, 1750, 326, 1566, 3169, 391, 77, 11333, 897, 253, 1566, 275, 247, 18144, 1039, 310, 21248, 275, 1798, 24187, 505, 2172, 11333, 751, 2870, 306, 5987, 39962, 2061, 5375, 1093, 2922, 11718, 1976, 45505, 5367, 5987, 39962, 2061, 5375, 16129, 25805, 22067, 534, 897, 253, 1566, 281, 6635, 2793, 281, 1361, 3037, 253, 7291, 1646, 281, 320, 970, 253, 1566, 275, 253, 1072, 1039, 347, 21195, 275, 253, 3282, 326, 597, 760, 897, 253, 1566, 281, 6635, 3530, 342, 2159, 4533, 8349, 627, 310, 671, 642, 5955, 273, 3082, 751, 1313, 83, 5367, 5987, 39962, 2061, 5375, 1093, 2640, 740, 42140, 1499, 2399, 5987, 39962, 2061, 5375, 11395, 1967, 1839, 3680, 390, 253, 11333, 275, 5987, 39962, 2061, 5375, 1518, 24769, 28927, 512, 273, 534, 760, 253, 1566, 281, 6635, 24102, 281, 897, 342, 247, 3646, 11786, 5933, 4583, 891, 651, 11435, 1199, 625, 5955, 670, 849, 21195, 7033, 281, 285, 26662, 45190, 1411, 2469, 391, 77, 11333, 326, 760, 897, 253, 1566, 281, 6635, 3530, 347, 973, 347, 8254, 5411, 253, 7234, 670, 21195, 4648, 253, 1566, 275, 247, 1679, 18144, 1039, 50276, 251, 247, 4858, 3877, 970, 253, 1566, 281, 6635, 295, 10539, 2309, 8197, 273, 253, 3646, 1318, 556, 644, 3786, 2218, 275, 5987, 39962, 2061, 5375, 1093, 2922, 11718, 1976, 323, 1650, 253, 2234, 3064, 310, 326, 2720, 789, 908, 352, 281, 6635, 2303, 2193, 323, 4715, 253, 2805, 3701, 1223, 1060, 352, 310, 908, 15846, 323, 3646, 11269, 1677, 849, 12014, 253, 3210, 403, 908, 2299, 891, 651, 5583, 16585, 436, 1386, 273, 789, 11120, 275, 253, 2905, 789, 1014, 2167, 597, 403, 19767, 50275, 49363, 7103, 5747, 253, 18979, 36594, 2523, 253, 1332, 3133, 281, 2085, 11701, 672, 3732, 281, 32989, 20, 285, 16907, 4577, 11701, 327, 1755, 273, 7044, 2299, 891, 1089, 352, 6685, 8921, 326, 253, 4477, 9703, 275, 4677, 577, 281, 7484, 6548, 1411, 253, 1180, 273, 3733, 5018, 3185, 273, 253, 1180, 273, 3530, 347, 14969, 407, 253, 4477, 436, 2789, 253, 7044, 4632, 32989, 20, 14023, 275, 681, 36730, 342, 32989, 20, 285, 32989, 20, 43491, 18262, 253, 5750, 273, 1907, 2326, 7019, 347, 1142, 3530, 25761, 10941, 760, 327, 253, 1180, 273, 12353, 68, 17425, 11269, 310, 2649, 1014, 247, 4344, 5301, 875, 21195, 285, 1666, 25379, 347, 21195, 23000, 556, 281, 6194, 247, 8062, 1566, 285, 10921, 23403, 891, 651, 4122, 5583, 3365, 4645, 4715, 9191, 342, 1675, 281, 253, 4588, 1180, 273, 3126, 3530, 2581, 685, 29607, 970, 253, 1180, 273, 12353, 11269, 50276, 74, 651, 671, 751, 281, 923, 14023, 1411, 1566, 1754, 391, 77, 1666, 25379, 3782, 45505, 5367, 534, 4648, 247, 24187, 505, 2172, 5731, 342, 7044, 281, 7277, 534, 1332, 273, 17617, 253, 1566, 310, 1805, 253, 4477, 812, 671, 923, 604, 253, 21195, 12353, 5731, 2007, 19132, 2220, 45505, 5367, 390, 643, 1566, 1754, 391, 77, 3082, 50276, 8774, 50276, 284, 352, 9572, 891, 2868, 253, 2929, 943, 320, 10945, 1955, 281, 253, 36594, 3374, 285, 4795, 3480, 273, 22861, 323, 2139, 21195, 943, 1918, 1805, 12353, 11269, 285, 12497, 5955, 273, 849, 352, 7033, 281, 2720, 275, 1566, 1754, 391, 77, 281, 1908, 18738, 253, 2929, 891, 651, 387, 1878, 878, 281, 923, 841, 3374, 9713, 407, 253, 4477, 50276, 1747, 13218, 38135, 285, 8453, 970, 247, 1566, 281, 3283, 295, 10539, 1318, 8197, 347, 253, 2929, 3916, 281, 320, 2509, 390, 970, 253, 1566, 281, 11120, 4575, 253, 1375, 3268, 273, 253, 3646, 5731, 347, 891, 9101, 436, 1537, 320, 2509, 3185, 323, 253, 3646, 5731, 275, 271, 745, 22872, 12353, 7291, 5933, 556, 417, 644, 2218, 1078, 347, 2080, 347, 891, 871, 2299, 436, 1818, 275, 849, 253, 1566, 310, 908, 3133, 9648, 5884, 285, 281, 320, 13762, 352, 497, 4217, 891, 651, 751, 281, 923, 1941, 273, 849, 352, 26662, 1411, 253, 643, 1566, 1754, 391, 77, 11333, 275, 1798, 253, 5649, 891, 8564, 352, 1537, 452, 689, 643, 1566, 1754, 391, 77, 3082, 310, 275, 1146, 625, 10237, 281, 15225, 4944, 3210, 533, 891, 651, 878, 281, 923, 16774, 1941, 8109, 436, 50276, 7152, 339, 431, 248, 2929, 29328, 281, 13398, 5697, 432, 1566, 3169, 391, 77, 715, 771, 813, 658, 745, 22872, 3646, 11786, 11333, 751, 7044, 285, 32989, 20, 5742, 253, 2929, 29328, 281, 3037, 24026, 3210, 273, 3126, 23267, 285, 8062, 285, 897, 247, 2500, 493, 554, 4533, 483, 432, 841, 3210, 1309, 253, 13782, 273, 253, 3646, 11786, 253, 2929, 10262, 1543, 273, 436, 5122, 3732, 281, 2629, 7044, 285, 32989, 20, 27558, 327, 247, 5235, 273, 5415, 1453, 12620, 342, 13857, 1543, 50276, 296, 3755, 20556, 50275, 783, 2926, 310, 3839, 3477, 281, 956, 50275, 74, 11435, 326, 253, 4477, 42126, 816, 7472, 327, 253, 6685, 1846, 285, 8489, 23543, 278, 10441, 16856, 49602, 285, 3559, 3081, 1543, 327, 15086, 264, 267, 2940, 254, 50275, 284, 2080, 347, 891, 476, 2028, 253, 3646, 5731, 4081, 1060, 310, 4460, 50276, 20881, 1255, 265, 50275, 6050, 253, 3646, 5731, 4620, 4460, 352, 310, 1077, 2074, 281, 5609, 275, 253, 1566, 3169, 391, 77, 6239, 1677, 436, 352, 651, 10260, 3157, 253, 2929, 604, 352, 574, 4569, 14023, 281, 2074, 1566, 3169, 5609, 3340, 1110, 534, 1750, 281, 13398, 771, 813, 658, 11269, 342, 1566, 3169, 5609, 323, 1650, 5987, 39962, 2061, 5375, 16129, 25805, 22067, 50275, 3062, 1189, 21657, 352, 651, 320, 5322, 281, 923, 14023, 281, 1375, 23037, 14387, 1566, 1754, 391, 77, 3082, 323, 1650, 5987, 39962, 2061, 5375, 1093, 2640, 740, 42140, 50275, 249, 2426, 273, 253, 1655, 3368, 1543, 891, 1119, 253, 11815, 13857, 281, 253, 4081, 5853, 533, 417, 247, 1077, 18511, 20028, 323, 1650, 275, 2829, 337, 2761, 512, 253, 12620, 4711, 760, 247, 4512, 5649, 323, 253, 4081, 1332, 352, 4620, 253, 760, 1534, 5649, 310, 327, 1331, 12014, 275, 2829, 374, 359, 923, 253, 1543, 273, 7044, 43491, 403, 4536, 1199, 7197, 685, 7044, 327, 697, 1211, 50275, 249, 2426, 273, 16038, 323, 253, 1332, 891, 369, 417, 7094, 13762, 273, 2139, 253, 4081, 5731, 310, 3058, 253, 2929, 12527, 281, 253, 2934, 273, 25312, 281, 1921, 670, 2193, 275, 253, 2852, 533, 943, 2649, 253, 2805, 2877, 2168, 22642, 4187, 436, 25761, 253, 4081, 5731, 7637, 598, 760, 39793, 5231, 275, 253, 2852, 2581, 685, 10380, 14720, 670, 253, 2193, 387, 5018, 246, 18, 246, 19, 281, 7617, 253, 1682, 2250, 387, 3213, 246, 187, 187, 4118, 18435, 27, 455, 30628, 5821, 326, 253, 38135, 273, 253, 1332, 369, 417, 387, 253, 1268, 3264, 323, 9311, 285, 671, 5439, 247, 1180, 273, 7681, 7350, 5001, 253, 2746, 627, 369, 642, 2380, 432, 253, 4477, 327, 841, 3374, 7613, 253, 37317, 13969, 310, 326, 253, 2929, 310, 417, 4704, 323, 9311, 387, 436, 673 ]
Below is given review of a research paper from cnoference journal. Please write a summary the review. ### Review: this paper develops an approximate secondorder policy optimization method to overcome the slow convergence of policy gradient methods this paper proposes an entropy regularized version policy gradient method that approximates the newton update using a diagonal preconditioner theoretical results a presented that show the convergence rate of the new method near the optimal solution is quadratic a generalized version of the algorithm is presented for any entropic regularizer experiments on two tabular mdps demonstrate that the algorithm can reach convergence in a singledigit number of iterations the primary strength of this paper is the characterization of the hessian around the optimal policy for a tabular policy representation this analysis may be helpful to others who do theoretical analysis on policy gradient methods the biggest issue is with this paper is that it does not offer clear support that this hessian approximation is close when the policy is not trivially close to the optimal policy there is no investigation as to what effect this precondition matrix has on the optimization surface since the resulting update is simply a diagonalized version of the natural policy gradient method one would expect further insight into when and why the method is advantageous these experiments have two issues the first is that the comparison between methods is unfair the same step size is used for all methods but they all have different dynamics and it is not clear that the same hyperparameters for each method provide a meaningful comparison the second is that the experiments shed no light on the type of problems this method will solve efficiently the methods are only tested on two mdps and there is no discussion as to what characteristics of the policy optimization problem these problems are designed to test the paper also makes it seem like it is addressing policy optimization issues for typical rl settings but the setting of this paper is not typical in practice first it does not acknowledge that these results do not pertain to cases with function approximation which is the more common case of policy optimization second it assumes full knowledge of the reward function and transition dynamics which is rarely the case in this setting it is not clear that policy gradient methods offer any advantage over value iteration policy iteration techniques questions is the following a correct interpretation of the approximate hessian the approximate hessian assumes that the dominant terms of the hessian only depend on the diagonal of the fisher information matrix fim of the policy parameters this interpretation implies that offdiagonal components of the fim and any terms that are influenced by the reward function have little impact on the curvature of the objective function how close does the policy need to be optimal before this hessian approximation is reasonable does the approximate hessian provide a helpful update direction and step length when the policy is not nearly optimal why would one use this hessian approximation instead of the diagonal of the gaussnewton method why are the mdps in the experiments well suited to demonstrate the essential properties of these policy gradient methods what are the properties of the mdps in the experiment that make these policy optimization methods converge quickly corrections section 11 first contribution the algorithm does not recover the natural gradient algorithm but uses the diagonal of the fisher information matrix section 2 definition of wpi zpi should have a 1 in the exponent suggestions for improving clarity page 1 arbitrary weight vector e it might be worth mentioning that in most rl problems this weight vector only covers a small set of possible states making optimization difficult section 2 repsilon and zepsilon are introduced but it is unclear what role they have or why they are being introduced section 2 formulation of the policy gradient is not immediately obvious and no derivation or reference is given for the expression section 2 measure for opi pi is not given until much later in the paper it was not clear if this was meant to be a norm or divergence measure from the writing i do not recommend this paper for acceptance because there are many essential unanswered questions regarding the proposed method docsepthis paper proposes a quasinewton method for policy gradient algorithm with entropy regularization which is popular in solving the reinforcement learning problem with various entropy functions this paper establishes quadratic convergence rate for the proposed algorithm such convergence rate is verified using numerical experiments overall i enjoyed reading this paper the presentation is clear but the writing can be improved as explained in the minor comments major comments 1 since using entropy regularization changes the problem pi of the modified problem is no longer the true optimal policy of the original problem to complete the story it is better to provide a bound on the difference between pi of the modified problem and the true optimal policy as a followup question i believe there is a tradeoff between the asymptotic error and the convergence rate in choosing the regularization coefficient is it possible to make the regularization coefficient timevarying so that we have asymptotic convergence to the true optimal policy and also have improved convergence rate 2 in the statement of theorem 5 the authors assume that thetak converges to theta since thetak is a function of the iterate pik generated by the algorithm this assumption seems to be problematic is it possible to actually prove the statement instead of assuming it what is the major difficulty there 3 regarding eq 18 of theorem 5 does it hold for all k or only for k large enough if it is the latter case what parameters of the problem does the threshold depend on regarding the constant c since it is not a numerical constant what does it depend on 4 newtons method has been studied extensively in the literature what is the main technical challenge in extending existing results on using newtons method to solve optimization problem to the setting of policy gradient algorithm 5 the policy gradient method is introduced mostly for solving the reinforcement learning rl problem where the mdp model is unknown existing algorithms such as pg and npg can be easily generalized to the rl setting for the proposed algorithm in this paper due to the presence of pa it is not immediately clear how to actually use it in the rl setting except to first use the modelbased approach to estimate pa it would be nice to have a paragraph discussing about the possibility of extending the results in this paper to the rl setting other comments 1 about writing when stating a theorem proposition lemma etc it is better to introduce all the notation before the result in that case the statement of the result is more concise after author feedback since the author did not provide any feedback i would like to keep my score and vote for rejection overall i think this paper is interesting and has the potential to contribute theoretically to policy gradient method there is some additional work see my main review that needs to be done to make this paper more competitive docsepin this paper the authors propose a quasinewton method for policy gradient algorithms in reinforcement learning while being entropy regularised their method acts as an umbrella of other techniques including natural policy gradients interestingly upon using different regularisers the approach yields new policy gradient algorithms although i think this is an interesting paper i have some questions that i would be grateful if answered 1 i found the exposition of the paper to be a bit confusing and not very clear is it possible for the authors to clearly list in a subsection even in the appendix the assumptions used when constructing the proofs 2 can the authors kindly illustrate the novelty conveyed in the proof in other words how is this analysis novel compared to standard analysis of quasinewton methods beyond its application to entropy regularised reinforcement learning that is not to say that the analysis is not rigorous i just want to understand if there were any hurdles that needed to be overcome carrying optimisation proofs to rl settings 3 concerning the assumptions is this analysis assuming convexity of course in discrete cases this can be met but it is not general if so i think there needs to be a section clearly elaborating the limitations of the current proof 4 could the authors report the running time of their algorithm rather than just demonstrating iterations it would be great if those running times are also compared to standard policy gradients and natural policy gradients 5 could the authors please show reward learning curves rather than just policy changes or log errors 6 what does an eta1 really mean i do understand that setting a learning rate that high can lead us to the statement of the proof but what does it mean emprically please see above docsepthe paper is concerned with infinite horizon discounted finite state and finite action space mdp problem with known reward and transition matrices to solve this problem it proposes a quasinewton method for the policy gradient algorithm with entropy regularization the main contribution is the it proves that the proposed algorithm has local quadratic convergence property under certain assumptions it also uses two examples to show that the proposed algorithms is faster than the current policy gradient based algorithms in the literature as said in the summary section this paper proposes a quasinewton method that uses the diagonal of the hessian matrix as an approximate to accelerate the convergence of the policy gradient algorithm it shows that under certain conditions along with close to the optimal initial policy the quasinewton policy gradient algorithm converges quadratically its really interesting to see the quadratic convergence for the quasinewton policy gradient method but the conditions under which this convergence holds is very unclear for example in theorem 5 the assumptions are the nabla f is lipschitz on a closed subset and thetak to thetastar how could we verify that these conditions are met for certain problems i think answering this question could help quantify the localness definition for this paper for the first example the initial policy is uniform since based on the theorem its quadratic convergence holds when the initial policy is very close to the optimal one is this true for this example as well also for the compared algorithms do you also use the same initial policy how about the manually tuned learning rate for gpmd cen et al 2020 also has quadratic convergence result for close to optimal policy case how does this papers result differ from it the quadratic convergence of the proposed quasinewton algorithm is interesting but the conditions under which it holds is pretty unclear since this is the papers main contribution its a borderline submission for me at this moment i will wait for the rebuttal to address my concerns ### Summary:
this is a nice paper which shows that klregularized natural policy gradient assuming exact access to the mdp meaning no noise in the reward and q function estimates which achieves linear convergence can use ideas from quasinewton methods and recover their quadratic convergence given the excitement surrounding policy gradient methods and their convergence rates this is a valuable direction and family of ideas unfortunately the reviewers had many concerns about presentation and also of the exact meaning and relationship of the results to prior work ill add to this and note that one issue with quasinewton methods is that it is unclear how long the burnin phase is meaning the phase before their quadratic convergence kicks in and this is still an issue in the present works theory another issue as raised by reviewers is the difference between the regularized and unregularized optimal policies as such it makes sense for this paper to receive more time and polish
[ 9371, 281, 2571, 665, 513, 10527, 1783, 327, 3646, 11786, 3082, 50276, 783, 5962, 2523, 310, 342, 436, 2929, 310, 326, 352, 1057, 417, 3959, 2590, 1329, 326, 436, 344, 859, 757, 11193, 310, 2810, 672, 253, 3646, 310, 417, 35820, 1365, 2810, 281, 253, 8654, 3646, 627, 310, 642, 5839, 347, 281, 752, 1055, 436, 638, 12380, 4315, 556, 327, 253, 13757, 2553, 1580, 253, 4795, 5731, 310, 3365, 247, 16421, 1025, 2715, 273, 253, 3626, 3646, 11786, 1332, 581, 651, 1902, 2007, 12288, 715, 672, 285, 2139, 253, 1332, 310, 24400, 50276, 20513, 4679, 452, 767, 3374, 253, 806, 310, 326, 253, 5301, 875, 3082, 310, 16593, 253, 1072, 3213, 1979, 310, 908, 323, 512, 3082, 533, 597, 512, 452, 1027, 8062, 285, 352, 310, 417, 2590, 326, 253, 1072, 4373, 22041, 323, 1016, 1332, 2085, 247, 14282, 5301, 253, 1273, 310, 326, 253, 4679, 17914, 642, 1708, 327, 253, 1511, 273, 3237, 436, 1332, 588, 8415, 14556, 253, 3082, 403, 760, 5762, 327, 767, 31934, 793, 285, 627, 310, 642, 5955, 347, 281, 752, 5319, 273, 253, 3646, 13757, 1895, 841, 3237, 403, 4158, 281, 1071, 50276, 783, 2929, 671, 2789, 352, 1646, 751, 352, 310, 15974, 3646, 13757, 3374, 323, 6867, 391, 77, 7533, 533, 253, 4758, 273, 436, 2929, 310, 417, 6867, 275, 3946, 806, 352, 1057, 417, 14409, 326, 841, 1543, 513, 417, 6925, 404, 281, 2219, 342, 1159, 11193, 534, 310, 253, 625, 1846, 1083, 273, 3646, 13757, 1273, 352, 19584, 2120, 3640, 273, 253, 10921, 1159, 285, 5502, 8062, 534, 310, 11766, 253, 1083, 275, 436, 4758, 352, 310, 417, 2590, 326, 3646, 11786, 3082, 3959, 667, 5750, 689, 1318, 19502, 3646, 19502, 5609, 50275, 34974, 310, 253, 1563, 247, 3451, 7914, 273, 253, 16851, 344, 859, 757, 253, 16851, 344, 859, 757, 19584, 326, 253, 11360, 2426, 273, 253, 344, 859, 757, 760, 3469, 327, 253, 16421, 273, 253, 27633, 1491, 4315, 269, 303, 273, 253, 3646, 3602, 436, 7914, 8018, 326, 745, 41758, 4295, 273, 253, 269, 303, 285, 667, 2426, 326, 403, 12208, 407, 253, 10921, 1159, 452, 1652, 3486, 327, 253, 16841, 273, 253, 8103, 1159, 50276, 5430, 2810, 1057, 253, 3646, 878, 281, 320, 8654, 1078, 436, 344, 859, 757, 11193, 310, 5272, 50276, 18566, 253, 16851, 344, 859, 757, 2085, 247, 9371, 5731, 3884, 285, 3213, 2978, 672, 253, 3646, 310, 417, 4829, 8654, 50276, 22309, 651, 581, 897, 436, 344, 859, 757, 11193, 3185, 273, 253, 16421, 273, 253, 305, 10064, 1826, 1299, 1332, 50276, 22309, 403, 253, 31934, 793, 275, 253, 4679, 973, 18960, 281, 7568, 253, 5667, 3607, 273, 841, 3646, 11786, 3082, 752, 403, 253, 3607, 273, 253, 31934, 793, 275, 253, 3368, 326, 1056, 841, 3646, 13757, 3082, 29623, 4541, 50274, 5528, 38526, 2593, 1903, 806, 7680, 253, 5933, 1057, 417, 9295, 253, 3626, 11786, 5933, 533, 4648, 253, 16421, 273, 253, 27633, 1491, 4315, 50276, 4674, 374, 5426, 273, 259, 2059, 50276, 91, 2059, 943, 452, 247, 337, 275, 253, 23653, 50275, 35640, 621, 323, 11138, 19843, 3239, 337, 10341, 2801, 4972, 299, 352, 1537, 320, 4409, 29570, 326, 275, 954, 391, 77, 3237, 436, 2801, 4972, 760, 10949, 247, 1355, 873, 273, 1896, 3054, 2403, 13757, 2834, 50276, 4674, 374, 294, 4277, 285, 1182, 4259, 403, 5611, 533, 352, 310, 12744, 752, 2554, 597, 452, 390, 2139, 597, 403, 1146, 5611, 50276, 4674, 374, 15895, 273, 253, 3646, 11786, 310, 417, 4745, 4755, 285, 642, 28529, 390, 3806, 310, 1677, 323, 253, 2048, 50276, 4674, 374, 2557, 323, 1121, 74, 50276, 2059, 310, 417, 1677, 1919, 1199, 1996, 275, 253, 2929, 352, 369, 417, 2590, 604, 436, 369, 5486, 281, 320, 247, 5222, 390, 23279, 2557, 432, 253, 4028, 50274, 74, 513, 417, 5583, 436, 2929, 323, 14924, 984, 627, 403, 1142, 5667, 440, 42195, 3533, 5001, 253, 4081, 1332, 50276, 7152, 33032, 2520, 2929, 29328, 247, 21582, 460, 88, 1299, 1332, 323, 3646, 11786, 5933, 342, 15579, 37820, 534, 310, 4633, 275, 16161, 253, 35221, 4715, 1895, 342, 2710, 15579, 3470, 436, 2929, 25097, 21396, 14940, 2281, 323, 253, 4081, 5933, 824, 14940, 2281, 310, 16058, 970, 10704, 4679, 4583, 891, 11346, 4361, 436, 2929, 253, 9759, 310, 2590, 533, 253, 4028, 476, 320, 5520, 347, 5544, 275, 253, 5884, 5701, 50276, 24330, 5701, 50276, 18, 1580, 970, 15579, 37820, 2544, 253, 1895, 12580, 273, 253, 7321, 1895, 310, 642, 3356, 253, 2032, 8654, 3646, 273, 253, 3236, 1895, 281, 3426, 253, 2926, 352, 310, 1805, 281, 2085, 247, 3033, 327, 253, 3064, 875, 12580, 273, 253, 7321, 1895, 285, 253, 2032, 8654, 3646, 347, 247, 956, 484, 1953, 891, 2868, 627, 310, 247, 5454, 2727, 875, 253, 20185, 2228, 285, 253, 14940, 2281, 275, 13887, 253, 37820, 10235, 310, 352, 1896, 281, 1056, 253, 37820, 10235, 673, 39381, 272, 594, 326, 359, 452, 20185, 14940, 281, 253, 2032, 8654, 3646, 285, 671, 452, 5520, 14940, 2281, 50276, 19, 275, 253, 3908, 273, 10012, 608, 253, 4477, 5467, 326, 253, 85, 518, 26414, 281, 39116, 1580, 253, 85, 518, 310, 247, 1159, 273, 253, 35388, 268, 1479, 4561, 407, 253, 5933, 436, 9376, 3133, 281, 320, 20276, 310, 352, 1896, 281, 2686, 5276, 253, 3908, 3185, 273, 7384, 352, 752, 310, 253, 2201, 10183, 627, 50276, 20, 5001, 16186, 1283, 273, 10012, 608, 1057, 352, 2186, 323, 512, 465, 390, 760, 323, 465, 1781, 2217, 604, 352, 310, 253, 6158, 1083, 752, 3602, 273, 253, 1895, 1057, 253, 7887, 3469, 327, 5001, 253, 3638, 260, 1580, 352, 310, 417, 247, 10704, 3638, 752, 1057, 352, 3469, 327, 50276, 21, 747, 24787, 1332, 556, 644, 5421, 18171, 275, 253, 6239, 752, 310, 253, 2022, 7681, 5691, 275, 13633, 5368, 1543, 327, 970, 747, 24787, 1332, 281, 8415, 13757, 1895, 281, 253, 4758, 273, 3646, 11786, 5933, 50276, 22, 253, 3646, 11786, 1332, 310, 5611, 6571, 323, 16161, 253, 35221, 4715, 391, 77, 1895, 835, 253, 278, 12132, 1566, 310, 7202, 5368, 11333, 824, 347, 23256, 285, 295, 8159, 476, 320, 4354, 14923, 281, 253, 391, 77, 4758, 323, 253, 4081, 5933, 275, 436, 2929, 1955, 281, 253, 3361, 273, 1349, 352, 310, 417, 4745, 2590, 849, 281, 2686, 897, 352, 275, 253, 391, 77, 4758, 3707, 281, 806, 897, 253, 1566, 3169, 2746, 281, 6642, 1349, 352, 651, 320, 5322, 281, 452, 247, 12494, 16585, 670, 253, 6387, 273, 13633, 253, 1543, 275, 436, 2929, 281, 253, 391, 77, 4758, 50276, 977, 5701, 50276, 18, 670, 4028, 672, 14851, 247, 10012, 13989, 18057, 3966, 352, 310, 1805, 281, 9569, 512, 253, 14951, 1078, 253, 906, 275, 326, 1083, 253, 3908, 273, 253, 906, 310, 625, 44003, 50276, 6438, 2488, 8680, 50276, 17480, 253, 2488, 858, 417, 2085, 667, 8680, 891, 651, 751, 281, 1978, 619, 4868, 285, 6273, 323, 18235, 50276, 1189, 455, 891, 1158, 436, 2929, 310, 4722, 285, 556, 253, 2442, 281, 8162, 28055, 281, 3646, 11786, 1332, 627, 310, 690, 3081, 789, 923, 619, 2022, 2278, 326, 3198, 281, 320, 2218, 281, 1056, 436, 2929, 625, 12085, 5474, 339, 9852, 436, 2929, 253, 4477, 12661, 247, 21582, 460, 88, 1299, 1332, 323, 3646, 11786, 11333, 275, 35221, 4715, 1223, 1146, 15579, 3963, 1701, 616, 1332, 6993, 347, 271, 33265, 273, 643, 5609, 1690, 3626, 3646, 27935, 4722, 314, 2220, 970, 1027, 3963, 34768, 253, 2746, 11026, 747, 3646, 11786, 11333, 50276, 20261, 891, 1158, 436, 310, 271, 4722, 2929, 891, 452, 690, 3533, 326, 891, 651, 320, 14442, 604, 9577, 50275, 18, 891, 1119, 253, 47284, 273, 253, 2929, 281, 320, 247, 2372, 21643, 285, 417, 1077, 2590, 310, 352, 1896, 323, 253, 4477, 281, 4518, 1618, 275, 247, 19087, 1014, 275, 253, 30762, 253, 13260, 908, 672, 26736, 253, 27947, 374, 476, 253, 4477, 26604, 17093, 253, 38135, 29403, 275, 253, 4737, 275, 643, 3000, 849, 310, 436, 1783, 4460, 2429, 281, 2629, 1783, 273, 21582, 460, 88, 1299, 3082, 4457, 697, 2898, 281, 15579, 3963, 1701, 35221, 4715, 326, 310, 417, 281, 1333, 326, 253, 1783, 310, 417, 26565, 891, 816, 971, 281, 2096, 604, 627, 497, 667, 7929, 49467, 326, 3058, 281, 320, 11399, 8785, 5556, 5837, 27947, 281, 391, 77, 7533, 50276, 20, 8664, 253, 13260, 310, 436, 1783, 7384, 17133, 414, 273, 2282, 275, 13358, 2219, 436, 476, 320, 1313, 533, 352, 310, 417, 2087, 604, 594, 891, 1158, 627, 3198, 281, 320, 247, 2593, 4518, 14883, 839, 253, 7364, 273, 253, 1655, 4737, 50276, 21, 812, 253, 4477, 1304, 253, 3515, 673, 273, 616, 5933, 2581, 685, 816, 17227, 25142, 352, 651, 320, 1270, 604, 1110, 3515, 2069, 403, 671, 2429, 281, 2629, 3646, 27935, 285, 3626, 3646, 27935, 50276, 22, 812, 253, 4477, 4496, 921, 10921, 4715, 9191, 2581, 685, 816, 3646, 2544, 390, 2412, 6332, 721, 752, 1057, 271, 1162, 66, 18, 1663, 1599, 891, 513, 2096, 326, 4758, 247, 4715, 2281, 326, 1029, 476, 1421, 441, 281, 253, 3908, 273, 253, 4737, 533, 752, 1057, 352, 1599, 802, 1087, 1037, 50275, 32897, 923, 1840, 5474, 339, 431, 248, 2929, 310, 7514, 342, 11968, 16892, 42214, 6486, 1375, 285, 6486, 2250, 2317, 278, 12132, 1895, 342, 1929, 10921, 285, 5502, 12624, 281, 8415, 436, 1895, 352, 29328, 247, 21582, 460, 88, 1299, 1332, 323, 253, 3646, 11786, 5933, 342, 15579, 37820, 253, 2022, 7680, 310, 253, 352, 19539, 326, 253, 4081, 5933, 556, 1980, 21396, 14940, 2867, 762, 2176, 13260, 352, 671, 4648, 767, 6667, 281, 921, 326, 253, 4081, 11333, 310, 7938, 685, 253, 1655, 3646, 11786, 1754, 11333, 275, 253, 6239, 50276, 284, 753, 275, 253, 6010, 2593, 436, 2929, 29328, 247, 21582, 460, 88, 1299, 1332, 326, 4648, 253, 16421, 273, 253, 344, 859, 757, 4315, 347, 271, 16851, 281, 28523, 253, 14940, 273, 253, 3646, 11786, 5933, 352, 2722, 326, 762, 2176, 2515, 2112, 342, 2810, 281, 253, 8654, 3302, 3646, 253, 21582, 460, 88, 1299, 3646, 11786, 5933, 26414, 13284, 5372, 50276, 953, 1663, 4722, 281, 923, 253, 21396, 14940, 323, 253, 21582, 460, 88, 1299, 3646, 11786, 1332, 533, 253, 2515, 762, 534, 436, 14940, 6556, 310, 1077, 12744, 323, 1650, 275, 10012, 608, 253, 13260, 403, 253, 295, 6348, 269, 310, 11233, 37913, 327, 247, 4581, 8578, 285, 253, 85, 518, 281, 253, 85, 505, 274, 849, 812, 359, 12654, 326, 841, 2515, 403, 1313, 323, 2176, 3237, 891, 1158, 22291, 436, 1953, 812, 1361, 22048, 253, 1980, 1255, 5426, 323, 436, 2929, 50276, 1542, 253, 806, 1650, 253, 3302, 3646, 310, 6447, 1580, 1754, 327, 253, 10012, 697, 21396, 14940, 6556, 672, 253, 3302, 3646, 310, 1077, 2810, 281, 253, 8654, 581, 310, 436, 2032, 323, 436, 1650, 347, 973, 671, 323, 253, 2429, 11333, 513, 368, 671, 897, 253, 1072, 3302, 3646, 849, 670, 253, 13542, 24251, 4715, 2281, 323, 305, 2617, 69, 260, 257, 1162, 355, 9169, 671, 556, 21396, 14940, 906, 323, 2810, 281, 8654, 3646, 1083, 849, 1057, 436, 9380, 906, 9184, 432, 352, 253, 21396, 14940, 273, 253, 4081, 21582, 460, 88, 1299, 5933, 310, 4722, 533, 253, 2515, 762, 534, 352, 6556, 310, 3965, 12744, 1580, 436, 310, 253, 9380, 2022, 7680, 697, 247, 45210, 19529, 323, 479, 387, 436, 2774, 891, 588, 3343, 323, 253, 30080, 22559, 281, 2953, 619, 7350, 2490, 187, 4118, 18435, 27, 2520, 310, 247, 5322, 2929, 534, 2722, 326, 27451, 12846, 1025, 3626, 3646, 11786, 7384, 3242, 2289, 281, 253, 278, 12132, 4495, 642, 6046, 275, 253, 10921, 285, 2805, 1159, 8197, 534, 33526, 4872, 14940, 476, 897, 5697, 432, 21582, 460, 88, 1299, 3082, 285, 9295, 616, 21396, 14940, 50276, 28821, 253, 18349, 8704, 3646, 11786, 3082, 285, 616, 14940, 4142, 436, 310, 247, 9865, 3884, 285, 2021, 273, 5697, 50276, 328, 9520, 253, 30628, 574, 1142, 7350, 670, 9759, 285, 671, 273, 253, 3242, 4495, 285, 2954, 273, 253, 1543, 281, 2720, 789, 2853, 823, 281, 436, 285, 3877, 326, 581, 2523, 342, 21582, 460, 88, 1299, 3082, 310, 326, 352, 310, 12744, 849, 1048, 253, 5451, 249, 3408, 310, 4495, 253, 3408, 1078, 616, 21396, 14940, 32356, 275, 285, 436, 310, 1335, 271, 2523, 275, 253, 1246, 2987, 3762, 1529, 2523, 347, 5439, 407, 30628, 310, 253, 3064, 875, 253, 3963, 1025, 285, 440, 12846, 1025, 8654, 7823, 50276, 284, 824, 352, 2789, 3282, 323, 436, 2929, 281, 4763, 625, 673, 285, 40167 ]
[ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 ]
[ 9371, 281, 2571, 665, 513, 10527, 1783, 327, 3646, 11786, 3082, 50276, 783, 5962, 2523, 310, 342, 436, 2929, 310, 326, 352, 1057, 417, 3959, 2590, 1329, 326, 436, 344, 859, 757, 11193, 310, 2810, 672, 253, 3646, 310, 417, 35820, 1365, 2810, 281, 253, 8654, 3646, 627, 310, 642, 5839, 347, 281, 752, 1055, 436, 638, 12380, 4315, 556, 327, 253, 13757, 2553, 1580, 253, 4795, 5731, 310, 3365, 247, 16421, 1025, 2715, 273, 253, 3626, 3646, 11786, 1332, 581, 651, 1902, 2007, 12288, 715, 672, 285, 2139, 253, 1332, 310, 24400, 50276, 20513, 4679, 452, 767, 3374, 253, 806, 310, 326, 253, 5301, 875, 3082, 310, 16593, 253, 1072, 3213, 1979, 310, 908, 323, 512, 3082, 533, 597, 512, 452, 1027, 8062, 285, 352, 310, 417, 2590, 326, 253, 1072, 4373, 22041, 323, 1016, 1332, 2085, 247, 14282, 5301, 253, 1273, 310, 326, 253, 4679, 17914, 642, 1708, 327, 253, 1511, 273, 3237, 436, 1332, 588, 8415, 14556, 253, 3082, 403, 760, 5762, 327, 767, 31934, 793, 285, 627, 310, 642, 5955, 347, 281, 752, 5319, 273, 253, 3646, 13757, 1895, 841, 3237, 403, 4158, 281, 1071, 50276, 783, 2929, 671, 2789, 352, 1646, 751, 352, 310, 15974, 3646, 13757, 3374, 323, 6867, 391, 77, 7533, 533, 253, 4758, 273, 436, 2929, 310, 417, 6867, 275, 3946, 806, 352, 1057, 417, 14409, 326, 841, 1543, 513, 417, 6925, 404, 281, 2219, 342, 1159, 11193, 534, 310, 253, 625, 1846, 1083, 273, 3646, 13757, 1273, 352, 19584, 2120, 3640, 273, 253, 10921, 1159, 285, 5502, 8062, 534, 310, 11766, 253, 1083, 275, 436, 4758, 352, 310, 417, 2590, 326, 3646, 11786, 3082, 3959, 667, 5750, 689, 1318, 19502, 3646, 19502, 5609, 50275, 34974, 310, 253, 1563, 247, 3451, 7914, 273, 253, 16851, 344, 859, 757, 253, 16851, 344, 859, 757, 19584, 326, 253, 11360, 2426, 273, 253, 344, 859, 757, 760, 3469, 327, 253, 16421, 273, 253, 27633, 1491, 4315, 269, 303, 273, 253, 3646, 3602, 436, 7914, 8018, 326, 745, 41758, 4295, 273, 253, 269, 303, 285, 667, 2426, 326, 403, 12208, 407, 253, 10921, 1159, 452, 1652, 3486, 327, 253, 16841, 273, 253, 8103, 1159, 50276, 5430, 2810, 1057, 253, 3646, 878, 281, 320, 8654, 1078, 436, 344, 859, 757, 11193, 310, 5272, 50276, 18566, 253, 16851, 344, 859, 757, 2085, 247, 9371, 5731, 3884, 285, 3213, 2978, 672, 253, 3646, 310, 417, 4829, 8654, 50276, 22309, 651, 581, 897, 436, 344, 859, 757, 11193, 3185, 273, 253, 16421, 273, 253, 305, 10064, 1826, 1299, 1332, 50276, 22309, 403, 253, 31934, 793, 275, 253, 4679, 973, 18960, 281, 7568, 253, 5667, 3607, 273, 841, 3646, 11786, 3082, 752, 403, 253, 3607, 273, 253, 31934, 793, 275, 253, 3368, 326, 1056, 841, 3646, 13757, 3082, 29623, 4541, 50274, 5528, 38526, 2593, 1903, 806, 7680, 253, 5933, 1057, 417, 9295, 253, 3626, 11786, 5933, 533, 4648, 253, 16421, 273, 253, 27633, 1491, 4315, 50276, 4674, 374, 5426, 273, 259, 2059, 50276, 91, 2059, 943, 452, 247, 337, 275, 253, 23653, 50275, 35640, 621, 323, 11138, 19843, 3239, 337, 10341, 2801, 4972, 299, 352, 1537, 320, 4409, 29570, 326, 275, 954, 391, 77, 3237, 436, 2801, 4972, 760, 10949, 247, 1355, 873, 273, 1896, 3054, 2403, 13757, 2834, 50276, 4674, 374, 294, 4277, 285, 1182, 4259, 403, 5611, 533, 352, 310, 12744, 752, 2554, 597, 452, 390, 2139, 597, 403, 1146, 5611, 50276, 4674, 374, 15895, 273, 253, 3646, 11786, 310, 417, 4745, 4755, 285, 642, 28529, 390, 3806, 310, 1677, 323, 253, 2048, 50276, 4674, 374, 2557, 323, 1121, 74, 50276, 2059, 310, 417, 1677, 1919, 1199, 1996, 275, 253, 2929, 352, 369, 417, 2590, 604, 436, 369, 5486, 281, 320, 247, 5222, 390, 23279, 2557, 432, 253, 4028, 50274, 74, 513, 417, 5583, 436, 2929, 323, 14924, 984, 627, 403, 1142, 5667, 440, 42195, 3533, 5001, 253, 4081, 1332, 50276, 7152, 33032, 2520, 2929, 29328, 247, 21582, 460, 88, 1299, 1332, 323, 3646, 11786, 5933, 342, 15579, 37820, 534, 310, 4633, 275, 16161, 253, 35221, 4715, 1895, 342, 2710, 15579, 3470, 436, 2929, 25097, 21396, 14940, 2281, 323, 253, 4081, 5933, 824, 14940, 2281, 310, 16058, 970, 10704, 4679, 4583, 891, 11346, 4361, 436, 2929, 253, 9759, 310, 2590, 533, 253, 4028, 476, 320, 5520, 347, 5544, 275, 253, 5884, 5701, 50276, 24330, 5701, 50276, 18, 1580, 970, 15579, 37820, 2544, 253, 1895, 12580, 273, 253, 7321, 1895, 310, 642, 3356, 253, 2032, 8654, 3646, 273, 253, 3236, 1895, 281, 3426, 253, 2926, 352, 310, 1805, 281, 2085, 247, 3033, 327, 253, 3064, 875, 12580, 273, 253, 7321, 1895, 285, 253, 2032, 8654, 3646, 347, 247, 956, 484, 1953, 891, 2868, 627, 310, 247, 5454, 2727, 875, 253, 20185, 2228, 285, 253, 14940, 2281, 275, 13887, 253, 37820, 10235, 310, 352, 1896, 281, 1056, 253, 37820, 10235, 673, 39381, 272, 594, 326, 359, 452, 20185, 14940, 281, 253, 2032, 8654, 3646, 285, 671, 452, 5520, 14940, 2281, 50276, 19, 275, 253, 3908, 273, 10012, 608, 253, 4477, 5467, 326, 253, 85, 518, 26414, 281, 39116, 1580, 253, 85, 518, 310, 247, 1159, 273, 253, 35388, 268, 1479, 4561, 407, 253, 5933, 436, 9376, 3133, 281, 320, 20276, 310, 352, 1896, 281, 2686, 5276, 253, 3908, 3185, 273, 7384, 352, 752, 310, 253, 2201, 10183, 627, 50276, 20, 5001, 16186, 1283, 273, 10012, 608, 1057, 352, 2186, 323, 512, 465, 390, 760, 323, 465, 1781, 2217, 604, 352, 310, 253, 6158, 1083, 752, 3602, 273, 253, 1895, 1057, 253, 7887, 3469, 327, 5001, 253, 3638, 260, 1580, 352, 310, 417, 247, 10704, 3638, 752, 1057, 352, 3469, 327, 50276, 21, 747, 24787, 1332, 556, 644, 5421, 18171, 275, 253, 6239, 752, 310, 253, 2022, 7681, 5691, 275, 13633, 5368, 1543, 327, 970, 747, 24787, 1332, 281, 8415, 13757, 1895, 281, 253, 4758, 273, 3646, 11786, 5933, 50276, 22, 253, 3646, 11786, 1332, 310, 5611, 6571, 323, 16161, 253, 35221, 4715, 391, 77, 1895, 835, 253, 278, 12132, 1566, 310, 7202, 5368, 11333, 824, 347, 23256, 285, 295, 8159, 476, 320, 4354, 14923, 281, 253, 391, 77, 4758, 323, 253, 4081, 5933, 275, 436, 2929, 1955, 281, 253, 3361, 273, 1349, 352, 310, 417, 4745, 2590, 849, 281, 2686, 897, 352, 275, 253, 391, 77, 4758, 3707, 281, 806, 897, 253, 1566, 3169, 2746, 281, 6642, 1349, 352, 651, 320, 5322, 281, 452, 247, 12494, 16585, 670, 253, 6387, 273, 13633, 253, 1543, 275, 436, 2929, 281, 253, 391, 77, 4758, 50276, 977, 5701, 50276, 18, 670, 4028, 672, 14851, 247, 10012, 13989, 18057, 3966, 352, 310, 1805, 281, 9569, 512, 253, 14951, 1078, 253, 906, 275, 326, 1083, 253, 3908, 273, 253, 906, 310, 625, 44003, 50276, 6438, 2488, 8680, 50276, 17480, 253, 2488, 858, 417, 2085, 667, 8680, 891, 651, 751, 281, 1978, 619, 4868, 285, 6273, 323, 18235, 50276, 1189, 455, 891, 1158, 436, 2929, 310, 4722, 285, 556, 253, 2442, 281, 8162, 28055, 281, 3646, 11786, 1332, 627, 310, 690, 3081, 789, 923, 619, 2022, 2278, 326, 3198, 281, 320, 2218, 281, 1056, 436, 2929, 625, 12085, 5474, 339, 9852, 436, 2929, 253, 4477, 12661, 247, 21582, 460, 88, 1299, 1332, 323, 3646, 11786, 11333, 275, 35221, 4715, 1223, 1146, 15579, 3963, 1701, 616, 1332, 6993, 347, 271, 33265, 273, 643, 5609, 1690, 3626, 3646, 27935, 4722, 314, 2220, 970, 1027, 3963, 34768, 253, 2746, 11026, 747, 3646, 11786, 11333, 50276, 20261, 891, 1158, 436, 310, 271, 4722, 2929, 891, 452, 690, 3533, 326, 891, 651, 320, 14442, 604, 9577, 50275, 18, 891, 1119, 253, 47284, 273, 253, 2929, 281, 320, 247, 2372, 21643, 285, 417, 1077, 2590, 310, 352, 1896, 323, 253, 4477, 281, 4518, 1618, 275, 247, 19087, 1014, 275, 253, 30762, 253, 13260, 908, 672, 26736, 253, 27947, 374, 476, 253, 4477, 26604, 17093, 253, 38135, 29403, 275, 253, 4737, 275, 643, 3000, 849, 310, 436, 1783, 4460, 2429, 281, 2629, 1783, 273, 21582, 460, 88, 1299, 3082, 4457, 697, 2898, 281, 15579, 3963, 1701, 35221, 4715, 326, 310, 417, 281, 1333, 326, 253, 1783, 310, 417, 26565, 891, 816, 971, 281, 2096, 604, 627, 497, 667, 7929, 49467, 326, 3058, 281, 320, 11399, 8785, 5556, 5837, 27947, 281, 391, 77, 7533, 50276, 20, 8664, 253, 13260, 310, 436, 1783, 7384, 17133, 414, 273, 2282, 275, 13358, 2219, 436, 476, 320, 1313, 533, 352, 310, 417, 2087, 604, 594, 891, 1158, 627, 3198, 281, 320, 247, 2593, 4518, 14883, 839, 253, 7364, 273, 253, 1655, 4737, 50276, 21, 812, 253, 4477, 1304, 253, 3515, 673, 273, 616, 5933, 2581, 685, 816, 17227, 25142, 352, 651, 320, 1270, 604, 1110, 3515, 2069, 403, 671, 2429, 281, 2629, 3646, 27935, 285, 3626, 3646, 27935, 50276, 22, 812, 253, 4477, 4496, 921, 10921, 4715, 9191, 2581, 685, 816, 3646, 2544, 390, 2412, 6332, 721, 752, 1057, 271, 1162, 66, 18, 1663, 1599, 891, 513, 2096, 326, 4758, 247, 4715, 2281, 326, 1029, 476, 1421, 441, 281, 253, 3908, 273, 253, 4737, 533, 752, 1057, 352, 1599, 802, 1087, 1037, 50275, 32897, 923, 1840, 5474, 339, 431, 248, 2929, 310, 7514, 342, 11968, 16892, 42214, 6486, 1375, 285, 6486, 2250, 2317, 278, 12132, 1895, 342, 1929, 10921, 285, 5502, 12624, 281, 8415, 436, 1895, 352, 29328, 247, 21582, 460, 88, 1299, 1332, 323, 253, 3646, 11786, 5933, 342, 15579, 37820, 253, 2022, 7680, 310, 253, 352, 19539, 326, 253, 4081, 5933, 556, 1980, 21396, 14940, 2867, 762, 2176, 13260, 352, 671, 4648, 767, 6667, 281, 921, 326, 253, 4081, 11333, 310, 7938, 685, 253, 1655, 3646, 11786, 1754, 11333, 275, 253, 6239, 50276, 284, 753, 275, 253, 6010, 2593, 436, 2929, 29328, 247, 21582, 460, 88, 1299, 1332, 326, 4648, 253, 16421, 273, 253, 344, 859, 757, 4315, 347, 271, 16851, 281, 28523, 253, 14940, 273, 253, 3646, 11786, 5933, 352, 2722, 326, 762, 2176, 2515, 2112, 342, 2810, 281, 253, 8654, 3302, 3646, 253, 21582, 460, 88, 1299, 3646, 11786, 5933, 26414, 13284, 5372, 50276, 953, 1663, 4722, 281, 923, 253, 21396, 14940, 323, 253, 21582, 460, 88, 1299, 3646, 11786, 1332, 533, 253, 2515, 762, 534, 436, 14940, 6556, 310, 1077, 12744, 323, 1650, 275, 10012, 608, 253, 13260, 403, 253, 295, 6348, 269, 310, 11233, 37913, 327, 247, 4581, 8578, 285, 253, 85, 518, 281, 253, 85, 505, 274, 849, 812, 359, 12654, 326, 841, 2515, 403, 1313, 323, 2176, 3237, 891, 1158, 22291, 436, 1953, 812, 1361, 22048, 253, 1980, 1255, 5426, 323, 436, 2929, 50276, 1542, 253, 806, 1650, 253, 3302, 3646, 310, 6447, 1580, 1754, 327, 253, 10012, 697, 21396, 14940, 6556, 672, 253, 3302, 3646, 310, 1077, 2810, 281, 253, 8654, 581, 310, 436, 2032, 323, 436, 1650, 347, 973, 671, 323, 253, 2429, 11333, 513, 368, 671, 897, 253, 1072, 3302, 3646, 849, 670, 253, 13542, 24251, 4715, 2281, 323, 305, 2617, 69, 260, 257, 1162, 355, 9169, 671, 556, 21396, 14940, 906, 323, 2810, 281, 8654, 3646, 1083, 849, 1057, 436, 9380, 906, 9184, 432, 352, 253, 21396, 14940, 273, 253, 4081, 21582, 460, 88, 1299, 5933, 310, 4722, 533, 253, 2515, 762, 534, 352, 6556, 310, 3965, 12744, 1580, 436, 310, 253, 9380, 2022, 7680, 697, 247, 45210, 19529, 323, 479, 387, 436, 2774, 891, 588, 3343, 323, 253, 30080, 22559, 281, 2953, 619, 7350, 2490, 187, 4118, 18435, 27, 2520, 310, 247, 5322, 2929, 534, 2722, 326, 27451, 12846, 1025, 3626, 3646, 11786, 7384, 3242, 2289, 281, 253, 278, 12132, 4495, 642, 6046, 275, 253, 10921, 285, 2805, 1159, 8197, 534, 33526, 4872, 14940, 476, 897, 5697, 432, 21582, 460, 88, 1299, 3082, 285, 9295, 616, 21396, 14940, 50276, 28821, 253, 18349, 8704, 3646, 11786, 3082, 285, 616, 14940, 4142, 436, 310, 247, 9865, 3884, 285, 2021, 273, 5697, 50276, 328, 9520, 253, 30628, 574, 1142, 7350, 670, 9759, 285, 671, 273, 253, 3242, 4495, 285, 2954, 273, 253, 1543, 281, 2720, 789, 2853, 823, 281, 436, 285, 3877, 326, 581, 2523, 342, 21582, 460, 88, 1299, 3082, 310, 326, 352, 310, 12744, 849, 1048, 253, 5451, 249, 3408, 310, 4495, 253, 3408, 1078, 616, 21396, 14940, 32356, 275, 285, 436, 310, 1335, 271, 2523, 275, 253, 1246, 2987, 3762, 1529, 2523, 347, 5439, 407, 30628, 310, 253, 3064, 875, 253, 3963, 1025, 285, 440, 12846, 1025, 8654, 7823, 50276, 284, 824, 352, 2789, 3282, 323, 436, 2929, 281, 4763, 625, 673, 285, 40167 ]
Below is given review of a research paper from cnoference journal. Please write a summary the review. ### Review: the paper collects and adapts previously published datasets to construct a standardized benchmark for tinyml methods which previously suffer from poor comparability due to softwarehardware heterogeneity the tasks include keyword spotting visual wake words image classification and anomaly detection the benchmark has two divisions one allowing any changes to the pipeline and one only allowing post training optimization it measures accuracy latency and energy use 1 the benchmark contains a wide range of tasks despite lacking nlprelated ones the tasks are also oriented at lowresource environments and applications can be found in real life eg single person detection for doorbells 2 the measurement considers latency energy and accuracy the use of a quality threshold instead of the commonly used tradeoff curve comparisons greatly simplifies the comparison 3 the benchmark has two divisions an open division that allows the change of model data training and a closed division that focuses more on posttraining optimization this allows different methods that focus on different parts of the whole optimization process to show their benefits more clearly 4 the challenges for constructing a standardized tinyml benchmark are well summarized in the related work section the running and submission process to the benchmark as documented in the appendix requires significantly more effort than a typical benchmark submission where submitters usually only submit the test results this can limit the number of submitters however this is perhaps a tradeoff that has to be made for more comparability for the widely heterogenous tinyml works it would also be nice to include a summary of the execution process in the main paper instead of only documenting in appendix docsepthis paper presents mlperf tiny the first industrystandard benchmark suite for ultralowpower tiny machine learning systems mlperf tiny measures the accuracy latency and energy of machine learning inference to properly evaluate the tradeoffs between systems this paper presents the first industrystandard benchmark suite for ultralowpower tiny machine learning systems it could push the development for compact model design specific experiments are lacking such as the accuracy latency and energy of the ml models mobilenetmicronetsrnn autoencoder on mlperf tiny benchmark docsepthis paper focuses on the field on tiny machine learning which are used in ultralowpower devices the first industrystandard benchmark is introduced in this paper named mlperf tiny the details of the new benchmark are presented in various aspects including tasks datasets models evaluation process and system design this paper makes the following contributions 1 a new benchmark on tiny machine learning is established 2 the related work is introduced thoroughly the motivation of building such a benchmark is clear 3 the benchmark design is detailed with a modular implementation 4 it includes various tasks including image classification keyword spotting etc the contributions of the new benchmark on tiny ml are significant since the widespread use of ml models in smart devices require the fair comparisons between different methods and models establishing a new benchmark in this field may be valuable the benchmark is relevant to a broad research community and especially from the industry perspective the benchmark is opensourced and will be maintained in the future im not an expert in this area from my perspective i have few concerns about the benchmark my biggest question is about the evaluation although the authors have introduced many aspects of the benchmark i do not see many evaluation results done in this benchmark as there may be many existing methodsmodels performing tiny ml it is expected to conduct comprehensive studies to compare the existing methods based on the evaluation could you have some findings or the limitations of existing methods for figure 1 can you provide more illustrations about the connections to the benchmark eg how do you deal with each level of figure 1 in your benchmark ### Summary:
the paper proposes a benchmark for machine learning models running on lowpower devices all reviewers saw value in the proposed benchmark but initially had some minor concerns and requested more details about how to run the benchmark wanted to see more evaluation results and wanted to see more discussion about connections between different layers of the tinyml stack figure 1 the authors were able to address these concerns through their responses and revisions to the paper and in the end all reviewers voted to accept the paper congratulations on having your paper accepted to the neurips 2021 datasets and benchmarks track the authors are encouraged to use the additional space in the cameraready version of the paper to continue refining the language and presentation in light of the new additions suggested by reviewers
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 783, 2929, 41084, 285, 5223, 84, 3786, 3863, 15302, 281, 3989, 247, 19817, 22791, 323, 20596, 46415, 3082, 534, 3786, 11089, 432, 4105, 3294, 1430, 1955, 281, 3694, 10984, 1935, 19331, 253, 8892, 2486, 23473, 6308, 1076, 5304, 11772, 3000, 2460, 9162, 285, 30207, 5481, 253, 22791, 556, 767, 22387, 581, 6941, 667, 2544, 281, 253, 15722, 285, 581, 760, 6941, 1501, 3733, 13757, 352, 5593, 7200, 22667, 285, 2341, 897, 337, 253, 22791, 4428, 247, 4618, 2491, 273, 8892, 5747, 14999, 295, 24343, 4919, 4394, 253, 8892, 403, 671, 19373, 387, 1698, 15024, 12620, 285, 4893, 476, 320, 1119, 275, 1524, 1495, 24088, 2014, 1436, 5481, 323, 3369, 67, 7042, 50276, 19, 253, 6814, 19401, 22667, 2341, 285, 7200, 253, 897, 273, 247, 3290, 7887, 3185, 273, 253, 7744, 908, 5454, 2727, 6970, 14023, 10260, 8077, 7790, 253, 5301, 50276, 20, 253, 22791, 556, 767, 22387, 271, 1527, 9025, 326, 4483, 253, 1818, 273, 1566, 941, 3733, 285, 247, 4581, 9025, 326, 16633, 625, 327, 1501, 31158, 13757, 436, 4483, 1027, 3082, 326, 2770, 327, 1027, 4243, 273, 253, 2644, 13757, 1232, 281, 921, 616, 5373, 625, 4518, 577, 253, 7881, 323, 26736, 247, 19817, 20596, 46415, 22791, 403, 973, 17903, 275, 253, 2905, 789, 2593, 50276, 783, 3515, 285, 19529, 1232, 281, 253, 22791, 347, 14290, 275, 253, 30762, 4419, 3012, 625, 3434, 685, 247, 6867, 22791, 19529, 835, 11929, 1336, 3798, 760, 11929, 253, 1071, 1543, 436, 476, 2701, 253, 1180, 273, 11929, 1336, 2299, 436, 310, 4931, 247, 5454, 2727, 326, 556, 281, 320, 1160, 323, 625, 3294, 1430, 323, 253, 7561, 6895, 11426, 20596, 46415, 2987, 50276, 262, 651, 671, 320, 5322, 281, 2486, 247, 6010, 273, 253, 10636, 1232, 275, 253, 2022, 2929, 3185, 273, 760, 48447, 275, 30762, 50275, 7152, 33032, 2520, 2929, 10262, 13361, 49181, 10058, 253, 806, 4491, 15291, 22791, 18880, 323, 4054, 1544, 319, 9177, 10058, 5145, 4715, 2718, 13361, 49181, 10058, 5593, 253, 7200, 22667, 285, 2341, 273, 5145, 4715, 17032, 281, 6283, 7472, 253, 5454, 14273, 875, 2718, 436, 2929, 10262, 253, 806, 4491, 15291, 22791, 18880, 323, 4054, 1544, 319, 9177, 10058, 5145, 4715, 2718, 352, 812, 7450, 253, 2440, 323, 8566, 1566, 2216, 2173, 4679, 403, 14999, 824, 347, 253, 7200, 22667, 285, 2341, 273, 253, 13361, 3210, 31551, 257, 292, 6185, 1406, 1507, 83, 9866, 6753, 36465, 327, 13361, 49181, 10058, 22791, 5474, 33032, 2520, 2929, 16633, 327, 253, 1673, 327, 10058, 5145, 4715, 534, 403, 908, 275, 4054, 1544, 319, 9177, 4095, 253, 806, 4491, 15291, 50275, 31591, 4698, 310, 5611, 275, 436, 2929, 4907, 13361, 49181, 10058, 253, 4278, 273, 253, 747, 22791, 403, 3559, 275, 2710, 7794, 1690, 8892, 15302, 3210, 7103, 1232, 285, 985, 2216, 50276, 2520, 2929, 2789, 253, 1563, 9021, 50276, 18, 247, 747, 22791, 327, 10058, 5145, 4715, 310, 4232, 50276, 19, 253, 2905, 789, 310, 5611, 16575, 253, 16038, 273, 3652, 824, 247, 22791, 310, 2590, 495, 253, 22791, 2216, 310, 7000, 342, 247, 23178, 7092, 577, 352, 3797, 2710, 8892, 1690, 2460, 9162, 23473, 6308, 1076, 3966, 50276, 783, 9021, 273, 253, 747, 22791, 327, 10058, 13361, 403, 1534, 1580, 253, 14414, 897, 273, 13361, 3210, 275, 7060, 4095, 2430, 253, 4344, 14023, 875, 1027, 3082, 285, 3210, 14631, 247, 747, 22791, 275, 436, 1673, 778, 320, 9865, 50276, 783, 22791, 310, 4623, 281, 247, 3862, 2561, 3114, 285, 3340, 432, 253, 4491, 8668, 50276, 783, 22791, 310, 13279, 47549, 285, 588, 320, 8838, 275, 253, 2852, 516, 417, 271, 6485, 275, 436, 2170, 432, 619, 8668, 891, 452, 1643, 7350, 670, 253, 22791, 50276, 2577, 5962, 1953, 310, 670, 253, 7103, 3738, 253, 4477, 452, 5611, 1142, 7794, 273, 253, 22791, 891, 513, 417, 923, 1142, 7103, 1543, 2218, 275, 436, 22791, 347, 627, 778, 320, 1142, 5368, 3082, 19286, 9591, 10058, 13361, 352, 310, 3264, 281, 2589, 11088, 2175, 281, 7277, 253, 5368, 3082, 1754, 327, 253, 7103, 812, 368, 452, 690, 4342, 390, 253, 7364, 273, 5368, 3082, 50276, 1542, 4677, 337, 476, 368, 2085, 625, 33954, 670, 253, 10291, 281, 253, 22791, 24088, 849, 513, 368, 2968, 342, 1016, 1268, 273, 4677, 337, 275, 634, 22791, 2490, 187, 4118, 18435, 27, 783, 2929, 29328, 247, 22791, 323, 5145, 4715, 3210, 3515, 327, 1698, 9177, 4095, 512, 30628, 3047, 1318, 275, 253, 4081, 22791, 533, 8523, 574, 690, 5884, 7350, 285, 9521, 625, 4278, 670, 849, 281, 1408, 253, 22791, 3078, 281, 923, 625, 7103, 1543, 285, 3078, 281, 923, 625, 5955, 670, 10291, 875, 1027, 8090, 273, 253, 20596, 46415, 8031, 4677, 337, 253, 4477, 497, 2104, 281, 2953, 841, 7350, 949, 616, 6128, 285, 38549, 281, 253, 2929, 285, 275, 253, 990, 512, 30628, 14285, 281, 2997, 253, 2929, 28858, 3339, 327, 1907, 634, 2929, 7607, 281, 253, 5723, 2824, 43425, 15302, 285, 49602, 3540, 253, 4477, 403, 14659, 281, 897, 253, 3081, 2317, 275, 253, 4049, 254, 609, 5102, 2715, 273, 253, 2929, 281, 4035, 1275, 1699, 253, 3448, 285, 9759, 275, 1708, 273, 253, 747, 30733, 5125, 407, 30628 ]
[ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 ]
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 783, 2929, 41084, 285, 5223, 84, 3786, 3863, 15302, 281, 3989, 247, 19817, 22791, 323, 20596, 46415, 3082, 534, 3786, 11089, 432, 4105, 3294, 1430, 1955, 281, 3694, 10984, 1935, 19331, 253, 8892, 2486, 23473, 6308, 1076, 5304, 11772, 3000, 2460, 9162, 285, 30207, 5481, 253, 22791, 556, 767, 22387, 581, 6941, 667, 2544, 281, 253, 15722, 285, 581, 760, 6941, 1501, 3733, 13757, 352, 5593, 7200, 22667, 285, 2341, 897, 337, 253, 22791, 4428, 247, 4618, 2491, 273, 8892, 5747, 14999, 295, 24343, 4919, 4394, 253, 8892, 403, 671, 19373, 387, 1698, 15024, 12620, 285, 4893, 476, 320, 1119, 275, 1524, 1495, 24088, 2014, 1436, 5481, 323, 3369, 67, 7042, 50276, 19, 253, 6814, 19401, 22667, 2341, 285, 7200, 253, 897, 273, 247, 3290, 7887, 3185, 273, 253, 7744, 908, 5454, 2727, 6970, 14023, 10260, 8077, 7790, 253, 5301, 50276, 20, 253, 22791, 556, 767, 22387, 271, 1527, 9025, 326, 4483, 253, 1818, 273, 1566, 941, 3733, 285, 247, 4581, 9025, 326, 16633, 625, 327, 1501, 31158, 13757, 436, 4483, 1027, 3082, 326, 2770, 327, 1027, 4243, 273, 253, 2644, 13757, 1232, 281, 921, 616, 5373, 625, 4518, 577, 253, 7881, 323, 26736, 247, 19817, 20596, 46415, 22791, 403, 973, 17903, 275, 253, 2905, 789, 2593, 50276, 783, 3515, 285, 19529, 1232, 281, 253, 22791, 347, 14290, 275, 253, 30762, 4419, 3012, 625, 3434, 685, 247, 6867, 22791, 19529, 835, 11929, 1336, 3798, 760, 11929, 253, 1071, 1543, 436, 476, 2701, 253, 1180, 273, 11929, 1336, 2299, 436, 310, 4931, 247, 5454, 2727, 326, 556, 281, 320, 1160, 323, 625, 3294, 1430, 323, 253, 7561, 6895, 11426, 20596, 46415, 2987, 50276, 262, 651, 671, 320, 5322, 281, 2486, 247, 6010, 273, 253, 10636, 1232, 275, 253, 2022, 2929, 3185, 273, 760, 48447, 275, 30762, 50275, 7152, 33032, 2520, 2929, 10262, 13361, 49181, 10058, 253, 806, 4491, 15291, 22791, 18880, 323, 4054, 1544, 319, 9177, 10058, 5145, 4715, 2718, 13361, 49181, 10058, 5593, 253, 7200, 22667, 285, 2341, 273, 5145, 4715, 17032, 281, 6283, 7472, 253, 5454, 14273, 875, 2718, 436, 2929, 10262, 253, 806, 4491, 15291, 22791, 18880, 323, 4054, 1544, 319, 9177, 10058, 5145, 4715, 2718, 352, 812, 7450, 253, 2440, 323, 8566, 1566, 2216, 2173, 4679, 403, 14999, 824, 347, 253, 7200, 22667, 285, 2341, 273, 253, 13361, 3210, 31551, 257, 292, 6185, 1406, 1507, 83, 9866, 6753, 36465, 327, 13361, 49181, 10058, 22791, 5474, 33032, 2520, 2929, 16633, 327, 253, 1673, 327, 10058, 5145, 4715, 534, 403, 908, 275, 4054, 1544, 319, 9177, 4095, 253, 806, 4491, 15291, 50275, 31591, 4698, 310, 5611, 275, 436, 2929, 4907, 13361, 49181, 10058, 253, 4278, 273, 253, 747, 22791, 403, 3559, 275, 2710, 7794, 1690, 8892, 15302, 3210, 7103, 1232, 285, 985, 2216, 50276, 2520, 2929, 2789, 253, 1563, 9021, 50276, 18, 247, 747, 22791, 327, 10058, 5145, 4715, 310, 4232, 50276, 19, 253, 2905, 789, 310, 5611, 16575, 253, 16038, 273, 3652, 824, 247, 22791, 310, 2590, 495, 253, 22791, 2216, 310, 7000, 342, 247, 23178, 7092, 577, 352, 3797, 2710, 8892, 1690, 2460, 9162, 23473, 6308, 1076, 3966, 50276, 783, 9021, 273, 253, 747, 22791, 327, 10058, 13361, 403, 1534, 1580, 253, 14414, 897, 273, 13361, 3210, 275, 7060, 4095, 2430, 253, 4344, 14023, 875, 1027, 3082, 285, 3210, 14631, 247, 747, 22791, 275, 436, 1673, 778, 320, 9865, 50276, 783, 22791, 310, 4623, 281, 247, 3862, 2561, 3114, 285, 3340, 432, 253, 4491, 8668, 50276, 783, 22791, 310, 13279, 47549, 285, 588, 320, 8838, 275, 253, 2852, 516, 417, 271, 6485, 275, 436, 2170, 432, 619, 8668, 891, 452, 1643, 7350, 670, 253, 22791, 50276, 2577, 5962, 1953, 310, 670, 253, 7103, 3738, 253, 4477, 452, 5611, 1142, 7794, 273, 253, 22791, 891, 513, 417, 923, 1142, 7103, 1543, 2218, 275, 436, 22791, 347, 627, 778, 320, 1142, 5368, 3082, 19286, 9591, 10058, 13361, 352, 310, 3264, 281, 2589, 11088, 2175, 281, 7277, 253, 5368, 3082, 1754, 327, 253, 7103, 812, 368, 452, 690, 4342, 390, 253, 7364, 273, 5368, 3082, 50276, 1542, 4677, 337, 476, 368, 2085, 625, 33954, 670, 253, 10291, 281, 253, 22791, 24088, 849, 513, 368, 2968, 342, 1016, 1268, 273, 4677, 337, 275, 634, 22791, 2490, 187, 4118, 18435, 27, 783, 2929, 29328, 247, 22791, 323, 5145, 4715, 3210, 3515, 327, 1698, 9177, 4095, 512, 30628, 3047, 1318, 275, 253, 4081, 22791, 533, 8523, 574, 690, 5884, 7350, 285, 9521, 625, 4278, 670, 849, 281, 1408, 253, 22791, 3078, 281, 923, 625, 7103, 1543, 285, 3078, 281, 923, 625, 5955, 670, 10291, 875, 1027, 8090, 273, 253, 20596, 46415, 8031, 4677, 337, 253, 4477, 497, 2104, 281, 2953, 841, 7350, 949, 616, 6128, 285, 38549, 281, 253, 2929, 285, 275, 253, 990, 512, 30628, 14285, 281, 2997, 253, 2929, 28858, 3339, 327, 1907, 634, 2929, 7607, 281, 253, 5723, 2824, 43425, 15302, 285, 49602, 3540, 253, 4477, 403, 14659, 281, 897, 253, 3081, 2317, 275, 253, 4049, 254, 609, 5102, 2715, 273, 253, 2929, 281, 4035, 1275, 1699, 253, 3448, 285, 9759, 275, 1708, 273, 253, 747, 30733, 5125, 407, 30628 ]
Below is given review of a research paper from cnoference journal. Please write a summary the review. ### Review: update the authors went out of their way to address my concerns about the absence of the unbalanced class setting they added a new datasets svhn new results table 4 and updated some of their explanations all these additions seem satisfactory i was also pleased with the feedback about computational cost r3 i improved my rating while i agree with the concerns of reviewer 4 those i could understand they would apply to every publication i have read about calibration and i think the authors addressed these concerns to the best of our current knowledge this paper proposes an information maximization binning scheme for calibration starting with a good introduction a clear progression leads to the core algorithm described by theorem 1 limits of previous histogrambased approaches both in terms of performance or reliability of metric are clearly demonstrated with clear figures and proper references while using information measures to drive histogram binning has been done i assume that the current classification setting where one maximizes the mi between the logit and the class is novel the authors do not give pointer to previous work here only mentioning the info bottleneck without references theorem 1 leads to an alternative minimization algorithm with analytical steps i did not check the convergence behavior proof but figure 3 is convincing enough i did not fully understand the information bottleneck limit experiments show first that the information binning strategy is far superior than equalmass or size binning table 2 and 3 then shows how it improves on most scaling algorithms used for calibration one detail i am not comfortable with the ece1k hack as it looks like a lastminute addition to give even stronger gains to the imax method a more principled introduction would be better see below this would be an excellent paper except for the following which casts doubts whether all the steps of the method generalize to an unbalanced multiclass setting it is probably possible to fix or explain before publication this paper relies on a very unnatural and unfortunate stateofaffair in ml classes are equally distributed on the test data the phrasing does even consider any other possibility and some of the algorithms seem to be quite specific to this setup requiring significant changes in the real world case where test classes are not equally distributed at the end of section 32 the authors propose an algorithm to merge sk across k classes based on the observation that they have similar distribution rather than a proof they run a simulation on imagenet sec a2 that shows it is better than binning each sk separately while the experiment is elegant it probably strongly relies on the fact that each sk has the same 1k1 split what would happen if the classes follow a more realistic zipf law as observed in real nlp classification tasks i would assume that the merging process could still be applicable but applied to groups of sk with similar class0class1 distributions in section 41 the trick to remove from the measure of the ece classes where the predicted probability is less than 1k also depends on a uniform 1k prior it should also be adapted to a nonuniform prior docsepupdate after the rebuttal the authors have answered my concerns i believe the paper should be accepted and would be a nice contribution to the current research the paper proposes a novel approach for posthoc calibration of outputs of the neural networks to estimate uncertainty of its prediction the paper considers the histogram binning approach in contrast to scaling approaches existing in the literature and utilises the information theory in building bins strong points the work is very well placed in the context of the existing literature identifying the current gaps theoretically sound motivation of the approach extensive empirical evaluation weak points some discussion of the cost of the proposed method is lacking ie how much in terms of computational time and memory this new calibration method is the methods are compared with respect to accuracy and expected calibration error ece only it has been shown that ece is not a good metric for comparing different methods see eg ashukha a lyzhov a molchanov d and vetrov d 2020 pitfalls of indomain uncertainty estimation and ensembling in deep learning iclr 2020 i am recommending acceptance of the paper though addressing the weak points above would largely improve the paper the reasons for this decision is that strong points outweigh weak points the proposed idea is interesting it is shown that it is promising in practice subject to not very good metrics and the paper is mostly well written and easy to follow questions to authors could you please address raised weak points additional feedback not necessarily important for evaluation but could help to improve the paper 1 the part on shared classwise binning is rather rushed in the main paper and it is not very clear it is also rather independent contribution from the main imax calibration contribution it would be better to somehow put them under one umbrella 2 section 2 bayesian dnns eg blundell et al 2015 and their approximations gal ghahramani 2016 a very arguable statement i would suggest rephrasing it since blundell et al proposed variational inference which is also an approximation and gal ghahramani work is not an approximation of blundell et al model 3 section 42 namely matrix scaling w l2 dot after w is read as a full stop which is confusing 4 after eq5 so we can solve the problem by iteratively and alternately updating gm and phim based on a12 it seems eq 5 and a12 are the same and it would be more convenient to refer to eq 5 in the text right after it 5 i am a little bit missing the overall procedure of the proposed calibration ie all details are there especially if refer to appendix but after reading the main paper there is no feeling that i can now go and implement it for my problem maybe a pseudocode can help or just stepbystep guidance minor section 3 first paragraph sec 32 redundant bracket docsepthis paper highlights the issues with the scaling method and histogram binning ie underestimate calibration error in scaling methods and failing to preserve classification accuracy and sampleinefficiency in hb they use the imax concept for binning which maximizes the mutual information between labels and quantized logits they claim that their approach mitigates potential loss in ranking performance and allows simultaneous improvement of ranking and calibration performance by disentangling the optimization of bin edges and representatives they also propose a shared classwise scw strategy that fits a single calibrator on the merged training sets of all k classwise problems to improve the sample efficiency the paper is well written and the authors provide enough motivation and intuition of why maximizing the mutual information between labels and quantized logits would help multiclass calibration there are some concerns and issues that i think needs to be addressed 1 one approach in estimating uncertainty in classification is to choose a model and a regularized loss function to inherently learn a good representation for example using confidence as a term for regularization in neural networks is proposed in regularizing neural networks by penalizing confident output distributions iclr 2017 that penalizes lowentropy output distributions i think it is worth comparing the results with such existing work and discussing the advantages and disadvantages since a similar concept has been used one while training the model and this paper as a posthoc calibration 2 it is interesting that a single calibrator on the merged training sets of all k classwise problems scw performs well as it is mentioned in the paper it introduces bias due to having samples drawn from the other classes in hb increasing its number of evaluation bins reduces the bias but in scw such bias can not be controlled moreover figure a2 shows it achieves smaller jsds which is not expected is there any reason for that what would happen if the number of bins is increased 3 based on the experimental results it seems imax performs better than other binning approaches however compared to the scaling methods it seems gp wenger et al 2020 performs better at nllbrier than the imax variants 4 even though the paper shows combining imax with gp improves the ece it is not clear how the issues of each approach will be handled for example the ece might be underestimated ### Summary:
the paper proposes to maximizing the mutual information to optimize the bin for multiclass calibration the idea technique and presentation are good the paper solves some multiclass calibration issues the author should revise the paper according the reviewers comments before publish
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 11183, 253, 4477, 2427, 562, 273, 616, 1039, 281, 2953, 619, 7350, 670, 253, 5928, 273, 253, 440, 30063, 966, 4758, 597, 2879, 247, 747, 15302, 18504, 13107, 747, 1543, 2829, 577, 285, 9300, 690, 273, 616, 22909, 512, 841, 30733, 1646, 20297, 891, 369, 671, 13864, 342, 253, 8680, 670, 15180, 2105, 391, 20, 891, 5520, 619, 13716, 50276, 6050, 891, 5194, 342, 253, 7350, 273, 37317, 577, 1110, 891, 812, 2096, 597, 651, 4647, 281, 1046, 9311, 891, 452, 1239, 670, 18543, 285, 891, 1158, 253, 4477, 9713, 841, 7350, 281, 253, 1682, 273, 776, 1655, 3640, 50276, 2520, 2929, 29328, 271, 1491, 11903, 1320, 10269, 920, 6974, 323, 18543, 4983, 342, 247, 1175, 10199, 247, 2590, 10005, 5644, 281, 253, 5161, 5933, 2529, 407, 10012, 337, 7787, 273, 2045, 1872, 31481, 1369, 833, 7274, 1097, 275, 2426, 273, 3045, 390, 13367, 273, 7982, 403, 4518, 5183, 342, 2590, 8442, 285, 1463, 10414, 1223, 970, 1491, 5593, 281, 4446, 33105, 10269, 920, 556, 644, 2218, 891, 5467, 326, 253, 1655, 9162, 4758, 835, 581, 11903, 4219, 253, 3641, 875, 253, 2412, 262, 285, 253, 966, 310, 4460, 253, 4477, 513, 417, 1918, 12219, 281, 2045, 789, 1060, 760, 29570, 253, 8692, 3673, 44856, 1293, 10414, 50276, 33921, 337, 5644, 281, 271, 5795, 41458, 5933, 342, 16101, 5018, 891, 858, 417, 2451, 253, 14940, 3879, 4737, 533, 4677, 495, 310, 21414, 2217, 891, 858, 417, 4751, 2096, 253, 1491, 3673, 44856, 2701, 50276, 16217, 3825, 921, 806, 326, 253, 1491, 10269, 920, 5700, 310, 2080, 8936, 685, 4503, 14611, 390, 1979, 10269, 920, 2829, 374, 285, 495, 840, 2722, 849, 352, 19132, 327, 954, 13642, 11333, 908, 323, 18543, 581, 2508, 891, 717, 417, 9848, 342, 253, 299, 336, 18, 76, 13908, 347, 352, 4453, 751, 247, 1390, 15505, 1635, 281, 1918, 1014, 10046, 15988, 281, 253, 516, 991, 1332, 247, 625, 3505, 74, 6216, 10199, 651, 320, 1805, 923, 2708, 50276, 2520, 651, 320, 271, 7126, 2929, 3707, 323, 253, 1563, 534, 43603, 24626, 1880, 512, 253, 5018, 273, 253, 1332, 39970, 281, 271, 440, 30063, 23559, 14407, 4758, 352, 310, 3164, 1896, 281, 4993, 390, 5513, 1078, 9311, 436, 2929, 15771, 327, 247, 1077, 44822, 285, 23293, 1375, 1171, 2843, 1094, 275, 13361, 5971, 403, 9696, 5939, 327, 253, 1071, 941, 253, 9839, 2355, 1057, 1014, 1908, 667, 643, 6387, 285, 690, 273, 253, 11333, 1646, 281, 320, 3240, 2173, 281, 436, 9978, 10568, 1534, 2544, 275, 253, 1524, 1533, 1083, 835, 1071, 5971, 403, 417, 9696, 5939, 50276, 255, 253, 990, 273, 2593, 4567, 253, 4477, 12661, 271, 5933, 281, 17310, 1629, 2439, 465, 5971, 1754, 327, 253, 8310, 326, 597, 452, 2074, 3268, 2581, 685, 247, 4737, 597, 1408, 247, 9864, 327, 4440, 257, 292, 4706, 247, 19, 326, 2722, 352, 310, 1805, 685, 10269, 920, 1016, 1629, 11794, 1223, 253, 3368, 310, 20654, 352, 3164, 7052, 15771, 327, 253, 958, 326, 1016, 1629, 556, 253, 1072, 337, 76, 18, 8085, 752, 651, 5108, 604, 253, 5971, 956, 247, 625, 15958, 23367, 71, 1569, 347, 2540, 275, 1524, 295, 24343, 9162, 8892, 891, 651, 5467, 326, 253, 34047, 1232, 812, 1335, 320, 7763, 533, 3732, 281, 2390, 273, 1629, 342, 2074, 966, 17, 2437, 18, 10670, 50276, 249, 2593, 7609, 253, 10480, 281, 5386, 432, 253, 2557, 273, 253, 299, 336, 5971, 835, 253, 8131, 5912, 310, 1679, 685, 337, 76, 671, 7024, 327, 247, 6447, 337, 76, 2720, 352, 943, 671, 320, 12956, 281, 247, 1327, 23714, 2720, 5474, 33032, 11183, 846, 253, 30080, 22559, 253, 4477, 452, 9577, 619, 7350, 891, 2868, 253, 2929, 943, 320, 7607, 285, 651, 320, 247, 5322, 7680, 281, 253, 1655, 2561, 50272, 783, 2929, 29328, 247, 4460, 2746, 323, 1501, 37806, 18543, 273, 18012, 273, 253, 11454, 6928, 281, 6642, 11649, 273, 697, 10554, 253, 2929, 19401, 253, 33105, 10269, 920, 2746, 275, 4499, 281, 13642, 7274, 5368, 275, 253, 6239, 285, 4981, 3013, 253, 1491, 3762, 275, 3652, 27925, 50276, 9072, 2792, 50276, 783, 789, 310, 1077, 973, 4845, 275, 253, 3634, 273, 253, 5368, 6239, 12488, 253, 1655, 18388, 50276, 783, 7262, 1037, 3590, 16038, 273, 253, 2746, 50276, 2068, 3134, 16774, 7103, 50276, 20881, 2792, 50276, 8826, 5955, 273, 253, 2105, 273, 253, 4081, 1332, 310, 14999, 50276, 466, 849, 1199, 275, 2426, 273, 15180, 673, 285, 3541, 436, 747, 18543, 1332, 310, 50275, 783, 3082, 403, 2429, 342, 1675, 281, 7200, 285, 3264, 18543, 2228, 299, 336, 760, 352, 556, 644, 2011, 326, 299, 336, 310, 417, 247, 1175, 7982, 323, 10941, 1027, 3082, 923, 24088, 15898, 2788, 3227, 247, 12865, 20122, 729, 247, 14008, 2291, 729, 277, 285, 26925, 18540, 277, 9169, 8483, 27366, 273, 801, 297, 404, 11649, 13418, 285, 546, 35128, 275, 3676, 4715, 17857, 32888, 9169, 50275, 74, 717, 46705, 14924, 273, 253, 2929, 2167, 15974, 253, 5075, 2792, 1840, 651, 8127, 3157, 253, 2929, 253, 4606, 323, 436, 3061, 310, 326, 2266, 2792, 32180, 798, 5075, 2792, 253, 4081, 2934, 310, 4722, 352, 310, 2011, 326, 352, 310, 12532, 275, 3946, 2256, 281, 417, 1077, 1175, 17082, 285, 253, 2929, 310, 6571, 973, 3542, 285, 3477, 281, 956, 50276, 34974, 281, 4477, 50276, 16534, 368, 4496, 2953, 5439, 5075, 2792, 50276, 38092, 8680, 417, 7933, 1774, 323, 7103, 533, 812, 1361, 281, 3157, 253, 2929, 337, 253, 629, 327, 6096, 966, 3020, 10269, 920, 310, 2581, 20906, 275, 253, 2022, 2929, 285, 352, 310, 417, 1077, 2590, 352, 310, 671, 2581, 3907, 7680, 432, 253, 2022, 516, 991, 18543, 7680, 352, 651, 320, 1805, 281, 10380, 1691, 731, 762, 581, 33265, 374, 2593, 374, 17699, 16561, 277, 79, 2224, 24088, 787, 1504, 437, 1162, 355, 4104, 285, 616, 34754, 5918, 50276, 18068, 1240, 3358, 6451, 4022, 50276, 66, 1077, 1736, 8584, 3908, 891, 651, 1804, 294, 545, 83, 2355, 352, 1580, 787, 1504, 437, 1162, 355, 4081, 39762, 17032, 534, 310, 671, 271, 11193, 285, 5918, 50276, 18068, 1240, 3358, 6451, 789, 310, 417, 271, 11193, 273, 787, 1504, 437, 1162, 355, 1566, 495, 2593, 5976, 10775, 4315, 13642, 259, 298, 19, 14261, 846, 259, 310, 1239, 347, 247, 2120, 3523, 534, 310, 21643, 577, 846, 16186, 22, 594, 359, 476, 8415, 253, 1895, 407, 10040, 3146, 285, 3960, 1523, 22753, 305, 78, 285, 815, 303, 1754, 327, 247, 805, 50276, 262, 3133, 16186, 608, 285, 247, 805, 403, 253, 1072, 285, 352, 651, 320, 625, 11638, 281, 3730, 281, 16186, 608, 275, 253, 2505, 987, 846, 352, 608, 891, 717, 247, 1652, 2372, 5816, 253, 4583, 5199, 273, 253, 4081, 18543, 26332, 512, 4278, 403, 627, 3340, 604, 3730, 281, 30762, 533, 846, 4361, 253, 2022, 2929, 627, 310, 642, 5471, 326, 891, 476, 1024, 564, 285, 3359, 352, 323, 619, 1895, 5046, 247, 10585, 406, 853, 476, 1361, 390, 816, 3213, 1615, 10539, 12925, 50276, 37585, 2593, 495, 806, 12494, 4706, 4567, 50276, 433, 1504, 386, 24312, 50276, 7152, 33032, 2520, 2929, 16681, 253, 3374, 342, 253, 13642, 1332, 285, 33105, 10269, 920, 26332, 45166, 18543, 2228, 275, 13642, 3082, 285, 11741, 281, 14003, 9162, 7200, 285, 3410, 460, 15412, 275, 288, 67, 597, 897, 253, 516, 991, 4473, 323, 10269, 920, 534, 11903, 4219, 253, 15577, 1491, 875, 13301, 285, 2677, 1025, 2412, 953, 597, 1750, 326, 616, 2746, 4784, 304, 684, 2442, 2957, 275, 19947, 3045, 285, 4483, 19645, 7756, 273, 19947, 285, 18543, 3045, 407, 557, 290, 36874, 253, 13757, 273, 10269, 9297, 285, 15572, 597, 671, 12661, 247, 6096, 966, 3020, 660, 88, 5700, 326, 13840, 247, 2014, 24403, 1080, 327, 253, 21884, 3733, 5239, 273, 512, 465, 966, 3020, 3237, 281, 3157, 253, 50276, 16848, 6733, 50276, 783, 2929, 310, 973, 3542, 285, 253, 4477, 2085, 2217, 16038, 285, 30328, 273, 2139, 46875, 253, 15577, 1491, 875, 13301, 285, 2677, 1025, 2412, 953, 651, 1361, 23559, 14407, 18543, 627, 403, 690, 7350, 285, 3374, 326, 891, 1158, 3198, 281, 320, 9713, 50275, 18, 581, 2746, 275, 26230, 11649, 275, 9162, 310, 281, 5206, 247, 1566, 285, 247, 3963, 1025, 2957, 1159, 281, 26557, 3037, 247, 1175, 6779, 323, 1650, 970, 7162, 347, 247, 1307, 323, 37820, 275, 11454, 6928, 310, 4081, 275, 3963, 3006, 11454, 6928, 407, 29697, 3006, 13224, 3453, 10670, 17857, 32888, 4240, 326, 29697, 4219, 1698, 290, 10144, 3453, 10670, 891, 1158, 352, 310, 4409, 10941, 253, 1543, 342, 824, 5368, 789, 285, 16585, 253, 11361, 285, 23797, 1580, 247, 2074, 4473, 556, 644, 908, 581, 1223, 3733, 253, 1566, 285, 436, 2929, 347, 247, 1501, 37806, 18543, 50275, 19, 352, 310, 4722, 326, 247, 2014, 24403, 1080, 327, 253, 21884, 3733, 5239, 273, 512, 465, 966, 3020, 3237, 660, 88, 17923, 973, 347, 352, 310, 5393, 275, 253, 2929, 352, 23970, 8492, 1955, 281, 1907, 3530, 8392, 432, 253, 643, 5971, 275, 288, 67, 3629, 697, 1180, 273, 7103, 27925, 11355, 253, 8492, 533, 275, 660, 88, 824, 8492, 476, 417, 320, 6537, 25761, 4677, 247, 19, 2722, 352, 33526, 4577, 23421, 1397, 534, 310, 417, 3264, 310, 627, 667, 1921, 323, 326, 752, 651, 5108, 604, 253, 1180, 273, 27925, 310, 2559, 50275, 20, 1754, 327, 253, 5661, 1543, 352, 3133, 516, 991, 17923, 1805, 685, 643, 10269, 920, 7274, 2299, 2429, 281, 253, 13642, 3082, 352, 3133, 31025, 259, 9562, 1162, 355, 9169, 17923, 1805, 387, 295, 620, 67, 4586, 685, 253, 516, 991, 11640, 50275, 21, 1014, 2167, 253, 2929, 2722, 16248, 516, 991, 342, 31025, 19132, 253, 299, 336, 352, 310, 417, 2590, 849, 253, 3374, 273, 1016, 2746, 588, 320, 15726, 323, 1650, 253, 299, 336, 1537, 320, 41901, 50275, 187, 187, 4118, 18435, 27, 783, 2929, 29328, 281, 46875, 253, 15577, 1491, 281, 22318, 253, 10269, 323, 23559, 14407, 18543, 253, 2934, 5853, 285, 9759, 403, 1175, 253, 2929, 35910, 690, 23559, 14407, 18543, 50276, 22402, 253, 2488, 943, 49620, 253, 2929, 2556, 253, 30628, 5701, 1078, 15452 ]
[ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 ]
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 11183, 253, 4477, 2427, 562, 273, 616, 1039, 281, 2953, 619, 7350, 670, 253, 5928, 273, 253, 440, 30063, 966, 4758, 597, 2879, 247, 747, 15302, 18504, 13107, 747, 1543, 2829, 577, 285, 9300, 690, 273, 616, 22909, 512, 841, 30733, 1646, 20297, 891, 369, 671, 13864, 342, 253, 8680, 670, 15180, 2105, 391, 20, 891, 5520, 619, 13716, 50276, 6050, 891, 5194, 342, 253, 7350, 273, 37317, 577, 1110, 891, 812, 2096, 597, 651, 4647, 281, 1046, 9311, 891, 452, 1239, 670, 18543, 285, 891, 1158, 253, 4477, 9713, 841, 7350, 281, 253, 1682, 273, 776, 1655, 3640, 50276, 2520, 2929, 29328, 271, 1491, 11903, 1320, 10269, 920, 6974, 323, 18543, 4983, 342, 247, 1175, 10199, 247, 2590, 10005, 5644, 281, 253, 5161, 5933, 2529, 407, 10012, 337, 7787, 273, 2045, 1872, 31481, 1369, 833, 7274, 1097, 275, 2426, 273, 3045, 390, 13367, 273, 7982, 403, 4518, 5183, 342, 2590, 8442, 285, 1463, 10414, 1223, 970, 1491, 5593, 281, 4446, 33105, 10269, 920, 556, 644, 2218, 891, 5467, 326, 253, 1655, 9162, 4758, 835, 581, 11903, 4219, 253, 3641, 875, 253, 2412, 262, 285, 253, 966, 310, 4460, 253, 4477, 513, 417, 1918, 12219, 281, 2045, 789, 1060, 760, 29570, 253, 8692, 3673, 44856, 1293, 10414, 50276, 33921, 337, 5644, 281, 271, 5795, 41458, 5933, 342, 16101, 5018, 891, 858, 417, 2451, 253, 14940, 3879, 4737, 533, 4677, 495, 310, 21414, 2217, 891, 858, 417, 4751, 2096, 253, 1491, 3673, 44856, 2701, 50276, 16217, 3825, 921, 806, 326, 253, 1491, 10269, 920, 5700, 310, 2080, 8936, 685, 4503, 14611, 390, 1979, 10269, 920, 2829, 374, 285, 495, 840, 2722, 849, 352, 19132, 327, 954, 13642, 11333, 908, 323, 18543, 581, 2508, 891, 717, 417, 9848, 342, 253, 299, 336, 18, 76, 13908, 347, 352, 4453, 751, 247, 1390, 15505, 1635, 281, 1918, 1014, 10046, 15988, 281, 253, 516, 991, 1332, 247, 625, 3505, 74, 6216, 10199, 651, 320, 1805, 923, 2708, 50276, 2520, 651, 320, 271, 7126, 2929, 3707, 323, 253, 1563, 534, 43603, 24626, 1880, 512, 253, 5018, 273, 253, 1332, 39970, 281, 271, 440, 30063, 23559, 14407, 4758, 352, 310, 3164, 1896, 281, 4993, 390, 5513, 1078, 9311, 436, 2929, 15771, 327, 247, 1077, 44822, 285, 23293, 1375, 1171, 2843, 1094, 275, 13361, 5971, 403, 9696, 5939, 327, 253, 1071, 941, 253, 9839, 2355, 1057, 1014, 1908, 667, 643, 6387, 285, 690, 273, 253, 11333, 1646, 281, 320, 3240, 2173, 281, 436, 9978, 10568, 1534, 2544, 275, 253, 1524, 1533, 1083, 835, 1071, 5971, 403, 417, 9696, 5939, 50276, 255, 253, 990, 273, 2593, 4567, 253, 4477, 12661, 271, 5933, 281, 17310, 1629, 2439, 465, 5971, 1754, 327, 253, 8310, 326, 597, 452, 2074, 3268, 2581, 685, 247, 4737, 597, 1408, 247, 9864, 327, 4440, 257, 292, 4706, 247, 19, 326, 2722, 352, 310, 1805, 685, 10269, 920, 1016, 1629, 11794, 1223, 253, 3368, 310, 20654, 352, 3164, 7052, 15771, 327, 253, 958, 326, 1016, 1629, 556, 253, 1072, 337, 76, 18, 8085, 752, 651, 5108, 604, 253, 5971, 956, 247, 625, 15958, 23367, 71, 1569, 347, 2540, 275, 1524, 295, 24343, 9162, 8892, 891, 651, 5467, 326, 253, 34047, 1232, 812, 1335, 320, 7763, 533, 3732, 281, 2390, 273, 1629, 342, 2074, 966, 17, 2437, 18, 10670, 50276, 249, 2593, 7609, 253, 10480, 281, 5386, 432, 253, 2557, 273, 253, 299, 336, 5971, 835, 253, 8131, 5912, 310, 1679, 685, 337, 76, 671, 7024, 327, 247, 6447, 337, 76, 2720, 352, 943, 671, 320, 12956, 281, 247, 1327, 23714, 2720, 5474, 33032, 11183, 846, 253, 30080, 22559, 253, 4477, 452, 9577, 619, 7350, 891, 2868, 253, 2929, 943, 320, 7607, 285, 651, 320, 247, 5322, 7680, 281, 253, 1655, 2561, 50272, 783, 2929, 29328, 247, 4460, 2746, 323, 1501, 37806, 18543, 273, 18012, 273, 253, 11454, 6928, 281, 6642, 11649, 273, 697, 10554, 253, 2929, 19401, 253, 33105, 10269, 920, 2746, 275, 4499, 281, 13642, 7274, 5368, 275, 253, 6239, 285, 4981, 3013, 253, 1491, 3762, 275, 3652, 27925, 50276, 9072, 2792, 50276, 783, 789, 310, 1077, 973, 4845, 275, 253, 3634, 273, 253, 5368, 6239, 12488, 253, 1655, 18388, 50276, 783, 7262, 1037, 3590, 16038, 273, 253, 2746, 50276, 2068, 3134, 16774, 7103, 50276, 20881, 2792, 50276, 8826, 5955, 273, 253, 2105, 273, 253, 4081, 1332, 310, 14999, 50276, 466, 849, 1199, 275, 2426, 273, 15180, 673, 285, 3541, 436, 747, 18543, 1332, 310, 50275, 783, 3082, 403, 2429, 342, 1675, 281, 7200, 285, 3264, 18543, 2228, 299, 336, 760, 352, 556, 644, 2011, 326, 299, 336, 310, 417, 247, 1175, 7982, 323, 10941, 1027, 3082, 923, 24088, 15898, 2788, 3227, 247, 12865, 20122, 729, 247, 14008, 2291, 729, 277, 285, 26925, 18540, 277, 9169, 8483, 27366, 273, 801, 297, 404, 11649, 13418, 285, 546, 35128, 275, 3676, 4715, 17857, 32888, 9169, 50275, 74, 717, 46705, 14924, 273, 253, 2929, 2167, 15974, 253, 5075, 2792, 1840, 651, 8127, 3157, 253, 2929, 253, 4606, 323, 436, 3061, 310, 326, 2266, 2792, 32180, 798, 5075, 2792, 253, 4081, 2934, 310, 4722, 352, 310, 2011, 326, 352, 310, 12532, 275, 3946, 2256, 281, 417, 1077, 1175, 17082, 285, 253, 2929, 310, 6571, 973, 3542, 285, 3477, 281, 956, 50276, 34974, 281, 4477, 50276, 16534, 368, 4496, 2953, 5439, 5075, 2792, 50276, 38092, 8680, 417, 7933, 1774, 323, 7103, 533, 812, 1361, 281, 3157, 253, 2929, 337, 253, 629, 327, 6096, 966, 3020, 10269, 920, 310, 2581, 20906, 275, 253, 2022, 2929, 285, 352, 310, 417, 1077, 2590, 352, 310, 671, 2581, 3907, 7680, 432, 253, 2022, 516, 991, 18543, 7680, 352, 651, 320, 1805, 281, 10380, 1691, 731, 762, 581, 33265, 374, 2593, 374, 17699, 16561, 277, 79, 2224, 24088, 787, 1504, 437, 1162, 355, 4104, 285, 616, 34754, 5918, 50276, 18068, 1240, 3358, 6451, 4022, 50276, 66, 1077, 1736, 8584, 3908, 891, 651, 1804, 294, 545, 83, 2355, 352, 1580, 787, 1504, 437, 1162, 355, 4081, 39762, 17032, 534, 310, 671, 271, 11193, 285, 5918, 50276, 18068, 1240, 3358, 6451, 789, 310, 417, 271, 11193, 273, 787, 1504, 437, 1162, 355, 1566, 495, 2593, 5976, 10775, 4315, 13642, 259, 298, 19, 14261, 846, 259, 310, 1239, 347, 247, 2120, 3523, 534, 310, 21643, 577, 846, 16186, 22, 594, 359, 476, 8415, 253, 1895, 407, 10040, 3146, 285, 3960, 1523, 22753, 305, 78, 285, 815, 303, 1754, 327, 247, 805, 50276, 262, 3133, 16186, 608, 285, 247, 805, 403, 253, 1072, 285, 352, 651, 320, 625, 11638, 281, 3730, 281, 16186, 608, 275, 253, 2505, 987, 846, 352, 608, 891, 717, 247, 1652, 2372, 5816, 253, 4583, 5199, 273, 253, 4081, 18543, 26332, 512, 4278, 403, 627, 3340, 604, 3730, 281, 30762, 533, 846, 4361, 253, 2022, 2929, 627, 310, 642, 5471, 326, 891, 476, 1024, 564, 285, 3359, 352, 323, 619, 1895, 5046, 247, 10585, 406, 853, 476, 1361, 390, 816, 3213, 1615, 10539, 12925, 50276, 37585, 2593, 495, 806, 12494, 4706, 4567, 50276, 433, 1504, 386, 24312, 50276, 7152, 33032, 2520, 2929, 16681, 253, 3374, 342, 253, 13642, 1332, 285, 33105, 10269, 920, 26332, 45166, 18543, 2228, 275, 13642, 3082, 285, 11741, 281, 14003, 9162, 7200, 285, 3410, 460, 15412, 275, 288, 67, 597, 897, 253, 516, 991, 4473, 323, 10269, 920, 534, 11903, 4219, 253, 15577, 1491, 875, 13301, 285, 2677, 1025, 2412, 953, 597, 1750, 326, 616, 2746, 4784, 304, 684, 2442, 2957, 275, 19947, 3045, 285, 4483, 19645, 7756, 273, 19947, 285, 18543, 3045, 407, 557, 290, 36874, 253, 13757, 273, 10269, 9297, 285, 15572, 597, 671, 12661, 247, 6096, 966, 3020, 660, 88, 5700, 326, 13840, 247, 2014, 24403, 1080, 327, 253, 21884, 3733, 5239, 273, 512, 465, 966, 3020, 3237, 281, 3157, 253, 50276, 16848, 6733, 50276, 783, 2929, 310, 973, 3542, 285, 253, 4477, 2085, 2217, 16038, 285, 30328, 273, 2139, 46875, 253, 15577, 1491, 875, 13301, 285, 2677, 1025, 2412, 953, 651, 1361, 23559, 14407, 18543, 627, 403, 690, 7350, 285, 3374, 326, 891, 1158, 3198, 281, 320, 9713, 50275, 18, 581, 2746, 275, 26230, 11649, 275, 9162, 310, 281, 5206, 247, 1566, 285, 247, 3963, 1025, 2957, 1159, 281, 26557, 3037, 247, 1175, 6779, 323, 1650, 970, 7162, 347, 247, 1307, 323, 37820, 275, 11454, 6928, 310, 4081, 275, 3963, 3006, 11454, 6928, 407, 29697, 3006, 13224, 3453, 10670, 17857, 32888, 4240, 326, 29697, 4219, 1698, 290, 10144, 3453, 10670, 891, 1158, 352, 310, 4409, 10941, 253, 1543, 342, 824, 5368, 789, 285, 16585, 253, 11361, 285, 23797, 1580, 247, 2074, 4473, 556, 644, 908, 581, 1223, 3733, 253, 1566, 285, 436, 2929, 347, 247, 1501, 37806, 18543, 50275, 19, 352, 310, 4722, 326, 247, 2014, 24403, 1080, 327, 253, 21884, 3733, 5239, 273, 512, 465, 966, 3020, 3237, 660, 88, 17923, 973, 347, 352, 310, 5393, 275, 253, 2929, 352, 23970, 8492, 1955, 281, 1907, 3530, 8392, 432, 253, 643, 5971, 275, 288, 67, 3629, 697, 1180, 273, 7103, 27925, 11355, 253, 8492, 533, 275, 660, 88, 824, 8492, 476, 417, 320, 6537, 25761, 4677, 247, 19, 2722, 352, 33526, 4577, 23421, 1397, 534, 310, 417, 3264, 310, 627, 667, 1921, 323, 326, 752, 651, 5108, 604, 253, 1180, 273, 27925, 310, 2559, 50275, 20, 1754, 327, 253, 5661, 1543, 352, 3133, 516, 991, 17923, 1805, 685, 643, 10269, 920, 7274, 2299, 2429, 281, 253, 13642, 3082, 352, 3133, 31025, 259, 9562, 1162, 355, 9169, 17923, 1805, 387, 295, 620, 67, 4586, 685, 253, 516, 991, 11640, 50275, 21, 1014, 2167, 253, 2929, 2722, 16248, 516, 991, 342, 31025, 19132, 253, 299, 336, 352, 310, 417, 2590, 849, 253, 3374, 273, 1016, 2746, 588, 320, 15726, 323, 1650, 253, 299, 336, 1537, 320, 41901, 50275, 187, 187, 4118, 18435, 27, 783, 2929, 29328, 281, 46875, 253, 15577, 1491, 281, 22318, 253, 10269, 323, 23559, 14407, 18543, 253, 2934, 5853, 285, 9759, 403, 1175, 253, 2929, 35910, 690, 23559, 14407, 18543, 50276, 22402, 253, 2488, 943, 49620, 253, 2929, 2556, 253, 30628, 5701, 1078, 15452 ]
Below is given review of a research paper from cnoference journal. Please write a summary the review. ### Review: in this paper the authors first show that under some assumptions smoothness and convexity on the enforcing individual fairness if on the source distribution can potentially improve the performance of ml models on the target distribution and then propose a training mechanism to add an if regularizer into the training process to achieve individual fairness in the target distribution the authors also show that domain adaptation algorithms that align the feature distributions in the source and target domain can be used to improve if the idea that enforcing individual fairness can help with distribution shift tasks is a natural idea and using regularization is also an intuitive way of achieving such an objective the paper also provides a bridge between individual fairness and domain adaption literature the major weakness of this paper is that its not clear why the main results are insightful in particular theorem 24 and theorem 27 it seems to simply say that the target loss can be bounded as a source loss plus the regularizers penalties which seems natural perhaps even obvious i wish the bound could be more concrete and interpretable one way of fixing this problem is to add a corollary for theorem 24 for specific regularizers which will help convince the reader that this bound is nontrivial for classes of regularizers they may use another problem of this paper is the meaning and the interpretation of their bounds for example in theorem 2i would like to see ns and nt exposed in the bound in theorem 24 because those variables describe the size of the input thus the bound would be more insightful if the authors could explain why the bound is proportional to ns and inversely proportional to nt the example of wedding dress provided in section 2 feels uncomfortable it almost seems to associate the development stage of a country with its brides wearing white dresses for weddings you can avoid this by simply talking about a specific image dataset that contains photos of europeanstyle wedding gowns and note that it lacks examples of wedding dresses from a specific and contrasting culture docsepthis paper shows that satisfying individual fairness if and representation alignment methods can complement each other enforcing if is shown to mitigate algorithmic biases caused by covariate shift as long as the regression function satisfies if conversely if can be enforced by aligning the distributions of the features under a factor model experiments verify the theoretical results strengths shows interesting connections between if and domain adaptation methods provides theoretical evidence for the above connections empirical results on various datasets show that if can be used for domain adaptation and vice versa weaknesses overall the compatibility between if and domain adaptation seems to only hold in restrictive settings if is similar to domain adaptation only for covariate shifts but not necessarily for other data shifts label and concept shifts fundamentally if means similar samples get similar predictions so it obviously cannot address all data shifts in general also domain adaptation is only useful to if for specific factor model structures i think the paper would be more interesting if if techniques are used to improve domain adaptation technique stated as future work but the current contribution focuses on their similarities only it could be that using if along with domain adaptation does not help domain adaptation at all and vice versa but i do not see any discussion about whether the two complement subsume or possibly have a negative effect on each other several assumptions are made to make the theory work without much justification and it is not clear how realistic they are there should be more convincing explanations and possibly empirical evidence assumption 21 this seems to assume that the penalty of the regression function f0 is always small leq delta wouldnt this result in trivial regression and penalty functions assumption 22 why should we believe that r is strongly convex assumption 23 why is it reasonable that l is both strongly convex and strongly smooth assumption 26 why is it reasonable to assume that r satisfies strong convexity in the experiments the assumptions are not verified on the real datasets there are only end results for if and domain adaptation the author response addresses my concerns so i am increasing my score the authors say the limitations are stated in the paper but it is not clear where exactly there does not seem to be negative societal impact docsepthis work mainly focuses on individual fairness problem under covariate shift the authors find that individual fairness and domain adaptation methods can benefit from each other as below 1 the regularizers for enforcing individual fairness can help models adapt to new domains the authors consider the transduction and inductive settings under the standard smoothness and convexity assumptions on the regularizer and the loss function the authors give a bound on the targetdomain risk of models that are regularized by individual fairness 2 on a synthetic setting where the source and target domain have the opposite sensitive attribute the authors show that a domain invariant transformation appeals to individual fairness the authors also validate their theory on two textual datasets strengths in general the presented conclusions are intriguing and theoretically grounded i checked the proof sketch but didnt touch the proof details the paper is quite clearly written i didnt find any problem in understanding the assumptions and the theories the paper flow is lucid the scope of the studied problem may have significant impacts on both the individual fairness and domain adaptation fields weaknesses i would suggest the authors to significantly improve the structure of this paper there should be a preliminary section for better understanding the problem settings as well as some key concepts in particular the settings of domain adaptation need to be further explained i would also suggest the authors to complement a separate section to introduce prior works on algorithmic fairness under domain shift a large body of recent works on algorithmic fairness under distribution shift is missing including 1 h singh r singh v mhasawade and r chunara fairness violations and mitigation under covariate shift in proceedings of the 2021 acm conference on fairness accountability and transparency 2021 pp 313 2 j schrouff n harris o koyejo i alabdulmohsin e schnider k opsahlong a brown s roy d mincu c chen et al maintaining fairness across distribution shift do we have viable solutions for realworld applications arxiv preprint arxiv220201034 2022 3 a rezaei a liu o memarrast and bd ziebart robust fairness under covariate shift aaai 2021 4 y chen rp raab j wang and y liu fairness transferability subject to bounded distribution shift arxiv preprint arxiv220600129 2022 i did not foresee any negative societal impact in this work ### Summary:
the initial reviews were divergent during the rebuttal and discussion phase however many of the raised concerns are addressed propertly leading slightly towards accept while some of the issues are not checked yet for whether to address them i believe the authors response answer them adequately hence i recommend the acceptance of this paper
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 249, 436, 2929, 253, 4477, 806, 921, 326, 762, 690, 13260, 6032, 1255, 285, 17133, 414, 327, 253, 37703, 2060, 28959, 604, 327, 253, 2603, 3268, 476, 7826, 3157, 253, 3045, 273, 13361, 3210, 327, 253, 2303, 3268, 285, 840, 12661, 247, 3733, 5122, 281, 823, 271, 604, 3963, 6081, 715, 253, 3733, 1232, 281, 5115, 2060, 28959, 275, 253, 2303, 3268, 253, 4477, 671, 921, 326, 5028, 15644, 11333, 326, 8495, 253, 4735, 10670, 275, 253, 2603, 285, 2303, 5028, 476, 320, 908, 281, 3157, 604, 50276, 783, 2934, 326, 37703, 2060, 28959, 476, 1361, 342, 3268, 5333, 8892, 310, 247, 3626, 2934, 285, 970, 37820, 310, 671, 271, 27350, 1039, 273, 17170, 824, 271, 8103, 253, 2929, 671, 3400, 247, 9729, 875, 2060, 28959, 285, 5028, 5223, 279, 6239, 50275, 783, 2201, 14855, 273, 436, 2929, 310, 326, 697, 417, 2590, 2139, 253, 2022, 1543, 403, 47860, 275, 1798, 10012, 2164, 285, 10012, 3435, 352, 3133, 281, 3365, 1333, 326, 253, 2303, 2957, 476, 320, 11542, 347, 247, 2603, 2957, 5043, 253, 3963, 14460, 22414, 534, 3133, 3626, 4931, 1014, 4755, 891, 5730, 253, 3033, 812, 320, 625, 11859, 285, 4665, 494, 581, 1039, 273, 18505, 436, 1895, 310, 281, 823, 247, 40460, 323, 10012, 2164, 323, 2173, 3963, 14460, 534, 588, 1361, 18578, 253, 9414, 326, 436, 3033, 310, 37825, 323, 5971, 273, 3963, 14460, 597, 778, 897, 50276, 23955, 1895, 273, 436, 2929, 310, 253, 4495, 285, 253, 7914, 273, 616, 14493, 323, 1650, 275, 10012, 374, 74, 651, 751, 281, 923, 19769, 285, 34900, 7329, 275, 253, 3033, 275, 10012, 2164, 984, 1110, 4903, 6266, 253, 1979, 273, 253, 3280, 3021, 253, 3033, 651, 320, 625, 47860, 604, 253, 4477, 812, 5513, 2139, 253, 3033, 310, 14495, 281, 19769, 285, 39342, 14495, 281, 34900, 50275, 783, 1650, 273, 12142, 7619, 2530, 275, 2593, 374, 9193, 20032, 352, 2761, 3133, 281, 15629, 253, 2440, 3924, 273, 247, 2586, 342, 697, 1308, 1487, 9398, 3168, 32724, 323, 50225, 368, 476, 3693, 436, 407, 3365, 5015, 670, 247, 2173, 2460, 10895, 326, 4428, 7963, 273, 19454, 266, 4826, 12142, 31079, 84, 285, 3877, 326, 352, 19756, 6667, 273, 12142, 32724, 432, 247, 2173, 285, 42455, 4466, 50276, 7152, 33032, 2520, 2929, 2722, 326, 14127, 2060, 28959, 604, 285, 6779, 12420, 3082, 476, 13503, 1016, 643, 37703, 604, 310, 2011, 281, 29966, 5933, 280, 31306, 4269, 407, 9383, 11610, 5333, 347, 1048, 347, 253, 9077, 1159, 12310, 604, 5636, 600, 604, 476, 320, 27810, 407, 8495, 272, 253, 10670, 273, 253, 3386, 762, 247, 2803, 1566, 4679, 12654, 253, 10527, 1543, 20544, 50276, 1200, 5811, 4722, 10291, 875, 604, 285, 5028, 15644, 3082, 50276, 11404, 1487, 10527, 1941, 323, 253, 1840, 10291, 50276, 358, 5378, 474, 1543, 327, 2710, 15302, 921, 326, 604, 476, 320, 908, 323, 5028, 15644, 285, 12008, 26620, 50276, 20881, 1255, 265, 50276, 1189, 455, 253, 22862, 875, 604, 285, 5028, 15644, 3133, 281, 760, 2186, 275, 29190, 7533, 604, 310, 2074, 281, 5028, 15644, 760, 323, 9383, 11610, 15036, 533, 417, 7933, 323, 643, 941, 15036, 5203, 285, 4473, 15036, 26401, 604, 2097, 2074, 3530, 755, 2074, 13650, 594, 352, 9090, 2550, 2953, 512, 941, 15036, 275, 2087, 671, 5028, 15644, 310, 760, 4217, 281, 604, 323, 2173, 2803, 1566, 5289, 891, 1158, 253, 2929, 651, 320, 625, 4722, 604, 604, 5609, 403, 908, 281, 3157, 5028, 15644, 5853, 4767, 347, 2852, 789, 533, 253, 1655, 7680, 16633, 327, 616, 22620, 760, 352, 812, 320, 326, 970, 604, 2112, 342, 5028, 15644, 1057, 417, 1361, 5028, 15644, 387, 512, 285, 12008, 26620, 533, 891, 513, 417, 923, 667, 5955, 670, 1880, 253, 767, 13503, 8790, 2123, 390, 6830, 452, 247, 4016, 1055, 327, 1016, 643, 50275, 43249, 13260, 403, 1160, 281, 1056, 253, 3762, 789, 1293, 1199, 22861, 285, 352, 310, 417, 2590, 849, 15958, 597, 403, 627, 943, 320, 625, 21414, 22909, 285, 6830, 16774, 1941, 50274, 515, 23892, 3127, 436, 3133, 281, 5467, 326, 253, 12339, 273, 253, 9077, 1159, 269, 17, 310, 1900, 1355, 458, 82, 18687, 651, 2649, 436, 906, 275, 14916, 9077, 285, 12339, 3470, 50274, 515, 23892, 3307, 2139, 943, 359, 2868, 326, 391, 310, 7052, 17133, 50273, 515, 23892, 3495, 2139, 310, 352, 5272, 326, 298, 310, 1097, 7052, 17133, 285, 7052, 6032, 50274, 515, 23892, 3436, 2139, 310, 352, 5272, 281, 5467, 326, 391, 12310, 2266, 17133, 414, 50275, 249, 253, 4679, 253, 13260, 403, 417, 16058, 327, 253, 1524, 15302, 627, 403, 760, 990, 1543, 323, 604, 285, 5028, 15644, 50274, 783, 2488, 2380, 12453, 619, 7350, 594, 891, 717, 3629, 619, 4868, 50275, 783, 4477, 1333, 253, 7364, 403, 4767, 275, 253, 2929, 533, 352, 310, 417, 2590, 835, 4555, 50276, 9088, 1057, 417, 1646, 281, 320, 4016, 38058, 3486, 5474, 33032, 2520, 789, 7194, 16633, 327, 2060, 28959, 1895, 762, 9383, 11610, 5333, 253, 4477, 1089, 326, 2060, 28959, 285, 5028, 15644, 3082, 476, 5649, 432, 1016, 643, 347, 2708, 50276, 18, 253, 3963, 14460, 323, 37703, 2060, 28959, 476, 1361, 3210, 5223, 281, 747, 10625, 253, 4477, 1908, 253, 28942, 285, 42115, 7533, 762, 253, 2629, 6032, 1255, 285, 17133, 414, 13260, 327, 253, 3963, 6081, 285, 253, 2957, 1159, 253, 4477, 1918, 247, 3033, 327, 253, 2303, 13517, 2495, 273, 3210, 326, 403, 3963, 1025, 407, 2060, 28959, 374, 327, 247, 13506, 4758, 835, 253, 2603, 285, 2303, 5028, 452, 253, 7285, 7996, 11104, 253, 4477, 921, 326, 247, 5028, 13727, 9261, 12527, 281, 2060, 28959, 50275, 783, 4477, 671, 17813, 616, 3762, 327, 767, 45860, 15302, 20544, 50275, 249, 2087, 253, 3559, 11815, 403, 27807, 285, 28055, 28462, 891, 10141, 253, 4737, 23211, 533, 42126, 5181, 253, 4737, 4278, 253, 2929, 310, 3240, 4518, 3542, 891, 42126, 1089, 667, 1895, 275, 4685, 253, 13260, 285, 253, 11813, 253, 2929, 2685, 310, 298, 15912, 253, 7990, 273, 253, 5421, 1895, 778, 452, 1534, 16274, 327, 1097, 253, 2060, 28959, 285, 5028, 15644, 4910, 50276, 20881, 1255, 265, 50275, 74, 651, 1804, 253, 4477, 281, 3012, 3157, 253, 2605, 273, 436, 2929, 627, 943, 320, 247, 12611, 2593, 323, 1805, 4685, 253, 1895, 7533, 347, 973, 347, 690, 2234, 12342, 275, 1798, 253, 7533, 273, 5028, 15644, 878, 281, 320, 2007, 5544, 891, 651, 671, 1804, 253, 4477, 281, 13503, 247, 4858, 2593, 281, 9569, 2720, 2987, 327, 5933, 280, 28959, 762, 5028, 5333, 50276, 66, 1781, 2133, 273, 3332, 2987, 327, 5933, 280, 28959, 762, 3268, 5333, 310, 5816, 1690, 50275, 18, 288, 1625, 73, 391, 1625, 73, 362, 278, 7110, 1403, 796, 285, 391, 448, 328, 4595, 28959, 15927, 285, 36455, 762, 9383, 11610, 5333, 275, 10061, 273, 253, 43425, 913, 78, 8059, 327, 28959, 30990, 285, 22107, 43425, 7266, 31389, 50274, 19, 480, 5807, 30898, 567, 295, 288, 34662, 258, 465, 899, 70, 5309, 891, 355, 357, 69, 335, 78, 1368, 7432, 299, 256, 1451, 1334, 465, 38322, 1240, 5056, 247, 8516, 256, 12869, 277, 1054, 14573, 260, 260, 864, 1162, 355, 11850, 28959, 2439, 3268, 5333, 513, 359, 452, 16571, 5482, 323, 1524, 10186, 4893, 549, 32693, 638, 3845, 549, 32693, 14256, 7199, 1706, 1384, 1423, 50274, 20, 247, 294, 91, 3348, 74, 247, 632, 86, 258, 1167, 3298, 505, 285, 270, 69, 1182, 466, 35292, 10237, 28959, 762, 9383, 11610, 5333, 39951, 2284, 43425, 50274, 21, 340, 260, 864, 391, 81, 1218, 357, 480, 259, 606, 285, 340, 632, 86, 28959, 3700, 1430, 2256, 281, 11542, 3268, 5333, 549, 32693, 638, 3845, 549, 32693, 14256, 10487, 13482, 1384, 1423, 50275, 74, 858, 417, 32734, 667, 4016, 38058, 3486, 275, 436, 789, 2490, 187, 4118, 18435, 27, 783, 3302, 10123, 497, 34249, 1309, 253, 30080, 22559, 285, 5955, 3408, 2299, 1142, 273, 253, 5439, 7350, 403, 9713, 1463, 85, 314, 4283, 5777, 4404, 2997, 1223, 690, 273, 253, 3374, 403, 417, 10141, 2568, 323, 1880, 281, 2953, 731, 891, 2868, 253, 4477, 2380, 3662, 731, 18212, 7613, 891, 5583, 253, 14924, 273, 436, 2929, 209 ]
[ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 ]
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 249, 436, 2929, 253, 4477, 806, 921, 326, 762, 690, 13260, 6032, 1255, 285, 17133, 414, 327, 253, 37703, 2060, 28959, 604, 327, 253, 2603, 3268, 476, 7826, 3157, 253, 3045, 273, 13361, 3210, 327, 253, 2303, 3268, 285, 840, 12661, 247, 3733, 5122, 281, 823, 271, 604, 3963, 6081, 715, 253, 3733, 1232, 281, 5115, 2060, 28959, 275, 253, 2303, 3268, 253, 4477, 671, 921, 326, 5028, 15644, 11333, 326, 8495, 253, 4735, 10670, 275, 253, 2603, 285, 2303, 5028, 476, 320, 908, 281, 3157, 604, 50276, 783, 2934, 326, 37703, 2060, 28959, 476, 1361, 342, 3268, 5333, 8892, 310, 247, 3626, 2934, 285, 970, 37820, 310, 671, 271, 27350, 1039, 273, 17170, 824, 271, 8103, 253, 2929, 671, 3400, 247, 9729, 875, 2060, 28959, 285, 5028, 5223, 279, 6239, 50275, 783, 2201, 14855, 273, 436, 2929, 310, 326, 697, 417, 2590, 2139, 253, 2022, 1543, 403, 47860, 275, 1798, 10012, 2164, 285, 10012, 3435, 352, 3133, 281, 3365, 1333, 326, 253, 2303, 2957, 476, 320, 11542, 347, 247, 2603, 2957, 5043, 253, 3963, 14460, 22414, 534, 3133, 3626, 4931, 1014, 4755, 891, 5730, 253, 3033, 812, 320, 625, 11859, 285, 4665, 494, 581, 1039, 273, 18505, 436, 1895, 310, 281, 823, 247, 40460, 323, 10012, 2164, 323, 2173, 3963, 14460, 534, 588, 1361, 18578, 253, 9414, 326, 436, 3033, 310, 37825, 323, 5971, 273, 3963, 14460, 597, 778, 897, 50276, 23955, 1895, 273, 436, 2929, 310, 253, 4495, 285, 253, 7914, 273, 616, 14493, 323, 1650, 275, 10012, 374, 74, 651, 751, 281, 923, 19769, 285, 34900, 7329, 275, 253, 3033, 275, 10012, 2164, 984, 1110, 4903, 6266, 253, 1979, 273, 253, 3280, 3021, 253, 3033, 651, 320, 625, 47860, 604, 253, 4477, 812, 5513, 2139, 253, 3033, 310, 14495, 281, 19769, 285, 39342, 14495, 281, 34900, 50275, 783, 1650, 273, 12142, 7619, 2530, 275, 2593, 374, 9193, 20032, 352, 2761, 3133, 281, 15629, 253, 2440, 3924, 273, 247, 2586, 342, 697, 1308, 1487, 9398, 3168, 32724, 323, 50225, 368, 476, 3693, 436, 407, 3365, 5015, 670, 247, 2173, 2460, 10895, 326, 4428, 7963, 273, 19454, 266, 4826, 12142, 31079, 84, 285, 3877, 326, 352, 19756, 6667, 273, 12142, 32724, 432, 247, 2173, 285, 42455, 4466, 50276, 7152, 33032, 2520, 2929, 2722, 326, 14127, 2060, 28959, 604, 285, 6779, 12420, 3082, 476, 13503, 1016, 643, 37703, 604, 310, 2011, 281, 29966, 5933, 280, 31306, 4269, 407, 9383, 11610, 5333, 347, 1048, 347, 253, 9077, 1159, 12310, 604, 5636, 600, 604, 476, 320, 27810, 407, 8495, 272, 253, 10670, 273, 253, 3386, 762, 247, 2803, 1566, 4679, 12654, 253, 10527, 1543, 20544, 50276, 1200, 5811, 4722, 10291, 875, 604, 285, 5028, 15644, 3082, 50276, 11404, 1487, 10527, 1941, 323, 253, 1840, 10291, 50276, 358, 5378, 474, 1543, 327, 2710, 15302, 921, 326, 604, 476, 320, 908, 323, 5028, 15644, 285, 12008, 26620, 50276, 20881, 1255, 265, 50276, 1189, 455, 253, 22862, 875, 604, 285, 5028, 15644, 3133, 281, 760, 2186, 275, 29190, 7533, 604, 310, 2074, 281, 5028, 15644, 760, 323, 9383, 11610, 15036, 533, 417, 7933, 323, 643, 941, 15036, 5203, 285, 4473, 15036, 26401, 604, 2097, 2074, 3530, 755, 2074, 13650, 594, 352, 9090, 2550, 2953, 512, 941, 15036, 275, 2087, 671, 5028, 15644, 310, 760, 4217, 281, 604, 323, 2173, 2803, 1566, 5289, 891, 1158, 253, 2929, 651, 320, 625, 4722, 604, 604, 5609, 403, 908, 281, 3157, 5028, 15644, 5853, 4767, 347, 2852, 789, 533, 253, 1655, 7680, 16633, 327, 616, 22620, 760, 352, 812, 320, 326, 970, 604, 2112, 342, 5028, 15644, 1057, 417, 1361, 5028, 15644, 387, 512, 285, 12008, 26620, 533, 891, 513, 417, 923, 667, 5955, 670, 1880, 253, 767, 13503, 8790, 2123, 390, 6830, 452, 247, 4016, 1055, 327, 1016, 643, 50275, 43249, 13260, 403, 1160, 281, 1056, 253, 3762, 789, 1293, 1199, 22861, 285, 352, 310, 417, 2590, 849, 15958, 597, 403, 627, 943, 320, 625, 21414, 22909, 285, 6830, 16774, 1941, 50274, 515, 23892, 3127, 436, 3133, 281, 5467, 326, 253, 12339, 273, 253, 9077, 1159, 269, 17, 310, 1900, 1355, 458, 82, 18687, 651, 2649, 436, 906, 275, 14916, 9077, 285, 12339, 3470, 50274, 515, 23892, 3307, 2139, 943, 359, 2868, 326, 391, 310, 7052, 17133, 50273, 515, 23892, 3495, 2139, 310, 352, 5272, 326, 298, 310, 1097, 7052, 17133, 285, 7052, 6032, 50274, 515, 23892, 3436, 2139, 310, 352, 5272, 281, 5467, 326, 391, 12310, 2266, 17133, 414, 50275, 249, 253, 4679, 253, 13260, 403, 417, 16058, 327, 253, 1524, 15302, 627, 403, 760, 990, 1543, 323, 604, 285, 5028, 15644, 50274, 783, 2488, 2380, 12453, 619, 7350, 594, 891, 717, 3629, 619, 4868, 50275, 783, 4477, 1333, 253, 7364, 403, 4767, 275, 253, 2929, 533, 352, 310, 417, 2590, 835, 4555, 50276, 9088, 1057, 417, 1646, 281, 320, 4016, 38058, 3486, 5474, 33032, 2520, 789, 7194, 16633, 327, 2060, 28959, 1895, 762, 9383, 11610, 5333, 253, 4477, 1089, 326, 2060, 28959, 285, 5028, 15644, 3082, 476, 5649, 432, 1016, 643, 347, 2708, 50276, 18, 253, 3963, 14460, 323, 37703, 2060, 28959, 476, 1361, 3210, 5223, 281, 747, 10625, 253, 4477, 1908, 253, 28942, 285, 42115, 7533, 762, 253, 2629, 6032, 1255, 285, 17133, 414, 13260, 327, 253, 3963, 6081, 285, 253, 2957, 1159, 253, 4477, 1918, 247, 3033, 327, 253, 2303, 13517, 2495, 273, 3210, 326, 403, 3963, 1025, 407, 2060, 28959, 374, 327, 247, 13506, 4758, 835, 253, 2603, 285, 2303, 5028, 452, 253, 7285, 7996, 11104, 253, 4477, 921, 326, 247, 5028, 13727, 9261, 12527, 281, 2060, 28959, 50275, 783, 4477, 671, 17813, 616, 3762, 327, 767, 45860, 15302, 20544, 50275, 249, 2087, 253, 3559, 11815, 403, 27807, 285, 28055, 28462, 891, 10141, 253, 4737, 23211, 533, 42126, 5181, 253, 4737, 4278, 253, 2929, 310, 3240, 4518, 3542, 891, 42126, 1089, 667, 1895, 275, 4685, 253, 13260, 285, 253, 11813, 253, 2929, 2685, 310, 298, 15912, 253, 7990, 273, 253, 5421, 1895, 778, 452, 1534, 16274, 327, 1097, 253, 2060, 28959, 285, 5028, 15644, 4910, 50276, 20881, 1255, 265, 50275, 74, 651, 1804, 253, 4477, 281, 3012, 3157, 253, 2605, 273, 436, 2929, 627, 943, 320, 247, 12611, 2593, 323, 1805, 4685, 253, 1895, 7533, 347, 973, 347, 690, 2234, 12342, 275, 1798, 253, 7533, 273, 5028, 15644, 878, 281, 320, 2007, 5544, 891, 651, 671, 1804, 253, 4477, 281, 13503, 247, 4858, 2593, 281, 9569, 2720, 2987, 327, 5933, 280, 28959, 762, 5028, 5333, 50276, 66, 1781, 2133, 273, 3332, 2987, 327, 5933, 280, 28959, 762, 3268, 5333, 310, 5816, 1690, 50275, 18, 288, 1625, 73, 391, 1625, 73, 362, 278, 7110, 1403, 796, 285, 391, 448, 328, 4595, 28959, 15927, 285, 36455, 762, 9383, 11610, 5333, 275, 10061, 273, 253, 43425, 913, 78, 8059, 327, 28959, 30990, 285, 22107, 43425, 7266, 31389, 50274, 19, 480, 5807, 30898, 567, 295, 288, 34662, 258, 465, 899, 70, 5309, 891, 355, 357, 69, 335, 78, 1368, 7432, 299, 256, 1451, 1334, 465, 38322, 1240, 5056, 247, 8516, 256, 12869, 277, 1054, 14573, 260, 260, 864, 1162, 355, 11850, 28959, 2439, 3268, 5333, 513, 359, 452, 16571, 5482, 323, 1524, 10186, 4893, 549, 32693, 638, 3845, 549, 32693, 14256, 7199, 1706, 1384, 1423, 50274, 20, 247, 294, 91, 3348, 74, 247, 632, 86, 258, 1167, 3298, 505, 285, 270, 69, 1182, 466, 35292, 10237, 28959, 762, 9383, 11610, 5333, 39951, 2284, 43425, 50274, 21, 340, 260, 864, 391, 81, 1218, 357, 480, 259, 606, 285, 340, 632, 86, 28959, 3700, 1430, 2256, 281, 11542, 3268, 5333, 549, 32693, 638, 3845, 549, 32693, 14256, 10487, 13482, 1384, 1423, 50275, 74, 858, 417, 32734, 667, 4016, 38058, 3486, 275, 436, 789, 2490, 187, 4118, 18435, 27, 783, 3302, 10123, 497, 34249, 1309, 253, 30080, 22559, 285, 5955, 3408, 2299, 1142, 273, 253, 5439, 7350, 403, 9713, 1463, 85, 314, 4283, 5777, 4404, 2997, 1223, 690, 273, 253, 3374, 403, 417, 10141, 2568, 323, 1880, 281, 2953, 731, 891, 2868, 253, 4477, 2380, 3662, 731, 18212, 7613, 891, 5583, 253, 14924, 273, 436, 2929, 209 ]
Below is given review of a research paper from cnoference journal. Please write a summary the review. ### Review: brief summary of the paper this work discusses the transformation of planning tasks based on a finite domain representation fdr into tasks based on factored state transition systems fts the transformation is based on the mergeshrink framework which consists of various abstractionbased techniques commonly used to compute abstraction heuristics the key idea of this work is to transform the original task via a sequence of task transformations and to provide a reconstruction procedure which allows to use a plan from the transformed task to compute a plan for the original task the paper discusses how to perform search on these ftsbased tasks and also provides an adaptation of the ff heuristic additionally previous fdr task reformulations are put into context and it is formally shown that these are dominated by the fts reformulations presented here the paper concludes with a comprehensive empirical evaluation which uncovers advantages and disadvantages of different types of task reformulations brief summary of the review this paper was a pleasure to read the quality of the work is very high and suitable to be presented at a top ai conference it is written fluently the formal presentation is concise and clear and simple examples guide the reader through the definitions moreover the framework generalizes multiple previously introduced planning task reformulations and ftsbased planning tasks warrant a deeper look into heuristic search performed on such tasks therefore this work fits perfectly into the scope of the workshop detailed review the paper is very dense yet it manages to cite relevant related work one can note though that while the introduction mentions that reformulations are a common tool to reduce the accidental complexity of a planning task this is never discussed again later on i wonder how the presented reformulation methods impact the accidental complexity the presentation of fdrfds representations the ms framework and the various transformations is concise clear formally correct and easy to follow the plan reconstruction is a bit harder to follow although being familiar with cascading tables helps here the example really helps to completely grasp the reconstruction process what i find a bit odd is that the paper mentions that it avoids pruningbased reformulations because of the overhead yet proposition 1 includes pruningbased reformulations since the proof considers the previously shown plan reconstruction process i wonder if proposition 1 really should include pruning i do not necessarily think it is wrong as i guess the overhead for pruning does not really affect the polytime property i would also have expected a part of the empirical evaluation to discuss this overhead yet it is not mentioned anywhere i think the paper would benefit from one or two more sentences in how much overhead one can expect when using pruning maybe the overhead warrants the pruning power the relation to other fdr reformulation methods is also clearly written and interesting to follow although one could argue that dominating is not necessarily the most fitting description for a method which can perform the same formulations why not require the dominating method to provide additional reformulations both shrinking and label reduction would still dominate in this case no the evaluation is also very comprehensive it evaluates the different techniques over all planning tasks of the previous ipcs and provides interesting insights while it mostly covers the results over all domains i think an analysis on the effect of the different transformations on particular domains could be interesting while this is partially hinted to in the conclusion the paper could benefit from a short discussion on which types of transformation might be particular useful or harmful in what type of domain minor comments plan reconstruction on the final task we can run any planning algorithm on pin to find a plan for pin since the final task is pin this sentence mentions three times the same task reconstruction of mlb reformulations lambdal is not defined before or later on i suspect lambda was renamed to fl example keeping track of one of the factored state that corresponds one of the factored states taulabel reconstrunction to some in pii to some ts in pii you use alpha for the weak bisimulation of theta but before alpha was usually used as the abstraction function for states and not transition systems proof of prop 2 why can computing the coarses weak bisimulation of a ts be done in poly time def 4 i suggest to use two different namingfont schemes to differentiate between the reformulation methods of fts and fdr tasks figure 2 the figure could explicitly say which plot refers to weak bisimulation shrinking discussion of fig 3 this plots this plot tt is also remarkable the large amount of instances grammar there is no expansions until last jump first i wanted to mention grammar but then i remembered that this is one of the search properties for fd benchmarks i suspect this confuses more people than it helps people familiar with fd benchmarks so i would suggest to just remove this remark docsepthis paper revisits the idea of applying a suite of automatic simplification techniques to a planning problem in advance of solving it with the difference that simplifications are applied to the internal factored transition system representation instead of the original modelling formalism such as strips or fdr this is claimed to have some advantage in that the representation allows for conditional effects and certain forms of disjunctions which allows the result of certain additional simplifications to be expressed on the downside the authors need to recreate the planning machinery to operate on this representation which at current is less efficient than the welldeveloped planner implementation in fast downward that operates on the fdr representation in section reconstruction of mlb reformulations on page 4 at the beginning of the section the labelabstracting function is falled fl but later in the section it seems to change name to lambda is this a typo or is lambda a different function if so what is it binary variables example later in the same section the authors claim that applying bisimulation shrinking the resulting ts has still only 2 states one where both counters are set to 1 and another where at least one counter must still be set i dont see how this can be a bisimulation it would mean that 00 01 10 but applying the action that sets the first variable in 00 leads to 10 while applying it in state 01 leads to 11 which with bisimilarity should imply that 10 11 which contradicts the first condition of bisimilarity since 11 is a goal state and 10 is not perhaps the authors meant applying weak bisimilarity but in that case which are the internal labels in figure 2 what is the difference between the plots in the top row and the plots in the bottom row the labels or caption does not say anything in the section on search space reduction the authors mention a number of domains that are completely solved by simplification this was also observed in the original paper does the generalization of the simplifications proposed here result in any new domain being fully solved that was not with the previous versions of the same techniques or any individual task within a not fully solved domain ### Summary:
dear authors thank you very much for your submission we are happy to inform you that we have decided to accept it and we look forward to your talk in the workshop please go over the feedback in the reviews and correct or update your papers in time for the camera ready date may 24 best regards hsdip organizers
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 18399, 6010, 273, 253, 2929, 436, 789, 25339, 253, 9261, 273, 7219, 8892, 1754, 327, 247, 6486, 5028, 6779, 269, 5267, 715, 8892, 1754, 327, 958, 2149, 1375, 5502, 2718, 269, 1641, 253, 9261, 310, 1754, 327, 253, 14041, 265, 6285, 750, 7792, 534, 8414, 273, 2710, 38562, 3169, 5609, 7744, 908, 281, 11897, 38562, 344, 321, 3397, 253, 2234, 2934, 273, 436, 789, 310, 281, 4979, 253, 3236, 4836, 3066, 247, 3425, 273, 4836, 21257, 285, 281, 2085, 247, 14433, 5199, 534, 4483, 281, 897, 247, 2098, 432, 253, 13657, 4836, 281, 11897, 247, 2098, 323, 253, 3236, 4836, 253, 2929, 25339, 849, 281, 1347, 3186, 327, 841, 269, 1641, 3169, 8892, 285, 671, 3400, 271, 15644, 273, 253, 34082, 47641, 23000, 50276, 35065, 269, 5267, 4836, 8460, 3339, 403, 1691, 715, 3634, 285, 352, 310, 19186, 2011, 326, 841, 403, 14691, 407, 253, 269, 1641, 8460, 3339, 3559, 1060, 253, 2929, 20097, 342, 247, 11088, 16774, 7103, 534, 440, 1940, 735, 11361, 285, 23797, 273, 1027, 3510, 273, 4836, 8460, 3339, 50276, 18399, 6010, 273, 253, 2278, 436, 2929, 369, 247, 11284, 281, 1239, 253, 3290, 273, 253, 789, 310, 1077, 1029, 285, 7470, 281, 320, 3559, 387, 247, 1755, 23105, 8059, 352, 310, 3542, 2938, 1574, 253, 7473, 9759, 310, 44003, 285, 2590, 285, 2969, 6667, 7102, 253, 9414, 949, 253, 14308, 25761, 253, 7792, 2087, 4219, 2709, 3786, 5611, 7219, 4836, 8460, 3339, 285, 269, 1641, 3169, 7219, 8892, 7501, 247, 12861, 1007, 715, 47641, 3186, 2684, 327, 824, 8892, 3103, 436, 789, 13840, 9670, 715, 253, 7990, 273, 253, 22586, 50276, 5992, 7193, 2278, 253, 2929, 310, 1077, 14086, 2568, 352, 26091, 281, 26542, 4623, 2905, 789, 581, 476, 3877, 2167, 326, 1223, 253, 10199, 25957, 326, 8460, 3339, 403, 247, 1846, 4968, 281, 4796, 253, 31120, 10454, 273, 247, 7219, 4836, 436, 310, 1620, 5469, 969, 1996, 327, 891, 4282, 849, 253, 3559, 8460, 1427, 3082, 3486, 253, 31120, 10454, 50276, 783, 9759, 273, 269, 5267, 71, 1397, 14237, 253, 13818, 7792, 285, 253, 2710, 21257, 310, 44003, 2590, 19186, 3451, 285, 3477, 281, 956, 253, 2098, 14433, 310, 247, 2372, 12150, 281, 956, 3738, 1146, 7615, 342, 18779, 6748, 7180, 7729, 1060, 253, 1650, 1663, 7729, 281, 4336, 15909, 253, 14433, 1232, 752, 891, 1089, 247, 2372, 8909, 310, 326, 253, 2929, 25957, 326, 352, 32547, 819, 25004, 3169, 8460, 3339, 984, 273, 253, 18332, 2568, 13989, 337, 3797, 819, 25004, 3169, 8460, 3339, 1580, 253, 4737, 19401, 253, 3786, 2011, 2098, 14433, 1232, 891, 4282, 604, 13989, 337, 1663, 943, 2486, 819, 25004, 891, 513, 417, 7933, 1158, 352, 310, 3430, 347, 891, 5476, 253, 18332, 323, 819, 25004, 1057, 417, 1663, 2818, 253, 877, 1767, 553, 2867, 891, 651, 671, 452, 3264, 247, 629, 273, 253, 16774, 7103, 281, 2319, 436, 18332, 2568, 352, 310, 417, 5393, 9825, 891, 1158, 253, 2929, 651, 5649, 432, 581, 390, 767, 625, 14683, 275, 849, 1199, 18332, 581, 476, 1902, 672, 970, 819, 25004, 5046, 253, 18332, 32570, 253, 819, 25004, 1612, 50276, 783, 5886, 281, 643, 269, 5267, 8460, 1427, 3082, 310, 671, 4518, 3542, 285, 4722, 281, 956, 3738, 581, 812, 9059, 326, 41297, 310, 417, 7933, 253, 954, 13532, 5740, 323, 247, 1332, 534, 476, 1347, 253, 1072, 26850, 2139, 417, 2430, 253, 41297, 1332, 281, 2085, 3081, 8460, 3339, 1097, 39443, 285, 5203, 5141, 651, 1335, 25903, 275, 436, 1083, 642, 50276, 783, 7103, 310, 671, 1077, 11088, 352, 44995, 253, 1027, 5609, 689, 512, 7219, 8892, 273, 253, 2045, 13997, 6113, 285, 3400, 4722, 16039, 1223, 352, 6571, 10949, 253, 1543, 689, 512, 10625, 891, 1158, 271, 1783, 327, 253, 1055, 273, 253, 1027, 21257, 327, 1798, 10625, 812, 320, 4722, 1223, 436, 310, 10571, 47466, 281, 275, 253, 6452, 253, 2929, 812, 5649, 432, 247, 2159, 5955, 327, 534, 3510, 273, 9261, 1537, 320, 1798, 4217, 390, 19632, 275, 752, 1511, 273, 5028, 50276, 37585, 5701, 50276, 11139, 14433, 50276, 251, 253, 2457, 4836, 359, 476, 1408, 667, 7219, 5933, 327, 9176, 50276, 936, 1089, 247, 2098, 50276, 1542, 9176, 1580, 253, 2457, 4836, 310, 9176, 436, 6197, 25957, 1264, 2069, 253, 1072, 4836, 50276, 250, 11682, 273, 13361, 67, 8460, 3339, 50276, 77, 1369, 26955, 310, 417, 2931, 1078, 390, 1996, 327, 891, 9101, 29331, 369, 27624, 281, 892, 50276, 11667, 7562, 3540, 273, 581, 273, 253, 958, 2149, 1375, 326, 10140, 50276, 531, 273, 253, 958, 2149, 3054, 50275, 893, 335, 1492, 8756, 1344, 4346, 50275, 936, 690, 275, 268, 2886, 50276, 936, 690, 28669, 275, 268, 2886, 50275, 5658, 897, 9765, 323, 253, 5075, 17542, 303, 1427, 273, 39116, 533, 1078, 9765, 369, 3798, 908, 347, 253, 38562, 1159, 323, 3054, 285, 417, 5502, 2718, 50274, 16314, 273, 4198, 374, 2139, 476, 12672, 253, 820, 1032, 265, 5075, 17542, 303, 1427, 273, 247, 28669, 320, 2218, 275, 3488, 673, 50276, 1545, 577, 891, 1804, 281, 897, 767, 1027, 26086, 4909, 15849, 281, 22629, 875, 253, 8460, 1427, 3082, 273, 269, 1641, 285, 269, 5267, 8892, 50276, 13206, 374, 253, 4677, 812, 11120, 1333, 534, 7484, 10770, 281, 5075, 17542, 303, 1427, 39443, 50276, 49794, 273, 3036, 495, 50275, 2520, 14777, 50276, 2520, 7484, 50275, 1440, 310, 671, 13406, 253, 1781, 2408, 273, 10872, 50276, 1710, 4175, 50275, 9088, 310, 642, 40955, 1919, 1390, 6923, 806, 891, 3078, 281, 3748, 28146, 533, 840, 891, 12659, 326, 436, 310, 581, 273, 253, 3186, 3607, 323, 29439, 49602, 891, 9101, 436, 1461, 5123, 625, 952, 685, 352, 7729, 952, 7615, 342, 29439, 49602, 594, 891, 651, 1804, 281, 816, 5386, 436, 7579, 50273, 7152, 33032, 2520, 2929, 27694, 953, 253, 2934, 273, 9433, 247, 18880, 273, 12077, 8077, 1877, 5609, 281, 247, 7219, 1895, 275, 7170, 273, 16161, 352, 342, 253, 3064, 326, 8077, 6787, 403, 3732, 281, 253, 4812, 958, 2149, 5502, 985, 6779, 3185, 273, 253, 3236, 26278, 30221, 824, 347, 22486, 390, 269, 5267, 436, 310, 7558, 281, 452, 690, 5750, 275, 326, 253, 6779, 4483, 323, 17697, 2538, 285, 2176, 4948, 273, 557, 30986, 960, 534, 4483, 253, 906, 273, 2176, 3081, 8077, 6787, 281, 320, 4469, 327, 253, 42719, 253, 4477, 878, 281, 48516, 253, 7219, 20949, 281, 10196, 327, 436, 6779, 534, 387, 1655, 310, 1679, 5919, 685, 253, 6210, 392, 70, 1155, 264, 499, 9582, 7092, 275, 3809, 21169, 326, 17209, 327, 253, 269, 5267, 6779, 50276, 249, 2593, 14433, 273, 13361, 67, 8460, 3339, 327, 3239, 577, 387, 253, 5068, 273, 253, 2593, 253, 5203, 15834, 272, 1159, 310, 2965, 264, 892, 533, 1996, 275, 253, 2593, 352, 3133, 281, 1818, 1416, 281, 29331, 310, 436, 247, 1745, 80, 390, 310, 29331, 247, 1027, 1159, 604, 594, 752, 310, 352, 50276, 26458, 4903, 1650, 1996, 275, 253, 1072, 2593, 253, 4477, 1750, 326, 9433, 17542, 303, 1427, 39443, 253, 4795, 28669, 556, 1335, 760, 374, 3054, 581, 835, 1097, 33605, 403, 873, 281, 337, 285, 1529, 835, 387, 1878, 581, 4828, 1364, 1335, 320, 873, 891, 13414, 923, 849, 436, 476, 320, 247, 17542, 303, 1427, 352, 651, 1599, 326, 7449, 50276, 520, 50276, 740, 533, 9433, 253, 2250, 326, 5239, 253, 806, 4778, 275, 7449, 5644, 281, 884, 1223, 9433, 352, 275, 1375, 14805, 5644, 281, 1903, 534, 342, 17542, 303, 1858, 414, 943, 16084, 326, 884, 50276, 883, 534, 40878, 253, 806, 1617, 273, 17542, 303, 1858, 414, 1580, 1903, 310, 247, 4736, 1375, 285, 884, 310, 417, 50276, 30875, 253, 4477, 5486, 9433, 5075, 17542, 303, 1858, 414, 533, 275, 326, 1083, 534, 403, 253, 4812, 13301, 50276, 249, 4677, 374, 752, 310, 253, 3064, 875, 253, 14777, 275, 253, 1755, 4194, 285, 253, 14777, 275, 253, 5004, 4194, 253, 13301, 390, 11743, 1057, 417, 1333, 2712, 50276, 249, 253, 2593, 327, 3186, 2317, 5141, 253, 4477, 3748, 247, 1180, 273, 10625, 326, 403, 4336, 14042, 407, 8077, 1877, 436, 369, 671, 2540, 275, 253, 3236, 2929, 1057, 253, 26647, 273, 253, 8077, 6787, 4081, 1060, 906, 275, 667, 747, 5028, 1146, 4751, 14042, 326, 369, 417, 342, 253, 2045, 9508, 273, 253, 1072, 5609, 390, 667, 2060, 4836, 1561, 247, 417, 4751, 14042, 5028, 2490, 187, 4118, 18435, 27, 69, 613, 4477, 5717, 368, 1077, 1199, 323, 634, 19529, 359, 403, 5211, 281, 4151, 368, 326, 359, 452, 4425, 281, 2997, 352, 285, 359, 1007, 3579, 281, 634, 2312, 275, 253, 22586, 4496, 564, 689, 253, 8680, 275, 253, 10123, 285, 3451, 390, 5731, 634, 9380, 275, 673, 323, 253, 6568, 4704, 3522, 778, 2164, 1682, 17730, 288, 8289, 532, 37630 ]
[ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 ]
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 18399, 6010, 273, 253, 2929, 436, 789, 25339, 253, 9261, 273, 7219, 8892, 1754, 327, 247, 6486, 5028, 6779, 269, 5267, 715, 8892, 1754, 327, 958, 2149, 1375, 5502, 2718, 269, 1641, 253, 9261, 310, 1754, 327, 253, 14041, 265, 6285, 750, 7792, 534, 8414, 273, 2710, 38562, 3169, 5609, 7744, 908, 281, 11897, 38562, 344, 321, 3397, 253, 2234, 2934, 273, 436, 789, 310, 281, 4979, 253, 3236, 4836, 3066, 247, 3425, 273, 4836, 21257, 285, 281, 2085, 247, 14433, 5199, 534, 4483, 281, 897, 247, 2098, 432, 253, 13657, 4836, 281, 11897, 247, 2098, 323, 253, 3236, 4836, 253, 2929, 25339, 849, 281, 1347, 3186, 327, 841, 269, 1641, 3169, 8892, 285, 671, 3400, 271, 15644, 273, 253, 34082, 47641, 23000, 50276, 35065, 269, 5267, 4836, 8460, 3339, 403, 1691, 715, 3634, 285, 352, 310, 19186, 2011, 326, 841, 403, 14691, 407, 253, 269, 1641, 8460, 3339, 3559, 1060, 253, 2929, 20097, 342, 247, 11088, 16774, 7103, 534, 440, 1940, 735, 11361, 285, 23797, 273, 1027, 3510, 273, 4836, 8460, 3339, 50276, 18399, 6010, 273, 253, 2278, 436, 2929, 369, 247, 11284, 281, 1239, 253, 3290, 273, 253, 789, 310, 1077, 1029, 285, 7470, 281, 320, 3559, 387, 247, 1755, 23105, 8059, 352, 310, 3542, 2938, 1574, 253, 7473, 9759, 310, 44003, 285, 2590, 285, 2969, 6667, 7102, 253, 9414, 949, 253, 14308, 25761, 253, 7792, 2087, 4219, 2709, 3786, 5611, 7219, 4836, 8460, 3339, 285, 269, 1641, 3169, 7219, 8892, 7501, 247, 12861, 1007, 715, 47641, 3186, 2684, 327, 824, 8892, 3103, 436, 789, 13840, 9670, 715, 253, 7990, 273, 253, 22586, 50276, 5992, 7193, 2278, 253, 2929, 310, 1077, 14086, 2568, 352, 26091, 281, 26542, 4623, 2905, 789, 581, 476, 3877, 2167, 326, 1223, 253, 10199, 25957, 326, 8460, 3339, 403, 247, 1846, 4968, 281, 4796, 253, 31120, 10454, 273, 247, 7219, 4836, 436, 310, 1620, 5469, 969, 1996, 327, 891, 4282, 849, 253, 3559, 8460, 1427, 3082, 3486, 253, 31120, 10454, 50276, 783, 9759, 273, 269, 5267, 71, 1397, 14237, 253, 13818, 7792, 285, 253, 2710, 21257, 310, 44003, 2590, 19186, 3451, 285, 3477, 281, 956, 253, 2098, 14433, 310, 247, 2372, 12150, 281, 956, 3738, 1146, 7615, 342, 18779, 6748, 7180, 7729, 1060, 253, 1650, 1663, 7729, 281, 4336, 15909, 253, 14433, 1232, 752, 891, 1089, 247, 2372, 8909, 310, 326, 253, 2929, 25957, 326, 352, 32547, 819, 25004, 3169, 8460, 3339, 984, 273, 253, 18332, 2568, 13989, 337, 3797, 819, 25004, 3169, 8460, 3339, 1580, 253, 4737, 19401, 253, 3786, 2011, 2098, 14433, 1232, 891, 4282, 604, 13989, 337, 1663, 943, 2486, 819, 25004, 891, 513, 417, 7933, 1158, 352, 310, 3430, 347, 891, 5476, 253, 18332, 323, 819, 25004, 1057, 417, 1663, 2818, 253, 877, 1767, 553, 2867, 891, 651, 671, 452, 3264, 247, 629, 273, 253, 16774, 7103, 281, 2319, 436, 18332, 2568, 352, 310, 417, 5393, 9825, 891, 1158, 253, 2929, 651, 5649, 432, 581, 390, 767, 625, 14683, 275, 849, 1199, 18332, 581, 476, 1902, 672, 970, 819, 25004, 5046, 253, 18332, 32570, 253, 819, 25004, 1612, 50276, 783, 5886, 281, 643, 269, 5267, 8460, 1427, 3082, 310, 671, 4518, 3542, 285, 4722, 281, 956, 3738, 581, 812, 9059, 326, 41297, 310, 417, 7933, 253, 954, 13532, 5740, 323, 247, 1332, 534, 476, 1347, 253, 1072, 26850, 2139, 417, 2430, 253, 41297, 1332, 281, 2085, 3081, 8460, 3339, 1097, 39443, 285, 5203, 5141, 651, 1335, 25903, 275, 436, 1083, 642, 50276, 783, 7103, 310, 671, 1077, 11088, 352, 44995, 253, 1027, 5609, 689, 512, 7219, 8892, 273, 253, 2045, 13997, 6113, 285, 3400, 4722, 16039, 1223, 352, 6571, 10949, 253, 1543, 689, 512, 10625, 891, 1158, 271, 1783, 327, 253, 1055, 273, 253, 1027, 21257, 327, 1798, 10625, 812, 320, 4722, 1223, 436, 310, 10571, 47466, 281, 275, 253, 6452, 253, 2929, 812, 5649, 432, 247, 2159, 5955, 327, 534, 3510, 273, 9261, 1537, 320, 1798, 4217, 390, 19632, 275, 752, 1511, 273, 5028, 50276, 37585, 5701, 50276, 11139, 14433, 50276, 251, 253, 2457, 4836, 359, 476, 1408, 667, 7219, 5933, 327, 9176, 50276, 936, 1089, 247, 2098, 50276, 1542, 9176, 1580, 253, 2457, 4836, 310, 9176, 436, 6197, 25957, 1264, 2069, 253, 1072, 4836, 50276, 250, 11682, 273, 13361, 67, 8460, 3339, 50276, 77, 1369, 26955, 310, 417, 2931, 1078, 390, 1996, 327, 891, 9101, 29331, 369, 27624, 281, 892, 50276, 11667, 7562, 3540, 273, 581, 273, 253, 958, 2149, 1375, 326, 10140, 50276, 531, 273, 253, 958, 2149, 3054, 50275, 893, 335, 1492, 8756, 1344, 4346, 50275, 936, 690, 275, 268, 2886, 50276, 936, 690, 28669, 275, 268, 2886, 50275, 5658, 897, 9765, 323, 253, 5075, 17542, 303, 1427, 273, 39116, 533, 1078, 9765, 369, 3798, 908, 347, 253, 38562, 1159, 323, 3054, 285, 417, 5502, 2718, 50274, 16314, 273, 4198, 374, 2139, 476, 12672, 253, 820, 1032, 265, 5075, 17542, 303, 1427, 273, 247, 28669, 320, 2218, 275, 3488, 673, 50276, 1545, 577, 891, 1804, 281, 897, 767, 1027, 26086, 4909, 15849, 281, 22629, 875, 253, 8460, 1427, 3082, 273, 269, 1641, 285, 269, 5267, 8892, 50276, 13206, 374, 253, 4677, 812, 11120, 1333, 534, 7484, 10770, 281, 5075, 17542, 303, 1427, 39443, 50276, 49794, 273, 3036, 495, 50275, 2520, 14777, 50276, 2520, 7484, 50275, 1440, 310, 671, 13406, 253, 1781, 2408, 273, 10872, 50276, 1710, 4175, 50275, 9088, 310, 642, 40955, 1919, 1390, 6923, 806, 891, 3078, 281, 3748, 28146, 533, 840, 891, 12659, 326, 436, 310, 581, 273, 253, 3186, 3607, 323, 29439, 49602, 891, 9101, 436, 1461, 5123, 625, 952, 685, 352, 7729, 952, 7615, 342, 29439, 49602, 594, 891, 651, 1804, 281, 816, 5386, 436, 7579, 50273, 7152, 33032, 2520, 2929, 27694, 953, 253, 2934, 273, 9433, 247, 18880, 273, 12077, 8077, 1877, 5609, 281, 247, 7219, 1895, 275, 7170, 273, 16161, 352, 342, 253, 3064, 326, 8077, 6787, 403, 3732, 281, 253, 4812, 958, 2149, 5502, 985, 6779, 3185, 273, 253, 3236, 26278, 30221, 824, 347, 22486, 390, 269, 5267, 436, 310, 7558, 281, 452, 690, 5750, 275, 326, 253, 6779, 4483, 323, 17697, 2538, 285, 2176, 4948, 273, 557, 30986, 960, 534, 4483, 253, 906, 273, 2176, 3081, 8077, 6787, 281, 320, 4469, 327, 253, 42719, 253, 4477, 878, 281, 48516, 253, 7219, 20949, 281, 10196, 327, 436, 6779, 534, 387, 1655, 310, 1679, 5919, 685, 253, 6210, 392, 70, 1155, 264, 499, 9582, 7092, 275, 3809, 21169, 326, 17209, 327, 253, 269, 5267, 6779, 50276, 249, 2593, 14433, 273, 13361, 67, 8460, 3339, 327, 3239, 577, 387, 253, 5068, 273, 253, 2593, 253, 5203, 15834, 272, 1159, 310, 2965, 264, 892, 533, 1996, 275, 253, 2593, 352, 3133, 281, 1818, 1416, 281, 29331, 310, 436, 247, 1745, 80, 390, 310, 29331, 247, 1027, 1159, 604, 594, 752, 310, 352, 50276, 26458, 4903, 1650, 1996, 275, 253, 1072, 2593, 253, 4477, 1750, 326, 9433, 17542, 303, 1427, 39443, 253, 4795, 28669, 556, 1335, 760, 374, 3054, 581, 835, 1097, 33605, 403, 873, 281, 337, 285, 1529, 835, 387, 1878, 581, 4828, 1364, 1335, 320, 873, 891, 13414, 923, 849, 436, 476, 320, 247, 17542, 303, 1427, 352, 651, 1599, 326, 7449, 50276, 520, 50276, 740, 533, 9433, 253, 2250, 326, 5239, 253, 806, 4778, 275, 7449, 5644, 281, 884, 1223, 9433, 352, 275, 1375, 14805, 5644, 281, 1903, 534, 342, 17542, 303, 1858, 414, 943, 16084, 326, 884, 50276, 883, 534, 40878, 253, 806, 1617, 273, 17542, 303, 1858, 414, 1580, 1903, 310, 247, 4736, 1375, 285, 884, 310, 417, 50276, 30875, 253, 4477, 5486, 9433, 5075, 17542, 303, 1858, 414, 533, 275, 326, 1083, 534, 403, 253, 4812, 13301, 50276, 249, 4677, 374, 752, 310, 253, 3064, 875, 253, 14777, 275, 253, 1755, 4194, 285, 253, 14777, 275, 253, 5004, 4194, 253, 13301, 390, 11743, 1057, 417, 1333, 2712, 50276, 249, 253, 2593, 327, 3186, 2317, 5141, 253, 4477, 3748, 247, 1180, 273, 10625, 326, 403, 4336, 14042, 407, 8077, 1877, 436, 369, 671, 2540, 275, 253, 3236, 2929, 1057, 253, 26647, 273, 253, 8077, 6787, 4081, 1060, 906, 275, 667, 747, 5028, 1146, 4751, 14042, 326, 369, 417, 342, 253, 2045, 9508, 273, 253, 1072, 5609, 390, 667, 2060, 4836, 1561, 247, 417, 4751, 14042, 5028, 2490, 187, 4118, 18435, 27, 69, 613, 4477, 5717, 368, 1077, 1199, 323, 634, 19529, 359, 403, 5211, 281, 4151, 368, 326, 359, 452, 4425, 281, 2997, 352, 285, 359, 1007, 3579, 281, 634, 2312, 275, 253, 22586, 4496, 564, 689, 253, 8680, 275, 253, 10123, 285, 3451, 390, 5731, 634, 9380, 275, 673, 323, 253, 6568, 4704, 3522, 778, 2164, 1682, 17730, 288, 8289, 532, 37630 ]
Below is given review of a research paper from cnoference journal. Please write a summary the review. ### Review: this paper addresses an interesting problem uncertainty estimation for gbrt the authors propose a knearest neighbors approach based on an affinity between the testing sample and training samples to save computational time sampling from trees is used they prove by experiments that the proposed ibug works better than ngboost and pgbm two recent gradient boosting algorithms for tree model uncertainty strengths the idea has some similarities to existing works such as distancebased conformal prediction the way to calculate the distance or the affinity is new so this paper shows a new idea and the authors show the usefulness of the model comparisons to pgbm verify the benefits of such a gbrt uncertainty model weaknesses i have some concerns about the technical issues please see the list in the question section no potential negative societal impact docsepthis paper proposes a new method to estimate data uncertainty in gbdt models the proposed approach uses k nearest training elements to produce probabilistic predictions nearest elements are determined using the constructed ensemble top instances are chosen based on the number of times they are in the same leaf with the test example the method can be applied to any gbdt model after it is trained strengths the method can work with any gbdt model and can be applied to any trained model at inference time the reported results show that the proposed approach outperforms existing methods ngboost and pgbm in terms of both rmse and probabilistic evaluation measures weaknesses the inference time can increase significantly the paper does not compare with the existing probabilistic prediction for gbdt implemented in catboost see details below in questions yes docsepstarting from gradient boosting trees the paper develops a new method to estimate the conditional distribution pyx for regression problems the authors propose to estimate pyx in a nearest neighbor approach using a specific similarity measure two instances are considered similar if they end up in the same leaf in many of the trees using the neighborhood the variance or the full conditional distribution pyx can be estimated the authors also present a calibration procedure for situations where the variance is wrongly estimated during training in the experiments the new method is compared to two specific baselines on 20 tabular datasets using nll and crps as performance measures strengths the paper is well written and easy to follow uncertainty estimation for gradientboosted regression trees is an important research problem because boosted trees usually yield sota performance on tabular datasets the experimental evaluation is quite extensive weaknesses the proposed method comes without any theoretical justification the method is in essence a heuristic i would have liked to see a comparison with nearest neighbor methods that use other similarity measures the presented approach is interesting but i also see some limitations the presented approach is a very simple approach with limited novelty it is in essence a nearest neighbor method with a specific similarity measure estimating the conditional distribution by analyzing the neighborhood of a test instance is a wellknown approach in nearest neighbor research so the only novelty here is the similarity measure which is quite specific i would not be surprised if this similarity measure outperforms euclidean distance because the presented similarity is computed on those features that matter for prediction especially for highdimensional datasets with many irrelevant features this might be an advantage over euclidean distance however in nearest neighbor research many alternative similarity scores that also provide a solution for the curse of dimensionality have been proposed i find it a pity that this literature is completely ignored to my opinion the proposed similarity measure has at least one obvious shortcoming it is a noncontinuous function that results in many ties i am wondering whether this performance measure is able to outperform some simple baselines that also overcome the curse of dimensionality see eg select the most important features based on a variable importance criterion and compute the euclidean distance in the resulting lowerdimensional space overall i would have liked to see more theoretical and experimental justification that the proposed similarity measure is the way to go the fact that the variance needs to be recalibrated using a validation set lets me conclude that the considered similarity measure leads to a biased estimate of pyx more theoretical insights on what goes wrong would be useful the experiments show that the new method outperforms some baselines but thats what 99 of the neurips submissions claim these claims are hard to verify thus some theoretical results would help me to believe that the proposed method is stateoftheart in the experiments the assumptions for using a paired ttest are not met individual numbers are not independent because the training datasets overlap if you want to compute pvalues please use the right type of test see for example dietterich approximate statistical tests for comparing supervised learning algorithms and followup papers on that topic for using the right tests nll and crps are hard to interpret as measures for comparing different approaches to my opinion checking the validity of a predefined prediction interval is easier to interpret this measure is commonly used in the conformal prediction for regression literature docsepthe authors develop ibug a straightforward approach for producing probabilistic predictions for gradientboosted regression trees the approach itself is very simple but the point the authors are making is that this approach is useful the authors do a very large amount of computational work showing the potential of their approach the code the authors provide looks clean and easytouse the contributions of this paper are largely in engineering a good solution and then implementing that solution in an excellent way there is no big methodological contribution originality originality is the weakest aspect of the paper the authors dont cite davies ghahramani 2014 but the affinity score in equation 1 is equivalent to section 3 in davies ghahramani 2014 even putting aside davies ghahramani 2014 this paper is not terribly new or novel nothing in this paper is very surprising however originality is often overrated in research the authors seem to have made a substantial engineering contribution although the authors may have missed davies ghahramani 2014 they otherwise nicely document the literature so the authors arent with the exception of the one paper overclaiming the originality quality the quality of the engineering is high the authors have clearly done a lot of work in examining i 22 datasets including 21 standard benchmarks ii 3 performance metrics including nll crps and rmse iii 3 different base models including lightgbm xgboost and catboost iv several types of output distributions i checked the code the authors provided and it looks nice clarity the clarity of the paper is good the paper is wellwritten and easy to understand the code is also wellwritten and easy to understand significance the paper has potential to be significant because of the extensive engineering contributions the currently anonymous code is available under an apache 20 license so my hope is that lots of others will have the opportunity to use the authors work as the authors write theres a fairly big chasm between the easeofuse of gbrt and the availability of probabilistic models so closing that gap in an easytouse way is nice my concerns about significance are twofold i the authors only compare to other simple easytoimplement metrics for calculating probabilistic prediction on trees on the one hand the authors are right in saying that bayesian models and more complex approaches like bart will not scale well and therefore are unlikely to be used very frequently but on the other hand it would be nice to have comparisons between the authors simplified approaches and the more intricate approaches ii the authors claim better performance of ibug but i do have some concerns about this because with trees the goal should maybe be different performance of ibug to other approaches retrofitting gbrt for probabilistic prediction is always going to be a bit of an art since the probabilistic prediction is missing from the beginning and really only added on later although im hoping that the authors will address these items in the rebuttal theyre not really showstopping points the engineering contribution here is rather extensive minor line 99 euclidean euclidean this is fine ### Summary:
this paper presents a method for extending any gbrt point predictor to produce probabilistic predictions such that the aleatoric uncertainty can be quantified it computes a nonparametric distribution around a prediction using the k nns where the distance is measured by a kernel that is similar to the random forest kernel the paper is well written and easy to read all of reviewers agree that it is a simple practical method that is well engineered but all the techniques used in this system are existing ones so that its technical novelty is limited during the discussion period i had more than a few communications with reviewers one one hand there were some concerns on the limited novelty which i also agree with in fact this concern became more notable in the discussion on the other hand a strength is in its simplicity practicability and its excellence in engineering and design how to critically evaluate alternative approaches and how to design experiments that evaluate those approaches a few things that i would like the authors to consider in their future submissions include 1 the method is applied to quantify only aleatoric uncertainty which should be clearly mentioned in an earlier place in the paper since these days we observe a few interesting methods for quantifying the predictive uncertainty that is both aleatoric and episdemic uncertainty 2 a kernel which is similar to the random forest kernel is used as a distance metric unlike rf gbrt construct trees with small depth so that it is expected that many instances fall in the same leaf the behavior might be different from the case of rf despite a concern on the limited novelty most of reviewers feel that this work can be accepted so i recommend it for acceptance
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 2520, 2929, 12453, 271, 4722, 1895, 11649, 13418, 323, 305, 1288, 85, 253, 4477, 12661, 247, 7725, 4885, 15833, 2746, 1754, 327, 271, 15430, 875, 253, 5175, 3410, 285, 3733, 3530, 281, 5321, 15180, 673, 10491, 432, 7139, 310, 908, 597, 5276, 407, 4679, 326, 253, 4081, 18890, 814, 2987, 1805, 685, 9782, 15467, 285, 23256, 5844, 767, 3332, 11786, 43124, 11333, 323, 5202, 1566, 11649, 20544, 253, 2934, 556, 690, 22620, 281, 5368, 2987, 824, 347, 4181, 3169, 29269, 10554, 253, 1039, 281, 10173, 253, 4181, 390, 253, 15430, 310, 747, 594, 436, 2929, 2722, 247, 747, 2934, 285, 253, 4477, 921, 253, 31471, 273, 253, 1566, 14023, 281, 23256, 5844, 12654, 253, 5373, 273, 824, 247, 305, 1288, 85, 11649, 1566, 50276, 20881, 1255, 265, 891, 452, 690, 7350, 670, 253, 7681, 3374, 4496, 923, 253, 1618, 275, 253, 1953, 2593, 642, 2442, 4016, 38058, 3486, 5474, 33032, 2520, 2929, 29328, 247, 747, 1332, 281, 6642, 941, 11649, 275, 305, 67, 7064, 3210, 253, 4081, 2746, 4648, 465, 5275, 3733, 3603, 281, 4711, 37851, 13650, 5275, 3603, 403, 3413, 970, 253, 8818, 19862, 1755, 10872, 403, 6777, 1754, 327, 253, 1180, 273, 2069, 597, 403, 275, 253, 1072, 10617, 342, 253, 1071, 1650, 253, 1332, 476, 320, 3732, 281, 667, 305, 67, 7064, 1566, 846, 352, 310, 10166, 20544, 50276, 783, 1332, 476, 789, 342, 667, 305, 67, 7064, 1566, 285, 476, 320, 3732, 281, 667, 10166, 1566, 387, 17032, 673, 50276, 783, 2361, 1543, 921, 326, 253, 4081, 2746, 41731, 13015, 5368, 3082, 9782, 15467, 285, 23256, 5844, 275, 2426, 273, 1097, 40373, 339, 285, 37851, 7103, 5593, 50276, 20881, 1255, 265, 50276, 783, 17032, 673, 476, 2572, 3012, 50276, 783, 2929, 1057, 417, 7277, 342, 253, 5368, 37851, 10554, 323, 305, 67, 7064, 9009, 275, 5798, 15467, 923, 4278, 2708, 275, 3533, 4754, 5474, 33032, 45033, 432, 11786, 43124, 7139, 253, 2929, 24357, 247, 747, 1332, 281, 6642, 253, 17697, 3268, 7239, 89, 323, 9077, 3237, 253, 4477, 12661, 281, 6642, 7239, 89, 275, 247, 5275, 6346, 2746, 970, 247, 2173, 14259, 2557, 767, 10872, 403, 2783, 2074, 604, 597, 990, 598, 275, 253, 1072, 10617, 275, 1142, 273, 253, 7139, 970, 253, 9168, 253, 11041, 390, 253, 2120, 17697, 3268, 7239, 89, 476, 320, 5998, 253, 4477, 671, 1246, 247, 18543, 5199, 323, 9534, 835, 253, 11041, 310, 47723, 5998, 1309, 3733, 50275, 249, 253, 4679, 253, 747, 1332, 310, 2429, 281, 767, 2173, 1666, 25379, 327, 1384, 10334, 792, 15302, 970, 295, 620, 285, 1531, 793, 347, 3045, 5593, 50276, 296, 3755, 20556, 50276, 783, 2929, 310, 973, 3542, 285, 3477, 281, 956, 50276, 7157, 1695, 555, 13418, 323, 11786, 15467, 264, 9077, 7139, 310, 271, 1774, 2561, 1895, 984, 46002, 7139, 3798, 4917, 256, 5503, 3045, 327, 10334, 792, 15302, 50275, 783, 5661, 7103, 310, 3240, 9470, 50275, 20881, 1255, 265, 50276, 783, 4081, 1332, 3249, 1293, 667, 10527, 22861, 253, 1332, 310, 275, 17718, 247, 47641, 50275, 74, 651, 452, 10490, 281, 923, 247, 5301, 342, 5275, 6346, 3082, 326, 897, 643, 14259, 5593, 50276, 783, 3559, 2746, 310, 4722, 533, 891, 671, 923, 690, 7364, 50275, 783, 3559, 2746, 310, 247, 1077, 2969, 2746, 342, 3710, 38135, 352, 310, 275, 17718, 247, 5275, 6346, 1332, 342, 247, 2173, 14259, 2557, 26230, 253, 17697, 3268, 407, 18918, 253, 9168, 273, 247, 1071, 4227, 310, 247, 973, 4304, 2746, 275, 5275, 6346, 2561, 594, 253, 760, 38135, 1060, 310, 253, 14259, 2557, 534, 310, 3240, 2173, 891, 651, 417, 320, 9861, 604, 436, 14259, 2557, 41731, 13015, 299, 26365, 4181, 984, 253, 3559, 14259, 310, 10302, 327, 1110, 3386, 326, 2647, 323, 10554, 3340, 323, 1029, 6967, 15302, 342, 1142, 19124, 3386, 436, 1537, 320, 271, 5750, 689, 299, 26365, 4181, 2299, 275, 5275, 6346, 2561, 1142, 5795, 14259, 7363, 326, 671, 2085, 247, 2900, 323, 253, 28401, 273, 7877, 1319, 452, 644, 4081, 891, 1089, 352, 247, 27042, 326, 436, 6239, 310, 4336, 12841, 50275, 936, 619, 4743, 253, 4081, 14259, 2557, 556, 387, 1878, 581, 4755, 2159, 4202, 352, 310, 247, 1327, 38927, 1159, 326, 1543, 275, 1142, 16027, 891, 717, 12371, 1880, 436, 3045, 2557, 310, 2104, 281, 562, 32231, 690, 2969, 1666, 25379, 326, 671, 11399, 253, 28401, 273, 7877, 1319, 923, 24088, 3609, 253, 954, 1774, 3386, 1754, 327, 247, 4778, 6349, 17705, 285, 11897, 253, 299, 26365, 4181, 275, 253, 4795, 2406, 6967, 2317, 4583, 891, 651, 452, 10490, 281, 923, 625, 10527, 285, 5661, 22861, 326, 253, 4081, 14259, 2557, 310, 253, 1039, 281, 564, 50275, 783, 958, 326, 253, 11041, 3198, 281, 320, 42545, 50250, 970, 247, 12820, 873, 14935, 479, 7525, 326, 253, 2783, 14259, 2557, 5644, 281, 247, 23539, 6642, 273, 7239, 89, 625, 10527, 16039, 327, 752, 4566, 3430, 651, 320, 4217, 253, 4679, 921, 326, 253, 747, 1332, 41731, 13015, 690, 1666, 25379, 533, 28763, 752, 8688, 273, 253, 5723, 2824, 35103, 1750, 841, 3916, 403, 1892, 281, 12654, 3021, 690, 10527, 1543, 651, 1361, 479, 281, 2868, 326, 253, 4081, 1332, 310, 1375, 23037, 14387, 50275, 249, 253, 4679, 253, 13260, 323, 970, 247, 18433, 246, 2566, 403, 417, 1313, 2060, 3904, 403, 417, 3907, 984, 253, 3733, 15302, 14787, 604, 368, 971, 281, 11897, 268, 8858, 4496, 897, 253, 987, 1511, 273, 1071, 923, 323, 1650, 6196, 350, 469, 16851, 7605, 5216, 323, 10941, 22296, 4715, 11333, 285, 956, 484, 9380, 327, 326, 9400, 323, 970, 253, 987, 5216, 50274, 79, 620, 285, 1531, 793, 403, 1892, 281, 4665, 347, 5593, 323, 10941, 1027, 7274, 281, 619, 4743, 12669, 253, 13091, 273, 247, 41364, 10554, 7726, 310, 6927, 281, 4665, 436, 2557, 310, 7744, 908, 275, 253, 29269, 10554, 323, 9077, 6239, 50276, 7152, 339, 431, 248, 4477, 1287, 18890, 814, 247, 15246, 2746, 323, 9603, 37851, 13650, 323, 11786, 15467, 264, 9077, 7139, 253, 2746, 3139, 310, 1077, 2969, 533, 253, 1127, 253, 4477, 403, 2403, 310, 326, 436, 2746, 310, 4217, 253, 4477, 513, 247, 1077, 1781, 2408, 273, 15180, 789, 4645, 253, 2442, 273, 616, 2746, 253, 2127, 253, 4477, 2085, 4453, 4076, 285, 1842, 1767, 1312, 50276, 783, 9021, 273, 436, 2929, 403, 8127, 275, 11369, 247, 1175, 2900, 285, 840, 16994, 326, 2900, 275, 271, 7126, 1039, 627, 310, 642, 1943, 35961, 7680, 3236, 414, 50276, 19164, 414, 310, 253, 5075, 383, 4809, 273, 253, 2929, 253, 4477, 13414, 26542, 34843, 447, 50276, 18068, 1240, 3358, 6451, 4059, 533, 253, 15430, 4868, 275, 5150, 337, 310, 6425, 281, 2593, 495, 275, 34843, 447, 50276, 18068, 1240, 3358, 6451, 4059, 1014, 8133, 9255, 34843, 447, 50276, 18068, 1240, 3358, 6451, 4059, 436, 2929, 310, 417, 30643, 747, 390, 4460, 2717, 275, 436, 2929, 310, 1077, 10084, 50276, 35529, 3236, 414, 310, 2223, 689, 14092, 275, 2561, 253, 4477, 1646, 281, 452, 1160, 247, 6832, 11369, 7680, 50276, 20261, 253, 4477, 778, 452, 9829, 34843, 447, 50276, 18068, 1240, 3358, 6451, 4059, 597, 5010, 23395, 3389, 253, 6239, 594, 253, 4477, 403, 2649, 342, 253, 6517, 273, 253, 581, 2929, 689, 43759, 253, 3236, 414, 50276, 15177, 50276, 783, 3290, 273, 253, 11369, 310, 1029, 253, 4477, 452, 4518, 2218, 247, 2257, 273, 789, 275, 17565, 891, 3307, 15302, 1690, 3127, 2629, 49602, 21255, 495, 3045, 17082, 1690, 295, 620, 1531, 793, 285, 40373, 339, 37685, 495, 1027, 2613, 3210, 1690, 1708, 72, 5844, 1269, 72, 15467, 285, 5798, 15467, 21983, 2067, 3510, 273, 3453, 10670, 891, 10141, 253, 2127, 253, 4477, 2530, 285, 352, 4453, 5322, 50276, 498, 15752, 50276, 783, 19843, 273, 253, 2929, 310, 1175, 253, 2929, 310, 973, 15720, 285, 3477, 281, 2096, 253, 2127, 310, 671, 973, 15720, 285, 3477, 281, 2096, 50276, 9188, 40348, 50276, 783, 2929, 556, 2442, 281, 320, 1534, 984, 273, 253, 9470, 11369, 9021, 253, 4390, 17679, 2127, 310, 2130, 762, 271, 43449, 1384, 7981, 594, 619, 3524, 310, 326, 8783, 273, 2571, 588, 452, 253, 5107, 281, 897, 253, 4477, 789, 347, 253, 4477, 3630, 253, 373, 247, 9648, 1943, 448, 4542, 875, 253, 11990, 1171, 2327, 273, 305, 1288, 85, 285, 253, 11659, 273, 37851, 3210, 594, 11196, 326, 8037, 275, 271, 1842, 1767, 1312, 1039, 310, 5322, 50276, 2577, 7350, 670, 8453, 403, 767, 8089, 50276, 74, 253, 4477, 760, 7277, 281, 643, 2969, 3477, 936, 303, 3018, 17082, 323, 18899, 37851, 10554, 327, 7139, 327, 253, 581, 1133, 253, 4477, 403, 987, 275, 3981, 326, 17699, 16561, 3210, 285, 625, 2570, 7274, 751, 44693, 588, 417, 4311, 973, 285, 3103, 403, 11543, 281, 320, 908, 1077, 7208, 533, 327, 253, 643, 1133, 352, 651, 320, 5322, 281, 452, 14023, 875, 253, 4477, 21010, 7274, 285, 253, 625, 36930, 7274, 50276, 2886, 253, 4477, 1750, 1805, 3045, 273, 18890, 814, 533, 891, 513, 452, 690, 7350, 670, 436, 984, 342, 7139, 253, 4736, 943, 5046, 320, 1027, 3045, 273, 18890, 814, 281, 643, 7274, 13767, 31893, 305, 1288, 85, 323, 37851, 10554, 310, 1900, 1469, 281, 320, 247, 2372, 273, 271, 1445, 1580, 253, 37851, 10554, 310, 5816, 432, 253, 5068, 285, 1663, 760, 2879, 327, 1996, 50276, 20261, 516, 11525, 326, 253, 4477, 588, 2953, 841, 4957, 275, 253, 30080, 22559, 597, 250, 417, 1663, 921, 11769, 2784, 2792, 253, 11369, 7680, 1060, 310, 2581, 9470, 50276, 37585, 50276, 1282, 8688, 299, 26365, 50276, 70, 26365, 50276, 2520, 310, 4030, 2490, 187, 4118, 18435, 27, 2520, 2929, 10262, 247, 1332, 50276, 1542, 13633, 667, 305, 1288, 85, 1127, 23403, 281, 4711, 37851, 13650, 824, 326, 253, 21844, 1080, 280, 11649, 476, 320, 18755, 352, 48169, 247, 1327, 36928, 3268, 1475, 247, 10554, 970, 253, 465, 295, 2224, 835, 253, 4181, 310, 4080, 407, 247, 10295, 326, 310, 2074, 281, 253, 3632, 9741, 10295, 253, 2929, 310, 973, 3542, 285, 3477, 281, 1239, 512, 273, 30628, 5194, 326, 352, 310, 247, 2969, 8542, 1332, 326, 310, 973, 28136, 533, 512, 253, 5609, 908, 275, 436, 985, 403, 5368, 4394, 594, 326, 697, 7681, 38135, 310, 3710, 1309, 253, 5955, 2180, 891, 574, 625, 685, 247, 1643, 10924, 342, 30628, 581, 581, 1133, 627, 497, 690, 7350, 327, 253, 3710, 38135, 534, 891, 671, 5194, 342, 275, 958, 436, 4468, 3395, 625, 16613, 275, 253, 5955, 327, 253, 643, 1133, 247, 4757, 310, 275, 697, 17647, 2283, 280, 1430, 285, 697, 31925, 275, 11369, 285, 2216, 849, 281, 21038, 7472, 5795, 7274, 285, 849, 281, 2216, 4679, 326, 7472, 1110, 7274, 247, 1643, 1841, 326, 891, 651, 751, 253, 4477, 281, 1908, 275, 616, 2852, 35103, 2486, 337, 253, 1332, 310, 3732, 281, 22048, 760, 21844, 1080, 280, 11649, 534, 943, 320, 4518, 5393, 275, 271, 4321, 1659, 275, 253, 2929, 1580, 841, 1897, 359, 10018, 247, 1643, 4722, 3082, 323, 2677, 5411, 253, 15970, 11649, 326, 310, 1097, 21844, 1080, 280, 285, 2563, 5800, 11060, 11649, 374, 247, 10295, 534, 310, 2074, 281, 253, 3632, 9741, 10295, 310, 908, 347, 247, 4181, 7982, 12401, 391, 71, 305, 1288, 85, 3989, 7139, 342, 1355, 6864, 594, 326, 352, 310, 3264, 326, 1142, 10872, 2965, 275, 253, 1072, 10617, 253, 3879, 1537, 320, 1027, 432, 253, 1083, 273, 391, 71, 5747, 247, 4468, 327, 253, 3710, 38135, 954, 273, 30628, 1928, 326, 436, 789, 476, 320, 7607, 594, 891, 5583, 352, 323, 14924, 50276 ]
[ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 ]
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 2520, 2929, 12453, 271, 4722, 1895, 11649, 13418, 323, 305, 1288, 85, 253, 4477, 12661, 247, 7725, 4885, 15833, 2746, 1754, 327, 271, 15430, 875, 253, 5175, 3410, 285, 3733, 3530, 281, 5321, 15180, 673, 10491, 432, 7139, 310, 908, 597, 5276, 407, 4679, 326, 253, 4081, 18890, 814, 2987, 1805, 685, 9782, 15467, 285, 23256, 5844, 767, 3332, 11786, 43124, 11333, 323, 5202, 1566, 11649, 20544, 253, 2934, 556, 690, 22620, 281, 5368, 2987, 824, 347, 4181, 3169, 29269, 10554, 253, 1039, 281, 10173, 253, 4181, 390, 253, 15430, 310, 747, 594, 436, 2929, 2722, 247, 747, 2934, 285, 253, 4477, 921, 253, 31471, 273, 253, 1566, 14023, 281, 23256, 5844, 12654, 253, 5373, 273, 824, 247, 305, 1288, 85, 11649, 1566, 50276, 20881, 1255, 265, 891, 452, 690, 7350, 670, 253, 7681, 3374, 4496, 923, 253, 1618, 275, 253, 1953, 2593, 642, 2442, 4016, 38058, 3486, 5474, 33032, 2520, 2929, 29328, 247, 747, 1332, 281, 6642, 941, 11649, 275, 305, 67, 7064, 3210, 253, 4081, 2746, 4648, 465, 5275, 3733, 3603, 281, 4711, 37851, 13650, 5275, 3603, 403, 3413, 970, 253, 8818, 19862, 1755, 10872, 403, 6777, 1754, 327, 253, 1180, 273, 2069, 597, 403, 275, 253, 1072, 10617, 342, 253, 1071, 1650, 253, 1332, 476, 320, 3732, 281, 667, 305, 67, 7064, 1566, 846, 352, 310, 10166, 20544, 50276, 783, 1332, 476, 789, 342, 667, 305, 67, 7064, 1566, 285, 476, 320, 3732, 281, 667, 10166, 1566, 387, 17032, 673, 50276, 783, 2361, 1543, 921, 326, 253, 4081, 2746, 41731, 13015, 5368, 3082, 9782, 15467, 285, 23256, 5844, 275, 2426, 273, 1097, 40373, 339, 285, 37851, 7103, 5593, 50276, 20881, 1255, 265, 50276, 783, 17032, 673, 476, 2572, 3012, 50276, 783, 2929, 1057, 417, 7277, 342, 253, 5368, 37851, 10554, 323, 305, 67, 7064, 9009, 275, 5798, 15467, 923, 4278, 2708, 275, 3533, 4754, 5474, 33032, 45033, 432, 11786, 43124, 7139, 253, 2929, 24357, 247, 747, 1332, 281, 6642, 253, 17697, 3268, 7239, 89, 323, 9077, 3237, 253, 4477, 12661, 281, 6642, 7239, 89, 275, 247, 5275, 6346, 2746, 970, 247, 2173, 14259, 2557, 767, 10872, 403, 2783, 2074, 604, 597, 990, 598, 275, 253, 1072, 10617, 275, 1142, 273, 253, 7139, 970, 253, 9168, 253, 11041, 390, 253, 2120, 17697, 3268, 7239, 89, 476, 320, 5998, 253, 4477, 671, 1246, 247, 18543, 5199, 323, 9534, 835, 253, 11041, 310, 47723, 5998, 1309, 3733, 50275, 249, 253, 4679, 253, 747, 1332, 310, 2429, 281, 767, 2173, 1666, 25379, 327, 1384, 10334, 792, 15302, 970, 295, 620, 285, 1531, 793, 347, 3045, 5593, 50276, 296, 3755, 20556, 50276, 783, 2929, 310, 973, 3542, 285, 3477, 281, 956, 50276, 7157, 1695, 555, 13418, 323, 11786, 15467, 264, 9077, 7139, 310, 271, 1774, 2561, 1895, 984, 46002, 7139, 3798, 4917, 256, 5503, 3045, 327, 10334, 792, 15302, 50275, 783, 5661, 7103, 310, 3240, 9470, 50275, 20881, 1255, 265, 50276, 783, 4081, 1332, 3249, 1293, 667, 10527, 22861, 253, 1332, 310, 275, 17718, 247, 47641, 50275, 74, 651, 452, 10490, 281, 923, 247, 5301, 342, 5275, 6346, 3082, 326, 897, 643, 14259, 5593, 50276, 783, 3559, 2746, 310, 4722, 533, 891, 671, 923, 690, 7364, 50275, 783, 3559, 2746, 310, 247, 1077, 2969, 2746, 342, 3710, 38135, 352, 310, 275, 17718, 247, 5275, 6346, 1332, 342, 247, 2173, 14259, 2557, 26230, 253, 17697, 3268, 407, 18918, 253, 9168, 273, 247, 1071, 4227, 310, 247, 973, 4304, 2746, 275, 5275, 6346, 2561, 594, 253, 760, 38135, 1060, 310, 253, 14259, 2557, 534, 310, 3240, 2173, 891, 651, 417, 320, 9861, 604, 436, 14259, 2557, 41731, 13015, 299, 26365, 4181, 984, 253, 3559, 14259, 310, 10302, 327, 1110, 3386, 326, 2647, 323, 10554, 3340, 323, 1029, 6967, 15302, 342, 1142, 19124, 3386, 436, 1537, 320, 271, 5750, 689, 299, 26365, 4181, 2299, 275, 5275, 6346, 2561, 1142, 5795, 14259, 7363, 326, 671, 2085, 247, 2900, 323, 253, 28401, 273, 7877, 1319, 452, 644, 4081, 891, 1089, 352, 247, 27042, 326, 436, 6239, 310, 4336, 12841, 50275, 936, 619, 4743, 253, 4081, 14259, 2557, 556, 387, 1878, 581, 4755, 2159, 4202, 352, 310, 247, 1327, 38927, 1159, 326, 1543, 275, 1142, 16027, 891, 717, 12371, 1880, 436, 3045, 2557, 310, 2104, 281, 562, 32231, 690, 2969, 1666, 25379, 326, 671, 11399, 253, 28401, 273, 7877, 1319, 923, 24088, 3609, 253, 954, 1774, 3386, 1754, 327, 247, 4778, 6349, 17705, 285, 11897, 253, 299, 26365, 4181, 275, 253, 4795, 2406, 6967, 2317, 4583, 891, 651, 452, 10490, 281, 923, 625, 10527, 285, 5661, 22861, 326, 253, 4081, 14259, 2557, 310, 253, 1039, 281, 564, 50275, 783, 958, 326, 253, 11041, 3198, 281, 320, 42545, 50250, 970, 247, 12820, 873, 14935, 479, 7525, 326, 253, 2783, 14259, 2557, 5644, 281, 247, 23539, 6642, 273, 7239, 89, 625, 10527, 16039, 327, 752, 4566, 3430, 651, 320, 4217, 253, 4679, 921, 326, 253, 747, 1332, 41731, 13015, 690, 1666, 25379, 533, 28763, 752, 8688, 273, 253, 5723, 2824, 35103, 1750, 841, 3916, 403, 1892, 281, 12654, 3021, 690, 10527, 1543, 651, 1361, 479, 281, 2868, 326, 253, 4081, 1332, 310, 1375, 23037, 14387, 50275, 249, 253, 4679, 253, 13260, 323, 970, 247, 18433, 246, 2566, 403, 417, 1313, 2060, 3904, 403, 417, 3907, 984, 253, 3733, 15302, 14787, 604, 368, 971, 281, 11897, 268, 8858, 4496, 897, 253, 987, 1511, 273, 1071, 923, 323, 1650, 6196, 350, 469, 16851, 7605, 5216, 323, 10941, 22296, 4715, 11333, 285, 956, 484, 9380, 327, 326, 9400, 323, 970, 253, 987, 5216, 50274, 79, 620, 285, 1531, 793, 403, 1892, 281, 4665, 347, 5593, 323, 10941, 1027, 7274, 281, 619, 4743, 12669, 253, 13091, 273, 247, 41364, 10554, 7726, 310, 6927, 281, 4665, 436, 2557, 310, 7744, 908, 275, 253, 29269, 10554, 323, 9077, 6239, 50276, 7152, 339, 431, 248, 4477, 1287, 18890, 814, 247, 15246, 2746, 323, 9603, 37851, 13650, 323, 11786, 15467, 264, 9077, 7139, 253, 2746, 3139, 310, 1077, 2969, 533, 253, 1127, 253, 4477, 403, 2403, 310, 326, 436, 2746, 310, 4217, 253, 4477, 513, 247, 1077, 1781, 2408, 273, 15180, 789, 4645, 253, 2442, 273, 616, 2746, 253, 2127, 253, 4477, 2085, 4453, 4076, 285, 1842, 1767, 1312, 50276, 783, 9021, 273, 436, 2929, 403, 8127, 275, 11369, 247, 1175, 2900, 285, 840, 16994, 326, 2900, 275, 271, 7126, 1039, 627, 310, 642, 1943, 35961, 7680, 3236, 414, 50276, 19164, 414, 310, 253, 5075, 383, 4809, 273, 253, 2929, 253, 4477, 13414, 26542, 34843, 447, 50276, 18068, 1240, 3358, 6451, 4059, 533, 253, 15430, 4868, 275, 5150, 337, 310, 6425, 281, 2593, 495, 275, 34843, 447, 50276, 18068, 1240, 3358, 6451, 4059, 1014, 8133, 9255, 34843, 447, 50276, 18068, 1240, 3358, 6451, 4059, 436, 2929, 310, 417, 30643, 747, 390, 4460, 2717, 275, 436, 2929, 310, 1077, 10084, 50276, 35529, 3236, 414, 310, 2223, 689, 14092, 275, 2561, 253, 4477, 1646, 281, 452, 1160, 247, 6832, 11369, 7680, 50276, 20261, 253, 4477, 778, 452, 9829, 34843, 447, 50276, 18068, 1240, 3358, 6451, 4059, 597, 5010, 23395, 3389, 253, 6239, 594, 253, 4477, 403, 2649, 342, 253, 6517, 273, 253, 581, 2929, 689, 43759, 253, 3236, 414, 50276, 15177, 50276, 783, 3290, 273, 253, 11369, 310, 1029, 253, 4477, 452, 4518, 2218, 247, 2257, 273, 789, 275, 17565, 891, 3307, 15302, 1690, 3127, 2629, 49602, 21255, 495, 3045, 17082, 1690, 295, 620, 1531, 793, 285, 40373, 339, 37685, 495, 1027, 2613, 3210, 1690, 1708, 72, 5844, 1269, 72, 15467, 285, 5798, 15467, 21983, 2067, 3510, 273, 3453, 10670, 891, 10141, 253, 2127, 253, 4477, 2530, 285, 352, 4453, 5322, 50276, 498, 15752, 50276, 783, 19843, 273, 253, 2929, 310, 1175, 253, 2929, 310, 973, 15720, 285, 3477, 281, 2096, 253, 2127, 310, 671, 973, 15720, 285, 3477, 281, 2096, 50276, 9188, 40348, 50276, 783, 2929, 556, 2442, 281, 320, 1534, 984, 273, 253, 9470, 11369, 9021, 253, 4390, 17679, 2127, 310, 2130, 762, 271, 43449, 1384, 7981, 594, 619, 3524, 310, 326, 8783, 273, 2571, 588, 452, 253, 5107, 281, 897, 253, 4477, 789, 347, 253, 4477, 3630, 253, 373, 247, 9648, 1943, 448, 4542, 875, 253, 11990, 1171, 2327, 273, 305, 1288, 85, 285, 253, 11659, 273, 37851, 3210, 594, 11196, 326, 8037, 275, 271, 1842, 1767, 1312, 1039, 310, 5322, 50276, 2577, 7350, 670, 8453, 403, 767, 8089, 50276, 74, 253, 4477, 760, 7277, 281, 643, 2969, 3477, 936, 303, 3018, 17082, 323, 18899, 37851, 10554, 327, 7139, 327, 253, 581, 1133, 253, 4477, 403, 987, 275, 3981, 326, 17699, 16561, 3210, 285, 625, 2570, 7274, 751, 44693, 588, 417, 4311, 973, 285, 3103, 403, 11543, 281, 320, 908, 1077, 7208, 533, 327, 253, 643, 1133, 352, 651, 320, 5322, 281, 452, 14023, 875, 253, 4477, 21010, 7274, 285, 253, 625, 36930, 7274, 50276, 2886, 253, 4477, 1750, 1805, 3045, 273, 18890, 814, 533, 891, 513, 452, 690, 7350, 670, 436, 984, 342, 7139, 253, 4736, 943, 5046, 320, 1027, 3045, 273, 18890, 814, 281, 643, 7274, 13767, 31893, 305, 1288, 85, 323, 37851, 10554, 310, 1900, 1469, 281, 320, 247, 2372, 273, 271, 1445, 1580, 253, 37851, 10554, 310, 5816, 432, 253, 5068, 285, 1663, 760, 2879, 327, 1996, 50276, 20261, 516, 11525, 326, 253, 4477, 588, 2953, 841, 4957, 275, 253, 30080, 22559, 597, 250, 417, 1663, 921, 11769, 2784, 2792, 253, 11369, 7680, 1060, 310, 2581, 9470, 50276, 37585, 50276, 1282, 8688, 299, 26365, 50276, 70, 26365, 50276, 2520, 310, 4030, 2490, 187, 4118, 18435, 27, 2520, 2929, 10262, 247, 1332, 50276, 1542, 13633, 667, 305, 1288, 85, 1127, 23403, 281, 4711, 37851, 13650, 824, 326, 253, 21844, 1080, 280, 11649, 476, 320, 18755, 352, 48169, 247, 1327, 36928, 3268, 1475, 247, 10554, 970, 253, 465, 295, 2224, 835, 253, 4181, 310, 4080, 407, 247, 10295, 326, 310, 2074, 281, 253, 3632, 9741, 10295, 253, 2929, 310, 973, 3542, 285, 3477, 281, 1239, 512, 273, 30628, 5194, 326, 352, 310, 247, 2969, 8542, 1332, 326, 310, 973, 28136, 533, 512, 253, 5609, 908, 275, 436, 985, 403, 5368, 4394, 594, 326, 697, 7681, 38135, 310, 3710, 1309, 253, 5955, 2180, 891, 574, 625, 685, 247, 1643, 10924, 342, 30628, 581, 581, 1133, 627, 497, 690, 7350, 327, 253, 3710, 38135, 534, 891, 671, 5194, 342, 275, 958, 436, 4468, 3395, 625, 16613, 275, 253, 5955, 327, 253, 643, 1133, 247, 4757, 310, 275, 697, 17647, 2283, 280, 1430, 285, 697, 31925, 275, 11369, 285, 2216, 849, 281, 21038, 7472, 5795, 7274, 285, 849, 281, 2216, 4679, 326, 7472, 1110, 7274, 247, 1643, 1841, 326, 891, 651, 751, 253, 4477, 281, 1908, 275, 616, 2852, 35103, 2486, 337, 253, 1332, 310, 3732, 281, 22048, 760, 21844, 1080, 280, 11649, 534, 943, 320, 4518, 5393, 275, 271, 4321, 1659, 275, 253, 2929, 1580, 841, 1897, 359, 10018, 247, 1643, 4722, 3082, 323, 2677, 5411, 253, 15970, 11649, 326, 310, 1097, 21844, 1080, 280, 285, 2563, 5800, 11060, 11649, 374, 247, 10295, 534, 310, 2074, 281, 253, 3632, 9741, 10295, 310, 908, 347, 247, 4181, 7982, 12401, 391, 71, 305, 1288, 85, 3989, 7139, 342, 1355, 6864, 594, 326, 352, 310, 3264, 326, 1142, 10872, 2965, 275, 253, 1072, 10617, 253, 3879, 1537, 320, 1027, 432, 253, 1083, 273, 391, 71, 5747, 247, 4468, 327, 253, 3710, 38135, 954, 273, 30628, 1928, 326, 436, 789, 476, 320, 7607, 594, 891, 5583, 352, 323, 14924, 50276 ]
Below is given review of a research paper from cnoference journal. Please write a summary the review. ### Review: in this paper the authors proposed a new benchmark protocol iblurry for continual learning in this benchmark protocol the class distribution is class incremental and has blurry task boundaries and the training is online they also propose a new method cilb this method contains three important components sample importance memory memoryonly training and adaptive lr scheduling extensive experimental results are provided to show the effectiveness of the proposed method strengths the proposed benchmark protocol iblurry is reasonable and interesting it is important to analyse the results when some of the classes are not disjoint in continual learning tasks the authors provided extensive experimental results the proposed method is technically sound and proven to be effective by the empirical results the paper is wellorganized and easy to follow nbsp weaknesses the technical contributions of this paper are somewhat weak in this paper the authors mainly propose three components in the method section sample importance memory memoryonly training and adaptive lr scheduling for the first component the main idea is to find the optimal exemplars that can minimise the total loss so that it could be regarded as a simplified version of mnemonics liu et al 2020 for the second component it can be summarised as sample importance memorygdumb prabhu et al 2020 for the third component adaptive lr can benefit all classification models the authors dont explain why it is important in continual learning tasks this paper is not selfcontained alg3 and section a3 are not some additional information and results i cannot understand the proposed method without reading alg3 and section a3 thus the authors should definitely include these parts in the main paper the authors only provide experiment results on smallscale datasets most continual learning papers such as icarl rebuffi et al 2017 and bic wu et al 2019 provide the results on largescale datasets eg imagenet1k as the authors trying to establish a new benchmark protocol it is important to provide the results on largescale datasets an ablation study on the number of tasks should be provided in many continual learning papers icarl rebuffi et al 2017 bic wu et al 2019 etc the number of tasks will significantly influence the continual learning performance however i dont find an ablation study on the number of tasks in this paper i dont even see an explanation about how the authors choose that hyperparameter overall i think this is an interesting paper the authors design a new benchmark for continual learning and propose a simple yet effective method my primary concern is that how likely is it for following researchers to refer to this benchmark as there are already many different benchmark protocols in continual learning i think the following researchers will tend to choose the benchmark protocol already used in many popular papers nevertheless i still think the paper might be useful for the continual learning community my rating is borderline accept and i will consider upgrading my rating if the authors successfully address my questions nbsp postrebuttal update the authors addressed most of my concerns in the rebuttal they also provided the results i asked for in the revision eg the results on imagenet1k i think i tend to accept this submission so i upgrade my rating to eight docsepthe paper proposes a new problem setup in continual learning as the title suggests the paper focuses on online taskfree class incremental task blurry learning with anytime inference the authors also came up with new baselines and importancebased memory management they empirically tested their methods in the proposed problem setup the paper has excellent plots for the new problem setup however without further clarification of the following conceptscomments the significance of the new setup could be weak the paper claimed the new setup is both taskfree and classincremental in my opinion the two setups are not compatible with each other specifically they are different at the output layer in supervised learning anytime inference may require more rigorous justifications so as to the new metric only one baseline demonstrated the incapability of making anytime inference in table 1 the significance of anytime inference needs more evidence thm 1 may suffer from catastrophic forgetting thm 1 can be seen as a performancedriven memory management strategy however it has nothing to do with catastrophic forgetting in fig 4 the proposed method has a projection of performance drop in the later phase in the long run it is hard to tell whether the proposed method will outperform the baselines this performance drop could be attributed to the catastrophic forgetting of thm 1 the paper doesnt discuss the algorithm complexity which could be om in general where m is the memory size there are unsolved concerns about the conceptscomments in the current version without addressing them the proposed problem setup and method could not be that significant for continual learning docsepthe paper proposes a more realistic setting called iblurry for continual learning cl that generalizes the blurry and disjoint settings proposed in prior work the disjoint setting assumes that there is no class that appears in multiple tasks and the blurry setting assumes that no new classes are seen after the first task in the iblurry setting one can have overlapping classes across tasks as well as new classes appearing in each task this setting is also online and thus the paper is interested in continuous model evaluation ie anytime inference too it proposes a metric which calculates the area under the accuracy curve during training for the same for the new configuration the paper proposes a strong baseline and a new algorithm called clib clib is a memorybased cl method that refines its memory by throwing out least important samples and updating it with more important ones the sample importance is calculated as the expected decrease in training loss when the sample is used for training clib is also equipped with a datadriven adaptive learning rate scheduling scheme which provides some additional performance benefits the results showcase that clib is able to outperform other online cl methods and the baseline by large margins for various instantiations of their iblurry setting including the disjoint and blurry setting on the cifar10 cifar100 and tinyimagenet datasets an ablation study shows the main benefits of the proposed method come from the sample importance based memory scheme strengths 1 the proposed method works well on a various instantiations of the iblurrynm setup the proposed algorithms is better in the blurry n0 setting and performs comparably well in disjoint setting n100 as well 2 the ablation study highlights the main component in the proposed approach the memory management scheme that relies on sample importance that results in the highest benefits 3 lots of supporting experiments are provided in the appendix that further corroborate most claims 4 good organization of related work and great figures weaknesses 1 i am unsure if the proposed approach of obtaining sample importance is entirely novel a discussion regarding this is necessary as it is likely that similar ideas present beyond cl for instance in literature 2 a natural language description of theorem 1 is missing 3 i felt that more insights into the calculation of importance are required in the main text a note about the efficiencyruntime needs to be specified personally i think dedicating a portion of the maintext section 4 in discussing these would probably be more beneficial than describing memory only training and adaptive lr as they seemingly play only a minor role as indicated by the ablations 4 no information as to why tinyimagenet is challenging for all methods is provided there is also no justification provided for choosing tinyimagenet over imagenet rm bic and gdumb all show their results on imagenet 5 since the iblurry setting is an online one the choice of hyperparameters are likely to affect the results greatly as such hyperparameter choices updates per sample memory size and batchsize need to justified specifically were the hyperparameters chosen consistent with previous work 6 for a fixed m is it reasonable to expect that performance of all algorithms would increase in general with increasing n this can be used to justify choosing n to be from 0 50 100 would also be helpful reference for future works suggestion not a weakness the paper can also report the performance on some soft upper bounds for instance what if the task was not restricted to be online what is the performance when the entire training data is available at once non cl scenario there are some typos section 42 page 5 arguing that considering lead to training efficacy decrease section 41 page 5 for the lr scheduling other cl methods use either 1 exponential decay or 2 constant lr last line on page 4 is missing a period fullstop also table 3 aavg on cifar100 has the same numbers for the last two rows which might be a typo basis for the scores 1 the experiments justify most of the information present in the paper some additional information regarding the sample importance based memory management scheme is required also some decisions regarding hyperparameters need to justified given the iblurry setting is online and the main motivation of the paper is to propose and solve a more practical cl setting 2 im not sure about the novelty of the sampleimportance based memory management scheme and a discussion regarding prior work related to sampleimportance is required the proposed setting on iblurry is marginally novel as it is a generalization of the disjoint and blurry configurations anytime inference and aauc are likely novel 3 empirical results shown cannot be easily and extensively compared to some prior work as there isnt a strict adherence to prior dataset and hyperparameter choices some justification for the decisions is required 4 insights into the proposed clib method along with a nuanced discussion regarding the differences between the algorithms is lacking there is no description regarding efficiency and scalability update the authors clarify almost all of my questions pertaining to the paper the paper now contains experiments on imagenet1k with additional insights into the algorithm and its efficiency as such ive improved my scores docsepthe paper presents an experience replay method for continual learning cl whose main innovation is a memorymanagement algorithm that updates the memory when new samples arrive in a way that preserves the most useful samples wrt to their effect on the loss while promoting class balance the method is evaluated on the trickiest cl setting class incremental online and taskfree where it outperforms baselines on cifar10 and 100 datasets the main contribution of the paper i believe is the memory update method which as stated above updates the memory when new samples arrive in a way that preserves the most useful samples wrt to their effect on the loss while promoting class balance the authors also train exclusively on samples drawn from the memory rather than on batches split between memory and newly arrived samples which seems to be advantageous and the authors use a learningrate scheduling method that also gives some advantages the memory management method is novel as far as im aware and the overall method performs very well evaluation is carried out in the class incremental online taskfree setting which is a good decision in my opinion so many other methods fall short under these conditions so i think the paper deserves to be accepted the authors make a big deal out of their blurry sampling method for evaluation i think its a good approach but isnt really a big contribution in its own right presumably following memory update the batch that is trained on is sampled uniformly from the memory is that right space permitting it would be nice to see a more detailed comparison with related experience replay methods in particular aljundi et als mir method also uses a kind of importance weighting it would be good to see the differences spelled out and to see some sort of explanation for why the present method performs better a nice paper describing an effective and somewhat novel replaybased cl method evauated in a demanding setting ### Summary:
the authors propose a new continuallearning setting with a few distinguishing features 1 the task boundaries are blurry in other words past task samples can reappear 2 training is online and 3 evaluation using online accuracy instead of average accuracy the authors also propose a useful method for this scenario and benchmark it using four different datasets the first round of review pointed to two main limitations of the manuscript the authors only provided smallscale experiments the reviewers argued that for the setup and method to have an impact having good results using largerscale data would go a long way whether taskfree and classincremental were compatible for the former the authors were very reactive and provided results using a standard imagenet for cl dataset for the latter i must thank the authors and also the reviewers for discussing this thoroughly in the end my understanding is that there was a reconciliation that both were in fact compatible but the reviewer suggested that this be discussed very clearly by the authors i second this suggestion the cl field given its many slightly different settings might be partly to blame here reviewer vfw2 made a similar comment and i also thank them for playing a role in resolving the issue a few additional thoughts i believe that more general setups in cl are worthwhile even in the absence of any immediate applications this is especially true since some of the standard cl assumptions do not seem to be well motivated however i find that claiming that something is more realistic requires grounding eg a set of examples from the real world or a specific domainsetting i know the authors backed some of their claims with references but different realworld problems will come with different limitations and i would be hesitant to use phrases such as most realworld settings without thorough justification while different from the core of your work i believe the framework proposed in this other recent paper has similar goals although the setup allows pretraining and is not online might be worth knowing about it in case you do not online fast adaptation and knowledge accumulation osaka a new approach to continual learning neurips 2020 httpspapersnipsccpaper2020filec0a271bc0ecb776a094786474322cb82paperpdf all in all this is a good contribution that proposes an interesting and rich setting along with a good baseline method for it i strongly encourage the authors to follow through on their promise to provide the community with code dataset splits kaggle leaderboard etc as a way to maximize the impact of their work
[ 767, 873, 8777, 403, 417, 13333, 342, 1016, 643, 5742, 597, 403, 1027, 387, 253, 3453, 3828, 275, 22296, 4715, 50275, 1279, 2606, 17032, 778, 2430, 625, 26565, 816, 6787, 594, 347, 281, 253, 747, 7982, 760, 581, 8245, 5183, 253, 22988, 1430, 273, 2403, 28537, 17032, 275, 2829, 337, 253, 8453, 273, 28537, 17032, 3198, 625, 1941, 50275, 7801, 337, 778, 11089, 432, 36256, 37264, 289, 78, 337, 476, 320, 2326, 347, 247, 1347, 3086, 1069, 257, 3541, 4323, 5700, 2299, 352, 556, 2717, 281, 513, 342, 36256, 37264, 275, 3036, 577, 253, 4081, 1332, 556, 247, 12378, 273, 3045, 5926, 275, 253, 1996, 3408, 275, 253, 1048, 1408, 352, 310, 1892, 281, 2028, 1880, 253, 4081, 1332, 588, 562, 32231, 253, 1666, 25379, 436, 3045, 5926, 812, 320, 12877, 281, 253, 36256, 37264, 273, 289, 78, 337, 50276, 783, 2929, 36908, 2319, 253, 5933, 10454, 534, 812, 320, 7005, 275, 2087, 835, 278, 310, 253, 3541, 1979, 50276, 9088, 403, 5061, 5336, 7350, 670, 253, 12342, 26122, 275, 253, 1655, 2715, 1293, 15974, 731, 253, 4081, 1895, 9978, 285, 1332, 812, 417, 320, 326, 1534, 323, 45120, 4715, 50276, 7152, 339, 431, 248, 2929, 29328, 247, 625, 15958, 4758, 1925, 18890, 77, 20657, 323, 45120, 4715, 502, 326, 2087, 4219, 253, 787, 20657, 285, 28465, 50276, 17494, 4081, 275, 2720, 789, 253, 28465, 4758, 19584, 326, 627, 310, 642, 966, 326, 4620, 275, 2709, 8892, 285, 253, 787, 20657, 4758, 19584, 326, 642, 747, 5971, 403, 2326, 846, 253, 806, 4836, 275, 253, 18890, 77, 20657, 4758, 581, 476, 452, 21481, 5971, 2439, 8892, 347, 973, 347, 747, 5971, 15602, 275, 1016, 4836, 436, 4758, 310, 671, 3909, 285, 3021, 253, 2929, 310, 6110, 275, 5415, 1566, 7103, 26332, 28537, 17032, 1512, 352, 29328, 247, 7982, 534, 45319, 253, 2170, 762, 253, 7200, 6970, 1309, 3733, 323, 253, 1072, 50275, 1542, 253, 747, 6661, 253, 2929, 29328, 247, 2266, 8245, 285, 247, 747, 5933, 1925, 502, 487, 502, 487, 310, 247, 3541, 3169, 502, 1332, 326, 1275, 1100, 697, 3541, 407, 15950, 562, 1878, 1774, 3530, 285, 22753, 352, 342, 625, 1774, 4394, 253, 3410, 6349, 310, 5118, 347, 253, 3264, 6379, 275, 3733, 2957, 672, 253, 3410, 310, 908, 323, 3733, 502, 487, 310, 671, 13496, 342, 247, 2856, 324, 1069, 257, 17825, 4715, 2281, 27387, 6974, 534, 3400, 690, 3081, 3045, 5373, 50275, 783, 1543, 34647, 326, 502, 487, 310, 2104, 281, 562, 32231, 643, 3909, 502, 3082, 285, 253, 8245, 407, 1781, 24390, 323, 2710, 8164, 10944, 273, 616, 18890, 77, 20657, 4758, 1690, 253, 28465, 285, 787, 20657, 4758, 327, 253, 260, 338, 274, 740, 260, 338, 274, 2313, 285, 10058, 303, 6533, 292, 15302, 271, 28913, 1263, 2722, 253, 2022, 5373, 273, 253, 4081, 1332, 1705, 432, 253, 3410, 6349, 1754, 3541, 6974, 50275, 296, 3755, 20556, 50276, 18, 253, 4081, 1332, 2987, 973, 327, 247, 2710, 8164, 10944, 273, 253, 18890, 77, 20657, 10602, 9978, 253, 4081, 11333, 310, 1805, 275, 253, 787, 20657, 295, 17, 4758, 285, 17923, 3294, 1598, 973, 275, 28465, 4758, 295, 2313, 347, 973, 50275, 19, 253, 28913, 1263, 16681, 253, 2022, 4445, 275, 253, 4081, 2746, 253, 3541, 4323, 6974, 326, 15771, 327, 3410, 6349, 326, 1543, 275, 253, 4585, 5373, 50275, 20, 8783, 273, 8109, 4679, 403, 2530, 275, 253, 30762, 326, 2007, 25092, 366, 954, 3916, 50275, 21, 1175, 6003, 273, 2905, 789, 285, 1270, 8442, 50273, 20881, 1255, 265, 50276, 18, 891, 717, 31488, 604, 253, 4081, 2746, 273, 13546, 3410, 6349, 310, 7094, 4460, 247, 5955, 5001, 436, 310, 3309, 347, 352, 310, 2779, 326, 2074, 5697, 1246, 4457, 502, 323, 4227, 275, 6239, 50275, 19, 247, 3626, 3448, 5740, 273, 10012, 337, 310, 5816, 50275, 20, 891, 3543, 326, 625, 16039, 715, 253, 10272, 273, 6349, 403, 2424, 275, 253, 2022, 2505, 247, 3877, 670, 253, 6733, 21005, 3198, 281, 320, 7616, 50274, 10816, 595, 891, 1158, 5514, 30782, 247, 5110, 273, 253, 2022, 1156, 2593, 577, 275, 16585, 841, 651, 3164, 320, 625, 12912, 685, 12930, 3541, 760, 3733, 285, 17825, 298, 83, 347, 597, 16907, 1132, 760, 247, 5884, 2554, 347, 4860, 407, 253, 490, 77, 569, 50276, 21, 642, 1491, 347, 281, 2139, 10058, 303, 6533, 292, 310, 11132, 323, 512, 3082, 310, 2530, 627, 310, 671, 642, 22861, 2530, 323, 13887, 10058, 303, 6533, 292, 689, 4440, 257, 292, 40373, 43022, 285, 305, 69, 3561, 512, 921, 616, 1543, 327, 4440, 257, 292, 50275, 22, 1580, 253, 18890, 77, 20657, 4758, 310, 271, 3909, 581, 253, 4327, 273, 4373, 22041, 403, 2779, 281, 2818, 253, 1543, 10260, 347, 824, 4373, 19484, 10165, 11269, 591, 3410, 3541, 1979, 285, 14604, 3281, 878, 281, 17285, 5742, 497, 253, 4373, 22041, 6777, 5185, 342, 2045, 789, 50275, 23, 323, 247, 4229, 278, 310, 352, 5272, 281, 1902, 326, 3045, 273, 512, 11333, 651, 2572, 275, 2087, 342, 3629, 295, 436, 476, 320, 908, 281, 15249, 13887, 295, 281, 320, 432, 470, 2456, 2233, 651, 671, 320, 9371, 3806, 323, 2852, 2987, 50275, 35640, 279, 417, 247, 14855, 253, 2929, 476, 671, 1304, 253, 3045, 327, 690, 2602, 5170, 14493, 323, 4227, 752, 604, 253, 4836, 369, 417, 11096, 281, 320, 3909, 752, 310, 253, 3045, 672, 253, 2862, 3733, 941, 310, 2130, 387, 2378, 1327, 502, 10076, 50274, 9088, 403, 690, 963, 993, 50275, 4674, 5976, 3239, 608, 16425, 326, 7296, 50276, 26460, 281, 3733, 10307, 6379, 50275, 4674, 7609, 3239, 608, 323, 253, 298, 83, 27387, 643, 502, 3082, 897, 2057, 337, 17619, 10027, 390, 374, 3638, 298, 83, 50274, 6275, 1386, 327, 3239, 577, 310, 5816, 247, 2180, 2120, 13121, 50271, 12563, 2829, 495, 247, 42921, 327, 260, 338, 274, 2313, 556, 253, 1072, 3904, 323, 253, 1390, 767, 10175, 534, 1537, 320, 247, 1745, 80, 50276, 40265, 323, 253, 7363, 50276, 18, 253, 4679, 15249, 954, 273, 253, 1491, 1246, 275, 253, 2929, 690, 3081, 1491, 5001, 253, 3410, 6349, 1754, 3541, 4323, 6974, 310, 2424, 671, 690, 7089, 5001, 4373, 22041, 878, 281, 17285, 1677, 253, 18890, 77, 20657, 4758, 310, 3909, 285, 253, 2022, 16038, 273, 253, 2929, 310, 281, 12661, 285, 8415, 247, 625, 8542, 502, 4758, 50275, 19, 516, 417, 2119, 670, 253, 38135, 273, 253, 3410, 2948, 593, 1754, 3541, 4323, 6974, 285, 247, 5955, 5001, 2720, 789, 2905, 281, 3410, 2948, 593, 310, 2424, 253, 4081, 4758, 327, 18890, 77, 20657, 310, 42876, 4460, 347, 352, 310, 247, 26647, 273, 253, 28465, 285, 787, 20657, 16012, 28537, 17032, 285, 247, 14827, 403, 50276, 10355, 4460, 50276, 20, 16774, 1543, 2011, 2550, 320, 4354, 285, 18171, 2429, 281, 690, 2720, 789, 347, 627, 310, 2649, 247, 7654, 19471, 281, 2720, 10895, 285, 4373, 19484, 10165, 690, 22861, 323, 253, 7089, 310, 2424, 50275, 21, 16039, 715, 253, 4081, 502, 487, 1332, 2112, 342, 247, 8794, 3086, 5955, 5001, 253, 3910, 875, 253, 11333, 310, 14999, 627, 310, 642, 5740, 5001, 6733, 285, 9171, 1430, 50275, 11183, 253, 4477, 19148, 2761, 512, 273, 619, 3533, 27855, 281, 253, 2929, 253, 2929, 1024, 4428, 4679, 327, 4440, 257, 292, 18, 76, 342, 3081, 16039, 715, 253, 5933, 285, 697, 6733, 347, 824, 209, 422, 5520, 619, 7363, 50276, 7152, 339, 431, 248, 2929, 10262, 271, 2793, 44864, 1332, 323, 45120, 4715, 502, 3692, 2022, 15832, 310, 247, 3541, 26454, 5933, 326, 11269, 253, 3541, 672, 747, 3530, 12666, 275, 247, 1039, 326, 31221, 253, 954, 4217, 3530, 8772, 281, 616, 1055, 327, 253, 2957, 1223, 14312, 966, 6654, 253, 1332, 310, 6760, 327, 253, 10480, 10558, 502, 4758, 50276, 2437, 32809, 3909, 285, 4836, 4924, 50276, 2811, 352, 41731, 13015, 1666, 25379, 327, 260, 338, 274, 740, 285, 2233, 15302, 253, 2022, 7680, 273, 253, 2929, 891, 2868, 310, 253, 3541, 5731, 1332, 534, 347, 4767, 1840, 11269, 253, 3541, 672, 747, 3530, 12666, 275, 247, 1039, 326, 31221, 253, 954, 4217, 3530, 8772, 281, 616, 1055, 327, 253, 2957, 1223, 14312, 966, 6654, 253, 4477, 671, 6194, 14288, 327, 3530, 8392, 432, 253, 3541, 2581, 685, 327, 39657, 8085, 875, 3541, 285, 9841, 7244, 3530, 534, 3133, 281, 320, 24400, 285, 253, 4477, 897, 247, 4715, 4427, 27387, 1332, 326, 671, 4245, 690, 11361, 253, 3541, 4323, 1332, 310, 4460, 347, 2080, 347, 516, 6600, 285, 253, 4583, 1332, 17923, 1077, 973, 7103, 310, 4824, 562, 275, 253, 966, 32809, 3909, 4836, 4924, 4758, 534, 310, 247, 1175, 3061, 275, 619, 4743, 594, 1142, 643, 3082, 2965, 2159, 762, 841, 2515, 50276, 601, 891, 1158, 253, 2929, 22828, 281, 320, 7607, 50276, 783, 4477, 1056, 247, 1943, 2968, 562, 273, 616, 787, 20657, 10491, 1332, 323, 7103, 891, 1158, 697, 247, 1175, 2746, 533, 310, 2649, 1663, 247, 1943, 7680, 275, 697, 1211, 987, 50276, 10192, 40224, 1563, 3541, 5731, 253, 14604, 326, 310, 10166, 327, 310, 19958, 17568, 432, 253, 3541, 310, 326, 987, 50276, 5641, 27382, 352, 651, 320, 5322, 281, 923, 247, 625, 7000, 5301, 342, 2905, 2793, 44864, 3082, 275, 1798, 355, 75, 1504, 74, 1162, 14350, 6385, 1332, 671, 4648, 247, 2238, 273, 6349, 42428, 352, 651, 320, 1175, 281, 923, 253, 3910, 43997, 562, 285, 281, 923, 690, 3686, 273, 8813, 323, 2139, 253, 1246, 1332, 17923, 1805, 247, 5322, 2929, 12930, 271, 3576, 285, 8489, 4460, 44864, 3169, 502, 1332, 612, 1952, 456, 275, 247, 17905, 4758, 2490, 187, 4118, 18435, 27, 783, 4477, 12661, 247, 747, 45120, 28269, 4758, 342, 247, 1643, 32495, 3386, 337, 253, 4836, 13674, 403, 787, 20657, 275, 643, 3000, 2469, 4836, 3530, 476, 43482, 274, 374, 3733, 310, 3909, 285, 495, 7103, 970, 3909, 7200, 3185, 273, 3388, 7200, 253, 4477, 671, 12661, 247, 4217, 1332, 323, 436, 10076, 285, 22791, 352, 970, 1740, 1027, 15302, 50276, 783, 806, 3790, 273, 2278, 8042, 281, 767, 2022, 7364, 273, 253, 7714, 50275, 783, 4477, 760, 2530, 1355, 7527, 4679, 253, 30628, 9125, 326, 323, 253, 9978, 285, 1332, 281, 452, 271, 3486, 1907, 1175, 1543, 970, 1236, 7276, 25912, 941, 651, 564, 247, 1048, 1039, 50275, 20094, 4836, 4924, 285, 966, 19687, 30132, 497, 13333, 50275, 1542, 253, 3438, 253, 4477, 497, 1077, 17411, 285, 2530, 1543, 970, 247, 2629, 4440, 257, 292, 323, 502, 10895, 50275, 1542, 253, 6158, 891, 1364, 5717, 253, 4477, 285, 671, 253, 30628, 323, 16585, 436, 16575, 275, 253, 990, 619, 4685, 310, 326, 627, 369, 247, 40746, 326, 1097, 497, 275, 958, 13333, 533, 253, 37317, 5125, 326, 436, 320, 5469, 1077, 4518, 407, 253, 4477, 891, 1273, 436, 14876, 253, 502, 1673, 1677, 697, 1142, 5777, 1027, 7533, 1537, 320, 13730, 281, 13387, 1060, 37317, 362, 25837, 19, 1160, 247, 2074, 4385, 285, 891, 671, 5717, 731, 323, 4882, 247, 2554, 275, 30426, 253, 2523, 50275, 66, 1643, 3081, 7906, 50275, 74, 2868, 326, 625, 2087, 873, 8777, 275, 502, 403, 32811, 1014, 275, 253, 5928, 273, 667, 8993, 4893, 436, 310, 3340, 2032, 1580, 690, 273, 253, 2629, 502, 13260, 513, 417, 1646, 281, 320, 973, 17194, 2299, 891, 1089, 326, 15081, 326, 1633, 310, 625, 15958, 4419, 3216, 272, 24088, 247, 873, 273, 6667, 432, 253, 1524, 1533, 390, 247, 2173, 5028, 28617, 891, 871, 253, 4477, 17245, 690, 273, 616, 3916, 342, 10414, 533, 1027, 1524, 10186, 3237, 588, 1705, 342, 1027, 7364, 285, 891, 651, 320, 16063, 386, 281, 897, 25491, 824, 347, 954, 1524, 10186, 7533, 1293, 11080, 22861, 50276, 6050, 1027, 432, 253, 5161, 273, 634, 789, 891, 2868, 253, 7792, 4081, 275, 436, 643, 3332, 2929, 556, 2074, 7342, 3738, 253, 9978, 4483, 3215, 26208, 285, 310, 417, 3909, 1537, 320, 4409, 8958, 670, 352, 275, 1083, 368, 513, 417, 50276, 27381, 3809, 15644, 285, 3640, 12037, 7684, 10573, 247, 747, 2746, 281, 45120, 4715, 5723, 2824, 9169, 5987, 50004, 79, 2824, 550, 20790, 14952, 3140, 68, 17, 66, 28209, 12847, 17, 886, 67, 32332, 66, 2693, 31819, 1540, 3566, 23625, 11316, 3507, 20790, 9275, 50276, 455, 275, 512, 436, 310, 247, 1175, 7680, 326, 29328, 271, 4722, 285, 6793, 4758, 2112, 342, 247, 1175, 8245, 1332, 323, 352, 891, 7052, 11907, 253, 4477, 281, 956, 949, 327, 616, 9023, 281, 2085, 253, 3114, 342, 2127, 10895, 36509, 465, 356, 10582, 6657, 4697, 3966, 347, 247, 1039, 281, 22950, 253, 3486, 273, 616, 789 ]
[ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 ]
[ 767, 873, 8777, 403, 417, 13333, 342, 1016, 643, 5742, 597, 403, 1027, 387, 253, 3453, 3828, 275, 22296, 4715, 50275, 1279, 2606, 17032, 778, 2430, 625, 26565, 816, 6787, 594, 347, 281, 253, 747, 7982, 760, 581, 8245, 5183, 253, 22988, 1430, 273, 2403, 28537, 17032, 275, 2829, 337, 253, 8453, 273, 28537, 17032, 3198, 625, 1941, 50275, 7801, 337, 778, 11089, 432, 36256, 37264, 289, 78, 337, 476, 320, 2326, 347, 247, 1347, 3086, 1069, 257, 3541, 4323, 5700, 2299, 352, 556, 2717, 281, 513, 342, 36256, 37264, 275, 3036, 577, 253, 4081, 1332, 556, 247, 12378, 273, 3045, 5926, 275, 253, 1996, 3408, 275, 253, 1048, 1408, 352, 310, 1892, 281, 2028, 1880, 253, 4081, 1332, 588, 562, 32231, 253, 1666, 25379, 436, 3045, 5926, 812, 320, 12877, 281, 253, 36256, 37264, 273, 289, 78, 337, 50276, 783, 2929, 36908, 2319, 253, 5933, 10454, 534, 812, 320, 7005, 275, 2087, 835, 278, 310, 253, 3541, 1979, 50276, 9088, 403, 5061, 5336, 7350, 670, 253, 12342, 26122, 275, 253, 1655, 2715, 1293, 15974, 731, 253, 4081, 1895, 9978, 285, 1332, 812, 417, 320, 326, 1534, 323, 45120, 4715, 50276, 7152, 339, 431, 248, 2929, 29328, 247, 625, 15958, 4758, 1925, 18890, 77, 20657, 323, 45120, 4715, 502, 326, 2087, 4219, 253, 787, 20657, 285, 28465, 50276, 17494, 4081, 275, 2720, 789, 253, 28465, 4758, 19584, 326, 627, 310, 642, 966, 326, 4620, 275, 2709, 8892, 285, 253, 787, 20657, 4758, 19584, 326, 642, 747, 5971, 403, 2326, 846, 253, 806, 4836, 275, 253, 18890, 77, 20657, 4758, 581, 476, 452, 21481, 5971, 2439, 8892, 347, 973, 347, 747, 5971, 15602, 275, 1016, 4836, 436, 4758, 310, 671, 3909, 285, 3021, 253, 2929, 310, 6110, 275, 5415, 1566, 7103, 26332, 28537, 17032, 1512, 352, 29328, 247, 7982, 534, 45319, 253, 2170, 762, 253, 7200, 6970, 1309, 3733, 323, 253, 1072, 50275, 1542, 253, 747, 6661, 253, 2929, 29328, 247, 2266, 8245, 285, 247, 747, 5933, 1925, 502, 487, 502, 487, 310, 247, 3541, 3169, 502, 1332, 326, 1275, 1100, 697, 3541, 407, 15950, 562, 1878, 1774, 3530, 285, 22753, 352, 342, 625, 1774, 4394, 253, 3410, 6349, 310, 5118, 347, 253, 3264, 6379, 275, 3733, 2957, 672, 253, 3410, 310, 908, 323, 3733, 502, 487, 310, 671, 13496, 342, 247, 2856, 324, 1069, 257, 17825, 4715, 2281, 27387, 6974, 534, 3400, 690, 3081, 3045, 5373, 50275, 783, 1543, 34647, 326, 502, 487, 310, 2104, 281, 562, 32231, 643, 3909, 502, 3082, 285, 253, 8245, 407, 1781, 24390, 323, 2710, 8164, 10944, 273, 616, 18890, 77, 20657, 4758, 1690, 253, 28465, 285, 787, 20657, 4758, 327, 253, 260, 338, 274, 740, 260, 338, 274, 2313, 285, 10058, 303, 6533, 292, 15302, 271, 28913, 1263, 2722, 253, 2022, 5373, 273, 253, 4081, 1332, 1705, 432, 253, 3410, 6349, 1754, 3541, 6974, 50275, 296, 3755, 20556, 50276, 18, 253, 4081, 1332, 2987, 973, 327, 247, 2710, 8164, 10944, 273, 253, 18890, 77, 20657, 10602, 9978, 253, 4081, 11333, 310, 1805, 275, 253, 787, 20657, 295, 17, 4758, 285, 17923, 3294, 1598, 973, 275, 28465, 4758, 295, 2313, 347, 973, 50275, 19, 253, 28913, 1263, 16681, 253, 2022, 4445, 275, 253, 4081, 2746, 253, 3541, 4323, 6974, 326, 15771, 327, 3410, 6349, 326, 1543, 275, 253, 4585, 5373, 50275, 20, 8783, 273, 8109, 4679, 403, 2530, 275, 253, 30762, 326, 2007, 25092, 366, 954, 3916, 50275, 21, 1175, 6003, 273, 2905, 789, 285, 1270, 8442, 50273, 20881, 1255, 265, 50276, 18, 891, 717, 31488, 604, 253, 4081, 2746, 273, 13546, 3410, 6349, 310, 7094, 4460, 247, 5955, 5001, 436, 310, 3309, 347, 352, 310, 2779, 326, 2074, 5697, 1246, 4457, 502, 323, 4227, 275, 6239, 50275, 19, 247, 3626, 3448, 5740, 273, 10012, 337, 310, 5816, 50275, 20, 891, 3543, 326, 625, 16039, 715, 253, 10272, 273, 6349, 403, 2424, 275, 253, 2022, 2505, 247, 3877, 670, 253, 6733, 21005, 3198, 281, 320, 7616, 50274, 10816, 595, 891, 1158, 5514, 30782, 247, 5110, 273, 253, 2022, 1156, 2593, 577, 275, 16585, 841, 651, 3164, 320, 625, 12912, 685, 12930, 3541, 760, 3733, 285, 17825, 298, 83, 347, 597, 16907, 1132, 760, 247, 5884, 2554, 347, 4860, 407, 253, 490, 77, 569, 50276, 21, 642, 1491, 347, 281, 2139, 10058, 303, 6533, 292, 310, 11132, 323, 512, 3082, 310, 2530, 627, 310, 671, 642, 22861, 2530, 323, 13887, 10058, 303, 6533, 292, 689, 4440, 257, 292, 40373, 43022, 285, 305, 69, 3561, 512, 921, 616, 1543, 327, 4440, 257, 292, 50275, 22, 1580, 253, 18890, 77, 20657, 4758, 310, 271, 3909, 581, 253, 4327, 273, 4373, 22041, 403, 2779, 281, 2818, 253, 1543, 10260, 347, 824, 4373, 19484, 10165, 11269, 591, 3410, 3541, 1979, 285, 14604, 3281, 878, 281, 17285, 5742, 497, 253, 4373, 22041, 6777, 5185, 342, 2045, 789, 50275, 23, 323, 247, 4229, 278, 310, 352, 5272, 281, 1902, 326, 3045, 273, 512, 11333, 651, 2572, 275, 2087, 342, 3629, 295, 436, 476, 320, 908, 281, 15249, 13887, 295, 281, 320, 432, 470, 2456, 2233, 651, 671, 320, 9371, 3806, 323, 2852, 2987, 50275, 35640, 279, 417, 247, 14855, 253, 2929, 476, 671, 1304, 253, 3045, 327, 690, 2602, 5170, 14493, 323, 4227, 752, 604, 253, 4836, 369, 417, 11096, 281, 320, 3909, 752, 310, 253, 3045, 672, 253, 2862, 3733, 941, 310, 2130, 387, 2378, 1327, 502, 10076, 50274, 9088, 403, 690, 963, 993, 50275, 4674, 5976, 3239, 608, 16425, 326, 7296, 50276, 26460, 281, 3733, 10307, 6379, 50275, 4674, 7609, 3239, 608, 323, 253, 298, 83, 27387, 643, 502, 3082, 897, 2057, 337, 17619, 10027, 390, 374, 3638, 298, 83, 50274, 6275, 1386, 327, 3239, 577, 310, 5816, 247, 2180, 2120, 13121, 50271, 12563, 2829, 495, 247, 42921, 327, 260, 338, 274, 2313, 556, 253, 1072, 3904, 323, 253, 1390, 767, 10175, 534, 1537, 320, 247, 1745, 80, 50276, 40265, 323, 253, 7363, 50276, 18, 253, 4679, 15249, 954, 273, 253, 1491, 1246, 275, 253, 2929, 690, 3081, 1491, 5001, 253, 3410, 6349, 1754, 3541, 4323, 6974, 310, 2424, 671, 690, 7089, 5001, 4373, 22041, 878, 281, 17285, 1677, 253, 18890, 77, 20657, 4758, 310, 3909, 285, 253, 2022, 16038, 273, 253, 2929, 310, 281, 12661, 285, 8415, 247, 625, 8542, 502, 4758, 50275, 19, 516, 417, 2119, 670, 253, 38135, 273, 253, 3410, 2948, 593, 1754, 3541, 4323, 6974, 285, 247, 5955, 5001, 2720, 789, 2905, 281, 3410, 2948, 593, 310, 2424, 253, 4081, 4758, 327, 18890, 77, 20657, 310, 42876, 4460, 347, 352, 310, 247, 26647, 273, 253, 28465, 285, 787, 20657, 16012, 28537, 17032, 285, 247, 14827, 403, 50276, 10355, 4460, 50276, 20, 16774, 1543, 2011, 2550, 320, 4354, 285, 18171, 2429, 281, 690, 2720, 789, 347, 627, 310, 2649, 247, 7654, 19471, 281, 2720, 10895, 285, 4373, 19484, 10165, 690, 22861, 323, 253, 7089, 310, 2424, 50275, 21, 16039, 715, 253, 4081, 502, 487, 1332, 2112, 342, 247, 8794, 3086, 5955, 5001, 253, 3910, 875, 253, 11333, 310, 14999, 627, 310, 642, 5740, 5001, 6733, 285, 9171, 1430, 50275, 11183, 253, 4477, 19148, 2761, 512, 273, 619, 3533, 27855, 281, 253, 2929, 253, 2929, 1024, 4428, 4679, 327, 4440, 257, 292, 18, 76, 342, 3081, 16039, 715, 253, 5933, 285, 697, 6733, 347, 824, 209, 422, 5520, 619, 7363, 50276, 7152, 339, 431, 248, 2929, 10262, 271, 2793, 44864, 1332, 323, 45120, 4715, 502, 3692, 2022, 15832, 310, 247, 3541, 26454, 5933, 326, 11269, 253, 3541, 672, 747, 3530, 12666, 275, 247, 1039, 326, 31221, 253, 954, 4217, 3530, 8772, 281, 616, 1055, 327, 253, 2957, 1223, 14312, 966, 6654, 253, 1332, 310, 6760, 327, 253, 10480, 10558, 502, 4758, 50276, 2437, 32809, 3909, 285, 4836, 4924, 50276, 2811, 352, 41731, 13015, 1666, 25379, 327, 260, 338, 274, 740, 285, 2233, 15302, 253, 2022, 7680, 273, 253, 2929, 891, 2868, 310, 253, 3541, 5731, 1332, 534, 347, 4767, 1840, 11269, 253, 3541, 672, 747, 3530, 12666, 275, 247, 1039, 326, 31221, 253, 954, 4217, 3530, 8772, 281, 616, 1055, 327, 253, 2957, 1223, 14312, 966, 6654, 253, 4477, 671, 6194, 14288, 327, 3530, 8392, 432, 253, 3541, 2581, 685, 327, 39657, 8085, 875, 3541, 285, 9841, 7244, 3530, 534, 3133, 281, 320, 24400, 285, 253, 4477, 897, 247, 4715, 4427, 27387, 1332, 326, 671, 4245, 690, 11361, 253, 3541, 4323, 1332, 310, 4460, 347, 2080, 347, 516, 6600, 285, 253, 4583, 1332, 17923, 1077, 973, 7103, 310, 4824, 562, 275, 253, 966, 32809, 3909, 4836, 4924, 4758, 534, 310, 247, 1175, 3061, 275, 619, 4743, 594, 1142, 643, 3082, 2965, 2159, 762, 841, 2515, 50276, 601, 891, 1158, 253, 2929, 22828, 281, 320, 7607, 50276, 783, 4477, 1056, 247, 1943, 2968, 562, 273, 616, 787, 20657, 10491, 1332, 323, 7103, 891, 1158, 697, 247, 1175, 2746, 533, 310, 2649, 1663, 247, 1943, 7680, 275, 697, 1211, 987, 50276, 10192, 40224, 1563, 3541, 5731, 253, 14604, 326, 310, 10166, 327, 310, 19958, 17568, 432, 253, 3541, 310, 326, 987, 50276, 5641, 27382, 352, 651, 320, 5322, 281, 923, 247, 625, 7000, 5301, 342, 2905, 2793, 44864, 3082, 275, 1798, 355, 75, 1504, 74, 1162, 14350, 6385, 1332, 671, 4648, 247, 2238, 273, 6349, 42428, 352, 651, 320, 1175, 281, 923, 253, 3910, 43997, 562, 285, 281, 923, 690, 3686, 273, 8813, 323, 2139, 253, 1246, 1332, 17923, 1805, 247, 5322, 2929, 12930, 271, 3576, 285, 8489, 4460, 44864, 3169, 502, 1332, 612, 1952, 456, 275, 247, 17905, 4758, 2490, 187, 4118, 18435, 27, 783, 4477, 12661, 247, 747, 45120, 28269, 4758, 342, 247, 1643, 32495, 3386, 337, 253, 4836, 13674, 403, 787, 20657, 275, 643, 3000, 2469, 4836, 3530, 476, 43482, 274, 374, 3733, 310, 3909, 285, 495, 7103, 970, 3909, 7200, 3185, 273, 3388, 7200, 253, 4477, 671, 12661, 247, 4217, 1332, 323, 436, 10076, 285, 22791, 352, 970, 1740, 1027, 15302, 50276, 783, 806, 3790, 273, 2278, 8042, 281, 767, 2022, 7364, 273, 253, 7714, 50275, 783, 4477, 760, 2530, 1355, 7527, 4679, 253, 30628, 9125, 326, 323, 253, 9978, 285, 1332, 281, 452, 271, 3486, 1907, 1175, 1543, 970, 1236, 7276, 25912, 941, 651, 564, 247, 1048, 1039, 50275, 20094, 4836, 4924, 285, 966, 19687, 30132, 497, 13333, 50275, 1542, 253, 3438, 253, 4477, 497, 1077, 17411, 285, 2530, 1543, 970, 247, 2629, 4440, 257, 292, 323, 502, 10895, 50275, 1542, 253, 6158, 891, 1364, 5717, 253, 4477, 285, 671, 253, 30628, 323, 16585, 436, 16575, 275, 253, 990, 619, 4685, 310, 326, 627, 369, 247, 40746, 326, 1097, 497, 275, 958, 13333, 533, 253, 37317, 5125, 326, 436, 320, 5469, 1077, 4518, 407, 253, 4477, 891, 1273, 436, 14876, 253, 502, 1673, 1677, 697, 1142, 5777, 1027, 7533, 1537, 320, 13730, 281, 13387, 1060, 37317, 362, 25837, 19, 1160, 247, 2074, 4385, 285, 891, 671, 5717, 731, 323, 4882, 247, 2554, 275, 30426, 253, 2523, 50275, 66, 1643, 3081, 7906, 50275, 74, 2868, 326, 625, 2087, 873, 8777, 275, 502, 403, 32811, 1014, 275, 253, 5928, 273, 667, 8993, 4893, 436, 310, 3340, 2032, 1580, 690, 273, 253, 2629, 502, 13260, 513, 417, 1646, 281, 320, 973, 17194, 2299, 891, 1089, 326, 15081, 326, 1633, 310, 625, 15958, 4419, 3216, 272, 24088, 247, 873, 273, 6667, 432, 253, 1524, 1533, 390, 247, 2173, 5028, 28617, 891, 871, 253, 4477, 17245, 690, 273, 616, 3916, 342, 10414, 533, 1027, 1524, 10186, 3237, 588, 1705, 342, 1027, 7364, 285, 891, 651, 320, 16063, 386, 281, 897, 25491, 824, 347, 954, 1524, 10186, 7533, 1293, 11080, 22861, 50276, 6050, 1027, 432, 253, 5161, 273, 634, 789, 891, 2868, 253, 7792, 4081, 275, 436, 643, 3332, 2929, 556, 2074, 7342, 3738, 253, 9978, 4483, 3215, 26208, 285, 310, 417, 3909, 1537, 320, 4409, 8958, 670, 352, 275, 1083, 368, 513, 417, 50276, 27381, 3809, 15644, 285, 3640, 12037, 7684, 10573, 247, 747, 2746, 281, 45120, 4715, 5723, 2824, 9169, 5987, 50004, 79, 2824, 550, 20790, 14952, 3140, 68, 17, 66, 28209, 12847, 17, 886, 67, 32332, 66, 2693, 31819, 1540, 3566, 23625, 11316, 3507, 20790, 9275, 50276, 455, 275, 512, 436, 310, 247, 1175, 7680, 326, 29328, 271, 4722, 285, 6793, 4758, 2112, 342, 247, 1175, 8245, 1332, 323, 352, 891, 7052, 11907, 253, 4477, 281, 956, 949, 327, 616, 9023, 281, 2085, 253, 3114, 342, 2127, 10895, 36509, 465, 356, 10582, 6657, 4697, 3966, 347, 247, 1039, 281, 22950, 253, 3486, 273, 616, 789 ]
Below is given review of a research paper from cnoference journal. Please write a summary the review. ### Review: the paper considers learning boolean functions represented by readonce dnfs by using neural networks the neural network architecture consists of a hidden layer with 2d components which is rich enough to express any boolean functions given a whole 2d instances of some readonce dnf the authors showed that 1 weights corresponds to the true dnf is the global minimum of the loss minimization problem with the network 2 they empirically observe that gradient descent with a rounding heuristics finds the true dnf expression and3 the solution of a 2norm minimization recovers the true dnf the assumption that the whole set of instances of the true readonce dnf is given is too strong it would be much nicer given a partial set of instances s subset x one can learn a consistent dnf by using neural networks then previous pac learning results of readonce dnfs could be applied to obtain sample complexity results i dont think the computeraided proof is really a proof so i am afraid that it should not be stated as a theorem as a summary i feel that the technical results are still preliminary and not mature enough to be published docsepin this paper the aim in to understand the inductive bias of neural networks learning dnfs the focus is in convex neural networks and gradient descent it is shown that under a symmetric initialization the global minimum that gradient descent converges to is similar to a dnfrecovery solution further experimental evaluation demonstrates that gradient descent can recover readonce dnfs from data learning functions over boolean variables is a fundamental problem and there have been an increasing interest towards using neural networks for this task this paper sheds light to this task in a very specific case i am not an expert on the area of the paper and hence cannot fully assess the novelty and impact of the work however i found the development in the paper concise and enough details are provided for even a nonexpert to follow the presentation the restriction to readonce dnfs seems rather severe however i found the analysis of the inductive bias of gradient decent towards logical formulas interesting pros 1 provides understanding of the inductive bias of convex neural networks using gradient descent 2 computer assisted proof and experiments are used to complement theoretical results 3 clearly and concisely written paper a significant amount of supplementary material to make a more detailed treatment available for those interested cons 1 results cover readonce dnfs which is very restricted class of dnfs which might limit the impact of the results 2 without access to code its impossible to assess the correctness and quality of the computer assisted proof questions its quite unclear what is happening in figure 2 could you add some more explanations will the code related to computer assisted proof and the other experiments be made publicly available minor comments remark 31 boolean not boolean there are couple of references we use a unique property of our setting that allows us to perform calculations in integers and avoid floating point errors intro start of sec 6 where it is unclear what this property is this is finally explained before def 63 it could be better to explain this in short in the earlier mentions because in their current format they do not really help the reader but rather make them wonder what is this property kkt in page 7 not defined docsepthis paper investigates the problem of learning monotone readonce dnf formulas using convex neural networks specifically the authors explore the distributionspecific pac setting where training samples are drawn independently according to the uniform distributions and are labeled according to a target monotone readonce dnf the main contribution of this study is essentially empirical convex neural nets trained with gd for minimizing the cumulative hinge loss converge to global minima for which neural units coincide with the monomials of the target dnf this remarkable stability is corroborated by theoretical insights about global minima first of all the formal setting should be clarified according to the specifications given in section 3 this study focuses on monotone readonce dnf formula for which all literals are positive i dont think that this restriction has a major impact on the result since readonce dnfs are unate ie we can rename negative literals in order to obtain a monotone variant next the learnability result about readonce dnf formula should be clarified in the introduction it is indicated that this concept class is efficiently pac learnable under the uniform distribution by quoting fiat pechyony 2004 well this is not exactly true fiat pechyony 2004 used a result obtained by mansour schain 2001 in which it was shown that readonce dnf formulas are properly and efficiently pac learnable under any maximum entropy distribution but in fp 2004 it was not explicitly demonstrated that any readonce dnf formula is properly and efficiently pac learnable under the uniform distribution so for the sake of completeness it would be legitimate to provide such a result in the appendix for example finally and most importantly i am not entirely convinced by the impact of this study on the one hand as indicated above the authors empirically demonstrate that convex neural nets are able to learn monotone readonce dnf concepts by converging to global minima that capture the target concept this is indeed interesting in practice but there is no formal proof that convex neural nets are able to learn any monotone readonce dnf in polynomial time with polynomial sample complexity on the other hand it is already known that monotone readonce dnf functions are improperly but efficiently pac learnable under the uniform distribution hancock mansour 1991 in fact hancock and mansour have shown that monotone readk dnf functions are efficiently pac learnable under product distributions actually many subclasses of dnf are known to be efficiently learnable under product distributions using spectral approaches see eg feldman 2012 so in light of such strong results it seems that the contribution of the present paper is slightly behind vitaly feldman learning dnf expressions from fourier spectrum in proc conf learn theory colt pages 1711719 2012 thomas hancock and yishay mansour learning monotone k dnf formulas on product distributions in proceedings of the 4th annual conference on computational learning theory colt pages 179193 1991 yishay mansour and mariano schain learning with maximumentropy distributions machine learning 452123145 2001 ### Summary:
this paper studies how twolayer neural networks can learn dnfs the paper provides some theoretical analysis together with empirical evidence the direction of analyzing how neural networks learn certain concept classes is definitely extremely important and the authors do make some progress towards this direction however there are some major concerns about the paper in the main result the authors seem to only able to prove that the learning process converges with exponentially many neurons exponential in the input dimension see 61 setup with this many neurons it is unclear whether the result is directly covered by existing works such as a learning and generalization in overparameterized neural networks going beyond two layers b finegrained analysis of optimization and generalization for overparameterized twolayer neural networks these two works provide efficient optimization and generalization bounds wrt the complexity of the function however these works are still in the ntk regime it would be nice if the authors can distinguish the current technique from ntks by providing some theoretical guarantee that their main result is indeed more efficient than kernels as they argue in the intro the author can refer to a what can resnet efficiently learn going beyond kernels b when do neural networks outperform kernel methods with the current form of the draft it is unclear how the result is better than existing approaches the authors should address that in the next version of the paper missing reference of ntks a convergence theory for deep learning via overparameterization
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 783, 2929, 19401, 4715, 12419, 3470, 6607, 407, 1239, 19131, 277, 79, 3671, 407, 970, 11454, 6928, 253, 11454, 2990, 10336, 8414, 273, 247, 8763, 3828, 342, 374, 69, 4295, 534, 310, 6793, 2217, 281, 3890, 667, 12419, 3470, 1677, 247, 2644, 374, 69, 10872, 273, 690, 1239, 19131, 277, 35478, 253, 4477, 2692, 326, 337, 13461, 10140, 281, 253, 2032, 277, 35478, 310, 253, 4156, 5927, 273, 253, 2957, 41458, 1895, 342, 253, 2990, 374, 597, 45190, 10018, 326, 11786, 18499, 342, 247, 46551, 344, 321, 3397, 9010, 253, 2032, 277, 35478, 2048, 285, 20, 253, 2900, 273, 247, 374, 12850, 41458, 761, 12239, 253, 2032, 277, 35478, 50276, 783, 9376, 326, 253, 2644, 873, 273, 10872, 273, 253, 2032, 1239, 19131, 277, 35478, 310, 1677, 310, 1512, 2266, 352, 651, 320, 1199, 49482, 1677, 247, 7898, 873, 273, 10872, 256, 8578, 1269, 581, 476, 3037, 247, 5185, 277, 35478, 407, 970, 11454, 6928, 840, 2045, 19162, 4715, 1543, 273, 1239, 19131, 277, 79, 3671, 812, 320, 3732, 281, 4044, 3410, 10454, 1543, 50275, 74, 13414, 1158, 253, 2475, 3525, 1356, 4737, 310, 1663, 247, 4737, 594, 891, 717, 9202, 326, 352, 943, 417, 320, 4767, 347, 247, 10012, 50275, 284, 247, 6010, 891, 1928, 326, 253, 7681, 1543, 403, 1335, 12611, 285, 417, 14242, 2217, 281, 320, 3863, 50275, 7152, 339, 9852, 436, 2929, 253, 4388, 275, 281, 2096, 253, 42115, 8492, 273, 11454, 6928, 4715, 277, 79, 3671, 253, 2770, 310, 275, 17133, 11454, 6928, 285, 11786, 18499, 352, 310, 2011, 326, 762, 247, 13123, 31850, 253, 4156, 5927, 326, 11786, 18499, 26414, 281, 310, 2074, 281, 247, 277, 79, 18105, 17708, 2900, 2007, 5661, 7103, 14371, 326, 11786, 18499, 476, 9295, 1239, 19131, 277, 79, 3671, 432, 941, 50275, 28269, 3470, 689, 12419, 4903, 310, 247, 7936, 1895, 285, 627, 452, 644, 271, 3629, 1600, 4404, 970, 11454, 6928, 323, 436, 4836, 436, 2929, 703, 1397, 1708, 281, 436, 4836, 275, 247, 1077, 2173, 1083, 891, 717, 417, 271, 6485, 327, 253, 2170, 273, 253, 2929, 285, 7613, 2550, 4751, 2939, 253, 38135, 285, 3486, 273, 253, 789, 2299, 891, 1119, 253, 2440, 275, 253, 2929, 44003, 285, 2217, 4278, 403, 2530, 323, 1014, 247, 44382, 8292, 281, 956, 253, 9759, 253, 12400, 281, 1239, 19131, 277, 79, 3671, 3133, 2581, 5460, 2299, 891, 1119, 253, 1783, 273, 253, 42115, 8492, 273, 11786, 12524, 4404, 13760, 23276, 4722, 50274, 856, 84, 337, 3400, 4685, 273, 253, 42115, 8492, 273, 17133, 11454, 6928, 970, 11786, 18499, 374, 4382, 21075, 4737, 285, 4679, 403, 908, 281, 13503, 10527, 1543, 495, 4518, 285, 7036, 9299, 3542, 2929, 247, 1534, 2408, 273, 24864, 2144, 281, 50276, 11145, 247, 625, 7000, 1971, 2130, 323, 1110, 6110, 50276, 5040, 337, 1543, 3835, 1239, 19131, 277, 79, 3671, 534, 310, 1077, 11096, 966, 273, 277, 79, 3671, 534, 1537, 2701, 253, 3486, 273, 253, 1543, 374, 1293, 2289, 281, 2127, 697, 7479, 281, 2939, 253, 36594, 285, 3290, 273, 253, 4382, 21075, 4737, 50275, 34974, 50276, 953, 3240, 12744, 752, 310, 9369, 275, 4677, 374, 812, 368, 823, 690, 625, 22909, 50276, 9846, 253, 2127, 2905, 281, 4382, 21075, 4737, 285, 253, 643, 4679, 320, 1160, 13644, 2130, 50276, 37585, 5701, 50275, 39808, 4562, 12419, 417, 12419, 50276, 9088, 403, 4564, 273, 10414, 359, 897, 247, 4451, 2867, 273, 776, 4758, 326, 4483, 441, 281, 1347, 10426, 275, 20935, 285, 3693, 14974, 1127, 6332, 26432, 1265, 273, 4706, 721, 835, 352, 310, 12744, 752, 436, 2867, 310, 436, 310, 4720, 5544, 1078, 809, 9654, 352, 812, 320, 1805, 281, 5513, 436, 275, 2159, 275, 253, 4321, 25957, 984, 275, 616, 1655, 5981, 597, 513, 417, 1663, 1361, 253, 9414, 533, 2581, 1056, 731, 4282, 752, 310, 436, 2867, 50276, 76, 5751, 275, 3239, 818, 417, 2931, 5474, 33032, 2520, 2929, 2340, 684, 253, 1895, 273, 4715, 49123, 1239, 19131, 277, 35478, 23276, 970, 17133, 11454, 6928, 5742, 253, 4477, 8338, 253, 10670, 29765, 19162, 4758, 835, 3733, 3530, 403, 8392, 10939, 2556, 281, 253, 6447, 10670, 285, 403, 13130, 2556, 281, 247, 2303, 49123, 1239, 19131, 277, 35478, 253, 2022, 7680, 273, 436, 1263, 310, 9093, 16774, 17133, 11454, 37507, 10166, 342, 305, 69, 323, 28699, 253, 18849, 38864, 2957, 29623, 281, 4156, 46836, 323, 534, 11454, 5085, 28588, 342, 253, 1114, 297, 8075, 273, 253, 2303, 277, 35478, 436, 13406, 7882, 310, 47790, 407, 10527, 16039, 670, 4156, 46836, 50276, 7053, 273, 512, 253, 7473, 4758, 943, 320, 31637, 2556, 281, 253, 23944, 1677, 275, 2593, 495, 436, 1263, 16633, 327, 49123, 1239, 19131, 277, 35478, 7212, 323, 534, 512, 4133, 932, 403, 2762, 891, 13414, 1158, 326, 436, 12400, 556, 247, 2201, 3486, 327, 253, 906, 1580, 1239, 19131, 277, 79, 3671, 403, 440, 366, 26332, 359, 476, 41838, 4016, 4133, 932, 275, 1340, 281, 4044, 247, 49123, 12955, 50275, 8384, 253, 3037, 1430, 906, 670, 1239, 19131, 277, 35478, 7212, 943, 320, 31637, 275, 253, 10199, 352, 310, 4860, 326, 436, 4473, 966, 310, 14556, 19162, 3037, 494, 762, 253, 6447, 3268, 407, 32249, 269, 5946, 50276, 365, 43749, 2421, 6157, 973, 436, 310, 417, 4555, 2032, 269, 5946, 50276, 365, 43749, 2421, 6157, 908, 247, 906, 2797, 407, 30433, 454, 50276, 84, 10050, 6585, 275, 534, 352, 369, 2011, 326, 1239, 19131, 277, 35478, 23276, 403, 6283, 285, 14556, 19162, 3037, 494, 762, 667, 4869, 15579, 3268, 533, 275, 44296, 6157, 352, 369, 417, 11120, 5183, 326, 667, 1239, 19131, 277, 35478, 7212, 310, 6283, 285, 14556, 19162, 3037, 494, 762, 253, 6447, 3268, 594, 323, 253, 13232, 273, 29867, 352, 651, 320, 14905, 281, 2085, 824, 247, 906, 275, 253, 30762, 323, 1650, 50276, 71, 3341, 285, 954, 15538, 891, 717, 417, 7094, 13762, 407, 253, 3486, 273, 436, 1263, 327, 253, 581, 1133, 347, 4860, 1840, 253, 4477, 45190, 7568, 326, 17133, 11454, 37507, 403, 2104, 281, 3037, 49123, 1239, 19131, 277, 35478, 12342, 407, 5975, 3390, 281, 4156, 46836, 326, 9232, 253, 2303, 4473, 436, 310, 6296, 4722, 275, 3946, 533, 627, 310, 642, 7473, 4737, 326, 17133, 11454, 37507, 403, 2104, 281, 3037, 667, 49123, 1239, 19131, 277, 35478, 275, 14189, 673, 342, 14189, 3410, 10454, 327, 253, 643, 1133, 352, 310, 2168, 1929, 326, 49123, 1239, 19131, 277, 35478, 3470, 403, 28203, 533, 14556, 19162, 3037, 494, 762, 253, 6447, 3268, 288, 47453, 50276, 24044, 454, 10226, 275, 958, 288, 47453, 285, 30433, 454, 452, 2011, 326, 49123, 1239, 76, 277, 35478, 3470, 403, 14556, 19162, 3037, 494, 762, 1885, 10670, 2686, 1142, 749, 19770, 273, 277, 35478, 403, 1929, 281, 320, 14556, 3037, 494, 762, 1885, 10670, 970, 9879, 7274, 923, 24088, 269, 10391, 1342, 4050, 594, 275, 1708, 273, 824, 2266, 1543, 352, 3133, 326, 253, 7680, 273, 253, 1246, 2929, 310, 5777, 3212, 50276, 87, 1562, 90, 269, 10391, 1342, 4715, 277, 35478, 12091, 432, 269, 15421, 6637, 275, 15613, 1461, 3037, 3762, 847, 85, 7223, 1722, 12231, 746, 4050, 50276, 394, 4921, 288, 47453, 285, 340, 763, 333, 30433, 454, 4715, 49123, 465, 277, 35478, 23276, 327, 1885, 10670, 275, 10061, 273, 253, 577, 394, 7970, 8059, 327, 15180, 4715, 3762, 847, 85, 7223, 24062, 19631, 10226, 50276, 90, 763, 333, 30433, 454, 285, 2304, 10356, 5807, 404, 4715, 342, 11903, 1303, 10144, 10670, 5145, 4715, 43952, 10683, 11838, 6585, 2490, 187, 4118, 18435, 27, 2520, 2929, 2175, 849, 2500, 311, 4071, 11454, 6928, 476, 3037, 277, 79, 3671, 253, 2929, 3400, 690, 10527, 1783, 2366, 342, 16774, 1941, 50276, 783, 3884, 273, 18918, 849, 11454, 6928, 3037, 2176, 4473, 5971, 310, 7964, 6685, 1774, 285, 253, 4477, 513, 1056, 690, 4780, 4404, 436, 3884, 2299, 627, 403, 690, 2201, 7350, 670, 253, 2929, 50275, 249, 253, 2022, 906, 253, 4477, 1646, 281, 760, 2104, 281, 5276, 326, 253, 4715, 1232, 26414, 342, 28596, 1142, 8512, 17619, 275, 253, 3280, 7877, 923, 9901, 9978, 342, 436, 1142, 8512, 352, 310, 12744, 1880, 253, 906, 310, 3587, 6107, 407, 5368, 2987, 824, 347, 50274, 66, 4715, 285, 26647, 275, 689, 19484, 1025, 11454, 6928, 1469, 4457, 767, 8090, 270, 4030, 72, 11273, 1783, 273, 13757, 285, 26647, 323, 689, 19484, 1025, 2500, 311, 4071, 11454, 6928, 50275, 20513, 767, 2987, 2085, 5919, 13757, 285, 26647, 14493, 8772, 253, 10454, 273, 253, 1159, 2299, 841, 2987, 403, 1335, 275, 253, 295, 17922, 9459, 352, 651, 320, 5322, 604, 253, 4477, 476, 12129, 253, 1655, 5853, 432, 34900, 661, 407, 5277, 690, 10527, 12215, 326, 616, 2022, 906, 310, 6296, 625, 5919, 685, 34501, 347, 597, 9059, 275, 253, 26432, 253, 2488, 476, 3730, 281, 50274, 66, 752, 476, 501, 3024, 14556, 3037, 1469, 4457, 34501, 50275, 67, 672, 513, 11454, 6928, 562, 32231, 10295, 3082, 50275, 3113, 253, 1655, 830, 273, 253, 7482, 352, 310, 12744, 849, 253, 906, 310, 1805, 685, 5368, 7274, 253, 4477, 943, 2953, 326, 275, 253, 1735, 2715, 273, 253, 2929, 50275, 33722, 3806, 273, 34900, 661, 247, 14940, 3762, 323, 3676, 4715, 3066, 689, 19484, 1320, 50275 ]
[ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 ]
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 783, 2929, 19401, 4715, 12419, 3470, 6607, 407, 1239, 19131, 277, 79, 3671, 407, 970, 11454, 6928, 253, 11454, 2990, 10336, 8414, 273, 247, 8763, 3828, 342, 374, 69, 4295, 534, 310, 6793, 2217, 281, 3890, 667, 12419, 3470, 1677, 247, 2644, 374, 69, 10872, 273, 690, 1239, 19131, 277, 35478, 253, 4477, 2692, 326, 337, 13461, 10140, 281, 253, 2032, 277, 35478, 310, 253, 4156, 5927, 273, 253, 2957, 41458, 1895, 342, 253, 2990, 374, 597, 45190, 10018, 326, 11786, 18499, 342, 247, 46551, 344, 321, 3397, 9010, 253, 2032, 277, 35478, 2048, 285, 20, 253, 2900, 273, 247, 374, 12850, 41458, 761, 12239, 253, 2032, 277, 35478, 50276, 783, 9376, 326, 253, 2644, 873, 273, 10872, 273, 253, 2032, 1239, 19131, 277, 35478, 310, 1677, 310, 1512, 2266, 352, 651, 320, 1199, 49482, 1677, 247, 7898, 873, 273, 10872, 256, 8578, 1269, 581, 476, 3037, 247, 5185, 277, 35478, 407, 970, 11454, 6928, 840, 2045, 19162, 4715, 1543, 273, 1239, 19131, 277, 79, 3671, 812, 320, 3732, 281, 4044, 3410, 10454, 1543, 50275, 74, 13414, 1158, 253, 2475, 3525, 1356, 4737, 310, 1663, 247, 4737, 594, 891, 717, 9202, 326, 352, 943, 417, 320, 4767, 347, 247, 10012, 50275, 284, 247, 6010, 891, 1928, 326, 253, 7681, 1543, 403, 1335, 12611, 285, 417, 14242, 2217, 281, 320, 3863, 50275, 7152, 339, 9852, 436, 2929, 253, 4388, 275, 281, 2096, 253, 42115, 8492, 273, 11454, 6928, 4715, 277, 79, 3671, 253, 2770, 310, 275, 17133, 11454, 6928, 285, 11786, 18499, 352, 310, 2011, 326, 762, 247, 13123, 31850, 253, 4156, 5927, 326, 11786, 18499, 26414, 281, 310, 2074, 281, 247, 277, 79, 18105, 17708, 2900, 2007, 5661, 7103, 14371, 326, 11786, 18499, 476, 9295, 1239, 19131, 277, 79, 3671, 432, 941, 50275, 28269, 3470, 689, 12419, 4903, 310, 247, 7936, 1895, 285, 627, 452, 644, 271, 3629, 1600, 4404, 970, 11454, 6928, 323, 436, 4836, 436, 2929, 703, 1397, 1708, 281, 436, 4836, 275, 247, 1077, 2173, 1083, 891, 717, 417, 271, 6485, 327, 253, 2170, 273, 253, 2929, 285, 7613, 2550, 4751, 2939, 253, 38135, 285, 3486, 273, 253, 789, 2299, 891, 1119, 253, 2440, 275, 253, 2929, 44003, 285, 2217, 4278, 403, 2530, 323, 1014, 247, 44382, 8292, 281, 956, 253, 9759, 253, 12400, 281, 1239, 19131, 277, 79, 3671, 3133, 2581, 5460, 2299, 891, 1119, 253, 1783, 273, 253, 42115, 8492, 273, 11786, 12524, 4404, 13760, 23276, 4722, 50274, 856, 84, 337, 3400, 4685, 273, 253, 42115, 8492, 273, 17133, 11454, 6928, 970, 11786, 18499, 374, 4382, 21075, 4737, 285, 4679, 403, 908, 281, 13503, 10527, 1543, 495, 4518, 285, 7036, 9299, 3542, 2929, 247, 1534, 2408, 273, 24864, 2144, 281, 50276, 11145, 247, 625, 7000, 1971, 2130, 323, 1110, 6110, 50276, 5040, 337, 1543, 3835, 1239, 19131, 277, 79, 3671, 534, 310, 1077, 11096, 966, 273, 277, 79, 3671, 534, 1537, 2701, 253, 3486, 273, 253, 1543, 374, 1293, 2289, 281, 2127, 697, 7479, 281, 2939, 253, 36594, 285, 3290, 273, 253, 4382, 21075, 4737, 50275, 34974, 50276, 953, 3240, 12744, 752, 310, 9369, 275, 4677, 374, 812, 368, 823, 690, 625, 22909, 50276, 9846, 253, 2127, 2905, 281, 4382, 21075, 4737, 285, 253, 643, 4679, 320, 1160, 13644, 2130, 50276, 37585, 5701, 50275, 39808, 4562, 12419, 417, 12419, 50276, 9088, 403, 4564, 273, 10414, 359, 897, 247, 4451, 2867, 273, 776, 4758, 326, 4483, 441, 281, 1347, 10426, 275, 20935, 285, 3693, 14974, 1127, 6332, 26432, 1265, 273, 4706, 721, 835, 352, 310, 12744, 752, 436, 2867, 310, 436, 310, 4720, 5544, 1078, 809, 9654, 352, 812, 320, 1805, 281, 5513, 436, 275, 2159, 275, 253, 4321, 25957, 984, 275, 616, 1655, 5981, 597, 513, 417, 1663, 1361, 253, 9414, 533, 2581, 1056, 731, 4282, 752, 310, 436, 2867, 50276, 76, 5751, 275, 3239, 818, 417, 2931, 5474, 33032, 2520, 2929, 2340, 684, 253, 1895, 273, 4715, 49123, 1239, 19131, 277, 35478, 23276, 970, 17133, 11454, 6928, 5742, 253, 4477, 8338, 253, 10670, 29765, 19162, 4758, 835, 3733, 3530, 403, 8392, 10939, 2556, 281, 253, 6447, 10670, 285, 403, 13130, 2556, 281, 247, 2303, 49123, 1239, 19131, 277, 35478, 253, 2022, 7680, 273, 436, 1263, 310, 9093, 16774, 17133, 11454, 37507, 10166, 342, 305, 69, 323, 28699, 253, 18849, 38864, 2957, 29623, 281, 4156, 46836, 323, 534, 11454, 5085, 28588, 342, 253, 1114, 297, 8075, 273, 253, 2303, 277, 35478, 436, 13406, 7882, 310, 47790, 407, 10527, 16039, 670, 4156, 46836, 50276, 7053, 273, 512, 253, 7473, 4758, 943, 320, 31637, 2556, 281, 253, 23944, 1677, 275, 2593, 495, 436, 1263, 16633, 327, 49123, 1239, 19131, 277, 35478, 7212, 323, 534, 512, 4133, 932, 403, 2762, 891, 13414, 1158, 326, 436, 12400, 556, 247, 2201, 3486, 327, 253, 906, 1580, 1239, 19131, 277, 79, 3671, 403, 440, 366, 26332, 359, 476, 41838, 4016, 4133, 932, 275, 1340, 281, 4044, 247, 49123, 12955, 50275, 8384, 253, 3037, 1430, 906, 670, 1239, 19131, 277, 35478, 7212, 943, 320, 31637, 275, 253, 10199, 352, 310, 4860, 326, 436, 4473, 966, 310, 14556, 19162, 3037, 494, 762, 253, 6447, 3268, 407, 32249, 269, 5946, 50276, 365, 43749, 2421, 6157, 973, 436, 310, 417, 4555, 2032, 269, 5946, 50276, 365, 43749, 2421, 6157, 908, 247, 906, 2797, 407, 30433, 454, 50276, 84, 10050, 6585, 275, 534, 352, 369, 2011, 326, 1239, 19131, 277, 35478, 23276, 403, 6283, 285, 14556, 19162, 3037, 494, 762, 667, 4869, 15579, 3268, 533, 275, 44296, 6157, 352, 369, 417, 11120, 5183, 326, 667, 1239, 19131, 277, 35478, 7212, 310, 6283, 285, 14556, 19162, 3037, 494, 762, 253, 6447, 3268, 594, 323, 253, 13232, 273, 29867, 352, 651, 320, 14905, 281, 2085, 824, 247, 906, 275, 253, 30762, 323, 1650, 50276, 71, 3341, 285, 954, 15538, 891, 717, 417, 7094, 13762, 407, 253, 3486, 273, 436, 1263, 327, 253, 581, 1133, 347, 4860, 1840, 253, 4477, 45190, 7568, 326, 17133, 11454, 37507, 403, 2104, 281, 3037, 49123, 1239, 19131, 277, 35478, 12342, 407, 5975, 3390, 281, 4156, 46836, 326, 9232, 253, 2303, 4473, 436, 310, 6296, 4722, 275, 3946, 533, 627, 310, 642, 7473, 4737, 326, 17133, 11454, 37507, 403, 2104, 281, 3037, 667, 49123, 1239, 19131, 277, 35478, 275, 14189, 673, 342, 14189, 3410, 10454, 327, 253, 643, 1133, 352, 310, 2168, 1929, 326, 49123, 1239, 19131, 277, 35478, 3470, 403, 28203, 533, 14556, 19162, 3037, 494, 762, 253, 6447, 3268, 288, 47453, 50276, 24044, 454, 10226, 275, 958, 288, 47453, 285, 30433, 454, 452, 2011, 326, 49123, 1239, 76, 277, 35478, 3470, 403, 14556, 19162, 3037, 494, 762, 1885, 10670, 2686, 1142, 749, 19770, 273, 277, 35478, 403, 1929, 281, 320, 14556, 3037, 494, 762, 1885, 10670, 970, 9879, 7274, 923, 24088, 269, 10391, 1342, 4050, 594, 275, 1708, 273, 824, 2266, 1543, 352, 3133, 326, 253, 7680, 273, 253, 1246, 2929, 310, 5777, 3212, 50276, 87, 1562, 90, 269, 10391, 1342, 4715, 277, 35478, 12091, 432, 269, 15421, 6637, 275, 15613, 1461, 3037, 3762, 847, 85, 7223, 1722, 12231, 746, 4050, 50276, 394, 4921, 288, 47453, 285, 340, 763, 333, 30433, 454, 4715, 49123, 465, 277, 35478, 23276, 327, 1885, 10670, 275, 10061, 273, 253, 577, 394, 7970, 8059, 327, 15180, 4715, 3762, 847, 85, 7223, 24062, 19631, 10226, 50276, 90, 763, 333, 30433, 454, 285, 2304, 10356, 5807, 404, 4715, 342, 11903, 1303, 10144, 10670, 5145, 4715, 43952, 10683, 11838, 6585, 2490, 187, 4118, 18435, 27, 2520, 2929, 2175, 849, 2500, 311, 4071, 11454, 6928, 476, 3037, 277, 79, 3671, 253, 2929, 3400, 690, 10527, 1783, 2366, 342, 16774, 1941, 50276, 783, 3884, 273, 18918, 849, 11454, 6928, 3037, 2176, 4473, 5971, 310, 7964, 6685, 1774, 285, 253, 4477, 513, 1056, 690, 4780, 4404, 436, 3884, 2299, 627, 403, 690, 2201, 7350, 670, 253, 2929, 50275, 249, 253, 2022, 906, 253, 4477, 1646, 281, 760, 2104, 281, 5276, 326, 253, 4715, 1232, 26414, 342, 28596, 1142, 8512, 17619, 275, 253, 3280, 7877, 923, 9901, 9978, 342, 436, 1142, 8512, 352, 310, 12744, 1880, 253, 906, 310, 3587, 6107, 407, 5368, 2987, 824, 347, 50274, 66, 4715, 285, 26647, 275, 689, 19484, 1025, 11454, 6928, 1469, 4457, 767, 8090, 270, 4030, 72, 11273, 1783, 273, 13757, 285, 26647, 323, 689, 19484, 1025, 2500, 311, 4071, 11454, 6928, 50275, 20513, 767, 2987, 2085, 5919, 13757, 285, 26647, 14493, 8772, 253, 10454, 273, 253, 1159, 2299, 841, 2987, 403, 1335, 275, 253, 295, 17922, 9459, 352, 651, 320, 5322, 604, 253, 4477, 476, 12129, 253, 1655, 5853, 432, 34900, 661, 407, 5277, 690, 10527, 12215, 326, 616, 2022, 906, 310, 6296, 625, 5919, 685, 34501, 347, 597, 9059, 275, 253, 26432, 253, 2488, 476, 3730, 281, 50274, 66, 752, 476, 501, 3024, 14556, 3037, 1469, 4457, 34501, 50275, 67, 672, 513, 11454, 6928, 562, 32231, 10295, 3082, 50275, 3113, 253, 1655, 830, 273, 253, 7482, 352, 310, 12744, 849, 253, 906, 310, 1805, 685, 5368, 7274, 253, 4477, 943, 2953, 326, 275, 253, 1735, 2715, 273, 253, 2929, 50275, 33722, 3806, 273, 34900, 661, 247, 14940, 3762, 323, 3676, 4715, 3066, 689, 19484, 1320, 50275 ]
Below is given review of a research paper from cnoference journal. Please write a summary the review. ### Review: the authors propose a method for fast and efficient classification of sequential data the guiding principle is that for some data modalities it is not necessary to see the whole sequence in order to make a fairly certain classification their model reduces inference time by learning a rank code that is inspired by spiking neural networks reported results show improved inference times in two toy sequence classification tasks temporal mnist and in google speech commands classification compared to models without optimizing timing of inference through learning a rank code increasing inference speed comes with a minimal decrease in accuracy the authors however introduce and show the effectiveness of a regularization term that allows for tuning of this speedaccuracy tradeoff pros i think the proposed method is very practical and the ideas of this paper are organized logically the method is related to earlyexit inference but their model not just exits early it does so by a learned rank code rc that is inspired by spiking neural networks this allows for adaptively decreasing computation and increasing speed during both training and inference for training the idea of backpropagating from a strategically early time step of each sequence determined by the time at which an output neurons activation crosses a threshold is interesting and can lead to significantly shorter training time and lower compute resources it appears that prior and related work is adequately referenced throughout the paper unfortunately i am not familiar enough with this line of work but i quick search revealed no glaring omission the empirical methodology appears standard and is reported in sufficient detail to recreate the results the resulting claims are justified by the performance of the method strengths and limitations are sufficiently discussed the writing is clear and succinct cons the authors do not compare their method to nonrc trained lstms in the temporal mnist task for this task since it appears that the first frame contains most of the information i suspect that an earlyexit but nonrc trained lstm would perform similar in terms of inference speed and accuracy this is however just a guess did the authors try this or have further insights comparing their model to snns in this task seems somehow unfair also given that the authors state earlier that anns are simpler to train and usually achieve superior performance however i can follow the authors argumentation on why they have chosen to compare to snns comparing to earlyexit but nonrc trained lstms would still be interesting but omitting it would not weaken the message delivered in the overall strong paper the threshold is a fixed hyperparameter of the model would it be beneficial to learn this parameter overall i vote for accepting i like the idea of integrating specific isolated aspects of biological neurons into otherwise conventional anns here the authors show that anns can benefit by such an approach and i think that further exploration of such methods may advance both spiking neural network and conventional deep learning research docsepthe authors introduce a new way to train rnns using rank order coding roc with roc the label is given by the first readout unit to reach a threshold as soon as this happens the processing is stopped and bptt is used from that particular time step using the predictions at that particular time step and the ground truth this will encourage the neuron with the right label to be as active as possible at that particular time step and thus its threshold will tend to be reached earlier in the future this is desirable as the latency of the decision will decrease furthermore the speedaccuracy tradeoff is tunable by varying the threshold the authors validate their idea using lstms on two toy problems and then on mnist and on the google speech command dataset strength as far as i know the idea is new well written weaknesses experimental validations are well below modern ml standards the authors should test their idea on spiking neurons eg lif as opposed to lstms i think experimental validation falls short for iclr only the google speech command dataset is not toy and the accuracy they get on this dataset is below the sota in addition the authors seem to target the spiking neural network community part of which uses roc see for example table 1 but then for a fair comparison they should try to use a spiking neuron model eg leaky integrate and fire lif instead of the lstm in discretetime the lif can be seen as a recurrent ann unit and bptt could be used as well the authors seem to use batch processing but i dont understand how its possible the number of timesteps used by bptt is exampledependent how can batch processing work with a varying number of timesteps more insight is needed here a potentially interesting idea but not yet validated docsepthis paper presents the original idea of applying rank coding to classical lstm networks in order to improve their performance the paper is very clearly written and has an extensive introduction that allows to introduce the scientific problem the method is briefly described and then applied in experimental demonstrations if the first example is particularly simple the three following applications are more challenging and show the advantage of using this method compared to the state of the art one a particularly interesting point is the tradeoff curves between speed and accuracy as well as the performance of the network depending on the use or not of rank coding i would like to point out some limitations of this work and how it could be improved first of all it seems that the rank coding used in this paper is radically different from the one proposed by thorpe and gautrais indeed in the latter an analog vector is transformed into a vector in which its analog values are ordered from the highest to the lowest this representation has many properties such as being invariant to continuous monotonic transformations of the analog values as for example an image can be transformed by a change of its contrast this transformation also keeps a very high complexity of possible representations which corresponds to the set of permutations of all ranks that is to say to the factorial of the dimension of this vector however in this paper it seems that you only consider the maximum of the analog vector when it exceeds a threshold this indeed allows to transform the calculation into a temporal calculation but the complexity of this operation is much lower than the original rank coding in the first practical example it seems that you use this coding in different time windows that correspond to different bins in a sequence and that these sequences are then processed independently this information processing is far too simplified to qualify as temporal coding and thus to correspond to a spiking neuron model the other experiments seem to show a clear advantage to the rank coding and seem promising to extend it to the coding of the maximum but also of the successive values i encourage the authors to apply this kind of method to larger images as it has already been done in the literature ### Summary:
the authors propose a rank coding scheme for recurrent neural networks rnns inspired by spiking neural networks in order to improve inference times at the classification of sequential data the basic idea is to train the rnn to classify the sequence early even before the full sequence has been observed they also introduce a regularisation term that allows for a speedaccuracy tradeoff the method is tested on two toytasks as well as on temporal mnist and google speech commands the results are very good typically improving inference time with very little loss in accuracy furthermore the idea seems novel and the paper is well written an initial criticism was that experiments with spiking neural networks snns were missing the authors added a proof of concept for snns which satisfied the reviewer the authors also added some control experiments in response to the initial reviews which improved the manuscript in summary the manuscript presents a valuable novel idea with good experimental verification and interesting aspects both for anns and snns the reviewers consistently vote for acceptance
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 783, 4477, 12661, 247, 1332, 323, 3809, 285, 5919, 9162, 273, 22453, 941, 253, 26766, 8063, 310, 326, 323, 690, 941, 33433, 352, 310, 417, 3309, 281, 923, 253, 2644, 3425, 275, 1340, 281, 1056, 247, 9648, 2176, 9162, 616, 1566, 11355, 17032, 673, 407, 4715, 247, 5958, 2127, 326, 310, 11797, 407, 653, 16434, 11454, 6928, 2361, 1543, 921, 5520, 17032, 2069, 275, 767, 20953, 3425, 9162, 8892, 11935, 278, 79, 382, 285, 275, 17899, 6519, 13896, 9162, 2429, 281, 3210, 1293, 39793, 11795, 273, 17032, 949, 4715, 247, 5958, 2127, 3629, 17032, 3885, 3249, 342, 247, 8723, 6379, 275, 7200, 253, 4477, 2299, 9569, 285, 921, 253, 12510, 273, 247, 37820, 1307, 326, 4483, 323, 25184, 273, 436, 3885, 18921, 1974, 5454, 2727, 50276, 856, 84, 50275, 74, 1158, 253, 4081, 1332, 310, 1077, 8542, 285, 253, 5697, 273, 436, 2929, 403, 10932, 40452, 50275, 783, 1332, 310, 2905, 281, 2393, 19874, 17032, 533, 616, 1566, 417, 816, 39852, 2393, 352, 1057, 594, 407, 247, 6311, 5958, 2127, 27657, 326, 310, 11797, 407, 653, 16434, 11454, 6928, 436, 4483, 323, 5223, 1242, 11052, 13782, 285, 3629, 3885, 1309, 1097, 3733, 285, 17032, 50275, 1542, 3733, 253, 2934, 273, 896, 44263, 839, 432, 247, 3483, 1037, 2393, 673, 3213, 273, 1016, 3425, 3413, 407, 253, 673, 387, 534, 271, 3453, 8512, 5743, 25808, 247, 7887, 310, 4722, 285, 476, 1421, 281, 3012, 12217, 3733, 673, 285, 2406, 11897, 5300, 50276, 262, 4620, 326, 2720, 285, 2905, 789, 310, 18212, 23378, 4768, 253, 2929, 19235, 891, 717, 417, 7615, 2217, 342, 436, 1386, 273, 789, 533, 891, 3158, 3186, 4950, 642, 45982, 33860, 50276, 783, 16774, 16182, 4620, 2629, 285, 310, 2361, 275, 4209, 2508, 281, 48516, 253, 1543, 253, 4795, 3916, 403, 17285, 407, 253, 3045, 273, 253, 1332, 20544, 285, 7364, 403, 10481, 5469, 253, 4028, 310, 2590, 285, 18382, 4291, 50276, 5040, 50276, 783, 4477, 513, 417, 7277, 616, 1332, 281, 1327, 3373, 10166, 298, 296, 983, 275, 253, 11935, 278, 79, 382, 4836, 323, 436, 4836, 1580, 352, 4620, 326, 253, 806, 3665, 4428, 954, 273, 253, 1491, 891, 9101, 326, 271, 2393, 19874, 533, 1327, 3373, 10166, 298, 296, 78, 651, 1347, 2074, 275, 2426, 273, 17032, 3885, 285, 7200, 436, 310, 2299, 816, 247, 5476, 858, 253, 4477, 1611, 436, 390, 452, 2007, 16039, 10941, 616, 1566, 281, 3802, 2224, 275, 436, 4836, 3133, 10380, 16593, 671, 1677, 326, 253, 4477, 1375, 4321, 326, 271, 2224, 403, 19554, 281, 6194, 285, 3798, 5115, 8936, 3045, 2299, 891, 476, 956, 253, 4477, 4154, 318, 327, 2139, 597, 452, 6777, 281, 7277, 281, 3802, 2224, 10941, 281, 2393, 19874, 533, 1327, 3373, 10166, 298, 296, 983, 651, 1335, 320, 4722, 533, 7005, 2835, 352, 651, 417, 20171, 253, 3935, 8549, 275, 253, 4583, 2266, 2929, 50276, 783, 7887, 310, 247, 4229, 4373, 19484, 273, 253, 1566, 651, 352, 320, 12912, 281, 3037, 436, 4764, 50276, 1189, 455, 891, 6273, 323, 18738, 891, 751, 253, 2934, 273, 24399, 2173, 7011, 7794, 273, 7534, 8512, 715, 5010, 6041, 271, 2224, 1060, 253, 4477, 921, 326, 271, 2224, 476, 5649, 407, 824, 271, 2746, 285, 891, 1158, 326, 2007, 17947, 273, 824, 3082, 778, 7170, 1097, 653, 16434, 11454, 2990, 285, 6041, 3676, 4715, 2561, 5474, 339, 431, 248, 4477, 9569, 247, 747, 1039, 281, 6194, 391, 79, 2224, 970, 5958, 1340, 12425, 687, 68, 342, 687, 68, 253, 5203, 310, 1677, 407, 253, 806, 49914, 3943, 281, 3986, 247, 7887, 347, 3517, 347, 436, 6569, 253, 5162, 310, 6331, 285, 270, 431, 85, 310, 908, 432, 326, 1798, 673, 3213, 970, 253, 13650, 387, 326, 1798, 673, 3213, 285, 253, 3216, 5083, 436, 588, 11907, 253, 23586, 342, 253, 987, 5203, 281, 320, 347, 3939, 347, 1896, 387, 326, 1798, 673, 3213, 285, 3021, 697, 7887, 588, 5257, 281, 320, 4925, 4321, 275, 253, 2852, 436, 310, 11408, 347, 253, 22667, 273, 253, 3061, 588, 6379, 33810, 253, 3885, 18921, 1974, 5454, 2727, 310, 10839, 494, 407, 11962, 253, 7887, 50276, 783, 4477, 17813, 616, 2934, 970, 298, 296, 983, 327, 767, 20953, 3237, 285, 840, 327, 278, 79, 382, 285, 327, 253, 17899, 6519, 3923, 10895, 4757, 50276, 284, 2080, 347, 891, 871, 253, 2934, 310, 747, 50276, 4714, 3542, 50276, 20881, 1255, 265, 50276, 49363, 3588, 569, 403, 973, 2708, 4980, 13361, 7465, 50276, 783, 4477, 943, 1071, 616, 2934, 327, 653, 16434, 8512, 24088, 5243, 347, 10066, 281, 298, 296, 983, 50276, 74, 1158, 5661, 12820, 11521, 2159, 323, 17857, 32888, 760, 253, 17899, 6519, 3923, 10895, 310, 417, 20953, 285, 253, 7200, 597, 755, 327, 436, 10895, 310, 2708, 253, 256, 5503, 50276, 249, 1635, 253, 4477, 1646, 281, 2303, 253, 653, 16434, 11454, 2990, 3114, 629, 273, 534, 4648, 687, 68, 923, 323, 1650, 2829, 337, 533, 840, 323, 247, 4344, 5301, 597, 943, 1611, 281, 897, 247, 653, 16434, 23586, 1566, 24088, 13584, 90, 19837, 285, 3289, 5243, 3185, 273, 253, 298, 296, 78, 275, 35132, 7816, 253, 5243, 476, 320, 2326, 347, 247, 18902, 2459, 3943, 285, 270, 431, 85, 812, 320, 908, 347, 973, 50276, 783, 4477, 1646, 281, 897, 14604, 5162, 533, 891, 13414, 2096, 849, 697, 1896, 253, 1180, 273, 4522, 383, 2265, 908, 407, 270, 431, 85, 310, 1174, 6216, 2662, 849, 476, 14604, 5162, 789, 342, 247, 11962, 1180, 273, 4522, 383, 2265, 625, 12288, 310, 3058, 1060, 50275, 66, 7826, 4722, 2934, 533, 417, 2568, 17618, 5474, 33032, 2520, 2929, 10262, 253, 3236, 2934, 273, 9433, 5958, 12425, 281, 8946, 298, 296, 78, 6928, 275, 1340, 281, 3157, 616, 3045, 253, 2929, 310, 1077, 4518, 3542, 285, 556, 271, 9470, 10199, 326, 4483, 281, 9569, 253, 8249, 1895, 253, 1332, 310, 13366, 2529, 285, 840, 3732, 275, 5661, 32367, 604, 253, 806, 1650, 310, 3782, 2969, 253, 1264, 1563, 4893, 403, 625, 11132, 285, 921, 253, 5750, 273, 970, 436, 1332, 2429, 281, 253, 1375, 273, 253, 1445, 581, 247, 3782, 4722, 1127, 310, 253, 5454, 2727, 9191, 875, 3885, 285, 7200, 347, 973, 347, 253, 3045, 273, 253, 2990, 7293, 327, 253, 897, 390, 417, 273, 5958, 12425, 50276, 74, 651, 751, 281, 1127, 562, 690, 7364, 273, 436, 789, 285, 849, 352, 812, 320, 5520, 806, 273, 512, 352, 3133, 326, 253, 5958, 12425, 908, 275, 436, 2929, 310, 39278, 1027, 432, 253, 581, 4081, 407, 9062, 365, 285, 305, 1920, 21269, 6296, 275, 253, 6158, 271, 7370, 4972, 310, 13657, 715, 247, 4972, 275, 534, 697, 7370, 2193, 403, 6960, 432, 253, 4585, 281, 253, 8840, 436, 6779, 556, 1142, 3607, 824, 347, 1146, 13727, 281, 5415, 45973, 21257, 273, 253, 7370, 2193, 347, 323, 1650, 271, 2460, 476, 320, 13657, 407, 247, 1818, 273, 697, 4499, 436, 9261, 671, 11359, 247, 1077, 1029, 10454, 273, 1896, 14237, 534, 10140, 281, 253, 873, 273, 39908, 273, 512, 17210, 326, 310, 281, 1333, 281, 253, 2803, 451, 273, 253, 7877, 273, 436, 4972, 2299, 275, 436, 2929, 352, 3133, 326, 368, 760, 1908, 253, 4869, 273, 253, 7370, 4972, 672, 352, 23141, 247, 7887, 436, 6296, 4483, 281, 4979, 253, 10272, 715, 247, 11935, 10272, 533, 253, 10454, 273, 436, 4254, 310, 1199, 2406, 685, 253, 3236, 5958, 12425, 50276, 249, 253, 806, 8542, 1650, 352, 3133, 326, 368, 897, 436, 12425, 275, 1027, 673, 8323, 326, 2723, 281, 1027, 27925, 275, 247, 3425, 285, 326, 841, 6430, 403, 840, 11742, 10939, 436, 1491, 5162, 310, 2080, 1512, 21010, 281, 19478, 347, 11935, 12425, 285, 3021, 281, 2723, 281, 247, 653, 16434, 23586, 1566, 253, 643, 4679, 1646, 281, 921, 247, 2590, 5750, 281, 253, 5958, 12425, 285, 1646, 12532, 281, 9017, 352, 281, 253, 12425, 273, 253, 4869, 533, 671, 273, 253, 20946, 2193, 891, 11907, 253, 4477, 281, 4647, 436, 2238, 273, 1332, 281, 4067, 3888, 347, 352, 556, 2168, 644, 2218, 275, 253, 6239, 2490, 187, 4118, 18435, 27, 783, 4477, 12661, 247, 5958, 12425, 6974, 323, 18902, 11454, 6928, 391, 79, 2224, 50276, 38358, 407, 653, 16434, 11454, 6928, 50276, 249, 1340, 281, 3157, 17032, 2069, 387, 253, 9162, 273, 22453, 941, 253, 5044, 2934, 310, 281, 6194, 253, 391, 9866, 281, 30215, 253, 3425, 2393, 50276, 9154, 1078, 253, 2120, 3425, 556, 644, 2540, 597, 671, 9569, 247, 3963, 5837, 1307, 326, 4483, 323, 247, 3885, 18921, 1974, 5454, 2727, 50276, 783, 1332, 310, 5762, 327, 767, 281, 1767, 6579, 347, 973, 347, 327, 11935, 278, 79, 382, 285, 17899, 6519, 13896, 50276, 783, 1543, 403, 1077, 1175, 5431, 11138, 17032, 673, 342, 1077, 1652, 2957, 275, 7200, 50276, 44295, 3062, 253, 2934, 3133, 4460, 285, 253, 2929, 310, 973, 3542, 50276, 266, 3302, 14226, 369, 326, 4679, 342, 653, 16434, 11454, 6928, 3802, 2224, 497, 5816, 253, 4477, 2879, 247, 4737, 273, 4473, 323, 3802, 2224, 534, 10048, 253, 37317, 50275, 783, 4477, 671, 2879, 690, 1453, 4679, 275, 2380, 281, 253, 3302, 10123, 534, 5520, 253, 7714, 50276, 249, 6010, 253, 7714, 10262, 247, 9865, 4460, 2934, 342, 1175, 5661, 21999, 285, 4722, 7794, 1097, 323, 271, 2224, 285, 3802, 2224, 253, 30628, 12724, 6273, 323, 14924 ]
[ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 ]
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 783, 4477, 12661, 247, 1332, 323, 3809, 285, 5919, 9162, 273, 22453, 941, 253, 26766, 8063, 310, 326, 323, 690, 941, 33433, 352, 310, 417, 3309, 281, 923, 253, 2644, 3425, 275, 1340, 281, 1056, 247, 9648, 2176, 9162, 616, 1566, 11355, 17032, 673, 407, 4715, 247, 5958, 2127, 326, 310, 11797, 407, 653, 16434, 11454, 6928, 2361, 1543, 921, 5520, 17032, 2069, 275, 767, 20953, 3425, 9162, 8892, 11935, 278, 79, 382, 285, 275, 17899, 6519, 13896, 9162, 2429, 281, 3210, 1293, 39793, 11795, 273, 17032, 949, 4715, 247, 5958, 2127, 3629, 17032, 3885, 3249, 342, 247, 8723, 6379, 275, 7200, 253, 4477, 2299, 9569, 285, 921, 253, 12510, 273, 247, 37820, 1307, 326, 4483, 323, 25184, 273, 436, 3885, 18921, 1974, 5454, 2727, 50276, 856, 84, 50275, 74, 1158, 253, 4081, 1332, 310, 1077, 8542, 285, 253, 5697, 273, 436, 2929, 403, 10932, 40452, 50275, 783, 1332, 310, 2905, 281, 2393, 19874, 17032, 533, 616, 1566, 417, 816, 39852, 2393, 352, 1057, 594, 407, 247, 6311, 5958, 2127, 27657, 326, 310, 11797, 407, 653, 16434, 11454, 6928, 436, 4483, 323, 5223, 1242, 11052, 13782, 285, 3629, 3885, 1309, 1097, 3733, 285, 17032, 50275, 1542, 3733, 253, 2934, 273, 896, 44263, 839, 432, 247, 3483, 1037, 2393, 673, 3213, 273, 1016, 3425, 3413, 407, 253, 673, 387, 534, 271, 3453, 8512, 5743, 25808, 247, 7887, 310, 4722, 285, 476, 1421, 281, 3012, 12217, 3733, 673, 285, 2406, 11897, 5300, 50276, 262, 4620, 326, 2720, 285, 2905, 789, 310, 18212, 23378, 4768, 253, 2929, 19235, 891, 717, 417, 7615, 2217, 342, 436, 1386, 273, 789, 533, 891, 3158, 3186, 4950, 642, 45982, 33860, 50276, 783, 16774, 16182, 4620, 2629, 285, 310, 2361, 275, 4209, 2508, 281, 48516, 253, 1543, 253, 4795, 3916, 403, 17285, 407, 253, 3045, 273, 253, 1332, 20544, 285, 7364, 403, 10481, 5469, 253, 4028, 310, 2590, 285, 18382, 4291, 50276, 5040, 50276, 783, 4477, 513, 417, 7277, 616, 1332, 281, 1327, 3373, 10166, 298, 296, 983, 275, 253, 11935, 278, 79, 382, 4836, 323, 436, 4836, 1580, 352, 4620, 326, 253, 806, 3665, 4428, 954, 273, 253, 1491, 891, 9101, 326, 271, 2393, 19874, 533, 1327, 3373, 10166, 298, 296, 78, 651, 1347, 2074, 275, 2426, 273, 17032, 3885, 285, 7200, 436, 310, 2299, 816, 247, 5476, 858, 253, 4477, 1611, 436, 390, 452, 2007, 16039, 10941, 616, 1566, 281, 3802, 2224, 275, 436, 4836, 3133, 10380, 16593, 671, 1677, 326, 253, 4477, 1375, 4321, 326, 271, 2224, 403, 19554, 281, 6194, 285, 3798, 5115, 8936, 3045, 2299, 891, 476, 956, 253, 4477, 4154, 318, 327, 2139, 597, 452, 6777, 281, 7277, 281, 3802, 2224, 10941, 281, 2393, 19874, 533, 1327, 3373, 10166, 298, 296, 983, 651, 1335, 320, 4722, 533, 7005, 2835, 352, 651, 417, 20171, 253, 3935, 8549, 275, 253, 4583, 2266, 2929, 50276, 783, 7887, 310, 247, 4229, 4373, 19484, 273, 253, 1566, 651, 352, 320, 12912, 281, 3037, 436, 4764, 50276, 1189, 455, 891, 6273, 323, 18738, 891, 751, 253, 2934, 273, 24399, 2173, 7011, 7794, 273, 7534, 8512, 715, 5010, 6041, 271, 2224, 1060, 253, 4477, 921, 326, 271, 2224, 476, 5649, 407, 824, 271, 2746, 285, 891, 1158, 326, 2007, 17947, 273, 824, 3082, 778, 7170, 1097, 653, 16434, 11454, 2990, 285, 6041, 3676, 4715, 2561, 5474, 339, 431, 248, 4477, 9569, 247, 747, 1039, 281, 6194, 391, 79, 2224, 970, 5958, 1340, 12425, 687, 68, 342, 687, 68, 253, 5203, 310, 1677, 407, 253, 806, 49914, 3943, 281, 3986, 247, 7887, 347, 3517, 347, 436, 6569, 253, 5162, 310, 6331, 285, 270, 431, 85, 310, 908, 432, 326, 1798, 673, 3213, 970, 253, 13650, 387, 326, 1798, 673, 3213, 285, 253, 3216, 5083, 436, 588, 11907, 253, 23586, 342, 253, 987, 5203, 281, 320, 347, 3939, 347, 1896, 387, 326, 1798, 673, 3213, 285, 3021, 697, 7887, 588, 5257, 281, 320, 4925, 4321, 275, 253, 2852, 436, 310, 11408, 347, 253, 22667, 273, 253, 3061, 588, 6379, 33810, 253, 3885, 18921, 1974, 5454, 2727, 310, 10839, 494, 407, 11962, 253, 7887, 50276, 783, 4477, 17813, 616, 2934, 970, 298, 296, 983, 327, 767, 20953, 3237, 285, 840, 327, 278, 79, 382, 285, 327, 253, 17899, 6519, 3923, 10895, 4757, 50276, 284, 2080, 347, 891, 871, 253, 2934, 310, 747, 50276, 4714, 3542, 50276, 20881, 1255, 265, 50276, 49363, 3588, 569, 403, 973, 2708, 4980, 13361, 7465, 50276, 783, 4477, 943, 1071, 616, 2934, 327, 653, 16434, 8512, 24088, 5243, 347, 10066, 281, 298, 296, 983, 50276, 74, 1158, 5661, 12820, 11521, 2159, 323, 17857, 32888, 760, 253, 17899, 6519, 3923, 10895, 310, 417, 20953, 285, 253, 7200, 597, 755, 327, 436, 10895, 310, 2708, 253, 256, 5503, 50276, 249, 1635, 253, 4477, 1646, 281, 2303, 253, 653, 16434, 11454, 2990, 3114, 629, 273, 534, 4648, 687, 68, 923, 323, 1650, 2829, 337, 533, 840, 323, 247, 4344, 5301, 597, 943, 1611, 281, 897, 247, 653, 16434, 23586, 1566, 24088, 13584, 90, 19837, 285, 3289, 5243, 3185, 273, 253, 298, 296, 78, 275, 35132, 7816, 253, 5243, 476, 320, 2326, 347, 247, 18902, 2459, 3943, 285, 270, 431, 85, 812, 320, 908, 347, 973, 50276, 783, 4477, 1646, 281, 897, 14604, 5162, 533, 891, 13414, 2096, 849, 697, 1896, 253, 1180, 273, 4522, 383, 2265, 908, 407, 270, 431, 85, 310, 1174, 6216, 2662, 849, 476, 14604, 5162, 789, 342, 247, 11962, 1180, 273, 4522, 383, 2265, 625, 12288, 310, 3058, 1060, 50275, 66, 7826, 4722, 2934, 533, 417, 2568, 17618, 5474, 33032, 2520, 2929, 10262, 253, 3236, 2934, 273, 9433, 5958, 12425, 281, 8946, 298, 296, 78, 6928, 275, 1340, 281, 3157, 616, 3045, 253, 2929, 310, 1077, 4518, 3542, 285, 556, 271, 9470, 10199, 326, 4483, 281, 9569, 253, 8249, 1895, 253, 1332, 310, 13366, 2529, 285, 840, 3732, 275, 5661, 32367, 604, 253, 806, 1650, 310, 3782, 2969, 253, 1264, 1563, 4893, 403, 625, 11132, 285, 921, 253, 5750, 273, 970, 436, 1332, 2429, 281, 253, 1375, 273, 253, 1445, 581, 247, 3782, 4722, 1127, 310, 253, 5454, 2727, 9191, 875, 3885, 285, 7200, 347, 973, 347, 253, 3045, 273, 253, 2990, 7293, 327, 253, 897, 390, 417, 273, 5958, 12425, 50276, 74, 651, 751, 281, 1127, 562, 690, 7364, 273, 436, 789, 285, 849, 352, 812, 320, 5520, 806, 273, 512, 352, 3133, 326, 253, 5958, 12425, 908, 275, 436, 2929, 310, 39278, 1027, 432, 253, 581, 4081, 407, 9062, 365, 285, 305, 1920, 21269, 6296, 275, 253, 6158, 271, 7370, 4972, 310, 13657, 715, 247, 4972, 275, 534, 697, 7370, 2193, 403, 6960, 432, 253, 4585, 281, 253, 8840, 436, 6779, 556, 1142, 3607, 824, 347, 1146, 13727, 281, 5415, 45973, 21257, 273, 253, 7370, 2193, 347, 323, 1650, 271, 2460, 476, 320, 13657, 407, 247, 1818, 273, 697, 4499, 436, 9261, 671, 11359, 247, 1077, 1029, 10454, 273, 1896, 14237, 534, 10140, 281, 253, 873, 273, 39908, 273, 512, 17210, 326, 310, 281, 1333, 281, 253, 2803, 451, 273, 253, 7877, 273, 436, 4972, 2299, 275, 436, 2929, 352, 3133, 326, 368, 760, 1908, 253, 4869, 273, 253, 7370, 4972, 672, 352, 23141, 247, 7887, 436, 6296, 4483, 281, 4979, 253, 10272, 715, 247, 11935, 10272, 533, 253, 10454, 273, 436, 4254, 310, 1199, 2406, 685, 253, 3236, 5958, 12425, 50276, 249, 253, 806, 8542, 1650, 352, 3133, 326, 368, 897, 436, 12425, 275, 1027, 673, 8323, 326, 2723, 281, 1027, 27925, 275, 247, 3425, 285, 326, 841, 6430, 403, 840, 11742, 10939, 436, 1491, 5162, 310, 2080, 1512, 21010, 281, 19478, 347, 11935, 12425, 285, 3021, 281, 2723, 281, 247, 653, 16434, 23586, 1566, 253, 643, 4679, 1646, 281, 921, 247, 2590, 5750, 281, 253, 5958, 12425, 285, 1646, 12532, 281, 9017, 352, 281, 253, 12425, 273, 253, 4869, 533, 671, 273, 253, 20946, 2193, 891, 11907, 253, 4477, 281, 4647, 436, 2238, 273, 1332, 281, 4067, 3888, 347, 352, 556, 2168, 644, 2218, 275, 253, 6239, 2490, 187, 4118, 18435, 27, 783, 4477, 12661, 247, 5958, 12425, 6974, 323, 18902, 11454, 6928, 391, 79, 2224, 50276, 38358, 407, 653, 16434, 11454, 6928, 50276, 249, 1340, 281, 3157, 17032, 2069, 387, 253, 9162, 273, 22453, 941, 253, 5044, 2934, 310, 281, 6194, 253, 391, 9866, 281, 30215, 253, 3425, 2393, 50276, 9154, 1078, 253, 2120, 3425, 556, 644, 2540, 597, 671, 9569, 247, 3963, 5837, 1307, 326, 4483, 323, 247, 3885, 18921, 1974, 5454, 2727, 50276, 783, 1332, 310, 5762, 327, 767, 281, 1767, 6579, 347, 973, 347, 327, 11935, 278, 79, 382, 285, 17899, 6519, 13896, 50276, 783, 1543, 403, 1077, 1175, 5431, 11138, 17032, 673, 342, 1077, 1652, 2957, 275, 7200, 50276, 44295, 3062, 253, 2934, 3133, 4460, 285, 253, 2929, 310, 973, 3542, 50276, 266, 3302, 14226, 369, 326, 4679, 342, 653, 16434, 11454, 6928, 3802, 2224, 497, 5816, 253, 4477, 2879, 247, 4737, 273, 4473, 323, 3802, 2224, 534, 10048, 253, 37317, 50275, 783, 4477, 671, 2879, 690, 1453, 4679, 275, 2380, 281, 253, 3302, 10123, 534, 5520, 253, 7714, 50276, 249, 6010, 253, 7714, 10262, 247, 9865, 4460, 2934, 342, 1175, 5661, 21999, 285, 4722, 7794, 1097, 323, 271, 2224, 285, 3802, 2224, 253, 30628, 12724, 6273, 323, 14924 ]
Below is given review of a research paper from cnoference journal. Please write a summary the review. ### Review: this work targets the issues in learning from demonstration in order to achieve the ability of skill reuse transferability and improve the sample efficiency to address the issues the authors propose an optioncontrol network ocn including a highlevel controller and a pool of lowlevel options which is an imitationfinetune paradigm as well finally the proposed method is evaluated on two domains craft discrete action space and dial continuous action space with results demonstrating its effectiveness strengths although this work extends the paradigm of imitationfinetune its idea is still novel by freezing the options and finetuning the controller the topic is timely important which is related to transfer learning and multitasking the paper is wellorganized and its method is sound to me besides the related work is quite clear and includes the relevant literature weakness there lacks the clarity for example why positive rewards at the beginning of training is a problem in hrl what are structured exploration and unstructured exploration during the execution of an option if probability eit is 0 the option will keep executing what if the task is really hard and eit keeps 0 will this step into an infinite loop according to figure 1 figure 3e and table 1 does that mean each option corresponds to a specific task if this is the case there lacks analysis or discussion on the scalability and generalizability this work primarily offers an empirical contribution however the empirical evaluation is still weak regarding the transferability demonstrated in the experiment the newly task is a similar task in the same domain which is still limited minor typo they encoder skills they encode skills i prefer to weak reject marginally below the acceptance threshold the paper the clarity is an issue that makes it a little bit hard to follow the empirical evaluation is weak so that the transferability looks limited the scalability and generalizability should be discussed in the paper as well docsepthis paper proposes to extract options from a dataset using an options framework with a recurrent controller and multiple recurrent option networks during pretraining ie skill extraction the actions from the framework are computed by the weighted sum over options which can be endtoend differentiable and trained using behavioral cloning once all options are extracted from the dataset a new controller can be trained to solve a new task with the learned and frozen options via rl the experimental results on the discrete craft environment are impressive outperforming all baselines the results on the continuous dial environment show improved results over the baseline methods but the difference is marginal strengths the proposed approach is intuitive and easy to implement the paper is easy to follow and clearly written the experimental results in craft show significant improvement over prior works weaknesses comparison to prior latent skill learning approaches is required to understand the contribution of the proposed option extraction method these approaches already have shown impressive results on complex robotic manipulation environments it is unclear whether the proposed method can achieve better performance compared to these approaches pertsch et al accelerating reinforcement learning with learned skill priors corl 2020 shankar et al learning robot skills with temporal variational inference icml 2020 in the approach section ocn can be easily expanded to a multilevel model does not seem true most hierarchical approaches so far have failed to learn more than twolevel hierarchy which implies challenges in extending to multilevel hierarchy when an option is continued the hidden state of the controller is copied without any update this results in missing taskspecific information during the option execution given the fact that the proposed method relies heavily on recurrent models this missing temporal information could make the agent decision not optimal there are discrepancies between pretraining and training ocn models where the pretraining stage computes an action and termination signal from all options with soft weighting while the rl training stage chooses a single option at a time will this affect rl training badly the options learn to be mixed together during skill learning without clear skill decomposition but when using these options they need to work alone and have a clear skill boundary in the dial task the proposed method works better than the baseline methods however the variance in figure 6 seems very high and the performance does not look significant how many random seeds are used for the experiments it definitely requires more random seeds to reach a meaningful conclusion another question from the learning curves is about the discrepancy between dense and sparse reward settings ocn seems to succeed 2030 in the sparse reward setting but in the dense reward setting its not even close to solving 2 subtasks out of 3 subtasks moreover this poor performance makes the scalability of the proposed method less convincing can ocn even learn the tasks it saw during pretraining such as 1 2 in the dial task the experiments focus on generalizing to new tasks with a longer sequence of subtasks one interesting experiment would be learning skills from a large number of tasks eg three digits pressing tasks with all numbers and then learning a controller for a holdout task minor comments optioncontrol network is used instead of optioncontroller network in the introduction option framework options framework furthered combined further combined in equation 13 lhs should be hatpat not hatpai t what happen if the chosen option outputs done this paper tackles an important problem of reusable skill option extraction from a large multitask dataset the proposed method is simple and works well in the discrete task craft however the training process is not intuitive soft selection during pretraining and hard selection during rl the experimental results on the continuous domain dial are noisy and show only marginal improvement over prior work moreover comparisons to continuous latent skill approaches are required to show the technical contribution of the proposed method docsepthe paper proposes a specific neural network architecture for option learning with recurrency on both option and highlevel controller the final action outputs are determined from a mixture of experts of all options the approach learns options offline from demonstrations behavioural cloning and combines them online by learning a new highlevel controller via rl ppo while using frozen options the submission uses variations of 2 domains from prior work for evaluation and shows good performance the overall ideas of task decomposition and using information from demonstrations to accelerate rl are important and this paper proposes a new architecture to address these ideas it is overall well written and easy to follow however the particular network is quite similar to existing implementations and has contributions focused on an aspect that could fall under implementation details in other papers a further big challenge is the limited evaluation and a missing connection to much of related work in its current form the submission is well worth discussion but fits more in the context of a workshop regarding the evaluation it is unclear why the authors do not compare to using options from known approaches like ddo cited in paper option critic 1 or ho2 2 where the latter two can naturally use bc pretrained options as it just requires a change of the initial parameters the evaluation uses variations of two domains which are not compared against methods from the papers which propose them in addition one of the baselines ompn uses an adaptation of the craft domain in the paper but shows a very different level of performance which suggests that the variation of the domain is different for the submitted paper the original reason for investigating the papers was that all baselines flatline at surprisingly low performance most prominent in the craft domain on the more positive side section 41 includes an example from transferring options from multiple previous experiments which provides an interesting perspective a future iteration could benefit from more focus on this generally underinvestigated idea but the results for baselines are again surprisingly low minor section 1 the mentioned exploration problem is more a property of specific domains rather than algorithms high level controller is updated less frequently is likely supposed to mean acting updating can be confused with learning each option does not correspond to a meaningful subtask it is unclear what this is supposed to mean what is a meaningful subtask and why is it good for options to not correspond to one 1 bacon pierreluc jean harb and doina precup the optioncritic architecture proceedings of the aaai conference on artificial intelligence vol 31 no 1 2017 2 wulfmeier markus et al dataefficient hindsight offpolicy option learning international conference on machine learning pmlr 2021 submission proposes a specific architecture for option learning an aspect that in other work might fall under implementation details the work has some interesting experiments on combining options from multiple experiments but the evaluation and connection to previous work is very limited docsepthe paper proposes a new model called optioncontroller network with requisite inductive biases to model temporal hierarchy this model is used to learn temporal abstractions and the control policy in the space of options using demonstration data via imitation learning the performance on discrete action craft and continuous action dial environments are promising compared to existing baselines such as ompn compile and moe strengths the model proposed is interesting and builds on the recently proposed ompn model as well as prior work 1 2 3 4 5 on incorporating inductive biases in rnns to model temporal hierarchy baseline models compared against compile ompn moe are strong appropriate and relevant all models have been evaluated on the benchmark datasets such as craft and dial as in prior work the pretrained options learned by ocn during the imitation phase lead to good performance gains when being reused later by the controller network implementation details for all models including baselines have been described well in the appendix weaknesses assuming i understood the generation of training datasets in experiments s1 and s2 correctly the training datasets ac cd da or ab ba or cd dc generated for the imitation learning phase in s1 s2 are overly simplistic the associated decomposition problem involves just finding 1 boundary point to demarcate 2 skill primitives in an episode in the datasets for s1 and s2 experiments it would be unreasonable to assume this training setup would directly transfer to any generalcomplex realworld offline datasets containing longer sequences involving execution of many options have i missed something in my interpretation could the authors clarify this point andor provide some justification as to why they would expect their model to show similar performance in the more general case it would be beneficial to compare and discuss the ocn model in the context of prior work on rnn variants that model temporal hierarchy clockwork rnn1 hmrnn2 using various adaptations to hiddenstate updates specifically the hmrnn update rule uses similar operators such as copy update and flush to allow for slower updates to memory cells deeper in the hierarchy therefore it is important to delineate the differences in the update rules employed in hmrnn 2 and the proposed ocn model to evaluate the noveltyoriginality of the contribution further earlier work in literature on inducing hierarchical temporal structure in rnns 3 4 5 6 has not been referenced and compareddiscussed in the related work can the hierarchical update rule control flow between controller and option networks proposed in the ocn model not be implementedcomputed by an appropriately sized ompn model or perhaps a hmrnn 2 model results i found it rather surprising to see that the ompn seems to struggle to learn good decompositions of primitive skills in s1 which can be composed reused by the controller later several experiments show compile and moe baselines outperforming the ompn model quite significantly figure 4 and figure 6 this is very surprising to me since ompn essentially has a strong inductive bias for modelling temporal hierarchy which is clearly beneficial for these tasks and very similar to the proposed ocn model could the authors offer an explanation for why this is so it would be interesting to quantify how redundant ie number of options learned for the same underlying skill the learned options are and amount of reuse by the controller as we use more noisiercomplex demonstration datasets for the imitation learning phase the authors claim that ocn does not require specifying the exact number of segments in an episode and only need to specify a safe upper bound on it what do the extra slots capturemodel are they left unused or does it lead to a slightly oversegmented decomposition of the episode writingpresentation in general the paper contains several typos and ambiguous or unclear phrasing in several sections the papers readability and clarity would greatly benefit from a significant revision in this regard i have highlighted some of the typosinconsistent math notationgrammatical errors below model each skill with separate options does this imply that there are several options used to model a single latent skill is a hallmark in human intelligence is a hallmark of human intelligence further the authors do not provide a reference to validate the claim in this sentence the authors introduce it is unclear whether this is a new problem introduced by the authors or an interpretation of an existing problem it would help to cite relevant references that definestudy this exploration problem in pure hrl methods limits the practical values of these approaches limits the practical utility of these approaches popularity of neural nets popularity of neural networks however assuming access to an environment in the pertain phase might be infeasible in many tasks missing citation to relevant environments and performs imitation learning and perform imitation learning our work focused on our work focuses on performs irl on the demonstration performs irl on the demonstration data what does the abbreviation irl stand for extracts meaning segments extracts meaningful segments temporal hierarchical structure hierarchical temporal structure following the fast and slow learning idea proposed in madan et al 2021 please improve the citation some suggestions for the same 6 7 some of the math notation in the methodology section is confusing difficult unintuitive to follow for the reader and also not consistent throughout the text using boldface for functions ie rnn updates example oi xt ht1 c xt ht1 is confusing as a small letters with bold font is being used to denote vectors as well typo in equation 3 should it be hatht1 and not ht1 in equations 3456 several vectors with no bold font it is also very confusing to have both ht and hatht in the cell network equations further the indexing notation i t for activations of option models ex hi to is also very confusing and reduces the clarity and readability please use another indexing scheme for activations of option models like htoi or something similar it is also incredibly hard to mentally keep track of 3 variants of each activation vector for ex hatp p and p throughout the description of the algorithm and their roles please use a more intuitive alternative to make the reading experience better and improve the clarity the objective function to minimise equation 18 is written in a pseudocodelike manner please remain consistent and define the objective function using a mathematical expression to avoid ambiguity 1 koutnik et al a clockwork rnn icml 2014 2 chung et al hierarchical multiscale recurrent neural networks iclr 2017 3 schmidhuber et al learning complex extended sequences using the principle of history compression neural computation 1992 4 mozer et al induction of multiscale temporal structure nips 1992 5 el hihi et al hierarchical recurrent neural networks for longterm dependencies nips 1995 6 jaynes edwin t 1957 information theory and statistical mechanics ii physical review 108 2171 7 boltzmann ludwig studien ber das gleichgewicht der lebendigen kraft zwischen bewegten materiellen punkten wiener berichte 58 517560 1868 the proposed ocn model is an interesting extension to the recent ompn however my main issue is that the demonstration dataset generation process has drastically reduced the full sequence decomposition task to an overly simplistic case the writing can be improved significantly as suggested above to improve clarity of the proposed ideas further some closely related prior work 2 on modelling temporal hierarchy in rnns using similar ideas have not been compareddiscussed and many others not referenced 1 3 4 5 ### Summary:
description of paper content the paper describes a technique to learn option policies using behavioral cloning and then recombine them using a highlevel controller trained by rl the underlying options are frozen the method is tested in two published environments a discrete grid world environment and a continuous action space robot it is compared to three baselines summary of paper discussion all reviewers moved to reject based on a lack of novelty and a lack of significant empirical results no rebuttals were provided
[ 970, 1491, 432, 32367, 281, 28523, 391, 77, 403, 1774, 285, 436, 2929, 29328, 247, 747, 10336, 281, 2953, 841, 5697, 352, 310, 4583, 973, 3542, 285, 3477, 281, 956, 2299, 253, 1798, 2990, 310, 3240, 2074, 281, 5368, 27558, 285, 556, 9021, 7106, 327, 271, 4809, 326, 812, 2965, 762, 7092, 4278, 275, 643, 9380, 247, 2007, 1943, 5691, 310, 253, 3710, 7103, 285, 247, 5816, 4602, 281, 1199, 273, 2905, 789, 275, 697, 1655, 830, 253, 19529, 310, 973, 4409, 5955, 533, 13840, 625, 275, 253, 3634, 273, 247, 22586, 50276, 1747, 13218, 253, 7103, 352, 310, 12744, 2139, 253, 4477, 513, 417, 7277, 281, 970, 4610, 432, 1929, 7274, 751, 277, 3088, 11106, 275, 2929, 4500, 7291, 337, 390, 8511, 19, 374, 835, 253, 6158, 767, 476, 10748, 897, 49501, 3215, 11273, 4610, 347, 352, 816, 4419, 247, 1818, 273, 253, 3302, 3602, 253, 7103, 4648, 10575, 273, 767, 10625, 534, 403, 417, 2429, 1411, 3082, 432, 253, 9380, 534, 12661, 731, 275, 1635, 581, 273, 253, 1666, 25379, 258, 2503, 79, 4648, 271, 15644, 273, 253, 11072, 5028, 275, 253, 2929, 533, 2722, 247, 1077, 1027, 1268, 273, 3045, 534, 5936, 326, 253, 7629, 273, 253, 5028, 310, 1027, 323, 253, 9262, 2929, 253, 3236, 1921, 323, 15686, 253, 9380, 369, 326, 512, 1666, 25379, 6507, 1282, 387, 19143, 1698, 3045, 954, 11906, 275, 253, 11072, 5028, 50276, 251, 253, 625, 2762, 1930, 2593, 7609, 3797, 271, 1650, 432, 27090, 4610, 432, 2709, 2045, 4679, 534, 3400, 271, 4722, 8668, 247, 2852, 19502, 812, 5649, 432, 625, 2770, 327, 436, 3839, 762, 24889, 27285, 2934, 533, 253, 1543, 323, 1666, 25379, 403, 969, 19143, 1698, 50275, 37585, 2593, 337, 253, 5393, 17947, 1895, 310, 625, 247, 2867, 273, 2173, 10625, 2581, 685, 11333, 50276, 8656, 1268, 9763, 310, 9300, 1679, 7208, 310, 2779, 6326, 281, 1599, 8534, 22753, 476, 320, 13477, 342, 4715, 1016, 4500, 1057, 417, 2723, 281, 247, 14282, 8482, 1945, 352, 310, 12744, 752, 436, 310, 6326, 281, 1599, 752, 310, 247, 14282, 8482, 1945, 285, 2139, 310, 352, 1175, 323, 4610, 281, 417, 2723, 281, 581, 50276, 18, 29855, 18753, 1661, 1028, 5139, 266, 37050, 285, 513, 1758, 3509, 484, 253, 4500, 68, 17425, 10336, 10061, 273, 253, 39951, 2284, 8059, 327, 13345, 9260, 1936, 4562, 642, 337, 4240, 374, 259, 5773, 1405, 1321, 1616, 316, 1162, 355, 941, 20246, 17134, 18347, 745, 22872, 4500, 4715, 5213, 8059, 327, 5145, 4715, 268, 1686, 83, 43425, 50275, 2377, 2230, 29328, 247, 2173, 10336, 323, 4500, 4715, 271, 4809, 326, 275, 643, 789, 1537, 2965, 762, 7092, 4278, 253, 789, 556, 690, 4722, 4679, 327, 16248, 4610, 432, 2709, 4679, 533, 253, 7103, 285, 4602, 281, 2045, 789, 310, 1077, 3710, 50276, 7152, 339, 431, 248, 2929, 29328, 247, 747, 1566, 1925, 4500, 19879, 2990, 342, 31578, 42115, 31306, 281, 1566, 11935, 19868, 436, 1566, 310, 908, 281, 3037, 11935, 490, 10981, 960, 285, 253, 1453, 3646, 275, 253, 2317, 273, 4610, 970, 20028, 941, 3066, 45738, 4715, 253, 3045, 327, 13358, 2250, 11072, 285, 5415, 2250, 11901, 12620, 403, 12532, 2429, 281, 5368, 1666, 25379, 824, 347, 258, 2503, 79, 18122, 285, 278, 3703, 50276, 296, 3755, 20556, 50276, 783, 1566, 4081, 310, 4722, 285, 21168, 327, 253, 4102, 4081, 258, 2503, 79, 1566, 347, 973, 347, 2720, 789, 337, 374, 495, 577, 608, 327, 24049, 42115, 31306, 275, 391, 79, 2224, 281, 1566, 11935, 19868, 8245, 3210, 2429, 1411, 18122, 258, 2503, 79, 278, 3703, 403, 2266, 4569, 285, 4623, 512, 3210, 452, 644, 6760, 327, 253, 22791, 15302, 824, 347, 11072, 285, 11901, 347, 275, 2720, 789, 253, 3215, 11273, 4610, 6311, 407, 258, 14340, 1309, 253, 45738, 3408, 1421, 281, 1175, 3045, 15988, 672, 1146, 294, 3197, 1996, 407, 253, 9763, 2990, 7092, 4278, 323, 512, 3210, 1690, 1666, 25379, 452, 644, 2529, 973, 275, 253, 30762, 50273, 20881, 1255, 265, 50276, 37411, 891, 7192, 253, 5978, 273, 3733, 15302, 275, 4679, 256, 18, 285, 256, 19, 9113, 253, 3733, 15302, 913, 22942, 4204, 390, 490, 18927, 390, 22942, 36196, 4561, 323, 253, 45738, 4715, 3408, 275, 256, 18, 50276, 84, 19, 403, 27662, 8077, 2531, 253, 2330, 14717, 1895, 8687, 816, 4560, 337, 7548, 1127, 281, 1471, 3178, 366, 374, 10861, 2248, 23223, 275, 271, 9037, 275, 253, 15302, 323, 256, 18, 285, 256, 19, 4679, 352, 651, 320, 20697, 281, 5467, 436, 3733, 9978, 651, 3587, 3700, 281, 667, 2087, 19017, 1524, 10186, 28841, 15302, 4508, 3356, 6430, 7668, 10636, 273, 1142, 4610, 452, 891, 9829, 1633, 275, 619, 7914, 812, 253, 4477, 19148, 436, 1127, 285, 263, 2085, 690, 22861, 347, 281, 2139, 597, 651, 1902, 616, 1566, 281, 921, 2074, 3045, 275, 253, 625, 2087, 1083, 50276, 262, 651, 320, 12912, 281, 7277, 285, 2319, 253, 258, 14340, 1566, 275, 253, 3634, 273, 2720, 789, 327, 391, 9866, 11640, 326, 1566, 11935, 19868, 8886, 1601, 391, 9866, 18, 288, 28094, 9866, 19, 970, 2710, 41655, 281, 8763, 3409, 11269, 5742, 253, 288, 28094, 9866, 5731, 4086, 4648, 2074, 9158, 824, 347, 3491, 5731, 285, 26363, 281, 1581, 323, 17357, 11269, 281, 3541, 1341, 12861, 275, 253, 19868, 3103, 352, 310, 1774, 281, 30191, 366, 253, 3910, 275, 253, 5731, 4803, 7091, 275, 288, 28094, 9866, 374, 285, 253, 4081, 258, 14340, 1566, 281, 7472, 253, 38135, 19164, 414, 273, 253, 7680, 2007, 4321, 789, 275, 6239, 327, 24635, 24498, 11935, 2605, 275, 391, 79, 2224, 495, 577, 608, 721, 556, 417, 644, 23378, 285, 2429, 35844, 264, 275, 253, 2905, 789, 50275, 5092, 253, 24498, 5731, 4086, 1453, 2685, 875, 9763, 285, 4500, 6928, 4081, 275, 253, 258, 14340, 1566, 417, 320, 9009, 16777, 264, 407, 271, 20420, 25180, 258, 2503, 79, 1566, 390, 4931, 247, 288, 28094, 9866, 374, 1566, 50273, 16680, 50276, 74, 1119, 352, 2581, 10084, 281, 923, 326, 253, 258, 2503, 79, 3133, 281, 11182, 281, 3037, 1175, 14717, 84, 273, 20523, 6936, 275, 256, 18, 534, 476, 320, 9924, 294, 3197, 407, 253, 9763, 1996, 2067, 4679, 921, 18122, 285, 278, 3703, 1666, 25379, 41731, 14692, 253, 258, 2503, 79, 1566, 3240, 3012, 4677, 577, 285, 4677, 721, 436, 310, 1077, 10084, 281, 479, 1580, 258, 2503, 79, 9093, 556, 247, 2266, 42115, 8492, 323, 26278, 11935, 19868, 534, 310, 4518, 12912, 323, 841, 8892, 285, 1077, 2074, 281, 253, 4081, 258, 14340, 1566, 812, 253, 4477, 3959, 271, 8813, 323, 2139, 436, 310, 594, 50275, 262, 651, 320, 4722, 281, 22048, 849, 28116, 26332, 1180, 273, 4610, 6311, 323, 253, 1072, 6944, 10861, 253, 6311, 4610, 403, 285, 2408, 273, 33150, 407, 253, 9763, 347, 359, 897, 625, 642, 261, 1321, 19017, 20028, 15302, 323, 253, 45738, 4715, 3408, 50275, 783, 4477, 1750, 326, 258, 14340, 1057, 417, 2430, 31238, 253, 3242, 1180, 273, 13288, 275, 271, 9037, 285, 760, 878, 281, 13199, 247, 4999, 5170, 3033, 327, 352, 752, 513, 253, 4465, 25195, 9232, 7645, 403, 597, 1669, 30732, 390, 1057, 352, 1421, 281, 247, 5777, 18182, 2747, 264, 14717, 273, 253, 9037, 50273, 17695, 49836, 50276, 249, 2087, 253, 2929, 4428, 2067, 963, 993, 285, 23851, 390, 12744, 9839, 2355, 275, 2067, 7118, 253, 9380, 1239, 1430, 285, 19843, 651, 10260, 5649, 432, 247, 1534, 18520, 275, 436, 2743, 891, 452, 16318, 690, 273, 253, 963, 993, 249, 32474, 14168, 14951, 1710, 2056, 474, 6332, 2708, 50276, 7645, 1016, 10861, 342, 4858, 4610, 50276, 18566, 436, 16084, 326, 627, 403, 2067, 4610, 908, 281, 1566, 247, 2014, 21624, 10861, 50275, 261, 247, 47947, 275, 1966, 9260, 50276, 261, 247, 47947, 273, 1966, 9260, 2007, 253, 4477, 513, 417, 2085, 247, 3806, 281, 17813, 253, 1750, 275, 436, 6197, 50275, 783, 4477, 9569, 352, 310, 12744, 1880, 436, 310, 247, 747, 1895, 5611, 407, 253, 4477, 390, 271, 7914, 273, 271, 5368, 1895, 352, 651, 1361, 281, 26542, 4623, 10414, 326, 3029, 383, 438, 90, 436, 17947, 50276, 28872, 275, 6313, 288, 8435, 3082, 50274, 10423, 253, 8542, 2193, 273, 841, 7274, 50274, 10423, 253, 8542, 11839, 273, 841, 7274, 50275, 32667, 414, 273, 11454, 37507, 50276, 32667, 414, 273, 11454, 6928, 50276, 35529, 7384, 2289, 281, 271, 3126, 275, 253, 6925, 404, 3408, 1537, 320, 275, 36764, 917, 275, 1142, 8892, 50276, 33722, 25577, 281, 4623, 12620, 50276, 395, 17923, 45738, 4715, 50276, 395, 1347, 45738, 4715, 50275, 454, 789, 7106, 327, 50276, 454, 789, 16633, 327, 50276, 468, 13015, 209, 2587, 327, 253, 20028, 50276, 468, 13015, 209, 2587, 327, 253, 20028, 941, 752, 1057, 253, 31931, 2492, 209, 2587, 1462, 323, 50276, 41316, 84, 4495, 13288, 50276, 41316, 84, 14282, 13288, 50276, 46258, 24498, 2605, 50276, 73, 1321, 1116, 474, 11935, 2605, 50275, 34814, 253, 3809, 285, 3468, 4715, 2934, 4081, 275, 10279, 266, 1162, 355, 43425, 4496, 3157, 253, 25577, 690, 13991, 323, 253, 1072, 721, 818, 50275, 8826, 273, 253, 14168, 14951, 275, 253, 16182, 2593, 310, 21643, 2834, 25962, 48714, 281, 956, 323, 253, 9414, 285, 671, 417, 5185, 4768, 253, 2505, 970, 13433, 1664, 323, 3470, 26332, 391, 9866, 11269, 1650, 258, 74, 209, 633, 288, 85, 18, 260, 209, 633, 288, 85, 18, 310, 21643, 347, 247, 1355, 4876, 342, 13433, 8266, 310, 1146, 908, 281, 9173, 11390, 347, 973, 1745, 80, 275, 5150, 495, 943, 352, 320, 7856, 384, 18, 285, 417, 288, 85, 18, 275, 7424, 495, 25133, 2067, 11390, 342, 642, 13433, 8266, 352, 310, 671, 1077, 21643, 281, 452, 1097, 288, 85, 285, 7856, 384, 275, 253, 894, 2990, 7424, 2007, 253, 44176, 14951, 891, 246, 323, 1396, 569, 273, 4500, 3210, 385, 14260, 281, 310, 671, 1077, 21643, 285, 11355, 253, 19843, 285, 1239, 1430, 4496, 897, 1529, 44176, 6974, 323, 1396, 569, 273, 4500, 3210, 751, 288, 936, 74, 390, 1633, 2074, 352, 310, 671, 16088, 1892, 281, 23198, 1978, 3540, 273, 495, 11640, 273, 1016, 5743, 4972, 323, 385, 7856, 81, 268, 285, 268, 4768, 253, 5740, 273, 253, 5933, 285, 616, 9503, 4496, 897, 247, 625, 27350, 5795, 281, 1056, 253, 4361, 2793, 1805, 285, 3157, 253, 19843, 253, 8103, 1159, 281, 7221, 885, 5150, 1283, 310, 3542, 275, 247, 10585, 406, 351, 44549, 5133, 4496, 3464, 5185, 285, 4853, 253, 8103, 1159, 970, 247, 15965, 2048, 281, 3693, 28931, 50272, 18, 465, 483, 16825, 1162, 355, 247, 8886, 1601, 391, 9866, 17857, 1686, 4059, 50276, 19, 448, 1947, 1162, 355, 24498, 1554, 2865, 1079, 18902, 11454, 6928, 17857, 32888, 4240, 50275, 20, 5807, 7893, 73, 22651, 1162, 355, 4715, 2570, 6508, 6430, 970, 253, 8063, 273, 2892, 13800, 11454, 13782, 9748, 50276, 21, 5497, 8260, 1162, 355, 50276, 527, 14684, 273, 1554, 2865, 1079, 11935, 2605, 295, 2824, 9748, 50276, 22, 1045, 14260, 5801, 1162, 355, 24498, 18902, 11454, 6928, 323, 1048, 3945, 21011, 295, 2824, 8878, 50275, 23, 480, 333, 5210, 1407, 6481, 246, 23305, 1491, 3762, 285, 7605, 17823, 21255, 3520, 2278, 13278, 374, 19816, 50276, 24, 22491, 91, 8420, 44961, 28015, 50276, 14091, 1914, 17099, 9527, 20786, 469, 463, 88, 11014, 1784, 458, 67, 423, 3855, 465, 2694, 39231, 16050, 320, 41768, 1866, 10691, 74, 22220, 30304, 1866, 259, 1914, 254, 270, 6555, 28993, 9135, 608, 14840, 1549, 48318, 50276, 783, 4081, 258, 14340, 1566, 310, 271, 4722, 6880, 281, 253, 3332, 258, 2503, 79, 2299, 619, 2022, 2523, 310, 326, 253, 20028, 10895, 5978, 1232, 556, 31063, 3777, 253, 2120, 3425, 14717, 4836, 281, 271, 27662, 8077, 2531, 1083, 253, 4028, 476, 320, 5520, 3012, 347, 5125, 1840, 281, 3157, 19843, 273, 253, 4081, 5697, 2007, 690, 8244, 2905, 2720, 789, 374, 327, 26278, 11935, 19868, 275, 391, 79, 2224, 970, 2074, 5697, 452, 417, 644, 2429, 35844, 264, 285, 1142, 2571, 417, 23378, 337, 495, 577, 608, 50276, 187, 187, 4118, 18435, 27, 10008, 273, 2929, 2600, 50276, 783, 2929, 8631, 247, 5853, 281, 3037, 4500, 7823, 970, 14613, 34591, 285, 840, 21084, 460, 731, 970, 247, 1029, 5251, 9763, 10166, 407, 391, 77, 253, 6944, 4610, 403, 13831, 253, 1332, 310, 5762, 275, 767, 3863, 12620, 247, 13358, 9860, 1533, 3126, 285, 247, 5415, 2250, 2317, 15688, 352, 310, 2429, 281, 1264, 1666, 25379, 50276, 8774, 273, 2929, 5955, 50276, 455, 30628, 4395, 281, 12009, 1754, 327, 247, 3480, 273, 38135, 285, 247, 3480, 273, 1534, 16774, 1543, 642, 30080, 85, 932, 497, 2530 ]
[ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 ]
[ 970, 1491, 432, 32367, 281, 28523, 391, 77, 403, 1774, 285, 436, 2929, 29328, 247, 747, 10336, 281, 2953, 841, 5697, 352, 310, 4583, 973, 3542, 285, 3477, 281, 956, 2299, 253, 1798, 2990, 310, 3240, 2074, 281, 5368, 27558, 285, 556, 9021, 7106, 327, 271, 4809, 326, 812, 2965, 762, 7092, 4278, 275, 643, 9380, 247, 2007, 1943, 5691, 310, 253, 3710, 7103, 285, 247, 5816, 4602, 281, 1199, 273, 2905, 789, 275, 697, 1655, 830, 253, 19529, 310, 973, 4409, 5955, 533, 13840, 625, 275, 253, 3634, 273, 247, 22586, 50276, 1747, 13218, 253, 7103, 352, 310, 12744, 2139, 253, 4477, 513, 417, 7277, 281, 970, 4610, 432, 1929, 7274, 751, 277, 3088, 11106, 275, 2929, 4500, 7291, 337, 390, 8511, 19, 374, 835, 253, 6158, 767, 476, 10748, 897, 49501, 3215, 11273, 4610, 347, 352, 816, 4419, 247, 1818, 273, 253, 3302, 3602, 253, 7103, 4648, 10575, 273, 767, 10625, 534, 403, 417, 2429, 1411, 3082, 432, 253, 9380, 534, 12661, 731, 275, 1635, 581, 273, 253, 1666, 25379, 258, 2503, 79, 4648, 271, 15644, 273, 253, 11072, 5028, 275, 253, 2929, 533, 2722, 247, 1077, 1027, 1268, 273, 3045, 534, 5936, 326, 253, 7629, 273, 253, 5028, 310, 1027, 323, 253, 9262, 2929, 253, 3236, 1921, 323, 15686, 253, 9380, 369, 326, 512, 1666, 25379, 6507, 1282, 387, 19143, 1698, 3045, 954, 11906, 275, 253, 11072, 5028, 50276, 251, 253, 625, 2762, 1930, 2593, 7609, 3797, 271, 1650, 432, 27090, 4610, 432, 2709, 2045, 4679, 534, 3400, 271, 4722, 8668, 247, 2852, 19502, 812, 5649, 432, 625, 2770, 327, 436, 3839, 762, 24889, 27285, 2934, 533, 253, 1543, 323, 1666, 25379, 403, 969, 19143, 1698, 50275, 37585, 2593, 337, 253, 5393, 17947, 1895, 310, 625, 247, 2867, 273, 2173, 10625, 2581, 685, 11333, 50276, 8656, 1268, 9763, 310, 9300, 1679, 7208, 310, 2779, 6326, 281, 1599, 8534, 22753, 476, 320, 13477, 342, 4715, 1016, 4500, 1057, 417, 2723, 281, 247, 14282, 8482, 1945, 352, 310, 12744, 752, 436, 310, 6326, 281, 1599, 752, 310, 247, 14282, 8482, 1945, 285, 2139, 310, 352, 1175, 323, 4610, 281, 417, 2723, 281, 581, 50276, 18, 29855, 18753, 1661, 1028, 5139, 266, 37050, 285, 513, 1758, 3509, 484, 253, 4500, 68, 17425, 10336, 10061, 273, 253, 39951, 2284, 8059, 327, 13345, 9260, 1936, 4562, 642, 337, 4240, 374, 259, 5773, 1405, 1321, 1616, 316, 1162, 355, 941, 20246, 17134, 18347, 745, 22872, 4500, 4715, 5213, 8059, 327, 5145, 4715, 268, 1686, 83, 43425, 50275, 2377, 2230, 29328, 247, 2173, 10336, 323, 4500, 4715, 271, 4809, 326, 275, 643, 789, 1537, 2965, 762, 7092, 4278, 253, 789, 556, 690, 4722, 4679, 327, 16248, 4610, 432, 2709, 4679, 533, 253, 7103, 285, 4602, 281, 2045, 789, 310, 1077, 3710, 50276, 7152, 339, 431, 248, 2929, 29328, 247, 747, 1566, 1925, 4500, 19879, 2990, 342, 31578, 42115, 31306, 281, 1566, 11935, 19868, 436, 1566, 310, 908, 281, 3037, 11935, 490, 10981, 960, 285, 253, 1453, 3646, 275, 253, 2317, 273, 4610, 970, 20028, 941, 3066, 45738, 4715, 253, 3045, 327, 13358, 2250, 11072, 285, 5415, 2250, 11901, 12620, 403, 12532, 2429, 281, 5368, 1666, 25379, 824, 347, 258, 2503, 79, 18122, 285, 278, 3703, 50276, 296, 3755, 20556, 50276, 783, 1566, 4081, 310, 4722, 285, 21168, 327, 253, 4102, 4081, 258, 2503, 79, 1566, 347, 973, 347, 2720, 789, 337, 374, 495, 577, 608, 327, 24049, 42115, 31306, 275, 391, 79, 2224, 281, 1566, 11935, 19868, 8245, 3210, 2429, 1411, 18122, 258, 2503, 79, 278, 3703, 403, 2266, 4569, 285, 4623, 512, 3210, 452, 644, 6760, 327, 253, 22791, 15302, 824, 347, 11072, 285, 11901, 347, 275, 2720, 789, 253, 3215, 11273, 4610, 6311, 407, 258, 14340, 1309, 253, 45738, 3408, 1421, 281, 1175, 3045, 15988, 672, 1146, 294, 3197, 1996, 407, 253, 9763, 2990, 7092, 4278, 323, 512, 3210, 1690, 1666, 25379, 452, 644, 2529, 973, 275, 253, 30762, 50273, 20881, 1255, 265, 50276, 37411, 891, 7192, 253, 5978, 273, 3733, 15302, 275, 4679, 256, 18, 285, 256, 19, 9113, 253, 3733, 15302, 913, 22942, 4204, 390, 490, 18927, 390, 22942, 36196, 4561, 323, 253, 45738, 4715, 3408, 275, 256, 18, 50276, 84, 19, 403, 27662, 8077, 2531, 253, 2330, 14717, 1895, 8687, 816, 4560, 337, 7548, 1127, 281, 1471, 3178, 366, 374, 10861, 2248, 23223, 275, 271, 9037, 275, 253, 15302, 323, 256, 18, 285, 256, 19, 4679, 352, 651, 320, 20697, 281, 5467, 436, 3733, 9978, 651, 3587, 3700, 281, 667, 2087, 19017, 1524, 10186, 28841, 15302, 4508, 3356, 6430, 7668, 10636, 273, 1142, 4610, 452, 891, 9829, 1633, 275, 619, 7914, 812, 253, 4477, 19148, 436, 1127, 285, 263, 2085, 690, 22861, 347, 281, 2139, 597, 651, 1902, 616, 1566, 281, 921, 2074, 3045, 275, 253, 625, 2087, 1083, 50276, 262, 651, 320, 12912, 281, 7277, 285, 2319, 253, 258, 14340, 1566, 275, 253, 3634, 273, 2720, 789, 327, 391, 9866, 11640, 326, 1566, 11935, 19868, 8886, 1601, 391, 9866, 18, 288, 28094, 9866, 19, 970, 2710, 41655, 281, 8763, 3409, 11269, 5742, 253, 288, 28094, 9866, 5731, 4086, 4648, 2074, 9158, 824, 347, 3491, 5731, 285, 26363, 281, 1581, 323, 17357, 11269, 281, 3541, 1341, 12861, 275, 253, 19868, 3103, 352, 310, 1774, 281, 30191, 366, 253, 3910, 275, 253, 5731, 4803, 7091, 275, 288, 28094, 9866, 374, 285, 253, 4081, 258, 14340, 1566, 281, 7472, 253, 38135, 19164, 414, 273, 253, 7680, 2007, 4321, 789, 275, 6239, 327, 24635, 24498, 11935, 2605, 275, 391, 79, 2224, 495, 577, 608, 721, 556, 417, 644, 23378, 285, 2429, 35844, 264, 275, 253, 2905, 789, 50275, 5092, 253, 24498, 5731, 4086, 1453, 2685, 875, 9763, 285, 4500, 6928, 4081, 275, 253, 258, 14340, 1566, 417, 320, 9009, 16777, 264, 407, 271, 20420, 25180, 258, 2503, 79, 1566, 390, 4931, 247, 288, 28094, 9866, 374, 1566, 50273, 16680, 50276, 74, 1119, 352, 2581, 10084, 281, 923, 326, 253, 258, 2503, 79, 3133, 281, 11182, 281, 3037, 1175, 14717, 84, 273, 20523, 6936, 275, 256, 18, 534, 476, 320, 9924, 294, 3197, 407, 253, 9763, 1996, 2067, 4679, 921, 18122, 285, 278, 3703, 1666, 25379, 41731, 14692, 253, 258, 2503, 79, 1566, 3240, 3012, 4677, 577, 285, 4677, 721, 436, 310, 1077, 10084, 281, 479, 1580, 258, 2503, 79, 9093, 556, 247, 2266, 42115, 8492, 323, 26278, 11935, 19868, 534, 310, 4518, 12912, 323, 841, 8892, 285, 1077, 2074, 281, 253, 4081, 258, 14340, 1566, 812, 253, 4477, 3959, 271, 8813, 323, 2139, 436, 310, 594, 50275, 262, 651, 320, 4722, 281, 22048, 849, 28116, 26332, 1180, 273, 4610, 6311, 323, 253, 1072, 6944, 10861, 253, 6311, 4610, 403, 285, 2408, 273, 33150, 407, 253, 9763, 347, 359, 897, 625, 642, 261, 1321, 19017, 20028, 15302, 323, 253, 45738, 4715, 3408, 50275, 783, 4477, 1750, 326, 258, 14340, 1057, 417, 2430, 31238, 253, 3242, 1180, 273, 13288, 275, 271, 9037, 285, 760, 878, 281, 13199, 247, 4999, 5170, 3033, 327, 352, 752, 513, 253, 4465, 25195, 9232, 7645, 403, 597, 1669, 30732, 390, 1057, 352, 1421, 281, 247, 5777, 18182, 2747, 264, 14717, 273, 253, 9037, 50273, 17695, 49836, 50276, 249, 2087, 253, 2929, 4428, 2067, 963, 993, 285, 23851, 390, 12744, 9839, 2355, 275, 2067, 7118, 253, 9380, 1239, 1430, 285, 19843, 651, 10260, 5649, 432, 247, 1534, 18520, 275, 436, 2743, 891, 452, 16318, 690, 273, 253, 963, 993, 249, 32474, 14168, 14951, 1710, 2056, 474, 6332, 2708, 50276, 7645, 1016, 10861, 342, 4858, 4610, 50276, 18566, 436, 16084, 326, 627, 403, 2067, 4610, 908, 281, 1566, 247, 2014, 21624, 10861, 50275, 261, 247, 47947, 275, 1966, 9260, 50276, 261, 247, 47947, 273, 1966, 9260, 2007, 253, 4477, 513, 417, 2085, 247, 3806, 281, 17813, 253, 1750, 275, 436, 6197, 50275, 783, 4477, 9569, 352, 310, 12744, 1880, 436, 310, 247, 747, 1895, 5611, 407, 253, 4477, 390, 271, 7914, 273, 271, 5368, 1895, 352, 651, 1361, 281, 26542, 4623, 10414, 326, 3029, 383, 438, 90, 436, 17947, 50276, 28872, 275, 6313, 288, 8435, 3082, 50274, 10423, 253, 8542, 2193, 273, 841, 7274, 50274, 10423, 253, 8542, 11839, 273, 841, 7274, 50275, 32667, 414, 273, 11454, 37507, 50276, 32667, 414, 273, 11454, 6928, 50276, 35529, 7384, 2289, 281, 271, 3126, 275, 253, 6925, 404, 3408, 1537, 320, 275, 36764, 917, 275, 1142, 8892, 50276, 33722, 25577, 281, 4623, 12620, 50276, 395, 17923, 45738, 4715, 50276, 395, 1347, 45738, 4715, 50275, 454, 789, 7106, 327, 50276, 454, 789, 16633, 327, 50276, 468, 13015, 209, 2587, 327, 253, 20028, 50276, 468, 13015, 209, 2587, 327, 253, 20028, 941, 752, 1057, 253, 31931, 2492, 209, 2587, 1462, 323, 50276, 41316, 84, 4495, 13288, 50276, 41316, 84, 14282, 13288, 50276, 46258, 24498, 2605, 50276, 73, 1321, 1116, 474, 11935, 2605, 50275, 34814, 253, 3809, 285, 3468, 4715, 2934, 4081, 275, 10279, 266, 1162, 355, 43425, 4496, 3157, 253, 25577, 690, 13991, 323, 253, 1072, 721, 818, 50275, 8826, 273, 253, 14168, 14951, 275, 253, 16182, 2593, 310, 21643, 2834, 25962, 48714, 281, 956, 323, 253, 9414, 285, 671, 417, 5185, 4768, 253, 2505, 970, 13433, 1664, 323, 3470, 26332, 391, 9866, 11269, 1650, 258, 74, 209, 633, 288, 85, 18, 260, 209, 633, 288, 85, 18, 310, 21643, 347, 247, 1355, 4876, 342, 13433, 8266, 310, 1146, 908, 281, 9173, 11390, 347, 973, 1745, 80, 275, 5150, 495, 943, 352, 320, 7856, 384, 18, 285, 417, 288, 85, 18, 275, 7424, 495, 25133, 2067, 11390, 342, 642, 13433, 8266, 352, 310, 671, 1077, 21643, 281, 452, 1097, 288, 85, 285, 7856, 384, 275, 253, 894, 2990, 7424, 2007, 253, 44176, 14951, 891, 246, 323, 1396, 569, 273, 4500, 3210, 385, 14260, 281, 310, 671, 1077, 21643, 285, 11355, 253, 19843, 285, 1239, 1430, 4496, 897, 1529, 44176, 6974, 323, 1396, 569, 273, 4500, 3210, 751, 288, 936, 74, 390, 1633, 2074, 352, 310, 671, 16088, 1892, 281, 23198, 1978, 3540, 273, 495, 11640, 273, 1016, 5743, 4972, 323, 385, 7856, 81, 268, 285, 268, 4768, 253, 5740, 273, 253, 5933, 285, 616, 9503, 4496, 897, 247, 625, 27350, 5795, 281, 1056, 253, 4361, 2793, 1805, 285, 3157, 253, 19843, 253, 8103, 1159, 281, 7221, 885, 5150, 1283, 310, 3542, 275, 247, 10585, 406, 351, 44549, 5133, 4496, 3464, 5185, 285, 4853, 253, 8103, 1159, 970, 247, 15965, 2048, 281, 3693, 28931, 50272, 18, 465, 483, 16825, 1162, 355, 247, 8886, 1601, 391, 9866, 17857, 1686, 4059, 50276, 19, 448, 1947, 1162, 355, 24498, 1554, 2865, 1079, 18902, 11454, 6928, 17857, 32888, 4240, 50275, 20, 5807, 7893, 73, 22651, 1162, 355, 4715, 2570, 6508, 6430, 970, 253, 8063, 273, 2892, 13800, 11454, 13782, 9748, 50276, 21, 5497, 8260, 1162, 355, 50276, 527, 14684, 273, 1554, 2865, 1079, 11935, 2605, 295, 2824, 9748, 50276, 22, 1045, 14260, 5801, 1162, 355, 24498, 18902, 11454, 6928, 323, 1048, 3945, 21011, 295, 2824, 8878, 50275, 23, 480, 333, 5210, 1407, 6481, 246, 23305, 1491, 3762, 285, 7605, 17823, 21255, 3520, 2278, 13278, 374, 19816, 50276, 24, 22491, 91, 8420, 44961, 28015, 50276, 14091, 1914, 17099, 9527, 20786, 469, 463, 88, 11014, 1784, 458, 67, 423, 3855, 465, 2694, 39231, 16050, 320, 41768, 1866, 10691, 74, 22220, 30304, 1866, 259, 1914, 254, 270, 6555, 28993, 9135, 608, 14840, 1549, 48318, 50276, 783, 4081, 258, 14340, 1566, 310, 271, 4722, 6880, 281, 253, 3332, 258, 2503, 79, 2299, 619, 2022, 2523, 310, 326, 253, 20028, 10895, 5978, 1232, 556, 31063, 3777, 253, 2120, 3425, 14717, 4836, 281, 271, 27662, 8077, 2531, 1083, 253, 4028, 476, 320, 5520, 3012, 347, 5125, 1840, 281, 3157, 19843, 273, 253, 4081, 5697, 2007, 690, 8244, 2905, 2720, 789, 374, 327, 26278, 11935, 19868, 275, 391, 79, 2224, 970, 2074, 5697, 452, 417, 644, 2429, 35844, 264, 285, 1142, 2571, 417, 23378, 337, 495, 577, 608, 50276, 187, 187, 4118, 18435, 27, 10008, 273, 2929, 2600, 50276, 783, 2929, 8631, 247, 5853, 281, 3037, 4500, 7823, 970, 14613, 34591, 285, 840, 21084, 460, 731, 970, 247, 1029, 5251, 9763, 10166, 407, 391, 77, 253, 6944, 4610, 403, 13831, 253, 1332, 310, 5762, 275, 767, 3863, 12620, 247, 13358, 9860, 1533, 3126, 285, 247, 5415, 2250, 2317, 15688, 352, 310, 2429, 281, 1264, 1666, 25379, 50276, 8774, 273, 2929, 5955, 50276, 455, 30628, 4395, 281, 12009, 1754, 327, 247, 3480, 273, 38135, 285, 247, 3480, 273, 1534, 16774, 1543, 642, 30080, 85, 932, 497, 2530 ]
Below is given review of a research paper from cnoference journal. Please write a summary the review. ### Review: the paper describes a vaebased approach to semisupervised learning of dependency parsing the encoder in the vae is a neural edgefactored parser allowing inference using eisners dynamic programming algorithms the decoder generates sentences lefttoright at each point conditioning on headmodifier dependencies specified by the tree a key technical step is to develop a method for differentiable samplingparsing using a modification of the dynamic program and the gumbelmax trick i thought this was an excellent paper very clear an important problem a very useful set of techniques and results i would strongly recommend acceptance some comments i do wonder how well this approach would work with orders of magnitude more unlabeled data the amount of unlabeled data used is quite small similarly i wonder how well the approach works as the amount of unlabeled data is decreased or increased for that matter it should be possible to provide graphs showing this are there natural generalizations to multilingual data for example settings where supervised data is only available for languages other than the language of interest it would be interesting to see an analysis of accuracy improvements on different dependency labels the root case is in some sense just one of the labels nsubj dobj prep etc that could be analyzed i wonder also if this method would be particularly helpful in domain transfer for example from wall street journal text to wikipedia or web data in general the improvements could be more dramatic in this case that kind of effect has been seen with elmo for exampledocsepsummary this paper proposes to do semisupervised learning via a generative model of an arcfactored dependency parser by using amortized variational inference the parse tree is the latent variable the parser is the encoder that maps a sentence to a distribution over parsetrees and the decoder is a generative model that maps a parse tree to a distribution over sentences pros semisupervised learning for dependency parsing is both important and difficult and this paper presents a novel approach using variational autoencoders and the semisupervised learning method in this paper gives a small but nonzero improvement over a reasonably strong baseline cons 1 my main concern with this paper currently are the explanations provided in the paper which are quite handwavy eg the authors state that using a kl term in semisupervised learning is exactly opposite to the low density separation assumption and therefore they set the kl term to be zero one has to wonder that why is the low density separation assumption so critical for dependency parsing only vaes have been used with a prior for semisupervised learning before why didnt this assumption affect those models a better explanation will have been that since the authors first trained the parser in a supervised fashion therefore their inference network already represents a good distribution over parses even though this distribution is specified only upto sampling but not in a mathematically closed form finally setting the kl divergence between the posterior of the inference network and the prior to be zero is the same as dynamically specifying the prior to be the same as the inference networks distribution 2 a number of important details are missing in the submitted version of the paper which the authors addressed in their reply to my public comment 3 the current paper does not contain any comparison to selftraining which is a natural baseline for this work the authors replied to my comment saying that selftraining requires a number of heuristics but its not clear to me how much more difficult can these heuristics be than the tuning required for training their vaedocsepthis paper proposed a variational autoencoderbased method for semisupervised dependency parsing given an input sentence s an lstmbased encoder generates a sentence embedding z and a nn of kiperwasser goldberg 2016 generates a dependency structure t gradients over the tree encoder are approximated by 1 adding a perturbation matrix over the weight matrix and 2 relax dynamic programmingbased parsing algorithm to a differentiable format the decoder combines standard lstm and graph convolutional network to generate the input sentence from z and t the authors evaluated the proposed method on three languages using 10 of the original training data as labeled and the rest as unlabeled data pros 1 i like the idea of this sentencetreesentence autoencoder for semisupervised parsing the authors proposed a novel and nice way to tackle key challenges in gradient computation vae involves marginalization over all possible dependency trees which is computationally infeasible and the proposed method used a gumbelmax trick to approximate it the tree inference procedure involves nondifferentiable structured prediction and the authors used a peakedsoftmax method to address the issue the whole model is fully differentiable and can be thus trained end to end 2 the direction of semisupervised parsing is useful and promising not only for resourcepoor languages but also for popular languages like english a successful research on this direction could be potentially helpful for lots of future work cons and suggestions on experiments my main concerns are around experiments overall i think they are not strong enough to demonstrate that this paper has sufficient contribution to semisupervised parsing below are details 1 the current version only used 10 of original training data as labeled and the rest as unlabeled data this makes the reported numbers way below existing stateoftheart performance for example the sota uas on english ptb has been 95 ideally the authors should be able to train a competitive supervised parser on full training data english or other languages and get huge amount of unlabeled data from other sources eg news to further push up the performance the current setting makes it hard to justify how useful the proposed method could be in practice 2 the best numbers from the proposed model is lower than baseline kipperwasser goldberg on english and only marginally better on swedish this probably means the supervised baseline is weak and its hard to tell if the gains from vae will retain if applied to a stronger supervised 3 a performance curve with different amount of labeled and unlabeled data would be useful to better understand the impact of semisupervised learning 4 whats the impact of perturbation one could simply use teisnerw as approximation did you observe any significant benefits from sampling other questions 1 whats the impact of keeping the tree constraint on dependencies during backpropagation have you tried removing the tree constraint like previous work 2 are sentence embedding and trees generated from two separate lstm encoders are there any parameter sharing between the two ### Summary:
this paper proposes a method for unsupervised learning that uses a latent variable generative model for semisupervised dependency parsing the key learning method consists of making perturbations to the logits going into a parsing algorithm to make it possible to sample within the variational autoencoder framework significant gains are found through semisupervised learning the largest reviewer concern was that the baselines were potentially not strong enough as significantly better numbers have been reported in previous work which may have a result of overstating the perceived utility overall though it seems that the reviewers appreciated the novel solution to an important problem and in general would like to see the paper accepted
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 783, 2929, 8631, 247, 13460, 2275, 833, 2746, 281, 49863, 29974, 13337, 4715, 273, 18925, 29072, 253, 32049, 275, 253, 362, 3348, 310, 247, 11454, 5024, 12690, 2149, 22109, 6941, 17032, 970, 299, 261, 6118, 7870, 10717, 11333, 253, 29810, 15693, 14683, 1669, 13473, 429, 387, 1016, 1127, 21839, 327, 1481, 2307, 5425, 21011, 7616, 407, 253, 5202, 247, 2234, 7681, 50276, 10539, 310, 281, 1287, 247, 1332, 323, 46350, 10491, 35422, 272, 970, 247, 11237, 273, 253, 7870, 2086, 285, 253, 305, 3561, 293, 4090, 10480, 50276, 74, 1869, 436, 369, 271, 7126, 2929, 50276, 635, 2590, 271, 1774, 50276, 28872, 247, 1077, 4217, 873, 273, 5609, 285, 1543, 891, 651, 7052, 5583, 14924, 50276, 8826, 5701, 50275, 74, 513, 4282, 849, 973, 436, 2746, 651, 789, 342, 7367, 273, 9777, 625, 440, 22027, 941, 253, 2408, 273, 440, 22027, 941, 908, 310, 3240, 1355, 50275, 3549, 6241, 891, 4282, 849, 973, 253, 2746, 2987, 347, 253, 2408, 273, 440, 22027, 941, 310, 6137, 390, 2559, 323, 326, 2647, 352, 943, 320, 1896, 281, 2085, 14580, 4645, 436, 50275, 609, 627, 3626, 2087, 5904, 281, 1554, 39661, 941, 323, 1650, 7533, 835, 22296, 941, 310, 760, 2130, 323, 11515, 643, 685, 253, 3448, 273, 1600, 50275, 262, 651, 320, 4722, 281, 923, 271, 1783, 273, 7200, 11701, 327, 1027, 18925, 13301, 253, 5230, 1083, 310, 275, 690, 3282, 816, 581, 273, 253, 13301, 295, 2377, 75, 513, 35744, 3765, 3966, 326, 812, 320, 5867, 50275, 74, 4282, 671, 604, 436, 1332, 651, 320, 3782, 9371, 275, 50276, 13517, 3700, 323, 1650, 432, 3402, 6406, 6698, 2505, 281, 259, 15170, 390, 4384, 941, 275, 2087, 253, 11701, 812, 320, 625, 14138, 275, 436, 1083, 50276, 3529, 2238, 273, 1055, 556, 644, 2326, 342, 50276, 293, 6972, 323, 1174, 6216, 406, 339, 793, 360, 3454, 436, 2929, 29328, 281, 513, 49863, 29974, 13337, 4715, 50276, 13917, 247, 1006, 800, 1566, 273, 271, 549, 7836, 514, 2149, 18925, 22109, 407, 970, 50276, 312, 430, 1025, 39762, 17032, 50276, 783, 14390, 5202, 310, 253, 21624, 4778, 253, 22109, 310, 253, 32049, 326, 8115, 247, 6197, 281, 247, 3268, 689, 13328, 292, 6151, 285, 253, 29810, 310, 247, 1006, 800, 1566, 326, 8115, 247, 14390, 5202, 281, 247, 3268, 689, 14683, 50274, 856, 84, 49863, 29974, 13337, 4715, 323, 18925, 29072, 310, 1097, 1774, 285, 2834, 285, 436, 2929, 10262, 247, 4460, 2746, 970, 39762, 6753, 2083, 351, 398, 285, 253, 49863, 29974, 13337, 4715, 1332, 275, 436, 2929, 4245, 247, 1355, 533, 28078, 7756, 689, 247, 12054, 2266, 8245, 50275, 5040, 337, 619, 2022, 4468, 342, 436, 2929, 4390, 403, 253, 22909, 2530, 275, 253, 2929, 534, 403, 3240, 1133, 88, 17157, 24088, 253, 4477, 1375, 326, 970, 247, 27451, 1307, 275, 49863, 29974, 13337, 4715, 310, 4555, 7285, 281, 253, 1698, 4038, 9712, 9376, 50276, 395, 3103, 597, 873, 253, 27451, 1307, 281, 320, 5058, 581, 556, 281, 4282, 326, 2139, 310, 253, 1698, 4038, 9712, 9376, 594, 4619, 323, 18925, 29072, 760, 13460, 265, 452, 644, 908, 342, 247, 2720, 323, 49863, 29974, 13337, 4715, 1078, 2139, 42126, 436, 9376, 2818, 1110, 3210, 50274, 66, 1805, 8813, 588, 452, 644, 326, 1580, 253, 4477, 806, 10166, 253, 22109, 275, 247, 22296, 8142, 3103, 616, 17032, 2990, 2168, 6125, 247, 1175, 3268, 689, 13328, 265, 1014, 2167, 436, 3268, 310, 7616, 760, 11776, 80, 10491, 533, 417, 275, 247, 11076, 1037, 4581, 830, 4720, 4758, 253, 27451, 23279, 875, 253, 12637, 273, 253, 17032, 2990, 285, 253, 2720, 281, 320, 5058, 310, 253, 1072, 347, 23043, 31238, 253, 2720, 281, 320, 253, 1072, 347, 253, 17032, 6928, 3268, 50274, 19, 247, 1180, 273, 1774, 4278, 403, 5816, 275, 253, 9262, 2715, 273, 253, 2929, 534, 253, 4477, 9713, 275, 616, 12252, 281, 619, 1345, 4385, 50276, 20, 253, 1655, 2929, 1057, 417, 3831, 667, 5301, 281, 11329, 649, 26208, 534, 310, 247, 3626, 8245, 323, 436, 789, 253, 4477, 10017, 281, 619, 4385, 3981, 326, 11329, 649, 26208, 4419, 247, 1180, 273, 344, 321, 3397, 533, 697, 417, 2590, 281, 479, 849, 1199, 625, 2834, 476, 841, 344, 321, 3397, 320, 685, 253, 25184, 2424, 323, 3733, 616, 13460, 264, 406, 33032, 2520, 2929, 4081, 247, 39762, 6753, 36465, 3169, 1332, 323, 49863, 29974, 13337, 18925, 29072, 1677, 271, 3280, 6197, 256, 271, 298, 296, 1814, 833, 32049, 15693, 247, 6197, 21496, 1182, 285, 247, 48257, 273, 465, 24277, 88, 30666, 50276, 27716, 4978, 4022, 15693, 247, 18925, 2605, 246, 27935, 689, 253, 5202, 32049, 403, 34930, 407, 337, 6240, 247, 20452, 4315, 689, 253, 2801, 4315, 285, 374, 7921, 7870, 10717, 3169, 29072, 5933, 281, 247, 46350, 5981, 253, 29810, 24772, 2629, 298, 296, 78, 285, 4216, 27311, 267, 2990, 281, 6635, 253, 3280, 6197, 432, 1182, 285, 246, 253, 4477, 6760, 253, 4081, 1332, 327, 1264, 11515, 970, 884, 273, 253, 3236, 3733, 941, 347, 13130, 285, 253, 1551, 347, 440, 22027, 941, 50276, 856, 84, 337, 891, 751, 253, 2934, 273, 436, 2197, 2083, 292, 6151, 290, 566, 6753, 36465, 323, 49863, 29974, 13337, 29072, 253, 4477, 4081, 247, 4460, 285, 5322, 1039, 281, 18915, 2234, 7881, 275, 11786, 13782, 362, 3348, 8687, 16888, 1320, 689, 512, 1896, 18925, 7139, 534, 310, 43245, 275, 36764, 917, 285, 253, 4081, 1332, 908, 247, 305, 3561, 293, 4090, 10480, 281, 16851, 352, 253, 5202, 17032, 5199, 8687, 27370, 7413, 6051, 18872, 10554, 285, 253, 4477, 908, 247, 33404, 5530, 4090, 1332, 281, 2953, 253, 2523, 253, 2644, 1566, 310, 4751, 46350, 285, 476, 320, 3021, 10166, 990, 281, 990, 50276, 19, 253, 3884, 273, 49863, 29974, 13337, 29072, 310, 4217, 285, 12532, 417, 760, 323, 7741, 31943, 11515, 533, 671, 323, 4633, 11515, 751, 48087, 247, 5547, 2561, 327, 436, 3884, 812, 320, 7826, 9371, 323, 8783, 273, 2852, 789, 50276, 5040, 285, 13991, 327, 4679, 619, 2022, 7350, 403, 1475, 4679, 4583, 891, 1158, 597, 403, 417, 2266, 2217, 281, 7568, 326, 436, 2929, 556, 4209, 7680, 281, 49863, 29974, 13337, 29072, 2708, 403, 4278, 50276, 18, 253, 1655, 2715, 760, 908, 884, 273, 3236, 3733, 941, 347, 13130, 285, 253, 1551, 347, 440, 22027, 941, 436, 2789, 253, 2361, 3904, 1039, 2708, 5368, 1375, 23037, 14387, 3045, 323, 1650, 253, 256, 5503, 1484, 284, 327, 48087, 268, 25192, 556, 644, 5325, 34243, 253, 4477, 943, 320, 2104, 281, 6194, 247, 12085, 22296, 22109, 327, 2120, 3733, 941, 48087, 390, 643, 11515, 285, 755, 5699, 2408, 273, 440, 22027, 941, 432, 643, 4973, 24088, 3668, 281, 2007, 7450, 598, 253, 3045, 253, 1655, 4758, 2789, 352, 1892, 281, 15249, 849, 4217, 253, 4081, 1332, 812, 320, 275, 3946, 50276, 19, 253, 1682, 3904, 432, 253, 4081, 1566, 310, 2406, 685, 8245, 25130, 3803, 88, 30666, 50276, 27716, 4978, 327, 48087, 285, 760, 42876, 1805, 327, 1863, 264, 763, 436, 3164, 2097, 253, 22296, 8245, 310, 5075, 285, 697, 1892, 281, 2028, 604, 253, 15988, 432, 362, 3348, 588, 13280, 604, 3732, 281, 247, 10046, 22296, 50276, 20, 247, 3045, 6970, 342, 1027, 2408, 273, 13130, 285, 440, 22027, 941, 651, 320, 4217, 281, 1805, 2096, 253, 3486, 273, 49863, 29974, 13337, 4715, 50276, 21, 47515, 253, 3486, 273, 20452, 581, 812, 3365, 897, 716, 261, 1216, 88, 347, 11193, 858, 368, 10018, 667, 1534, 5373, 432, 10491, 50276, 977, 3533, 337, 47515, 253, 3486, 273, 7562, 253, 5202, 7658, 327, 21011, 1309, 896, 44263, 318, 50276, 9802, 368, 3597, 11922, 253, 5202, 7658, 751, 2045, 789, 50276, 19, 403, 6197, 21496, 285, 7139, 4561, 432, 767, 4858, 298, 296, 78, 2349, 351, 398, 403, 627, 667, 4764, 9628, 875, 253, 767, 50276, 187, 187, 4118, 18435, 27, 2520, 2929, 29328, 247, 1332, 323, 440, 35421, 4715, 326, 4648, 247, 21624, 4778, 1006, 800, 1566, 323, 49863, 29974, 13337, 18925, 29072, 253, 2234, 4715, 1332, 8414, 273, 2403, 26309, 281, 253, 2412, 953, 1469, 715, 247, 29072, 5933, 281, 1056, 352, 1896, 281, 3410, 1561, 253, 39762, 6753, 36465, 7792, 1534, 15988, 403, 1119, 949, 49863, 29974, 13337, 4715, 50276, 783, 6253, 37317, 4468, 369, 326, 253, 1666, 25379, 497, 7826, 417, 2266, 2217, 347, 3012, 1805, 3904, 452, 644, 2361, 275, 2045, 789, 534, 778, 452, 247, 906, 273, 689, 44101, 253, 12351, 11839, 50276, 1189, 455, 2167, 352, 3133, 326, 253, 30628, 14109, 253, 4460, 2900, 281, 271, 1774, 1895, 285, 275, 2087, 651, 751, 281, 923, 253, 2929, 7607 ]
[ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 ]
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 783, 2929, 8631, 247, 13460, 2275, 833, 2746, 281, 49863, 29974, 13337, 4715, 273, 18925, 29072, 253, 32049, 275, 253, 362, 3348, 310, 247, 11454, 5024, 12690, 2149, 22109, 6941, 17032, 970, 299, 261, 6118, 7870, 10717, 11333, 253, 29810, 15693, 14683, 1669, 13473, 429, 387, 1016, 1127, 21839, 327, 1481, 2307, 5425, 21011, 7616, 407, 253, 5202, 247, 2234, 7681, 50276, 10539, 310, 281, 1287, 247, 1332, 323, 46350, 10491, 35422, 272, 970, 247, 11237, 273, 253, 7870, 2086, 285, 253, 305, 3561, 293, 4090, 10480, 50276, 74, 1869, 436, 369, 271, 7126, 2929, 50276, 635, 2590, 271, 1774, 50276, 28872, 247, 1077, 4217, 873, 273, 5609, 285, 1543, 891, 651, 7052, 5583, 14924, 50276, 8826, 5701, 50275, 74, 513, 4282, 849, 973, 436, 2746, 651, 789, 342, 7367, 273, 9777, 625, 440, 22027, 941, 253, 2408, 273, 440, 22027, 941, 908, 310, 3240, 1355, 50275, 3549, 6241, 891, 4282, 849, 973, 253, 2746, 2987, 347, 253, 2408, 273, 440, 22027, 941, 310, 6137, 390, 2559, 323, 326, 2647, 352, 943, 320, 1896, 281, 2085, 14580, 4645, 436, 50275, 609, 627, 3626, 2087, 5904, 281, 1554, 39661, 941, 323, 1650, 7533, 835, 22296, 941, 310, 760, 2130, 323, 11515, 643, 685, 253, 3448, 273, 1600, 50275, 262, 651, 320, 4722, 281, 923, 271, 1783, 273, 7200, 11701, 327, 1027, 18925, 13301, 253, 5230, 1083, 310, 275, 690, 3282, 816, 581, 273, 253, 13301, 295, 2377, 75, 513, 35744, 3765, 3966, 326, 812, 320, 5867, 50275, 74, 4282, 671, 604, 436, 1332, 651, 320, 3782, 9371, 275, 50276, 13517, 3700, 323, 1650, 432, 3402, 6406, 6698, 2505, 281, 259, 15170, 390, 4384, 941, 275, 2087, 253, 11701, 812, 320, 625, 14138, 275, 436, 1083, 50276, 3529, 2238, 273, 1055, 556, 644, 2326, 342, 50276, 293, 6972, 323, 1174, 6216, 406, 339, 793, 360, 3454, 436, 2929, 29328, 281, 513, 49863, 29974, 13337, 4715, 50276, 13917, 247, 1006, 800, 1566, 273, 271, 549, 7836, 514, 2149, 18925, 22109, 407, 970, 50276, 312, 430, 1025, 39762, 17032, 50276, 783, 14390, 5202, 310, 253, 21624, 4778, 253, 22109, 310, 253, 32049, 326, 8115, 247, 6197, 281, 247, 3268, 689, 13328, 292, 6151, 285, 253, 29810, 310, 247, 1006, 800, 1566, 326, 8115, 247, 14390, 5202, 281, 247, 3268, 689, 14683, 50274, 856, 84, 49863, 29974, 13337, 4715, 323, 18925, 29072, 310, 1097, 1774, 285, 2834, 285, 436, 2929, 10262, 247, 4460, 2746, 970, 39762, 6753, 2083, 351, 398, 285, 253, 49863, 29974, 13337, 4715, 1332, 275, 436, 2929, 4245, 247, 1355, 533, 28078, 7756, 689, 247, 12054, 2266, 8245, 50275, 5040, 337, 619, 2022, 4468, 342, 436, 2929, 4390, 403, 253, 22909, 2530, 275, 253, 2929, 534, 403, 3240, 1133, 88, 17157, 24088, 253, 4477, 1375, 326, 970, 247, 27451, 1307, 275, 49863, 29974, 13337, 4715, 310, 4555, 7285, 281, 253, 1698, 4038, 9712, 9376, 50276, 395, 3103, 597, 873, 253, 27451, 1307, 281, 320, 5058, 581, 556, 281, 4282, 326, 2139, 310, 253, 1698, 4038, 9712, 9376, 594, 4619, 323, 18925, 29072, 760, 13460, 265, 452, 644, 908, 342, 247, 2720, 323, 49863, 29974, 13337, 4715, 1078, 2139, 42126, 436, 9376, 2818, 1110, 3210, 50274, 66, 1805, 8813, 588, 452, 644, 326, 1580, 253, 4477, 806, 10166, 253, 22109, 275, 247, 22296, 8142, 3103, 616, 17032, 2990, 2168, 6125, 247, 1175, 3268, 689, 13328, 265, 1014, 2167, 436, 3268, 310, 7616, 760, 11776, 80, 10491, 533, 417, 275, 247, 11076, 1037, 4581, 830, 4720, 4758, 253, 27451, 23279, 875, 253, 12637, 273, 253, 17032, 2990, 285, 253, 2720, 281, 320, 5058, 310, 253, 1072, 347, 23043, 31238, 253, 2720, 281, 320, 253, 1072, 347, 253, 17032, 6928, 3268, 50274, 19, 247, 1180, 273, 1774, 4278, 403, 5816, 275, 253, 9262, 2715, 273, 253, 2929, 534, 253, 4477, 9713, 275, 616, 12252, 281, 619, 1345, 4385, 50276, 20, 253, 1655, 2929, 1057, 417, 3831, 667, 5301, 281, 11329, 649, 26208, 534, 310, 247, 3626, 8245, 323, 436, 789, 253, 4477, 10017, 281, 619, 4385, 3981, 326, 11329, 649, 26208, 4419, 247, 1180, 273, 344, 321, 3397, 533, 697, 417, 2590, 281, 479, 849, 1199, 625, 2834, 476, 841, 344, 321, 3397, 320, 685, 253, 25184, 2424, 323, 3733, 616, 13460, 264, 406, 33032, 2520, 2929, 4081, 247, 39762, 6753, 36465, 3169, 1332, 323, 49863, 29974, 13337, 18925, 29072, 1677, 271, 3280, 6197, 256, 271, 298, 296, 1814, 833, 32049, 15693, 247, 6197, 21496, 1182, 285, 247, 48257, 273, 465, 24277, 88, 30666, 50276, 27716, 4978, 4022, 15693, 247, 18925, 2605, 246, 27935, 689, 253, 5202, 32049, 403, 34930, 407, 337, 6240, 247, 20452, 4315, 689, 253, 2801, 4315, 285, 374, 7921, 7870, 10717, 3169, 29072, 5933, 281, 247, 46350, 5981, 253, 29810, 24772, 2629, 298, 296, 78, 285, 4216, 27311, 267, 2990, 281, 6635, 253, 3280, 6197, 432, 1182, 285, 246, 253, 4477, 6760, 253, 4081, 1332, 327, 1264, 11515, 970, 884, 273, 253, 3236, 3733, 941, 347, 13130, 285, 253, 1551, 347, 440, 22027, 941, 50276, 856, 84, 337, 891, 751, 253, 2934, 273, 436, 2197, 2083, 292, 6151, 290, 566, 6753, 36465, 323, 49863, 29974, 13337, 29072, 253, 4477, 4081, 247, 4460, 285, 5322, 1039, 281, 18915, 2234, 7881, 275, 11786, 13782, 362, 3348, 8687, 16888, 1320, 689, 512, 1896, 18925, 7139, 534, 310, 43245, 275, 36764, 917, 285, 253, 4081, 1332, 908, 247, 305, 3561, 293, 4090, 10480, 281, 16851, 352, 253, 5202, 17032, 5199, 8687, 27370, 7413, 6051, 18872, 10554, 285, 253, 4477, 908, 247, 33404, 5530, 4090, 1332, 281, 2953, 253, 2523, 253, 2644, 1566, 310, 4751, 46350, 285, 476, 320, 3021, 10166, 990, 281, 990, 50276, 19, 253, 3884, 273, 49863, 29974, 13337, 29072, 310, 4217, 285, 12532, 417, 760, 323, 7741, 31943, 11515, 533, 671, 323, 4633, 11515, 751, 48087, 247, 5547, 2561, 327, 436, 3884, 812, 320, 7826, 9371, 323, 8783, 273, 2852, 789, 50276, 5040, 285, 13991, 327, 4679, 619, 2022, 7350, 403, 1475, 4679, 4583, 891, 1158, 597, 403, 417, 2266, 2217, 281, 7568, 326, 436, 2929, 556, 4209, 7680, 281, 49863, 29974, 13337, 29072, 2708, 403, 4278, 50276, 18, 253, 1655, 2715, 760, 908, 884, 273, 3236, 3733, 941, 347, 13130, 285, 253, 1551, 347, 440, 22027, 941, 436, 2789, 253, 2361, 3904, 1039, 2708, 5368, 1375, 23037, 14387, 3045, 323, 1650, 253, 256, 5503, 1484, 284, 327, 48087, 268, 25192, 556, 644, 5325, 34243, 253, 4477, 943, 320, 2104, 281, 6194, 247, 12085, 22296, 22109, 327, 2120, 3733, 941, 48087, 390, 643, 11515, 285, 755, 5699, 2408, 273, 440, 22027, 941, 432, 643, 4973, 24088, 3668, 281, 2007, 7450, 598, 253, 3045, 253, 1655, 4758, 2789, 352, 1892, 281, 15249, 849, 4217, 253, 4081, 1332, 812, 320, 275, 3946, 50276, 19, 253, 1682, 3904, 432, 253, 4081, 1566, 310, 2406, 685, 8245, 25130, 3803, 88, 30666, 50276, 27716, 4978, 327, 48087, 285, 760, 42876, 1805, 327, 1863, 264, 763, 436, 3164, 2097, 253, 22296, 8245, 310, 5075, 285, 697, 1892, 281, 2028, 604, 253, 15988, 432, 362, 3348, 588, 13280, 604, 3732, 281, 247, 10046, 22296, 50276, 20, 247, 3045, 6970, 342, 1027, 2408, 273, 13130, 285, 440, 22027, 941, 651, 320, 4217, 281, 1805, 2096, 253, 3486, 273, 49863, 29974, 13337, 4715, 50276, 21, 47515, 253, 3486, 273, 20452, 581, 812, 3365, 897, 716, 261, 1216, 88, 347, 11193, 858, 368, 10018, 667, 1534, 5373, 432, 10491, 50276, 977, 3533, 337, 47515, 253, 3486, 273, 7562, 253, 5202, 7658, 327, 21011, 1309, 896, 44263, 318, 50276, 9802, 368, 3597, 11922, 253, 5202, 7658, 751, 2045, 789, 50276, 19, 403, 6197, 21496, 285, 7139, 4561, 432, 767, 4858, 298, 296, 78, 2349, 351, 398, 403, 627, 667, 4764, 9628, 875, 253, 767, 50276, 187, 187, 4118, 18435, 27, 2520, 2929, 29328, 247, 1332, 323, 440, 35421, 4715, 326, 4648, 247, 21624, 4778, 1006, 800, 1566, 323, 49863, 29974, 13337, 18925, 29072, 253, 2234, 4715, 1332, 8414, 273, 2403, 26309, 281, 253, 2412, 953, 1469, 715, 247, 29072, 5933, 281, 1056, 352, 1896, 281, 3410, 1561, 253, 39762, 6753, 36465, 7792, 1534, 15988, 403, 1119, 949, 49863, 29974, 13337, 4715, 50276, 783, 6253, 37317, 4468, 369, 326, 253, 1666, 25379, 497, 7826, 417, 2266, 2217, 347, 3012, 1805, 3904, 452, 644, 2361, 275, 2045, 789, 534, 778, 452, 247, 906, 273, 689, 44101, 253, 12351, 11839, 50276, 1189, 455, 2167, 352, 3133, 326, 253, 30628, 14109, 253, 4460, 2900, 281, 271, 1774, 1895, 285, 275, 2087, 651, 751, 281, 923, 253, 2929, 7607 ]
Below is given review of a research paper from cnoference journal. Please write a summary the review. ### Review: 1 this paper introduces curriculum learning to semisupervised keypoint localization which is an automatic pseudolabeled data selection method the method uses reinforcement learning to learns a series of dynamic thresholds 2 besides this paper proposes the crosstraining strategy for pseudolabeling to alleviate confirmation bias 3 the experiments shows that the proposed method can effectively improve the performance in different dataset and surpass other semisupervised methods strengths 1 the task is practical and motivation is well how to select the threshold of pseudo label is an important and complicated task 2 overall the method of this paper is technically reasonable and novel the paper first applies curriculum learning to semisupervised keypoint localization and proposes use rl to search the best curriculum 3 the experiments show that the proposed method is effective compared to other ssl methods the ablation study validates the three parts of proposed method are all important and can improve performance about 25 weaknesses 1 in my opinion the time complexity of proposed algorithm is too high the rl search process increases the training time by tm times 128 in this paper the high complexity will make the model less scalable such as in dataset size the author maybe can explain about the current training cost and whether the scalability is indeed a problem 2 the dataset in experiments in the article is somewhat simple and small in scale if the mainstream datasets in human pose estimation such as coco keypoint or full mpii can be used in experiments the contributions would be more convincing 3 the experimental comparison in effect of parameter search is not sufficient and the selected baseline method is weak the paper only compares random search and does not fully explain the details of this baseline does this baseline method also select the optimal strategy from candidates of the same size tm in addition considering that using rl to search is one of the main contributions of this paper the author should consider comparing other stronger and more comprehensive baselines for example manually design many curriculums whose thresholds are gradually decreasing on the epoch level and select the curriculum with the best result minor comments 1 the division and use of data sets are not particularly clear are the reported numbers all evaluated in testing set and dval in eq3 uses validation set but the paper says that the mpii validation set is for evaluation so what is the dval in mpii 2 in the paper the symbol n represents both the keypoint network and the number of epochs which is somewhat confusing 3 in my opinion the proposed method is not particularly related to the keypoint localization task it will be better if this method can be applied to other tasks 4 in addition the author maybe can add discussion on the following papers about semisupervised human pose estimation rongchang xie chunyu wang wenjun zeng yizhou wang an empirical study of the collapsing problem in semisupervised 2d human pose estimation iccv 2021 updates thanks for the authors response the response and new resutls address my main concern i tend to accept this paper this paper proposes a novel and effective threshold selection method for semisupervised keypoint localization meanwhile i think there are some weaknesses in practicality i currently choose borderline accept the author may can explain about the weaknesses docsepthe paper introduces a method for semisupervised keypoint localization based on pseudolabeling with autocurriculum learning the autocurriculum learning approach learns a series of dynamic thresholds for automatic selection of highquality pseudolabeled examples for model retraining the reinforcement learning rl framework more specifically the proximal policy optimization algorithm is used to search for the optimal curriculum the method is evaluated on four benchmarks in keypoint localization strengths 1 the authors provide both intuition and theoretical explanations of the proposed method supported by experimental results 2 although all components are widely used techniques in the field the application of rl to tackle pseudolabeled sample selection for keypoint localization is novel 3 the proposed method achieves the improvement upon the stateoftheart on several benchmarks for human and animal body landmark localization and specifically in a low labeled data regime 5 of data is labeled 4 informative ablation studies and evaluation of the generalization ability on domain transfer weaknesses 1 the rl part of the approach has many moving parts and it would be beneficial to justify the choices of hyperparameters in the proximal policy optimization algorithm 2 innerloop network training is executed multiple times during policy learning how has the training complexity time to convergence increased compared to previous works questions 1 section 32 is eq1 computed per image or per keypoint given that each image has k keypoints if an image has some keypoints with the confidence above a threshold and some are below does the image get selected for the training round 2 how does the proposed pseudolabeling strategy deal with notvisible keypoints in the image eg a left eye and a leftwing of a bird are not visible if the bird is depicted from the rightside viewpoint the paper is wellwritten and easy to follow the proposed method is described in detail the method is evaluated on four popular datasets for the keypoint localization task the proposed method demonstrates superior performance especially in cases with only 5 of labeled data out of no more than 10000 examples ablation studies justify the design choices the method while combining existing techniques is proven experimentally to be superior to the previous works and will add to the body of knowledge on keypoint localization docsepa semisupervised learning method placl is proposed this method employs a pseudolabeling pl approach it consists in iteratively these iterations are called rounds 1 predicting pseudolabels to unlabeled data using the current model 2 training a series of models from scratch using the labeled data and selections of pseudolabeled data the pseudolabel selection is performed using a series of thresholds called curriculum over the scores output by the model the authors propose an autocurriculum learning acl strategy to automatically update the curriculum using a reinforcement learning approach ppo2 the fact that the curriculum is updated at each round is called curriculum residual learning they also employ a crosstraining strategy to prevent the issue of confirmation bias the performances of placl are evaluated on a keypoint localization application on 5 different datasets when the percentage of labeled data is very low 5 or 10 placl outperforms the state of the art semisupervised learning competitors strengths 1 the idea of performing acl using ppo2 is simple and elegant 2 the paper is well written and easy to read 3 the method is always either on par or outperforms the keypoint localization network competitors weaknessesquestions 1 placl seems computationally very intensive in the experiments m8 r6 and t16 so it means that 736 networks are trained from scratch isnt it if i am not mistaken i did not see any information concerning the training time of placl type and number of gpus used and a comparison against sota methods for instance how does placl compare against sskl in terms of training time memory footprint etc 2 the results that are reported in the experiments table 1 and 2 for the line placl ours correspond to a forward pass in your implementation of hrnet using which weights in alg 2 the output is the optimal curriculum does it mean that a final training is performed using the last threshold of the optimal curriculum if that is the case i suggest to add this step at the end of alg2 and return the weights of this network 3 placl seems application agnostic the results are always either on par or outperform the keypoint localization network competitors but the improvement wrt sskl for instance is not huge i suggest to consider a second application to strengthen the paper 4 in section 42 it is said however the analysis of their combined effects is outside the scope of this work if there exists methods with codes available that combine these effects i suggest to include them in the experiments even if they outperform placl 5 in alg2 pseudolabels are predicted using nomegar1 is it the network obtained at the previous round for jm and tt or do you train another network with gammar1 i think this question is related to my remark 2 the paper is interesting and well written but the novelty is limited using ppo2 to update the thresholds the results are not very impressive especially compared to sskl and the evaluation is limited to a single application keypoint localization while placl is application agnostic thus i believe the paper is not ready for a publication at iclr please answer my questions from the main review ### Summary:
this paper proposes a pseudolabeled data selection method for semisupervised pose estimation the investigated task in this paper is practical and useful the framework is well designed and reasonable and extensive ablation studies are conducted to test the efficacy of the method after discussion all the reviewers recommend accept of this paper
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 18, 436, 2929, 23970, 24642, 4715, 281, 49863, 29974, 13337, 2234, 3659, 14536, 534, 310, 271, 12077, 10585, 311, 1492, 264, 941, 5438, 1332, 253, 1332, 4648, 35221, 4715, 281, 33772, 247, 2962, 273, 7870, 26682, 50275, 19, 16280, 436, 2929, 29328, 253, 23987, 10981, 1699, 5700, 323, 10585, 311, 1492, 272, 281, 33623, 16883, 8492, 50275, 20, 253, 4679, 2722, 326, 253, 4081, 1332, 476, 8069, 3157, 253, 3045, 275, 1027, 10895, 285, 28842, 643, 49863, 29974, 13337, 3082, 50274, 296, 3755, 20556, 337, 253, 4836, 310, 8542, 285, 16038, 310, 973, 849, 281, 3609, 253, 7887, 273, 17927, 5203, 310, 271, 1774, 285, 9542, 4836, 50276, 19, 4583, 253, 1332, 273, 436, 2929, 310, 22335, 5272, 285, 4460, 253, 2929, 806, 10384, 24642, 4715, 281, 49863, 29974, 13337, 2234, 3659, 14536, 285, 29328, 897, 391, 77, 281, 3186, 253, 1682, 24642, 50276, 20, 253, 4679, 921, 326, 253, 4081, 1332, 310, 3576, 2429, 281, 643, 256, 3433, 3082, 253, 28913, 1263, 3588, 684, 253, 1264, 4243, 273, 4081, 1332, 403, 512, 1774, 285, 476, 3157, 3045, 670, 2030, 50272, 20881, 1255, 265, 337, 275, 619, 4743, 253, 673, 10454, 273, 4081, 5933, 310, 1512, 1029, 253, 391, 77, 3186, 1232, 5459, 253, 3733, 673, 407, 246, 78, 2069, 12842, 275, 436, 2929, 253, 1029, 10454, 588, 1056, 253, 1566, 1679, 44755, 824, 347, 275, 10895, 1979, 253, 2488, 5046, 476, 5513, 670, 253, 1655, 3733, 2105, 285, 1880, 253, 9171, 1430, 310, 6296, 247, 1895, 50276, 19, 253, 10895, 275, 4679, 275, 253, 3929, 310, 8489, 2969, 285, 1355, 275, 4311, 604, 253, 17068, 15302, 275, 1966, 16753, 13418, 824, 347, 9285, 80, 2234, 3659, 390, 2120, 278, 2059, 74, 476, 320, 908, 275, 4679, 253, 9021, 651, 320, 625, 21414, 50276, 20, 253, 5661, 5301, 275, 1055, 273, 4764, 3186, 310, 417, 4209, 285, 253, 4236, 8245, 1332, 310, 5075, 253, 2929, 760, 26662, 3632, 3186, 285, 1057, 417, 4751, 5513, 253, 4278, 273, 436, 8245, 1057, 436, 8245, 1332, 671, 3609, 253, 8654, 5700, 432, 9183, 273, 253, 1072, 1979, 246, 78, 275, 1635, 7296, 326, 970, 391, 77, 281, 3186, 310, 581, 273, 253, 2022, 9021, 273, 436, 2929, 253, 2488, 943, 1908, 10941, 643, 10046, 285, 625, 11088, 1666, 25379, 323, 1650, 13542, 2216, 1142, 22182, 335, 7640, 3692, 26682, 403, 13237, 11052, 327, 253, 23657, 1268, 285, 3609, 253, 24642, 342, 253, 1682, 906, 50271, 37585, 5701, 337, 253, 9025, 285, 897, 273, 941, 5239, 403, 417, 3782, 2590, 403, 253, 2361, 3904, 512, 6760, 275, 5175, 873, 285, 277, 1208, 275, 16186, 20, 4648, 12820, 873, 533, 253, 2929, 2296, 326, 253, 278, 2059, 74, 12820, 873, 310, 323, 7103, 594, 752, 310, 253, 277, 1208, 275, 278, 2059, 74, 50276, 19, 275, 253, 2929, 253, 9484, 295, 6125, 1097, 253, 2234, 3659, 2990, 285, 253, 1180, 273, 44540, 534, 310, 8489, 21643, 50276, 20, 275, 619, 4743, 253, 4081, 1332, 310, 417, 3782, 2905, 281, 253, 2234, 3659, 14536, 4836, 352, 588, 320, 1805, 604, 436, 1332, 476, 320, 3732, 281, 643, 8892, 50276, 21, 275, 1635, 253, 2488, 5046, 476, 823, 5955, 327, 253, 1563, 9380, 670, 49863, 29974, 13337, 1966, 16753, 13418, 391, 543, 348, 606, 1269, 466, 448, 328, 30838, 259, 606, 259, 257, 30986, 1182, 1205, 340, 478, 14451, 259, 606, 271, 16774, 1263, 273, 253, 45130, 1895, 275, 49863, 29974, 13337, 374, 69, 1966, 16753, 13418, 17857, 17312, 43425, 50274, 484, 24275, 50276, 35501, 323, 253, 4477, 2380, 253, 2380, 285, 747, 501, 307, 5200, 2953, 619, 2022, 4468, 891, 5257, 281, 2997, 436, 2929, 50274, 2520, 2929, 29328, 247, 4460, 285, 3576, 7887, 5438, 1332, 323, 49863, 29974, 13337, 2234, 3659, 14536, 26614, 891, 1158, 627, 403, 690, 32213, 275, 8542, 414, 891, 4390, 5206, 45210, 2997, 253, 2488, 778, 476, 5513, 670, 253, 32213, 5474, 339, 431, 248, 2929, 23970, 247, 1332, 323, 49863, 29974, 13337, 2234, 3659, 14536, 1754, 327, 10585, 311, 1492, 272, 342, 1125, 20020, 695, 15508, 4715, 253, 1125, 20020, 695, 15508, 4715, 2746, 33772, 247, 2962, 273, 7870, 26682, 323, 12077, 5438, 273, 1029, 15177, 10585, 311, 1492, 264, 6667, 323, 1566, 851, 26208, 253, 35221, 4715, 391, 77, 7792, 625, 5742, 253, 19561, 3646, 13757, 5933, 310, 908, 281, 3186, 323, 253, 8654, 24642, 253, 1332, 310, 6760, 327, 1740, 49602, 275, 2234, 3659, 14536, 20544, 337, 253, 4477, 2085, 1097, 30328, 285, 10527, 22909, 273, 253, 4081, 1332, 4516, 407, 5661, 1543, 50276, 19, 3738, 512, 4295, 403, 7561, 908, 5609, 275, 253, 1673, 253, 2898, 273, 391, 77, 281, 18915, 10585, 311, 1492, 264, 3410, 5438, 323, 2234, 3659, 14536, 310, 4460, 495, 253, 4081, 1332, 33526, 253, 7756, 2220, 253, 1375, 23037, 14387, 327, 2067, 49602, 323, 1966, 285, 5893, 2133, 30951, 14536, 285, 5742, 275, 247, 1698, 13130, 941, 9459, 608, 273, 941, 310, 13130, 577, 27096, 28913, 2175, 285, 7103, 273, 253, 26647, 3745, 327, 5028, 3700, 50276, 20881, 1255, 265, 337, 253, 391, 77, 629, 273, 253, 2746, 556, 1142, 4886, 4243, 285, 352, 651, 320, 12912, 281, 15249, 253, 10165, 273, 4373, 22041, 275, 253, 19561, 3646, 13757, 5933, 374, 6703, 14075, 2990, 3733, 310, 11407, 2709, 2069, 1309, 3646, 4715, 849, 556, 253, 3733, 10454, 673, 281, 14940, 2559, 2429, 281, 2045, 2987, 50275, 34974, 337, 2593, 4567, 310, 16186, 18, 10302, 591, 2460, 390, 591, 2234, 3659, 1677, 326, 1016, 2460, 556, 465, 2234, 10801, 604, 271, 2460, 556, 690, 2234, 10801, 342, 253, 7162, 1840, 247, 7887, 285, 690, 403, 2708, 1057, 253, 2460, 755, 4236, 323, 253, 3733, 3790, 374, 849, 1057, 253, 4081, 10585, 311, 1492, 272, 5700, 2968, 342, 417, 22772, 2234, 10801, 275, 253, 2460, 24088, 247, 1669, 5130, 285, 247, 1669, 7706, 273, 247, 12621, 403, 417, 7985, 604, 253, 12621, 310, 17253, 432, 253, 987, 2189, 31460, 50275, 783, 2929, 310, 973, 15720, 285, 3477, 281, 956, 253, 4081, 1332, 310, 2529, 275, 2508, 253, 1332, 310, 6760, 327, 1740, 4633, 15302, 323, 253, 2234, 3659, 14536, 4836, 253, 4081, 1332, 14371, 8936, 3045, 3340, 275, 2219, 342, 760, 608, 273, 13130, 941, 562, 273, 642, 625, 685, 30321, 6667, 28913, 2175, 15249, 253, 2216, 10165, 253, 1332, 1223, 16248, 5368, 5609, 310, 11464, 21657, 281, 320, 8936, 281, 253, 2045, 2987, 285, 588, 823, 281, 253, 2133, 273, 3640, 327, 2234, 3659, 14536, 5474, 339, 4904, 49863, 29974, 13337, 4715, 1332, 21927, 77, 310, 4081, 436, 1332, 27532, 247, 10585, 311, 1492, 272, 499, 2746, 352, 8414, 275, 10040, 3146, 841, 25142, 403, 1925, 16334, 50276, 18, 21565, 10585, 311, 357, 1241, 281, 440, 22027, 941, 970, 253, 1655, 1566, 374, 3733, 247, 2962, 273, 3210, 432, 20041, 970, 253, 13130, 941, 285, 36318, 273, 10585, 311, 1492, 264, 941, 253, 10585, 311, 1492, 5438, 310, 2684, 970, 247, 2962, 273, 26682, 1925, 24642, 689, 253, 7363, 3453, 407, 253, 1566, 50276, 783, 4477, 12661, 271, 1125, 20020, 695, 15508, 4715, 247, 498, 5700, 281, 8356, 5731, 253, 24642, 970, 247, 35221, 4715, 2746, 268, 5367, 19, 253, 958, 326, 253, 24642, 310, 9300, 387, 1016, 3790, 310, 1925, 24642, 12541, 4715, 597, 671, 2126, 247, 23987, 10981, 1699, 5700, 281, 3657, 253, 2523, 273, 16883, 8492, 50276, 783, 16226, 273, 21927, 77, 403, 6760, 327, 247, 2234, 3659, 14536, 2898, 327, 608, 1027, 15302, 672, 253, 7155, 273, 13130, 941, 310, 1077, 1698, 608, 390, 884, 21927, 77, 41731, 13015, 253, 1375, 273, 253, 1445, 49863, 29974, 13337, 4715, 21607, 50275, 296, 3755, 20556, 50276, 18, 253, 2934, 273, 9591, 247, 498, 970, 268, 5367, 19, 310, 2969, 285, 20654, 374, 253, 2929, 310, 973, 3542, 285, 3477, 281, 1239, 495, 253, 1332, 310, 1900, 2057, 327, 1061, 390, 41731, 13015, 253, 2234, 3659, 14536, 2990, 21607, 50275, 20881, 1255, 265, 34974, 50276, 18, 21927, 77, 3133, 43245, 1077, 17193, 275, 253, 4679, 278, 25, 391, 23, 285, 246, 1036, 594, 352, 2097, 326, 818, 1812, 6928, 403, 10166, 432, 20041, 310, 2649, 352, 50276, 338, 891, 717, 417, 20854, 891, 858, 417, 923, 667, 1491, 8664, 253, 3733, 673, 273, 21927, 77, 1511, 285, 1180, 273, 31025, 316, 908, 285, 247, 5301, 1411, 256, 5503, 3082, 323, 4227, 849, 1057, 21927, 77, 7277, 1411, 256, 3319, 77, 275, 2426, 273, 3733, 673, 3541, 33257, 3966, 374, 253, 1543, 326, 403, 2361, 275, 253, 4679, 2829, 337, 285, 374, 323, 253, 1386, 21927, 77, 20451, 2723, 281, 247, 3579, 1509, 275, 634, 7092, 273, 20589, 3024, 970, 534, 13461, 275, 20320, 374, 253, 3453, 310, 253, 8654, 24642, 1057, 352, 1599, 326, 247, 2457, 3733, 310, 2684, 970, 253, 1390, 7887, 273, 253, 8654, 24642, 604, 326, 310, 253, 1083, 891, 1804, 281, 823, 436, 3213, 387, 253, 990, 273, 20320, 19, 285, 1091, 253, 13461, 273, 436, 2990, 495, 21927, 77, 3133, 2898, 639, 79, 6932, 253, 1543, 403, 1900, 2057, 327, 1061, 390, 562, 32231, 253, 2234, 3659, 14536, 2990, 21607, 533, 253, 7756, 8772, 256, 3319, 77, 323, 4227, 310, 417, 5699, 891, 1804, 281, 1908, 247, 1273, 2898, 281, 17084, 253, 2929, 577, 275, 2593, 5976, 352, 310, 753, 2299, 253, 1783, 273, 616, 5678, 2538, 310, 3345, 253, 7990, 273, 436, 789, 604, 627, 4961, 3082, 342, 11646, 2130, 326, 13398, 841, 2538, 891, 1804, 281, 2486, 731, 275, 253, 4679, 1014, 604, 597, 562, 32231, 21927, 77, 608, 275, 20320, 19, 10585, 311, 357, 1241, 403, 8131, 970, 43831, 5209, 18, 310, 352, 253, 2990, 2797, 387, 253, 2045, 3790, 323, 480, 78, 285, 42085, 390, 513, 368, 6194, 1529, 2990, 342, 305, 21089, 18, 50276, 74, 1158, 436, 1953, 310, 2905, 281, 619, 7579, 374, 253, 2929, 310, 4722, 285, 973, 3542, 533, 253, 38135, 310, 3710, 970, 268, 5367, 19, 281, 5731, 253, 26682, 253, 1543, 403, 417, 1077, 13943, 3340, 2429, 281, 256, 3319, 77, 285, 253, 7103, 310, 3710, 281, 247, 2014, 2898, 2234, 3659, 14536, 1223, 21927, 77, 310, 2898, 639, 79, 6932, 3021, 891, 2868, 253, 2929, 310, 417, 4704, 323, 247, 9311, 387, 17857, 32888, 4496, 3662, 619, 3533, 432, 253, 2022, 2278, 50276, 187, 187, 4118, 18435, 27, 2520, 2929, 29328, 247, 10585, 311, 1492, 264, 941, 5438, 1332, 323, 49863, 29974, 13337, 16753, 13418, 253, 6949, 4836, 275, 436, 2929, 310, 8542, 285, 4217, 253, 7792, 310, 973, 4158, 285, 5272, 285, 9470, 28913, 2175, 403, 5196, 281, 1071, 253, 10307, 273, 253, 1332, 846, 5955, 512, 253, 30628, 5583, 2997, 273, 436, 2929 ]
[ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 ]
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 18, 436, 2929, 23970, 24642, 4715, 281, 49863, 29974, 13337, 2234, 3659, 14536, 534, 310, 271, 12077, 10585, 311, 1492, 264, 941, 5438, 1332, 253, 1332, 4648, 35221, 4715, 281, 33772, 247, 2962, 273, 7870, 26682, 50275, 19, 16280, 436, 2929, 29328, 253, 23987, 10981, 1699, 5700, 323, 10585, 311, 1492, 272, 281, 33623, 16883, 8492, 50275, 20, 253, 4679, 2722, 326, 253, 4081, 1332, 476, 8069, 3157, 253, 3045, 275, 1027, 10895, 285, 28842, 643, 49863, 29974, 13337, 3082, 50274, 296, 3755, 20556, 337, 253, 4836, 310, 8542, 285, 16038, 310, 973, 849, 281, 3609, 253, 7887, 273, 17927, 5203, 310, 271, 1774, 285, 9542, 4836, 50276, 19, 4583, 253, 1332, 273, 436, 2929, 310, 22335, 5272, 285, 4460, 253, 2929, 806, 10384, 24642, 4715, 281, 49863, 29974, 13337, 2234, 3659, 14536, 285, 29328, 897, 391, 77, 281, 3186, 253, 1682, 24642, 50276, 20, 253, 4679, 921, 326, 253, 4081, 1332, 310, 3576, 2429, 281, 643, 256, 3433, 3082, 253, 28913, 1263, 3588, 684, 253, 1264, 4243, 273, 4081, 1332, 403, 512, 1774, 285, 476, 3157, 3045, 670, 2030, 50272, 20881, 1255, 265, 337, 275, 619, 4743, 253, 673, 10454, 273, 4081, 5933, 310, 1512, 1029, 253, 391, 77, 3186, 1232, 5459, 253, 3733, 673, 407, 246, 78, 2069, 12842, 275, 436, 2929, 253, 1029, 10454, 588, 1056, 253, 1566, 1679, 44755, 824, 347, 275, 10895, 1979, 253, 2488, 5046, 476, 5513, 670, 253, 1655, 3733, 2105, 285, 1880, 253, 9171, 1430, 310, 6296, 247, 1895, 50276, 19, 253, 10895, 275, 4679, 275, 253, 3929, 310, 8489, 2969, 285, 1355, 275, 4311, 604, 253, 17068, 15302, 275, 1966, 16753, 13418, 824, 347, 9285, 80, 2234, 3659, 390, 2120, 278, 2059, 74, 476, 320, 908, 275, 4679, 253, 9021, 651, 320, 625, 21414, 50276, 20, 253, 5661, 5301, 275, 1055, 273, 4764, 3186, 310, 417, 4209, 285, 253, 4236, 8245, 1332, 310, 5075, 253, 2929, 760, 26662, 3632, 3186, 285, 1057, 417, 4751, 5513, 253, 4278, 273, 436, 8245, 1057, 436, 8245, 1332, 671, 3609, 253, 8654, 5700, 432, 9183, 273, 253, 1072, 1979, 246, 78, 275, 1635, 7296, 326, 970, 391, 77, 281, 3186, 310, 581, 273, 253, 2022, 9021, 273, 436, 2929, 253, 2488, 943, 1908, 10941, 643, 10046, 285, 625, 11088, 1666, 25379, 323, 1650, 13542, 2216, 1142, 22182, 335, 7640, 3692, 26682, 403, 13237, 11052, 327, 253, 23657, 1268, 285, 3609, 253, 24642, 342, 253, 1682, 906, 50271, 37585, 5701, 337, 253, 9025, 285, 897, 273, 941, 5239, 403, 417, 3782, 2590, 403, 253, 2361, 3904, 512, 6760, 275, 5175, 873, 285, 277, 1208, 275, 16186, 20, 4648, 12820, 873, 533, 253, 2929, 2296, 326, 253, 278, 2059, 74, 12820, 873, 310, 323, 7103, 594, 752, 310, 253, 277, 1208, 275, 278, 2059, 74, 50276, 19, 275, 253, 2929, 253, 9484, 295, 6125, 1097, 253, 2234, 3659, 2990, 285, 253, 1180, 273, 44540, 534, 310, 8489, 21643, 50276, 20, 275, 619, 4743, 253, 4081, 1332, 310, 417, 3782, 2905, 281, 253, 2234, 3659, 14536, 4836, 352, 588, 320, 1805, 604, 436, 1332, 476, 320, 3732, 281, 643, 8892, 50276, 21, 275, 1635, 253, 2488, 5046, 476, 823, 5955, 327, 253, 1563, 9380, 670, 49863, 29974, 13337, 1966, 16753, 13418, 391, 543, 348, 606, 1269, 466, 448, 328, 30838, 259, 606, 259, 257, 30986, 1182, 1205, 340, 478, 14451, 259, 606, 271, 16774, 1263, 273, 253, 45130, 1895, 275, 49863, 29974, 13337, 374, 69, 1966, 16753, 13418, 17857, 17312, 43425, 50274, 484, 24275, 50276, 35501, 323, 253, 4477, 2380, 253, 2380, 285, 747, 501, 307, 5200, 2953, 619, 2022, 4468, 891, 5257, 281, 2997, 436, 2929, 50274, 2520, 2929, 29328, 247, 4460, 285, 3576, 7887, 5438, 1332, 323, 49863, 29974, 13337, 2234, 3659, 14536, 26614, 891, 1158, 627, 403, 690, 32213, 275, 8542, 414, 891, 4390, 5206, 45210, 2997, 253, 2488, 778, 476, 5513, 670, 253, 32213, 5474, 339, 431, 248, 2929, 23970, 247, 1332, 323, 49863, 29974, 13337, 2234, 3659, 14536, 1754, 327, 10585, 311, 1492, 272, 342, 1125, 20020, 695, 15508, 4715, 253, 1125, 20020, 695, 15508, 4715, 2746, 33772, 247, 2962, 273, 7870, 26682, 323, 12077, 5438, 273, 1029, 15177, 10585, 311, 1492, 264, 6667, 323, 1566, 851, 26208, 253, 35221, 4715, 391, 77, 7792, 625, 5742, 253, 19561, 3646, 13757, 5933, 310, 908, 281, 3186, 323, 253, 8654, 24642, 253, 1332, 310, 6760, 327, 1740, 49602, 275, 2234, 3659, 14536, 20544, 337, 253, 4477, 2085, 1097, 30328, 285, 10527, 22909, 273, 253, 4081, 1332, 4516, 407, 5661, 1543, 50276, 19, 3738, 512, 4295, 403, 7561, 908, 5609, 275, 253, 1673, 253, 2898, 273, 391, 77, 281, 18915, 10585, 311, 1492, 264, 3410, 5438, 323, 2234, 3659, 14536, 310, 4460, 495, 253, 4081, 1332, 33526, 253, 7756, 2220, 253, 1375, 23037, 14387, 327, 2067, 49602, 323, 1966, 285, 5893, 2133, 30951, 14536, 285, 5742, 275, 247, 1698, 13130, 941, 9459, 608, 273, 941, 310, 13130, 577, 27096, 28913, 2175, 285, 7103, 273, 253, 26647, 3745, 327, 5028, 3700, 50276, 20881, 1255, 265, 337, 253, 391, 77, 629, 273, 253, 2746, 556, 1142, 4886, 4243, 285, 352, 651, 320, 12912, 281, 15249, 253, 10165, 273, 4373, 22041, 275, 253, 19561, 3646, 13757, 5933, 374, 6703, 14075, 2990, 3733, 310, 11407, 2709, 2069, 1309, 3646, 4715, 849, 556, 253, 3733, 10454, 673, 281, 14940, 2559, 2429, 281, 2045, 2987, 50275, 34974, 337, 2593, 4567, 310, 16186, 18, 10302, 591, 2460, 390, 591, 2234, 3659, 1677, 326, 1016, 2460, 556, 465, 2234, 10801, 604, 271, 2460, 556, 690, 2234, 10801, 342, 253, 7162, 1840, 247, 7887, 285, 690, 403, 2708, 1057, 253, 2460, 755, 4236, 323, 253, 3733, 3790, 374, 849, 1057, 253, 4081, 10585, 311, 1492, 272, 5700, 2968, 342, 417, 22772, 2234, 10801, 275, 253, 2460, 24088, 247, 1669, 5130, 285, 247, 1669, 7706, 273, 247, 12621, 403, 417, 7985, 604, 253, 12621, 310, 17253, 432, 253, 987, 2189, 31460, 50275, 783, 2929, 310, 973, 15720, 285, 3477, 281, 956, 253, 4081, 1332, 310, 2529, 275, 2508, 253, 1332, 310, 6760, 327, 1740, 4633, 15302, 323, 253, 2234, 3659, 14536, 4836, 253, 4081, 1332, 14371, 8936, 3045, 3340, 275, 2219, 342, 760, 608, 273, 13130, 941, 562, 273, 642, 625, 685, 30321, 6667, 28913, 2175, 15249, 253, 2216, 10165, 253, 1332, 1223, 16248, 5368, 5609, 310, 11464, 21657, 281, 320, 8936, 281, 253, 2045, 2987, 285, 588, 823, 281, 253, 2133, 273, 3640, 327, 2234, 3659, 14536, 5474, 339, 4904, 49863, 29974, 13337, 4715, 1332, 21927, 77, 310, 4081, 436, 1332, 27532, 247, 10585, 311, 1492, 272, 499, 2746, 352, 8414, 275, 10040, 3146, 841, 25142, 403, 1925, 16334, 50276, 18, 21565, 10585, 311, 357, 1241, 281, 440, 22027, 941, 970, 253, 1655, 1566, 374, 3733, 247, 2962, 273, 3210, 432, 20041, 970, 253, 13130, 941, 285, 36318, 273, 10585, 311, 1492, 264, 941, 253, 10585, 311, 1492, 5438, 310, 2684, 970, 247, 2962, 273, 26682, 1925, 24642, 689, 253, 7363, 3453, 407, 253, 1566, 50276, 783, 4477, 12661, 271, 1125, 20020, 695, 15508, 4715, 247, 498, 5700, 281, 8356, 5731, 253, 24642, 970, 247, 35221, 4715, 2746, 268, 5367, 19, 253, 958, 326, 253, 24642, 310, 9300, 387, 1016, 3790, 310, 1925, 24642, 12541, 4715, 597, 671, 2126, 247, 23987, 10981, 1699, 5700, 281, 3657, 253, 2523, 273, 16883, 8492, 50276, 783, 16226, 273, 21927, 77, 403, 6760, 327, 247, 2234, 3659, 14536, 2898, 327, 608, 1027, 15302, 672, 253, 7155, 273, 13130, 941, 310, 1077, 1698, 608, 390, 884, 21927, 77, 41731, 13015, 253, 1375, 273, 253, 1445, 49863, 29974, 13337, 4715, 21607, 50275, 296, 3755, 20556, 50276, 18, 253, 2934, 273, 9591, 247, 498, 970, 268, 5367, 19, 310, 2969, 285, 20654, 374, 253, 2929, 310, 973, 3542, 285, 3477, 281, 1239, 495, 253, 1332, 310, 1900, 2057, 327, 1061, 390, 41731, 13015, 253, 2234, 3659, 14536, 2990, 21607, 50275, 20881, 1255, 265, 34974, 50276, 18, 21927, 77, 3133, 43245, 1077, 17193, 275, 253, 4679, 278, 25, 391, 23, 285, 246, 1036, 594, 352, 2097, 326, 818, 1812, 6928, 403, 10166, 432, 20041, 310, 2649, 352, 50276, 338, 891, 717, 417, 20854, 891, 858, 417, 923, 667, 1491, 8664, 253, 3733, 673, 273, 21927, 77, 1511, 285, 1180, 273, 31025, 316, 908, 285, 247, 5301, 1411, 256, 5503, 3082, 323, 4227, 849, 1057, 21927, 77, 7277, 1411, 256, 3319, 77, 275, 2426, 273, 3733, 673, 3541, 33257, 3966, 374, 253, 1543, 326, 403, 2361, 275, 253, 4679, 2829, 337, 285, 374, 323, 253, 1386, 21927, 77, 20451, 2723, 281, 247, 3579, 1509, 275, 634, 7092, 273, 20589, 3024, 970, 534, 13461, 275, 20320, 374, 253, 3453, 310, 253, 8654, 24642, 1057, 352, 1599, 326, 247, 2457, 3733, 310, 2684, 970, 253, 1390, 7887, 273, 253, 8654, 24642, 604, 326, 310, 253, 1083, 891, 1804, 281, 823, 436, 3213, 387, 253, 990, 273, 20320, 19, 285, 1091, 253, 13461, 273, 436, 2990, 495, 21927, 77, 3133, 2898, 639, 79, 6932, 253, 1543, 403, 1900, 2057, 327, 1061, 390, 562, 32231, 253, 2234, 3659, 14536, 2990, 21607, 533, 253, 7756, 8772, 256, 3319, 77, 323, 4227, 310, 417, 5699, 891, 1804, 281, 1908, 247, 1273, 2898, 281, 17084, 253, 2929, 577, 275, 2593, 5976, 352, 310, 753, 2299, 253, 1783, 273, 616, 5678, 2538, 310, 3345, 253, 7990, 273, 436, 789, 604, 627, 4961, 3082, 342, 11646, 2130, 326, 13398, 841, 2538, 891, 1804, 281, 2486, 731, 275, 253, 4679, 1014, 604, 597, 562, 32231, 21927, 77, 608, 275, 20320, 19, 10585, 311, 357, 1241, 403, 8131, 970, 43831, 5209, 18, 310, 352, 253, 2990, 2797, 387, 253, 2045, 3790, 323, 480, 78, 285, 42085, 390, 513, 368, 6194, 1529, 2990, 342, 305, 21089, 18, 50276, 74, 1158, 436, 1953, 310, 2905, 281, 619, 7579, 374, 253, 2929, 310, 4722, 285, 973, 3542, 533, 253, 38135, 310, 3710, 970, 268, 5367, 19, 281, 5731, 253, 26682, 253, 1543, 403, 417, 1077, 13943, 3340, 2429, 281, 256, 3319, 77, 285, 253, 7103, 310, 3710, 281, 247, 2014, 2898, 2234, 3659, 14536, 1223, 21927, 77, 310, 2898, 639, 79, 6932, 3021, 891, 2868, 253, 2929, 310, 417, 4704, 323, 247, 9311, 387, 17857, 32888, 4496, 3662, 619, 3533, 432, 253, 2022, 2278, 50276, 187, 187, 4118, 18435, 27, 2520, 2929, 29328, 247, 10585, 311, 1492, 264, 941, 5438, 1332, 323, 49863, 29974, 13337, 16753, 13418, 253, 6949, 4836, 275, 436, 2929, 310, 8542, 285, 4217, 253, 7792, 310, 973, 4158, 285, 5272, 285, 9470, 28913, 2175, 403, 5196, 281, 1071, 253, 10307, 273, 253, 1332, 846, 5955, 512, 253, 30628, 5583, 2997, 273, 436, 2929 ]
Below is given review of a research paper from cnoference journal. Please write a summary the review. ### Review: this paper presents two ar techniques for outofview target guiding awareness and variants in two experiments with human participants in the first experiment two variants of the sous technique bsous and fsous were compared using a target selection task both techniques are built on the existing fa technique and are less intrusive than the original technique the first experiment showed that they both perform better although by a small margin than fa furthermore they found that bsous was more robust in complex environments than fsous in the second experiment the authors compared four variants of the fa technique in a second experiment in which two additional behaviors were added to fa the experiment showed that each technique offered tradeoffs in speed and accuracy and there was some indication that the complexity of the environment was also a factor in how effective a particular techniques is the paper is clearly written and the techniques and rationale for selecting them are well described the quantitative data is well presented and future research can explore how these techniques perform in realistic applications this brings to my concerns about the paper i had a hard time determining if the paper is making a large enough contribution to the field to warrant publication at gi the research as presented has not resulted in large gains although some statistically significant results were observed and it remains to be seen whether each techniques particular gains would impact the user experience in concrete applications eg gaming etc having said that i appreciate that the authors discuss some of these aspects in the paper and especially in section 73 i suggest the authors add a limitations section and provide some ideas for how the research can be extended in the future another issue that needs clarification is the relationship between experiment 1 and 2 upon my first reading i got the impression that experiment 1 informed experiment 2 but it seems like both experiments were completed at the same time and by the same participants i recommend the authors revise the methods section to clarify this point further docsepthe paper describes two experiments that evaluate different techniques to guide users towards a target outside their field of view in the first experiment they evaluate two techniques fsous low visual salience and bsous visually salient to guide users towards a target outside their field of view they compare this technique against flyingarrow a previously proposed technique that uses an arrow inside the ve to guide the users results show their techniques to improve user performance over flyingarrow in various environment types in the second experiment they test different modifications of flyingarrow to improve their performance they found that adding trails helped performance but were more intrusive this is a good paper but some missing elements make judging the results difficult here are some questions i had while reading the paper for both experiments 1 why the inview faded and outofview sparked targets had different selection feedback also in figure 6 the outofview target is red vs yellow for the inview target is this difference part of the user study different selection feedback might have distracted users but it was not mentioned when discussing the results 2 what are the target and cursor sizes and why were those selected even if they are not part of the evaluated parameters the user performance especially if the selection is gazebased 3 what device was used to track the eyes during the study for gazeselection also include device latency in the paper 4 was the saccadic eyemovement considered for the data analysis 5 explain what a short break means in study 2 and include if study 1 participants also took a break for experiment 1 1 in section 562 the paper mentions that we did not have to consider fittss law for this study because our targets have the same angular size yet in the next paragraph the paper discusses that the increase in angular distances did not follow fitts law i suggest removing this sentence as the experiment did not follow the correct protocol to make this conclusion 2 some conclusions are too strong and not supported by the data i recommend adding what data support these conclusions or removing the sentence for example we argue that the true strength of fa is not about maximizing speed of target acquisition but to limit it for experiment 2 1 why was the modified flyingarrow white and not red like gruenefeld et al s paper colour has a large effect on visual cues so i suspect this might affect the results but it is not mentioned in the discussion minor comments 1 it is difficult to understand the differences between flyingarrow modifications from figure 3 docsepthe paper describes and analyzes comparisons of different techniques to indicate and locate outofview targets in vr environment the authors generally prefer the two sous methods over the prior work fa method but they didnt claim contribution to sous so the contribution of the paper is mostly the comparative study overall the paper is well written with details of the experiments recorded and analyzed i think the technical contribution is weak but the results may have good practical use note figure 3 may be wrong b should be the technique with trail c should be the one without trail ### Summary:
overall the reviewers had positive responses to the paper although they also had concerns about the technical contribution of the paper and have asked the authors to provide more details about different aspects of their study including the relationship between the two studies i will make an overall recommendation that the paper is considered a marginal accept and request that the authors pay close attention to the detailed feedback provided by the reviewers and incorporate them into the paper before final submission
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 2520, 2929, 10262, 767, 549, 5609, 323, 562, 1171, 1374, 2303, 26766, 11891, 285, 11640, 275, 767, 4679, 342, 1966, 5014, 275, 253, 806, 3368, 767, 11640, 273, 253, 32022, 5853, 48996, 528, 285, 25290, 528, 497, 2429, 970, 247, 2303, 5438, 4836, 1097, 5609, 403, 4270, 327, 253, 5368, 4195, 5853, 285, 403, 1679, 4996, 15240, 685, 253, 3236, 5853, 253, 806, 3368, 2692, 326, 597, 1097, 1347, 1805, 3738, 407, 247, 1355, 8459, 685, 4195, 33810, 597, 1119, 326, 48996, 528, 369, 625, 10237, 275, 2570, 12620, 685, 25290, 528, 275, 253, 1273, 3368, 253, 4477, 2429, 1740, 11640, 273, 253, 4195, 5853, 275, 247, 1273, 3368, 275, 534, 767, 3081, 13576, 497, 2879, 281, 4195, 253, 3368, 2692, 326, 1016, 5853, 5907, 5454, 14273, 275, 3885, 285, 7200, 285, 627, 369, 690, 14011, 326, 253, 10454, 273, 253, 3126, 369, 671, 247, 2803, 275, 849, 3576, 247, 1798, 5609, 310, 50275, 783, 2929, 310, 4518, 3542, 285, 253, 5609, 285, 24775, 323, 17221, 731, 403, 973, 2529, 253, 11745, 941, 310, 973, 3559, 285, 2852, 2561, 476, 8338, 849, 841, 5609, 1347, 275, 15958, 4893, 436, 10316, 281, 619, 7350, 670, 253, 2929, 891, 574, 247, 1892, 673, 8925, 604, 253, 2929, 310, 2403, 247, 1781, 2217, 7680, 281, 253, 1673, 281, 7501, 9311, 387, 15891, 253, 2561, 347, 3559, 556, 417, 7369, 275, 1781, 15988, 3738, 690, 10126, 1534, 1543, 497, 2540, 285, 352, 4558, 281, 320, 2326, 1880, 1016, 5609, 1798, 15988, 651, 3486, 253, 2608, 2793, 275, 11859, 4893, 24088, 16791, 3966, 1907, 753, 326, 891, 11435, 326, 253, 4477, 2319, 690, 273, 841, 7794, 275, 253, 2929, 285, 3340, 275, 2593, 11087, 50276, 74, 1804, 253, 4477, 823, 247, 7364, 2593, 285, 2085, 690, 5697, 323, 849, 253, 2561, 476, 320, 6508, 275, 253, 2852, 1529, 2523, 326, 3198, 37699, 310, 253, 2954, 875, 3368, 337, 285, 374, 2220, 619, 806, 4361, 891, 1694, 253, 13214, 326, 3368, 337, 8191, 3368, 374, 533, 352, 3133, 751, 1097, 4679, 497, 6312, 387, 253, 1072, 673, 285, 407, 253, 1072, 5014, 891, 5583, 253, 4477, 49620, 253, 3082, 2593, 281, 19148, 436, 1127, 2007, 5474, 339, 431, 248, 2929, 8631, 767, 4679, 326, 7472, 1027, 5609, 281, 7102, 4212, 4404, 247, 2303, 3345, 616, 1673, 273, 1859, 275, 253, 806, 3368, 597, 7472, 767, 5609, 25290, 528, 1698, 5304, 3779, 1482, 285, 48996, 528, 25910, 43066, 281, 7102, 4212, 4404, 247, 2303, 3345, 616, 1673, 273, 1859, 597, 7277, 436, 5853, 1411, 12060, 2501, 247, 3786, 4081, 5853, 326, 4648, 271, 14150, 3304, 253, 1670, 281, 7102, 253, 4212, 1543, 921, 616, 5609, 281, 3157, 2608, 3045, 689, 12060, 2501, 275, 2710, 3126, 3510, 275, 253, 1273, 3368, 597, 1071, 1027, 14586, 273, 12060, 2501, 281, 3157, 616, 3045, 597, 1119, 326, 6240, 27192, 6518, 3045, 533, 497, 625, 4996, 15240, 50275, 2520, 310, 247, 1175, 2929, 533, 690, 5816, 3603, 1056, 32721, 253, 1543, 2834, 1060, 403, 690, 3533, 891, 574, 1223, 4361, 253, 2929, 50276, 1542, 1097, 4679, 50276, 18, 2139, 253, 828, 827, 28290, 285, 562, 1171, 1374, 35560, 8571, 574, 1027, 5438, 8680, 671, 275, 4677, 721, 253, 562, 1171, 1374, 2303, 310, 2502, 4632, 8862, 323, 253, 828, 827, 2303, 310, 436, 3064, 629, 273, 253, 2608, 1263, 1027, 5438, 8680, 1537, 452, 31564, 4212, 533, 352, 369, 417, 5393, 672, 16585, 253, 1543, 374, 752, 403, 253, 2303, 285, 20482, 9552, 285, 2139, 497, 1110, 4236, 1014, 604, 597, 403, 417, 629, 273, 253, 6760, 3602, 253, 2608, 3045, 3340, 604, 253, 5438, 310, 20001, 2275, 833, 495, 752, 2813, 369, 908, 281, 3540, 253, 2927, 1309, 253, 1263, 323, 20001, 3248, 1788, 671, 2486, 2813, 22667, 275, 253, 2929, 577, 369, 253, 256, 3649, 18535, 2046, 358, 710, 420, 2783, 323, 253, 941, 1783, 608, 5513, 752, 247, 2159, 2740, 2097, 275, 1263, 374, 285, 2486, 604, 1263, 337, 5014, 671, 2335, 247, 2740, 50276, 1542, 3368, 337, 50276, 18, 275, 2593, 49820, 253, 2929, 25957, 326, 359, 858, 417, 452, 281, 1908, 269, 770, 859, 1569, 323, 436, 1263, 984, 776, 8571, 452, 253, 1072, 12336, 1979, 2568, 275, 253, 1735, 12494, 253, 2929, 25339, 326, 253, 2572, 275, 12336, 13849, 858, 417, 956, 269, 770, 84, 1569, 891, 1804, 11922, 436, 6197, 347, 253, 3368, 858, 417, 956, 253, 3451, 7241, 281, 1056, 436, 6452, 374, 690, 11815, 403, 1512, 2266, 285, 417, 4516, 407, 253, 941, 891, 5583, 6240, 752, 941, 1329, 841, 11815, 390, 11922, 253, 6197, 323, 1650, 359, 9059, 326, 253, 2032, 4757, 273, 4195, 310, 417, 670, 46875, 3885, 273, 2303, 11931, 533, 281, 2701, 352, 50276, 1542, 3368, 374, 50276, 18, 2139, 369, 253, 7321, 12060, 2501, 3168, 285, 417, 2502, 751, 26970, 257, 832, 10391, 1162, 355, 256, 2929, 10688, 556, 247, 1781, 1055, 327, 5304, 26638, 594, 891, 9101, 436, 1537, 2818, 253, 1543, 533, 352, 310, 417, 5393, 275, 253, 5955, 50276, 37585, 5701, 50276, 18, 352, 310, 2834, 281, 2096, 253, 3910, 875, 12060, 2501, 14586, 432, 4677, 495, 50276, 7152, 339, 431, 248, 2929, 8631, 285, 3537, 13505, 14023, 273, 1027, 5609, 281, 5224, 285, 19912, 562, 1171, 1374, 8571, 275, 362, 83, 3126, 253, 4477, 3839, 4510, 253, 767, 32022, 3082, 689, 253, 2720, 789, 4195, 1332, 533, 597, 42126, 1750, 7680, 281, 32022, 594, 253, 7680, 273, 253, 2929, 310, 6571, 253, 20407, 1263, 50276, 1189, 455, 253, 2929, 310, 973, 3542, 342, 4278, 273, 253, 4679, 5950, 285, 5867, 891, 1158, 253, 7681, 7680, 310, 5075, 533, 253, 1543, 778, 452, 1175, 8542, 897, 50276, 9939, 4677, 495, 778, 320, 3430, 270, 943, 320, 253, 5853, 342, 11693, 260, 943, 320, 253, 581, 1293, 11693, 187, 187, 4118, 18435, 27, 1189, 455, 253, 30628, 574, 2762, 6128, 281, 253, 2929, 3738, 597, 671, 574, 7350, 670, 253, 7681, 7680, 273, 253, 2929, 285, 452, 2546, 253, 4477, 281, 2085, 625, 4278, 670, 1027, 7794, 273, 616, 1263, 1690, 253, 2954, 875, 253, 767, 2175, 891, 588, 1056, 271, 4583, 17401, 326, 253, 2929, 310, 2783, 247, 16888, 2997, 285, 2748, 326, 253, 4477, 2075, 2810, 4116, 281, 253, 7000, 8680, 2530, 407, 253, 30628, 285, 19071, 731, 715, 253, 2929, 1078, 2457, 19529, 209 ]
[ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 ]
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 2520, 2929, 10262, 767, 549, 5609, 323, 562, 1171, 1374, 2303, 26766, 11891, 285, 11640, 275, 767, 4679, 342, 1966, 5014, 275, 253, 806, 3368, 767, 11640, 273, 253, 32022, 5853, 48996, 528, 285, 25290, 528, 497, 2429, 970, 247, 2303, 5438, 4836, 1097, 5609, 403, 4270, 327, 253, 5368, 4195, 5853, 285, 403, 1679, 4996, 15240, 685, 253, 3236, 5853, 253, 806, 3368, 2692, 326, 597, 1097, 1347, 1805, 3738, 407, 247, 1355, 8459, 685, 4195, 33810, 597, 1119, 326, 48996, 528, 369, 625, 10237, 275, 2570, 12620, 685, 25290, 528, 275, 253, 1273, 3368, 253, 4477, 2429, 1740, 11640, 273, 253, 4195, 5853, 275, 247, 1273, 3368, 275, 534, 767, 3081, 13576, 497, 2879, 281, 4195, 253, 3368, 2692, 326, 1016, 5853, 5907, 5454, 14273, 275, 3885, 285, 7200, 285, 627, 369, 690, 14011, 326, 253, 10454, 273, 253, 3126, 369, 671, 247, 2803, 275, 849, 3576, 247, 1798, 5609, 310, 50275, 783, 2929, 310, 4518, 3542, 285, 253, 5609, 285, 24775, 323, 17221, 731, 403, 973, 2529, 253, 11745, 941, 310, 973, 3559, 285, 2852, 2561, 476, 8338, 849, 841, 5609, 1347, 275, 15958, 4893, 436, 10316, 281, 619, 7350, 670, 253, 2929, 891, 574, 247, 1892, 673, 8925, 604, 253, 2929, 310, 2403, 247, 1781, 2217, 7680, 281, 253, 1673, 281, 7501, 9311, 387, 15891, 253, 2561, 347, 3559, 556, 417, 7369, 275, 1781, 15988, 3738, 690, 10126, 1534, 1543, 497, 2540, 285, 352, 4558, 281, 320, 2326, 1880, 1016, 5609, 1798, 15988, 651, 3486, 253, 2608, 2793, 275, 11859, 4893, 24088, 16791, 3966, 1907, 753, 326, 891, 11435, 326, 253, 4477, 2319, 690, 273, 841, 7794, 275, 253, 2929, 285, 3340, 275, 2593, 11087, 50276, 74, 1804, 253, 4477, 823, 247, 7364, 2593, 285, 2085, 690, 5697, 323, 849, 253, 2561, 476, 320, 6508, 275, 253, 2852, 1529, 2523, 326, 3198, 37699, 310, 253, 2954, 875, 3368, 337, 285, 374, 2220, 619, 806, 4361, 891, 1694, 253, 13214, 326, 3368, 337, 8191, 3368, 374, 533, 352, 3133, 751, 1097, 4679, 497, 6312, 387, 253, 1072, 673, 285, 407, 253, 1072, 5014, 891, 5583, 253, 4477, 49620, 253, 3082, 2593, 281, 19148, 436, 1127, 2007, 5474, 339, 431, 248, 2929, 8631, 767, 4679, 326, 7472, 1027, 5609, 281, 7102, 4212, 4404, 247, 2303, 3345, 616, 1673, 273, 1859, 275, 253, 806, 3368, 597, 7472, 767, 5609, 25290, 528, 1698, 5304, 3779, 1482, 285, 48996, 528, 25910, 43066, 281, 7102, 4212, 4404, 247, 2303, 3345, 616, 1673, 273, 1859, 597, 7277, 436, 5853, 1411, 12060, 2501, 247, 3786, 4081, 5853, 326, 4648, 271, 14150, 3304, 253, 1670, 281, 7102, 253, 4212, 1543, 921, 616, 5609, 281, 3157, 2608, 3045, 689, 12060, 2501, 275, 2710, 3126, 3510, 275, 253, 1273, 3368, 597, 1071, 1027, 14586, 273, 12060, 2501, 281, 3157, 616, 3045, 597, 1119, 326, 6240, 27192, 6518, 3045, 533, 497, 625, 4996, 15240, 50275, 2520, 310, 247, 1175, 2929, 533, 690, 5816, 3603, 1056, 32721, 253, 1543, 2834, 1060, 403, 690, 3533, 891, 574, 1223, 4361, 253, 2929, 50276, 1542, 1097, 4679, 50276, 18, 2139, 253, 828, 827, 28290, 285, 562, 1171, 1374, 35560, 8571, 574, 1027, 5438, 8680, 671, 275, 4677, 721, 253, 562, 1171, 1374, 2303, 310, 2502, 4632, 8862, 323, 253, 828, 827, 2303, 310, 436, 3064, 629, 273, 253, 2608, 1263, 1027, 5438, 8680, 1537, 452, 31564, 4212, 533, 352, 369, 417, 5393, 672, 16585, 253, 1543, 374, 752, 403, 253, 2303, 285, 20482, 9552, 285, 2139, 497, 1110, 4236, 1014, 604, 597, 403, 417, 629, 273, 253, 6760, 3602, 253, 2608, 3045, 3340, 604, 253, 5438, 310, 20001, 2275, 833, 495, 752, 2813, 369, 908, 281, 3540, 253, 2927, 1309, 253, 1263, 323, 20001, 3248, 1788, 671, 2486, 2813, 22667, 275, 253, 2929, 577, 369, 253, 256, 3649, 18535, 2046, 358, 710, 420, 2783, 323, 253, 941, 1783, 608, 5513, 752, 247, 2159, 2740, 2097, 275, 1263, 374, 285, 2486, 604, 1263, 337, 5014, 671, 2335, 247, 2740, 50276, 1542, 3368, 337, 50276, 18, 275, 2593, 49820, 253, 2929, 25957, 326, 359, 858, 417, 452, 281, 1908, 269, 770, 859, 1569, 323, 436, 1263, 984, 776, 8571, 452, 253, 1072, 12336, 1979, 2568, 275, 253, 1735, 12494, 253, 2929, 25339, 326, 253, 2572, 275, 12336, 13849, 858, 417, 956, 269, 770, 84, 1569, 891, 1804, 11922, 436, 6197, 347, 253, 3368, 858, 417, 956, 253, 3451, 7241, 281, 1056, 436, 6452, 374, 690, 11815, 403, 1512, 2266, 285, 417, 4516, 407, 253, 941, 891, 5583, 6240, 752, 941, 1329, 841, 11815, 390, 11922, 253, 6197, 323, 1650, 359, 9059, 326, 253, 2032, 4757, 273, 4195, 310, 417, 670, 46875, 3885, 273, 2303, 11931, 533, 281, 2701, 352, 50276, 1542, 3368, 374, 50276, 18, 2139, 369, 253, 7321, 12060, 2501, 3168, 285, 417, 2502, 751, 26970, 257, 832, 10391, 1162, 355, 256, 2929, 10688, 556, 247, 1781, 1055, 327, 5304, 26638, 594, 891, 9101, 436, 1537, 2818, 253, 1543, 533, 352, 310, 417, 5393, 275, 253, 5955, 50276, 37585, 5701, 50276, 18, 352, 310, 2834, 281, 2096, 253, 3910, 875, 12060, 2501, 14586, 432, 4677, 495, 50276, 7152, 339, 431, 248, 2929, 8631, 285, 3537, 13505, 14023, 273, 1027, 5609, 281, 5224, 285, 19912, 562, 1171, 1374, 8571, 275, 362, 83, 3126, 253, 4477, 3839, 4510, 253, 767, 32022, 3082, 689, 253, 2720, 789, 4195, 1332, 533, 597, 42126, 1750, 7680, 281, 32022, 594, 253, 7680, 273, 253, 2929, 310, 6571, 253, 20407, 1263, 50276, 1189, 455, 253, 2929, 310, 973, 3542, 342, 4278, 273, 253, 4679, 5950, 285, 5867, 891, 1158, 253, 7681, 7680, 310, 5075, 533, 253, 1543, 778, 452, 1175, 8542, 897, 50276, 9939, 4677, 495, 778, 320, 3430, 270, 943, 320, 253, 5853, 342, 11693, 260, 943, 320, 253, 581, 1293, 11693, 187, 187, 4118, 18435, 27, 1189, 455, 253, 30628, 574, 2762, 6128, 281, 253, 2929, 3738, 597, 671, 574, 7350, 670, 253, 7681, 7680, 273, 253, 2929, 285, 452, 2546, 253, 4477, 281, 2085, 625, 4278, 670, 1027, 7794, 273, 616, 1263, 1690, 253, 2954, 875, 253, 767, 2175, 891, 588, 1056, 271, 4583, 17401, 326, 253, 2929, 310, 2783, 247, 16888, 2997, 285, 2748, 326, 253, 4477, 2075, 2810, 4116, 281, 253, 7000, 8680, 2530, 407, 253, 30628, 285, 19071, 731, 715, 253, 2929, 1078, 2457, 19529, 209 ]
Below is given review of a research paper from cnoference journal. Please write a summary the review. ### Review: summary this work presents a novel neural model xref for entity linking in chinese online news comments two new datasets of news articles and comments in entertainment and product domains are collected and annotated for evaluation on this task the unique problem setupup facilitates xref to use the corresponding news article in the following ways a construct a candidate entity set with high coverage since comments mostly discuss entities in the article b use a novel attention mechanism over the news article c guide these article attention values using a supervised loss furthermore xref leverages unannotated articles and comments using matchbased weak supervision the model achieves improvements over existing sota entity linking models and strong baselines for the proposed tasks especially for plural pronominal mentions pros authors identify a novel way to tackle the lack of context for entitylinking in social media posts use the corresponding news article connected to the posts the work presents an effective model to use a relatedlinked article when its available there is potential to combine xref with other sources of context like user history for broader applications cons comments without entities are left out in the constructed datasets this could make the task of mention detection harder as negative samples are missing the paper lacks ablations for weak supervison and the different attention mechanisms proposed comment articledocsepthe paper describes a method to perform entity linking across news and news comments in chinese using attention mechanisms to pinpoint relevant context within comments and detect supporting entities in the article body the authors use a weakly supervised training scheme to work with a large scale corpus the method is well described the model has promising results compared to the state of the art and the chineselanguage entity linking corpus is a welcome addition because of these reasons the paper is a good candidate for the conference the only suggestion i have for the cameraready version is a discussion about the generalizability of this methodology is this method dependent on the articlecomment structure would it work with other datasets eg a wikipedia page and editor discussions finally i have a question about the usage of attention would it make sense to use other comments in addition to the article body itself for the detection of supporting entities it seems like this could help in the case when conversations happen between commentersdocsepthis paper focuses on the problem of entity linking in chinese social media compared to entity linking in documents entity linking in social media poses additional problems as social media post have limited context and the language is informal the paper proposes xref which overcomes these problems by utilizing additional context from comments and associated articles and by using data augmentation the paper is overall well written xref uses an attention mechanisms to pinpoint relevant context within comments and detect supporting entities from the news article a weakly supervised training scheme is utilized to employ unlabelled corpus the authors also propose two new lowresource datasets experimental results demonstrate effectiveness of xref over other baselines the paper would have been stronger if results on at least one more language were reported discussioncomparison with the following relevant prior work will be useful 1 entity linking on chinese microblogs via deep neural network weixin zeng jiuyang tang xiang zhao 2 chinese social media entity linking based on effective context with topic semantics chengfang ma ying sha jianlong tan li guo huailiang peng ### Summary:
all reviewers are fairly positive about the paper that deals with entity linking in chinese online news comments the key strengths of the paper are using additional context from associated articles by using data augmentation novel attention mechanism over news articles guidance to article attention values the paper is well written and has good results over state of the art reviewers pointed out suggestions for further work like trying another language doing ablation study testing the generalizability etc while all of these are good ideas to make the work more comprehensive and thorough still the paper stands on its own merits and should be a good addition to the conference
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 8774, 436, 789, 10262, 247, 4460, 11454, 1566, 1269, 709, 323, 10726, 20057, 275, 448, 5187, 3909, 3668, 5701, 767, 747, 15302, 273, 3668, 7774, 285, 5701, 275, 14608, 285, 1885, 10625, 403, 5728, 285, 28267, 323, 7103, 327, 436, 4836, 253, 4451, 1895, 9978, 484, 29499, 1269, 709, 281, 897, 253, 3969, 3668, 3929, 275, 253, 1563, 4088, 50276, 66, 3989, 247, 7431, 10726, 873, 342, 1029, 7031, 1580, 5701, 6571, 2319, 14429, 275, 253, 3929, 50276, 67, 897, 247, 4460, 4116, 5122, 689, 253, 3668, 3929, 260, 7102, 841, 3929, 4116, 2193, 970, 247, 22296, 2957, 50276, 44295, 3062, 1269, 709, 19732, 1131, 440, 11423, 456, 7774, 285, 5701, 970, 3761, 3169, 5075, 20446, 253, 1566, 33526, 11701, 689, 5368, 256, 5503, 10726, 20057, 3210, 285, 2266, 1666, 25379, 323, 253, 4081, 8892, 3340, 323, 25540, 11093, 297, 989, 25957, 50275, 856, 84, 4477, 4271, 247, 4460, 1039, 281, 18915, 253, 3480, 273, 3634, 323, 10726, 30816, 275, 2675, 3420, 9319, 897, 253, 3969, 3668, 3929, 4802, 281, 253, 9319, 253, 789, 10262, 271, 3576, 1566, 281, 897, 247, 2905, 16862, 3929, 672, 697, 2130, 627, 310, 2442, 281, 13398, 1269, 709, 342, 643, 4973, 273, 3634, 751, 2608, 2892, 323, 16055, 4893, 50276, 5040, 50276, 26122, 1293, 14429, 403, 1669, 562, 275, 253, 8818, 15302, 436, 812, 1056, 253, 4836, 273, 3748, 5481, 12150, 347, 4016, 3530, 403, 5816, 50276, 783, 2929, 19756, 490, 77, 569, 323, 5075, 35220, 1988, 285, 253, 1027, 4116, 6297, 4081, 4385, 18575, 1070, 406, 339, 431, 248, 2929, 8631, 247, 1332, 281, 1347, 10726, 20057, 2439, 3668, 285, 3668, 5701, 275, 448, 5187, 970, 4116, 6297, 281, 45661, 4623, 3634, 1561, 5701, 285, 2736, 8109, 14429, 275, 253, 3929, 2133, 253, 4477, 897, 247, 22112, 22296, 3733, 6974, 281, 789, 342, 247, 1781, 4311, 20689, 50276, 783, 1332, 310, 973, 2529, 253, 1566, 556, 12532, 1543, 2429, 281, 253, 1375, 273, 253, 1445, 285, 253, 448, 1100, 293, 2848, 10726, 20057, 20689, 310, 247, 10112, 1635, 984, 273, 841, 4606, 253, 2929, 310, 247, 1175, 7431, 323, 253, 8059, 50276, 783, 760, 14876, 891, 452, 323, 253, 4049, 254, 609, 5102, 2715, 310, 247, 5955, 670, 253, 2087, 50228, 273, 436, 16182, 310, 436, 1332, 7976, 327, 253, 3929, 13982, 2605, 651, 352, 789, 342, 643, 15302, 24088, 247, 259, 15170, 3239, 285, 8121, 11985, 50276, 71, 3341, 891, 452, 247, 1953, 670, 253, 10393, 273, 4116, 651, 352, 1056, 3282, 281, 897, 643, 5701, 275, 1635, 281, 253, 3929, 2133, 3139, 323, 253, 5481, 273, 8109, 14429, 352, 3133, 751, 436, 812, 1361, 275, 253, 1083, 672, 16072, 5108, 875, 4385, 398, 7152, 33032, 2520, 2929, 16633, 327, 253, 1895, 273, 10726, 20057, 275, 448, 5187, 2675, 3420, 2429, 281, 10726, 20057, 275, 7177, 10726, 20057, 275, 2675, 3420, 24543, 3081, 3237, 347, 2675, 3420, 1501, 452, 3710, 3634, 285, 253, 3448, 310, 25040, 253, 2929, 29328, 1269, 709, 534, 689, 3217, 841, 3237, 407, 17617, 3081, 3634, 432, 5701, 285, 2330, 7774, 285, 407, 970, 941, 42072, 253, 2929, 310, 4583, 973, 3542, 50276, 89, 709, 4648, 271, 4116, 6297, 281, 45661, 4623, 3634, 1561, 5701, 285, 2736, 8109, 14429, 432, 253, 3668, 3929, 247, 22112, 22296, 3733, 6974, 310, 12845, 281, 2126, 440, 47728, 20689, 253, 4477, 671, 12661, 767, 747, 1698, 15024, 15302, 5661, 1543, 7568, 12510, 273, 1269, 709, 689, 643, 1666, 25379, 50276, 783, 2929, 651, 452, 644, 10046, 604, 1543, 327, 387, 1878, 581, 625, 3448, 497, 2361, 50276, 49794, 47109, 342, 253, 1563, 4623, 2720, 789, 588, 320, 4217, 50276, 18, 10726, 20057, 327, 448, 5187, 2494, 46856, 3066, 3676, 11454, 2990, 359, 895, 249, 1182, 1205, 480, 74, 7352, 606, 12717, 1269, 22589, 1182, 31035, 374, 448, 5187, 2675, 3420, 10726, 20057, 1754, 327, 3576, 3634, 342, 9400, 35185, 260, 24176, 71, 606, 6429, 340, 272, 48183, 480, 757, 5056, 23136, 632, 1149, 80, 30287, 647, 22589, 42151, 2490, 187, 4118, 18435, 27, 455, 30628, 403, 9648, 2762, 670, 253, 2929, 326, 13330, 342, 10726, 20057, 275, 448, 5187, 3909, 3668, 5701, 253, 2234, 20544, 273, 253, 2929, 403, 970, 3081, 3634, 432, 2330, 7774, 407, 970, 941, 42072, 4460, 4116, 5122, 689, 3668, 7774, 12925, 281, 3929, 4116, 2193, 253, 2929, 310, 973, 3542, 285, 556, 1175, 1543, 689, 1375, 273, 253, 1445, 30628, 8042, 562, 13991, 323, 2007, 789, 751, 2820, 1529, 3448, 2509, 28913, 1263, 5175, 253, 2087, 50228, 3966, 1223, 512, 273, 841, 403, 1175, 5697, 281, 1056, 253, 789, 625, 11088, 285, 11080, 1335, 253, 2929, 9572, 327, 697, 1211, 16108, 285, 943, 320, 247, 1175, 1635, 281, 253, 8059, 50276 ]
[ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 ]
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 8774, 436, 789, 10262, 247, 4460, 11454, 1566, 1269, 709, 323, 10726, 20057, 275, 448, 5187, 3909, 3668, 5701, 767, 747, 15302, 273, 3668, 7774, 285, 5701, 275, 14608, 285, 1885, 10625, 403, 5728, 285, 28267, 323, 7103, 327, 436, 4836, 253, 4451, 1895, 9978, 484, 29499, 1269, 709, 281, 897, 253, 3969, 3668, 3929, 275, 253, 1563, 4088, 50276, 66, 3989, 247, 7431, 10726, 873, 342, 1029, 7031, 1580, 5701, 6571, 2319, 14429, 275, 253, 3929, 50276, 67, 897, 247, 4460, 4116, 5122, 689, 253, 3668, 3929, 260, 7102, 841, 3929, 4116, 2193, 970, 247, 22296, 2957, 50276, 44295, 3062, 1269, 709, 19732, 1131, 440, 11423, 456, 7774, 285, 5701, 970, 3761, 3169, 5075, 20446, 253, 1566, 33526, 11701, 689, 5368, 256, 5503, 10726, 20057, 3210, 285, 2266, 1666, 25379, 323, 253, 4081, 8892, 3340, 323, 25540, 11093, 297, 989, 25957, 50275, 856, 84, 4477, 4271, 247, 4460, 1039, 281, 18915, 253, 3480, 273, 3634, 323, 10726, 30816, 275, 2675, 3420, 9319, 897, 253, 3969, 3668, 3929, 4802, 281, 253, 9319, 253, 789, 10262, 271, 3576, 1566, 281, 897, 247, 2905, 16862, 3929, 672, 697, 2130, 627, 310, 2442, 281, 13398, 1269, 709, 342, 643, 4973, 273, 3634, 751, 2608, 2892, 323, 16055, 4893, 50276, 5040, 50276, 26122, 1293, 14429, 403, 1669, 562, 275, 253, 8818, 15302, 436, 812, 1056, 253, 4836, 273, 3748, 5481, 12150, 347, 4016, 3530, 403, 5816, 50276, 783, 2929, 19756, 490, 77, 569, 323, 5075, 35220, 1988, 285, 253, 1027, 4116, 6297, 4081, 4385, 18575, 1070, 406, 339, 431, 248, 2929, 8631, 247, 1332, 281, 1347, 10726, 20057, 2439, 3668, 285, 3668, 5701, 275, 448, 5187, 970, 4116, 6297, 281, 45661, 4623, 3634, 1561, 5701, 285, 2736, 8109, 14429, 275, 253, 3929, 2133, 253, 4477, 897, 247, 22112, 22296, 3733, 6974, 281, 789, 342, 247, 1781, 4311, 20689, 50276, 783, 1332, 310, 973, 2529, 253, 1566, 556, 12532, 1543, 2429, 281, 253, 1375, 273, 253, 1445, 285, 253, 448, 1100, 293, 2848, 10726, 20057, 20689, 310, 247, 10112, 1635, 984, 273, 841, 4606, 253, 2929, 310, 247, 1175, 7431, 323, 253, 8059, 50276, 783, 760, 14876, 891, 452, 323, 253, 4049, 254, 609, 5102, 2715, 310, 247, 5955, 670, 253, 2087, 50228, 273, 436, 16182, 310, 436, 1332, 7976, 327, 253, 3929, 13982, 2605, 651, 352, 789, 342, 643, 15302, 24088, 247, 259, 15170, 3239, 285, 8121, 11985, 50276, 71, 3341, 891, 452, 247, 1953, 670, 253, 10393, 273, 4116, 651, 352, 1056, 3282, 281, 897, 643, 5701, 275, 1635, 281, 253, 3929, 2133, 3139, 323, 253, 5481, 273, 8109, 14429, 352, 3133, 751, 436, 812, 1361, 275, 253, 1083, 672, 16072, 5108, 875, 4385, 398, 7152, 33032, 2520, 2929, 16633, 327, 253, 1895, 273, 10726, 20057, 275, 448, 5187, 2675, 3420, 2429, 281, 10726, 20057, 275, 7177, 10726, 20057, 275, 2675, 3420, 24543, 3081, 3237, 347, 2675, 3420, 1501, 452, 3710, 3634, 285, 253, 3448, 310, 25040, 253, 2929, 29328, 1269, 709, 534, 689, 3217, 841, 3237, 407, 17617, 3081, 3634, 432, 5701, 285, 2330, 7774, 285, 407, 970, 941, 42072, 253, 2929, 310, 4583, 973, 3542, 50276, 89, 709, 4648, 271, 4116, 6297, 281, 45661, 4623, 3634, 1561, 5701, 285, 2736, 8109, 14429, 432, 253, 3668, 3929, 247, 22112, 22296, 3733, 6974, 310, 12845, 281, 2126, 440, 47728, 20689, 253, 4477, 671, 12661, 767, 747, 1698, 15024, 15302, 5661, 1543, 7568, 12510, 273, 1269, 709, 689, 643, 1666, 25379, 50276, 783, 2929, 651, 452, 644, 10046, 604, 1543, 327, 387, 1878, 581, 625, 3448, 497, 2361, 50276, 49794, 47109, 342, 253, 1563, 4623, 2720, 789, 588, 320, 4217, 50276, 18, 10726, 20057, 327, 448, 5187, 2494, 46856, 3066, 3676, 11454, 2990, 359, 895, 249, 1182, 1205, 480, 74, 7352, 606, 12717, 1269, 22589, 1182, 31035, 374, 448, 5187, 2675, 3420, 10726, 20057, 1754, 327, 3576, 3634, 342, 9400, 35185, 260, 24176, 71, 606, 6429, 340, 272, 48183, 480, 757, 5056, 23136, 632, 1149, 80, 30287, 647, 22589, 42151, 2490, 187, 4118, 18435, 27, 455, 30628, 403, 9648, 2762, 670, 253, 2929, 326, 13330, 342, 10726, 20057, 275, 448, 5187, 3909, 3668, 5701, 253, 2234, 20544, 273, 253, 2929, 403, 970, 3081, 3634, 432, 2330, 7774, 407, 970, 941, 42072, 4460, 4116, 5122, 689, 3668, 7774, 12925, 281, 3929, 4116, 2193, 253, 2929, 310, 973, 3542, 285, 556, 1175, 1543, 689, 1375, 273, 253, 1445, 30628, 8042, 562, 13991, 323, 2007, 789, 751, 2820, 1529, 3448, 2509, 28913, 1263, 5175, 253, 2087, 50228, 3966, 1223, 512, 273, 841, 403, 1175, 5697, 281, 1056, 253, 789, 625, 11088, 285, 11080, 1335, 253, 2929, 9572, 327, 697, 1211, 16108, 285, 943, 320, 247, 1175, 1635, 281, 253, 8059, 50276 ]
Below is given review of a research paper from cnoference journal. Please write a summary the review. ### Review: the authors address the problem of representation learning in which datagenerative factors of variation are separated or disentangled from each other pointing out that unsupervised disentangling is hard despite recent breakthroughs and that supervised disentangling needs a large number of carefully labeled data they propose a weakly supervised approach that does not require explicit factor labels but instead divides the training data in to two subsets one set the reference set is known to the learning algorithm to leave a set of generative target factors fixed at one specific value per factor while the other set is known to the learning algorithm to vary across all generative factors the problem setup posed by the authors is to separate the corresponding two sets of factors into two nonoverlapping sets of latents pros to address this problem the authors propose an architecture that includes a reverse klterm in the loss and they show convincingly that this approach is indeed successful in separating the two sets of generative factors from each other this is demonstrated in two different ways first quantitatively on an a modified mnist dataset showing that the information about the target factors is indeed mostly in the set of latents that are meant to capture them second qualitatively on the modified mnist and on a further dataset affectnet which has been carefully curated by the authors to improve the quality of the reference set the qualitative results are impressive and show that this approach can be used to transfer the target factors from one image onto another image technically this work combines and extends a set of interesting techniques into a novel framework applied to a new way of disentangling two sets of factors of variation with a vae approach cons the problem that this work solves seems somewhat artificial and the training data while less burdensome than having explicit labels is still difficult to obtain in practice more importantly though both the title and the start of the both the abstract and the introduction are somewhat misleading thats because this work does not actually address disentangling in the sense of learning disentangled representations from visual data where highlevel generative factors correspond to independent dimensions of feature vectors what it really addresses is separating two sets of factors into different parts of the representation within each of which the factors can be are very likely are entangled with each other related to the point that this work is not really about disentangling the quantitative comparisons with completely unsupervised baselines are not really that meaningful at least not in terms of what this work sets out to do all it shows is whether information about the target factors is easily linearly decodable from the latents which while related to disentangling says little about the quality of it on the positive side this kind of quantitative comparison where the authors approach has to show that the information exists in the correct part of the space is not pitted unfairly against the unsupervised baselines update the authors have made a good effort to address the concerns raised and i believe the paper should be accepted in its current form i have increased my rating from 6 to 7 accordingly docsepthe paper proposes reference based vaes which considers learning semantically meaningful feature with weak supervision the latent variable contains two parts one related to the reference set and the other irrelevant to prevent degenerate solutions the paper proposed to use reverse kl resulting in a alicestyle objective the paper demonstrates interesting empirical results on feature prediction conditional image generation and image synthesis i dont really see how equation 5 in symmetric kl prevents learning redundant z ie z contains all information of e it seems one could have both kl terms near zero but also have pxz e pxz one scenario would be the case where z contains all the information about e which learns the reference latent features so we have redundant information in z in this case the learned features e are informative but the decoder does not use e anyways to ensure that z does not contain information about e one could add an adversarial predictor that tries to predict e from z note that this cannot be detected by the feature learning metric because it ignores z for rbvae during training the experiments on conditional image generation look interesting but i wonder if the ground truth transformation for mnist can be simply described as in some linear transformation on the original image i wonder if the proposed method works on svhn where you can use label information as reference supervision moreover i wonder if it is possible to use multiple types of reference images but fewer images in each type to reach comparable or even better performance minor points why assume that the reference distribution is delta distribution whose support has measure zero instead of a regular gaussian 6 8 10 seems over complicated due to the semisupervised nature of the objective i wonder if having an additional figure would make things clearer maybe it is helpful to cite the alice paper li et al for equation 10 table 1 maybe add the word respectively so it is clearer which metric you use for which dataset i wonder if it is fair enough to compare feature prediction with vae and other models since they do not use any weak supervision a fairer baseline could consider learning with the weak supervision labels containing the information that some images have the same label the improvement on affectnet compared to regular vae does not look amazing given the additional weak supervision docsepsummary given two sets of data where one is unlabelled and the other is a reference data set with a particular factor of variation that is fixed the approach disentangles this factor of variation from the others the approach uses a vae whose latents are split into e that represents the factor of variation and z that represents the remaining factors a symmetric kl loss that is approximated using the densityratio trick is optimised for the learning and the method is applied to mnist digit style disentangling and affectnet facial expression disentangling pros clearly written results look promising both quantitative and qualitative cons mathieu et al disentangle a specific factor from others without explicit labels but by drawing two images with the same value of the specified factor ie drawing from the reference set and also drawing a third image with a any value of the specified factor ie drawing from the unlabelled set hence their approach is directly applicable to the problem at hand in the paper although mathieu et al use digitface identity as the shared factor their method is directly applicable to the case where the shared factor is digit stylefacial expression hence it appears to me that it should be compared against missing reference bouchacourt explicit labels arent given and data is grouped where each group shares a factor of var but here the data is assumed to be partitioned into groups so there is no equivalent to the unlablled set hence difficult to compare against for the outlined tasks regarding comparison against unsupervised disentangling methods there have been more recent approaches since betavae and dipvae eg factorvae kim et al tcvae chen et al it would be nice to compare against these methods not only via predictive accuracy of target factors but also using disentangling metrics specified in these papers other qscomments the kl terms in 5 are intractable due to the densities pux and prx hence two separate discriminators need to be used to approximate two separate density ratios making the model rather large and complicated with many moving parts what would happen if these kl terms in 5 are dropped and one simply uses sgvb to optimise the resulting loss without the need for discriminators usually discriminators tend to heavily underestimate density ratios see eg rosca et al especially densities defined on high dimensions so it might be best to avoid them whenever possible the requirement of adding reconstruction terms to the loss in 10 is perhaps evidence of this because these reconstruction terms are already present in the loss 3 5 that the discriminator should be approximating so the necessity of extra regularisation of these reconstruction terms suggests that the discriminator is giving poor estimates of them the reconstruction terms for ze in 5 appear sufficient to force the model to use e which is the motivation given in the paper for using the symmetric kl akin to how infogan forces the model to use the latents so the necessity of the kl terms in 5 is questionable and appears to need further justification andor ablation studies minor why not learn the likelihood variance lambda revision i am convinced by the rebuttal of the authors hence have modified my score accordingly ### Summary:
this is a proposed method that studies learning of disentangled representations in a relatively specific setting defined as follows given two datasets one unlabeled and another that has a particular factor of variation fixed the method will disentangle the factor of variation from the others the reviewers found the method promising with interesting results qual quant the weaknesses of the method as discussed in the reviews and after the quantitative results with weak supervision are not a big improvement over betavaelike methods or mathieu et al a red flag of sorts to me is that it is not very clear where the gains are coming from the authors claim to have done a fair comparison with the various baselines but they introduce an entirely new encoderdecoder architecture that was likely involuntarily but still tuned more to their method than others the setup as presented is somewhat artificial and less general than it could be however this was not a major factor in my decision it is easy to get confused by the kind of disentagled representations that this work is aiming to get i think this has the potential to be a solid paper but at this stage its missing a number of ablation studies to truly understand what sets it apart from the previous work at the very least there is a number of architectural and training choices in appendix d like the 025 dropout that require more explanation empirical understanding and how they generalize to other datasets given all of this at this point it is hard for me to recommend acceptance of this work i encourage the authors to take all this feedback into account extend their work to more domains the artisticstyle disentangling that they mention seems like a good idea and provide more empirical evidence about their architectural choices and their effect on the results
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 783, 4477, 2953, 253, 1895, 273, 6779, 4715, 275, 534, 2856, 356, 4330, 800, 2616, 273, 7629, 403, 9070, 390, 557, 290, 33195, 432, 1016, 643, 13458, 562, 326, 440, 35421, 557, 290, 36874, 310, 1892, 5747, 3332, 29709, 84, 285, 326, 22296, 557, 290, 36874, 3198, 247, 1781, 1180, 273, 9257, 13130, 941, 597, 12661, 247, 22112, 22296, 2746, 326, 1057, 417, 2430, 6843, 2803, 13301, 533, 3185, 37141, 253, 3733, 941, 275, 281, 767, 20077, 581, 873, 253, 3806, 873, 310, 1929, 281, 253, 4715, 5933, 281, 3553, 247, 873, 273, 1006, 800, 2303, 2616, 4229, 387, 581, 2173, 1318, 591, 2803, 1223, 253, 643, 873, 310, 1929, 281, 253, 4715, 5933, 281, 6889, 2439, 512, 1006, 800, 2616, 253, 1895, 9978, 22691, 407, 253, 4477, 310, 281, 4858, 253, 3969, 767, 5239, 273, 2616, 715, 767, 1327, 1189, 77, 5436, 5239, 273, 4329, 592, 50275, 856, 84, 50276, 936, 2953, 436, 1895, 253, 4477, 12661, 271, 10336, 326, 3797, 247, 8107, 27451, 3945, 275, 253, 2957, 285, 597, 921, 2410, 1763, 5356, 326, 436, 2746, 310, 6296, 5547, 275, 23694, 253, 767, 5239, 273, 1006, 800, 2616, 432, 1016, 643, 436, 310, 5183, 275, 767, 1027, 4088, 806, 36878, 327, 271, 247, 7321, 278, 79, 382, 10895, 4645, 326, 253, 1491, 670, 253, 2303, 2616, 310, 6296, 6571, 275, 253, 873, 273, 4329, 592, 326, 403, 5486, 281, 9232, 731, 1273, 36143, 327, 253, 7321, 278, 79, 382, 285, 327, 247, 2007, 10895, 2818, 3024, 534, 556, 644, 9257, 1095, 456, 407, 253, 4477, 281, 3157, 253, 3290, 273, 253, 3806, 873, 253, 18276, 1543, 403, 13943, 285, 921, 326, 436, 2746, 476, 320, 908, 281, 3700, 253, 2303, 2616, 432, 581, 2460, 4830, 1529, 2460, 50276, 23693, 1037, 436, 789, 24772, 285, 8725, 247, 873, 273, 4722, 5609, 715, 247, 4460, 7792, 3732, 281, 247, 747, 1039, 273, 557, 290, 36874, 767, 5239, 273, 2616, 273, 7629, 342, 247, 362, 3348, 2746, 50275, 5040, 50276, 783, 1895, 326, 436, 789, 35910, 3133, 8489, 13345, 285, 253, 3733, 941, 1223, 1679, 32274, 485, 685, 1907, 6843, 13301, 310, 1335, 2834, 281, 4044, 275, 3946, 625, 15538, 2167, 1097, 253, 4060, 285, 253, 1265, 273, 253, 1097, 253, 12002, 285, 253, 10199, 403, 8489, 24363, 28763, 984, 436, 789, 1057, 417, 2686, 2953, 557, 290, 36874, 275, 253, 3282, 273, 4715, 557, 290, 33195, 14237, 432, 5304, 941, 835, 1029, 5251, 1006, 800, 2616, 2723, 281, 3907, 10103, 273, 4735, 11390, 752, 352, 1663, 12453, 310, 23694, 767, 5239, 273, 2616, 715, 1027, 4243, 273, 253, 6779, 1561, 1016, 273, 534, 253, 2616, 476, 320, 403, 1077, 2779, 403, 36255, 342, 1016, 643, 50276, 4919, 281, 253, 1127, 326, 436, 789, 310, 417, 1663, 670, 557, 290, 36874, 253, 11745, 14023, 342, 4336, 440, 35421, 1666, 25379, 403, 417, 1663, 326, 14282, 387, 1878, 417, 275, 2426, 273, 752, 436, 789, 5239, 562, 281, 513, 512, 352, 2722, 310, 1880, 1491, 670, 253, 2303, 2616, 310, 4354, 23352, 1086, 351, 494, 432, 253, 4329, 592, 534, 1223, 2905, 281, 557, 290, 36874, 2296, 1652, 670, 253, 3290, 273, 352, 327, 253, 2762, 1930, 436, 2238, 273, 11745, 5301, 835, 253, 4477, 2746, 556, 281, 921, 326, 253, 1491, 4961, 275, 253, 3451, 629, 273, 253, 2317, 310, 417, 268, 2166, 16593, 314, 1411, 253, 440, 35421, 1666, 25379, 50275, 11183, 50276, 783, 4477, 452, 1160, 247, 1175, 3434, 281, 2953, 253, 7350, 5439, 285, 891, 2868, 253, 2929, 943, 320, 7607, 275, 697, 1655, 830, 891, 452, 2559, 619, 13716, 432, 721, 281, 818, 15672, 5474, 339, 431, 248, 2929, 29328, 3806, 1754, 13460, 265, 534, 19401, 4715, 3300, 39904, 14282, 4735, 342, 5075, 20446, 253, 21624, 4778, 4428, 767, 4243, 581, 2905, 281, 253, 3806, 873, 285, 253, 643, 19124, 281, 3657, 29458, 5482, 253, 2929, 4081, 281, 897, 8107, 27451, 4795, 275, 247, 355, 280, 12463, 8103, 253, 2929, 14371, 4722, 16774, 1543, 327, 4735, 10554, 17697, 2460, 5978, 285, 2460, 9066, 50276, 74, 13414, 1663, 923, 849, 5150, 608, 275, 13123, 27451, 16897, 4715, 28116, 1182, 26332, 1182, 4428, 512, 1491, 273, 299, 352, 3133, 581, 812, 452, 1097, 27451, 2426, 2822, 5058, 533, 671, 452, 268, 39344, 299, 50276, 3498, 91, 581, 10076, 651, 320, 253, 1083, 835, 1182, 4428, 512, 253, 1491, 670, 299, 534, 33772, 253, 3806, 21624, 3386, 594, 359, 452, 28116, 1491, 275, 1182, 275, 436, 1083, 253, 6311, 3386, 299, 403, 27096, 533, 253, 29810, 1057, 417, 897, 299, 667, 1576, 281, 5416, 326, 1182, 1057, 417, 3831, 1491, 670, 299, 581, 812, 823, 271, 48960, 23403, 326, 14177, 281, 3283, 299, 432, 1182, 3877, 326, 436, 2550, 320, 5189, 407, 253, 4735, 4715, 7982, 984, 352, 35136, 1182, 323, 45630, 21574, 1309, 3733, 50276, 783, 4679, 327, 17697, 2460, 5978, 1007, 4722, 533, 891, 4282, 604, 253, 3216, 5083, 9261, 323, 278, 79, 382, 476, 320, 3365, 2529, 347, 275, 690, 4872, 9261, 327, 253, 3236, 2460, 891, 4282, 604, 253, 4081, 1332, 2987, 327, 18504, 13107, 835, 368, 476, 897, 5203, 1491, 347, 3806, 20446, 25761, 891, 4282, 604, 352, 310, 1896, 281, 897, 2709, 3510, 273, 3806, 3888, 533, 11184, 3888, 275, 1016, 1511, 281, 3986, 10870, 390, 1014, 1805, 3045, 50276, 37585, 2792, 50276, 22309, 5467, 326, 253, 3806, 3268, 310, 18687, 3268, 3692, 1329, 556, 2557, 5058, 3185, 273, 247, 3963, 305, 12064, 50276, 23, 854, 884, 3133, 689, 9542, 1955, 281, 253, 49863, 29974, 13337, 3753, 273, 253, 8103, 891, 4282, 604, 1907, 271, 3081, 4677, 651, 1056, 1841, 30909, 50275, 28489, 352, 310, 9371, 281, 26542, 253, 355, 547, 2929, 632, 1162, 355, 323, 5150, 884, 50276, 2420, 337, 5046, 823, 253, 3159, 2975, 594, 352, 310, 30909, 534, 7982, 368, 897, 323, 534, 10895, 50276, 74, 4282, 604, 352, 310, 4344, 2217, 281, 7277, 4735, 10554, 342, 362, 3348, 285, 643, 3210, 1580, 597, 513, 417, 897, 667, 5075, 20446, 247, 22870, 83, 8245, 812, 1908, 4715, 342, 253, 5075, 20446, 13301, 4508, 253, 1491, 326, 690, 3888, 452, 253, 1072, 5203, 253, 7756, 327, 2818, 3024, 2429, 281, 3963, 362, 3348, 1057, 417, 1007, 8644, 1677, 253, 3081, 5075, 20446, 5474, 339, 793, 360, 3454, 1677, 767, 5239, 273, 941, 835, 581, 310, 440, 47728, 285, 253, 643, 310, 247, 3806, 941, 873, 342, 247, 1798, 2803, 273, 7629, 326, 310, 4229, 253, 2746, 557, 290, 19236, 436, 2803, 273, 7629, 432, 253, 2571, 253, 2746, 4648, 247, 362, 3348, 3692, 4329, 592, 403, 8085, 715, 299, 326, 6125, 253, 2803, 273, 7629, 285, 1182, 326, 6125, 253, 5780, 2616, 247, 13123, 27451, 2957, 326, 310, 34930, 970, 253, 4038, 29603, 10480, 310, 5556, 1701, 323, 253, 4715, 285, 253, 1332, 310, 3732, 281, 278, 79, 382, 6670, 3740, 557, 290, 36874, 285, 2818, 3024, 17754, 2048, 557, 290, 36874, 50276, 856, 84, 50276, 49346, 3542, 50276, 16680, 1007, 12532, 1097, 11745, 285, 18276, 50276, 5040, 50276, 679, 19683, 1162, 355, 557, 290, 2134, 247, 2173, 2803, 432, 2571, 1293, 6843, 13301, 533, 407, 10263, 767, 3888, 342, 253, 1072, 1318, 273, 253, 7616, 2803, 26332, 10263, 432, 253, 3806, 873, 285, 671, 10263, 247, 2626, 2460, 342, 247, 667, 1318, 273, 253, 7616, 2803, 26332, 10263, 432, 253, 440, 47728, 873, 7613, 616, 2746, 310, 3587, 7763, 281, 253, 1895, 387, 1133, 275, 253, 2929, 3738, 14168, 19683, 1162, 355, 897, 6670, 1664, 6489, 347, 253, 6096, 2803, 616, 1332, 310, 3587, 7763, 281, 253, 1083, 835, 253, 6096, 2803, 310, 6670, 3740, 36101, 2048, 7613, 352, 4620, 281, 479, 326, 352, 943, 320, 2429, 1411, 50276, 33722, 3806, 50276, 67, 9764, 317, 950, 50276, 911, 20692, 13301, 403, 2649, 1677, 285, 941, 310, 24104, 835, 1016, 1387, 10764, 247, 2803, 273, 945, 533, 1060, 253, 941, 310, 8025, 281, 320, 10883, 264, 715, 2390, 594, 627, 310, 642, 6425, 281, 253, 440, 13068, 620, 264, 873, 7613, 2834, 281, 7277, 1411, 323, 253, 18627, 8892, 50276, 1747, 13218, 5301, 1411, 440, 35421, 557, 290, 36874, 3082, 627, 452, 644, 625, 3332, 7274, 1580, 701, 2623, 70, 285, 12539, 21574, 24088, 2803, 21574, 465, 303, 1162, 355, 246, 17312, 3348, 260, 864, 1162, 355, 352, 651, 320, 5322, 281, 7277, 1411, 841, 3082, 417, 760, 3066, 15970, 7200, 273, 2303, 2616, 533, 671, 970, 557, 290, 36874, 17082, 7616, 275, 841, 9380, 50276, 977, 2805, 84, 26122, 50276, 783, 27451, 2426, 275, 608, 403, 540, 44374, 1955, 281, 253, 16689, 268, 2310, 285, 819, 89, 7613, 767, 4858, 20741, 2392, 878, 281, 320, 908, 281, 16851, 767, 4858, 4038, 11878, 2403, 253, 1566, 2581, 1781, 285, 9542, 342, 1142, 4886, 4243, 752, 651, 5108, 604, 841, 27451, 2426, 275, 608, 403, 8231, 285, 581, 3365, 4648, 48237, 30890, 281, 5556, 885, 253, 4795, 2957, 1293, 253, 878, 323, 20741, 2392, 3798, 20741, 2392, 5257, 281, 11306, 45166, 4038, 11878, 923, 24088, 687, 1026, 66, 1162, 355, 3340, 16689, 2931, 327, 1029, 10103, 594, 352, 1537, 320, 1682, 281, 3693, 731, 10793, 1896, 253, 8284, 273, 6240, 14433, 2426, 281, 253, 2957, 275, 884, 310, 4931, 1941, 273, 436, 984, 841, 14433, 2426, 403, 2168, 1246, 275, 253, 2957, 495, 50276, 22, 326, 253, 7134, 12915, 943, 320, 4020, 839, 594, 253, 15504, 273, 4465, 3963, 5837, 273, 841, 14433, 2426, 5936, 326, 253, 7134, 12915, 310, 4933, 4105, 8197, 273, 731, 253, 14433, 2426, 323, 14756, 275, 608, 3176, 4209, 281, 3490, 253, 1566, 281, 897, 299, 534, 310, 253, 16038, 1677, 275, 253, 2929, 323, 970, 253, 13123, 27451, 33917, 281, 849, 2192, 19356, 5621, 253, 1566, 281, 897, 253, 4329, 592, 594, 253, 15504, 273, 253, 27451, 2426, 275, 608, 310, 30455, 285, 4620, 281, 878, 2007, 22861, 285, 263, 28913, 2175, 50276, 37585, 2139, 417, 3037, 253, 12177, 11041, 29331, 50275, 250, 4694, 50276, 74, 717, 13762, 407, 253, 30080, 22559, 273, 253, 4477, 7613, 452, 7321, 619, 4868, 15672, 187, 187, 4118, 18435, 27, 2520, 310, 247, 4081, 1332, 326, 2175, 4715, 273, 557, 290, 33195, 14237, 275, 247, 4942, 2173, 4758, 2931, 347, 3637, 1677, 767, 15302, 581, 440, 22027, 285, 1529, 326, 556, 247, 1798, 2803, 273, 7629, 4229, 253, 1332, 588, 557, 290, 2134, 253, 2803, 273, 7629, 432, 253, 2571, 253, 30628, 1119, 253, 1332, 12532, 342, 4722, 1543, 4426, 50276, 17149, 50276, 783, 32213, 273, 253, 1332, 347, 5469, 275, 253, 10123, 285, 846, 50275, 783, 11745, 1543, 342, 5075, 20446, 403, 417, 247, 1943, 7756, 689, 701, 2623, 44549, 3082, 390, 14168, 19683, 1162, 355, 50276, 66, 2502, 7908, 273, 16308, 281, 479, 310, 326, 352, 310, 417, 1077, 2590, 835, 253, 15988, 403, 3551, 432, 253, 4477, 1750, 281, 452, 2218, 247, 4344, 5301, 342, 253, 2710, 1666, 25379, 533, 597, 9569, 271, 7094, 747, 32049, 48759, 10336, 326, 369, 2779, 1901, 2084, 3441, 533, 1335, 24251, 625, 281, 616, 1332, 685, 2571, 50276, 783, 9978, 347, 3559, 310, 8489, 13345, 285, 1679, 2087, 685, 352, 812, 320, 2299, 436, 369, 417, 247, 2201, 2803, 275, 619, 3061, 352, 310, 3477, 281, 755, 13477, 407, 253, 2238, 273, 557, 290, 356, 1070, 14237, 326, 436, 789, 310, 26400, 281, 755, 50276, 74, 1158, 436, 556, 253, 2442, 281, 320, 247, 4891, 2929, 533, 387, 436, 3924, 697, 5816, 247, 1180, 273, 28913, 2175, 281, 7777, 2096, 752, 5239, 352, 7419, 432, 253, 2045, 789, 387, 253, 1077, 1878, 627, 310, 247, 1180, 273, 27934, 285, 3733, 10165, 275, 30762, 277, 50276, 3022, 253, 470, 1099, 5926, 483, 50276, 3529, 2430, 625, 8813, 50276, 358, 5378, 474, 4685, 285, 849, 597, 39970, 281, 643, 15302, 50276, 28821, 512, 273, 436, 387, 436, 1127, 352, 310, 1892, 323, 479, 281, 5583, 14924, 273, 436, 789, 891, 11907, 253, 4477, 281, 1379, 512, 436, 8680, 715, 2395, 9017, 616, 789, 281, 625, 10625, 253, 21518, 4826, 557, 290, 36874, 326, 597, 3748, 3133, 751, 247, 1175, 2934, 285, 2085, 625, 16774, 1941, 670, 616, 27934, 10165, 285, 616, 1055, 327, 253, 1543 ]
[ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 ]
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 783, 4477, 2953, 253, 1895, 273, 6779, 4715, 275, 534, 2856, 356, 4330, 800, 2616, 273, 7629, 403, 9070, 390, 557, 290, 33195, 432, 1016, 643, 13458, 562, 326, 440, 35421, 557, 290, 36874, 310, 1892, 5747, 3332, 29709, 84, 285, 326, 22296, 557, 290, 36874, 3198, 247, 1781, 1180, 273, 9257, 13130, 941, 597, 12661, 247, 22112, 22296, 2746, 326, 1057, 417, 2430, 6843, 2803, 13301, 533, 3185, 37141, 253, 3733, 941, 275, 281, 767, 20077, 581, 873, 253, 3806, 873, 310, 1929, 281, 253, 4715, 5933, 281, 3553, 247, 873, 273, 1006, 800, 2303, 2616, 4229, 387, 581, 2173, 1318, 591, 2803, 1223, 253, 643, 873, 310, 1929, 281, 253, 4715, 5933, 281, 6889, 2439, 512, 1006, 800, 2616, 253, 1895, 9978, 22691, 407, 253, 4477, 310, 281, 4858, 253, 3969, 767, 5239, 273, 2616, 715, 767, 1327, 1189, 77, 5436, 5239, 273, 4329, 592, 50275, 856, 84, 50276, 936, 2953, 436, 1895, 253, 4477, 12661, 271, 10336, 326, 3797, 247, 8107, 27451, 3945, 275, 253, 2957, 285, 597, 921, 2410, 1763, 5356, 326, 436, 2746, 310, 6296, 5547, 275, 23694, 253, 767, 5239, 273, 1006, 800, 2616, 432, 1016, 643, 436, 310, 5183, 275, 767, 1027, 4088, 806, 36878, 327, 271, 247, 7321, 278, 79, 382, 10895, 4645, 326, 253, 1491, 670, 253, 2303, 2616, 310, 6296, 6571, 275, 253, 873, 273, 4329, 592, 326, 403, 5486, 281, 9232, 731, 1273, 36143, 327, 253, 7321, 278, 79, 382, 285, 327, 247, 2007, 10895, 2818, 3024, 534, 556, 644, 9257, 1095, 456, 407, 253, 4477, 281, 3157, 253, 3290, 273, 253, 3806, 873, 253, 18276, 1543, 403, 13943, 285, 921, 326, 436, 2746, 476, 320, 908, 281, 3700, 253, 2303, 2616, 432, 581, 2460, 4830, 1529, 2460, 50276, 23693, 1037, 436, 789, 24772, 285, 8725, 247, 873, 273, 4722, 5609, 715, 247, 4460, 7792, 3732, 281, 247, 747, 1039, 273, 557, 290, 36874, 767, 5239, 273, 2616, 273, 7629, 342, 247, 362, 3348, 2746, 50275, 5040, 50276, 783, 1895, 326, 436, 789, 35910, 3133, 8489, 13345, 285, 253, 3733, 941, 1223, 1679, 32274, 485, 685, 1907, 6843, 13301, 310, 1335, 2834, 281, 4044, 275, 3946, 625, 15538, 2167, 1097, 253, 4060, 285, 253, 1265, 273, 253, 1097, 253, 12002, 285, 253, 10199, 403, 8489, 24363, 28763, 984, 436, 789, 1057, 417, 2686, 2953, 557, 290, 36874, 275, 253, 3282, 273, 4715, 557, 290, 33195, 14237, 432, 5304, 941, 835, 1029, 5251, 1006, 800, 2616, 2723, 281, 3907, 10103, 273, 4735, 11390, 752, 352, 1663, 12453, 310, 23694, 767, 5239, 273, 2616, 715, 1027, 4243, 273, 253, 6779, 1561, 1016, 273, 534, 253, 2616, 476, 320, 403, 1077, 2779, 403, 36255, 342, 1016, 643, 50276, 4919, 281, 253, 1127, 326, 436, 789, 310, 417, 1663, 670, 557, 290, 36874, 253, 11745, 14023, 342, 4336, 440, 35421, 1666, 25379, 403, 417, 1663, 326, 14282, 387, 1878, 417, 275, 2426, 273, 752, 436, 789, 5239, 562, 281, 513, 512, 352, 2722, 310, 1880, 1491, 670, 253, 2303, 2616, 310, 4354, 23352, 1086, 351, 494, 432, 253, 4329, 592, 534, 1223, 2905, 281, 557, 290, 36874, 2296, 1652, 670, 253, 3290, 273, 352, 327, 253, 2762, 1930, 436, 2238, 273, 11745, 5301, 835, 253, 4477, 2746, 556, 281, 921, 326, 253, 1491, 4961, 275, 253, 3451, 629, 273, 253, 2317, 310, 417, 268, 2166, 16593, 314, 1411, 253, 440, 35421, 1666, 25379, 50275, 11183, 50276, 783, 4477, 452, 1160, 247, 1175, 3434, 281, 2953, 253, 7350, 5439, 285, 891, 2868, 253, 2929, 943, 320, 7607, 275, 697, 1655, 830, 891, 452, 2559, 619, 13716, 432, 721, 281, 818, 15672, 5474, 339, 431, 248, 2929, 29328, 3806, 1754, 13460, 265, 534, 19401, 4715, 3300, 39904, 14282, 4735, 342, 5075, 20446, 253, 21624, 4778, 4428, 767, 4243, 581, 2905, 281, 253, 3806, 873, 285, 253, 643, 19124, 281, 3657, 29458, 5482, 253, 2929, 4081, 281, 897, 8107, 27451, 4795, 275, 247, 355, 280, 12463, 8103, 253, 2929, 14371, 4722, 16774, 1543, 327, 4735, 10554, 17697, 2460, 5978, 285, 2460, 9066, 50276, 74, 13414, 1663, 923, 849, 5150, 608, 275, 13123, 27451, 16897, 4715, 28116, 1182, 26332, 1182, 4428, 512, 1491, 273, 299, 352, 3133, 581, 812, 452, 1097, 27451, 2426, 2822, 5058, 533, 671, 452, 268, 39344, 299, 50276, 3498, 91, 581, 10076, 651, 320, 253, 1083, 835, 1182, 4428, 512, 253, 1491, 670, 299, 534, 33772, 253, 3806, 21624, 3386, 594, 359, 452, 28116, 1491, 275, 1182, 275, 436, 1083, 253, 6311, 3386, 299, 403, 27096, 533, 253, 29810, 1057, 417, 897, 299, 667, 1576, 281, 5416, 326, 1182, 1057, 417, 3831, 1491, 670, 299, 581, 812, 823, 271, 48960, 23403, 326, 14177, 281, 3283, 299, 432, 1182, 3877, 326, 436, 2550, 320, 5189, 407, 253, 4735, 4715, 7982, 984, 352, 35136, 1182, 323, 45630, 21574, 1309, 3733, 50276, 783, 4679, 327, 17697, 2460, 5978, 1007, 4722, 533, 891, 4282, 604, 253, 3216, 5083, 9261, 323, 278, 79, 382, 476, 320, 3365, 2529, 347, 275, 690, 4872, 9261, 327, 253, 3236, 2460, 891, 4282, 604, 253, 4081, 1332, 2987, 327, 18504, 13107, 835, 368, 476, 897, 5203, 1491, 347, 3806, 20446, 25761, 891, 4282, 604, 352, 310, 1896, 281, 897, 2709, 3510, 273, 3806, 3888, 533, 11184, 3888, 275, 1016, 1511, 281, 3986, 10870, 390, 1014, 1805, 3045, 50276, 37585, 2792, 50276, 22309, 5467, 326, 253, 3806, 3268, 310, 18687, 3268, 3692, 1329, 556, 2557, 5058, 3185, 273, 247, 3963, 305, 12064, 50276, 23, 854, 884, 3133, 689, 9542, 1955, 281, 253, 49863, 29974, 13337, 3753, 273, 253, 8103, 891, 4282, 604, 1907, 271, 3081, 4677, 651, 1056, 1841, 30909, 50275, 28489, 352, 310, 9371, 281, 26542, 253, 355, 547, 2929, 632, 1162, 355, 323, 5150, 884, 50276, 2420, 337, 5046, 823, 253, 3159, 2975, 594, 352, 310, 30909, 534, 7982, 368, 897, 323, 534, 10895, 50276, 74, 4282, 604, 352, 310, 4344, 2217, 281, 7277, 4735, 10554, 342, 362, 3348, 285, 643, 3210, 1580, 597, 513, 417, 897, 667, 5075, 20446, 247, 22870, 83, 8245, 812, 1908, 4715, 342, 253, 5075, 20446, 13301, 4508, 253, 1491, 326, 690, 3888, 452, 253, 1072, 5203, 253, 7756, 327, 2818, 3024, 2429, 281, 3963, 362, 3348, 1057, 417, 1007, 8644, 1677, 253, 3081, 5075, 20446, 5474, 339, 793, 360, 3454, 1677, 767, 5239, 273, 941, 835, 581, 310, 440, 47728, 285, 253, 643, 310, 247, 3806, 941, 873, 342, 247, 1798, 2803, 273, 7629, 326, 310, 4229, 253, 2746, 557, 290, 19236, 436, 2803, 273, 7629, 432, 253, 2571, 253, 2746, 4648, 247, 362, 3348, 3692, 4329, 592, 403, 8085, 715, 299, 326, 6125, 253, 2803, 273, 7629, 285, 1182, 326, 6125, 253, 5780, 2616, 247, 13123, 27451, 2957, 326, 310, 34930, 970, 253, 4038, 29603, 10480, 310, 5556, 1701, 323, 253, 4715, 285, 253, 1332, 310, 3732, 281, 278, 79, 382, 6670, 3740, 557, 290, 36874, 285, 2818, 3024, 17754, 2048, 557, 290, 36874, 50276, 856, 84, 50276, 49346, 3542, 50276, 16680, 1007, 12532, 1097, 11745, 285, 18276, 50276, 5040, 50276, 679, 19683, 1162, 355, 557, 290, 2134, 247, 2173, 2803, 432, 2571, 1293, 6843, 13301, 533, 407, 10263, 767, 3888, 342, 253, 1072, 1318, 273, 253, 7616, 2803, 26332, 10263, 432, 253, 3806, 873, 285, 671, 10263, 247, 2626, 2460, 342, 247, 667, 1318, 273, 253, 7616, 2803, 26332, 10263, 432, 253, 440, 47728, 873, 7613, 616, 2746, 310, 3587, 7763, 281, 253, 1895, 387, 1133, 275, 253, 2929, 3738, 14168, 19683, 1162, 355, 897, 6670, 1664, 6489, 347, 253, 6096, 2803, 616, 1332, 310, 3587, 7763, 281, 253, 1083, 835, 253, 6096, 2803, 310, 6670, 3740, 36101, 2048, 7613, 352, 4620, 281, 479, 326, 352, 943, 320, 2429, 1411, 50276, 33722, 3806, 50276, 67, 9764, 317, 950, 50276, 911, 20692, 13301, 403, 2649, 1677, 285, 941, 310, 24104, 835, 1016, 1387, 10764, 247, 2803, 273, 945, 533, 1060, 253, 941, 310, 8025, 281, 320, 10883, 264, 715, 2390, 594, 627, 310, 642, 6425, 281, 253, 440, 13068, 620, 264, 873, 7613, 2834, 281, 7277, 1411, 323, 253, 18627, 8892, 50276, 1747, 13218, 5301, 1411, 440, 35421, 557, 290, 36874, 3082, 627, 452, 644, 625, 3332, 7274, 1580, 701, 2623, 70, 285, 12539, 21574, 24088, 2803, 21574, 465, 303, 1162, 355, 246, 17312, 3348, 260, 864, 1162, 355, 352, 651, 320, 5322, 281, 7277, 1411, 841, 3082, 417, 760, 3066, 15970, 7200, 273, 2303, 2616, 533, 671, 970, 557, 290, 36874, 17082, 7616, 275, 841, 9380, 50276, 977, 2805, 84, 26122, 50276, 783, 27451, 2426, 275, 608, 403, 540, 44374, 1955, 281, 253, 16689, 268, 2310, 285, 819, 89, 7613, 767, 4858, 20741, 2392, 878, 281, 320, 908, 281, 16851, 767, 4858, 4038, 11878, 2403, 253, 1566, 2581, 1781, 285, 9542, 342, 1142, 4886, 4243, 752, 651, 5108, 604, 841, 27451, 2426, 275, 608, 403, 8231, 285, 581, 3365, 4648, 48237, 30890, 281, 5556, 885, 253, 4795, 2957, 1293, 253, 878, 323, 20741, 2392, 3798, 20741, 2392, 5257, 281, 11306, 45166, 4038, 11878, 923, 24088, 687, 1026, 66, 1162, 355, 3340, 16689, 2931, 327, 1029, 10103, 594, 352, 1537, 320, 1682, 281, 3693, 731, 10793, 1896, 253, 8284, 273, 6240, 14433, 2426, 281, 253, 2957, 275, 884, 310, 4931, 1941, 273, 436, 984, 841, 14433, 2426, 403, 2168, 1246, 275, 253, 2957, 495, 50276, 22, 326, 253, 7134, 12915, 943, 320, 4020, 839, 594, 253, 15504, 273, 4465, 3963, 5837, 273, 841, 14433, 2426, 5936, 326, 253, 7134, 12915, 310, 4933, 4105, 8197, 273, 731, 253, 14433, 2426, 323, 14756, 275, 608, 3176, 4209, 281, 3490, 253, 1566, 281, 897, 299, 534, 310, 253, 16038, 1677, 275, 253, 2929, 323, 970, 253, 13123, 27451, 33917, 281, 849, 2192, 19356, 5621, 253, 1566, 281, 897, 253, 4329, 592, 594, 253, 15504, 273, 253, 27451, 2426, 275, 608, 310, 30455, 285, 4620, 281, 878, 2007, 22861, 285, 263, 28913, 2175, 50276, 37585, 2139, 417, 3037, 253, 12177, 11041, 29331, 50275, 250, 4694, 50276, 74, 717, 13762, 407, 253, 30080, 22559, 273, 253, 4477, 7613, 452, 7321, 619, 4868, 15672, 187, 187, 4118, 18435, 27, 2520, 310, 247, 4081, 1332, 326, 2175, 4715, 273, 557, 290, 33195, 14237, 275, 247, 4942, 2173, 4758, 2931, 347, 3637, 1677, 767, 15302, 581, 440, 22027, 285, 1529, 326, 556, 247, 1798, 2803, 273, 7629, 4229, 253, 1332, 588, 557, 290, 2134, 253, 2803, 273, 7629, 432, 253, 2571, 253, 30628, 1119, 253, 1332, 12532, 342, 4722, 1543, 4426, 50276, 17149, 50276, 783, 32213, 273, 253, 1332, 347, 5469, 275, 253, 10123, 285, 846, 50275, 783, 11745, 1543, 342, 5075, 20446, 403, 417, 247, 1943, 7756, 689, 701, 2623, 44549, 3082, 390, 14168, 19683, 1162, 355, 50276, 66, 2502, 7908, 273, 16308, 281, 479, 310, 326, 352, 310, 417, 1077, 2590, 835, 253, 15988, 403, 3551, 432, 253, 4477, 1750, 281, 452, 2218, 247, 4344, 5301, 342, 253, 2710, 1666, 25379, 533, 597, 9569, 271, 7094, 747, 32049, 48759, 10336, 326, 369, 2779, 1901, 2084, 3441, 533, 1335, 24251, 625, 281, 616, 1332, 685, 2571, 50276, 783, 9978, 347, 3559, 310, 8489, 13345, 285, 1679, 2087, 685, 352, 812, 320, 2299, 436, 369, 417, 247, 2201, 2803, 275, 619, 3061, 352, 310, 3477, 281, 755, 13477, 407, 253, 2238, 273, 557, 290, 356, 1070, 14237, 326, 436, 789, 310, 26400, 281, 755, 50276, 74, 1158, 436, 556, 253, 2442, 281, 320, 247, 4891, 2929, 533, 387, 436, 3924, 697, 5816, 247, 1180, 273, 28913, 2175, 281, 7777, 2096, 752, 5239, 352, 7419, 432, 253, 2045, 789, 387, 253, 1077, 1878, 627, 310, 247, 1180, 273, 27934, 285, 3733, 10165, 275, 30762, 277, 50276, 3022, 253, 470, 1099, 5926, 483, 50276, 3529, 2430, 625, 8813, 50276, 358, 5378, 474, 4685, 285, 849, 597, 39970, 281, 643, 15302, 50276, 28821, 512, 273, 436, 387, 436, 1127, 352, 310, 1892, 323, 479, 281, 5583, 14924, 273, 436, 789, 891, 11907, 253, 4477, 281, 1379, 512, 436, 8680, 715, 2395, 9017, 616, 789, 281, 625, 10625, 253, 21518, 4826, 557, 290, 36874, 326, 597, 3748, 3133, 751, 247, 1175, 2934, 285, 2085, 625, 16774, 1941, 670, 616, 27934, 10165, 285, 616, 1055, 327, 253, 1543 ]
Below is given review of a research paper from cnoference journal. Please write a summary the review. ### Review: this work aims to address the domain generalization problem in semantic segmentation a simple and effective adversarial augmentation technic is proposed which changes global statistics at the image level through adversarial learning the adversarial style features seem to capture the characteristics of other datasets successfully and improve the performance significantly strengths the main idea of advstyle is simple and easy to implement i believe this can benefit many computer vision tasks especially the dense prediction tasks the illustration of how style changes affect the segmentation performance in figure 1 is good it can be clearly observed that the miou decreases a lot when only changing scene colors the tsne visualization in figure 5 provides good proof for the advstyle idea the adversarial style features can capture some characteristics of other datasets sufficient experiments have been done including comparisons with common dataaugmentation technics sota dg methods and extra experiments on classification tasks weaknesses in figure 1 it is claimed that imagelevel meanvariance is used as the style feature while it is not clear how to calculate and apply such style features 
in line 70 it is claimed that 6dim feature for each example so i wonder if the style features are just mean variance of rgb channels if so why not use other color spaces or even texture spaces i think this is an important ablation study that is missing in the main paper the limitations and potential negative societal impact have been well described docsepthe paper addresses the general problem of learning a model in one domain and testing it in a second domain where there is some domain shift between the domains specifically it demonstrates a method that improves testing performance on real unknown data when learning on synthetic rendered data in semantic segmentation for autonomous driving the motivating observation is that in urban semantic segmentation for autonomous driving a concise domain shift is measurable in the perchannel mean and variance of the image data in other words the color statistics between different data sets and acquisition conditions differ the authors propose to explicitly target adversarial changes of the mean and variance of the first domain images during training to improve generalization performance in a second unknown domain the method works in two phases for each batch iteration where the mean and variance of the input image can be modified during the forward pass adversarial updating of the meanvariance perturbation based on the current frozen model prediction and then regular updating of model weights with both adversarial as well as unperturbed samples the paper continues to evaluate the performance of the proposed method on one synthetic versus three real datasets in semantic segmentation for autonomous driving gtav versus cityscapes bdd mapillary as well as several choices for backbones and batch normalization schemes the empirical performance results demonstrate significant improvements over the chosen baselines furthermore the paper extends the results to image classification on two relevant benchmarks digits and and pacs the paper is written well and the method is laid out with sufficient clarity and detail the datasets chosen for comparison are relevant and realistic in terms of potential realworld applications of the method i appreciate the motivation we observe measurable population differences between various datasets and that they can be captured in a compact and semantically meaningful vector already image mean and variance how can we use this prior knowledge to improve task performance in semantic segmentation based on this motivation the authors conclude to model the observed perturbation explicitly as it happens to be a natural part of compact image formation models already in the sense that normalization with these statistics is well known and commonly used once it is modeled the authors show that it is a comparatively simple step to adversarially predict a new set of samples that are harder to predict at the current training state an appealing property of the method is that the perturbation space is fairly lowdimensional 6vector and thus more easily characterized compared to highdimensional perturbations for instance on the level of individual pixels a significant concern with the submission is that it poses its method as domain generalization dg rather than image augmentation ia and the subsequent analysis and choice of baselines i believe this confounds the comparison to prior art and comparable methods if i take the dg perspective id have to ask will this work for other dg tasks that do not make strong assumptions about the signal image formation process meaning of meanvariance i can find no evidence for this in the paper and it appears that the main premise is a strong domain assumption the image formation process and selection of data to conform with this assumption ie the datasets are chosen such that they differ demonstrably in the chosen and explicitly modeled statistic alternatively the message of the submission could be that for a set of domains i can a priori examine some lowdimensional formative process that i then can exploit to generate better samples in terms of generalization across the prior domains this could be reformulated from the perspective of image augmentation ia if i take on the perspective of ia id have to ask if the cited prior art baselines and benchmarks are chosen appropriately for prior art there would be significant references missing for instance 1 and 2 below demonstrate generic intraining adversarial augmentation conceivably perchannel image mean and variance could be included in these approaches as well without fundamental changes to the algorithms how would the presented method compare in this case could it be extended to other explicitly modeled image formation processes luminance inplane shift etc as an example lets say my domain shift were inplane rotations eg relatively static carbased camera vs smart phone from pedestrian view the presented method would not be expected to demonstrate much benefit as the domain assumption of shift in image mean and variance are violated one could probably model the inplane rotation with a similar adversarial approach to sample generation but this essentially requires explicit knowledge of the domain shit in a lowdimensional parametric forward way as it is presented the empirical results are appealing as the method is relatively straightforward and improves over the presented baselines but the level of contribution is unclear as the choice of baselines lack a thorough sample of and comparison to state of the art image augmentation methods 1 inproceedings wang2021augmax titleaugmax adversarial composition of random augmentations for robust training authorhaotao wang and chaowei xiao and jean kossaifi and zhiding yu and anima anandkumar and zhangyang wang booktitleadvances in neural information processing systems editora beygelzimer and y dauphin and p liang and j wortman vaughan year2021 urlhttpsopenreviewnetforumidp5mtdcvdfz4 2 inproceedings zhang2020adversarial titleadversarial autoaugment authorxinyu zhang and qiang wang and jian zhang and zhao zhong booktitleinternational conference on learning representations year2020 urlhttpsopenreviewnetforumidbyxduyskvs the limitations of the work are fairly clear from the submission docsepthis paper propose a new adversarial style augmentation strategy advstyle for domain generalization in the semantic segmentation task the approach of advstyle can generate hard stylized images during training preventing the model from overfitting on the source domain the generation is completed via learning adversarial style feature experiments on two semantic segmentation benchmarks demonstrate the effectiveness of advstyle strengths the proposed approach can obviously improve the performance of baselines performance on unseen domains including segmentation and classification tasks weakness 1 the proposed advstyle is indeed an approach to extend the appearance diversity of the training images and this approach can not ensure the performance on one unseen domain ie this method lacks the interpretability and advstyle can improve the generalization ability via color augmentation but more augmentation effects are needed such as lighting for the generalization of a model from normallight training data to lowlight testing data 2 why not apply the advstyle on sota classification baselines like meada in table 6 and l2d in table 7 i wonder whether the performance improvement is highly related with the baseline performance 3 the superiority compared with current sota augmentation method is not obvious like the mixstyle reported in table 2 of supp more comparison is needed and such comparisons should be placed in the main paper not supp i agree with authors stated limitations in the supp which is the training time the author can consider how to reduce the training cost ### Summary:
simple and practical way to do better at domain generalization when it comes to semantic segmentation advstyle can generate images that are hard during training and prevent the model from overfitting on the source domain given that it works well is relatively simple to implement and conceptually sound i think it will appeal to a large portion of the neurips audience that works on domain generalization
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 2520, 789, 13698, 281, 2953, 253, 5028, 26647, 1895, 275, 24705, 26405, 247, 2969, 285, 3576, 48960, 42072, 1732, 280, 310, 4081, 534, 2544, 4156, 9990, 387, 253, 2460, 1268, 949, 48960, 4715, 253, 48960, 3740, 3386, 1646, 281, 9232, 253, 5319, 273, 643, 15302, 8379, 285, 3157, 253, 3045, 3012, 20544, 50276, 783, 2022, 2934, 273, 1604, 4826, 310, 2969, 285, 3477, 281, 3359, 891, 2868, 436, 476, 5649, 1142, 4382, 8113, 8892, 3340, 253, 14086, 10554, 8892, 50275, 783, 23356, 273, 849, 3740, 2544, 2818, 253, 26405, 3045, 275, 4677, 337, 310, 1175, 352, 476, 320, 4518, 2540, 326, 253, 3641, 276, 12075, 247, 2257, 672, 760, 6890, 6200, 9830, 50274, 783, 28669, 570, 24426, 275, 4677, 608, 3400, 1175, 4737, 323, 253, 1604, 4826, 2934, 253, 48960, 3740, 3386, 476, 9232, 690, 5319, 273, 643, 15302, 50274, 31031, 4679, 452, 644, 2218, 1690, 14023, 342, 1846, 941, 2321, 16977, 1732, 982, 256, 5503, 277, 72, 3082, 285, 4465, 4679, 327, 9162, 8892, 50276, 20881, 1255, 265, 50276, 249, 4677, 337, 352, 310, 7558, 326, 2460, 5251, 1599, 87, 14417, 310, 908, 347, 253, 3740, 4735, 1223, 352, 310, 417, 2590, 849, 281, 10173, 285, 4647, 824, 3740, 3386, 209, 40702, 249, 1386, 5571, 352, 310, 7558, 326, 721, 4528, 4735, 323, 1016, 1650, 594, 891, 4282, 604, 253, 3740, 3386, 403, 816, 1599, 50276, 87, 14417, 273, 46206, 8123, 604, 594, 2139, 417, 897, 643, 3295, 8470, 390, 1014, 14542, 8470, 891, 1158, 436, 310, 271, 1774, 28913, 1263, 326, 310, 5816, 275, 253, 2022, 2929, 50275, 783, 7364, 285, 2442, 4016, 38058, 3486, 452, 644, 973, 2529, 5474, 339, 431, 248, 2929, 12453, 253, 2087, 1895, 273, 4715, 247, 1566, 275, 581, 5028, 285, 5175, 352, 275, 247, 1273, 5028, 835, 627, 310, 690, 5028, 5333, 875, 253, 10625, 5742, 352, 14371, 247, 1332, 326, 19132, 5175, 3045, 327, 1524, 7202, 941, 672, 4715, 327, 13506, 13697, 941, 275, 24705, 26405, 323, 26279, 6276, 253, 15265, 839, 8310, 310, 326, 275, 10106, 24705, 26405, 323, 26279, 6276, 247, 44003, 5028, 5333, 310, 27289, 275, 253, 591, 13695, 1599, 285, 11041, 273, 253, 2460, 941, 275, 643, 3000, 253, 3295, 9990, 875, 1027, 941, 5239, 285, 11931, 2515, 9184, 253, 4477, 12661, 281, 11120, 2303, 48960, 2544, 273, 253, 1599, 285, 11041, 273, 253, 806, 5028, 3888, 1309, 3733, 281, 3157, 26647, 3045, 275, 247, 1273, 7202, 5028, 50276, 783, 1332, 2987, 275, 767, 12475, 323, 1016, 14604, 19502, 835, 253, 1599, 285, 11041, 273, 253, 3280, 2460, 476, 320, 7321, 1309, 253, 3579, 1509, 48960, 22753, 273, 253, 1599, 87, 14417, 20452, 1754, 327, 253, 1655, 13831, 1566, 10554, 285, 840, 3963, 22753, 273, 1566, 13461, 342, 1097, 48960, 347, 973, 347, 440, 8292, 16193, 3530, 50276, 783, 2929, 7788, 281, 7472, 253, 3045, 273, 253, 4081, 1332, 327, 581, 13506, 7147, 1264, 1524, 15302, 275, 24705, 26405, 323, 26279, 6276, 305, 85, 580, 7147, 2846, 1026, 9652, 270, 1678, 3711, 12537, 347, 973, 347, 2067, 10165, 323, 896, 47473, 285, 14604, 21539, 15849, 253, 16774, 3045, 1543, 7568, 1534, 11701, 689, 253, 6777, 1666, 25379, 33810, 253, 2929, 8725, 253, 1543, 281, 2460, 9162, 327, 767, 4623, 49602, 24321, 285, 285, 268, 18944, 253, 2929, 310, 3542, 973, 285, 253, 1332, 310, 10090, 562, 342, 4209, 19843, 285, 2508, 253, 15302, 6777, 323, 5301, 403, 4623, 285, 15958, 275, 2426, 273, 2442, 1524, 10186, 4893, 273, 253, 1332, 891, 11435, 253, 16038, 359, 10018, 27289, 3072, 3910, 875, 2710, 15302, 285, 326, 597, 476, 320, 10848, 275, 247, 8566, 285, 3300, 39904, 14282, 4972, 2168, 2460, 1599, 285, 11041, 849, 476, 359, 897, 436, 2720, 3640, 281, 3157, 4836, 3045, 275, 24705, 26405, 1754, 327, 436, 16038, 253, 4477, 7525, 281, 1566, 253, 2540, 20452, 11120, 347, 352, 6569, 281, 320, 247, 3626, 629, 273, 8566, 2460, 4702, 3210, 2168, 275, 253, 3282, 326, 21539, 342, 841, 9990, 310, 973, 1929, 285, 7744, 908, 2378, 352, 310, 23115, 253, 4477, 921, 326, 352, 310, 247, 31381, 2969, 3213, 281, 18539, 274, 1365, 3283, 247, 747, 873, 273, 3530, 326, 403, 12150, 281, 3283, 387, 253, 1655, 3733, 1375, 271, 23176, 2867, 273, 253, 1332, 310, 326, 253, 20452, 2317, 310, 9648, 1698, 6967, 721, 11000, 285, 3021, 625, 4354, 7943, 2429, 281, 1029, 6967, 26309, 323, 4227, 327, 253, 1268, 273, 2060, 15115, 50276, 66, 1534, 4468, 342, 253, 19529, 310, 326, 352, 24543, 697, 1332, 347, 5028, 26647, 277, 72, 2581, 685, 2460, 42072, 209, 571, 285, 253, 6774, 1783, 285, 4327, 273, 1666, 25379, 891, 2868, 436, 1461, 2261, 253, 5301, 281, 2720, 1445, 285, 10870, 3082, 604, 891, 1379, 253, 277, 72, 8668, 2654, 452, 281, 1642, 588, 436, 789, 323, 643, 277, 72, 8892, 326, 513, 417, 1056, 2266, 13260, 670, 253, 2625, 2460, 4702, 1232, 4495, 273, 1599, 87, 14417, 891, 476, 1089, 642, 1941, 323, 436, 275, 253, 2929, 285, 352, 4620, 326, 253, 2022, 26536, 310, 247, 2266, 5028, 9376, 253, 2460, 4702, 1232, 285, 5438, 273, 941, 281, 10138, 342, 436, 9376, 26332, 253, 15302, 403, 6777, 824, 326, 597, 9184, 2837, 1598, 275, 253, 6777, 285, 11120, 23115, 26312, 31506, 253, 3935, 273, 253, 19529, 812, 320, 326, 323, 247, 873, 273, 10625, 891, 476, 247, 30400, 9186, 690, 1698, 6967, 830, 800, 1232, 326, 891, 840, 476, 22059, 281, 6635, 1805, 3530, 275, 2426, 273, 26647, 2439, 253, 2720, 10625, 436, 812, 320, 8460, 2907, 432, 253, 8668, 273, 2460, 42072, 209, 571, 604, 891, 1379, 327, 253, 8668, 273, 209, 571, 2654, 452, 281, 1642, 604, 253, 11106, 2720, 1445, 1666, 25379, 285, 49602, 403, 6777, 20420, 323, 2720, 1445, 627, 651, 320, 1534, 10414, 5816, 323, 4227, 337, 285, 374, 2708, 7568, 12314, 8376, 1699, 48960, 42072, 10686, 400, 1598, 591, 13695, 2460, 1599, 285, 11041, 812, 320, 2908, 275, 841, 7274, 347, 973, 1293, 7936, 2544, 281, 253, 11333, 849, 651, 253, 3559, 1332, 7277, 275, 436, 1083, 812, 352, 320, 6508, 281, 643, 11120, 23115, 2460, 4702, 4870, 48560, 275, 13568, 5333, 3966, 50276, 284, 271, 1650, 14935, 1333, 619, 5028, 5333, 497, 275, 13568, 39501, 24088, 4942, 4228, 1113, 3169, 6568, 4632, 7060, 4481, 432, 34792, 1859, 253, 3559, 1332, 651, 417, 320, 3264, 281, 7568, 1199, 5649, 347, 253, 5028, 9376, 273, 5333, 275, 2460, 1599, 285, 11041, 403, 13588, 581, 812, 3164, 1566, 253, 275, 13568, 9381, 342, 247, 2074, 48960, 2746, 281, 3410, 5978, 533, 436, 9093, 4419, 6843, 3640, 273, 253, 5028, 10739, 275, 247, 1698, 6967, 36833, 3579, 1039, 50275, 284, 352, 310, 3559, 253, 16774, 1543, 403, 23176, 347, 253, 1332, 310, 4942, 15246, 285, 19132, 689, 253, 3559, 1666, 25379, 533, 253, 1268, 273, 7680, 310, 12744, 347, 253, 4327, 273, 1666, 25379, 3480, 247, 11080, 3410, 273, 285, 5301, 281, 1375, 273, 253, 1445, 2460, 42072, 3082, 50275, 18, 275, 856, 22868, 259, 606, 938, 1797, 2321, 4090, 4060, 2321, 4090, 48960, 5889, 273, 3632, 35919, 569, 323, 10237, 3733, 2488, 3227, 5503, 80, 259, 606, 285, 11450, 319, 27116, 1269, 22728, 285, 5139, 266, 465, 30797, 18279, 285, 1182, 73, 2821, 340, 86, 285, 3865, 66, 271, 395, 76, 22711, 285, 1182, 12109, 31524, 259, 606, 1984, 5564, 24301, 1972, 275, 11454, 1491, 5162, 2718, 8121, 66, 320, 90, 11500, 91, 7438, 285, 340, 4204, 484, 23187, 285, 268, 632, 606, 285, 480, 259, 430, 1342, 362, 3920, 266, 807, 938, 1797, 9688, 3614, 5758, 15337, 3024, 39061, 301, 81, 22, 78, 2851, 17312, 4989, 91, 21, 50275, 19, 275, 856, 22868, 1182, 12109, 14952, 324, 735, 24406, 4060, 324, 735, 24406, 6753, 2321, 420, 2488, 89, 5104, 86, 1182, 12109, 285, 2805, 22589, 259, 606, 285, 480, 757, 1182, 12109, 285, 1182, 31035, 1182, 73, 543, 1984, 5564, 48985, 8059, 327, 4715, 14237, 807, 14952, 9688, 3614, 5758, 15337, 3024, 39061, 301, 1615, 89, 563, 656, 76, 10936, 50275, 783, 7364, 273, 253, 789, 403, 9648, 2590, 432, 253, 19529, 5474, 33032, 2520, 2929, 12661, 247, 747, 48960, 3740, 42072, 5700, 1604, 4826, 323, 5028, 26647, 275, 253, 24705, 26405, 4836, 253, 2746, 273, 1604, 4826, 476, 6635, 1892, 17521, 1025, 3888, 1309, 3733, 13538, 253, 1566, 432, 689, 31893, 327, 253, 2603, 5028, 253, 5978, 310, 6312, 3066, 4715, 48960, 3740, 4735, 4679, 327, 767, 24705, 26405, 49602, 7568, 253, 12510, 273, 1604, 4826, 50276, 296, 3755, 20556, 253, 4081, 2746, 476, 9090, 3157, 253, 3045, 273, 1666, 25379, 3045, 327, 39709, 10625, 1690, 26405, 285, 9162, 8892, 50276, 20881, 1255, 337, 253, 4081, 1604, 4826, 310, 6296, 271, 2746, 281, 9017, 253, 7286, 9991, 273, 253, 3733, 3888, 285, 436, 2746, 476, 417, 5416, 253, 3045, 327, 581, 39709, 5028, 26332, 436, 1332, 19756, 253, 4665, 1430, 285, 1604, 4826, 476, 3157, 253, 26647, 3745, 3066, 3295, 42072, 533, 625, 42072, 2538, 403, 3058, 824, 347, 15632, 323, 253, 26647, 273, 247, 1566, 432, 5222, 455, 429, 3733, 941, 281, 1698, 3243, 5175, 941, 50276, 19, 2139, 417, 4647, 253, 1604, 4826, 327, 256, 5503, 9162, 1666, 25379, 751, 479, 2960, 275, 2829, 721, 285, 298, 19, 69, 275, 2829, 818, 891, 4282, 1880, 253, 3045, 7756, 310, 4122, 2905, 342, 253, 8245, 3045, 50276, 20, 253, 34385, 2429, 342, 1655, 256, 5503, 42072, 1332, 310, 417, 4755, 751, 253, 5878, 4826, 2361, 275, 2829, 374, 273, 915, 625, 5301, 310, 3058, 285, 824, 14023, 943, 320, 4845, 275, 253, 2022, 2929, 417, 915, 50273, 74, 5194, 342, 4477, 4767, 7364, 275, 253, 915, 534, 310, 253, 3733, 673, 253, 2488, 476, 1908, 849, 281, 4796, 253, 3733, 2105, 50276, 187, 187, 4118, 18435, 27, 19583, 285, 8542, 1039, 281, 513, 1805, 387, 5028, 26647, 672, 352, 3249, 281, 24705, 26405, 1604, 4826, 476, 6635, 3888, 326, 403, 1892, 1309, 3733, 285, 3657, 253, 1566, 432, 689, 31893, 327, 253, 2603, 5028, 1677, 326, 352, 2987, 973, 310, 4942, 2969, 281, 3359, 285, 4473, 1230, 3590, 891, 1158, 352, 588, 4549, 281, 247, 1781, 5110, 273, 253, 5723, 2824, 8446, 326, 2987, 327, 5028, 26647 ]
[ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 ]
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 2520, 789, 13698, 281, 2953, 253, 5028, 26647, 1895, 275, 24705, 26405, 247, 2969, 285, 3576, 48960, 42072, 1732, 280, 310, 4081, 534, 2544, 4156, 9990, 387, 253, 2460, 1268, 949, 48960, 4715, 253, 48960, 3740, 3386, 1646, 281, 9232, 253, 5319, 273, 643, 15302, 8379, 285, 3157, 253, 3045, 3012, 20544, 50276, 783, 2022, 2934, 273, 1604, 4826, 310, 2969, 285, 3477, 281, 3359, 891, 2868, 436, 476, 5649, 1142, 4382, 8113, 8892, 3340, 253, 14086, 10554, 8892, 50275, 783, 23356, 273, 849, 3740, 2544, 2818, 253, 26405, 3045, 275, 4677, 337, 310, 1175, 352, 476, 320, 4518, 2540, 326, 253, 3641, 276, 12075, 247, 2257, 672, 760, 6890, 6200, 9830, 50274, 783, 28669, 570, 24426, 275, 4677, 608, 3400, 1175, 4737, 323, 253, 1604, 4826, 2934, 253, 48960, 3740, 3386, 476, 9232, 690, 5319, 273, 643, 15302, 50274, 31031, 4679, 452, 644, 2218, 1690, 14023, 342, 1846, 941, 2321, 16977, 1732, 982, 256, 5503, 277, 72, 3082, 285, 4465, 4679, 327, 9162, 8892, 50276, 20881, 1255, 265, 50276, 249, 4677, 337, 352, 310, 7558, 326, 2460, 5251, 1599, 87, 14417, 310, 908, 347, 253, 3740, 4735, 1223, 352, 310, 417, 2590, 849, 281, 10173, 285, 4647, 824, 3740, 3386, 209, 40702, 249, 1386, 5571, 352, 310, 7558, 326, 721, 4528, 4735, 323, 1016, 1650, 594, 891, 4282, 604, 253, 3740, 3386, 403, 816, 1599, 50276, 87, 14417, 273, 46206, 8123, 604, 594, 2139, 417, 897, 643, 3295, 8470, 390, 1014, 14542, 8470, 891, 1158, 436, 310, 271, 1774, 28913, 1263, 326, 310, 5816, 275, 253, 2022, 2929, 50275, 783, 7364, 285, 2442, 4016, 38058, 3486, 452, 644, 973, 2529, 5474, 339, 431, 248, 2929, 12453, 253, 2087, 1895, 273, 4715, 247, 1566, 275, 581, 5028, 285, 5175, 352, 275, 247, 1273, 5028, 835, 627, 310, 690, 5028, 5333, 875, 253, 10625, 5742, 352, 14371, 247, 1332, 326, 19132, 5175, 3045, 327, 1524, 7202, 941, 672, 4715, 327, 13506, 13697, 941, 275, 24705, 26405, 323, 26279, 6276, 253, 15265, 839, 8310, 310, 326, 275, 10106, 24705, 26405, 323, 26279, 6276, 247, 44003, 5028, 5333, 310, 27289, 275, 253, 591, 13695, 1599, 285, 11041, 273, 253, 2460, 941, 275, 643, 3000, 253, 3295, 9990, 875, 1027, 941, 5239, 285, 11931, 2515, 9184, 253, 4477, 12661, 281, 11120, 2303, 48960, 2544, 273, 253, 1599, 285, 11041, 273, 253, 806, 5028, 3888, 1309, 3733, 281, 3157, 26647, 3045, 275, 247, 1273, 7202, 5028, 50276, 783, 1332, 2987, 275, 767, 12475, 323, 1016, 14604, 19502, 835, 253, 1599, 285, 11041, 273, 253, 3280, 2460, 476, 320, 7321, 1309, 253, 3579, 1509, 48960, 22753, 273, 253, 1599, 87, 14417, 20452, 1754, 327, 253, 1655, 13831, 1566, 10554, 285, 840, 3963, 22753, 273, 1566, 13461, 342, 1097, 48960, 347, 973, 347, 440, 8292, 16193, 3530, 50276, 783, 2929, 7788, 281, 7472, 253, 3045, 273, 253, 4081, 1332, 327, 581, 13506, 7147, 1264, 1524, 15302, 275, 24705, 26405, 323, 26279, 6276, 305, 85, 580, 7147, 2846, 1026, 9652, 270, 1678, 3711, 12537, 347, 973, 347, 2067, 10165, 323, 896, 47473, 285, 14604, 21539, 15849, 253, 16774, 3045, 1543, 7568, 1534, 11701, 689, 253, 6777, 1666, 25379, 33810, 253, 2929, 8725, 253, 1543, 281, 2460, 9162, 327, 767, 4623, 49602, 24321, 285, 285, 268, 18944, 253, 2929, 310, 3542, 973, 285, 253, 1332, 310, 10090, 562, 342, 4209, 19843, 285, 2508, 253, 15302, 6777, 323, 5301, 403, 4623, 285, 15958, 275, 2426, 273, 2442, 1524, 10186, 4893, 273, 253, 1332, 891, 11435, 253, 16038, 359, 10018, 27289, 3072, 3910, 875, 2710, 15302, 285, 326, 597, 476, 320, 10848, 275, 247, 8566, 285, 3300, 39904, 14282, 4972, 2168, 2460, 1599, 285, 11041, 849, 476, 359, 897, 436, 2720, 3640, 281, 3157, 4836, 3045, 275, 24705, 26405, 1754, 327, 436, 16038, 253, 4477, 7525, 281, 1566, 253, 2540, 20452, 11120, 347, 352, 6569, 281, 320, 247, 3626, 629, 273, 8566, 2460, 4702, 3210, 2168, 275, 253, 3282, 326, 21539, 342, 841, 9990, 310, 973, 1929, 285, 7744, 908, 2378, 352, 310, 23115, 253, 4477, 921, 326, 352, 310, 247, 31381, 2969, 3213, 281, 18539, 274, 1365, 3283, 247, 747, 873, 273, 3530, 326, 403, 12150, 281, 3283, 387, 253, 1655, 3733, 1375, 271, 23176, 2867, 273, 253, 1332, 310, 326, 253, 20452, 2317, 310, 9648, 1698, 6967, 721, 11000, 285, 3021, 625, 4354, 7943, 2429, 281, 1029, 6967, 26309, 323, 4227, 327, 253, 1268, 273, 2060, 15115, 50276, 66, 1534, 4468, 342, 253, 19529, 310, 326, 352, 24543, 697, 1332, 347, 5028, 26647, 277, 72, 2581, 685, 2460, 42072, 209, 571, 285, 253, 6774, 1783, 285, 4327, 273, 1666, 25379, 891, 2868, 436, 1461, 2261, 253, 5301, 281, 2720, 1445, 285, 10870, 3082, 604, 891, 1379, 253, 277, 72, 8668, 2654, 452, 281, 1642, 588, 436, 789, 323, 643, 277, 72, 8892, 326, 513, 417, 1056, 2266, 13260, 670, 253, 2625, 2460, 4702, 1232, 4495, 273, 1599, 87, 14417, 891, 476, 1089, 642, 1941, 323, 436, 275, 253, 2929, 285, 352, 4620, 326, 253, 2022, 26536, 310, 247, 2266, 5028, 9376, 253, 2460, 4702, 1232, 285, 5438, 273, 941, 281, 10138, 342, 436, 9376, 26332, 253, 15302, 403, 6777, 824, 326, 597, 9184, 2837, 1598, 275, 253, 6777, 285, 11120, 23115, 26312, 31506, 253, 3935, 273, 253, 19529, 812, 320, 326, 323, 247, 873, 273, 10625, 891, 476, 247, 30400, 9186, 690, 1698, 6967, 830, 800, 1232, 326, 891, 840, 476, 22059, 281, 6635, 1805, 3530, 275, 2426, 273, 26647, 2439, 253, 2720, 10625, 436, 812, 320, 8460, 2907, 432, 253, 8668, 273, 2460, 42072, 209, 571, 604, 891, 1379, 327, 253, 8668, 273, 209, 571, 2654, 452, 281, 1642, 604, 253, 11106, 2720, 1445, 1666, 25379, 285, 49602, 403, 6777, 20420, 323, 2720, 1445, 627, 651, 320, 1534, 10414, 5816, 323, 4227, 337, 285, 374, 2708, 7568, 12314, 8376, 1699, 48960, 42072, 10686, 400, 1598, 591, 13695, 2460, 1599, 285, 11041, 812, 320, 2908, 275, 841, 7274, 347, 973, 1293, 7936, 2544, 281, 253, 11333, 849, 651, 253, 3559, 1332, 7277, 275, 436, 1083, 812, 352, 320, 6508, 281, 643, 11120, 23115, 2460, 4702, 4870, 48560, 275, 13568, 5333, 3966, 50276, 284, 271, 1650, 14935, 1333, 619, 5028, 5333, 497, 275, 13568, 39501, 24088, 4942, 4228, 1113, 3169, 6568, 4632, 7060, 4481, 432, 34792, 1859, 253, 3559, 1332, 651, 417, 320, 3264, 281, 7568, 1199, 5649, 347, 253, 5028, 9376, 273, 5333, 275, 2460, 1599, 285, 11041, 403, 13588, 581, 812, 3164, 1566, 253, 275, 13568, 9381, 342, 247, 2074, 48960, 2746, 281, 3410, 5978, 533, 436, 9093, 4419, 6843, 3640, 273, 253, 5028, 10739, 275, 247, 1698, 6967, 36833, 3579, 1039, 50275, 284, 352, 310, 3559, 253, 16774, 1543, 403, 23176, 347, 253, 1332, 310, 4942, 15246, 285, 19132, 689, 253, 3559, 1666, 25379, 533, 253, 1268, 273, 7680, 310, 12744, 347, 253, 4327, 273, 1666, 25379, 3480, 247, 11080, 3410, 273, 285, 5301, 281, 1375, 273, 253, 1445, 2460, 42072, 3082, 50275, 18, 275, 856, 22868, 259, 606, 938, 1797, 2321, 4090, 4060, 2321, 4090, 48960, 5889, 273, 3632, 35919, 569, 323, 10237, 3733, 2488, 3227, 5503, 80, 259, 606, 285, 11450, 319, 27116, 1269, 22728, 285, 5139, 266, 465, 30797, 18279, 285, 1182, 73, 2821, 340, 86, 285, 3865, 66, 271, 395, 76, 22711, 285, 1182, 12109, 31524, 259, 606, 1984, 5564, 24301, 1972, 275, 11454, 1491, 5162, 2718, 8121, 66, 320, 90, 11500, 91, 7438, 285, 340, 4204, 484, 23187, 285, 268, 632, 606, 285, 480, 259, 430, 1342, 362, 3920, 266, 807, 938, 1797, 9688, 3614, 5758, 15337, 3024, 39061, 301, 81, 22, 78, 2851, 17312, 4989, 91, 21, 50275, 19, 275, 856, 22868, 1182, 12109, 14952, 324, 735, 24406, 4060, 324, 735, 24406, 6753, 2321, 420, 2488, 89, 5104, 86, 1182, 12109, 285, 2805, 22589, 259, 606, 285, 480, 757, 1182, 12109, 285, 1182, 31035, 1182, 73, 543, 1984, 5564, 48985, 8059, 327, 4715, 14237, 807, 14952, 9688, 3614, 5758, 15337, 3024, 39061, 301, 1615, 89, 563, 656, 76, 10936, 50275, 783, 7364, 273, 253, 789, 403, 9648, 2590, 432, 253, 19529, 5474, 33032, 2520, 2929, 12661, 247, 747, 48960, 3740, 42072, 5700, 1604, 4826, 323, 5028, 26647, 275, 253, 24705, 26405, 4836, 253, 2746, 273, 1604, 4826, 476, 6635, 1892, 17521, 1025, 3888, 1309, 3733, 13538, 253, 1566, 432, 689, 31893, 327, 253, 2603, 5028, 253, 5978, 310, 6312, 3066, 4715, 48960, 3740, 4735, 4679, 327, 767, 24705, 26405, 49602, 7568, 253, 12510, 273, 1604, 4826, 50276, 296, 3755, 20556, 253, 4081, 2746, 476, 9090, 3157, 253, 3045, 273, 1666, 25379, 3045, 327, 39709, 10625, 1690, 26405, 285, 9162, 8892, 50276, 20881, 1255, 337, 253, 4081, 1604, 4826, 310, 6296, 271, 2746, 281, 9017, 253, 7286, 9991, 273, 253, 3733, 3888, 285, 436, 2746, 476, 417, 5416, 253, 3045, 327, 581, 39709, 5028, 26332, 436, 1332, 19756, 253, 4665, 1430, 285, 1604, 4826, 476, 3157, 253, 26647, 3745, 3066, 3295, 42072, 533, 625, 42072, 2538, 403, 3058, 824, 347, 15632, 323, 253, 26647, 273, 247, 1566, 432, 5222, 455, 429, 3733, 941, 281, 1698, 3243, 5175, 941, 50276, 19, 2139, 417, 4647, 253, 1604, 4826, 327, 256, 5503, 9162, 1666, 25379, 751, 479, 2960, 275, 2829, 721, 285, 298, 19, 69, 275, 2829, 818, 891, 4282, 1880, 253, 3045, 7756, 310, 4122, 2905, 342, 253, 8245, 3045, 50276, 20, 253, 34385, 2429, 342, 1655, 256, 5503, 42072, 1332, 310, 417, 4755, 751, 253, 5878, 4826, 2361, 275, 2829, 374, 273, 915, 625, 5301, 310, 3058, 285, 824, 14023, 943, 320, 4845, 275, 253, 2022, 2929, 417, 915, 50273, 74, 5194, 342, 4477, 4767, 7364, 275, 253, 915, 534, 310, 253, 3733, 673, 253, 2488, 476, 1908, 849, 281, 4796, 253, 3733, 2105, 50276, 187, 187, 4118, 18435, 27, 19583, 285, 8542, 1039, 281, 513, 1805, 387, 5028, 26647, 672, 352, 3249, 281, 24705, 26405, 1604, 4826, 476, 6635, 3888, 326, 403, 1892, 1309, 3733, 285, 3657, 253, 1566, 432, 689, 31893, 327, 253, 2603, 5028, 1677, 326, 352, 2987, 973, 310, 4942, 2969, 281, 3359, 285, 4473, 1230, 3590, 891, 1158, 352, 588, 4549, 281, 247, 1781, 5110, 273, 253, 5723, 2824, 8446, 326, 2987, 327, 5028, 26647 ]
Below is given review of a research paper from cnoference journal. Please write a summary the review. ### Review: summary the paper is clear and very well written it makes important steps towards understanding if sparse convolutional neural networks can represent a substitute for their dense counterparts moreover it unveils the relation between the performance of sparse neural networks and gradient flow based on this relation it explains also why the dynamic sparse training approach has higher potential of improving sparse neural networks in the future while lottery tickets are limited by the performance of the pruning solutions from which they are derived the last but not the least the paper introduces a simple and practical method specially designed to initialise sparse networks weights strong points the paper brings novel basic knowledge and understanding of sparse neural networks the extensive set of experiments is welldesigned very informative and support the paper claims the fundamental study performed in this paper is timely and has the potential of advancing seriously the field weak points while the abstract and some other parts of the paper discuss about deep neural networks in general the experiments are solely focused on convolutional neural networks i believe that extending them also to other types of networks would improve the overall quality of the paper for the discussion phase i suggest to the authors to consider the weak point and the following minor comments 1 i find very interesting that any sparse training method cope much better with the resnet architecture than with the vgg architecture it is easy to observe this in table 1 do you have any idea why is this happening is this the effect of skip connections 2 can you add in figure 3 the version of the training algorithms for resnet50 and the version without for lenet5 3 the relation between hessian gradient flow and sparse training raised my curiosity but i agree with the authors that this investigation can be let for future work docsepthis paper presents three key hypotheses for sparse nn training dynamics and provides empirical studies and observations to verify them this review will first provide general comments and then specific ones on each hypothesis main comments pros three messages that the authors try to convey are important and interesting to the audiences in pruning it is an observational paper which provides insights on 1 what is a good initialization of the for sparse nn training 2 why dst can achieve good generalization and 3 is lth really different from pruning cons the presentation of the paper needs work the highlevel structure is good and clear but for each paragraph the logic flow is hard to follow for example in a very key paragraph on p7 lottery tickets learn similar functions to the pruning solution i have to read repeatedly and infer inner logics of each sentence to see the conclusion hypotheses i appreciate identifying the problem of naive initialization of sparse nn and connecting it with gradient flow however a new proposal for initialization here as the major contribution is unnecessary and actually negatively affects the credits of the true contribution the results presented in table 1 are not very impressive it is ok to just compare original and liu et al and provide insights in an observational paper the observations provide one possible explanation on why dst might work the authors could try to test this in different architectures even beyond cnns to see if they are widely held if so it is potentially a good metric or analysis tool for sparse nn training i am not fully convinced by the third hypothesis first the models and datasets for ensemble and prediction disagreement are too limited while the conclusion is very strong also i think a more appropriate statement could be lottery tickets learn similar functions to the pruning solution than random scratch since that is the only thing you are comparing with minor comment it would be interesting to see if all the conclusions hold in other models besides cnnsdocsep paper summary this paper presents an empirical study of sparse deep nets either obtained by sparsification methods such as dynamic sparse training or by pruning according to the lottery ticket hypothesis the main contribution of this work is to study gradient flow both at initialisation and during training and to propose an extension of known initialisation methods that works for sparse networks in addition this work also attempts at explaining why lottery tickets are successful despite sharing similar problems related to the gradient flow when compared to other sparsification methods reasons for score  overall i like this kind of empirical study where authors set the stage for important questions and attempt at answering them with a thorough empirical study my major concern for accepting this work relates to the depth of contributions in my humble opinion the proposed generalisation of hes initialisation could have have been the main only focus of this work with additional experiments and considerations relation to other sparsity inducing methods not only dst a better understanding of the interaction between initialisation batch normalisation and skip connections instead the presentation strategy in this paper is to illustrate several findings but due to space constraints in a more shallow manner this choice dilutes the contributions too much positive points  1 empirical work that addresses an important topic that of sparse nn questions are well motivated and sufficiently well described although they have the slightly negative effect of diluting the overall take home message from reading this paper 2 the proposed extension to a known initialisation method to cope with issues related to gradient flow during the early stages of training is reasonable and effective as shown by the experiments 3 the experiments on gradient flow during training and the ones on lottery tickets confirm either known results or intuition they can be viewed as a reproducibility study which is commendable some by products of the study indicate important properties of lt which are of direct practical relevance eg rewinding strategies negative points 1 the proposed generalisation of hes method is not sufficiently exploited focusing on the forward pass the idea is to initialise weights on a per neuron basis using the mask computed by the sparsification method as the authors notice the work by lui et al achieves similar performance fig 1c and in many cases it outperforms the proposed method tab1 bold results this calls for a better understanding of the advantages of the proposed method furthermore the interaction of sparse initialisation with batch normalisation and skip connections is not sufficiently studied in fig2 all methods appear similar finally the fact that a small but dense network achieves better results lenet on mnist tab1 and does not suffer from gradient flow problems fig2 c is interesting and calls again for further study 2 the results on gradient flow during training are only superficially commented although the authors hint at additional ideas based on second order approximations of the loss ie considering the hessian and its eigenspectrum overall the take home message from fig3 confirms the known behaviour of dst methods such as rigl albeit interesting in my humble opinion this results seem to be given more real estate on the paper than it deserves subtracting space for the main contribution on a new initialisation scheme 3 the results on lottery ticket could enjoy some improvements on the terminology which is a minor remark indeed it would be easier to refer to 1 imp solution pruning solution 2 random initfinal startend 3 lt initfinal startend my main concern with this set of results is that on the one hand they are somehow expected especially with respect to the large literature available on the topic it is not bad per se to collect in one coherent piece of work previous observations and place them in a thorough experimental framework however i have problems with the following a the notion of closeness as shown in fig 5 ad and reported in fig 5 be is the result of a dramatic dimensionality reduction similar techniques have been used in other contexts eg hao li et al visualizing the loss landscape of neural nets nips 2018 and the warnings are to take results with a grain of salt that said it is expected by construction to find that lt final networks are close to the imp solution b the argument used to confirm that lt are in the basin of the imp solution is based on path connectivity and as the author also note in a foot note in page 7 studying in detail this path is outside the scope of the paper the geometry of the loss landscapes is in general very complex especially when there is no batch normalisation nor skip connections which have the effect of smoothing it i am not sure it is correct to claim that if two solutions that is the params of a neural net have the same loss and they are connected by a linear path then it is necessary true that they lie in the same basin even if results in tab2 on the disagreement are compelling it might still not be necessarily true that the two compared models are the same instance of function approximation c as a minor remark the implications of the results in sec 43 are interesting and valuable but i failed to understand properly the connection to the empirical study on the distance between imp solutions and lt final solutions   additional comments i found this paper well written in most parts just a minor comment on the terminology used in one subsection i liked this work and i think it has plenty of potential as an humble suggestion would it make sense to attempt at focussing more the message and insist on the main contribution of the paper as per a new initialisation scheme or was this not seen as sufficient in light of the results from liu et al 2019docsepoverview summary this paper tries to answer the following two questions i why training unstructured sparse networks from random initiation perform poorly 2 what makes lts and dst the exception the authors show the following findings 1 sparse nns have poor gradient flow at initialization they show that existing methods for initializing sparse nns are incorrect in not considering heterogeneous connectivity improved methods are sample initialization from a dynamic gaussian whose variance is related to the fanin numbers fanin fanout rule plays an important role here and improves the gradient flow 2 sparse nns have poor gradient flow during training they show that dst based methods achieving the best generalization have improved gradient flow 3 they find the lts do not improve gradient flow rather their success lies in relearning the pruning solution they are derived from strength bullets 1 the idea is very interesting i appreciate the novel analysis the proposed methods are wellmotivated 2 the paper is well written and easy to understand 3 the finding is surprising but the experiment design is poor which i will list more detailed limitations in the weakness sections i like the idea i will raise my score if the authors can completely address my confusion and concerns weakness bullets 1 for table 1 a strong baseline is missing why not compare with the performance of the lottery ticket setting i think it is a more natural baseline than set and rigl 2 in my opinion there is a mustdo experiment lottery ticket mask proposed initialization and compare it to lt and random tickets because the lt mask random reinitialization random tickets fail in the previous literature according to the explanation in the paper it can also be the problem of random reinitialization thus strong supportive evidence is that show proposed modified random reinitialization lt mask can surpass random ticket performance 3 missing details what is the pruning ratio of each stage in iterative magnitude pruning the appendix only tells me the author using 95 and 80 sparsity why pick these two sparsity because this sparsity gives the extreme matching subnetworks and the author uses iterative magnitude pruning if they follow the original lth setting pruning 20 for each time then the sparsity should be 108i how to achieve 95 and 80 4 what is the definition of pruning solution is it the obtained mask or initialization or subnetworks contains both mask and initialization super confused 5 conflicted experiments results with linear mode connectivity and the lottery ticket hypothesis paper resnet 50 imp lt on imagenet without early weight rewinding can not have good linear mode connectivity however the pruning solution and lt solution have good linear mode connectivity it is wired even for two lts resnet 50 imp lt on imagenet trained with the same initialization in different data orders they do not have a linear path where interpolated training loss is flat as evidenced in figure 5 in the paper linear mode connectivity and the lottery ticket hypothesis early weight rewinding is needed for the presented results while i think the author did not use it 6 the comparison in table 2 is unfair scratch settings are trained from five different random initialization while lt settings are trained from the same initialization with different data orders lt setting results should also be from different initialization otherwise can not achieve the conclusion that lottery tickets learn similar functions to the pruning solution minor 1 the definition of lth in 33 perform as well as onfthetam why there is m it should be the full dense model without the mask right post rebuttal thanks to the authors for the extra experiments and feedback lottery baseline for table1 although rigl does not need dense network training it cost more to find the mask table 2 of the rigl paper random tickets random ticket lt mask random reinitialization rather than random pruning random init the front one will be much more interesting because the lt mask random reinitialization random tickets fail in the previous literature according to the explanation in the paper it can also be the problem of random reinitialization thus strong supportive evidence is that show proposed modified random reinitialization lt mask can surpass random ticket performance i personally do the experiment that performing proposed initialization on random tickets and the performance is unchanged of course there may exist lots of reasons for the results i will not degrade the paper according to my experiments other concerns are willaddressed thanks although i do like the idea of this paper i think it might need to be revised and resubmitted incorporating the extensive discussion presented by all the reviewers i tend to keep my scores unchanged but i dont think this is 100 a clear reject and depending on the opinions of the other reviewers i would not feel that accepting this paper was completely out of bounds ### Summary:
the paper shows empirically that training unstructured sparse networks from random initialization performs poorly as sparse nns have poor gradient flow at initialization besides the authors argue that sparse nns have poor gradient flow during training they show that dst based methods achieving the best generalization have improved gradient flow moreover they find the lts do not improve gradient flow rather their success lies in relearning the pruning solution they are derived from i read the paper and the reviewers discussed the rebuttal although all the reviewers found the rebuttal helpful and they all agree that the paper is decently well written and has some clear value the majority believes that further observations are required for making the paper and its hypothesis convincing there are also some recent related work on initialization of pruned networks eg by rescaling their weights at initialization i believe adding the discussion of such related techniques and making the connection to existing work will greatly strengthen the paper and provides more evidence to support its claims
[ 619, 2201, 4468, 323, 18738, 436, 789, 7033, 281, 253, 6864, 273, 9021, 275, 619, 26896, 4743, 253, 4081, 2087, 5837, 273, 34236, 3302, 5837, 812, 452, 452, 644, 253, 2022, 760, 2770, 273, 436, 789, 342, 3081, 4679, 285, 15711, 5886, 281, 643, 37139, 414, 24635, 3082, 417, 760, 24334, 247, 1805, 4685, 273, 253, 5016, 875, 3302, 5837, 14604, 2622, 5837, 285, 17049, 10291, 50276, 34235, 253, 9759, 5700, 275, 436, 2929, 310, 281, 17093, 2067, 4342, 533, 1955, 281, 2317, 10806, 275, 247, 625, 20126, 5133, 436, 4327, 7425, 2279, 253, 9021, 1512, 1199, 50273, 10247, 2792, 575, 50276, 18, 16774, 789, 326, 12453, 271, 1774, 9400, 326, 273, 23507, 48257, 3533, 403, 973, 17194, 285, 10481, 973, 2529, 3738, 597, 452, 253, 5777, 4016, 1055, 273, 7425, 9634, 253, 4583, 1379, 1728, 3935, 432, 4361, 436, 2929, 50276, 19, 253, 4081, 6880, 281, 247, 1929, 3302, 5837, 1332, 281, 23808, 342, 3374, 2905, 281, 11786, 2685, 1309, 253, 2393, 8661, 273, 3733, 310, 5272, 285, 3576, 347, 2011, 407, 253, 4679, 50276, 20, 253, 4679, 327, 11786, 2685, 1309, 3733, 285, 253, 4394, 327, 36284, 14997, 6583, 2057, 1929, 1543, 390, 30328, 597, 476, 320, 11575, 347, 247, 38041, 1263, 534, 310, 49638, 494, 690, 407, 3580, 273, 253, 1263, 5224, 1774, 3607, 273, 46007, 534, 403, 273, 1480, 8542, 17200, 24088, 294, 88, 3087, 8130, 50273, 12373, 2792, 50276, 18, 253, 4081, 2087, 5837, 273, 34236, 1332, 310, 417, 10481, 28734, 13654, 327, 253, 3579, 1509, 253, 2934, 310, 281, 3302, 885, 13461, 327, 247, 591, 23586, 3720, 970, 253, 8989, 10302, 407, 253, 37139, 1877, 1332, 347, 253, 4477, 4366, 253, 789, 407, 17669, 1162, 355, 33526, 2074, 3045, 3036, 337, 68, 285, 275, 1142, 2219, 352, 41731, 13015, 253, 4081, 1332, 10334, 18, 13433, 1543, 436, 5841, 323, 247, 1805, 4685, 273, 253, 11361, 273, 253, 4081, 1332, 33810, 253, 5016, 273, 23507, 3302, 5837, 342, 14604, 2622, 5837, 285, 17049, 10291, 310, 417, 10481, 5421, 275, 3036, 19, 512, 3082, 3176, 2074, 4720, 253, 958, 326, 247, 1355, 533, 14086, 2990, 33526, 1805, 1543, 8472, 292, 327, 278, 79, 382, 10334, 18, 285, 1057, 417, 11089, 432, 11786, 2685, 3237, 3036, 19, 260, 310, 4722, 285, 5841, 969, 323, 2007, 1263, 50276, 19, 253, 1543, 327, 11786, 2685, 1309, 3733, 403, 760, 23426, 280, 1365, 20503, 3738, 253, 4477, 12662, 387, 3081, 5697, 1754, 327, 1273, 1340, 34754, 273, 253, 2957, 26332, 7296, 253, 344, 859, 757, 285, 697, 299, 17731, 808, 4638, 4583, 253, 1379, 1728, 3935, 432, 3036, 20, 23849, 253, 1929, 8770, 273, 24334, 3082, 824, 347, 8132, 77, 23447, 4722, 275, 619, 26896, 4743, 436, 1543, 1646, 281, 320, 1677, 625, 1524, 8304, 327, 253, 2929, 685, 352, 22828, 45771, 2317, 323, 253, 2022, 7680, 327, 247, 747, 3302, 5837, 6974, 50276, 20, 253, 1543, 327, 36284, 13571, 812, 4264, 690, 11701, 327, 253, 28939, 534, 310, 247, 5884, 7579, 6296, 352, 651, 320, 6927, 281, 3730, 281, 337, 1607, 2900, 50276, 1087, 25004, 2900, 374, 3632, 2012, 13017, 50276, 5478, 423, 495, 46007, 2012, 13017, 50276, 5478, 423, 619, 2022, 4468, 342, 436, 873, 273, 1543, 310, 326, 327, 253, 581, 1133, 597, 403, 10380, 3264, 3340, 342, 1675, 281, 253, 1781, 6239, 2130, 327, 253, 9400, 352, 310, 417, 3076, 591, 396, 281, 4822, 275, 581, 18893, 5313, 273, 789, 2045, 7313, 285, 1659, 731, 275, 247, 11080, 5661, 7792, 2299, 891, 452, 3237, 342, 253, 1563, 247, 253, 10732, 273, 2734, 8098, 347, 2011, 275, 3036, 608, 519, 285, 2361, 275, 3036, 608, 320, 310, 253, 906, 273, 247, 14138, 7877, 1319, 5141, 2074, 5609, 452, 644, 908, 275, 643, 22349, 24088, 419, 80, 632, 1162, 355, 5304, 3006, 253, 2957, 13016, 273, 11454, 37507, 295, 2824, 4765, 285, 253, 20942, 403, 281, 1379, 1543, 342, 247, 13723, 273, 7043, 326, 753, 352, 310, 3264, 50276, 1615, 5140, 50276, 936, 1089, 326, 46007, 2457, 6928, 403, 2810, 281, 253, 1607, 2900, 270, 253, 4154, 908, 281, 6583, 326, 46007, 403, 275, 253, 31567, 273, 253, 1607, 2900, 310, 1754, 327, 1854, 17769, 285, 347, 253, 2488, 671, 3877, 275, 247, 3174, 3877, 275, 3239, 818, 12392, 275, 2508, 436, 1854, 310, 3345, 253, 7990, 273, 253, 2929, 253, 12087, 273, 253, 2957, 37328, 310, 275, 2087, 1077, 2570, 3340, 672, 627, 310, 642, 14604, 2622, 5837, 4543, 17049, 10291, 534, 452, 253, 1055, 273, 36971, 352, 891, 717, 417, 2119, 352, 310, 3451, 281, 1750, 326, 604, 767, 5482, 326, 310, 253, 18912, 273, 247, 11454, 2036, 452, 253, 1072, 2957, 285, 597, 403, 4802, 407, 247, 4872, 1854, 840, 352, 310, 3309, 2032, 326, 597, 7027, 275, 253, 1072, 31567, 1014, 604, 1543, 275, 10334, 19, 327, 253, 30859, 403, 18511, 352, 1537, 1335, 417, 320, 7933, 2032, 326, 253, 767, 2429, 3210, 403, 253, 1072, 4227, 273, 1159, 11193, 260, 347, 247, 5884, 7579, 253, 12739, 273, 253, 1543, 275, 4706, 7652, 403, 4722, 285, 9865, 533, 891, 4242, 281, 2096, 6283, 253, 4602, 281, 253, 16774, 1263, 327, 253, 4181, 875, 1607, 5482, 285, 46007, 2457, 5482, 17345, 50274, 38092, 5701, 50276, 74, 1119, 436, 2929, 973, 3542, 275, 954, 4243, 816, 247, 5884, 4385, 327, 253, 28939, 908, 275, 581, 19087, 891, 10490, 436, 789, 285, 891, 1158, 352, 556, 9828, 273, 2442, 50276, 284, 271, 26896, 14876, 651, 352, 1056, 3282, 281, 3177, 387, 41685, 1316, 272, 625, 253, 3935, 285, 23103, 327, 253, 2022, 7680, 273, 253, 2929, 347, 591, 247, 747, 3302, 5837, 6974, 390, 369, 436, 417, 2326, 347, 4209, 275, 1708, 273, 253, 1543, 432, 632, 86, 1162, 355, 6247, 7152, 33032, 39930, 50276, 8774, 436, 2929, 14177, 281, 3662, 253, 1563, 767, 3533, 891, 2139, 3733, 440, 34218, 23507, 6928, 432, 3632, 17000, 1347, 15225, 374, 752, 2789, 298, 1641, 285, 24334, 253, 6517, 253, 4477, 921, 253, 1563, 4342, 337, 23507, 295, 2224, 452, 4105, 11786, 2685, 387, 31850, 597, 921, 326, 5368, 3082, 323, 3302, 3006, 23507, 295, 2224, 403, 13583, 275, 417, 7296, 22766, 17769, 5520, 3082, 403, 3410, 31850, 432, 247, 7870, 305, 12064, 3692, 11041, 310, 2905, 281, 253, 7989, 249, 3904, 7989, 249, 50276, 20227, 483, 4086, 7120, 271, 1774, 2554, 1060, 285, 19132, 253, 11786, 2685, 374, 23507, 295, 2224, 452, 4105, 11786, 2685, 1309, 3733, 597, 921, 326, 24334, 1754, 3082, 17170, 253, 1682, 26647, 452, 5520, 11786, 2685, 495, 597, 1089, 253, 298, 1641, 513, 417, 3157, 11786, 2685, 2581, 616, 2323, 8696, 275, 1693, 4026, 253, 819, 25004, 2900, 597, 403, 6012, 432, 50275, 45563, 29093, 337, 253, 2934, 310, 1077, 4722, 891, 11435, 253, 4460, 1783, 253, 4081, 3082, 403, 973, 24013, 8550, 374, 253, 2929, 310, 973, 3542, 285, 3477, 281, 2096, 495, 253, 4560, 310, 10084, 533, 253, 3368, 2216, 310, 4105, 534, 891, 588, 1618, 625, 7000, 7364, 275, 253, 14855, 7118, 891, 751, 253, 2934, 891, 588, 7164, 619, 4868, 604, 253, 4477, 476, 4336, 2953, 619, 13775, 285, 7350, 50275, 20881, 1255, 29093, 337, 323, 2829, 337, 247, 2266, 8245, 310, 5816, 2139, 417, 7277, 342, 253, 3045, 273, 253, 36284, 13571, 4758, 891, 1158, 352, 310, 247, 625, 3626, 8245, 685, 873, 285, 8132, 77, 374, 275, 619, 4743, 627, 310, 247, 1364, 3088, 3368, 36284, 13571, 8989, 50276, 856, 7334, 31850, 285, 7277, 352, 281, 46007, 285, 3632, 14997, 984, 253, 46007, 8989, 50276, 14719, 294, 19078, 1320, 50276, 14719, 14997, 1891, 275, 253, 2045, 6239, 2556, 281, 253, 8813, 275, 253, 2929, 352, 476, 671, 320, 253, 1895, 273, 3632, 294, 19078, 1320, 3021, 2266, 23384, 1941, 310, 326, 921, 4081, 7321, 3632, 294, 19078, 1320, 50276, 5792, 8989, 476, 28842, 3632, 13571, 3045, 495, 5816, 4278, 752, 310, 253, 819, 25004, 4313, 273, 1016, 3924, 275, 34560, 9777, 819, 25004, 253, 30762, 760, 8599, 479, 253, 2488, 970, 5325, 285, 5096, 37139, 414, 2139, 2619, 841, 767, 37139, 414, 984, 436, 37139, 414, 4245, 253, 9559, 11038, 749, 3024, 4896, 285, 253, 2488, 4648, 34560, 9777, 819, 25004, 604, 597, 956, 253, 3236, 298, 394, 4758, 819, 25004, 1384, 323, 1016, 673, 840, 253, 37139, 414, 943, 320, 13278, 74, 849, 281, 5115, 5325, 285, 5096, 577, 752, 310, 253, 5426, 273, 819, 25004, 2900, 310, 352, 253, 2797, 8989, 390, 31850, 390, 749, 3024, 4896, 4428, 1097, 8989, 285, 31850, 2221, 13477, 608, 7344, 264, 4679, 1543, 342, 4872, 4438, 17769, 285, 253, 36284, 13571, 9079, 2929, 501, 3024, 2456, 1607, 46007, 327, 4440, 257, 292, 1293, 2393, 2801, 294, 88, 3087, 476, 417, 452, 1175, 4872, 4438, 17769, 2299, 253, 819, 25004, 2900, 285, 46007, 2900, 452, 1175, 4872, 4438, 17769, 352, 310, 36427, 1014, 323, 767, 298, 1641, 501, 3024, 2456, 1607, 46007, 327, 4440, 257, 292, 10166, 342, 253, 1072, 31850, 275, 1027, 941, 7367, 597, 513, 417, 452, 247, 4872, 1854, 835, 20670, 456, 3733, 2957, 310, 6507, 347, 27007, 275, 4677, 608, 275, 253, 2929, 4872, 4438, 17769, 285, 253, 36284, 13571, 9079, 2393, 2801, 294, 88, 3087, 310, 3058, 323, 253, 3559, 1543, 1223, 891, 1158, 253, 2488, 858, 417, 897, 352, 50276, 23, 253, 5301, 275, 2829, 374, 310, 16593, 20041, 7533, 403, 10166, 432, 2620, 1027, 3632, 31850, 1223, 46007, 7533, 403, 10166, 432, 253, 1072, 31850, 342, 1027, 941, 7367, 46007, 4758, 1543, 943, 671, 320, 432, 1027, 31850, 5010, 476, 417, 5115, 253, 6452, 326, 36284, 14997, 3037, 2074, 3470, 281, 253, 819, 25004, 2900, 50276, 37585, 337, 253, 5426, 273, 298, 394, 275, 5922, 1347, 347, 973, 347, 327, 649, 6168, 312, 2139, 627, 310, 278, 352, 943, 320, 253, 2120, 14086, 1566, 1293, 253, 8989, 987, 50275, 5996, 30080, 22559, 50276, 35501, 281, 253, 4477, 323, 253, 4465, 4679, 285, 8680, 50276, 11753, 18631, 8245, 323, 2829, 18, 3738, 8132, 77, 1057, 417, 878, 14086, 2990, 3733, 352, 2105, 625, 281, 1089, 253, 8989, 2829, 374, 273, 253, 8132, 77, 2929, 50276, 14719, 14997, 3632, 13571, 50276, 5792, 8989, 50276, 14719, 294, 19078, 1320, 2581, 685, 3632, 819, 25004, 50276, 14719, 2012, 253, 2914, 581, 588, 320, 1199, 625, 4722, 984, 253, 46007, 8989, 50276, 14719, 294, 19078, 1320, 50276, 14719, 14997, 1891, 275, 253, 2045, 6239, 2556, 281, 253, 8813, 275, 253, 2929, 352, 476, 671, 320, 253, 1895, 273, 3632, 294, 19078, 1320, 3021, 2266, 23384, 1941, 310, 326, 921, 4081, 7321, 3632, 294, 19078, 1320, 50276, 5792, 8989, 476, 28842, 3632, 13571, 3045, 891, 11697, 513, 253, 3368, 326, 9591, 4081, 31850, 327, 3632, 14997, 285, 253, 3045, 310, 19965, 273, 2282, 627, 778, 2226, 8783, 273, 4606, 323, 253, 1543, 891, 588, 417, 40195, 253, 2929, 2556, 281, 619, 4679, 50276, 977, 7350, 403, 588, 1911, 2079, 6701, 50276, 20261, 891, 513, 751, 253, 2934, 273, 436, 2929, 891, 1158, 352, 1537, 878, 281, 320, 17265, 285, 501, 538, 3004, 24049, 253, 9470, 5955, 3559, 407, 512, 253, 30628, 891, 5257, 281, 1978, 619, 7363, 19965, 533, 891, 13414, 1158, 436, 310, 2233, 247, 2590, 12009, 285, 7293, 327, 253, 11626, 273, 253, 643, 30628, 891, 651, 417, 1928, 326, 18738, 436, 2929, 369, 4336, 562, 273, 14493, 2490, 187, 4118, 18435, 27, 783, 2929, 2722, 45190, 326, 3733, 440, 34218, 23507, 6928, 432, 3632, 31850, 17923, 15225, 347, 23507, 295, 2224, 452, 4105, 11786, 2685, 387, 31850, 16280, 253, 4477, 9059, 326, 23507, 295, 2224, 452, 4105, 11786, 2685, 1309, 3733, 597, 921, 326, 24334, 1754, 3082, 17170, 253, 1682, 26647, 452, 5520, 11786, 2685, 25761, 597, 1089, 253, 298, 1641, 513, 417, 3157, 11786, 2685, 2581, 616, 2323, 8696, 275, 1693, 4026, 253, 819, 25004, 2900, 597, 403, 6012, 432, 891, 1239, 253, 2929, 285, 253, 30628, 5469, 253, 30080, 22559, 3738, 512, 253, 30628, 1119, 253, 30080, 22559, 9371, 285, 597, 512, 5194, 326, 253, 2929, 310, 1086, 1574, 973, 3542, 285, 556, 690, 2590, 1318, 253, 5020, 11532, 326, 2007, 7313, 403, 2424, 323, 2403, 253, 2929, 285, 697, 9079, 21414, 627, 403, 671, 690, 3332, 2905, 789, 327, 31850, 273, 819, 37437, 6928, 24088, 407, 46595, 272, 616, 13461, 387, 31850, 891, 2868, 6240, 253, 5955, 273, 824, 2905, 5609, 285, 2403, 253, 4602, 281, 5368, 789, 588, 10260, 17084, 253, 2929, 285, 3400, 625, 1941, 281, 1329, 697, 3916, 50275 ]
[ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 ]
[ 619, 2201, 4468, 323, 18738, 436, 789, 7033, 281, 253, 6864, 273, 9021, 275, 619, 26896, 4743, 253, 4081, 2087, 5837, 273, 34236, 3302, 5837, 812, 452, 452, 644, 253, 2022, 760, 2770, 273, 436, 789, 342, 3081, 4679, 285, 15711, 5886, 281, 643, 37139, 414, 24635, 3082, 417, 760, 24334, 247, 1805, 4685, 273, 253, 5016, 875, 3302, 5837, 14604, 2622, 5837, 285, 17049, 10291, 50276, 34235, 253, 9759, 5700, 275, 436, 2929, 310, 281, 17093, 2067, 4342, 533, 1955, 281, 2317, 10806, 275, 247, 625, 20126, 5133, 436, 4327, 7425, 2279, 253, 9021, 1512, 1199, 50273, 10247, 2792, 575, 50276, 18, 16774, 789, 326, 12453, 271, 1774, 9400, 326, 273, 23507, 48257, 3533, 403, 973, 17194, 285, 10481, 973, 2529, 3738, 597, 452, 253, 5777, 4016, 1055, 273, 7425, 9634, 253, 4583, 1379, 1728, 3935, 432, 4361, 436, 2929, 50276, 19, 253, 4081, 6880, 281, 247, 1929, 3302, 5837, 1332, 281, 23808, 342, 3374, 2905, 281, 11786, 2685, 1309, 253, 2393, 8661, 273, 3733, 310, 5272, 285, 3576, 347, 2011, 407, 253, 4679, 50276, 20, 253, 4679, 327, 11786, 2685, 1309, 3733, 285, 253, 4394, 327, 36284, 14997, 6583, 2057, 1929, 1543, 390, 30328, 597, 476, 320, 11575, 347, 247, 38041, 1263, 534, 310, 49638, 494, 690, 407, 3580, 273, 253, 1263, 5224, 1774, 3607, 273, 46007, 534, 403, 273, 1480, 8542, 17200, 24088, 294, 88, 3087, 8130, 50273, 12373, 2792, 50276, 18, 253, 4081, 2087, 5837, 273, 34236, 1332, 310, 417, 10481, 28734, 13654, 327, 253, 3579, 1509, 253, 2934, 310, 281, 3302, 885, 13461, 327, 247, 591, 23586, 3720, 970, 253, 8989, 10302, 407, 253, 37139, 1877, 1332, 347, 253, 4477, 4366, 253, 789, 407, 17669, 1162, 355, 33526, 2074, 3045, 3036, 337, 68, 285, 275, 1142, 2219, 352, 41731, 13015, 253, 4081, 1332, 10334, 18, 13433, 1543, 436, 5841, 323, 247, 1805, 4685, 273, 253, 11361, 273, 253, 4081, 1332, 33810, 253, 5016, 273, 23507, 3302, 5837, 342, 14604, 2622, 5837, 285, 17049, 10291, 310, 417, 10481, 5421, 275, 3036, 19, 512, 3082, 3176, 2074, 4720, 253, 958, 326, 247, 1355, 533, 14086, 2990, 33526, 1805, 1543, 8472, 292, 327, 278, 79, 382, 10334, 18, 285, 1057, 417, 11089, 432, 11786, 2685, 3237, 3036, 19, 260, 310, 4722, 285, 5841, 969, 323, 2007, 1263, 50276, 19, 253, 1543, 327, 11786, 2685, 1309, 3733, 403, 760, 23426, 280, 1365, 20503, 3738, 253, 4477, 12662, 387, 3081, 5697, 1754, 327, 1273, 1340, 34754, 273, 253, 2957, 26332, 7296, 253, 344, 859, 757, 285, 697, 299, 17731, 808, 4638, 4583, 253, 1379, 1728, 3935, 432, 3036, 20, 23849, 253, 1929, 8770, 273, 24334, 3082, 824, 347, 8132, 77, 23447, 4722, 275, 619, 26896, 4743, 436, 1543, 1646, 281, 320, 1677, 625, 1524, 8304, 327, 253, 2929, 685, 352, 22828, 45771, 2317, 323, 253, 2022, 7680, 327, 247, 747, 3302, 5837, 6974, 50276, 20, 253, 1543, 327, 36284, 13571, 812, 4264, 690, 11701, 327, 253, 28939, 534, 310, 247, 5884, 7579, 6296, 352, 651, 320, 6927, 281, 3730, 281, 337, 1607, 2900, 50276, 1087, 25004, 2900, 374, 3632, 2012, 13017, 50276, 5478, 423, 495, 46007, 2012, 13017, 50276, 5478, 423, 619, 2022, 4468, 342, 436, 873, 273, 1543, 310, 326, 327, 253, 581, 1133, 597, 403, 10380, 3264, 3340, 342, 1675, 281, 253, 1781, 6239, 2130, 327, 253, 9400, 352, 310, 417, 3076, 591, 396, 281, 4822, 275, 581, 18893, 5313, 273, 789, 2045, 7313, 285, 1659, 731, 275, 247, 11080, 5661, 7792, 2299, 891, 452, 3237, 342, 253, 1563, 247, 253, 10732, 273, 2734, 8098, 347, 2011, 275, 3036, 608, 519, 285, 2361, 275, 3036, 608, 320, 310, 253, 906, 273, 247, 14138, 7877, 1319, 5141, 2074, 5609, 452, 644, 908, 275, 643, 22349, 24088, 419, 80, 632, 1162, 355, 5304, 3006, 253, 2957, 13016, 273, 11454, 37507, 295, 2824, 4765, 285, 253, 20942, 403, 281, 1379, 1543, 342, 247, 13723, 273, 7043, 326, 753, 352, 310, 3264, 50276, 1615, 5140, 50276, 936, 1089, 326, 46007, 2457, 6928, 403, 2810, 281, 253, 1607, 2900, 270, 253, 4154, 908, 281, 6583, 326, 46007, 403, 275, 253, 31567, 273, 253, 1607, 2900, 310, 1754, 327, 1854, 17769, 285, 347, 253, 2488, 671, 3877, 275, 247, 3174, 3877, 275, 3239, 818, 12392, 275, 2508, 436, 1854, 310, 3345, 253, 7990, 273, 253, 2929, 253, 12087, 273, 253, 2957, 37328, 310, 275, 2087, 1077, 2570, 3340, 672, 627, 310, 642, 14604, 2622, 5837, 4543, 17049, 10291, 534, 452, 253, 1055, 273, 36971, 352, 891, 717, 417, 2119, 352, 310, 3451, 281, 1750, 326, 604, 767, 5482, 326, 310, 253, 18912, 273, 247, 11454, 2036, 452, 253, 1072, 2957, 285, 597, 403, 4802, 407, 247, 4872, 1854, 840, 352, 310, 3309, 2032, 326, 597, 7027, 275, 253, 1072, 31567, 1014, 604, 1543, 275, 10334, 19, 327, 253, 30859, 403, 18511, 352, 1537, 1335, 417, 320, 7933, 2032, 326, 253, 767, 2429, 3210, 403, 253, 1072, 4227, 273, 1159, 11193, 260, 347, 247, 5884, 7579, 253, 12739, 273, 253, 1543, 275, 4706, 7652, 403, 4722, 285, 9865, 533, 891, 4242, 281, 2096, 6283, 253, 4602, 281, 253, 16774, 1263, 327, 253, 4181, 875, 1607, 5482, 285, 46007, 2457, 5482, 17345, 50274, 38092, 5701, 50276, 74, 1119, 436, 2929, 973, 3542, 275, 954, 4243, 816, 247, 5884, 4385, 327, 253, 28939, 908, 275, 581, 19087, 891, 10490, 436, 789, 285, 891, 1158, 352, 556, 9828, 273, 2442, 50276, 284, 271, 26896, 14876, 651, 352, 1056, 3282, 281, 3177, 387, 41685, 1316, 272, 625, 253, 3935, 285, 23103, 327, 253, 2022, 7680, 273, 253, 2929, 347, 591, 247, 747, 3302, 5837, 6974, 390, 369, 436, 417, 2326, 347, 4209, 275, 1708, 273, 253, 1543, 432, 632, 86, 1162, 355, 6247, 7152, 33032, 39930, 50276, 8774, 436, 2929, 14177, 281, 3662, 253, 1563, 767, 3533, 891, 2139, 3733, 440, 34218, 23507, 6928, 432, 3632, 17000, 1347, 15225, 374, 752, 2789, 298, 1641, 285, 24334, 253, 6517, 253, 4477, 921, 253, 1563, 4342, 337, 23507, 295, 2224, 452, 4105, 11786, 2685, 387, 31850, 597, 921, 326, 5368, 3082, 323, 3302, 3006, 23507, 295, 2224, 403, 13583, 275, 417, 7296, 22766, 17769, 5520, 3082, 403, 3410, 31850, 432, 247, 7870, 305, 12064, 3692, 11041, 310, 2905, 281, 253, 7989, 249, 3904, 7989, 249, 50276, 20227, 483, 4086, 7120, 271, 1774, 2554, 1060, 285, 19132, 253, 11786, 2685, 374, 23507, 295, 2224, 452, 4105, 11786, 2685, 1309, 3733, 597, 921, 326, 24334, 1754, 3082, 17170, 253, 1682, 26647, 452, 5520, 11786, 2685, 495, 597, 1089, 253, 298, 1641, 513, 417, 3157, 11786, 2685, 2581, 616, 2323, 8696, 275, 1693, 4026, 253, 819, 25004, 2900, 597, 403, 6012, 432, 50275, 45563, 29093, 337, 253, 2934, 310, 1077, 4722, 891, 11435, 253, 4460, 1783, 253, 4081, 3082, 403, 973, 24013, 8550, 374, 253, 2929, 310, 973, 3542, 285, 3477, 281, 2096, 495, 253, 4560, 310, 10084, 533, 253, 3368, 2216, 310, 4105, 534, 891, 588, 1618, 625, 7000, 7364, 275, 253, 14855, 7118, 891, 751, 253, 2934, 891, 588, 7164, 619, 4868, 604, 253, 4477, 476, 4336, 2953, 619, 13775, 285, 7350, 50275, 20881, 1255, 29093, 337, 323, 2829, 337, 247, 2266, 8245, 310, 5816, 2139, 417, 7277, 342, 253, 3045, 273, 253, 36284, 13571, 4758, 891, 1158, 352, 310, 247, 625, 3626, 8245, 685, 873, 285, 8132, 77, 374, 275, 619, 4743, 627, 310, 247, 1364, 3088, 3368, 36284, 13571, 8989, 50276, 856, 7334, 31850, 285, 7277, 352, 281, 46007, 285, 3632, 14997, 984, 253, 46007, 8989, 50276, 14719, 294, 19078, 1320, 50276, 14719, 14997, 1891, 275, 253, 2045, 6239, 2556, 281, 253, 8813, 275, 253, 2929, 352, 476, 671, 320, 253, 1895, 273, 3632, 294, 19078, 1320, 3021, 2266, 23384, 1941, 310, 326, 921, 4081, 7321, 3632, 294, 19078, 1320, 50276, 5792, 8989, 476, 28842, 3632, 13571, 3045, 495, 5816, 4278, 752, 310, 253, 819, 25004, 4313, 273, 1016, 3924, 275, 34560, 9777, 819, 25004, 253, 30762, 760, 8599, 479, 253, 2488, 970, 5325, 285, 5096, 37139, 414, 2139, 2619, 841, 767, 37139, 414, 984, 436, 37139, 414, 4245, 253, 9559, 11038, 749, 3024, 4896, 285, 253, 2488, 4648, 34560, 9777, 819, 25004, 604, 597, 956, 253, 3236, 298, 394, 4758, 819, 25004, 1384, 323, 1016, 673, 840, 253, 37139, 414, 943, 320, 13278, 74, 849, 281, 5115, 5325, 285, 5096, 577, 752, 310, 253, 5426, 273, 819, 25004, 2900, 310, 352, 253, 2797, 8989, 390, 31850, 390, 749, 3024, 4896, 4428, 1097, 8989, 285, 31850, 2221, 13477, 608, 7344, 264, 4679, 1543, 342, 4872, 4438, 17769, 285, 253, 36284, 13571, 9079, 2929, 501, 3024, 2456, 1607, 46007, 327, 4440, 257, 292, 1293, 2393, 2801, 294, 88, 3087, 476, 417, 452, 1175, 4872, 4438, 17769, 2299, 253, 819, 25004, 2900, 285, 46007, 2900, 452, 1175, 4872, 4438, 17769, 352, 310, 36427, 1014, 323, 767, 298, 1641, 501, 3024, 2456, 1607, 46007, 327, 4440, 257, 292, 10166, 342, 253, 1072, 31850, 275, 1027, 941, 7367, 597, 513, 417, 452, 247, 4872, 1854, 835, 20670, 456, 3733, 2957, 310, 6507, 347, 27007, 275, 4677, 608, 275, 253, 2929, 4872, 4438, 17769, 285, 253, 36284, 13571, 9079, 2393, 2801, 294, 88, 3087, 310, 3058, 323, 253, 3559, 1543, 1223, 891, 1158, 253, 2488, 858, 417, 897, 352, 50276, 23, 253, 5301, 275, 2829, 374, 310, 16593, 20041, 7533, 403, 10166, 432, 2620, 1027, 3632, 31850, 1223, 46007, 7533, 403, 10166, 432, 253, 1072, 31850, 342, 1027, 941, 7367, 46007, 4758, 1543, 943, 671, 320, 432, 1027, 31850, 5010, 476, 417, 5115, 253, 6452, 326, 36284, 14997, 3037, 2074, 3470, 281, 253, 819, 25004, 2900, 50276, 37585, 337, 253, 5426, 273, 298, 394, 275, 5922, 1347, 347, 973, 347, 327, 649, 6168, 312, 2139, 627, 310, 278, 352, 943, 320, 253, 2120, 14086, 1566, 1293, 253, 8989, 987, 50275, 5996, 30080, 22559, 50276, 35501, 281, 253, 4477, 323, 253, 4465, 4679, 285, 8680, 50276, 11753, 18631, 8245, 323, 2829, 18, 3738, 8132, 77, 1057, 417, 878, 14086, 2990, 3733, 352, 2105, 625, 281, 1089, 253, 8989, 2829, 374, 273, 253, 8132, 77, 2929, 50276, 14719, 14997, 3632, 13571, 50276, 5792, 8989, 50276, 14719, 294, 19078, 1320, 2581, 685, 3632, 819, 25004, 50276, 14719, 2012, 253, 2914, 581, 588, 320, 1199, 625, 4722, 984, 253, 46007, 8989, 50276, 14719, 294, 19078, 1320, 50276, 14719, 14997, 1891, 275, 253, 2045, 6239, 2556, 281, 253, 8813, 275, 253, 2929, 352, 476, 671, 320, 253, 1895, 273, 3632, 294, 19078, 1320, 3021, 2266, 23384, 1941, 310, 326, 921, 4081, 7321, 3632, 294, 19078, 1320, 50276, 5792, 8989, 476, 28842, 3632, 13571, 3045, 891, 11697, 513, 253, 3368, 326, 9591, 4081, 31850, 327, 3632, 14997, 285, 253, 3045, 310, 19965, 273, 2282, 627, 778, 2226, 8783, 273, 4606, 323, 253, 1543, 891, 588, 417, 40195, 253, 2929, 2556, 281, 619, 4679, 50276, 977, 7350, 403, 588, 1911, 2079, 6701, 50276, 20261, 891, 513, 751, 253, 2934, 273, 436, 2929, 891, 1158, 352, 1537, 878, 281, 320, 17265, 285, 501, 538, 3004, 24049, 253, 9470, 5955, 3559, 407, 512, 253, 30628, 891, 5257, 281, 1978, 619, 7363, 19965, 533, 891, 13414, 1158, 436, 310, 2233, 247, 2590, 12009, 285, 7293, 327, 253, 11626, 273, 253, 643, 30628, 891, 651, 417, 1928, 326, 18738, 436, 2929, 369, 4336, 562, 273, 14493, 2490, 187, 4118, 18435, 27, 783, 2929, 2722, 45190, 326, 3733, 440, 34218, 23507, 6928, 432, 3632, 31850, 17923, 15225, 347, 23507, 295, 2224, 452, 4105, 11786, 2685, 387, 31850, 16280, 253, 4477, 9059, 326, 23507, 295, 2224, 452, 4105, 11786, 2685, 1309, 3733, 597, 921, 326, 24334, 1754, 3082, 17170, 253, 1682, 26647, 452, 5520, 11786, 2685, 25761, 597, 1089, 253, 298, 1641, 513, 417, 3157, 11786, 2685, 2581, 616, 2323, 8696, 275, 1693, 4026, 253, 819, 25004, 2900, 597, 403, 6012, 432, 891, 1239, 253, 2929, 285, 253, 30628, 5469, 253, 30080, 22559, 3738, 512, 253, 30628, 1119, 253, 30080, 22559, 9371, 285, 597, 512, 5194, 326, 253, 2929, 310, 1086, 1574, 973, 3542, 285, 556, 690, 2590, 1318, 253, 5020, 11532, 326, 2007, 7313, 403, 2424, 323, 2403, 253, 2929, 285, 697, 9079, 21414, 627, 403, 671, 690, 3332, 2905, 789, 327, 31850, 273, 819, 37437, 6928, 24088, 407, 46595, 272, 616, 13461, 387, 31850, 891, 2868, 6240, 253, 5955, 273, 824, 2905, 5609, 285, 2403, 253, 4602, 281, 5368, 789, 588, 10260, 17084, 253, 2929, 285, 3400, 625, 1941, 281, 1329, 697, 3916, 50275 ]
Below is given review of a research paper from cnoference journal. Please write a summary the review. ### Review: this paper proposes a transformerlike model graph multiset transformer to perform the graph poolaggregation overall the technique part is concrete and clear and the experimental evaluations are comprehensive the authors also prove the expressive power regarding wltest however several points need to be clarified or addressed 1 the idea to utilize the transformerlike architecture to model the graph neural network gnn is not new some existing works12 have employed the transformer to enhance the expressive power of gnn its better to add more discussions between gtm and the existing works to highlight its contribution meanwhile several studies 34567 about graph pooling selfattention are missing its better to make discussions in the related works moreover if possible i suggest making the comparison with these methods especially the recent studies to make the whole experimental results more convincing eg haarpool eigenpool etc 1 about the experimental settings 1 in section 41 the authors state that the 4 molecule datasets are obtained from ogb dataset however in ogb dataset it only contains hiv while tox21 toxcast bbbp is not included maybe there is a mistake 2 for the molecular dataset the data splitting is very curial for the final results meanwhile the atombond feature extraction process for the molecular datasets is unclear the authors need to clarify the data splitting randomscaffold and feature extraction process to ensure the reproducibility of experiments minor in equation 6 what is the qwiq in equation 8 why we need mhh h h how about directly applying h into selfatt block ie zh in the experiments this paper only evaluates the memory efficiency of gmt i would like to see the evaluation about the time efficiency of gmt with other baselines overall this paper is well written and the experiments look solid considering the novelty issue i think this is a borderline paper i recommend marginally below acceptance threshold and would like to see the authors response 1 rong yu et al grover selfsupervised message passing transformer on largescale molecular data arxiv preprint arxiv200702835 2020 2 chithrananda seyone gabe grand and bharath ramsundar chemberta largescale selfsupervised pretraining for molecular property prediction arxiv preprint arxiv201009885 2020 3 wang yu guang et al haar graph pooling arxiv 2019 arxiv1909 4 ma yao et al graph convolutional networks with eigenpooling proceedings of the 25th acm sigkdd international conference on knowledge discovery data mining 2019 5 bianchi filippo maria daniele grattarola and cesare alippi spectral clustering with graph neural networks for graph pooling 2020 6 ranjan ekagra soumya sanyal and partha p talukdar asap adaptive structure aware pooling for learning hierarchical graph representations aaai 2020 7 li jia et al semisupervised graph classification a hierarchical graph perspective the world wide web conference 2019 docsepthis work proposes a graph multiset transformer gmt that uses a multihead attentionbased approach to capture potential interactions between nodes when pooling nodes to produce a graph representation multihead attention mechanism is used to group nodes into clusters each of which produces a representation selfattention is then used to pool representations of clusters into the representation of a graph pros the proposed pooling is more reasonable than a simple sum or average pooling as the multihead attention mechanism can potentially capture dependencies between nodes cons several parts of the manuscript need more explanations additional experimental results see below for details are needed 1 it is not clear what multihead attention achieve semantically as claimed better capture structure information 2 figures are neither selfexplained nor well explained in the main text 3 the std of cross validation in each experiment should be reported the mean performance alone is not enough to say a method performs better or not it will be better to provide a pvalue eg ttest to show if a method is statistically significantly better 4 four datasets from the open graph benchmark were used the authors should refer to the leaderboard for the performance of some baseline methods for example the leaderboard of hiv dataset reports that gin has 07654 rather than 07595 listed in this manuscript 5 the abstract points out that without considering task relevance is a weakness of the previous pooling method however it is not clear how the proposed approach make improvements on this 6 the proposed method does not necessary pass graph isomorphism as the nodes in the manuscript have attributes but the proof does not consider node attributes 7 in appendix examples in figure 10 are confusing more explanations are needed docsepthis work studies the graph pooling operation for graph neural networks it proposes the graph multiset pooling which treats the graph pooling as a multiset encoding problem and can capture the graph structural information it first employs multihead attention to learn node features where the query q is a learnable matrix contains k vectors then a gmpool operation is performed and finally the selfattention is used for learning internode relationships experimental results show the effectiveness of the proposed method strengths this work studies an important problem graph pooling graph pooling can learn highlevel graph representations but is still less explored the proposed method is interesting by using a learnable query matrix the method can reduce the nnode input to knode output the selfattention used after gmpool can learn the relationships between highlevel embeddings the experimental results are promising the proposed method outperforms other compared methods weaknesses even though the method is called multiset pooling its method is not related to multiset the proposed method is mainly based on attention and selfattention mechanism then claiming the proposed method as multiset pooling is not convincing the experimental settings are not fair enough the pooling operation is defined as reducing nnode input to knode output for all other methods the pooling layer is connected with the global sumaverage however in the proposed gmt the gmpool is connected with a selfattention layer it is not clear whether the proposed gmpool or the selfattention layer leads to the performance gain a careful ablation study is needed i think the proposed method can be regarded as using sagpool in the clusteringbased pooling the main difference is the multihead attention and the learnable matrix s please comment if i missed something the use of graph structures is not very convincing the gnnh a is the simple message passing of gnns then the graph structural information a is already incorporated in h since h is obtained by gnns several baselines are missing such as structpool and edge pooling they should be discussed and compared questions 1 if the query q is obtained from the learnable matrix s which is kd dimensions then the output of attq k v should also have kd dimension which means the output is already reduced from n vectors to k vectors why do we still need the gmpool operation the gmpool does not compress the n nodes into the k typical nodes update after rebuttal i have read the authors rebuttal most of my concerns are addressed properly and hence i am willing to increase my score from 4 to 6 docsepthe work extends the set transformer to obtain a method for multihead attention pooling on multisets with connectivity graphs the authors show that the approach is as expressive as the wl isomorphism test and has better space complexity than existing node clustering networks the method achieves stateoftheart results on graph classification and strong results on graph reconstruction and generation strengths the paper is very well written and polished the figures complement the text well the work is technically and mathematically sound the method shows good results and is scalable making it a valuable addition to the set of existing gnn operators the authors make an effort to substantiate their statements about expressivity and scalability with proofs the experiments are well chosen and show where improvements come from weaknesses the proven expressiveness is not a very strong statement since most pooling approaches adhere to this property it is nice to have the theoretical analysis though the method itself is an incremental variation of set transformers lee et al although adapted for a different type of input data using the identity matrix as adjacency as described in appendix b to work around the scalability issue of node clustering methods seems to make the approach identical to the set transformer in all layers except the first which dampens the contribution there is some potential for improvement in clarity of presentation in abstract may yield representations that do not pass the graph isomorphism test in introduction page 2 accurate representations of given graphs that can pass the wltest those sentences are confusing as it is not about representations passing the wl test is it it is about two graphs which are distinguished by wl get different representations as correctly stated elsewhere in the work regarding page 4 paragraph graph multihead attention and the following ones on the one hand they are extremely close to vaswani et al and lee et al sometimes even nearly the same sentences on the other hand some things are left out which are crucial for understanding such as definitions of symbols for dimensionalities nq dk dv dmodel and the origin of some matrices see next point where do the seeds s come from for the non self attention operator it probably is a parameter matrix that is directly optimized but it is not completely clear learnable is ambiguous can also be the output of a network suggestion maybe the description of the pooling method becomes more clear when described in a topdown manner eq 7 eq 6 eq 5 att experiments variances of graph classification results over the cross validation would be greatly appreciated since there seems to be a space issue they can go into the appendix the reconstruction architecture does not reconstruct adjacency it might be interesting to see how well the method can do that for the synthetic graphs for example related work the work 1 should be mentioned and discussed in related work and compared against in experiments it is also a pooling method with attention but seems to follow a different approach typos figure 2 caption to compress the all nodes all proof of theorem 4 but it is highly efficient than node clustering methods proof of proposition 5 then the first them inherently generates all in all i think this paper has a valuable contribution even if the method is incremental therefore i tend to vote for accepting the paper but encourage the authors to improve on the mentioned issues in experiments related work and presentation 1 ranjan et al asap adaptive structure aware pooling for learning hierarchical graph representations aaai 2020 ### Summary:
the paper addresses a very important issue in gnn the definition of a welldefined pooling function for node aggregation the proposed graph multiset transformer although not entirely new seems to be useful in practice issues related to experimental results as well as problems with presentation have been solved by the authorss rebuttal that presented solid experimental results and analysis concerns about the real expressivity of the proposed approach when compared to weisfeilerlehman graph isomorphism test do not affect the contribution delivered by the paper that seems at this point significant
[ 2175, 281, 1056, 253, 2644, 5661, 1543, 625, 21414, 24088, 419, 5916, 1062, 9216, 10730, 3966, 50275, 18, 186, 10383, 253, 5661, 7533, 337, 275, 2593, 7609, 253, 4477, 1375, 326, 253, 577, 12570, 15302, 403, 2797, 432, 9040, 67, 10895, 2299, 275, 9040, 67, 10895, 352, 760, 4428, 288, 400, 1223, 6847, 1797, 6847, 4008, 270, 4482, 81, 310, 417, 2908, 5046, 627, 310, 247, 10551, 374, 323, 253, 5787, 10895, 253, 941, 19860, 310, 1077, 1095, 451, 323, 253, 2457, 1543, 26614, 253, 387, 4894, 857, 4735, 11998, 1232, 323, 253, 5787, 15302, 310, 12744, 253, 4477, 878, 281, 19148, 253, 941, 19860, 3632, 1026, 2843, 744, 285, 4735, 11998, 1232, 281, 5416, 253, 38041, 273, 4679, 50275, 37585, 50276, 249, 5150, 721, 752, 310, 253, 2805, 22084, 82, 50275, 249, 5150, 854, 2139, 359, 878, 278, 12155, 288, 288, 849, 670, 3587, 9433, 288, 715, 1881, 1595, 2972, 26332, 1182, 73, 50276, 249, 253, 4679, 436, 2929, 760, 44995, 253, 3541, 6733, 273, 305, 6917, 891, 651, 751, 281, 923, 253, 7103, 670, 253, 673, 6733, 273, 305, 6917, 342, 643, 1666, 25379, 50275, 1189, 455, 436, 2929, 310, 973, 3542, 285, 253, 4679, 1007, 4891, 7296, 253, 38135, 2523, 891, 1158, 436, 310, 247, 45210, 2929, 891, 5583, 42876, 2708, 14924, 7887, 50276, 395, 651, 751, 281, 923, 253, 4477, 2380, 50276, 18, 391, 543, 340, 86, 1162, 355, 7753, 332, 1881, 35421, 3935, 8136, 39707, 327, 1236, 2510, 25912, 5787, 941, 549, 32693, 638, 3845, 549, 32693, 1518, 1967, 1619, 1671, 9169, 50276, 19, 448, 334, 4011, 7447, 396, 90, 531, 305, 12424, 4936, 285, 270, 9432, 506, 391, 1317, 1504, 274, 1161, 2480, 893, 1236, 2510, 25912, 1881, 35421, 3215, 26208, 323, 5787, 2867, 10554, 549, 32693, 638, 3845, 549, 32693, 1252, 8972, 29805, 9169, 50276, 20, 259, 606, 340, 86, 1149, 606, 1162, 355, 419, 274, 4216, 45900, 549, 32693, 6247, 549, 32693, 746, 2693, 50276, 21, 6429, 340, 8500, 1162, 355, 4216, 27311, 267, 6928, 342, 9216, 10730, 272, 10061, 273, 253, 2030, 394, 913, 78, 9788, 76, 1678, 5213, 8059, 327, 3640, 8900, 50276, 2203, 15067, 6247, 50276, 22, 270, 757, 4635, 1193, 5265, 80, 2304, 571, 16447, 38853, 650, 1595, 274, 6836, 285, 19109, 609, 355, 17204, 9879, 17524, 342, 4216, 11454, 6928, 323, 4216, 45900, 9169, 50276, 23, 6337, 17551, 34978, 34499, 18155, 2577, 66, 256, 1279, 267, 285, 1061, 19243, 268, 5269, 2788, 27083, 347, 522, 17825, 2605, 6600, 45900, 323, 4715, 24498, 4216, 14237, 39951, 2284, 9169, 50276, 24, 632, 480, 571, 1162, 355, 49863, 29974, 13337, 4216, 9162, 247, 24498, 4216, 8668, 253, 1533, 4618, 4384, 8059, 6247, 5474, 33032, 2520, 789, 29328, 247, 4216, 1554, 42468, 39707, 305, 6917, 326, 4648, 247, 4471, 2522, 4116, 3169, 2746, 281, 9232, 2442, 6355, 875, 7632, 672, 45900, 7632, 281, 4711, 247, 4216, 6779, 4471, 2522, 4116, 5122, 310, 908, 281, 1387, 7632, 715, 9959, 1016, 273, 534, 11330, 247, 6779, 1881, 42959, 310, 840, 908, 281, 6363, 14237, 273, 9959, 715, 253, 6779, 273, 247, 4216, 50276, 856, 84, 50276, 783, 4081, 45900, 310, 625, 5272, 685, 247, 2969, 2020, 390, 3388, 45900, 347, 253, 4471, 2522, 4116, 5122, 476, 7826, 9232, 21011, 875, 7632, 50275, 5040, 2067, 4243, 273, 253, 7714, 878, 625, 22909, 3081, 5661, 1543, 923, 2708, 323, 4278, 403, 3058, 337, 352, 310, 417, 2590, 752, 4471, 2522, 4116, 5115, 3300, 39904, 347, 7558, 1805, 9232, 2605, 1491, 50276, 19, 8442, 403, 6747, 11329, 453, 89, 446, 1243, 4543, 973, 5544, 275, 253, 2022, 2505, 495, 253, 6268, 273, 2831, 12820, 275, 1016, 3368, 943, 320, 2361, 253, 1599, 3045, 3815, 310, 417, 2217, 281, 1333, 247, 1332, 17923, 1805, 390, 417, 352, 588, 320, 1805, 281, 2085, 247, 268, 2877, 24088, 246, 2566, 281, 921, 604, 247, 1332, 310, 10126, 3012, 1805, 577, 1740, 15302, 432, 253, 1527, 4216, 22791, 497, 908, 253, 4477, 943, 3730, 281, 253, 6657, 4697, 323, 253, 3045, 273, 690, 8245, 3082, 323, 1650, 253, 6657, 4697, 273, 288, 400, 10895, 5012, 326, 48347, 556, 18188, 29195, 2581, 685, 470, 1976, 2222, 7117, 275, 436, 7714, 50276, 22, 253, 12002, 2792, 562, 326, 1293, 7296, 4836, 17200, 310, 247, 14855, 273, 253, 2045, 45900, 1332, 2299, 352, 310, 417, 2590, 849, 253, 4081, 2746, 1056, 11701, 327, 436, 50276, 23, 253, 4081, 1332, 1057, 417, 3309, 1509, 4216, 20169, 347, 253, 7632, 275, 253, 7714, 452, 12474, 533, 253, 4737, 1057, 417, 1908, 4666, 12474, 818, 275, 30762, 6667, 275, 4677, 884, 403, 21643, 625, 22909, 403, 3058, 50276, 7152, 33032, 2520, 789, 2175, 253, 4216, 45900, 4254, 323, 4216, 11454, 6928, 352, 29328, 253, 4216, 1554, 42468, 45900, 534, 26574, 253, 4216, 45900, 347, 247, 1554, 42468, 9706, 1895, 285, 476, 9232, 253, 4216, 8350, 1491, 352, 806, 27532, 4471, 2522, 4116, 281, 3037, 4666, 3386, 835, 253, 7316, 2805, 310, 247, 3037, 494, 4315, 4428, 465, 11390, 840, 247, 305, 2503, 1062, 4254, 310, 2684, 285, 4720, 253, 1881, 42959, 310, 908, 323, 4715, 4184, 853, 7688, 5661, 1543, 921, 253, 12510, 273, 253, 4081, 1332, 50276, 296, 3755, 20556, 50276, 2520, 789, 2175, 271, 1774, 1895, 4216, 45900, 4216, 45900, 476, 3037, 1029, 5251, 4216, 14237, 533, 310, 1335, 1679, 14859, 50276, 783, 4081, 1332, 310, 4722, 407, 970, 247, 3037, 494, 7316, 4315, 253, 1332, 476, 4796, 253, 295, 6219, 3280, 281, 694, 853, 3453, 253, 1881, 42959, 908, 846, 305, 2503, 1062, 476, 3037, 253, 7688, 875, 1029, 5251, 46234, 50276, 783, 5661, 1543, 403, 12532, 253, 4081, 1332, 41731, 13015, 643, 2429, 3082, 50276, 20881, 1255, 265, 50276, 9154, 2167, 253, 1332, 310, 1925, 1554, 42468, 45900, 697, 1332, 310, 417, 2905, 281, 1554, 42468, 253, 4081, 1332, 310, 7194, 1754, 327, 4116, 285, 1881, 42959, 5122, 840, 15081, 253, 4081, 1332, 347, 1554, 42468, 45900, 310, 417, 21414, 50276, 783, 5661, 7533, 403, 417, 4344, 2217, 253, 45900, 4254, 310, 2931, 347, 8493, 295, 6219, 3280, 281, 694, 853, 3453, 323, 512, 643, 3082, 253, 45900, 3828, 310, 4802, 342, 253, 4156, 2020, 25629, 2299, 275, 253, 4081, 305, 6917, 253, 305, 2503, 1062, 310, 4802, 342, 247, 1881, 42959, 3828, 352, 310, 417, 2590, 1880, 253, 4081, 305, 2503, 1062, 390, 253, 1881, 42959, 3828, 5644, 281, 253, 3045, 6351, 247, 10182, 28913, 1263, 310, 3058, 50276, 74, 1158, 253, 4081, 1332, 476, 320, 12258, 347, 970, 18561, 10730, 275, 253, 17524, 3169, 45900, 253, 2022, 3064, 310, 253, 4471, 2522, 4116, 285, 253, 3037, 494, 4315, 256, 4496, 4385, 604, 891, 9829, 1633, 50275, 783, 897, 273, 4216, 5289, 310, 417, 1077, 21414, 253, 305, 9866, 73, 247, 310, 253, 2969, 3935, 8136, 273, 18976, 2224, 840, 253, 4216, 8350, 1491, 247, 310, 2168, 11217, 275, 288, 1580, 288, 310, 2797, 407, 18976, 2224, 50276, 43249, 1666, 25379, 403, 5816, 824, 347, 1577, 10730, 285, 5024, 45900, 597, 943, 320, 5469, 285, 2429, 50276, 34974, 337, 604, 253, 7316, 2805, 310, 2797, 432, 253, 3037, 494, 4315, 256, 534, 310, 465, 69, 10103, 840, 253, 3453, 273, 50276, 1595, 82, 465, 362, 50276, 11425, 671, 452, 465, 69, 7877, 534, 2097, 253, 3453, 310, 2168, 3777, 432, 295, 11390, 281, 465, 11390, 2139, 513, 359, 1335, 878, 253, 305, 2503, 1062, 4254, 253, 305, 2503, 1062, 1057, 417, 19477, 253, 295, 7632, 715, 253, 465, 6867, 7632, 50275, 11183, 846, 30080, 22559, 50276, 74, 452, 1239, 253, 4477, 30080, 22559, 954, 273, 619, 7350, 403, 9713, 6283, 285, 7613, 891, 717, 7378, 281, 2572, 619, 4868, 432, 577, 281, 721, 5474, 339, 431, 248, 789, 8725, 253, 873, 39707, 281, 4044, 247, 1332, 323, 4471, 2522, 4116, 45900, 327, 1554, 261, 1507, 342, 17769, 14580, 253, 4477, 921, 326, 253, 2746, 310, 347, 43541, 347, 253, 259, 77, 20169, 1071, 285, 556, 1805, 2317, 10454, 685, 5368, 4666, 17524, 6928, 253, 1332, 33526, 1375, 23037, 14387, 1543, 327, 4216, 9162, 285, 2266, 1543, 327, 4216, 14433, 285, 5978, 50276, 296, 3755, 20556, 50276, 783, 2929, 310, 1077, 973, 3542, 285, 29422, 50275, 783, 8442, 13503, 253, 2505, 973, 50275, 783, 789, 310, 22335, 285, 11076, 1037, 3590, 50276, 783, 1332, 2722, 1175, 1543, 285, 310, 44755, 2403, 352, 247, 9865, 1635, 281, 253, 873, 273, 5368, 305, 9866, 9158, 50276, 783, 4477, 1056, 271, 3434, 281, 4326, 4513, 616, 7234, 670, 3890, 2351, 285, 9171, 1430, 342, 27947, 50276, 783, 4679, 403, 973, 6777, 285, 921, 835, 11701, 1705, 432, 50276, 20881, 1255, 265, 50276, 783, 11464, 3890, 6460, 310, 417, 247, 1077, 2266, 3908, 1580, 954, 45900, 7274, 29534, 281, 436, 2867, 352, 310, 5322, 281, 452, 253, 10527, 1783, 2167, 50276, 783, 1332, 3139, 310, 271, 32809, 7629, 273, 873, 4979, 398, 458, 70, 1162, 355, 3738, 12956, 323, 247, 1027, 1511, 273, 3280, 941, 50276, 5302, 253, 6489, 4315, 347, 3067, 43850, 347, 2529, 275, 30762, 270, 281, 789, 1475, 253, 9171, 1430, 2523, 273, 4666, 17524, 3082, 3133, 281, 1056, 253, 2746, 8931, 281, 253, 873, 39707, 275, 512, 8090, 3707, 253, 806, 534, 16109, 561, 253, 7680, 50276, 9088, 310, 690, 2442, 323, 7756, 275, 19843, 273, 9759, 50276, 249, 12002, 778, 4917, 14237, 326, 513, 417, 1509, 253, 4216, 20169, 1071, 275, 10199, 3239, 374, 7899, 14237, 273, 1677, 14580, 326, 476, 1509, 253, 259, 77, 2566, 1110, 14683, 403, 21643, 347, 352, 310, 417, 670, 14237, 8136, 253, 259, 77, 1071, 310, 352, 352, 310, 670, 767, 14580, 534, 403, 15622, 407, 259, 77, 755, 1027, 14237, 347, 9113, 4767, 11358, 275, 253, 789, 50276, 1747, 13218, 3239, 577, 12494, 4216, 4471, 2522, 4116, 285, 253, 1563, 4394, 28910, 327, 253, 581, 1133, 597, 403, 6685, 2810, 281, 16016, 88, 6451, 1162, 355, 285, 458, 70, 1162, 355, 4536, 1014, 4829, 253, 1072, 14683, 28910, 327, 253, 643, 1133, 690, 1841, 403, 1669, 562, 534, 403, 9560, 323, 4685, 824, 347, 14308, 273, 14217, 323, 15759, 1005, 295, 82, 277, 76, 43857, 277, 7645, 285, 253, 6510, 273, 690, 12624, 923, 1735, 1127, 50276, 186, 835, 513, 253, 12922, 256, 1705, 432, 323, 253, 1327, 1881, 4116, 5572, 352, 3164, 310, 247, 4764, 4315, 326, 310, 3587, 18325, 533, 352, 310, 417, 4336, 2590, 3037, 494, 310, 23851, 476, 671, 320, 253, 3453, 273, 247, 2990, 28910, 14876, 5046, 253, 5740, 273, 253, 45900, 1332, 4916, 625, 2590, 672, 2529, 275, 247, 1755, 3487, 5133, 16186, 818, 50276, 2574, 721, 50276, 2574, 608, 50276, 1595, 50275, 16217, 3825, 50276, 2044, 18843, 273, 4216, 9162, 1543, 689, 253, 2831, 12820, 651, 320, 10260, 14109, 1580, 627, 3133, 281, 320, 247, 2317, 2523, 597, 476, 564, 715, 253, 30762, 50276, 783, 14433, 10336, 1057, 417, 17029, 3067, 43850, 352, 1537, 320, 4722, 281, 923, 849, 973, 253, 1332, 476, 513, 326, 323, 253, 13506, 14580, 323, 1650, 50276, 4919, 789, 50276, 783, 789, 337, 943, 320, 5393, 285, 5469, 275, 2905, 789, 285, 2429, 1411, 275, 4679, 352, 310, 671, 247, 45900, 1332, 342, 4116, 533, 3133, 281, 956, 247, 1027, 2746, 50276, 555, 993, 50276, 13206, 374, 11743, 281, 19477, 253, 512, 7632, 512, 50276, 16314, 273, 10012, 577, 533, 352, 310, 4122, 5919, 685, 4666, 17524, 3082, 50276, 16314, 273, 13989, 608, 840, 253, 806, 731, 26557, 15693, 50276, 455, 275, 512, 891, 1158, 436, 2929, 556, 247, 9865, 7680, 1014, 604, 253, 1332, 310, 32809, 3103, 891, 5257, 281, 6273, 323, 18738, 253, 2929, 533, 11907, 253, 4477, 281, 3157, 327, 253, 5393, 3374, 275, 4679, 2905, 789, 285, 9759, 50276, 18, 6337, 17551, 1162, 355, 347, 522, 17825, 2605, 6600, 45900, 323, 4715, 24498, 4216, 14237, 39951, 2284, 9169, 187, 187, 4118, 18435, 27, 783, 2929, 12453, 247, 1077, 1774, 2523, 275, 305, 9866, 253, 5426, 273, 247, 6210, 392, 37224, 45900, 1159, 323, 4666, 20828, 253, 4081, 4216, 1554, 42468, 39707, 3738, 417, 7094, 747, 3133, 281, 320, 4217, 275, 3946, 3374, 2905, 281, 5661, 1543, 347, 973, 347, 3237, 342, 9759, 452, 644, 14042, 407, 253, 4477, 84, 30080, 22559, 326, 3559, 4891, 5661, 1543, 285, 1783, 7350, 670, 253, 1524, 3890, 2351, 273, 253, 4081, 2746, 672, 2429, 281, 359, 261, 453, 6731, 282, 38373, 4216, 20169, 1071, 513, 417, 2818, 253, 7680, 8549, 407, 253, 2929, 326, 3133, 387, 436, 1127, 1534, 209 ]
[ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 ]
[ 2175, 281, 1056, 253, 2644, 5661, 1543, 625, 21414, 24088, 419, 5916, 1062, 9216, 10730, 3966, 50275, 18, 186, 10383, 253, 5661, 7533, 337, 275, 2593, 7609, 253, 4477, 1375, 326, 253, 577, 12570, 15302, 403, 2797, 432, 9040, 67, 10895, 2299, 275, 9040, 67, 10895, 352, 760, 4428, 288, 400, 1223, 6847, 1797, 6847, 4008, 270, 4482, 81, 310, 417, 2908, 5046, 627, 310, 247, 10551, 374, 323, 253, 5787, 10895, 253, 941, 19860, 310, 1077, 1095, 451, 323, 253, 2457, 1543, 26614, 253, 387, 4894, 857, 4735, 11998, 1232, 323, 253, 5787, 15302, 310, 12744, 253, 4477, 878, 281, 19148, 253, 941, 19860, 3632, 1026, 2843, 744, 285, 4735, 11998, 1232, 281, 5416, 253, 38041, 273, 4679, 50275, 37585, 50276, 249, 5150, 721, 752, 310, 253, 2805, 22084, 82, 50275, 249, 5150, 854, 2139, 359, 878, 278, 12155, 288, 288, 849, 670, 3587, 9433, 288, 715, 1881, 1595, 2972, 26332, 1182, 73, 50276, 249, 253, 4679, 436, 2929, 760, 44995, 253, 3541, 6733, 273, 305, 6917, 891, 651, 751, 281, 923, 253, 7103, 670, 253, 673, 6733, 273, 305, 6917, 342, 643, 1666, 25379, 50275, 1189, 455, 436, 2929, 310, 973, 3542, 285, 253, 4679, 1007, 4891, 7296, 253, 38135, 2523, 891, 1158, 436, 310, 247, 45210, 2929, 891, 5583, 42876, 2708, 14924, 7887, 50276, 395, 651, 751, 281, 923, 253, 4477, 2380, 50276, 18, 391, 543, 340, 86, 1162, 355, 7753, 332, 1881, 35421, 3935, 8136, 39707, 327, 1236, 2510, 25912, 5787, 941, 549, 32693, 638, 3845, 549, 32693, 1518, 1967, 1619, 1671, 9169, 50276, 19, 448, 334, 4011, 7447, 396, 90, 531, 305, 12424, 4936, 285, 270, 9432, 506, 391, 1317, 1504, 274, 1161, 2480, 893, 1236, 2510, 25912, 1881, 35421, 3215, 26208, 323, 5787, 2867, 10554, 549, 32693, 638, 3845, 549, 32693, 1252, 8972, 29805, 9169, 50276, 20, 259, 606, 340, 86, 1149, 606, 1162, 355, 419, 274, 4216, 45900, 549, 32693, 6247, 549, 32693, 746, 2693, 50276, 21, 6429, 340, 8500, 1162, 355, 4216, 27311, 267, 6928, 342, 9216, 10730, 272, 10061, 273, 253, 2030, 394, 913, 78, 9788, 76, 1678, 5213, 8059, 327, 3640, 8900, 50276, 2203, 15067, 6247, 50276, 22, 270, 757, 4635, 1193, 5265, 80, 2304, 571, 16447, 38853, 650, 1595, 274, 6836, 285, 19109, 609, 355, 17204, 9879, 17524, 342, 4216, 11454, 6928, 323, 4216, 45900, 9169, 50276, 23, 6337, 17551, 34978, 34499, 18155, 2577, 66, 256, 1279, 267, 285, 1061, 19243, 268, 5269, 2788, 27083, 347, 522, 17825, 2605, 6600, 45900, 323, 4715, 24498, 4216, 14237, 39951, 2284, 9169, 50276, 24, 632, 480, 571, 1162, 355, 49863, 29974, 13337, 4216, 9162, 247, 24498, 4216, 8668, 253, 1533, 4618, 4384, 8059, 6247, 5474, 33032, 2520, 789, 29328, 247, 4216, 1554, 42468, 39707, 305, 6917, 326, 4648, 247, 4471, 2522, 4116, 3169, 2746, 281, 9232, 2442, 6355, 875, 7632, 672, 45900, 7632, 281, 4711, 247, 4216, 6779, 4471, 2522, 4116, 5122, 310, 908, 281, 1387, 7632, 715, 9959, 1016, 273, 534, 11330, 247, 6779, 1881, 42959, 310, 840, 908, 281, 6363, 14237, 273, 9959, 715, 253, 6779, 273, 247, 4216, 50276, 856, 84, 50276, 783, 4081, 45900, 310, 625, 5272, 685, 247, 2969, 2020, 390, 3388, 45900, 347, 253, 4471, 2522, 4116, 5122, 476, 7826, 9232, 21011, 875, 7632, 50275, 5040, 2067, 4243, 273, 253, 7714, 878, 625, 22909, 3081, 5661, 1543, 923, 2708, 323, 4278, 403, 3058, 337, 352, 310, 417, 2590, 752, 4471, 2522, 4116, 5115, 3300, 39904, 347, 7558, 1805, 9232, 2605, 1491, 50276, 19, 8442, 403, 6747, 11329, 453, 89, 446, 1243, 4543, 973, 5544, 275, 253, 2022, 2505, 495, 253, 6268, 273, 2831, 12820, 275, 1016, 3368, 943, 320, 2361, 253, 1599, 3045, 3815, 310, 417, 2217, 281, 1333, 247, 1332, 17923, 1805, 390, 417, 352, 588, 320, 1805, 281, 2085, 247, 268, 2877, 24088, 246, 2566, 281, 921, 604, 247, 1332, 310, 10126, 3012, 1805, 577, 1740, 15302, 432, 253, 1527, 4216, 22791, 497, 908, 253, 4477, 943, 3730, 281, 253, 6657, 4697, 323, 253, 3045, 273, 690, 8245, 3082, 323, 1650, 253, 6657, 4697, 273, 288, 400, 10895, 5012, 326, 48347, 556, 18188, 29195, 2581, 685, 470, 1976, 2222, 7117, 275, 436, 7714, 50276, 22, 253, 12002, 2792, 562, 326, 1293, 7296, 4836, 17200, 310, 247, 14855, 273, 253, 2045, 45900, 1332, 2299, 352, 310, 417, 2590, 849, 253, 4081, 2746, 1056, 11701, 327, 436, 50276, 23, 253, 4081, 1332, 1057, 417, 3309, 1509, 4216, 20169, 347, 253, 7632, 275, 253, 7714, 452, 12474, 533, 253, 4737, 1057, 417, 1908, 4666, 12474, 818, 275, 30762, 6667, 275, 4677, 884, 403, 21643, 625, 22909, 403, 3058, 50276, 7152, 33032, 2520, 789, 2175, 253, 4216, 45900, 4254, 323, 4216, 11454, 6928, 352, 29328, 253, 4216, 1554, 42468, 45900, 534, 26574, 253, 4216, 45900, 347, 247, 1554, 42468, 9706, 1895, 285, 476, 9232, 253, 4216, 8350, 1491, 352, 806, 27532, 4471, 2522, 4116, 281, 3037, 4666, 3386, 835, 253, 7316, 2805, 310, 247, 3037, 494, 4315, 4428, 465, 11390, 840, 247, 305, 2503, 1062, 4254, 310, 2684, 285, 4720, 253, 1881, 42959, 310, 908, 323, 4715, 4184, 853, 7688, 5661, 1543, 921, 253, 12510, 273, 253, 4081, 1332, 50276, 296, 3755, 20556, 50276, 2520, 789, 2175, 271, 1774, 1895, 4216, 45900, 4216, 45900, 476, 3037, 1029, 5251, 4216, 14237, 533, 310, 1335, 1679, 14859, 50276, 783, 4081, 1332, 310, 4722, 407, 970, 247, 3037, 494, 7316, 4315, 253, 1332, 476, 4796, 253, 295, 6219, 3280, 281, 694, 853, 3453, 253, 1881, 42959, 908, 846, 305, 2503, 1062, 476, 3037, 253, 7688, 875, 1029, 5251, 46234, 50276, 783, 5661, 1543, 403, 12532, 253, 4081, 1332, 41731, 13015, 643, 2429, 3082, 50276, 20881, 1255, 265, 50276, 9154, 2167, 253, 1332, 310, 1925, 1554, 42468, 45900, 697, 1332, 310, 417, 2905, 281, 1554, 42468, 253, 4081, 1332, 310, 7194, 1754, 327, 4116, 285, 1881, 42959, 5122, 840, 15081, 253, 4081, 1332, 347, 1554, 42468, 45900, 310, 417, 21414, 50276, 783, 5661, 7533, 403, 417, 4344, 2217, 253, 45900, 4254, 310, 2931, 347, 8493, 295, 6219, 3280, 281, 694, 853, 3453, 323, 512, 643, 3082, 253, 45900, 3828, 310, 4802, 342, 253, 4156, 2020, 25629, 2299, 275, 253, 4081, 305, 6917, 253, 305, 2503, 1062, 310, 4802, 342, 247, 1881, 42959, 3828, 352, 310, 417, 2590, 1880, 253, 4081, 305, 2503, 1062, 390, 253, 1881, 42959, 3828, 5644, 281, 253, 3045, 6351, 247, 10182, 28913, 1263, 310, 3058, 50276, 74, 1158, 253, 4081, 1332, 476, 320, 12258, 347, 970, 18561, 10730, 275, 253, 17524, 3169, 45900, 253, 2022, 3064, 310, 253, 4471, 2522, 4116, 285, 253, 3037, 494, 4315, 256, 4496, 4385, 604, 891, 9829, 1633, 50275, 783, 897, 273, 4216, 5289, 310, 417, 1077, 21414, 253, 305, 9866, 73, 247, 310, 253, 2969, 3935, 8136, 273, 18976, 2224, 840, 253, 4216, 8350, 1491, 247, 310, 2168, 11217, 275, 288, 1580, 288, 310, 2797, 407, 18976, 2224, 50276, 43249, 1666, 25379, 403, 5816, 824, 347, 1577, 10730, 285, 5024, 45900, 597, 943, 320, 5469, 285, 2429, 50276, 34974, 337, 604, 253, 7316, 2805, 310, 2797, 432, 253, 3037, 494, 4315, 256, 534, 310, 465, 69, 10103, 840, 253, 3453, 273, 50276, 1595, 82, 465, 362, 50276, 11425, 671, 452, 465, 69, 7877, 534, 2097, 253, 3453, 310, 2168, 3777, 432, 295, 11390, 281, 465, 11390, 2139, 513, 359, 1335, 878, 253, 305, 2503, 1062, 4254, 253, 305, 2503, 1062, 1057, 417, 19477, 253, 295, 7632, 715, 253, 465, 6867, 7632, 50275, 11183, 846, 30080, 22559, 50276, 74, 452, 1239, 253, 4477, 30080, 22559, 954, 273, 619, 7350, 403, 9713, 6283, 285, 7613, 891, 717, 7378, 281, 2572, 619, 4868, 432, 577, 281, 721, 5474, 339, 431, 248, 789, 8725, 253, 873, 39707, 281, 4044, 247, 1332, 323, 4471, 2522, 4116, 45900, 327, 1554, 261, 1507, 342, 17769, 14580, 253, 4477, 921, 326, 253, 2746, 310, 347, 43541, 347, 253, 259, 77, 20169, 1071, 285, 556, 1805, 2317, 10454, 685, 5368, 4666, 17524, 6928, 253, 1332, 33526, 1375, 23037, 14387, 1543, 327, 4216, 9162, 285, 2266, 1543, 327, 4216, 14433, 285, 5978, 50276, 296, 3755, 20556, 50276, 783, 2929, 310, 1077, 973, 3542, 285, 29422, 50275, 783, 8442, 13503, 253, 2505, 973, 50275, 783, 789, 310, 22335, 285, 11076, 1037, 3590, 50276, 783, 1332, 2722, 1175, 1543, 285, 310, 44755, 2403, 352, 247, 9865, 1635, 281, 253, 873, 273, 5368, 305, 9866, 9158, 50276, 783, 4477, 1056, 271, 3434, 281, 4326, 4513, 616, 7234, 670, 3890, 2351, 285, 9171, 1430, 342, 27947, 50276, 783, 4679, 403, 973, 6777, 285, 921, 835, 11701, 1705, 432, 50276, 20881, 1255, 265, 50276, 783, 11464, 3890, 6460, 310, 417, 247, 1077, 2266, 3908, 1580, 954, 45900, 7274, 29534, 281, 436, 2867, 352, 310, 5322, 281, 452, 253, 10527, 1783, 2167, 50276, 783, 1332, 3139, 310, 271, 32809, 7629, 273, 873, 4979, 398, 458, 70, 1162, 355, 3738, 12956, 323, 247, 1027, 1511, 273, 3280, 941, 50276, 5302, 253, 6489, 4315, 347, 3067, 43850, 347, 2529, 275, 30762, 270, 281, 789, 1475, 253, 9171, 1430, 2523, 273, 4666, 17524, 3082, 3133, 281, 1056, 253, 2746, 8931, 281, 253, 873, 39707, 275, 512, 8090, 3707, 253, 806, 534, 16109, 561, 253, 7680, 50276, 9088, 310, 690, 2442, 323, 7756, 275, 19843, 273, 9759, 50276, 249, 12002, 778, 4917, 14237, 326, 513, 417, 1509, 253, 4216, 20169, 1071, 275, 10199, 3239, 374, 7899, 14237, 273, 1677, 14580, 326, 476, 1509, 253, 259, 77, 2566, 1110, 14683, 403, 21643, 347, 352, 310, 417, 670, 14237, 8136, 253, 259, 77, 1071, 310, 352, 352, 310, 670, 767, 14580, 534, 403, 15622, 407, 259, 77, 755, 1027, 14237, 347, 9113, 4767, 11358, 275, 253, 789, 50276, 1747, 13218, 3239, 577, 12494, 4216, 4471, 2522, 4116, 285, 253, 1563, 4394, 28910, 327, 253, 581, 1133, 597, 403, 6685, 2810, 281, 16016, 88, 6451, 1162, 355, 285, 458, 70, 1162, 355, 4536, 1014, 4829, 253, 1072, 14683, 28910, 327, 253, 643, 1133, 690, 1841, 403, 1669, 562, 534, 403, 9560, 323, 4685, 824, 347, 14308, 273, 14217, 323, 15759, 1005, 295, 82, 277, 76, 43857, 277, 7645, 285, 253, 6510, 273, 690, 12624, 923, 1735, 1127, 50276, 186, 835, 513, 253, 12922, 256, 1705, 432, 323, 253, 1327, 1881, 4116, 5572, 352, 3164, 310, 247, 4764, 4315, 326, 310, 3587, 18325, 533, 352, 310, 417, 4336, 2590, 3037, 494, 310, 23851, 476, 671, 320, 253, 3453, 273, 247, 2990, 28910, 14876, 5046, 253, 5740, 273, 253, 45900, 1332, 4916, 625, 2590, 672, 2529, 275, 247, 1755, 3487, 5133, 16186, 818, 50276, 2574, 721, 50276, 2574, 608, 50276, 1595, 50275, 16217, 3825, 50276, 2044, 18843, 273, 4216, 9162, 1543, 689, 253, 2831, 12820, 651, 320, 10260, 14109, 1580, 627, 3133, 281, 320, 247, 2317, 2523, 597, 476, 564, 715, 253, 30762, 50276, 783, 14433, 10336, 1057, 417, 17029, 3067, 43850, 352, 1537, 320, 4722, 281, 923, 849, 973, 253, 1332, 476, 513, 326, 323, 253, 13506, 14580, 323, 1650, 50276, 4919, 789, 50276, 783, 789, 337, 943, 320, 5393, 285, 5469, 275, 2905, 789, 285, 2429, 1411, 275, 4679, 352, 310, 671, 247, 45900, 1332, 342, 4116, 533, 3133, 281, 956, 247, 1027, 2746, 50276, 555, 993, 50276, 13206, 374, 11743, 281, 19477, 253, 512, 7632, 512, 50276, 16314, 273, 10012, 577, 533, 352, 310, 4122, 5919, 685, 4666, 17524, 3082, 50276, 16314, 273, 13989, 608, 840, 253, 806, 731, 26557, 15693, 50276, 455, 275, 512, 891, 1158, 436, 2929, 556, 247, 9865, 7680, 1014, 604, 253, 1332, 310, 32809, 3103, 891, 5257, 281, 6273, 323, 18738, 253, 2929, 533, 11907, 253, 4477, 281, 3157, 327, 253, 5393, 3374, 275, 4679, 2905, 789, 285, 9759, 50276, 18, 6337, 17551, 1162, 355, 347, 522, 17825, 2605, 6600, 45900, 323, 4715, 24498, 4216, 14237, 39951, 2284, 9169, 187, 187, 4118, 18435, 27, 783, 2929, 12453, 247, 1077, 1774, 2523, 275, 305, 9866, 253, 5426, 273, 247, 6210, 392, 37224, 45900, 1159, 323, 4666, 20828, 253, 4081, 4216, 1554, 42468, 39707, 3738, 417, 7094, 747, 3133, 281, 320, 4217, 275, 3946, 3374, 2905, 281, 5661, 1543, 347, 973, 347, 3237, 342, 9759, 452, 644, 14042, 407, 253, 4477, 84, 30080, 22559, 326, 3559, 4891, 5661, 1543, 285, 1783, 7350, 670, 253, 1524, 3890, 2351, 273, 253, 4081, 2746, 672, 2429, 281, 359, 261, 453, 6731, 282, 38373, 4216, 20169, 1071, 513, 417, 2818, 253, 7680, 8549, 407, 253, 2929, 326, 3133, 387, 436, 1127, 1534, 209 ]
Below is given review of a research paper from cnoference journal. Please write a summary the review. ### Review: this paper studies policy optimization in reinforcement learning with wasserstein and sinkhorn trust regions compare to the standard trpo which based on kldivergence the proposed wpo and spo go beyond the parametric policy distribution class the authors also derive closed form policy update as well as theoretical performance guarantees for both problems the paper is overall well written the results are well organized and reasonably clear to follow in terms of novelty i believe the closed form update parts theorem 1 3 are reasonably standard primaldual arguments however theorem 2 is an interesting and significant contribution since it provides theoretical guarantees even when the update betat is not the optimal solution in 6 which circumvent the difficulty in obtaining closed form solution of 6 theorem 4 also seems important as it gives theoretical guarantee for us to approximate wpo by spo with large lambda i would like to discuss a few more questions with the authors the upper bound of betalambda for spo in theorem 3 is of order 1delta which seems not very good because the bound would be trivial if there is no perturbation to the policy ie delta 0 also the bound for beta for wpo in theorem 1 seems to be independent of lambda i would like the authors to discuss some insights behind the different dependency on lambda do we expect the bound for betalambda is actually also independent of delta moreover it would be also interesting to understand the optimal dependency of betalambda as a function of lambda in theorem 4 can we characterize the rate of convergence of flambdabeta fbeta seems like an upper bound could be 1lambda but is this correct or optimal if it is correct also it would be good if we could understand the convergence of the optimizers in part 2 of theorem 4 small typo in abstract extensions of policy optimziation extensions of policy optimization this paper presents two policy update frameworks wpo and spo which relax the restriction to parametric policy distribution in the standard setting theoretical guarantees are provided the numerical results also suggest that proposed policy optimization methods outperform the standard trpo and ppo with better performance and faster convergence docsepthis paper proposes to use two extensions of the trpo algorithm relying on the wasserstein distance and the sinkhorn divergence which dont require to explicitly specify a distribution for the policy the authors provide a theoretical analysis giving a closed form policy update for their two methods and a performance improvement bound in the case of wasserstein policy optimisation they evaluate their methods empirically on tabular domains taxi chain and cliff walking and on some discrete locomotion tasks cartpole acrobot they find that their method outperform trpo and ppo while being more sampleefficient and converging faster the paper is wellorganised and present a new way of incorporating the wasserstein distance within policy optimisation algorithms the theoretical results look correct at first glance although i admit didnt check them carefully in the appendix it would be nice for theorem 1 to be selfcontained some of the variables are defined earlier in the text like m or beta but it would ease the reading to define them in the theorem the presentation of the theoretical results are a bit hard to follow so adding a few explanatory sentences about the importance of each terms would also be helpful could the authors also comment on the computational complexity of the method in terms of experiments the domains considered seem to show the benefits of the method i believe stronger tasks would make the paper stronger in particular continuous control ones or at least showing results for a few more domains related work wassersteinlike metrics have only been recently studied in the context of reinforcement learning please note that the wasserstein metric has been used in rl since at least 2012 with the introduction of bisimulation metrics httpsarxivorgpdf12074114pdf and recent work havent only used it for imitation learning but also for generalization httpsarxivorgpdf210105265pdf httpsarxivorgabs210201514 httpsarxivorgabs200610742 i believe the wasserstein and sinkhorn metrics have also been used in the distributional rl literature so it might be nice to discuss the similarity of ideas in both areas typos wassersteim the wasserstein metric etc please dont use etc and add all necessary details this paper provides a new way of optimising the policy distribution relying on the wasserstein and sinkhorn distances the methods is theoretically grounded so i would recommend an accept but i believe the authors would need to evaluate their method on more domains to make the paper stronger docsepthis paper considers wasserstein and sinkhorn trust region policy optimization wpo and spo for reinforcement learning unlike existing works on wpo it does not assume a parametric policy class theoretically it shows the performance improvement of wpo at every iteration and that spo converges to wpo it also conducts some experiements to illustrate the advantages of the proposed methods strength 1 the paper provides a relatively complete analysis of the wasserstein and sinkhorn policy optimization including closedform policy updates performance improvement bound actorcritic algorithm and experiments on popular instances weakness 1 the theoretical advances seem quite limited the closedform policy update in theorem 1 and theorem 3 should follow from existing duality results on wasserstein and sinkhorn optimization for example 1 and 2 it is unfortunate that such connections are not recognized in the paper theorem 2 is a simple consequence of the wellknown performance difference lemma theorem 4 is sort of expected as many existing results on the relationship between wasserstein distance and sinkhorn distance in addition it is unclear whether spo has a performance improvement at each iteration for any fixed entropic regularization parameter 2 computationalwise i am wondering how the policy update in wpo and spo can be implemented when the stateaction space is large and the computation of the parameters betat seems also timeconsuming whereas i did not find a technically sound argument for the heuristic choices of beta 3 the experiment results are not convincing enough to demonstrate clear advantages of wpo or spo 1 there is no comparison with existing wasserstein policy optimization with parametric classes such as algorithms in moskovitz et al 2020 and pacchinao et al 2020 therefore it is unclear to me whether it is worth considering a nonparametric policy update given that is the high computational cost for large spaces 2 in most results for example figures 235 spo is outperformed by wpo in terms of the reward at the end of the timesteps although it converges to a suboptimal value faster it might be helpful to consider a varying entropic regularization parameter references 1 blanchet jose and karthyek murthy quantifying distributional model risk via optimal transport mathematics of operations research 442 2019 565600 2 wang jie rui gao and yao xie sinkhorn distributionally robust optimization arxiv preprint arxiv210911926 2021 the main feature of this paper is considering a nonparametric policy class for wasserstein policy optimization however the theoretical contribution adds marginal to the existing literature and the numerical findings are not convincing to demonstrate the advantages of the new framework docsepthe paper introduces wasserstein and sinkhorn policy optimization with three main contributions first the paper derives closed form expressions for updates using lagrange multipliers second it proves monotonicity of performance improvement third experiments show efficiency and effectiveness relative to trpo and ppo review of efficient wasserstein and sinkhorn policy optimization in general i found the paper to be a nice if modest extension of trpo in essence the key idea is trading kl regularization for wasserstein or sinkhorn experiments show this is in general a win particularly for the case of approximated advantage functions i found the exposition to be mostly fine with some minor critiques first it would be better to have a more detailed treatment of the prior work trpo is properly introduced later second separately discussing wasserstein and sinkhorn is unnecessary and redundant they are very closely related and the text and math would benefit from making that clearer additional points the discussion of related work is inadequate it would be desirable to make a clear statement of what the reasons for and consequences of assumption 1 are compared to the performance bound when using klbased trust region see eg schulman et al 2015 cen et al 2020 using the wasserstein metric yields a tighter performance improvement bound and is more robust to the choice of parameters t it would be nice to show this explicitly theorems 1 and 3 are closely related as are wasserstein and sinkhorn is is really necessary to break them into separate sections this choice makes the relation less clear overall this is not the most ambitious paper but generally a nice contribution ### Summary:
this paper proposes two extensions of the trpo algorithm in which the trust region is defined using the wasserstein distance and the sinkhorn divergence the proposed methods do not restrict the policy to belong to a parametric distribution class and the authors provide closedform policy updates and a performance improvement bound for the wasserstein policy optimization the authors provide an empirical evaluation of their approaches on tabular domains and some discrete locomotion tasks comparing the performance with some stateoftheart policy optimization approaches after reading the authors feedback and interacting with the authors the reviewers did not reach a consensus one of the reviewers votes for rejection while the other three reviewers are slightly positive in particular the reviewer that voted for rejection raised a number of concerns that have been discussed at length with the authors who were able to clarify some of the issues but some of the answers did not satisfy the reviewer i went through the paper and i found the paper solid from a technical point of view but i share some of the reviewers concerns and i think that the authors should better position their contribution with respect to the state of the art overall this paper is borderline and i feel it needs still some work to deserve clear acceptance which i think will be soon
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 2520, 2929, 2175, 3646, 13757, 275, 35221, 4715, 342, 369, 2152, 6339, 285, 16338, 27721, 4517, 4811, 7277, 281, 253, 2629, 492, 5367, 534, 1754, 327, 465, 392, 2373, 9515, 253, 4081, 259, 5367, 285, 15695, 564, 4457, 253, 36833, 3646, 3268, 966, 253, 4477, 671, 15313, 4581, 830, 3646, 5731, 347, 973, 347, 10527, 3045, 23632, 323, 1097, 3237, 253, 2929, 310, 4583, 973, 3542, 253, 1543, 403, 973, 10932, 285, 12054, 2590, 281, 956, 275, 2426, 273, 38135, 891, 2868, 253, 4581, 830, 5731, 4243, 10012, 337, 50276, 20, 403, 12054, 2629, 819, 1983, 34716, 7125, 2299, 10012, 374, 310, 271, 4722, 285, 1534, 7680, 1580, 352, 3400, 10527, 23632, 1014, 672, 253, 5731, 701, 255, 310, 417, 253, 8654, 2900, 275, 721, 534, 39256, 253, 10183, 275, 13546, 4581, 830, 2900, 273, 721, 10012, 577, 671, 3133, 1774, 347, 352, 4245, 10527, 12215, 323, 441, 281, 16851, 259, 5367, 407, 15695, 342, 1781, 29331, 891, 651, 751, 281, 2319, 247, 1643, 625, 3533, 342, 253, 4477, 50275, 783, 5170, 3033, 273, 701, 267, 1836, 323, 15695, 275, 10012, 495, 310, 273, 1340, 337, 3005, 534, 3133, 417, 1077, 1175, 984, 253, 3033, 651, 320, 14916, 604, 627, 310, 642, 20452, 281, 253, 3646, 26332, 18687, 50276, 17, 671, 253, 3033, 323, 9840, 323, 259, 5367, 275, 10012, 337, 3133, 281, 320, 3907, 273, 29331, 891, 651, 751, 253, 4477, 281, 2319, 690, 16039, 3212, 253, 1027, 18925, 327, 29331, 513, 359, 1902, 253, 3033, 323, 701, 267, 1836, 310, 2686, 671, 3907, 273, 18687, 25761, 352, 651, 320, 671, 4722, 281, 2096, 253, 8654, 18925, 273, 701, 267, 1836, 347, 247, 1159, 273, 29331, 275, 10012, 577, 476, 359, 17710, 253, 2281, 273, 14940, 273, 892, 1369, 69, 357, 1464, 50276, 71, 2461, 3133, 751, 271, 5170, 3033, 812, 320, 337, 2260, 533, 310, 436, 3451, 390, 8654, 604, 352, 310, 3451, 671, 352, 651, 320, 1175, 604, 359, 812, 2096, 253, 14940, 273, 253, 5556, 14460, 275, 629, 374, 273, 10012, 577, 50276, 6795, 1745, 80, 275, 12002, 18149, 273, 3646, 5556, 91, 2492, 50276, 40028, 273, 3646, 13757, 50276, 2520, 2929, 10262, 767, 3646, 5731, 31225, 259, 5367, 285, 15695, 534, 7921, 253, 12400, 281, 36833, 3646, 3268, 275, 253, 2629, 4758, 10527, 23632, 403, 2530, 253, 10704, 1543, 671, 1804, 326, 4081, 3646, 13757, 3082, 562, 32231, 253, 2629, 492, 5367, 285, 268, 5367, 342, 1805, 3045, 285, 7938, 14940, 50276, 7152, 33032, 2520, 2929, 29328, 281, 897, 767, 18149, 273, 253, 492, 5367, 5933, 22128, 327, 253, 369, 2152, 6339, 4181, 285, 253, 16338, 27721, 23279, 534, 13414, 2430, 281, 11120, 13199, 247, 3268, 323, 253, 3646, 253, 4477, 2085, 247, 10527, 1783, 4933, 247, 4581, 830, 3646, 5731, 323, 616, 767, 3082, 285, 247, 3045, 7756, 3033, 275, 253, 1083, 273, 369, 2152, 6339, 3646, 5556, 5837, 597, 7472, 616, 3082, 45190, 327, 10334, 792, 10625, 24928, 5931, 285, 28900, 7824, 285, 327, 690, 13358, 23904, 5011, 8892, 7281, 36479, 913, 287, 12042, 597, 1089, 326, 616, 1332, 562, 32231, 492, 5367, 285, 268, 5367, 1223, 1146, 625, 3410, 20246, 285, 5975, 3390, 7938, 253, 2929, 310, 973, 7397, 1701, 285, 1246, 247, 747, 1039, 273, 24049, 253, 369, 2152, 6339, 4181, 1561, 3646, 5556, 5837, 11333, 50276, 783, 10527, 1543, 1007, 3451, 387, 806, 17834, 3738, 891, 11476, 42126, 2451, 731, 9257, 275, 253, 30762, 50276, 262, 651, 320, 5322, 323, 10012, 337, 281, 320, 1881, 41010, 690, 273, 253, 4903, 403, 2931, 4321, 275, 253, 2505, 751, 278, 390, 9840, 533, 352, 651, 11990, 253, 4361, 281, 4853, 731, 275, 253, 10012, 253, 9759, 273, 253, 10527, 1543, 403, 247, 2372, 1892, 281, 956, 594, 6240, 247, 1643, 41355, 14683, 670, 253, 6349, 273, 1016, 2426, 651, 671, 320, 9371, 50276, 16534, 253, 4477, 671, 4385, 327, 253, 15180, 10454, 273, 253, 1332, 50276, 249, 2426, 273, 4679, 253, 10625, 2783, 1646, 281, 921, 253, 5373, 273, 253, 1332, 891, 2868, 10046, 8892, 651, 1056, 253, 2929, 10046, 275, 1798, 5415, 1453, 4394, 390, 387, 1878, 4645, 1543, 323, 247, 1643, 625, 10625, 50276, 4919, 789, 369, 2152, 6339, 3022, 17082, 452, 760, 644, 4102, 5421, 275, 253, 3634, 273, 35221, 4715, 4496, 3877, 326, 253, 369, 2152, 6339, 7982, 556, 644, 908, 275, 391, 77, 1580, 387, 1878, 4050, 342, 253, 10199, 273, 17542, 303, 1427, 17082, 5987, 39962, 2061, 9275, 805, 39768, 13391, 9275, 285, 3332, 789, 419, 2254, 760, 908, 352, 323, 45738, 4715, 533, 671, 323, 26647, 5987, 39962, 2061, 9275, 1797, 520, 1762, 21317, 9275, 5987, 39962, 2061, 5375, 16899, 6620, 1047, 5987, 39962, 2061, 5375, 8603, 12224, 2945, 891, 2868, 253, 369, 2152, 6339, 285, 16338, 27721, 17082, 452, 671, 644, 908, 275, 253, 3268, 267, 391, 77, 6239, 594, 352, 1537, 320, 5322, 281, 2319, 253, 14259, 273, 5697, 275, 1097, 3672, 50276, 555, 993, 369, 2152, 3241, 303, 50276, 783, 369, 2152, 6339, 7982, 3966, 50276, 32897, 13414, 897, 3966, 285, 823, 512, 3309, 4278, 436, 2929, 3400, 247, 747, 1039, 273, 5556, 2182, 253, 3646, 3268, 22128, 327, 253, 369, 2152, 6339, 285, 16338, 27721, 13849, 253, 3082, 310, 28055, 28462, 594, 891, 651, 5583, 271, 2997, 533, 891, 2868, 253, 4477, 651, 878, 281, 7472, 616, 1332, 327, 625, 10625, 281, 1056, 253, 2929, 10046, 5474, 33032, 2520, 2929, 19401, 369, 2152, 6339, 285, 16338, 27721, 4517, 2919, 3646, 13757, 259, 5367, 285, 15695, 323, 35221, 4715, 12401, 5368, 2987, 327, 259, 5367, 352, 1057, 417, 5467, 247, 36833, 3646, 966, 28055, 352, 2722, 253, 3045, 7756, 273, 259, 5367, 387, 1046, 19502, 285, 326, 15695, 26414, 281, 259, 5367, 352, 671, 2589, 84, 690, 1172, 74, 3658, 281, 17093, 253, 11361, 273, 253, 4081, 3082, 4757, 50276, 18, 253, 2929, 3400, 247, 4942, 3426, 1783, 273, 253, 369, 2152, 6339, 285, 16338, 27721, 3646, 13757, 1690, 4581, 630, 3646, 11269, 3045, 7756, 3033, 12353, 68, 17425, 5933, 285, 4679, 327, 4633, 10872, 50276, 20881, 1255, 50276, 18, 253, 10527, 16424, 1646, 3240, 3710, 253, 4581, 630, 3646, 5731, 275, 10012, 337, 285, 10012, 495, 943, 956, 432, 5368, 34962, 1543, 327, 369, 2152, 6339, 285, 16338, 27721, 13757, 323, 1650, 337, 285, 374, 352, 310, 23293, 326, 824, 10291, 403, 417, 7478, 275, 253, 2929, 10012, 374, 310, 247, 2969, 9936, 273, 253, 973, 4304, 3045, 3064, 18057, 10012, 577, 310, 3686, 273, 3264, 347, 1142, 5368, 1543, 327, 253, 2954, 875, 369, 2152, 6339, 4181, 285, 16338, 27721, 4181, 275, 1635, 352, 310, 12744, 1880, 15695, 556, 247, 3045, 7756, 387, 1016, 19502, 323, 667, 4229, 994, 12189, 37820, 4764, 50276, 19, 15180, 3020, 891, 717, 12371, 849, 253, 3646, 5731, 275, 259, 5367, 285, 15695, 476, 320, 9009, 672, 253, 1375, 1913, 2317, 310, 1781, 285, 253, 13782, 273, 253, 3602, 701, 255, 3133, 671, 673, 33136, 5727, 891, 858, 417, 1089, 247, 22335, 3590, 4154, 323, 253, 47641, 10165, 273, 9840, 50275, 20, 253, 3368, 1543, 403, 417, 21414, 2217, 281, 7568, 2590, 11361, 273, 259, 5367, 390, 15695, 50275, 18, 627, 310, 642, 5301, 342, 5368, 369, 2152, 6339, 3646, 13757, 342, 36833, 5971, 824, 347, 11333, 275, 15039, 17131, 5432, 1162, 355, 9169, 285, 19162, 348, 1758, 80, 1162, 355, 9169, 3103, 352, 310, 12744, 281, 479, 1880, 352, 310, 4409, 7296, 247, 1327, 36928, 3646, 5731, 1677, 326, 310, 253, 1029, 15180, 2105, 323, 1781, 8470, 50276, 19, 275, 954, 1543, 323, 1650, 8442, 23540, 15695, 310, 41731, 10574, 407, 259, 5367, 275, 2426, 273, 253, 10921, 387, 253, 990, 273, 253, 4522, 383, 2265, 3738, 352, 26414, 281, 247, 749, 29776, 1318, 7938, 352, 1537, 320, 9371, 281, 1908, 247, 11962, 994, 12189, 37820, 4764, 50276, 250, 3065, 337, 787, 1377, 6168, 480, 583, 285, 465, 27914, 1441, 4682, 24085, 2677, 5411, 3268, 267, 1566, 2495, 3066, 8654, 4616, 23065, 273, 5871, 2561, 42222, 6247, 46472, 10487, 374, 259, 606, 480, 466, 391, 4113, 305, 8500, 285, 340, 8500, 1269, 466, 16338, 27721, 3268, 595, 10237, 13757, 549, 32693, 638, 3845, 549, 32693, 16899, 4739, 746, 1731, 43425, 253, 2022, 4735, 273, 436, 2929, 310, 7296, 247, 1327, 36928, 3646, 966, 323, 369, 2152, 6339, 3646, 13757, 2299, 253, 10527, 7680, 11323, 16888, 281, 253, 5368, 6239, 285, 253, 10704, 4342, 403, 417, 21414, 281, 7568, 253, 11361, 273, 253, 747, 7792, 5474, 339, 431, 248, 2929, 23970, 369, 2152, 6339, 285, 16338, 27721, 3646, 13757, 342, 1264, 2022, 9021, 806, 253, 2929, 38422, 4581, 830, 12091, 323, 11269, 970, 16653, 6324, 18878, 4670, 1273, 352, 19539, 45973, 414, 273, 3045, 7756, 2626, 4679, 921, 6733, 285, 12510, 4103, 281, 492, 5367, 285, 268, 5367, 50275, 15337, 273, 5919, 369, 2152, 6339, 285, 16338, 27721, 3646, 13757, 50276, 249, 2087, 891, 1119, 253, 2929, 281, 320, 247, 5322, 604, 16453, 6880, 273, 492, 5367, 275, 17718, 253, 2234, 2934, 310, 11947, 27451, 37820, 323, 369, 2152, 6339, 390, 16338, 27721, 4679, 921, 436, 310, 275, 2087, 247, 3330, 3782, 323, 253, 1083, 273, 34930, 5750, 3470, 891, 1119, 253, 47284, 281, 320, 6571, 4030, 342, 690, 5884, 2268, 4624, 806, 352, 651, 320, 1805, 281, 452, 247, 625, 7000, 1971, 273, 253, 2720, 789, 492, 5367, 310, 6283, 5611, 1996, 1273, 11794, 16585, 369, 2152, 6339, 285, 16338, 27721, 310, 15279, 285, 28116, 597, 403, 1077, 8244, 2905, 285, 253, 2505, 285, 14168, 651, 5649, 432, 2403, 326, 30909, 50274, 38092, 2792, 50275, 783, 5955, 273, 2905, 789, 310, 18766, 50275, 262, 651, 320, 11408, 281, 1056, 247, 2590, 3908, 273, 752, 253, 4606, 323, 285, 9099, 273, 9376, 337, 403, 50275, 3118, 1096, 281, 253, 3045, 3033, 672, 970, 27451, 3169, 4517, 2919, 923, 24088, 5807, 335, 1342, 1162, 355, 4104, 260, 257, 1162, 355, 9169, 970, 253, 369, 2152, 6339, 7982, 11026, 247, 40638, 3045, 7756, 3033, 285, 310, 625, 10237, 281, 253, 4327, 273, 3602, 246, 352, 651, 320, 5322, 281, 921, 436, 11120, 50275, 783, 28657, 337, 285, 495, 403, 8244, 2905, 347, 403, 369, 2152, 6339, 285, 16338, 27721, 310, 310, 1663, 3309, 281, 2740, 731, 715, 4858, 7118, 436, 4327, 2789, 253, 5886, 1679, 2590, 50276, 1189, 455, 436, 310, 417, 253, 954, 24683, 2929, 533, 3839, 247, 5322, 7680, 50274, 187, 187, 4118, 18435, 27, 2520, 2929, 29328, 767, 18149, 273, 253, 492, 5367, 5933, 275, 534, 253, 4517, 2919, 310, 2931, 970, 253, 369, 2152, 6339, 4181, 285, 253, 16338, 27721, 23279, 253, 4081, 3082, 513, 417, 4656, 253, 3646, 281, 5663, 281, 247, 36833, 3268, 966, 285, 253, 4477, 2085, 50276, 13784, 630, 3646, 11269, 285, 247, 3045, 7756, 3033, 323, 253, 369, 2152, 6339, 3646, 13757, 253, 4477, 2085, 271, 16774, 7103, 273, 616, 7274, 327, 10334, 792, 10625, 285, 690, 13358, 23904, 5011, 8892, 10941, 253, 3045, 342, 690, 1375, 23037, 14387, 3646, 13757, 7274, 50276, 6438, 4361, 253, 4477, 8680, 285, 18745, 342, 253, 4477, 253, 30628, 858, 417, 3986, 247, 13969, 581, 273, 253, 30628, 13008, 323, 18235, 1223, 253, 643, 1264, 30628, 403, 5777, 2762, 275, 1798, 253, 37317, 326, 14285, 323, 18235, 5439, 247, 1180, 273, 7350, 326, 452, 644, 5469, 387, 2978, 342, 253, 4477, 665, 497, 2104, 281, 19148, 690, 273, 253, 3374, 533, 690, 273, 253, 9172, 858, 417, 10517, 253, 37317, 891, 2427, 949, 253, 2929, 285, 891, 1119, 253, 2929, 4891, 432, 247, 7681, 1127, 273, 1859, 533, 891, 3894, 690, 273, 253, 30628, 7350, 285, 891, 1158, 326, 253, 4477, 943, 1805, 1899, 616, 7680, 342, 1675, 281, 253, 1375, 273, 253, 1445, 50276, 1189, 455, 436, 2929, 310, 45210, 285, 891, 1928, 352, 3198, 1335, 690, 789, 281, 17337, 2590, 14924, 534, 891, 1158, 588, 320, 3517 ]
[ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 ]
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 2520, 2929, 2175, 3646, 13757, 275, 35221, 4715, 342, 369, 2152, 6339, 285, 16338, 27721, 4517, 4811, 7277, 281, 253, 2629, 492, 5367, 534, 1754, 327, 465, 392, 2373, 9515, 253, 4081, 259, 5367, 285, 15695, 564, 4457, 253, 36833, 3646, 3268, 966, 253, 4477, 671, 15313, 4581, 830, 3646, 5731, 347, 973, 347, 10527, 3045, 23632, 323, 1097, 3237, 253, 2929, 310, 4583, 973, 3542, 253, 1543, 403, 973, 10932, 285, 12054, 2590, 281, 956, 275, 2426, 273, 38135, 891, 2868, 253, 4581, 830, 5731, 4243, 10012, 337, 50276, 20, 403, 12054, 2629, 819, 1983, 34716, 7125, 2299, 10012, 374, 310, 271, 4722, 285, 1534, 7680, 1580, 352, 3400, 10527, 23632, 1014, 672, 253, 5731, 701, 255, 310, 417, 253, 8654, 2900, 275, 721, 534, 39256, 253, 10183, 275, 13546, 4581, 830, 2900, 273, 721, 10012, 577, 671, 3133, 1774, 347, 352, 4245, 10527, 12215, 323, 441, 281, 16851, 259, 5367, 407, 15695, 342, 1781, 29331, 891, 651, 751, 281, 2319, 247, 1643, 625, 3533, 342, 253, 4477, 50275, 783, 5170, 3033, 273, 701, 267, 1836, 323, 15695, 275, 10012, 495, 310, 273, 1340, 337, 3005, 534, 3133, 417, 1077, 1175, 984, 253, 3033, 651, 320, 14916, 604, 627, 310, 642, 20452, 281, 253, 3646, 26332, 18687, 50276, 17, 671, 253, 3033, 323, 9840, 323, 259, 5367, 275, 10012, 337, 3133, 281, 320, 3907, 273, 29331, 891, 651, 751, 253, 4477, 281, 2319, 690, 16039, 3212, 253, 1027, 18925, 327, 29331, 513, 359, 1902, 253, 3033, 323, 701, 267, 1836, 310, 2686, 671, 3907, 273, 18687, 25761, 352, 651, 320, 671, 4722, 281, 2096, 253, 8654, 18925, 273, 701, 267, 1836, 347, 247, 1159, 273, 29331, 275, 10012, 577, 476, 359, 17710, 253, 2281, 273, 14940, 273, 892, 1369, 69, 357, 1464, 50276, 71, 2461, 3133, 751, 271, 5170, 3033, 812, 320, 337, 2260, 533, 310, 436, 3451, 390, 8654, 604, 352, 310, 3451, 671, 352, 651, 320, 1175, 604, 359, 812, 2096, 253, 14940, 273, 253, 5556, 14460, 275, 629, 374, 273, 10012, 577, 50276, 6795, 1745, 80, 275, 12002, 18149, 273, 3646, 5556, 91, 2492, 50276, 40028, 273, 3646, 13757, 50276, 2520, 2929, 10262, 767, 3646, 5731, 31225, 259, 5367, 285, 15695, 534, 7921, 253, 12400, 281, 36833, 3646, 3268, 275, 253, 2629, 4758, 10527, 23632, 403, 2530, 253, 10704, 1543, 671, 1804, 326, 4081, 3646, 13757, 3082, 562, 32231, 253, 2629, 492, 5367, 285, 268, 5367, 342, 1805, 3045, 285, 7938, 14940, 50276, 7152, 33032, 2520, 2929, 29328, 281, 897, 767, 18149, 273, 253, 492, 5367, 5933, 22128, 327, 253, 369, 2152, 6339, 4181, 285, 253, 16338, 27721, 23279, 534, 13414, 2430, 281, 11120, 13199, 247, 3268, 323, 253, 3646, 253, 4477, 2085, 247, 10527, 1783, 4933, 247, 4581, 830, 3646, 5731, 323, 616, 767, 3082, 285, 247, 3045, 7756, 3033, 275, 253, 1083, 273, 369, 2152, 6339, 3646, 5556, 5837, 597, 7472, 616, 3082, 45190, 327, 10334, 792, 10625, 24928, 5931, 285, 28900, 7824, 285, 327, 690, 13358, 23904, 5011, 8892, 7281, 36479, 913, 287, 12042, 597, 1089, 326, 616, 1332, 562, 32231, 492, 5367, 285, 268, 5367, 1223, 1146, 625, 3410, 20246, 285, 5975, 3390, 7938, 253, 2929, 310, 973, 7397, 1701, 285, 1246, 247, 747, 1039, 273, 24049, 253, 369, 2152, 6339, 4181, 1561, 3646, 5556, 5837, 11333, 50276, 783, 10527, 1543, 1007, 3451, 387, 806, 17834, 3738, 891, 11476, 42126, 2451, 731, 9257, 275, 253, 30762, 50276, 262, 651, 320, 5322, 323, 10012, 337, 281, 320, 1881, 41010, 690, 273, 253, 4903, 403, 2931, 4321, 275, 253, 2505, 751, 278, 390, 9840, 533, 352, 651, 11990, 253, 4361, 281, 4853, 731, 275, 253, 10012, 253, 9759, 273, 253, 10527, 1543, 403, 247, 2372, 1892, 281, 956, 594, 6240, 247, 1643, 41355, 14683, 670, 253, 6349, 273, 1016, 2426, 651, 671, 320, 9371, 50276, 16534, 253, 4477, 671, 4385, 327, 253, 15180, 10454, 273, 253, 1332, 50276, 249, 2426, 273, 4679, 253, 10625, 2783, 1646, 281, 921, 253, 5373, 273, 253, 1332, 891, 2868, 10046, 8892, 651, 1056, 253, 2929, 10046, 275, 1798, 5415, 1453, 4394, 390, 387, 1878, 4645, 1543, 323, 247, 1643, 625, 10625, 50276, 4919, 789, 369, 2152, 6339, 3022, 17082, 452, 760, 644, 4102, 5421, 275, 253, 3634, 273, 35221, 4715, 4496, 3877, 326, 253, 369, 2152, 6339, 7982, 556, 644, 908, 275, 391, 77, 1580, 387, 1878, 4050, 342, 253, 10199, 273, 17542, 303, 1427, 17082, 5987, 39962, 2061, 9275, 805, 39768, 13391, 9275, 285, 3332, 789, 419, 2254, 760, 908, 352, 323, 45738, 4715, 533, 671, 323, 26647, 5987, 39962, 2061, 9275, 1797, 520, 1762, 21317, 9275, 5987, 39962, 2061, 5375, 16899, 6620, 1047, 5987, 39962, 2061, 5375, 8603, 12224, 2945, 891, 2868, 253, 369, 2152, 6339, 285, 16338, 27721, 17082, 452, 671, 644, 908, 275, 253, 3268, 267, 391, 77, 6239, 594, 352, 1537, 320, 5322, 281, 2319, 253, 14259, 273, 5697, 275, 1097, 3672, 50276, 555, 993, 369, 2152, 3241, 303, 50276, 783, 369, 2152, 6339, 7982, 3966, 50276, 32897, 13414, 897, 3966, 285, 823, 512, 3309, 4278, 436, 2929, 3400, 247, 747, 1039, 273, 5556, 2182, 253, 3646, 3268, 22128, 327, 253, 369, 2152, 6339, 285, 16338, 27721, 13849, 253, 3082, 310, 28055, 28462, 594, 891, 651, 5583, 271, 2997, 533, 891, 2868, 253, 4477, 651, 878, 281, 7472, 616, 1332, 327, 625, 10625, 281, 1056, 253, 2929, 10046, 5474, 33032, 2520, 2929, 19401, 369, 2152, 6339, 285, 16338, 27721, 4517, 2919, 3646, 13757, 259, 5367, 285, 15695, 323, 35221, 4715, 12401, 5368, 2987, 327, 259, 5367, 352, 1057, 417, 5467, 247, 36833, 3646, 966, 28055, 352, 2722, 253, 3045, 7756, 273, 259, 5367, 387, 1046, 19502, 285, 326, 15695, 26414, 281, 259, 5367, 352, 671, 2589, 84, 690, 1172, 74, 3658, 281, 17093, 253, 11361, 273, 253, 4081, 3082, 4757, 50276, 18, 253, 2929, 3400, 247, 4942, 3426, 1783, 273, 253, 369, 2152, 6339, 285, 16338, 27721, 3646, 13757, 1690, 4581, 630, 3646, 11269, 3045, 7756, 3033, 12353, 68, 17425, 5933, 285, 4679, 327, 4633, 10872, 50276, 20881, 1255, 50276, 18, 253, 10527, 16424, 1646, 3240, 3710, 253, 4581, 630, 3646, 5731, 275, 10012, 337, 285, 10012, 495, 943, 956, 432, 5368, 34962, 1543, 327, 369, 2152, 6339, 285, 16338, 27721, 13757, 323, 1650, 337, 285, 374, 352, 310, 23293, 326, 824, 10291, 403, 417, 7478, 275, 253, 2929, 10012, 374, 310, 247, 2969, 9936, 273, 253, 973, 4304, 3045, 3064, 18057, 10012, 577, 310, 3686, 273, 3264, 347, 1142, 5368, 1543, 327, 253, 2954, 875, 369, 2152, 6339, 4181, 285, 16338, 27721, 4181, 275, 1635, 352, 310, 12744, 1880, 15695, 556, 247, 3045, 7756, 387, 1016, 19502, 323, 667, 4229, 994, 12189, 37820, 4764, 50276, 19, 15180, 3020, 891, 717, 12371, 849, 253, 3646, 5731, 275, 259, 5367, 285, 15695, 476, 320, 9009, 672, 253, 1375, 1913, 2317, 310, 1781, 285, 253, 13782, 273, 253, 3602, 701, 255, 3133, 671, 673, 33136, 5727, 891, 858, 417, 1089, 247, 22335, 3590, 4154, 323, 253, 47641, 10165, 273, 9840, 50275, 20, 253, 3368, 1543, 403, 417, 21414, 2217, 281, 7568, 2590, 11361, 273, 259, 5367, 390, 15695, 50275, 18, 627, 310, 642, 5301, 342, 5368, 369, 2152, 6339, 3646, 13757, 342, 36833, 5971, 824, 347, 11333, 275, 15039, 17131, 5432, 1162, 355, 9169, 285, 19162, 348, 1758, 80, 1162, 355, 9169, 3103, 352, 310, 12744, 281, 479, 1880, 352, 310, 4409, 7296, 247, 1327, 36928, 3646, 5731, 1677, 326, 310, 253, 1029, 15180, 2105, 323, 1781, 8470, 50276, 19, 275, 954, 1543, 323, 1650, 8442, 23540, 15695, 310, 41731, 10574, 407, 259, 5367, 275, 2426, 273, 253, 10921, 387, 253, 990, 273, 253, 4522, 383, 2265, 3738, 352, 26414, 281, 247, 749, 29776, 1318, 7938, 352, 1537, 320, 9371, 281, 1908, 247, 11962, 994, 12189, 37820, 4764, 50276, 250, 3065, 337, 787, 1377, 6168, 480, 583, 285, 465, 27914, 1441, 4682, 24085, 2677, 5411, 3268, 267, 1566, 2495, 3066, 8654, 4616, 23065, 273, 5871, 2561, 42222, 6247, 46472, 10487, 374, 259, 606, 480, 466, 391, 4113, 305, 8500, 285, 340, 8500, 1269, 466, 16338, 27721, 3268, 595, 10237, 13757, 549, 32693, 638, 3845, 549, 32693, 16899, 4739, 746, 1731, 43425, 253, 2022, 4735, 273, 436, 2929, 310, 7296, 247, 1327, 36928, 3646, 966, 323, 369, 2152, 6339, 3646, 13757, 2299, 253, 10527, 7680, 11323, 16888, 281, 253, 5368, 6239, 285, 253, 10704, 4342, 403, 417, 21414, 281, 7568, 253, 11361, 273, 253, 747, 7792, 5474, 339, 431, 248, 2929, 23970, 369, 2152, 6339, 285, 16338, 27721, 3646, 13757, 342, 1264, 2022, 9021, 806, 253, 2929, 38422, 4581, 830, 12091, 323, 11269, 970, 16653, 6324, 18878, 4670, 1273, 352, 19539, 45973, 414, 273, 3045, 7756, 2626, 4679, 921, 6733, 285, 12510, 4103, 281, 492, 5367, 285, 268, 5367, 50275, 15337, 273, 5919, 369, 2152, 6339, 285, 16338, 27721, 3646, 13757, 50276, 249, 2087, 891, 1119, 253, 2929, 281, 320, 247, 5322, 604, 16453, 6880, 273, 492, 5367, 275, 17718, 253, 2234, 2934, 310, 11947, 27451, 37820, 323, 369, 2152, 6339, 390, 16338, 27721, 4679, 921, 436, 310, 275, 2087, 247, 3330, 3782, 323, 253, 1083, 273, 34930, 5750, 3470, 891, 1119, 253, 47284, 281, 320, 6571, 4030, 342, 690, 5884, 2268, 4624, 806, 352, 651, 320, 1805, 281, 452, 247, 625, 7000, 1971, 273, 253, 2720, 789, 492, 5367, 310, 6283, 5611, 1996, 1273, 11794, 16585, 369, 2152, 6339, 285, 16338, 27721, 310, 15279, 285, 28116, 597, 403, 1077, 8244, 2905, 285, 253, 2505, 285, 14168, 651, 5649, 432, 2403, 326, 30909, 50274, 38092, 2792, 50275, 783, 5955, 273, 2905, 789, 310, 18766, 50275, 262, 651, 320, 11408, 281, 1056, 247, 2590, 3908, 273, 752, 253, 4606, 323, 285, 9099, 273, 9376, 337, 403, 50275, 3118, 1096, 281, 253, 3045, 3033, 672, 970, 27451, 3169, 4517, 2919, 923, 24088, 5807, 335, 1342, 1162, 355, 4104, 260, 257, 1162, 355, 9169, 970, 253, 369, 2152, 6339, 7982, 11026, 247, 40638, 3045, 7756, 3033, 285, 310, 625, 10237, 281, 253, 4327, 273, 3602, 246, 352, 651, 320, 5322, 281, 921, 436, 11120, 50275, 783, 28657, 337, 285, 495, 403, 8244, 2905, 347, 403, 369, 2152, 6339, 285, 16338, 27721, 310, 310, 1663, 3309, 281, 2740, 731, 715, 4858, 7118, 436, 4327, 2789, 253, 5886, 1679, 2590, 50276, 1189, 455, 436, 310, 417, 253, 954, 24683, 2929, 533, 3839, 247, 5322, 7680, 50274, 187, 187, 4118, 18435, 27, 2520, 2929, 29328, 767, 18149, 273, 253, 492, 5367, 5933, 275, 534, 253, 4517, 2919, 310, 2931, 970, 253, 369, 2152, 6339, 4181, 285, 253, 16338, 27721, 23279, 253, 4081, 3082, 513, 417, 4656, 253, 3646, 281, 5663, 281, 247, 36833, 3268, 966, 285, 253, 4477, 2085, 50276, 13784, 630, 3646, 11269, 285, 247, 3045, 7756, 3033, 323, 253, 369, 2152, 6339, 3646, 13757, 253, 4477, 2085, 271, 16774, 7103, 273, 616, 7274, 327, 10334, 792, 10625, 285, 690, 13358, 23904, 5011, 8892, 10941, 253, 3045, 342, 690, 1375, 23037, 14387, 3646, 13757, 7274, 50276, 6438, 4361, 253, 4477, 8680, 285, 18745, 342, 253, 4477, 253, 30628, 858, 417, 3986, 247, 13969, 581, 273, 253, 30628, 13008, 323, 18235, 1223, 253, 643, 1264, 30628, 403, 5777, 2762, 275, 1798, 253, 37317, 326, 14285, 323, 18235, 5439, 247, 1180, 273, 7350, 326, 452, 644, 5469, 387, 2978, 342, 253, 4477, 665, 497, 2104, 281, 19148, 690, 273, 253, 3374, 533, 690, 273, 253, 9172, 858, 417, 10517, 253, 37317, 891, 2427, 949, 253, 2929, 285, 891, 1119, 253, 2929, 4891, 432, 247, 7681, 1127, 273, 1859, 533, 891, 3894, 690, 273, 253, 30628, 7350, 285, 891, 1158, 326, 253, 4477, 943, 1805, 1899, 616, 7680, 342, 1675, 281, 253, 1375, 273, 253, 1445, 50276, 1189, 455, 436, 2929, 310, 45210, 285, 891, 1928, 352, 3198, 1335, 690, 789, 281, 17337, 2590, 14924, 534, 891, 1158, 588, 320, 3517 ]
Below is given review of a research paper from cnoference journal. Please write a summary the review. ### Review: the authors introduce a deeplearningbased extension to accelerated failure time modeling for survival analysis and introduce a rankregressiontype loss function based on gehans rank statistic this is a somewhat confusing paper for several reasons i the problem is set as aft however the likelihood encoded in the residuals which is key to aft is not used for optimization so one could argue that the proposed approach is not an aft formulation but rather rank regression with censoring ii the rankbased loss function in 6 that is very similar to that in raykar et al neurips 2007 on ranking in survival analysis bounds on the concordance index which is used in draft but not discussed in the paper iii the experiments show that the proposed model is no better than the standard linear cox proportional hazard in terms of cindex which is the most appropriate performance metric considering the proposed model is optimizing for ordering using a rank statistic iv the theoretical results for 7 in the linear case as the authors discuss are not directly generalizable to nonlinear models the proposed approach seems to outperform cox in large datasets the authors may consider presenting results on additional large datasets to confirm that it is indeed the case which in turn will make the experiments section stronger note that draft in figure 1 is not introduced or discussed in the paper the proposed approach presented as a deeplearning extension of aft optimizes a rankbased loss function very similar to that in raykar et al neurips 2007 on ranking in survival analysis bounds on the concordance index the experiments results show that the proposed approach does not outperform standard cox in 3 out of 4 datasets docsepthe authors suggest a neural network based accelerated failure time aft model as well as an l1type rank loss which they argue results in an easy and fast to train model which is state of the art in terms of two standard evaluation metrics concordance and integrated brier score following the semiparametric aft literature the baseline hazard is estimated from the data and the full hazard function is parametrized in a way that the parametric part corresponds to a prediction of the failure time the main innovation of the paper is the use of a novel loss function based on a ranking loss using gehans rank statistic which while being used in the statistical literature has not been used in a machine learning setting strengths use of a for the machine learning literature novel loss function based on gehans rank statistic the paper is mostly clearly written and easy to follow the authors put their method into context well weaknesses due to semiparametric nature prediction only possible up until the last observed patient time the loss function in jin et al 2003 is derived for the linear case while the authors argue for how it could work in the nonlinear case there is no guarantee that the optimal set of parameters corresponds to a zero rank statistic in the nonlinear case the loss function is quadratic in the training samples which could lead to bad performance for large datasets the authors however show very fast performance on a large dataset which makes this a minor point the authors introduce both a neural network architecture for aft models and an l1 ranking loss it is not entirely clear how much influence either of these have on the benchmark results in the appendix the authors introduce other loss functions but do not include the trained models based on those losses in the benchmark results the addition of these in table 2 and 3 would help disentangle the influence of the model architecture and of the loss which would be interesting other the loss function is quadratic in the number of patientsintervals whereas the full likelihood is linear given this the speed on kkbox seems very surprising and is probably due to a small constant it would be interesting to see the scaling with the number of patients n with real data with both the ranking and the maximum likelihood loss the authors could make the connection between epsilon in equation 3 and h0 in 9 more clear as a step before using the nelsonaalen estimator to use a semiparametric aft the authors use calibrated when talking about brier score however traditionally when talking especially about aft models calibration refers to the difference between real and predicted event times and the brier score is a mix of calibration and classification the brier score is the calibration of the survival distribution over time so calling it calibration is not incorrect however within the context of the survival literature somewhat misleading more specific language would make the paper more readable an introduction or motivation why the loss function is a good loss function would be beneficial for the understanding of the paper the authors could elaborate on rank statistics and why they are expected to be more stable especially as eg deephit talks about ranking when talking about concordance it would be helpful for reading that ranking here is in reference to a weighted logrank estimating function it would be interesting to see how well the model does in calibration of the event times also with the other loss functions as this is clinically meaningful the bold facing in table 2 is not consistent deephit should be bold in one case in the appendix the authors make it seem that there are no convergence guarantees for mle based estimators however also considering the application to aft models as a special case there do exist estimators that have some theoretical convergence guarantees tang et al 2020 survival analysis via ordinary differential equations a small discussion thereof could be beneficial while there is a large number of machine learning methods for survival by considering a for the machine learning literature novel loss function in the aft setting this paper can be seen as a meaningful contribution to the machine learning for survival literature the theoretical foundation of the loss function is however not perfect and the influence of loss function and architecture are hard to disentangle due to the form of the loss function a formal scaling analysis would also be needed i would recommend a weak accept i could be convinced to improve of the score with a more convincing argument why the loss term in the nonlinear setting is a good loss term maybe through some upper bound on the loss term docsepthis paper proposes to combine the idea of gehans rank statistic idea on fitting the aft model as well as using the deep learning model as a nonlinear method for replacing the linear method in the original aft model the authors argue they connect gehans nonparametric technique with deep learning models experiments on various benchmark datasets show that the proposed model is competitive to stateoftheart baselines strength 1 the idea of combining gehans model with nonlinear deep learning model is new 2 performance is stable on several benchmarks 3 clear writing weaknesses 1 the intuition of the approach is not quite clear does the proposed model focus on reducing computational time or improving the performance or both 2 the results do not support the conclusion the improvements seem not significant compared to baselines 3 the methodology parts mostly combine another existing model idea into the current aft model hence the innovation is limited methodologicalwise while the idea of combining gehan model with deep learning is new the results show that the proposed model does not outperform sota systems on several benchmarks docsepthis paper proposes a deep neural net extension to an existing rankingbased estimator used for the semiparametric aft model jin et al 2003 the resulting method achieves timedependent concordance index and integrated brier score values that are competitive compared to baseline deep aft models i found this paper straightforward to read although i have several large concerns my first concern is that the exposition is overly complicated even though the key idea of the paper is extremely simple the proposed model dart can be derived by just taking the known rank estimator by jin et al 2003 and swapping out the inner product in their equation 21 with a neural net and then just train using minibatch gradient descent instead of linear programming in other words much like how faraggi and simon 1995 swapped out the inner product in the semiparametric cox model to be a neural net to come up with the same neural net model as deepsurv katzman et al 2018 this paper does the same thing with the already existing semiparametric aft model which already was known before the jin et al 2003 paper but the optimization procedures in estimating the regression coefficients were not optimal from a technical noveltyinnovation perspective there is very little that is conceptually new instead the paper is motivated in a way that spends way too much emphasistext on describing standard results from survival analysis eg what is a cox model what is an aft model why one would use standard cox ph vs aft that the cox model can be made timedependent that aft models can be specified in both parametric or semiparametric forms etc id suggest perhaps having your background instead focus on existing semiparametric aft literature that youre simply doing a straightforward neural net extension replace inner product with neural net and comparing your model with draft and date i found figure 1 very helpful id suggest reducing the amount of text spent on explaining what hazards models are or why one should use hazards models over aft models as this is a very old debate at this point from what i can tell it really just depends on the data of course a weibull regression model is both a hazards model and an aft model my next concern is that the experimental results are inconsistent with existing literature as is the experimental results presented make it seem that dart is not a clear winner over topperforming baselines however once we consider that some of the numbers are quite off from literature how good dart is seems even more suspect for example your reported deephit ctexttd number for the kkbox dataset is dramatically off from kvamme et al 2019basically according to what they get deephit gets 0888 which is higher than dart 0867 some of your ibs numbers across methods also seem off some explanation of these discrepancies would be helpful there is another exposition issue early on in the paper the text makes it seem like the cox model does not make a linear assumption whereas aft does this isnt true cox assumes the partial log likelihood is linear aft assumes the log survival time is linear in other words both make linearity assumptions just for different quantities references david faraggi and richard simon a neural network model for survival data statistics in medicine 1995 zhezhen jin d y lin l j wei zhiliang ying rankbased inference for the accelerated failure time model biometrika 2003 jared l katzman uri shaham alexander cloninger jonathan bates tingting jiang and yuval kluger deepsurv personalized treatment recommender system using a cox proportional hazards deep neural network bmc medical research methodology 2018 hvard kvamme rnulf borgan ida scheel timetoevent prediction with neural networks and cox regression jmlr 2019 this paper has extremely limited technical novelty spends too much text on explaining existing standard resultsdefinitions from survival analysis and has experimental results that are suspect inconsistent with existing literature docsepthe paper extends the previously proposed linear semiparametric aft model based on gehans rank statistic to a nonlinear setup the nonlinear model termed deep aft rankregression for timetoevent prediction model dart is parameterized by a neural network experimental results across four datasets show a competitive advantage over baselines per concordance index cindex integrated brier score ibs and training time the paper is easy to follow with competitive quantitative results however the technical contributions are lacking there are some misleading statements and the writing needs improvement below are specific examples introduction the accelerated failure time model aft or accelerated life model relates the logarithm of the failure time linearly to the features this statement is not necessarily true for parametric aft other than lognormal eg weibull exponential etc the lrm coxph eq 2 definition is not correctly specified note the risk set mathcalri should include censored on noncensored times tj st tj ti sec 32 however estimating survival quantities eg conditional hazard function cannot be directly done for aftbased models this statement is not necessarily true for some parametric aft models eg weibull exponential etc sec 32 while paper claims to make timetoevent predictions hatt expgxi theta it is not clear why those predictions can not be used to directly estimate hatst x instead the paper proposes a semiparametric conditional hazard transformation similar to coxph without providing any justification sec 4 is there evidence that supports this statement note that standard concordance index yields identical results with rm crm td for aftbased models sec 54 for fair comparisons the paper should compare aft and hazard models using a similar metric either cindex or rm ctd or both for a comprehensive evaluation the paper should provide additional qualitative results eg model predictions against ground truth and calibration curves minor issues missing xi sim fxi in third term eq 4 introduction typo atf should be aft sec 4 wellfitted model yields ibs lower than sentence is incomplete apart from empirical competitive quantitative results against baselines the technical contributions are not clear the semiparametric aft model based on gehans rank statistic objective function has been previously proposed additionally while the paper claims to make timetoevent predictions the conditional hazard transformation of model predictions seems to indicate the model predicts hazards instead ### Summary:
most reviewers came to the conclusion that this work lacks novelty and theoretical depth further severe concerns about the validity of some statements and about the experimental setup have been raised the rebuttal was not perceived as being fully convincing and nobody wanted to champion this paper i share most of these points of criticism although there is certainly some potential in this work i think it is not ready for publication and would at least need a major revision
[ 4872, 1677, 436, 253, 3885, 327, 465, 76, 3364, 3133, 1077, 10084, 285, 310, 3164, 1955, 281, 247, 1355, 3638, 352, 651, 320, 4722, 281, 923, 253, 13642, 342, 253, 1180, 273, 1363, 295, 342, 1524, 941, 342, 1097, 253, 19947, 285, 253, 4869, 12177, 2957, 50275, 783, 4477, 812, 1056, 253, 4602, 875, 299, 4277, 275, 5150, 495, 285, 288, 17, 275, 898, 625, 2590, 347, 247, 3213, 1078, 970, 253, 295, 1241, 8440, 43322, 29107, 281, 897, 247, 3300, 532, 29982, 6853, 247, 649, 50275, 783, 4477, 897, 35890, 672, 5015, 670, 270, 4586, 4868, 2299, 21533, 672, 5015, 3340, 670, 247, 649, 3210, 18543, 10770, 281, 253, 3064, 875, 1524, 285, 8131, 2362, 2069, 285, 253, 270, 4586, 4868, 310, 247, 5878, 273, 18543, 285, 9162, 253, 270, 4586, 4868, 310, 253, 18543, 273, 253, 5788, 3268, 689, 673, 594, 6789, 352, 18543, 310, 417, 13583, 2299, 1561, 253, 3634, 273, 253, 5788, 6239, 8489, 24363, 625, 2173, 3448, 651, 1056, 253, 2929, 625, 34025, 50275, 266, 10199, 390, 16038, 2139, 253, 2957, 1159, 310, 247, 1175, 2957, 1159, 651, 320, 12912, 323, 253, 4685, 273, 253, 2929, 253, 4477, 812, 21184, 327, 5958, 9990, 285, 2139, 597, 403, 3264, 281, 320, 625, 6474, 3340, 347, 24088, 372, 70, 545, 262, 12088, 670, 19947, 672, 5015, 670, 34860, 593, 352, 651, 320, 9371, 323, 4361, 326, 19947, 1060, 310, 275, 3806, 281, 247, 17375, 2412, 14714, 26230, 1159, 50275, 262, 651, 320, 4722, 281, 923, 849, 973, 253, 1566, 1057, 275, 18543, 273, 253, 2362, 2069, 671, 342, 253, 643, 2957, 3470, 347, 436, 310, 16747, 14282, 50275, 783, 13433, 10268, 275, 2829, 374, 310, 417, 5185, 372, 70, 545, 262, 943, 320, 13433, 275, 581, 1083, 50275, 249, 253, 30762, 253, 4477, 1056, 352, 1646, 326, 627, 403, 642, 14940, 23632, 323, 278, 282, 1754, 48489, 2299, 671, 7296, 253, 2898, 281, 247, 649, 3210, 347, 247, 2714, 1083, 627, 513, 2226, 48489, 326, 452, 690, 10527, 14940, 23632, 12717, 1162, 355, 9169, 5788, 1783, 3066, 9826, 8967, 7424, 247, 1355, 5955, 10445, 812, 320, 12912, 1223, 627, 310, 247, 1781, 1180, 273, 5145, 4715, 3082, 323, 5788, 407, 7296, 247, 323, 253, 5145, 4715, 6239, 4460, 2957, 1159, 275, 253, 247, 649, 4758, 436, 2929, 476, 320, 2326, 347, 247, 14282, 7680, 281, 253, 5145, 4715, 323, 5788, 6239, 253, 10527, 12153, 273, 253, 2957, 1159, 310, 2299, 417, 3962, 285, 253, 4833, 273, 2957, 1159, 285, 10336, 403, 1892, 281, 557, 290, 2134, 1955, 281, 253, 830, 273, 253, 2957, 1159, 247, 7473, 13642, 1783, 651, 671, 320, 3058, 891, 651, 5583, 247, 5075, 2997, 891, 812, 320, 13762, 281, 3157, 273, 253, 4868, 342, 247, 625, 21414, 4154, 2139, 253, 2957, 1307, 275, 253, 14561, 4758, 310, 247, 1175, 2957, 1307, 5046, 949, 690, 5170, 3033, 327, 253, 2957, 1307, 5474, 33032, 2520, 2929, 29328, 281, 13398, 253, 2934, 273, 3471, 73, 507, 5958, 26312, 2934, 327, 13532, 253, 247, 649, 1566, 347, 973, 347, 970, 253, 3676, 4715, 1566, 347, 247, 14561, 1332, 323, 15706, 253, 4872, 1332, 275, 253, 3236, 247, 649, 1566, 253, 4477, 9059, 597, 4684, 3471, 73, 507, 1327, 36928, 5853, 342, 3676, 4715, 3210, 4679, 327, 2710, 22791, 15302, 921, 326, 253, 4081, 1566, 310, 12085, 281, 1375, 23037, 14387, 1666, 25379, 4757, 50276, 18, 253, 2934, 273, 16248, 3471, 73, 507, 1566, 342, 14561, 3676, 4715, 1566, 310, 747, 374, 3045, 310, 6474, 327, 2067, 49602, 495, 2590, 4028, 50276, 20881, 1255, 265, 337, 253, 30328, 273, 253, 2746, 310, 417, 3240, 2590, 1057, 253, 4081, 1566, 2770, 327, 8493, 15180, 673, 390, 11138, 253, 3045, 390, 1097, 374, 253, 1543, 513, 417, 1329, 253, 6452, 253, 11701, 1646, 417, 1534, 2429, 281, 1666, 25379, 495, 253, 16182, 4243, 6571, 13398, 1529, 5368, 1566, 2934, 715, 253, 1655, 247, 649, 1566, 7613, 253, 15832, 310, 3710, 35961, 3020, 1223, 253, 2934, 273, 16248, 3471, 5582, 1566, 342, 3676, 4715, 310, 747, 253, 1543, 921, 326, 253, 4081, 1566, 1057, 417, 562, 32231, 256, 5503, 2718, 327, 2067, 49602, 5474, 33032, 2520, 2929, 29328, 247, 3676, 11454, 2036, 6880, 281, 271, 5368, 19947, 3169, 29107, 908, 323, 253, 3300, 532, 29982, 6853, 247, 649, 1566, 480, 249, 1162, 355, 6469, 253, 4795, 1332, 33526, 37282, 2662, 34860, 593, 3605, 285, 8527, 270, 4586, 4868, 2193, 326, 403, 12085, 2429, 281, 8245, 3676, 247, 649, 3210, 891, 1119, 436, 2929, 15246, 281, 1239, 3738, 891, 452, 2067, 1781, 7350, 50276, 2577, 806, 4468, 310, 326, 253, 47284, 310, 27662, 9542, 1014, 2167, 253, 2234, 2934, 273, 253, 2929, 310, 6685, 2969, 253, 4081, 1566, 35197, 476, 320, 6012, 407, 816, 3192, 253, 1929, 5958, 29107, 407, 480, 249, 1162, 355, 6469, 285, 1863, 5436, 562, 253, 6703, 1885, 275, 616, 5150, 3127, 342, 247, 11454, 2036, 285, 840, 816, 6194, 970, 1054, 487, 1506, 11786, 18499, 3185, 273, 4872, 10717, 275, 643, 3000, 1199, 751, 849, 2080, 356, 7311, 285, 948, 251, 8878, 1863, 6965, 562, 253, 6703, 1885, 275, 253, 3300, 532, 29982, 6853, 820, 89, 1566, 281, 320, 247, 11454, 2036, 281, 1705, 598, 342, 253, 1072, 11454, 2036, 1566, 347, 372, 2265, 321, 87, 465, 16859, 1342, 1162, 355, 4765, 436, 2929, 1057, 253, 1072, 2181, 342, 253, 2168, 5368, 3300, 532, 29982, 6853, 247, 649, 1566, 534, 2168, 369, 1929, 1078, 253, 480, 249, 1162, 355, 6469, 2929, 533, 253, 13757, 7259, 275, 26230, 253, 9077, 10303, 497, 417, 8654, 432, 247, 7681, 38135, 2966, 21373, 8668, 627, 310, 1077, 1652, 326, 310, 4473, 1230, 747, 3185, 253, 2929, 310, 17194, 275, 247, 1039, 326, 30885, 1039, 1512, 1199, 10251, 382, 2068, 327, 12930, 2629, 1543, 432, 5788, 1783, 24088, 752, 310, 247, 820, 89, 1566, 752, 310, 271, 247, 649, 1566, 2139, 581, 651, 897, 2629, 820, 89, 815, 4632, 247, 649, 326, 253, 820, 89, 1566, 476, 320, 1160, 37282, 2662, 326, 247, 649, 3210, 476, 320, 7616, 275, 1097, 36833, 390, 3300, 532, 29982, 6853, 4948, 3966, 2654, 1804, 4931, 1907, 634, 4114, 3185, 2770, 327, 5368, 3300, 532, 29982, 6853, 247, 649, 6239, 326, 368, 250, 3365, 2509, 247, 15246, 11454, 2036, 6880, 8171, 6703, 1885, 342, 11454, 2036, 285, 10941, 634, 1566, 342, 7482, 285, 3522, 891, 1119, 4677, 337, 1077, 9371, 2654, 1804, 8493, 253, 2408, 273, 2505, 5262, 327, 15571, 752, 29952, 3210, 403, 390, 2139, 581, 943, 897, 29952, 3210, 689, 247, 649, 3210, 347, 436, 310, 247, 1077, 1711, 8881, 387, 436, 1127, 432, 752, 891, 476, 2028, 352, 1663, 816, 7024, 327, 253, 941, 273, 2282, 247, 359, 487, 962, 9077, 1566, 310, 1097, 247, 29952, 1566, 285, 271, 247, 649, 1566, 50276, 2577, 1735, 4468, 310, 326, 253, 5661, 1543, 403, 16706, 342, 5368, 6239, 347, 310, 253, 5661, 1543, 3559, 1056, 352, 1646, 326, 35197, 310, 417, 247, 2590, 13688, 689, 281, 3803, 14692, 1666, 25379, 2299, 2378, 359, 1908, 326, 690, 273, 253, 3904, 403, 3240, 745, 432, 6239, 849, 1175, 35197, 310, 3133, 1014, 625, 9101, 323, 1650, 634, 2361, 372, 70, 545, 262, 260, 1156, 2851, 1180, 323, 253, 465, 76, 3364, 10895, 310, 16821, 745, 432, 44739, 312, 1405, 1162, 355, 6247, 10352, 1037, 2556, 281, 752, 597, 755, 372, 70, 545, 262, 4850, 470, 25452, 534, 310, 2169, 685, 35197, 16331, 2251, 690, 273, 634, 18890, 84, 3904, 2439, 3082, 671, 1646, 745, 690, 8813, 273, 841, 37122, 651, 320, 9371, 50276, 9088, 310, 1529, 47284, 2523, 2393, 327, 275, 253, 2929, 253, 2505, 2789, 352, 1646, 751, 253, 820, 89, 1566, 1057, 417, 1056, 247, 4872, 9376, 5727, 247, 649, 1057, 436, 310, 2649, 2032, 820, 89, 19584, 253, 7898, 2412, 12177, 310, 4872, 247, 649, 19584, 253, 2412, 5788, 673, 310, 4872, 275, 643, 3000, 1097, 1056, 50137, 13260, 816, 323, 1027, 13483, 50276, 250, 3065, 50276, 34926, 301, 2080, 356, 7311, 285, 6793, 472, 948, 251, 247, 11454, 2990, 1566, 323, 5788, 941, 9990, 275, 9921, 8878, 50276, 91, 248, 91, 864, 480, 249, 277, 340, 19169, 298, 480, 359, 74, 1182, 73, 3093, 606, 340, 272, 5958, 3169, 17032, 323, 253, 21702, 4433, 673, 1566, 1794, 2755, 32410, 6469, 50276, 75, 1096, 298, 465, 16859, 1342, 41475, 439, 1240, 312, 247, 1591, 5945, 49556, 4940, 480, 251, 10511, 270, 684, 39838, 1076, 480, 22589, 285, 340, 86, 1208, 27451, 814, 254, 372, 2265, 321, 87, 32339, 1971, 3818, 3109, 985, 970, 247, 820, 89, 14495, 29952, 3676, 11454, 2990, 270, 17475, 3739, 2561, 16182, 4765, 50276, 73, 12299, 44739, 312, 1405, 391, 79, 5773, 270, 7397, 2654, 66, 4436, 293, 4522, 16713, 8045, 10554, 342, 11454, 6928, 285, 820, 89, 9077, 480, 1686, 83, 6247, 436, 2929, 556, 6685, 3710, 7681, 38135, 30885, 1512, 1199, 2505, 327, 15571, 5368, 2629, 1543, 1545, 24164, 432, 5788, 1783, 285, 556, 5661, 1543, 326, 403, 9101, 16706, 342, 5368, 6239, 5474, 339, 431, 248, 2929, 8725, 253, 3786, 4081, 4872, 3300, 532, 29982, 6853, 247, 649, 1566, 1754, 327, 3471, 73, 507, 5958, 26312, 281, 247, 14561, 9978, 253, 14561, 1566, 23776, 3676, 247, 649, 5958, 1747, 1256, 323, 4522, 16713, 8045, 10554, 1566, 35197, 310, 4764, 1025, 407, 247, 11454, 2990, 5661, 1543, 2439, 1740, 15302, 921, 247, 12085, 5750, 689, 1666, 25379, 591, 34860, 593, 3605, 260, 4663, 8527, 270, 4586, 4868, 18890, 84, 285, 3733, 673, 253, 2929, 310, 3477, 281, 956, 342, 12085, 11745, 1543, 2299, 253, 7681, 9021, 403, 14999, 627, 403, 690, 24363, 7234, 285, 253, 4028, 3198, 7756, 2708, 403, 2173, 6667, 50274, 46089, 253, 21702, 4433, 673, 1566, 247, 649, 390, 21702, 1495, 1566, 7033, 253, 42407, 273, 253, 4433, 673, 23352, 281, 253, 3386, 436, 3908, 310, 417, 7933, 2032, 323, 36833, 247, 649, 643, 685, 298, 2331, 1939, 24088, 50276, 664, 487, 962, 17619, 50276, 14069, 50275, 783, 298, 1109, 820, 89, 545, 50276, 2574, 374, 5426, 310, 417, 9113, 7616, 3877, 253, 2495, 873, 14168, 1179, 363, 943, 2486, 23339, 2149, 327, 1327, 46874, 2149, 2069, 246, 75, 331, 246, 75, 50276, 6811, 50276, 1704, 4567, 2299, 26230, 5788, 13483, 24088, 17697, 16466, 1159, 2550, 320, 3587, 2218, 323, 247, 649, 3169, 3210, 436, 3908, 310, 417, 7933, 2032, 323, 690, 36833, 247, 649, 3210, 24088, 359, 487, 962, 17619, 50276, 14069, 50276, 1704, 4567, 1223, 2929, 3916, 281, 1056, 4522, 16713, 8045, 50276, 12787, 11297, 288, 1595, 50276, 4347, 72, 2981, 39116, 352, 310, 417, 2590, 2139, 1110, 13650, 476, 417, 320, 908, 281, 3587, 6642, 7856, 296, 1269, 3185, 253, 2929, 29328, 247, 3300, 532, 29982, 6853, 17697, 16466, 9261, 2074, 281, 820, 89, 545, 1293, 5277, 667, 22861, 50276, 1704, 577, 310, 627, 1941, 326, 8525, 436, 3908, 50276, 9939, 326, 2629, 34860, 593, 3605, 11026, 8931, 1543, 342, 40373, 260, 1109, 32989, 323, 247, 649, 3169, 3210, 50274, 1704, 8255, 50276, 1542, 4344, 14023, 253, 2929, 943, 7277, 247, 649, 285, 16466, 3210, 970, 247, 2074, 7982, 2057, 260, 4663, 390, 40373, 260, 2851, 390, 1097, 50276, 1542, 247, 11088, 7103, 253, 2929, 943, 2085, 3081, 18276, 1543, 24088, 1566, 13650, 1411, 3216, 5083, 285, 18543, 9191, 50276, 37585, 3374, 50275, 33722, 1269, 74, 948, 269, 2981, 275, 2626, 1307, 16186, 577, 50276, 46089, 1745, 80, 387, 71, 943, 320, 247, 649, 50276, 1704, 577, 973, 71, 2166, 1566, 11026, 18890, 84, 2406, 685, 50276, 36817, 310, 18464, 50276, 522, 435, 432, 16774, 12085, 11745, 1543, 1411, 1666, 25379, 253, 7681, 9021, 403, 417, 2590, 253, 3300, 532, 29982, 6853, 247, 649, 1566, 1754, 327, 3471, 73, 507, 5958, 26312, 8103, 1159, 556, 644, 3786, 4081, 23000, 1223, 253, 2929, 3916, 281, 1056, 4522, 16713, 8045, 13650, 253, 17697, 16466, 9261, 273, 1566, 13650, 3133, 281, 5224, 253, 1566, 26295, 29952, 3185, 2490, 187, 4118, 18435, 27, 2252, 30628, 2210, 281, 253, 6452, 326, 436, 789, 19756, 38135, 285, 10527, 6864, 2007, 5460, 7350, 670, 253, 13091, 273, 690, 7234, 285, 670, 253, 5661, 9978, 452, 644, 5439, 253, 30080, 22559, 369, 417, 12351, 347, 1146, 4751, 21414, 285, 12445, 3078, 281, 16928, 436, 2929, 50276, 74, 3894, 954, 273, 841, 2792, 273, 14226, 3738, 627, 310, 5604, 690, 2442, 275, 436, 789, 891, 1158, 352, 310, 417, 4704, 323, 9311, 285, 651, 387, 1878, 878, 247, 2201, 18520 ]
[ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 ]
[ 4872, 1677, 436, 253, 3885, 327, 465, 76, 3364, 3133, 1077, 10084, 285, 310, 3164, 1955, 281, 247, 1355, 3638, 352, 651, 320, 4722, 281, 923, 253, 13642, 342, 253, 1180, 273, 1363, 295, 342, 1524, 941, 342, 1097, 253, 19947, 285, 253, 4869, 12177, 2957, 50275, 783, 4477, 812, 1056, 253, 4602, 875, 299, 4277, 275, 5150, 495, 285, 288, 17, 275, 898, 625, 2590, 347, 247, 3213, 1078, 970, 253, 295, 1241, 8440, 43322, 29107, 281, 897, 247, 3300, 532, 29982, 6853, 247, 649, 50275, 783, 4477, 897, 35890, 672, 5015, 670, 270, 4586, 4868, 2299, 21533, 672, 5015, 3340, 670, 247, 649, 3210, 18543, 10770, 281, 253, 3064, 875, 1524, 285, 8131, 2362, 2069, 285, 253, 270, 4586, 4868, 310, 247, 5878, 273, 18543, 285, 9162, 253, 270, 4586, 4868, 310, 253, 18543, 273, 253, 5788, 3268, 689, 673, 594, 6789, 352, 18543, 310, 417, 13583, 2299, 1561, 253, 3634, 273, 253, 5788, 6239, 8489, 24363, 625, 2173, 3448, 651, 1056, 253, 2929, 625, 34025, 50275, 266, 10199, 390, 16038, 2139, 253, 2957, 1159, 310, 247, 1175, 2957, 1159, 651, 320, 12912, 323, 253, 4685, 273, 253, 2929, 253, 4477, 812, 21184, 327, 5958, 9990, 285, 2139, 597, 403, 3264, 281, 320, 625, 6474, 3340, 347, 24088, 372, 70, 545, 262, 12088, 670, 19947, 672, 5015, 670, 34860, 593, 352, 651, 320, 9371, 323, 4361, 326, 19947, 1060, 310, 275, 3806, 281, 247, 17375, 2412, 14714, 26230, 1159, 50275, 262, 651, 320, 4722, 281, 923, 849, 973, 253, 1566, 1057, 275, 18543, 273, 253, 2362, 2069, 671, 342, 253, 643, 2957, 3470, 347, 436, 310, 16747, 14282, 50275, 783, 13433, 10268, 275, 2829, 374, 310, 417, 5185, 372, 70, 545, 262, 943, 320, 13433, 275, 581, 1083, 50275, 249, 253, 30762, 253, 4477, 1056, 352, 1646, 326, 627, 403, 642, 14940, 23632, 323, 278, 282, 1754, 48489, 2299, 671, 7296, 253, 2898, 281, 247, 649, 3210, 347, 247, 2714, 1083, 627, 513, 2226, 48489, 326, 452, 690, 10527, 14940, 23632, 12717, 1162, 355, 9169, 5788, 1783, 3066, 9826, 8967, 7424, 247, 1355, 5955, 10445, 812, 320, 12912, 1223, 627, 310, 247, 1781, 1180, 273, 5145, 4715, 3082, 323, 5788, 407, 7296, 247, 323, 253, 5145, 4715, 6239, 4460, 2957, 1159, 275, 253, 247, 649, 4758, 436, 2929, 476, 320, 2326, 347, 247, 14282, 7680, 281, 253, 5145, 4715, 323, 5788, 6239, 253, 10527, 12153, 273, 253, 2957, 1159, 310, 2299, 417, 3962, 285, 253, 4833, 273, 2957, 1159, 285, 10336, 403, 1892, 281, 557, 290, 2134, 1955, 281, 253, 830, 273, 253, 2957, 1159, 247, 7473, 13642, 1783, 651, 671, 320, 3058, 891, 651, 5583, 247, 5075, 2997, 891, 812, 320, 13762, 281, 3157, 273, 253, 4868, 342, 247, 625, 21414, 4154, 2139, 253, 2957, 1307, 275, 253, 14561, 4758, 310, 247, 1175, 2957, 1307, 5046, 949, 690, 5170, 3033, 327, 253, 2957, 1307, 5474, 33032, 2520, 2929, 29328, 281, 13398, 253, 2934, 273, 3471, 73, 507, 5958, 26312, 2934, 327, 13532, 253, 247, 649, 1566, 347, 973, 347, 970, 253, 3676, 4715, 1566, 347, 247, 14561, 1332, 323, 15706, 253, 4872, 1332, 275, 253, 3236, 247, 649, 1566, 253, 4477, 9059, 597, 4684, 3471, 73, 507, 1327, 36928, 5853, 342, 3676, 4715, 3210, 4679, 327, 2710, 22791, 15302, 921, 326, 253, 4081, 1566, 310, 12085, 281, 1375, 23037, 14387, 1666, 25379, 4757, 50276, 18, 253, 2934, 273, 16248, 3471, 73, 507, 1566, 342, 14561, 3676, 4715, 1566, 310, 747, 374, 3045, 310, 6474, 327, 2067, 49602, 495, 2590, 4028, 50276, 20881, 1255, 265, 337, 253, 30328, 273, 253, 2746, 310, 417, 3240, 2590, 1057, 253, 4081, 1566, 2770, 327, 8493, 15180, 673, 390, 11138, 253, 3045, 390, 1097, 374, 253, 1543, 513, 417, 1329, 253, 6452, 253, 11701, 1646, 417, 1534, 2429, 281, 1666, 25379, 495, 253, 16182, 4243, 6571, 13398, 1529, 5368, 1566, 2934, 715, 253, 1655, 247, 649, 1566, 7613, 253, 15832, 310, 3710, 35961, 3020, 1223, 253, 2934, 273, 16248, 3471, 5582, 1566, 342, 3676, 4715, 310, 747, 253, 1543, 921, 326, 253, 4081, 1566, 1057, 417, 562, 32231, 256, 5503, 2718, 327, 2067, 49602, 5474, 33032, 2520, 2929, 29328, 247, 3676, 11454, 2036, 6880, 281, 271, 5368, 19947, 3169, 29107, 908, 323, 253, 3300, 532, 29982, 6853, 247, 649, 1566, 480, 249, 1162, 355, 6469, 253, 4795, 1332, 33526, 37282, 2662, 34860, 593, 3605, 285, 8527, 270, 4586, 4868, 2193, 326, 403, 12085, 2429, 281, 8245, 3676, 247, 649, 3210, 891, 1119, 436, 2929, 15246, 281, 1239, 3738, 891, 452, 2067, 1781, 7350, 50276, 2577, 806, 4468, 310, 326, 253, 47284, 310, 27662, 9542, 1014, 2167, 253, 2234, 2934, 273, 253, 2929, 310, 6685, 2969, 253, 4081, 1566, 35197, 476, 320, 6012, 407, 816, 3192, 253, 1929, 5958, 29107, 407, 480, 249, 1162, 355, 6469, 285, 1863, 5436, 562, 253, 6703, 1885, 275, 616, 5150, 3127, 342, 247, 11454, 2036, 285, 840, 816, 6194, 970, 1054, 487, 1506, 11786, 18499, 3185, 273, 4872, 10717, 275, 643, 3000, 1199, 751, 849, 2080, 356, 7311, 285, 948, 251, 8878, 1863, 6965, 562, 253, 6703, 1885, 275, 253, 3300, 532, 29982, 6853, 820, 89, 1566, 281, 320, 247, 11454, 2036, 281, 1705, 598, 342, 253, 1072, 11454, 2036, 1566, 347, 372, 2265, 321, 87, 465, 16859, 1342, 1162, 355, 4765, 436, 2929, 1057, 253, 1072, 2181, 342, 253, 2168, 5368, 3300, 532, 29982, 6853, 247, 649, 1566, 534, 2168, 369, 1929, 1078, 253, 480, 249, 1162, 355, 6469, 2929, 533, 253, 13757, 7259, 275, 26230, 253, 9077, 10303, 497, 417, 8654, 432, 247, 7681, 38135, 2966, 21373, 8668, 627, 310, 1077, 1652, 326, 310, 4473, 1230, 747, 3185, 253, 2929, 310, 17194, 275, 247, 1039, 326, 30885, 1039, 1512, 1199, 10251, 382, 2068, 327, 12930, 2629, 1543, 432, 5788, 1783, 24088, 752, 310, 247, 820, 89, 1566, 752, 310, 271, 247, 649, 1566, 2139, 581, 651, 897, 2629, 820, 89, 815, 4632, 247, 649, 326, 253, 820, 89, 1566, 476, 320, 1160, 37282, 2662, 326, 247, 649, 3210, 476, 320, 7616, 275, 1097, 36833, 390, 3300, 532, 29982, 6853, 4948, 3966, 2654, 1804, 4931, 1907, 634, 4114, 3185, 2770, 327, 5368, 3300, 532, 29982, 6853, 247, 649, 6239, 326, 368, 250, 3365, 2509, 247, 15246, 11454, 2036, 6880, 8171, 6703, 1885, 342, 11454, 2036, 285, 10941, 634, 1566, 342, 7482, 285, 3522, 891, 1119, 4677, 337, 1077, 9371, 2654, 1804, 8493, 253, 2408, 273, 2505, 5262, 327, 15571, 752, 29952, 3210, 403, 390, 2139, 581, 943, 897, 29952, 3210, 689, 247, 649, 3210, 347, 436, 310, 247, 1077, 1711, 8881, 387, 436, 1127, 432, 752, 891, 476, 2028, 352, 1663, 816, 7024, 327, 253, 941, 273, 2282, 247, 359, 487, 962, 9077, 1566, 310, 1097, 247, 29952, 1566, 285, 271, 247, 649, 1566, 50276, 2577, 1735, 4468, 310, 326, 253, 5661, 1543, 403, 16706, 342, 5368, 6239, 347, 310, 253, 5661, 1543, 3559, 1056, 352, 1646, 326, 35197, 310, 417, 247, 2590, 13688, 689, 281, 3803, 14692, 1666, 25379, 2299, 2378, 359, 1908, 326, 690, 273, 253, 3904, 403, 3240, 745, 432, 6239, 849, 1175, 35197, 310, 3133, 1014, 625, 9101, 323, 1650, 634, 2361, 372, 70, 545, 262, 260, 1156, 2851, 1180, 323, 253, 465, 76, 3364, 10895, 310, 16821, 745, 432, 44739, 312, 1405, 1162, 355, 6247, 10352, 1037, 2556, 281, 752, 597, 755, 372, 70, 545, 262, 4850, 470, 25452, 534, 310, 2169, 685, 35197, 16331, 2251, 690, 273, 634, 18890, 84, 3904, 2439, 3082, 671, 1646, 745, 690, 8813, 273, 841, 37122, 651, 320, 9371, 50276, 9088, 310, 1529, 47284, 2523, 2393, 327, 275, 253, 2929, 253, 2505, 2789, 352, 1646, 751, 253, 820, 89, 1566, 1057, 417, 1056, 247, 4872, 9376, 5727, 247, 649, 1057, 436, 310, 2649, 2032, 820, 89, 19584, 253, 7898, 2412, 12177, 310, 4872, 247, 649, 19584, 253, 2412, 5788, 673, 310, 4872, 275, 643, 3000, 1097, 1056, 50137, 13260, 816, 323, 1027, 13483, 50276, 250, 3065, 50276, 34926, 301, 2080, 356, 7311, 285, 6793, 472, 948, 251, 247, 11454, 2990, 1566, 323, 5788, 941, 9990, 275, 9921, 8878, 50276, 91, 248, 91, 864, 480, 249, 277, 340, 19169, 298, 480, 359, 74, 1182, 73, 3093, 606, 340, 272, 5958, 3169, 17032, 323, 253, 21702, 4433, 673, 1566, 1794, 2755, 32410, 6469, 50276, 75, 1096, 298, 465, 16859, 1342, 41475, 439, 1240, 312, 247, 1591, 5945, 49556, 4940, 480, 251, 10511, 270, 684, 39838, 1076, 480, 22589, 285, 340, 86, 1208, 27451, 814, 254, 372, 2265, 321, 87, 32339, 1971, 3818, 3109, 985, 970, 247, 820, 89, 14495, 29952, 3676, 11454, 2990, 270, 17475, 3739, 2561, 16182, 4765, 50276, 73, 12299, 44739, 312, 1405, 391, 79, 5773, 270, 7397, 2654, 66, 4436, 293, 4522, 16713, 8045, 10554, 342, 11454, 6928, 285, 820, 89, 9077, 480, 1686, 83, 6247, 436, 2929, 556, 6685, 3710, 7681, 38135, 30885, 1512, 1199, 2505, 327, 15571, 5368, 2629, 1543, 1545, 24164, 432, 5788, 1783, 285, 556, 5661, 1543, 326, 403, 9101, 16706, 342, 5368, 6239, 5474, 339, 431, 248, 2929, 8725, 253, 3786, 4081, 4872, 3300, 532, 29982, 6853, 247, 649, 1566, 1754, 327, 3471, 73, 507, 5958, 26312, 281, 247, 14561, 9978, 253, 14561, 1566, 23776, 3676, 247, 649, 5958, 1747, 1256, 323, 4522, 16713, 8045, 10554, 1566, 35197, 310, 4764, 1025, 407, 247, 11454, 2990, 5661, 1543, 2439, 1740, 15302, 921, 247, 12085, 5750, 689, 1666, 25379, 591, 34860, 593, 3605, 260, 4663, 8527, 270, 4586, 4868, 18890, 84, 285, 3733, 673, 253, 2929, 310, 3477, 281, 956, 342, 12085, 11745, 1543, 2299, 253, 7681, 9021, 403, 14999, 627, 403, 690, 24363, 7234, 285, 253, 4028, 3198, 7756, 2708, 403, 2173, 6667, 50274, 46089, 253, 21702, 4433, 673, 1566, 247, 649, 390, 21702, 1495, 1566, 7033, 253, 42407, 273, 253, 4433, 673, 23352, 281, 253, 3386, 436, 3908, 310, 417, 7933, 2032, 323, 36833, 247, 649, 643, 685, 298, 2331, 1939, 24088, 50276, 664, 487, 962, 17619, 50276, 14069, 50275, 783, 298, 1109, 820, 89, 545, 50276, 2574, 374, 5426, 310, 417, 9113, 7616, 3877, 253, 2495, 873, 14168, 1179, 363, 943, 2486, 23339, 2149, 327, 1327, 46874, 2149, 2069, 246, 75, 331, 246, 75, 50276, 6811, 50276, 1704, 4567, 2299, 26230, 5788, 13483, 24088, 17697, 16466, 1159, 2550, 320, 3587, 2218, 323, 247, 649, 3169, 3210, 436, 3908, 310, 417, 7933, 2032, 323, 690, 36833, 247, 649, 3210, 24088, 359, 487, 962, 17619, 50276, 14069, 50276, 1704, 4567, 1223, 2929, 3916, 281, 1056, 4522, 16713, 8045, 50276, 12787, 11297, 288, 1595, 50276, 4347, 72, 2981, 39116, 352, 310, 417, 2590, 2139, 1110, 13650, 476, 417, 320, 908, 281, 3587, 6642, 7856, 296, 1269, 3185, 253, 2929, 29328, 247, 3300, 532, 29982, 6853, 17697, 16466, 9261, 2074, 281, 820, 89, 545, 1293, 5277, 667, 22861, 50276, 1704, 577, 310, 627, 1941, 326, 8525, 436, 3908, 50276, 9939, 326, 2629, 34860, 593, 3605, 11026, 8931, 1543, 342, 40373, 260, 1109, 32989, 323, 247, 649, 3169, 3210, 50274, 1704, 8255, 50276, 1542, 4344, 14023, 253, 2929, 943, 7277, 247, 649, 285, 16466, 3210, 970, 247, 2074, 7982, 2057, 260, 4663, 390, 40373, 260, 2851, 390, 1097, 50276, 1542, 247, 11088, 7103, 253, 2929, 943, 2085, 3081, 18276, 1543, 24088, 1566, 13650, 1411, 3216, 5083, 285, 18543, 9191, 50276, 37585, 3374, 50275, 33722, 1269, 74, 948, 269, 2981, 275, 2626, 1307, 16186, 577, 50276, 46089, 1745, 80, 387, 71, 943, 320, 247, 649, 50276, 1704, 577, 973, 71, 2166, 1566, 11026, 18890, 84, 2406, 685, 50276, 36817, 310, 18464, 50276, 522, 435, 432, 16774, 12085, 11745, 1543, 1411, 1666, 25379, 253, 7681, 9021, 403, 417, 2590, 253, 3300, 532, 29982, 6853, 247, 649, 1566, 1754, 327, 3471, 73, 507, 5958, 26312, 8103, 1159, 556, 644, 3786, 4081, 23000, 1223, 253, 2929, 3916, 281, 1056, 4522, 16713, 8045, 13650, 253, 17697, 16466, 9261, 273, 1566, 13650, 3133, 281, 5224, 253, 1566, 26295, 29952, 3185, 2490, 187, 4118, 18435, 27, 2252, 30628, 2210, 281, 253, 6452, 326, 436, 789, 19756, 38135, 285, 10527, 6864, 2007, 5460, 7350, 670, 253, 13091, 273, 690, 7234, 285, 670, 253, 5661, 9978, 452, 644, 5439, 253, 30080, 22559, 369, 417, 12351, 347, 1146, 4751, 21414, 285, 12445, 3078, 281, 16928, 436, 2929, 50276, 74, 3894, 954, 273, 841, 2792, 273, 14226, 3738, 627, 310, 5604, 690, 2442, 275, 436, 789, 891, 1158, 352, 310, 417, 4704, 323, 9311, 285, 651, 387, 1878, 878, 247, 2201, 18520 ]
Below is given review of a research paper from cnoference journal. Please write a summary the review. ### Review: this paper studies the federated learning problem in this model local clients are allowed to jointly train a model without sharing user data and the central goal is to design communication efficient algorithms the author gives a federated averaging langevin algorithm and provide theoretical guarantees for strongly logconcave distributions these results provide guidance on the choice of learning rates and local updates to minimize communication cost the authors also consider applying correlated noises and using only partial device updates which are more applicable in practice the authors propose the federated averaging langevin algorithm fald for posterior inference the authors present nontrivial nonasymptotic convergence analysis for fald for distributions with strongly smooth and strongly convex hamiltonian and with bounded variance of noise in the stochastic gradient the assumption of bounded gradient in l2 norm is not required in contrast to the previous work fedavg as well as many others in literature the analysis of fald indicates the number of global steps and the choice of the learning rate more importantly it shows that the number of local steps should be set roughly as the order of square root of the condition number the authors further obtain nontrivial results in the cases of varying learning rates in each step privacyaccuracy tradeoff via correlated gaussian noises and computation model with partial device participation the highlights of this paper are novel theoretical analysis of the fald algorithm it demonstrates how the injected noise the data heterogeneity and the stochasticgradient noise affect the convergence it would be nice if the authors could also provide some lower bound or worst case analysis for the fald algorithm to show that the current convergence guarantees are tight or close to tight for example how does fald behave for gaussian posteriors is the convergence rate in the main theorem asymptotically optimal for gaussian detailed comments page 3 right before section 4 tildef is a unbiased estimate of f shouldnt it be grad tildef is an unbiased estimate of grad f overall i think this is a nice paper with strong theoretical results it provides insights to the theoretical study of standard sampling algorithms in federated learning the paper is also wellwritten docsepin this work the authors 1 propose a federated learning version of langevin diffusion sampling 2 develop theoretical guarantees for fald for strongly logconcave distributions with noniid data and study how the injected noise and the stochastic gradient noise the heterogeneity of data and the varying learning rates affect the convergence 3 analyze the partial participation setting strength 1 the idea of sampling using langevin diffusion in the federated learning setting is new weakness since this is a theoretical paper i am basing my decision on its theoretical contribution 1 novelty in the proof of ld is incremental only part of the proof which is different from the traditional proof of ld is the divergence term but showing that this term is small lemma b3 follows very simply from typical strongly convex optimization techniques all the results corresponding to partial participation and varying rates follow almost directly from the proof of the main theorem and does not have much theoretical novelty 2 i liked the idea of allowing for correlated noise but i could not find the proof of theorem b8 theorem 59 in the paper so its difficult to gauge the difficulty of this part or even the correctness of this result if i have missed the proof and the authors could point me towards it that would be great the paper lacks theoretical novelty but i liked the idea of introducing ld in federated setting i may change my decision after discussion with other reviewers docsepthe paper proposes federated averaging langevin dynamic algorithm fald the theoretical guarantees for the proposed algorithm are developed and their relationships with noise type heterogeneity and learning rate are studied the paper has shown extensive theoretical analysis for the proposed fald method such as convergence for independent noise and varying learning rates with or without full device participation however given the existing theoretical analysis for sgld based on 2wasserstein distance and theoretical analysis for fedavg with or without full device participation the results are straightforward could the authors show the difficulty of the theoretical analysis in this paper empirical studies are missing in the current version the experiment section should be added to the paper to validate the theoretical findings and investigate the performance of fald particularly the reviewer would like to see the performance of fald in terms of different local updates learning rates participating devices etc and its comparison with other distributed ld methods typo section 32 a energy function the paper has given extensive theoretical convergence analysis for fald algorithm via 2wasserstein however empirical studies to verify theoretical findings are missing in the main paper docsepthe paper proposes a federated averaging langevin algorithm fald and analyzes its convergence under noniid and partical participation settings the main contribution is the theoretical guarantees for a federated averaging langevin algorithm for strongly logconcave distributions with noniid data the proof sketch is well written and clear it seems that the proof is sound however most of the analysis technique can be founded in lituerature the proof technique is thus standard i think the writing has room to improve but overall it is clear and easy to follow i have the following concerns 1 in section 31 i think it is better to stress the betak is an auxiliary sequence besides how betakc involves the local updates step in one iterate of the fedavg algorithm seems to be incorrect i think it should add thetak c betakc for 2 le k le k 2 algorithm 1 is easy to understand since it is an approximation of fedavg however algorithm 2 is not so intuitive the author didnt explain why we should add an additional textbfshared noise dotxik and make the final noise correlated with the original noise xikc in algorithm 2 how does such a shared noise ensure privacy and what kind of definition of privacy is used the last paragraph of section 4 is quite vague and amphibolous to substantiate the point i think more argument and discussion should be added 3 the current presentation of main result is troublesome the author could first introduce the main result and then provide a proof sketch after all not everyone is interested in the proof technique whats more the analysis for algorithm 2 is deferred to section 533 but its introduction is section 4 i think it is somewhat too late and not helpful for readers to grasp the algorithm 4 the first paragraph of section 52 seem to have a type nabla fleftbarthetatright should be defined as sumc1n pc nabla fcleftbarthetatright 5 though this paper is theoretical the author could still provide some numerical experiments to show the convergence of fald those experiments can further validate the theorems for example the tradeoff of k and the effect of sampling methods after all fald is different from fedavg i dont know whether the convergence of fald has been empirically studied before the delcaimed contribution include fald and its theoretical guarantees however the analysis technique is commonly used in litureature while no empirical expeirments to illustrate the effectiveness of fald besides there are some presentation issues and ambiguity on motivation i dont think this paper is ready for publication docsepthis paper studies federated learning algorithms where a collection of computing nodes seek to collectively optimize a global objective through parallelized updates executed at a server different from prior works which execute localonly stochastic gradient updates on nonconvex losses here a stochastic gradient langevin update is developed which additionally incorporates randomized gaussian perturbations convergence theory is presented for the proposed scheme which establishes pointwise convergence in mean as well as convergence in distribution according to the wasserstein metric variations of the noise generating process which introduce heterogeneity and privacy preservation are also proposed and analyzed assuming strong convexity assumption 52 seems completely ridiculous in this setting one of the major motivations for considering stochastic gradient langevin dynamics is to improve the performance of nonconvex stochastic optimization algorithms so as to escape spurious stationarity points and arrive at global extrema in fact this can be rigorously shown to tend towards the distribution of global extrema without convexity with nonasymptotic rate given in raginsky et al 2017 there are no numerical experiments that demonstrate the utility of the approach how restrictive is it to consider logconcave distributions this should preclude most neural network training in practice which is the primary emphasis of federated learning as compared with more classical methodologies based upon consensusmultiagent optimization is there a practical phenomenon where standard federated learning or its variants that use lagrange multipliers such as feddyan and scaffold get stuck whereas this version with randomized perturbations does not i feel that this scenario is exactly in nonconvex settings which is excluded from consideration in this work it is difficult to make sense of what are the specific technical innovations in the analysis here and how they are a departure from earlier results this work is mainly of theoretical interest as it considers randomized perturbations in a federated averaging setting however its technical scope precludes the most useful instances of it in practice and it is missing experiments therefore it seems of limited usefulness in my view ### Summary:
this paper proposes a federated averaging langevin dynamics fald for numerical mean prediction with uncertainty quantification under the setting of federated learning convergence analysis for the proposed method under the smoothness and strongconvex assumptions is also provided and the results are summarized in theorems 57510 each of which bounds the wasserstein2 distance w2mukpi between the model distribution muk and the target distribution pi under different settings this paper received 5 reviews in total with scores 6 5 3 5 and 3 some reviewers evaluated positively the novelty of the idea of using the langevin dynamics in the federated setting which i would also like to acknowledge upon reading the paper by myself however i find that the mathematical formulations are in some places not correct what i think problematic is the third equation in equation 3 the righthand side is a function of n variables thetakc and they undergo different local updates at different clients when knotequiv 0mod k ie the synchronization does not take place also nablatildefc is in general a nonlinear function of its argument therefore the righthand side cannot be written in general as a function of a single variable thetak which is defined as thetaksumc1npcthetakc making this equation incorrect this problem would affect various parts of the arguments to follow in this paper such as the first two equations in equation 16 on page 14 the two inline equations just after equation 16 equation 18 the second equality in the inline equation in page 15 line 1 and the third line in equation 25 on page 18 to mention a few thus i have to question the validity of the theoretical development in this paper another point i would like to mention is that i did not understand the definition of schemes i and ii in section 54 it is not stated at all that mathcalsk is a random quantity here furthermore the conditions withwithout replacement are not described at all still another point to mention is that i did not understand the claim in page 7 lines 3031 does it mean if one knows the number tepsilon of steps to achieve the precision epsilon then one should set the number k of local steps per synchronization should be set of the order of sqrttepsilon but tepsilon depends on k so that it would be unnatural to assume that one knows tepsilon irrespective of k in the first place because of these i would judge that this paper is not yet ready for presentation in its current form i would therefore not be able to recommend acceptance of this paper minor points citation style the authors use throughout the paper what are called the narrative citations even though there are occasions where what are called the parenthetical citations the author name and publication date are both enclosed in parentheses should be used page 3 line 7 is the an unbiased stochastic gradient there are several unbiased estimators for the gradient and what is mentioned here is only one instance of them page 3 lines 2324 the aggregation should take place not on each client but on the central server page 3 line 36 an energy function an unbiased estimate page 5 lines 1720 the contents of assumptions 51 and 52 are not assumptions but definitions page 6 line 2 to obtain the a lower bound page 6 line 18 mathcald2 is undefined page 8 line 39 a the probability pc if it is meant to be the one defined in page 3 line 8 otherwise use of the same symbol to represent different quantities should be avoided page 14 line 25 mod e k 0 page 15 line 30 hrho2 hrho
[ 7052, 2412, 45542, 1123, 10670, 342, 1327, 74, 301, 941, 285, 1263, 849, 253, 13945, 6046, 285, 253, 19191, 11786, 6046, 253, 19331, 273, 941, 285, 253, 11962, 4715, 4142, 2818, 253, 14940, 50276, 20, 12106, 253, 7898, 11497, 4758, 50276, 45563, 50276, 18, 253, 2934, 273, 10491, 970, 298, 912, 8498, 12393, 275, 253, 10208, 12072, 4715, 4758, 310, 747, 50275, 20881, 1255, 50275, 17480, 436, 310, 247, 10527, 2929, 891, 717, 1666, 272, 619, 3061, 327, 697, 10527, 7680, 50276, 18, 38135, 275, 253, 4737, 273, 42651, 310, 32809, 50276, 7483, 629, 273, 253, 4737, 534, 310, 1027, 432, 253, 5899, 4737, 273, 42651, 310, 253, 23279, 1307, 533, 4645, 326, 436, 1307, 310, 1355, 18057, 270, 20, 3637, 1077, 3365, 432, 6867, 7052, 17133, 13757, 5609, 512, 253, 1543, 3969, 281, 7898, 11497, 285, 11962, 4142, 956, 2761, 3587, 50276, 4064, 253, 4737, 273, 253, 2022, 10012, 285, 1057, 417, 452, 1199, 10527, 38135, 50276, 19, 891, 10490, 253, 2934, 273, 6941, 323, 9578, 6046, 533, 891, 812, 417, 1089, 253, 4737, 273, 10012, 270, 25, 10012, 8978, 275, 253, 2929, 594, 697, 2834, 281, 11206, 253, 10183, 273, 436, 629, 390, 1014, 253, 36594, 273, 436, 906, 604, 891, 452, 9829, 253, 4737, 285, 253, 4477, 812, 1127, 479, 4404, 352, 326, 651, 320, 1270, 50274, 783, 2929, 19756, 10527, 38135, 533, 891, 10490, 253, 2934, 273, 16984, 42651, 275, 10208, 12072, 4758, 891, 778, 1818, 619, 3061, 846, 5955, 342, 643, 30628, 5474, 339, 431, 248, 2929, 29328, 10208, 12072, 25001, 298, 912, 8498, 7870, 5933, 269, 8950, 253, 10527, 23632, 323, 253, 4081, 5933, 403, 3715, 285, 616, 7688, 342, 6046, 1511, 19331, 285, 4715, 2281, 403, 5421, 50276, 783, 2929, 556, 2011, 9470, 10527, 1783, 323, 253, 4081, 269, 8950, 1332, 824, 347, 14940, 323, 3907, 6046, 285, 11962, 4715, 4142, 342, 390, 1293, 2120, 2813, 11497, 2299, 1677, 253, 5368, 10527, 1783, 323, 48237, 392, 1754, 327, 374, 88, 30666, 6339, 4181, 285, 10527, 1783, 323, 10208, 42921, 342, 390, 1293, 2120, 2813, 11497, 253, 1543, 403, 15246, 812, 253, 4477, 921, 253, 10183, 273, 253, 10527, 1783, 275, 436, 2929, 50276, 358, 5378, 474, 2175, 403, 5816, 275, 253, 1655, 2715, 253, 3368, 2593, 943, 320, 2879, 281, 253, 2929, 281, 17813, 253, 10527, 4342, 285, 7409, 253, 3045, 273, 269, 8950, 3782, 253, 37317, 651, 751, 281, 923, 253, 3045, 273, 269, 8950, 275, 2426, 273, 1027, 1980, 11269, 4715, 4142, 15299, 4095, 3966, 285, 697, 5301, 342, 643, 5939, 42651, 3082, 50274, 555, 5367, 2593, 4567, 247, 2341, 1159, 253, 2929, 556, 1677, 9470, 10527, 14940, 1783, 323, 269, 8950, 5933, 3066, 374, 88, 30666, 6339, 2299, 16774, 2175, 281, 12654, 10527, 4342, 403, 5816, 275, 253, 2022, 2929, 50276, 7152, 339, 431, 248, 2929, 29328, 247, 10208, 12072, 25001, 298, 912, 8498, 5933, 269, 8950, 285, 3537, 13505, 697, 14940, 762, 1327, 74, 301, 285, 629, 474, 11497, 7533, 253, 2022, 7680, 310, 253, 10527, 23632, 323, 247, 10208, 12072, 25001, 298, 912, 8498, 5933, 323, 7052, 2412, 45542, 1123, 10670, 342, 1327, 74, 301, 941, 50275, 783, 4737, 23211, 310, 973, 3542, 285, 2590, 352, 3133, 326, 253, 4737, 310, 3590, 2299, 954, 273, 253, 1783, 5853, 476, 320, 11420, 275, 6195, 10207, 1177, 253, 4737, 5853, 310, 3021, 2629, 50276, 74, 1158, 253, 4028, 556, 2316, 281, 3157, 533, 4583, 352, 310, 2590, 285, 3477, 281, 956, 50276, 74, 452, 253, 1563, 7350, 337, 275, 2593, 4562, 891, 1158, 352, 310, 1805, 281, 4073, 253, 701, 518, 310, 271, 24026, 3425, 16280, 849, 701, 518, 68, 8687, 253, 1980, 11269, 3213, 275, 581, 35388, 273, 253, 10208, 42921, 5933, 3133, 281, 320, 13583, 891, 1158, 352, 943, 823, 253, 85, 518, 260, 50276, 9900, 518, 68, 323, 374, 458, 465, 458, 465, 374, 5933, 337, 310, 3477, 281, 2096, 1580, 352, 310, 271, 11193, 273, 10208, 42921, 2299, 5933, 374, 310, 417, 594, 27350, 253, 2488, 42126, 5513, 2139, 359, 943, 823, 271, 3081, 2505, 3342, 18867, 6046, 14261, 89, 1479, 285, 1056, 253, 2457, 6046, 9578, 342, 253, 3236, 6046, 1269, 1479, 68, 275, 5933, 374, 849, 1057, 824, 247, 6096, 6046, 5416, 11068, 285, 752, 2238, 273, 5426, 273, 11068, 310, 908, 50276, 783, 1390, 12494, 273, 2593, 577, 310, 3240, 21248, 285, 49195, 311, 528, 281, 4326, 4513, 253, 1127, 891, 1158, 625, 4154, 285, 5955, 943, 320, 2879, 495, 253, 1655, 9759, 273, 2022, 906, 310, 45991, 253, 2488, 812, 806, 9569, 253, 2022, 906, 285, 840, 2085, 247, 4737, 23211, 846, 512, 417, 4130, 310, 6110, 275, 253, 4737, 5853, 47515, 625, 253, 1783, 323, 5933, 374, 310, 36334, 281, 2593, 44504, 533, 697, 10199, 310, 2593, 577, 891, 1158, 352, 310, 8489, 1512, 3563, 285, 417, 9371, 323, 10668, 281, 15909, 253, 5933, 50276, 21, 253, 806, 12494, 273, 2593, 8073, 1646, 281, 452, 247, 1511, 295, 6348, 269, 1274, 35292, 6168, 255, 918, 943, 320, 2931, 347, 2020, 68, 18, 79, 21136, 295, 6348, 269, 68, 1274, 35292, 6168, 255, 918, 608, 2167, 436, 2929, 310, 10527, 253, 2488, 812, 1335, 2085, 690, 10704, 4679, 281, 921, 253, 14940, 273, 269, 8950, 1110, 4679, 476, 2007, 17813, 253, 39383, 323, 1650, 253, 5454, 2727, 273, 465, 285, 253, 1055, 273, 10491, 3082, 846, 512, 269, 8950, 310, 1027, 432, 10208, 42921, 891, 13414, 871, 1880, 253, 14940, 273, 269, 8950, 556, 644, 45190, 5421, 1078, 253, 1448, 68, 1468, 264, 7680, 2486, 269, 8950, 285, 697, 10527, 23632, 2299, 253, 1783, 5853, 310, 7744, 908, 275, 6195, 459, 1177, 1223, 642, 16774, 385, 365, 343, 942, 281, 17093, 253, 12510, 273, 269, 8950, 16280, 627, 403, 690, 9759, 3374, 285, 28931, 327, 16038, 891, 13414, 1158, 436, 2929, 310, 4704, 323, 9311, 50275, 7152, 33032, 2520, 2929, 2175, 10208, 12072, 4715, 11333, 835, 247, 4849, 273, 12672, 7632, 7703, 281, 26708, 22318, 247, 4156, 8103, 949, 7529, 1025, 11269, 11407, 387, 247, 4771, 1027, 432, 2720, 2987, 534, 13194, 1980, 7483, 19191, 11786, 11269, 327, 1327, 44181, 11655, 1060, 247, 19191, 11786, 298, 912, 8498, 5731, 310, 3715, 534, 23000, 31167, 14871, 305, 12064, 26309, 14940, 3762, 310, 3559, 323, 253, 4081, 6974, 534, 25097, 1127, 3020, 50276, 585, 41801, 275, 1599, 347, 973, 347, 14940, 275, 3268, 2556, 281, 253, 369, 2152, 6339, 7982, 10575, 273, 253, 6046, 11365, 1232, 534, 9569, 19331, 285, 11068, 23029, 403, 671, 4081, 285, 5867, 50275, 37411, 2266, 17133, 414, 9376, 8073, 3133, 4336, 19082, 275, 436, 4758, 581, 273, 253, 2201, 42852, 323, 7296, 19191, 11786, 298, 912, 8498, 8062, 310, 281, 3157, 253, 3045, 273, 1327, 44181, 19191, 13757, 11333, 594, 347, 281, 8773, 46541, 4660, 15752, 2792, 285, 12666, 387, 4156, 1021, 250, 785, 275, 958, 436, 476, 320, 8132, 29689, 2011, 281, 5257, 4404, 253, 3268, 273, 4156, 1021, 250, 785, 1293, 17133, 414, 342, 1327, 284, 40045, 3875, 2281, 1677, 275, 23603, 26495, 1162, 355, 4240, 50276, 9088, 403, 642, 10704, 4679, 326, 7568, 253, 11839, 273, 253, 2746, 849, 29190, 310, 352, 281, 1908, 2412, 45542, 1123, 10670, 436, 943, 31423, 954, 11454, 2990, 3733, 275, 3946, 534, 310, 253, 3625, 15075, 273, 10208, 12072, 4715, 347, 2429, 342, 625, 8946, 39396, 1754, 2220, 13969, 23939, 12788, 13757, 50276, 261, 627, 247, 8542, 11562, 835, 2629, 10208, 12072, 4715, 390, 697, 11640, 326, 897, 16653, 6324, 18878, 4670, 824, 347, 10208, 6421, 266, 285, 32973, 755, 10960, 5727, 436, 2715, 342, 14871, 26309, 1057, 417, 891, 1928, 326, 436, 10076, 310, 4555, 275, 1327, 44181, 7533, 534, 310, 10432, 432, 8180, 275, 436, 789, 50276, 262, 310, 2834, 281, 1056, 3282, 273, 752, 403, 253, 2173, 7681, 32771, 275, 253, 1783, 1060, 285, 849, 597, 403, 247, 16018, 432, 4321, 1543, 50275, 2520, 789, 310, 7194, 273, 10527, 1600, 347, 352, 19401, 14871, 26309, 275, 247, 10208, 12072, 25001, 4758, 2299, 697, 7681, 7990, 46704, 253, 954, 4217, 10872, 273, 352, 275, 3946, 285, 352, 310, 5816, 4679, 3103, 352, 3133, 273, 3710, 31471, 275, 619, 1859, 2490, 187, 4118, 18435, 27, 2520, 2929, 29328, 247, 10208, 12072, 25001, 298, 912, 8498, 8062, 269, 8950, 323, 10704, 1599, 10554, 342, 11649, 21652, 762, 253, 4758, 273, 10208, 12072, 4715, 14940, 1783, 323, 253, 4081, 1332, 762, 253, 6032, 1255, 285, 2266, 44181, 13260, 310, 671, 2530, 285, 253, 1543, 403, 17903, 275, 39383, 45916, 740, 1016, 273, 534, 14493, 253, 369, 2152, 6339, 19, 4181, 259, 19, 1906, 76, 2059, 875, 253, 1566, 3268, 278, 2788, 285, 253, 2303, 3268, 12580, 762, 1027, 7533, 50276, 2520, 2929, 2959, 608, 10123, 275, 2264, 342, 7363, 721, 608, 495, 608, 285, 495, 690, 30628, 6760, 14962, 253, 38135, 273, 253, 2934, 273, 970, 253, 298, 912, 8498, 8062, 275, 253, 10208, 12072, 4758, 534, 891, 651, 671, 751, 281, 14409, 2220, 4361, 253, 2929, 407, 4266, 2299, 891, 1089, 326, 253, 15965, 26850, 403, 275, 690, 5053, 417, 3451, 752, 891, 1158, 20276, 310, 253, 2626, 5150, 275, 5150, 495, 253, 987, 4608, 1930, 310, 247, 1159, 273, 295, 4903, 253, 85, 518, 68, 285, 597, 15080, 1027, 1980, 11269, 387, 1027, 8548, 672, 694, 1584, 371, 400, 470, 2307, 465, 26332, 253, 27801, 1057, 417, 1379, 1659, 671, 295, 1752, 255, 300, 1545, 68, 310, 275, 2087, 247, 14561, 1159, 273, 697, 4154, 3103, 253, 987, 4608, 1930, 2550, 320, 3542, 275, 2087, 347, 247, 1159, 273, 247, 2014, 4778, 253, 85, 518, 534, 310, 2931, 347, 253, 85, 518, 2204, 68, 18, 18650, 291, 6168, 518, 68, 2403, 436, 5150, 13583, 436, 1895, 651, 2818, 2710, 4243, 273, 253, 7125, 281, 956, 275, 436, 2929, 824, 347, 253, 806, 767, 7424, 275, 5150, 1668, 327, 3239, 1638, 253, 767, 13866, 7424, 816, 846, 5150, 1668, 5150, 1283, 253, 1273, 13919, 275, 253, 13866, 5150, 275, 3239, 1458, 1386, 337, 285, 253, 2626, 1386, 275, 5150, 2030, 327, 3239, 1283, 281, 3748, 247, 1643, 3021, 891, 452, 281, 1953, 253, 13091, 273, 253, 10527, 2440, 275, 436, 2929, 50276, 23955, 1127, 891, 651, 751, 281, 3748, 310, 326, 891, 858, 417, 2096, 253, 5426, 273, 15849, 891, 285, 21255, 275, 2593, 8255, 352, 310, 417, 4767, 387, 512, 326, 14168, 68, 932, 76, 310, 247, 3632, 10671, 1060, 33810, 253, 2515, 342, 14920, 5407, 403, 417, 2529, 387, 512, 50276, 23350, 1529, 1127, 281, 3748, 310, 326, 891, 858, 417, 2096, 253, 1750, 275, 3239, 818, 3104, 1884, 2405, 1057, 352, 1599, 604, 581, 6057, 253, 1180, 246, 4259, 273, 5018, 281, 5115, 253, 12320, 299, 4277, 840, 581, 943, 873, 253, 1180, 465, 273, 1980, 5018, 591, 27801, 943, 320, 873, 273, 253, 1340, 273, 8084, 442, 4277, 533, 246, 4259, 7024, 327, 465, 594, 326, 352, 651, 320, 44822, 281, 5467, 326, 581, 6057, 246, 4259, 30472, 273, 465, 275, 253, 806, 1659, 50276, 12157, 273, 841, 891, 651, 5963, 326, 436, 2929, 310, 417, 2568, 4704, 323, 9759, 275, 697, 1655, 830, 891, 651, 3103, 417, 320, 2104, 281, 5583, 14924, 273, 436, 2929, 50276, 37585, 2792, 50276, 26977, 3740, 253, 4477, 897, 4768, 253, 2929, 752, 403, 1925, 253, 14511, 30404, 1014, 2167, 627, 403, 15530, 835, 752, 403, 1925, 253, 2885, 6168, 474, 30404, 253, 2488, 1416, 285, 9311, 3522, 403, 1097, 26895, 275, 41616, 943, 320, 908, 50276, 6377, 495, 1386, 818, 310, 253, 50276, 266, 38663, 19191, 11786, 627, 403, 2067, 38663, 48489, 323, 253, 11786, 285, 752, 310, 5393, 1060, 310, 760, 581, 4227, 273, 731, 50276, 6377, 495, 3104, 374, 21397, 253, 20828, 943, 1379, 1659, 417, 327, 1016, 5268, 533, 327, 253, 4275, 4771, 50276, 6377, 495, 1386, 5540, 271, 2341, 1159, 271, 38663, 6642, 50276, 6377, 608, 3104, 1722, 938, 253, 9410, 273, 13260, 8319, 285, 8073, 403, 417, 13260, 533, 14308, 50276, 6377, 721, 1386, 374, 281, 4044, 253, 50276, 66, 50276, 12973, 3033, 50276, 6377, 721, 1386, 1283, 14168, 1179, 69, 19, 310, 17011, 50276, 6377, 854, 1386, 6931, 247, 50276, 783, 5912, 21136, 604, 352, 310, 5486, 281, 320, 253, 581, 2931, 275, 3239, 495, 1386, 854, 5010, 897, 273, 253, 1072, 9484, 281, 1957, 1027, 13483, 943, 320, 16371, 50276, 6377, 1638, 1386, 2030, 771, 299, 50276, 76, 470, 50276, 6377, 1458, 1386, 1884, 288, 2859, 19, 50276, 73, 2859 ]
[ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 ]
[ 7052, 2412, 45542, 1123, 10670, 342, 1327, 74, 301, 941, 285, 1263, 849, 253, 13945, 6046, 285, 253, 19191, 11786, 6046, 253, 19331, 273, 941, 285, 253, 11962, 4715, 4142, 2818, 253, 14940, 50276, 20, 12106, 253, 7898, 11497, 4758, 50276, 45563, 50276, 18, 253, 2934, 273, 10491, 970, 298, 912, 8498, 12393, 275, 253, 10208, 12072, 4715, 4758, 310, 747, 50275, 20881, 1255, 50275, 17480, 436, 310, 247, 10527, 2929, 891, 717, 1666, 272, 619, 3061, 327, 697, 10527, 7680, 50276, 18, 38135, 275, 253, 4737, 273, 42651, 310, 32809, 50276, 7483, 629, 273, 253, 4737, 534, 310, 1027, 432, 253, 5899, 4737, 273, 42651, 310, 253, 23279, 1307, 533, 4645, 326, 436, 1307, 310, 1355, 18057, 270, 20, 3637, 1077, 3365, 432, 6867, 7052, 17133, 13757, 5609, 512, 253, 1543, 3969, 281, 7898, 11497, 285, 11962, 4142, 956, 2761, 3587, 50276, 4064, 253, 4737, 273, 253, 2022, 10012, 285, 1057, 417, 452, 1199, 10527, 38135, 50276, 19, 891, 10490, 253, 2934, 273, 6941, 323, 9578, 6046, 533, 891, 812, 417, 1089, 253, 4737, 273, 10012, 270, 25, 10012, 8978, 275, 253, 2929, 594, 697, 2834, 281, 11206, 253, 10183, 273, 436, 629, 390, 1014, 253, 36594, 273, 436, 906, 604, 891, 452, 9829, 253, 4737, 285, 253, 4477, 812, 1127, 479, 4404, 352, 326, 651, 320, 1270, 50274, 783, 2929, 19756, 10527, 38135, 533, 891, 10490, 253, 2934, 273, 16984, 42651, 275, 10208, 12072, 4758, 891, 778, 1818, 619, 3061, 846, 5955, 342, 643, 30628, 5474, 339, 431, 248, 2929, 29328, 10208, 12072, 25001, 298, 912, 8498, 7870, 5933, 269, 8950, 253, 10527, 23632, 323, 253, 4081, 5933, 403, 3715, 285, 616, 7688, 342, 6046, 1511, 19331, 285, 4715, 2281, 403, 5421, 50276, 783, 2929, 556, 2011, 9470, 10527, 1783, 323, 253, 4081, 269, 8950, 1332, 824, 347, 14940, 323, 3907, 6046, 285, 11962, 4715, 4142, 342, 390, 1293, 2120, 2813, 11497, 2299, 1677, 253, 5368, 10527, 1783, 323, 48237, 392, 1754, 327, 374, 88, 30666, 6339, 4181, 285, 10527, 1783, 323, 10208, 42921, 342, 390, 1293, 2120, 2813, 11497, 253, 1543, 403, 15246, 812, 253, 4477, 921, 253, 10183, 273, 253, 10527, 1783, 275, 436, 2929, 50276, 358, 5378, 474, 2175, 403, 5816, 275, 253, 1655, 2715, 253, 3368, 2593, 943, 320, 2879, 281, 253, 2929, 281, 17813, 253, 10527, 4342, 285, 7409, 253, 3045, 273, 269, 8950, 3782, 253, 37317, 651, 751, 281, 923, 253, 3045, 273, 269, 8950, 275, 2426, 273, 1027, 1980, 11269, 4715, 4142, 15299, 4095, 3966, 285, 697, 5301, 342, 643, 5939, 42651, 3082, 50274, 555, 5367, 2593, 4567, 247, 2341, 1159, 253, 2929, 556, 1677, 9470, 10527, 14940, 1783, 323, 269, 8950, 5933, 3066, 374, 88, 30666, 6339, 2299, 16774, 2175, 281, 12654, 10527, 4342, 403, 5816, 275, 253, 2022, 2929, 50276, 7152, 339, 431, 248, 2929, 29328, 247, 10208, 12072, 25001, 298, 912, 8498, 5933, 269, 8950, 285, 3537, 13505, 697, 14940, 762, 1327, 74, 301, 285, 629, 474, 11497, 7533, 253, 2022, 7680, 310, 253, 10527, 23632, 323, 247, 10208, 12072, 25001, 298, 912, 8498, 5933, 323, 7052, 2412, 45542, 1123, 10670, 342, 1327, 74, 301, 941, 50275, 783, 4737, 23211, 310, 973, 3542, 285, 2590, 352, 3133, 326, 253, 4737, 310, 3590, 2299, 954, 273, 253, 1783, 5853, 476, 320, 11420, 275, 6195, 10207, 1177, 253, 4737, 5853, 310, 3021, 2629, 50276, 74, 1158, 253, 4028, 556, 2316, 281, 3157, 533, 4583, 352, 310, 2590, 285, 3477, 281, 956, 50276, 74, 452, 253, 1563, 7350, 337, 275, 2593, 4562, 891, 1158, 352, 310, 1805, 281, 4073, 253, 701, 518, 310, 271, 24026, 3425, 16280, 849, 701, 518, 68, 8687, 253, 1980, 11269, 3213, 275, 581, 35388, 273, 253, 10208, 42921, 5933, 3133, 281, 320, 13583, 891, 1158, 352, 943, 823, 253, 85, 518, 260, 50276, 9900, 518, 68, 323, 374, 458, 465, 458, 465, 374, 5933, 337, 310, 3477, 281, 2096, 1580, 352, 310, 271, 11193, 273, 10208, 42921, 2299, 5933, 374, 310, 417, 594, 27350, 253, 2488, 42126, 5513, 2139, 359, 943, 823, 271, 3081, 2505, 3342, 18867, 6046, 14261, 89, 1479, 285, 1056, 253, 2457, 6046, 9578, 342, 253, 3236, 6046, 1269, 1479, 68, 275, 5933, 374, 849, 1057, 824, 247, 6096, 6046, 5416, 11068, 285, 752, 2238, 273, 5426, 273, 11068, 310, 908, 50276, 783, 1390, 12494, 273, 2593, 577, 310, 3240, 21248, 285, 49195, 311, 528, 281, 4326, 4513, 253, 1127, 891, 1158, 625, 4154, 285, 5955, 943, 320, 2879, 495, 253, 1655, 9759, 273, 2022, 906, 310, 45991, 253, 2488, 812, 806, 9569, 253, 2022, 906, 285, 840, 2085, 247, 4737, 23211, 846, 512, 417, 4130, 310, 6110, 275, 253, 4737, 5853, 47515, 625, 253, 1783, 323, 5933, 374, 310, 36334, 281, 2593, 44504, 533, 697, 10199, 310, 2593, 577, 891, 1158, 352, 310, 8489, 1512, 3563, 285, 417, 9371, 323, 10668, 281, 15909, 253, 5933, 50276, 21, 253, 806, 12494, 273, 2593, 8073, 1646, 281, 452, 247, 1511, 295, 6348, 269, 1274, 35292, 6168, 255, 918, 943, 320, 2931, 347, 2020, 68, 18, 79, 21136, 295, 6348, 269, 68, 1274, 35292, 6168, 255, 918, 608, 2167, 436, 2929, 310, 10527, 253, 2488, 812, 1335, 2085, 690, 10704, 4679, 281, 921, 253, 14940, 273, 269, 8950, 1110, 4679, 476, 2007, 17813, 253, 39383, 323, 1650, 253, 5454, 2727, 273, 465, 285, 253, 1055, 273, 10491, 3082, 846, 512, 269, 8950, 310, 1027, 432, 10208, 42921, 891, 13414, 871, 1880, 253, 14940, 273, 269, 8950, 556, 644, 45190, 5421, 1078, 253, 1448, 68, 1468, 264, 7680, 2486, 269, 8950, 285, 697, 10527, 23632, 2299, 253, 1783, 5853, 310, 7744, 908, 275, 6195, 459, 1177, 1223, 642, 16774, 385, 365, 343, 942, 281, 17093, 253, 12510, 273, 269, 8950, 16280, 627, 403, 690, 9759, 3374, 285, 28931, 327, 16038, 891, 13414, 1158, 436, 2929, 310, 4704, 323, 9311, 50275, 7152, 33032, 2520, 2929, 2175, 10208, 12072, 4715, 11333, 835, 247, 4849, 273, 12672, 7632, 7703, 281, 26708, 22318, 247, 4156, 8103, 949, 7529, 1025, 11269, 11407, 387, 247, 4771, 1027, 432, 2720, 2987, 534, 13194, 1980, 7483, 19191, 11786, 11269, 327, 1327, 44181, 11655, 1060, 247, 19191, 11786, 298, 912, 8498, 5731, 310, 3715, 534, 23000, 31167, 14871, 305, 12064, 26309, 14940, 3762, 310, 3559, 323, 253, 4081, 6974, 534, 25097, 1127, 3020, 50276, 585, 41801, 275, 1599, 347, 973, 347, 14940, 275, 3268, 2556, 281, 253, 369, 2152, 6339, 7982, 10575, 273, 253, 6046, 11365, 1232, 534, 9569, 19331, 285, 11068, 23029, 403, 671, 4081, 285, 5867, 50275, 37411, 2266, 17133, 414, 9376, 8073, 3133, 4336, 19082, 275, 436, 4758, 581, 273, 253, 2201, 42852, 323, 7296, 19191, 11786, 298, 912, 8498, 8062, 310, 281, 3157, 253, 3045, 273, 1327, 44181, 19191, 13757, 11333, 594, 347, 281, 8773, 46541, 4660, 15752, 2792, 285, 12666, 387, 4156, 1021, 250, 785, 275, 958, 436, 476, 320, 8132, 29689, 2011, 281, 5257, 4404, 253, 3268, 273, 4156, 1021, 250, 785, 1293, 17133, 414, 342, 1327, 284, 40045, 3875, 2281, 1677, 275, 23603, 26495, 1162, 355, 4240, 50276, 9088, 403, 642, 10704, 4679, 326, 7568, 253, 11839, 273, 253, 2746, 849, 29190, 310, 352, 281, 1908, 2412, 45542, 1123, 10670, 436, 943, 31423, 954, 11454, 2990, 3733, 275, 3946, 534, 310, 253, 3625, 15075, 273, 10208, 12072, 4715, 347, 2429, 342, 625, 8946, 39396, 1754, 2220, 13969, 23939, 12788, 13757, 50276, 261, 627, 247, 8542, 11562, 835, 2629, 10208, 12072, 4715, 390, 697, 11640, 326, 897, 16653, 6324, 18878, 4670, 824, 347, 10208, 6421, 266, 285, 32973, 755, 10960, 5727, 436, 2715, 342, 14871, 26309, 1057, 417, 891, 1928, 326, 436, 10076, 310, 4555, 275, 1327, 44181, 7533, 534, 310, 10432, 432, 8180, 275, 436, 789, 50276, 262, 310, 2834, 281, 1056, 3282, 273, 752, 403, 253, 2173, 7681, 32771, 275, 253, 1783, 1060, 285, 849, 597, 403, 247, 16018, 432, 4321, 1543, 50275, 2520, 789, 310, 7194, 273, 10527, 1600, 347, 352, 19401, 14871, 26309, 275, 247, 10208, 12072, 25001, 4758, 2299, 697, 7681, 7990, 46704, 253, 954, 4217, 10872, 273, 352, 275, 3946, 285, 352, 310, 5816, 4679, 3103, 352, 3133, 273, 3710, 31471, 275, 619, 1859, 2490, 187, 4118, 18435, 27, 2520, 2929, 29328, 247, 10208, 12072, 25001, 298, 912, 8498, 8062, 269, 8950, 323, 10704, 1599, 10554, 342, 11649, 21652, 762, 253, 4758, 273, 10208, 12072, 4715, 14940, 1783, 323, 253, 4081, 1332, 762, 253, 6032, 1255, 285, 2266, 44181, 13260, 310, 671, 2530, 285, 253, 1543, 403, 17903, 275, 39383, 45916, 740, 1016, 273, 534, 14493, 253, 369, 2152, 6339, 19, 4181, 259, 19, 1906, 76, 2059, 875, 253, 1566, 3268, 278, 2788, 285, 253, 2303, 3268, 12580, 762, 1027, 7533, 50276, 2520, 2929, 2959, 608, 10123, 275, 2264, 342, 7363, 721, 608, 495, 608, 285, 495, 690, 30628, 6760, 14962, 253, 38135, 273, 253, 2934, 273, 970, 253, 298, 912, 8498, 8062, 275, 253, 10208, 12072, 4758, 534, 891, 651, 671, 751, 281, 14409, 2220, 4361, 253, 2929, 407, 4266, 2299, 891, 1089, 326, 253, 15965, 26850, 403, 275, 690, 5053, 417, 3451, 752, 891, 1158, 20276, 310, 253, 2626, 5150, 275, 5150, 495, 253, 987, 4608, 1930, 310, 247, 1159, 273, 295, 4903, 253, 85, 518, 68, 285, 597, 15080, 1027, 1980, 11269, 387, 1027, 8548, 672, 694, 1584, 371, 400, 470, 2307, 465, 26332, 253, 27801, 1057, 417, 1379, 1659, 671, 295, 1752, 255, 300, 1545, 68, 310, 275, 2087, 247, 14561, 1159, 273, 697, 4154, 3103, 253, 987, 4608, 1930, 2550, 320, 3542, 275, 2087, 347, 247, 1159, 273, 247, 2014, 4778, 253, 85, 518, 534, 310, 2931, 347, 253, 85, 518, 2204, 68, 18, 18650, 291, 6168, 518, 68, 2403, 436, 5150, 13583, 436, 1895, 651, 2818, 2710, 4243, 273, 253, 7125, 281, 956, 275, 436, 2929, 824, 347, 253, 806, 767, 7424, 275, 5150, 1668, 327, 3239, 1638, 253, 767, 13866, 7424, 816, 846, 5150, 1668, 5150, 1283, 253, 1273, 13919, 275, 253, 13866, 5150, 275, 3239, 1458, 1386, 337, 285, 253, 2626, 1386, 275, 5150, 2030, 327, 3239, 1283, 281, 3748, 247, 1643, 3021, 891, 452, 281, 1953, 253, 13091, 273, 253, 10527, 2440, 275, 436, 2929, 50276, 23955, 1127, 891, 651, 751, 281, 3748, 310, 326, 891, 858, 417, 2096, 253, 5426, 273, 15849, 891, 285, 21255, 275, 2593, 8255, 352, 310, 417, 4767, 387, 512, 326, 14168, 68, 932, 76, 310, 247, 3632, 10671, 1060, 33810, 253, 2515, 342, 14920, 5407, 403, 417, 2529, 387, 512, 50276, 23350, 1529, 1127, 281, 3748, 310, 326, 891, 858, 417, 2096, 253, 1750, 275, 3239, 818, 3104, 1884, 2405, 1057, 352, 1599, 604, 581, 6057, 253, 1180, 246, 4259, 273, 5018, 281, 5115, 253, 12320, 299, 4277, 840, 581, 943, 873, 253, 1180, 465, 273, 1980, 5018, 591, 27801, 943, 320, 873, 273, 253, 1340, 273, 8084, 442, 4277, 533, 246, 4259, 7024, 327, 465, 594, 326, 352, 651, 320, 44822, 281, 5467, 326, 581, 6057, 246, 4259, 30472, 273, 465, 275, 253, 806, 1659, 50276, 12157, 273, 841, 891, 651, 5963, 326, 436, 2929, 310, 417, 2568, 4704, 323, 9759, 275, 697, 1655, 830, 891, 651, 3103, 417, 320, 2104, 281, 5583, 14924, 273, 436, 2929, 50276, 37585, 2792, 50276, 26977, 3740, 253, 4477, 897, 4768, 253, 2929, 752, 403, 1925, 253, 14511, 30404, 1014, 2167, 627, 403, 15530, 835, 752, 403, 1925, 253, 2885, 6168, 474, 30404, 253, 2488, 1416, 285, 9311, 3522, 403, 1097, 26895, 275, 41616, 943, 320, 908, 50276, 6377, 495, 1386, 818, 310, 253, 50276, 266, 38663, 19191, 11786, 627, 403, 2067, 38663, 48489, 323, 253, 11786, 285, 752, 310, 5393, 1060, 310, 760, 581, 4227, 273, 731, 50276, 6377, 495, 3104, 374, 21397, 253, 20828, 943, 1379, 1659, 417, 327, 1016, 5268, 533, 327, 253, 4275, 4771, 50276, 6377, 495, 1386, 5540, 271, 2341, 1159, 271, 38663, 6642, 50276, 6377, 608, 3104, 1722, 938, 253, 9410, 273, 13260, 8319, 285, 8073, 403, 417, 13260, 533, 14308, 50276, 6377, 721, 1386, 374, 281, 4044, 253, 50276, 66, 50276, 12973, 3033, 50276, 6377, 721, 1386, 1283, 14168, 1179, 69, 19, 310, 17011, 50276, 6377, 854, 1386, 6931, 247, 50276, 783, 5912, 21136, 604, 352, 310, 5486, 281, 320, 253, 581, 2931, 275, 3239, 495, 1386, 854, 5010, 897, 273, 253, 1072, 9484, 281, 1957, 1027, 13483, 943, 320, 16371, 50276, 6377, 1638, 1386, 2030, 771, 299, 50276, 76, 470, 50276, 6377, 1458, 1386, 1884, 288, 2859, 19, 50276, 73, 2859 ]