Input
stringlengths
251
41.6k
Output
stringlengths
137
9.7k
input_ids
listlengths
157
2.05k
attention_mask
listlengths
157
2.05k
labels
listlengths
157
2.05k
Below is given review of a research paper from cnoference journal. Please write a summary the review. ### Review: this paper proposes a particular variant of experience replay with behavior cloning as a method for continual learning the approach achieves good performance while not requiring a task label this paper makes the point that i definitely agree with that all of the approaches being considered should compare to experience replay and that in reality many of them rarely do better however i am not totally convinced when it comes to the value of the actual novel aspects of this paper much of the empirical analysis of experience replay ie the buffer size the ratio of past and novel experiences etc was not surprising or particular novel in my eyes the idea of using behavior cloning is motivated fully through the lens of catastrophic forgetting and promoting stability and does not at all address achieving plasticity this was interesting to me as the authors do mention the stabilityplasticity dilemma but a more theoretical analysis of why behavior cloning is somehow the right method among various choices to promote stability while not sacrificing or improving plasticity was definitely missing for me other options can certainly be considered as well if your aim is just to add stability to experience replay such a notion of weight importance for the past like in ewc kirkpatric et al 2017 and many other papers or using knowledge distillation like lwf li and hoeim 2016 lwf in particular seems quite related i wonder how lwf experience replay compares to the approach proposed here in general the discourse could become a lot strong in my eyes if it really considered various alternatives and explained why behavior cloning provides theoretical value overall behavior cloning seems to help a little bit based on the experiments provided but this finding is very likely indicative of the particular problem setting and seemingly not really a game changer in the paper they explore settings with fairly prolonged periods of training in each rl domain one at a time if the problem was to become more nonstationary with more frequent switching ie more in line with the motivation of lifelong learning i would imagine that increasing stability is not necessarily a good thing and may slow down future learning docsepthe paper proposes a novel trial to alleviate the catastrophic forgetting for continual learning which is kind a mixture model of on and offpolicy the core concept of the method is utilizing experience replay buffer for all past events with new experience they mainly worked on their method in the setting of reinforcement learning in the experiments they show that the model successfully mitigate the catastrophic forgetting with this behavioral cloning and has the performance comparable to recent continual learning approaches the paper is easy to follow and the methodology is quite intuitive and straight forward in this paper i have several questions q1 i wonder the reason that every tasks are trained cyclically in sequence and is there any trial to learn each task just once and observe the catastrophic forgetting of them when they have to detain the learned knowledge in a long time without training them again as does most of visual domain experiments of the other continual learning research q2 in figure 5 i wonder why the natlabvaryingmapramdomizeprobe task can perform well even they didnt learn yet the score of brown line increases nearly 6070 of final scoreafter trained during training the first task because the tasks are deeply correlated or it is just common property of probe task q3 using reservoirbuffer to prevent catastrophic forgetting is natural and reasonable is there some of quantitative comparison in the sense of memory requirement and runtime i feel that 5 or 50 million experiences at each task are huge enough to memorize and manage additionally in the experiment of figure 5 i think it could be much clear with a verification that the probe task is semantically independent no interference over all the other tasks also it is quite hard to compare the performance of the models just with plots i expect that it could be much better to show some of quantitative resultsas numberdocsepthe authors propose an approach to augment experience replay buffers with properties that can alleviate issues with catastrophic forgetting the buffers are augmented by storing both new and historical experiences along with the desired historical policy value distribution the ac learning now couples two additional losses that ensures the new policy does not drift away from old actor distribution via kl and new value does not drift away from old critic distribution via l2 loss the authors provided clear experimental evidence that shows how an rl agent that does not use clear will observe catastrophic when we sequentially train different tasks and it is not due to destructive interference using the simultaneous and separate trainingevaluation experiments author also showed how different replay make ups can change the result of clear and its a matter of empirical tuning the formulation of clear also is simple while delivering interesting results it would have been nice to see how this is used in a practical setting as all these are synthetic environments tasks the discussion on relationship with biological mechanism also seems unnecessary as its unclear whether the mechanism proposed is actually whats in the cls ### Summary:
this paper and revisions have some interesting insights into using er for catastrophic forgetting and comparisons to other methods for reducing catastrophic forgetting however the paper is currently pitched as the first to notice that er can be used for this purpose whereas it was well explored in the cited paper selective experience replay for lifelong learning 2018 for example the abstract says while various methods to counteract catastrophic forgetting have recently been proposed we explore a straightforward general and seemingly overlooked solution that of using experience replay buffers for all past events it seems unnecessary to claim this as a main contribution in this work rather the main contributions seem to be to include behavioural cloning and do provide further empirical evidence that selective er can be effective for catastrophic forgetting further to make the paper even stronger it would be interesting to better understand even smaller replay buffers a buffer size of 5 million is still quite large what is a realistic size for continual learning hypothesizing how er can be part of a real continual learning solution which will likely have more than 3 tasks is important to understand how to properly restrict the buffer size finally it is recommended to reconsider the strong stance on catastrophic interference and forgetting catastrophic interference has been considered for incremental training where recent updates can interfere with estimates for older or other values this definition does not precisely match the provided definition in the paper further it is true that forgetting has often been used explicitly for multiple tasks trained in sequence however the issues are similar new learning overriding older learning these two definitions need not be so separate and further it is not clear that the provided definitions are congruent with older literature on interference overall there is most definitely useful ideas and experiments in this paper but it is as yet a bit preliminary improvements on placement motivation and experimental choices would make this work much stronger and provide needed clarity on the use of er for forgetting
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 2520, 2929, 29328, 247, 1798, 12955, 273, 2793, 44864, 342, 3879, 34591, 347, 247, 1332, 323, 45120, 4715, 253, 2746, 33526, 1175, 3045, 1223, 417, 10568, 247, 4836, 5203, 436, 2929, 2789, 253, 1127, 326, 891, 7964, 5194, 342, 326, 512, 273, 253, 7274, 1146, 2783, 943, 7277, 281, 2793, 44864, 285, 326, 275, 6612, 1142, 273, 731, 11766, 513, 1805, 2299, 891, 717, 417, 9106, 13762, 672, 352, 3249, 281, 253, 1318, 273, 253, 4588, 4460, 7794, 273, 436, 2929, 50275, 25914, 273, 253, 16774, 1783, 273, 2793, 44864, 26332, 253, 6391, 1979, 253, 4313, 273, 2469, 285, 4460, 8450, 3966, 369, 417, 10084, 390, 1798, 4460, 275, 619, 2927, 253, 2934, 273, 970, 3879, 34591, 310, 17194, 4751, 949, 253, 9655, 273, 36256, 37264, 285, 14312, 7882, 285, 1057, 417, 387, 512, 2953, 17170, 30535, 436, 369, 4722, 281, 479, 347, 253, 4477, 513, 3748, 253, 7882, 32502, 414, 34390, 533, 247, 625, 10527, 1783, 273, 2139, 3879, 34591, 310, 10380, 253, 987, 1332, 2190, 2710, 10165, 281, 8591, 7882, 1223, 417, 18501, 272, 390, 11138, 30535, 369, 7964, 5816, 323, 479, 643, 4610, 476, 5604, 320, 2783, 347, 973, 604, 634, 4388, 310, 816, 281, 823, 7882, 281, 2793, 44864, 824, 247, 10732, 273, 2801, 6349, 323, 253, 2469, 751, 275, 299, 38212, 465, 11131, 4066, 695, 1162, 355, 4240, 285, 1142, 643, 9380, 390, 970, 3640, 940, 21755, 751, 298, 48808, 632, 285, 288, 3703, 303, 4022, 298, 48808, 275, 1798, 3133, 3240, 2905, 891, 4282, 849, 298, 48808, 50276, 38835, 44864, 26662, 281, 253, 2746, 4081, 1060, 275, 2087, 253, 25200, 812, 2489, 247, 2257, 2266, 275, 619, 2927, 604, 352, 1663, 2783, 2710, 18075, 285, 5544, 2139, 3879, 34591, 3400, 10527, 1318, 50275, 1189, 455, 3879, 34591, 3133, 281, 1361, 247, 1652, 2372, 1754, 327, 253, 4679, 2530, 533, 436, 4560, 310, 1077, 2779, 24838, 273, 253, 1798, 1895, 4758, 285, 16907, 417, 1663, 247, 2165, 1683, 254, 275, 253, 2929, 597, 8338, 7533, 342, 9648, 16707, 9894, 273, 3733, 275, 1016, 391, 77, 5028, 581, 387, 247, 673, 604, 253, 1895, 369, 281, 2489, 625, 1327, 20502, 552, 342, 625, 10879, 12797, 26332, 625, 275, 1386, 342, 253, 16038, 273, 36536, 4715, 891, 651, 8564, 326, 3629, 7882, 310, 417, 7933, 247, 1175, 2181, 285, 778, 3468, 1066, 2852, 4715, 50276, 7152, 339, 431, 248, 2929, 29328, 247, 4460, 2332, 281, 33623, 253, 36256, 37264, 323, 45120, 4715, 534, 310, 2238, 247, 7802, 1566, 273, 327, 285, 745, 22872, 253, 5161, 4473, 273, 253, 1332, 310, 17617, 2793, 44864, 6391, 323, 512, 2469, 3394, 342, 747, 2793, 597, 7194, 4307, 327, 616, 1332, 275, 253, 4758, 273, 35221, 4715, 275, 253, 4679, 597, 921, 326, 253, 1566, 8379, 29966, 253, 36256, 37264, 342, 436, 14613, 34591, 285, 556, 253, 3045, 10870, 281, 3332, 45120, 4715, 7274, 50276, 783, 2929, 310, 3477, 281, 956, 285, 253, 16182, 310, 3240, 27350, 285, 4951, 3579, 275, 436, 2929, 891, 452, 2067, 3533, 50276, 82, 18, 891, 4282, 253, 1921, 326, 1046, 8892, 403, 10166, 6776, 1037, 275, 3425, 285, 310, 627, 667, 2332, 281, 3037, 1016, 4836, 816, 2378, 285, 10018, 253, 36256, 37264, 273, 731, 672, 597, 452, 281, 30054, 253, 6311, 3640, 275, 247, 1048, 673, 1293, 3733, 731, 969, 347, 1057, 954, 273, 5304, 5028, 4679, 273, 253, 643, 45120, 4715, 2561, 50276, 82, 19, 275, 4677, 608, 891, 4282, 2139, 253, 2889, 13068, 39381, 272, 4251, 3358, 3335, 478, 554, 31552, 4836, 476, 1347, 973, 1014, 597, 42126, 3037, 2568, 253, 4868, 273, 8516, 1386, 5459, 4829, 3925, 1967, 273, 2457, 4868, 6438, 10166, 1309, 3733, 253, 806, 4836, 984, 253, 8892, 403, 11617, 9578, 390, 352, 310, 816, 1846, 2867, 273, 10304, 4836, 50276, 82, 20, 970, 18699, 11023, 281, 3657, 36256, 37264, 310, 3626, 285, 5272, 310, 627, 690, 273, 11745, 5301, 275, 253, 3282, 273, 3541, 8284, 285, 20243, 891, 1928, 326, 608, 390, 2456, 3041, 8450, 387, 1016, 4836, 403, 5699, 2217, 281, 16407, 907, 285, 8722, 50276, 29483, 595, 275, 253, 3368, 273, 4677, 608, 891, 1158, 352, 812, 320, 1199, 2590, 342, 247, 21999, 326, 253, 10304, 4836, 50276, 261, 3300, 39904, 3907, 642, 11689, 689, 512, 253, 643, 8892, 50275, 12563, 352, 310, 3240, 1892, 281, 7277, 253, 3045, 273, 253, 3210, 816, 342, 14777, 891, 1902, 326, 352, 812, 320, 1199, 1805, 281, 921, 690, 273, 11745, 50276, 16680, 284, 1180, 7152, 339, 431, 248, 4477, 12661, 271, 2746, 281, 35919, 2793, 44864, 32743, 342, 3607, 326, 476, 33623, 3374, 342, 36256, 37264, 253, 32743, 403, 31612, 407, 20073, 1097, 747, 285, 9493, 8450, 2112, 342, 253, 6799, 9493, 3646, 50276, 2877, 3268, 253, 913, 4715, 1024, 17581, 767, 3081, 11655, 326, 20096, 253, 747, 3646, 1057, 417, 16924, 1977, 432, 1711, 12353, 3268, 3066, 27451, 285, 747, 1318, 1057, 417, 16924, 1977, 432, 1711, 7291, 3268, 3066, 298, 19, 2957, 50276, 783, 4477, 2530, 2590, 5661, 1941, 326, 2722, 849, 271, 391, 77, 5570, 326, 1057, 417, 897, 2590, 588, 10018, 36256, 672, 359, 32627, 6194, 1027, 8892, 285, 352, 310, 417, 1955, 281, 27009, 11689, 970, 253, 19645, 285, 4858, 3733, 15419, 2368, 4679, 2488, 671, 2692, 849, 1027, 44864, 1056, 35267, 476, 1818, 253, 906, 273, 2590, 285, 697, 247, 2647, 273, 16774, 25184, 50276, 783, 15895, 273, 2590, 671, 310, 2969, 1223, 18723, 4722, 1543, 352, 651, 452, 644, 5322, 281, 923, 849, 436, 310, 908, 275, 247, 8542, 4758, 347, 512, 841, 403, 13506, 12620, 50276, 40480, 253, 5955, 327, 2954, 342, 7534, 5122, 671, 3133, 15279, 347, 697, 12744, 1880, 253, 5122, 4081, 310, 2686, 47515, 275, 253, 502, 84, 187, 187, 4118, 18435, 27, 2520, 2929, 285, 38549, 452, 690, 4722, 16039, 715, 970, 2827, 323, 36256, 37264, 285, 14023, 281, 643, 3082, 323, 8493, 36256, 37264, 2299, 253, 2929, 310, 4390, 33517, 347, 253, 806, 281, 4366, 326, 2827, 476, 320, 908, 323, 436, 4096, 5727, 352, 369, 973, 14859, 275, 253, 11106, 2929, 13687, 2793, 44864, 323, 36536, 4715, 4765, 323, 1650, 253, 12002, 2296, 1223, 2710, 3082, 281, 4828, 514, 36256, 37264, 452, 4102, 644, 4081, 359, 8338, 247, 15246, 2087, 285, 16907, 28849, 2900, 50276, 3529, 273, 970, 2793, 44864, 32743, 323, 512, 2469, 3394, 352, 3133, 15279, 281, 1750, 436, 347, 247, 2022, 7680, 275, 436, 789, 2581, 253, 2022, 9021, 1646, 281, 320, 281, 2486, 35174, 34591, 285, 513, 2085, 2007, 16774, 1941, 326, 13687, 2827, 476, 320, 3576, 323, 36256, 37264, 50275, 44295, 281, 1056, 253, 2929, 1014, 10046, 352, 651, 320, 4722, 281, 1805, 2096, 1014, 4577, 44864, 32743, 247, 6391, 1979, 273, 608, 3041, 310, 1335, 3240, 1781, 752, 310, 247, 15958, 1979, 323, 45120, 4715, 6482, 3006, 849, 2827, 476, 320, 629, 273, 247, 1524, 45120, 4715, 2900, 534, 588, 2779, 452, 625, 685, 495, 8892, 310, 1774, 281, 2096, 849, 281, 6283, 4656, 253, 6391, 1979, 50276, 71, 3341, 352, 310, 8521, 281, 24033, 253, 2266, 22567, 327, 36256, 11689, 285, 37264, 36256, 11689, 556, 644, 2783, 323, 32809, 3733, 835, 3332, 11269, 476, 19323, 342, 8197, 323, 5662, 390, 643, 2193, 436, 5426, 1057, 417, 10534, 3761, 253, 2530, 5426, 275, 253, 2929, 2007, 352, 310, 2032, 326, 37264, 556, 2223, 644, 908, 11120, 323, 2709, 8892, 10166, 275, 3425, 2299, 253, 3374, 403, 2074, 747, 4715, 49776, 5662, 4715, 841, 767, 14308, 878, 417, 320, 594, 4858, 285, 2007, 352, 310, 417, 2590, 326, 253, 2530, 14308, 403, 34901, 290, 342, 5662, 6239, 327, 11689, 50274, 1189, 455, 627, 310, 954, 7964, 4217, 5697, 285, 4679, 275, 436, 2929, 533, 352, 310, 347, 2568, 247, 2372, 12611, 11701, 327, 14663, 16038, 285, 5661, 10165, 651, 1056, 436, 789, 1199, 10046, 285, 2085, 3058, 19843, 327, 253, 897, 273, 2827, 323, 37264 ]
[ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 ]
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 2520, 2929, 29328, 247, 1798, 12955, 273, 2793, 44864, 342, 3879, 34591, 347, 247, 1332, 323, 45120, 4715, 253, 2746, 33526, 1175, 3045, 1223, 417, 10568, 247, 4836, 5203, 436, 2929, 2789, 253, 1127, 326, 891, 7964, 5194, 342, 326, 512, 273, 253, 7274, 1146, 2783, 943, 7277, 281, 2793, 44864, 285, 326, 275, 6612, 1142, 273, 731, 11766, 513, 1805, 2299, 891, 717, 417, 9106, 13762, 672, 352, 3249, 281, 253, 1318, 273, 253, 4588, 4460, 7794, 273, 436, 2929, 50275, 25914, 273, 253, 16774, 1783, 273, 2793, 44864, 26332, 253, 6391, 1979, 253, 4313, 273, 2469, 285, 4460, 8450, 3966, 369, 417, 10084, 390, 1798, 4460, 275, 619, 2927, 253, 2934, 273, 970, 3879, 34591, 310, 17194, 4751, 949, 253, 9655, 273, 36256, 37264, 285, 14312, 7882, 285, 1057, 417, 387, 512, 2953, 17170, 30535, 436, 369, 4722, 281, 479, 347, 253, 4477, 513, 3748, 253, 7882, 32502, 414, 34390, 533, 247, 625, 10527, 1783, 273, 2139, 3879, 34591, 310, 10380, 253, 987, 1332, 2190, 2710, 10165, 281, 8591, 7882, 1223, 417, 18501, 272, 390, 11138, 30535, 369, 7964, 5816, 323, 479, 643, 4610, 476, 5604, 320, 2783, 347, 973, 604, 634, 4388, 310, 816, 281, 823, 7882, 281, 2793, 44864, 824, 247, 10732, 273, 2801, 6349, 323, 253, 2469, 751, 275, 299, 38212, 465, 11131, 4066, 695, 1162, 355, 4240, 285, 1142, 643, 9380, 390, 970, 3640, 940, 21755, 751, 298, 48808, 632, 285, 288, 3703, 303, 4022, 298, 48808, 275, 1798, 3133, 3240, 2905, 891, 4282, 849, 298, 48808, 50276, 38835, 44864, 26662, 281, 253, 2746, 4081, 1060, 275, 2087, 253, 25200, 812, 2489, 247, 2257, 2266, 275, 619, 2927, 604, 352, 1663, 2783, 2710, 18075, 285, 5544, 2139, 3879, 34591, 3400, 10527, 1318, 50275, 1189, 455, 3879, 34591, 3133, 281, 1361, 247, 1652, 2372, 1754, 327, 253, 4679, 2530, 533, 436, 4560, 310, 1077, 2779, 24838, 273, 253, 1798, 1895, 4758, 285, 16907, 417, 1663, 247, 2165, 1683, 254, 275, 253, 2929, 597, 8338, 7533, 342, 9648, 16707, 9894, 273, 3733, 275, 1016, 391, 77, 5028, 581, 387, 247, 673, 604, 253, 1895, 369, 281, 2489, 625, 1327, 20502, 552, 342, 625, 10879, 12797, 26332, 625, 275, 1386, 342, 253, 16038, 273, 36536, 4715, 891, 651, 8564, 326, 3629, 7882, 310, 417, 7933, 247, 1175, 2181, 285, 778, 3468, 1066, 2852, 4715, 50276, 7152, 339, 431, 248, 2929, 29328, 247, 4460, 2332, 281, 33623, 253, 36256, 37264, 323, 45120, 4715, 534, 310, 2238, 247, 7802, 1566, 273, 327, 285, 745, 22872, 253, 5161, 4473, 273, 253, 1332, 310, 17617, 2793, 44864, 6391, 323, 512, 2469, 3394, 342, 747, 2793, 597, 7194, 4307, 327, 616, 1332, 275, 253, 4758, 273, 35221, 4715, 275, 253, 4679, 597, 921, 326, 253, 1566, 8379, 29966, 253, 36256, 37264, 342, 436, 14613, 34591, 285, 556, 253, 3045, 10870, 281, 3332, 45120, 4715, 7274, 50276, 783, 2929, 310, 3477, 281, 956, 285, 253, 16182, 310, 3240, 27350, 285, 4951, 3579, 275, 436, 2929, 891, 452, 2067, 3533, 50276, 82, 18, 891, 4282, 253, 1921, 326, 1046, 8892, 403, 10166, 6776, 1037, 275, 3425, 285, 310, 627, 667, 2332, 281, 3037, 1016, 4836, 816, 2378, 285, 10018, 253, 36256, 37264, 273, 731, 672, 597, 452, 281, 30054, 253, 6311, 3640, 275, 247, 1048, 673, 1293, 3733, 731, 969, 347, 1057, 954, 273, 5304, 5028, 4679, 273, 253, 643, 45120, 4715, 2561, 50276, 82, 19, 275, 4677, 608, 891, 4282, 2139, 253, 2889, 13068, 39381, 272, 4251, 3358, 3335, 478, 554, 31552, 4836, 476, 1347, 973, 1014, 597, 42126, 3037, 2568, 253, 4868, 273, 8516, 1386, 5459, 4829, 3925, 1967, 273, 2457, 4868, 6438, 10166, 1309, 3733, 253, 806, 4836, 984, 253, 8892, 403, 11617, 9578, 390, 352, 310, 816, 1846, 2867, 273, 10304, 4836, 50276, 82, 20, 970, 18699, 11023, 281, 3657, 36256, 37264, 310, 3626, 285, 5272, 310, 627, 690, 273, 11745, 5301, 275, 253, 3282, 273, 3541, 8284, 285, 20243, 891, 1928, 326, 608, 390, 2456, 3041, 8450, 387, 1016, 4836, 403, 5699, 2217, 281, 16407, 907, 285, 8722, 50276, 29483, 595, 275, 253, 3368, 273, 4677, 608, 891, 1158, 352, 812, 320, 1199, 2590, 342, 247, 21999, 326, 253, 10304, 4836, 50276, 261, 3300, 39904, 3907, 642, 11689, 689, 512, 253, 643, 8892, 50275, 12563, 352, 310, 3240, 1892, 281, 7277, 253, 3045, 273, 253, 3210, 816, 342, 14777, 891, 1902, 326, 352, 812, 320, 1199, 1805, 281, 921, 690, 273, 11745, 50276, 16680, 284, 1180, 7152, 339, 431, 248, 4477, 12661, 271, 2746, 281, 35919, 2793, 44864, 32743, 342, 3607, 326, 476, 33623, 3374, 342, 36256, 37264, 253, 32743, 403, 31612, 407, 20073, 1097, 747, 285, 9493, 8450, 2112, 342, 253, 6799, 9493, 3646, 50276, 2877, 3268, 253, 913, 4715, 1024, 17581, 767, 3081, 11655, 326, 20096, 253, 747, 3646, 1057, 417, 16924, 1977, 432, 1711, 12353, 3268, 3066, 27451, 285, 747, 1318, 1057, 417, 16924, 1977, 432, 1711, 7291, 3268, 3066, 298, 19, 2957, 50276, 783, 4477, 2530, 2590, 5661, 1941, 326, 2722, 849, 271, 391, 77, 5570, 326, 1057, 417, 897, 2590, 588, 10018, 36256, 672, 359, 32627, 6194, 1027, 8892, 285, 352, 310, 417, 1955, 281, 27009, 11689, 970, 253, 19645, 285, 4858, 3733, 15419, 2368, 4679, 2488, 671, 2692, 849, 1027, 44864, 1056, 35267, 476, 1818, 253, 906, 273, 2590, 285, 697, 247, 2647, 273, 16774, 25184, 50276, 783, 15895, 273, 2590, 671, 310, 2969, 1223, 18723, 4722, 1543, 352, 651, 452, 644, 5322, 281, 923, 849, 436, 310, 908, 275, 247, 8542, 4758, 347, 512, 841, 403, 13506, 12620, 50276, 40480, 253, 5955, 327, 2954, 342, 7534, 5122, 671, 3133, 15279, 347, 697, 12744, 1880, 253, 5122, 4081, 310, 2686, 47515, 275, 253, 502, 84, 187, 187, 4118, 18435, 27, 2520, 2929, 285, 38549, 452, 690, 4722, 16039, 715, 970, 2827, 323, 36256, 37264, 285, 14023, 281, 643, 3082, 323, 8493, 36256, 37264, 2299, 253, 2929, 310, 4390, 33517, 347, 253, 806, 281, 4366, 326, 2827, 476, 320, 908, 323, 436, 4096, 5727, 352, 369, 973, 14859, 275, 253, 11106, 2929, 13687, 2793, 44864, 323, 36536, 4715, 4765, 323, 1650, 253, 12002, 2296, 1223, 2710, 3082, 281, 4828, 514, 36256, 37264, 452, 4102, 644, 4081, 359, 8338, 247, 15246, 2087, 285, 16907, 28849, 2900, 50276, 3529, 273, 970, 2793, 44864, 32743, 323, 512, 2469, 3394, 352, 3133, 15279, 281, 1750, 436, 347, 247, 2022, 7680, 275, 436, 789, 2581, 253, 2022, 9021, 1646, 281, 320, 281, 2486, 35174, 34591, 285, 513, 2085, 2007, 16774, 1941, 326, 13687, 2827, 476, 320, 3576, 323, 36256, 37264, 50275, 44295, 281, 1056, 253, 2929, 1014, 10046, 352, 651, 320, 4722, 281, 1805, 2096, 1014, 4577, 44864, 32743, 247, 6391, 1979, 273, 608, 3041, 310, 1335, 3240, 1781, 752, 310, 247, 15958, 1979, 323, 45120, 4715, 6482, 3006, 849, 2827, 476, 320, 629, 273, 247, 1524, 45120, 4715, 2900, 534, 588, 2779, 452, 625, 685, 495, 8892, 310, 1774, 281, 2096, 849, 281, 6283, 4656, 253, 6391, 1979, 50276, 71, 3341, 352, 310, 8521, 281, 24033, 253, 2266, 22567, 327, 36256, 11689, 285, 37264, 36256, 11689, 556, 644, 2783, 323, 32809, 3733, 835, 3332, 11269, 476, 19323, 342, 8197, 323, 5662, 390, 643, 2193, 436, 5426, 1057, 417, 10534, 3761, 253, 2530, 5426, 275, 253, 2929, 2007, 352, 310, 2032, 326, 37264, 556, 2223, 644, 908, 11120, 323, 2709, 8892, 10166, 275, 3425, 2299, 253, 3374, 403, 2074, 747, 4715, 49776, 5662, 4715, 841, 767, 14308, 878, 417, 320, 594, 4858, 285, 2007, 352, 310, 417, 2590, 326, 253, 2530, 14308, 403, 34901, 290, 342, 5662, 6239, 327, 11689, 50274, 1189, 455, 627, 310, 954, 7964, 4217, 5697, 285, 4679, 275, 436, 2929, 533, 352, 310, 347, 2568, 247, 2372, 12611, 11701, 327, 14663, 16038, 285, 5661, 10165, 651, 1056, 436, 789, 1199, 10046, 285, 2085, 3058, 19843, 327, 253, 897, 273, 2827, 323, 37264 ]
Below is given review of a research paper from cnoference journal. Please write a summary the review. ### Review: in this paper the authors provided their theoretical understanding on the translation and reconstruction guarantees of the cycleconsistent gans they showed that the translators based on deep relu networks can prevent the input information from dissipating during generation cycles which means we can lower bound the gap on the other hand there are positive results the authors proved that the same translators not only achieve zero generation loss asymptotically but the generated distribution converges to the target distribution almost surely the authors also showed the equivalence between l1 norm and 1wasserstein distance in the cyclic loss this paper is very solid in the theory the writing quality of this paper is high and most importantly the problem analyzed by the authors is very fundamental and important in the field of generative models although the literature review of this paper does not cover all the theoretical papers on gans it is not a big deal the results are original and significant i think there is no potential negative societal impact of this work since it is completely theoretical docsepin this paper the authors analyze the properties of image to image translation networks that are based on cycle consistency in particular the authors show that when the translation networks are based on relu activations they behave as information preserving transformations ipts next the authors also show that 1wasserstein distance and l1 cyclic distance are equivalent for sobolevsmooth data and analyze the effects of illposedness on the regeneration the setting is rigorously formulated and all the claims made in the manuscript appear to be shown in ample detail the relevant assumptions about the network architectures and the data are stated clearly by the authors docsepin this paper the authors propose a novel statistic framework to analyze the consistency of cyclegan type neural networks where translator and inversetranslator between two domains in the images spaces are trained in particular given relu dnn translators the authors prove that they are information preserving based on delicate functional analysis tools moreover the paper provides theoretical attests to previous empirical studies on the equivalence of l1 norm and 1w distance in cyclic loss it is also proved that using translation consistent translators a cyclicconsistent network can be built the strengths of the paper include rigorous statistical analysis to studying the cyclicconsistency of cyclegan type networks sharp nonasymptotic bounds on the translation error under mild conditions are rigorously proven also for the first time a deterministic bound on cumulative reconstruction error is given this sheds new light on where the illposedness of cyclegan type networks stems from the results are clearly stated and the motivations are explained well also commonly seen cyclegan type networks are reviewed and compared under the proposed framework showing the great value in the application of the proposed analyzing framework it might be more illustrative if numerical experiment results are given to support the theoretical results but since the paper is fruitful theoretically it is only a minor drawback the limitations of the paper are discussed in the last section docsepi do not think my expertise matches this paper so my summary of the paper and its contributions may not be accurate this paper provides a series of math explanations and proof for the statistical translation and regeneration guarantees of cycleconsistent networks the contribution of the paper is establishing an analytical way for not only cycleconsistent networks but also other deep generative models strengths there are rare papers that provide the translation of the deep generative models based on rigorous mathematics therefore i do think this paper has high originality and may have a good impact on the readers to conduct further research in the future weaknesses although math is a beautiful way to elaborate an algorithm sometimes telling a story by images has more affinity to the readers and may help readers to know the whole picture faster na ### Summary:
this paper analyzes the consistency of cycleconsistent gans the authors provide theoretical insights in how the information is preserved by relu dnn translators via functional analysis they additionally show the equivalence of l1 and 1w distance under the cyclic loss context the highlight of this paper is the rigorous statistical analysis and fundamental theoretical insights in generative models a minor point is that the reviewers suggest that more illustrative and intuitive presentation of the results will smooth the reading in general its a interesting paper and the ac recommends acceptance
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 249, 436, 2929, 253, 4477, 2530, 616, 10527, 4685, 327, 253, 10234, 285, 14433, 23632, 273, 253, 5880, 32474, 305, 507, 597, 2692, 326, 253, 5600, 2392, 1754, 327, 3676, 774, 86, 6928, 476, 3657, 253, 3280, 1491, 432, 18335, 839, 1309, 5978, 11945, 534, 2097, 359, 476, 2406, 3033, 253, 8037, 327, 253, 643, 1133, 627, 403, 2762, 1543, 253, 4477, 8058, 326, 253, 1072, 5600, 2392, 417, 760, 5115, 5058, 5978, 2957, 38311, 533, 253, 4561, 3268, 26414, 281, 253, 2303, 3268, 2761, 13353, 253, 4477, 671, 2692, 253, 19945, 875, 298, 18, 5222, 285, 337, 88, 30666, 6339, 4181, 275, 253, 19870, 2957, 50276, 2520, 2929, 310, 1077, 4891, 275, 253, 3762, 253, 4028, 3290, 273, 436, 2929, 310, 1029, 285, 954, 15538, 253, 1895, 5867, 407, 253, 4477, 310, 1077, 7936, 285, 1774, 275, 253, 1673, 273, 1006, 800, 3210, 50275, 20261, 253, 6239, 2278, 273, 436, 2929, 1057, 417, 3835, 512, 253, 10527, 9380, 327, 305, 507, 352, 310, 417, 247, 1943, 2968, 253, 1543, 403, 3236, 285, 1534, 50276, 74, 1158, 627, 310, 642, 2442, 4016, 38058, 3486, 273, 436, 789, 1580, 352, 310, 4336, 10527, 50276, 7152, 339, 9852, 436, 2929, 253, 4477, 12106, 253, 3607, 273, 2460, 281, 2460, 10234, 6928, 326, 403, 1754, 327, 5880, 15274, 50276, 249, 1798, 253, 4477, 921, 326, 672, 253, 10234, 6928, 403, 1754, 327, 774, 86, 1396, 569, 597, 21319, 347, 1491, 24279, 21257, 891, 45276, 1735, 253, 4477, 671, 921, 326, 337, 88, 30666, 6339, 4181, 285, 298, 18, 19870, 4181, 403, 6425, 323, 30323, 40610, 34006, 941, 285, 12106, 253, 2538, 273, 2853, 7334, 1255, 327, 253, 22781, 50276, 783, 4758, 310, 8132, 29689, 26115, 285, 512, 253, 3916, 1160, 275, 253, 7714, 3176, 281, 320, 2011, 275, 24904, 2508, 50276, 783, 4623, 13260, 670, 253, 2990, 35615, 285, 253, 941, 403, 4767, 4518, 407, 253, 4477, 5474, 339, 9852, 436, 2929, 253, 4477, 12661, 247, 4460, 26312, 7792, 281, 12106, 253, 15274, 273, 5880, 1247, 1511, 11454, 6928, 835, 42726, 285, 275, 735, 11656, 507, 77, 1080, 875, 767, 10625, 275, 253, 3888, 8470, 403, 10166, 275, 1798, 1677, 774, 86, 277, 9866, 5600, 2392, 253, 4477, 5276, 326, 597, 403, 1491, 24279, 1754, 327, 21140, 5164, 1783, 5657, 25761, 253, 2929, 3400, 10527, 863, 6655, 281, 2045, 16774, 2175, 327, 253, 19945, 273, 298, 18, 5222, 285, 337, 88, 4181, 275, 19870, 2957, 352, 310, 671, 8058, 326, 970, 10234, 5185, 5600, 2392, 247, 19870, 32474, 2990, 476, 320, 4270, 253, 20544, 273, 253, 2929, 2486, 26565, 7605, 1783, 281, 12392, 253, 19870, 46540, 1371, 273, 5880, 1247, 1511, 6928, 9479, 1327, 284, 40045, 3875, 14493, 327, 253, 10234, 2228, 762, 11134, 2515, 403, 8132, 29689, 11464, 671, 323, 253, 806, 673, 247, 30027, 3033, 327, 18849, 14433, 2228, 310, 1677, 436, 703, 1397, 747, 1708, 327, 835, 253, 2853, 7334, 1255, 273, 5880, 1247, 1511, 6928, 23880, 432, 253, 1543, 403, 4518, 4767, 285, 253, 42852, 403, 5544, 973, 671, 50276, 9784, 314, 2326, 5880, 1247, 1511, 6928, 403, 9814, 285, 2429, 762, 253, 4081, 7792, 4645, 253, 1270, 1318, 275, 253, 2898, 273, 253, 4081, 18918, 7792, 50275, 262, 1537, 320, 625, 47386, 604, 10704, 3368, 1543, 403, 1677, 281, 1329, 253, 10527, 1543, 533, 1580, 253, 2929, 310, 46001, 28055, 352, 310, 760, 247, 5884, 32489, 253, 7364, 273, 253, 2929, 403, 5469, 275, 253, 1390, 2593, 50276, 7152, 339, 2059, 513, 417, 1158, 619, 15040, 10129, 436, 2929, 594, 619, 6010, 273, 253, 2929, 285, 697, 9021, 778, 417, 320, 7899, 436, 2929, 3400, 247, 2962, 273, 14168, 22909, 285, 4737, 323, 253, 7605, 10234, 285, 22781, 23632, 273, 5880, 32474, 6928, 253, 7680, 273, 253, 2929, 310, 14631, 271, 16101, 1039, 323, 417, 760, 5880, 32474, 6928, 533, 671, 643, 3676, 1006, 800, 3210, 20544, 627, 403, 7520, 9380, 326, 2085, 253, 10234, 273, 253, 3676, 1006, 800, 3210, 1754, 327, 26565, 23065, 3103, 891, 513, 1158, 436, 2929, 556, 1029, 3236, 414, 285, 778, 452, 247, 1175, 3486, 327, 253, 10668, 281, 2589, 2007, 2561, 275, 253, 2852, 50276, 20881, 1255, 265, 3738, 14168, 310, 247, 5389, 1039, 281, 21184, 271, 5933, 4536, 7746, 247, 2926, 407, 3888, 556, 625, 15430, 281, 253, 10668, 285, 778, 1361, 10668, 281, 871, 253, 2644, 5406, 7938, 5549, 2490, 187, 4118, 18435, 27, 2520, 2929, 3537, 13505, 253, 15274, 273, 5880, 32474, 305, 507, 253, 4477, 2085, 10527, 16039, 275, 849, 253, 1491, 310, 15296, 407, 774, 86, 277, 9866, 5600, 2392, 3066, 5164, 1783, 597, 23000, 921, 253, 19945, 273, 298, 18, 285, 337, 88, 4181, 762, 253, 19870, 2957, 3634, 253, 6780, 273, 436, 2929, 310, 253, 26565, 7605, 1783, 285, 7936, 10527, 16039, 275, 1006, 800, 3210, 247, 5884, 1127, 310, 326, 253, 30628, 1804, 326, 625, 47386, 285, 27350, 9759, 273, 253, 1543, 588, 6032, 253, 4361, 275, 2087, 697, 247, 4722, 2929, 285, 253, 913, 32636, 14924, 209 ]
[ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 ]
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 249, 436, 2929, 253, 4477, 2530, 616, 10527, 4685, 327, 253, 10234, 285, 14433, 23632, 273, 253, 5880, 32474, 305, 507, 597, 2692, 326, 253, 5600, 2392, 1754, 327, 3676, 774, 86, 6928, 476, 3657, 253, 3280, 1491, 432, 18335, 839, 1309, 5978, 11945, 534, 2097, 359, 476, 2406, 3033, 253, 8037, 327, 253, 643, 1133, 627, 403, 2762, 1543, 253, 4477, 8058, 326, 253, 1072, 5600, 2392, 417, 760, 5115, 5058, 5978, 2957, 38311, 533, 253, 4561, 3268, 26414, 281, 253, 2303, 3268, 2761, 13353, 253, 4477, 671, 2692, 253, 19945, 875, 298, 18, 5222, 285, 337, 88, 30666, 6339, 4181, 275, 253, 19870, 2957, 50276, 2520, 2929, 310, 1077, 4891, 275, 253, 3762, 253, 4028, 3290, 273, 436, 2929, 310, 1029, 285, 954, 15538, 253, 1895, 5867, 407, 253, 4477, 310, 1077, 7936, 285, 1774, 275, 253, 1673, 273, 1006, 800, 3210, 50275, 20261, 253, 6239, 2278, 273, 436, 2929, 1057, 417, 3835, 512, 253, 10527, 9380, 327, 305, 507, 352, 310, 417, 247, 1943, 2968, 253, 1543, 403, 3236, 285, 1534, 50276, 74, 1158, 627, 310, 642, 2442, 4016, 38058, 3486, 273, 436, 789, 1580, 352, 310, 4336, 10527, 50276, 7152, 339, 9852, 436, 2929, 253, 4477, 12106, 253, 3607, 273, 2460, 281, 2460, 10234, 6928, 326, 403, 1754, 327, 5880, 15274, 50276, 249, 1798, 253, 4477, 921, 326, 672, 253, 10234, 6928, 403, 1754, 327, 774, 86, 1396, 569, 597, 21319, 347, 1491, 24279, 21257, 891, 45276, 1735, 253, 4477, 671, 921, 326, 337, 88, 30666, 6339, 4181, 285, 298, 18, 19870, 4181, 403, 6425, 323, 30323, 40610, 34006, 941, 285, 12106, 253, 2538, 273, 2853, 7334, 1255, 327, 253, 22781, 50276, 783, 4758, 310, 8132, 29689, 26115, 285, 512, 253, 3916, 1160, 275, 253, 7714, 3176, 281, 320, 2011, 275, 24904, 2508, 50276, 783, 4623, 13260, 670, 253, 2990, 35615, 285, 253, 941, 403, 4767, 4518, 407, 253, 4477, 5474, 339, 9852, 436, 2929, 253, 4477, 12661, 247, 4460, 26312, 7792, 281, 12106, 253, 15274, 273, 5880, 1247, 1511, 11454, 6928, 835, 42726, 285, 275, 735, 11656, 507, 77, 1080, 875, 767, 10625, 275, 253, 3888, 8470, 403, 10166, 275, 1798, 1677, 774, 86, 277, 9866, 5600, 2392, 253, 4477, 5276, 326, 597, 403, 1491, 24279, 1754, 327, 21140, 5164, 1783, 5657, 25761, 253, 2929, 3400, 10527, 863, 6655, 281, 2045, 16774, 2175, 327, 253, 19945, 273, 298, 18, 5222, 285, 337, 88, 4181, 275, 19870, 2957, 352, 310, 671, 8058, 326, 970, 10234, 5185, 5600, 2392, 247, 19870, 32474, 2990, 476, 320, 4270, 253, 20544, 273, 253, 2929, 2486, 26565, 7605, 1783, 281, 12392, 253, 19870, 46540, 1371, 273, 5880, 1247, 1511, 6928, 9479, 1327, 284, 40045, 3875, 14493, 327, 253, 10234, 2228, 762, 11134, 2515, 403, 8132, 29689, 11464, 671, 323, 253, 806, 673, 247, 30027, 3033, 327, 18849, 14433, 2228, 310, 1677, 436, 703, 1397, 747, 1708, 327, 835, 253, 2853, 7334, 1255, 273, 5880, 1247, 1511, 6928, 23880, 432, 253, 1543, 403, 4518, 4767, 285, 253, 42852, 403, 5544, 973, 671, 50276, 9784, 314, 2326, 5880, 1247, 1511, 6928, 403, 9814, 285, 2429, 762, 253, 4081, 7792, 4645, 253, 1270, 1318, 275, 253, 2898, 273, 253, 4081, 18918, 7792, 50275, 262, 1537, 320, 625, 47386, 604, 10704, 3368, 1543, 403, 1677, 281, 1329, 253, 10527, 1543, 533, 1580, 253, 2929, 310, 46001, 28055, 352, 310, 760, 247, 5884, 32489, 253, 7364, 273, 253, 2929, 403, 5469, 275, 253, 1390, 2593, 50276, 7152, 339, 2059, 513, 417, 1158, 619, 15040, 10129, 436, 2929, 594, 619, 6010, 273, 253, 2929, 285, 697, 9021, 778, 417, 320, 7899, 436, 2929, 3400, 247, 2962, 273, 14168, 22909, 285, 4737, 323, 253, 7605, 10234, 285, 22781, 23632, 273, 5880, 32474, 6928, 253, 7680, 273, 253, 2929, 310, 14631, 271, 16101, 1039, 323, 417, 760, 5880, 32474, 6928, 533, 671, 643, 3676, 1006, 800, 3210, 20544, 627, 403, 7520, 9380, 326, 2085, 253, 10234, 273, 253, 3676, 1006, 800, 3210, 1754, 327, 26565, 23065, 3103, 891, 513, 1158, 436, 2929, 556, 1029, 3236, 414, 285, 778, 452, 247, 1175, 3486, 327, 253, 10668, 281, 2589, 2007, 2561, 275, 253, 2852, 50276, 20881, 1255, 265, 3738, 14168, 310, 247, 5389, 1039, 281, 21184, 271, 5933, 4536, 7746, 247, 2926, 407, 3888, 556, 625, 15430, 281, 253, 10668, 285, 778, 1361, 10668, 281, 871, 253, 2644, 5406, 7938, 5549, 2490, 187, 4118, 18435, 27, 2520, 2929, 3537, 13505, 253, 15274, 273, 5880, 32474, 305, 507, 253, 4477, 2085, 10527, 16039, 275, 849, 253, 1491, 310, 15296, 407, 774, 86, 277, 9866, 5600, 2392, 3066, 5164, 1783, 597, 23000, 921, 253, 19945, 273, 298, 18, 285, 337, 88, 4181, 762, 253, 19870, 2957, 3634, 253, 6780, 273, 436, 2929, 310, 253, 26565, 7605, 1783, 285, 7936, 10527, 16039, 275, 1006, 800, 3210, 247, 5884, 1127, 310, 326, 253, 30628, 1804, 326, 625, 47386, 285, 27350, 9759, 273, 253, 1543, 588, 6032, 253, 4361, 275, 2087, 697, 247, 4722, 2929, 285, 253, 913, 32636, 14924, 209 ]
Below is given review of a research paper from cnoference journal. Please write a summary the review. ### Review: this paper proposes a method for using text editing for aligning language models to human values they compare with zeroshot gpt3 and instructgpt3 in addition to other methods and generally improve performance on their benchmarks after response thank you for your response the response has largely addressed my concerns and for this reason i will increase my score to a 6 for a broader audience i would strongly recommend adding these definitions of the terms that you added in the rebuttal in revision furthermore i can appreciate how expensive it can be to train the instructgpt models however given this statement our experiments confirm that simply scaling lms is not adequate for good alignment with human values which echoes the findings of recent studies 50 34 instead smaller lms trained with a few properly decomposed human demonstrations can often lead to better results 43 i would appreciate a clarification given what was mentioned in the response thank you for your hard work strengths i genuinely like the method i think its simple and has potential for effectiveness mostly wellwritten weaknesses i would appreciate more experiments on other datasetstasks to verify the applicability of your method many terms are used eg human values without a clear definition to what that refers to in the text yes they provide an ethics statement docsepthe paper targets the alignment of existing language models to be aligned with human values and proposes a new learning method that leverages reinforcement learning for refining the generated text to be more aligned with human values the proposed learning method models insert delete and replace as a chain of edits using reinforcement learning and introduces augmented edits modeling aem the paper also proposes the use of 1 adversarial imitation learning ail which leverages negative samples and an adversarial language model for guiding the target language model to generate text which is more coherent to the context keeping the valuealignment and 2 value modeling vm where a language model based classifier is trained to judge the coherence between the generated text and the context the paper reports experimental results on three benchmarks moral stories mic and ethicsdeontology and compares the proposed method for alignment and coherence of the generated text the paper further reports the results for transfer learning with limited humanlabeled data and performs error analysis using humanguided correction the results portrayed in the paper show the effectiveness of the proposed aem architecture strengths the paper is well motivated and highlights the problem of value alignment in the existing language models the method proposed in the paper improves the existing language models by aligning them with human values the reported results on three benchmarks show the effectiveness of the proposed architecture when compared to other text generation approaches the transfer learning results and error analysis make the results of the proposed architecture more concrete and reliable weaknesses the human evaluation design as highlighted in the limitation section might be imposing biases on value judgments due to factors like demographics education etc the authors highlight the technical limitation of the proposed method regarding the max sequence length allowed for a language model moreover the authors clearly highlight the biases in the human participants demographic factors such as gender education and ideological belief which might have influenced their reported value judgment docsepthis paper proposes a framework that uses sequential editions to align generated text with human values specifically they finetune the language model by creating edit steps between unaligned and aligned sequences using dynamic programmingbased methods as augmentation aem the reinforcement learning method is applied to make the generation coherent better with the given context the experimental results that have better value alignment and coherence compared with many previous language models and language models with larger sizes strength the human value alignment is an important research question and many controlled generation problems can be framed in this way the editingbased framework offers a lighweight way to plugin value alignment weakness the limited data scenario is helpful to this work is helpful but the demonstration in epochs makes it not very clear whether the fewshot cases fall into the individual efforts eg 16shot 32shot 64shot the authors addresses the limitation of max length capacity and evolving of human values docsepthis paper addresses the problem of how we can control large lms in order to generate text that does not contain any toxic or moral context ie human valuealigned text in contrast to prior work that focuses on constraining the decoding algorithm or engineering prompts to encourage lms to produce valuealigned text the authors propose to perform multiple edits ie insert delete replace iteratively on the generated text until they correct immoral statements moreover instead of generating only one final edited sequence they generate multiple positive and negative examples and finetune the lm given these examples finally the authors include a reinforcement learning refinement step in order to encourage the lm to produce sequences that are valuealigned but still stay in context they perform human evaluation on text generated by their method and other comparison models and find that the generated text by their approach is more valuealigned and stays in context strengths 1 the authors propose an interesting and intuitive approach for correcting generated text from large lms the method is novel and one main advantage of the proposed method is that it is interpretable with multiple edits until they reach the desired text in contrast to prior work that mostly tries to construct prompts or constraint decoding 2 the authors perform multiple ablation studies and analysis both in the main paper and in the appendix overall their analysis indicates the performance gains of their method which problems they address in contrast to prior work and what components of their method are the most important weaknesses 1 although the proposed approach offers text of higher quality in comparison with comparison models a significant limitation is that the lm has to be finetuned multiple times as far as i understand the model is first finetuned on positive and negative examples generated by editing the source text and then is further trained via an rl objective this is an important limitation in comparison to other approaches constrained decoding prompts that do not require retraining of the model this is especially important when we consider the problem at scale the authors here finetune a 345m model what will happen when we move to 13b or 175b model as the comparison systems the whole problem at hand is related to how we can control massive lms that have used the entire web for training and i am not sure whether this method can scale 2 section 32 is not described in detail so it is difficult to understand all details this section would benefit from a more complete description where the inputoutput all steps and objective for finetuning are described 3 during human evaluation the authors state that they asked human judges a to what extend does the edited response improve the original response in terms of alignment with human values and how coherent is the edited response with the given context although the qualities that the authors try to judge are hard to define in general the questions are not well formulated first for question a what are the human values this is too generic and it is wellknown that not all people share the same values in fact there is a large variation i think this question should be much more specific stating specific values that the authors want to test eg toxic racist or sexist text with examples second question b again is too generic the authors should define coherence and state specifically and in plain english the qualities that they search for given this setup the results of the human evaluation are not very trustworthy although there is definitely a positive indication and additional automatic metrics in the appendix that agree with the conclusions of the paper as stated in the weaknesses section above there are two main limitations of this paper the first is related to the parameters of the lm that have to be finetuned which poses a question about scalability to massive lms and the second is related to the human evaluation which was not performed carefully given the difficulty of this problem ### Summary:
this paper received reviews and ratings that are leaning positively the reviewer discussion and ethics review highlighted weaknesses that make this paper quite borderline strengths 1 the goal and task of the paper are quite well motivated and they are geared towards positive societal impact refining generated text to be more aligned with human values eg gearing text towards moral actions 2 the text editing approach of the paper using adversarial imitation learning is both novel and intuitive according to the reviewer 3 the authors give a convincing justification for their textedit paradigm as prior attributecontrol generation methods eg pplm tend to struggle when the context is polluted experimental results appear to support their claim 4 the chainofedit paradigm of the paper showing eg how the text morphs into a more moral one eases error diagnosis and enables interactive correction weaknesses 1 the questions asked to amt workers in the human evaluation do not seem to be well formulated ie to what extent does the edited response improve the original response in terms of alignment with human values first the term human value is very generic even the specific human value gets defined later eg deontology the amt question form reads more like one that would be given to an expert rater and would need to be either written in plain english or given to trained judges second the judges need to identify improvement in alignment with human values could be a source of confusion as the revised response could align well but no better than the original response in which case the improvement is nonexistent 2 the ethics reviewers made comments that have bearing on the technical merits of the paper as they both pointed out that the authors modeling assumption that human values are static and consistent across contexts is probably too simplistic as ethics reviewer udnx views and norms of the world are varied and may depend on complex contexts eg one may need a lot of background information about a given situation to know which action or statement is more moral what seems concerning in the paper is that the context seems to be always reduced to one sentence and it seems doubtful this provides enough information to make value judgments in many realworld situations in sum the paper makes valuable contributions but there may be biases in the results weakness 1 and the practical utility may be somewhat limited weakness 2 we would recommend that the authors address the amtrelated concern and discuss more extensively their humanvalue assumptions considering the ethics reviews regarding ethical concerns we also highly recommend that the authors follow the suggestions proposed by the ethics reviewers eg include more information about the representativeness of the participants
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 2520, 2929, 29328, 247, 1332, 323, 970, 2505, 14835, 323, 8495, 272, 3448, 3210, 281, 1966, 2193, 597, 7277, 342, 1182, 254, 6934, 302, 305, 431, 20, 285, 9618, 72, 431, 20, 275, 1635, 281, 643, 3082, 285, 3839, 3157, 3045, 327, 616, 49602, 50273, 6438, 2380, 5717, 368, 323, 634, 2380, 253, 2380, 556, 8127, 9713, 619, 7350, 285, 323, 436, 1921, 891, 588, 2572, 619, 4868, 281, 247, 721, 323, 247, 16055, 8446, 891, 651, 7052, 5583, 6240, 841, 14308, 273, 253, 2426, 326, 368, 2879, 275, 253, 30080, 22559, 275, 18520, 50275, 44295, 3062, 891, 476, 11435, 849, 8214, 352, 476, 320, 281, 6194, 253, 9618, 72, 431, 3210, 2299, 1677, 436, 3908, 776, 4679, 6583, 326, 3365, 13642, 298, 983, 310, 417, 10599, 323, 1175, 12420, 342, 1966, 2193, 534, 41249, 253, 4342, 273, 3332, 2175, 2456, 5910, 3185, 4577, 298, 983, 10166, 342, 247, 1643, 6283, 45765, 1966, 32367, 476, 2223, 1421, 281, 1805, 1543, 7652, 891, 651, 11435, 247, 37699, 1677, 752, 369, 5393, 275, 253, 2380, 50276, 47033, 368, 323, 634, 1892, 789, 50275, 296, 3755, 20556, 50276, 74, 27364, 751, 253, 1332, 891, 1158, 697, 2969, 285, 556, 2442, 323, 12510, 50276, 39025, 973, 15720, 50276, 20881, 1255, 265, 50276, 74, 651, 11435, 625, 4679, 327, 643, 10895, 296, 6579, 281, 12654, 253, 30437, 273, 634, 1332, 50276, 20415, 2426, 403, 908, 24088, 1966, 2193, 1293, 247, 2590, 5426, 281, 752, 326, 10770, 281, 275, 253, 2505, 4754, 597, 2085, 271, 18035, 3908, 5474, 339, 431, 248, 2929, 8571, 253, 12420, 273, 5368, 3448, 3210, 281, 320, 15616, 342, 1966, 2193, 285, 29328, 247, 747, 4715, 1332, 326, 19732, 1131, 35221, 4715, 323, 1275, 1699, 253, 4561, 2505, 281, 320, 625, 15616, 342, 1966, 2193, 253, 4081, 4715, 1332, 3210, 5669, 11352, 285, 8171, 347, 247, 5931, 273, 1407, 953, 970, 35221, 4715, 285, 23970, 31612, 1407, 953, 14053, 247, 358, 253, 2929, 671, 29328, 253, 897, 273, 337, 48960, 45738, 4715, 44244, 534, 19732, 1131, 4016, 3530, 285, 271, 48960, 3448, 1566, 323, 26766, 253, 2303, 3448, 1566, 281, 6635, 2505, 534, 310, 625, 18893, 281, 253, 3634, 7562, 253, 1318, 40446, 285, 374, 1318, 14053, 31940, 835, 247, 3448, 1566, 1754, 30410, 310, 10166, 281, 5963, 253, 25253, 875, 253, 4561, 2505, 285, 253, 3634, 50276, 783, 2929, 5012, 5661, 1543, 327, 1264, 49602, 9447, 6281, 8555, 285, 18035, 615, 834, 1497, 285, 26662, 253, 4081, 1332, 323, 12420, 285, 25253, 273, 253, 4561, 2505, 253, 2929, 2007, 5012, 253, 1543, 323, 3700, 4715, 342, 3710, 1966, 22027, 941, 285, 17923, 2228, 1783, 970, 1547, 2435, 1356, 10618, 253, 1543, 30804, 275, 253, 2929, 921, 253, 12510, 273, 253, 4081, 247, 358, 10336, 50269, 296, 3755, 20556, 50276, 783, 2929, 310, 973, 17194, 285, 16681, 253, 1895, 273, 1318, 12420, 275, 253, 5368, 3448, 3210, 50275, 783, 1332, 4081, 275, 253, 2929, 19132, 253, 5368, 3448, 3210, 407, 8495, 272, 731, 342, 1966, 2193, 253, 2361, 1543, 327, 1264, 49602, 921, 253, 12510, 273, 253, 4081, 10336, 672, 2429, 281, 643, 2505, 5978, 7274, 50275, 783, 3700, 4715, 1543, 285, 2228, 1783, 1056, 253, 1543, 273, 253, 4081, 10336, 625, 11859, 285, 9630, 50275, 20881, 1255, 265, 50275, 783, 1966, 7103, 2216, 347, 16318, 275, 253, 12291, 2593, 1537, 320, 23254, 31306, 327, 1318, 23014, 1955, 281, 2616, 751, 35949, 4730, 3966, 50275, 783, 4477, 6780, 253, 7681, 12291, 273, 253, 4081, 1332, 5001, 253, 2781, 3425, 2978, 4136, 323, 247, 3448, 1566, 25761, 253, 4477, 4518, 6780, 253, 31306, 275, 253, 1966, 5014, 18825, 2616, 824, 347, 8645, 4730, 285, 31002, 9927, 534, 1537, 452, 12208, 616, 2361, 1318, 3883, 50275, 7152, 33032, 2520, 2929, 29328, 247, 7792, 326, 4648, 22453, 34081, 281, 8495, 4561, 2505, 342, 1966, 2193, 5742, 597, 1442, 292, 2517, 253, 3448, 1566, 407, 6153, 12921, 5018, 875, 440, 2132, 285, 15616, 6430, 970, 7870, 10717, 3169, 3082, 347, 42072, 247, 358, 253, 35221, 4715, 1332, 310, 3732, 281, 1056, 253, 5978, 18893, 1805, 342, 253, 1677, 3634, 253, 5661, 1543, 326, 452, 1805, 1318, 12420, 285, 25253, 2429, 342, 1142, 2045, 3448, 3210, 285, 3448, 3210, 342, 4067, 9552, 50276, 45563, 50276, 783, 1966, 1318, 12420, 310, 271, 1774, 2561, 1953, 285, 1142, 6537, 5978, 3237, 476, 320, 29318, 275, 436, 1039, 50276, 783, 14835, 3169, 7792, 6131, 247, 298, 798, 6712, 1039, 281, 15191, 1318, 12420, 50276, 20881, 1255, 50276, 783, 3710, 941, 10076, 310, 9371, 281, 436, 789, 310, 9371, 533, 253, 20028, 275, 44540, 2789, 352, 417, 1077, 2590, 1880, 253, 1643, 11860, 2219, 2965, 715, 253, 2060, 6031, 24088, 1668, 11860, 4567, 11860, 6705, 11860, 50276, 783, 4477, 12453, 253, 12291, 273, 2781, 2978, 5350, 285, 25537, 273, 1966, 2193, 5474, 33032, 2520, 2929, 12453, 253, 1895, 273, 849, 359, 476, 1453, 1781, 298, 983, 275, 1340, 281, 6635, 2505, 326, 1057, 417, 3831, 667, 12825, 390, 9447, 3634, 26332, 1966, 1318, 2132, 2505, 275, 4499, 281, 2720, 789, 326, 16633, 327, 1030, 26208, 253, 28490, 5933, 390, 11369, 49887, 281, 11907, 298, 983, 281, 4711, 1318, 2132, 2505, 253, 4477, 12661, 281, 1347, 2709, 1407, 953, 26332, 5669, 11352, 8171, 10040, 3146, 327, 253, 4561, 2505, 1919, 597, 3451, 4293, 7909, 7234, 25761, 3185, 273, 11365, 760, 581, 2457, 16168, 3425, 597, 6635, 2709, 2762, 285, 4016, 6667, 285, 1442, 292, 2517, 253, 298, 78, 1677, 841, 6667, 4720, 253, 4477, 2486, 247, 35221, 4715, 29646, 3213, 275, 1340, 281, 11907, 253, 298, 78, 281, 4711, 6430, 326, 403, 1318, 2132, 533, 1335, 3297, 275, 3634, 597, 1347, 1966, 7103, 327, 2505, 4561, 407, 616, 1332, 285, 643, 5301, 3210, 285, 1089, 326, 253, 4561, 2505, 407, 616, 2746, 310, 625, 1318, 2132, 285, 19931, 275, 3634, 50276, 296, 3755, 20556, 337, 253, 4477, 12661, 271, 4722, 285, 27350, 2746, 323, 35827, 4561, 2505, 432, 1781, 298, 983, 253, 1332, 310, 4460, 285, 581, 2022, 5750, 273, 253, 4081, 1332, 310, 326, 352, 310, 4665, 494, 342, 2709, 1407, 953, 1919, 597, 3986, 253, 6799, 2505, 275, 4499, 281, 2720, 789, 326, 6571, 14177, 281, 3989, 49887, 390, 7658, 28490, 50276, 19, 253, 4477, 1347, 2709, 28913, 2175, 285, 1783, 1097, 275, 253, 2022, 2929, 285, 275, 253, 30762, 4583, 616, 1783, 6492, 253, 3045, 15988, 273, 616, 1332, 534, 3237, 597, 2953, 275, 4499, 281, 2720, 789, 285, 752, 4295, 273, 616, 1332, 403, 253, 954, 1774, 50275, 20881, 1255, 265, 337, 3738, 253, 4081, 2746, 6131, 2505, 273, 2169, 3290, 275, 5301, 342, 5301, 3210, 247, 1534, 12291, 310, 326, 253, 298, 78, 556, 281, 320, 1442, 292, 37437, 2709, 2069, 347, 2080, 347, 891, 2096, 253, 1566, 310, 806, 1442, 292, 37437, 327, 2762, 285, 4016, 6667, 4561, 407, 14835, 253, 2603, 2505, 285, 840, 310, 2007, 10166, 3066, 271, 391, 77, 8103, 436, 310, 271, 1774, 12291, 275, 5301, 281, 643, 7274, 20793, 28490, 49887, 326, 513, 417, 2430, 851, 26208, 273, 253, 1566, 436, 310, 3340, 1774, 672, 359, 1908, 253, 1895, 387, 4311, 253, 4477, 1060, 1442, 292, 2517, 247, 32670, 78, 1566, 752, 588, 5108, 672, 359, 2118, 281, 2145, 67, 390, 20105, 67, 1566, 347, 253, 5301, 2718, 253, 2644, 1895, 387, 1133, 310, 2905, 281, 849, 359, 476, 1453, 7863, 298, 983, 326, 452, 908, 253, 2862, 4384, 323, 3733, 285, 891, 717, 417, 2119, 1880, 436, 1332, 476, 4311, 374, 2593, 4567, 310, 417, 2529, 275, 2508, 594, 352, 310, 2834, 281, 2096, 512, 4278, 436, 2593, 651, 5649, 432, 247, 625, 3426, 5740, 835, 253, 3280, 9252, 512, 5018, 285, 8103, 323, 1442, 292, 25004, 403, 2529, 495, 1309, 1966, 7103, 253, 4477, 1375, 326, 597, 2546, 1966, 16006, 247, 281, 752, 9017, 1057, 253, 16168, 2380, 3157, 253, 3236, 2380, 275, 2426, 273, 12420, 342, 1966, 2193, 285, 849, 18893, 310, 253, 16168, 2380, 342, 253, 1677, 3634, 3738, 253, 18701, 326, 253, 4477, 1611, 281, 5963, 403, 1892, 281, 4853, 275, 2087, 253, 3533, 403, 417, 973, 26115, 806, 323, 1953, 247, 752, 403, 253, 1966, 2193, 436, 310, 1512, 12314, 285, 352, 310, 973, 4304, 326, 417, 512, 952, 3894, 253, 1072, 2193, 275, 958, 627, 310, 247, 1781, 7629, 891, 1158, 436, 1953, 943, 320, 1199, 625, 2173, 14851, 2173, 2193, 326, 253, 4477, 971, 281, 1071, 24088, 12825, 22198, 390, 2825, 382, 2505, 342, 6667, 1273, 1953, 270, 969, 310, 1512, 12314, 253, 4477, 943, 4853, 25253, 285, 1375, 5742, 285, 275, 8342, 48087, 253, 18701, 326, 597, 3186, 323, 1677, 436, 9978, 253, 1543, 273, 253, 1966, 7103, 403, 417, 1077, 46808, 3738, 627, 310, 7964, 247, 2762, 14011, 285, 3081, 12077, 17082, 275, 253, 30762, 326, 5194, 342, 253, 11815, 273, 253, 2929, 50276, 284, 4767, 275, 253, 32213, 2593, 1840, 627, 403, 767, 2022, 7364, 273, 436, 2929, 253, 806, 310, 2905, 281, 253, 3602, 273, 253, 298, 78, 326, 452, 281, 320, 1442, 292, 37437, 534, 24543, 247, 1953, 670, 9171, 1430, 281, 7863, 298, 983, 285, 253, 1273, 310, 2905, 281, 253, 1966, 7103, 534, 369, 417, 2684, 9257, 1677, 253, 10183, 273, 436, 1895, 50276, 187, 187, 4118, 18435, 27, 2520, 2929, 2959, 10123, 285, 17503, 326, 403, 25661, 14962, 253, 37317, 5955, 285, 18035, 2278, 16318, 32213, 326, 1056, 436, 2929, 3240, 45210, 50276, 296, 3755, 20556, 337, 253, 4736, 285, 4836, 273, 253, 2929, 403, 3240, 973, 17194, 285, 597, 403, 48526, 4404, 2762, 38058, 3486, 1275, 1699, 4561, 2505, 281, 320, 625, 15616, 342, 1966, 2193, 24088, 305, 10745, 2505, 4404, 9447, 5231, 50276, 19, 253, 2505, 14835, 2746, 273, 253, 2929, 970, 48960, 45738, 4715, 310, 1097, 4460, 285, 27350, 2556, 281, 253, 37317, 495, 253, 4477, 1918, 247, 21414, 22861, 323, 616, 2505, 15576, 22199, 347, 2720, 11104, 8519, 5978, 3082, 24088, 268, 446, 78, 5257, 281, 11182, 672, 253, 3634, 310, 8461, 4525, 5661, 1543, 3176, 281, 1329, 616, 1750, 577, 253, 5931, 1171, 15576, 22199, 273, 253, 2929, 4645, 24088, 849, 253, 2505, 6695, 84, 715, 247, 625, 9447, 581, 299, 1169, 2228, 6120, 285, 13276, 18366, 10618, 50276, 20881, 1255, 265, 337, 253, 3533, 2546, 281, 717, 85, 5820, 275, 253, 1966, 7103, 513, 417, 1646, 281, 320, 973, 26115, 26332, 281, 752, 6070, 1057, 253, 16168, 2380, 3157, 253, 3236, 2380, 275, 2426, 273, 12420, 342, 1966, 2193, 50276, 7053, 253, 1307, 1966, 1318, 310, 1077, 12314, 1014, 253, 2173, 1966, 1318, 4850, 2931, 1996, 24088, 372, 834, 1497, 253, 717, 85, 1953, 830, 9563, 625, 751, 581, 326, 651, 320, 1677, 281, 271, 6485, 391, 727, 285, 651, 878, 281, 320, 2057, 3542, 275, 8342, 48087, 390, 1677, 281, 10166, 16006, 1273, 253, 16006, 878, 281, 4271, 7756, 275, 12420, 342, 1966, 2193, 812, 320, 247, 2603, 273, 13775, 347, 253, 17265, 2380, 812, 8495, 973, 533, 642, 1805, 685, 253, 3236, 2380, 275, 534, 1083, 253, 7756, 310, 44382, 6688, 374, 253, 18035, 30628, 1160, 5701, 326, 452, 12206, 327, 253, 7681, 16108, 273, 253, 2929, 347, 597, 1097, 8042, 562, 326, 253, 4477, 14053, 9376, 326, 1966, 2193, 403, 4228, 285, 5185, 2439, 22349, 310, 3164, 1512, 8077, 2531, 347, 18035, 37317, 18198, 28708, 6849, 285, 22429, 273, 253, 1533, 403, 12848, 285, 778, 3469, 327, 2570, 22349, 24088, 581, 778, 878, 247, 2257, 273, 4114, 1491, 670, 247, 1677, 4112, 281, 871, 534, 2250, 390, 3908, 310, 625, 9447, 752, 3133, 8664, 275, 253, 2929, 310, 326, 253, 3634, 3133, 281, 320, 1900, 3777, 281, 581, 6197, 285, 352, 3133, 38342, 436, 3400, 2217, 1491, 281, 1056, 1318, 23014, 275, 1142, 1524, 10186, 9534, 50276, 249, 2020, 253, 2929, 2789, 9865, 9021, 533, 627, 778, 320, 31306, 275, 253, 1543, 14855, 337, 285, 253, 8542, 11839, 778, 320, 8489, 3710, 14855, 374, 359, 651, 5583, 326, 253, 4477, 2953, 253, 717, 85, 4919, 4468, 285, 2319, 625, 18171, 616, 1966, 2877, 13260, 7296, 253, 18035, 10123, 50276, 1747, 13218, 16289, 7350, 359, 671, 4122, 5583, 326, 253, 4477, 956, 253, 13991, 4081, 407, 253, 18035, 30628, 24088, 2486, 625, 1491, 670, 253, 1957, 255, 6460, 273, 253, 5014 ]
[ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 ]
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 2520, 2929, 29328, 247, 1332, 323, 970, 2505, 14835, 323, 8495, 272, 3448, 3210, 281, 1966, 2193, 597, 7277, 342, 1182, 254, 6934, 302, 305, 431, 20, 285, 9618, 72, 431, 20, 275, 1635, 281, 643, 3082, 285, 3839, 3157, 3045, 327, 616, 49602, 50273, 6438, 2380, 5717, 368, 323, 634, 2380, 253, 2380, 556, 8127, 9713, 619, 7350, 285, 323, 436, 1921, 891, 588, 2572, 619, 4868, 281, 247, 721, 323, 247, 16055, 8446, 891, 651, 7052, 5583, 6240, 841, 14308, 273, 253, 2426, 326, 368, 2879, 275, 253, 30080, 22559, 275, 18520, 50275, 44295, 3062, 891, 476, 11435, 849, 8214, 352, 476, 320, 281, 6194, 253, 9618, 72, 431, 3210, 2299, 1677, 436, 3908, 776, 4679, 6583, 326, 3365, 13642, 298, 983, 310, 417, 10599, 323, 1175, 12420, 342, 1966, 2193, 534, 41249, 253, 4342, 273, 3332, 2175, 2456, 5910, 3185, 4577, 298, 983, 10166, 342, 247, 1643, 6283, 45765, 1966, 32367, 476, 2223, 1421, 281, 1805, 1543, 7652, 891, 651, 11435, 247, 37699, 1677, 752, 369, 5393, 275, 253, 2380, 50276, 47033, 368, 323, 634, 1892, 789, 50275, 296, 3755, 20556, 50276, 74, 27364, 751, 253, 1332, 891, 1158, 697, 2969, 285, 556, 2442, 323, 12510, 50276, 39025, 973, 15720, 50276, 20881, 1255, 265, 50276, 74, 651, 11435, 625, 4679, 327, 643, 10895, 296, 6579, 281, 12654, 253, 30437, 273, 634, 1332, 50276, 20415, 2426, 403, 908, 24088, 1966, 2193, 1293, 247, 2590, 5426, 281, 752, 326, 10770, 281, 275, 253, 2505, 4754, 597, 2085, 271, 18035, 3908, 5474, 339, 431, 248, 2929, 8571, 253, 12420, 273, 5368, 3448, 3210, 281, 320, 15616, 342, 1966, 2193, 285, 29328, 247, 747, 4715, 1332, 326, 19732, 1131, 35221, 4715, 323, 1275, 1699, 253, 4561, 2505, 281, 320, 625, 15616, 342, 1966, 2193, 253, 4081, 4715, 1332, 3210, 5669, 11352, 285, 8171, 347, 247, 5931, 273, 1407, 953, 970, 35221, 4715, 285, 23970, 31612, 1407, 953, 14053, 247, 358, 253, 2929, 671, 29328, 253, 897, 273, 337, 48960, 45738, 4715, 44244, 534, 19732, 1131, 4016, 3530, 285, 271, 48960, 3448, 1566, 323, 26766, 253, 2303, 3448, 1566, 281, 6635, 2505, 534, 310, 625, 18893, 281, 253, 3634, 7562, 253, 1318, 40446, 285, 374, 1318, 14053, 31940, 835, 247, 3448, 1566, 1754, 30410, 310, 10166, 281, 5963, 253, 25253, 875, 253, 4561, 2505, 285, 253, 3634, 50276, 783, 2929, 5012, 5661, 1543, 327, 1264, 49602, 9447, 6281, 8555, 285, 18035, 615, 834, 1497, 285, 26662, 253, 4081, 1332, 323, 12420, 285, 25253, 273, 253, 4561, 2505, 253, 2929, 2007, 5012, 253, 1543, 323, 3700, 4715, 342, 3710, 1966, 22027, 941, 285, 17923, 2228, 1783, 970, 1547, 2435, 1356, 10618, 253, 1543, 30804, 275, 253, 2929, 921, 253, 12510, 273, 253, 4081, 247, 358, 10336, 50269, 296, 3755, 20556, 50276, 783, 2929, 310, 973, 17194, 285, 16681, 253, 1895, 273, 1318, 12420, 275, 253, 5368, 3448, 3210, 50275, 783, 1332, 4081, 275, 253, 2929, 19132, 253, 5368, 3448, 3210, 407, 8495, 272, 731, 342, 1966, 2193, 253, 2361, 1543, 327, 1264, 49602, 921, 253, 12510, 273, 253, 4081, 10336, 672, 2429, 281, 643, 2505, 5978, 7274, 50275, 783, 3700, 4715, 1543, 285, 2228, 1783, 1056, 253, 1543, 273, 253, 4081, 10336, 625, 11859, 285, 9630, 50275, 20881, 1255, 265, 50275, 783, 1966, 7103, 2216, 347, 16318, 275, 253, 12291, 2593, 1537, 320, 23254, 31306, 327, 1318, 23014, 1955, 281, 2616, 751, 35949, 4730, 3966, 50275, 783, 4477, 6780, 253, 7681, 12291, 273, 253, 4081, 1332, 5001, 253, 2781, 3425, 2978, 4136, 323, 247, 3448, 1566, 25761, 253, 4477, 4518, 6780, 253, 31306, 275, 253, 1966, 5014, 18825, 2616, 824, 347, 8645, 4730, 285, 31002, 9927, 534, 1537, 452, 12208, 616, 2361, 1318, 3883, 50275, 7152, 33032, 2520, 2929, 29328, 247, 7792, 326, 4648, 22453, 34081, 281, 8495, 4561, 2505, 342, 1966, 2193, 5742, 597, 1442, 292, 2517, 253, 3448, 1566, 407, 6153, 12921, 5018, 875, 440, 2132, 285, 15616, 6430, 970, 7870, 10717, 3169, 3082, 347, 42072, 247, 358, 253, 35221, 4715, 1332, 310, 3732, 281, 1056, 253, 5978, 18893, 1805, 342, 253, 1677, 3634, 253, 5661, 1543, 326, 452, 1805, 1318, 12420, 285, 25253, 2429, 342, 1142, 2045, 3448, 3210, 285, 3448, 3210, 342, 4067, 9552, 50276, 45563, 50276, 783, 1966, 1318, 12420, 310, 271, 1774, 2561, 1953, 285, 1142, 6537, 5978, 3237, 476, 320, 29318, 275, 436, 1039, 50276, 783, 14835, 3169, 7792, 6131, 247, 298, 798, 6712, 1039, 281, 15191, 1318, 12420, 50276, 20881, 1255, 50276, 783, 3710, 941, 10076, 310, 9371, 281, 436, 789, 310, 9371, 533, 253, 20028, 275, 44540, 2789, 352, 417, 1077, 2590, 1880, 253, 1643, 11860, 2219, 2965, 715, 253, 2060, 6031, 24088, 1668, 11860, 4567, 11860, 6705, 11860, 50276, 783, 4477, 12453, 253, 12291, 273, 2781, 2978, 5350, 285, 25537, 273, 1966, 2193, 5474, 33032, 2520, 2929, 12453, 253, 1895, 273, 849, 359, 476, 1453, 1781, 298, 983, 275, 1340, 281, 6635, 2505, 326, 1057, 417, 3831, 667, 12825, 390, 9447, 3634, 26332, 1966, 1318, 2132, 2505, 275, 4499, 281, 2720, 789, 326, 16633, 327, 1030, 26208, 253, 28490, 5933, 390, 11369, 49887, 281, 11907, 298, 983, 281, 4711, 1318, 2132, 2505, 253, 4477, 12661, 281, 1347, 2709, 1407, 953, 26332, 5669, 11352, 8171, 10040, 3146, 327, 253, 4561, 2505, 1919, 597, 3451, 4293, 7909, 7234, 25761, 3185, 273, 11365, 760, 581, 2457, 16168, 3425, 597, 6635, 2709, 2762, 285, 4016, 6667, 285, 1442, 292, 2517, 253, 298, 78, 1677, 841, 6667, 4720, 253, 4477, 2486, 247, 35221, 4715, 29646, 3213, 275, 1340, 281, 11907, 253, 298, 78, 281, 4711, 6430, 326, 403, 1318, 2132, 533, 1335, 3297, 275, 3634, 597, 1347, 1966, 7103, 327, 2505, 4561, 407, 616, 1332, 285, 643, 5301, 3210, 285, 1089, 326, 253, 4561, 2505, 407, 616, 2746, 310, 625, 1318, 2132, 285, 19931, 275, 3634, 50276, 296, 3755, 20556, 337, 253, 4477, 12661, 271, 4722, 285, 27350, 2746, 323, 35827, 4561, 2505, 432, 1781, 298, 983, 253, 1332, 310, 4460, 285, 581, 2022, 5750, 273, 253, 4081, 1332, 310, 326, 352, 310, 4665, 494, 342, 2709, 1407, 953, 1919, 597, 3986, 253, 6799, 2505, 275, 4499, 281, 2720, 789, 326, 6571, 14177, 281, 3989, 49887, 390, 7658, 28490, 50276, 19, 253, 4477, 1347, 2709, 28913, 2175, 285, 1783, 1097, 275, 253, 2022, 2929, 285, 275, 253, 30762, 4583, 616, 1783, 6492, 253, 3045, 15988, 273, 616, 1332, 534, 3237, 597, 2953, 275, 4499, 281, 2720, 789, 285, 752, 4295, 273, 616, 1332, 403, 253, 954, 1774, 50275, 20881, 1255, 265, 337, 3738, 253, 4081, 2746, 6131, 2505, 273, 2169, 3290, 275, 5301, 342, 5301, 3210, 247, 1534, 12291, 310, 326, 253, 298, 78, 556, 281, 320, 1442, 292, 37437, 2709, 2069, 347, 2080, 347, 891, 2096, 253, 1566, 310, 806, 1442, 292, 37437, 327, 2762, 285, 4016, 6667, 4561, 407, 14835, 253, 2603, 2505, 285, 840, 310, 2007, 10166, 3066, 271, 391, 77, 8103, 436, 310, 271, 1774, 12291, 275, 5301, 281, 643, 7274, 20793, 28490, 49887, 326, 513, 417, 2430, 851, 26208, 273, 253, 1566, 436, 310, 3340, 1774, 672, 359, 1908, 253, 1895, 387, 4311, 253, 4477, 1060, 1442, 292, 2517, 247, 32670, 78, 1566, 752, 588, 5108, 672, 359, 2118, 281, 2145, 67, 390, 20105, 67, 1566, 347, 253, 5301, 2718, 253, 2644, 1895, 387, 1133, 310, 2905, 281, 849, 359, 476, 1453, 7863, 298, 983, 326, 452, 908, 253, 2862, 4384, 323, 3733, 285, 891, 717, 417, 2119, 1880, 436, 1332, 476, 4311, 374, 2593, 4567, 310, 417, 2529, 275, 2508, 594, 352, 310, 2834, 281, 2096, 512, 4278, 436, 2593, 651, 5649, 432, 247, 625, 3426, 5740, 835, 253, 3280, 9252, 512, 5018, 285, 8103, 323, 1442, 292, 25004, 403, 2529, 495, 1309, 1966, 7103, 253, 4477, 1375, 326, 597, 2546, 1966, 16006, 247, 281, 752, 9017, 1057, 253, 16168, 2380, 3157, 253, 3236, 2380, 275, 2426, 273, 12420, 342, 1966, 2193, 285, 849, 18893, 310, 253, 16168, 2380, 342, 253, 1677, 3634, 3738, 253, 18701, 326, 253, 4477, 1611, 281, 5963, 403, 1892, 281, 4853, 275, 2087, 253, 3533, 403, 417, 973, 26115, 806, 323, 1953, 247, 752, 403, 253, 1966, 2193, 436, 310, 1512, 12314, 285, 352, 310, 973, 4304, 326, 417, 512, 952, 3894, 253, 1072, 2193, 275, 958, 627, 310, 247, 1781, 7629, 891, 1158, 436, 1953, 943, 320, 1199, 625, 2173, 14851, 2173, 2193, 326, 253, 4477, 971, 281, 1071, 24088, 12825, 22198, 390, 2825, 382, 2505, 342, 6667, 1273, 1953, 270, 969, 310, 1512, 12314, 253, 4477, 943, 4853, 25253, 285, 1375, 5742, 285, 275, 8342, 48087, 253, 18701, 326, 597, 3186, 323, 1677, 436, 9978, 253, 1543, 273, 253, 1966, 7103, 403, 417, 1077, 46808, 3738, 627, 310, 7964, 247, 2762, 14011, 285, 3081, 12077, 17082, 275, 253, 30762, 326, 5194, 342, 253, 11815, 273, 253, 2929, 50276, 284, 4767, 275, 253, 32213, 2593, 1840, 627, 403, 767, 2022, 7364, 273, 436, 2929, 253, 806, 310, 2905, 281, 253, 3602, 273, 253, 298, 78, 326, 452, 281, 320, 1442, 292, 37437, 534, 24543, 247, 1953, 670, 9171, 1430, 281, 7863, 298, 983, 285, 253, 1273, 310, 2905, 281, 253, 1966, 7103, 534, 369, 417, 2684, 9257, 1677, 253, 10183, 273, 436, 1895, 50276, 187, 187, 4118, 18435, 27, 2520, 2929, 2959, 10123, 285, 17503, 326, 403, 25661, 14962, 253, 37317, 5955, 285, 18035, 2278, 16318, 32213, 326, 1056, 436, 2929, 3240, 45210, 50276, 296, 3755, 20556, 337, 253, 4736, 285, 4836, 273, 253, 2929, 403, 3240, 973, 17194, 285, 597, 403, 48526, 4404, 2762, 38058, 3486, 1275, 1699, 4561, 2505, 281, 320, 625, 15616, 342, 1966, 2193, 24088, 305, 10745, 2505, 4404, 9447, 5231, 50276, 19, 253, 2505, 14835, 2746, 273, 253, 2929, 970, 48960, 45738, 4715, 310, 1097, 4460, 285, 27350, 2556, 281, 253, 37317, 495, 253, 4477, 1918, 247, 21414, 22861, 323, 616, 2505, 15576, 22199, 347, 2720, 11104, 8519, 5978, 3082, 24088, 268, 446, 78, 5257, 281, 11182, 672, 253, 3634, 310, 8461, 4525, 5661, 1543, 3176, 281, 1329, 616, 1750, 577, 253, 5931, 1171, 15576, 22199, 273, 253, 2929, 4645, 24088, 849, 253, 2505, 6695, 84, 715, 247, 625, 9447, 581, 299, 1169, 2228, 6120, 285, 13276, 18366, 10618, 50276, 20881, 1255, 265, 337, 253, 3533, 2546, 281, 717, 85, 5820, 275, 253, 1966, 7103, 513, 417, 1646, 281, 320, 973, 26115, 26332, 281, 752, 6070, 1057, 253, 16168, 2380, 3157, 253, 3236, 2380, 275, 2426, 273, 12420, 342, 1966, 2193, 50276, 7053, 253, 1307, 1966, 1318, 310, 1077, 12314, 1014, 253, 2173, 1966, 1318, 4850, 2931, 1996, 24088, 372, 834, 1497, 253, 717, 85, 1953, 830, 9563, 625, 751, 581, 326, 651, 320, 1677, 281, 271, 6485, 391, 727, 285, 651, 878, 281, 320, 2057, 3542, 275, 8342, 48087, 390, 1677, 281, 10166, 16006, 1273, 253, 16006, 878, 281, 4271, 7756, 275, 12420, 342, 1966, 2193, 812, 320, 247, 2603, 273, 13775, 347, 253, 17265, 2380, 812, 8495, 973, 533, 642, 1805, 685, 253, 3236, 2380, 275, 534, 1083, 253, 7756, 310, 44382, 6688, 374, 253, 18035, 30628, 1160, 5701, 326, 452, 12206, 327, 253, 7681, 16108, 273, 253, 2929, 347, 597, 1097, 8042, 562, 326, 253, 4477, 14053, 9376, 326, 1966, 2193, 403, 4228, 285, 5185, 2439, 22349, 310, 3164, 1512, 8077, 2531, 347, 18035, 37317, 18198, 28708, 6849, 285, 22429, 273, 253, 1533, 403, 12848, 285, 778, 3469, 327, 2570, 22349, 24088, 581, 778, 878, 247, 2257, 273, 4114, 1491, 670, 247, 1677, 4112, 281, 871, 534, 2250, 390, 3908, 310, 625, 9447, 752, 3133, 8664, 275, 253, 2929, 310, 326, 253, 3634, 3133, 281, 320, 1900, 3777, 281, 581, 6197, 285, 352, 3133, 38342, 436, 3400, 2217, 1491, 281, 1056, 1318, 23014, 275, 1142, 1524, 10186, 9534, 50276, 249, 2020, 253, 2929, 2789, 9865, 9021, 533, 627, 778, 320, 31306, 275, 253, 1543, 14855, 337, 285, 253, 8542, 11839, 778, 320, 8489, 3710, 14855, 374, 359, 651, 5583, 326, 253, 4477, 2953, 253, 717, 85, 4919, 4468, 285, 2319, 625, 18171, 616, 1966, 2877, 13260, 7296, 253, 18035, 10123, 50276, 1747, 13218, 16289, 7350, 359, 671, 4122, 5583, 326, 253, 4477, 956, 253, 13991, 4081, 407, 253, 18035, 30628, 24088, 2486, 625, 1491, 670, 253, 1957, 255, 6460, 273, 253, 5014 ]
Below is given review of a research paper from cnoference journal. Please write a summary the review. ### Review: the authors have done a remarkable job of reproduction the work incorporates major todos of rcdocsepthis paper seems to follow the guidelines for reproducibility notation code the authors introduce a simple output layer called gen that can be used in graph neural networks to to obtain a means of predicting how a series of graphs will evolve over time the domain is graph editing the authors indicate gens can be used to do several things such as a solution to the ged problem and the authors show gens in experimental environments which are encouraging for predicting node insertions and edge operations the manuscript reveals that with a graph matching a pair of graphs there is an algorithmically determinable nearoptimal graph edit sequence for generating training data finally the paper demonstrates the models capabilities on a set of synthetic benchmarks the manuscript is well written and this paper is likely of interest to the iclr forum broadly speaking this would be suitable for poster presentation ### Summary:
reviewers agreed that this work should be accepted they praised that it was a thorough reproduction and it was well written with good quality evaluations
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 783, 4477, 452, 2218, 247, 13406, 2628, 273, 21068, 253, 789, 31167, 2201, 23034, 273, 391, 2428, 406, 33032, 2520, 2929, 3133, 281, 956, 253, 9600, 323, 38041, 14951, 2127, 50276, 783, 4477, 9569, 247, 2969, 3453, 3828, 1925, 730, 326, 476, 320, 908, 275, 4216, 11454, 6928, 281, 281, 4044, 247, 2097, 273, 21565, 849, 247, 2962, 273, 14580, 588, 23554, 689, 673, 50276, 783, 5028, 310, 4216, 14835, 50276, 783, 4477, 5224, 305, 561, 476, 320, 908, 281, 513, 2067, 1841, 824, 347, 247, 2900, 281, 253, 305, 264, 1895, 285, 253, 4477, 921, 305, 561, 275, 5661, 12620, 534, 403, 18462, 323, 21565, 4666, 49678, 285, 5024, 5871, 50275, 783, 7714, 12957, 326, 342, 247, 4216, 11038, 247, 4667, 273, 14580, 627, 310, 271, 5933, 1037, 11544, 494, 2822, 29776, 4216, 12921, 3425, 323, 11365, 3733, 941, 4720, 253, 2929, 14371, 253, 3210, 13789, 327, 247, 873, 273, 13506, 49602, 50276, 783, 7714, 310, 973, 3542, 285, 436, 2929, 310, 2779, 273, 1600, 281, 253, 17857, 32888, 12209, 21450, 8288, 50276, 2520, 651, 320, 7470, 323, 20731, 9759, 50275, 187, 187, 4118, 18435, 27, 15337, 398, 5821, 326, 436, 789, 943, 320, 7607, 597, 26108, 326, 352, 369, 247, 11080, 21068, 285, 352, 369, 973, 3542, 342, 1175, 3290, 27163 ]
[ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 ]
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 783, 4477, 452, 2218, 247, 13406, 2628, 273, 21068, 253, 789, 31167, 2201, 23034, 273, 391, 2428, 406, 33032, 2520, 2929, 3133, 281, 956, 253, 9600, 323, 38041, 14951, 2127, 50276, 783, 4477, 9569, 247, 2969, 3453, 3828, 1925, 730, 326, 476, 320, 908, 275, 4216, 11454, 6928, 281, 281, 4044, 247, 2097, 273, 21565, 849, 247, 2962, 273, 14580, 588, 23554, 689, 673, 50276, 783, 5028, 310, 4216, 14835, 50276, 783, 4477, 5224, 305, 561, 476, 320, 908, 281, 513, 2067, 1841, 824, 347, 247, 2900, 281, 253, 305, 264, 1895, 285, 253, 4477, 921, 305, 561, 275, 5661, 12620, 534, 403, 18462, 323, 21565, 4666, 49678, 285, 5024, 5871, 50275, 783, 7714, 12957, 326, 342, 247, 4216, 11038, 247, 4667, 273, 14580, 627, 310, 271, 5933, 1037, 11544, 494, 2822, 29776, 4216, 12921, 3425, 323, 11365, 3733, 941, 4720, 253, 2929, 14371, 253, 3210, 13789, 327, 247, 873, 273, 13506, 49602, 50276, 783, 7714, 310, 973, 3542, 285, 436, 2929, 310, 2779, 273, 1600, 281, 253, 17857, 32888, 12209, 21450, 8288, 50276, 2520, 651, 320, 7470, 323, 20731, 9759, 50275, 187, 187, 4118, 18435, 27, 15337, 398, 5821, 326, 436, 789, 943, 320, 7607, 597, 26108, 326, 352, 369, 247, 11080, 21068, 285, 352, 369, 973, 3542, 342, 1175, 3290, 27163 ]
Below is given review of a research paper from cnoference journal. Please write a summary the review. ### Review: the authors of this paper studied the popular belief that deep neural networks do information compression for supervised tasks they studied this compression behavior with tanh and relu and its variants activation functions which are saturating and non saturating in nature respectively the compression score is computed using mutual information estimation which when computed are usually infinite for finite mutual information values noise can be added to hidden activations for this purpose two approaches namely entropy based binningebab and adaptive kernel density estimationakde were explored ebab adds noise to the hidden activations by binning and akde by gaussian noise their results show that both ebab and akde exhibit compression in case of relu although this behavior is the strongest in tanh finally when compression score was plotted against accuracy higher rates of compression did not show significant correlation with generalization hence showing evidence that generalizationor good performance can be achieved even without information bottleneckinformation compression qualms 1 figure 7s description that elu swish and centered softplus functions doing compression is not very apparent 2 figure 9b regression line between compression score and accuracy shows a positive correlation between them this seems contradictory to the inference 3 the experiments were done on a 5layer network with 107543 nodes respectively on a toy data of 12bit binary vectors the study could have included bigger networks with popular datasets which would give substantial support to the trend observed on toy datadocsepthis paper proposes a method for the estimation of mutual information for networks with unbounded activation functions and the use of l2 regularization to induce more compression the use of information planes to study the training behavior of networks is not new this paper addresses the issue of unbounded hidden state activities as the differential mutual information in dnn is illdefined the authors proposed to add noise to the hidden activity by using the binning process it is not clear in the paper that if the binning is applied just for visualizing the information plane or for computing the activities of hidden units in upper layers if it is the latter one it creates unnecessary distortions to the dnn as the authors pointed out different initializations can lead to different behaviour on the information plane it would be difficult to draw conclusions based on the experimental results even they come from the average of 50 individual networks also the experiences are performed using a particular task it is not sure if similar behavior is observed in other tasks it is however more important to understand what makes the compression for the l2 regularization the compression is expected as the regularization tends to limit the values of the weights docsepthis paper has 3 principal contributions it proposes a different way of measuring mutual information in a neural network proposes a compression score to compare different models then empirically analyses different activation functions and l2 weights this work seems like a welcome addition to the ib thread to me the most interesting result is simply that activation functions arent simply about gradient flow and that they may each have properties that are more or less desirable depending on the domain they might be used on the authors are careful in the wording of their conclusions i think with reason while these results are useful in that there seem to be consistently different behaviors coming from different hyperparameters information planes show a relatively qualitative part of the picture quantitatively the proposed compression score is interesting but as the authors say simplistic it seems to me that we care more about the converged models than the whole training trajectory how does this score evolve with time i think an important part of discussion that lacks in this paper is a more indepth take as to how these findings relate to the zhang et al 1 memorization vs generalization paper and its follow ups there seem to be many links to be drawn this work is overall a good contribution but ill have to agree with the authors conclusion that more principled analysis methods are required to have a solid grasp of the training dynamics of dnns the writing of the paper is good but the writing of the captions could be improved the hard page limit of iclr is 10 pages and your paper has a lot of captions so i think investing into a bit more text would be good comments it might be worth to reexplain what the information plane plots are in a figure caption not just in the text the text also doesnt really explain that each point is a moment in training and each thread a different layer this paper should be readable by someone who has never seen these plots before its not clear what is going on in figure 5 i can guess but again this paper should be readable by anyone in the field you mention different initializations but which exactly what makes you say that 5c has no compression but that 5a does compression first it should be explained explicitly i believe what you say about figure 8 but the plots are so similar that it is hard to compare them visually maybe a different kind of superposition into a single plot would better illustrate the compression effect of l2 typo in the x axis caption of figures 9 figure 9a is not readable in greyscale or by a colorblind person consider using a different symbol for the softmax scatter and adding this symbol to the legend the first schmidhuber citation of the paper seems a bit out of place i think he himself would say that deep learning has been going on for much longer than since 2015 in fact i think you could just remove the entire first paragraph it is just unnecessary boilerplate why should there be a direct correlation between compression and generalization for example it is known that training dnns with soft targets improves test accuracy in classification or even forcing softness in both targets and representations 2 also improves test accuracy im still personally not sold on binning as a strategy to evaluate mi did you perform experiments that show that the observed difference is consistent if more computation is done to approximate mi and not just an artefact of maxentropy binning 1 zhang et al 2016 httpsarxivorgabs161103530 2 verma et al 2018 httpsarxivorgabs180605236 ### Summary:
this paper suggests that noiseregularized estimators of mutual information in deep neural networks should be adaptive in the sense that the variance of the regularization noise should be proportional to the range of the hidden activity two adaptive estimators are proposed 1 an entropybased adaptive binning ebab estimator that chooses the bin boundaries such that each bin contains the same number of unique observed activation levels and 2 an adaptive kernel density estimator akde that adds isotropic gaussian noise where the variance of the noise is proportional to the maximum activity value in a given layer these estimators are then used to show that 1 relu networks can compress but that compression may or may not occur depending on the specific weight initialization 2 different nonsaturating noninearities exhibit different information plane behaviors over the course of training and 3 l2 regularization in relu networks encourages compression the paper also finds that only compression in the last softmax layer correlates with generalization performance the reviewers liked the range of experiments and found the observations in the paper interesting but had reservations about the lack of rigor in the paper no theoretical analysis of the convergence of the proposed estimator were worried that posthoc addition of noise distorts the function of the network and felt that there wasnt much insight provided on the cause of compression in deep neural networks the ac shares these concerns and considers them to be more significant than the reviewers do but doesnt wish to override the reviewers recommendation that the paper be accepted
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 783, 4477, 273, 436, 2929, 5421, 253, 4633, 9927, 326, 3676, 11454, 6928, 513, 1491, 13800, 323, 22296, 8892, 597, 5421, 436, 13800, 3879, 342, 23136, 73, 285, 774, 86, 285, 697, 11640, 5743, 3470, 534, 403, 19004, 839, 285, 1327, 19004, 839, 275, 3753, 2975, 50275, 783, 13800, 4868, 310, 10302, 970, 15577, 1491, 13418, 534, 672, 10302, 403, 3798, 11968, 323, 6486, 15577, 1491, 2193, 6046, 476, 320, 2879, 281, 8763, 1396, 569, 323, 436, 4096, 767, 7274, 10775, 15579, 1754, 10269, 920, 2275, 357, 285, 17825, 10295, 4038, 13418, 518, 615, 497, 14859, 38391, 357, 11323, 6046, 281, 253, 8763, 1396, 569, 407, 10269, 920, 285, 29507, 615, 407, 305, 12064, 6046, 616, 1543, 921, 326, 1097, 38391, 357, 285, 29507, 615, 10738, 13800, 275, 1083, 273, 774, 86, 3738, 436, 3879, 310, 253, 19508, 275, 23136, 73, 50275, 71, 3341, 672, 13800, 4868, 369, 17944, 1411, 7200, 2169, 4142, 273, 13800, 858, 417, 921, 1534, 5921, 342, 26647, 7613, 4645, 1941, 326, 26647, 263, 1175, 3045, 476, 320, 6786, 1014, 1293, 1491, 3673, 44856, 18480, 13800, 50276, 15847, 983, 337, 4677, 818, 84, 5740, 326, 1045, 86, 1863, 763, 285, 18932, 2602, 11095, 3470, 2509, 13800, 310, 417, 1077, 5165, 50276, 19, 4677, 898, 67, 9077, 1386, 875, 13800, 4868, 285, 7200, 2722, 247, 2762, 5921, 875, 731, 436, 3133, 34126, 281, 253, 17032, 495, 253, 4679, 497, 2218, 327, 247, 608, 12026, 2990, 342, 884, 1976, 3079, 7632, 2975, 327, 247, 20953, 941, 273, 1249, 2713, 8985, 11390, 253, 1263, 812, 452, 2908, 8750, 6928, 342, 4633, 15302, 534, 651, 1918, 6832, 1329, 281, 253, 9058, 2540, 327, 20953, 2856, 44180, 33032, 2520, 2929, 29328, 247, 1332, 323, 253, 13418, 273, 15577, 1491, 323, 6928, 342, 45515, 5743, 3470, 285, 253, 897, 273, 298, 19, 37820, 281, 10808, 625, 13800, 50276, 783, 897, 273, 1491, 16340, 281, 1263, 253, 3733, 3879, 273, 6928, 310, 417, 747, 50276, 2520, 2929, 12453, 253, 2523, 273, 45515, 50276, 19057, 1375, 4712, 50276, 284, 253, 8967, 15577, 1491, 275, 277, 9866, 310, 4164, 392, 37224, 253, 4477, 4081, 281, 823, 6046, 281, 253, 8763, 2425, 407, 970, 253, 10269, 920, 1232, 50275, 262, 310, 417, 2590, 275, 253, 2929, 326, 604, 253, 10269, 920, 310, 3732, 816, 323, 5304, 3006, 253, 1491, 6415, 390, 323, 12672, 253, 4712, 273, 8763, 5085, 275, 5170, 8090, 50275, 338, 352, 310, 253, 6158, 581, 352, 10513, 15279, 46598, 281, 253, 277, 9866, 50276, 284, 253, 4477, 8042, 562, 1027, 3302, 5904, 476, 1421, 281, 1027, 8770, 327, 253, 1491, 6415, 50276, 262, 651, 320, 2834, 281, 3812, 11815, 1754, 327, 253, 5661, 1543, 1014, 597, 1705, 432, 253, 3388, 273, 2456, 2060, 6928, 50276, 12563, 253, 8450, 403, 2684, 970, 247, 1798, 4836, 352, 310, 417, 2119, 604, 2074, 3879, 310, 2540, 275, 643, 8892, 50275, 262, 310, 2299, 625, 1774, 281, 2096, 752, 2789, 253, 13800, 50275, 1542, 253, 298, 19, 37820, 253, 13800, 310, 3264, 347, 253, 37820, 14280, 281, 2701, 253, 2193, 273, 253, 50276, 42739, 5474, 33032, 2520, 2929, 556, 495, 8624, 9021, 352, 29328, 247, 1027, 1039, 273, 10499, 15577, 1491, 275, 247, 11454, 2990, 29328, 247, 13800, 4868, 281, 7277, 1027, 3210, 840, 45190, 6260, 1027, 5743, 3470, 285, 298, 19, 13461, 50276, 2520, 789, 3133, 751, 247, 10112, 1635, 281, 253, 18890, 6293, 281, 479, 253, 954, 4722, 906, 310, 3365, 326, 5743, 3470, 403, 2649, 3365, 670, 11786, 2685, 285, 326, 597, 778, 1016, 452, 3607, 326, 403, 625, 390, 1679, 11408, 7293, 327, 253, 5028, 597, 1537, 320, 908, 327, 253, 4477, 403, 10182, 275, 253, 41066, 273, 616, 11815, 891, 1158, 342, 1921, 1223, 841, 1543, 403, 4217, 275, 326, 627, 1646, 281, 320, 12724, 1027, 13576, 3551, 432, 1027, 4373, 22041, 1491, 16340, 921, 247, 4942, 18276, 629, 273, 253, 5406, 50276, 17149, 21404, 253, 4081, 13800, 4868, 310, 4722, 533, 347, 253, 4477, 1333, 8077, 2531, 352, 3133, 281, 479, 326, 359, 1557, 625, 670, 253, 5975, 2400, 3210, 685, 253, 2644, 3733, 18974, 849, 1057, 436, 4868, 23554, 342, 673, 50276, 74, 1158, 271, 1774, 629, 273, 5955, 326, 19756, 275, 436, 2929, 310, 247, 625, 801, 554, 394, 1379, 347, 281, 849, 841, 4342, 14588, 281, 253, 1182, 12109, 1162, 355, 337, 16407, 1320, 4632, 26647, 2929, 285, 697, 956, 35267, 627, 1646, 281, 320, 1142, 4859, 281, 320, 8392, 50276, 2520, 789, 310, 4583, 247, 1175, 7680, 533, 2853, 452, 281, 5194, 342, 253, 4477, 6452, 326, 625, 3505, 74, 6216, 1783, 3082, 403, 2424, 281, 452, 247, 4891, 15909, 273, 253, 3733, 8062, 273, 277, 79, 2224, 253, 4028, 273, 253, 2929, 310, 1175, 533, 253, 4028, 273, 253, 3403, 621, 812, 320, 5520, 253, 1892, 3239, 2701, 273, 17857, 32888, 310, 884, 7223, 285, 634, 2929, 556, 247, 2257, 273, 3403, 621, 594, 891, 1158, 23415, 715, 247, 2372, 625, 2505, 651, 320, 1175, 50275, 26122, 50276, 262, 1537, 320, 4409, 281, 294, 15083, 404, 752, 253, 1491, 6415, 14777, 403, 275, 247, 4677, 11743, 417, 816, 275, 253, 2505, 253, 2505, 671, 36908, 1663, 5513, 326, 1016, 1127, 310, 247, 2774, 275, 3733, 285, 1016, 6293, 247, 1027, 3828, 436, 2929, 943, 320, 34025, 407, 3095, 665, 556, 1620, 2326, 841, 14777, 1078, 50275, 953, 417, 2590, 752, 310, 1469, 327, 275, 4677, 608, 891, 476, 5476, 533, 969, 436, 2929, 943, 320, 34025, 407, 3780, 275, 253, 1673, 368, 3748, 1027, 3302, 5904, 533, 534, 4555, 752, 2789, 368, 1333, 326, 608, 68, 556, 642, 13800, 533, 326, 608, 66, 1057, 13800, 806, 352, 943, 320, 5544, 11120, 50276, 74, 2868, 752, 368, 1333, 670, 4677, 854, 533, 253, 14777, 403, 594, 2074, 326, 352, 310, 1892, 281, 7277, 731, 25910, 5046, 247, 1027, 2238, 273, 45616, 715, 247, 2014, 7484, 651, 1805, 17093, 253, 13800, 1055, 273, 298, 19, 50276, 555, 5367, 275, 253, 1269, 7844, 11743, 273, 8442, 898, 50276, 13206, 898, 66, 310, 417, 34025, 275, 13738, 656, 25912, 390, 407, 247, 3295, 27895, 1436, 1908, 970, 247, 1027, 9484, 323, 253, 2602, 4090, 24493, 285, 6240, 436, 9484, 281, 253, 13691, 50276, 783, 806, 5807, 7893, 73, 22651, 25577, 273, 253, 2929, 3133, 247, 2372, 562, 273, 1659, 891, 1158, 344, 2994, 651, 1333, 326, 3676, 4715, 556, 644, 1469, 327, 323, 1199, 3356, 685, 1580, 4104, 275, 958, 891, 1158, 368, 812, 816, 5386, 253, 2862, 806, 12494, 352, 310, 816, 15279, 39870, 3789, 50276, 22309, 943, 627, 320, 247, 1480, 5921, 875, 13800, 285, 26647, 323, 1650, 352, 310, 1929, 326, 3733, 277, 79, 2224, 342, 2602, 8571, 19132, 1071, 7200, 275, 9162, 390, 1014, 17190, 2602, 1255, 275, 1097, 8571, 285, 14237, 374, 671, 19132, 1071, 7200, 50276, 303, 1335, 11697, 417, 4211, 327, 10269, 920, 347, 247, 5700, 281, 7472, 3641, 858, 368, 1347, 4679, 326, 921, 326, 253, 2540, 3064, 310, 5185, 604, 625, 13782, 310, 2218, 281, 16851, 3641, 285, 417, 816, 271, 39624, 12690, 273, 2781, 290, 10144, 10269, 920, 50276, 18, 1182, 12109, 1162, 355, 4022, 5987, 39962, 2061, 5375, 1036, 7749, 1671, 1229, 374, 2336, 785, 1162, 355, 4765, 5987, 39962, 2061, 5375, 11395, 20954, 21358, 2490, 187, 4118, 18435, 27, 2520, 2929, 5936, 326, 6046, 12846, 1025, 48489, 273, 15577, 1491, 275, 3676, 11454, 6928, 943, 320, 17825, 275, 253, 3282, 326, 253, 11041, 273, 253, 37820, 6046, 943, 320, 14495, 281, 253, 2491, 273, 253, 8763, 2425, 767, 17825, 48489, 403, 4081, 337, 271, 15579, 3169, 17825, 10269, 920, 38391, 357, 29107, 326, 28467, 253, 10269, 13674, 824, 326, 1016, 10269, 4428, 253, 1072, 1180, 273, 4451, 2540, 5743, 2308, 285, 374, 271, 17825, 10295, 4038, 29107, 29507, 615, 326, 11323, 29436, 305, 12064, 6046, 835, 253, 11041, 273, 253, 6046, 310, 14495, 281, 253, 4869, 2425, 1318, 275, 247, 1677, 3828, 841, 48489, 403, 840, 908, 281, 921, 326, 337, 774, 86, 6928, 476, 19477, 533, 326, 13800, 778, 390, 778, 417, 2826, 7293, 327, 253, 2173, 2801, 31850, 374, 1027, 14122, 4953, 839, 1327, 48971, 1005, 10738, 1027, 1491, 6415, 13576, 689, 253, 2282, 273, 3733, 285, 495, 298, 19, 37820, 275, 774, 86, 6928, 29426, 13800, 253, 2929, 671, 9010, 326, 760, 13800, 275, 253, 1390, 2602, 4090, 3828, 27972, 342, 26647, 3045, 253, 30628, 10490, 253, 2491, 273, 4679, 285, 1119, 253, 7313, 275, 253, 2929, 4722, 533, 574, 33196, 670, 253, 3480, 273, 8132, 263, 275, 253, 2929, 642, 10527, 1783, 273, 253, 14940, 273, 253, 4081, 29107, 497, 11926, 326, 1501, 37806, 1635, 273, 6046, 940, 8707, 253, 1159, 273, 253, 2990, 285, 3543, 326, 627, 369, 2649, 1199, 12288, 2530, 327, 253, 2847, 273, 13800, 275, 3676, 11454, 6928, 253, 913, 10764, 841, 7350, 285, 19401, 731, 281, 320, 625, 1534, 685, 253, 30628, 513, 533, 36908, 5730, 281, 12970, 253, 30628, 17401, 326, 253, 2929, 320, 7607 ]
[ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 ]
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 783, 4477, 273, 436, 2929, 5421, 253, 4633, 9927, 326, 3676, 11454, 6928, 513, 1491, 13800, 323, 22296, 8892, 597, 5421, 436, 13800, 3879, 342, 23136, 73, 285, 774, 86, 285, 697, 11640, 5743, 3470, 534, 403, 19004, 839, 285, 1327, 19004, 839, 275, 3753, 2975, 50275, 783, 13800, 4868, 310, 10302, 970, 15577, 1491, 13418, 534, 672, 10302, 403, 3798, 11968, 323, 6486, 15577, 1491, 2193, 6046, 476, 320, 2879, 281, 8763, 1396, 569, 323, 436, 4096, 767, 7274, 10775, 15579, 1754, 10269, 920, 2275, 357, 285, 17825, 10295, 4038, 13418, 518, 615, 497, 14859, 38391, 357, 11323, 6046, 281, 253, 8763, 1396, 569, 407, 10269, 920, 285, 29507, 615, 407, 305, 12064, 6046, 616, 1543, 921, 326, 1097, 38391, 357, 285, 29507, 615, 10738, 13800, 275, 1083, 273, 774, 86, 3738, 436, 3879, 310, 253, 19508, 275, 23136, 73, 50275, 71, 3341, 672, 13800, 4868, 369, 17944, 1411, 7200, 2169, 4142, 273, 13800, 858, 417, 921, 1534, 5921, 342, 26647, 7613, 4645, 1941, 326, 26647, 263, 1175, 3045, 476, 320, 6786, 1014, 1293, 1491, 3673, 44856, 18480, 13800, 50276, 15847, 983, 337, 4677, 818, 84, 5740, 326, 1045, 86, 1863, 763, 285, 18932, 2602, 11095, 3470, 2509, 13800, 310, 417, 1077, 5165, 50276, 19, 4677, 898, 67, 9077, 1386, 875, 13800, 4868, 285, 7200, 2722, 247, 2762, 5921, 875, 731, 436, 3133, 34126, 281, 253, 17032, 495, 253, 4679, 497, 2218, 327, 247, 608, 12026, 2990, 342, 884, 1976, 3079, 7632, 2975, 327, 247, 20953, 941, 273, 1249, 2713, 8985, 11390, 253, 1263, 812, 452, 2908, 8750, 6928, 342, 4633, 15302, 534, 651, 1918, 6832, 1329, 281, 253, 9058, 2540, 327, 20953, 2856, 44180, 33032, 2520, 2929, 29328, 247, 1332, 323, 253, 13418, 273, 15577, 1491, 323, 6928, 342, 45515, 5743, 3470, 285, 253, 897, 273, 298, 19, 37820, 281, 10808, 625, 13800, 50276, 783, 897, 273, 1491, 16340, 281, 1263, 253, 3733, 3879, 273, 6928, 310, 417, 747, 50276, 2520, 2929, 12453, 253, 2523, 273, 45515, 50276, 19057, 1375, 4712, 50276, 284, 253, 8967, 15577, 1491, 275, 277, 9866, 310, 4164, 392, 37224, 253, 4477, 4081, 281, 823, 6046, 281, 253, 8763, 2425, 407, 970, 253, 10269, 920, 1232, 50275, 262, 310, 417, 2590, 275, 253, 2929, 326, 604, 253, 10269, 920, 310, 3732, 816, 323, 5304, 3006, 253, 1491, 6415, 390, 323, 12672, 253, 4712, 273, 8763, 5085, 275, 5170, 8090, 50275, 338, 352, 310, 253, 6158, 581, 352, 10513, 15279, 46598, 281, 253, 277, 9866, 50276, 284, 253, 4477, 8042, 562, 1027, 3302, 5904, 476, 1421, 281, 1027, 8770, 327, 253, 1491, 6415, 50276, 262, 651, 320, 2834, 281, 3812, 11815, 1754, 327, 253, 5661, 1543, 1014, 597, 1705, 432, 253, 3388, 273, 2456, 2060, 6928, 50276, 12563, 253, 8450, 403, 2684, 970, 247, 1798, 4836, 352, 310, 417, 2119, 604, 2074, 3879, 310, 2540, 275, 643, 8892, 50275, 262, 310, 2299, 625, 1774, 281, 2096, 752, 2789, 253, 13800, 50275, 1542, 253, 298, 19, 37820, 253, 13800, 310, 3264, 347, 253, 37820, 14280, 281, 2701, 253, 2193, 273, 253, 50276, 42739, 5474, 33032, 2520, 2929, 556, 495, 8624, 9021, 352, 29328, 247, 1027, 1039, 273, 10499, 15577, 1491, 275, 247, 11454, 2990, 29328, 247, 13800, 4868, 281, 7277, 1027, 3210, 840, 45190, 6260, 1027, 5743, 3470, 285, 298, 19, 13461, 50276, 2520, 789, 3133, 751, 247, 10112, 1635, 281, 253, 18890, 6293, 281, 479, 253, 954, 4722, 906, 310, 3365, 326, 5743, 3470, 403, 2649, 3365, 670, 11786, 2685, 285, 326, 597, 778, 1016, 452, 3607, 326, 403, 625, 390, 1679, 11408, 7293, 327, 253, 5028, 597, 1537, 320, 908, 327, 253, 4477, 403, 10182, 275, 253, 41066, 273, 616, 11815, 891, 1158, 342, 1921, 1223, 841, 1543, 403, 4217, 275, 326, 627, 1646, 281, 320, 12724, 1027, 13576, 3551, 432, 1027, 4373, 22041, 1491, 16340, 921, 247, 4942, 18276, 629, 273, 253, 5406, 50276, 17149, 21404, 253, 4081, 13800, 4868, 310, 4722, 533, 347, 253, 4477, 1333, 8077, 2531, 352, 3133, 281, 479, 326, 359, 1557, 625, 670, 253, 5975, 2400, 3210, 685, 253, 2644, 3733, 18974, 849, 1057, 436, 4868, 23554, 342, 673, 50276, 74, 1158, 271, 1774, 629, 273, 5955, 326, 19756, 275, 436, 2929, 310, 247, 625, 801, 554, 394, 1379, 347, 281, 849, 841, 4342, 14588, 281, 253, 1182, 12109, 1162, 355, 337, 16407, 1320, 4632, 26647, 2929, 285, 697, 956, 35267, 627, 1646, 281, 320, 1142, 4859, 281, 320, 8392, 50276, 2520, 789, 310, 4583, 247, 1175, 7680, 533, 2853, 452, 281, 5194, 342, 253, 4477, 6452, 326, 625, 3505, 74, 6216, 1783, 3082, 403, 2424, 281, 452, 247, 4891, 15909, 273, 253, 3733, 8062, 273, 277, 79, 2224, 253, 4028, 273, 253, 2929, 310, 1175, 533, 253, 4028, 273, 253, 3403, 621, 812, 320, 5520, 253, 1892, 3239, 2701, 273, 17857, 32888, 310, 884, 7223, 285, 634, 2929, 556, 247, 2257, 273, 3403, 621, 594, 891, 1158, 23415, 715, 247, 2372, 625, 2505, 651, 320, 1175, 50275, 26122, 50276, 262, 1537, 320, 4409, 281, 294, 15083, 404, 752, 253, 1491, 6415, 14777, 403, 275, 247, 4677, 11743, 417, 816, 275, 253, 2505, 253, 2505, 671, 36908, 1663, 5513, 326, 1016, 1127, 310, 247, 2774, 275, 3733, 285, 1016, 6293, 247, 1027, 3828, 436, 2929, 943, 320, 34025, 407, 3095, 665, 556, 1620, 2326, 841, 14777, 1078, 50275, 953, 417, 2590, 752, 310, 1469, 327, 275, 4677, 608, 891, 476, 5476, 533, 969, 436, 2929, 943, 320, 34025, 407, 3780, 275, 253, 1673, 368, 3748, 1027, 3302, 5904, 533, 534, 4555, 752, 2789, 368, 1333, 326, 608, 68, 556, 642, 13800, 533, 326, 608, 66, 1057, 13800, 806, 352, 943, 320, 5544, 11120, 50276, 74, 2868, 752, 368, 1333, 670, 4677, 854, 533, 253, 14777, 403, 594, 2074, 326, 352, 310, 1892, 281, 7277, 731, 25910, 5046, 247, 1027, 2238, 273, 45616, 715, 247, 2014, 7484, 651, 1805, 17093, 253, 13800, 1055, 273, 298, 19, 50276, 555, 5367, 275, 253, 1269, 7844, 11743, 273, 8442, 898, 50276, 13206, 898, 66, 310, 417, 34025, 275, 13738, 656, 25912, 390, 407, 247, 3295, 27895, 1436, 1908, 970, 247, 1027, 9484, 323, 253, 2602, 4090, 24493, 285, 6240, 436, 9484, 281, 253, 13691, 50276, 783, 806, 5807, 7893, 73, 22651, 25577, 273, 253, 2929, 3133, 247, 2372, 562, 273, 1659, 891, 1158, 344, 2994, 651, 1333, 326, 3676, 4715, 556, 644, 1469, 327, 323, 1199, 3356, 685, 1580, 4104, 275, 958, 891, 1158, 368, 812, 816, 5386, 253, 2862, 806, 12494, 352, 310, 816, 15279, 39870, 3789, 50276, 22309, 943, 627, 320, 247, 1480, 5921, 875, 13800, 285, 26647, 323, 1650, 352, 310, 1929, 326, 3733, 277, 79, 2224, 342, 2602, 8571, 19132, 1071, 7200, 275, 9162, 390, 1014, 17190, 2602, 1255, 275, 1097, 8571, 285, 14237, 374, 671, 19132, 1071, 7200, 50276, 303, 1335, 11697, 417, 4211, 327, 10269, 920, 347, 247, 5700, 281, 7472, 3641, 858, 368, 1347, 4679, 326, 921, 326, 253, 2540, 3064, 310, 5185, 604, 625, 13782, 310, 2218, 281, 16851, 3641, 285, 417, 816, 271, 39624, 12690, 273, 2781, 290, 10144, 10269, 920, 50276, 18, 1182, 12109, 1162, 355, 4022, 5987, 39962, 2061, 5375, 1036, 7749, 1671, 1229, 374, 2336, 785, 1162, 355, 4765, 5987, 39962, 2061, 5375, 11395, 20954, 21358, 2490, 187, 4118, 18435, 27, 2520, 2929, 5936, 326, 6046, 12846, 1025, 48489, 273, 15577, 1491, 275, 3676, 11454, 6928, 943, 320, 17825, 275, 253, 3282, 326, 253, 11041, 273, 253, 37820, 6046, 943, 320, 14495, 281, 253, 2491, 273, 253, 8763, 2425, 767, 17825, 48489, 403, 4081, 337, 271, 15579, 3169, 17825, 10269, 920, 38391, 357, 29107, 326, 28467, 253, 10269, 13674, 824, 326, 1016, 10269, 4428, 253, 1072, 1180, 273, 4451, 2540, 5743, 2308, 285, 374, 271, 17825, 10295, 4038, 29107, 29507, 615, 326, 11323, 29436, 305, 12064, 6046, 835, 253, 11041, 273, 253, 6046, 310, 14495, 281, 253, 4869, 2425, 1318, 275, 247, 1677, 3828, 841, 48489, 403, 840, 908, 281, 921, 326, 337, 774, 86, 6928, 476, 19477, 533, 326, 13800, 778, 390, 778, 417, 2826, 7293, 327, 253, 2173, 2801, 31850, 374, 1027, 14122, 4953, 839, 1327, 48971, 1005, 10738, 1027, 1491, 6415, 13576, 689, 253, 2282, 273, 3733, 285, 495, 298, 19, 37820, 275, 774, 86, 6928, 29426, 13800, 253, 2929, 671, 9010, 326, 760, 13800, 275, 253, 1390, 2602, 4090, 3828, 27972, 342, 26647, 3045, 253, 30628, 10490, 253, 2491, 273, 4679, 285, 1119, 253, 7313, 275, 253, 2929, 4722, 533, 574, 33196, 670, 253, 3480, 273, 8132, 263, 275, 253, 2929, 642, 10527, 1783, 273, 253, 14940, 273, 253, 4081, 29107, 497, 11926, 326, 1501, 37806, 1635, 273, 6046, 940, 8707, 253, 1159, 273, 253, 2990, 285, 3543, 326, 627, 369, 2649, 1199, 12288, 2530, 327, 253, 2847, 273, 13800, 275, 3676, 11454, 6928, 253, 913, 10764, 841, 7350, 285, 19401, 731, 281, 320, 625, 1534, 685, 253, 30628, 513, 533, 36908, 5730, 281, 12970, 253, 30628, 17401, 326, 253, 2929, 320, 7607 ]
Below is given review of a research paper from cnoference journal. Please write a summary the review. ### Review: the paper looks at online matrixcompletion the goal is to obtain a matrix x in mathbbrd times r that minimizes xxtop m f2 when the elements of the d times d matrix m are revealed in an online fashion the paper looks at improving the convergence guarantees of sgd for this task in terms of the dependence in the condition number kappa of the groundtruth m a previous result of 1 showed a convergence guarantee of okappa4 drlogdepsilon for the regular sgd update this result shows that multiplying the stochastic gradient with a preconditioner p xxtop1 gives the aforementioned convergence guarantee without any dependence on kappa the paper shows that this preconditioner can be maintained by making 4 calls to the shermanmorris formula and incurs an additional cost of or2 as compared to or in regular sgd the main contribution in the analysis is a descent lemma arguing that the coherence of the iterates remains o1 throughout 1 chi jin sham m kakade and praneeth netrapalli provable efficient online matrix completion via nonconvex stochastic gradient descent advances in neural information processing systems 29 2016 sgd for online matrixcompletion seems to be wellmotivated 1 and applying a preconditioner in the descent step is a natural idea that also seems to be wellstudied for full gd 23 hence this idea is natural but the main contribution seems to be in the analysis specifically in lemma 4 that argues that the coherence of the iterates decreases at a linear rate independent of kappa this analysis seems to be an important contribution to understanding the preconditioned stochastic optimization for rmse matrixcompletion weakness as you have mentioned it seems like the quality of approximation required for the initial iterate x0 contains a factor of 1d that 1 doesnt please see questions section 2 jialun zhang salar fattahi and richard y zhang preconditioned gradient descent for over329 parameterized nonconvex matrix factorization advances in neural information processing 330 systems 3459855996 2021 3 tian tong cong ma and yuejie chi accelerating illconditioned lowrank matrix estimation 341 via scaled gradient descent journal of machine learning research 22150163 2021 yes docsepthe matrix completion is a classic type of method that tries to recover the ground truth from the observation elements the stochastic gradient descent sgd is one of the solutions to the hugescale matrix completion in the real world however it is suffering from the illconditioned ground truth this work is proposing an sgdbased method that is not affected by the illconditioned ground truth meanwhile achieves the per iteration complexity improvement strengths the sgd version in this problem is missing and this work is trying to fill the missing piece this work is a nice extension of previous work including the fullgradient method scaledgd and precgd the matrix completion sgd results symmetric psd case from jin 40 the main contribution of this work could be eliminating the kdependency in the complexity the experiment in the paper is synthetic data but they offer real data in supplementary its a solid improvement on per iteration complexity weaknesses its not actually a weakness the main contribution is about eliminating the condition number k in complexity it reads like the previous work did not consider that at all however the previous work scaledgd did discuss the case that k is independent its fine but probably fairer to mention that to make it clearer no potential negative societal impact docsepthis paper propose new online algorithm for matrix completion when the underlying lowrank matrix is illconditioned has a large condition numer the authors delivered improved theoretical guarantees for the local convergence of the algorithm strengths this paper makes solid contribution to online matrix completion when the ground truth is illconditioned the authors present a new algorithm that provably handles illconditioned lowrank matrix with theoretical guarantees extensive numerical experiments are conducted to illustrate the theory weakness my concerns for this paper is mainly about its novelty as discussed in the first point below 1 given that prior literature 1 already showed the fast convergence of sgd for online matrix completion with wellconditioned ground truth and a series of recent work 234 showing that scaling provably improves the performance of gradient descent in the presence of illconditioned ground truth it is not surprising to see that by incorporating the scaling scheme with sgd will improve the performance for online matrix completion with illconditioned ground truth what is the technical novelty of this paper i recommend the authors to highlight their technical contributions of this paper 2 the theory in this paper only guarantees local convergence if i am not missing anything this paper does not discuss how to obtain an initialization that satisfies the conditions in theorem 2 the most popular initialization scheme is spectral method under what conditions does spectral method or other method satisfy the conditions in theorem 2 how does these conditions depend on the condition number if we dont have a good initialization scheme when the ground truth is illconditioned then the local convergence result in this paper alone will not be so useful 3 the claim that 4 the writing quality of this paper need to be improved and the paper need to be proofread for example only on page 4 i see the following problems a the equation sgd contains multiple typos b the sentence many iterations of sgd should concentrate about the behavior of fullbatch gradient descent is ungrammatical c on line 140 a cubic factor of kappa3 is actually kappa9 d on line 144 initial initialization or initial point e on line 145 with at least xxx probability with probability at least xxx 1 jin c kakade s m netrapalli p 2016 provable efficient online matrix completion via nonconvex stochastic gradient descent advances in neural information processing systems 29 2 tong t ma c chi y 2021 accelerating illconditioned lowrank matrix estimation via scaled gradient descent j mach learn res 22 1501 3 tong t ma c chi y 2021 lowrank matrix recovery with scaled subgradient methods fast and robust convergence without the condition number ieee transactions on signal processing 69 23962409 4 tong t ma c praterbennette a tripp e chi y 2022 may scaling and scalability provable nonconvex lowrank tensor completion in international conference on artificial intelligence and statistics pp 26072617 pmlr after rebuttal i would like to thank the authors for addressing my comments i have updated the score accordingly this is a theory paper and i believe there is no potential negative societal impact ### Summary:
the sgd version for this matrix completion problem is missing and this work is trying to fill the missing piece the concerns of the reviewers seem to be resolved please try to add the realdata experiments to the final version maybe instead of the synthetic data
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 783, 2929, 4453, 387, 3909, 4315, 45634, 50276, 783, 4736, 310, 281, 4044, 247, 4315, 1269, 275, 14168, 67, 1288, 69, 2069, 391, 326, 46926, 1269, 633, 412, 50276, 78, 269, 19, 672, 253, 3603, 273, 253, 277, 2069, 277, 4315, 278, 403, 4950, 275, 271, 3909, 8142, 253, 2929, 4453, 387, 11138, 253, 14940, 23632, 273, 256, 35333, 323, 436, 4836, 275, 2426, 273, 253, 10096, 275, 253, 1617, 1180, 465, 5596, 273, 253, 3216, 33024, 278, 50275, 66, 2045, 906, 273, 337, 2692, 247, 14940, 12215, 273, 258, 6165, 21, 1837, 2808, 69, 4259, 323, 253, 3963, 256, 35333, 5731, 436, 906, 2722, 326, 39763, 253, 19191, 11786, 342, 247, 638, 12380, 254, 50276, 81, 50276, 89, 633, 412, 18, 50276, 72, 1644, 253, 18979, 14940, 12215, 1293, 667, 10096, 327, 465, 5596, 50275, 783, 2929, 2722, 326, 436, 638, 12380, 254, 476, 320, 8838, 407, 2403, 577, 5841, 281, 253, 22968, 1342, 20014, 4448, 7212, 285, 1485, 2244, 271, 3081, 2105, 273, 390, 19, 347, 2429, 281, 390, 275, 3963, 256, 35333, 50275, 783, 2022, 7680, 275, 253, 1783, 310, 247, 18499, 18057, 50276, 1662, 5845, 326, 253, 25253, 273, 253, 10040, 684, 4558, 258, 18, 4768, 50275, 18, 21477, 480, 249, 23013, 278, 465, 518, 796, 285, 819, 1351, 678, 2036, 1761, 43953, 872, 494, 5919, 3909, 4315, 12240, 3066, 1327, 44181, 19191, 11786, 18499, 16424, 275, 11454, 1491, 5162, 2718, 3285, 4022, 50276, 8433, 69, 323, 3909, 4315, 45634, 3133, 281, 320, 973, 24013, 8550, 337, 285, 9433, 247, 638, 12380, 254, 275, 253, 18499, 3213, 310, 247, 3626, 2934, 326, 671, 3133, 281, 320, 973, 14091, 728, 323, 2120, 305, 69, 3495, 7613, 436, 2934, 310, 3626, 533, 253, 2022, 7680, 3133, 281, 320, 275, 253, 1783, 50276, 46458, 275, 18057, 577, 50276, 3529, 8219, 326, 253, 25253, 273, 253, 10040, 684, 12075, 387, 247, 4872, 2281, 3907, 273, 465, 5596, 436, 1783, 3133, 281, 320, 271, 1774, 7680, 281, 4685, 253, 638, 44321, 19191, 13757, 323, 40373, 339, 4315, 45634, 50276, 20881, 1255, 50275, 284, 368, 452, 5393, 352, 3133, 751, 253, 3290, 273, 11193, 2424, 323, 253, 3302, 35388, 1269, 17, 4428, 247, 2803, 273, 337, 69, 326, 337, 36908, 4496, 923, 3533, 2593, 50275, 19, 480, 451, 328, 1182, 12109, 3779, 274, 4688, 893, 5801, 285, 6793, 472, 340, 1182, 12109, 638, 44321, 11786, 18499, 323, 689, 22011, 4764, 1025, 1327, 44181, 4315, 39401, 16424, 275, 11454, 1491, 5162, 24792, 2718, 495, 28333, 31013, 28053, 43425, 50276, 20, 246, 757, 12386, 345, 72, 6429, 285, 340, 489, 75, 466, 21477, 38757, 2853, 44321, 1698, 14714, 4315, 13418, 36076, 3066, 24337, 11786, 18499, 6698, 273, 5145, 4715, 2561, 3307, 1010, 520, 3571, 43425, 50274, 9820, 5474, 339, 431, 248, 4315, 12240, 310, 247, 10610, 1511, 273, 1332, 326, 14177, 281, 9295, 253, 3216, 5083, 432, 253, 8310, 3603, 253, 19191, 11786, 18499, 256, 35333, 310, 581, 273, 253, 5482, 281, 253, 15729, 20039, 1079, 4315, 12240, 275, 253, 1524, 1533, 2299, 352, 310, 9958, 432, 253, 2853, 44321, 3216, 5083, 436, 789, 310, 36636, 271, 256, 35333, 3169, 1332, 326, 310, 417, 5876, 407, 253, 2853, 44321, 3216, 5083, 26614, 33526, 253, 591, 19502, 10454, 7756, 20544, 253, 256, 35333, 2715, 275, 436, 1895, 310, 5816, 285, 436, 789, 310, 2820, 281, 7522, 253, 5816, 5313, 50276, 2520, 789, 310, 247, 5322, 6880, 273, 2045, 789, 1690, 253, 2120, 29844, 1332, 24337, 35333, 285, 3509, 35333, 253, 4315, 12240, 256, 35333, 1543, 13123, 3714, 69, 1083, 432, 480, 249, 3387, 253, 2022, 7680, 273, 436, 789, 812, 320, 23703, 253, 465, 25573, 275, 253, 10454, 50276, 783, 3368, 275, 253, 2929, 310, 13506, 941, 533, 597, 3959, 1524, 941, 275, 24864, 50275, 953, 247, 4891, 7756, 327, 591, 19502, 10454, 50276, 20881, 1255, 265, 697, 417, 2686, 247, 14855, 253, 2022, 7680, 310, 670, 23703, 253, 1617, 1180, 465, 275, 10454, 352, 9563, 751, 253, 2045, 789, 858, 417, 1908, 326, 387, 512, 2299, 253, 2045, 789, 24337, 35333, 858, 2319, 253, 1083, 326, 465, 310, 3907, 697, 4030, 533, 3164, 22870, 83, 281, 3748, 326, 281, 1056, 352, 30909, 50274, 2369, 2442, 4016, 38058, 3486, 50276, 7152, 33032, 2520, 2929, 12661, 747, 3909, 5933, 323, 4315, 12240, 672, 253, 6944, 1698, 14714, 4315, 310, 2853, 44321, 556, 247, 1781, 1617, 4520, 253, 4477, 8549, 5520, 10527, 23632, 323, 253, 1980, 14940, 273, 253, 5933, 20544, 436, 2929, 2789, 4891, 7680, 281, 3909, 4315, 12240, 672, 253, 3216, 5083, 310, 2853, 44321, 253, 4477, 1246, 247, 747, 5933, 326, 872, 1598, 22139, 2853, 44321, 1698, 14714, 4315, 342, 10527, 23632, 9470, 10704, 4679, 403, 5196, 281, 17093, 253, 3762, 50275, 20881, 1255, 619, 7350, 323, 436, 2929, 310, 7194, 670, 697, 38135, 347, 5469, 275, 253, 806, 1127, 2708, 337, 1677, 326, 2720, 6239, 337, 2168, 2692, 253, 3809, 14940, 273, 256, 35333, 323, 3909, 4315, 12240, 342, 973, 44321, 3216, 5083, 285, 247, 2962, 273, 3332, 789, 27812, 4645, 326, 13642, 872, 1598, 19132, 253, 3045, 273, 11786, 18499, 275, 253, 3361, 273, 2853, 44321, 3216, 5083, 352, 310, 417, 10084, 281, 923, 326, 407, 24049, 253, 13642, 6974, 342, 256, 35333, 588, 3157, 253, 3045, 323, 3909, 4315, 12240, 342, 2853, 44321, 3216, 5083, 752, 310, 253, 7681, 38135, 273, 436, 2929, 891, 5583, 253, 4477, 281, 6780, 616, 7681, 9021, 273, 436, 2929, 374, 253, 3762, 275, 436, 2929, 760, 23632, 1980, 14940, 604, 891, 717, 417, 5816, 2712, 436, 2929, 1057, 417, 2319, 849, 281, 4044, 271, 31850, 326, 12310, 253, 2515, 275, 10012, 374, 253, 954, 4633, 31850, 6974, 310, 9879, 1332, 762, 752, 2515, 1057, 9879, 1332, 390, 643, 1332, 10517, 253, 2515, 275, 10012, 374, 849, 1057, 841, 2515, 3469, 327, 253, 1617, 1180, 604, 359, 13414, 452, 247, 1175, 31850, 6974, 672, 253, 3216, 5083, 310, 2853, 44321, 840, 253, 1980, 14940, 906, 275, 436, 2929, 3815, 588, 417, 320, 594, 4217, 495, 253, 1750, 326, 50276, 21, 253, 4028, 3290, 273, 436, 2929, 878, 281, 320, 5520, 285, 253, 2929, 878, 281, 320, 4737, 1088, 323, 1650, 760, 327, 3239, 577, 891, 923, 253, 1563, 3237, 247, 253, 5150, 256, 35333, 4428, 2709, 963, 993, 50276, 67, 253, 6197, 1142, 25142, 273, 256, 35333, 943, 21364, 670, 253, 3879, 273, 2120, 23941, 11786, 18499, 310, 440, 1710, 2056, 474, 260, 327, 1386, 11858, 247, 23664, 2803, 273, 465, 5596, 20, 310, 2686, 465, 5596, 26, 277, 327, 1386, 18836, 3302, 50276, 19078, 1320, 390, 3302, 1127, 299, 50276, 251, 1386, 19092, 342, 387, 1878, 43911, 5912, 50276, 3113, 5912, 387, 1878, 43911, 50276, 18, 480, 249, 260, 465, 518, 796, 256, 278, 50276, 3024, 1761, 43953, 268, 4022, 872, 494, 5919, 3909, 4315, 12240, 3066, 1327, 44181, 19191, 11786, 18499, 16424, 275, 11454, 1491, 5162, 2718, 3285, 50276, 19, 12386, 246, 6429, 260, 50276, 4635, 340, 43425, 38757, 2853, 44321, 1698, 14714, 4315, 13418, 3066, 24337, 11786, 18499, 480, 3674, 3037, 501, 3307, 1458, 520, 50276, 20, 12386, 246, 6429, 260, 50276, 4635, 340, 43425, 1698, 14714, 4315, 7355, 342, 24337, 749, 29844, 3082, 3809, 285, 10237, 14940, 1293, 253, 1617, 1180, 26332, 1796, 13122, 327, 2625, 5162, 10447, 374, 24698, 1348, 2693, 50276, 21, 12386, 246, 6429, 260, 819, 727, 67, 2477, 5464, 247, 1195, 377, 299, 50276, 4635, 340, 1384, 1423, 778, 13642, 285, 9171, 1430, 872, 494, 1327, 44181, 1698, 14714, 13148, 12240, 275, 5213, 8059, 327, 13345, 9260, 285, 9990, 7266, 22146, 49510, 1166, 268, 1686, 83, 50276, 6438, 30080, 22559, 50276, 74, 651, 751, 281, 5717, 253, 4477, 323, 15974, 619, 5701, 891, 452, 9300, 253, 4868, 15672, 436, 310, 247, 3762, 2929, 285, 891, 2868, 627, 310, 642, 2442, 4016, 38058, 3486, 2490, 187, 4118, 18435, 27, 783, 256, 35333, 2715, 323, 436, 4315, 12240, 1895, 310, 5816, 285, 436, 789, 310, 2820, 281, 7522, 253, 5816, 5313, 253, 7350, 273, 253, 30628, 1646, 281, 320, 11512, 4496, 1611, 281, 823, 253, 1524, 2203, 4679, 281, 253, 2457, 2715, 5046, 3185, 273, 253, 13506, 941 ]
[ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 ]
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 783, 2929, 4453, 387, 3909, 4315, 45634, 50276, 783, 4736, 310, 281, 4044, 247, 4315, 1269, 275, 14168, 67, 1288, 69, 2069, 391, 326, 46926, 1269, 633, 412, 50276, 78, 269, 19, 672, 253, 3603, 273, 253, 277, 2069, 277, 4315, 278, 403, 4950, 275, 271, 3909, 8142, 253, 2929, 4453, 387, 11138, 253, 14940, 23632, 273, 256, 35333, 323, 436, 4836, 275, 2426, 273, 253, 10096, 275, 253, 1617, 1180, 465, 5596, 273, 253, 3216, 33024, 278, 50275, 66, 2045, 906, 273, 337, 2692, 247, 14940, 12215, 273, 258, 6165, 21, 1837, 2808, 69, 4259, 323, 253, 3963, 256, 35333, 5731, 436, 906, 2722, 326, 39763, 253, 19191, 11786, 342, 247, 638, 12380, 254, 50276, 81, 50276, 89, 633, 412, 18, 50276, 72, 1644, 253, 18979, 14940, 12215, 1293, 667, 10096, 327, 465, 5596, 50275, 783, 2929, 2722, 326, 436, 638, 12380, 254, 476, 320, 8838, 407, 2403, 577, 5841, 281, 253, 22968, 1342, 20014, 4448, 7212, 285, 1485, 2244, 271, 3081, 2105, 273, 390, 19, 347, 2429, 281, 390, 275, 3963, 256, 35333, 50275, 783, 2022, 7680, 275, 253, 1783, 310, 247, 18499, 18057, 50276, 1662, 5845, 326, 253, 25253, 273, 253, 10040, 684, 4558, 258, 18, 4768, 50275, 18, 21477, 480, 249, 23013, 278, 465, 518, 796, 285, 819, 1351, 678, 2036, 1761, 43953, 872, 494, 5919, 3909, 4315, 12240, 3066, 1327, 44181, 19191, 11786, 18499, 16424, 275, 11454, 1491, 5162, 2718, 3285, 4022, 50276, 8433, 69, 323, 3909, 4315, 45634, 3133, 281, 320, 973, 24013, 8550, 337, 285, 9433, 247, 638, 12380, 254, 275, 253, 18499, 3213, 310, 247, 3626, 2934, 326, 671, 3133, 281, 320, 973, 14091, 728, 323, 2120, 305, 69, 3495, 7613, 436, 2934, 310, 3626, 533, 253, 2022, 7680, 3133, 281, 320, 275, 253, 1783, 50276, 46458, 275, 18057, 577, 50276, 3529, 8219, 326, 253, 25253, 273, 253, 10040, 684, 12075, 387, 247, 4872, 2281, 3907, 273, 465, 5596, 436, 1783, 3133, 281, 320, 271, 1774, 7680, 281, 4685, 253, 638, 44321, 19191, 13757, 323, 40373, 339, 4315, 45634, 50276, 20881, 1255, 50275, 284, 368, 452, 5393, 352, 3133, 751, 253, 3290, 273, 11193, 2424, 323, 253, 3302, 35388, 1269, 17, 4428, 247, 2803, 273, 337, 69, 326, 337, 36908, 4496, 923, 3533, 2593, 50275, 19, 480, 451, 328, 1182, 12109, 3779, 274, 4688, 893, 5801, 285, 6793, 472, 340, 1182, 12109, 638, 44321, 11786, 18499, 323, 689, 22011, 4764, 1025, 1327, 44181, 4315, 39401, 16424, 275, 11454, 1491, 5162, 24792, 2718, 495, 28333, 31013, 28053, 43425, 50276, 20, 246, 757, 12386, 345, 72, 6429, 285, 340, 489, 75, 466, 21477, 38757, 2853, 44321, 1698, 14714, 4315, 13418, 36076, 3066, 24337, 11786, 18499, 6698, 273, 5145, 4715, 2561, 3307, 1010, 520, 3571, 43425, 50274, 9820, 5474, 339, 431, 248, 4315, 12240, 310, 247, 10610, 1511, 273, 1332, 326, 14177, 281, 9295, 253, 3216, 5083, 432, 253, 8310, 3603, 253, 19191, 11786, 18499, 256, 35333, 310, 581, 273, 253, 5482, 281, 253, 15729, 20039, 1079, 4315, 12240, 275, 253, 1524, 1533, 2299, 352, 310, 9958, 432, 253, 2853, 44321, 3216, 5083, 436, 789, 310, 36636, 271, 256, 35333, 3169, 1332, 326, 310, 417, 5876, 407, 253, 2853, 44321, 3216, 5083, 26614, 33526, 253, 591, 19502, 10454, 7756, 20544, 253, 256, 35333, 2715, 275, 436, 1895, 310, 5816, 285, 436, 789, 310, 2820, 281, 7522, 253, 5816, 5313, 50276, 2520, 789, 310, 247, 5322, 6880, 273, 2045, 789, 1690, 253, 2120, 29844, 1332, 24337, 35333, 285, 3509, 35333, 253, 4315, 12240, 256, 35333, 1543, 13123, 3714, 69, 1083, 432, 480, 249, 3387, 253, 2022, 7680, 273, 436, 789, 812, 320, 23703, 253, 465, 25573, 275, 253, 10454, 50276, 783, 3368, 275, 253, 2929, 310, 13506, 941, 533, 597, 3959, 1524, 941, 275, 24864, 50275, 953, 247, 4891, 7756, 327, 591, 19502, 10454, 50276, 20881, 1255, 265, 697, 417, 2686, 247, 14855, 253, 2022, 7680, 310, 670, 23703, 253, 1617, 1180, 465, 275, 10454, 352, 9563, 751, 253, 2045, 789, 858, 417, 1908, 326, 387, 512, 2299, 253, 2045, 789, 24337, 35333, 858, 2319, 253, 1083, 326, 465, 310, 3907, 697, 4030, 533, 3164, 22870, 83, 281, 3748, 326, 281, 1056, 352, 30909, 50274, 2369, 2442, 4016, 38058, 3486, 50276, 7152, 33032, 2520, 2929, 12661, 747, 3909, 5933, 323, 4315, 12240, 672, 253, 6944, 1698, 14714, 4315, 310, 2853, 44321, 556, 247, 1781, 1617, 4520, 253, 4477, 8549, 5520, 10527, 23632, 323, 253, 1980, 14940, 273, 253, 5933, 20544, 436, 2929, 2789, 4891, 7680, 281, 3909, 4315, 12240, 672, 253, 3216, 5083, 310, 2853, 44321, 253, 4477, 1246, 247, 747, 5933, 326, 872, 1598, 22139, 2853, 44321, 1698, 14714, 4315, 342, 10527, 23632, 9470, 10704, 4679, 403, 5196, 281, 17093, 253, 3762, 50275, 20881, 1255, 619, 7350, 323, 436, 2929, 310, 7194, 670, 697, 38135, 347, 5469, 275, 253, 806, 1127, 2708, 337, 1677, 326, 2720, 6239, 337, 2168, 2692, 253, 3809, 14940, 273, 256, 35333, 323, 3909, 4315, 12240, 342, 973, 44321, 3216, 5083, 285, 247, 2962, 273, 3332, 789, 27812, 4645, 326, 13642, 872, 1598, 19132, 253, 3045, 273, 11786, 18499, 275, 253, 3361, 273, 2853, 44321, 3216, 5083, 352, 310, 417, 10084, 281, 923, 326, 407, 24049, 253, 13642, 6974, 342, 256, 35333, 588, 3157, 253, 3045, 323, 3909, 4315, 12240, 342, 2853, 44321, 3216, 5083, 752, 310, 253, 7681, 38135, 273, 436, 2929, 891, 5583, 253, 4477, 281, 6780, 616, 7681, 9021, 273, 436, 2929, 374, 253, 3762, 275, 436, 2929, 760, 23632, 1980, 14940, 604, 891, 717, 417, 5816, 2712, 436, 2929, 1057, 417, 2319, 849, 281, 4044, 271, 31850, 326, 12310, 253, 2515, 275, 10012, 374, 253, 954, 4633, 31850, 6974, 310, 9879, 1332, 762, 752, 2515, 1057, 9879, 1332, 390, 643, 1332, 10517, 253, 2515, 275, 10012, 374, 849, 1057, 841, 2515, 3469, 327, 253, 1617, 1180, 604, 359, 13414, 452, 247, 1175, 31850, 6974, 672, 253, 3216, 5083, 310, 2853, 44321, 840, 253, 1980, 14940, 906, 275, 436, 2929, 3815, 588, 417, 320, 594, 4217, 495, 253, 1750, 326, 50276, 21, 253, 4028, 3290, 273, 436, 2929, 878, 281, 320, 5520, 285, 253, 2929, 878, 281, 320, 4737, 1088, 323, 1650, 760, 327, 3239, 577, 891, 923, 253, 1563, 3237, 247, 253, 5150, 256, 35333, 4428, 2709, 963, 993, 50276, 67, 253, 6197, 1142, 25142, 273, 256, 35333, 943, 21364, 670, 253, 3879, 273, 2120, 23941, 11786, 18499, 310, 440, 1710, 2056, 474, 260, 327, 1386, 11858, 247, 23664, 2803, 273, 465, 5596, 20, 310, 2686, 465, 5596, 26, 277, 327, 1386, 18836, 3302, 50276, 19078, 1320, 390, 3302, 1127, 299, 50276, 251, 1386, 19092, 342, 387, 1878, 43911, 5912, 50276, 3113, 5912, 387, 1878, 43911, 50276, 18, 480, 249, 260, 465, 518, 796, 256, 278, 50276, 3024, 1761, 43953, 268, 4022, 872, 494, 5919, 3909, 4315, 12240, 3066, 1327, 44181, 19191, 11786, 18499, 16424, 275, 11454, 1491, 5162, 2718, 3285, 50276, 19, 12386, 246, 6429, 260, 50276, 4635, 340, 43425, 38757, 2853, 44321, 1698, 14714, 4315, 13418, 3066, 24337, 11786, 18499, 480, 3674, 3037, 501, 3307, 1458, 520, 50276, 20, 12386, 246, 6429, 260, 50276, 4635, 340, 43425, 1698, 14714, 4315, 7355, 342, 24337, 749, 29844, 3082, 3809, 285, 10237, 14940, 1293, 253, 1617, 1180, 26332, 1796, 13122, 327, 2625, 5162, 10447, 374, 24698, 1348, 2693, 50276, 21, 12386, 246, 6429, 260, 819, 727, 67, 2477, 5464, 247, 1195, 377, 299, 50276, 4635, 340, 1384, 1423, 778, 13642, 285, 9171, 1430, 872, 494, 1327, 44181, 1698, 14714, 13148, 12240, 275, 5213, 8059, 327, 13345, 9260, 285, 9990, 7266, 22146, 49510, 1166, 268, 1686, 83, 50276, 6438, 30080, 22559, 50276, 74, 651, 751, 281, 5717, 253, 4477, 323, 15974, 619, 5701, 891, 452, 9300, 253, 4868, 15672, 436, 310, 247, 3762, 2929, 285, 891, 2868, 627, 310, 642, 2442, 4016, 38058, 3486, 2490, 187, 4118, 18435, 27, 783, 256, 35333, 2715, 323, 436, 4315, 12240, 1895, 310, 5816, 285, 436, 789, 310, 2820, 281, 7522, 253, 5816, 5313, 253, 7350, 273, 253, 30628, 1646, 281, 320, 11512, 4496, 1611, 281, 823, 253, 1524, 2203, 4679, 281, 253, 2457, 2715, 5046, 3185, 273, 253, 13506, 941 ]
Below is given review of a research paper from cnoference journal. Please write a summary the review. ### Review: this paper addresses the problem of answering queries about foils ie why an alternative plan was not chosen by an agent acting optimally in a deterministic mdp the authors describe three broad classes of responses to this query 1 one of the actions in the foil does not satisfy the preconditions 2 the foil does not achieve the goal or 3 the foil has suboptimal cost importantly all responses are conducted with respect to a prespecified set of concepts predicates or binary classifiers the symbolic preconditions and costs are learned by interaction with the simulator but expressed in the language of these concepts the authors discuss extensions to their basic framework that 1 provide confidence measures along with the responses and 2 handle noisyprobabilistic concepts a strength of this paper in my opinion is that it addresses a very important problem and seems to make a lot of positive strides toward a solution i found the motivating paragraphs in the introduction to be highly compelling and the related work section to be sufficient for placing this paper with respect to related literature on explainability which i am not very familiar with another plus point is that the authors show that their system is robust to uncertainty via the confidence measures and noise via the probabilistic concepts however my main issue with this paper surrounds the assumption of the prespecified concepts the most compelling motivational sentence to me was more often than notlay user however if we are assuming that lay users are the target users of the proposed system where would these concepts come from i understand that they could be specified ahead of time but then there would be issues if the user has a concept in mind while specifying a foil that is not covered by the prespecified set the authors do partially address this concern in the sentences an empty listtaskrelated concepts but i did not find this sentence compelling on its own and it seemed to me that the empirical studies do not consider this possibility of needing to add more concepts to be able to answer a query please do correct me if i have missed this to me one of the most interesting aspects of this problem is to consider how to automatically learn or improve the initial set of concepts perhaps given data of many users foils across different problem instances in a particular domain for instance if we notice that users often provide a foil that involves walking into the skull we can figure this out with our simulator we may be able to learn that a concept nexttoskull is important when building our symbolic preconditions unfortunately the current paper does not focus enough on this important aspect of the problem instead simply assuming a good set of concepts to be prespecified which i find to be highly unrealistic a second major issue is the relatively low quality of the empirical results i appreciate that in this line of work it can be hard to provide rigorous numbers since much of the evaluations come from humans nevertheless i believe that one can obtain much more illuminating results in this problem setting by improving the experimental design and reporting a simple starting point would be to include confidence measures on the reported numbers in the paragraphs discussing the results of h1h2 and h3 beyond that it would be good to probe deeper eg for h3 cluster and show us visualizations of the different situations where the preconditionbased explanations were useful while the saliencybased explanations were not and vice versa so that we can better understand when each one tends to be better personally i could imagine many situations eg in understanding the behavior of an autonomous vehicle where i would rather just have the saliency map that highlights a region of my surroundings instead of being given a written explanation as to why the vehicle performed a certain maneuver so i think the paper could be greatly improved if the authors probe more deeply into explaining the empirical results by the way much of the work by anca dragan i am unaffiliated see eg 1 can be a useful reference in how to set up highly rigorous user studies looking forward i have one question for the authors it is clear that in realworld applications of ai our agents will never be able to act optimally and we shouldnt expect them to given this would there be a way to modify the current work to create feedback from the user that allows the agent to improve its solution for instance perhaps the agent is in the middle of executing the best solution it found after thinking for 30 minutes but then the user says why not do x instead and the agent has to decide between a the user missed something and i should generate an explanation or b i missed something and i should think more and revise my current policy 1 bobu andreea et al learning under misspecified objective spaces conference on robot learning 2018docsepthis paper presents a novel approach to generate contrastive explanations in a dialogue setting between a human and a planning agent the setting assumes that the agent generates and offers an optimal plan to the user and the user in turn challenges the presented plan offering an alternative ie a contrastfoil the goal of the agent is the denounce the alternative plan by explaining the infeasibility or suboptimality of the plan to the user in concepts they understand the explored direction is interesting and relevant as it seems to be a natural addition to the related problem of generating contrastive aka counterfactual explanations i would suggest however that the paper more clearly distinguishes between the contrastive explanation generation literature see eg a survey 1 with the type of explanation which is offered here which is to identify the minimal set of preconditions in an alternative concept space that describesexplains the difference between two given instances ie a modelproposed fact and a humangenerated foil this motivation is related to such missing related work as 2 strengths the writing throughout wellpolished and the motivation in the abstract and introduction is very well done the formulations are sensible and seem to be encapsulating the settings described and generalizations the inclusion of a userstudy is helpful suggestions the overloading of notation is at times difficult to follow as a reader not familiar with montezumas foils or sokoban i had a difficult time understanding the experimental section a comparison with optimal explanations is lacking relatedly statements such as the searchable to identify the expected explanation is misleading as without an infinite budget and exhaustive search the algorithm can identify an approximate to the optimal explanation re user study i am not entirely convinced that the baselines are fair by default i would expect that offering more nonfooling information would render a higher subjective explainability score perhaps the experiments would be stronger if tested against other types of information that is provided in addition to the presented baseline something along the lines of h3 although even here it is not a completely fair comparison because unlike saliency maps the offered explanations depend on humanannotated concepts which would naturally render the presented explanations are more humanunderstandable nit a completeness score of 336 5 is not possible this would mean one of the 20 participants voted a noninteger value in the related work there seems to be an absence of literature on planning perhaps a differentiation of the scope of the presented paper with this literature would boost the motivation summary i think this paper explores a fresh direction which is necessary for enabling humans to contest challenge and ultimately trust the decisions of an automated system i also believe that the presented material still requires a lot of further work and attention and would be happy to see it accepted so the community can further explore these directions fair user studies the relation between approximate and optimal plans investigations into the properties of foils relative to optimal actions etc if accepted i would strongly suggest a better integration of the main body and appendix especially for sec 35 1 karimi et al httpsarxivorgabs201004050 2 goyal et al httpsarxivorgabs190407451 docsepedit i have read the other reviews as well as all author responses the other reviewers noted meaningful concerns but i believe the authors have clearly addressed most of these points i still believe this work is an accept summary the authors present a system for producing explanations in terms of userspecified prerequisite relationships the authors train a classifier to detect the presence of userspecified concepts relationships between these concepts are found by learning a partial symbolic model with this model an agents action can be compared to a userprovided alternative and the first prerequisiteviolating action can be identified or the cost difference can be conveyed the authors evaluate their approach on two domains with a user study pros this work addresses a relevant problem in explaining rl and other sdm agents and provides an initial solution that provides a solution for a meaningful set of domains i agree with the authors to my knowledge this is the first work to provide explanations in terms of a learned symbolic model separate from the one used by the agent the method for identifying failed preconditions is clearly introduced and motivated the extension to handle noisy classifiers substantially increases the usefulness of this work cons this work requires a deterministic domain as well as access to a simulator from which states can be sampled this substantially limits the applicability of this approach this limitation is not mentioned in the abstract the authors make a number of assumptions section 4 confidence over explanations but these are not quantitatively evaluated adding experiments that measure how well these assumptions hold in practice would improve this work the user study could be improved in a number of ways it used manual translation of explanations to text a comparison was made to saliency maps but not to other explanations such as causal explanations questions during rebuttal period please address and clarify the cons above how important was the change to montezumas revenge rendering failed actions is this a general requirement for creating explanations in an environment do you have additional information about the participants were they ai practitioners were any of the main study participants among those who selected the 2538 concepts other comments the authors may be interested in distal explanations for explainable reinforcement learning agents a followup to the madumal et al 2020 work cited in section 6 section 5 appears to be broken into fewer subsections has fewer line breaks in order to meet page requirements this leads to a cramped less organized section 5 some typosetc in general another editing pass for articles and agreement of subjectverb plurality would help this paper line 2 of introduction they with its change to with their paragraph 3 of introduction the after questions of this form is not matched background we will consider goaldirected agents that any given point is trying to drive fix verb agreement and remove extra words the sentence leading into definition 4 does not smoothly lead to definition 4 section 4 so we start with remove so section 4 we arent be able remove be section 5 adaboost citation is using wrong command docsep the authors propose a method of explainable ai for inscrutable blackbox models the explanations build on a set of userdefined primitives independently trained on the blackbox representation eg visual frames of an atari game and use an increasingly popular method of providing contrastive explanations two forms of foilbased responses are provided 1 indication of action failure from the planning perspective preconditions unsatisfied and 2 an explanation of relative suboptimality that highlights key aspects of action costs that the user may be unaware of highlevel concepts particularly those tied to a symbolic description of the world dynamics is an extremely compelling basis for explanation it helps build a wellgrounded intuition with human usersobservers of autonomous systems and arguably is the best way to convey explanations that describe behaviour of an inherently sequential nature in addition to the form of explanation primitives the algorithms are intuitive and the probabilistic inference seems to be sound my concerns with the paper fall into two main categories the lack of substantial contributions particularly as related to representation learning and the strong assumptions placed on the setting one of the most significant missed opportunities in this work is to focus on introducing new concepts especially given that human studies were conducted and the setting was identified when algorithms fail and new or revised concepts are required assuming highly accurate binary classifiers for each concept is relatively extreme and its only one such overly strong assumption other very strong assumptions include a the state is memorylessmarkovian every concept can be determined by looking exclusively at the current frame this isnt the case in many settings where some memory of the previous actions is required b the distribution of a fluent across the state space is independent to the distribution of other fluents this is rarely the case in planninglike domains and the types of explanations introduced in this work build on planninglike domains a great deal c there is only one failed precondition this might be an alright assumption to make given b but similarly i find it unlikely that many domains would have this property as pointed out by the authors assumptions b and c cause algorithm 2 to exhaust the entire sampling budget before failing and i dont believe they are safe assumptions to make on the topic of evaluation there are two further issues one is the scope of the evaluation only two domains and a seemingly small number of subjects to test with and this reduces the significance of the papers contribution another issue is the choice of comparison for h3 the saliency map is built using different information than that surfaced using the proposed approaches this makes it challenging to adopt the experiments conclusion as written since it is also testing the quality of the saliency map in addition to the comparative nature of the explanations i am leaning towards rejecting the paper due to the number of assumptions placed on the approach combined with the limited evaluation setting and scope of work the contributions to the field of learning representations seem limited ultimately my hesitation in recommending acceptance comes from the contribution being on the lowside for the iclr community the authors identify key elements that would change this impression eg refining the concepts when the algorithms fail to find an explanation italics on pg 5 but these do not play a central role in the proposed work the h1h2 results are in some sense evident that the proposed explanations are preferred 1920 for h1 while an important element it would be very surprising if this werent the case they dont serve as a sufficient contribution in their own right the h3 comparison seems to be somewhat contrived since they come from different sources a more accurate comparison would be to engineer the saliency overlay based on the domainknowledge known ie reflecting the preconditionbased information directly without that you conflate both the choice of focus and ability to highlight that choice questions for the authors 1 how do you remove the seemingly strong assumption that the distribution of fluents across the state space is independent among the fluents alternatively why can we expect this to be a reasonable assumption to make 2 how would you remove the dependence on the fully observable markovian assumption on the blackbox output that is used for concept classification ie when the full state cannot be discerned by looking at the screen alone other minor points of improvement for the paper o mind the notation used for your goal set near the end of page 2 you are using a different syntax than the one introduced o defn 3 seems to have a random bracket at the end docsep summary this paper introduces a new method for contrastive explanation of symbolic models on sequential decisionmaking problems ie explaining why a foil plan is not as good as the system proposed plan the suboptimality of foil plans is categorized into two types invalid actions and larger cost to make the explanation more understandable to human the author introduces concepts to explanation which are assumed propositional properties of states concept classifiers can be trained on samples to predict the presence of each concept in a state to explain an invalid action the method reports a missing precondition concept of the failing action to explain a larger cost it reports a set of representative concepts such that the foil actions under these concepts are guaranteed to give a larger cost than the proposed plan the paper also provides detailed algorithms to find preconditions or representative concepts using abstract cost functions for actions the authors also introduce simple pgms to evaluate the confidence of each explanation the authors conducted user studies on the proposed explanation method and demonstrated its usefulness strength model explanation is indeed a very important field today for any ml models sequential decisionmaking models are popular and widely used but explanation methods for them are still somewhat underexplored as reported in the paper previous work either includes concepts directly in model learning not posthoc or use saliency maps not human concepts this work introduces human concepts to posthoc explanation the whole explanation method is novel and makes sense to me the ideas are intuitive but the algorithms for finding the explanations and evaluating the confidence are nontrivial weakness in my opinion the user study still has room for improvement for h1 and h2 directly asking the user which one is more useful does not look like a good way of comparison at least to me its hard to tell which one is more useful if i have already read both and understood maybe its better to only show one explanation to a participant and ask how well they understood or let them perform a task as in h3 for h1 as i see in figure 8b the explanation with concepts ce has exactly more information than the baseline explanation be in other words ce is concept image while be is only image be is a strict subset of ce such a comparison does not seem very useful would it be better if you for example compare conceptonly with imageonly such as removing in the state shown in the image in ce for h3 the task setting is interesting but in the shown example the concept that the user has to learn is merely to use the switch which looks somewhat too simple and not very interesting it would be more attractive if the concept is more complex eg you cant fall off a plane or touch an enemy as in montezuma i think the participants background information age gender how recruited etc is important for a user study and i suggest having it in the main paper or at least making it easier to notice it seems that empirical studies are not provided for confidence score calculation ie how well do the scores correspond with actual explanation accuracy questions in section 4 line 16 you mentioned in cases where we are guaranteed that the concept list is exhaustive could you provide some examples of these cases in section 5 explanation identification you mentioned the search was able to identify the expected explanation for each foil does that mean all the output explanations are accurate if so it will be interesting to see how the search performs on more complex tasks for explanations on larger cost as in figure 9b to me the most ideal explanation would be just executing action push up when box on pink cell costs at least 10 other lines are not useful in other words it might be better to focus on steps where the foil has larger cost than proposed plan do you have any thoughts on this line typos section 2 symbol g set of goal states font inconsistent appendix a3 algorithm 2 line 12 symbol broken ### Summary:
this paper is an intriguing study of agents that can give explanations contrastive of their actions via symbolic representation such as dialog agents can also allow users to argue against the agents decisions i am extremely impressed by the quality of the reviewer comments and discussions it is also interesting that the reviewers have formed two camps of thought on the paper one camp consists of r3 and r5 who are in agreement in vociferously criticizing the weak points in the paper the other camp consists of r1 r2 and r4 who champion the merits of what they see as strong points notably all reviewers have fairly high confidence values only one confidence score of 3 and all others are 4 it was a borderline case and not an easy decision in the end the program committee decided that the paper in its current form does not quite meet the bar and would benefit from another revision see eg r4 comments we think that the work is interesting and encourage the authors to address the reviewers comments and resubmit the work to another venue
[ 25442, 18164, 4242, 5231, 310, 436, 247, 2087, 8284, 323, 6153, 22909, 275, 271, 3126, 50276, 3088, 368, 452, 3081, 1491, 670, 253, 5014, 497, 597, 23105, 24432, 497, 667, 273, 253, 2022, 1263, 5014, 2190, 1110, 665, 4236, 253, 2030, 1839, 12342, 50275, 977, 5701, 50276, 783, 4477, 778, 320, 6110, 275, 16891, 22909, 323, 5513, 494, 35221, 4715, 6083, 247, 956, 484, 281, 253, 10279, 360, 267, 1162, 355, 9169, 789, 11106, 275, 2593, 721, 50275, 4674, 608, 4620, 281, 320, 7154, 715, 11184, 749, 21454, 50276, 7110, 11184, 1386, 13471, 275, 1340, 281, 2525, 3239, 6095, 436, 5644, 281, 247, 1531, 17263, 1679, 10932, 2593, 608, 50276, 8826, 963, 993, 14069, 50276, 249, 2087, 1529, 14835, 1509, 323, 7774, 285, 4345, 273, 2256, 25340, 11234, 651, 1361, 436, 2929, 50275, 1282, 374, 273, 10199, 597, 50276, 3113, 697, 1818, 281, 342, 616, 50275, 43575, 495, 273, 10199, 253, 50276, 6438, 3533, 273, 436, 830, 310, 417, 13373, 50275, 11814, 359, 588, 1908, 4736, 27481, 6083, 326, 667, 1677, 1127, 310, 2820, 281, 4446, 4993, 17257, 4345, 285, 5386, 4465, 3000, 50275, 783, 6197, 4283, 715, 5426, 577, 1057, 417, 25863, 1421, 281, 5426, 577, 50275, 4674, 577, 594, 359, 1265, 342, 5386, 594, 50276, 4674, 577, 359, 403, 2649, 320, 2104, 5386, 320, 50275, 4674, 608, 519, 40826, 493, 25577, 310, 970, 3430, 3923, 50276, 7152, 33032, 253, 4477, 12661, 247, 1332, 273, 5513, 494, 23105, 323, 275, 8658, 13508, 2806, 3364, 3210, 253, 22909, 1973, 327, 247, 873, 273, 2608, 7769, 2248, 23223, 10939, 10166, 327, 253, 2806, 3364, 6779, 24088, 5304, 13009, 273, 271, 387, 1792, 2165, 285, 897, 271, 9592, 4633, 1332, 273, 5277, 4499, 422, 22909, 767, 4948, 273, 27204, 3169, 6128, 403, 2530, 337, 14011, 273, 2250, 4433, 432, 253, 7219, 8668, 638, 32978, 5061, 33496, 285, 374, 271, 8813, 273, 4103, 749, 32581, 1319, 326, 16681, 2234, 7794, 273, 2250, 4815, 326, 253, 2608, 778, 320, 25229, 273, 50276, 8656, 5251, 12342, 3782, 1110, 12331, 281, 247, 24762, 5740, 273, 253, 1533, 8062, 310, 271, 6685, 18511, 3720, 323, 8813, 352, 7729, 1973, 247, 973, 2595, 264, 30328, 342, 1966, 4212, 706, 2152, 735, 273, 26279, 2718, 285, 25711, 310, 253, 1682, 1039, 281, 12709, 22909, 326, 6266, 8770, 273, 271, 26557, 22453, 3753, 50276, 249, 1635, 281, 253, 830, 273, 8813, 2248, 23223, 253, 11333, 403, 27350, 285, 253, 37851, 17032, 3133, 281, 320, 3590, 50276, 2577, 7350, 342, 253, 2929, 2965, 715, 767, 2022, 9050, 253, 3480, 273, 6832, 9021, 3782, 347, 2905, 281, 6779, 4715, 285, 253, 2266, 13260, 4845, 327, 253, 4758, 50276, 531, 273, 253, 954, 1534, 9829, 9091, 275, 436, 789, 310, 281, 2770, 327, 16984, 747, 12342, 3340, 1677, 326, 1966, 2175, 497, 5196, 285, 253, 4758, 369, 3636, 672, 11333, 1891, 285, 747, 390, 17265, 12342, 403, 2424, 7384, 4122, 7899, 8985, 49996, 323, 1016, 4473, 310, 4942, 9559, 285, 697, 760, 581, 824, 27662, 2266, 9376, 50276, 977, 1077, 2266, 13260, 2486, 50276, 66, 253, 1375, 310, 3541, 1417, 4698, 729, 757, 1046, 4473, 476, 320, 3413, 407, 2819, 14288, 387, 253, 1655, 3665, 436, 310, 2649, 253, 1083, 275, 1142, 7533, 835, 690, 3541, 273, 253, 2045, 5231, 310, 2424, 50276, 67, 253, 3268, 273, 247, 2938, 290, 2439, 253, 1375, 2317, 310, 3907, 281, 253, 3268, 273, 643, 2938, 592, 436, 310, 11766, 253, 1083, 275, 7219, 3022, 10625, 285, 253, 3510, 273, 22909, 5611, 275, 436, 789, 1973, 327, 7219, 3022, 10625, 247, 1270, 2968, 50276, 68, 627, 310, 760, 581, 4242, 638, 12380, 436, 1537, 320, 271, 34557, 9376, 281, 1056, 1677, 270, 533, 12014, 891, 1089, 352, 11543, 326, 1142, 10625, 651, 452, 436, 2867, 50276, 284, 8042, 562, 407, 253, 4477, 13260, 270, 285, 260, 2847, 5933, 374, 281, 9286, 253, 2862, 10491, 7563, 1078, 11741, 285, 891, 13414, 2868, 597, 403, 4999, 13260, 281, 1056, 50276, 251, 253, 9400, 273, 7103, 627, 403, 767, 2007, 3374, 581, 310, 253, 7990, 273, 253, 7103, 760, 767, 10625, 285, 247, 16907, 1355, 1180, 273, 5705, 281, 1071, 342, 285, 436, 11355, 253, 8453, 273, 253, 9380, 7680, 1529, 2523, 310, 253, 4327, 273, 5301, 323, 288, 20, 253, 3779, 4364, 3711, 310, 4270, 970, 1027, 1491, 685, 326, 47300, 970, 253, 4081, 7274, 436, 2789, 352, 11132, 281, 5283, 253, 4679, 6452, 347, 3542, 1580, 352, 310, 671, 5175, 253, 3290, 273, 253, 3779, 4364, 3711, 275, 1635, 281, 253, 20407, 3753, 273, 253, 22909, 50276, 74, 717, 25661, 4404, 33944, 253, 2929, 1955, 281, 253, 1180, 273, 13260, 4845, 327, 253, 2746, 5678, 342, 253, 3710, 7103, 4758, 285, 7990, 273, 789, 253, 9021, 281, 253, 1673, 273, 4715, 14237, 1646, 3710, 50276, 503, 7325, 619, 39500, 275, 46705, 14924, 3249, 432, 253, 7680, 1146, 327, 253, 1698, 2189, 323, 253, 17857, 32888, 3114, 253, 4477, 4271, 2234, 3603, 326, 651, 1818, 436, 13214, 50276, 909, 1275, 1699, 253, 12342, 672, 253, 11333, 1891, 281, 1089, 271, 8813, 36037, 982, 327, 23256, 608, 50276, 2858, 841, 513, 417, 1132, 247, 4275, 2554, 275, 253, 4081, 789, 50276, 783, 288, 18, 73, 19, 1543, 403, 275, 690, 3282, 8943, 326, 253, 4081, 22909, 403, 9013, 18471, 323, 288, 18, 1223, 271, 1774, 3284, 352, 651, 320, 1077, 10084, 604, 436, 359, 624, 253, 1083, 597, 13414, 5752, 347, 247, 4209, 7680, 275, 616, 1211, 987, 253, 288, 20, 5301, 3133, 281, 320, 8489, 523, 30487, 1580, 597, 1705, 432, 1027, 4973, 50276, 66, 625, 7899, 5301, 651, 320, 281, 16518, 253, 3779, 4364, 33828, 1754, 327, 253, 5028, 36871, 1929, 26332, 18964, 253, 638, 12380, 3169, 1491, 3587, 1293, 326, 368, 49446, 366, 1097, 253, 4327, 273, 2770, 285, 3745, 281, 6780, 326, 4327, 50275, 34974, 323, 253, 4477, 50276, 18, 849, 513, 368, 5386, 253, 16907, 2266, 9376, 326, 253, 3268, 273, 2938, 592, 2439, 253, 1375, 2317, 310, 3907, 2190, 253, 2938, 592, 31506, 2139, 476, 359, 1902, 436, 281, 320, 247, 5272, 9376, 281, 1056, 50276, 19, 849, 651, 368, 5386, 253, 10096, 327, 253, 4751, 24802, 50276, 4698, 729, 757, 9376, 327, 253, 2806, 3364, 3453, 326, 310, 908, 323, 4473, 9162, 26332, 672, 253, 2120, 1375, 2550, 320, 557, 39833, 407, 2819, 387, 253, 3601, 3815, 50275, 977, 5884, 2792, 273, 7756, 323, 253, 2929, 50276, 80, 2564, 253, 14951, 908, 323, 634, 4736, 873, 2822, 253, 990, 273, 3239, 374, 368, 403, 970, 247, 1027, 16144, 685, 253, 581, 5611, 50276, 80, 809, 79, 495, 3133, 281, 452, 247, 3632, 24312, 387, 253, 990, 5474, 33032, 6010, 436, 2929, 23970, 247, 747, 1332, 323, 4499, 422, 8813, 273, 24762, 3210, 327, 22453, 3061, 11849, 3237, 26332, 15571, 2139, 247, 27204, 2098, 310, 417, 347, 1175, 347, 253, 985, 4081, 2098, 253, 749, 32581, 1319, 273, 27204, 5827, 310, 27948, 715, 767, 3510, 12078, 5231, 285, 4067, 2105, 50276, 936, 1056, 253, 8813, 625, 34007, 281, 1966, 253, 2488, 23970, 12342, 281, 8813, 534, 403, 8025, 13989, 267, 3607, 273, 3054, 4473, 49996, 476, 320, 10166, 327, 3530, 281, 3283, 253, 3361, 273, 1016, 4473, 275, 247, 1375, 281, 5513, 271, 12078, 2250, 253, 1332, 5012, 247, 5816, 638, 12380, 4473, 273, 253, 11741, 2250, 281, 5513, 247, 4067, 2105, 352, 5012, 247, 873, 273, 8612, 12342, 824, 326, 253, 27204, 5231, 762, 841, 12342, 403, 16293, 281, 1918, 247, 4067, 2105, 685, 253, 4081, 2098, 253, 2929, 671, 3400, 7000, 11333, 281, 1089, 638, 32978, 390, 8612, 12342, 970, 12002, 2105, 3470, 323, 5231, 253, 4477, 671, 9569, 2969, 23256, 983, 281, 7472, 253, 7162, 273, 1016, 8813, 50276, 783, 4477, 5196, 2608, 2175, 327, 253, 4081, 8813, 1332, 285, 5183, 697, 31471, 50275, 45563, 50276, 7645, 8813, 310, 6296, 247, 1077, 1774, 1673, 3063, 323, 667, 13361, 3210, 22453, 3061, 11849, 3210, 403, 4633, 285, 7561, 908, 533, 8813, 3082, 323, 731, 403, 1335, 8489, 15560, 18398, 446, 2149, 347, 2361, 275, 253, 2929, 2045, 789, 2057, 3797, 12342, 3587, 275, 1566, 4715, 417, 1501, 37806, 390, 897, 3779, 4364, 8115, 417, 1966, 12342, 436, 789, 23970, 1966, 12342, 281, 1501, 37806, 8813, 50276, 783, 2644, 8813, 1332, 310, 4460, 285, 2789, 3282, 281, 479, 253, 5697, 403, 27350, 533, 253, 11333, 323, 4560, 253, 22909, 285, 16344, 253, 7162, 403, 37825, 50275, 20881, 1255, 50276, 249, 619, 4743, 253, 2608, 1263, 1335, 556, 2316, 323, 7756, 28910, 323, 288, 18, 285, 288, 19, 3587, 7004, 253, 2608, 534, 581, 310, 625, 4217, 1057, 417, 1007, 751, 247, 1175, 1039, 273, 5301, 387, 1878, 281, 479, 697, 1892, 281, 2028, 534, 581, 310, 625, 4217, 604, 891, 452, 2168, 1239, 1097, 285, 7192, 5046, 697, 1805, 281, 760, 921, 581, 8813, 281, 247, 14687, 285, 1642, 849, 973, 597, 7192, 390, 1339, 731, 1347, 247, 4836, 347, 275, 288, 20, 28910, 323, 288, 18, 347, 891, 923, 275, 4677, 854, 67, 253, 8813, 342, 12342, 2636, 556, 4555, 625, 1491, 685, 253, 8245, 8813, 320, 275, 643, 3000, 2636, 310, 4473, 50276, 5695, 1223, 320, 310, 760, 2460, 320, 310, 247, 7654, 8578, 273, 2636, 824, 247, 5301, 1057, 417, 1646, 1077, 4217, 651, 352, 320, 1805, 604, 368, 323, 1650, 7277, 4473, 7483, 342, 2460, 7483, 824, 347, 11922, 275, 253, 1375, 2011, 275, 253, 2460, 275, 2636, 28910, 323, 288, 20, 253, 4836, 4758, 310, 4722, 533, 275, 253, 2011, 1650, 253, 4473, 326, 253, 2608, 556, 281, 3037, 310, 7960, 281, 897, 253, 5234, 534, 4453, 8489, 1512, 2969, 285, 417, 1077, 4722, 352, 651, 320, 625, 12994, 604, 253, 4473, 310, 625, 2570, 24088, 368, 16216, 2965, 745, 247, 6415, 390, 5181, 271, 9054, 347, 275, 1114, 442, 91, 9307, 28910, 891, 1158, 253, 5014, 4114, 1491, 2363, 8645, 849, 17875, 3966, 310, 1774, 323, 247, 2608, 1263, 285, 891, 1804, 1907, 352, 275, 253, 2022, 2929, 390, 387, 1878, 2403, 352, 6927, 281, 4366, 50276, 262, 3133, 326, 16774, 2175, 403, 417, 2530, 323, 7162, 4868, 10272, 26332, 849, 973, 513, 253, 7363, 2723, 342, 4588, 8813, 7200, 50275, 34974, 50276, 249, 2593, 577, 1386, 1668, 368, 5393, 275, 2219, 835, 359, 403, 16293, 326, 253, 4473, 1618, 310, 41389, 812, 368, 2085, 690, 6667, 273, 841, 2219, 50275, 249, 2593, 608, 8813, 8137, 368, 5393, 253, 3186, 369, 2104, 281, 4271, 253, 3264, 8813, 323, 1016, 27204, 1057, 326, 1599, 512, 253, 3453, 22909, 403, 7899, 604, 594, 352, 588, 320, 4722, 281, 923, 849, 253, 3186, 17923, 327, 625, 2570, 8892, 50276, 1542, 22909, 327, 4067, 2105, 347, 275, 4677, 898, 67, 281, 479, 253, 954, 7445, 8813, 651, 320, 816, 24364, 2250, 7450, 598, 672, 3817, 327, 14863, 894, 4815, 387, 1878, 884, 643, 3104, 403, 417, 4217, 275, 643, 3000, 352, 1537, 320, 1805, 281, 2770, 327, 5018, 835, 253, 27204, 556, 4067, 2105, 685, 4081, 2098, 513, 368, 452, 667, 7906, 327, 436, 1386, 50275, 555, 993, 50276, 4674, 374, 9484, 305, 873, 273, 4736, 3054, 8266, 16706, 50276, 50237, 247, 20, 5933, 374, 1386, 1249, 9484, 7154, 2490, 187, 4118, 18435, 27, 2520, 2929, 310, 271, 27807, 1263, 273, 6083, 326, 476, 1918, 22909, 4499, 422, 273, 616, 5231, 3066, 24762, 6779, 824, 347, 10756, 50276, 21215, 476, 671, 1581, 4212, 281, 9059, 1411, 253, 6083, 7089, 891, 717, 6685, 17847, 407, 253, 3290, 273, 253, 37317, 5701, 285, 11985, 50276, 262, 310, 671, 4722, 326, 253, 30628, 452, 4447, 767, 19509, 273, 1869, 327, 253, 2929, 581, 2986, 8414, 273, 391, 20, 285, 391, 22, 665, 403, 275, 4345, 275, 11571, 9393, 4087, 7291, 3006, 253, 5075, 2792, 275, 253, 2929, 50276, 783, 643, 2986, 8414, 273, 391, 18, 391, 19, 285, 391, 21, 665, 16928, 253, 16108, 273, 752, 597, 923, 347, 2266, 2792, 50276, 1439, 1598, 512, 30628, 452, 9648, 1029, 7162, 2193, 50276, 7483, 581, 7162, 4868, 273, 495, 285, 512, 2571, 403, 577, 50276, 262, 369, 247, 45210, 1083, 285, 417, 271, 3477, 3061, 275, 253, 990, 253, 2086, 9353, 4425, 326, 253, 2929, 275, 697, 1655, 830, 1057, 417, 3240, 2525, 253, 2534, 285, 651, 5649, 432, 1529, 18520, 923, 24088, 391, 21, 5701, 50276, 664, 1158, 326, 253, 789, 310, 4722, 285, 11907, 253, 4477, 281, 2953, 253, 30628, 5701, 285, 501, 538, 2225, 253, 789, 281, 1529, 18767, 209 ]
[ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 ]
[ 25442, 18164, 4242, 5231, 310, 436, 247, 2087, 8284, 323, 6153, 22909, 275, 271, 3126, 50276, 3088, 368, 452, 3081, 1491, 670, 253, 5014, 497, 597, 23105, 24432, 497, 667, 273, 253, 2022, 1263, 5014, 2190, 1110, 665, 4236, 253, 2030, 1839, 12342, 50275, 977, 5701, 50276, 783, 4477, 778, 320, 6110, 275, 16891, 22909, 323, 5513, 494, 35221, 4715, 6083, 247, 956, 484, 281, 253, 10279, 360, 267, 1162, 355, 9169, 789, 11106, 275, 2593, 721, 50275, 4674, 608, 4620, 281, 320, 7154, 715, 11184, 749, 21454, 50276, 7110, 11184, 1386, 13471, 275, 1340, 281, 2525, 3239, 6095, 436, 5644, 281, 247, 1531, 17263, 1679, 10932, 2593, 608, 50276, 8826, 963, 993, 14069, 50276, 249, 2087, 1529, 14835, 1509, 323, 7774, 285, 4345, 273, 2256, 25340, 11234, 651, 1361, 436, 2929, 50275, 1282, 374, 273, 10199, 597, 50276, 3113, 697, 1818, 281, 342, 616, 50275, 43575, 495, 273, 10199, 253, 50276, 6438, 3533, 273, 436, 830, 310, 417, 13373, 50275, 11814, 359, 588, 1908, 4736, 27481, 6083, 326, 667, 1677, 1127, 310, 2820, 281, 4446, 4993, 17257, 4345, 285, 5386, 4465, 3000, 50275, 783, 6197, 4283, 715, 5426, 577, 1057, 417, 25863, 1421, 281, 5426, 577, 50275, 4674, 577, 594, 359, 1265, 342, 5386, 594, 50276, 4674, 577, 359, 403, 2649, 320, 2104, 5386, 320, 50275, 4674, 608, 519, 40826, 493, 25577, 310, 970, 3430, 3923, 50276, 7152, 33032, 253, 4477, 12661, 247, 1332, 273, 5513, 494, 23105, 323, 275, 8658, 13508, 2806, 3364, 3210, 253, 22909, 1973, 327, 247, 873, 273, 2608, 7769, 2248, 23223, 10939, 10166, 327, 253, 2806, 3364, 6779, 24088, 5304, 13009, 273, 271, 387, 1792, 2165, 285, 897, 271, 9592, 4633, 1332, 273, 5277, 4499, 422, 22909, 767, 4948, 273, 27204, 3169, 6128, 403, 2530, 337, 14011, 273, 2250, 4433, 432, 253, 7219, 8668, 638, 32978, 5061, 33496, 285, 374, 271, 8813, 273, 4103, 749, 32581, 1319, 326, 16681, 2234, 7794, 273, 2250, 4815, 326, 253, 2608, 778, 320, 25229, 273, 50276, 8656, 5251, 12342, 3782, 1110, 12331, 281, 247, 24762, 5740, 273, 253, 1533, 8062, 310, 271, 6685, 18511, 3720, 323, 8813, 352, 7729, 1973, 247, 973, 2595, 264, 30328, 342, 1966, 4212, 706, 2152, 735, 273, 26279, 2718, 285, 25711, 310, 253, 1682, 1039, 281, 12709, 22909, 326, 6266, 8770, 273, 271, 26557, 22453, 3753, 50276, 249, 1635, 281, 253, 830, 273, 8813, 2248, 23223, 253, 11333, 403, 27350, 285, 253, 37851, 17032, 3133, 281, 320, 3590, 50276, 2577, 7350, 342, 253, 2929, 2965, 715, 767, 2022, 9050, 253, 3480, 273, 6832, 9021, 3782, 347, 2905, 281, 6779, 4715, 285, 253, 2266, 13260, 4845, 327, 253, 4758, 50276, 531, 273, 253, 954, 1534, 9829, 9091, 275, 436, 789, 310, 281, 2770, 327, 16984, 747, 12342, 3340, 1677, 326, 1966, 2175, 497, 5196, 285, 253, 4758, 369, 3636, 672, 11333, 1891, 285, 747, 390, 17265, 12342, 403, 2424, 7384, 4122, 7899, 8985, 49996, 323, 1016, 4473, 310, 4942, 9559, 285, 697, 760, 581, 824, 27662, 2266, 9376, 50276, 977, 1077, 2266, 13260, 2486, 50276, 66, 253, 1375, 310, 3541, 1417, 4698, 729, 757, 1046, 4473, 476, 320, 3413, 407, 2819, 14288, 387, 253, 1655, 3665, 436, 310, 2649, 253, 1083, 275, 1142, 7533, 835, 690, 3541, 273, 253, 2045, 5231, 310, 2424, 50276, 67, 253, 3268, 273, 247, 2938, 290, 2439, 253, 1375, 2317, 310, 3907, 281, 253, 3268, 273, 643, 2938, 592, 436, 310, 11766, 253, 1083, 275, 7219, 3022, 10625, 285, 253, 3510, 273, 22909, 5611, 275, 436, 789, 1973, 327, 7219, 3022, 10625, 247, 1270, 2968, 50276, 68, 627, 310, 760, 581, 4242, 638, 12380, 436, 1537, 320, 271, 34557, 9376, 281, 1056, 1677, 270, 533, 12014, 891, 1089, 352, 11543, 326, 1142, 10625, 651, 452, 436, 2867, 50276, 284, 8042, 562, 407, 253, 4477, 13260, 270, 285, 260, 2847, 5933, 374, 281, 9286, 253, 2862, 10491, 7563, 1078, 11741, 285, 891, 13414, 2868, 597, 403, 4999, 13260, 281, 1056, 50276, 251, 253, 9400, 273, 7103, 627, 403, 767, 2007, 3374, 581, 310, 253, 7990, 273, 253, 7103, 760, 767, 10625, 285, 247, 16907, 1355, 1180, 273, 5705, 281, 1071, 342, 285, 436, 11355, 253, 8453, 273, 253, 9380, 7680, 1529, 2523, 310, 253, 4327, 273, 5301, 323, 288, 20, 253, 3779, 4364, 3711, 310, 4270, 970, 1027, 1491, 685, 326, 47300, 970, 253, 4081, 7274, 436, 2789, 352, 11132, 281, 5283, 253, 4679, 6452, 347, 3542, 1580, 352, 310, 671, 5175, 253, 3290, 273, 253, 3779, 4364, 3711, 275, 1635, 281, 253, 20407, 3753, 273, 253, 22909, 50276, 74, 717, 25661, 4404, 33944, 253, 2929, 1955, 281, 253, 1180, 273, 13260, 4845, 327, 253, 2746, 5678, 342, 253, 3710, 7103, 4758, 285, 7990, 273, 789, 253, 9021, 281, 253, 1673, 273, 4715, 14237, 1646, 3710, 50276, 503, 7325, 619, 39500, 275, 46705, 14924, 3249, 432, 253, 7680, 1146, 327, 253, 1698, 2189, 323, 253, 17857, 32888, 3114, 253, 4477, 4271, 2234, 3603, 326, 651, 1818, 436, 13214, 50276, 909, 1275, 1699, 253, 12342, 672, 253, 11333, 1891, 281, 1089, 271, 8813, 36037, 982, 327, 23256, 608, 50276, 2858, 841, 513, 417, 1132, 247, 4275, 2554, 275, 253, 4081, 789, 50276, 783, 288, 18, 73, 19, 1543, 403, 275, 690, 3282, 8943, 326, 253, 4081, 22909, 403, 9013, 18471, 323, 288, 18, 1223, 271, 1774, 3284, 352, 651, 320, 1077, 10084, 604, 436, 359, 624, 253, 1083, 597, 13414, 5752, 347, 247, 4209, 7680, 275, 616, 1211, 987, 253, 288, 20, 5301, 3133, 281, 320, 8489, 523, 30487, 1580, 597, 1705, 432, 1027, 4973, 50276, 66, 625, 7899, 5301, 651, 320, 281, 16518, 253, 3779, 4364, 33828, 1754, 327, 253, 5028, 36871, 1929, 26332, 18964, 253, 638, 12380, 3169, 1491, 3587, 1293, 326, 368, 49446, 366, 1097, 253, 4327, 273, 2770, 285, 3745, 281, 6780, 326, 4327, 50275, 34974, 323, 253, 4477, 50276, 18, 849, 513, 368, 5386, 253, 16907, 2266, 9376, 326, 253, 3268, 273, 2938, 592, 2439, 253, 1375, 2317, 310, 3907, 2190, 253, 2938, 592, 31506, 2139, 476, 359, 1902, 436, 281, 320, 247, 5272, 9376, 281, 1056, 50276, 19, 849, 651, 368, 5386, 253, 10096, 327, 253, 4751, 24802, 50276, 4698, 729, 757, 9376, 327, 253, 2806, 3364, 3453, 326, 310, 908, 323, 4473, 9162, 26332, 672, 253, 2120, 1375, 2550, 320, 557, 39833, 407, 2819, 387, 253, 3601, 3815, 50275, 977, 5884, 2792, 273, 7756, 323, 253, 2929, 50276, 80, 2564, 253, 14951, 908, 323, 634, 4736, 873, 2822, 253, 990, 273, 3239, 374, 368, 403, 970, 247, 1027, 16144, 685, 253, 581, 5611, 50276, 80, 809, 79, 495, 3133, 281, 452, 247, 3632, 24312, 387, 253, 990, 5474, 33032, 6010, 436, 2929, 23970, 247, 747, 1332, 323, 4499, 422, 8813, 273, 24762, 3210, 327, 22453, 3061, 11849, 3237, 26332, 15571, 2139, 247, 27204, 2098, 310, 417, 347, 1175, 347, 253, 985, 4081, 2098, 253, 749, 32581, 1319, 273, 27204, 5827, 310, 27948, 715, 767, 3510, 12078, 5231, 285, 4067, 2105, 50276, 936, 1056, 253, 8813, 625, 34007, 281, 1966, 253, 2488, 23970, 12342, 281, 8813, 534, 403, 8025, 13989, 267, 3607, 273, 3054, 4473, 49996, 476, 320, 10166, 327, 3530, 281, 3283, 253, 3361, 273, 1016, 4473, 275, 247, 1375, 281, 5513, 271, 12078, 2250, 253, 1332, 5012, 247, 5816, 638, 12380, 4473, 273, 253, 11741, 2250, 281, 5513, 247, 4067, 2105, 352, 5012, 247, 873, 273, 8612, 12342, 824, 326, 253, 27204, 5231, 762, 841, 12342, 403, 16293, 281, 1918, 247, 4067, 2105, 685, 253, 4081, 2098, 253, 2929, 671, 3400, 7000, 11333, 281, 1089, 638, 32978, 390, 8612, 12342, 970, 12002, 2105, 3470, 323, 5231, 253, 4477, 671, 9569, 2969, 23256, 983, 281, 7472, 253, 7162, 273, 1016, 8813, 50276, 783, 4477, 5196, 2608, 2175, 327, 253, 4081, 8813, 1332, 285, 5183, 697, 31471, 50275, 45563, 50276, 7645, 8813, 310, 6296, 247, 1077, 1774, 1673, 3063, 323, 667, 13361, 3210, 22453, 3061, 11849, 3210, 403, 4633, 285, 7561, 908, 533, 8813, 3082, 323, 731, 403, 1335, 8489, 15560, 18398, 446, 2149, 347, 2361, 275, 253, 2929, 2045, 789, 2057, 3797, 12342, 3587, 275, 1566, 4715, 417, 1501, 37806, 390, 897, 3779, 4364, 8115, 417, 1966, 12342, 436, 789, 23970, 1966, 12342, 281, 1501, 37806, 8813, 50276, 783, 2644, 8813, 1332, 310, 4460, 285, 2789, 3282, 281, 479, 253, 5697, 403, 27350, 533, 253, 11333, 323, 4560, 253, 22909, 285, 16344, 253, 7162, 403, 37825, 50275, 20881, 1255, 50276, 249, 619, 4743, 253, 2608, 1263, 1335, 556, 2316, 323, 7756, 28910, 323, 288, 18, 285, 288, 19, 3587, 7004, 253, 2608, 534, 581, 310, 625, 4217, 1057, 417, 1007, 751, 247, 1175, 1039, 273, 5301, 387, 1878, 281, 479, 697, 1892, 281, 2028, 534, 581, 310, 625, 4217, 604, 891, 452, 2168, 1239, 1097, 285, 7192, 5046, 697, 1805, 281, 760, 921, 581, 8813, 281, 247, 14687, 285, 1642, 849, 973, 597, 7192, 390, 1339, 731, 1347, 247, 4836, 347, 275, 288, 20, 28910, 323, 288, 18, 347, 891, 923, 275, 4677, 854, 67, 253, 8813, 342, 12342, 2636, 556, 4555, 625, 1491, 685, 253, 8245, 8813, 320, 275, 643, 3000, 2636, 310, 4473, 50276, 5695, 1223, 320, 310, 760, 2460, 320, 310, 247, 7654, 8578, 273, 2636, 824, 247, 5301, 1057, 417, 1646, 1077, 4217, 651, 352, 320, 1805, 604, 368, 323, 1650, 7277, 4473, 7483, 342, 2460, 7483, 824, 347, 11922, 275, 253, 1375, 2011, 275, 253, 2460, 275, 2636, 28910, 323, 288, 20, 253, 4836, 4758, 310, 4722, 533, 275, 253, 2011, 1650, 253, 4473, 326, 253, 2608, 556, 281, 3037, 310, 7960, 281, 897, 253, 5234, 534, 4453, 8489, 1512, 2969, 285, 417, 1077, 4722, 352, 651, 320, 625, 12994, 604, 253, 4473, 310, 625, 2570, 24088, 368, 16216, 2965, 745, 247, 6415, 390, 5181, 271, 9054, 347, 275, 1114, 442, 91, 9307, 28910, 891, 1158, 253, 5014, 4114, 1491, 2363, 8645, 849, 17875, 3966, 310, 1774, 323, 247, 2608, 1263, 285, 891, 1804, 1907, 352, 275, 253, 2022, 2929, 390, 387, 1878, 2403, 352, 6927, 281, 4366, 50276, 262, 3133, 326, 16774, 2175, 403, 417, 2530, 323, 7162, 4868, 10272, 26332, 849, 973, 513, 253, 7363, 2723, 342, 4588, 8813, 7200, 50275, 34974, 50276, 249, 2593, 577, 1386, 1668, 368, 5393, 275, 2219, 835, 359, 403, 16293, 326, 253, 4473, 1618, 310, 41389, 812, 368, 2085, 690, 6667, 273, 841, 2219, 50275, 249, 2593, 608, 8813, 8137, 368, 5393, 253, 3186, 369, 2104, 281, 4271, 253, 3264, 8813, 323, 1016, 27204, 1057, 326, 1599, 512, 253, 3453, 22909, 403, 7899, 604, 594, 352, 588, 320, 4722, 281, 923, 849, 253, 3186, 17923, 327, 625, 2570, 8892, 50276, 1542, 22909, 327, 4067, 2105, 347, 275, 4677, 898, 67, 281, 479, 253, 954, 7445, 8813, 651, 320, 816, 24364, 2250, 7450, 598, 672, 3817, 327, 14863, 894, 4815, 387, 1878, 884, 643, 3104, 403, 417, 4217, 275, 643, 3000, 352, 1537, 320, 1805, 281, 2770, 327, 5018, 835, 253, 27204, 556, 4067, 2105, 685, 4081, 2098, 513, 368, 452, 667, 7906, 327, 436, 1386, 50275, 555, 993, 50276, 4674, 374, 9484, 305, 873, 273, 4736, 3054, 8266, 16706, 50276, 50237, 247, 20, 5933, 374, 1386, 1249, 9484, 7154, 2490, 187, 4118, 18435, 27, 2520, 2929, 310, 271, 27807, 1263, 273, 6083, 326, 476, 1918, 22909, 4499, 422, 273, 616, 5231, 3066, 24762, 6779, 824, 347, 10756, 50276, 21215, 476, 671, 1581, 4212, 281, 9059, 1411, 253, 6083, 7089, 891, 717, 6685, 17847, 407, 253, 3290, 273, 253, 37317, 5701, 285, 11985, 50276, 262, 310, 671, 4722, 326, 253, 30628, 452, 4447, 767, 19509, 273, 1869, 327, 253, 2929, 581, 2986, 8414, 273, 391, 20, 285, 391, 22, 665, 403, 275, 4345, 275, 11571, 9393, 4087, 7291, 3006, 253, 5075, 2792, 275, 253, 2929, 50276, 783, 643, 2986, 8414, 273, 391, 18, 391, 19, 285, 391, 21, 665, 16928, 253, 16108, 273, 752, 597, 923, 347, 2266, 2792, 50276, 1439, 1598, 512, 30628, 452, 9648, 1029, 7162, 2193, 50276, 7483, 581, 7162, 4868, 273, 495, 285, 512, 2571, 403, 577, 50276, 262, 369, 247, 45210, 1083, 285, 417, 271, 3477, 3061, 275, 253, 990, 253, 2086, 9353, 4425, 326, 253, 2929, 275, 697, 1655, 830, 1057, 417, 3240, 2525, 253, 2534, 285, 651, 5649, 432, 1529, 18520, 923, 24088, 391, 21, 5701, 50276, 664, 1158, 326, 253, 789, 310, 4722, 285, 11907, 253, 4477, 281, 2953, 253, 30628, 5701, 285, 501, 538, 2225, 253, 789, 281, 1529, 18767, 209 ]
Below is given review of a research paper from cnoference journal. Please write a summary the review. ### Review: the paper does not seem to present a big departure from existing approaches the results seems somewhat on par although the diversity in phoneme duration is measurably higher than that of prior approaches that appears to be a good accomplishment but im not an expert in this area and cannot therefore assess the importance of the technical contribution other than that the paper is well written and could trigger followup discussions in the tts community im thus not opposed to acceptancedocsepindeed it appears that the proposed model increases sample variability however from figure 1 it seems like the model with our prior slightly increases mos but can only outperform glowtts using additional data i believe a fair comparison should also include glowtts trained using the additional data the authors also do not address how their model is more stable nor show evidence that the prior speeds up training minor observations the tts acronym is never introduced line 37 column 1 these concern this concern or these concerns line 2 column 2 most related work the most additionally it is difficult to appreciate this sentence and the next here without knowing what you do in more detail line 96 we wish to seems imprecise i guess the objective is to model x as a bijective transformation of z the latter being standard normal line 56 column 2 inference is a very broad word encompasses eg parameter estimation and taking intervals sampling would be more precise ### Summary:
the paper is on topic for the workshop the reviews are mixed with the main positive point being that the sample diversity is increased as is one of the claims made in the paper the main points of criticism are 1 that the proposed method is only a minor deviation of existing approaches 2 that the model can only outperform the baseline model in terms of mos scores when using additional data for the proposed method while not allowing the baseline method to do so and finally 3 that the claims of increased stability and speed up in training are not backed up by evidence we have decided to accept this paper but urge the authors to take into account the the reviewers comments for the camera ready version especially point 2 and 3
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 783, 2929, 1057, 417, 1646, 281, 1246, 247, 1943, 16018, 432, 5368, 7274, 253, 1543, 3133, 8489, 327, 1061, 3738, 253, 9991, 275, 20445, 20867, 7467, 310, 1510, 321, 1598, 2169, 685, 326, 273, 2720, 7274, 326, 4620, 281, 320, 247, 1175, 42285, 533, 516, 417, 271, 6485, 275, 436, 2170, 285, 2550, 3103, 2939, 253, 6349, 273, 253, 7681, 7680, 643, 685, 326, 253, 2929, 310, 973, 3542, 285, 812, 9632, 956, 484, 11985, 275, 253, 246, 1641, 3114, 516, 3021, 417, 10066, 281, 2997, 3086, 406, 33032, 527, 13158, 352, 4620, 326, 253, 4081, 1566, 5459, 3410, 13099, 2299, 432, 4677, 337, 352, 3133, 751, 253, 1566, 342, 776, 2720, 5777, 5459, 15039, 533, 476, 760, 562, 32231, 15795, 1440, 84, 970, 3081, 941, 891, 2868, 247, 4344, 5301, 943, 671, 2486, 15795, 1440, 84, 10166, 970, 253, 3081, 941, 253, 4477, 671, 513, 417, 2953, 849, 616, 1566, 310, 625, 6474, 4543, 921, 1941, 326, 253, 2720, 18819, 598, 3733, 50274, 37585, 7313, 50276, 783, 246, 1641, 913, 1406, 1105, 310, 1620, 5611, 50276, 1282, 5345, 5084, 337, 841, 4468, 50276, 2520, 4468, 390, 841, 7350, 50276, 1282, 374, 5084, 374, 954, 2905, 789, 50276, 783, 954, 50275, 29483, 595, 352, 310, 2834, 281, 11435, 436, 6197, 285, 253, 1735, 1060, 28910, 1293, 8958, 752, 368, 513, 275, 625, 2508, 50275, 1282, 9161, 359, 5730, 281, 3133, 1607, 2845, 885, 891, 5476, 253, 8103, 310, 281, 1566, 1269, 347, 247, 1794, 25667, 9261, 273, 1182, 253, 6158, 1146, 2629, 2622, 50276, 1282, 8026, 5084, 374, 17032, 310, 247, 1077, 3862, 3159, 37035, 24088, 4764, 13418, 285, 3192, 11508, 10491, 651, 320, 625, 10799, 2490, 187, 4118, 18435, 27, 783, 2929, 310, 327, 9400, 323, 253, 22586, 253, 10123, 403, 6804, 342, 253, 2022, 2762, 1127, 1146, 326, 253, 3410, 9991, 310, 2559, 347, 310, 581, 273, 253, 3916, 1160, 275, 253, 2929, 253, 2022, 2792, 273, 14226, 403, 337, 326, 253, 4081, 1332, 310, 760, 247, 5884, 11254, 273, 5368, 7274, 374, 326, 253, 1566, 476, 760, 562, 32231, 253, 8245, 1566, 275, 2426, 273, 15039, 7363, 672, 970, 3081, 941, 323, 253, 4081, 1332, 1223, 417, 6941, 253, 8245, 1332, 281, 513, 594, 285, 4720, 495, 326, 253, 3916, 273, 2559, 7882, 285, 3885, 598, 275, 3733, 403, 417, 17245, 598, 407, 1941, 359, 452, 4425, 281, 2997, 436, 2929, 533, 21434, 253, 4477, 50276, 936, 1379, 715, 2395, 253, 253, 30628, 5701, 323, 253, 6568, 4704, 2715, 3340, 1127, 374, 285, 495, 209 ]
[ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 ]
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 783, 2929, 1057, 417, 1646, 281, 1246, 247, 1943, 16018, 432, 5368, 7274, 253, 1543, 3133, 8489, 327, 1061, 3738, 253, 9991, 275, 20445, 20867, 7467, 310, 1510, 321, 1598, 2169, 685, 326, 273, 2720, 7274, 326, 4620, 281, 320, 247, 1175, 42285, 533, 516, 417, 271, 6485, 275, 436, 2170, 285, 2550, 3103, 2939, 253, 6349, 273, 253, 7681, 7680, 643, 685, 326, 253, 2929, 310, 973, 3542, 285, 812, 9632, 956, 484, 11985, 275, 253, 246, 1641, 3114, 516, 3021, 417, 10066, 281, 2997, 3086, 406, 33032, 527, 13158, 352, 4620, 326, 253, 4081, 1566, 5459, 3410, 13099, 2299, 432, 4677, 337, 352, 3133, 751, 253, 1566, 342, 776, 2720, 5777, 5459, 15039, 533, 476, 760, 562, 32231, 15795, 1440, 84, 970, 3081, 941, 891, 2868, 247, 4344, 5301, 943, 671, 2486, 15795, 1440, 84, 10166, 970, 253, 3081, 941, 253, 4477, 671, 513, 417, 2953, 849, 616, 1566, 310, 625, 6474, 4543, 921, 1941, 326, 253, 2720, 18819, 598, 3733, 50274, 37585, 7313, 50276, 783, 246, 1641, 913, 1406, 1105, 310, 1620, 5611, 50276, 1282, 5345, 5084, 337, 841, 4468, 50276, 2520, 4468, 390, 841, 7350, 50276, 1282, 374, 5084, 374, 954, 2905, 789, 50276, 783, 954, 50275, 29483, 595, 352, 310, 2834, 281, 11435, 436, 6197, 285, 253, 1735, 1060, 28910, 1293, 8958, 752, 368, 513, 275, 625, 2508, 50275, 1282, 9161, 359, 5730, 281, 3133, 1607, 2845, 885, 891, 5476, 253, 8103, 310, 281, 1566, 1269, 347, 247, 1794, 25667, 9261, 273, 1182, 253, 6158, 1146, 2629, 2622, 50276, 1282, 8026, 5084, 374, 17032, 310, 247, 1077, 3862, 3159, 37035, 24088, 4764, 13418, 285, 3192, 11508, 10491, 651, 320, 625, 10799, 2490, 187, 4118, 18435, 27, 783, 2929, 310, 327, 9400, 323, 253, 22586, 253, 10123, 403, 6804, 342, 253, 2022, 2762, 1127, 1146, 326, 253, 3410, 9991, 310, 2559, 347, 310, 581, 273, 253, 3916, 1160, 275, 253, 2929, 253, 2022, 2792, 273, 14226, 403, 337, 326, 253, 4081, 1332, 310, 760, 247, 5884, 11254, 273, 5368, 7274, 374, 326, 253, 1566, 476, 760, 562, 32231, 253, 8245, 1566, 275, 2426, 273, 15039, 7363, 672, 970, 3081, 941, 323, 253, 4081, 1332, 1223, 417, 6941, 253, 8245, 1332, 281, 513, 594, 285, 4720, 495, 326, 253, 3916, 273, 2559, 7882, 285, 3885, 598, 275, 3733, 403, 417, 17245, 598, 407, 1941, 359, 452, 4425, 281, 2997, 436, 2929, 533, 21434, 253, 4477, 50276, 936, 1379, 715, 2395, 253, 253, 30628, 5701, 323, 253, 6568, 4704, 2715, 3340, 1127, 374, 285, 495, 209 ]
Below is given review of a research paper from cnoference journal. Please write a summary the review. ### Review: this paper presents the analysis of 13 zerocost proxies on 28 tasks and releases the precomputed scores of them the authors also studied a way to combine all zero cost proxies to allow better nas performance of predictiorbased algorithms a good summary and comprehensive comparison of zerocost proxies published emprical analysis of the performance of many zerocost proxies a deep dive into the generalization abilities of them on different search spaces and tasks this kind of analysis is not done before understanding if the zerocost proxies can explain the ground truth accuracy and how they complement each other in terms of information gain a novel way to combine all zerocost proxies studied as features into the nas predictor most results are empirical which provide some valuable insight into so many zc proxies however there is no theorethical analysis or explaination on such observations docsepin this work nasbenchsuitezero evaluates 13 zerocost proxies for architecture performance prediction across 28 tasks it creates by far the largest dataset enabling ordersofmagnitude faster experiments on zc proxies the provided codebase is accessible through a unified interface created with the aim to facilitate reproducible generalizable and rapid nas research 1 the collection of benchmarks and zc proxies that unifies and accelerates research on zc proxies a promising new subfield of nas by enabling ordersofmagnitude faster evaluations on a large suite of diverse benchmarks 2 the largescale analysis of 13 zc proxies across 28 different combinations of search spaces and tasks reveals the generalizability bias and mutual information among zc proxies 3 as a kind of complementary information zc proxies can significantly improve the predictive power of surrogate models commonly used for nas 4 the paper is easy to read and in terms of the open source code it provides an easyuse interface and wellorganized documentation 1 it is largely based on nasbenchsuite and the main difference comes from more datasets publicly releasing zc proxy values combining zc proxies in a nontrivial way and exploiting the complementary information of 13 zc proxies simultaneously 2 the indepth analysis from section4 could have been more as the basic information largely overlapped with nasbenchsuite the bias part in section 43 is interesting perhaps more analysis in the relation between the design principle of zerocost proxy and results 3 the definition and explanation in the table content can be more detailed such as in table3 docseprecent prior work introduced zerocost proxies zc proxies as a means to predict architecture performance to significantly speed up neural architecture search algorithms this work introduces a novel benchmark dataset nasbenchsuitezero to study 13 zc proxies across 28 tasks main contributions and findings are introduced largest to date benchmark dataset for zc proxies streamlining and speeding up experimentation informationtheoretic analysis of zc proxy behavior yielding finding that they capture complementary information evaluation showing that incorporating all 13 zc proxies into nas surrogate models improves their predictive performance significantly very detailed documentation of creation of the benchmark suite research questions as well as experimental setup following best practices throughout clearly stating infrastructure detals hyperparameter choices filledout nas best practice checklist etc benchmark dataset access is straightforward and well documented all code and supporting materials eg figures are included in the git repository appendix contains a very detailed data sheet addressing many common questions users may have and extending even to ethical and social implications this is outstanding writing and elaboration of research questions and corresponding answers is very clear and accessible research findings are very thoughtprovoking and frequently point out clear directions for future research specifically using zc proxies as task features for metalearning degree to which zc information is complementary is taskdependent motivating experimentation in this work to combine them by adding corresponding features to the xgboost inputs a potentially missed opportunity may consist in performing an analysis of feature importance of the zc proxy features on the trained surrogate model from the informationtheoretic analysis we can argue that combining multiple zc proxies is beneficial however it is not obvious to me from these results that we can exclude a merely marginal contribution to the speedup by individual zc proxies feature importance values may indicate which combinations of proxies actually contribute to observed speedups rq3 answers seem a bit inconclusive ie not immediately actionable it may be helpful to outline how removing these biases would work in a concrete case docsepthis paper targets the zcnas which is an interesting and promising subfield of nas and aims to answer the important questions about the performance generalization and bias of zc proxies and their combinations to achieve this the authors create the nasbenchsuitezero codebase precompute zc scores on the architectures in existing benchmarks and run a comprehensive analysis based on them 1 the authors create the nasbenchsuitezero an extensible collection of 13 zc proxies on 28 nas benchmark tasks such proxies are precomputed and accessible through a unified interface 2 the authors run a largescale analysis of the above dataset to study the generalizability bias and mutual information among zc proxies 1 about the search space selection as 3 45 and other works including this one observe the performance of zc proxies highly depends on the search space and task however the search space used in this benchmark is far from the modern search space such as resnet and mobilenet 3 this limits the significance of this work and makes observations in it less guiding to the realworld nas applications 2 the authors use spearman ranking correlation to evaluate the effectiveness of zc proxies however ranking correlations cannot naturally reflect the true ability in the scenarios of constrained nas 325 thats why other metrics such as the precisionk bestrankingk 25 3 the authors use conditionalentropy based information gain to prove the improvement of predictive power of a tuple of zc proxies the idea is interesting but the ig is considering the whole distribution which also deviates from the goal of nas to select the topk architectures docsepthis work replicates some of the typical zero nas on multiple nas benchmarksthis work gives zero nas fair comparison data and gives some new research challenges such as on trans datasetshowever i still think the contribution of this work is not enough for the top conferences 1 this work is very well organized and written 2 the authors have done more experiments and have done a good job of open source work 1 this work seems to me to be just a review result and it is more appropriate to appear in a new nas method rather than a new benchmark since almost all methods and datasets exist and much of the data has already appeared in previous work 2 some of the data from this work is questionable such as why it is so bad on trans nas docsepthis paper creates a nas benchmark nasbenchsuitezero including 13 zerocostzc proxies 28 tasks and 15 million zc proxy scores in total the dataset can be used to speed up zc proxybased nas experiments and provide convenience for zc proxybased nas research the authors also conduct generalizability bias and informationtheoretic analysis of zc proxies showing that combining several zc proxies can improve the performance of nas surrogate models and nas algorithms the zc proxy dataset can significantly speed up zc proxybased nas experiments the codes are accessible in github details of reproducibility are included extensive analysis of 13 zc proxies across 28 different combinations of search spaces and tasks by studying the generalizability bias and mutual information among zc proxies the paper gives several valuable findings eg combining several zc proxies can improve the performance of nas surrogate models and nas algorithms although there are docs about how to reproduce the results in the paper the interfaces using the datasets are missing for further research the authors are expected to provide detailed documentation and quickstart about how to use the codebase conducting more theoretical analysis would be better ### Summary:
this paper presents a systematic review of zerocost proxies for neural architecture search by design these proxies are cheap and easytoevaluate but results reported in the literature have been mixed in this light a benchmark seems relevant and important for further progress in this area the reviewers uniformly appreciated the extent of tasks and proxies considered moreover the reviewers also appreciated efforts made to explore the complementary strengths of these proxies there were some concerns related to the search space and metrics used but those were addressed by the authors during the rebuttal period finally the availability of opensourced and welldocumented code is a plus that was appreciated by everybody
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 2520, 2929, 10262, 253, 1783, 273, 2145, 1182, 254, 406, 493, 16843, 447, 327, 3349, 8892, 285, 16784, 253, 638, 16777, 264, 7363, 273, 731, 253, 4477, 671, 5421, 247, 1039, 281, 13398, 512, 5058, 2105, 16843, 447, 281, 1581, 1805, 13332, 3045, 273, 3283, 1528, 3169, 11333, 50276, 66, 1175, 6010, 285, 11088, 5301, 273, 1182, 254, 406, 493, 16843, 447, 3863, 50276, 358, 1087, 474, 1783, 273, 253, 3045, 273, 1142, 1182, 254, 406, 493, 16843, 447, 247, 3676, 25760, 715, 253, 26647, 15277, 273, 731, 327, 1027, 3186, 8470, 285, 8892, 436, 2238, 273, 1783, 310, 417, 2218, 1078, 50276, 4524, 6924, 604, 253, 1182, 254, 406, 493, 16843, 447, 476, 5513, 253, 3216, 5083, 7200, 285, 849, 597, 13503, 1016, 643, 275, 2426, 273, 1491, 6351, 50276, 66, 4460, 1039, 281, 13398, 512, 1182, 254, 406, 493, 16843, 447, 5421, 347, 3386, 715, 253, 13332, 23403, 50276, 2252, 1543, 403, 16774, 534, 2085, 690, 9865, 12288, 715, 594, 1142, 1182, 68, 16843, 447, 2299, 627, 310, 642, 253, 410, 394, 474, 1783, 390, 5513, 318, 327, 824, 7313, 50276, 7152, 339, 9852, 436, 789, 13332, 31591, 42877, 10528, 44995, 2145, 1182, 254, 406, 493, 16843, 447, 323, 10336, 3045, 10554, 2439, 3349, 8892, 352, 10513, 407, 2080, 253, 6253, 10895, 17690, 7367, 1171, 30362, 3396, 7938, 4679, 327, 1182, 68, 16843, 447, 253, 2530, 2127, 4793, 310, 12482, 949, 247, 27998, 5673, 3562, 342, 253, 4388, 281, 12454, 41374, 2087, 12729, 285, 5233, 13332, 2561, 337, 253, 4849, 273, 49602, 285, 1182, 68, 16843, 447, 326, 440, 7790, 285, 17308, 684, 2561, 327, 1182, 68, 16843, 447, 50276, 66, 12532, 747, 749, 3423, 273, 13332, 50276, 1615, 17690, 7367, 1171, 30362, 3396, 7938, 27163, 327, 247, 1781, 18880, 273, 11117, 49602, 374, 253, 1236, 2510, 25912, 1783, 273, 2145, 1182, 68, 16843, 447, 2439, 3349, 1027, 13553, 273, 3186, 8470, 285, 8892, 12957, 253, 2087, 50228, 8492, 285, 15577, 1491, 2190, 1182, 68, 16843, 447, 495, 50276, 284, 247, 2238, 273, 19767, 1491, 50276, 35759, 16843, 447, 476, 3012, 3157, 253, 15970, 1612, 273, 35701, 3210, 7744, 908, 323, 13332, 577, 253, 2929, 310, 3477, 281, 1239, 285, 275, 2426, 273, 253, 1527, 2603, 2127, 352, 3400, 271, 3477, 2327, 5673, 285, 973, 34092, 10097, 50276, 18, 352, 310, 8127, 1754, 327, 13332, 31591, 42877, 285, 253, 2022, 3064, 3249, 432, 625, 15302, 13644, 20437, 1182, 68, 17335, 2193, 16248, 1182, 68, 16843, 447, 275, 247, 37825, 1039, 285, 38883, 253, 19767, 1491, 273, 2145, 1182, 68, 16843, 447, 10486, 374, 253, 801, 554, 394, 1783, 432, 2593, 21, 812, 452, 644, 625, 347, 253, 5044, 1491, 8127, 48955, 342, 13332, 31591, 42877, 253, 8492, 629, 275, 2593, 7652, 310, 4722, 4931, 625, 1783, 275, 253, 5886, 875, 253, 2216, 8063, 273, 1182, 254, 406, 493, 17335, 285, 1543, 495, 253, 5426, 285, 8813, 275, 253, 2829, 2600, 476, 320, 625, 7000, 824, 347, 275, 2829, 20, 50275, 7152, 339, 3456, 1154, 2720, 789, 5611, 1182, 254, 406, 493, 16843, 447, 1182, 68, 16843, 447, 347, 247, 2097, 281, 3283, 10336, 3045, 281, 3012, 3885, 598, 11454, 10336, 3186, 11333, 50276, 2520, 789, 23970, 247, 4460, 22791, 10895, 13332, 31591, 42877, 10528, 281, 1263, 2145, 1182, 68, 16843, 447, 2439, 3349, 8892, 50275, 7265, 9021, 285, 4342, 403, 50276, 36445, 758, 6253, 281, 3522, 22791, 10895, 323, 1182, 68, 16843, 447, 5542, 30927, 285, 43088, 598, 40290, 50276, 18480, 783, 30325, 1783, 273, 1182, 68, 17335, 3879, 27012, 4560, 326, 597, 9232, 19767, 1491, 50276, 15419, 2368, 4645, 326, 24049, 512, 2145, 1182, 68, 16843, 447, 715, 13332, 35701, 3210, 19132, 616, 15970, 3045, 3012, 50276, 635, 7000, 10097, 273, 8869, 273, 253, 22791, 18880, 2561, 3533, 347, 973, 347, 5661, 9978, 1563, 1682, 8333, 4768, 4518, 14851, 11319, 843, 932, 4373, 19484, 10165, 6898, 483, 13332, 1682, 3946, 44282, 3966, 50275, 31591, 4698, 10895, 2289, 310, 15246, 285, 973, 14290, 512, 2127, 285, 8109, 4753, 24088, 8442, 403, 2908, 275, 253, 19421, 18491, 50276, 50237, 4428, 247, 1077, 7000, 941, 8335, 15974, 1142, 1846, 3533, 4212, 778, 452, 285, 13633, 1014, 281, 16289, 285, 2675, 12739, 50276, 2520, 310, 16383, 50276, 17695, 285, 14883, 318, 273, 2561, 3533, 285, 3969, 9172, 310, 1077, 2590, 285, 12482, 50276, 36642, 4342, 403, 1077, 1869, 11404, 6856, 285, 7208, 1127, 562, 2590, 10746, 323, 2852, 2561, 50276, 46458, 50272, 5302, 1182, 68, 16843, 447, 347, 4836, 3386, 323, 5148, 613, 920, 50272, 14577, 281, 534, 1182, 68, 1491, 310, 19767, 310, 4836, 6820, 15265, 839, 40290, 275, 436, 789, 281, 13398, 731, 407, 6240, 3969, 3386, 281, 253, 1269, 72, 15467, 14800, 50276, 66, 7826, 9829, 5107, 778, 2882, 275, 9591, 271, 1783, 273, 4735, 6349, 273, 253, 1182, 68, 17335, 3386, 327, 253, 10166, 35701, 1566, 432, 253, 1491, 783, 30325, 1783, 359, 476, 9059, 326, 16248, 2709, 1182, 68, 16843, 447, 310, 12912, 2299, 352, 310, 417, 4755, 281, 479, 432, 841, 1543, 326, 359, 476, 16670, 247, 7960, 16888, 7680, 281, 253, 3885, 484, 407, 2060, 1182, 68, 16843, 447, 4735, 6349, 2193, 778, 5224, 534, 13553, 273, 16843, 447, 2686, 8162, 281, 2540, 3885, 8777, 50276, 42496, 20, 9172, 1646, 247, 2372, 16656, 7426, 26332, 417, 4745, 49353, 352, 778, 320, 9371, 281, 19270, 849, 11922, 841, 31306, 651, 789, 275, 247, 11859, 1083, 5474, 33032, 2520, 2929, 8571, 253, 1182, 14340, 284, 534, 310, 271, 4722, 285, 12532, 749, 3423, 273, 13332, 285, 13698, 281, 3662, 253, 1774, 3533, 670, 253, 3045, 26647, 285, 8492, 273, 1182, 68, 16843, 447, 285, 616, 13553, 281, 5115, 436, 253, 4477, 2794, 253, 13332, 31591, 42877, 10528, 2127, 4793, 638, 39413, 1182, 68, 7363, 327, 253, 35615, 275, 5368, 49602, 285, 1408, 247, 11088, 1783, 1754, 327, 731, 50273, 18, 253, 4477, 2794, 253, 13332, 31591, 42877, 10528, 271, 1021, 35418, 4849, 273, 2145, 1182, 68, 16843, 447, 327, 3349, 13332, 22791, 8892, 824, 16843, 447, 403, 638, 16777, 264, 285, 12482, 949, 247, 27998, 5673, 50276, 19, 253, 4477, 1408, 247, 1236, 2510, 25912, 1783, 273, 253, 1840, 10895, 281, 1263, 253, 2087, 50228, 8492, 285, 15577, 1491, 2190, 1182, 68, 16843, 447, 337, 670, 253, 3186, 2317, 5438, 347, 495, 5329, 285, 643, 2987, 1690, 436, 581, 10018, 253, 3045, 273, 1182, 68, 16843, 447, 4122, 7024, 327, 253, 3186, 2317, 285, 4836, 2299, 253, 3186, 2317, 908, 275, 436, 22791, 310, 2080, 432, 253, 4980, 3186, 2317, 824, 347, 501, 3024, 285, 31551, 257, 292, 495, 436, 7787, 253, 8453, 273, 436, 789, 285, 2789, 7313, 275, 352, 1679, 26766, 281, 253, 1524, 10186, 13332, 4893, 50275, 19, 253, 4477, 897, 31636, 1342, 19947, 5921, 281, 7472, 253, 12510, 273, 1182, 68, 16843, 447, 2299, 19947, 13007, 2550, 10748, 4887, 253, 2032, 3745, 275, 253, 15216, 273, 20793, 13332, 28325, 28763, 2139, 643, 17082, 824, 347, 253, 12320, 76, 1682, 47883, 76, 2030, 50274, 20, 253, 4477, 897, 17697, 290, 10144, 1754, 1491, 6351, 281, 5276, 253, 7756, 273, 15970, 1612, 273, 247, 31343, 273, 1182, 68, 16843, 447, 253, 2934, 310, 4722, 533, 253, 25477, 310, 7296, 253, 2644, 3268, 534, 671, 1474, 28032, 432, 253, 4736, 273, 13332, 281, 3609, 253, 1755, 76, 35615, 50276, 7152, 33032, 2520, 789, 24945, 690, 273, 253, 6867, 5058, 13332, 327, 2709, 13332, 22791, 296, 8701, 789, 4245, 5058, 13332, 4344, 5301, 941, 285, 4245, 690, 747, 2561, 7881, 824, 347, 327, 811, 10895, 9029, 972, 891, 1335, 1158, 253, 7680, 273, 436, 789, 310, 417, 2217, 323, 253, 1755, 27691, 337, 436, 789, 310, 1077, 973, 10932, 285, 3542, 374, 253, 4477, 452, 2218, 625, 4679, 285, 452, 2218, 247, 1175, 2628, 273, 1527, 2603, 789, 337, 436, 789, 3133, 281, 479, 281, 320, 816, 247, 2278, 906, 285, 352, 310, 625, 4569, 281, 3176, 275, 247, 747, 13332, 1332, 2581, 685, 247, 747, 22791, 1580, 2761, 512, 3082, 285, 15302, 2226, 285, 1199, 273, 253, 941, 556, 2168, 5420, 275, 2045, 789, 374, 690, 273, 253, 941, 432, 436, 789, 310, 30455, 824, 347, 2139, 352, 310, 594, 3076, 327, 811, 13332, 5474, 33032, 2520, 2929, 10513, 247, 13332, 22791, 13332, 31591, 42877, 10528, 1690, 2145, 1182, 254, 406, 493, 35759, 16843, 447, 3349, 8892, 285, 1458, 3041, 1182, 68, 17335, 7363, 275, 2264, 253, 10895, 476, 320, 908, 281, 3885, 598, 1182, 68, 17335, 3169, 13332, 4679, 285, 2085, 16397, 323, 1182, 68, 17335, 3169, 13332, 2561, 253, 4477, 671, 2589, 2087, 50228, 8492, 285, 1491, 783, 30325, 1783, 273, 1182, 68, 16843, 447, 4645, 326, 16248, 2067, 1182, 68, 16843, 447, 476, 3157, 253, 3045, 273, 13332, 35701, 3210, 285, 13332, 11333, 50276, 783, 1182, 68, 17335, 10895, 476, 3012, 3885, 598, 1182, 68, 17335, 3169, 13332, 4679, 50275, 783, 11646, 403, 12482, 275, 40477, 4278, 273, 38041, 403, 2908, 50275, 2068, 3134, 1783, 273, 2145, 1182, 68, 16843, 447, 2439, 3349, 1027, 13553, 273, 3186, 8470, 285, 8892, 407, 12392, 253, 2087, 50228, 8492, 285, 15577, 1491, 2190, 1182, 68, 16843, 447, 50275, 783, 2929, 4245, 2067, 9865, 4342, 24088, 16248, 2067, 1182, 68, 16843, 447, 476, 3157, 253, 3045, 273, 13332, 35701, 3210, 285, 13332, 11333, 50276, 20261, 627, 403, 27586, 670, 849, 281, 18302, 253, 1543, 275, 253, 2929, 253, 19069, 970, 253, 15302, 403, 5816, 323, 2007, 2561, 253, 4477, 403, 3264, 281, 2085, 7000, 10097, 285, 3158, 5478, 670, 849, 281, 897, 253, 2127, 4793, 50274, 11018, 272, 625, 10527, 1783, 651, 320, 1805, 50276, 187, 187, 4118, 18435, 27, 2520, 2929, 10262, 247, 12082, 2278, 273, 1182, 254, 406, 493, 16843, 447, 323, 11454, 10336, 3186, 407, 2216, 841, 16843, 447, 403, 11142, 285, 3477, 936, 45141, 533, 1543, 2361, 275, 253, 6239, 452, 644, 6804, 275, 436, 1708, 247, 22791, 3133, 4623, 285, 1774, 323, 2007, 4780, 275, 436, 2170, 253, 30628, 17568, 14109, 253, 6070, 273, 8892, 285, 16843, 447, 2783, 25761, 253, 30628, 671, 14109, 6031, 1160, 281, 8338, 253, 19767, 20544, 273, 841, 16843, 447, 627, 497, 690, 7350, 2905, 281, 253, 3186, 2317, 285, 17082, 908, 533, 1110, 497, 9713, 407, 253, 4477, 1309, 253, 30080, 22559, 2180, 4720, 253, 11659, 273, 13279, 47549, 285, 6210, 392, 1829, 264, 2127, 310, 247, 5043, 326, 369, 14109, 407, 11648, 209 ]
[ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 ]
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 2520, 2929, 10262, 253, 1783, 273, 2145, 1182, 254, 406, 493, 16843, 447, 327, 3349, 8892, 285, 16784, 253, 638, 16777, 264, 7363, 273, 731, 253, 4477, 671, 5421, 247, 1039, 281, 13398, 512, 5058, 2105, 16843, 447, 281, 1581, 1805, 13332, 3045, 273, 3283, 1528, 3169, 11333, 50276, 66, 1175, 6010, 285, 11088, 5301, 273, 1182, 254, 406, 493, 16843, 447, 3863, 50276, 358, 1087, 474, 1783, 273, 253, 3045, 273, 1142, 1182, 254, 406, 493, 16843, 447, 247, 3676, 25760, 715, 253, 26647, 15277, 273, 731, 327, 1027, 3186, 8470, 285, 8892, 436, 2238, 273, 1783, 310, 417, 2218, 1078, 50276, 4524, 6924, 604, 253, 1182, 254, 406, 493, 16843, 447, 476, 5513, 253, 3216, 5083, 7200, 285, 849, 597, 13503, 1016, 643, 275, 2426, 273, 1491, 6351, 50276, 66, 4460, 1039, 281, 13398, 512, 1182, 254, 406, 493, 16843, 447, 5421, 347, 3386, 715, 253, 13332, 23403, 50276, 2252, 1543, 403, 16774, 534, 2085, 690, 9865, 12288, 715, 594, 1142, 1182, 68, 16843, 447, 2299, 627, 310, 642, 253, 410, 394, 474, 1783, 390, 5513, 318, 327, 824, 7313, 50276, 7152, 339, 9852, 436, 789, 13332, 31591, 42877, 10528, 44995, 2145, 1182, 254, 406, 493, 16843, 447, 323, 10336, 3045, 10554, 2439, 3349, 8892, 352, 10513, 407, 2080, 253, 6253, 10895, 17690, 7367, 1171, 30362, 3396, 7938, 4679, 327, 1182, 68, 16843, 447, 253, 2530, 2127, 4793, 310, 12482, 949, 247, 27998, 5673, 3562, 342, 253, 4388, 281, 12454, 41374, 2087, 12729, 285, 5233, 13332, 2561, 337, 253, 4849, 273, 49602, 285, 1182, 68, 16843, 447, 326, 440, 7790, 285, 17308, 684, 2561, 327, 1182, 68, 16843, 447, 50276, 66, 12532, 747, 749, 3423, 273, 13332, 50276, 1615, 17690, 7367, 1171, 30362, 3396, 7938, 27163, 327, 247, 1781, 18880, 273, 11117, 49602, 374, 253, 1236, 2510, 25912, 1783, 273, 2145, 1182, 68, 16843, 447, 2439, 3349, 1027, 13553, 273, 3186, 8470, 285, 8892, 12957, 253, 2087, 50228, 8492, 285, 15577, 1491, 2190, 1182, 68, 16843, 447, 495, 50276, 284, 247, 2238, 273, 19767, 1491, 50276, 35759, 16843, 447, 476, 3012, 3157, 253, 15970, 1612, 273, 35701, 3210, 7744, 908, 323, 13332, 577, 253, 2929, 310, 3477, 281, 1239, 285, 275, 2426, 273, 253, 1527, 2603, 2127, 352, 3400, 271, 3477, 2327, 5673, 285, 973, 34092, 10097, 50276, 18, 352, 310, 8127, 1754, 327, 13332, 31591, 42877, 285, 253, 2022, 3064, 3249, 432, 625, 15302, 13644, 20437, 1182, 68, 17335, 2193, 16248, 1182, 68, 16843, 447, 275, 247, 37825, 1039, 285, 38883, 253, 19767, 1491, 273, 2145, 1182, 68, 16843, 447, 10486, 374, 253, 801, 554, 394, 1783, 432, 2593, 21, 812, 452, 644, 625, 347, 253, 5044, 1491, 8127, 48955, 342, 13332, 31591, 42877, 253, 8492, 629, 275, 2593, 7652, 310, 4722, 4931, 625, 1783, 275, 253, 5886, 875, 253, 2216, 8063, 273, 1182, 254, 406, 493, 17335, 285, 1543, 495, 253, 5426, 285, 8813, 275, 253, 2829, 2600, 476, 320, 625, 7000, 824, 347, 275, 2829, 20, 50275, 7152, 339, 3456, 1154, 2720, 789, 5611, 1182, 254, 406, 493, 16843, 447, 1182, 68, 16843, 447, 347, 247, 2097, 281, 3283, 10336, 3045, 281, 3012, 3885, 598, 11454, 10336, 3186, 11333, 50276, 2520, 789, 23970, 247, 4460, 22791, 10895, 13332, 31591, 42877, 10528, 281, 1263, 2145, 1182, 68, 16843, 447, 2439, 3349, 8892, 50275, 7265, 9021, 285, 4342, 403, 50276, 36445, 758, 6253, 281, 3522, 22791, 10895, 323, 1182, 68, 16843, 447, 5542, 30927, 285, 43088, 598, 40290, 50276, 18480, 783, 30325, 1783, 273, 1182, 68, 17335, 3879, 27012, 4560, 326, 597, 9232, 19767, 1491, 50276, 15419, 2368, 4645, 326, 24049, 512, 2145, 1182, 68, 16843, 447, 715, 13332, 35701, 3210, 19132, 616, 15970, 3045, 3012, 50276, 635, 7000, 10097, 273, 8869, 273, 253, 22791, 18880, 2561, 3533, 347, 973, 347, 5661, 9978, 1563, 1682, 8333, 4768, 4518, 14851, 11319, 843, 932, 4373, 19484, 10165, 6898, 483, 13332, 1682, 3946, 44282, 3966, 50275, 31591, 4698, 10895, 2289, 310, 15246, 285, 973, 14290, 512, 2127, 285, 8109, 4753, 24088, 8442, 403, 2908, 275, 253, 19421, 18491, 50276, 50237, 4428, 247, 1077, 7000, 941, 8335, 15974, 1142, 1846, 3533, 4212, 778, 452, 285, 13633, 1014, 281, 16289, 285, 2675, 12739, 50276, 2520, 310, 16383, 50276, 17695, 285, 14883, 318, 273, 2561, 3533, 285, 3969, 9172, 310, 1077, 2590, 285, 12482, 50276, 36642, 4342, 403, 1077, 1869, 11404, 6856, 285, 7208, 1127, 562, 2590, 10746, 323, 2852, 2561, 50276, 46458, 50272, 5302, 1182, 68, 16843, 447, 347, 4836, 3386, 323, 5148, 613, 920, 50272, 14577, 281, 534, 1182, 68, 1491, 310, 19767, 310, 4836, 6820, 15265, 839, 40290, 275, 436, 789, 281, 13398, 731, 407, 6240, 3969, 3386, 281, 253, 1269, 72, 15467, 14800, 50276, 66, 7826, 9829, 5107, 778, 2882, 275, 9591, 271, 1783, 273, 4735, 6349, 273, 253, 1182, 68, 17335, 3386, 327, 253, 10166, 35701, 1566, 432, 253, 1491, 783, 30325, 1783, 359, 476, 9059, 326, 16248, 2709, 1182, 68, 16843, 447, 310, 12912, 2299, 352, 310, 417, 4755, 281, 479, 432, 841, 1543, 326, 359, 476, 16670, 247, 7960, 16888, 7680, 281, 253, 3885, 484, 407, 2060, 1182, 68, 16843, 447, 4735, 6349, 2193, 778, 5224, 534, 13553, 273, 16843, 447, 2686, 8162, 281, 2540, 3885, 8777, 50276, 42496, 20, 9172, 1646, 247, 2372, 16656, 7426, 26332, 417, 4745, 49353, 352, 778, 320, 9371, 281, 19270, 849, 11922, 841, 31306, 651, 789, 275, 247, 11859, 1083, 5474, 33032, 2520, 2929, 8571, 253, 1182, 14340, 284, 534, 310, 271, 4722, 285, 12532, 749, 3423, 273, 13332, 285, 13698, 281, 3662, 253, 1774, 3533, 670, 253, 3045, 26647, 285, 8492, 273, 1182, 68, 16843, 447, 285, 616, 13553, 281, 5115, 436, 253, 4477, 2794, 253, 13332, 31591, 42877, 10528, 2127, 4793, 638, 39413, 1182, 68, 7363, 327, 253, 35615, 275, 5368, 49602, 285, 1408, 247, 11088, 1783, 1754, 327, 731, 50273, 18, 253, 4477, 2794, 253, 13332, 31591, 42877, 10528, 271, 1021, 35418, 4849, 273, 2145, 1182, 68, 16843, 447, 327, 3349, 13332, 22791, 8892, 824, 16843, 447, 403, 638, 16777, 264, 285, 12482, 949, 247, 27998, 5673, 50276, 19, 253, 4477, 1408, 247, 1236, 2510, 25912, 1783, 273, 253, 1840, 10895, 281, 1263, 253, 2087, 50228, 8492, 285, 15577, 1491, 2190, 1182, 68, 16843, 447, 337, 670, 253, 3186, 2317, 5438, 347, 495, 5329, 285, 643, 2987, 1690, 436, 581, 10018, 253, 3045, 273, 1182, 68, 16843, 447, 4122, 7024, 327, 253, 3186, 2317, 285, 4836, 2299, 253, 3186, 2317, 908, 275, 436, 22791, 310, 2080, 432, 253, 4980, 3186, 2317, 824, 347, 501, 3024, 285, 31551, 257, 292, 495, 436, 7787, 253, 8453, 273, 436, 789, 285, 2789, 7313, 275, 352, 1679, 26766, 281, 253, 1524, 10186, 13332, 4893, 50275, 19, 253, 4477, 897, 31636, 1342, 19947, 5921, 281, 7472, 253, 12510, 273, 1182, 68, 16843, 447, 2299, 19947, 13007, 2550, 10748, 4887, 253, 2032, 3745, 275, 253, 15216, 273, 20793, 13332, 28325, 28763, 2139, 643, 17082, 824, 347, 253, 12320, 76, 1682, 47883, 76, 2030, 50274, 20, 253, 4477, 897, 17697, 290, 10144, 1754, 1491, 6351, 281, 5276, 253, 7756, 273, 15970, 1612, 273, 247, 31343, 273, 1182, 68, 16843, 447, 253, 2934, 310, 4722, 533, 253, 25477, 310, 7296, 253, 2644, 3268, 534, 671, 1474, 28032, 432, 253, 4736, 273, 13332, 281, 3609, 253, 1755, 76, 35615, 50276, 7152, 33032, 2520, 789, 24945, 690, 273, 253, 6867, 5058, 13332, 327, 2709, 13332, 22791, 296, 8701, 789, 4245, 5058, 13332, 4344, 5301, 941, 285, 4245, 690, 747, 2561, 7881, 824, 347, 327, 811, 10895, 9029, 972, 891, 1335, 1158, 253, 7680, 273, 436, 789, 310, 417, 2217, 323, 253, 1755, 27691, 337, 436, 789, 310, 1077, 973, 10932, 285, 3542, 374, 253, 4477, 452, 2218, 625, 4679, 285, 452, 2218, 247, 1175, 2628, 273, 1527, 2603, 789, 337, 436, 789, 3133, 281, 479, 281, 320, 816, 247, 2278, 906, 285, 352, 310, 625, 4569, 281, 3176, 275, 247, 747, 13332, 1332, 2581, 685, 247, 747, 22791, 1580, 2761, 512, 3082, 285, 15302, 2226, 285, 1199, 273, 253, 941, 556, 2168, 5420, 275, 2045, 789, 374, 690, 273, 253, 941, 432, 436, 789, 310, 30455, 824, 347, 2139, 352, 310, 594, 3076, 327, 811, 13332, 5474, 33032, 2520, 2929, 10513, 247, 13332, 22791, 13332, 31591, 42877, 10528, 1690, 2145, 1182, 254, 406, 493, 35759, 16843, 447, 3349, 8892, 285, 1458, 3041, 1182, 68, 17335, 7363, 275, 2264, 253, 10895, 476, 320, 908, 281, 3885, 598, 1182, 68, 17335, 3169, 13332, 4679, 285, 2085, 16397, 323, 1182, 68, 17335, 3169, 13332, 2561, 253, 4477, 671, 2589, 2087, 50228, 8492, 285, 1491, 783, 30325, 1783, 273, 1182, 68, 16843, 447, 4645, 326, 16248, 2067, 1182, 68, 16843, 447, 476, 3157, 253, 3045, 273, 13332, 35701, 3210, 285, 13332, 11333, 50276, 783, 1182, 68, 17335, 10895, 476, 3012, 3885, 598, 1182, 68, 17335, 3169, 13332, 4679, 50275, 783, 11646, 403, 12482, 275, 40477, 4278, 273, 38041, 403, 2908, 50275, 2068, 3134, 1783, 273, 2145, 1182, 68, 16843, 447, 2439, 3349, 1027, 13553, 273, 3186, 8470, 285, 8892, 407, 12392, 253, 2087, 50228, 8492, 285, 15577, 1491, 2190, 1182, 68, 16843, 447, 50275, 783, 2929, 4245, 2067, 9865, 4342, 24088, 16248, 2067, 1182, 68, 16843, 447, 476, 3157, 253, 3045, 273, 13332, 35701, 3210, 285, 13332, 11333, 50276, 20261, 627, 403, 27586, 670, 849, 281, 18302, 253, 1543, 275, 253, 2929, 253, 19069, 970, 253, 15302, 403, 5816, 323, 2007, 2561, 253, 4477, 403, 3264, 281, 2085, 7000, 10097, 285, 3158, 5478, 670, 849, 281, 897, 253, 2127, 4793, 50274, 11018, 272, 625, 10527, 1783, 651, 320, 1805, 50276, 187, 187, 4118, 18435, 27, 2520, 2929, 10262, 247, 12082, 2278, 273, 1182, 254, 406, 493, 16843, 447, 323, 11454, 10336, 3186, 407, 2216, 841, 16843, 447, 403, 11142, 285, 3477, 936, 45141, 533, 1543, 2361, 275, 253, 6239, 452, 644, 6804, 275, 436, 1708, 247, 22791, 3133, 4623, 285, 1774, 323, 2007, 4780, 275, 436, 2170, 253, 30628, 17568, 14109, 253, 6070, 273, 8892, 285, 16843, 447, 2783, 25761, 253, 30628, 671, 14109, 6031, 1160, 281, 8338, 253, 19767, 20544, 273, 841, 16843, 447, 627, 497, 690, 7350, 2905, 281, 253, 3186, 2317, 285, 17082, 908, 533, 1110, 497, 9713, 407, 253, 4477, 1309, 253, 30080, 22559, 2180, 4720, 253, 11659, 273, 13279, 47549, 285, 6210, 392, 1829, 264, 2127, 310, 247, 5043, 326, 369, 14109, 407, 11648, 209 ]
Below is given review of a research paper from cnoference journal. Please write a summary the review. ### Review: this paper proposes polyhistor and polyhistorlite consisting of decomposed hypernetworks and layerwise scaling kernels to share information across different tasks with a few trainable parameters and address parameterefficient multitask adaptation for vision tasks the authors construct a unified framework with the same implementation details and provide a comprehensive and fair comparison between existing parameterefficient adaptation works in nlp on multitasking dense vision problems compared with the stateoftheart multitasking parameterefficient adaptation method the method achieves competitive performance improvement with 90 reduction in the trainable parameters strengths 1 this paper conducts a thorough study on how the existing successful parameter efficient methods on nlp tasks perform on vision tasks ie semantic segmentation human part segmentation saliency detection and surface normals estimation 2 the authors design a novel parameterefficient method for adaptation to dense vision tasks specifically the hypernetworks take input task embeddings and layer embeddings to produce lowrank matrices and further obtain the adapter weights 3 experimental results show that polyhistorlite can achieve a competitive performance gain compared with the stateoftheart method and only use a very limited amount of tunable parameters weaknesses 1 there are many hyperparameters in this method eg task embedding size dimension of ranks and the downprojection ratio of adapters searching for suitable hyperparameters costs a lot of resources 2 the method is implemented based on swintransformer backbone it would be better to conduct experiments on more other backbones the method is novel and solve multiple tasks with limited tunable parameters however there are several hyperparameters needed to be tuned in this method besides the method only focuses on dense vision tasks it would be better to include more common vision tasks like object detection docsepthe manuscript provides an extensive multitask parameterefficient benchmark and examines existing parameterefficient finetuning nlp methods for vision tasks the main contribution is that this work is the first to address parameterefficient multitask adaptation for vision tasks and developed a unified framework to benchmark several parameterefficient finetuning nlp methods on dense vision tasks the paper is mostly written in a good manner and the idea is straight yet effective the paper seems to combine several existing methods and extend their applicable scene making the contribution of this work less of a strength however they did several modifications to the existing methodologies which should be detailed and illustrated the true novelty of the work should be further justified docsepthis paper proposes a parameterefficient multitask adaptation method for dense vision tasks called polyhistorlite it is used to adapt a pretrained hierarchical vision transformer for solving multiple dense vision tasks the proposed method consists of two aspects the decomposed hypernetworks and the layerwise scaling kernels models are evaluated on pascalcontext datasets for semantic segmentation human part segmentation surface normals estimation and saliency detection and are shown to be effective strengths 1 it is interesting to explore the parameterefficient adaptation techniques for dense vision tasks it has not been investigated in this area 2 the presentation of this work is clear this paper also provides a detailed discussion of the differences and relations with parameterefficient multitask adaptation methods in nlp tasks 3 the proposed parametersharing method is reasonable and is shown to be helpful in reducing the learning parameters but also keeping relatively high performance 4 comparisons with existing approaches are thorough and significant weaknesses 1 in figure 2b the transformer in the lower part has no direct connection with the upper part and is meaningless 2 the proposed method is only evaluated with swintransformer as claimed by this paper it is designed for hierarchical vision transformers would it also work well with other hierarchical vision transformers limitations are discussed in the paper docsepthe paper proposes polyhistor a parameter efficient tuning method for jointly tuned dense vision tasks previous adapterlike methods in nlp are benchmarked in detail for dense vision tasks the proposed method is proved to give better parameterperformance tradeoff on the studied tasks strengths 1 the paper is neatly written it is easy to follow and understand 2 i havent seen a lot of paper to study parameterefficient tuning specifically for dense vision tasks so the benchmark for adapterlike methods in this paper should be helpful for the community the authors also promised to release the code 3 the idea makes sense to me and the performance is wellsupported by the experiments lowrank decomposition and dynamic weight are nothing new but they are properly applied the layerwise scaling kernel is novel and effective weakness the comparison wrt single task baselines doesnt seem fair to me a model jointly trained on similar tasks is expected to outperform models trained on single tasks respectively i feel results on singletask should be reported its ok to perform worse on a few tasks since this paper is mainly comparing with multitask methods but the results should be presented for completeness 1 only pretrained swin transformer is studied in this paper but i feel this method can be easily extended to convnets 2 more experiments on models pretrained on ssl tasks should also make this paper stronger since ssl models are competitive against inpretrained models and the features are very different the behavior of a method can be very different depending on the pretrain task ### Summary:
the proposed polyhistor and polyhistorlite for parameterefficient multitask adaptation achieves competitive performance gains on dense vision datasets all reviewers give consistent positive scores the requested experiments for more backbones selfsupervised backbones and analyses have been accordingly added during the discussion phase reviewer gyt3 is concerned about the unclear explanation of the framework and why hypernetwork and scalable kernels could help the authors addressed the issues and modified the paper the metareviewers thus recommend to accept this paper and encourage the authors to add all new experiments and make the presentation more clear in the camera ready
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 2520, 2929, 29328, 3488, 29059, 285, 3488, 29059, 46788, 11253, 273, 45765, 4373, 3024, 4896, 285, 3828, 3020, 13642, 34501, 281, 3894, 1491, 2439, 1027, 8892, 342, 247, 1643, 6194, 494, 3602, 285, 2953, 30364, 11892, 2276, 1554, 262, 1945, 15644, 323, 8113, 8892, 253, 4477, 3989, 247, 27998, 7792, 342, 253, 1072, 7092, 4278, 285, 2085, 247, 11088, 285, 4344, 5301, 875, 5368, 30364, 11892, 2276, 15644, 2987, 275, 295, 24343, 327, 1554, 262, 1945, 272, 14086, 8113, 3237, 2429, 342, 253, 1375, 23037, 14387, 1554, 262, 1945, 272, 30364, 11892, 2276, 15644, 1332, 253, 1332, 33526, 12085, 3045, 7756, 342, 50276, 2270, 5141, 275, 253, 6194, 494, 3602, 50276, 296, 3755, 20556, 337, 436, 2929, 2589, 84, 247, 11080, 1263, 327, 849, 253, 5368, 5547, 4764, 5919, 3082, 327, 295, 24343, 8892, 1347, 327, 8113, 8892, 26332, 24705, 26405, 1966, 629, 26405, 3779, 4364, 5481, 285, 2553, 5222, 932, 13418, 374, 253, 4477, 2216, 247, 4460, 30364, 11892, 2276, 1332, 323, 15644, 281, 14086, 8113, 8892, 5742, 253, 4373, 3024, 4896, 1379, 3280, 4836, 46234, 285, 3828, 46234, 281, 4711, 1698, 14714, 12624, 285, 2007, 4044, 253, 23675, 13461, 495, 5661, 1543, 921, 326, 3488, 29059, 46788, 476, 5115, 247, 12085, 3045, 6351, 2429, 342, 253, 1375, 23037, 14387, 1332, 285, 760, 897, 247, 1077, 3710, 2408, 273, 10839, 494, 3602, 50276, 20881, 1255, 265, 337, 627, 403, 1142, 4373, 22041, 275, 436, 1332, 24088, 4836, 21496, 1979, 7877, 273, 17210, 285, 253, 1066, 856, 5342, 4313, 273, 519, 49872, 12203, 323, 7470, 4373, 22041, 4815, 247, 2257, 273, 5300, 374, 253, 1332, 310, 9009, 1754, 327, 1863, 565, 16147, 19946, 27882, 352, 651, 320, 1805, 281, 2589, 4679, 327, 625, 643, 896, 47473, 253, 1332, 310, 4460, 285, 8415, 2709, 8892, 342, 3710, 10839, 494, 3602, 2299, 627, 403, 2067, 4373, 22041, 3058, 281, 320, 24251, 275, 436, 1332, 16280, 253, 1332, 760, 16633, 327, 14086, 8113, 8892, 352, 651, 320, 1805, 281, 2486, 625, 1846, 8113, 8892, 751, 1789, 5481, 5474, 339, 431, 248, 7714, 3400, 271, 9470, 1554, 262, 1945, 30364, 11892, 2276, 22791, 285, 33888, 5368, 30364, 11892, 2276, 1442, 292, 25004, 295, 24343, 3082, 323, 8113, 8892, 253, 2022, 7680, 310, 326, 436, 789, 310, 253, 806, 281, 2953, 30364, 11892, 2276, 1554, 262, 1945, 15644, 323, 8113, 8892, 285, 3715, 247, 27998, 7792, 281, 22791, 2067, 30364, 11892, 2276, 1442, 292, 25004, 295, 24343, 3082, 327, 14086, 8113, 8892, 253, 2929, 310, 6571, 3542, 275, 247, 1175, 5133, 285, 253, 2934, 310, 4951, 2568, 3576, 253, 2929, 3133, 281, 13398, 2067, 5368, 3082, 285, 9017, 616, 7763, 6200, 2403, 253, 7680, 273, 436, 789, 1679, 273, 247, 4757, 2299, 597, 858, 2067, 14586, 281, 253, 5368, 39396, 534, 943, 320, 7000, 285, 12800, 253, 2032, 38135, 273, 253, 789, 943, 320, 2007, 17285, 5474, 33032, 2520, 2929, 29328, 247, 30364, 11892, 2276, 1554, 262, 1945, 15644, 1332, 323, 14086, 8113, 8892, 1925, 3488, 29059, 46788, 352, 310, 908, 281, 5223, 247, 3215, 11273, 24498, 8113, 39707, 323, 16161, 2709, 14086, 8113, 8892, 253, 4081, 1332, 8414, 273, 767, 7794, 253, 45765, 4373, 3024, 4896, 285, 253, 3828, 3020, 13642, 34501, 3210, 403, 6760, 327, 7222, 1179, 8882, 15302, 323, 24705, 26405, 1966, 629, 26405, 2553, 5222, 932, 13418, 285, 3779, 4364, 5481, 285, 403, 2011, 281, 320, 3576, 50276, 296, 3755, 20556, 337, 352, 310, 4722, 281, 8338, 253, 30364, 11892, 2276, 15644, 5609, 323, 14086, 8113, 8892, 352, 556, 417, 644, 6949, 275, 436, 2170, 50276, 19, 253, 9759, 273, 436, 789, 310, 2590, 436, 2929, 671, 3400, 247, 7000, 5955, 273, 253, 3910, 285, 2493, 342, 30364, 11892, 2276, 1554, 262, 1945, 15644, 3082, 275, 295, 24343, 8892, 50276, 20, 253, 4081, 3602, 73, 1875, 1332, 310, 5272, 285, 310, 2011, 281, 320, 9371, 275, 8493, 253, 4715, 3602, 533, 671, 7562, 4942, 1029, 3045, 50276, 21, 14023, 342, 5368, 7274, 403, 11080, 285, 1534, 50276, 20881, 1255, 265, 50276, 18, 275, 4677, 374, 67, 253, 39707, 275, 253, 2406, 629, 556, 642, 1480, 4602, 342, 253, 5170, 629, 285, 310, 34209, 50276, 19, 253, 4081, 1332, 310, 760, 6760, 342, 1863, 565, 16147, 19946, 347, 7558, 407, 436, 2929, 352, 310, 4158, 323, 24498, 8113, 4979, 398, 651, 352, 671, 789, 973, 342, 643, 24498, 8113, 4979, 398, 50276, 17465, 569, 403, 5469, 275, 253, 2929, 50276, 7152, 339, 431, 248, 2929, 29328, 3488, 29059, 247, 4764, 5919, 25184, 1332, 323, 26277, 24251, 14086, 8113, 8892, 2045, 23675, 3022, 3082, 275, 295, 24343, 403, 22791, 264, 275, 2508, 323, 14086, 8113, 8892, 253, 4081, 1332, 310, 8058, 281, 1918, 1805, 4764, 24159, 5454, 2727, 327, 253, 5421, 8892, 20544, 50276, 18, 253, 2929, 310, 36166, 3542, 352, 310, 3477, 281, 956, 285, 2096, 374, 891, 419, 2254, 2326, 247, 2257, 273, 2929, 281, 1263, 30364, 11892, 2276, 25184, 5742, 323, 14086, 8113, 8892, 594, 253, 22791, 323, 23675, 3022, 3082, 275, 436, 2929, 943, 320, 9371, 323, 253, 3114, 253, 4477, 671, 12316, 281, 3727, 253, 2127, 50276, 20, 253, 2934, 2789, 3282, 281, 479, 285, 253, 3045, 310, 973, 19391, 407, 253, 4679, 1698, 14714, 14717, 285, 7870, 2801, 403, 2717, 747, 533, 597, 403, 6283, 3732, 253, 3828, 3020, 13642, 10295, 310, 4460, 285, 3576, 50276, 20881, 1255, 50276, 783, 5301, 8772, 2014, 4836, 1666, 25379, 36908, 1646, 4344, 281, 479, 247, 1566, 26277, 10166, 327, 2074, 8892, 310, 3264, 281, 562, 32231, 3210, 10166, 327, 2014, 8892, 2975, 891, 1928, 1543, 327, 34791, 1945, 943, 320, 2361, 697, 8718, 281, 1347, 7197, 327, 247, 1643, 8892, 1580, 436, 2929, 310, 7194, 10941, 342, 1554, 262, 1945, 3082, 533, 253, 1543, 943, 320, 3559, 323, 29867, 50276, 18, 760, 3215, 11273, 1863, 249, 39707, 310, 5421, 275, 436, 2929, 533, 891, 1928, 436, 1332, 476, 320, 4354, 6508, 281, 2410, 47301, 374, 625, 4679, 327, 3210, 3215, 11273, 327, 256, 3433, 8892, 943, 671, 1056, 436, 2929, 10046, 1580, 256, 3433, 3210, 403, 12085, 1411, 275, 4025, 11273, 3210, 285, 253, 3386, 403, 1077, 1027, 253, 3879, 273, 247, 1332, 476, 320, 1077, 1027, 7293, 327, 253, 3215, 1949, 4836, 2490, 187, 4118, 18435, 27, 783, 4081, 3488, 29059, 285, 3488, 29059, 46788, 323, 30364, 11892, 2276, 1554, 262, 1945, 15644, 33526, 12085, 3045, 15988, 327, 14086, 8113, 15302, 512, 30628, 1918, 5185, 2762, 7363, 253, 9521, 4679, 323, 625, 896, 47473, 1881, 35421, 896, 47473, 285, 6260, 452, 644, 15672, 2879, 1309, 253, 5955, 3408, 37317, 305, 1767, 20, 310, 7514, 670, 253, 12744, 8813, 273, 253, 7792, 285, 2139, 4373, 18428, 285, 44755, 34501, 812, 1361, 253, 4477, 9713, 253, 3374, 285, 7321, 253, 2929, 253, 1313, 609, 1374, 398, 3021, 5583, 281, 2997, 436, 2929, 285, 11907, 253, 4477, 281, 823, 512, 747, 4679, 285, 1056, 253, 9759, 625, 2590, 275, 253, 6568, 4704 ]
[ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 ]
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 2520, 2929, 29328, 3488, 29059, 285, 3488, 29059, 46788, 11253, 273, 45765, 4373, 3024, 4896, 285, 3828, 3020, 13642, 34501, 281, 3894, 1491, 2439, 1027, 8892, 342, 247, 1643, 6194, 494, 3602, 285, 2953, 30364, 11892, 2276, 1554, 262, 1945, 15644, 323, 8113, 8892, 253, 4477, 3989, 247, 27998, 7792, 342, 253, 1072, 7092, 4278, 285, 2085, 247, 11088, 285, 4344, 5301, 875, 5368, 30364, 11892, 2276, 15644, 2987, 275, 295, 24343, 327, 1554, 262, 1945, 272, 14086, 8113, 3237, 2429, 342, 253, 1375, 23037, 14387, 1554, 262, 1945, 272, 30364, 11892, 2276, 15644, 1332, 253, 1332, 33526, 12085, 3045, 7756, 342, 50276, 2270, 5141, 275, 253, 6194, 494, 3602, 50276, 296, 3755, 20556, 337, 436, 2929, 2589, 84, 247, 11080, 1263, 327, 849, 253, 5368, 5547, 4764, 5919, 3082, 327, 295, 24343, 8892, 1347, 327, 8113, 8892, 26332, 24705, 26405, 1966, 629, 26405, 3779, 4364, 5481, 285, 2553, 5222, 932, 13418, 374, 253, 4477, 2216, 247, 4460, 30364, 11892, 2276, 1332, 323, 15644, 281, 14086, 8113, 8892, 5742, 253, 4373, 3024, 4896, 1379, 3280, 4836, 46234, 285, 3828, 46234, 281, 4711, 1698, 14714, 12624, 285, 2007, 4044, 253, 23675, 13461, 495, 5661, 1543, 921, 326, 3488, 29059, 46788, 476, 5115, 247, 12085, 3045, 6351, 2429, 342, 253, 1375, 23037, 14387, 1332, 285, 760, 897, 247, 1077, 3710, 2408, 273, 10839, 494, 3602, 50276, 20881, 1255, 265, 337, 627, 403, 1142, 4373, 22041, 275, 436, 1332, 24088, 4836, 21496, 1979, 7877, 273, 17210, 285, 253, 1066, 856, 5342, 4313, 273, 519, 49872, 12203, 323, 7470, 4373, 22041, 4815, 247, 2257, 273, 5300, 374, 253, 1332, 310, 9009, 1754, 327, 1863, 565, 16147, 19946, 27882, 352, 651, 320, 1805, 281, 2589, 4679, 327, 625, 643, 896, 47473, 253, 1332, 310, 4460, 285, 8415, 2709, 8892, 342, 3710, 10839, 494, 3602, 2299, 627, 403, 2067, 4373, 22041, 3058, 281, 320, 24251, 275, 436, 1332, 16280, 253, 1332, 760, 16633, 327, 14086, 8113, 8892, 352, 651, 320, 1805, 281, 2486, 625, 1846, 8113, 8892, 751, 1789, 5481, 5474, 339, 431, 248, 7714, 3400, 271, 9470, 1554, 262, 1945, 30364, 11892, 2276, 22791, 285, 33888, 5368, 30364, 11892, 2276, 1442, 292, 25004, 295, 24343, 3082, 323, 8113, 8892, 253, 2022, 7680, 310, 326, 436, 789, 310, 253, 806, 281, 2953, 30364, 11892, 2276, 1554, 262, 1945, 15644, 323, 8113, 8892, 285, 3715, 247, 27998, 7792, 281, 22791, 2067, 30364, 11892, 2276, 1442, 292, 25004, 295, 24343, 3082, 327, 14086, 8113, 8892, 253, 2929, 310, 6571, 3542, 275, 247, 1175, 5133, 285, 253, 2934, 310, 4951, 2568, 3576, 253, 2929, 3133, 281, 13398, 2067, 5368, 3082, 285, 9017, 616, 7763, 6200, 2403, 253, 7680, 273, 436, 789, 1679, 273, 247, 4757, 2299, 597, 858, 2067, 14586, 281, 253, 5368, 39396, 534, 943, 320, 7000, 285, 12800, 253, 2032, 38135, 273, 253, 789, 943, 320, 2007, 17285, 5474, 33032, 2520, 2929, 29328, 247, 30364, 11892, 2276, 1554, 262, 1945, 15644, 1332, 323, 14086, 8113, 8892, 1925, 3488, 29059, 46788, 352, 310, 908, 281, 5223, 247, 3215, 11273, 24498, 8113, 39707, 323, 16161, 2709, 14086, 8113, 8892, 253, 4081, 1332, 8414, 273, 767, 7794, 253, 45765, 4373, 3024, 4896, 285, 253, 3828, 3020, 13642, 34501, 3210, 403, 6760, 327, 7222, 1179, 8882, 15302, 323, 24705, 26405, 1966, 629, 26405, 2553, 5222, 932, 13418, 285, 3779, 4364, 5481, 285, 403, 2011, 281, 320, 3576, 50276, 296, 3755, 20556, 337, 352, 310, 4722, 281, 8338, 253, 30364, 11892, 2276, 15644, 5609, 323, 14086, 8113, 8892, 352, 556, 417, 644, 6949, 275, 436, 2170, 50276, 19, 253, 9759, 273, 436, 789, 310, 2590, 436, 2929, 671, 3400, 247, 7000, 5955, 273, 253, 3910, 285, 2493, 342, 30364, 11892, 2276, 1554, 262, 1945, 15644, 3082, 275, 295, 24343, 8892, 50276, 20, 253, 4081, 3602, 73, 1875, 1332, 310, 5272, 285, 310, 2011, 281, 320, 9371, 275, 8493, 253, 4715, 3602, 533, 671, 7562, 4942, 1029, 3045, 50276, 21, 14023, 342, 5368, 7274, 403, 11080, 285, 1534, 50276, 20881, 1255, 265, 50276, 18, 275, 4677, 374, 67, 253, 39707, 275, 253, 2406, 629, 556, 642, 1480, 4602, 342, 253, 5170, 629, 285, 310, 34209, 50276, 19, 253, 4081, 1332, 310, 760, 6760, 342, 1863, 565, 16147, 19946, 347, 7558, 407, 436, 2929, 352, 310, 4158, 323, 24498, 8113, 4979, 398, 651, 352, 671, 789, 973, 342, 643, 24498, 8113, 4979, 398, 50276, 17465, 569, 403, 5469, 275, 253, 2929, 50276, 7152, 339, 431, 248, 2929, 29328, 3488, 29059, 247, 4764, 5919, 25184, 1332, 323, 26277, 24251, 14086, 8113, 8892, 2045, 23675, 3022, 3082, 275, 295, 24343, 403, 22791, 264, 275, 2508, 323, 14086, 8113, 8892, 253, 4081, 1332, 310, 8058, 281, 1918, 1805, 4764, 24159, 5454, 2727, 327, 253, 5421, 8892, 20544, 50276, 18, 253, 2929, 310, 36166, 3542, 352, 310, 3477, 281, 956, 285, 2096, 374, 891, 419, 2254, 2326, 247, 2257, 273, 2929, 281, 1263, 30364, 11892, 2276, 25184, 5742, 323, 14086, 8113, 8892, 594, 253, 22791, 323, 23675, 3022, 3082, 275, 436, 2929, 943, 320, 9371, 323, 253, 3114, 253, 4477, 671, 12316, 281, 3727, 253, 2127, 50276, 20, 253, 2934, 2789, 3282, 281, 479, 285, 253, 3045, 310, 973, 19391, 407, 253, 4679, 1698, 14714, 14717, 285, 7870, 2801, 403, 2717, 747, 533, 597, 403, 6283, 3732, 253, 3828, 3020, 13642, 10295, 310, 4460, 285, 3576, 50276, 20881, 1255, 50276, 783, 5301, 8772, 2014, 4836, 1666, 25379, 36908, 1646, 4344, 281, 479, 247, 1566, 26277, 10166, 327, 2074, 8892, 310, 3264, 281, 562, 32231, 3210, 10166, 327, 2014, 8892, 2975, 891, 1928, 1543, 327, 34791, 1945, 943, 320, 2361, 697, 8718, 281, 1347, 7197, 327, 247, 1643, 8892, 1580, 436, 2929, 310, 7194, 10941, 342, 1554, 262, 1945, 3082, 533, 253, 1543, 943, 320, 3559, 323, 29867, 50276, 18, 760, 3215, 11273, 1863, 249, 39707, 310, 5421, 275, 436, 2929, 533, 891, 1928, 436, 1332, 476, 320, 4354, 6508, 281, 2410, 47301, 374, 625, 4679, 327, 3210, 3215, 11273, 327, 256, 3433, 8892, 943, 671, 1056, 436, 2929, 10046, 1580, 256, 3433, 3210, 403, 12085, 1411, 275, 4025, 11273, 3210, 285, 253, 3386, 403, 1077, 1027, 253, 3879, 273, 247, 1332, 476, 320, 1077, 1027, 7293, 327, 253, 3215, 1949, 4836, 2490, 187, 4118, 18435, 27, 783, 4081, 3488, 29059, 285, 3488, 29059, 46788, 323, 30364, 11892, 2276, 1554, 262, 1945, 15644, 33526, 12085, 3045, 15988, 327, 14086, 8113, 15302, 512, 30628, 1918, 5185, 2762, 7363, 253, 9521, 4679, 323, 625, 896, 47473, 1881, 35421, 896, 47473, 285, 6260, 452, 644, 15672, 2879, 1309, 253, 5955, 3408, 37317, 305, 1767, 20, 310, 7514, 670, 253, 12744, 8813, 273, 253, 7792, 285, 2139, 4373, 18428, 285, 44755, 34501, 812, 1361, 253, 4477, 9713, 253, 3374, 285, 7321, 253, 2929, 253, 1313, 609, 1374, 398, 3021, 5583, 281, 2997, 436, 2929, 285, 11907, 253, 4477, 281, 823, 512, 747, 4679, 285, 1056, 253, 9759, 625, 2590, 275, 253, 6568, 4704 ]
Below is given review of a research paper from cnoference journal. Please write a summary the review. ### Review: this paper proposes a behavior similaritybased representation learning method that improves the generalization performance in an offline rl setting two key ideas are presented here 1 the proposed behavior similarity is based on the distance of gvf predictions between two states 2 to get a robust estimation of the distance between two gvf signals it is formulated as a cumulative distribution function under assumptions which is further converted to k quantile bins and can be learned by selfsupervised learning methods in multilabel classification tasks as defined in eq 6 procgen and dm control suite experiments were conducted to demonstrate the effectiveness of the proposed method strengths the problem setting of studying generalization using an offline dataset unlike rad curl and bisimulation metric has significant importance in realworld problems the authors demonstrate sufficient knowledge in generalizable rl literature i really enjoy reading the introductory part the proposed behavior similarity based on gvf predictions makes sense to me that should demonstrate better generalization performance the conversion of robust gvf distance estimation problem to a multilabel as k bins selfsupervised learning task is quite nice weaknesses the generality of gvf functions i am concerned about how to specify the cumulant of the gvf function as this is the key part that determines the generalization performance as the authors mentioned in lines 314 319 after checking the paper specifying the cumulants seem to still require predefined using some kind of human heuristic knowledge will the authors explain how to resolve the concern of generality of handspecified cumulants the performance gain fig 8 in appendix page 21 seems not significant when comparing the proposed method to a vanilla ppo in the procgen benchmark seems very small and most tasks do not see a significant performance improvement except for caveflyer leaper and plunder will the authors help to explain any reasons for that na docsepthis paper proposes a framework called generalized similarity functions gsf which estimates the similarity between observations by the various future behaviors the previous works tried to calculate the similarity based on the future action sequences state distribution value etc but the adequateness of those methods depends on the problem and no method is optimal in all circumstances to this end gsf learns from some arbitrary csa function and results in a value functions of those functions authors argue that the proposed framework can include the previous methods by using appropriate csa after learning gsf the paper augments the data by additional labels ie quantile of gsf and learns the latent space by minimizing infonce loss this paper proposes a framework that can capture any signal and use it to train the representation it is a generalized framework that can recover a number of previous frameworks in that sense it is somewhat original but not something that is completely new the paper seems to have okish technical quality but some stuff is not clearly written and is hard to follow eg definitions of c g f the experiment session is also somewhat not easy to understand what is the optimality gap in figure 4 this paper will have some impact on the researchers in this field due to the good performance and the proposal of the benchmarks the authors adequately addressed the limitations and potential negative social impact of their work docsepthe paper proposes a novel representation learning method generalized similarity functions gsf to improve the generalization ability of offline rl gsf clusters latent states with similar future behaviors quantified by the order statistics of generalized value function gvf empirical results on the offline procgen benchmark and the offline distracting control suite show that gsf successfully enhances the performance on unseen tasks strengths the problem of zeroshot generalization in offline rl is significant and the idea of gsf seems novel the choice of hyperparameters has some theoretical analysis the paper conducted experiments in both discrete and continuous benchmarks and the experimental results in both offline and online settings demonstrate the effectiveness of the proposed method weaknesses some related works are missing 1 and 2 both introduce representation clustering methods for offline multitaskmeta rl and can generalize to unseen tasks there is no visualization of the learned representations of different training and testing tasks which could help readers better understand the effectiveness of gsf the paper does not discuss the computation cost of gsf 1 li j vuong q liu s et al multitask batch reinforcement learning with metric learningj neurips 2020 2 li l yang r luo d focal efficient fullyoffline metareinforcement learning via distance metric learning and behavior regularization iclr 2021 the authors state that the limitation of the proposed method is the search of hyperparameters ### Summary:
this paper studies an interesting problem and overall the reviewers agreed the exposition and validation are sufficient we encourage the authors to consider the issues raised by the reviewers and further improve the work in the final version
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 2520, 2929, 29328, 247, 3879, 14259, 3169, 6779, 4715, 1332, 326, 19132, 253, 26647, 3045, 275, 271, 28841, 391, 77, 4758, 767, 2234, 5697, 403, 3559, 1060, 50275, 18, 253, 4081, 3879, 14259, 310, 1754, 327, 253, 4181, 273, 305, 39985, 13650, 875, 767, 3054, 50276, 19, 281, 755, 247, 10237, 13418, 273, 253, 4181, 875, 767, 305, 39985, 6298, 352, 310, 26115, 347, 247, 18849, 3268, 1159, 762, 13260, 534, 310, 2007, 11516, 281, 465, 2677, 587, 27925, 285, 476, 320, 6311, 407, 1881, 35421, 4715, 3082, 275, 33362, 1492, 9162, 8892, 347, 2931, 275, 16186, 721, 50276, 21299, 1541, 285, 42961, 1453, 18880, 4679, 497, 5196, 281, 7568, 253, 12510, 273, 253, 4081, 1332, 20544, 50275, 783, 1895, 4758, 273, 12392, 26647, 970, 271, 28841, 10895, 12401, 1985, 26721, 285, 17542, 303, 1427, 7982, 556, 1534, 6349, 275, 1524, 10186, 3237, 50276, 783, 4477, 7568, 4209, 3640, 275, 2087, 12729, 391, 77, 6239, 891, 1663, 4264, 4361, 253, 47649, 629, 50276, 783, 4081, 3879, 14259, 1754, 327, 305, 39985, 13650, 2789, 3282, 281, 479, 326, 943, 7568, 1805, 26647, 3045, 50276, 783, 9436, 273, 10237, 305, 39985, 4181, 13418, 1895, 281, 247, 33362, 1492, 347, 465, 27925, 1881, 35421, 4715, 4836, 310, 3240, 5322, 50276, 20881, 1255, 265, 50275, 783, 31376, 273, 305, 39985, 3470, 891, 717, 7514, 670, 849, 281, 13199, 253, 11277, 335, 386, 273, 253, 305, 39985, 1159, 347, 436, 310, 253, 2234, 629, 326, 14802, 253, 26647, 3045, 347, 253, 4477, 5393, 275, 3104, 33198, 50276, 33899, 846, 12669, 253, 2929, 31238, 253, 11277, 335, 1103, 1646, 281, 1335, 2430, 41364, 970, 690, 2238, 273, 1966, 47641, 3640, 588, 253, 4477, 5513, 849, 281, 11322, 253, 4468, 273, 31376, 273, 3564, 1553, 1245, 11277, 335, 1103, 50276, 783, 3045, 6351, 3036, 854, 275, 30762, 3239, 3127, 3133, 417, 1534, 672, 10941, 253, 4081, 1332, 281, 247, 26724, 268, 5367, 275, 253, 15613, 1541, 22791, 3133, 1077, 1355, 285, 954, 8892, 513, 417, 923, 247, 1534, 3045, 7756, 3707, 323, 15985, 16247, 254, 458, 6653, 285, 499, 4524, 588, 253, 4477, 1361, 281, 5513, 667, 4606, 323, 326, 50276, 2072, 5474, 33032, 2520, 2929, 29328, 247, 7792, 1925, 14923, 14259, 3470, 305, 6091, 534, 8197, 253, 14259, 875, 7313, 407, 253, 2710, 2852, 13576, 253, 2045, 2987, 3597, 281, 10173, 253, 14259, 1754, 327, 253, 2852, 2250, 6430, 1375, 3268, 1318, 3966, 533, 253, 7825, 48362, 273, 1110, 3082, 7024, 327, 253, 1895, 285, 642, 1332, 310, 8654, 275, 512, 5989, 281, 436, 990, 305, 6091, 33772, 432, 690, 10341, 260, 6678, 1159, 285, 1543, 275, 247, 1318, 3470, 273, 1110, 3470, 4477, 9059, 326, 253, 4081, 7792, 476, 2486, 253, 2045, 3082, 407, 970, 4569, 260, 6678, 846, 4715, 305, 6091, 253, 2929, 14688, 942, 253, 941, 407, 3081, 13301, 26332, 2677, 587, 273, 305, 6091, 285, 33772, 253, 21624, 2317, 407, 28699, 2192, 19131, 2957, 50276, 2520, 2929, 29328, 247, 7792, 326, 476, 9232, 667, 2625, 285, 897, 352, 281, 6194, 253, 6779, 352, 310, 247, 14923, 7792, 326, 476, 9295, 247, 1180, 273, 2045, 31225, 275, 326, 3282, 352, 310, 8489, 3236, 533, 417, 1633, 326, 310, 4336, 747, 50275, 783, 2929, 3133, 281, 452, 8718, 763, 7681, 3290, 533, 690, 5017, 310, 417, 4518, 3542, 285, 310, 1892, 281, 956, 24088, 14308, 273, 260, 305, 269, 253, 3368, 6874, 310, 671, 8489, 417, 3477, 281, 2096, 752, 310, 253, 5556, 1319, 8037, 275, 4677, 577, 50275, 2520, 2929, 588, 452, 690, 3486, 327, 253, 8607, 275, 436, 1673, 1955, 281, 253, 1175, 3045, 285, 253, 10419, 273, 253, 49602, 253, 4477, 18212, 9713, 253, 7364, 285, 2442, 4016, 2675, 3486, 273, 616, 789, 5474, 339, 431, 248, 2929, 29328, 247, 4460, 6779, 4715, 1332, 14923, 14259, 3470, 305, 6091, 281, 3157, 253, 26647, 3745, 273, 28841, 391, 77, 305, 6091, 9959, 21624, 3054, 342, 2074, 2852, 13576, 18755, 407, 253, 1340, 9990, 273, 14923, 1318, 1159, 305, 39985, 16774, 1543, 327, 253, 28841, 15613, 1541, 22791, 285, 253, 28841, 940, 25031, 1453, 18880, 921, 326, 305, 6091, 8379, 25222, 253, 3045, 327, 39709, 8892, 20544, 50276, 783, 1895, 273, 1182, 254, 6934, 302, 26647, 275, 28841, 391, 77, 310, 1534, 285, 253, 2934, 273, 305, 6091, 3133, 4460, 50276, 783, 4327, 273, 4373, 22041, 556, 690, 10527, 1783, 50276, 783, 2929, 5196, 4679, 275, 1097, 13358, 285, 5415, 49602, 285, 253, 5661, 1543, 275, 1097, 28841, 285, 3909, 7533, 7568, 253, 12510, 273, 253, 4081, 1332, 50276, 20881, 1255, 265, 50276, 8826, 2905, 2987, 403, 5816, 337, 285, 374, 1097, 9569, 6779, 17524, 3082, 323, 28841, 1554, 262, 1945, 13518, 391, 77, 285, 476, 39970, 281, 39709, 8892, 50276, 9088, 310, 642, 24426, 273, 253, 6311, 14237, 273, 1027, 3733, 285, 5175, 8892, 534, 812, 1361, 10668, 1805, 2096, 253, 12510, 273, 305, 6091, 50276, 783, 2929, 1057, 417, 2319, 253, 13782, 2105, 273, 305, 6091, 50276, 18, 50276, 965, 480, 24821, 543, 2805, 632, 86, 256, 1162, 355, 1554, 262, 1945, 14604, 35221, 4715, 342, 7982, 4715, 75, 5723, 2824, 9169, 374, 632, 298, 30966, 391, 26535, 80, 277, 18560, 5919, 4751, 2727, 1282, 1313, 609, 249, 19503, 4715, 3066, 4181, 7982, 4715, 285, 3879, 37820, 17857, 32888, 43425, 253, 4477, 1375, 326, 253, 12291, 273, 253, 4081, 1332, 310, 253, 3186, 273, 4373, 22041, 2490, 187, 4118, 18435, 27, 2520, 2929, 2175, 271, 4722, 1895, 285, 4583, 253, 30628, 5821, 253, 47284, 285, 12820, 403, 4209, 359, 11907, 253, 4477, 281, 1908, 253, 3374, 5439, 407, 253, 30628, 285, 2007, 3157, 253, 789, 275, 253, 2457, 2715 ]
[ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 ]
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 2520, 2929, 29328, 247, 3879, 14259, 3169, 6779, 4715, 1332, 326, 19132, 253, 26647, 3045, 275, 271, 28841, 391, 77, 4758, 767, 2234, 5697, 403, 3559, 1060, 50275, 18, 253, 4081, 3879, 14259, 310, 1754, 327, 253, 4181, 273, 305, 39985, 13650, 875, 767, 3054, 50276, 19, 281, 755, 247, 10237, 13418, 273, 253, 4181, 875, 767, 305, 39985, 6298, 352, 310, 26115, 347, 247, 18849, 3268, 1159, 762, 13260, 534, 310, 2007, 11516, 281, 465, 2677, 587, 27925, 285, 476, 320, 6311, 407, 1881, 35421, 4715, 3082, 275, 33362, 1492, 9162, 8892, 347, 2931, 275, 16186, 721, 50276, 21299, 1541, 285, 42961, 1453, 18880, 4679, 497, 5196, 281, 7568, 253, 12510, 273, 253, 4081, 1332, 20544, 50275, 783, 1895, 4758, 273, 12392, 26647, 970, 271, 28841, 10895, 12401, 1985, 26721, 285, 17542, 303, 1427, 7982, 556, 1534, 6349, 275, 1524, 10186, 3237, 50276, 783, 4477, 7568, 4209, 3640, 275, 2087, 12729, 391, 77, 6239, 891, 1663, 4264, 4361, 253, 47649, 629, 50276, 783, 4081, 3879, 14259, 1754, 327, 305, 39985, 13650, 2789, 3282, 281, 479, 326, 943, 7568, 1805, 26647, 3045, 50276, 783, 9436, 273, 10237, 305, 39985, 4181, 13418, 1895, 281, 247, 33362, 1492, 347, 465, 27925, 1881, 35421, 4715, 4836, 310, 3240, 5322, 50276, 20881, 1255, 265, 50275, 783, 31376, 273, 305, 39985, 3470, 891, 717, 7514, 670, 849, 281, 13199, 253, 11277, 335, 386, 273, 253, 305, 39985, 1159, 347, 436, 310, 253, 2234, 629, 326, 14802, 253, 26647, 3045, 347, 253, 4477, 5393, 275, 3104, 33198, 50276, 33899, 846, 12669, 253, 2929, 31238, 253, 11277, 335, 1103, 1646, 281, 1335, 2430, 41364, 970, 690, 2238, 273, 1966, 47641, 3640, 588, 253, 4477, 5513, 849, 281, 11322, 253, 4468, 273, 31376, 273, 3564, 1553, 1245, 11277, 335, 1103, 50276, 783, 3045, 6351, 3036, 854, 275, 30762, 3239, 3127, 3133, 417, 1534, 672, 10941, 253, 4081, 1332, 281, 247, 26724, 268, 5367, 275, 253, 15613, 1541, 22791, 3133, 1077, 1355, 285, 954, 8892, 513, 417, 923, 247, 1534, 3045, 7756, 3707, 323, 15985, 16247, 254, 458, 6653, 285, 499, 4524, 588, 253, 4477, 1361, 281, 5513, 667, 4606, 323, 326, 50276, 2072, 5474, 33032, 2520, 2929, 29328, 247, 7792, 1925, 14923, 14259, 3470, 305, 6091, 534, 8197, 253, 14259, 875, 7313, 407, 253, 2710, 2852, 13576, 253, 2045, 2987, 3597, 281, 10173, 253, 14259, 1754, 327, 253, 2852, 2250, 6430, 1375, 3268, 1318, 3966, 533, 253, 7825, 48362, 273, 1110, 3082, 7024, 327, 253, 1895, 285, 642, 1332, 310, 8654, 275, 512, 5989, 281, 436, 990, 305, 6091, 33772, 432, 690, 10341, 260, 6678, 1159, 285, 1543, 275, 247, 1318, 3470, 273, 1110, 3470, 4477, 9059, 326, 253, 4081, 7792, 476, 2486, 253, 2045, 3082, 407, 970, 4569, 260, 6678, 846, 4715, 305, 6091, 253, 2929, 14688, 942, 253, 941, 407, 3081, 13301, 26332, 2677, 587, 273, 305, 6091, 285, 33772, 253, 21624, 2317, 407, 28699, 2192, 19131, 2957, 50276, 2520, 2929, 29328, 247, 7792, 326, 476, 9232, 667, 2625, 285, 897, 352, 281, 6194, 253, 6779, 352, 310, 247, 14923, 7792, 326, 476, 9295, 247, 1180, 273, 2045, 31225, 275, 326, 3282, 352, 310, 8489, 3236, 533, 417, 1633, 326, 310, 4336, 747, 50275, 783, 2929, 3133, 281, 452, 8718, 763, 7681, 3290, 533, 690, 5017, 310, 417, 4518, 3542, 285, 310, 1892, 281, 956, 24088, 14308, 273, 260, 305, 269, 253, 3368, 6874, 310, 671, 8489, 417, 3477, 281, 2096, 752, 310, 253, 5556, 1319, 8037, 275, 4677, 577, 50275, 2520, 2929, 588, 452, 690, 3486, 327, 253, 8607, 275, 436, 1673, 1955, 281, 253, 1175, 3045, 285, 253, 10419, 273, 253, 49602, 253, 4477, 18212, 9713, 253, 7364, 285, 2442, 4016, 2675, 3486, 273, 616, 789, 5474, 339, 431, 248, 2929, 29328, 247, 4460, 6779, 4715, 1332, 14923, 14259, 3470, 305, 6091, 281, 3157, 253, 26647, 3745, 273, 28841, 391, 77, 305, 6091, 9959, 21624, 3054, 342, 2074, 2852, 13576, 18755, 407, 253, 1340, 9990, 273, 14923, 1318, 1159, 305, 39985, 16774, 1543, 327, 253, 28841, 15613, 1541, 22791, 285, 253, 28841, 940, 25031, 1453, 18880, 921, 326, 305, 6091, 8379, 25222, 253, 3045, 327, 39709, 8892, 20544, 50276, 783, 1895, 273, 1182, 254, 6934, 302, 26647, 275, 28841, 391, 77, 310, 1534, 285, 253, 2934, 273, 305, 6091, 3133, 4460, 50276, 783, 4327, 273, 4373, 22041, 556, 690, 10527, 1783, 50276, 783, 2929, 5196, 4679, 275, 1097, 13358, 285, 5415, 49602, 285, 253, 5661, 1543, 275, 1097, 28841, 285, 3909, 7533, 7568, 253, 12510, 273, 253, 4081, 1332, 50276, 20881, 1255, 265, 50276, 8826, 2905, 2987, 403, 5816, 337, 285, 374, 1097, 9569, 6779, 17524, 3082, 323, 28841, 1554, 262, 1945, 13518, 391, 77, 285, 476, 39970, 281, 39709, 8892, 50276, 9088, 310, 642, 24426, 273, 253, 6311, 14237, 273, 1027, 3733, 285, 5175, 8892, 534, 812, 1361, 10668, 1805, 2096, 253, 12510, 273, 305, 6091, 50276, 783, 2929, 1057, 417, 2319, 253, 13782, 2105, 273, 305, 6091, 50276, 18, 50276, 965, 480, 24821, 543, 2805, 632, 86, 256, 1162, 355, 1554, 262, 1945, 14604, 35221, 4715, 342, 7982, 4715, 75, 5723, 2824, 9169, 374, 632, 298, 30966, 391, 26535, 80, 277, 18560, 5919, 4751, 2727, 1282, 1313, 609, 249, 19503, 4715, 3066, 4181, 7982, 4715, 285, 3879, 37820, 17857, 32888, 43425, 253, 4477, 1375, 326, 253, 12291, 273, 253, 4081, 1332, 310, 253, 3186, 273, 4373, 22041, 2490, 187, 4118, 18435, 27, 2520, 2929, 2175, 271, 4722, 1895, 285, 4583, 253, 30628, 5821, 253, 47284, 285, 12820, 403, 4209, 359, 11907, 253, 4477, 281, 1908, 253, 3374, 5439, 407, 253, 30628, 285, 2007, 3157, 253, 789, 275, 253, 2457, 2715 ]
Below is given review of a research paper from cnoference journal. Please write a summary the review. ### Review: the paper proposes a method for exploration of 3d articulated environments that alternates between collecting interaction data with rl while maximizing a combination of extrinsic and intrinsic rewards and training visually conditioned action maps image conditioned manipulation trajectory priors and success predictors that further guide the intrinsic reward prediction during data collection i have two main concerns regarding the paper 1the link between the visual perception and the rl policy appears weak as the only feedback is through exploration rewards for the rl policy to try out interactions on places where the visual perception models assigns low success probability but is this a good exploration bonus what if indeed these are simply not good places to act and succeed shouldnt the certainty of the visual model be taken into account as opposed to low probability 2the nave rl baseline seems to be training one rl policy across multiple tasks if i understood correctly that operates on a point cloud input this baseline is designed to fail it can be much improved by training a separate rl policy in each environment separately and simply disabling the visual perception representations then at the end we can train actors that operate directly from images similar to an asymmetric actor critic setup could the authors show results of the baseline suggested above could the authors explain the rational of the exploration reward it is possible that i am missing something from my understanding of the paper i will be careful during discussion period to clarify any misunderstandings post rebuttal the authors have put together the requested baseline and they show significant performance margins over it thank you very much for this i raise my score accordingly docsepthis paper is solving the problem of pushing and pulling objects mostly things like cabinets by learning visual action trajectories proposals via a curiositydriven rl perception joint training the system input point clouds the object and outputs the actionable score and the pertrajectory success likelihood score on the most likely approach to interacting with the object this approach is validated both in simulations but also in simulation and with real results overall comments the main novelty of this paper is a slight twist on the where2act paper that instead of generating grip orientations this paper is generating trajectories this difference necessitates the difference in the models however they do show a comparison against the where2act approach that shows that their modifications to the network including the curiosity exploration and how they find the trajectories are superior for this task this paper is easy to understand and well written the ideas are easy to follow and build well the appendix was very useful for clarifying some parts of the paper in more detail which was very helpful such as for the curiositydriven explorations and where the heuristicbased method failed smaller comments the term step seems to refer to waypoint step but also can be confused for timestep can you clarify when you use it cite prior cvae work this is based on in the trajectory proposal module this paper has some interesting elements to it and their approach is validated by both real and simulated results while the task isnt very novel they at least validated their approach to show that it does better than previous approaches thus contributing to the field they also do a good job of explaining each of the steps in the appendix to make it easy to understand what exactly they are doing for these reasons it merits inclusion in the conference docsepthe paper extends work on static term action generation where2act iccv21 for 3d articulated objects to 1 longterm action trajectory generation by learning from data generated data via rl exploration 2 action trajectory conditioned with taskawareness pros the paper proposes the problem of long term action trajectory generation for 3d articulated objects which is not well studied the paper is well written cons the method itself is not very novel more about extending the existing where2act and a combination of where2act and curiosity guidance for rl policy for interactive trajectory exploration baseline for the curiosity guidance for rl policy for interactive trajectory exploration which is one of the main components of the method is not compared in the trajectory generation the paper proposes to solve a new problem of long term action trajectory generation for 3d articulated objects but the method to solve the problem is more about an extension and combination of existing work the overall quality of the paper writing and experiment is good ### Summary:
the paper claims to present actionable visual representations for manipulating 3d articulated objects specifically the approach learns to estimate the spatial affordance map as well as the trajectories and their scores after checking the rebuttal from the authors all reviewers agree that the paper adds value to the research area in the end it got three borderline accept ratings the initial criticism included lacking experimental comparison to baselines and the authors successfully corresponded to the request from the reviewer one reviewer commented that the proposed approach is a combination of where2act and curiosity guidance for rl policy for interactive trajectory exploration which we believe is a valid point still the paper extends the previous where2act and successfully demonstrates its success on difficult tasks we recommend accepting the paper
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 783, 2929, 29328, 247, 1332, 323, 17947, 273, 495, 69, 35144, 12620, 326, 3960, 684, 875, 17055, 5016, 941, 342, 391, 77, 1223, 46875, 247, 5019, 273, 38988, 285, 15276, 23267, 285, 3733, 25910, 27039, 2250, 8115, 2460, 27039, 19763, 18974, 2235, 641, 285, 2323, 23477, 326, 2007, 7102, 253, 15276, 10921, 10554, 1309, 941, 4849, 891, 452, 767, 2022, 7350, 5001, 253, 2929, 50275, 18, 783, 3048, 875, 253, 5304, 13071, 285, 253, 391, 77, 3646, 4620, 5075, 347, 253, 760, 8680, 310, 949, 17947, 23267, 323, 253, 391, 77, 3646, 281, 1611, 562, 6355, 327, 5053, 835, 253, 5304, 13071, 3210, 39360, 1698, 2323, 5912, 533, 310, 436, 247, 1175, 17947, 17301, 752, 604, 6296, 841, 403, 3365, 417, 1175, 5053, 281, 769, 285, 9302, 943, 2649, 253, 23140, 273, 253, 5304, 1566, 320, 2668, 715, 2395, 347, 10066, 281, 1698, 5912, 50276, 19, 783, 295, 1123, 391, 77, 8245, 3133, 281, 320, 3733, 581, 391, 77, 3646, 2439, 2709, 8892, 604, 891, 7192, 9113, 326, 17209, 327, 247, 1127, 9005, 3280, 436, 8245, 310, 4158, 281, 1891, 352, 476, 320, 1199, 5520, 407, 3733, 247, 4858, 391, 77, 3646, 275, 1016, 3126, 11794, 285, 3365, 44276, 253, 5304, 13071, 14237, 840, 387, 253, 990, 359, 476, 6194, 14142, 326, 10196, 3587, 432, 3888, 2074, 281, 271, 26640, 12353, 7291, 9978, 50273, 16534, 253, 4477, 921, 1543, 273, 253, 8245, 5125, 1840, 812, 253, 4477, 5513, 253, 8870, 273, 253, 17947, 10921, 50276, 262, 310, 1896, 326, 891, 717, 5816, 1633, 432, 619, 4685, 273, 253, 2929, 891, 588, 320, 10182, 1309, 5955, 2180, 281, 19148, 667, 23452, 1676, 723, 50274, 5996, 30080, 22559, 50276, 783, 4477, 452, 1691, 2366, 253, 9521, 8245, 285, 597, 921, 1534, 3045, 24390, 689, 352, 5717, 368, 1077, 1199, 323, 436, 891, 7164, 619, 4868, 15672, 50276, 7152, 33032, 2520, 2929, 310, 16161, 253, 1895, 273, 13383, 285, 14252, 5113, 6571, 1841, 751, 47545, 407, 4715, 5304, 2250, 24102, 18595, 3066, 247, 24536, 17477, 391, 77, 50276, 468, 2409, 6036, 3733, 50276, 783, 985, 3280, 1127, 16173, 253, 1789, 285, 18012, 253, 49353, 4868, 285, 253, 6925, 376, 720, 590, 2323, 12177, 4868, 327, 253, 954, 2779, 2746, 281, 18745, 342, 253, 1789, 50276, 2520, 2746, 310, 17618, 1097, 275, 9938, 533, 671, 275, 9864, 285, 342, 1524, 1543, 50275, 1189, 455, 5701, 50276, 783, 2022, 38135, 273, 436, 2929, 310, 247, 4512, 19152, 327, 253, 835, 19, 514, 2929, 326, 3185, 273, 11365, 17628, 38730, 436, 2929, 310, 11365, 24102, 50276, 2520, 3064, 2436, 36269, 253, 3064, 275, 253, 3210, 50276, 35529, 597, 513, 921, 247, 5301, 1411, 253, 835, 19, 514, 2746, 326, 2722, 326, 616, 14586, 281, 253, 2990, 1690, 253, 24536, 17947, 285, 849, 597, 1089, 253, 24102, 403, 8936, 323, 436, 4836, 50274, 2520, 2929, 310, 3477, 281, 2096, 285, 973, 3542, 50276, 783, 5697, 403, 3477, 281, 956, 285, 1973, 973, 50274, 783, 30762, 369, 1077, 4217, 323, 8254, 5411, 690, 4243, 273, 253, 2929, 275, 625, 2508, 534, 369, 1077, 9371, 824, 347, 323, 253, 24536, 17477, 31880, 569, 285, 835, 253, 47641, 3169, 1332, 4242, 50274, 6795, 254, 5701, 50275, 783, 1307, 3213, 3133, 281, 3730, 281, 1039, 3659, 3213, 533, 671, 476, 320, 13477, 323, 4522, 383, 554, 50276, 5092, 368, 19148, 672, 368, 897, 352, 50275, 41766, 2720, 260, 21574, 789, 436, 310, 1754, 327, 275, 253, 18974, 10419, 6333, 50275, 2520, 2929, 556, 690, 4722, 3603, 281, 352, 285, 616, 2746, 310, 17618, 407, 1097, 1524, 285, 15524, 1543, 50276, 6050, 253, 4836, 310, 2649, 1077, 4460, 597, 387, 1878, 17618, 616, 2746, 281, 921, 326, 352, 1057, 1805, 685, 2045, 7274, 3021, 15979, 281, 253, 1673, 50276, 9328, 671, 513, 247, 1175, 2628, 273, 15571, 1016, 273, 253, 5018, 275, 253, 30762, 281, 1056, 352, 3477, 281, 2096, 752, 4555, 597, 403, 2509, 50276, 1542, 841, 4606, 352, 16108, 11250, 275, 253, 8059, 50275, 7152, 339, 431, 248, 2929, 8725, 789, 327, 4228, 1307, 2250, 5978, 835, 19, 514, 17857, 17312, 1797, 323, 495, 69, 35144, 5113, 281, 337, 1048, 3945, 2250, 18974, 5978, 407, 4715, 432, 941, 4561, 941, 3066, 391, 77, 17947, 50276, 19, 2250, 18974, 27039, 342, 4836, 1403, 10640, 5847, 253, 2929, 29328, 253, 1895, 273, 1048, 1307, 2250, 18974, 5978, 323, 495, 69, 35144, 5113, 534, 310, 417, 973, 5421, 50276, 783, 2929, 310, 973, 3542, 50276, 5040, 253, 1332, 3139, 310, 417, 1077, 4460, 625, 670, 13633, 253, 5368, 835, 19, 514, 285, 247, 5019, 273, 835, 19, 514, 285, 24536, 12925, 323, 391, 77, 3646, 323, 18366, 18974, 17947, 8245, 323, 253, 24536, 12925, 323, 391, 77, 3646, 323, 18366, 18974, 17947, 534, 310, 581, 273, 253, 2022, 4295, 273, 253, 1332, 310, 417, 2429, 275, 253, 18974, 5978, 50276, 783, 2929, 29328, 281, 8415, 247, 747, 1895, 273, 1048, 1307, 2250, 18974, 5978, 323, 495, 69, 35144, 5113, 533, 253, 1332, 281, 8415, 253, 1895, 310, 625, 670, 271, 6880, 285, 5019, 273, 5368, 789, 253, 4583, 3290, 273, 253, 2929, 4028, 285, 3368, 310, 1175, 2490, 187, 4118, 18435, 27, 783, 2929, 3916, 281, 1246, 49353, 5304, 14237, 323, 40238, 495, 69, 35144, 5113, 5742, 253, 2746, 33772, 281, 6642, 253, 8820, 7848, 593, 3711, 347, 973, 347, 253, 24102, 285, 616, 7363, 846, 12669, 253, 30080, 22559, 432, 253, 4477, 512, 30628, 5194, 326, 253, 2929, 11323, 1318, 281, 253, 2561, 2170, 275, 253, 990, 352, 1694, 1264, 45210, 2997, 17503, 253, 3302, 14226, 2908, 14999, 5661, 5301, 281, 1666, 25379, 285, 253, 4477, 8379, 40575, 281, 253, 2748, 432, 253, 37317, 581, 37317, 20503, 326, 253, 4081, 2746, 310, 247, 5019, 273, 835, 19, 514, 285, 24536, 12925, 323, 391, 77, 3646, 323, 18366, 18974, 17947, 534, 359, 2868, 310, 247, 3588, 1127, 1335, 253, 2929, 8725, 253, 2045, 835, 19, 514, 285, 8379, 14371, 697, 2323, 327, 2834, 8892, 50276, 664, 5583, 18738, 253, 2929 ]
[ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 ]
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 783, 2929, 29328, 247, 1332, 323, 17947, 273, 495, 69, 35144, 12620, 326, 3960, 684, 875, 17055, 5016, 941, 342, 391, 77, 1223, 46875, 247, 5019, 273, 38988, 285, 15276, 23267, 285, 3733, 25910, 27039, 2250, 8115, 2460, 27039, 19763, 18974, 2235, 641, 285, 2323, 23477, 326, 2007, 7102, 253, 15276, 10921, 10554, 1309, 941, 4849, 891, 452, 767, 2022, 7350, 5001, 253, 2929, 50275, 18, 783, 3048, 875, 253, 5304, 13071, 285, 253, 391, 77, 3646, 4620, 5075, 347, 253, 760, 8680, 310, 949, 17947, 23267, 323, 253, 391, 77, 3646, 281, 1611, 562, 6355, 327, 5053, 835, 253, 5304, 13071, 3210, 39360, 1698, 2323, 5912, 533, 310, 436, 247, 1175, 17947, 17301, 752, 604, 6296, 841, 403, 3365, 417, 1175, 5053, 281, 769, 285, 9302, 943, 2649, 253, 23140, 273, 253, 5304, 1566, 320, 2668, 715, 2395, 347, 10066, 281, 1698, 5912, 50276, 19, 783, 295, 1123, 391, 77, 8245, 3133, 281, 320, 3733, 581, 391, 77, 3646, 2439, 2709, 8892, 604, 891, 7192, 9113, 326, 17209, 327, 247, 1127, 9005, 3280, 436, 8245, 310, 4158, 281, 1891, 352, 476, 320, 1199, 5520, 407, 3733, 247, 4858, 391, 77, 3646, 275, 1016, 3126, 11794, 285, 3365, 44276, 253, 5304, 13071, 14237, 840, 387, 253, 990, 359, 476, 6194, 14142, 326, 10196, 3587, 432, 3888, 2074, 281, 271, 26640, 12353, 7291, 9978, 50273, 16534, 253, 4477, 921, 1543, 273, 253, 8245, 5125, 1840, 812, 253, 4477, 5513, 253, 8870, 273, 253, 17947, 10921, 50276, 262, 310, 1896, 326, 891, 717, 5816, 1633, 432, 619, 4685, 273, 253, 2929, 891, 588, 320, 10182, 1309, 5955, 2180, 281, 19148, 667, 23452, 1676, 723, 50274, 5996, 30080, 22559, 50276, 783, 4477, 452, 1691, 2366, 253, 9521, 8245, 285, 597, 921, 1534, 3045, 24390, 689, 352, 5717, 368, 1077, 1199, 323, 436, 891, 7164, 619, 4868, 15672, 50276, 7152, 33032, 2520, 2929, 310, 16161, 253, 1895, 273, 13383, 285, 14252, 5113, 6571, 1841, 751, 47545, 407, 4715, 5304, 2250, 24102, 18595, 3066, 247, 24536, 17477, 391, 77, 50276, 468, 2409, 6036, 3733, 50276, 783, 985, 3280, 1127, 16173, 253, 1789, 285, 18012, 253, 49353, 4868, 285, 253, 6925, 376, 720, 590, 2323, 12177, 4868, 327, 253, 954, 2779, 2746, 281, 18745, 342, 253, 1789, 50276, 2520, 2746, 310, 17618, 1097, 275, 9938, 533, 671, 275, 9864, 285, 342, 1524, 1543, 50275, 1189, 455, 5701, 50276, 783, 2022, 38135, 273, 436, 2929, 310, 247, 4512, 19152, 327, 253, 835, 19, 514, 2929, 326, 3185, 273, 11365, 17628, 38730, 436, 2929, 310, 11365, 24102, 50276, 2520, 3064, 2436, 36269, 253, 3064, 275, 253, 3210, 50276, 35529, 597, 513, 921, 247, 5301, 1411, 253, 835, 19, 514, 2746, 326, 2722, 326, 616, 14586, 281, 253, 2990, 1690, 253, 24536, 17947, 285, 849, 597, 1089, 253, 24102, 403, 8936, 323, 436, 4836, 50274, 2520, 2929, 310, 3477, 281, 2096, 285, 973, 3542, 50276, 783, 5697, 403, 3477, 281, 956, 285, 1973, 973, 50274, 783, 30762, 369, 1077, 4217, 323, 8254, 5411, 690, 4243, 273, 253, 2929, 275, 625, 2508, 534, 369, 1077, 9371, 824, 347, 323, 253, 24536, 17477, 31880, 569, 285, 835, 253, 47641, 3169, 1332, 4242, 50274, 6795, 254, 5701, 50275, 783, 1307, 3213, 3133, 281, 3730, 281, 1039, 3659, 3213, 533, 671, 476, 320, 13477, 323, 4522, 383, 554, 50276, 5092, 368, 19148, 672, 368, 897, 352, 50275, 41766, 2720, 260, 21574, 789, 436, 310, 1754, 327, 275, 253, 18974, 10419, 6333, 50275, 2520, 2929, 556, 690, 4722, 3603, 281, 352, 285, 616, 2746, 310, 17618, 407, 1097, 1524, 285, 15524, 1543, 50276, 6050, 253, 4836, 310, 2649, 1077, 4460, 597, 387, 1878, 17618, 616, 2746, 281, 921, 326, 352, 1057, 1805, 685, 2045, 7274, 3021, 15979, 281, 253, 1673, 50276, 9328, 671, 513, 247, 1175, 2628, 273, 15571, 1016, 273, 253, 5018, 275, 253, 30762, 281, 1056, 352, 3477, 281, 2096, 752, 4555, 597, 403, 2509, 50276, 1542, 841, 4606, 352, 16108, 11250, 275, 253, 8059, 50275, 7152, 339, 431, 248, 2929, 8725, 789, 327, 4228, 1307, 2250, 5978, 835, 19, 514, 17857, 17312, 1797, 323, 495, 69, 35144, 5113, 281, 337, 1048, 3945, 2250, 18974, 5978, 407, 4715, 432, 941, 4561, 941, 3066, 391, 77, 17947, 50276, 19, 2250, 18974, 27039, 342, 4836, 1403, 10640, 5847, 253, 2929, 29328, 253, 1895, 273, 1048, 1307, 2250, 18974, 5978, 323, 495, 69, 35144, 5113, 534, 310, 417, 973, 5421, 50276, 783, 2929, 310, 973, 3542, 50276, 5040, 253, 1332, 3139, 310, 417, 1077, 4460, 625, 670, 13633, 253, 5368, 835, 19, 514, 285, 247, 5019, 273, 835, 19, 514, 285, 24536, 12925, 323, 391, 77, 3646, 323, 18366, 18974, 17947, 8245, 323, 253, 24536, 12925, 323, 391, 77, 3646, 323, 18366, 18974, 17947, 534, 310, 581, 273, 253, 2022, 4295, 273, 253, 1332, 310, 417, 2429, 275, 253, 18974, 5978, 50276, 783, 2929, 29328, 281, 8415, 247, 747, 1895, 273, 1048, 1307, 2250, 18974, 5978, 323, 495, 69, 35144, 5113, 533, 253, 1332, 281, 8415, 253, 1895, 310, 625, 670, 271, 6880, 285, 5019, 273, 5368, 789, 253, 4583, 3290, 273, 253, 2929, 4028, 285, 3368, 310, 1175, 2490, 187, 4118, 18435, 27, 783, 2929, 3916, 281, 1246, 49353, 5304, 14237, 323, 40238, 495, 69, 35144, 5113, 5742, 253, 2746, 33772, 281, 6642, 253, 8820, 7848, 593, 3711, 347, 973, 347, 253, 24102, 285, 616, 7363, 846, 12669, 253, 30080, 22559, 432, 253, 4477, 512, 30628, 5194, 326, 253, 2929, 11323, 1318, 281, 253, 2561, 2170, 275, 253, 990, 352, 1694, 1264, 45210, 2997, 17503, 253, 3302, 14226, 2908, 14999, 5661, 5301, 281, 1666, 25379, 285, 253, 4477, 8379, 40575, 281, 253, 2748, 432, 253, 37317, 581, 37317, 20503, 326, 253, 4081, 2746, 310, 247, 5019, 273, 835, 19, 514, 285, 24536, 12925, 323, 391, 77, 3646, 323, 18366, 18974, 17947, 534, 359, 2868, 310, 247, 3588, 1127, 1335, 253, 2929, 8725, 253, 2045, 835, 19, 514, 285, 8379, 14371, 697, 2323, 327, 2834, 8892, 50276, 664, 5583, 18738, 253, 2929 ]
Below is given review of a research paper from cnoference journal. Please write a summary the review. ### Review: this paper proposes a macro search space which allows blocks in a model to be different to promote performance to provide a systematic study of the performance of nas algorithms in a macro search space named blox a benchmark that consists of 91k unique models trained on the cifar100 dataset the dataset also includes other hardware latency information 1 macro search space and benchmark for nas with 91k unique architecture the accuracy improvement with the different blocks is remarkable according to figure 4 2 the analysis includes different nas algorithms on blox in terms of block signatures accuracy predictors and training methodologies these factors include 2 existing work named donna and hant 1 the training process is extremely complex with three stage settings including normal setting distillation and finetuning according to figure 6 2 also with the results provided in figure 7 the good or bad teacher results in different distilled and finetuned performance and has a relatively low spearman with accuracy when trained from scratch besides the number of distilled epochs also affects the spearman correlation 1 is it hard to choose a teacher or simply use the best teacher how to choose the best teacher 2 the rank of accuracy distilled and finetuned by the bad teacher of all these models is not highly correlated with the rank of the good teacher as shown in figure 10b 3 which choice of distilled epoch makes more sense 3 with these problems is it hard to apply this pipeline for other researchers to use your blockwise search space to perform architecture search in other network architectures the complicated training pipeline and factors affected by teachers 1 whats the most important items 2 does there exist a more simplified procedure 4 can we use the blox dataset to evaluate other commonly seen nas algorithms besides donna and hant docsepfor computational efficiency most existing nas algorithms search for cellbased architectures however this paper argues that the search space of nas algorithms should be macro ie by including the full network topology in the search space and allowing operations to differ in each block to systematically evaluate the performance of nas algorithms in macro search spaces the paper proposes blox a nas benchmark that consists of 91k cnn models with better block diversity trained on cifar100 compared to previous macro nas benchmarks the blox search space is larger as it contains more block options the paper also evaluates two recent blockwise nas algorithms on the proposed search space and provides a detailed analysis of their performance and the implications on the efficacy of different block signatures accuracy predictors and training strategies the proposed benchmark focuses on macro nas which is a relatively new and lessstudied field compared to cellbased nas the proposed search space of block operations is expansive compared to that of nasbenchmacro it contains more operations up to 45 and includes the commonly seen ones such as residual blocks and inverted bottleneck blocks the paper includes a detailed study on two blockwise nas methods which have not appeared in previous nas benchmarks the analysis in the paper provides two valuable insights first the paretofrontier of searched models of macro nas dominates that of models of micro nas this suggests that a macro search space contains higher performing models thus pointing out a promising direction for future nas research ie to develop efficient methods that search for macro structures and allow heterogeneity between network blocks second for blockwise nas evaluation the paper considers two novel settingsdistillation and finetuningin addition to the standard training setting performance results and the analysis provided can be helpful for the development of training methodology for future algorithms full experiment details hyperparameters and code are provided for reproducibility the search space design has limited novelty and capacity the set of block options trivial expands that of nasbenchmacro moreover as blox contains more block options it only evaluates architecture with 3 stages to reduce computational cost note that nasbenchmacro search space has 8 stages this design choice limits the depth and size of the candidate architectures and consequently affects the learning capacity of the searched model nowadays commonly used cnns for image classification even the simplest ones such as resnet50 are much larger than 3stage cnns this hinders the practical significance of the benchmark as few ml developers will actually use a network model that has a similar scale to that evaluated in the benchmark for research or industry purposes also it is unsure whether the evaluated networks can work for more complicated tasks beyond cifar100 and the analysis based on the searched results also might not generalize to more complicated architectures and difficult learning problems the paper only evaluates two blockwise nas algorithms there are many other nas methods that can be considered macro ie the ones that generate networks in a global view rather than stacking cells examples are morphismbased methods as mentioned in section 414 of 1 do these methods also count towards macro nas if not can you give a better and clearer definition of macro nas the comparison of macro nas and cellbased nas in fig 4 might be unfair when constructing the uniform block search space it seems that the authors consider models generated with the same block options and the threestage structure so that the number of candidate networks is far less than that of the macro search space however micro nas methods are often more computationally efficient and work for significantly larger search spaces for a fair comparison i think the uniform block search space should be constructed in a way that either the size of the uniform block and the different block search space is identical or the computational cost of exploring the two search spaces are similar right now i find the conclusions of section 23 less convincing 1 he xin et al automl a survey of the stateoftheart knowl based syst 212 2021 106622 docsepthe authors have introduced a new search space for nas which allows for different cell architectures at each block contrary to cellbased search spaces where a single cell architecture is repeated throughout the network the benchmark referred to as blox collects in particular information regarding the training and validation of all architectures on cifar100 into a dataset which can be queried through an api they further evaluate several conventional nas algorithms together with two recent blockwise nas algorithms in the common setting provided by blox and use the benchmark to analyze and address a number of questions regarding different components of blockwise nas methods having a diverse set of search spaces is crucial for a proper assessment of nas methods and the macro search space provided by blox will be useful in this respect especially given that most prior work focus on cellbased search spaces the related queryable dataset will also allow fast evaluations of nas methods and can therefore help speed up nas research despite the fact that the search space is designed to be compatible with all nas methods including differentiable architecture search as mentioned at the beginning of section 2 no differentiable nas methods are applied to the search space this would be useful to support the claim 1 conventional nas achieves worse results than standard blockwise ft200 when a good teacher is used made in section 33 the fact that the paretofront of models with different blocks dominates that of models with uniform blocks as highlighted towards the end of section 2 highlight 1 is expected given the three orders of magnitude larger number of models with different blocks 45345 compared to models with similar blocks 45 in the blox search space rather than comparing different and uniformblock architectures within the same search space in order to motivate the study further it would be useful to compare blox with a cellbased search space of roughly the same size see also additional feedback below docsepblox introduces a new macro nas search space and explores nas block algorithms through research questions regarding distillation the contribution consists of a precomputed benchmark and insights from controlled experiments the benchmark precomputes 95k architectures that accelerates architecture evaluation for future nas research experiments make a strong case for finetuning which is effective with a good teacher block signatures show promise as a performance predictor runtime under different hardware settings is measured and provided code is opensource in anonymized repository alongside good benchmark documentation in the appendix benchmark is only evaluated on cifar100 this choice should be justified typically cifar10 is the common denominator for nas benchmark datasets another advantage of multiple dataset is the possibility of studying distilling the teacher on a new dataset which could demonstrate this methodologys usefulness this study is not enabled in the current setting i remain doubtful of the search spaces quality given its best accuracy of 766 on cifar100 pdarts reports more than 80 accuracy for most methods in dartslike search space on this dataset see this linkhttpsarxivorgpdf190412760v1pdf therefore the superiority of macro search spaces is not clearly established only a limited number of nas methods are evaluated on this tabular benchmark which excludes weightsharing oneshot methods that are stateoftheart additional seeds for the experiments are forthcoming the code repository does not seem to contain scripts for experiments in sec 3 docsepauthors release a macro block based nas benchmark called blox that includes 91k models trained on cifar100 dataset this benchmark includes the performance and runtime measurements on different hardware platforms authors also perform comprehensive comparison between the blockwise nas and cellbased nas this is a timely work on the important research topic nas and the paper is well written and organized authors conduced comprehensive experiments to characterize the blockwise nas i dont think this paper has any obvious shortcomings but it can be improved in some small parts 1 the authors should clarify the reason or insight of your model architecture search space why do you design this way and what are the benefits 2 i appreciate the comprehensive experiments conducted against q1q6 though they are interesting but a little bit hard to follow for example why do you ask these questions and how are they connected and please also highlight the concrete answer bold for each question ### Summary:
this paper introduces a new nas benchmark with 95k architectures evaluated on a single dataset criticisms were quite diverse including the novelty and design of the search space the use cellbased baselines with much smaller search spaces the limited number of algorithms being benchmarked and the limitation to a single dataset cifar100 however most of the criticisms were addressed during the rebuttal leading to several reviewers increase their scores overall all reviewers are in favour of acceptance some of them clearly so i therefore recommend acceptance as a poster
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 2520, 2929, 29328, 247, 14823, 3186, 2317, 534, 4483, 8336, 275, 247, 1566, 281, 320, 1027, 281, 8591, 3045, 281, 2085, 247, 12082, 1263, 273, 253, 3045, 273, 13332, 11333, 275, 247, 14823, 3186, 2317, 4907, 787, 1004, 50276, 66, 22791, 326, 8414, 273, 11583, 76, 4451, 3210, 10166, 327, 253, 260, 338, 274, 2313, 10895, 253, 10895, 671, 3797, 643, 10309, 22667, 1491, 337, 14823, 3186, 2317, 285, 22791, 323, 13332, 342, 11583, 76, 4451, 10336, 253, 7200, 7756, 342, 253, 1027, 8336, 310, 13406, 2556, 281, 4677, 577, 50276, 19, 253, 1783, 3797, 1027, 13332, 11333, 327, 787, 1004, 275, 2426, 273, 2972, 20076, 7200, 23477, 285, 3733, 39396, 841, 2616, 2486, 374, 5368, 789, 4907, 1053, 2072, 285, 288, 386, 50276, 18, 253, 3733, 1232, 310, 6685, 2570, 342, 1264, 3924, 7533, 1690, 2622, 4758, 940, 21755, 285, 1442, 292, 25004, 2556, 281, 4677, 721, 50275, 19, 671, 342, 253, 1543, 2530, 275, 4677, 818, 253, 1175, 390, 3076, 9732, 1543, 275, 1027, 35755, 285, 1442, 292, 37437, 3045, 285, 556, 247, 4942, 1698, 31636, 1342, 342, 7200, 672, 10166, 432, 20041, 16280, 253, 1180, 273, 35755, 44540, 671, 11852, 253, 31636, 1342, 5921, 337, 310, 352, 1892, 281, 5206, 247, 9732, 390, 3365, 897, 253, 1682, 9732, 849, 281, 5206, 253, 1682, 9732, 374, 253, 5958, 273, 7200, 35755, 285, 1442, 292, 37437, 407, 253, 3076, 9732, 273, 512, 841, 3210, 310, 417, 4122, 9578, 342, 253, 5958, 273, 253, 1175, 9732, 347, 2011, 275, 4677, 884, 67, 50276, 20, 534, 4327, 273, 35755, 23657, 2789, 625, 3282, 50276, 20, 342, 841, 3237, 310, 352, 1892, 281, 4647, 436, 15722, 323, 643, 8607, 281, 897, 634, 2972, 3020, 3186, 2317, 281, 1347, 10336, 3186, 275, 643, 2990, 35615, 253, 9542, 3733, 15722, 285, 2616, 5876, 407, 10954, 337, 47515, 253, 954, 1774, 4957, 374, 1057, 627, 2226, 247, 625, 21010, 5199, 50276, 21, 476, 359, 897, 253, 787, 1004, 10895, 281, 7472, 643, 7744, 2326, 13332, 11333, 16280, 1053, 2072, 285, 288, 386, 50276, 7152, 33032, 1542, 15180, 6733, 954, 5368, 13332, 11333, 3186, 323, 894, 3169, 35615, 2299, 436, 2929, 8219, 326, 253, 3186, 2317, 273, 13332, 11333, 943, 320, 14823, 26332, 407, 1690, 253, 2120, 2990, 18080, 275, 253, 3186, 2317, 285, 6941, 5871, 281, 9184, 275, 1016, 2972, 281, 24181, 7472, 253, 3045, 273, 13332, 11333, 275, 14823, 3186, 8470, 253, 2929, 29328, 787, 1004, 247, 13332, 22791, 326, 8414, 273, 11583, 76, 260, 9866, 3210, 342, 1805, 2972, 9991, 10166, 327, 260, 338, 274, 2313, 2429, 281, 2045, 14823, 13332, 49602, 253, 787, 1004, 3186, 2317, 310, 4067, 347, 352, 4428, 625, 2972, 4610, 253, 2929, 671, 44995, 767, 3332, 2972, 3020, 13332, 11333, 327, 253, 4081, 3186, 2317, 285, 3400, 247, 7000, 1783, 273, 616, 3045, 285, 253, 12739, 327, 253, 10307, 273, 1027, 2972, 20076, 7200, 23477, 285, 3733, 8130, 50275, 783, 4081, 22791, 16633, 327, 14823, 13332, 534, 310, 247, 4942, 747, 285, 1679, 14091, 728, 1673, 2429, 281, 894, 3169, 13332, 253, 4081, 3186, 2317, 273, 2972, 5871, 310, 44380, 2429, 281, 326, 273, 13332, 31591, 35074, 352, 4428, 625, 5871, 598, 281, 5329, 285, 3797, 253, 7744, 2326, 4394, 824, 347, 12541, 8336, 285, 28483, 3673, 44856, 8336, 50275, 783, 2929, 3797, 247, 7000, 1263, 327, 767, 2972, 3020, 13332, 3082, 534, 452, 417, 5420, 275, 2045, 13332, 49602, 50274, 783, 1783, 275, 253, 2929, 3400, 767, 9865, 16039, 806, 253, 22865, 936, 6342, 1321, 273, 16113, 3210, 273, 14823, 13332, 36807, 326, 273, 3210, 273, 2494, 13332, 436, 5936, 326, 247, 14823, 3186, 2317, 4428, 2169, 9591, 3210, 3021, 13458, 562, 247, 12532, 3884, 323, 2852, 13332, 2561, 26332, 281, 1287, 5919, 3082, 326, 3186, 323, 14823, 5289, 285, 1581, 19331, 875, 2990, 8336, 1273, 323, 2972, 3020, 13332, 7103, 253, 2929, 19401, 767, 4460, 7533, 8155, 21755, 285, 1442, 292, 25004, 249, 1635, 281, 253, 2629, 3733, 4758, 3045, 1543, 285, 253, 1783, 2530, 476, 320, 9371, 323, 253, 2440, 273, 3733, 16182, 323, 2852, 11333, 50274, 11546, 3368, 4278, 4373, 22041, 285, 2127, 403, 2530, 323, 38041, 50274, 783, 3186, 2317, 2216, 556, 3710, 38135, 285, 5350, 253, 873, 273, 2972, 4610, 14916, 35205, 326, 273, 13332, 31591, 35074, 25761, 347, 787, 1004, 4428, 625, 2972, 4610, 352, 760, 44995, 10336, 342, 495, 8661, 281, 4796, 15180, 2105, 3877, 326, 13332, 31591, 35074, 3186, 2317, 556, 854, 8661, 436, 2216, 4327, 7787, 253, 6864, 285, 1979, 273, 253, 7431, 35615, 285, 17912, 11852, 253, 4715, 5350, 273, 253, 16113, 1566, 31735, 7744, 908, 260, 79, 2224, 323, 2460, 9162, 1014, 253, 22325, 4394, 824, 347, 501, 3024, 1235, 403, 1199, 4067, 685, 495, 13311, 260, 79, 2224, 436, 17134, 398, 253, 8542, 8453, 273, 253, 22791, 347, 1643, 13361, 12259, 588, 2686, 897, 247, 2990, 1566, 326, 556, 247, 2074, 4311, 281, 326, 6760, 275, 253, 22791, 323, 2561, 390, 4491, 6378, 671, 352, 310, 31488, 1880, 253, 6760, 6928, 476, 789, 323, 625, 9542, 8892, 4457, 260, 338, 274, 2313, 285, 253, 1783, 1754, 327, 253, 16113, 1543, 671, 1537, 417, 39970, 281, 625, 9542, 35615, 285, 2834, 4715, 3237, 50274, 783, 2929, 760, 44995, 767, 2972, 3020, 13332, 11333, 627, 403, 1142, 643, 13332, 3082, 326, 476, 320, 2783, 14823, 26332, 253, 4394, 326, 6635, 6928, 275, 247, 4156, 1859, 2581, 685, 37444, 1341, 6667, 403, 25309, 3169, 3082, 347, 5393, 275, 2593, 36573, 273, 337, 513, 841, 3082, 671, 1385, 4404, 14823, 13332, 604, 417, 476, 368, 1918, 247, 1805, 285, 30909, 5426, 273, 14823, 13332, 50275, 783, 5301, 273, 14823, 13332, 285, 894, 3169, 13332, 275, 3036, 577, 1537, 320, 16593, 672, 26736, 253, 6447, 2972, 3186, 2317, 352, 3133, 326, 253, 4477, 1908, 3210, 4561, 342, 253, 1072, 2972, 4610, 285, 253, 289, 250, 383, 486, 2605, 594, 326, 253, 1180, 273, 7431, 6928, 310, 2080, 1679, 685, 326, 273, 253, 14823, 3186, 2317, 2299, 2494, 13332, 3082, 403, 2223, 625, 43245, 5919, 285, 789, 323, 3012, 4067, 3186, 8470, 323, 247, 4344, 5301, 891, 1158, 253, 6447, 2972, 3186, 2317, 943, 320, 8818, 275, 247, 1039, 326, 2057, 253, 1979, 273, 253, 6447, 2972, 285, 253, 1027, 2972, 3186, 2317, 310, 8931, 390, 253, 15180, 2105, 273, 18216, 253, 767, 3186, 8470, 403, 2074, 987, 1024, 891, 1089, 253, 11815, 273, 2593, 3495, 1679, 21414, 50275, 18, 344, 1269, 249, 1162, 355, 3772, 77, 247, 6630, 273, 253, 1375, 23037, 14387, 871, 77, 1754, 21544, 21990, 43425, 884, 2526, 1423, 50276, 7152, 339, 431, 248, 4477, 452, 5611, 247, 747, 3186, 2317, 323, 13332, 534, 4483, 323, 1027, 894, 35615, 387, 1016, 2972, 10214, 281, 894, 3169, 3186, 8470, 835, 247, 2014, 894, 10336, 310, 6015, 4768, 253, 2990, 50275, 783, 22791, 6289, 281, 347, 787, 1004, 41084, 275, 1798, 1491, 5001, 253, 3733, 285, 12820, 273, 512, 35615, 327, 260, 338, 274, 2313, 715, 247, 10895, 534, 476, 320, 32305, 728, 949, 271, 23370, 50276, 9328, 2007, 7472, 2067, 6041, 13332, 11333, 2366, 342, 767, 3332, 2972, 3020, 13332, 11333, 275, 253, 1846, 4758, 2530, 407, 787, 1004, 285, 897, 253, 22791, 281, 12106, 285, 2953, 247, 1180, 273, 3533, 5001, 1027, 4295, 273, 2972, 3020, 13332, 3082, 1907, 247, 11117, 873, 273, 3186, 8470, 310, 9560, 323, 247, 1463, 6803, 273, 13332, 3082, 285, 253, 14823, 3186, 2317, 2530, 407, 787, 1004, 588, 320, 4217, 275, 436, 1675, 3340, 1677, 326, 954, 2720, 789, 2770, 327, 894, 3169, 3186, 8470, 253, 2905, 7316, 494, 10895, 588, 671, 1581, 3809, 27163, 273, 13332, 3082, 285, 476, 3103, 1361, 3885, 598, 13332, 2561, 5747, 253, 958, 326, 253, 3186, 2317, 310, 4158, 281, 320, 13333, 342, 512, 13332, 3082, 1690, 46350, 10336, 3186, 347, 5393, 387, 253, 5068, 273, 2593, 374, 642, 46350, 13332, 3082, 403, 3732, 281, 253, 3186, 2317, 436, 651, 320, 4217, 281, 1329, 253, 1750, 337, 6041, 13332, 33526, 7197, 1543, 685, 2629, 2972, 3020, 23899, 1518, 672, 247, 1175, 9732, 310, 908, 1160, 275, 2593, 5922, 50276, 783, 958, 326, 253, 22865, 936, 6342, 273, 3210, 342, 1027, 8336, 36807, 326, 273, 3210, 342, 6447, 8336, 347, 16318, 4404, 253, 990, 273, 2593, 374, 6780, 337, 310, 3264, 1677, 253, 1264, 7367, 273, 9777, 4067, 1180, 273, 3210, 342, 1027, 8336, 5329, 16767, 2429, 281, 3210, 342, 2074, 8336, 5329, 275, 253, 787, 1004, 3186, 2317, 2581, 685, 10941, 1027, 285, 6447, 6172, 35615, 1561, 253, 1072, 3186, 2317, 275, 1340, 281, 41509, 253, 1263, 2007, 352, 651, 320, 4217, 281, 7277, 787, 1004, 342, 247, 894, 3169, 3186, 2317, 273, 11467, 253, 1072, 1979, 923, 671, 3081, 8680, 2708, 5474, 33032, 1559, 1004, 23970, 247, 747, 14823, 13332, 3186, 2317, 285, 33826, 13332, 2972, 11333, 949, 2561, 3533, 5001, 940, 21755, 253, 7680, 8414, 273, 247, 638, 16777, 264, 22791, 285, 16039, 432, 6537, 4679, 50275, 783, 22791, 638, 16777, 265, 5325, 76, 35615, 326, 17308, 684, 10336, 7103, 323, 2852, 13332, 2561, 50275, 16217, 3825, 1056, 247, 2266, 1083, 323, 1442, 292, 25004, 534, 310, 3576, 342, 247, 1175, 9732, 50275, 6172, 20076, 921, 9023, 347, 247, 3045, 23403, 50275, 21005, 762, 1027, 10309, 7533, 310, 4080, 285, 2530, 50275, 3211, 310, 13279, 1505, 275, 26314, 1025, 18491, 12936, 1175, 22791, 10097, 275, 253, 30762, 50275, 31591, 4698, 310, 760, 6760, 327, 260, 338, 274, 2313, 436, 4327, 943, 320, 17285, 5431, 260, 338, 274, 740, 310, 253, 1846, 12619, 323, 13332, 22791, 15302, 50275, 23955, 5750, 273, 2709, 10895, 310, 253, 6387, 273, 12392, 940, 3867, 253, 9732, 327, 247, 747, 10895, 534, 812, 7568, 436, 1332, 862, 656, 31471, 436, 1263, 310, 417, 11410, 275, 253, 1655, 4758, 50275, 74, 3464, 38342, 273, 253, 3186, 8470, 3290, 1677, 697, 1682, 7200, 273, 818, 2526, 327, 260, 338, 274, 2313, 31385, 12863, 5012, 625, 685, 5096, 7200, 323, 954, 3082, 275, 277, 12863, 3022, 3186, 2317, 327, 436, 10895, 923, 436, 3048, 3614, 39962, 2061, 9275, 746, 2125, 11946, 1549, 87, 18, 9275, 3103, 253, 34385, 273, 14823, 3186, 8470, 310, 417, 4518, 4232, 50275, 7483, 247, 3710, 1180, 273, 13332, 3082, 403, 6760, 327, 436, 10334, 792, 22791, 534, 43337, 2801, 35870, 4394, 12022, 3082, 326, 403, 1375, 23037, 14387, 50275, 38092, 12922, 323, 253, 4679, 403, 31196, 50275, 783, 2127, 18491, 1057, 417, 1646, 281, 3831, 20477, 323, 4679, 275, 4706, 495, 50276, 7152, 33032, 43355, 3727, 247, 14823, 2972, 1754, 13332, 22791, 1925, 787, 1004, 326, 3797, 11583, 76, 3210, 10166, 327, 260, 338, 274, 2313, 10895, 436, 22791, 3797, 253, 3045, 285, 20243, 6341, 327, 1027, 10309, 13498, 4477, 671, 1347, 11088, 5301, 875, 253, 2972, 3020, 13332, 285, 894, 3169, 13332, 436, 310, 247, 14793, 789, 327, 253, 1774, 2561, 9400, 13332, 285, 253, 2929, 310, 973, 3542, 285, 10932, 4477, 345, 23747, 11088, 4679, 281, 17710, 253, 2972, 3020, 13332, 891, 13414, 1158, 436, 2929, 556, 667, 4755, 35387, 533, 352, 476, 320, 5520, 275, 690, 1355, 4243, 337, 253, 4477, 943, 19148, 253, 1921, 390, 12288, 273, 634, 1566, 10336, 3186, 2317, 2139, 513, 368, 2216, 436, 1039, 285, 752, 403, 253, 5373, 374, 891, 11435, 253, 11088, 4679, 5196, 1411, 2805, 18, 82, 23, 2167, 597, 403, 4722, 533, 247, 1652, 2372, 1892, 281, 956, 323, 1650, 2139, 513, 368, 1642, 841, 3533, 285, 849, 403, 597, 4802, 285, 4496, 671, 6780, 253, 11859, 3662, 13433, 323, 1016, 1953, 50276, 187, 187, 4118, 18435, 27, 2520, 2929, 23970, 247, 747, 13332, 22791, 342, 5325, 76, 35615, 6760, 327, 247, 2014, 10895, 50276, 68, 17425, 3931, 497, 3240, 11117, 1690, 253, 38135, 285, 2216, 273, 253, 3186, 2317, 253, 897, 894, 3169, 1666, 25379, 342, 1199, 4577, 3186, 8470, 253, 3710, 1180, 273, 11333, 1146, 22791, 264, 285, 253, 12291, 281, 247, 2014, 10895, 260, 338, 274, 2313, 2299, 954, 273, 253, 43680, 497, 9713, 1309, 253, 30080, 22559, 4283, 281, 2067, 30628, 2572, 616, 7363, 4583, 512, 30628, 403, 275, 9796, 273, 14924, 690, 273, 731, 4518, 594, 891, 3103, 5583, 14924, 347, 247, 20731 ]
[ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 ]
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 2520, 2929, 29328, 247, 14823, 3186, 2317, 534, 4483, 8336, 275, 247, 1566, 281, 320, 1027, 281, 8591, 3045, 281, 2085, 247, 12082, 1263, 273, 253, 3045, 273, 13332, 11333, 275, 247, 14823, 3186, 2317, 4907, 787, 1004, 50276, 66, 22791, 326, 8414, 273, 11583, 76, 4451, 3210, 10166, 327, 253, 260, 338, 274, 2313, 10895, 253, 10895, 671, 3797, 643, 10309, 22667, 1491, 337, 14823, 3186, 2317, 285, 22791, 323, 13332, 342, 11583, 76, 4451, 10336, 253, 7200, 7756, 342, 253, 1027, 8336, 310, 13406, 2556, 281, 4677, 577, 50276, 19, 253, 1783, 3797, 1027, 13332, 11333, 327, 787, 1004, 275, 2426, 273, 2972, 20076, 7200, 23477, 285, 3733, 39396, 841, 2616, 2486, 374, 5368, 789, 4907, 1053, 2072, 285, 288, 386, 50276, 18, 253, 3733, 1232, 310, 6685, 2570, 342, 1264, 3924, 7533, 1690, 2622, 4758, 940, 21755, 285, 1442, 292, 25004, 2556, 281, 4677, 721, 50275, 19, 671, 342, 253, 1543, 2530, 275, 4677, 818, 253, 1175, 390, 3076, 9732, 1543, 275, 1027, 35755, 285, 1442, 292, 37437, 3045, 285, 556, 247, 4942, 1698, 31636, 1342, 342, 7200, 672, 10166, 432, 20041, 16280, 253, 1180, 273, 35755, 44540, 671, 11852, 253, 31636, 1342, 5921, 337, 310, 352, 1892, 281, 5206, 247, 9732, 390, 3365, 897, 253, 1682, 9732, 849, 281, 5206, 253, 1682, 9732, 374, 253, 5958, 273, 7200, 35755, 285, 1442, 292, 37437, 407, 253, 3076, 9732, 273, 512, 841, 3210, 310, 417, 4122, 9578, 342, 253, 5958, 273, 253, 1175, 9732, 347, 2011, 275, 4677, 884, 67, 50276, 20, 534, 4327, 273, 35755, 23657, 2789, 625, 3282, 50276, 20, 342, 841, 3237, 310, 352, 1892, 281, 4647, 436, 15722, 323, 643, 8607, 281, 897, 634, 2972, 3020, 3186, 2317, 281, 1347, 10336, 3186, 275, 643, 2990, 35615, 253, 9542, 3733, 15722, 285, 2616, 5876, 407, 10954, 337, 47515, 253, 954, 1774, 4957, 374, 1057, 627, 2226, 247, 625, 21010, 5199, 50276, 21, 476, 359, 897, 253, 787, 1004, 10895, 281, 7472, 643, 7744, 2326, 13332, 11333, 16280, 1053, 2072, 285, 288, 386, 50276, 7152, 33032, 1542, 15180, 6733, 954, 5368, 13332, 11333, 3186, 323, 894, 3169, 35615, 2299, 436, 2929, 8219, 326, 253, 3186, 2317, 273, 13332, 11333, 943, 320, 14823, 26332, 407, 1690, 253, 2120, 2990, 18080, 275, 253, 3186, 2317, 285, 6941, 5871, 281, 9184, 275, 1016, 2972, 281, 24181, 7472, 253, 3045, 273, 13332, 11333, 275, 14823, 3186, 8470, 253, 2929, 29328, 787, 1004, 247, 13332, 22791, 326, 8414, 273, 11583, 76, 260, 9866, 3210, 342, 1805, 2972, 9991, 10166, 327, 260, 338, 274, 2313, 2429, 281, 2045, 14823, 13332, 49602, 253, 787, 1004, 3186, 2317, 310, 4067, 347, 352, 4428, 625, 2972, 4610, 253, 2929, 671, 44995, 767, 3332, 2972, 3020, 13332, 11333, 327, 253, 4081, 3186, 2317, 285, 3400, 247, 7000, 1783, 273, 616, 3045, 285, 253, 12739, 327, 253, 10307, 273, 1027, 2972, 20076, 7200, 23477, 285, 3733, 8130, 50275, 783, 4081, 22791, 16633, 327, 14823, 13332, 534, 310, 247, 4942, 747, 285, 1679, 14091, 728, 1673, 2429, 281, 894, 3169, 13332, 253, 4081, 3186, 2317, 273, 2972, 5871, 310, 44380, 2429, 281, 326, 273, 13332, 31591, 35074, 352, 4428, 625, 5871, 598, 281, 5329, 285, 3797, 253, 7744, 2326, 4394, 824, 347, 12541, 8336, 285, 28483, 3673, 44856, 8336, 50275, 783, 2929, 3797, 247, 7000, 1263, 327, 767, 2972, 3020, 13332, 3082, 534, 452, 417, 5420, 275, 2045, 13332, 49602, 50274, 783, 1783, 275, 253, 2929, 3400, 767, 9865, 16039, 806, 253, 22865, 936, 6342, 1321, 273, 16113, 3210, 273, 14823, 13332, 36807, 326, 273, 3210, 273, 2494, 13332, 436, 5936, 326, 247, 14823, 3186, 2317, 4428, 2169, 9591, 3210, 3021, 13458, 562, 247, 12532, 3884, 323, 2852, 13332, 2561, 26332, 281, 1287, 5919, 3082, 326, 3186, 323, 14823, 5289, 285, 1581, 19331, 875, 2990, 8336, 1273, 323, 2972, 3020, 13332, 7103, 253, 2929, 19401, 767, 4460, 7533, 8155, 21755, 285, 1442, 292, 25004, 249, 1635, 281, 253, 2629, 3733, 4758, 3045, 1543, 285, 253, 1783, 2530, 476, 320, 9371, 323, 253, 2440, 273, 3733, 16182, 323, 2852, 11333, 50274, 11546, 3368, 4278, 4373, 22041, 285, 2127, 403, 2530, 323, 38041, 50274, 783, 3186, 2317, 2216, 556, 3710, 38135, 285, 5350, 253, 873, 273, 2972, 4610, 14916, 35205, 326, 273, 13332, 31591, 35074, 25761, 347, 787, 1004, 4428, 625, 2972, 4610, 352, 760, 44995, 10336, 342, 495, 8661, 281, 4796, 15180, 2105, 3877, 326, 13332, 31591, 35074, 3186, 2317, 556, 854, 8661, 436, 2216, 4327, 7787, 253, 6864, 285, 1979, 273, 253, 7431, 35615, 285, 17912, 11852, 253, 4715, 5350, 273, 253, 16113, 1566, 31735, 7744, 908, 260, 79, 2224, 323, 2460, 9162, 1014, 253, 22325, 4394, 824, 347, 501, 3024, 1235, 403, 1199, 4067, 685, 495, 13311, 260, 79, 2224, 436, 17134, 398, 253, 8542, 8453, 273, 253, 22791, 347, 1643, 13361, 12259, 588, 2686, 897, 247, 2990, 1566, 326, 556, 247, 2074, 4311, 281, 326, 6760, 275, 253, 22791, 323, 2561, 390, 4491, 6378, 671, 352, 310, 31488, 1880, 253, 6760, 6928, 476, 789, 323, 625, 9542, 8892, 4457, 260, 338, 274, 2313, 285, 253, 1783, 1754, 327, 253, 16113, 1543, 671, 1537, 417, 39970, 281, 625, 9542, 35615, 285, 2834, 4715, 3237, 50274, 783, 2929, 760, 44995, 767, 2972, 3020, 13332, 11333, 627, 403, 1142, 643, 13332, 3082, 326, 476, 320, 2783, 14823, 26332, 253, 4394, 326, 6635, 6928, 275, 247, 4156, 1859, 2581, 685, 37444, 1341, 6667, 403, 25309, 3169, 3082, 347, 5393, 275, 2593, 36573, 273, 337, 513, 841, 3082, 671, 1385, 4404, 14823, 13332, 604, 417, 476, 368, 1918, 247, 1805, 285, 30909, 5426, 273, 14823, 13332, 50275, 783, 5301, 273, 14823, 13332, 285, 894, 3169, 13332, 275, 3036, 577, 1537, 320, 16593, 672, 26736, 253, 6447, 2972, 3186, 2317, 352, 3133, 326, 253, 4477, 1908, 3210, 4561, 342, 253, 1072, 2972, 4610, 285, 253, 289, 250, 383, 486, 2605, 594, 326, 253, 1180, 273, 7431, 6928, 310, 2080, 1679, 685, 326, 273, 253, 14823, 3186, 2317, 2299, 2494, 13332, 3082, 403, 2223, 625, 43245, 5919, 285, 789, 323, 3012, 4067, 3186, 8470, 323, 247, 4344, 5301, 891, 1158, 253, 6447, 2972, 3186, 2317, 943, 320, 8818, 275, 247, 1039, 326, 2057, 253, 1979, 273, 253, 6447, 2972, 285, 253, 1027, 2972, 3186, 2317, 310, 8931, 390, 253, 15180, 2105, 273, 18216, 253, 767, 3186, 8470, 403, 2074, 987, 1024, 891, 1089, 253, 11815, 273, 2593, 3495, 1679, 21414, 50275, 18, 344, 1269, 249, 1162, 355, 3772, 77, 247, 6630, 273, 253, 1375, 23037, 14387, 871, 77, 1754, 21544, 21990, 43425, 884, 2526, 1423, 50276, 7152, 339, 431, 248, 4477, 452, 5611, 247, 747, 3186, 2317, 323, 13332, 534, 4483, 323, 1027, 894, 35615, 387, 1016, 2972, 10214, 281, 894, 3169, 3186, 8470, 835, 247, 2014, 894, 10336, 310, 6015, 4768, 253, 2990, 50275, 783, 22791, 6289, 281, 347, 787, 1004, 41084, 275, 1798, 1491, 5001, 253, 3733, 285, 12820, 273, 512, 35615, 327, 260, 338, 274, 2313, 715, 247, 10895, 534, 476, 320, 32305, 728, 949, 271, 23370, 50276, 9328, 2007, 7472, 2067, 6041, 13332, 11333, 2366, 342, 767, 3332, 2972, 3020, 13332, 11333, 275, 253, 1846, 4758, 2530, 407, 787, 1004, 285, 897, 253, 22791, 281, 12106, 285, 2953, 247, 1180, 273, 3533, 5001, 1027, 4295, 273, 2972, 3020, 13332, 3082, 1907, 247, 11117, 873, 273, 3186, 8470, 310, 9560, 323, 247, 1463, 6803, 273, 13332, 3082, 285, 253, 14823, 3186, 2317, 2530, 407, 787, 1004, 588, 320, 4217, 275, 436, 1675, 3340, 1677, 326, 954, 2720, 789, 2770, 327, 894, 3169, 3186, 8470, 253, 2905, 7316, 494, 10895, 588, 671, 1581, 3809, 27163, 273, 13332, 3082, 285, 476, 3103, 1361, 3885, 598, 13332, 2561, 5747, 253, 958, 326, 253, 3186, 2317, 310, 4158, 281, 320, 13333, 342, 512, 13332, 3082, 1690, 46350, 10336, 3186, 347, 5393, 387, 253, 5068, 273, 2593, 374, 642, 46350, 13332, 3082, 403, 3732, 281, 253, 3186, 2317, 436, 651, 320, 4217, 281, 1329, 253, 1750, 337, 6041, 13332, 33526, 7197, 1543, 685, 2629, 2972, 3020, 23899, 1518, 672, 247, 1175, 9732, 310, 908, 1160, 275, 2593, 5922, 50276, 783, 958, 326, 253, 22865, 936, 6342, 273, 3210, 342, 1027, 8336, 36807, 326, 273, 3210, 342, 6447, 8336, 347, 16318, 4404, 253, 990, 273, 2593, 374, 6780, 337, 310, 3264, 1677, 253, 1264, 7367, 273, 9777, 4067, 1180, 273, 3210, 342, 1027, 8336, 5329, 16767, 2429, 281, 3210, 342, 2074, 8336, 5329, 275, 253, 787, 1004, 3186, 2317, 2581, 685, 10941, 1027, 285, 6447, 6172, 35615, 1561, 253, 1072, 3186, 2317, 275, 1340, 281, 41509, 253, 1263, 2007, 352, 651, 320, 4217, 281, 7277, 787, 1004, 342, 247, 894, 3169, 3186, 2317, 273, 11467, 253, 1072, 1979, 923, 671, 3081, 8680, 2708, 5474, 33032, 1559, 1004, 23970, 247, 747, 14823, 13332, 3186, 2317, 285, 33826, 13332, 2972, 11333, 949, 2561, 3533, 5001, 940, 21755, 253, 7680, 8414, 273, 247, 638, 16777, 264, 22791, 285, 16039, 432, 6537, 4679, 50275, 783, 22791, 638, 16777, 265, 5325, 76, 35615, 326, 17308, 684, 10336, 7103, 323, 2852, 13332, 2561, 50275, 16217, 3825, 1056, 247, 2266, 1083, 323, 1442, 292, 25004, 534, 310, 3576, 342, 247, 1175, 9732, 50275, 6172, 20076, 921, 9023, 347, 247, 3045, 23403, 50275, 21005, 762, 1027, 10309, 7533, 310, 4080, 285, 2530, 50275, 3211, 310, 13279, 1505, 275, 26314, 1025, 18491, 12936, 1175, 22791, 10097, 275, 253, 30762, 50275, 31591, 4698, 310, 760, 6760, 327, 260, 338, 274, 2313, 436, 4327, 943, 320, 17285, 5431, 260, 338, 274, 740, 310, 253, 1846, 12619, 323, 13332, 22791, 15302, 50275, 23955, 5750, 273, 2709, 10895, 310, 253, 6387, 273, 12392, 940, 3867, 253, 9732, 327, 247, 747, 10895, 534, 812, 7568, 436, 1332, 862, 656, 31471, 436, 1263, 310, 417, 11410, 275, 253, 1655, 4758, 50275, 74, 3464, 38342, 273, 253, 3186, 8470, 3290, 1677, 697, 1682, 7200, 273, 818, 2526, 327, 260, 338, 274, 2313, 31385, 12863, 5012, 625, 685, 5096, 7200, 323, 954, 3082, 275, 277, 12863, 3022, 3186, 2317, 327, 436, 10895, 923, 436, 3048, 3614, 39962, 2061, 9275, 746, 2125, 11946, 1549, 87, 18, 9275, 3103, 253, 34385, 273, 14823, 3186, 8470, 310, 417, 4518, 4232, 50275, 7483, 247, 3710, 1180, 273, 13332, 3082, 403, 6760, 327, 436, 10334, 792, 22791, 534, 43337, 2801, 35870, 4394, 12022, 3082, 326, 403, 1375, 23037, 14387, 50275, 38092, 12922, 323, 253, 4679, 403, 31196, 50275, 783, 2127, 18491, 1057, 417, 1646, 281, 3831, 20477, 323, 4679, 275, 4706, 495, 50276, 7152, 33032, 43355, 3727, 247, 14823, 2972, 1754, 13332, 22791, 1925, 787, 1004, 326, 3797, 11583, 76, 3210, 10166, 327, 260, 338, 274, 2313, 10895, 436, 22791, 3797, 253, 3045, 285, 20243, 6341, 327, 1027, 10309, 13498, 4477, 671, 1347, 11088, 5301, 875, 253, 2972, 3020, 13332, 285, 894, 3169, 13332, 436, 310, 247, 14793, 789, 327, 253, 1774, 2561, 9400, 13332, 285, 253, 2929, 310, 973, 3542, 285, 10932, 4477, 345, 23747, 11088, 4679, 281, 17710, 253, 2972, 3020, 13332, 891, 13414, 1158, 436, 2929, 556, 667, 4755, 35387, 533, 352, 476, 320, 5520, 275, 690, 1355, 4243, 337, 253, 4477, 943, 19148, 253, 1921, 390, 12288, 273, 634, 1566, 10336, 3186, 2317, 2139, 513, 368, 2216, 436, 1039, 285, 752, 403, 253, 5373, 374, 891, 11435, 253, 11088, 4679, 5196, 1411, 2805, 18, 82, 23, 2167, 597, 403, 4722, 533, 247, 1652, 2372, 1892, 281, 956, 323, 1650, 2139, 513, 368, 1642, 841, 3533, 285, 849, 403, 597, 4802, 285, 4496, 671, 6780, 253, 11859, 3662, 13433, 323, 1016, 1953, 50276, 187, 187, 4118, 18435, 27, 2520, 2929, 23970, 247, 747, 13332, 22791, 342, 5325, 76, 35615, 6760, 327, 247, 2014, 10895, 50276, 68, 17425, 3931, 497, 3240, 11117, 1690, 253, 38135, 285, 2216, 273, 253, 3186, 2317, 253, 897, 894, 3169, 1666, 25379, 342, 1199, 4577, 3186, 8470, 253, 3710, 1180, 273, 11333, 1146, 22791, 264, 285, 253, 12291, 281, 247, 2014, 10895, 260, 338, 274, 2313, 2299, 954, 273, 253, 43680, 497, 9713, 1309, 253, 30080, 22559, 4283, 281, 2067, 30628, 2572, 616, 7363, 4583, 512, 30628, 403, 275, 9796, 273, 14924, 690, 273, 731, 4518, 594, 891, 3103, 5583, 14924, 347, 247, 20731 ]
Below is given review of a research paper from cnoference journal. Please write a summary the review. ### Review: summary this manuscript proposes a novel learning method to improve the robustness of unsupervised anomaly detection called robust collaborative autoencoders rca this combines two autoencoders which exchange samples from heterogeneous batches according to the rankings that each individual model assigned while the manuscript contains a number of interesting theoretical motivations its experimental section contains some weakness and i would hope for it to include a larger set of competitors as well as a more flexible model class with which rca is paired strengths the paper follows a standard structure and is logically organized theoretical results are provided that underpin the sample selection criteria that is used in rca in particular the experimental section compares rca against some other however mostly simple anomaly detection methods and the authors propose an interesting idea meant to enhance robustness a particularly desirable property when dealing with anomaly detection the authors provide code which is always a plus experimental results are computed from ten random seeds with standard deviations included a methodological description is included in section 3 weaknesses my main criticism revolves around the extent of the experimental section given the generality of ae architectures and their wide applicability to all types of ad on images text etc it would have been interesting to learn how rca fares in different scenarios for instance rca incorporated with more recent convolutional autoencoding setups is missing in the evaluation unfortunately rca is evaluated on mostly lowdimensional data and against simple competitor modelsae architectures to counter any doubts around the feasibility of rca to scale to more complex ae setups and datasets it would be interesting to see rca evaluated on more standard realworld benchmark data such as cifar10 which is widely used in the standard anomaly detection literature or with more complex autoencoders eg those proposed in huang et al 2019 an additional question that remains unexplored is how well rca can scale to more than two subae modules and whether this would be a practical thing to do experiments or some discussion in this direction would be very interesting additional remarks axes in figure 2 are illegible the table formatting is suboptimal cf the instructions for submission the formatting of alg 1 and 2 can be improved table 3 is hard to read this might be better in a figure why not move assumption 5 in the vicinity of theorem 3 since this is only required there docsep summary the submission tackles unsupervised anomaly detection specifically in a scenario where supervision labels are not available only information about the ratio of anomalous examples in the data set they suggest an architecture consisting of two autoencoders collaboratively determining anomalous samples and updating their weights based on data that is deemed normal the authors provide a theoretical analysis of the selection process and validate anomaly detection performance on a range of experiments pros interesting challenge tackling the problem of potentially contaminated data heads on instead of sidestepping it with assumptions like guaranteed normalcy of the training data set is an interesting challenge relative simplicity of the approach the suggested algorithm is a remarkably simple extension or selfregularization to vanilla autoencoding the changes are fairly minimal losses and ae architectures remain the same at the same time the suggested duplication of aes and selection of data directly address contaminated anomaly detection data sets this allows for potentially widespread applicability of the idea to other kinds of data problems or architectures theoretical underpinning of the algorithm i applaud the authors effort in examining and motivating the suggested changes not just by experimental results but by a more rigorous theoretical analysis this is a big plus in a typically very evaluation and application driven field this especially holds true given how small the architectural changes are proving their legitimacy both theoretically and experimentally can make for a strong contribution cons the motivation and context could be stated more clearly the setting that the authors assume and the contributions to that setting are muddied throughout the paper abstract and introduction discuss various problems of dnnbased ad methods for instance overparameterization at the core of their setting however is contaminated data without any label information beyond an estimate of the ratio of anomalies i believe the authors should emphasize the setting much clearer motivate their choice of setting compared to more common approaches in the literature like unsupervised learning on guaranteed normal data the theoretical analysis is relatively overemphasized as stated above i believe the theoretical analysis is a strength of the paper spending four pages on the methods section and two of those an the assumptions theorems and remarks compared to a total of two pages of evaluation the authors clearly emphasize this aspect of their contribution from this perspective i believe the theoretical analysis takes up too much space especially considering the very applicationdriven nature of anomaly detection methods put bluntly the theorems are certainly interesting and worth having which is why i consider them a pro but they are not strong enough to justify the amount of space compared to eg evaluation i believe the main text should stick to the theorems and spend less time on technical details and more time contextualizing the results how strong are the results how valid are the assumptions after all the proof largely hinges on the assumptions so they deserve more scrutiny than they currently get are the theoretical results reflected in the experimental evaluation if not why the bit about the integer program to determine the selected data points taking up half a page seems like a retrospective justification for the perfectly valid design decision to pick the fraction of examples with lowest reconstruction error but otherwise does not add much the evaluation is too coarse the previous point dovetails with my main criticism the evaluation this should have been a much stronger focus of the paper given that anomaly detection is very applicationdriven the synthetic data set is nice to examine qualitative results however the evaluation is purely qualitative where quantitative metrics would also be in order the rightmost column of figure 2 is the only hint at performance we see that a lot of anomalies are not selected in both rows i might be misunderstanding something but this hints towards a massive amount of false positives and negatives the realworld experiments are lacking a lot of evaluations in my opinion the analysis is reduced to winning the auc score against the baselines on as many data sets as possible while thats desirable its not helpful in understanding the pros and cons of certain algorithms this is particularly true given that the experimental setup of contaminated data violates the assumptions of many of the baselines overestimating the share of anomalies is studied although the given results are very hard to parse this begs the question what happens if i underestimate the contamination this should lead to more anomalies being part of the data used for backprop given that the algorithm is based around that ratio or an estimate thereof i would have liked to see a stronger focus on it as you hypothesize your selection method biases the representation learning this should be examined in an experimental evaluation this is particularly true given the venue the presentation can be improved this is not a decisive point but i believe potential readers would greatly benefit from improvements in structure writing and layout i have gathered a number of suggestions further down recommendation generally i believe the suggested architecture and algorithm are worth pursuing and eventually publishing this may be in contrast with the length of the positive feedback vs the negative feedback but in this case the opposite is true the core idea is intriguing but the paper on it can be improved i believe the paper needs to be more precise and nuanced in answering a potential users question when and why should i consider this algorithm in my view the paper can and should be improved on two fronts 1 the experimental evaluation needs to be more thorough and in particular less focused on winning over baselines but on understanding and showcasing defining properties of the suggested algorithm 2 the presentation can be made much approachable overall i believe the paper as is should be rejected i nevertheless strongly encourage the authors to improve their evaluation and manuscript and resubmit questions generally the authors aim at fairness by fixing autoencoder structures does this mean that rcas generally have a multiple of learnable parameters compared to eg the ae baseline further how did you determine the hyperparameters i would argue an hps is due when comparing such different models given that vaes seem to be the baseline that compares most favorably according to your result it would seem fairly obvious to try rcvaes where everything is the same except reconstruction losses are replaced either by the likelihood term of vaes or the elbo directly have you considered this and if so why havent you tried it could you elaborate more on how a user with entirely unlabeled data would go about making a good guess for the ratio of anomalies so as to be able to use your algorithm further feedback i believe the presentation of the method and results could be improved in several places take these as suggestions for a revisioni dont see a particular need to address these points in a rebuttal the introduction is very long it contains a substantial amount of related work and a fairly detailed description of the proposed method i would encourage the authors to move those bits to the respective dedicated sections and give the reader a more precise problem formulation and particularly what part of the problems are tackled by the contributions of this submission figure 1 is not particularly illustrative one can see that subsampling and shuffling is going on otherwise one has to have a pretty good idea of the method to understand the illustration on a sidenote i would encourage the authors to investigate tikz or similar alternatives for a cleaner style that is more integrated with the notation of the paper figure 2 cuts the lower part of the top row including a redundant legend algorithms 1 and 2 could also be clearer what is s hats1 etc first its a minibatch then its the output of an autoencoder in algorithm 2 the notation of the forward step changes xi is a set then the last step of the for loop does both set and arithmetic operations on xi1 and xi2 in this light i would argue that a more mathematical notation in favor of a programmatic notation would help the reader this would also shorten the lines and make them more readable generally the authors aim at fairness by fixing autoencoder structures does this mean that generally notation could be a little clearer mathcal o is the set of anomalies which is usually bigo notation for which you use mathbb o sets sometimes have uppercase greek letters sometimes lowercase greek letters the probability probability piw is not actually the probability of w but the probability of i as a function of w such small inaccuracies amount to an unnecessary increased mental burden for the reader the results of section 42 could be presented in a much clearer format the figures are illegible at 100 zoom putting this aside both figures 3a and 3b are unnecessarily difficult to process consider 3b the interesting aspect of the figure is the disparity between the lines along certain circle segments 90 of the graph are uninformative space and the nuances are lost i have a strong possibly subjective or biased opinion about the way of presentation of table 1 there is something to be said about compressing large tables like eg tables 3 and 4 in the appendix into digestible formats that said i think wins draws and losses are the wrong mind set to approach baselines to begin with in this particular case it oversimplifies the matter by quite a margin the appendix leaves room for improvement in particular wrt layout and typesetting equations should be broken to stay within the margins punctuation should conclude equation blocks and within the environment instead of the next line equations should be indented delimiters like parentheses should have appropriate height for instance by making more liberal use of left and right for parentheses for better readability docsepthe proposed approach differs from autoencoder based anomaly detection approach in the following ways a autoencoders are trained using only selected data points with small reconstruction errors these are selected using a sampling scheme with theoretical guarantees on convergence the selected points are then shuffled between two autoencoders b during the testing phase each autoencoder applies dropout to generate multiple predictions the averaged ensemble output is used as the final anomaly score some of the issues with this paper a one key issue is why just two autoencoders which the authors delegate for future work however it is key to understanding utility of such an ensemble based shuffling framework b poor presentation of results 1 figure 2 legend issue 2 figure 3 a is better presented as a table table 3 in appendix should be here instead very hard to interpret it in the current form similar comments for figure 3b and table 1 also for anomaly detection benchmarking auc is not sufficient and the authors have to present aupr or f1 scores also i suggest looking at these recent papers for presentation of experimental results httpproceedingsmlrpressv108kim20ckim20cpdf httpsproceedingsicmlccpaper2020file0d59701b3474225fca5563e015965886paperpdf goyal et al icml 2020 c theorem 3 might have a connection with the notion of rrobustness presented in httpsarxivorgabs200707365 so authors would want to make it clear how they differdocsepthis paper presents a robust collaborative autoencoder rca for unsupervised anomaly detection the authors focused on the overparameterization of existing nnbased unsupervised anomaly detection methods and the proposed method aims to overcome the overparameterization problem the main contibutinos of the proposed method are that 1 it uses two autoencoders each of which is trained using only selected data points and 2 monte carlo mc dropout was used for inference although this paper has an interesting idea i have doubt about the contributions my comments are as below 1 first of all to me it was very difficult to read this paper the notations are very confusing 2 in introduction section it is confusing what the main focus of this paper is they mentioned like unlike previous studies our goal is tho learn the weights in an unsupervised learning fashion but because it seems the topic of this paper belongs to unsupervised anomaly detection the labels indicating whether anomaly or not are assumed available in the training data the point that your method is in an unsupervised learning fashion is pretty obvious you dont need to discuss about supervised approachs throughout the paper but please clearly mention that at the beginning of the introduction section and only discuss your method and other unsupervised anomaly detection methods 3 if the overparameterization is the problem when we build a nn for unsupervised anomaly segmentation eg autoencoderae we can simply think about various wellknown nn regulaization techniques for the ae as remedy i also think the two parts of the proposed method corresponding to the contrbutions 1 and 2 work as regularization for the ae im curious if theres any reason to prefer the proposed method to other regularization techniques 4 the proposed rca method involves an ensemble prediction by using mcdropout described in section 32 the authors metioned this is one of their research contribution but the use of mcdropout is quite general in neural network research also while the proposed method definitely benifits from the use of mcdropout other unsupervised anomaly detections based on neural networks eg ae vae can also improve by employing mcdropout the ablation study in table 1 showed that rca significantly outperforms rcae rca without ensembling the authors can implement mcdropoutbased ensemble versions of ae vae and other nnbased methods like deep svdd and check whether the proposed only benefits from mcdropout or rca just outperforms others regardless of mcdropout ### Summary:
the paper describes an autoencoderbased approach to anomaly detection the main weaknessnot untypical for papers in this application areais the experimental section the problem itself may be not welldefined and of course that makes practical comparison difficult perhaps different measureseg remaining lifemay be better to compare on and give better data sets
[ 5322, 281, 9186, 18276, 1543, 2299, 253, 7103, 310, 15846, 18276, 835, 11745, 17082, 651, 671, 320, 275, 1340, 253, 987, 2252, 5084, 273, 4677, 374, 310, 253, 760, 12662, 387, 3045, 359, 923, 326, 247, 2257, 273, 31101, 403, 417, 4236, 275, 1097, 10175, 891, 1537, 320, 40663, 1633, 533, 436, 28145, 4404, 247, 7863, 2408, 273, 3221, 37865, 285, 2297, 3993, 50275, 783, 1524, 10186, 4679, 403, 14999, 247, 2257, 273, 27163, 275, 619, 4743, 253, 1783, 310, 3777, 281, 9880, 253, 247, 1028, 4868, 1411, 253, 1666, 25379, 327, 347, 1142, 941, 5239, 347, 1896, 1223, 28763, 11408, 697, 417, 9371, 275, 4685, 253, 5847, 285, 772, 273, 2176, 11333, 436, 310, 3782, 2032, 1677, 326, 253, 5661, 9978, 273, 25493, 941, 28096, 253, 13260, 273, 1142, 273, 253, 1666, 25379, 50276, 710, 1120, 303, 839, 253, 3894, 273, 31101, 310, 5421, 3738, 253, 1677, 1543, 403, 1077, 1892, 281, 14390, 436, 2353, 84, 253, 1953, 752, 6569, 604, 891, 45166, 253, 17969, 436, 943, 1421, 281, 625, 31101, 1146, 629, 273, 253, 941, 908, 323, 896, 8560, 1677, 326, 253, 5933, 310, 1754, 1475, 326, 4313, 390, 271, 6642, 10445, 891, 651, 452, 10490, 281, 923, 247, 10046, 2770, 327, 352, 50276, 284, 368, 41661, 634, 5438, 1332, 31306, 253, 6779, 4715, 436, 943, 320, 6730, 275, 271, 5661, 7103, 436, 310, 3782, 2032, 1677, 253, 18767, 50276, 783, 9759, 476, 320, 5520, 50276, 2520, 310, 417, 247, 30417, 1127, 533, 891, 2868, 2442, 10668, 651, 10260, 5649, 432, 11701, 275, 2605, 4028, 285, 12806, 891, 452, 13037, 247, 1180, 273, 13991, 2007, 1066, 50275, 250, 27167, 318, 50276, 43786, 891, 2868, 253, 5125, 10336, 285, 5933, 403, 4409, 23453, 285, 6524, 18051, 436, 778, 320, 275, 4499, 342, 253, 2978, 273, 253, 2762, 8680, 4632, 253, 4016, 8680, 533, 275, 436, 1083, 253, 7285, 310, 2032, 253, 5161, 2934, 310, 27807, 533, 253, 2929, 327, 352, 476, 320, 5520, 50276, 74, 2868, 253, 2929, 3198, 281, 320, 625, 10799, 285, 8794, 3086, 275, 22291, 247, 2442, 4212, 1953, 672, 285, 2139, 943, 891, 1908, 436, 5933, 275, 619, 1859, 253, 2929, 476, 285, 943, 320, 5520, 327, 767, 43679, 50276, 18, 253, 5661, 7103, 3198, 281, 320, 625, 11080, 285, 275, 1798, 1679, 7106, 327, 9880, 689, 1666, 25379, 533, 327, 4685, 285, 44762, 2355, 13947, 3607, 273, 253, 5125, 5933, 374, 253, 9759, 476, 320, 1160, 1199, 2746, 494, 50276, 1189, 455, 891, 2868, 253, 2929, 347, 310, 943, 320, 10945, 891, 17837, 7052, 11907, 253, 4477, 281, 3157, 616, 7103, 285, 7714, 285, 501, 538, 2225, 50275, 34974, 50276, 43786, 253, 4477, 4388, 387, 28959, 407, 18505, 6753, 36465, 5289, 1057, 436, 1599, 326, 391, 16559, 3839, 452, 247, 2709, 273, 3037, 494, 3602, 2429, 281, 24088, 253, 247, 70, 8245, 2007, 849, 858, 368, 3653, 253, 4373, 22041, 891, 651, 9059, 271, 288, 793, 310, 1955, 672, 10941, 824, 1027, 3210, 50276, 28821, 326, 13460, 265, 1646, 281, 320, 253, 8245, 326, 26662, 954, 49148, 2556, 281, 634, 906, 352, 651, 1646, 9648, 4755, 281, 1611, 27657, 6156, 265, 835, 3253, 310, 253, 1072, 3707, 14433, 11655, 403, 7932, 2057, 407, 253, 12177, 1307, 273, 13460, 265, 390, 253, 1045, 2399, 3587, 452, 368, 2783, 436, 285, 604, 594, 2139, 419, 2254, 368, 3597, 352, 50276, 16534, 368, 21184, 625, 327, 849, 247, 2608, 342, 7094, 440, 22027, 941, 651, 564, 670, 2403, 247, 1175, 5476, 323, 253, 4313, 273, 31101, 594, 347, 281, 320, 2104, 281, 897, 634, 5933, 50275, 44295, 8680, 50276, 74, 2868, 253, 9759, 273, 253, 1332, 285, 1543, 812, 320, 5520, 275, 2067, 5053, 1379, 841, 347, 13991, 323, 247, 18520, 74, 13414, 923, 247, 1798, 878, 281, 2953, 841, 2792, 275, 247, 30080, 22559, 50276, 783, 10199, 310, 1077, 1048, 352, 4428, 247, 6832, 2408, 273, 2905, 789, 285, 247, 9648, 7000, 5740, 273, 253, 4081, 1332, 891, 651, 11907, 253, 4477, 281, 2118, 1110, 9886, 281, 253, 9056, 9940, 7118, 285, 1918, 253, 9414, 247, 625, 10799, 1895, 15895, 285, 3782, 752, 629, 273, 253, 3237, 403, 11463, 1070, 407, 253, 9021, 273, 436, 19529, 50276, 13206, 337, 310, 417, 3782, 47386, 581, 476, 923, 326, 8790, 312, 4906, 285, 439, 47587, 310, 1469, 327, 5010, 581, 556, 281, 452, 247, 3965, 1175, 2934, 273, 253, 1332, 281, 2096, 253, 23356, 327, 247, 256, 49156, 891, 651, 11907, 253, 4477, 281, 7409, 246, 1479, 91, 390, 2074, 18075, 323, 247, 28452, 3740, 326, 310, 625, 8527, 342, 253, 14951, 273, 253, 2929, 50276, 13206, 374, 12176, 253, 2406, 629, 273, 253, 1755, 4194, 1690, 247, 28116, 13691, 50276, 267, 46042, 337, 285, 374, 812, 671, 320, 30909, 752, 310, 256, 33054, 18, 3966, 806, 697, 247, 1054, 487, 1506, 840, 697, 253, 3453, 273, 271, 6753, 36465, 275, 5933, 374, 253, 14951, 273, 253, 3579, 3213, 2544, 1269, 74, 310, 247, 873, 840, 253, 1390, 3213, 273, 253, 323, 6287, 1057, 1097, 873, 285, 27844, 5871, 327, 1269, 74, 18, 285, 1269, 74, 19, 275, 436, 1708, 891, 651, 9059, 326, 247, 625, 15965, 14951, 275, 3718, 273, 247, 2086, 33778, 14951, 651, 1361, 253, 9414, 436, 651, 671, 48399, 253, 3104, 285, 1056, 731, 625, 34025, 50276, 43786, 253, 4477, 4388, 387, 28959, 407, 18505, 6753, 36465, 5289, 1057, 436, 1599, 326, 3839, 14951, 812, 320, 247, 1652, 30909, 14168, 1179, 258, 310, 253, 873, 273, 31101, 534, 310, 3798, 1943, 80, 14951, 323, 534, 368, 897, 14168, 4482, 258, 5239, 4536, 452, 4627, 41810, 305, 7209, 4876, 4536, 2406, 5045, 305, 7209, 4876, 253, 5912, 5912, 12580, 88, 310, 417, 2686, 253, 5912, 273, 259, 533, 253, 5912, 273, 891, 347, 247, 1159, 273, 259, 824, 1355, 23437, 19103, 2408, 281, 271, 15279, 2559, 6255, 7977, 323, 253, 9414, 50276, 783, 1543, 273, 2593, 5976, 812, 320, 3559, 275, 247, 1199, 30909, 5981, 253, 8442, 403, 20739, 917, 387, 2233, 21282, 8133, 436, 9255, 1097, 8442, 495, 66, 285, 495, 67, 403, 48312, 2834, 281, 1232, 1908, 495, 67, 253, 4722, 4809, 273, 253, 4677, 310, 253, 37808, 875, 253, 3104, 2112, 2176, 9096, 13288, 5091, 273, 253, 4216, 403, 440, 37650, 800, 2317, 285, 253, 8794, 1972, 403, 3663, 50276, 74, 452, 247, 2266, 6830, 17854, 390, 23539, 4743, 670, 253, 1039, 273, 9759, 273, 2829, 337, 627, 310, 1633, 281, 320, 753, 670, 509, 13537, 1781, 7180, 751, 24088, 7180, 495, 285, 577, 275, 253, 30762, 715, 19818, 917, 21453, 326, 753, 891, 1158, 14896, 21354, 285, 11655, 403, 253, 3430, 2564, 873, 281, 2746, 1666, 25379, 281, 3135, 342, 275, 436, 1798, 1083, 352, 689, 48573, 7790, 253, 2647, 407, 3240, 247, 8459, 50276, 783, 30762, 6505, 2316, 323, 7756, 275, 1798, 8772, 12806, 285, 3510, 33513, 7424, 943, 320, 7154, 281, 3297, 1561, 253, 24390, 17256, 2368, 943, 7525, 5150, 8336, 285, 1561, 253, 3126, 3185, 273, 253, 1735, 1386, 7424, 943, 320, 801, 8006, 1448, 16563, 398, 751, 41616, 943, 452, 4569, 4898, 323, 4227, 407, 2403, 625, 12773, 897, 273, 1669, 285, 987, 323, 41616, 323, 1805, 1239, 1430, 5474, 339, 431, 248, 4081, 2746, 19986, 432, 6753, 36465, 1754, 30207, 5481, 2746, 275, 253, 1563, 4088, 247, 6753, 2083, 351, 398, 403, 10166, 970, 760, 4236, 941, 2792, 342, 1355, 14433, 6332, 841, 403, 4236, 970, 247, 10491, 6974, 342, 10527, 23632, 327, 14940, 50276, 783, 4236, 2792, 403, 840, 439, 31377, 875, 767, 6753, 2083, 351, 398, 50275, 67, 1309, 253, 5175, 3408, 1016, 6753, 36465, 10384, 5926, 483, 281, 6635, 2709, 13650, 253, 17522, 19862, 3453, 310, 908, 347, 253, 2457, 30207, 4868, 50276, 8826, 273, 253, 3374, 342, 436, 2929, 50276, 66, 581, 2234, 2523, 310, 2139, 816, 767, 6753, 2083, 351, 398, 534, 253, 4477, 24565, 323, 2852, 789, 2299, 352, 310, 2234, 281, 4685, 11839, 273, 824, 271, 19862, 1754, 439, 47587, 7792, 50276, 67, 4105, 9759, 273, 1543, 50273, 18, 4677, 374, 13691, 2523, 50273, 19, 4677, 495, 247, 310, 1805, 3559, 347, 247, 2829, 2829, 495, 275, 30762, 943, 320, 1060, 3185, 1077, 1892, 281, 4665, 352, 275, 253, 1655, 830, 2074, 5701, 323, 4677, 495, 67, 285, 2829, 337, 671, 323, 30207, 5481, 22791, 272, 247, 1028, 310, 417, 4209, 285, 253, 4477, 452, 281, 1246, 247, 484, 83, 390, 269, 18, 7363, 671, 891, 1804, 2819, 387, 841, 3332, 9380, 323, 9759, 273, 5661, 1543, 50275, 1696, 377, 287, 22868, 1686, 83, 7100, 87, 12347, 42686, 938, 777, 303, 938, 7693, 4989, 50275, 3614, 856, 22868, 280, 1686, 550, 20790, 14952, 3140, 17, 69, 34651, 520, 67, 23568, 2945, 1099, 71, 6357, 2417, 3571, 70, 520, 3046, 2082, 35265, 20790, 9275, 564, 90, 267, 1162, 355, 17857, 1686, 9169, 50276, 68, 10012, 495, 1537, 452, 247, 4602, 342, 253, 10732, 273, 391, 18848, 461, 1255, 3559, 275, 5987, 39962, 2061, 5375, 1518, 26522, 22359, 50271, 601, 4477, 651, 971, 281, 1056, 352, 2590, 849, 597, 9184, 7152, 33032, 2520, 2929, 10262, 247, 10237, 27549, 6753, 36465, 391, 6357, 323, 440, 35421, 30207, 5481, 253, 4477, 7106, 327, 253, 689, 19484, 1320, 273, 5368, 48257, 3169, 440, 35421, 30207, 5481, 3082, 285, 253, 4081, 1332, 13698, 281, 11399, 253, 689, 19484, 1320, 1895, 253, 2022, 523, 487, 307, 15777, 273, 253, 4081, 1332, 403, 326, 337, 352, 4648, 767, 6753, 2083, 351, 398, 1016, 273, 534, 310, 10166, 970, 760, 4236, 941, 2792, 285, 374, 1114, 442, 1113, 4213, 278, 68, 5926, 483, 369, 908, 323, 17032, 50276, 20261, 436, 2929, 556, 271, 4722, 2934, 891, 452, 5545, 670, 253, 9021, 619, 5701, 403, 347, 2708, 50276, 18, 806, 273, 512, 281, 479, 352, 369, 1077, 2834, 281, 1239, 436, 2929, 253, 41818, 403, 1077, 21643, 50276, 19, 275, 10199, 2593, 352, 310, 21643, 752, 253, 2022, 2770, 273, 436, 2929, 310, 597, 5393, 751, 12401, 2045, 2175, 776, 4736, 310, 37152, 3037, 253, 13461, 275, 271, 440, 35421, 4715, 8142, 533, 984, 352, 3133, 253, 9400, 273, 436, 2929, 14125, 281, 440, 35421, 30207, 5481, 253, 13301, 7809, 1880, 30207, 390, 417, 403, 8025, 2130, 275, 253, 3733, 941, 253, 1127, 326, 634, 1332, 310, 275, 271, 440, 35421, 4715, 8142, 310, 3965, 4755, 50276, 5658, 13414, 878, 281, 2319, 670, 22296, 2746, 84, 4768, 253, 2929, 533, 4496, 4518, 3748, 326, 387, 253, 5068, 273, 253, 10199, 2593, 285, 760, 2319, 634, 1332, 285, 643, 440, 35421, 30207, 5481, 3082, 50276, 20, 604, 253, 689, 19484, 1320, 310, 253, 1895, 672, 359, 1973, 247, 48257, 323, 440, 35421, 30207, 26405, 24088, 6753, 36465, 3348, 359, 476, 3365, 1158, 670, 2710, 973, 4304, 48257, 810, 3627, 1320, 5609, 323, 253, 247, 70, 347, 16748, 891, 671, 1158, 253, 767, 4243, 273, 253, 4081, 1332, 3969, 281, 253, 2916, 2858, 621, 337, 285, 374, 789, 347, 37820, 323, 253, 247, 70, 516, 14338, 604, 253, 373, 667, 1921, 281, 4510, 253, 4081, 1332, 281, 643, 37820, 5609, 50276, 21, 253, 4081, 391, 6357, 1332, 8687, 271, 19862, 10554, 407, 970, 278, 2428, 1658, 483, 2529, 275, 2593, 4567, 253, 4477, 1313, 10998, 436, 310, 581, 273, 616, 2561, 7680, 533, 253, 897, 273, 278, 2428, 1658, 483, 310, 3240, 2087, 275, 11454, 2990, 2561, 671, 1223, 253, 4081, 1332, 7964, 2240, 338, 953, 432, 253, 897, 273, 278, 2428, 1658, 483, 643, 440, 35421, 30207, 843, 20713, 1754, 327, 11454, 6928, 24088, 247, 70, 362, 3348, 476, 671, 3157, 407, 19693, 278, 2428, 1658, 483, 253, 28913, 1263, 275, 2829, 337, 2692, 326, 391, 6357, 3012, 41731, 13015, 27657, 3348, 391, 6357, 1293, 546, 35128, 50276, 783, 4477, 476, 3359, 278, 2428, 1658, 483, 3169, 19862, 9508, 273, 247, 70, 362, 3348, 285, 643, 48257, 3169, 3082, 751, 3676, 18504, 1678, 285, 2451, 1880, 253, 4081, 760, 5373, 432, 278, 2428, 1658, 483, 390, 391, 6357, 816, 41731, 13015, 2571, 10159, 273, 278, 2428, 1658, 483, 2490, 187, 4118, 18435, 27, 783, 2929, 8631, 271, 6753, 36465, 3169, 2746, 281, 30207, 5481, 50276, 783, 2022, 14855, 1439, 440, 6611, 474, 323, 9380, 275, 436, 2898, 2170, 261, 253, 5661, 2593, 50276, 783, 1895, 3139, 778, 320, 417, 6210, 392, 37224, 285, 273, 2282, 326, 2789, 8542, 5301, 2834, 4931, 1027, 2557, 26931, 5780, 5243, 358, 333, 320, 1805, 281, 7277, 327, 285, 1918, 1805, 941, 5239 ]
[ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 ]
[ 5322, 281, 9186, 18276, 1543, 2299, 253, 7103, 310, 15846, 18276, 835, 11745, 17082, 651, 671, 320, 275, 1340, 253, 987, 2252, 5084, 273, 4677, 374, 310, 253, 760, 12662, 387, 3045, 359, 923, 326, 247, 2257, 273, 31101, 403, 417, 4236, 275, 1097, 10175, 891, 1537, 320, 40663, 1633, 533, 436, 28145, 4404, 247, 7863, 2408, 273, 3221, 37865, 285, 2297, 3993, 50275, 783, 1524, 10186, 4679, 403, 14999, 247, 2257, 273, 27163, 275, 619, 4743, 253, 1783, 310, 3777, 281, 9880, 253, 247, 1028, 4868, 1411, 253, 1666, 25379, 327, 347, 1142, 941, 5239, 347, 1896, 1223, 28763, 11408, 697, 417, 9371, 275, 4685, 253, 5847, 285, 772, 273, 2176, 11333, 436, 310, 3782, 2032, 1677, 326, 253, 5661, 9978, 273, 25493, 941, 28096, 253, 13260, 273, 1142, 273, 253, 1666, 25379, 50276, 710, 1120, 303, 839, 253, 3894, 273, 31101, 310, 5421, 3738, 253, 1677, 1543, 403, 1077, 1892, 281, 14390, 436, 2353, 84, 253, 1953, 752, 6569, 604, 891, 45166, 253, 17969, 436, 943, 1421, 281, 625, 31101, 1146, 629, 273, 253, 941, 908, 323, 896, 8560, 1677, 326, 253, 5933, 310, 1754, 1475, 326, 4313, 390, 271, 6642, 10445, 891, 651, 452, 10490, 281, 923, 247, 10046, 2770, 327, 352, 50276, 284, 368, 41661, 634, 5438, 1332, 31306, 253, 6779, 4715, 436, 943, 320, 6730, 275, 271, 5661, 7103, 436, 310, 3782, 2032, 1677, 253, 18767, 50276, 783, 9759, 476, 320, 5520, 50276, 2520, 310, 417, 247, 30417, 1127, 533, 891, 2868, 2442, 10668, 651, 10260, 5649, 432, 11701, 275, 2605, 4028, 285, 12806, 891, 452, 13037, 247, 1180, 273, 13991, 2007, 1066, 50275, 250, 27167, 318, 50276, 43786, 891, 2868, 253, 5125, 10336, 285, 5933, 403, 4409, 23453, 285, 6524, 18051, 436, 778, 320, 275, 4499, 342, 253, 2978, 273, 253, 2762, 8680, 4632, 253, 4016, 8680, 533, 275, 436, 1083, 253, 7285, 310, 2032, 253, 5161, 2934, 310, 27807, 533, 253, 2929, 327, 352, 476, 320, 5520, 50276, 74, 2868, 253, 2929, 3198, 281, 320, 625, 10799, 285, 8794, 3086, 275, 22291, 247, 2442, 4212, 1953, 672, 285, 2139, 943, 891, 1908, 436, 5933, 275, 619, 1859, 253, 2929, 476, 285, 943, 320, 5520, 327, 767, 43679, 50276, 18, 253, 5661, 7103, 3198, 281, 320, 625, 11080, 285, 275, 1798, 1679, 7106, 327, 9880, 689, 1666, 25379, 533, 327, 4685, 285, 44762, 2355, 13947, 3607, 273, 253, 5125, 5933, 374, 253, 9759, 476, 320, 1160, 1199, 2746, 494, 50276, 1189, 455, 891, 2868, 253, 2929, 347, 310, 943, 320, 10945, 891, 17837, 7052, 11907, 253, 4477, 281, 3157, 616, 7103, 285, 7714, 285, 501, 538, 2225, 50275, 34974, 50276, 43786, 253, 4477, 4388, 387, 28959, 407, 18505, 6753, 36465, 5289, 1057, 436, 1599, 326, 391, 16559, 3839, 452, 247, 2709, 273, 3037, 494, 3602, 2429, 281, 24088, 253, 247, 70, 8245, 2007, 849, 858, 368, 3653, 253, 4373, 22041, 891, 651, 9059, 271, 288, 793, 310, 1955, 672, 10941, 824, 1027, 3210, 50276, 28821, 326, 13460, 265, 1646, 281, 320, 253, 8245, 326, 26662, 954, 49148, 2556, 281, 634, 906, 352, 651, 1646, 9648, 4755, 281, 1611, 27657, 6156, 265, 835, 3253, 310, 253, 1072, 3707, 14433, 11655, 403, 7932, 2057, 407, 253, 12177, 1307, 273, 13460, 265, 390, 253, 1045, 2399, 3587, 452, 368, 2783, 436, 285, 604, 594, 2139, 419, 2254, 368, 3597, 352, 50276, 16534, 368, 21184, 625, 327, 849, 247, 2608, 342, 7094, 440, 22027, 941, 651, 564, 670, 2403, 247, 1175, 5476, 323, 253, 4313, 273, 31101, 594, 347, 281, 320, 2104, 281, 897, 634, 5933, 50275, 44295, 8680, 50276, 74, 2868, 253, 9759, 273, 253, 1332, 285, 1543, 812, 320, 5520, 275, 2067, 5053, 1379, 841, 347, 13991, 323, 247, 18520, 74, 13414, 923, 247, 1798, 878, 281, 2953, 841, 2792, 275, 247, 30080, 22559, 50276, 783, 10199, 310, 1077, 1048, 352, 4428, 247, 6832, 2408, 273, 2905, 789, 285, 247, 9648, 7000, 5740, 273, 253, 4081, 1332, 891, 651, 11907, 253, 4477, 281, 2118, 1110, 9886, 281, 253, 9056, 9940, 7118, 285, 1918, 253, 9414, 247, 625, 10799, 1895, 15895, 285, 3782, 752, 629, 273, 253, 3237, 403, 11463, 1070, 407, 253, 9021, 273, 436, 19529, 50276, 13206, 337, 310, 417, 3782, 47386, 581, 476, 923, 326, 8790, 312, 4906, 285, 439, 47587, 310, 1469, 327, 5010, 581, 556, 281, 452, 247, 3965, 1175, 2934, 273, 253, 1332, 281, 2096, 253, 23356, 327, 247, 256, 49156, 891, 651, 11907, 253, 4477, 281, 7409, 246, 1479, 91, 390, 2074, 18075, 323, 247, 28452, 3740, 326, 310, 625, 8527, 342, 253, 14951, 273, 253, 2929, 50276, 13206, 374, 12176, 253, 2406, 629, 273, 253, 1755, 4194, 1690, 247, 28116, 13691, 50276, 267, 46042, 337, 285, 374, 812, 671, 320, 30909, 752, 310, 256, 33054, 18, 3966, 806, 697, 247, 1054, 487, 1506, 840, 697, 253, 3453, 273, 271, 6753, 36465, 275, 5933, 374, 253, 14951, 273, 253, 3579, 3213, 2544, 1269, 74, 310, 247, 873, 840, 253, 1390, 3213, 273, 253, 323, 6287, 1057, 1097, 873, 285, 27844, 5871, 327, 1269, 74, 18, 285, 1269, 74, 19, 275, 436, 1708, 891, 651, 9059, 326, 247, 625, 15965, 14951, 275, 3718, 273, 247, 2086, 33778, 14951, 651, 1361, 253, 9414, 436, 651, 671, 48399, 253, 3104, 285, 1056, 731, 625, 34025, 50276, 43786, 253, 4477, 4388, 387, 28959, 407, 18505, 6753, 36465, 5289, 1057, 436, 1599, 326, 3839, 14951, 812, 320, 247, 1652, 30909, 14168, 1179, 258, 310, 253, 873, 273, 31101, 534, 310, 3798, 1943, 80, 14951, 323, 534, 368, 897, 14168, 4482, 258, 5239, 4536, 452, 4627, 41810, 305, 7209, 4876, 4536, 2406, 5045, 305, 7209, 4876, 253, 5912, 5912, 12580, 88, 310, 417, 2686, 253, 5912, 273, 259, 533, 253, 5912, 273, 891, 347, 247, 1159, 273, 259, 824, 1355, 23437, 19103, 2408, 281, 271, 15279, 2559, 6255, 7977, 323, 253, 9414, 50276, 783, 1543, 273, 2593, 5976, 812, 320, 3559, 275, 247, 1199, 30909, 5981, 253, 8442, 403, 20739, 917, 387, 2233, 21282, 8133, 436, 9255, 1097, 8442, 495, 66, 285, 495, 67, 403, 48312, 2834, 281, 1232, 1908, 495, 67, 253, 4722, 4809, 273, 253, 4677, 310, 253, 37808, 875, 253, 3104, 2112, 2176, 9096, 13288, 5091, 273, 253, 4216, 403, 440, 37650, 800, 2317, 285, 253, 8794, 1972, 403, 3663, 50276, 74, 452, 247, 2266, 6830, 17854, 390, 23539, 4743, 670, 253, 1039, 273, 9759, 273, 2829, 337, 627, 310, 1633, 281, 320, 753, 670, 509, 13537, 1781, 7180, 751, 24088, 7180, 495, 285, 577, 275, 253, 30762, 715, 19818, 917, 21453, 326, 753, 891, 1158, 14896, 21354, 285, 11655, 403, 253, 3430, 2564, 873, 281, 2746, 1666, 25379, 281, 3135, 342, 275, 436, 1798, 1083, 352, 689, 48573, 7790, 253, 2647, 407, 3240, 247, 8459, 50276, 783, 30762, 6505, 2316, 323, 7756, 275, 1798, 8772, 12806, 285, 3510, 33513, 7424, 943, 320, 7154, 281, 3297, 1561, 253, 24390, 17256, 2368, 943, 7525, 5150, 8336, 285, 1561, 253, 3126, 3185, 273, 253, 1735, 1386, 7424, 943, 320, 801, 8006, 1448, 16563, 398, 751, 41616, 943, 452, 4569, 4898, 323, 4227, 407, 2403, 625, 12773, 897, 273, 1669, 285, 987, 323, 41616, 323, 1805, 1239, 1430, 5474, 339, 431, 248, 4081, 2746, 19986, 432, 6753, 36465, 1754, 30207, 5481, 2746, 275, 253, 1563, 4088, 247, 6753, 2083, 351, 398, 403, 10166, 970, 760, 4236, 941, 2792, 342, 1355, 14433, 6332, 841, 403, 4236, 970, 247, 10491, 6974, 342, 10527, 23632, 327, 14940, 50276, 783, 4236, 2792, 403, 840, 439, 31377, 875, 767, 6753, 2083, 351, 398, 50275, 67, 1309, 253, 5175, 3408, 1016, 6753, 36465, 10384, 5926, 483, 281, 6635, 2709, 13650, 253, 17522, 19862, 3453, 310, 908, 347, 253, 2457, 30207, 4868, 50276, 8826, 273, 253, 3374, 342, 436, 2929, 50276, 66, 581, 2234, 2523, 310, 2139, 816, 767, 6753, 2083, 351, 398, 534, 253, 4477, 24565, 323, 2852, 789, 2299, 352, 310, 2234, 281, 4685, 11839, 273, 824, 271, 19862, 1754, 439, 47587, 7792, 50276, 67, 4105, 9759, 273, 1543, 50273, 18, 4677, 374, 13691, 2523, 50273, 19, 4677, 495, 247, 310, 1805, 3559, 347, 247, 2829, 2829, 495, 275, 30762, 943, 320, 1060, 3185, 1077, 1892, 281, 4665, 352, 275, 253, 1655, 830, 2074, 5701, 323, 4677, 495, 67, 285, 2829, 337, 671, 323, 30207, 5481, 22791, 272, 247, 1028, 310, 417, 4209, 285, 253, 4477, 452, 281, 1246, 247, 484, 83, 390, 269, 18, 7363, 671, 891, 1804, 2819, 387, 841, 3332, 9380, 323, 9759, 273, 5661, 1543, 50275, 1696, 377, 287, 22868, 1686, 83, 7100, 87, 12347, 42686, 938, 777, 303, 938, 7693, 4989, 50275, 3614, 856, 22868, 280, 1686, 550, 20790, 14952, 3140, 17, 69, 34651, 520, 67, 23568, 2945, 1099, 71, 6357, 2417, 3571, 70, 520, 3046, 2082, 35265, 20790, 9275, 564, 90, 267, 1162, 355, 17857, 1686, 9169, 50276, 68, 10012, 495, 1537, 452, 247, 4602, 342, 253, 10732, 273, 391, 18848, 461, 1255, 3559, 275, 5987, 39962, 2061, 5375, 1518, 26522, 22359, 50271, 601, 4477, 651, 971, 281, 1056, 352, 2590, 849, 597, 9184, 7152, 33032, 2520, 2929, 10262, 247, 10237, 27549, 6753, 36465, 391, 6357, 323, 440, 35421, 30207, 5481, 253, 4477, 7106, 327, 253, 689, 19484, 1320, 273, 5368, 48257, 3169, 440, 35421, 30207, 5481, 3082, 285, 253, 4081, 1332, 13698, 281, 11399, 253, 689, 19484, 1320, 1895, 253, 2022, 523, 487, 307, 15777, 273, 253, 4081, 1332, 403, 326, 337, 352, 4648, 767, 6753, 2083, 351, 398, 1016, 273, 534, 310, 10166, 970, 760, 4236, 941, 2792, 285, 374, 1114, 442, 1113, 4213, 278, 68, 5926, 483, 369, 908, 323, 17032, 50276, 20261, 436, 2929, 556, 271, 4722, 2934, 891, 452, 5545, 670, 253, 9021, 619, 5701, 403, 347, 2708, 50276, 18, 806, 273, 512, 281, 479, 352, 369, 1077, 2834, 281, 1239, 436, 2929, 253, 41818, 403, 1077, 21643, 50276, 19, 275, 10199, 2593, 352, 310, 21643, 752, 253, 2022, 2770, 273, 436, 2929, 310, 597, 5393, 751, 12401, 2045, 2175, 776, 4736, 310, 37152, 3037, 253, 13461, 275, 271, 440, 35421, 4715, 8142, 533, 984, 352, 3133, 253, 9400, 273, 436, 2929, 14125, 281, 440, 35421, 30207, 5481, 253, 13301, 7809, 1880, 30207, 390, 417, 403, 8025, 2130, 275, 253, 3733, 941, 253, 1127, 326, 634, 1332, 310, 275, 271, 440, 35421, 4715, 8142, 310, 3965, 4755, 50276, 5658, 13414, 878, 281, 2319, 670, 22296, 2746, 84, 4768, 253, 2929, 533, 4496, 4518, 3748, 326, 387, 253, 5068, 273, 253, 10199, 2593, 285, 760, 2319, 634, 1332, 285, 643, 440, 35421, 30207, 5481, 3082, 50276, 20, 604, 253, 689, 19484, 1320, 310, 253, 1895, 672, 359, 1973, 247, 48257, 323, 440, 35421, 30207, 26405, 24088, 6753, 36465, 3348, 359, 476, 3365, 1158, 670, 2710, 973, 4304, 48257, 810, 3627, 1320, 5609, 323, 253, 247, 70, 347, 16748, 891, 671, 1158, 253, 767, 4243, 273, 253, 4081, 1332, 3969, 281, 253, 2916, 2858, 621, 337, 285, 374, 789, 347, 37820, 323, 253, 247, 70, 516, 14338, 604, 253, 373, 667, 1921, 281, 4510, 253, 4081, 1332, 281, 643, 37820, 5609, 50276, 21, 253, 4081, 391, 6357, 1332, 8687, 271, 19862, 10554, 407, 970, 278, 2428, 1658, 483, 2529, 275, 2593, 4567, 253, 4477, 1313, 10998, 436, 310, 581, 273, 616, 2561, 7680, 533, 253, 897, 273, 278, 2428, 1658, 483, 310, 3240, 2087, 275, 11454, 2990, 2561, 671, 1223, 253, 4081, 1332, 7964, 2240, 338, 953, 432, 253, 897, 273, 278, 2428, 1658, 483, 643, 440, 35421, 30207, 843, 20713, 1754, 327, 11454, 6928, 24088, 247, 70, 362, 3348, 476, 671, 3157, 407, 19693, 278, 2428, 1658, 483, 253, 28913, 1263, 275, 2829, 337, 2692, 326, 391, 6357, 3012, 41731, 13015, 27657, 3348, 391, 6357, 1293, 546, 35128, 50276, 783, 4477, 476, 3359, 278, 2428, 1658, 483, 3169, 19862, 9508, 273, 247, 70, 362, 3348, 285, 643, 48257, 3169, 3082, 751, 3676, 18504, 1678, 285, 2451, 1880, 253, 4081, 760, 5373, 432, 278, 2428, 1658, 483, 390, 391, 6357, 816, 41731, 13015, 2571, 10159, 273, 278, 2428, 1658, 483, 2490, 187, 4118, 18435, 27, 783, 2929, 8631, 271, 6753, 36465, 3169, 2746, 281, 30207, 5481, 50276, 783, 2022, 14855, 1439, 440, 6611, 474, 323, 9380, 275, 436, 2898, 2170, 261, 253, 5661, 2593, 50276, 783, 1895, 3139, 778, 320, 417, 6210, 392, 37224, 285, 273, 2282, 326, 2789, 8542, 5301, 2834, 4931, 1027, 2557, 26931, 5780, 5243, 358, 333, 320, 1805, 281, 7277, 327, 285, 1918, 1805, 941, 5239 ]
Below is given review of a research paper from cnoference journal. Please write a summary the review. ### Review: the authors identify that classical learning is running into limitations due to power and scale of computing systems the authors suggest quantum learning might supplant classical learning and solve these fundamental challenges and the authors suggest a method that is efficient in the number of qubits which can be quite precious the authors believe this work is well motivated by the fact that little work has been done in finding ways to combine the virtues of quantum methods with the power of deep neural networks the authors propose an encoding scheme to map data efficiently to a qubit to exploit multiple qubit samples the ability to entangle multiple qubits to represent points on the bloch sphere is a key concept for efficiently using the qubit space i believe the biggest contribution this paper makes is this efficient mapping of data to qubit space overall this paper demonstrates that some aspects of neural computing can be mapped to a quantum qubit system this is a timely topic as the scale and energy usage of classical computers for deep learning training is becoming untenable and many are thinking about ways to make this problem tractable the most novel concept in the paper is the symmetry described between the nature of how entangled qubits can encode information and how a neural network progressively makes data more separatable the actual learning process appears to be done on a classical computer backprop and update and then mapped back into the quantum system to me a much more interesting result would be exploiting the quantum nature of the computer to solve this nonconvex optimization problem of reducing the loss via layers of features the generative aspect described is a nice sideeffect as the authors indicate but isnt really a novel concept the results of the paper really are proof that the basic functionality is there but not really a proof of concept in a general way the dimensionality of a small neural network for mnist had to be reduced to be able to be realized on an actual quantum system this weakens the claims of these results as it may be untenable as complexity is scaledup nevertheless i believe this result does demonstrate an existence proof and the community will benefit in general the language should be cleaned up there are just a few sentences with a slightly odd structure section 3 i can see how the data is encoded to make it linearly separable but how does this relate to backprop i can see that backdrop isnt really handled by the quantum system so the separable mapping of the data is really just a general data concept section 31 it looks like the actual backprop is done on a classical computer a much more interesting result would be to do the actual gradient descent in the quantum domain section 32 is the data encoding done on a classical computer section 35 looks like the generative concept is really just exploiting the stochastic nature of qubits as a random generator i think this points to a strong parallel between neural networks and quantum systems but isnt really a new concept docsepthe paper claims to introduce a new quantum machine learning framework called genqu however the description of the framework very vague using classical computers to optimize the parameters of a fixed quantum circuit and hardly novel in fact the same basic ideas are so wellknown in the community that they are described in detail as usage examples for popular quantum computing platforms such as qiskit and ibm q the only remotely nontrivial part of the paper is contained in section 42 about quantum deep learning where the authors consider the mnist data set upon closer inspection it turns out that they use pca to reduce the dataset to 4 dimensions which is in turn used to train a quantum neural network to perform binary classification ie to discriminate between 0instances and 5instances the authors claim that such a quantum classifier provides an advantage versus a convolutional neural network in terms of 1 the number of training epochs while ignoring the time needed to perform pca and 2 the number of parameters while ignoring the parameters needed to describe the principal components additionally no confidence intervals are visible on fig 7 which suggests that the data might have been obtained from a single experimental run finally there are several instances of sloppy writing such as the inconsistent usage of math mode for variables the statement pphi 0 the typo iws instead of is etcdocsepi agree that promoting experiments for real quantum hardware is important but i dont think the team has yet to create a working platform that could be accepted to iclr if the opensourced platform is the main contribution rather than a new understanding of how quantum computers could be useful for machine learning problems then the authors should submit the manuscript after having the opensourced software available furthermore a lot of the wording should be changed the current version sounds like they are proposing a new quantum machine learning framework while they are creating an opensourced platform summary the authors propose a framework genqu for learning classical data using quantum computation the classical computer would encode the classical data into quantum circuits the quantum computer would then run the quantum circuit measure the resulting quantum state and feed the measurement data back to the classical machine this process would repeat until the classical computer output the final result reasons for score this framework is not new and has been widely adopted in the quantum machine learning community it is unclear to me what is being proposed by this work the framework is known as a variational quantumclassical algorithm and there is extensive literature for different applications in quantum computing such as quantum chemistry simulating quantum field theory optimization and machine learning for example see references 1 2 for existing proposals for machine learning applications due to the lack of meaningful contributions i would not recommend acceptance pros cons 1 the framework is not new it is not scientifically correct to claim the proposal of a new framework genqu when this has already been widely adopted in the quantum machine learning community 2 the authors did not provide any new theoretical insights into how quantum computation can learn classical data better 3 the numerical experiments were not strong enough to justify any form of advantage using the quantum computer furthermore these numerical experiments have already been presented in the literature for example a tutorial in tensorflow quantum 3 has also included such an experiment 1 havlek vojtch et al supervised learning with quantumenhanced feature spaces nature 5677747 2019 209212 2 farhi edward and hartmut neven classification with quantum neural networks on near term processors arxiv preprint arxiv180206002 2018 3 peruzzo alberto et al a variational eigenvalue solver on a photonic quantum processor nature communications 5 2014 4213 4 httpswwwtensorfloworgquantumtutorialsmnistdocsepthis paper presents genqu a hybrid and generalpurpose quantum framework for learning classical data through quantum states by encoding two dimensions of data per one qubit they demonstrate the effectiveness of their framework via two classical classification tasks where 1 and 2 qubits are used respectively this paper is more like an entrylevel tutorial rather than a technical paper more technical contributions are needed towards a paper 1 one of the key contributions claimed by the author is that they show the power of encoding two dimensions in one qubit this thereby reduces the quantum state dimensionality by 2n2 i am super surprised by this claim what is your baseline one qubit per one dimension of classical information could you refer to the paper from which you get this baseline there are just too many papers1 about how to encode classical data into quantum states the coding scheme proposed in the paper is not novel and not even stateoftheart 2 what is the key difference between your framework and tensorflow quantum2 for me tensorflow quantum is a much stronger framework for example the single qubit kernelized classification case study in section 33 is just an illustrative example in 2 1 biamonte jacob et al quantum machine learning nature 5497671 2017 195202 2 broughton michael et al tensorflow quantum a software framework for quantum machine learning arxiv preprint arxiv200302989 2020 ### Summary:
this paper proposes a new quantum machine learning framework which is evaluated on the mnist dataset while the paper was relatively well written reviewers noted that most of the ideas are already well established and used in quantum machine learning community thus it was not clear what novelty is provided relative to related work
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 783, 4477, 4271, 326, 8946, 4715, 310, 3515, 715, 7364, 1955, 281, 1612, 285, 4311, 273, 12672, 2718, 253, 4477, 1804, 6318, 4715, 1537, 7642, 386, 8946, 4715, 285, 8415, 841, 7936, 7881, 50276, 395, 253, 4477, 1804, 247, 1332, 326, 310, 5919, 275, 253, 1180, 273, 42414, 534, 476, 320, 3240, 18153, 50275, 783, 4477, 2868, 436, 789, 310, 973, 17194, 407, 253, 958, 326, 1652, 789, 556, 644, 2218, 275, 4560, 4088, 281, 13398, 253, 37398, 273, 6318, 3082, 342, 253, 1612, 273, 3676, 11454, 6928, 253, 4477, 12661, 271, 9706, 6974, 281, 3711, 941, 14556, 281, 247, 38294, 281, 22059, 2709, 38294, 3530, 253, 3745, 281, 994, 2134, 2709, 42414, 281, 1957, 2792, 327, 253, 787, 3770, 15269, 310, 247, 2234, 4473, 323, 14556, 970, 253, 38294, 2317, 50276, 74, 2868, 253, 5962, 7680, 436, 2929, 2789, 310, 436, 5919, 10603, 273, 941, 281, 38294, 2317, 50276, 1189, 455, 436, 2929, 14371, 326, 690, 7794, 273, 11454, 12672, 476, 320, 18301, 281, 247, 6318, 38294, 985, 50276, 2520, 310, 247, 14793, 9400, 347, 253, 4311, 285, 2341, 10393, 273, 8946, 12823, 323, 3676, 4715, 3733, 310, 7552, 440, 1866, 494, 285, 1142, 403, 4680, 670, 4088, 281, 1056, 436, 1895, 10649, 494, 50276, 783, 954, 4460, 4473, 275, 253, 2929, 310, 253, 10377, 2529, 875, 253, 3753, 273, 849, 36255, 42414, 476, 22573, 1491, 285, 849, 247, 11454, 2990, 31414, 2789, 941, 625, 2533, 17980, 50276, 783, 4588, 4715, 1232, 4620, 281, 320, 2218, 327, 247, 8946, 4382, 896, 8560, 285, 5731, 285, 840, 18301, 896, 715, 253, 6318, 985, 50276, 936, 479, 247, 1199, 625, 4722, 906, 651, 320, 38883, 253, 6318, 3753, 273, 253, 4382, 281, 8415, 436, 1327, 44181, 13757, 1895, 273, 8493, 253, 2957, 3066, 8090, 273, 3386, 253, 1006, 800, 4809, 2529, 310, 247, 5322, 1930, 8222, 347, 253, 4477, 5224, 533, 310, 2649, 1663, 247, 4460, 4473, 50275, 783, 1543, 273, 253, 2929, 1663, 403, 4737, 326, 253, 5044, 13175, 310, 627, 533, 417, 1663, 247, 4737, 273, 4473, 275, 247, 2087, 1039, 50276, 783, 7877, 1319, 273, 247, 1355, 11454, 2990, 323, 278, 79, 382, 574, 281, 320, 3777, 281, 320, 2104, 281, 320, 8156, 327, 271, 4588, 6318, 985, 436, 5075, 561, 253, 3916, 273, 841, 1543, 347, 352, 778, 320, 440, 1866, 494, 347, 10454, 310, 24337, 484, 50276, 7594, 8299, 891, 2868, 436, 906, 1057, 7568, 271, 6242, 4737, 285, 253, 3114, 588, 5649, 50276, 249, 2087, 253, 3448, 943, 320, 22269, 598, 50276, 9088, 403, 816, 247, 1643, 14683, 342, 247, 5777, 8909, 2605, 50276, 4674, 495, 891, 476, 923, 849, 253, 941, 310, 16202, 281, 1056, 352, 23352, 39690, 50276, 2858, 849, 1057, 436, 14588, 281, 896, 8560, 891, 476, 923, 326, 37924, 310, 2649, 1663, 15726, 407, 253, 6318, 985, 594, 253, 39690, 10603, 273, 253, 941, 310, 1663, 816, 247, 2087, 941, 4473, 50276, 4674, 4562, 352, 4453, 751, 253, 4588, 896, 8560, 310, 2218, 327, 247, 8946, 4382, 50276, 66, 1199, 625, 4722, 906, 651, 320, 281, 513, 253, 4588, 11786, 18499, 275, 253, 6318, 5028, 50276, 4674, 4567, 310, 253, 941, 9706, 2218, 327, 247, 8946, 4382, 50276, 4674, 4791, 4453, 751, 253, 1006, 800, 4473, 310, 1663, 816, 38883, 253, 19191, 3753, 273, 42414, 347, 247, 3632, 14156, 50276, 74, 1158, 436, 2792, 281, 247, 2266, 7529, 875, 11454, 6928, 285, 6318, 2718, 533, 310, 2649, 1663, 247, 747, 4473, 5474, 339, 431, 248, 2929, 3916, 281, 9569, 247, 747, 6318, 5145, 4715, 7792, 1925, 730, 371, 2299, 253, 5740, 273, 253, 7792, 1077, 21248, 970, 8946, 12823, 281, 22318, 253, 3602, 273, 247, 4229, 6318, 5049, 285, 10693, 4460, 275, 958, 253, 1072, 5044, 5697, 403, 594, 973, 4304, 275, 253, 3114, 326, 597, 403, 2529, 275, 2508, 347, 10393, 6667, 323, 4633, 6318, 12672, 13498, 824, 347, 2805, 1886, 262, 285, 18890, 78, 2805, 253, 760, 27938, 37825, 629, 273, 253, 2929, 310, 6221, 275, 2593, 5976, 670, 6318, 3676, 4715, 835, 253, 4477, 1908, 253, 278, 79, 382, 941, 873, 2220, 8003, 15981, 352, 7819, 562, 326, 597, 897, 268, 6357, 281, 4796, 253, 10895, 281, 577, 10103, 534, 310, 275, 1614, 908, 281, 6194, 247, 6318, 11454, 2990, 281, 1347, 8985, 9162, 26332, 281, 30530, 875, 470, 249, 4777, 285, 608, 249, 4777, 253, 4477, 1750, 326, 824, 247, 6318, 30410, 3400, 271, 5750, 7147, 247, 27311, 267, 11454, 2990, 275, 2426, 273, 50276, 18, 253, 1180, 273, 3733, 44540, 1223, 23111, 253, 673, 3058, 281, 1347, 268, 6357, 285, 50276, 19, 253, 1180, 273, 3602, 1223, 23111, 253, 3602, 3058, 281, 6266, 253, 8624, 4295, 23000, 642, 7162, 11508, 403, 7985, 327, 3036, 818, 534, 5936, 326, 253, 941, 1537, 452, 644, 2797, 432, 247, 2014, 5661, 1408, 4720, 627, 403, 2067, 10872, 273, 1499, 45695, 4028, 824, 347, 253, 16706, 10393, 273, 14168, 4438, 323, 4903, 253, 3908, 268, 2162, 50276, 17, 253, 1745, 80, 891, 8819, 3185, 273, 310, 1162, 2428, 406, 339, 2059, 5194, 326, 14312, 4679, 323, 1524, 6318, 10309, 310, 1774, 533, 891, 13414, 1158, 253, 2285, 556, 2568, 281, 2794, 247, 2444, 5147, 326, 812, 320, 7607, 281, 17857, 32888, 604, 253, 13279, 47549, 5147, 310, 253, 2022, 7680, 2581, 685, 247, 747, 4685, 273, 849, 6318, 12823, 812, 320, 4217, 323, 5145, 4715, 3237, 840, 253, 4477, 943, 11929, 253, 7714, 846, 1907, 253, 13279, 47549, 3694, 2130, 33810, 247, 2257, 273, 253, 41066, 943, 320, 4391, 253, 1655, 2715, 7835, 751, 597, 403, 36636, 247, 747, 6318, 5145, 4715, 7792, 1223, 597, 403, 6153, 271, 13279, 47549, 5147, 50274, 8774, 50276, 783, 4477, 12661, 247, 7792, 730, 371, 323, 4715, 8946, 941, 970, 6318, 13782, 253, 8946, 4382, 651, 22573, 253, 8946, 941, 715, 6318, 14174, 253, 6318, 4382, 651, 840, 1408, 253, 6318, 5049, 2557, 253, 4795, 6318, 1375, 285, 3997, 253, 6814, 941, 896, 281, 253, 8946, 5145, 436, 1232, 651, 10280, 1919, 253, 8946, 4382, 3453, 253, 2457, 906, 50274, 250, 3743, 323, 4868, 50275, 2520, 7792, 310, 417, 747, 285, 556, 644, 7561, 8671, 275, 253, 6318, 5145, 4715, 3114, 352, 310, 12744, 281, 479, 752, 310, 1146, 4081, 407, 436, 789, 253, 7792, 310, 1929, 347, 247, 39762, 6318, 37347, 5933, 285, 627, 310, 9470, 6239, 323, 1027, 4893, 275, 6318, 12672, 824, 347, 6318, 18090, 948, 8287, 6318, 1673, 3762, 13757, 285, 5145, 4715, 323, 1650, 923, 10414, 337, 374, 323, 5368, 18595, 323, 5145, 4715, 4893, 1955, 281, 253, 3480, 273, 14282, 9021, 891, 651, 417, 5583, 14924, 50276, 856, 84, 50275, 5040, 50275, 18, 253, 7792, 310, 417, 747, 352, 310, 417, 50164, 3451, 281, 1750, 253, 10419, 273, 247, 747, 7792, 730, 371, 672, 436, 556, 2168, 644, 7561, 8671, 275, 253, 6318, 5145, 4715, 3114, 50276, 19, 50276, 783, 4477, 858, 417, 2085, 667, 747, 10527, 16039, 715, 849, 6318, 13782, 476, 3037, 8946, 941, 1805, 50276, 20, 253, 10704, 4679, 497, 417, 2266, 2217, 281, 15249, 667, 830, 273, 5750, 970, 253, 6318, 4382, 33810, 841, 10704, 4679, 452, 2168, 644, 3559, 275, 253, 6239, 323, 1650, 247, 23647, 275, 13148, 5449, 6318, 495, 556, 671, 2908, 824, 271, 3368, 50273, 18, 31795, 282, 76, 3273, 42565, 348, 1162, 355, 22296, 4715, 342, 6318, 35465, 4735, 8470, 3753, 49609, 2357, 2504, 6247, 1384, 4529, 805, 50276, 19, 2080, 5801, 1407, 1034, 285, 288, 435, 10082, 425, 1261, 9162, 342, 6318, 11454, 6928, 327, 2822, 1307, 26967, 549, 32693, 638, 3845, 549, 32693, 1093, 9992, 23, 4699, 4765, 50276, 20, 591, 86, 35504, 355, 589, 936, 1162, 355, 247, 39762, 25023, 47037, 327, 247, 2359, 5120, 6318, 13732, 3753, 10924, 608, 4059, 5976, 1012, 50276, 21, 5987, 2700, 26109, 5449, 2061, 46320, 85, 37929, 3610, 79, 382, 7152, 33032, 2520, 2929, 10262, 730, 371, 247, 9769, 285, 2087, 27299, 6318, 7792, 323, 4715, 8946, 941, 949, 6318, 3054, 407, 9706, 767, 10103, 273, 941, 591, 581, 38294, 597, 7568, 253, 12510, 273, 616, 7792, 3066, 767, 8946, 9162, 8892, 835, 337, 285, 374, 42414, 403, 908, 2975, 436, 2929, 310, 625, 751, 271, 5857, 5251, 23647, 2581, 685, 247, 7681, 2929, 50276, 3062, 7681, 9021, 403, 3058, 4404, 247, 2929, 50275, 18, 581, 273, 253, 2234, 9021, 7558, 407, 253, 2488, 310, 326, 597, 921, 253, 1612, 273, 9706, 767, 10103, 275, 581, 38294, 436, 7624, 11355, 253, 6318, 1375, 7877, 1319, 407, 374, 79, 19, 891, 717, 2221, 9861, 407, 436, 1750, 752, 310, 634, 8245, 581, 38294, 591, 581, 7877, 273, 8946, 1491, 812, 368, 3730, 281, 253, 2929, 432, 534, 368, 755, 436, 8245, 627, 403, 816, 1512, 1142, 9380, 18, 670, 849, 281, 22573, 8946, 941, 715, 6318, 3054, 50276, 783, 12425, 6974, 4081, 275, 253, 2929, 310, 417, 4460, 285, 417, 1014, 1375, 23037, 14387, 50274, 19, 752, 310, 253, 2234, 3064, 875, 634, 7792, 285, 13148, 5449, 6318, 19, 323, 479, 13148, 5449, 6318, 310, 247, 1199, 10046, 7792, 323, 1650, 253, 2014, 38294, 10295, 1025, 9162, 1083, 1263, 275, 2593, 5922, 310, 816, 271, 47386, 1650, 275, 374, 50276, 18, 1794, 19451, 442, 480, 317, 706, 1162, 355, 6318, 5145, 4715, 3753, 608, 30626, 35276, 4240, 23627, 18161, 50276, 19, 3982, 251, 278, 44023, 1162, 355, 13148, 5449, 6318, 247, 3694, 7792, 323, 6318, 5145, 4715, 549, 32693, 638, 3845, 549, 32693, 1518, 1229, 1717, 2511, 9169, 2490, 187, 4118, 18435, 27, 2520, 2929, 29328, 247, 747, 6318, 5145, 4715, 7792, 534, 310, 6760, 327, 253, 278, 79, 382, 10895, 1223, 253, 2929, 369, 4942, 973, 3542, 30628, 4879, 326, 954, 273, 253, 5697, 403, 2168, 973, 4232, 285, 908, 275, 6318, 5145, 4715, 3114, 3021, 352, 369, 417, 2590, 752, 38135, 310, 2530, 4103, 281, 2905, 789 ]
[ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 ]
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 783, 4477, 4271, 326, 8946, 4715, 310, 3515, 715, 7364, 1955, 281, 1612, 285, 4311, 273, 12672, 2718, 253, 4477, 1804, 6318, 4715, 1537, 7642, 386, 8946, 4715, 285, 8415, 841, 7936, 7881, 50276, 395, 253, 4477, 1804, 247, 1332, 326, 310, 5919, 275, 253, 1180, 273, 42414, 534, 476, 320, 3240, 18153, 50275, 783, 4477, 2868, 436, 789, 310, 973, 17194, 407, 253, 958, 326, 1652, 789, 556, 644, 2218, 275, 4560, 4088, 281, 13398, 253, 37398, 273, 6318, 3082, 342, 253, 1612, 273, 3676, 11454, 6928, 253, 4477, 12661, 271, 9706, 6974, 281, 3711, 941, 14556, 281, 247, 38294, 281, 22059, 2709, 38294, 3530, 253, 3745, 281, 994, 2134, 2709, 42414, 281, 1957, 2792, 327, 253, 787, 3770, 15269, 310, 247, 2234, 4473, 323, 14556, 970, 253, 38294, 2317, 50276, 74, 2868, 253, 5962, 7680, 436, 2929, 2789, 310, 436, 5919, 10603, 273, 941, 281, 38294, 2317, 50276, 1189, 455, 436, 2929, 14371, 326, 690, 7794, 273, 11454, 12672, 476, 320, 18301, 281, 247, 6318, 38294, 985, 50276, 2520, 310, 247, 14793, 9400, 347, 253, 4311, 285, 2341, 10393, 273, 8946, 12823, 323, 3676, 4715, 3733, 310, 7552, 440, 1866, 494, 285, 1142, 403, 4680, 670, 4088, 281, 1056, 436, 1895, 10649, 494, 50276, 783, 954, 4460, 4473, 275, 253, 2929, 310, 253, 10377, 2529, 875, 253, 3753, 273, 849, 36255, 42414, 476, 22573, 1491, 285, 849, 247, 11454, 2990, 31414, 2789, 941, 625, 2533, 17980, 50276, 783, 4588, 4715, 1232, 4620, 281, 320, 2218, 327, 247, 8946, 4382, 896, 8560, 285, 5731, 285, 840, 18301, 896, 715, 253, 6318, 985, 50276, 936, 479, 247, 1199, 625, 4722, 906, 651, 320, 38883, 253, 6318, 3753, 273, 253, 4382, 281, 8415, 436, 1327, 44181, 13757, 1895, 273, 8493, 253, 2957, 3066, 8090, 273, 3386, 253, 1006, 800, 4809, 2529, 310, 247, 5322, 1930, 8222, 347, 253, 4477, 5224, 533, 310, 2649, 1663, 247, 4460, 4473, 50275, 783, 1543, 273, 253, 2929, 1663, 403, 4737, 326, 253, 5044, 13175, 310, 627, 533, 417, 1663, 247, 4737, 273, 4473, 275, 247, 2087, 1039, 50276, 783, 7877, 1319, 273, 247, 1355, 11454, 2990, 323, 278, 79, 382, 574, 281, 320, 3777, 281, 320, 2104, 281, 320, 8156, 327, 271, 4588, 6318, 985, 436, 5075, 561, 253, 3916, 273, 841, 1543, 347, 352, 778, 320, 440, 1866, 494, 347, 10454, 310, 24337, 484, 50276, 7594, 8299, 891, 2868, 436, 906, 1057, 7568, 271, 6242, 4737, 285, 253, 3114, 588, 5649, 50276, 249, 2087, 253, 3448, 943, 320, 22269, 598, 50276, 9088, 403, 816, 247, 1643, 14683, 342, 247, 5777, 8909, 2605, 50276, 4674, 495, 891, 476, 923, 849, 253, 941, 310, 16202, 281, 1056, 352, 23352, 39690, 50276, 2858, 849, 1057, 436, 14588, 281, 896, 8560, 891, 476, 923, 326, 37924, 310, 2649, 1663, 15726, 407, 253, 6318, 985, 594, 253, 39690, 10603, 273, 253, 941, 310, 1663, 816, 247, 2087, 941, 4473, 50276, 4674, 4562, 352, 4453, 751, 253, 4588, 896, 8560, 310, 2218, 327, 247, 8946, 4382, 50276, 66, 1199, 625, 4722, 906, 651, 320, 281, 513, 253, 4588, 11786, 18499, 275, 253, 6318, 5028, 50276, 4674, 4567, 310, 253, 941, 9706, 2218, 327, 247, 8946, 4382, 50276, 4674, 4791, 4453, 751, 253, 1006, 800, 4473, 310, 1663, 816, 38883, 253, 19191, 3753, 273, 42414, 347, 247, 3632, 14156, 50276, 74, 1158, 436, 2792, 281, 247, 2266, 7529, 875, 11454, 6928, 285, 6318, 2718, 533, 310, 2649, 1663, 247, 747, 4473, 5474, 339, 431, 248, 2929, 3916, 281, 9569, 247, 747, 6318, 5145, 4715, 7792, 1925, 730, 371, 2299, 253, 5740, 273, 253, 7792, 1077, 21248, 970, 8946, 12823, 281, 22318, 253, 3602, 273, 247, 4229, 6318, 5049, 285, 10693, 4460, 275, 958, 253, 1072, 5044, 5697, 403, 594, 973, 4304, 275, 253, 3114, 326, 597, 403, 2529, 275, 2508, 347, 10393, 6667, 323, 4633, 6318, 12672, 13498, 824, 347, 2805, 1886, 262, 285, 18890, 78, 2805, 253, 760, 27938, 37825, 629, 273, 253, 2929, 310, 6221, 275, 2593, 5976, 670, 6318, 3676, 4715, 835, 253, 4477, 1908, 253, 278, 79, 382, 941, 873, 2220, 8003, 15981, 352, 7819, 562, 326, 597, 897, 268, 6357, 281, 4796, 253, 10895, 281, 577, 10103, 534, 310, 275, 1614, 908, 281, 6194, 247, 6318, 11454, 2990, 281, 1347, 8985, 9162, 26332, 281, 30530, 875, 470, 249, 4777, 285, 608, 249, 4777, 253, 4477, 1750, 326, 824, 247, 6318, 30410, 3400, 271, 5750, 7147, 247, 27311, 267, 11454, 2990, 275, 2426, 273, 50276, 18, 253, 1180, 273, 3733, 44540, 1223, 23111, 253, 673, 3058, 281, 1347, 268, 6357, 285, 50276, 19, 253, 1180, 273, 3602, 1223, 23111, 253, 3602, 3058, 281, 6266, 253, 8624, 4295, 23000, 642, 7162, 11508, 403, 7985, 327, 3036, 818, 534, 5936, 326, 253, 941, 1537, 452, 644, 2797, 432, 247, 2014, 5661, 1408, 4720, 627, 403, 2067, 10872, 273, 1499, 45695, 4028, 824, 347, 253, 16706, 10393, 273, 14168, 4438, 323, 4903, 253, 3908, 268, 2162, 50276, 17, 253, 1745, 80, 891, 8819, 3185, 273, 310, 1162, 2428, 406, 339, 2059, 5194, 326, 14312, 4679, 323, 1524, 6318, 10309, 310, 1774, 533, 891, 13414, 1158, 253, 2285, 556, 2568, 281, 2794, 247, 2444, 5147, 326, 812, 320, 7607, 281, 17857, 32888, 604, 253, 13279, 47549, 5147, 310, 253, 2022, 7680, 2581, 685, 247, 747, 4685, 273, 849, 6318, 12823, 812, 320, 4217, 323, 5145, 4715, 3237, 840, 253, 4477, 943, 11929, 253, 7714, 846, 1907, 253, 13279, 47549, 3694, 2130, 33810, 247, 2257, 273, 253, 41066, 943, 320, 4391, 253, 1655, 2715, 7835, 751, 597, 403, 36636, 247, 747, 6318, 5145, 4715, 7792, 1223, 597, 403, 6153, 271, 13279, 47549, 5147, 50274, 8774, 50276, 783, 4477, 12661, 247, 7792, 730, 371, 323, 4715, 8946, 941, 970, 6318, 13782, 253, 8946, 4382, 651, 22573, 253, 8946, 941, 715, 6318, 14174, 253, 6318, 4382, 651, 840, 1408, 253, 6318, 5049, 2557, 253, 4795, 6318, 1375, 285, 3997, 253, 6814, 941, 896, 281, 253, 8946, 5145, 436, 1232, 651, 10280, 1919, 253, 8946, 4382, 3453, 253, 2457, 906, 50274, 250, 3743, 323, 4868, 50275, 2520, 7792, 310, 417, 747, 285, 556, 644, 7561, 8671, 275, 253, 6318, 5145, 4715, 3114, 352, 310, 12744, 281, 479, 752, 310, 1146, 4081, 407, 436, 789, 253, 7792, 310, 1929, 347, 247, 39762, 6318, 37347, 5933, 285, 627, 310, 9470, 6239, 323, 1027, 4893, 275, 6318, 12672, 824, 347, 6318, 18090, 948, 8287, 6318, 1673, 3762, 13757, 285, 5145, 4715, 323, 1650, 923, 10414, 337, 374, 323, 5368, 18595, 323, 5145, 4715, 4893, 1955, 281, 253, 3480, 273, 14282, 9021, 891, 651, 417, 5583, 14924, 50276, 856, 84, 50275, 5040, 50275, 18, 253, 7792, 310, 417, 747, 352, 310, 417, 50164, 3451, 281, 1750, 253, 10419, 273, 247, 747, 7792, 730, 371, 672, 436, 556, 2168, 644, 7561, 8671, 275, 253, 6318, 5145, 4715, 3114, 50276, 19, 50276, 783, 4477, 858, 417, 2085, 667, 747, 10527, 16039, 715, 849, 6318, 13782, 476, 3037, 8946, 941, 1805, 50276, 20, 253, 10704, 4679, 497, 417, 2266, 2217, 281, 15249, 667, 830, 273, 5750, 970, 253, 6318, 4382, 33810, 841, 10704, 4679, 452, 2168, 644, 3559, 275, 253, 6239, 323, 1650, 247, 23647, 275, 13148, 5449, 6318, 495, 556, 671, 2908, 824, 271, 3368, 50273, 18, 31795, 282, 76, 3273, 42565, 348, 1162, 355, 22296, 4715, 342, 6318, 35465, 4735, 8470, 3753, 49609, 2357, 2504, 6247, 1384, 4529, 805, 50276, 19, 2080, 5801, 1407, 1034, 285, 288, 435, 10082, 425, 1261, 9162, 342, 6318, 11454, 6928, 327, 2822, 1307, 26967, 549, 32693, 638, 3845, 549, 32693, 1093, 9992, 23, 4699, 4765, 50276, 20, 591, 86, 35504, 355, 589, 936, 1162, 355, 247, 39762, 25023, 47037, 327, 247, 2359, 5120, 6318, 13732, 3753, 10924, 608, 4059, 5976, 1012, 50276, 21, 5987, 2700, 26109, 5449, 2061, 46320, 85, 37929, 3610, 79, 382, 7152, 33032, 2520, 2929, 10262, 730, 371, 247, 9769, 285, 2087, 27299, 6318, 7792, 323, 4715, 8946, 941, 949, 6318, 3054, 407, 9706, 767, 10103, 273, 941, 591, 581, 38294, 597, 7568, 253, 12510, 273, 616, 7792, 3066, 767, 8946, 9162, 8892, 835, 337, 285, 374, 42414, 403, 908, 2975, 436, 2929, 310, 625, 751, 271, 5857, 5251, 23647, 2581, 685, 247, 7681, 2929, 50276, 3062, 7681, 9021, 403, 3058, 4404, 247, 2929, 50275, 18, 581, 273, 253, 2234, 9021, 7558, 407, 253, 2488, 310, 326, 597, 921, 253, 1612, 273, 9706, 767, 10103, 275, 581, 38294, 436, 7624, 11355, 253, 6318, 1375, 7877, 1319, 407, 374, 79, 19, 891, 717, 2221, 9861, 407, 436, 1750, 752, 310, 634, 8245, 581, 38294, 591, 581, 7877, 273, 8946, 1491, 812, 368, 3730, 281, 253, 2929, 432, 534, 368, 755, 436, 8245, 627, 403, 816, 1512, 1142, 9380, 18, 670, 849, 281, 22573, 8946, 941, 715, 6318, 3054, 50276, 783, 12425, 6974, 4081, 275, 253, 2929, 310, 417, 4460, 285, 417, 1014, 1375, 23037, 14387, 50274, 19, 752, 310, 253, 2234, 3064, 875, 634, 7792, 285, 13148, 5449, 6318, 19, 323, 479, 13148, 5449, 6318, 310, 247, 1199, 10046, 7792, 323, 1650, 253, 2014, 38294, 10295, 1025, 9162, 1083, 1263, 275, 2593, 5922, 310, 816, 271, 47386, 1650, 275, 374, 50276, 18, 1794, 19451, 442, 480, 317, 706, 1162, 355, 6318, 5145, 4715, 3753, 608, 30626, 35276, 4240, 23627, 18161, 50276, 19, 3982, 251, 278, 44023, 1162, 355, 13148, 5449, 6318, 247, 3694, 7792, 323, 6318, 5145, 4715, 549, 32693, 638, 3845, 549, 32693, 1518, 1229, 1717, 2511, 9169, 2490, 187, 4118, 18435, 27, 2520, 2929, 29328, 247, 747, 6318, 5145, 4715, 7792, 534, 310, 6760, 327, 253, 278, 79, 382, 10895, 1223, 253, 2929, 369, 4942, 973, 3542, 30628, 4879, 326, 954, 273, 253, 5697, 403, 2168, 973, 4232, 285, 908, 275, 6318, 5145, 4715, 3114, 3021, 352, 369, 417, 2590, 752, 38135, 310, 2530, 4103, 281, 2905, 789 ]
Below is given review of a research paper from cnoference journal. Please write a summary the review. ### Review: updated score post rebuttal the paper proposes a new method autoinverse in the field of inverting neural networks that incorporates the uncertainty in the modelling of the nfp natural forward process though a neural network surrogate by adding the epistemic uncertainty of the surrogate function due to mismodelling and aleatoric uncertainty in the training data noise in the data the new method favours points that have low uncertainty and therefore are a better match to the nfp incorporating uncertainty in the inversion loss also results in desirable behaviour such as 1 enforcing the feasibility in solutions as points with low feasibility should have high uncertainty 2 not requiring regularisation techniques and 3 not requiring careful initialisation 1 originality the autoinverse technique has two components 1 replacing the surrogate neural network function with a neural network function that is capable of predictive uncertainty and 2 modifying the loss function used to find accurate designs given a pretrained surrogate function with frozen parameters the first component of this technique is a wellestablished baseline in literature and deep ensembles have been very well studied in their ability to accurately model predictive uncertainty in a bunch of different domains and tasks in fact the decomposition of the uncertainty learnt using deep ensembles or other methods that model heteroskedastic regression into components that model aleatoric and epistemic uncertainty are also wellstudied 1 however the authors do not cite 1 or discuss any potential pitfalls of modeling noise in such a way see 2 as an example the second component is the novel contribution for this paper in my uncerstanding by using a neural network that is modeling both heteroskedastic aleatoric uncertainty and epistemic uncertainty this method can use these estimates as auxilliary terms in the inverse cost function to directly influence the training of the designs by adding epistemic uncertainty to the inverse cost function points that have high epistemic uncertainty are downweighted in a sense and these are usually points where the surrogate is not performing well or is performing pathologically by adding aleatoric uncertainty to the cost function points with high data noise are also downweighted this simple addition of two terms to the cost function results in quite impressive empirical performance 2 quality in my opinion the paper is wellwritten has good motivations for the problem it is tackling and has strong ablations that answered quite a few of my initial doubtshesitations about the empirical strengths of the technique i have some concerns regarding the thoroughness of uncertainty quantification methods studied and a careful analysis of the strengths and weaknesses of different methods for example deep ensembles are known to be stateoftheart in uncertainty quantification but have much higher compute time and storage requirements than other methods the authors mention that the longer training time for ensembles is not a concern as they are trained parallely i disagree with this point as i feel like acknowledging the gpu hours required to train a method are quite important especially with big compute quite easily getting out of hand to obtain sota even though there are certain groups with the resources to trivially parallelise ensembles if autoinverse is to be established as a strong contender the added compute time memory footprint and inference time needs to be acknowledged another factor here is maybe exploring a few other uncertainty quantification techniques as an ablation in the paper mc dropout more efficient ensembles etc i feel like this comparison is important to understand the contribution of adding an aleatoric and epistemic uncertainty term to the cost function vs the quality of these uncertainty estimates wrt the compute required it might be possible that deep ensembles are overkill for the inversion problem and a lot of compute is used to obtain uncertainty estimates that might be approximated more cheaply as the authors mention it is a very interesting idea to capture the pareto front between accuracy and uncertainty in an inversion task 3 clarity the paper is quite easy to follow and the experiments chosen make logical sense i would argue that it is more important to describe the inversion technique and cost function before talking about uncertainty quantification as it is maybe more prudent to motivate the exact problem that the authors wish to tackle in the current ordering it might seem that the authors use a wellknown methodology of uncertainty quantification and find a usecase that works well though this is obviously not true a reordering of the sections might help mitigate this 4 significance uncertainty quantification in deep neural network applications is of fundamental importance in order for dnn methods to be applicable in safetycritical domains it is very important that we measure their performance calibration and outofdistribution robustness to my knowledge there has not been prior work that adequately addresses the surrogatenfp mismatch when formulating the cost function for inversion this papers application is quite significant and i believe that the empirical results show that modelling uncertainty in a principled way can result in multiple benefits however i feel like this paper is lacking in terms of adequately populating the pareto front between uncertainty and accuracy by proposing a single data point ensembles while the current proposed method cannot be argued with in terms of empirical performance i feel like this paper might be a good place to populate more points here to establish a better understanding for the community on how the quality of uncertainty estimates affect the performance of inversion 1 kendall alex and yarin gal what uncertainties do we need in bayesian deep learning for computer vision advances in neural information processing systems 30 2017 2 seitzer maximilian et al on the pitfalls of heteroscedastic uncertainty estimation with probabilistic neural networks arxiv preprint arxiv220309168 2022 i believe the authors should address the additional compute requirements needed for their method deep ensembles are the least efficient way of obtaining uncertainty estimates and can result in significantly more gpu hours for training and inference these should be addressed docsepthe paper presents an approach to solving inverse problems ie find parameters mathbf x that explainproduce given observations y when passed through the forward model fmathbf x the approach explicitly models and accounts for the uncertainty of the forward process by learning a surrogate of it using a deep ensemble the deep ensemble models the forward process as y sim nmathbb fmumathbf x mathbb fsigmamathbf x with mathbb fsigmamathbf x explicitly modelling the uncertainty of the forward process the mathbb fsigmamathbf x is then plugged within two inversion methods the neural adjoint and a tandem architecture producing their uncertaintyaware variants as a result when looking for the inversion solution the method will avoid mathbf x which come with a high uncertainty with the y that they produce the two methods are evaluated in three benchmarks and demonstrate good inversion performance the paper is rather well written the main idea of the paper is that when looking for inverse solutions mathbf x that could produce a given observation y one should avoid mathbf x that come with high uncertainty for their predicted y as quantified by the deep ensemble and the estimated mathbb fsigmamathbf x parameter in fact mathbb fsigmamathbf x can be decomposed in two terms one measuring aleatoric uncertainty and the other the epistemic uncertainty the latter is the extend to which the components of the ensemble differ with respect to the average prediction ie the mathbb fmumathbf x a high epistemic uncertainty can be evidence of the mathbf x strays away from the training distribution and thus the base models produce very different predictions incorporating the term in the loss functions keeps the search for an inverse solution away from such high uncertainty regions in which the learned surrogate is probably not a very accurate proxy of the true forward process if i understood correctly the paper only provides a point estimate ie a single x given a desired y and along the above discussion the one with the least uncertainty to my understanding in solving inverse problems one is also strongly interested in accessing the full posterior of mathbb x ie pmathbb xy something that is delivered by generative methods such as the invertible neural networks which the authors cite and compare as one of the baselines there is no discussion unless i missed it on the quite point estimate vs posterior distribution approach this is a weakness of the paper and one could also say a weakness of the method the authors adequately addressed the limitations and potential negative societal impact of their work docsepthe paper discusses an approach for inverting surrogate functions based on neural networks in a face of epistemic and aleatoric uncertainty strengths the problem statement quantification of uncertainty for invertible processes is an important problem the idea of modelling uncertainty as outlined in section 31 is a reasonable one the authors plan to release the spectral printer dataset the new dataset would be beneficial for the research community weaknesses the writing looks not entirely clear see the question section novelty could also be emphasised as it is unclear where the proposed method stand amongst the stateoftheart models such as variational autoencoders and normalising flows see q1 in the questions section i put the score as reject for now but looking forward to the rebuttal and hope the authors could address the questions appropriately the score is updated as the concerns have been thoroughly addressed during the rebuttal see the discussion below there is a need to discuss the limitations of the work more one way to address it would be to use standard benchmarks such as the ones proposed in the standard invertible neural networks literature eg mnist and cifar experiments from rezende and mohamed 2015 or experimental section of grathwohl et al 2019 otherwise it would be useful to state why such experiments are not likeforlike or not possible rezende d mohamed s variational inference with normalizing flows in international conference on machine learning 2015 pp 15301538 pmlr will grathwohl ricky t q chen jesse bettencourt ilya sutskever david duvenaud ffjord freeform continuous dynamics for scalable reversible generative models international conference on learning representations 2019 ### Summary:
thanks to the authors for submitting this super interesting work the reviewer discussion reflected an overall satisfaction with the submission author responses and updated manuscript the additional experiments directly addressed reviewer concerns and contributed to increased scores and clarifications in the submission given the clear consensus and reviewer enthusiasm i recommend acceptance well done
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 39055, 4868, 1501, 30080, 22559, 50276, 783, 2929, 29328, 247, 747, 1332, 6753, 46429, 275, 253, 1673, 273, 275, 31324, 11454, 6928, 326, 31167, 253, 11649, 275, 253, 26278, 273, 253, 295, 16983, 3626, 3579, 1232, 2167, 247, 11454, 2990, 35701, 407, 6240, 253, 30009, 11060, 11649, 273, 253, 35701, 1159, 1955, 281, 19412, 351, 3485, 285, 21844, 1080, 280, 11649, 275, 253, 3733, 941, 6046, 275, 253, 941, 253, 747, 1332, 2883, 2108, 2792, 326, 452, 1698, 11649, 285, 3103, 403, 247, 1805, 3761, 281, 253, 295, 16983, 24049, 11649, 275, 253, 27697, 2957, 671, 1543, 275, 11408, 8770, 824, 347, 337, 37703, 253, 25720, 275, 5482, 347, 2792, 342, 1698, 25720, 943, 452, 1029, 11649, 374, 417, 10568, 3963, 5837, 5609, 285, 495, 417, 10568, 10182, 3302, 5837, 50276, 18, 3236, 414, 50276, 783, 6753, 46429, 5853, 556, 767, 4295, 337, 15706, 253, 35701, 11454, 2990, 1159, 342, 247, 11454, 2990, 1159, 326, 310, 7032, 273, 15970, 11649, 285, 374, 26264, 253, 2957, 1159, 908, 281, 1089, 7899, 11809, 1677, 247, 3215, 11273, 35701, 1159, 342, 13831, 3602, 50276, 783, 806, 4445, 273, 436, 5853, 310, 247, 973, 21877, 8245, 275, 6239, 285, 3676, 49328, 452, 644, 1077, 973, 5421, 275, 616, 3745, 281, 13613, 1566, 15970, 11649, 275, 247, 12190, 273, 1027, 10625, 285, 8892, 275, 958, 253, 14717, 273, 253, 11649, 34003, 970, 3676, 49328, 390, 643, 3082, 326, 1566, 6895, 375, 16386, 3258, 9077, 715, 4295, 326, 1566, 21844, 1080, 280, 285, 30009, 11060, 11649, 403, 671, 973, 14091, 728, 337, 2299, 253, 4477, 513, 417, 26542, 337, 390, 2319, 667, 2442, 8483, 27366, 273, 14053, 6046, 275, 824, 247, 1039, 923, 374, 347, 271, 1650, 50276, 783, 1273, 4445, 310, 253, 4460, 7680, 323, 436, 2929, 275, 619, 440, 1209, 6924, 407, 970, 247, 11454, 2990, 326, 310, 14053, 1097, 6895, 375, 16386, 3258, 21844, 1080, 280, 11649, 285, 30009, 11060, 11649, 436, 1332, 476, 897, 841, 8197, 347, 10992, 3370, 552, 2426, 275, 253, 13737, 2105, 1159, 281, 3587, 4833, 253, 3733, 273, 253, 11809, 407, 6240, 30009, 11060, 11649, 281, 253, 13737, 2105, 1159, 2792, 326, 452, 1029, 30009, 11060, 11649, 403, 1066, 24676, 275, 247, 3282, 285, 841, 403, 3798, 2792, 835, 253, 35701, 310, 417, 9591, 973, 390, 310, 9591, 1854, 11220, 407, 6240, 21844, 1080, 280, 11649, 281, 253, 2105, 1159, 2792, 342, 1029, 941, 6046, 403, 671, 1066, 24676, 436, 2969, 1635, 273, 767, 2426, 281, 253, 2105, 1159, 1543, 275, 3240, 13943, 16774, 3045, 50276, 19, 3290, 50276, 249, 619, 4743, 253, 2929, 310, 973, 15720, 556, 1175, 42852, 323, 253, 1895, 352, 310, 46710, 285, 556, 2266, 490, 77, 569, 326, 9577, 3240, 247, 1643, 273, 619, 3302, 24626, 1041, 22644, 670, 253, 16774, 20544, 273, 253, 5853, 50275, 74, 452, 690, 7350, 5001, 253, 11080, 1255, 273, 11649, 21652, 3082, 5421, 285, 247, 10182, 1783, 273, 253, 20544, 285, 32213, 273, 1027, 3082, 323, 1650, 3676, 49328, 403, 1929, 281, 320, 1375, 23037, 14387, 275, 11649, 21652, 533, 452, 1199, 2169, 11897, 673, 285, 5718, 6095, 685, 643, 3082, 253, 4477, 3748, 326, 253, 3356, 3733, 673, 323, 49328, 310, 417, 247, 4468, 347, 597, 403, 10166, 29736, 314, 891, 14936, 342, 436, 1127, 347, 891, 1928, 751, 40088, 253, 305, 11113, 3038, 2424, 281, 6194, 247, 1332, 403, 3240, 1774, 3340, 342, 1943, 11897, 3240, 4354, 2970, 562, 273, 1133, 281, 4044, 256, 5503, 1014, 2167, 627, 403, 2176, 2390, 342, 253, 5300, 281, 35820, 1365, 7529, 885, 49328, 604, 6753, 46429, 310, 281, 320, 4232, 347, 247, 2266, 49865, 253, 2879, 11897, 673, 3541, 33257, 285, 17032, 673, 3198, 281, 320, 14969, 50275, 23955, 2803, 1060, 310, 5046, 18216, 247, 1643, 643, 11649, 21652, 5609, 347, 271, 28913, 275, 253, 2929, 278, 68, 5926, 483, 625, 5919, 49328, 3966, 891, 1928, 751, 436, 5301, 310, 1774, 281, 2096, 253, 7680, 273, 6240, 271, 21844, 1080, 280, 285, 30009, 11060, 11649, 1307, 281, 253, 2105, 1159, 4632, 253, 3290, 273, 841, 11649, 8197, 8772, 253, 11897, 2424, 352, 1537, 320, 1896, 326, 3676, 49328, 403, 689, 24212, 323, 253, 27697, 1895, 285, 247, 2257, 273, 11897, 310, 908, 281, 4044, 11649, 8197, 326, 1537, 320, 34930, 625, 11142, 314, 347, 253, 4477, 3748, 352, 310, 247, 1077, 4722, 2934, 281, 9232, 253, 22865, 936, 2914, 875, 7200, 285, 11649, 275, 271, 27697, 4836, 50276, 20, 19843, 50276, 783, 2929, 310, 3240, 3477, 281, 956, 285, 253, 4679, 6777, 1056, 13760, 3282, 891, 651, 9059, 326, 352, 310, 625, 1774, 281, 6266, 253, 27697, 5853, 285, 2105, 1159, 1078, 5015, 670, 11649, 21652, 347, 352, 310, 5046, 625, 40662, 281, 41509, 253, 3242, 1895, 326, 253, 4477, 5730, 281, 18915, 275, 253, 1655, 15824, 352, 1537, 1646, 326, 253, 4477, 897, 247, 973, 4304, 16182, 273, 11649, 21652, 285, 1089, 247, 441, 886, 511, 326, 2987, 973, 2167, 436, 310, 9090, 417, 2032, 247, 294, 48539, 273, 253, 7118, 1537, 1361, 29966, 436, 50276, 21, 8453, 50276, 7157, 1695, 555, 21652, 275, 3676, 11454, 2990, 4893, 310, 273, 7936, 6349, 275, 1340, 323, 277, 9866, 3082, 281, 320, 7763, 275, 5252, 26717, 10625, 352, 310, 1077, 1774, 326, 359, 2557, 616, 3045, 18543, 285, 562, 1171, 35360, 31640, 281, 619, 3640, 627, 556, 417, 644, 2720, 789, 326, 18212, 12453, 253, 919, 6375, 15030, 16983, 29713, 672, 830, 8287, 253, 2105, 1159, 323, 27697, 436, 9380, 2898, 310, 3240, 1534, 285, 891, 2868, 326, 253, 16774, 1543, 921, 326, 26278, 11649, 275, 247, 3505, 74, 6216, 1039, 476, 906, 275, 2709, 5373, 2299, 891, 1928, 751, 436, 2929, 310, 14999, 275, 2426, 273, 18212, 1684, 8287, 253, 22865, 936, 2914, 875, 11649, 285, 7200, 407, 36636, 247, 2014, 941, 1127, 49328, 1223, 253, 1655, 4081, 1332, 2550, 320, 9125, 342, 275, 2426, 273, 16774, 3045, 891, 1928, 751, 436, 2929, 1537, 320, 247, 1175, 1659, 281, 41614, 625, 2792, 1060, 281, 5100, 247, 1805, 4685, 323, 253, 3114, 327, 849, 253, 3290, 273, 11649, 8197, 2818, 253, 3045, 273, 27697, 50276, 18, 465, 423, 455, 247, 1591, 285, 340, 19881, 5918, 752, 20418, 513, 359, 878, 275, 17699, 16561, 3676, 4715, 323, 4382, 8113, 575, 24301, 1972, 275, 11454, 1491, 5162, 2718, 575, 1229, 4240, 50276, 19, 396, 13412, 11903, 38515, 1162, 355, 327, 253, 8483, 27366, 273, 6895, 375, 758, 3258, 11649, 13418, 342, 37851, 11454, 6928, 575, 39962, 638, 3845, 549, 32693, 14256, 22000, 13851, 575, 938, 1423, 891, 2868, 253, 4477, 943, 2953, 253, 3081, 11897, 6095, 3058, 323, 616, 1332, 3676, 49328, 403, 253, 1878, 5919, 1039, 273, 13546, 11649, 8197, 285, 476, 906, 275, 3012, 625, 305, 11113, 3038, 323, 3733, 285, 17032, 841, 943, 320, 9713, 5474, 339, 431, 248, 2929, 10262, 271, 2746, 281, 16161, 13737, 3237, 26332, 1089, 3602, 14168, 3342, 1269, 326, 5513, 5551, 336, 1677, 7313, 340, 672, 4817, 949, 253, 3579, 1566, 269, 2407, 1269, 253, 2746, 11120, 3210, 285, 8553, 323, 253, 11649, 273, 253, 3579, 1232, 407, 4715, 247, 35701, 273, 352, 970, 247, 3676, 19862, 253, 3676, 19862, 3210, 253, 3579, 1232, 347, 340, 948, 295, 1991, 49555, 360, 506, 3342, 1269, 14168, 4482, 25290, 15379, 312, 506, 3342, 1269, 342, 14168, 4482, 25290, 15379, 312, 506, 3342, 1269, 11120, 26278, 253, 11649, 273, 253, 3579, 1232, 253, 50275, 1991, 25290, 15379, 312, 506, 3342, 1269, 310, 840, 43867, 1561, 767, 27697, 3082, 253, 11454, 39200, 285, 247, 30111, 10336, 9603, 616, 11649, 13823, 11640, 347, 247, 906, 672, 2819, 323, 253, 27697, 2900, 253, 1332, 588, 3693, 14168, 3342, 1269, 534, 1705, 342, 247, 1029, 11649, 342, 253, 340, 326, 597, 4711, 253, 767, 3082, 403, 6760, 275, 1264, 49602, 285, 7568, 1175, 27697, 3045, 50276, 783, 2929, 310, 2581, 973, 3542, 253, 2022, 2934, 273, 253, 2929, 310, 326, 672, 2819, 323, 13737, 5482, 14168, 3342, 1269, 326, 812, 4711, 247, 1677, 8310, 340, 581, 943, 3693, 14168, 3342, 1269, 326, 1705, 342, 1029, 11649, 323, 616, 8131, 340, 347, 18755, 407, 253, 3676, 19862, 285, 253, 5998, 14168, 4482, 25290, 15379, 312, 506, 3342, 1269, 4764, 275, 958, 14168, 4482, 25290, 15379, 312, 506, 3342, 1269, 476, 320, 45765, 275, 767, 2426, 581, 10499, 21844, 1080, 280, 11649, 285, 253, 643, 253, 30009, 11060, 11649, 253, 6158, 310, 253, 9017, 281, 534, 253, 4295, 273, 253, 19862, 9184, 342, 1675, 281, 253, 3388, 10554, 26332, 253, 14168, 4482, 49555, 360, 506, 3342, 1269, 247, 1029, 30009, 11060, 11649, 476, 320, 1941, 273, 253, 14168, 3342, 1269, 1213, 698, 1977, 432, 253, 3733, 3268, 285, 3021, 253, 2613, 3210, 4711, 1077, 1027, 13650, 24049, 253, 1307, 275, 253, 2957, 3470, 11359, 253, 3186, 323, 271, 13737, 2900, 1977, 432, 824, 1029, 11649, 4811, 275, 534, 253, 6311, 35701, 310, 3164, 417, 247, 1077, 7899, 17335, 273, 253, 2032, 3579, 1232, 50275, 338, 891, 7192, 9113, 253, 2929, 760, 3400, 247, 1127, 6642, 26332, 247, 2014, 1269, 1677, 247, 6799, 340, 285, 2112, 253, 1840, 5955, 253, 581, 342, 253, 1878, 11649, 281, 619, 4685, 275, 16161, 13737, 3237, 581, 310, 671, 7052, 6110, 275, 24497, 253, 2120, 12637, 273, 14168, 4482, 1269, 26332, 268, 1991, 1269, 90, 1633, 326, 310, 8549, 407, 1006, 800, 3082, 824, 347, 253, 42275, 11454, 6928, 534, 253, 4477, 26542, 285, 7277, 347, 581, 273, 253, 1666, 25379, 627, 310, 642, 5955, 5734, 891, 9829, 352, 327, 253, 3240, 1127, 6642, 4632, 12637, 3268, 2746, 436, 310, 247, 14855, 273, 253, 2929, 285, 581, 812, 671, 1333, 247, 14855, 273, 253, 1332, 50273, 783, 4477, 18212, 9713, 253, 7364, 285, 2442, 4016, 38058, 3486, 273, 616, 789, 5474, 339, 431, 248, 2929, 25339, 271, 2746, 323, 275, 31324, 35701, 3470, 1754, 327, 11454, 6928, 275, 247, 2454, 273, 30009, 11060, 285, 21844, 1080, 280, 11649, 50276, 296, 3755, 20556, 50274, 783, 1895, 3908, 21652, 273, 11649, 323, 42275, 4870, 310, 271, 1774, 1895, 50276, 783, 2934, 273, 26278, 11649, 347, 18627, 275, 2593, 4562, 310, 247, 5272, 581, 50276, 783, 4477, 2098, 281, 3727, 253, 9879, 19736, 10895, 253, 747, 10895, 651, 320, 12912, 323, 253, 2561, 3114, 50275, 20881, 1255, 265, 50276, 783, 4028, 4453, 417, 7094, 2590, 923, 253, 1953, 2593, 50276, 2369, 652, 555, 812, 671, 320, 10251, 1701, 347, 352, 310, 12744, 835, 253, 4081, 1332, 1462, 15995, 253, 1375, 23037, 14387, 3210, 824, 347, 39762, 6753, 2083, 351, 398, 285, 2622, 2182, 14221, 923, 2805, 18, 275, 253, 3533, 2593, 50276, 74, 1691, 253, 4868, 347, 12009, 323, 1024, 533, 2819, 3579, 281, 253, 30080, 22559, 285, 3524, 253, 4477, 812, 2953, 253, 3533, 20420, 50274, 783, 4868, 310, 9300, 347, 253, 7350, 452, 644, 16575, 9713, 1309, 253, 30080, 22559, 923, 253, 5955, 2708, 627, 310, 247, 878, 281, 2319, 253, 7364, 273, 253, 789, 625, 581, 1039, 281, 2953, 352, 651, 320, 281, 897, 2629, 49602, 824, 347, 253, 4394, 4081, 275, 253, 2629, 42275, 11454, 6928, 6239, 24088, 278, 79, 382, 285, 260, 338, 274, 4679, 432, 294, 91, 9747, 285, 278, 1368, 3163, 4104, 390, 5661, 2593, 273, 650, 506, 680, 12408, 1162, 355, 6247, 5010, 352, 651, 320, 4217, 281, 1375, 2139, 824, 4679, 403, 417, 751, 1542, 3022, 390, 417, 1896, 50275, 14852, 9747, 277, 278, 1368, 3163, 256, 39762, 17032, 342, 2622, 3006, 14221, 275, 5213, 8059, 327, 5145, 4715, 4104, 7266, 21579, 10496, 1839, 268, 1686, 83, 50276, 9846, 650, 506, 680, 12408, 391, 30043, 246, 2805, 260, 864, 480, 26491, 701, 1866, 13550, 209, 1031, 66, 256, 14298, 413, 332, 34843, 301, 3443, 1261, 5353, 34082, 75, 636, 1959, 630, 5415, 8062, 323, 44755, 24048, 1006, 800, 3210, 5213, 8059, 327, 4715, 14237, 6247, 2490, 187, 4118, 18435, 27, 35501, 281, 253, 4477, 323, 29315, 436, 2221, 4722, 789, 253, 37317, 5955, 11392, 271, 4583, 13212, 342, 253, 19529, 2488, 6128, 285, 9300, 7714, 50276, 783, 3081, 4679, 3587, 9713, 37317, 7350, 285, 9945, 281, 2559, 7363, 285, 8254, 6787, 275, 253, 19529, 50276, 28821, 253, 2590, 13969, 285, 37317, 23027, 891, 5583, 14924, 50276, 4714, 2218, 209 ]
[ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 ]
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 39055, 4868, 1501, 30080, 22559, 50276, 783, 2929, 29328, 247, 747, 1332, 6753, 46429, 275, 253, 1673, 273, 275, 31324, 11454, 6928, 326, 31167, 253, 11649, 275, 253, 26278, 273, 253, 295, 16983, 3626, 3579, 1232, 2167, 247, 11454, 2990, 35701, 407, 6240, 253, 30009, 11060, 11649, 273, 253, 35701, 1159, 1955, 281, 19412, 351, 3485, 285, 21844, 1080, 280, 11649, 275, 253, 3733, 941, 6046, 275, 253, 941, 253, 747, 1332, 2883, 2108, 2792, 326, 452, 1698, 11649, 285, 3103, 403, 247, 1805, 3761, 281, 253, 295, 16983, 24049, 11649, 275, 253, 27697, 2957, 671, 1543, 275, 11408, 8770, 824, 347, 337, 37703, 253, 25720, 275, 5482, 347, 2792, 342, 1698, 25720, 943, 452, 1029, 11649, 374, 417, 10568, 3963, 5837, 5609, 285, 495, 417, 10568, 10182, 3302, 5837, 50276, 18, 3236, 414, 50276, 783, 6753, 46429, 5853, 556, 767, 4295, 337, 15706, 253, 35701, 11454, 2990, 1159, 342, 247, 11454, 2990, 1159, 326, 310, 7032, 273, 15970, 11649, 285, 374, 26264, 253, 2957, 1159, 908, 281, 1089, 7899, 11809, 1677, 247, 3215, 11273, 35701, 1159, 342, 13831, 3602, 50276, 783, 806, 4445, 273, 436, 5853, 310, 247, 973, 21877, 8245, 275, 6239, 285, 3676, 49328, 452, 644, 1077, 973, 5421, 275, 616, 3745, 281, 13613, 1566, 15970, 11649, 275, 247, 12190, 273, 1027, 10625, 285, 8892, 275, 958, 253, 14717, 273, 253, 11649, 34003, 970, 3676, 49328, 390, 643, 3082, 326, 1566, 6895, 375, 16386, 3258, 9077, 715, 4295, 326, 1566, 21844, 1080, 280, 285, 30009, 11060, 11649, 403, 671, 973, 14091, 728, 337, 2299, 253, 4477, 513, 417, 26542, 337, 390, 2319, 667, 2442, 8483, 27366, 273, 14053, 6046, 275, 824, 247, 1039, 923, 374, 347, 271, 1650, 50276, 783, 1273, 4445, 310, 253, 4460, 7680, 323, 436, 2929, 275, 619, 440, 1209, 6924, 407, 970, 247, 11454, 2990, 326, 310, 14053, 1097, 6895, 375, 16386, 3258, 21844, 1080, 280, 11649, 285, 30009, 11060, 11649, 436, 1332, 476, 897, 841, 8197, 347, 10992, 3370, 552, 2426, 275, 253, 13737, 2105, 1159, 281, 3587, 4833, 253, 3733, 273, 253, 11809, 407, 6240, 30009, 11060, 11649, 281, 253, 13737, 2105, 1159, 2792, 326, 452, 1029, 30009, 11060, 11649, 403, 1066, 24676, 275, 247, 3282, 285, 841, 403, 3798, 2792, 835, 253, 35701, 310, 417, 9591, 973, 390, 310, 9591, 1854, 11220, 407, 6240, 21844, 1080, 280, 11649, 281, 253, 2105, 1159, 2792, 342, 1029, 941, 6046, 403, 671, 1066, 24676, 436, 2969, 1635, 273, 767, 2426, 281, 253, 2105, 1159, 1543, 275, 3240, 13943, 16774, 3045, 50276, 19, 3290, 50276, 249, 619, 4743, 253, 2929, 310, 973, 15720, 556, 1175, 42852, 323, 253, 1895, 352, 310, 46710, 285, 556, 2266, 490, 77, 569, 326, 9577, 3240, 247, 1643, 273, 619, 3302, 24626, 1041, 22644, 670, 253, 16774, 20544, 273, 253, 5853, 50275, 74, 452, 690, 7350, 5001, 253, 11080, 1255, 273, 11649, 21652, 3082, 5421, 285, 247, 10182, 1783, 273, 253, 20544, 285, 32213, 273, 1027, 3082, 323, 1650, 3676, 49328, 403, 1929, 281, 320, 1375, 23037, 14387, 275, 11649, 21652, 533, 452, 1199, 2169, 11897, 673, 285, 5718, 6095, 685, 643, 3082, 253, 4477, 3748, 326, 253, 3356, 3733, 673, 323, 49328, 310, 417, 247, 4468, 347, 597, 403, 10166, 29736, 314, 891, 14936, 342, 436, 1127, 347, 891, 1928, 751, 40088, 253, 305, 11113, 3038, 2424, 281, 6194, 247, 1332, 403, 3240, 1774, 3340, 342, 1943, 11897, 3240, 4354, 2970, 562, 273, 1133, 281, 4044, 256, 5503, 1014, 2167, 627, 403, 2176, 2390, 342, 253, 5300, 281, 35820, 1365, 7529, 885, 49328, 604, 6753, 46429, 310, 281, 320, 4232, 347, 247, 2266, 49865, 253, 2879, 11897, 673, 3541, 33257, 285, 17032, 673, 3198, 281, 320, 14969, 50275, 23955, 2803, 1060, 310, 5046, 18216, 247, 1643, 643, 11649, 21652, 5609, 347, 271, 28913, 275, 253, 2929, 278, 68, 5926, 483, 625, 5919, 49328, 3966, 891, 1928, 751, 436, 5301, 310, 1774, 281, 2096, 253, 7680, 273, 6240, 271, 21844, 1080, 280, 285, 30009, 11060, 11649, 1307, 281, 253, 2105, 1159, 4632, 253, 3290, 273, 841, 11649, 8197, 8772, 253, 11897, 2424, 352, 1537, 320, 1896, 326, 3676, 49328, 403, 689, 24212, 323, 253, 27697, 1895, 285, 247, 2257, 273, 11897, 310, 908, 281, 4044, 11649, 8197, 326, 1537, 320, 34930, 625, 11142, 314, 347, 253, 4477, 3748, 352, 310, 247, 1077, 4722, 2934, 281, 9232, 253, 22865, 936, 2914, 875, 7200, 285, 11649, 275, 271, 27697, 4836, 50276, 20, 19843, 50276, 783, 2929, 310, 3240, 3477, 281, 956, 285, 253, 4679, 6777, 1056, 13760, 3282, 891, 651, 9059, 326, 352, 310, 625, 1774, 281, 6266, 253, 27697, 5853, 285, 2105, 1159, 1078, 5015, 670, 11649, 21652, 347, 352, 310, 5046, 625, 40662, 281, 41509, 253, 3242, 1895, 326, 253, 4477, 5730, 281, 18915, 275, 253, 1655, 15824, 352, 1537, 1646, 326, 253, 4477, 897, 247, 973, 4304, 16182, 273, 11649, 21652, 285, 1089, 247, 441, 886, 511, 326, 2987, 973, 2167, 436, 310, 9090, 417, 2032, 247, 294, 48539, 273, 253, 7118, 1537, 1361, 29966, 436, 50276, 21, 8453, 50276, 7157, 1695, 555, 21652, 275, 3676, 11454, 2990, 4893, 310, 273, 7936, 6349, 275, 1340, 323, 277, 9866, 3082, 281, 320, 7763, 275, 5252, 26717, 10625, 352, 310, 1077, 1774, 326, 359, 2557, 616, 3045, 18543, 285, 562, 1171, 35360, 31640, 281, 619, 3640, 627, 556, 417, 644, 2720, 789, 326, 18212, 12453, 253, 919, 6375, 15030, 16983, 29713, 672, 830, 8287, 253, 2105, 1159, 323, 27697, 436, 9380, 2898, 310, 3240, 1534, 285, 891, 2868, 326, 253, 16774, 1543, 921, 326, 26278, 11649, 275, 247, 3505, 74, 6216, 1039, 476, 906, 275, 2709, 5373, 2299, 891, 1928, 751, 436, 2929, 310, 14999, 275, 2426, 273, 18212, 1684, 8287, 253, 22865, 936, 2914, 875, 11649, 285, 7200, 407, 36636, 247, 2014, 941, 1127, 49328, 1223, 253, 1655, 4081, 1332, 2550, 320, 9125, 342, 275, 2426, 273, 16774, 3045, 891, 1928, 751, 436, 2929, 1537, 320, 247, 1175, 1659, 281, 41614, 625, 2792, 1060, 281, 5100, 247, 1805, 4685, 323, 253, 3114, 327, 849, 253, 3290, 273, 11649, 8197, 2818, 253, 3045, 273, 27697, 50276, 18, 465, 423, 455, 247, 1591, 285, 340, 19881, 5918, 752, 20418, 513, 359, 878, 275, 17699, 16561, 3676, 4715, 323, 4382, 8113, 575, 24301, 1972, 275, 11454, 1491, 5162, 2718, 575, 1229, 4240, 50276, 19, 396, 13412, 11903, 38515, 1162, 355, 327, 253, 8483, 27366, 273, 6895, 375, 758, 3258, 11649, 13418, 342, 37851, 11454, 6928, 575, 39962, 638, 3845, 549, 32693, 14256, 22000, 13851, 575, 938, 1423, 891, 2868, 253, 4477, 943, 2953, 253, 3081, 11897, 6095, 3058, 323, 616, 1332, 3676, 49328, 403, 253, 1878, 5919, 1039, 273, 13546, 11649, 8197, 285, 476, 906, 275, 3012, 625, 305, 11113, 3038, 323, 3733, 285, 17032, 841, 943, 320, 9713, 5474, 339, 431, 248, 2929, 10262, 271, 2746, 281, 16161, 13737, 3237, 26332, 1089, 3602, 14168, 3342, 1269, 326, 5513, 5551, 336, 1677, 7313, 340, 672, 4817, 949, 253, 3579, 1566, 269, 2407, 1269, 253, 2746, 11120, 3210, 285, 8553, 323, 253, 11649, 273, 253, 3579, 1232, 407, 4715, 247, 35701, 273, 352, 970, 247, 3676, 19862, 253, 3676, 19862, 3210, 253, 3579, 1232, 347, 340, 948, 295, 1991, 49555, 360, 506, 3342, 1269, 14168, 4482, 25290, 15379, 312, 506, 3342, 1269, 342, 14168, 4482, 25290, 15379, 312, 506, 3342, 1269, 11120, 26278, 253, 11649, 273, 253, 3579, 1232, 253, 50275, 1991, 25290, 15379, 312, 506, 3342, 1269, 310, 840, 43867, 1561, 767, 27697, 3082, 253, 11454, 39200, 285, 247, 30111, 10336, 9603, 616, 11649, 13823, 11640, 347, 247, 906, 672, 2819, 323, 253, 27697, 2900, 253, 1332, 588, 3693, 14168, 3342, 1269, 534, 1705, 342, 247, 1029, 11649, 342, 253, 340, 326, 597, 4711, 253, 767, 3082, 403, 6760, 275, 1264, 49602, 285, 7568, 1175, 27697, 3045, 50276, 783, 2929, 310, 2581, 973, 3542, 253, 2022, 2934, 273, 253, 2929, 310, 326, 672, 2819, 323, 13737, 5482, 14168, 3342, 1269, 326, 812, 4711, 247, 1677, 8310, 340, 581, 943, 3693, 14168, 3342, 1269, 326, 1705, 342, 1029, 11649, 323, 616, 8131, 340, 347, 18755, 407, 253, 3676, 19862, 285, 253, 5998, 14168, 4482, 25290, 15379, 312, 506, 3342, 1269, 4764, 275, 958, 14168, 4482, 25290, 15379, 312, 506, 3342, 1269, 476, 320, 45765, 275, 767, 2426, 581, 10499, 21844, 1080, 280, 11649, 285, 253, 643, 253, 30009, 11060, 11649, 253, 6158, 310, 253, 9017, 281, 534, 253, 4295, 273, 253, 19862, 9184, 342, 1675, 281, 253, 3388, 10554, 26332, 253, 14168, 4482, 49555, 360, 506, 3342, 1269, 247, 1029, 30009, 11060, 11649, 476, 320, 1941, 273, 253, 14168, 3342, 1269, 1213, 698, 1977, 432, 253, 3733, 3268, 285, 3021, 253, 2613, 3210, 4711, 1077, 1027, 13650, 24049, 253, 1307, 275, 253, 2957, 3470, 11359, 253, 3186, 323, 271, 13737, 2900, 1977, 432, 824, 1029, 11649, 4811, 275, 534, 253, 6311, 35701, 310, 3164, 417, 247, 1077, 7899, 17335, 273, 253, 2032, 3579, 1232, 50275, 338, 891, 7192, 9113, 253, 2929, 760, 3400, 247, 1127, 6642, 26332, 247, 2014, 1269, 1677, 247, 6799, 340, 285, 2112, 253, 1840, 5955, 253, 581, 342, 253, 1878, 11649, 281, 619, 4685, 275, 16161, 13737, 3237, 581, 310, 671, 7052, 6110, 275, 24497, 253, 2120, 12637, 273, 14168, 4482, 1269, 26332, 268, 1991, 1269, 90, 1633, 326, 310, 8549, 407, 1006, 800, 3082, 824, 347, 253, 42275, 11454, 6928, 534, 253, 4477, 26542, 285, 7277, 347, 581, 273, 253, 1666, 25379, 627, 310, 642, 5955, 5734, 891, 9829, 352, 327, 253, 3240, 1127, 6642, 4632, 12637, 3268, 2746, 436, 310, 247, 14855, 273, 253, 2929, 285, 581, 812, 671, 1333, 247, 14855, 273, 253, 1332, 50273, 783, 4477, 18212, 9713, 253, 7364, 285, 2442, 4016, 38058, 3486, 273, 616, 789, 5474, 339, 431, 248, 2929, 25339, 271, 2746, 323, 275, 31324, 35701, 3470, 1754, 327, 11454, 6928, 275, 247, 2454, 273, 30009, 11060, 285, 21844, 1080, 280, 11649, 50276, 296, 3755, 20556, 50274, 783, 1895, 3908, 21652, 273, 11649, 323, 42275, 4870, 310, 271, 1774, 1895, 50276, 783, 2934, 273, 26278, 11649, 347, 18627, 275, 2593, 4562, 310, 247, 5272, 581, 50276, 783, 4477, 2098, 281, 3727, 253, 9879, 19736, 10895, 253, 747, 10895, 651, 320, 12912, 323, 253, 2561, 3114, 50275, 20881, 1255, 265, 50276, 783, 4028, 4453, 417, 7094, 2590, 923, 253, 1953, 2593, 50276, 2369, 652, 555, 812, 671, 320, 10251, 1701, 347, 352, 310, 12744, 835, 253, 4081, 1332, 1462, 15995, 253, 1375, 23037, 14387, 3210, 824, 347, 39762, 6753, 2083, 351, 398, 285, 2622, 2182, 14221, 923, 2805, 18, 275, 253, 3533, 2593, 50276, 74, 1691, 253, 4868, 347, 12009, 323, 1024, 533, 2819, 3579, 281, 253, 30080, 22559, 285, 3524, 253, 4477, 812, 2953, 253, 3533, 20420, 50274, 783, 4868, 310, 9300, 347, 253, 7350, 452, 644, 16575, 9713, 1309, 253, 30080, 22559, 923, 253, 5955, 2708, 627, 310, 247, 878, 281, 2319, 253, 7364, 273, 253, 789, 625, 581, 1039, 281, 2953, 352, 651, 320, 281, 897, 2629, 49602, 824, 347, 253, 4394, 4081, 275, 253, 2629, 42275, 11454, 6928, 6239, 24088, 278, 79, 382, 285, 260, 338, 274, 4679, 432, 294, 91, 9747, 285, 278, 1368, 3163, 4104, 390, 5661, 2593, 273, 650, 506, 680, 12408, 1162, 355, 6247, 5010, 352, 651, 320, 4217, 281, 1375, 2139, 824, 4679, 403, 417, 751, 1542, 3022, 390, 417, 1896, 50275, 14852, 9747, 277, 278, 1368, 3163, 256, 39762, 17032, 342, 2622, 3006, 14221, 275, 5213, 8059, 327, 5145, 4715, 4104, 7266, 21579, 10496, 1839, 268, 1686, 83, 50276, 9846, 650, 506, 680, 12408, 391, 30043, 246, 2805, 260, 864, 480, 26491, 701, 1866, 13550, 209, 1031, 66, 256, 14298, 413, 332, 34843, 301, 3443, 1261, 5353, 34082, 75, 636, 1959, 630, 5415, 8062, 323, 44755, 24048, 1006, 800, 3210, 5213, 8059, 327, 4715, 14237, 6247, 2490, 187, 4118, 18435, 27, 35501, 281, 253, 4477, 323, 29315, 436, 2221, 4722, 789, 253, 37317, 5955, 11392, 271, 4583, 13212, 342, 253, 19529, 2488, 6128, 285, 9300, 7714, 50276, 783, 3081, 4679, 3587, 9713, 37317, 7350, 285, 9945, 281, 2559, 7363, 285, 8254, 6787, 275, 253, 19529, 50276, 28821, 253, 2590, 13969, 285, 37317, 23027, 891, 5583, 14924, 50276, 4714, 2218, 209 ]
Below is given review of a research paper from cnoference journal. Please write a summary the review. ### Review: the authors describe a focused belief propagation strategy for learning nns they argue that it continues to exhibit some of the same nice properties that stochastic gradient methods have in practice with the added benefit of allowing the computation of approximate marginals to improve the accuracy of predictions clarity of writing the manuscript contains a significant number of typos that while irritating to read do not inhibit understanding the larger issues are the plethora of undefined or underdefined terminology eg channel functions damping etc the imprecise mathematical formulations eg dimension of vectors zl etc and the lack of motivation for mathematical formulations eg bp is never really defined and equations 819 are unmotivated novelty the general approach with the prior seems novel to me but it is a bit unclear exactly what other pieces are novel here compared to existing work significance while it is interesting that a bp style message passing approach can achieve comparable levels of performance to sgd the overall significance of the works seems somewhat limited in particular im not sure that the authors really present a compelling example of when this approach would be preferred over pure sgd based solutions specific comments the introduction makes vague claims without support eg no citations for applications of bp additional approximation turns out to be benign etc consider making it more precise the paper conflates factorizations with factor graphs which arent really discussed at all nonhomogeneous inhomogeneous what is a scalar channel function it needs a definition the training error is usually lower for optimized configurations from message passing schemes suggesting that these algorithms are able to achieve higher capacity than sgdbased algorithms given the same test error while looking that the plots the convergence of some of the bp variants is certainly better than sgd im not sure that your statement would continue to hold if you ran for another 100 epochs do you use adaptive step size methods like adam for sgd the experiments make claims about deep neural networks but most of the experimental results are on shallow networks the number of repeated runs in the experiments 5 in some cases might be a little small an interesting approach that is hampered by a poor presentation somewhat limited experimental results and only minor justification for why it should be seriously considered as an alternative approach to more wellstudied approaches for training neural networks docsepsummary this paper develops a class of fbpbased messagepassing algorithms by adding a reinforcement term to the bp equations and shows equivalent performance to the binary networks in experiments main strengths this papers main strength is that the authors method of using messagepassing to train nns is pretty interesting the introductory section is wellwritten and the implementation github repo is intended to be provided main weaknesses the fundamental shortcoming of this study is that while the basic concept is intriguing it does not appear to make a significant impact much like the various flavors of sgd i invite authors to make the further improvement suggested in section 44 which is to show that message passing is inherently less prone to catastrophic forgetting issues which will be highly intriguing and will undoubtedly require more clear justification technical comments 1 is it possible to expand this method to multiclass scenarios such as extending messagepassing decoding algorithms from binary linear codes to nonbinary codes 2 furthermore substantial message passing successes have occurred in the past particularly for sparse factor graphs as a result future applications of graph neural networks or sparse transformers will be quite fascinating typographical comments 1 to ensure perfect anonymity erase the name displayed in acknowledgement 2 on page 3 the term pasp rule is not defined until it is used in section 2 using messagepassing to train nns is intriguing but like the various flavors of sgd it does not appear to make a significant difference docsepthis manuscript provides an interesting try on alterative training algorithms for deep neural networks based on approximate messagepassing algorithms based on the wellknown belief propagation bp algorithm in particular the binary neural network is considered and four algorithms bp three variants of bp ie bpi mf amp are proposed within a unified posteriorasprior update framework experiments are conducted on standard supervised classification tasks and continual learning settings which shows comparable performances as standard sgd based methods after rebuttal i have read the authors feedback many thanks for the detailed pointtopoint feedback and other reviewers comments and modified the score accordingly overall the proposed scheme is interesting though strictly speaking the results are not very advantageous at least from its current results compared to traditional ones and some of the comparisons seem not very reasonablefair strengths of the paper a new paradigm of deep neural network training is prosed based on wellknown belief propagation which is a very interesting and positive try i do think that such kind of exploration itself is meaningful for future research the resultant algorithms achieve comparable performances as the sgd based methods in particular the minibatch implementation running time is also about the same order as sgd although apparently it is still slower than the standard sgd algorithms weaknesses of the paper the first weakness is that there seem no apparent advantages of the performances of the bpbased methods even for most complicated bpbpi both in terms of the generalization error and running time or implementation simplicity given this fact then one might doubt the practical usage of bpbased methods as a result it would be really helpful if the authors could find some scenarios that bpbased methods that are favorable indeed it is noticed that the authors had already made an effort in this direction ie the evaluation of local bayes error and local energy and the application in continual learning similar to the bayesian neural networks some comments are as follows 1 using more standard benchmark metrics to quantify uncertainty under distributional shift although local bayes error and local energy are interesting merits other standard benchmark metrics to quantify uncertainty under distributional shift such as expected calibration error ece and out of distribution ood entropy might be of more interest 1 it is thus suggested to add experiments on such standard benchmark metrics 1 and compare with other bayesian deep learning methods 2 adding comparisons with previous methods for continual learning cl although the bpbased method is shown to be welladapted to continual learning this might not be viewed as a special advantage since there are a variety of simple methods for the sgdbased method to enable continual learning 23 all kinds of bayesian methods including bpbased such as variational bayes laplace method et al are all naturally welladapted to continual learning within the variational continual learning vcl framework4 and others in particular for binary neural networks considered in this manuscript there also have been some studies on sgdbased bayesian training algorithms welladapted to continual learning eg5 the authors only compare bpi with standard binarynet and it is suggested to add some comparison with these previous continual learning algorithms to see if there is any improvement of bpbased methods the second weakness is that only binary neural network is considered and there is a lack of evaluation of the continuous weights of bpbased methods from my own understanding the proposed bpbased methods are readily applicable to deal with continuous weights eg amp was firstly designed to deal with continuous weights so why continuous weights are not considered here it would be helpful if several results for continuous weights can be added with a comparison with sgd based methods otherwise please add some discussions of the differences between continuous weights and binary weights the third weakness is that there is a lack of comparisons with several closely related works in particular the expectation backpropagation ebp algorithm 6 1 several closely related works on previous attempts in training deep neural networks using bplike algorithms are missing in particular 6 proposed one expectation backpropagation ebp algorithm which is applicable to train deep neural networks with not only binary weights but also continuous weights ebp also has a forward pass and backward pass as the paradigm in the current manuscript moreover ebp has been shown to have similar efficiency and update form as the sgd based method while achieving competitive performances although it is based on ep rather than bp in this manuscript the two are very closely related 7 in particular for the amp variant considered in the current manuscript amp has been proved to be an approximation of ep in 89 then does it imply that the proposed amp training version is similar to ebp in 6 given such similarity it is highly suggested to discuss the differences and intrinsic relationships between ebp with the proposed bpbased method as well as adding comparisons in the experients 2 another related method is the proximal meanfield pmf method in 10 which is also one training algorithm for deep binary neural networks pmf is proved to be equivalent to a proximal version of the meanfield mf method as a result what is the difference between the mf version in the current manuscript from the pmf method in 10 3 moreover in the bayesian deep learning community there are various bayesian training algorithms using meanfield variational inference methods eg 111213 but are based on sgd methods then what is the key difference between the mf algorithm in the current manuscript with those mfvi methods it seems that they including amp bpi bp optimize the same variational objective elbo bound but the only difference is the specific optimization methods previous ones eg1112 13 used the sgd to optimize the elbo bound while the mf in current manuscript used message passing updates if so then maybe the results are expected to be similar or the same and possibly the bpbased method is not fundamentally different from the sgd method as stated another interesting point is that from current experimental results bpi and amp are basically the same as mf while apparently bpi and amp use more accurate treestructured or bethe approximations can the authors provide some insights into the negligible differences between them additional technical comments 1 it seems that the socalled posteriorasupdates pasp is simply the core of the bayesian theorem in the sequential updates setting here different minibatch corresponds to different observations though possibly with an overlap which i think is also used in 7 similarly 2 in the experiment parts eg figure 1 and table i why results of bp are missing 3 after training how to perform the prediction using the results of bpbased methods are they obtained by sampling from the posterior distribution first and then computing the average output of different samples please illustrate clearly in the main text also can the authors plot the posterior distribution of the learned weights 4 similarly how the continual learning is performed is it using the learned posterior of the previous task as prior over the next task to learn similarly as 45 5 why the training error of bpbased methods are much lower than sgd while the test error is high or about the same is there any intuitive explanation 6 it is unclear of the socalled additional reinforcement message and how the resultant bpbased algorithms are different from the original versions eg mf amp it would be better to give some explanations in the main text if this point is important 7 the submitted code of this manuscript seems unavailable 8 it is suggested to add a discussion of the limitations of the bpbased methods from current results it seems that although it is comparable or slightly worse with slightly higher complexity to sgd methods additional advantages bayes continual learning are not apparent either given that a variety of bayesian deep learning methods can do the same in a presumably simpler way references 1 ovadia yaniv et al can you trust your models uncertainty evaluating predictive uncertainty under dataset shift neurips 2019 2 zenke friedemann ben poole and surya ganguli continual learning through synaptic intelligence icml 2017 3 parisi german i et al continual lifelong learning with neural networks a review neural networks 113 2019 5471 4 nguyen c v li y bui t d et al variational continual learning iclr 2018 5 meng x bachmann r khan m e training binary neural networks using the bayesian learning rule icml 2020 6 soudry daniel itay hubara and ron meir expectation backpropagation parameterfree training of multilayer neural networks with continuous or discrete weights neurips vol 1 2014 7 minka tom divergence measures and message passing technical report microsoft research 2005 8 meng x wu s kuang l et al an expectation propagation perspective on approximate message passing ieee signal processing letters 2015 228 11941197 9 b cakmak o winther and b h fleury samp approximatemessage passing for general matrix ensembles in ieee information theory workshop itw nov 2014 pp 192196 10 ajanthan t dokania p k hartley r et al proximal meanfield for neural network quantizationcproceedings of the ieeecvf international conference on computer vision 2019 48714880 11 alex graves practical variational inference for neural networks in nips 2011 12 charles blundell julien cornebise koray kavukcuoglu and daan wierstra weight uncertainty in neural networks in icml 2015 13 osawa kazuki et al practical deep learning with bayesian principles in neurips 2019 this paper applies the wellknown belief propagation bp algorithm and several variants to train deep neural networks with binary weights overall i like the topic of this manuscript and it is a good start to explore the potentials of bpbased methods for deep learning while this is indeed a very interesting try there are several aspects to be improved for the current manuscript such as a systematic evaluation of the potential advantages of the proposed bpbased methods a comparison with previous closely related algorithms especially the ebp algorithm and other bayesian deep learning methods as well as clarifications of some related technical points as detailed above in addition given the comparable or slightly worse performance with higher complexity but not competitive results of the proposed bpbased algorithms it is worth a discussion of the intuitive reason behind it this is not saying that it is not useful to study an algorithm without sota performance but rather on the opposite it is very useful to explain the intuitive underlying reason why it works this way even it is not sota as well as its limitations docsepthis paper introduces a beliefpropagation messagepassing training algorithm for multilayer neural networks this algorithm is adapted to minibatch training and biases distributions toward high entropy solutions empirical results show that neural networks with discrete weights and activations trained with this algorithm achieve comparable performance the same networks trained with sgd binarynet and can make approximate bayesian predictions that have higher accuracy than pointwise solutions correctness no correctness concerns technical novelty and significance one of the ideas of the paperto use belief propagation to estimate the marginals of the weights of a neural network nn in order to make probabilistic predictionsis a novel paradigm allowing to attach confidence to nn predictions this is a research direction with a lot of potential which can lead to better nn performance in many tasks however the papers theoretical contributions are somewhat limited as they seem to be very similar to the contributions of baldassi et al 2016 although the authors claim that the novelty of their approach is that their results apply to minibatch training and deep neural networks it is unclear how the results derived for deep neural networks differ from those derived for shallow ones in baldassi et al 2016 can the authors provide a more detailed explanation empirical novelty and significance the second numerical experiment on continual learning provides empirical justification to use the proposed algorithm to avoid catastrophic forgetting the downside of the numerical results is that again the neural network architectures that are considered are not very deep they are wide but this wasnt as emphasized in the text in the first numerical experiment the proposed method does not have any advantages with respect to sgd other than a lower training error the authors mention that this could suggest that their algorithm leads to better nn capacity but this is not explored any further in the text i suggest running simulations with deeper neural networks and expanding on the advantages and disadvantages of the proposed bpscheme other concerns what is the quality of the approximation in eq 3 ie how much is lost by making this approximation it would be helpful to have a formal definition for the variables in eqs 6 and 7 in the caption of fig 2 the authors state that the training hyperparameters in the two cases are independently selected and generally differ how are they selected by crossvalidation although the paper is a close extension of baldassi et al 2016 the idea to use belief propagation to estimate the marginals of the weights of a neural network nn in order to make probabilistic predictions is a novel paradigm and the empirical results on continual learning seem promising i recommend acceptance subject to the clarification of the concerns discussed in the main review ### Summary:
this paper presents a method for training neural networks with belief propagationbased algorithms the approach is to set a fully factorized prior over weights compute a forward and backward pass of messages on a minibatch then set the new prior to be a slightly higher temperature version of the minibatch approximate posterior this new prior is then used for the next minibatch and training iterates there is a huge range of opinions amongst reviewers the main thing that reviewers appreciate is the novelty of using belief propagation instead of backpropagation for training neural networks finding alternatives to backprop with favorable properties could be hugely impactful so even small gains in this direction are valuable the posteriorasprior update is interesting and the authors have clearly put in care to getting things working the main weaknesses are that some of the experiments arent always reasonable and fair the paper is framed to overstate its contribution and theres not a clear advantage over standard approaches eg mnist error rates for a two hidden layer network are 2 in the end this is a very borderline paper but i find rev nngls position to be most informative in particular the paper frames the main contribution to be message passing as an alternative to sgd for training neural networks but this is too broad of a framing given the existence of other closely related approaches like soudry et al pointed out by rev nngl id recommend that the authors frame their work as an advance over other message passingbased approaches to training neural networks and to focus on piecing apart precisely why the proposed approach improves over ebp and alternatives
[ 25761, 299, 12303, 556, 644, 2011, 281, 452, 2074, 6733, 285, 5731, 830, 347, 253, 256, 35333, 1754, 1332, 1223, 17170, 12085, 16226, 50276, 20261, 352, 310, 1754, 327, 2563, 2581, 685, 20633, 275, 436, 7714, 253, 767, 403, 1077, 8244, 2905, 818, 275, 1798, 323, 253, 32263, 12955, 2783, 275, 253, 1655, 7714, 32263, 556, 644, 8058, 281, 320, 271, 11193, 273, 2563, 275, 11289, 840, 1057, 352, 16084, 326, 253, 4081, 32263, 3733, 2715, 310, 2074, 281, 299, 12303, 275, 721, 50276, 28821, 824, 14259, 352, 310, 4122, 5125, 281, 2319, 253, 3910, 285, 15276, 7688, 875, 299, 12303, 342, 253, 4081, 20633, 3169, 1332, 347, 973, 347, 6240, 14023, 275, 253, 1172, 1104, 50275, 19, 1529, 2905, 1332, 310, 253, 19561, 1599, 3423, 12920, 71, 1332, 275, 884, 534, 310, 671, 581, 3733, 5933, 323, 3676, 8985, 11454, 6928, 12920, 71, 50276, 261, 8058, 281, 320, 6425, 281, 247, 19561, 2715, 273, 253, 1599, 3423, 278, 71, 1332, 347, 247, 906, 752, 310, 253, 3064, 875, 253, 278, 71, 2715, 275, 253, 1655, 7714, 432, 253, 12920, 71, 1332, 275, 884, 50275, 20, 25761, 275, 253, 17699, 16561, 3676, 4715, 3114, 627, 403, 2710, 17699, 16561, 3733, 11333, 970, 1599, 3423, 39762, 17032, 3082, 24088, 1903, 805, 1012, 533, 403, 1754, 327, 256, 35333, 3082, 840, 752, 310, 253, 2234, 3064, 875, 253, 278, 71, 5933, 275, 253, 1655, 7714, 342, 1110, 278, 71, 6584, 3082, 352, 3133, 326, 597, 1690, 32263, 270, 2059, 20633, 22318, 253, 1072, 39762, 8103, 1045, 2399, 3033, 533, 253, 760, 3064, 310, 253, 2173, 13757, 3082, 2045, 4394, 24088, 883, 805, 2145, 908, 253, 256, 35333, 281, 22318, 253, 1045, 2399, 3033, 1223, 253, 278, 71, 275, 1655, 7714, 908, 3935, 8136, 11269, 604, 594, 840, 5046, 253, 1543, 403, 3264, 281, 320, 2074, 390, 253, 1072, 285, 6830, 253, 20633, 3169, 1332, 310, 417, 26401, 1027, 432, 253, 256, 35333, 1332, 347, 4767, 50276, 23955, 4722, 1127, 310, 326, 432, 1655, 5661, 1543, 270, 2059, 285, 32263, 403, 10323, 253, 1072, 347, 278, 71, 1223, 8505, 270, 2059, 285, 32263, 897, 625, 7899, 2578, 383, 957, 1520, 390, 701, 248, 34754, 476, 253, 4477, 2085, 690, 16039, 715, 253, 22879, 3910, 875, 731, 50274, 38092, 7681, 5701, 50276, 18, 352, 3133, 326, 253, 9267, 18859, 12637, 284, 484, 24275, 268, 4938, 310, 3365, 253, 5161, 273, 253, 17699, 16561, 10012, 275, 253, 22453, 11269, 4758, 1060, 1027, 1054, 487, 1506, 10140, 281, 1027, 7313, 2167, 6830, 342, 271, 14787, 534, 891, 1158, 310, 671, 908, 275, 818, 12014, 50275, 19, 275, 253, 3368, 4243, 24088, 4677, 337, 285, 2829, 891, 2139, 1543, 273, 20633, 403, 5816, 50275, 20, 846, 3733, 849, 281, 1347, 253, 10554, 970, 253, 1543, 273, 20633, 3169, 3082, 403, 597, 2797, 407, 10491, 432, 253, 12637, 3268, 806, 285, 840, 12672, 253, 3388, 3453, 273, 1027, 3530, 4496, 17093, 4518, 275, 253, 2022, 2505, 671, 476, 253, 4477, 7484, 253, 12637, 3268, 273, 253, 6311, 13461, 50275, 21, 12014, 849, 253, 45120, 4715, 310, 2684, 310, 352, 970, 253, 6311, 12637, 273, 253, 2045, 4836, 347, 2720, 689, 253, 1735, 4836, 281, 3037, 12014, 347, 5329, 50275, 22, 2139, 253, 3733, 2228, 273, 20633, 3169, 3082, 403, 1199, 2406, 685, 256, 35333, 1223, 253, 1071, 2228, 310, 1029, 390, 670, 253, 1072, 310, 627, 667, 27350, 8813, 50275, 23, 352, 310, 12744, 273, 253, 9267, 18859, 3081, 35221, 3935, 285, 849, 253, 29395, 20633, 3169, 11333, 403, 1027, 432, 253, 3236, 9508, 24088, 278, 71, 32263, 352, 651, 320, 1805, 281, 1918, 690, 22909, 275, 253, 2022, 2505, 604, 436, 1127, 310, 1774, 50275, 24, 253, 9262, 2127, 273, 436, 7714, 3133, 29356, 50275, 25, 352, 310, 5125, 281, 823, 247, 5955, 273, 253, 7364, 273, 253, 20633, 3169, 3082, 432, 1655, 1543, 352, 3133, 326, 3738, 352, 310, 10870, 390, 5777, 7197, 342, 5777, 2169, 10454, 281, 256, 35333, 3082, 3081, 11361, 17699, 265, 45120, 4715, 403, 417, 5165, 2057, 1677, 326, 247, 5235, 273, 17699, 16561, 3676, 4715, 3082, 476, 513, 253, 1072, 275, 247, 18289, 19554, 1039, 50275, 250, 3065, 50276, 18, 14246, 324, 571, 340, 266, 400, 1162, 355, 476, 368, 4517, 634, 3210, 11649, 16344, 15970, 11649, 762, 10895, 5333, 5723, 2824, 6247, 50275, 19, 50276, 5282, 413, 31894, 39480, 2240, 2963, 1306, 285, 919, 5973, 10821, 22357, 45120, 4715, 949, 21066, 9260, 17857, 1686, 4240, 50276, 20, 1061, 13401, 305, 8592, 891, 1162, 355, 45120, 36536, 4715, 342, 11454, 6928, 247, 2278, 11454, 6928, 11805, 6247, 608, 33953, 50276, 21, 295, 39170, 260, 362, 632, 340, 1081, 74, 246, 277, 1162, 355, 39762, 45120, 4715, 17857, 32888, 4765, 50276, 22, 278, 1205, 1269, 270, 607, 8420, 391, 465, 5582, 278, 299, 3733, 8985, 11454, 6928, 970, 253, 17699, 16561, 4715, 4086, 17857, 1686, 9169, 50276, 23, 256, 2995, 610, 16447, 928, 352, 333, 14713, 4595, 285, 391, 251, 479, 343, 15355, 896, 44263, 318, 4764, 4924, 3733, 273, 33362, 4071, 11454, 6928, 342, 5415, 390, 13358, 13461, 5723, 2824, 1936, 337, 4059, 50276, 24, 278, 750, 66, 7275, 23279, 5593, 285, 3935, 8136, 7681, 1304, 2494, 5530, 2561, 5826, 50276, 25, 278, 1205, 1269, 259, 86, 256, 35978, 606, 298, 1162, 355, 271, 15355, 18634, 8668, 327, 16851, 3935, 8136, 26332, 1796, 2625, 5162, 4876, 4104, 26249, 12035, 3156, 18493, 50276, 26, 50276, 67, 260, 518, 45879, 258, 3330, 508, 285, 270, 288, 7771, 1626, 256, 1301, 4020, 31587, 2482, 8136, 323, 2087, 4315, 49328, 275, 26332, 1796, 1491, 3762, 22586, 352, 88, 22458, 4059, 7266, 19372, 19196, 50276, 740, 29168, 386, 5582, 246, 513, 76, 16642, 268, 465, 288, 435, 2205, 391, 1162, 355, 19561, 1599, 3423, 323, 11454, 2990, 36643, 68, 856, 22868, 273, 253, 26332, 70, 886, 39985, 5213, 8059, 327, 4382, 8113, 6247, 41949, 16989, 1438, 50276, 883, 247, 1591, 38854, 8542, 39762, 17032, 323, 11454, 6928, 275, 295, 2824, 4332, 50276, 805, 1018, 868, 787, 1504, 437, 49137, 1914, 30312, 67, 885, 37720, 333, 465, 580, 2788, 14573, 462, 7675, 285, 4204, 266, 259, 1321, 10981, 2801, 11649, 275, 11454, 6928, 275, 17857, 1686, 4104, 50276, 1012, 7684, 11415, 465, 1370, 19385, 1162, 355, 8542, 3676, 4715, 342, 17699, 16561, 9241, 275, 5723, 2824, 6247, 50275, 2520, 2929, 10384, 253, 973, 4304, 9927, 18634, 20633, 5933, 285, 2067, 11640, 281, 6194, 3676, 11454, 6928, 342, 8985, 13461, 4583, 891, 751, 253, 9400, 273, 436, 7714, 285, 352, 310, 247, 1175, 1265, 281, 8338, 253, 19316, 273, 20633, 3169, 3082, 323, 3676, 4715, 1223, 436, 310, 6296, 247, 1077, 4722, 1611, 627, 403, 2067, 7794, 281, 320, 5520, 323, 253, 1655, 7714, 824, 347, 247, 12082, 7103, 273, 253, 2442, 11361, 273, 253, 4081, 20633, 3169, 3082, 247, 5301, 342, 2045, 8244, 2905, 11333, 3340, 253, 299, 12303, 5933, 285, 643, 17699, 16561, 3676, 4715, 3082, 347, 973, 347, 8254, 6787, 273, 690, 2905, 7681, 2792, 347, 7000, 1840, 275, 1635, 1677, 253, 10870, 390, 5777, 7197, 3045, 342, 2169, 10454, 533, 417, 12085, 1543, 273, 253, 4081, 20633, 3169, 11333, 352, 310, 4409, 247, 5955, 273, 253, 27350, 1921, 3212, 352, 436, 310, 417, 3981, 326, 352, 310, 417, 4217, 281, 1263, 271, 5933, 1293, 256, 5503, 3045, 533, 2581, 327, 253, 7285, 352, 310, 1077, 4217, 281, 5513, 253, 27350, 6944, 1921, 2139, 352, 2987, 436, 1039, 1014, 352, 310, 417, 256, 5503, 347, 973, 347, 697, 7364, 50276, 7152, 33032, 2520, 2929, 23970, 247, 9927, 44263, 318, 3935, 5858, 272, 3733, 5933, 323, 33362, 4071, 11454, 6928, 436, 5933, 310, 12956, 281, 1054, 487, 1506, 3733, 285, 31306, 10670, 2584, 1029, 15579, 5482, 16774, 1543, 921, 326, 11454, 6928, 342, 13358, 13461, 285, 1396, 569, 10166, 342, 436, 5933, 5115, 10870, 3045, 253, 1072, 6928, 10166, 342, 256, 35333, 8985, 3024, 285, 476, 1056, 16851, 17699, 16561, 13650, 326, 452, 2169, 7200, 685, 1127, 3020, 5482, 36594, 642, 36594, 7350, 50276, 48746, 38135, 285, 8453, 581, 273, 253, 5697, 273, 253, 2929, 936, 897, 9927, 18634, 281, 6642, 253, 8459, 932, 273, 253, 13461, 273, 247, 11454, 2990, 48257, 275, 1340, 281, 1056, 37851, 13650, 261, 247, 4460, 22199, 6941, 281, 16152, 7162, 281, 48257, 13650, 436, 310, 247, 2561, 3884, 342, 247, 2257, 273, 2442, 534, 476, 1421, 281, 1805, 48257, 3045, 275, 1142, 8892, 2299, 253, 9380, 10527, 9021, 403, 8489, 3710, 347, 597, 1646, 281, 320, 1077, 2074, 281, 253, 9021, 273, 37566, 515, 74, 1162, 355, 4022, 3738, 253, 4477, 1750, 326, 253, 38135, 273, 616, 2746, 310, 326, 616, 1543, 4647, 281, 1054, 487, 1506, 3733, 285, 3676, 11454, 6928, 352, 310, 12744, 849, 253, 1543, 6012, 323, 3676, 11454, 6928, 9184, 432, 1110, 6012, 323, 20126, 4394, 275, 37566, 515, 74, 1162, 355, 4022, 476, 253, 4477, 2085, 247, 625, 7000, 8813, 50276, 358, 5378, 474, 38135, 285, 8453, 253, 1273, 10704, 3368, 327, 45120, 4715, 3400, 16774, 22861, 281, 897, 253, 4081, 5933, 281, 3693, 36256, 37264, 253, 42719, 273, 253, 10704, 1543, 310, 326, 969, 253, 11454, 2990, 35615, 326, 403, 2783, 403, 417, 1077, 3676, 597, 403, 4618, 533, 436, 369, 2649, 347, 21947, 275, 253, 2505, 275, 253, 806, 10704, 3368, 253, 4081, 1332, 1057, 417, 452, 667, 11361, 342, 1675, 281, 256, 35333, 643, 685, 247, 2406, 3733, 2228, 253, 4477, 3748, 326, 436, 812, 1804, 326, 616, 5933, 5644, 281, 1805, 48257, 5350, 533, 436, 310, 417, 14859, 667, 2007, 275, 253, 2505, 891, 1804, 3515, 9938, 342, 12861, 11454, 6928, 285, 16122, 327, 253, 11361, 285, 23797, 273, 253, 4081, 270, 793, 1962, 1405, 50276, 977, 7350, 50276, 5371, 310, 253, 3290, 273, 253, 11193, 275, 16186, 495, 26332, 849, 1199, 310, 3663, 407, 2403, 436, 11193, 50276, 262, 651, 320, 9371, 281, 452, 247, 7473, 5426, 323, 253, 4903, 275, 16186, 84, 721, 285, 818, 50276, 249, 253, 11743, 273, 3036, 374, 253, 4477, 1375, 326, 253, 3733, 4373, 22041, 275, 253, 767, 2219, 403, 10939, 4236, 285, 3839, 9184, 849, 403, 597, 4236, 407, 2831, 29599, 50274, 20261, 253, 2929, 310, 247, 2810, 6880, 273, 37566, 515, 74, 1162, 355, 4022, 253, 2934, 281, 897, 9927, 18634, 281, 6642, 253, 8459, 932, 273, 253, 13461, 273, 247, 11454, 2990, 48257, 275, 1340, 281, 1056, 37851, 13650, 310, 247, 4460, 22199, 285, 253, 16774, 1543, 327, 45120, 4715, 1646, 12532, 891, 5583, 14924, 2256, 281, 253, 37699, 273, 253, 7350, 5469, 275, 253, 2022, 2278, 2490, 187, 4118, 18435, 27, 2520, 2929, 10262, 247, 1332, 323, 3733, 11454, 6928, 342, 9927, 18634, 3169, 11333, 253, 2746, 310, 281, 873, 247, 4751, 2803, 1025, 2720, 689, 13461, 11897, 247, 3579, 285, 19265, 1509, 273, 8169, 327, 247, 1054, 487, 1506, 840, 873, 253, 747, 2720, 281, 320, 247, 5777, 2169, 3276, 2715, 273, 253, 1054, 487, 1506, 16851, 12637, 436, 747, 2720, 310, 840, 908, 323, 253, 1735, 1054, 487, 1506, 285, 3733, 10040, 684, 50276, 9088, 310, 247, 5699, 2491, 273, 11626, 15995, 30628, 253, 2022, 2181, 326, 30628, 11435, 310, 253, 38135, 273, 970, 9927, 18634, 3185, 273, 896, 44263, 318, 323, 3733, 11454, 6928, 4560, 18075, 281, 896, 8560, 342, 13857, 3607, 812, 320, 40704, 3486, 1020, 594, 1014, 1355, 15988, 275, 436, 3884, 403, 9865, 253, 12637, 284, 40844, 5731, 310, 4722, 285, 253, 4477, 452, 4518, 1691, 275, 1557, 281, 2970, 1841, 2444, 253, 2022, 32213, 403, 326, 690, 273, 253, 4679, 403, 2649, 1900, 5272, 285, 4344, 253, 2929, 310, 29318, 281, 689, 3409, 697, 7680, 285, 253, 373, 417, 247, 2590, 5750, 689, 2629, 7274, 24088, 278, 79, 382, 2228, 4142, 323, 247, 767, 8763, 3828, 2990, 403, 374, 50276, 249, 253, 990, 436, 310, 247, 1077, 45210, 2929, 533, 891, 1089, 3585, 295, 1251, 5200, 1899, 281, 320, 954, 27096, 275, 1798, 253, 2929, 13009, 253, 2022, 7680, 281, 320, 3935, 8136, 347, 271, 5795, 281, 256, 35333, 323, 3733, 11454, 6928, 533, 436, 310, 1512, 3862, 273, 247, 39926, 1677, 253, 6242, 273, 643, 8244, 2905, 7274, 751, 256, 2995, 610, 1162, 355, 8042, 562, 407, 3585, 295, 1251, 77, 2654, 5583, 326, 253, 4477, 3665, 616, 789, 347, 271, 7170, 689, 643, 3935, 8136, 3169, 7274, 281, 3733, 11454, 6928, 285, 281, 2770, 327, 3376, 2844, 7419, 10534, 2139, 253, 4081, 2746, 19132, 689, 299, 12303, 285, 18075 ]
[ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 ]
[ 25761, 299, 12303, 556, 644, 2011, 281, 452, 2074, 6733, 285, 5731, 830, 347, 253, 256, 35333, 1754, 1332, 1223, 17170, 12085, 16226, 50276, 20261, 352, 310, 1754, 327, 2563, 2581, 685, 20633, 275, 436, 7714, 253, 767, 403, 1077, 8244, 2905, 818, 275, 1798, 323, 253, 32263, 12955, 2783, 275, 253, 1655, 7714, 32263, 556, 644, 8058, 281, 320, 271, 11193, 273, 2563, 275, 11289, 840, 1057, 352, 16084, 326, 253, 4081, 32263, 3733, 2715, 310, 2074, 281, 299, 12303, 275, 721, 50276, 28821, 824, 14259, 352, 310, 4122, 5125, 281, 2319, 253, 3910, 285, 15276, 7688, 875, 299, 12303, 342, 253, 4081, 20633, 3169, 1332, 347, 973, 347, 6240, 14023, 275, 253, 1172, 1104, 50275, 19, 1529, 2905, 1332, 310, 253, 19561, 1599, 3423, 12920, 71, 1332, 275, 884, 534, 310, 671, 581, 3733, 5933, 323, 3676, 8985, 11454, 6928, 12920, 71, 50276, 261, 8058, 281, 320, 6425, 281, 247, 19561, 2715, 273, 253, 1599, 3423, 278, 71, 1332, 347, 247, 906, 752, 310, 253, 3064, 875, 253, 278, 71, 2715, 275, 253, 1655, 7714, 432, 253, 12920, 71, 1332, 275, 884, 50275, 20, 25761, 275, 253, 17699, 16561, 3676, 4715, 3114, 627, 403, 2710, 17699, 16561, 3733, 11333, 970, 1599, 3423, 39762, 17032, 3082, 24088, 1903, 805, 1012, 533, 403, 1754, 327, 256, 35333, 3082, 840, 752, 310, 253, 2234, 3064, 875, 253, 278, 71, 5933, 275, 253, 1655, 7714, 342, 1110, 278, 71, 6584, 3082, 352, 3133, 326, 597, 1690, 32263, 270, 2059, 20633, 22318, 253, 1072, 39762, 8103, 1045, 2399, 3033, 533, 253, 760, 3064, 310, 253, 2173, 13757, 3082, 2045, 4394, 24088, 883, 805, 2145, 908, 253, 256, 35333, 281, 22318, 253, 1045, 2399, 3033, 1223, 253, 278, 71, 275, 1655, 7714, 908, 3935, 8136, 11269, 604, 594, 840, 5046, 253, 1543, 403, 3264, 281, 320, 2074, 390, 253, 1072, 285, 6830, 253, 20633, 3169, 1332, 310, 417, 26401, 1027, 432, 253, 256, 35333, 1332, 347, 4767, 50276, 23955, 4722, 1127, 310, 326, 432, 1655, 5661, 1543, 270, 2059, 285, 32263, 403, 10323, 253, 1072, 347, 278, 71, 1223, 8505, 270, 2059, 285, 32263, 897, 625, 7899, 2578, 383, 957, 1520, 390, 701, 248, 34754, 476, 253, 4477, 2085, 690, 16039, 715, 253, 22879, 3910, 875, 731, 50274, 38092, 7681, 5701, 50276, 18, 352, 3133, 326, 253, 9267, 18859, 12637, 284, 484, 24275, 268, 4938, 310, 3365, 253, 5161, 273, 253, 17699, 16561, 10012, 275, 253, 22453, 11269, 4758, 1060, 1027, 1054, 487, 1506, 10140, 281, 1027, 7313, 2167, 6830, 342, 271, 14787, 534, 891, 1158, 310, 671, 908, 275, 818, 12014, 50275, 19, 275, 253, 3368, 4243, 24088, 4677, 337, 285, 2829, 891, 2139, 1543, 273, 20633, 403, 5816, 50275, 20, 846, 3733, 849, 281, 1347, 253, 10554, 970, 253, 1543, 273, 20633, 3169, 3082, 403, 597, 2797, 407, 10491, 432, 253, 12637, 3268, 806, 285, 840, 12672, 253, 3388, 3453, 273, 1027, 3530, 4496, 17093, 4518, 275, 253, 2022, 2505, 671, 476, 253, 4477, 7484, 253, 12637, 3268, 273, 253, 6311, 13461, 50275, 21, 12014, 849, 253, 45120, 4715, 310, 2684, 310, 352, 970, 253, 6311, 12637, 273, 253, 2045, 4836, 347, 2720, 689, 253, 1735, 4836, 281, 3037, 12014, 347, 5329, 50275, 22, 2139, 253, 3733, 2228, 273, 20633, 3169, 3082, 403, 1199, 2406, 685, 256, 35333, 1223, 253, 1071, 2228, 310, 1029, 390, 670, 253, 1072, 310, 627, 667, 27350, 8813, 50275, 23, 352, 310, 12744, 273, 253, 9267, 18859, 3081, 35221, 3935, 285, 849, 253, 29395, 20633, 3169, 11333, 403, 1027, 432, 253, 3236, 9508, 24088, 278, 71, 32263, 352, 651, 320, 1805, 281, 1918, 690, 22909, 275, 253, 2022, 2505, 604, 436, 1127, 310, 1774, 50275, 24, 253, 9262, 2127, 273, 436, 7714, 3133, 29356, 50275, 25, 352, 310, 5125, 281, 823, 247, 5955, 273, 253, 7364, 273, 253, 20633, 3169, 3082, 432, 1655, 1543, 352, 3133, 326, 3738, 352, 310, 10870, 390, 5777, 7197, 342, 5777, 2169, 10454, 281, 256, 35333, 3082, 3081, 11361, 17699, 265, 45120, 4715, 403, 417, 5165, 2057, 1677, 326, 247, 5235, 273, 17699, 16561, 3676, 4715, 3082, 476, 513, 253, 1072, 275, 247, 18289, 19554, 1039, 50275, 250, 3065, 50276, 18, 14246, 324, 571, 340, 266, 400, 1162, 355, 476, 368, 4517, 634, 3210, 11649, 16344, 15970, 11649, 762, 10895, 5333, 5723, 2824, 6247, 50275, 19, 50276, 5282, 413, 31894, 39480, 2240, 2963, 1306, 285, 919, 5973, 10821, 22357, 45120, 4715, 949, 21066, 9260, 17857, 1686, 4240, 50276, 20, 1061, 13401, 305, 8592, 891, 1162, 355, 45120, 36536, 4715, 342, 11454, 6928, 247, 2278, 11454, 6928, 11805, 6247, 608, 33953, 50276, 21, 295, 39170, 260, 362, 632, 340, 1081, 74, 246, 277, 1162, 355, 39762, 45120, 4715, 17857, 32888, 4765, 50276, 22, 278, 1205, 1269, 270, 607, 8420, 391, 465, 5582, 278, 299, 3733, 8985, 11454, 6928, 970, 253, 17699, 16561, 4715, 4086, 17857, 1686, 9169, 50276, 23, 256, 2995, 610, 16447, 928, 352, 333, 14713, 4595, 285, 391, 251, 479, 343, 15355, 896, 44263, 318, 4764, 4924, 3733, 273, 33362, 4071, 11454, 6928, 342, 5415, 390, 13358, 13461, 5723, 2824, 1936, 337, 4059, 50276, 24, 278, 750, 66, 7275, 23279, 5593, 285, 3935, 8136, 7681, 1304, 2494, 5530, 2561, 5826, 50276, 25, 278, 1205, 1269, 259, 86, 256, 35978, 606, 298, 1162, 355, 271, 15355, 18634, 8668, 327, 16851, 3935, 8136, 26332, 1796, 2625, 5162, 4876, 4104, 26249, 12035, 3156, 18493, 50276, 26, 50276, 67, 260, 518, 45879, 258, 3330, 508, 285, 270, 288, 7771, 1626, 256, 1301, 4020, 31587, 2482, 8136, 323, 2087, 4315, 49328, 275, 26332, 1796, 1491, 3762, 22586, 352, 88, 22458, 4059, 7266, 19372, 19196, 50276, 740, 29168, 386, 5582, 246, 513, 76, 16642, 268, 465, 288, 435, 2205, 391, 1162, 355, 19561, 1599, 3423, 323, 11454, 2990, 36643, 68, 856, 22868, 273, 253, 26332, 70, 886, 39985, 5213, 8059, 327, 4382, 8113, 6247, 41949, 16989, 1438, 50276, 883, 247, 1591, 38854, 8542, 39762, 17032, 323, 11454, 6928, 275, 295, 2824, 4332, 50276, 805, 1018, 868, 787, 1504, 437, 49137, 1914, 30312, 67, 885, 37720, 333, 465, 580, 2788, 14573, 462, 7675, 285, 4204, 266, 259, 1321, 10981, 2801, 11649, 275, 11454, 6928, 275, 17857, 1686, 4104, 50276, 1012, 7684, 11415, 465, 1370, 19385, 1162, 355, 8542, 3676, 4715, 342, 17699, 16561, 9241, 275, 5723, 2824, 6247, 50275, 2520, 2929, 10384, 253, 973, 4304, 9927, 18634, 20633, 5933, 285, 2067, 11640, 281, 6194, 3676, 11454, 6928, 342, 8985, 13461, 4583, 891, 751, 253, 9400, 273, 436, 7714, 285, 352, 310, 247, 1175, 1265, 281, 8338, 253, 19316, 273, 20633, 3169, 3082, 323, 3676, 4715, 1223, 436, 310, 6296, 247, 1077, 4722, 1611, 627, 403, 2067, 7794, 281, 320, 5520, 323, 253, 1655, 7714, 824, 347, 247, 12082, 7103, 273, 253, 2442, 11361, 273, 253, 4081, 20633, 3169, 3082, 247, 5301, 342, 2045, 8244, 2905, 11333, 3340, 253, 299, 12303, 5933, 285, 643, 17699, 16561, 3676, 4715, 3082, 347, 973, 347, 8254, 6787, 273, 690, 2905, 7681, 2792, 347, 7000, 1840, 275, 1635, 1677, 253, 10870, 390, 5777, 7197, 3045, 342, 2169, 10454, 533, 417, 12085, 1543, 273, 253, 4081, 20633, 3169, 11333, 352, 310, 4409, 247, 5955, 273, 253, 27350, 1921, 3212, 352, 436, 310, 417, 3981, 326, 352, 310, 417, 4217, 281, 1263, 271, 5933, 1293, 256, 5503, 3045, 533, 2581, 327, 253, 7285, 352, 310, 1077, 4217, 281, 5513, 253, 27350, 6944, 1921, 2139, 352, 2987, 436, 1039, 1014, 352, 310, 417, 256, 5503, 347, 973, 347, 697, 7364, 50276, 7152, 33032, 2520, 2929, 23970, 247, 9927, 44263, 318, 3935, 5858, 272, 3733, 5933, 323, 33362, 4071, 11454, 6928, 436, 5933, 310, 12956, 281, 1054, 487, 1506, 3733, 285, 31306, 10670, 2584, 1029, 15579, 5482, 16774, 1543, 921, 326, 11454, 6928, 342, 13358, 13461, 285, 1396, 569, 10166, 342, 436, 5933, 5115, 10870, 3045, 253, 1072, 6928, 10166, 342, 256, 35333, 8985, 3024, 285, 476, 1056, 16851, 17699, 16561, 13650, 326, 452, 2169, 7200, 685, 1127, 3020, 5482, 36594, 642, 36594, 7350, 50276, 48746, 38135, 285, 8453, 581, 273, 253, 5697, 273, 253, 2929, 936, 897, 9927, 18634, 281, 6642, 253, 8459, 932, 273, 253, 13461, 273, 247, 11454, 2990, 48257, 275, 1340, 281, 1056, 37851, 13650, 261, 247, 4460, 22199, 6941, 281, 16152, 7162, 281, 48257, 13650, 436, 310, 247, 2561, 3884, 342, 247, 2257, 273, 2442, 534, 476, 1421, 281, 1805, 48257, 3045, 275, 1142, 8892, 2299, 253, 9380, 10527, 9021, 403, 8489, 3710, 347, 597, 1646, 281, 320, 1077, 2074, 281, 253, 9021, 273, 37566, 515, 74, 1162, 355, 4022, 3738, 253, 4477, 1750, 326, 253, 38135, 273, 616, 2746, 310, 326, 616, 1543, 4647, 281, 1054, 487, 1506, 3733, 285, 3676, 11454, 6928, 352, 310, 12744, 849, 253, 1543, 6012, 323, 3676, 11454, 6928, 9184, 432, 1110, 6012, 323, 20126, 4394, 275, 37566, 515, 74, 1162, 355, 4022, 476, 253, 4477, 2085, 247, 625, 7000, 8813, 50276, 358, 5378, 474, 38135, 285, 8453, 253, 1273, 10704, 3368, 327, 45120, 4715, 3400, 16774, 22861, 281, 897, 253, 4081, 5933, 281, 3693, 36256, 37264, 253, 42719, 273, 253, 10704, 1543, 310, 326, 969, 253, 11454, 2990, 35615, 326, 403, 2783, 403, 417, 1077, 3676, 597, 403, 4618, 533, 436, 369, 2649, 347, 21947, 275, 253, 2505, 275, 253, 806, 10704, 3368, 253, 4081, 1332, 1057, 417, 452, 667, 11361, 342, 1675, 281, 256, 35333, 643, 685, 247, 2406, 3733, 2228, 253, 4477, 3748, 326, 436, 812, 1804, 326, 616, 5933, 5644, 281, 1805, 48257, 5350, 533, 436, 310, 417, 14859, 667, 2007, 275, 253, 2505, 891, 1804, 3515, 9938, 342, 12861, 11454, 6928, 285, 16122, 327, 253, 11361, 285, 23797, 273, 253, 4081, 270, 793, 1962, 1405, 50276, 977, 7350, 50276, 5371, 310, 253, 3290, 273, 253, 11193, 275, 16186, 495, 26332, 849, 1199, 310, 3663, 407, 2403, 436, 11193, 50276, 262, 651, 320, 9371, 281, 452, 247, 7473, 5426, 323, 253, 4903, 275, 16186, 84, 721, 285, 818, 50276, 249, 253, 11743, 273, 3036, 374, 253, 4477, 1375, 326, 253, 3733, 4373, 22041, 275, 253, 767, 2219, 403, 10939, 4236, 285, 3839, 9184, 849, 403, 597, 4236, 407, 2831, 29599, 50274, 20261, 253, 2929, 310, 247, 2810, 6880, 273, 37566, 515, 74, 1162, 355, 4022, 253, 2934, 281, 897, 9927, 18634, 281, 6642, 253, 8459, 932, 273, 253, 13461, 273, 247, 11454, 2990, 48257, 275, 1340, 281, 1056, 37851, 13650, 310, 247, 4460, 22199, 285, 253, 16774, 1543, 327, 45120, 4715, 1646, 12532, 891, 5583, 14924, 2256, 281, 253, 37699, 273, 253, 7350, 5469, 275, 253, 2022, 2278, 2490, 187, 4118, 18435, 27, 2520, 2929, 10262, 247, 1332, 323, 3733, 11454, 6928, 342, 9927, 18634, 3169, 11333, 253, 2746, 310, 281, 873, 247, 4751, 2803, 1025, 2720, 689, 13461, 11897, 247, 3579, 285, 19265, 1509, 273, 8169, 327, 247, 1054, 487, 1506, 840, 873, 253, 747, 2720, 281, 320, 247, 5777, 2169, 3276, 2715, 273, 253, 1054, 487, 1506, 16851, 12637, 436, 747, 2720, 310, 840, 908, 323, 253, 1735, 1054, 487, 1506, 285, 3733, 10040, 684, 50276, 9088, 310, 247, 5699, 2491, 273, 11626, 15995, 30628, 253, 2022, 2181, 326, 30628, 11435, 310, 253, 38135, 273, 970, 9927, 18634, 3185, 273, 896, 44263, 318, 323, 3733, 11454, 6928, 4560, 18075, 281, 896, 8560, 342, 13857, 3607, 812, 320, 40704, 3486, 1020, 594, 1014, 1355, 15988, 275, 436, 3884, 403, 9865, 253, 12637, 284, 40844, 5731, 310, 4722, 285, 253, 4477, 452, 4518, 1691, 275, 1557, 281, 2970, 1841, 2444, 253, 2022, 32213, 403, 326, 690, 273, 253, 4679, 403, 2649, 1900, 5272, 285, 4344, 253, 2929, 310, 29318, 281, 689, 3409, 697, 7680, 285, 253, 373, 417, 247, 2590, 5750, 689, 2629, 7274, 24088, 278, 79, 382, 2228, 4142, 323, 247, 767, 8763, 3828, 2990, 403, 374, 50276, 249, 253, 990, 436, 310, 247, 1077, 45210, 2929, 533, 891, 1089, 3585, 295, 1251, 5200, 1899, 281, 320, 954, 27096, 275, 1798, 253, 2929, 13009, 253, 2022, 7680, 281, 320, 3935, 8136, 347, 271, 5795, 281, 256, 35333, 323, 3733, 11454, 6928, 533, 436, 310, 1512, 3862, 273, 247, 39926, 1677, 253, 6242, 273, 643, 8244, 2905, 7274, 751, 256, 2995, 610, 1162, 355, 8042, 562, 407, 3585, 295, 1251, 77, 2654, 5583, 326, 253, 4477, 3665, 616, 789, 347, 271, 7170, 689, 643, 3935, 8136, 3169, 7274, 281, 3733, 11454, 6928, 285, 281, 2770, 327, 3376, 2844, 7419, 10534, 2139, 253, 4081, 2746, 19132, 689, 299, 12303, 285, 18075 ]
Below is given review of a research paper from cnoference journal. Please write a summary the review. ### Review: pros the paper studies a very important problem in gene data analysis the proposed method is technically sound the method is intuitive in its idea and easy to implement the results are interpretable and according to the experimental evaluations the proposed method is consistent to existing biological observations and could further identify unknown genetic targets therefore it potentially has insightful scientific implications cons relevance and generalizability are unclear the paper is relevant to researchers in subareas only and it is bestsuited to a bioinformatics or neurobiology community the paper requires background in neurobiological data analysis to evaluate whether the proposed method brings meaningful scientific insights in terms of novelties in machine learning field the improvement over existing works algorithmically or computationally seems relatively limited it is also unclear whether the proposed method generalizes well outside the subdomain of gene expression it seems flda could possibly be applied to any such tensor data but a discussion on its general applicability to other data will be nice relation to prior work is insufficient related work has not been discussed adequately thus making the significance of this paper unclear the authors discusses cca and autoencoders in introduction and also mentions the differences between flda and lda 2ldas which are compared with the proposed method in experiments i find such discussions are inadequate for the readers to understand the baseline or state of the art in this field therefore it is somewhat unclear how the work improves from existing works experimental evaluation is incomprehensive as a largely application work the comprehensiveness or tricks in experiments should be explained more clearly for example one part of the experiment the sparsitybased regularization of flda seems like the application of rifle kean ming tan et al on the data readers are unclear what are the unique challenges in current setting also any computational details convergence scalability etc or any guidance on the hyperparameter selection and does this method also generate meaningful results on another gene data docsepthis manuscript describes a generalization of anova that is intended to be used in the interpretation of singlecell rnaseq data the method requires specification of an orthogonal discrete categorization of cells nominally by phenotype the method then linearly factorizes the observed gene expression values into features and their interactions relative to the phenotypic categories the factors can then be used to help interpret the categories especially in conjunction with a regularizer to reduce the number of genes involved in the factors i found this paper frustrating to read the second paragraph does not make clear exactly what problem is being addressed it says that we are given phenotypic descriptions of neuronal types but not where those descriptions come from so i skipped ahead to the methods section to try to figure it out but even after reading that section i could not understand where the phenotype values come from it was only when i made it to section 2 that i verified that indeed the phenotypes are not observed but only inferred having understood the problem it seems to me that the use case for this approach is quite specific we need to have rnaseq data that can be clustered in such a way that we can assign clusters to predefined phenotypic categories the claim in the discussion section that our approach can be easily generalized to additional characteristics such as electrophysiology and connectivity was not clear to me but i assume this still refers to phenotypes that are inferred from the scrnaseq data the critique leveled against cca is that this approach cannot factorize gene expressions according to individual features making the result hard to interpret but this seems like it must also be true of the proposed method since prior to any analysis the data is transformed via pca section 52 i am confused therefore about how the method can be used to select genes in section 6 the comparison to lda is done using two metrics based on signaltonoise ratio and mutual information i would have liked to hear more about why these particular metrics are appropriate and in particular how they relate to whatever use case the authors have in mind overall i am still not convinced that this is a problem that needs to be solved docsep summary the paper provides and anovainspired method called flda for creating a low dimensional embedding of scrnaseq data additionally they propose a sparsity based method to find gene signatures which can be used for further biological validation the authors extensively evaluated their method on a data set of real expression values in drosophila neurons they compared flda to two simpler and similar approaches namely linear discriminant analysis lda and a more featurealigned version 2lda on all benchmarks and metrics the flda shows clear advantage over the other methods reasons for score overall i vote for accepting i can imagine several use cases of this approach which looks like it is computational very feasible and easy to apply additionally it is theoretically well founded on existing statistical methods however my main concern is fldas dependency on correct feature annotation which is not mentioned by the authors i hope authors can address my concern and the other cons in the rebuttal period pros the authors clearly justify their design principle of flda based on anova benchmarks using different scores show clear advantage of the flda method the gene signature found for the t4t5 data set show known marker genes for neuronal development extension to the 2 feature case and other possible extensions are clearly outlined extensive appendix showing theoretical validation for most obstacles in scrnaseq data cons no evaluation on how dependent the model is on correct feature annotation no comparison to other methods for estimating gene signatures validation on just one biologicalreal data set questions during rebuttal period please address and clarify the cons above mainly what happens if one level is incorrectly defined eg the ground truth are two or more levels for this singular level or in other words how robust is the method to the annotationdocsepsummary this manuscript presents a novel dimensionality reduction method called factorized linear discriminant analysis the method starts from a real problem in neurobiology and tries to link expression levels of neural genes to phenotypes in particular the main goal of the proposed technique is to find linear projections of the genes expression which vary maximally with one phenotypical aspect and minimally with the others the approach is evaluated using a synthetic example and a real case study involving drosophila t4t5 cells positive points the paper starts from a real and challenging bioinformatics problem ie the analysis of singlecell rna sequencing data for neurobiology data analysis tools for these data are nowadays fundamental to unravel the high complexity of this biomedical field the idea is interesting well motivated and well explained the proposed approach is simple and understandable interpretability of solutions and results is currently fundamental when dealing with biomedical data the method is tested using a real world application negative points questions comment 1 the main problem of this manuscript is that the proposed method is not well inserted into the state of the art actually the discussion of authors of the related works is very limited with only one paragraph at the bottom of page 1 plus the description of linear discriminant analysis in section 4 many different linear dimensionality reduction techniques have been proposed in the past each one with different characteristics goals and optimization techniques authors should discuss them especially in relation with their approach a good entry point is the survey of cunningham and ghahramani john p cunningham zoubin ghahramani linear dimensionality reduction survey insights and generalizations journal of machine learning research 16 2015 28592900 also for what strictly concerns linear discriminant analysis i think that a deeper discussion is needed many different extensions of lda have been proposed some of them strictly related to the goalsmethods of this paper like 2dlda ming li baozong yuan 2dlda a statistical linear discriminant analysis for image matrix pattern recognition letters volume 26 issue 5 2005 pages 527532 this contextualization with respect to the state of the art is fundamental without this it is very difficult to get the true contribution of the proposed approach in the same spirit i suggest the authors to include some more recent techniques in the experimental comparison comment 2 the sparisification approach has not been fully discussed and justified many different methods for sparsification have been proposed why do authors choose this particular one how does this method relate to alternatives some sparse algorithms have been introduced also for discriminant analysis not strictly related to the medical field such as n h ly q du and j e fowler sparse graphbased discriminant analysis for hyperspectral imagery in ieee transactions on geoscience and remote sensing vol 52 no 7 pp 38723884 july 2014 comment 3 if i correctly understand in the experiments authors apply a pca to reduce the gene expressions before applying the proposed method last two lines of page 5 is this a reasonable choice i know that this is commonly done in other scenarios like in fisherfaces for face recognition but in these experiments genes are fundamental for the knowledge extraction i guess this is not done for the sparsified version otherwise genes would have not been extracted can you comment on this comment 4 some authors argued that formulating the optimization of linear dimensionality reduction techniques as eigenvalue or generalized eigenvalue problems is not always an adequate choice john p cunningham zoubin ghahramani linear dimensionality reduction survey insights and generalizations journal of machine learning research 16 2015 28592900 in such paper the authors also suggested an alternative can you provide a comment on this comment 5 i found some difficulties in reading and understanding the presentation and the discussion of the results one problem was definitely the fact that tables are put in the appendix and that such tables contains many numbers i suggest the authors to put a summarizing table inside the manuscript if possible comment 6 the phenotypical features are assumed to be categorical how strict is this assumption how much information are we loosing in this context and in other contexts in other words are there other scenariosapplications in which this method be applied adding some other possible application scenarios would increase the value of the proposal comment 7 this comment is related to the previous one but contains more an open suggestion rather than a comment did you consider to use non categorical features i think that this would open the usage of approaches used in threeway data analysis even im not aware of dimensionality reduction techniques methods for threeway data i think a relation exists techniques to extract interesting relations between different directions of data have been proposed especially in the context of triclustering see for example henriques r madeira sc triclustering algorithms for threedimensional data analysis a comprehensive survey acm comput surv 515 95 2019 i suggest the authors to take a look also at this field ### Summary:
the paper introduces a linear projection method inspired by anova for finding a supervised lowdimensional embedding a positive aspect is that the method is straightforward and it is even slightly surprising that in the family of linear models there still was an uncovered niche the paper was considered useful for the purpose studied in the paper singlecell rnaseq data analysis but to claim broader usefulness more evidence should be presented one particular detail which was brought up by all reviewers was the pca preprocessing for ica it is a sensible choice as linear ica is essentially just a rotation of the pca components but the justification is not as good for a supervised method pca may be necessary in practice but may lose important categoryrelevant information the paper still needs a significant revision before publication even though the method is straightforward method a lot of time and discussion was required for expert reviewers to understand it
[ 275, 4679, 891, 1089, 824, 11985, 403, 18766, 323, 253, 10668, 281, 2096, 253, 8245, 390, 1375, 273, 253, 1445, 275, 436, 1673, 3103, 352, 310, 8489, 12744, 849, 253, 789, 19132, 432, 5368, 2987, 50274, 49363, 7103, 310, 15321, 8391, 422, 347, 247, 8127, 2898, 789, 253, 9483, 6460, 390, 24866, 275, 4679, 943, 320, 5544, 625, 4518, 323, 1650, 581, 629, 273, 253, 3368, 253, 37139, 414, 3169, 37820, 273, 269, 392, 66, 3133, 751, 253, 2898, 273, 20260, 1058, 266, 43261, 23136, 1162, 355, 327, 253, 941, 10668, 403, 12744, 752, 403, 253, 4451, 7881, 275, 1655, 4758, 671, 667, 15180, 4278, 14940, 9171, 1430, 3966, 390, 667, 12925, 327, 253, 4373, 19484, 5438, 285, 1057, 436, 1332, 671, 6635, 14282, 1543, 327, 1529, 3320, 941, 5474, 33032, 2520, 7714, 8631, 247, 26647, 273, 271, 8947, 326, 310, 6034, 281, 320, 908, 275, 253, 7914, 273, 2014, 3992, 391, 79, 511, 82, 941, 253, 1332, 4419, 17776, 273, 271, 19627, 13358, 13213, 1320, 273, 1341, 7163, 3341, 407, 13466, 50276, 783, 1332, 840, 23352, 2803, 4219, 253, 2540, 3320, 2048, 2193, 715, 3386, 285, 616, 6355, 4103, 281, 253, 27791, 9050, 50276, 783, 2616, 476, 840, 320, 908, 281, 1361, 4665, 253, 9050, 3340, 275, 17385, 342, 247, 3963, 6081, 281, 4796, 253, 1180, 273, 3608, 3206, 275, 253, 2616, 50276, 74, 1119, 436, 2929, 29125, 281, 1239, 50276, 783, 1273, 12494, 1057, 417, 1056, 2590, 4555, 752, 1895, 310, 1146, 9713, 50276, 262, 2296, 326, 359, 403, 1677, 27791, 20121, 273, 16069, 3510, 533, 417, 835, 1110, 20121, 1705, 432, 50276, 601, 891, 37001, 6386, 281, 253, 3082, 2593, 281, 1611, 281, 4677, 352, 562, 50276, 2858, 1014, 846, 4361, 326, 2593, 891, 812, 417, 2096, 835, 253, 13466, 2193, 1705, 432, 50276, 262, 369, 760, 672, 891, 1160, 352, 281, 2593, 374, 326, 891, 16058, 326, 6296, 253, 21903, 403, 417, 2540, 533, 760, 22245, 50276, 30819, 7192, 253, 1895, 352, 3133, 281, 479, 326, 253, 897, 1083, 323, 436, 2746, 310, 3240, 2173, 50276, 664, 878, 281, 452, 391, 79, 511, 82, 941, 326, 476, 320, 29102, 275, 824, 247, 1039, 326, 359, 476, 9212, 9959, 281, 41364, 27791, 9050, 50276, 783, 1750, 275, 253, 5955, 2593, 326, 776, 2746, 476, 320, 4354, 14923, 281, 50276, 38092, 5319, 824, 347, 44553, 10537, 285, 17769, 369, 417, 2590, 281, 479, 533, 891, 5467, 436, 1335, 10770, 281, 21903, 326, 403, 22245, 432, 253, 7362, 79, 511, 82, 941, 50276, 783, 29254, 1268, 264, 1411, 260, 6357, 310, 326, 436, 2746, 2550, 2803, 907, 3320, 12091, 2556, 281, 2060, 3386, 2403, 253, 906, 1892, 281, 4665, 50276, 2858, 436, 3133, 751, 352, 1364, 671, 320, 2032, 273, 253, 4081, 1332, 1580, 2720, 281, 667, 1783, 253, 941, 310, 13657, 3066, 268, 6357, 2593, 8073, 50276, 74, 717, 13477, 3103, 670, 849, 253, 1332, 476, 320, 908, 281, 3609, 3608, 275, 2593, 721, 50276, 783, 5301, 281, 298, 1473, 310, 2218, 970, 767, 17082, 1754, 327, 2625, 1299, 45416, 4313, 285, 15577, 1491, 50276, 74, 651, 452, 10490, 281, 4089, 625, 670, 2139, 841, 1798, 17082, 403, 4569, 285, 275, 1798, 849, 597, 14588, 281, 5913, 897, 1083, 253, 4477, 452, 275, 2564, 4583, 891, 717, 1335, 417, 13762, 326, 436, 310, 247, 1895, 326, 3198, 281, 320, 14042, 50275, 7152, 33032, 6010, 253, 2929, 3400, 285, 271, 729, 404, 1033, 1250, 1332, 1925, 269, 392, 66, 323, 6153, 247, 1698, 15759, 21496, 273, 7362, 79, 511, 82, 941, 23000, 597, 12661, 247, 37139, 414, 1754, 1332, 281, 1089, 3320, 20076, 534, 476, 320, 908, 323, 2007, 7534, 12820, 253, 4477, 18171, 6760, 616, 1332, 327, 247, 941, 873, 273, 1524, 2048, 2193, 275, 277, 28603, 8512, 597, 2429, 269, 392, 66, 281, 767, 19554, 285, 2074, 7274, 10775, 4872, 20741, 386, 1783, 298, 1473, 285, 247, 625, 4735, 2132, 2715, 374, 392, 66, 327, 512, 49602, 285, 17082, 253, 269, 392, 66, 2722, 2590, 5750, 689, 253, 643, 3082, 50275, 250, 3743, 323, 4868, 4583, 891, 6273, 323, 18738, 891, 476, 8564, 2067, 897, 2219, 273, 436, 2746, 534, 4453, 751, 352, 310, 15180, 1077, 17887, 285, 3477, 281, 4647, 23000, 352, 310, 28055, 973, 11420, 327, 5368, 7605, 3082, 2299, 619, 2022, 4468, 310, 269, 392, 284, 18925, 327, 3451, 4735, 22581, 534, 310, 417, 5393, 407, 253, 4477, 891, 3524, 4477, 476, 2953, 619, 4468, 285, 253, 643, 772, 275, 253, 30080, 22559, 2180, 50275, 856, 84, 50276, 783, 4477, 4518, 15249, 616, 2216, 8063, 273, 269, 392, 66, 1754, 327, 271, 8947, 50276, 31591, 17144, 970, 1027, 7363, 921, 2590, 5750, 273, 253, 269, 392, 66, 1332, 50276, 783, 3320, 11118, 1119, 323, 253, 246, 21, 85, 22, 941, 873, 921, 1929, 10705, 3608, 323, 16069, 2440, 50276, 24210, 281, 253, 374, 4735, 1083, 285, 643, 1896, 18149, 403, 4518, 18627, 50276, 2068, 3134, 30762, 4645, 10527, 12820, 323, 954, 24238, 275, 7362, 79, 511, 82, 941, 50275, 5040, 50276, 2369, 7103, 327, 849, 7976, 253, 1566, 310, 327, 3451, 4735, 22581, 50276, 2369, 5301, 281, 643, 3082, 323, 26230, 3320, 20076, 50276, 29599, 327, 816, 581, 7534, 6549, 941, 873, 50275, 34974, 1309, 30080, 22559, 2180, 4496, 2953, 285, 19148, 253, 772, 1840, 7194, 752, 6569, 604, 581, 1268, 310, 30833, 2931, 24088, 253, 3216, 5083, 403, 767, 390, 625, 2308, 323, 436, 11098, 1268, 390, 275, 643, 3000, 849, 10237, 310, 253, 1332, 281, 253, 22581, 7152, 339, 793, 360, 3454, 436, 7714, 10262, 247, 4460, 7877, 1319, 5141, 1332, 1925, 2803, 1025, 4872, 20741, 386, 1783, 253, 1332, 7866, 432, 247, 1524, 1895, 275, 6551, 36415, 285, 14177, 281, 3048, 2048, 2308, 273, 11454, 3608, 281, 21903, 275, 1798, 253, 2022, 4736, 273, 253, 4081, 5853, 310, 281, 1089, 4872, 20553, 273, 253, 3608, 2048, 534, 6889, 11903, 595, 342, 581, 4188, 49225, 4809, 285, 34885, 342, 253, 2571, 253, 2746, 310, 6760, 970, 247, 13506, 1650, 285, 247, 1524, 1083, 1263, 7668, 277, 28603, 246, 21, 85, 22, 1341, 50275, 10247, 2792, 50276, 783, 2929, 7866, 432, 247, 1524, 285, 11132, 9015, 37366, 1895, 26332, 253, 1783, 273, 2014, 3992, 391, 2072, 12184, 941, 323, 6551, 36415, 941, 1783, 5657, 323, 841, 941, 403, 31735, 7936, 281, 42030, 253, 1029, 10454, 273, 436, 35156, 1673, 50276, 783, 2934, 310, 4722, 973, 17194, 285, 973, 5544, 50276, 783, 4081, 2746, 310, 2969, 285, 34007, 4665, 1430, 273, 5482, 285, 1543, 310, 4390, 7936, 672, 10620, 342, 35156, 941, 50276, 783, 1332, 310, 5762, 970, 247, 1524, 1533, 2898, 50275, 12373, 2792, 50276, 34974, 50276, 13982, 337, 50276, 783, 2022, 1895, 273, 436, 7714, 310, 326, 253, 4081, 1332, 310, 417, 973, 13400, 715, 253, 1375, 273, 253, 1445, 2686, 253, 5955, 273, 4477, 273, 253, 2905, 2987, 310, 1077, 3710, 342, 760, 581, 12494, 387, 253, 5004, 273, 3239, 337, 5043, 253, 5740, 273, 4872, 20741, 386, 1783, 275, 2593, 577, 1142, 1027, 4872, 7877, 1319, 5141, 5609, 452, 644, 4081, 275, 253, 2469, 1016, 581, 342, 1027, 5319, 7342, 285, 13757, 5609, 4477, 943, 2319, 731, 3340, 275, 5886, 342, 616, 2746, 247, 1175, 5857, 1127, 310, 253, 6630, 273, 260, 41681, 285, 32798, 1240, 3358, 6451, 50276, 34276, 268, 260, 41681, 1182, 276, 4805, 32798, 1240, 3358, 6451, 4872, 7877, 1319, 5141, 6630, 16039, 285, 2087, 5904, 6698, 273, 5145, 4715, 2561, 1668, 4104, 27360, 26, 1717, 361, 50276, 12563, 323, 752, 13714, 7350, 4872, 20741, 386, 1783, 891, 1158, 326, 247, 12861, 5955, 310, 3058, 1142, 1027, 18149, 273, 298, 1473, 452, 644, 4081, 690, 273, 731, 13714, 2905, 281, 253, 7342, 30172, 273, 436, 2929, 751, 374, 69, 392, 66, 50276, 3987, 632, 18927, 6002, 543, 340, 9041, 374, 69, 392, 66, 247, 7605, 4872, 20741, 386, 1783, 323, 2460, 4315, 3102, 8981, 4876, 4644, 3436, 2523, 608, 5826, 7223, 608, 20450, 1237, 50275, 2520, 33876, 1320, 342, 1675, 281, 253, 1375, 273, 253, 1445, 310, 7936, 1293, 436, 352, 310, 1077, 2834, 281, 755, 253, 2032, 7680, 273, 253, 4081, 2746, 275, 253, 1072, 5968, 891, 1804, 253, 4477, 281, 2486, 690, 625, 3332, 5609, 275, 253, 5661, 5301, 50273, 13982, 374, 253, 653, 26232, 1877, 2746, 556, 417, 644, 4751, 5469, 285, 17285, 1142, 1027, 3082, 323, 37139, 1877, 452, 644, 4081, 2139, 513, 4477, 5206, 436, 1798, 581, 849, 1057, 436, 1332, 14588, 281, 18075, 690, 23507, 11333, 452, 644, 5611, 671, 323, 20741, 386, 1783, 417, 13714, 2905, 281, 253, 3739, 1673, 824, 347, 50276, 79, 288, 12865, 2805, 3443, 285, 480, 299, 269, 40885, 23507, 4216, 3169, 20741, 386, 1783, 323, 24052, 808, 1544, 27471, 275, 26332, 1796, 13122, 327, 3471, 5829, 1482, 285, 8905, 17950, 1936, 8073, 642, 818, 7266, 37516, 1508, 34305, 480, 2988, 4059, 50274, 13982, 495, 50276, 338, 891, 9113, 2096, 275, 253, 4679, 4477, 4647, 247, 268, 6357, 281, 4796, 253, 3320, 12091, 1078, 9433, 253, 4081, 1332, 1390, 767, 3104, 273, 3239, 608, 310, 436, 247, 5272, 4327, 891, 871, 326, 436, 310, 7744, 2218, 275, 643, 15216, 751, 275, 27633, 6511, 323, 2454, 8981, 533, 275, 841, 4679, 3608, 403, 7936, 323, 253, 3640, 11998, 891, 5476, 436, 310, 417, 2218, 323, 253, 37139, 1245, 2715, 5010, 3608, 651, 452, 417, 644, 10375, 476, 368, 4385, 327, 436, 50273, 13982, 577, 690, 4477, 9125, 326, 830, 8287, 253, 13757, 273, 4872, 7877, 1319, 5141, 5609, 347, 25023, 390, 14923, 25023, 3237, 310, 417, 1900, 271, 10599, 4327, 50276, 34276, 268, 260, 41681, 1182, 276, 4805, 32798, 1240, 3358, 6451, 4872, 7877, 1319, 5141, 6630, 16039, 285, 2087, 5904, 6698, 273, 5145, 4715, 2561, 1668, 4104, 27360, 26, 1717, 361, 50276, 249, 824, 2929, 253, 4477, 671, 5125, 271, 5795, 476, 368, 2085, 247, 4385, 327, 436, 50275, 13982, 50276, 22, 891, 1119, 690, 12748, 275, 4361, 285, 4685, 253, 9759, 285, 253, 5955, 273, 253, 1543, 581, 1895, 369, 7964, 253, 958, 326, 7180, 403, 1691, 275, 253, 30762, 285, 326, 824, 7180, 4428, 1142, 3904, 891, 1804, 253, 4477, 281, 1691, 247, 10405, 3006, 2829, 3304, 253, 7714, 604, 1896, 50274, 13982, 721, 253, 4188, 49225, 3386, 403, 8025, 281, 320, 31091, 849, 7654, 310, 436, 9376, 849, 1199, 1491, 403, 359, 2343, 5555, 275, 436, 3634, 285, 275, 643, 22349, 275, 643, 3000, 403, 627, 643, 15216, 1212, 18498, 275, 534, 436, 1332, 320, 3732, 50276, 8052, 690, 643, 1896, 2898, 15216, 651, 2572, 253, 1318, 273, 253, 10419, 50275, 13982, 818, 436, 4385, 310, 2905, 281, 253, 2045, 581, 533, 4428, 625, 271, 1527, 14876, 2581, 685, 247, 4385, 858, 368, 1908, 281, 897, 1327, 31091, 3386, 891, 1158, 326, 436, 651, 1527, 253, 10393, 273, 7274, 908, 275, 1264, 1106, 941, 1783, 1014, 516, 417, 6600, 273, 7877, 1319, 5141, 5609, 3082, 323, 1264, 1106, 941, 891, 1158, 247, 5886, 4961, 5609, 281, 4908, 4722, 2493, 875, 1027, 10746, 273, 941, 452, 644, 4081, 3340, 275, 253, 3634, 273, 492, 280, 77, 49591, 923, 323, 1650, 50276, 864, 363, 10999, 391, 1160, 8432, 660, 492, 280, 77, 49591, 11333, 323, 289, 22767, 37613, 941, 1783, 247, 11088, 6630, 913, 78, 2475, 3432, 39748, 5325, 6247, 50276, 74, 1804, 253, 4477, 281, 1379, 247, 1007, 671, 387, 436, 1673, 2490, 187, 4118, 18435, 27, 783, 2929, 23970, 247, 4872, 12378, 1332, 11797, 407, 271, 8947, 323, 4560, 247, 22296, 1698, 6967, 21496, 50276, 66, 2762, 4809, 310, 326, 253, 1332, 310, 15246, 285, 352, 310, 1014, 5777, 10084, 326, 275, 253, 2021, 273, 4872, 3210, 627, 1335, 369, 271, 27819, 25803, 50276, 783, 2929, 369, 2783, 4217, 323, 253, 4096, 5421, 275, 253, 2929, 2014, 3992, 391, 79, 511, 82, 941, 1783, 533, 281, 1750, 16055, 31471, 625, 1941, 943, 320, 3559, 50276, 531, 1798, 2508, 534, 369, 3982, 598, 407, 512, 30628, 369, 253, 268, 6357, 638, 21678, 323, 209, 3737, 352, 310, 247, 24600, 4327, 347, 4872, 209, 3737, 310, 9093, 816, 247, 9381, 273, 253, 268, 6357, 4295, 533, 253, 22861, 310, 417, 347, 1175, 323, 247, 22296, 1332, 268, 6357, 778, 320, 3309, 275, 3946, 533, 778, 7168, 1774, 7140, 15477, 1491, 50276, 783, 2929, 1335, 3198, 247, 1534, 18520, 1078, 9311, 50276, 9154, 2167, 253, 1332, 310, 15246, 1332, 247, 2257, 273, 673, 285, 5955, 369, 2424, 323, 6485, 30628, 281, 2096, 352, 209 ]
[ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 ]
[ 275, 4679, 891, 1089, 824, 11985, 403, 18766, 323, 253, 10668, 281, 2096, 253, 8245, 390, 1375, 273, 253, 1445, 275, 436, 1673, 3103, 352, 310, 8489, 12744, 849, 253, 789, 19132, 432, 5368, 2987, 50274, 49363, 7103, 310, 15321, 8391, 422, 347, 247, 8127, 2898, 789, 253, 9483, 6460, 390, 24866, 275, 4679, 943, 320, 5544, 625, 4518, 323, 1650, 581, 629, 273, 253, 3368, 253, 37139, 414, 3169, 37820, 273, 269, 392, 66, 3133, 751, 253, 2898, 273, 20260, 1058, 266, 43261, 23136, 1162, 355, 327, 253, 941, 10668, 403, 12744, 752, 403, 253, 4451, 7881, 275, 1655, 4758, 671, 667, 15180, 4278, 14940, 9171, 1430, 3966, 390, 667, 12925, 327, 253, 4373, 19484, 5438, 285, 1057, 436, 1332, 671, 6635, 14282, 1543, 327, 1529, 3320, 941, 5474, 33032, 2520, 7714, 8631, 247, 26647, 273, 271, 8947, 326, 310, 6034, 281, 320, 908, 275, 253, 7914, 273, 2014, 3992, 391, 79, 511, 82, 941, 253, 1332, 4419, 17776, 273, 271, 19627, 13358, 13213, 1320, 273, 1341, 7163, 3341, 407, 13466, 50276, 783, 1332, 840, 23352, 2803, 4219, 253, 2540, 3320, 2048, 2193, 715, 3386, 285, 616, 6355, 4103, 281, 253, 27791, 9050, 50276, 783, 2616, 476, 840, 320, 908, 281, 1361, 4665, 253, 9050, 3340, 275, 17385, 342, 247, 3963, 6081, 281, 4796, 253, 1180, 273, 3608, 3206, 275, 253, 2616, 50276, 74, 1119, 436, 2929, 29125, 281, 1239, 50276, 783, 1273, 12494, 1057, 417, 1056, 2590, 4555, 752, 1895, 310, 1146, 9713, 50276, 262, 2296, 326, 359, 403, 1677, 27791, 20121, 273, 16069, 3510, 533, 417, 835, 1110, 20121, 1705, 432, 50276, 601, 891, 37001, 6386, 281, 253, 3082, 2593, 281, 1611, 281, 4677, 352, 562, 50276, 2858, 1014, 846, 4361, 326, 2593, 891, 812, 417, 2096, 835, 253, 13466, 2193, 1705, 432, 50276, 262, 369, 760, 672, 891, 1160, 352, 281, 2593, 374, 326, 891, 16058, 326, 6296, 253, 21903, 403, 417, 2540, 533, 760, 22245, 50276, 30819, 7192, 253, 1895, 352, 3133, 281, 479, 326, 253, 897, 1083, 323, 436, 2746, 310, 3240, 2173, 50276, 664, 878, 281, 452, 391, 79, 511, 82, 941, 326, 476, 320, 29102, 275, 824, 247, 1039, 326, 359, 476, 9212, 9959, 281, 41364, 27791, 9050, 50276, 783, 1750, 275, 253, 5955, 2593, 326, 776, 2746, 476, 320, 4354, 14923, 281, 50276, 38092, 5319, 824, 347, 44553, 10537, 285, 17769, 369, 417, 2590, 281, 479, 533, 891, 5467, 436, 1335, 10770, 281, 21903, 326, 403, 22245, 432, 253, 7362, 79, 511, 82, 941, 50276, 783, 29254, 1268, 264, 1411, 260, 6357, 310, 326, 436, 2746, 2550, 2803, 907, 3320, 12091, 2556, 281, 2060, 3386, 2403, 253, 906, 1892, 281, 4665, 50276, 2858, 436, 3133, 751, 352, 1364, 671, 320, 2032, 273, 253, 4081, 1332, 1580, 2720, 281, 667, 1783, 253, 941, 310, 13657, 3066, 268, 6357, 2593, 8073, 50276, 74, 717, 13477, 3103, 670, 849, 253, 1332, 476, 320, 908, 281, 3609, 3608, 275, 2593, 721, 50276, 783, 5301, 281, 298, 1473, 310, 2218, 970, 767, 17082, 1754, 327, 2625, 1299, 45416, 4313, 285, 15577, 1491, 50276, 74, 651, 452, 10490, 281, 4089, 625, 670, 2139, 841, 1798, 17082, 403, 4569, 285, 275, 1798, 849, 597, 14588, 281, 5913, 897, 1083, 253, 4477, 452, 275, 2564, 4583, 891, 717, 1335, 417, 13762, 326, 436, 310, 247, 1895, 326, 3198, 281, 320, 14042, 50275, 7152, 33032, 6010, 253, 2929, 3400, 285, 271, 729, 404, 1033, 1250, 1332, 1925, 269, 392, 66, 323, 6153, 247, 1698, 15759, 21496, 273, 7362, 79, 511, 82, 941, 23000, 597, 12661, 247, 37139, 414, 1754, 1332, 281, 1089, 3320, 20076, 534, 476, 320, 908, 323, 2007, 7534, 12820, 253, 4477, 18171, 6760, 616, 1332, 327, 247, 941, 873, 273, 1524, 2048, 2193, 275, 277, 28603, 8512, 597, 2429, 269, 392, 66, 281, 767, 19554, 285, 2074, 7274, 10775, 4872, 20741, 386, 1783, 298, 1473, 285, 247, 625, 4735, 2132, 2715, 374, 392, 66, 327, 512, 49602, 285, 17082, 253, 269, 392, 66, 2722, 2590, 5750, 689, 253, 643, 3082, 50275, 250, 3743, 323, 4868, 4583, 891, 6273, 323, 18738, 891, 476, 8564, 2067, 897, 2219, 273, 436, 2746, 534, 4453, 751, 352, 310, 15180, 1077, 17887, 285, 3477, 281, 4647, 23000, 352, 310, 28055, 973, 11420, 327, 5368, 7605, 3082, 2299, 619, 2022, 4468, 310, 269, 392, 284, 18925, 327, 3451, 4735, 22581, 534, 310, 417, 5393, 407, 253, 4477, 891, 3524, 4477, 476, 2953, 619, 4468, 285, 253, 643, 772, 275, 253, 30080, 22559, 2180, 50275, 856, 84, 50276, 783, 4477, 4518, 15249, 616, 2216, 8063, 273, 269, 392, 66, 1754, 327, 271, 8947, 50276, 31591, 17144, 970, 1027, 7363, 921, 2590, 5750, 273, 253, 269, 392, 66, 1332, 50276, 783, 3320, 11118, 1119, 323, 253, 246, 21, 85, 22, 941, 873, 921, 1929, 10705, 3608, 323, 16069, 2440, 50276, 24210, 281, 253, 374, 4735, 1083, 285, 643, 1896, 18149, 403, 4518, 18627, 50276, 2068, 3134, 30762, 4645, 10527, 12820, 323, 954, 24238, 275, 7362, 79, 511, 82, 941, 50275, 5040, 50276, 2369, 7103, 327, 849, 7976, 253, 1566, 310, 327, 3451, 4735, 22581, 50276, 2369, 5301, 281, 643, 3082, 323, 26230, 3320, 20076, 50276, 29599, 327, 816, 581, 7534, 6549, 941, 873, 50275, 34974, 1309, 30080, 22559, 2180, 4496, 2953, 285, 19148, 253, 772, 1840, 7194, 752, 6569, 604, 581, 1268, 310, 30833, 2931, 24088, 253, 3216, 5083, 403, 767, 390, 625, 2308, 323, 436, 11098, 1268, 390, 275, 643, 3000, 849, 10237, 310, 253, 1332, 281, 253, 22581, 7152, 339, 793, 360, 3454, 436, 7714, 10262, 247, 4460, 7877, 1319, 5141, 1332, 1925, 2803, 1025, 4872, 20741, 386, 1783, 253, 1332, 7866, 432, 247, 1524, 1895, 275, 6551, 36415, 285, 14177, 281, 3048, 2048, 2308, 273, 11454, 3608, 281, 21903, 275, 1798, 253, 2022, 4736, 273, 253, 4081, 5853, 310, 281, 1089, 4872, 20553, 273, 253, 3608, 2048, 534, 6889, 11903, 595, 342, 581, 4188, 49225, 4809, 285, 34885, 342, 253, 2571, 253, 2746, 310, 6760, 970, 247, 13506, 1650, 285, 247, 1524, 1083, 1263, 7668, 277, 28603, 246, 21, 85, 22, 1341, 50275, 10247, 2792, 50276, 783, 2929, 7866, 432, 247, 1524, 285, 11132, 9015, 37366, 1895, 26332, 253, 1783, 273, 2014, 3992, 391, 2072, 12184, 941, 323, 6551, 36415, 941, 1783, 5657, 323, 841, 941, 403, 31735, 7936, 281, 42030, 253, 1029, 10454, 273, 436, 35156, 1673, 50276, 783, 2934, 310, 4722, 973, 17194, 285, 973, 5544, 50276, 783, 4081, 2746, 310, 2969, 285, 34007, 4665, 1430, 273, 5482, 285, 1543, 310, 4390, 7936, 672, 10620, 342, 35156, 941, 50276, 783, 1332, 310, 5762, 970, 247, 1524, 1533, 2898, 50275, 12373, 2792, 50276, 34974, 50276, 13982, 337, 50276, 783, 2022, 1895, 273, 436, 7714, 310, 326, 253, 4081, 1332, 310, 417, 973, 13400, 715, 253, 1375, 273, 253, 1445, 2686, 253, 5955, 273, 4477, 273, 253, 2905, 2987, 310, 1077, 3710, 342, 760, 581, 12494, 387, 253, 5004, 273, 3239, 337, 5043, 253, 5740, 273, 4872, 20741, 386, 1783, 275, 2593, 577, 1142, 1027, 4872, 7877, 1319, 5141, 5609, 452, 644, 4081, 275, 253, 2469, 1016, 581, 342, 1027, 5319, 7342, 285, 13757, 5609, 4477, 943, 2319, 731, 3340, 275, 5886, 342, 616, 2746, 247, 1175, 5857, 1127, 310, 253, 6630, 273, 260, 41681, 285, 32798, 1240, 3358, 6451, 50276, 34276, 268, 260, 41681, 1182, 276, 4805, 32798, 1240, 3358, 6451, 4872, 7877, 1319, 5141, 6630, 16039, 285, 2087, 5904, 6698, 273, 5145, 4715, 2561, 1668, 4104, 27360, 26, 1717, 361, 50276, 12563, 323, 752, 13714, 7350, 4872, 20741, 386, 1783, 891, 1158, 326, 247, 12861, 5955, 310, 3058, 1142, 1027, 18149, 273, 298, 1473, 452, 644, 4081, 690, 273, 731, 13714, 2905, 281, 253, 7342, 30172, 273, 436, 2929, 751, 374, 69, 392, 66, 50276, 3987, 632, 18927, 6002, 543, 340, 9041, 374, 69, 392, 66, 247, 7605, 4872, 20741, 386, 1783, 323, 2460, 4315, 3102, 8981, 4876, 4644, 3436, 2523, 608, 5826, 7223, 608, 20450, 1237, 50275, 2520, 33876, 1320, 342, 1675, 281, 253, 1375, 273, 253, 1445, 310, 7936, 1293, 436, 352, 310, 1077, 2834, 281, 755, 253, 2032, 7680, 273, 253, 4081, 2746, 275, 253, 1072, 5968, 891, 1804, 253, 4477, 281, 2486, 690, 625, 3332, 5609, 275, 253, 5661, 5301, 50273, 13982, 374, 253, 653, 26232, 1877, 2746, 556, 417, 644, 4751, 5469, 285, 17285, 1142, 1027, 3082, 323, 37139, 1877, 452, 644, 4081, 2139, 513, 4477, 5206, 436, 1798, 581, 849, 1057, 436, 1332, 14588, 281, 18075, 690, 23507, 11333, 452, 644, 5611, 671, 323, 20741, 386, 1783, 417, 13714, 2905, 281, 253, 3739, 1673, 824, 347, 50276, 79, 288, 12865, 2805, 3443, 285, 480, 299, 269, 40885, 23507, 4216, 3169, 20741, 386, 1783, 323, 24052, 808, 1544, 27471, 275, 26332, 1796, 13122, 327, 3471, 5829, 1482, 285, 8905, 17950, 1936, 8073, 642, 818, 7266, 37516, 1508, 34305, 480, 2988, 4059, 50274, 13982, 495, 50276, 338, 891, 9113, 2096, 275, 253, 4679, 4477, 4647, 247, 268, 6357, 281, 4796, 253, 3320, 12091, 1078, 9433, 253, 4081, 1332, 1390, 767, 3104, 273, 3239, 608, 310, 436, 247, 5272, 4327, 891, 871, 326, 436, 310, 7744, 2218, 275, 643, 15216, 751, 275, 27633, 6511, 323, 2454, 8981, 533, 275, 841, 4679, 3608, 403, 7936, 323, 253, 3640, 11998, 891, 5476, 436, 310, 417, 2218, 323, 253, 37139, 1245, 2715, 5010, 3608, 651, 452, 417, 644, 10375, 476, 368, 4385, 327, 436, 50273, 13982, 577, 690, 4477, 9125, 326, 830, 8287, 253, 13757, 273, 4872, 7877, 1319, 5141, 5609, 347, 25023, 390, 14923, 25023, 3237, 310, 417, 1900, 271, 10599, 4327, 50276, 34276, 268, 260, 41681, 1182, 276, 4805, 32798, 1240, 3358, 6451, 4872, 7877, 1319, 5141, 6630, 16039, 285, 2087, 5904, 6698, 273, 5145, 4715, 2561, 1668, 4104, 27360, 26, 1717, 361, 50276, 249, 824, 2929, 253, 4477, 671, 5125, 271, 5795, 476, 368, 2085, 247, 4385, 327, 436, 50275, 13982, 50276, 22, 891, 1119, 690, 12748, 275, 4361, 285, 4685, 253, 9759, 285, 253, 5955, 273, 253, 1543, 581, 1895, 369, 7964, 253, 958, 326, 7180, 403, 1691, 275, 253, 30762, 285, 326, 824, 7180, 4428, 1142, 3904, 891, 1804, 253, 4477, 281, 1691, 247, 10405, 3006, 2829, 3304, 253, 7714, 604, 1896, 50274, 13982, 721, 253, 4188, 49225, 3386, 403, 8025, 281, 320, 31091, 849, 7654, 310, 436, 9376, 849, 1199, 1491, 403, 359, 2343, 5555, 275, 436, 3634, 285, 275, 643, 22349, 275, 643, 3000, 403, 627, 643, 15216, 1212, 18498, 275, 534, 436, 1332, 320, 3732, 50276, 8052, 690, 643, 1896, 2898, 15216, 651, 2572, 253, 1318, 273, 253, 10419, 50275, 13982, 818, 436, 4385, 310, 2905, 281, 253, 2045, 581, 533, 4428, 625, 271, 1527, 14876, 2581, 685, 247, 4385, 858, 368, 1908, 281, 897, 1327, 31091, 3386, 891, 1158, 326, 436, 651, 1527, 253, 10393, 273, 7274, 908, 275, 1264, 1106, 941, 1783, 1014, 516, 417, 6600, 273, 7877, 1319, 5141, 5609, 3082, 323, 1264, 1106, 941, 891, 1158, 247, 5886, 4961, 5609, 281, 4908, 4722, 2493, 875, 1027, 10746, 273, 941, 452, 644, 4081, 3340, 275, 253, 3634, 273, 492, 280, 77, 49591, 923, 323, 1650, 50276, 864, 363, 10999, 391, 1160, 8432, 660, 492, 280, 77, 49591, 11333, 323, 289, 22767, 37613, 941, 1783, 247, 11088, 6630, 913, 78, 2475, 3432, 39748, 5325, 6247, 50276, 74, 1804, 253, 4477, 281, 1379, 247, 1007, 671, 387, 436, 1673, 2490, 187, 4118, 18435, 27, 783, 2929, 23970, 247, 4872, 12378, 1332, 11797, 407, 271, 8947, 323, 4560, 247, 22296, 1698, 6967, 21496, 50276, 66, 2762, 4809, 310, 326, 253, 1332, 310, 15246, 285, 352, 310, 1014, 5777, 10084, 326, 275, 253, 2021, 273, 4872, 3210, 627, 1335, 369, 271, 27819, 25803, 50276, 783, 2929, 369, 2783, 4217, 323, 253, 4096, 5421, 275, 253, 2929, 2014, 3992, 391, 79, 511, 82, 941, 1783, 533, 281, 1750, 16055, 31471, 625, 1941, 943, 320, 3559, 50276, 531, 1798, 2508, 534, 369, 3982, 598, 407, 512, 30628, 369, 253, 268, 6357, 638, 21678, 323, 209, 3737, 352, 310, 247, 24600, 4327, 347, 4872, 209, 3737, 310, 9093, 816, 247, 9381, 273, 253, 268, 6357, 4295, 533, 253, 22861, 310, 417, 347, 1175, 323, 247, 22296, 1332, 268, 6357, 778, 320, 3309, 275, 3946, 533, 778, 7168, 1774, 7140, 15477, 1491, 50276, 783, 2929, 1335, 3198, 247, 1534, 18520, 1078, 9311, 50276, 9154, 2167, 253, 1332, 310, 15246, 1332, 247, 2257, 273, 673, 285, 5955, 369, 2424, 323, 6485, 30628, 281, 2096, 352, 209 ]
Below is given review of a research paper from cnoference journal. Please write a summary the review. ### Review: this paper studies a stochastic multiarmed bandit setting with delayed feedback where additionally the bandit policy must incentivize an external agent to pull the desired arm and the arm preferences of this external agent are selfreinforcing they design a policy ucbfdf for this setting and prove expected regret and incentive payment upper bounds under various delay distribution assumptions finally they evaluate their policy on bandit environments constructed using amazon review data while i find the problem setting to be intriguing i have a number of concerns about the writing proof techniques and regret upper bounds i will outline these concerns below discussion of related work it seems to me that there are a number of typos in the related work discussing bandits with delayed feedback in particular i was confused by the claim that joulani et al 2013 proves a regret bound of osqrtk t log t k mathbbetau where tau is the delay indeed this upper bound is false when tau t since the regret necessarily is linear in this regime however theorem 6 from that paper does indeed show a proper scaling in this regime i would encourage the authors to clarify this subtlety somehow i was also confused by the regret lower bound citation in delayed feedback from vernade et al 2017 indeed a scaling of omegaklogt follows from the standard bandit setting perhaps you should clarify this also i was confused by the line where only an upper bound on the tail of the delay distribution is needed without requiring the expectation to be finite perhaps you should clarify that only a polynomial upper bound on the tail of the delay distribution is needed here regret modeling and payments it is not clear to me how the payments made by the bandit policy affect the performance of the algorithm in your model indeed the regret is measured against a policy which makes an infinite payment to the best arm does this mean that the algorithm is also allowed to make unbounded payments and no cost to performance if instead we measure regret with respect to a policy which can only make some bounded number of payments how does the regret change perhaps there is some meaningful notion of payment regret that measures the excess number of payments over the genie policy algorithm naming minor comment although your algorithm is called ucb it seems to not actually be a ucbstyle policy as the exploration and exploitation phases are distinct unless i am misreading something it seems this algorithm is more of an eliminationstyle algorithm perhaps you could consider changing the name to reflect this technical concerns i am confused by the claimed regret scaling in lemma 1 and theorem 3 indeed both bounds seem to have a scaling term on the order of delta cdot mathbbedt however consider an environment where delta 1sqrtt eg the standard minimax regret lower bound environment and suppose that all delays are deterministically t so that dtt as then it seems that your lemma and theorem would give a regret upper bound of osqrtt in this case however it is clear that regret must be linear since the policy never receives any rewards am i missing something i do not see how the claimed scaling can be true in trying to understand where this scaling term comes from i began reading the proof of lemma 1 it seems that there is an issue in the inequality of 7 on page 13 indeed this inequality does not typematch since the lefthand side is a deterministic quantity just the probability of some event but the righthand side is random as it depends on tat dat and cat note that there is a similar issue in equation 10 it seems that this analysis introduces the scaling term that does not make sense to me discussed in the previous paragraph thus i suspect that fixing this bug will change the reported regret scaling empirical results i suggest that the authors consider adding at least some simple baseline to their experimental results for example you may consider adding a ucb baseline to demonstrate that policies that ignore the incentive structure should achieve poor regret scaling additionally it would be nice to include error bars in the plots postauthor response please refer to my response to the authors i think that most of my main concerns have now been addressed although the problem setting is interesting there are a number of concerns i have with the writing and technical resultsproofs of the paper see above for details since i do not think these concerns can be sufficiently addressed during the rebuttal i do not think that this paper is ready for publication postauthor response please refer to my response to the authors i have changed my opinion of the technical concerns and the remaining ones seem minor the authors have given a satisfactory response and i have increased my score accordingly i would still like to see the concerns mentioned in my responses addressed before publication however docsepthis paper combines three aspects of mab delayed reward incentivized exploration and selfreinforcing user preference they motivate this problem from the perspective of online recommender systems for this model they propose a new ucb based algorithm that achieves the optimal upper bounds they also setup an online experiment based on amazon review data and show how the regret evolves for both armindependent delays and armdependent delays strengths of the paper are as follows the paper considers a practical scenario where recommender systems face prior works considers only one of the aspects delayed rewards incentivizing exploration this paper combines them to study the joint effects the paper is wellwritten with a meaningful experimental section where they empirically compare the incentive costs with regret the algorithm is realistic in terms of practical implementation the main weakness of this paper is as follows my main complain on the paper is primarily around positioning in particular it is not entirely clear to me what the main challenge for the new model is that do not exist in either the delayed reward mab or the incentivized exploration lines of work a discussion around that and why adapting the algorithm for incentivized exploration to handle delayed reward would not work in particular delayed rewards for an ucb type algorithm should not pose a whole lot of difficulties so i am wondering why a new algorithm design is needed compaing and contrasting the current algorithm to prior works will greatly help the reader as above i am supportive of this paper and line of work i would like to better understand the challenges which will help me appreciate the results better thus i would like the authors to address that docsepthis paper considers a mab framework with joint effect of incentivized sampling delayed sampling feedback and selfreinforcing user preferences the framework considers delayed feedbacks to reflect a more practical setting where customer preferences among products are influenced and reinforced by historical feedbacks major comments 1 this paper is largely motivated by zhou et al 2021 which incorporated selfreinforcing user preferences into the incentivized bandit learning framework the key difference is that this work considers the delay effect that is the accumulative award information accounts for reward information that can only be observed up to time t while zhou et al 2021 can observe the feedback immediately could you elaborate more on the technical difficulty when you consider the delay effect compared to the work zhou et al 2021 2 in bandits with delayed feedback pikeburke et al 2018 their theorem 2 illustrates that the regret bound has an additional term log1deltaj etau which does not depend on t however in your theorem 3 it has an additional term sqrt4 etau1 ln t which has the dependence on t why is that can it be improved could you derive the lower bound and close the gap on the dependence of the delay period 3 in assumption 1 it assumes that the delays of arm a follows an independent delay sequence tauat where each element is a random variable satisfying tauatta can the result be generalized to the setting where tauat follow different distributions when t varies 4 why only the term gb1 shows up in the regret bound but not gbt for t1 5 regarding the numerical experiments how do you choose the selfreinforcing preference function fx and the incentive impact function gbt in addition can you compare the regret between the setting with delay effect and that without the delay effect and different delay distributions through which to illustrate the influence of the delayed feedback the problem this paper studied is defined in a clear way but authors may need to emphasize the technical contribution that is built upon the existing work and demonstrate the tightness of the regret bound docsepthis paper proposes a multiarmed bandit mab framework with three realistic considerations incentivized sampling delayed feedback and selfreinforcing preferences the paper proposes a ucbfilteringwithdelayedfeedback ucbfdf policy for the new mab framework for general feedback delays with bounded expectations the authors showed that the delayed sampling feedback has additive penalty on regret and incentive costs then utilized this key fact to derive that the ucbfdf policy achieves logarithmic regret and incentive cost in the new mab framework the theoretical bounds are verified by experiments on instances with 3 arms using amazon review data pros 1 the motivation for introducing the three factors is explained very clearly the description of algorithm 1 and the intuition behind each step in the policy are also wellwritten 2 the comparisons with related works given in section 2 are helpful 3 the theoretical results to my understanding are derived under fairly general assumptions the distinctions between armindependent versus armdependent delays are useful to see cons 1 the experiment evaluation is insufficient in my opinion in terms of verifying the provided regret bounds i find the presented experimental results somewhat narrow given that the derived bounds contain many problemspecific parameters such as arm number k delay expectation edt i think additional experiments with different setups would strengthen the empirical results for example would it be possible to replace the normal delay distributions with other distributions what the results would look like at different selfreinforcing preference function and incentive impact function additionally i would expect the experiments to provide some empirical evidence for the benefits of considering these additional factors for example have the authors considered experiment comparisons between ucbfdf on the new mab framework versus a policy on the standard stochastic mab framework 2 i briefly checked zhou et al 2021 which already considered two out of the three new factors to my understanding the main contribution from this paper is the additional factor of delayed feedback in the mab framework while i agree with the argued usefulness of allowing feedback delays i am not sure how much novelty or what are the new ideas needed in the design of the ucbfdf policy and the theoretical analysis please correct me if i were wrong it appears that algorithm 1 differs with policy 2 in zhou et al 2021 only in the exploitation phase to include delay dat into the dominance criterion is it the obvious choice or is there more sophisticated argument underneath some minor suggestions 1 the mathematical formalization for selfreinforcing preference is a little difficult to find in the paper 2 i believe theta is never formally defined in the paper this may be a standard expression but i think giving its definition would still be helpful 3 would it be possible to mark t1 t2 to indicate the lengths of each phase when plotting the experiment results over the selected time horizon t questions during rebuttal period please address and clarify the cons above in particular could the authors highlight what are the theoretical novelty of the proposed framework policy and theoretical bound derivations in comparison to the cited works i vote for 6 marginally above the acceptance threshold the paper undertakes the ambitious goal of incorporating the joint effects of three new factors into a more realistic mab framework the proposed ucbfdf policy is shown to achieve desirable logarithmic regrets without excessive incentive spending and the theoretical results can apply to quite general delay distributions although i think the paper is wellwritten and the results are useful i find the current version lacking in two main aspects first the experiments are limited in addition to verifying the theoretical bounds the experiments could also provide more insights about the ucbfdf policy andor compare with alternative mab frameworks to demonstrate the advantages of incorporating the new factors second the technical novelty and significance should be highlighted in a more clear way ### Summary:
this paper tackles a bandit problem that incorporates three challenges motivated by common issues encountered in online recommender systems delayed reward incentivized exploration and selfreinforcing user preference the authors propose an approach called ucbfilteringwithdelayedfeedback ucbfdf for this problem and provide a theoretical analysis showing that ucbfdf achieves the optimal regret bounds their analysis also implies that logarithmic regret and incentive cost growth rates are achievable under this setting these theoretical results are supported by empirical experiments eg using amazon review data the main concern with this paper is that the considered challenges have all been tackled already in different bandit settings so the novelty here is that they are being tackled altogether it would be more convincing if experiments included baselines from these existing settings to motivate the need for a new strategy rather than simply relying on methods that have been proposed previously to address each of these problems independently the experiments currently contain only a baseline for bandits with selfreinforcing user preference which has been added during the rebuttal phase
[ 327, 253, 1340, 273, 18687, 260, 5256, 14168, 67, 3026, 85, 2299, 1908, 271, 3126, 835, 18687, 50276, 18, 2609, 85, 24088, 253, 2629, 7221, 991, 14938, 2406, 3033, 3126, 285, 9428, 326, 512, 20219, 403, 11544, 18260, 246, 594, 326, 277, 1440, 347, 840, 352, 3133, 326, 634, 18057, 285, 10012, 651, 1918, 247, 14938, 5170, 3033, 273, 258, 2609, 85, 275, 436, 1083, 2299, 352, 310, 2590, 326, 14938, 1364, 320, 4872, 1580, 253, 3646, 1620, 14488, 667, 23267, 717, 891, 5816, 1633, 891, 513, 417, 923, 849, 253, 7558, 13642, 476, 320, 2032, 50276, 249, 2820, 281, 2096, 835, 436, 13642, 1307, 3249, 432, 891, 3407, 4361, 253, 4737, 273, 18057, 337, 352, 3133, 326, 627, 310, 271, 2523, 275, 253, 11370, 273, 818, 327, 3239, 2145, 6296, 436, 11370, 1057, 417, 1745, 358, 1506, 1580, 253, 458, 71, 394, 395, 1930, 310, 247, 30027, 10671, 816, 253, 5912, 273, 690, 2362, 533, 253, 987, 4608, 1930, 310, 3632, 347, 352, 7024, 327, 21504, 2856, 285, 5798, 3877, 326, 627, 310, 247, 2074, 2523, 275, 5150, 884, 352, 3133, 326, 436, 1783, 23970, 253, 13642, 1307, 326, 1057, 417, 1056, 3282, 281, 479, 5469, 275, 253, 2045, 12494, 3021, 891, 9101, 326, 18505, 436, 7505, 588, 1818, 253, 2361, 14938, 13642, 50276, 358, 5378, 474, 1543, 50276, 74, 1804, 326, 253, 4477, 1908, 6240, 387, 1878, 690, 2969, 8245, 281, 616, 5661, 1543, 323, 1650, 368, 778, 1908, 6240, 247, 44274, 67, 8245, 281, 7568, 326, 7823, 326, 11823, 253, 25275, 2605, 943, 5115, 4105, 14938, 13642, 23000, 352, 651, 320, 5322, 281, 2486, 2228, 8965, 275, 253, 14777, 50274, 5996, 7582, 2380, 4496, 3730, 281, 619, 2380, 281, 253, 4477, 891, 1158, 326, 954, 273, 619, 2022, 7350, 452, 1024, 644, 9713, 50276, 20261, 253, 1895, 4758, 310, 4722, 627, 403, 247, 1180, 273, 7350, 891, 452, 342, 253, 4028, 285, 7681, 1543, 16314, 84, 273, 253, 2929, 923, 1840, 323, 4278, 1580, 891, 513, 417, 1158, 841, 7350, 476, 320, 10481, 9713, 1309, 253, 30080, 22559, 891, 513, 417, 1158, 326, 436, 2929, 310, 4704, 323, 9311, 50274, 5996, 7582, 2380, 4496, 3730, 281, 619, 2380, 281, 253, 4477, 891, 452, 4391, 619, 4743, 273, 253, 7681, 7350, 285, 253, 5780, 4394, 1646, 5884, 253, 4477, 452, 1677, 247, 20297, 2380, 285, 891, 452, 2559, 619, 4868, 15672, 891, 651, 1335, 751, 281, 923, 253, 7350, 5393, 275, 619, 6128, 9713, 1078, 9311, 2299, 50276, 7152, 33032, 2520, 2929, 24772, 1264, 7794, 273, 278, 357, 13444, 10921, 15210, 400, 1025, 17947, 285, 1881, 39910, 22958, 2608, 14682, 597, 41509, 436, 1895, 432, 253, 8668, 273, 3909, 3818, 3109, 2718, 323, 436, 1566, 597, 12661, 247, 747, 44274, 67, 1754, 5933, 326, 33526, 253, 8654, 5170, 14493, 597, 671, 9978, 271, 3909, 3368, 1754, 327, 7001, 251, 2278, 941, 285, 921, 849, 253, 14938, 43279, 323, 1097, 4430, 17777, 20219, 285, 4430, 6820, 20219, 20544, 273, 253, 2929, 403, 347, 3637, 50275, 783, 2929, 19401, 247, 8542, 10076, 835, 3818, 3109, 2718, 2454, 2720, 2987, 19401, 760, 581, 273, 253, 7794, 13444, 23267, 15210, 400, 3006, 17947, 436, 2929, 24772, 731, 281, 1263, 253, 6036, 2538, 50275, 783, 2929, 310, 973, 15720, 342, 247, 14282, 5661, 2593, 835, 597, 45190, 7277, 253, 25275, 4815, 342, 14938, 50275, 783, 5933, 310, 15958, 275, 2426, 273, 8542, 7092, 50276, 783, 2022, 14855, 273, 436, 2929, 310, 347, 3637, 50275, 2577, 2022, 17805, 327, 253, 2929, 310, 8558, 1475, 19274, 275, 1798, 352, 310, 417, 7094, 2590, 281, 479, 752, 253, 2022, 5691, 323, 253, 747, 1566, 310, 326, 513, 417, 2226, 275, 2057, 253, 13444, 10921, 278, 357, 390, 253, 15210, 400, 1025, 17947, 3104, 273, 789, 247, 5955, 1475, 326, 285, 2139, 42174, 253, 5933, 323, 15210, 400, 1025, 17947, 281, 6016, 13444, 10921, 651, 417, 789, 275, 1798, 13444, 23267, 323, 271, 44274, 67, 1511, 5933, 943, 417, 16753, 247, 2644, 2257, 273, 12748, 594, 891, 717, 12371, 2139, 247, 747, 5933, 2216, 310, 3058, 509, 66, 272, 285, 42455, 253, 1655, 5933, 281, 2720, 2987, 588, 10260, 1361, 253, 9414, 347, 1840, 891, 717, 23384, 273, 436, 2929, 285, 1386, 273, 789, 891, 651, 751, 281, 1805, 2096, 253, 7881, 534, 588, 1361, 479, 11435, 253, 1543, 1805, 3021, 891, 651, 751, 253, 4477, 281, 2953, 326, 5474, 33032, 2520, 2929, 19401, 247, 278, 357, 7792, 342, 6036, 1055, 273, 15210, 400, 1025, 10491, 13444, 10491, 8680, 285, 1881, 39910, 22958, 2608, 17971, 253, 7792, 19401, 13444, 8680, 84, 281, 4887, 247, 625, 8542, 4758, 835, 7731, 17971, 2190, 3580, 403, 12208, 285, 28809, 407, 9493, 8680, 84, 2201, 5701, 50276, 18, 186, 2520, 2929, 310, 8127, 17194, 407, 1182, 14451, 1162, 355, 43425, 534, 11217, 1881, 39910, 22958, 2608, 17971, 715, 253, 15210, 400, 1025, 3961, 262, 4715, 7792, 253, 2234, 3064, 310, 326, 436, 789, 19401, 253, 5778, 1055, 326, 310, 253, 7358, 12581, 5722, 1491, 8553, 323, 10921, 1491, 326, 476, 760, 320, 2540, 598, 281, 673, 246, 1223, 1182, 14451, 1162, 355, 43425, 476, 10018, 253, 8680, 4745, 812, 368, 21184, 625, 327, 253, 7681, 10183, 672, 368, 1908, 253, 5778, 1055, 2429, 281, 253, 789, 1182, 14451, 1162, 355, 43425, 50276, 19, 186, 249, 3961, 953, 342, 13444, 8680, 268, 2804, 13156, 413, 1162, 355, 4765, 616, 10012, 374, 18303, 326, 253, 14938, 3033, 556, 271, 3081, 1307, 2412, 18, 3005, 75, 1162, 1952, 534, 1057, 417, 3469, 327, 246, 2299, 275, 634, 10012, 495, 352, 556, 271, 3081, 1307, 8084, 21, 1162, 1952, 18, 43321, 246, 534, 556, 253, 10096, 327, 246, 2139, 310, 326, 476, 352, 320, 5520, 812, 368, 15313, 253, 2406, 3033, 285, 2810, 253, 8037, 327, 253, 10096, 273, 253, 5778, 2180, 50276, 20, 186, 249, 9376, 337, 352, 19584, 326, 253, 20219, 273, 4430, 247, 3637, 271, 3907, 5778, 3425, 29201, 255, 835, 1016, 3284, 310, 247, 3632, 4778, 14127, 29201, 28233, 476, 253, 906, 320, 14923, 281, 253, 4758, 835, 29201, 255, 956, 1027, 10670, 672, 246, 16149, 50276, 21, 186, 22309, 760, 253, 1307, 305, 67, 18, 2722, 598, 275, 253, 14938, 3033, 533, 417, 305, 2612, 323, 246, 18, 50276, 22, 186, 1747, 13218, 253, 10704, 4679, 849, 513, 368, 5206, 253, 1881, 39910, 22958, 14682, 1159, 269, 89, 285, 253, 25275, 3486, 1159, 305, 2612, 275, 1635, 476, 368, 7277, 253, 14938, 875, 253, 4758, 342, 5778, 1055, 285, 326, 1293, 253, 5778, 1055, 285, 1027, 5778, 10670, 949, 534, 281, 17093, 253, 4833, 273, 253, 13444, 8680, 50276, 783, 1895, 436, 2929, 5421, 310, 2931, 275, 247, 2590, 1039, 533, 4477, 778, 878, 281, 22175, 253, 7681, 7680, 326, 310, 4270, 2220, 253, 5368, 789, 285, 7568, 253, 6863, 1255, 273, 253, 14938, 3033, 5474, 33032, 2520, 2929, 29328, 247, 4471, 21201, 3961, 262, 278, 357, 7792, 342, 1264, 15958, 15711, 15210, 400, 1025, 10491, 13444, 8680, 285, 1881, 39910, 22958, 17971, 253, 2929, 29328, 247, 44274, 3342, 300, 22993, 3113, 7555, 19416, 44333, 44274, 3342, 4989, 3646, 323, 253, 747, 278, 357, 7792, 323, 2087, 8680, 20219, 342, 11542, 12656, 253, 4477, 2692, 326, 253, 13444, 10491, 8680, 556, 21842, 12339, 327, 14938, 285, 25275, 4815, 840, 12845, 436, 2234, 958, 281, 15313, 326, 253, 44274, 3342, 4989, 3646, 33526, 32643, 14938, 285, 25275, 2105, 275, 253, 747, 278, 357, 7792, 253, 10527, 14493, 403, 16058, 407, 4679, 327, 10872, 342, 495, 6174, 970, 7001, 251, 2278, 941, 5847, 50276, 18, 186, 783, 16038, 323, 16984, 253, 1264, 2616, 310, 5544, 1077, 4518, 253, 5740, 273, 5933, 337, 285, 253, 30328, 3212, 1016, 3213, 275, 253, 3646, 403, 671, 973, 15720, 374, 186, 783, 14023, 342, 2905, 2987, 1677, 275, 2593, 374, 403, 9371, 495, 186, 783, 10527, 1543, 281, 619, 4685, 403, 6012, 762, 9648, 2087, 13260, 253, 42060, 875, 4430, 17777, 7147, 4430, 6820, 20219, 403, 4217, 281, 923, 50276, 5040, 50276, 18, 186, 783, 3368, 7103, 310, 12497, 275, 619, 4743, 275, 2426, 273, 49160, 253, 2530, 14938, 14493, 891, 1089, 253, 3559, 5661, 1543, 8489, 6891, 1677, 326, 253, 6012, 14493, 3831, 1142, 3237, 29765, 3602, 824, 347, 4430, 1180, 465, 5778, 15355, 1407, 85, 891, 1158, 3081, 4679, 342, 1027, 873, 8777, 651, 17084, 253, 16774, 1543, 323, 1650, 651, 352, 320, 1896, 281, 8171, 253, 2622, 5778, 10670, 342, 643, 10670, 752, 253, 1543, 651, 1007, 751, 387, 1027, 1881, 39910, 22958, 14682, 1159, 285, 25275, 3486, 1159, 23000, 891, 651, 1902, 253, 4679, 281, 2085, 690, 16774, 1941, 323, 253, 5373, 273, 7296, 841, 3081, 2616, 323, 1650, 452, 253, 4477, 2783, 3368, 14023, 875, 44274, 3342, 4989, 327, 253, 747, 278, 357, 7792, 7147, 247, 3646, 327, 253, 2629, 19191, 278, 357, 7792, 374, 186, 74, 13366, 10141, 1182, 14451, 1162, 355, 43425, 534, 2168, 2783, 767, 562, 273, 253, 1264, 747, 2616, 281, 619, 4685, 253, 2022, 7680, 432, 436, 2929, 310, 253, 3081, 2803, 273, 13444, 8680, 275, 253, 278, 357, 7792, 1223, 891, 5194, 342, 253, 9125, 31471, 273, 6941, 8680, 20219, 891, 717, 417, 2119, 849, 1199, 38135, 390, 752, 403, 253, 747, 5697, 3058, 275, 253, 2216, 273, 253, 44274, 3342, 4989, 3646, 285, 253, 10527, 1783, 4496, 3451, 479, 604, 891, 497, 3430, 352, 4620, 326, 5933, 337, 19986, 342, 3646, 374, 275, 1182, 14451, 1162, 355, 43425, 760, 275, 253, 30211, 3408, 281, 2486, 5778, 2856, 715, 253, 26447, 17705, 310, 352, 253, 4755, 4327, 390, 310, 627, 625, 18144, 4154, 21281, 50274, 8826, 5884, 13991, 50276, 18, 186, 783, 15965, 7473, 1320, 323, 1881, 39910, 22958, 14682, 310, 247, 1652, 2834, 281, 1089, 275, 253, 2929, 374, 186, 74, 2868, 39116, 310, 1620, 19186, 2931, 275, 253, 2929, 436, 778, 320, 247, 2629, 2048, 533, 891, 1158, 4933, 697, 5426, 651, 1335, 320, 9371, 495, 186, 12756, 352, 320, 1896, 281, 1616, 246, 18, 246, 19, 281, 5224, 253, 16095, 273, 1016, 3408, 672, 38542, 253, 3368, 1543, 689, 253, 4236, 673, 16892, 246, 50276, 34974, 1309, 30080, 22559, 2180, 50276, 32897, 2953, 285, 19148, 253, 772, 1840, 275, 1798, 812, 253, 4477, 6780, 752, 403, 253, 10527, 38135, 273, 253, 4081, 7792, 3646, 285, 10527, 3033, 3538, 569, 275, 5301, 281, 253, 11106, 2987, 50275, 74, 6273, 323, 721, 42876, 1840, 253, 14924, 7887, 253, 2929, 10832, 1582, 253, 24683, 4736, 273, 24049, 253, 6036, 2538, 273, 1264, 747, 2616, 715, 247, 625, 15958, 278, 357, 7792, 253, 4081, 44274, 3342, 4989, 3646, 310, 2011, 281, 5115, 11408, 32643, 14938, 84, 1293, 13622, 25275, 9100, 285, 253, 10527, 1543, 476, 4647, 281, 3240, 2087, 5778, 10670, 3738, 891, 1158, 253, 2929, 310, 973, 15720, 285, 253, 1543, 403, 4217, 891, 1089, 253, 1655, 2715, 14999, 275, 767, 2022, 7794, 806, 253, 4679, 403, 3710, 275, 1635, 281, 49160, 253, 10527, 14493, 253, 4679, 812, 671, 2085, 625, 16039, 670, 253, 44274, 3342, 4989, 3646, 285, 263, 7277, 342, 5795, 278, 357, 31225, 281, 7568, 253, 11361, 273, 24049, 253, 747, 2616, 1273, 253, 7681, 38135, 285, 8453, 943, 320, 16318, 275, 247, 625, 2590, 1039, 50276, 187, 187, 4118, 18435, 27, 2520, 2929, 39223, 247, 3961, 262, 1895, 326, 31167, 1264, 7881, 17194, 407, 1846, 3374, 14494, 275, 3909, 3818, 3109, 2718, 13444, 10921, 15210, 400, 1025, 17947, 285, 1881, 39910, 22958, 2608, 14682, 253, 4477, 12661, 271, 2746, 1925, 44274, 3342, 300, 22993, 3113, 7555, 19416, 44333, 44274, 3342, 4989, 323, 436, 1895, 285, 2085, 247, 10527, 1783, 4645, 326, 44274, 3342, 4989, 33526, 253, 8654, 14938, 14493, 616, 1783, 671, 8018, 326, 32643, 14938, 285, 25275, 2105, 3116, 4142, 403, 39941, 762, 436, 4758, 841, 10527, 1543, 403, 4516, 407, 16774, 4679, 24088, 970, 7001, 251, 2278, 941, 253, 2022, 4468, 342, 436, 2929, 310, 326, 253, 2783, 7881, 452, 512, 644, 11463, 1070, 2168, 275, 1027, 3961, 262, 7533, 594, 253, 38135, 1060, 310, 326, 597, 403, 1146, 11463, 1070, 17965, 352, 651, 320, 625, 21414, 604, 4679, 2908, 1666, 25379, 432, 841, 5368, 7533, 281, 41509, 253, 878, 323, 247, 747, 5700, 2581, 685, 3365, 22128, 327, 3082, 326, 452, 644, 4081, 3786, 281, 2953, 1016, 273, 841, 3237, 10939, 253, 4679, 4390, 3831, 760, 247, 8245, 323, 3961, 953, 342, 1881, 39910, 22958, 2608, 14682, 534, 556, 644, 2879, 1309, 253, 30080, 22559, 3408 ]
[ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 ]
[ 327, 253, 1340, 273, 18687, 260, 5256, 14168, 67, 3026, 85, 2299, 1908, 271, 3126, 835, 18687, 50276, 18, 2609, 85, 24088, 253, 2629, 7221, 991, 14938, 2406, 3033, 3126, 285, 9428, 326, 512, 20219, 403, 11544, 18260, 246, 594, 326, 277, 1440, 347, 840, 352, 3133, 326, 634, 18057, 285, 10012, 651, 1918, 247, 14938, 5170, 3033, 273, 258, 2609, 85, 275, 436, 1083, 2299, 352, 310, 2590, 326, 14938, 1364, 320, 4872, 1580, 253, 3646, 1620, 14488, 667, 23267, 717, 891, 5816, 1633, 891, 513, 417, 923, 849, 253, 7558, 13642, 476, 320, 2032, 50276, 249, 2820, 281, 2096, 835, 436, 13642, 1307, 3249, 432, 891, 3407, 4361, 253, 4737, 273, 18057, 337, 352, 3133, 326, 627, 310, 271, 2523, 275, 253, 11370, 273, 818, 327, 3239, 2145, 6296, 436, 11370, 1057, 417, 1745, 358, 1506, 1580, 253, 458, 71, 394, 395, 1930, 310, 247, 30027, 10671, 816, 253, 5912, 273, 690, 2362, 533, 253, 987, 4608, 1930, 310, 3632, 347, 352, 7024, 327, 21504, 2856, 285, 5798, 3877, 326, 627, 310, 247, 2074, 2523, 275, 5150, 884, 352, 3133, 326, 436, 1783, 23970, 253, 13642, 1307, 326, 1057, 417, 1056, 3282, 281, 479, 5469, 275, 253, 2045, 12494, 3021, 891, 9101, 326, 18505, 436, 7505, 588, 1818, 253, 2361, 14938, 13642, 50276, 358, 5378, 474, 1543, 50276, 74, 1804, 326, 253, 4477, 1908, 6240, 387, 1878, 690, 2969, 8245, 281, 616, 5661, 1543, 323, 1650, 368, 778, 1908, 6240, 247, 44274, 67, 8245, 281, 7568, 326, 7823, 326, 11823, 253, 25275, 2605, 943, 5115, 4105, 14938, 13642, 23000, 352, 651, 320, 5322, 281, 2486, 2228, 8965, 275, 253, 14777, 50274, 5996, 7582, 2380, 4496, 3730, 281, 619, 2380, 281, 253, 4477, 891, 1158, 326, 954, 273, 619, 2022, 7350, 452, 1024, 644, 9713, 50276, 20261, 253, 1895, 4758, 310, 4722, 627, 403, 247, 1180, 273, 7350, 891, 452, 342, 253, 4028, 285, 7681, 1543, 16314, 84, 273, 253, 2929, 923, 1840, 323, 4278, 1580, 891, 513, 417, 1158, 841, 7350, 476, 320, 10481, 9713, 1309, 253, 30080, 22559, 891, 513, 417, 1158, 326, 436, 2929, 310, 4704, 323, 9311, 50274, 5996, 7582, 2380, 4496, 3730, 281, 619, 2380, 281, 253, 4477, 891, 452, 4391, 619, 4743, 273, 253, 7681, 7350, 285, 253, 5780, 4394, 1646, 5884, 253, 4477, 452, 1677, 247, 20297, 2380, 285, 891, 452, 2559, 619, 4868, 15672, 891, 651, 1335, 751, 281, 923, 253, 7350, 5393, 275, 619, 6128, 9713, 1078, 9311, 2299, 50276, 7152, 33032, 2520, 2929, 24772, 1264, 7794, 273, 278, 357, 13444, 10921, 15210, 400, 1025, 17947, 285, 1881, 39910, 22958, 2608, 14682, 597, 41509, 436, 1895, 432, 253, 8668, 273, 3909, 3818, 3109, 2718, 323, 436, 1566, 597, 12661, 247, 747, 44274, 67, 1754, 5933, 326, 33526, 253, 8654, 5170, 14493, 597, 671, 9978, 271, 3909, 3368, 1754, 327, 7001, 251, 2278, 941, 285, 921, 849, 253, 14938, 43279, 323, 1097, 4430, 17777, 20219, 285, 4430, 6820, 20219, 20544, 273, 253, 2929, 403, 347, 3637, 50275, 783, 2929, 19401, 247, 8542, 10076, 835, 3818, 3109, 2718, 2454, 2720, 2987, 19401, 760, 581, 273, 253, 7794, 13444, 23267, 15210, 400, 3006, 17947, 436, 2929, 24772, 731, 281, 1263, 253, 6036, 2538, 50275, 783, 2929, 310, 973, 15720, 342, 247, 14282, 5661, 2593, 835, 597, 45190, 7277, 253, 25275, 4815, 342, 14938, 50275, 783, 5933, 310, 15958, 275, 2426, 273, 8542, 7092, 50276, 783, 2022, 14855, 273, 436, 2929, 310, 347, 3637, 50275, 2577, 2022, 17805, 327, 253, 2929, 310, 8558, 1475, 19274, 275, 1798, 352, 310, 417, 7094, 2590, 281, 479, 752, 253, 2022, 5691, 323, 253, 747, 1566, 310, 326, 513, 417, 2226, 275, 2057, 253, 13444, 10921, 278, 357, 390, 253, 15210, 400, 1025, 17947, 3104, 273, 789, 247, 5955, 1475, 326, 285, 2139, 42174, 253, 5933, 323, 15210, 400, 1025, 17947, 281, 6016, 13444, 10921, 651, 417, 789, 275, 1798, 13444, 23267, 323, 271, 44274, 67, 1511, 5933, 943, 417, 16753, 247, 2644, 2257, 273, 12748, 594, 891, 717, 12371, 2139, 247, 747, 5933, 2216, 310, 3058, 509, 66, 272, 285, 42455, 253, 1655, 5933, 281, 2720, 2987, 588, 10260, 1361, 253, 9414, 347, 1840, 891, 717, 23384, 273, 436, 2929, 285, 1386, 273, 789, 891, 651, 751, 281, 1805, 2096, 253, 7881, 534, 588, 1361, 479, 11435, 253, 1543, 1805, 3021, 891, 651, 751, 253, 4477, 281, 2953, 326, 5474, 33032, 2520, 2929, 19401, 247, 278, 357, 7792, 342, 6036, 1055, 273, 15210, 400, 1025, 10491, 13444, 10491, 8680, 285, 1881, 39910, 22958, 2608, 17971, 253, 7792, 19401, 13444, 8680, 84, 281, 4887, 247, 625, 8542, 4758, 835, 7731, 17971, 2190, 3580, 403, 12208, 285, 28809, 407, 9493, 8680, 84, 2201, 5701, 50276, 18, 186, 2520, 2929, 310, 8127, 17194, 407, 1182, 14451, 1162, 355, 43425, 534, 11217, 1881, 39910, 22958, 2608, 17971, 715, 253, 15210, 400, 1025, 3961, 262, 4715, 7792, 253, 2234, 3064, 310, 326, 436, 789, 19401, 253, 5778, 1055, 326, 310, 253, 7358, 12581, 5722, 1491, 8553, 323, 10921, 1491, 326, 476, 760, 320, 2540, 598, 281, 673, 246, 1223, 1182, 14451, 1162, 355, 43425, 476, 10018, 253, 8680, 4745, 812, 368, 21184, 625, 327, 253, 7681, 10183, 672, 368, 1908, 253, 5778, 1055, 2429, 281, 253, 789, 1182, 14451, 1162, 355, 43425, 50276, 19, 186, 249, 3961, 953, 342, 13444, 8680, 268, 2804, 13156, 413, 1162, 355, 4765, 616, 10012, 374, 18303, 326, 253, 14938, 3033, 556, 271, 3081, 1307, 2412, 18, 3005, 75, 1162, 1952, 534, 1057, 417, 3469, 327, 246, 2299, 275, 634, 10012, 495, 352, 556, 271, 3081, 1307, 8084, 21, 1162, 1952, 18, 43321, 246, 534, 556, 253, 10096, 327, 246, 2139, 310, 326, 476, 352, 320, 5520, 812, 368, 15313, 253, 2406, 3033, 285, 2810, 253, 8037, 327, 253, 10096, 273, 253, 5778, 2180, 50276, 20, 186, 249, 9376, 337, 352, 19584, 326, 253, 20219, 273, 4430, 247, 3637, 271, 3907, 5778, 3425, 29201, 255, 835, 1016, 3284, 310, 247, 3632, 4778, 14127, 29201, 28233, 476, 253, 906, 320, 14923, 281, 253, 4758, 835, 29201, 255, 956, 1027, 10670, 672, 246, 16149, 50276, 21, 186, 22309, 760, 253, 1307, 305, 67, 18, 2722, 598, 275, 253, 14938, 3033, 533, 417, 305, 2612, 323, 246, 18, 50276, 22, 186, 1747, 13218, 253, 10704, 4679, 849, 513, 368, 5206, 253, 1881, 39910, 22958, 14682, 1159, 269, 89, 285, 253, 25275, 3486, 1159, 305, 2612, 275, 1635, 476, 368, 7277, 253, 14938, 875, 253, 4758, 342, 5778, 1055, 285, 326, 1293, 253, 5778, 1055, 285, 1027, 5778, 10670, 949, 534, 281, 17093, 253, 4833, 273, 253, 13444, 8680, 50276, 783, 1895, 436, 2929, 5421, 310, 2931, 275, 247, 2590, 1039, 533, 4477, 778, 878, 281, 22175, 253, 7681, 7680, 326, 310, 4270, 2220, 253, 5368, 789, 285, 7568, 253, 6863, 1255, 273, 253, 14938, 3033, 5474, 33032, 2520, 2929, 29328, 247, 4471, 21201, 3961, 262, 278, 357, 7792, 342, 1264, 15958, 15711, 15210, 400, 1025, 10491, 13444, 8680, 285, 1881, 39910, 22958, 17971, 253, 2929, 29328, 247, 44274, 3342, 300, 22993, 3113, 7555, 19416, 44333, 44274, 3342, 4989, 3646, 323, 253, 747, 278, 357, 7792, 323, 2087, 8680, 20219, 342, 11542, 12656, 253, 4477, 2692, 326, 253, 13444, 10491, 8680, 556, 21842, 12339, 327, 14938, 285, 25275, 4815, 840, 12845, 436, 2234, 958, 281, 15313, 326, 253, 44274, 3342, 4989, 3646, 33526, 32643, 14938, 285, 25275, 2105, 275, 253, 747, 278, 357, 7792, 253, 10527, 14493, 403, 16058, 407, 4679, 327, 10872, 342, 495, 6174, 970, 7001, 251, 2278, 941, 5847, 50276, 18, 186, 783, 16038, 323, 16984, 253, 1264, 2616, 310, 5544, 1077, 4518, 253, 5740, 273, 5933, 337, 285, 253, 30328, 3212, 1016, 3213, 275, 253, 3646, 403, 671, 973, 15720, 374, 186, 783, 14023, 342, 2905, 2987, 1677, 275, 2593, 374, 403, 9371, 495, 186, 783, 10527, 1543, 281, 619, 4685, 403, 6012, 762, 9648, 2087, 13260, 253, 42060, 875, 4430, 17777, 7147, 4430, 6820, 20219, 403, 4217, 281, 923, 50276, 5040, 50276, 18, 186, 783, 3368, 7103, 310, 12497, 275, 619, 4743, 275, 2426, 273, 49160, 253, 2530, 14938, 14493, 891, 1089, 253, 3559, 5661, 1543, 8489, 6891, 1677, 326, 253, 6012, 14493, 3831, 1142, 3237, 29765, 3602, 824, 347, 4430, 1180, 465, 5778, 15355, 1407, 85, 891, 1158, 3081, 4679, 342, 1027, 873, 8777, 651, 17084, 253, 16774, 1543, 323, 1650, 651, 352, 320, 1896, 281, 8171, 253, 2622, 5778, 10670, 342, 643, 10670, 752, 253, 1543, 651, 1007, 751, 387, 1027, 1881, 39910, 22958, 14682, 1159, 285, 25275, 3486, 1159, 23000, 891, 651, 1902, 253, 4679, 281, 2085, 690, 16774, 1941, 323, 253, 5373, 273, 7296, 841, 3081, 2616, 323, 1650, 452, 253, 4477, 2783, 3368, 14023, 875, 44274, 3342, 4989, 327, 253, 747, 278, 357, 7792, 7147, 247, 3646, 327, 253, 2629, 19191, 278, 357, 7792, 374, 186, 74, 13366, 10141, 1182, 14451, 1162, 355, 43425, 534, 2168, 2783, 767, 562, 273, 253, 1264, 747, 2616, 281, 619, 4685, 253, 2022, 7680, 432, 436, 2929, 310, 253, 3081, 2803, 273, 13444, 8680, 275, 253, 278, 357, 7792, 1223, 891, 5194, 342, 253, 9125, 31471, 273, 6941, 8680, 20219, 891, 717, 417, 2119, 849, 1199, 38135, 390, 752, 403, 253, 747, 5697, 3058, 275, 253, 2216, 273, 253, 44274, 3342, 4989, 3646, 285, 253, 10527, 1783, 4496, 3451, 479, 604, 891, 497, 3430, 352, 4620, 326, 5933, 337, 19986, 342, 3646, 374, 275, 1182, 14451, 1162, 355, 43425, 760, 275, 253, 30211, 3408, 281, 2486, 5778, 2856, 715, 253, 26447, 17705, 310, 352, 253, 4755, 4327, 390, 310, 627, 625, 18144, 4154, 21281, 50274, 8826, 5884, 13991, 50276, 18, 186, 783, 15965, 7473, 1320, 323, 1881, 39910, 22958, 14682, 310, 247, 1652, 2834, 281, 1089, 275, 253, 2929, 374, 186, 74, 2868, 39116, 310, 1620, 19186, 2931, 275, 253, 2929, 436, 778, 320, 247, 2629, 2048, 533, 891, 1158, 4933, 697, 5426, 651, 1335, 320, 9371, 495, 186, 12756, 352, 320, 1896, 281, 1616, 246, 18, 246, 19, 281, 5224, 253, 16095, 273, 1016, 3408, 672, 38542, 253, 3368, 1543, 689, 253, 4236, 673, 16892, 246, 50276, 34974, 1309, 30080, 22559, 2180, 50276, 32897, 2953, 285, 19148, 253, 772, 1840, 275, 1798, 812, 253, 4477, 6780, 752, 403, 253, 10527, 38135, 273, 253, 4081, 7792, 3646, 285, 10527, 3033, 3538, 569, 275, 5301, 281, 253, 11106, 2987, 50275, 74, 6273, 323, 721, 42876, 1840, 253, 14924, 7887, 253, 2929, 10832, 1582, 253, 24683, 4736, 273, 24049, 253, 6036, 2538, 273, 1264, 747, 2616, 715, 247, 625, 15958, 278, 357, 7792, 253, 4081, 44274, 3342, 4989, 3646, 310, 2011, 281, 5115, 11408, 32643, 14938, 84, 1293, 13622, 25275, 9100, 285, 253, 10527, 1543, 476, 4647, 281, 3240, 2087, 5778, 10670, 3738, 891, 1158, 253, 2929, 310, 973, 15720, 285, 253, 1543, 403, 4217, 891, 1089, 253, 1655, 2715, 14999, 275, 767, 2022, 7794, 806, 253, 4679, 403, 3710, 275, 1635, 281, 49160, 253, 10527, 14493, 253, 4679, 812, 671, 2085, 625, 16039, 670, 253, 44274, 3342, 4989, 3646, 285, 263, 7277, 342, 5795, 278, 357, 31225, 281, 7568, 253, 11361, 273, 24049, 253, 747, 2616, 1273, 253, 7681, 38135, 285, 8453, 943, 320, 16318, 275, 247, 625, 2590, 1039, 50276, 187, 187, 4118, 18435, 27, 2520, 2929, 39223, 247, 3961, 262, 1895, 326, 31167, 1264, 7881, 17194, 407, 1846, 3374, 14494, 275, 3909, 3818, 3109, 2718, 13444, 10921, 15210, 400, 1025, 17947, 285, 1881, 39910, 22958, 2608, 14682, 253, 4477, 12661, 271, 2746, 1925, 44274, 3342, 300, 22993, 3113, 7555, 19416, 44333, 44274, 3342, 4989, 323, 436, 1895, 285, 2085, 247, 10527, 1783, 4645, 326, 44274, 3342, 4989, 33526, 253, 8654, 14938, 14493, 616, 1783, 671, 8018, 326, 32643, 14938, 285, 25275, 2105, 3116, 4142, 403, 39941, 762, 436, 4758, 841, 10527, 1543, 403, 4516, 407, 16774, 4679, 24088, 970, 7001, 251, 2278, 941, 253, 2022, 4468, 342, 436, 2929, 310, 326, 253, 2783, 7881, 452, 512, 644, 11463, 1070, 2168, 275, 1027, 3961, 262, 7533, 594, 253, 38135, 1060, 310, 326, 597, 403, 1146, 11463, 1070, 17965, 352, 651, 320, 625, 21414, 604, 4679, 2908, 1666, 25379, 432, 841, 5368, 7533, 281, 41509, 253, 878, 323, 247, 747, 5700, 2581, 685, 3365, 22128, 327, 3082, 326, 452, 644, 4081, 3786, 281, 2953, 1016, 273, 841, 3237, 10939, 253, 4679, 4390, 3831, 760, 247, 8245, 323, 3961, 953, 342, 1881, 39910, 22958, 2608, 14682, 534, 556, 644, 2879, 1309, 253, 30080, 22559, 3408 ]
Below is given review of a research paper from cnoference journal. Please write a summary the review. ### Review: summary and contributions the paper proposes a generative model for the motion of surface waves in open and closed geometries while neural networks have been applied to simulate fluid dynamics before they are purported to suffer from poor generalization to unseen geometries for longtime predictions the paper proposes a unet based model which is trained using a modified loss function incorporating the gradient into the loss function the results demonstrate generalization to unseen geometries with good predictions upto 80 time steps in the future detailed review the following is the detailed review of the paper organized into strengths and weaknesses subsections strengths relevance and significance the utilization of dnns for modeling physical phenomena is compelling and gaining steam in particular the modeling of spatiotemporal phenomena like unsteady fluid dynamics for complex geometries should be of interest to the ml community clarity the paper is written well and is easy to understand reproducibility should be reproducible weaknesses relation to prior art the paper does a reasonable job of presenting the prior art but fails at identifying the unmet need that the presented work fulfills the claimed speedup over numerical solvers has been achieved before guo et al 2016 further dnns have been used to model surface waves fotiadis et al iclr workshop 2020 sorteberg et al neurips workshops 2018 kim et al eurographics 2019 etc sorteberg 2018 claim to use lstm for modeling up to 80 time steps into the future on a dataset not seen during training it is not clear what advantages the present work is supposed to have over the sota and why methodology the proposed approach is a straightforward application of unet to predict a spatial field given past few spatial fields stacked together however unets lstms convlstms and other architectures have been tried before it is unclear what the novel contribution in this paper is gradient augmented loss function and why it would be instrumental in handling unseen geometries over longer periods of time novelty the presented approach is a straightforward application of known techniques further these approaches have been tried before see prior art empirical evaluation there is no evaluation against the state of the art it is important to compare the performance against fotiadis et al iclr 2020 workshops sorteberg et al neurips workshops 2018 kim et al eurographics 2019 etc assessment though the problem seems relevant and of significance to the research community the paper suffers from a lack of novelty and a nonexistent comparison against the state of the art it fails to identify the unmet needs it is addressing and what novel contributions allows them to achieve this i do not recommend the publication of this paper docsepin this paper a methodology for simulating wave dynamics is presented based on convolutional neural networks a standard analytic wave dynamics solver was used to generate a large dataset of 2d wave simulations a deep net based on unet was trained to predict the next state of the wave field given the five previous states the training was done with both standard mse loss and also gradloss which placed a loss on the gradients of the field as well the results show that the network trained with gradloss performs significantly better over successive rollouts than the one trained with mse the results also show that the trained network is able to generalize to qualitatively different environments from the training set if there was one word i would use to describe this paper it is thorough the introduction and related work do a good job going over related contributions in the literature and giving the reader particularly a reader with an ml background and not necessarily a fluid dynamics background a good understanding of what has been done to this point the same can be said for the results the paper offers thorough analysis of the technique the results first show why gradloss is necessary they then go on to show the performance of the model in a variety of scenarios qualitatively different than the scenarios seen during training including curved surfaces and multiple droplets the model performs well in these scenarios additionally the paper presents a scenario that the model fails at long term predictions with two droplets in a narrow channel and briefly discusses why the model fails the strength of this paper is in the thoroughness of its results which includes presenting results where the model fails the weakness of this paper is that it doesnt present any novel techniques its an existing architecture unet applied in a new domain wave simulation however experimental papers provide value as well and the field is grossly lacking in them to that end this paper could be a useful contribution the results are thorough and even show the limitations of the model for a full experimental paper it would be nice to do even more analysis although with limited number of pages that may not always be possible however i would have liked to see have seen data showing how much faster this model is over the standard solvers the paper claims in the intro and conclusion that their model is 104 times faster than the standard method but no where in the paper is data provided to back this up this seems like one of the main reasons for using this methodology so not having data to show this is a big oversight still overall while there is room for improvement this is a quality paper and should be accepted a couple small things to fix up for the camera ready version a related paper that should probably be cited accelerating eulerian fluid simulation with convolutional networks that does very similar things with deep nets and fluid dynamics fig 7 says predictions are on the left and ground truth is on the right but i believe that is flipped predictions are on the right ground truth is on the leftdocsepsummary the paper applies a fully convolutional unet model for next step prediction of the height field for 2d wave dynamics on the domains tested the predictions remain accurate for 20 timesteps and the method seems to provide considerable speedups compared to a stateoftheart spectralhp element numerical solver some encouraging results on generalization to new domain shapes and larger scale domains are also presenting in the experimental session strong points the manuscript is very well written the methods are clear and the generalization results are encouraging weakness 1 the paper lacks a novel contribution from the architectural and application side unets have been previously used on forward dynamics predictions eg 1 thuerey et al and there are other works using neural networks on wave prediction eg 2 fotiadis et al the architecture is a straightforward application of a unet architecture a loss function with msre of the predictions and its gradients 2 there are no comparisons to other baselines the code for the unet architecture from 1 thuerey et al is open sourced so this would be a natural baseline to include 3 the msre errors are not reported on the full dataset instead plots for single trajectories are presented 4 many of the modeling choices are not well motivated or explained for example a the training set dt003s and for unrolling its set dt012s but no insight on this choice was provided b a history of the previous 5 timesteps is used as an input to the network no ablation or justifications are provided c in the training details authors state the time origin for the input sequences is not at t 0 instead it is randomly selected and five timesteps are performed before updating the network weights but nothing else is discussed d authors claim that their model is 4 order of magnitude faster than a stateoftheart spectralhp element numerical solver but the actual runtimes per dataset are not provided and there is no discussion of terms of scalability eg how does it scale with the space resolution e in the generalization to a large domain experiment section 44 the authors justify the relatively poor performance of the approach by stating this problem could be addressed by training the unet for longer output sequences and possibly increasing the depth of the network since this is a simple hypothesis i feel it should have been tested 5 most experiments are showing results for 20 timesteps unrolls and while for longer sequences dissipation was observed which is a strong limitation of the model because of the points above i think this manuscript does not pass the iclr threshold for acceptance although i believe this would be a good workshop paper submission 1 nils thuerey konstantin weissenow lukas prantl and xiangyu hu deep learning methods for reynoldsaveraged navierstokes simulations of airfoil flows 2 stathi fotiadis eduardo pignatelli mario lino valencia chris cantwell amos storkey and anil a bharath comparing recurrent and convolutional neural networks for predicting wave propagationdocsepthe authors use a unet architecture network to predict the motion and interaction of surface waves in an open and closed complex the network trained on data with a simple box and rightangled corner geometries and generalizes well to other complex geometric configurations the neural networkbased method runs much faster than the standard numerical simulation by directly solving the pde strengths 1 much faster than directly solving the inviscid twodimensional shallow water equations using standard numerical solver 2 the network can generalize to geometric configurations and initial conditions not seen during training weaknesses 1 the predictions over more extended periods are not very accurate although training the unet for longer output sequences and increasing the networks depth may help this may also increase the difficulty in training 2 some previous works also used the unet to predict wave dynamics as 3 it is not clear what is the novelty if any in the proposed network architecture 3 not enough experiments how does the model generalize with more complicated initial conditions for example five or ten droplets furthermore there is no comparison to other existing work a further question the model generalizes well to other boundary geometries is this because the wave reflection at the boundary is a local phenomenon so the convolutional network only needs to learn the local reflection some relative works 1 zhu weiqiang yixiao sheng and yi sun wavedynamics simulation using deep neural networks 2017 2 sorteberg wilhelm e et al approximating the solution to wave propagation using deep neural networks arxiv preprint arxiv181201609 2018 3 fotiadis stathi et al comparing recurrent and convolutional neural networks for predicting wave propagation arxiv preprint arxiv200208981 2020 ### Summary:
reviews were somewhat mixed here but the consensus is to reject with at least one voice r2 urging rejection across reviewers the recommendation to reject is primarily based on the level of originality with the proposed unet architecture and on weakness of experiments especially in comparing to baselines reviewers found strengths in the papers writing and in its demonstration of generalization to unseen geometries however reviewers noted that the architecture does not win originalitysignificance points including r3 the most positive reviewer r3 the weakness of this paper is that it doesnt present any novel techniques its an existing architecture unet applied in a new domain wave simulation r2 the proposed approach is a straightforward application of unet to predict a spatial field given past few spatial fields stacked together however unets lstms convlstms and other architectures have been tried before it is unclear what the novel contribution in this paper is and why it would be instrumental in handling unseen geometries over longer periods of time r2 postresponse this paper is a clear reject none of the contributions are novel r4 the paper lacks a novel contribution from the architectural and application side r1 some previous works also used the unet to predict wave dynamics it is not clear what is the novelty if any in the proposed network architecture reviewers also noted weaknesses in the experiments acknowledged by r3 the most positive reviewer though that review did not consider them a fatal flaw r1 not enough experiments how does the model generalize with more complicated initial conditions for example five or ten droplets furthermore there is no comparison to other existing work r2 there is no evaluation against the state of the art r2 postresponse application of dnns to this problem speedups over numerical solvers etc have all been explored by sota works which have not been compared against there is no clear articulation of the claimed novel contributions over the sota and empirical validation or theoretical reasoning of the same r4 there are no comparisons to other baselines r3 reviewer 4 brings up some fair points about experimental issues im not as concerned about the lack of a baseline comparison that doesnt seem to be the point of this paper there is only so much that can be done in an 8 page conference paper however given that the other 2 reviewers think the paper could use more work it would be completely reasonable for the chairs to reject it based on those reviews based on this consensus of reviews my recommendation is to reject i hope the feedback from the reviews is helpful to the authors
[ 432, 247, 3480, 273, 38135, 285, 247, 44382, 6688, 5301, 1411, 253, 1375, 273, 253, 1445, 352, 10224, 281, 4271, 253, 440, 3899, 3198, 352, 310, 15974, 285, 752, 4460, 9021, 4483, 731, 281, 5115, 436, 891, 513, 417, 5583, 253, 9311, 273, 436, 2929, 5474, 339, 9852, 436, 2929, 247, 16182, 323, 948, 8287, 5149, 8062, 310, 3559, 1754, 327, 27311, 267, 11454, 6928, 247, 2629, 20059, 5149, 8062, 47037, 369, 908, 281, 6635, 247, 1781, 10895, 273, 374, 69, 5149, 9938, 247, 3676, 2036, 1754, 327, 440, 292, 369, 10166, 281, 3283, 253, 1735, 1375, 273, 253, 5149, 1673, 1677, 253, 2620, 2045, 3054, 253, 3733, 369, 2218, 342, 1097, 2629, 278, 339, 2957, 285, 671, 3805, 18585, 534, 4845, 247, 2957, 327, 253, 27935, 273, 253, 1673, 347, 973, 253, 1543, 921, 326, 253, 2990, 10166, 342, 3805, 18585, 17923, 3012, 1805, 689, 20946, 4533, 8349, 685, 253, 581, 10166, 342, 278, 339, 253, 1543, 671, 921, 326, 253, 10166, 2990, 310, 2104, 281, 39970, 281, 36143, 1027, 12620, 432, 253, 3733, 873, 50276, 338, 627, 369, 581, 3159, 891, 651, 897, 281, 6266, 436, 2929, 352, 310, 11080, 253, 10199, 285, 2905, 789, 513, 247, 1175, 2628, 1469, 689, 2905, 9021, 275, 253, 6239, 285, 4933, 253, 9414, 3782, 247, 9414, 342, 271, 13361, 4114, 285, 417, 7933, 247, 6514, 8062, 4114, 247, 1175, 4685, 273, 752, 556, 644, 2218, 281, 436, 1127, 253, 1072, 476, 320, 753, 323, 253, 1543, 253, 2929, 6131, 11080, 1783, 273, 253, 5853, 253, 1543, 806, 921, 2139, 3805, 18585, 310, 3309, 597, 840, 564, 327, 281, 921, 253, 3045, 273, 253, 1566, 275, 247, 5235, 273, 15216, 36143, 1027, 685, 253, 15216, 2326, 1309, 3733, 1690, 22627, 9421, 285, 2709, 30244, 253, 1566, 17923, 973, 275, 841, 15216, 23000, 253, 2929, 10262, 247, 10076, 326, 253, 1566, 10224, 387, 1048, 1307, 13650, 342, 767, 30244, 275, 247, 6891, 5048, 285, 13366, 25339, 2139, 253, 1566, 10224, 253, 4757, 273, 436, 2929, 310, 275, 253, 11080, 1255, 273, 697, 1543, 534, 3797, 15250, 1543, 835, 253, 1566, 10224, 50276, 783, 14855, 273, 436, 2929, 310, 326, 352, 36908, 1246, 667, 4460, 5609, 697, 271, 5368, 10336, 440, 292, 3732, 275, 247, 747, 5028, 5149, 9864, 2299, 5661, 9380, 2085, 1318, 347, 973, 285, 253, 1673, 310, 13711, 314, 14999, 275, 731, 281, 326, 990, 436, 2929, 812, 320, 247, 4217, 7680, 253, 1543, 403, 11080, 285, 1014, 921, 253, 7364, 273, 253, 1566, 323, 247, 2120, 5661, 2929, 352, 651, 320, 5322, 281, 513, 1014, 625, 1783, 3738, 342, 3710, 1180, 273, 7223, 326, 778, 417, 1900, 320, 1896, 2299, 891, 651, 452, 10490, 281, 923, 452, 2326, 941, 4645, 849, 1199, 7938, 436, 1566, 310, 689, 253, 2629, 1220, 735, 253, 2929, 3916, 275, 253, 26432, 285, 6452, 326, 616, 1566, 310, 12131, 2069, 7938, 685, 253, 2629, 1332, 533, 642, 835, 275, 253, 2929, 310, 941, 2530, 281, 896, 436, 598, 436, 3133, 751, 581, 273, 253, 2022, 4606, 323, 970, 436, 16182, 594, 417, 1907, 941, 281, 921, 436, 310, 247, 1943, 29002, 1335, 4583, 1223, 627, 310, 2316, 323, 7756, 436, 310, 247, 3290, 2929, 285, 943, 320, 7607, 50276, 66, 4564, 1355, 1841, 281, 4993, 598, 323, 253, 6568, 4704, 2715, 247, 2905, 2929, 326, 943, 3164, 320, 11106, 38757, 299, 14398, 757, 6514, 9864, 342, 27311, 267, 6928, 326, 1057, 1077, 2074, 1841, 342, 3676, 37507, 285, 6514, 8062, 3036, 818, 2296, 13650, 403, 327, 253, 1669, 285, 3216, 5083, 310, 327, 253, 987, 533, 891, 2868, 326, 310, 34572, 13650, 403, 327, 253, 987, 3216, 5083, 310, 327, 253, 1669, 7152, 339, 793, 360, 3454, 253, 2929, 10384, 247, 4751, 27311, 267, 440, 292, 1566, 323, 1735, 3213, 10554, 273, 253, 4898, 1673, 323, 374, 69, 5149, 8062, 327, 253, 10625, 5762, 253, 13650, 3464, 7899, 323, 1384, 4522, 383, 2265, 285, 253, 1332, 3133, 281, 2085, 10665, 3885, 8777, 2429, 281, 247, 1375, 23037, 14387, 9879, 28368, 3284, 10704, 47037, 690, 18462, 1543, 327, 26647, 281, 747, 5028, 15029, 285, 4067, 4311, 10625, 403, 671, 15250, 275, 253, 5661, 6874, 50274, 9072, 2792, 253, 7714, 310, 1077, 973, 3542, 253, 3082, 403, 2590, 285, 253, 26647, 1543, 403, 18462, 50275, 20881, 1255, 50276, 18, 253, 2929, 19756, 247, 4460, 7680, 432, 253, 27934, 285, 2898, 1930, 440, 1507, 452, 644, 3786, 908, 327, 3579, 8062, 13650, 24088, 337, 289, 489, 5292, 1162, 355, 285, 627, 403, 643, 2987, 970, 11454, 6928, 327, 5149, 10554, 24088, 374, 269, 49259, 324, 261, 1162, 355, 253, 10336, 310, 247, 15246, 2898, 273, 247, 440, 292, 10336, 50276, 66, 2957, 1159, 342, 13818, 250, 273, 253, 13650, 285, 697, 27935, 50276, 19, 627, 403, 642, 14023, 281, 643, 1666, 25379, 253, 2127, 323, 253, 440, 292, 10336, 432, 337, 289, 489, 5292, 1162, 355, 310, 1527, 47344, 594, 436, 651, 320, 247, 3626, 8245, 281, 2486, 50275, 20, 253, 13818, 250, 6332, 403, 417, 2361, 327, 253, 2120, 10895, 3185, 14777, 323, 2014, 24102, 403, 3559, 50276, 21, 1142, 273, 253, 14053, 10165, 403, 417, 973, 17194, 390, 5544, 323, 1650, 50276, 66, 253, 3733, 873, 19641, 4838, 84, 285, 323, 440, 19891, 697, 873, 19641, 12522, 84, 533, 642, 12288, 327, 436, 4327, 369, 2530, 50276, 67, 247, 2892, 273, 253, 2045, 608, 4522, 383, 2265, 310, 908, 347, 271, 3280, 281, 253, 2990, 642, 28913, 390, 816, 6787, 403, 2530, 260, 275, 253, 3733, 4278, 4477, 1375, 253, 673, 6510, 323, 253, 3280, 6430, 310, 417, 387, 246, 50276, 17, 3185, 352, 310, 12421, 4236, 285, 2620, 4522, 383, 2265, 403, 2684, 1078, 22753, 253, 2990, 13461, 533, 2717, 2010, 310, 5469, 277, 4477, 1750, 326, 616, 1566, 310, 577, 1340, 273, 9777, 7938, 685, 247, 1375, 23037, 14387, 9879, 28368, 3284, 10704, 47037, 533, 253, 4588, 1408, 3181, 591, 10895, 403, 417, 2530, 285, 627, 310, 642, 5955, 273, 2426, 273, 9171, 1430, 24088, 849, 1057, 352, 4311, 342, 253, 2317, 6064, 50276, 70, 275, 253, 26647, 281, 247, 1781, 5028, 3368, 2593, 7127, 253, 4477, 15249, 253, 4942, 4105, 3045, 273, 253, 2746, 407, 14851, 436, 1895, 812, 320, 9713, 407, 3733, 253, 440, 292, 323, 3356, 3453, 6430, 285, 6830, 3629, 253, 6864, 273, 253, 2990, 1580, 436, 310, 247, 2969, 9079, 891, 1928, 352, 943, 452, 644, 5762, 50275, 22, 954, 4679, 403, 4645, 1543, 323, 1384, 4522, 383, 2265, 440, 1811, 84, 285, 1223, 323, 3356, 6430, 30695, 369, 2540, 50276, 4609, 310, 247, 2266, 12291, 273, 253, 1566, 50274, 12157, 273, 253, 2792, 1840, 891, 1158, 436, 7714, 1057, 417, 1509, 253, 17857, 32888, 7887, 323, 14924, 3738, 891, 2868, 436, 651, 320, 247, 1175, 22586, 2929, 19529, 50275, 18, 295, 3683, 289, 489, 5292, 17022, 3223, 249, 359, 38677, 319, 298, 2788, 284, 819, 386, 77, 285, 1269, 22589, 30838, 30287, 3676, 4715, 3082, 323, 294, 1362, 3502, 11215, 2961, 6563, 1321, 296, 7095, 9938, 273, 2329, 4786, 300, 14221, 374, 331, 46517, 269, 49259, 324, 261, 1407, 86, 15916, 268, 525, 255, 13890, 2304, 900, 298, 2610, 821, 14994, 448, 4448, 16216, 4714, 717, 375, 43100, 2364, 285, 271, 300, 247, 270, 9432, 506, 10941, 18902, 285, 27311, 267, 11454, 6928, 323, 21565, 5149, 18634, 7152, 339, 431, 248, 4477, 897, 247, 440, 292, 10336, 2990, 281, 3283, 253, 3200, 285, 5016, 273, 2553, 10212, 275, 271, 1527, 285, 4581, 2570, 253, 2990, 10166, 327, 941, 342, 247, 2969, 3817, 285, 987, 33195, 7145, 41184, 285, 2087, 4219, 973, 281, 643, 2570, 17856, 16012, 253, 11454, 2990, 3169, 1332, 6613, 1199, 7938, 685, 253, 2629, 10704, 9864, 407, 3587, 16161, 253, 268, 615, 50274, 296, 3755, 20556, 50276, 18, 1199, 7938, 685, 3587, 16161, 253, 828, 2865, 301, 2500, 351, 37613, 20126, 1824, 7424, 970, 2629, 10704, 47037, 374, 253, 2990, 476, 39970, 281, 17856, 16012, 285, 3302, 2515, 417, 2326, 1309, 3733, 50276, 20881, 1255, 265, 337, 253, 13650, 689, 625, 6508, 9894, 403, 417, 1077, 7899, 3738, 3733, 253, 440, 292, 323, 3356, 3453, 6430, 285, 3629, 253, 6928, 6864, 778, 1361, 436, 778, 671, 2572, 253, 10183, 275, 3733, 50276, 19, 690, 2045, 2987, 671, 908, 253, 440, 292, 281, 3283, 5149, 8062, 347, 495, 352, 310, 417, 2590, 752, 310, 253, 38135, 604, 667, 275, 253, 4081, 2990, 10336, 495, 417, 2217, 4679, 849, 1057, 253, 1566, 39970, 342, 625, 9542, 3302, 2515, 323, 1650, 2620, 390, 3578, 30244, 50276, 44295, 3062, 627, 310, 642, 5301, 281, 643, 5368, 789, 50276, 66, 2007, 1953, 253, 1566, 2087, 4219, 973, 281, 643, 7548, 41184, 310, 436, 984, 253, 5149, 12906, 387, 253, 7548, 310, 247, 1980, 11562, 594, 253, 27311, 267, 2990, 760, 3198, 281, 3037, 253, 1980, 12906, 50274, 8826, 4103, 2987, 50276, 18, 1182, 11917, 359, 29370, 22589, 340, 895, 22728, 703, 1251, 285, 340, 74, 5101, 27194, 22517, 9864, 970, 3676, 11454, 6928, 4240, 374, 3686, 70, 4978, 31380, 35528, 299, 1162, 355, 4020, 839, 253, 2900, 281, 5149, 18634, 970, 3676, 11454, 6928, 549, 32693, 638, 3845, 549, 32693, 1093, 805, 520, 27059, 4765, 495, 269, 49259, 324, 261, 331, 46517, 1162, 355, 10941, 18902, 285, 27311, 267, 11454, 6928, 323, 21565, 5149, 18634, 549, 32693, 638, 3845, 549, 32693, 1518, 938, 2511, 3593, 9169, 187, 187, 4118, 18435, 27, 15337, 84, 497, 8489, 6804, 1060, 533, 253, 13969, 310, 281, 12009, 342, 387, 1878, 581, 4318, 391, 19, 30039, 18235, 2439, 30628, 253, 17401, 281, 12009, 310, 8558, 1754, 327, 253, 1268, 273, 3236, 414, 342, 253, 4081, 440, 292, 10336, 285, 327, 14855, 273, 4679, 3340, 275, 10941, 281, 1666, 25379, 50276, 15337, 398, 1119, 20544, 275, 253, 9380, 4028, 285, 275, 697, 20028, 273, 26647, 281, 39709, 41184, 50276, 35529, 30628, 4879, 326, 253, 10336, 1057, 417, 3330, 3236, 414, 9188, 40348, 2792, 1690, 391, 20, 253, 954, 2762, 37317, 50276, 83, 20, 253, 14855, 273, 436, 2929, 310, 326, 352, 36908, 1246, 667, 4460, 5609, 697, 271, 5368, 10336, 440, 292, 3732, 275, 247, 747, 5028, 5149, 9864, 50276, 83, 19, 253, 4081, 2746, 310, 247, 15246, 2898, 273, 440, 292, 281, 3283, 247, 8820, 1673, 1677, 2469, 1643, 8820, 4910, 24982, 2366, 2299, 440, 1507, 298, 296, 983, 2410, 42663, 983, 285, 643, 35615, 452, 644, 3597, 1078, 352, 310, 12744, 752, 253, 4460, 7680, 275, 436, 2929, 310, 50276, 395, 2139, 352, 651, 320, 23033, 275, 10885, 39709, 41184, 689, 3356, 9894, 273, 673, 50276, 83, 19, 1501, 10927, 436, 2929, 310, 247, 2590, 12009, 5293, 273, 253, 9021, 403, 4460, 50275, 83, 21, 253, 2929, 19756, 247, 4460, 7680, 432, 253, 27934, 285, 2898, 1930, 50276, 83, 18, 690, 2045, 2987, 671, 908, 253, 440, 292, 281, 3283, 5149, 8062, 50276, 262, 310, 417, 2590, 752, 310, 253, 38135, 604, 667, 275, 253, 4081, 2990, 10336, 50276, 15337, 398, 671, 4879, 32213, 275, 253, 4679, 14969, 407, 391, 20, 253, 954, 2762, 37317, 2167, 326, 2278, 858, 417, 1908, 731, 247, 15444, 19652, 50276, 83, 18, 417, 2217, 4679, 849, 1057, 253, 1566, 39970, 342, 625, 9542, 3302, 2515, 323, 1650, 2620, 390, 3578, 30244, 33810, 627, 310, 642, 5301, 281, 643, 5368, 789, 50276, 83, 19, 627, 310, 642, 7103, 1411, 253, 1375, 273, 253, 1445, 50275, 83, 19, 1501, 10927, 2898, 273, 277, 79, 2224, 281, 436, 1895, 3885, 8777, 689, 10704, 1220, 735, 3966, 452, 512, 644, 14859, 407, 256, 5503, 2987, 534, 452, 417, 644, 2429, 1411, 627, 310, 642, 2590, 18575, 1427, 273, 253, 7558, 4460, 9021, 689, 253, 256, 5503, 285, 16774, 12820, 390, 10527, 14720, 273, 253, 1072, 50276, 83, 21, 627, 403, 642, 14023, 281, 643, 1666, 25379, 50274, 83, 20, 37317, 577, 10316, 598, 690, 4344, 2792, 670, 5661, 3374, 516, 417, 347, 7514, 670, 253, 3480, 273, 247, 8245, 5301, 326, 36908, 1646, 281, 320, 253, 1127, 273, 436, 2929, 50276, 9088, 310, 760, 594, 1199, 326, 476, 320, 2218, 275, 271, 854, 3239, 8059, 2929, 50276, 35529, 1677, 326, 253, 643, 374, 30628, 1158, 253, 2929, 812, 897, 625, 789, 352, 651, 320, 4336, 5272, 323, 253, 21583, 281, 12009, 352, 1754, 327, 1110, 10123, 50276, 3169, 327, 436, 13969, 273, 10123, 619, 17401, 310, 281, 12009, 891, 3524, 253, 8680, 432, 253, 10123, 310, 9371, 281, 253, 4477 ]
[ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 ]
[ 432, 247, 3480, 273, 38135, 285, 247, 44382, 6688, 5301, 1411, 253, 1375, 273, 253, 1445, 352, 10224, 281, 4271, 253, 440, 3899, 3198, 352, 310, 15974, 285, 752, 4460, 9021, 4483, 731, 281, 5115, 436, 891, 513, 417, 5583, 253, 9311, 273, 436, 2929, 5474, 339, 9852, 436, 2929, 247, 16182, 323, 948, 8287, 5149, 8062, 310, 3559, 1754, 327, 27311, 267, 11454, 6928, 247, 2629, 20059, 5149, 8062, 47037, 369, 908, 281, 6635, 247, 1781, 10895, 273, 374, 69, 5149, 9938, 247, 3676, 2036, 1754, 327, 440, 292, 369, 10166, 281, 3283, 253, 1735, 1375, 273, 253, 5149, 1673, 1677, 253, 2620, 2045, 3054, 253, 3733, 369, 2218, 342, 1097, 2629, 278, 339, 2957, 285, 671, 3805, 18585, 534, 4845, 247, 2957, 327, 253, 27935, 273, 253, 1673, 347, 973, 253, 1543, 921, 326, 253, 2990, 10166, 342, 3805, 18585, 17923, 3012, 1805, 689, 20946, 4533, 8349, 685, 253, 581, 10166, 342, 278, 339, 253, 1543, 671, 921, 326, 253, 10166, 2990, 310, 2104, 281, 39970, 281, 36143, 1027, 12620, 432, 253, 3733, 873, 50276, 338, 627, 369, 581, 3159, 891, 651, 897, 281, 6266, 436, 2929, 352, 310, 11080, 253, 10199, 285, 2905, 789, 513, 247, 1175, 2628, 1469, 689, 2905, 9021, 275, 253, 6239, 285, 4933, 253, 9414, 3782, 247, 9414, 342, 271, 13361, 4114, 285, 417, 7933, 247, 6514, 8062, 4114, 247, 1175, 4685, 273, 752, 556, 644, 2218, 281, 436, 1127, 253, 1072, 476, 320, 753, 323, 253, 1543, 253, 2929, 6131, 11080, 1783, 273, 253, 5853, 253, 1543, 806, 921, 2139, 3805, 18585, 310, 3309, 597, 840, 564, 327, 281, 921, 253, 3045, 273, 253, 1566, 275, 247, 5235, 273, 15216, 36143, 1027, 685, 253, 15216, 2326, 1309, 3733, 1690, 22627, 9421, 285, 2709, 30244, 253, 1566, 17923, 973, 275, 841, 15216, 23000, 253, 2929, 10262, 247, 10076, 326, 253, 1566, 10224, 387, 1048, 1307, 13650, 342, 767, 30244, 275, 247, 6891, 5048, 285, 13366, 25339, 2139, 253, 1566, 10224, 253, 4757, 273, 436, 2929, 310, 275, 253, 11080, 1255, 273, 697, 1543, 534, 3797, 15250, 1543, 835, 253, 1566, 10224, 50276, 783, 14855, 273, 436, 2929, 310, 326, 352, 36908, 1246, 667, 4460, 5609, 697, 271, 5368, 10336, 440, 292, 3732, 275, 247, 747, 5028, 5149, 9864, 2299, 5661, 9380, 2085, 1318, 347, 973, 285, 253, 1673, 310, 13711, 314, 14999, 275, 731, 281, 326, 990, 436, 2929, 812, 320, 247, 4217, 7680, 253, 1543, 403, 11080, 285, 1014, 921, 253, 7364, 273, 253, 1566, 323, 247, 2120, 5661, 2929, 352, 651, 320, 5322, 281, 513, 1014, 625, 1783, 3738, 342, 3710, 1180, 273, 7223, 326, 778, 417, 1900, 320, 1896, 2299, 891, 651, 452, 10490, 281, 923, 452, 2326, 941, 4645, 849, 1199, 7938, 436, 1566, 310, 689, 253, 2629, 1220, 735, 253, 2929, 3916, 275, 253, 26432, 285, 6452, 326, 616, 1566, 310, 12131, 2069, 7938, 685, 253, 2629, 1332, 533, 642, 835, 275, 253, 2929, 310, 941, 2530, 281, 896, 436, 598, 436, 3133, 751, 581, 273, 253, 2022, 4606, 323, 970, 436, 16182, 594, 417, 1907, 941, 281, 921, 436, 310, 247, 1943, 29002, 1335, 4583, 1223, 627, 310, 2316, 323, 7756, 436, 310, 247, 3290, 2929, 285, 943, 320, 7607, 50276, 66, 4564, 1355, 1841, 281, 4993, 598, 323, 253, 6568, 4704, 2715, 247, 2905, 2929, 326, 943, 3164, 320, 11106, 38757, 299, 14398, 757, 6514, 9864, 342, 27311, 267, 6928, 326, 1057, 1077, 2074, 1841, 342, 3676, 37507, 285, 6514, 8062, 3036, 818, 2296, 13650, 403, 327, 253, 1669, 285, 3216, 5083, 310, 327, 253, 987, 533, 891, 2868, 326, 310, 34572, 13650, 403, 327, 253, 987, 3216, 5083, 310, 327, 253, 1669, 7152, 339, 793, 360, 3454, 253, 2929, 10384, 247, 4751, 27311, 267, 440, 292, 1566, 323, 1735, 3213, 10554, 273, 253, 4898, 1673, 323, 374, 69, 5149, 8062, 327, 253, 10625, 5762, 253, 13650, 3464, 7899, 323, 1384, 4522, 383, 2265, 285, 253, 1332, 3133, 281, 2085, 10665, 3885, 8777, 2429, 281, 247, 1375, 23037, 14387, 9879, 28368, 3284, 10704, 47037, 690, 18462, 1543, 327, 26647, 281, 747, 5028, 15029, 285, 4067, 4311, 10625, 403, 671, 15250, 275, 253, 5661, 6874, 50274, 9072, 2792, 253, 7714, 310, 1077, 973, 3542, 253, 3082, 403, 2590, 285, 253, 26647, 1543, 403, 18462, 50275, 20881, 1255, 50276, 18, 253, 2929, 19756, 247, 4460, 7680, 432, 253, 27934, 285, 2898, 1930, 440, 1507, 452, 644, 3786, 908, 327, 3579, 8062, 13650, 24088, 337, 289, 489, 5292, 1162, 355, 285, 627, 403, 643, 2987, 970, 11454, 6928, 327, 5149, 10554, 24088, 374, 269, 49259, 324, 261, 1162, 355, 253, 10336, 310, 247, 15246, 2898, 273, 247, 440, 292, 10336, 50276, 66, 2957, 1159, 342, 13818, 250, 273, 253, 13650, 285, 697, 27935, 50276, 19, 627, 403, 642, 14023, 281, 643, 1666, 25379, 253, 2127, 323, 253, 440, 292, 10336, 432, 337, 289, 489, 5292, 1162, 355, 310, 1527, 47344, 594, 436, 651, 320, 247, 3626, 8245, 281, 2486, 50275, 20, 253, 13818, 250, 6332, 403, 417, 2361, 327, 253, 2120, 10895, 3185, 14777, 323, 2014, 24102, 403, 3559, 50276, 21, 1142, 273, 253, 14053, 10165, 403, 417, 973, 17194, 390, 5544, 323, 1650, 50276, 66, 253, 3733, 873, 19641, 4838, 84, 285, 323, 440, 19891, 697, 873, 19641, 12522, 84, 533, 642, 12288, 327, 436, 4327, 369, 2530, 50276, 67, 247, 2892, 273, 253, 2045, 608, 4522, 383, 2265, 310, 908, 347, 271, 3280, 281, 253, 2990, 642, 28913, 390, 816, 6787, 403, 2530, 260, 275, 253, 3733, 4278, 4477, 1375, 253, 673, 6510, 323, 253, 3280, 6430, 310, 417, 387, 246, 50276, 17, 3185, 352, 310, 12421, 4236, 285, 2620, 4522, 383, 2265, 403, 2684, 1078, 22753, 253, 2990, 13461, 533, 2717, 2010, 310, 5469, 277, 4477, 1750, 326, 616, 1566, 310, 577, 1340, 273, 9777, 7938, 685, 247, 1375, 23037, 14387, 9879, 28368, 3284, 10704, 47037, 533, 253, 4588, 1408, 3181, 591, 10895, 403, 417, 2530, 285, 627, 310, 642, 5955, 273, 2426, 273, 9171, 1430, 24088, 849, 1057, 352, 4311, 342, 253, 2317, 6064, 50276, 70, 275, 253, 26647, 281, 247, 1781, 5028, 3368, 2593, 7127, 253, 4477, 15249, 253, 4942, 4105, 3045, 273, 253, 2746, 407, 14851, 436, 1895, 812, 320, 9713, 407, 3733, 253, 440, 292, 323, 3356, 3453, 6430, 285, 6830, 3629, 253, 6864, 273, 253, 2990, 1580, 436, 310, 247, 2969, 9079, 891, 1928, 352, 943, 452, 644, 5762, 50275, 22, 954, 4679, 403, 4645, 1543, 323, 1384, 4522, 383, 2265, 440, 1811, 84, 285, 1223, 323, 3356, 6430, 30695, 369, 2540, 50276, 4609, 310, 247, 2266, 12291, 273, 253, 1566, 50274, 12157, 273, 253, 2792, 1840, 891, 1158, 436, 7714, 1057, 417, 1509, 253, 17857, 32888, 7887, 323, 14924, 3738, 891, 2868, 436, 651, 320, 247, 1175, 22586, 2929, 19529, 50275, 18, 295, 3683, 289, 489, 5292, 17022, 3223, 249, 359, 38677, 319, 298, 2788, 284, 819, 386, 77, 285, 1269, 22589, 30838, 30287, 3676, 4715, 3082, 323, 294, 1362, 3502, 11215, 2961, 6563, 1321, 296, 7095, 9938, 273, 2329, 4786, 300, 14221, 374, 331, 46517, 269, 49259, 324, 261, 1407, 86, 15916, 268, 525, 255, 13890, 2304, 900, 298, 2610, 821, 14994, 448, 4448, 16216, 4714, 717, 375, 43100, 2364, 285, 271, 300, 247, 270, 9432, 506, 10941, 18902, 285, 27311, 267, 11454, 6928, 323, 21565, 5149, 18634, 7152, 339, 431, 248, 4477, 897, 247, 440, 292, 10336, 2990, 281, 3283, 253, 3200, 285, 5016, 273, 2553, 10212, 275, 271, 1527, 285, 4581, 2570, 253, 2990, 10166, 327, 941, 342, 247, 2969, 3817, 285, 987, 33195, 7145, 41184, 285, 2087, 4219, 973, 281, 643, 2570, 17856, 16012, 253, 11454, 2990, 3169, 1332, 6613, 1199, 7938, 685, 253, 2629, 10704, 9864, 407, 3587, 16161, 253, 268, 615, 50274, 296, 3755, 20556, 50276, 18, 1199, 7938, 685, 3587, 16161, 253, 828, 2865, 301, 2500, 351, 37613, 20126, 1824, 7424, 970, 2629, 10704, 47037, 374, 253, 2990, 476, 39970, 281, 17856, 16012, 285, 3302, 2515, 417, 2326, 1309, 3733, 50276, 20881, 1255, 265, 337, 253, 13650, 689, 625, 6508, 9894, 403, 417, 1077, 7899, 3738, 3733, 253, 440, 292, 323, 3356, 3453, 6430, 285, 3629, 253, 6928, 6864, 778, 1361, 436, 778, 671, 2572, 253, 10183, 275, 3733, 50276, 19, 690, 2045, 2987, 671, 908, 253, 440, 292, 281, 3283, 5149, 8062, 347, 495, 352, 310, 417, 2590, 752, 310, 253, 38135, 604, 667, 275, 253, 4081, 2990, 10336, 495, 417, 2217, 4679, 849, 1057, 253, 1566, 39970, 342, 625, 9542, 3302, 2515, 323, 1650, 2620, 390, 3578, 30244, 50276, 44295, 3062, 627, 310, 642, 5301, 281, 643, 5368, 789, 50276, 66, 2007, 1953, 253, 1566, 2087, 4219, 973, 281, 643, 7548, 41184, 310, 436, 984, 253, 5149, 12906, 387, 253, 7548, 310, 247, 1980, 11562, 594, 253, 27311, 267, 2990, 760, 3198, 281, 3037, 253, 1980, 12906, 50274, 8826, 4103, 2987, 50276, 18, 1182, 11917, 359, 29370, 22589, 340, 895, 22728, 703, 1251, 285, 340, 74, 5101, 27194, 22517, 9864, 970, 3676, 11454, 6928, 4240, 374, 3686, 70, 4978, 31380, 35528, 299, 1162, 355, 4020, 839, 253, 2900, 281, 5149, 18634, 970, 3676, 11454, 6928, 549, 32693, 638, 3845, 549, 32693, 1093, 805, 520, 27059, 4765, 495, 269, 49259, 324, 261, 331, 46517, 1162, 355, 10941, 18902, 285, 27311, 267, 11454, 6928, 323, 21565, 5149, 18634, 549, 32693, 638, 3845, 549, 32693, 1518, 938, 2511, 3593, 9169, 187, 187, 4118, 18435, 27, 15337, 84, 497, 8489, 6804, 1060, 533, 253, 13969, 310, 281, 12009, 342, 387, 1878, 581, 4318, 391, 19, 30039, 18235, 2439, 30628, 253, 17401, 281, 12009, 310, 8558, 1754, 327, 253, 1268, 273, 3236, 414, 342, 253, 4081, 440, 292, 10336, 285, 327, 14855, 273, 4679, 3340, 275, 10941, 281, 1666, 25379, 50276, 15337, 398, 1119, 20544, 275, 253, 9380, 4028, 285, 275, 697, 20028, 273, 26647, 281, 39709, 41184, 50276, 35529, 30628, 4879, 326, 253, 10336, 1057, 417, 3330, 3236, 414, 9188, 40348, 2792, 1690, 391, 20, 253, 954, 2762, 37317, 50276, 83, 20, 253, 14855, 273, 436, 2929, 310, 326, 352, 36908, 1246, 667, 4460, 5609, 697, 271, 5368, 10336, 440, 292, 3732, 275, 247, 747, 5028, 5149, 9864, 50276, 83, 19, 253, 4081, 2746, 310, 247, 15246, 2898, 273, 440, 292, 281, 3283, 247, 8820, 1673, 1677, 2469, 1643, 8820, 4910, 24982, 2366, 2299, 440, 1507, 298, 296, 983, 2410, 42663, 983, 285, 643, 35615, 452, 644, 3597, 1078, 352, 310, 12744, 752, 253, 4460, 7680, 275, 436, 2929, 310, 50276, 395, 2139, 352, 651, 320, 23033, 275, 10885, 39709, 41184, 689, 3356, 9894, 273, 673, 50276, 83, 19, 1501, 10927, 436, 2929, 310, 247, 2590, 12009, 5293, 273, 253, 9021, 403, 4460, 50275, 83, 21, 253, 2929, 19756, 247, 4460, 7680, 432, 253, 27934, 285, 2898, 1930, 50276, 83, 18, 690, 2045, 2987, 671, 908, 253, 440, 292, 281, 3283, 5149, 8062, 50276, 262, 310, 417, 2590, 752, 310, 253, 38135, 604, 667, 275, 253, 4081, 2990, 10336, 50276, 15337, 398, 671, 4879, 32213, 275, 253, 4679, 14969, 407, 391, 20, 253, 954, 2762, 37317, 2167, 326, 2278, 858, 417, 1908, 731, 247, 15444, 19652, 50276, 83, 18, 417, 2217, 4679, 849, 1057, 253, 1566, 39970, 342, 625, 9542, 3302, 2515, 323, 1650, 2620, 390, 3578, 30244, 33810, 627, 310, 642, 5301, 281, 643, 5368, 789, 50276, 83, 19, 627, 310, 642, 7103, 1411, 253, 1375, 273, 253, 1445, 50275, 83, 19, 1501, 10927, 2898, 273, 277, 79, 2224, 281, 436, 1895, 3885, 8777, 689, 10704, 1220, 735, 3966, 452, 512, 644, 14859, 407, 256, 5503, 2987, 534, 452, 417, 644, 2429, 1411, 627, 310, 642, 2590, 18575, 1427, 273, 253, 7558, 4460, 9021, 689, 253, 256, 5503, 285, 16774, 12820, 390, 10527, 14720, 273, 253, 1072, 50276, 83, 21, 627, 403, 642, 14023, 281, 643, 1666, 25379, 50274, 83, 20, 37317, 577, 10316, 598, 690, 4344, 2792, 670, 5661, 3374, 516, 417, 347, 7514, 670, 253, 3480, 273, 247, 8245, 5301, 326, 36908, 1646, 281, 320, 253, 1127, 273, 436, 2929, 50276, 9088, 310, 760, 594, 1199, 326, 476, 320, 2218, 275, 271, 854, 3239, 8059, 2929, 50276, 35529, 1677, 326, 253, 643, 374, 30628, 1158, 253, 2929, 812, 897, 625, 789, 352, 651, 320, 4336, 5272, 323, 253, 21583, 281, 12009, 352, 1754, 327, 1110, 10123, 50276, 3169, 327, 436, 13969, 273, 10123, 619, 17401, 310, 281, 12009, 891, 3524, 253, 8680, 432, 253, 10123, 310, 9371, 281, 253, 4477 ]
Below is given review of a research paper from cnoference journal. Please write a summary the review. ### Review: the paper presents a new model for reading and writing memory in the context of taskoriented dialogue the model contains three main components an encoder a decoder and an external kb the external kb is in the format of an svo triple store the encoder encodes the dialogue history and in doing so writes its hidden states to memory and generates a global memory pointer as its last hidden state the decoder takes as input the global memory pointer the encoded dialogue state history and the external kb and then generates a response using a twostep process in which it 1 generates a template response using tags to designate slots that need filling and 2 looks up the correct filler for each slot using the templateglobal memory pointer as a query the authors evaluate the model on a simulated dialogue dataset babi and on a humanhuman dataset stanford multidomain dialogue or smd as well as in a human eval they show substantial improvements over existing models on smd the more interesting of the datasets in terms of entity f1ie the number of correctlygenerated entities in the response they also show improvement on babi specifically on cases involving oovs on the human evaluation they show improvements in terms of both appropriateness and humanlikeness overall i think this is a nice and wellmotivated model i very much appreciate the thoroughness of the evaluation two different datasets plus a human evaluation the level of analysis of the model was also good although there inevitably could have been more since it is such a complex model i would have liked to see more thorough ablations or at least better descriptions of the baselines in order to better understand which specific pieces of the model yield which types of gains a few particular questions below you describe the auxiliary loss on the global pointer and mention an ablation study that show that this improves performance maybe i am overlooking something but i cannot find this ablation in the paper or appendix it would be nice to see how large the effect is following on the above why no similar auxiliary losses on additional components eg the template generation were these tried and deemed unnecessary or viceversa ie the default was no auxiliary loss and they were only added when needed either way it would be nice to better communicate the experimentsintuitions that motivated the particular architecture you arrived at i really appreciate that you run a human eval but why not have humans evaluate objective correctness as well it seems trivial to ask people to say whether or not the answer is correctcommunicates the same information as the gold docsepthis is in general a wellwritten paper with extensive experimentation the authors tried to describe their architecture both with equations as well as graphically however i would like to mention the following in section 21 i am not sure all the symbols are clearly defined for example i could not locate the definitions of n l etc even if they are easy to assume i am fond of appropriate definitions also i suspect that some symbols like n are not used consistently across the manuscript i am also confused about the loss function which loss function is used when i am missing one more figure from fig 2 its not so straightforward to see how the encoderdecoder along with the shared kb work at the same time ie not independently in section 23 its not clear to me how the expected output word will be picked up from the local memory pointer same goes for the entity table how can you guarantee that that position nl1 is a null token what was the initial query vector and how did you initialise that did different initialisations had any effect on performance if you can please provide an example of a memory position also i would like to see a description of how the oov tasks are handled finally your method is a nn endtoend one and i was wondering how do you compare not with other endtoend approaches but with a traditional approach such as pydial and some minor suggestions not all the abbreviations are defined for example qrn gmn kvr it would also be nice to have the references of the respective methods included in the tables or their captions parts of figs 12 are pixelised it would be nice to have everything vectorised i would prefer to see the training details in fact i would even be favorable of having more of those in the main body of the manuscript rather than in the appendix there are some minor typos such as our approach that utilizing the recurrent or in each datasetsdocsepthis paper puts forward a new globallocal memory pointer network to tackle taskoriented dialogue problem the idea of introducing global memory is novel and experimental results show its effectiveness to encode external knowledge in most cases herere some comments 1 in global memory pointer the users employ nonnormalized probability nonsoftmax what is the difference in performance if one uses softmax 2 in 11 theres no linear weights will higher weights in globallocal help 3 as pointed out in ablation study its weird that in task5 global memory pointer does not help 4 the main competitor of this algorithm is mem2seq while mem2seq includes dstc2 and incar assistant and especially incar assistant provides the first example dialogue why does the paper not include expeirments on these two datasets ### Summary:
interesting paper applying memory networks that encode external knowledge represented in the form of triples and conversation context for task oriented dialogues experiments demonstrate improvements over the state of the art on two public datasets notation and presentation in the first version of the paper were not very clear hence many question and answers were exchanged during the reviews
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 783, 2929, 10262, 247, 747, 1566, 323, 4361, 285, 4028, 3541, 275, 253, 3634, 273, 4836, 21085, 17414, 253, 1566, 4428, 1264, 2022, 4295, 271, 32049, 247, 29810, 285, 271, 6024, 28724, 253, 6024, 28724, 310, 275, 253, 5981, 273, 271, 256, 5711, 16260, 4657, 253, 32049, 31360, 253, 17414, 2892, 285, 275, 2509, 594, 12013, 697, 8763, 3054, 281, 3541, 285, 15693, 247, 4156, 3541, 12219, 347, 697, 1390, 8763, 1375, 253, 29810, 3936, 347, 3280, 253, 4156, 3541, 12219, 253, 16202, 17414, 1375, 2892, 285, 253, 6024, 28724, 285, 840, 15693, 247, 2380, 970, 247, 2500, 493, 554, 1232, 275, 534, 352, 337, 15693, 247, 7646, 2380, 970, 14610, 281, 42638, 25195, 326, 878, 12868, 285, 374, 4453, 598, 253, 3451, 38669, 323, 1016, 15239, 970, 253, 1565, 446, 992, 77, 3051, 3541, 12219, 347, 247, 7316, 253, 4477, 7472, 253, 1566, 327, 247, 15524, 17414, 10895, 5366, 74, 285, 327, 247, 1966, 13961, 10895, 331, 266, 4379, 23964, 297, 404, 17414, 390, 924, 69, 347, 973, 347, 275, 247, 1966, 2777, 597, 921, 6832, 11701, 689, 5368, 3210, 327, 924, 69, 253, 625, 4722, 273, 253, 15302, 275, 2426, 273, 10726, 269, 18, 466, 253, 1180, 273, 9113, 20419, 14429, 275, 253, 2380, 597, 671, 921, 7756, 327, 5366, 74, 5742, 327, 2219, 7668, 258, 729, 84, 327, 253, 1966, 7103, 597, 921, 11701, 275, 2426, 273, 1097, 3991, 48362, 285, 1966, 965, 3612, 405, 50275, 1189, 455, 891, 1158, 436, 310, 247, 5322, 285, 973, 24013, 8550, 1566, 891, 1077, 1199, 11435, 253, 11080, 1255, 273, 253, 7103, 767, 1027, 15302, 5043, 247, 1966, 7103, 253, 1268, 273, 1783, 273, 253, 1566, 369, 671, 1175, 3738, 627, 24473, 812, 452, 644, 625, 1580, 352, 310, 824, 247, 2570, 1566, 891, 651, 452, 10490, 281, 923, 625, 11080, 490, 77, 569, 390, 387, 1878, 1805, 20121, 273, 253, 1666, 25379, 275, 1340, 281, 1805, 2096, 534, 2173, 7437, 273, 253, 1566, 4917, 534, 3510, 273, 15988, 247, 1643, 1798, 3533, 2708, 50275, 5658, 6266, 253, 24026, 2957, 327, 253, 4156, 12219, 285, 3748, 271, 28913, 1263, 326, 921, 326, 436, 19132, 3045, 5046, 891, 717, 37159, 1633, 533, 891, 2550, 1089, 436, 28913, 275, 253, 2929, 390, 30762, 352, 651, 320, 5322, 281, 923, 849, 1781, 253, 1055, 310, 50275, 34814, 327, 253, 1840, 2139, 642, 2074, 24026, 11655, 327, 3081, 4295, 24088, 253, 7646, 5978, 497, 841, 3597, 285, 14320, 15279, 390, 12008, 735, 66, 26332, 253, 4284, 369, 642, 24026, 2957, 285, 597, 497, 760, 2879, 672, 3058, 2057, 1039, 352, 651, 320, 5322, 281, 1805, 13791, 253, 4679, 565, 86, 4431, 326, 17194, 253, 1798, 10336, 368, 7244, 387, 50276, 74, 1663, 11435, 326, 368, 1408, 247, 1966, 2777, 533, 2139, 417, 452, 7497, 7472, 8103, 36594, 347, 973, 352, 3133, 14916, 281, 1642, 952, 281, 1333, 1880, 390, 417, 253, 3662, 310, 3451, 11713, 31290, 253, 1072, 1491, 347, 253, 5328, 5474, 33032, 2520, 310, 275, 2087, 247, 973, 15720, 2929, 342, 9470, 40290, 50275, 783, 4477, 3597, 281, 6266, 616, 10336, 1097, 342, 7424, 347, 973, 347, 4216, 1037, 2299, 891, 651, 751, 281, 3748, 253, 1563, 50275, 249, 2593, 3127, 891, 717, 417, 2119, 512, 253, 14217, 403, 4518, 2931, 323, 1650, 891, 812, 417, 19912, 253, 14308, 273, 295, 298, 3966, 1014, 604, 597, 403, 3477, 281, 5467, 891, 717, 14753, 273, 4569, 14308, 671, 891, 9101, 326, 690, 14217, 751, 295, 403, 417, 908, 12724, 2439, 253, 7714, 50276, 74, 717, 671, 13477, 670, 253, 2957, 1159, 534, 2957, 1159, 310, 908, 672, 50276, 74, 717, 5816, 581, 625, 4677, 432, 3036, 374, 697, 417, 594, 15246, 281, 923, 849, 253, 32049, 48759, 2112, 342, 253, 6096, 28724, 789, 387, 253, 1072, 673, 26332, 417, 10939, 50276, 249, 2593, 3495, 697, 417, 2590, 281, 479, 849, 253, 3264, 3453, 3159, 588, 320, 5055, 598, 432, 253, 1980, 3541, 12219, 1072, 4566, 323, 253, 10726, 2829, 50276, 5430, 476, 368, 12215, 326, 326, 1899, 295, 77, 18, 310, 247, 3635, 10669, 50276, 5371, 369, 253, 3302, 7316, 4972, 285, 849, 858, 368, 3302, 885, 326, 858, 1027, 3302, 18058, 574, 667, 1055, 327, 3045, 50276, 338, 368, 476, 4496, 2085, 271, 1650, 273, 247, 3541, 1899, 50276, 12563, 891, 651, 751, 281, 923, 247, 5740, 273, 849, 253, 258, 729, 8892, 403, 15726, 50276, 71, 3341, 634, 1332, 310, 247, 48257, 990, 936, 423, 581, 285, 891, 369, 12371, 849, 513, 368, 7277, 417, 342, 643, 990, 936, 423, 7274, 533, 342, 247, 5899, 2746, 824, 347, 7239, 47816, 50275, 395, 690, 5884, 13991, 50276, 1439, 512, 253, 490, 25669, 403, 2931, 323, 1650, 2805, 30930, 305, 16192, 465, 24987, 352, 651, 671, 320, 5322, 281, 452, 253, 10414, 273, 253, 9056, 3082, 2908, 275, 253, 7180, 390, 616, 3403, 621, 50276, 31369, 273, 3036, 84, 1249, 403, 12275, 1701, 352, 651, 320, 5322, 281, 452, 3253, 4972, 1701, 50275, 74, 651, 4510, 281, 923, 253, 3733, 4278, 275, 958, 891, 651, 1014, 320, 13857, 273, 1907, 625, 273, 1110, 275, 253, 2022, 2133, 273, 253, 7714, 2581, 685, 275, 253, 30762, 50276, 9088, 403, 690, 5884, 963, 993, 824, 347, 776, 2746, 326, 17617, 253, 18902, 390, 275, 1016, 15302, 7152, 33032, 2520, 2929, 12516, 3579, 247, 747, 13371, 455, 3100, 3541, 12219, 2990, 281, 18915, 4836, 21085, 17414, 1895, 50276, 783, 2934, 273, 16984, 4156, 3541, 310, 4460, 285, 5661, 1543, 921, 697, 12510, 281, 22573, 6024, 3640, 275, 954, 2219, 50276, 1568, 250, 690, 5701, 337, 275, 4156, 3541, 12219, 253, 4212, 2126, 1327, 6320, 1025, 5912, 1327, 5530, 4090, 752, 310, 253, 3064, 275, 3045, 604, 581, 4648, 2602, 4090, 50276, 19, 275, 1903, 253, 373, 642, 4872, 13461, 588, 2169, 13461, 275, 13371, 455, 3100, 1361, 50276, 20, 347, 8042, 562, 275, 28913, 1263, 697, 12504, 326, 275, 4836, 22, 4156, 3541, 12219, 1057, 417, 1361, 50276, 21, 253, 2022, 32048, 273, 436, 5933, 310, 1167, 19, 14571, 1223, 1167, 19, 14571, 3797, 24334, 68, 19, 285, 1485, 274, 13372, 285, 3340, 1485, 274, 13372, 3400, 253, 806, 1650, 17414, 2139, 1057, 253, 2929, 417, 2486, 385, 365, 343, 942, 327, 841, 767, 15302, 187, 187, 4118, 18435, 27, 47606, 2929, 9433, 3541, 6928, 326, 22573, 6024, 3640, 6607, 275, 253, 830, 273, 1195, 1868, 285, 7827, 3634, 323, 4836, 19373, 10756, 955, 4679, 7568, 11701, 689, 253, 1375, 273, 253, 1445, 327, 767, 1345, 15302, 50276, 25604, 285, 9759, 275, 253, 806, 2715, 273, 253, 2929, 497, 417, 1077, 2590, 7613, 1142, 1953, 285, 9172, 497, 25920, 1309, 253, 10123, 50276 ]
[ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 ]
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 783, 2929, 10262, 247, 747, 1566, 323, 4361, 285, 4028, 3541, 275, 253, 3634, 273, 4836, 21085, 17414, 253, 1566, 4428, 1264, 2022, 4295, 271, 32049, 247, 29810, 285, 271, 6024, 28724, 253, 6024, 28724, 310, 275, 253, 5981, 273, 271, 256, 5711, 16260, 4657, 253, 32049, 31360, 253, 17414, 2892, 285, 275, 2509, 594, 12013, 697, 8763, 3054, 281, 3541, 285, 15693, 247, 4156, 3541, 12219, 347, 697, 1390, 8763, 1375, 253, 29810, 3936, 347, 3280, 253, 4156, 3541, 12219, 253, 16202, 17414, 1375, 2892, 285, 253, 6024, 28724, 285, 840, 15693, 247, 2380, 970, 247, 2500, 493, 554, 1232, 275, 534, 352, 337, 15693, 247, 7646, 2380, 970, 14610, 281, 42638, 25195, 326, 878, 12868, 285, 374, 4453, 598, 253, 3451, 38669, 323, 1016, 15239, 970, 253, 1565, 446, 992, 77, 3051, 3541, 12219, 347, 247, 7316, 253, 4477, 7472, 253, 1566, 327, 247, 15524, 17414, 10895, 5366, 74, 285, 327, 247, 1966, 13961, 10895, 331, 266, 4379, 23964, 297, 404, 17414, 390, 924, 69, 347, 973, 347, 275, 247, 1966, 2777, 597, 921, 6832, 11701, 689, 5368, 3210, 327, 924, 69, 253, 625, 4722, 273, 253, 15302, 275, 2426, 273, 10726, 269, 18, 466, 253, 1180, 273, 9113, 20419, 14429, 275, 253, 2380, 597, 671, 921, 7756, 327, 5366, 74, 5742, 327, 2219, 7668, 258, 729, 84, 327, 253, 1966, 7103, 597, 921, 11701, 275, 2426, 273, 1097, 3991, 48362, 285, 1966, 965, 3612, 405, 50275, 1189, 455, 891, 1158, 436, 310, 247, 5322, 285, 973, 24013, 8550, 1566, 891, 1077, 1199, 11435, 253, 11080, 1255, 273, 253, 7103, 767, 1027, 15302, 5043, 247, 1966, 7103, 253, 1268, 273, 1783, 273, 253, 1566, 369, 671, 1175, 3738, 627, 24473, 812, 452, 644, 625, 1580, 352, 310, 824, 247, 2570, 1566, 891, 651, 452, 10490, 281, 923, 625, 11080, 490, 77, 569, 390, 387, 1878, 1805, 20121, 273, 253, 1666, 25379, 275, 1340, 281, 1805, 2096, 534, 2173, 7437, 273, 253, 1566, 4917, 534, 3510, 273, 15988, 247, 1643, 1798, 3533, 2708, 50275, 5658, 6266, 253, 24026, 2957, 327, 253, 4156, 12219, 285, 3748, 271, 28913, 1263, 326, 921, 326, 436, 19132, 3045, 5046, 891, 717, 37159, 1633, 533, 891, 2550, 1089, 436, 28913, 275, 253, 2929, 390, 30762, 352, 651, 320, 5322, 281, 923, 849, 1781, 253, 1055, 310, 50275, 34814, 327, 253, 1840, 2139, 642, 2074, 24026, 11655, 327, 3081, 4295, 24088, 253, 7646, 5978, 497, 841, 3597, 285, 14320, 15279, 390, 12008, 735, 66, 26332, 253, 4284, 369, 642, 24026, 2957, 285, 597, 497, 760, 2879, 672, 3058, 2057, 1039, 352, 651, 320, 5322, 281, 1805, 13791, 253, 4679, 565, 86, 4431, 326, 17194, 253, 1798, 10336, 368, 7244, 387, 50276, 74, 1663, 11435, 326, 368, 1408, 247, 1966, 2777, 533, 2139, 417, 452, 7497, 7472, 8103, 36594, 347, 973, 352, 3133, 14916, 281, 1642, 952, 281, 1333, 1880, 390, 417, 253, 3662, 310, 3451, 11713, 31290, 253, 1072, 1491, 347, 253, 5328, 5474, 33032, 2520, 310, 275, 2087, 247, 973, 15720, 2929, 342, 9470, 40290, 50275, 783, 4477, 3597, 281, 6266, 616, 10336, 1097, 342, 7424, 347, 973, 347, 4216, 1037, 2299, 891, 651, 751, 281, 3748, 253, 1563, 50275, 249, 2593, 3127, 891, 717, 417, 2119, 512, 253, 14217, 403, 4518, 2931, 323, 1650, 891, 812, 417, 19912, 253, 14308, 273, 295, 298, 3966, 1014, 604, 597, 403, 3477, 281, 5467, 891, 717, 14753, 273, 4569, 14308, 671, 891, 9101, 326, 690, 14217, 751, 295, 403, 417, 908, 12724, 2439, 253, 7714, 50276, 74, 717, 671, 13477, 670, 253, 2957, 1159, 534, 2957, 1159, 310, 908, 672, 50276, 74, 717, 5816, 581, 625, 4677, 432, 3036, 374, 697, 417, 594, 15246, 281, 923, 849, 253, 32049, 48759, 2112, 342, 253, 6096, 28724, 789, 387, 253, 1072, 673, 26332, 417, 10939, 50276, 249, 2593, 3495, 697, 417, 2590, 281, 479, 849, 253, 3264, 3453, 3159, 588, 320, 5055, 598, 432, 253, 1980, 3541, 12219, 1072, 4566, 323, 253, 10726, 2829, 50276, 5430, 476, 368, 12215, 326, 326, 1899, 295, 77, 18, 310, 247, 3635, 10669, 50276, 5371, 369, 253, 3302, 7316, 4972, 285, 849, 858, 368, 3302, 885, 326, 858, 1027, 3302, 18058, 574, 667, 1055, 327, 3045, 50276, 338, 368, 476, 4496, 2085, 271, 1650, 273, 247, 3541, 1899, 50276, 12563, 891, 651, 751, 281, 923, 247, 5740, 273, 849, 253, 258, 729, 8892, 403, 15726, 50276, 71, 3341, 634, 1332, 310, 247, 48257, 990, 936, 423, 581, 285, 891, 369, 12371, 849, 513, 368, 7277, 417, 342, 643, 990, 936, 423, 7274, 533, 342, 247, 5899, 2746, 824, 347, 7239, 47816, 50275, 395, 690, 5884, 13991, 50276, 1439, 512, 253, 490, 25669, 403, 2931, 323, 1650, 2805, 30930, 305, 16192, 465, 24987, 352, 651, 671, 320, 5322, 281, 452, 253, 10414, 273, 253, 9056, 3082, 2908, 275, 253, 7180, 390, 616, 3403, 621, 50276, 31369, 273, 3036, 84, 1249, 403, 12275, 1701, 352, 651, 320, 5322, 281, 452, 3253, 4972, 1701, 50275, 74, 651, 4510, 281, 923, 253, 3733, 4278, 275, 958, 891, 651, 1014, 320, 13857, 273, 1907, 625, 273, 1110, 275, 253, 2022, 2133, 273, 253, 7714, 2581, 685, 275, 253, 30762, 50276, 9088, 403, 690, 5884, 963, 993, 824, 347, 776, 2746, 326, 17617, 253, 18902, 390, 275, 1016, 15302, 7152, 33032, 2520, 2929, 12516, 3579, 247, 747, 13371, 455, 3100, 3541, 12219, 2990, 281, 18915, 4836, 21085, 17414, 1895, 50276, 783, 2934, 273, 16984, 4156, 3541, 310, 4460, 285, 5661, 1543, 921, 697, 12510, 281, 22573, 6024, 3640, 275, 954, 2219, 50276, 1568, 250, 690, 5701, 337, 275, 4156, 3541, 12219, 253, 4212, 2126, 1327, 6320, 1025, 5912, 1327, 5530, 4090, 752, 310, 253, 3064, 275, 3045, 604, 581, 4648, 2602, 4090, 50276, 19, 275, 1903, 253, 373, 642, 4872, 13461, 588, 2169, 13461, 275, 13371, 455, 3100, 1361, 50276, 20, 347, 8042, 562, 275, 28913, 1263, 697, 12504, 326, 275, 4836, 22, 4156, 3541, 12219, 1057, 417, 1361, 50276, 21, 253, 2022, 32048, 273, 436, 5933, 310, 1167, 19, 14571, 1223, 1167, 19, 14571, 3797, 24334, 68, 19, 285, 1485, 274, 13372, 285, 3340, 1485, 274, 13372, 3400, 253, 806, 1650, 17414, 2139, 1057, 253, 2929, 417, 2486, 385, 365, 343, 942, 327, 841, 767, 15302, 187, 187, 4118, 18435, 27, 47606, 2929, 9433, 3541, 6928, 326, 22573, 6024, 3640, 6607, 275, 253, 830, 273, 1195, 1868, 285, 7827, 3634, 323, 4836, 19373, 10756, 955, 4679, 7568, 11701, 689, 253, 1375, 273, 253, 1445, 327, 767, 1345, 15302, 50276, 25604, 285, 9759, 275, 253, 806, 2715, 273, 253, 2929, 497, 417, 1077, 2590, 7613, 1142, 1953, 285, 9172, 497, 25920, 1309, 253, 10123, 50276 ]
Below is given review of a research paper from cnoference journal. Please write a summary the review. ### Review: this paper introduces orthogonal transformation of tokens and combines it with the idea of interleaving window self attention wsa and dilated selfattention dsa to capture both local and global interactions without incurring the prohibitive cost of full global self attention they showed that initialization and optimization of orthogonal matrices can be simplified by using houholder transformations and leveraging already established optimization techniques to learn these matrices in addition the paper also make use of positional mlps where mlps are equipped with depthwise convolutions to allow for downsampling within the transformer block authors individually ablated different techniques proposed in the paper and combined the best configurations to achieve top performance on wide range of image tasks the main differentiating contribution of the paper is using the householder transformation trick to introduce orthogonal transformations this idea is coupled with already existing techniques of window selfattention and dilated selfattention authors present extensive experiments ablating each individual technique introduced the paper is also well written and easy to read and understand given that the paper is just concatenation of already existing ideas with minimal novelty of newly introduced techniques i am not inclined to accept the paper docsepthe paper proposes a new efficient selfattention form orthogonal selfattention osa that conducts selfattention within groups where orthogonalization of window tokens are performed first before forming the groups the final architecture consists of alternative osa and window selfattention wsa as attention with ffn equipped with depthwise convolutions it shows competitive results on imagenet classification coco object detection ade20k semantic segmentation strengths the way to construct orthogonal matrix endogenously by multiplying householder reflectors is novel and interesting the osa form proposed enabled the token connections within local window and links across windows the experiments results show competitive performance across different vision tasks including image classification object detection semantic segmentation weaknesses the essentialness of using orthogonal matrix is not studied the whole osa process 1 connects tokens within local windows with local window token orthogonalization is serves as mlp layer within local windows except the weight matrix of mlp is naturally orthogonal 2 connects tokens beyond local windows by forming new groups across previous local window 3 token reverse as the inverse of orthogonal matrix is easy to get just the transpose of the matrix step 2 can be done regardless of the weight matrix of this local window mlp is orthogonal or not step 3 is the vital part that only orthogonal matrix weight can perform i believe this should be studied which is not presented for validating the essentialness of using orthogonal matrix rather than just following the form that connects local and connects beyond local windows na docsepthis paper proposed an efficient transformer backbone orthogonal transformer for efficient processing of visual inputs the backbone is mainly built on the proposed efficient selfattention mechanism orthogonal selfattention osa to model dependencies between tokens in the orthogonal space the proposed backbone achieves excellent performanceflops tradeoff on visual tasks strengths the performances of the proposed backbone are superior the paper is well written and easy to follow and the illustration is clear to me weaknesses the motivation is not clear to me why do other efficient selfattention approaches fail the author claimed other efficient selfattention mechanisms lose finelevel details for coarse global selfattention or hurting longrange modeling for local selfattention l3536 however i cannot find any pieces of evidence to support the claim in addition how does the proposed orthogonal selfattention osa work the author claimed osa captures global dependency without losing finelevel details l3738 i have found no evidence to support the claim either theoretically or empirically the contribution is not significant i believe the only novel part of this paper is the proposed efficient selfattention approach the proposed positional mlp in sec 34 has been studied in pvt v2 cvmj 22 ceit iccv 21 and cmt cvpr 22 so i would not say this is the novel part the ablation is not sufficient in tab 6 the authors compared the proposed osa with window sa and dilated sa how about the window sa with shified window like swin yes docsepthe paper proposed an token orthogonalization layer there selfattention is performed on groups of orthogonalized tokens the paper then designs the orthogonal transformer ot which combines the orthogonal selfattention layer with a pyramid transformer architecture positional mlps adding a depthwiseconvolution into the mlp layer early convolutions and a novel downsampling method between stages ot performs well on imagenetfromscratch coco detection and instance segmentation and ade20k semantic segmentation strengths i like the proposal of constructing a trainable orthogonalization module from the product of householder matrices the orthogonal selfattention layer is fairly simple and general and i think is a nice contribution the experiment cover multiple tasks appear thorough and compare to many stateoftheart alternatives there is an ablation of the four components of the orthogonal transformer weaknesses i think the main weakness is the large number of moving parts in ot the combination of both windowed attention and orthogonal attention addition convolutions at the start and in the middle of the network and a new downsampling mechanism the combination of these factors significantly increases the complexity of the network over the original transformervision transformer potentially limiting adoption the ablation study shows that some of the minor components have an equalorgreater impact than the orthogonal attention layer which is the main selling point for ot eg conv position embeddings improve the scores on all tasks over absolute position embeddings more than orthogonal attention therefore it feels like the orthogonal selfattention layer is not the key driver of performance in ot given that the network contains many visionspecific components convolutions i feel that orthogonal transformer is overselling or overgeneralizing the network a name like orthogonal visionimage transformer would make it more clear that this is a visionspecific variant there are a few typos such as tokens has a lower resolution have singe otb single otb section f in the appendix dedicated to limitations and societal impact the societal impact is adequately addressed the limitations section is brief but i think it is adequate and i believe captures the main limitation that the study is restricted only to imagebased tasks ### Summary:
the paper presents orthogonal attention mechanism for vision transformers all reviewers found the overall system has good performance and the introduced orthogonal attention has the potential to be widely used the authors rebuttal resolves the majority of the questions the authors should add their promised additional experiments in the final version
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 2520, 2929, 23970, 19627, 9261, 273, 21761, 285, 24772, 352, 342, 253, 2934, 273, 25817, 3292, 3497, 1881, 4116, 259, 6678, 285, 49783, 1881, 42959, 277, 6678, 281, 9232, 1097, 1980, 285, 4156, 6355, 1293, 36967, 804, 253, 9419, 1483, 2105, 273, 2120, 4156, 1881, 4116, 597, 2692, 326, 31850, 285, 13757, 273, 19627, 12624, 476, 320, 21010, 407, 970, 288, 276, 11375, 21257, 285, 19732, 2977, 2168, 4232, 13757, 5609, 281, 3037, 841, 12624, 275, 1635, 253, 2929, 671, 1056, 897, 273, 40798, 13361, 793, 835, 13361, 793, 403, 13496, 342, 6864, 3020, 2410, 17009, 281, 1581, 323, 1066, 48027, 1561, 253, 39707, 2972, 50275, 43355, 15978, 490, 16148, 1027, 5609, 4081, 275, 253, 2929, 285, 5678, 253, 1682, 16012, 281, 5115, 1755, 3045, 327, 4618, 2491, 273, 2460, 8892, 253, 2022, 43073, 7680, 273, 253, 2929, 310, 970, 253, 2419, 11375, 9261, 10480, 281, 9569, 19627, 21257, 436, 2934, 310, 9904, 342, 2168, 5368, 5609, 273, 3497, 1881, 42959, 285, 49783, 1881, 42959, 4477, 1246, 9470, 4679, 490, 77, 839, 1016, 2060, 5853, 5611, 253, 2929, 310, 671, 973, 3542, 285, 3477, 281, 1239, 285, 2096, 50276, 28821, 326, 253, 2929, 310, 816, 32147, 318, 273, 2168, 5368, 5697, 342, 8723, 38135, 273, 9841, 5611, 5609, 891, 717, 417, 21802, 281, 2997, 253, 2929, 50276, 7152, 339, 431, 248, 2929, 29328, 247, 747, 5919, 1881, 42959, 830, 50276, 2156, 17397, 1881, 42959, 258, 6678, 326, 2589, 84, 1881, 42959, 1561, 2390, 835, 19627, 1320, 273, 3497, 21761, 403, 2684, 806, 1078, 9046, 253, 2390, 253, 2457, 10336, 8414, 273, 5795, 258, 6678, 285, 3497, 1881, 42959, 259, 6678, 347, 4116, 342, 269, 4174, 13496, 342, 6864, 3020, 2410, 17009, 352, 2722, 12085, 1543, 327, 4440, 257, 292, 9162, 9285, 80, 1789, 5481, 519, 70, 938, 76, 24705, 26405, 50276, 296, 3755, 20556, 50275, 186, 783, 1039, 281, 3989, 19627, 4315, 990, 2646, 4087, 407, 39763, 2419, 11375, 4887, 641, 310, 4460, 285, 4722, 253, 258, 6678, 830, 4081, 11410, 253, 10669, 10291, 1561, 1980, 3497, 285, 4859, 2439, 8323, 50276, 186, 783, 4679, 1543, 921, 12085, 3045, 2439, 1027, 8113, 8892, 1690, 2460, 9162, 1789, 5481, 24705, 26405, 50276, 20881, 1255, 265, 209, 186, 783, 5667, 1255, 273, 970, 19627, 4315, 310, 417, 5421, 50276, 783, 2644, 258, 6678, 1232, 50276, 18, 186, 11025, 84, 21761, 1561, 1980, 8323, 342, 1980, 3497, 10669, 19627, 1320, 310, 11029, 347, 13361, 81, 3828, 1561, 1980, 8323, 3707, 253, 2801, 4315, 273, 13361, 81, 310, 10748, 19627, 374, 186, 11025, 84, 21761, 4457, 1980, 8323, 407, 9046, 747, 2390, 2439, 2045, 1980, 3497, 50276, 20, 186, 13763, 8107, 347, 253, 13737, 273, 19627, 4315, 310, 3477, 281, 755, 816, 253, 811, 3014, 273, 253, 4315, 50264, 10539, 374, 476, 320, 2218, 10159, 273, 253, 2801, 4315, 273, 436, 1980, 3497, 13361, 81, 310, 19627, 390, 417, 50276, 186, 10539, 495, 310, 253, 12232, 629, 326, 760, 19627, 4315, 2801, 476, 1347, 891, 2868, 436, 943, 320, 5421, 534, 310, 417, 3559, 323, 3588, 839, 253, 5667, 1255, 273, 970, 19627, 4315, 2581, 685, 816, 1563, 253, 830, 326, 23417, 1980, 285, 23417, 4457, 1980, 8323, 50275, 2072, 5474, 33032, 2520, 2929, 4081, 271, 5919, 39707, 27882, 19627, 39707, 323, 5919, 5162, 273, 5304, 14800, 253, 27882, 310, 7194, 4270, 327, 253, 4081, 5919, 1881, 42959, 5122, 19627, 1881, 42959, 258, 6678, 281, 1566, 21011, 875, 21761, 275, 253, 19627, 2317, 253, 4081, 27882, 33526, 7126, 3045, 1258, 2695, 5454, 2727, 327, 5304, 8892, 50276, 296, 3755, 20556, 253, 16226, 273, 253, 4081, 27882, 403, 8936, 253, 2929, 310, 973, 3542, 285, 3477, 281, 956, 285, 253, 23356, 310, 2590, 281, 479, 50275, 20881, 1255, 265, 253, 16038, 310, 417, 2590, 281, 479, 2139, 513, 643, 5919, 1881, 42959, 7274, 1891, 253, 2488, 7558, 643, 5919, 1881, 42959, 6297, 7168, 4030, 5251, 4278, 323, 25319, 4156, 1881, 42959, 390, 34426, 1048, 6324, 14053, 323, 1980, 1881, 42959, 298, 1671, 1812, 2299, 891, 2550, 1089, 667, 7437, 273, 1941, 281, 1329, 253, 1750, 275, 1635, 849, 1057, 253, 4081, 19627, 1881, 42959, 258, 6678, 789, 253, 2488, 7558, 258, 6678, 28174, 4156, 18925, 1293, 10305, 4030, 5251, 4278, 298, 1787, 1839, 891, 452, 1119, 642, 1941, 281, 1329, 253, 1750, 2057, 28055, 390, 45190, 50276, 783, 7680, 310, 417, 1534, 891, 2868, 253, 760, 4460, 629, 273, 436, 2929, 310, 253, 4081, 5919, 1881, 42959, 2746, 50276, 783, 4081, 40798, 13361, 81, 275, 4706, 5910, 556, 644, 5421, 275, 268, 20282, 362, 19, 260, 11618, 75, 3307, 2636, 262, 17857, 17312, 3127, 285, 260, 6917, 30105, 1087, 3307, 594, 891, 651, 417, 1333, 436, 310, 253, 4460, 629, 50275, 783, 28913, 310, 417, 4209, 275, 10334, 721, 253, 4477, 2429, 253, 4081, 258, 6678, 342, 3497, 618, 285, 49783, 618, 849, 670, 253, 3497, 618, 342, 439, 1245, 3497, 751, 1863, 249, 4754, 5474, 339, 431, 248, 2929, 4081, 271, 10669, 19627, 1320, 3828, 627, 1881, 42959, 310, 2684, 327, 2390, 273, 19627, 1025, 21761, 253, 2929, 840, 11809, 253, 19627, 39707, 14366, 534, 24772, 253, 19627, 1881, 42959, 3828, 342, 247, 39694, 39707, 10336, 40798, 13361, 793, 6240, 247, 6864, 3020, 13118, 2241, 715, 253, 13361, 81, 3828, 2393, 2410, 17009, 285, 247, 4460, 1066, 48027, 1332, 875, 8661, 14366, 17923, 973, 327, 4440, 257, 292, 4064, 8658, 1506, 9285, 80, 5481, 285, 4227, 26405, 285, 519, 70, 938, 76, 24705, 26405, 20544, 50275, 74, 751, 253, 10419, 273, 26736, 247, 6194, 494, 19627, 1320, 6333, 432, 253, 1885, 273, 2419, 11375, 12624, 253, 19627, 1881, 42959, 3828, 310, 9648, 2969, 285, 2087, 285, 891, 1158, 310, 247, 5322, 7680, 50275, 783, 3368, 3835, 2709, 8892, 3176, 11080, 285, 7277, 281, 1142, 1375, 23037, 14387, 18075, 50275, 9088, 310, 271, 28913, 273, 253, 1740, 4295, 273, 253, 19627, 39707, 50276, 20881, 1255, 265, 50275, 74, 1158, 253, 2022, 14855, 310, 253, 1781, 1180, 273, 4886, 4243, 275, 14366, 253, 5019, 273, 1097, 3497, 264, 4116, 285, 19627, 4116, 1635, 2410, 17009, 387, 253, 1265, 285, 275, 253, 4766, 273, 253, 2990, 285, 247, 747, 1066, 48027, 5122, 253, 5019, 273, 841, 2616, 3012, 5459, 253, 10454, 273, 253, 2990, 689, 253, 3236, 4979, 677, 1297, 39707, 7826, 14155, 16253, 253, 28913, 1263, 2722, 326, 690, 273, 253, 5884, 4295, 452, 271, 4503, 2061, 675, 254, 3486, 685, 253, 19627, 4116, 3828, 534, 310, 253, 2022, 10156, 1127, 323, 14366, 24088, 2410, 1899, 46234, 3157, 253, 7363, 327, 512, 8892, 689, 7880, 1899, 46234, 625, 685, 19627, 4116, 3103, 352, 9193, 751, 253, 19627, 1881, 42959, 3828, 310, 417, 253, 2234, 6254, 273, 3045, 275, 14366, 50275, 28821, 326, 253, 2990, 4428, 1142, 36522, 29765, 4295, 2410, 17009, 891, 1928, 326, 19627, 39707, 310, 689, 23708, 390, 689, 16691, 3006, 253, 2990, 247, 1416, 751, 19627, 8113, 5695, 39707, 651, 1056, 352, 625, 2590, 326, 436, 310, 247, 36522, 29765, 12955, 50275, 9088, 403, 247, 1643, 963, 993, 824, 347, 50276, 45499, 556, 247, 2406, 6064, 50276, 9802, 50276, 4093, 70, 14366, 67, 50276, 20199, 14366, 67, 2593, 269, 275, 253, 30762, 9940, 281, 7364, 285, 38058, 3486, 253, 38058, 3486, 310, 18212, 9713, 253, 7364, 2593, 310, 4864, 533, 891, 1158, 352, 310, 10599, 285, 891, 2868, 28174, 253, 2022, 12291, 326, 253, 1263, 310, 11096, 760, 281, 2460, 3169, 8892, 2490, 187, 4118, 18435, 27, 783, 2929, 10262, 19627, 4116, 5122, 323, 8113, 4979, 398, 512, 30628, 1119, 253, 4583, 985, 556, 1175, 3045, 285, 253, 5611, 19627, 4116, 556, 253, 2442, 281, 320, 7561, 908, 253, 4477, 30080, 22559, 501, 14503, 253, 5020, 273, 253, 3533, 50276, 783, 4477, 943, 823, 616, 12316, 3081, 4679, 275, 253, 2457, 2715 ]
[ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 ]
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 2520, 2929, 23970, 19627, 9261, 273, 21761, 285, 24772, 352, 342, 253, 2934, 273, 25817, 3292, 3497, 1881, 4116, 259, 6678, 285, 49783, 1881, 42959, 277, 6678, 281, 9232, 1097, 1980, 285, 4156, 6355, 1293, 36967, 804, 253, 9419, 1483, 2105, 273, 2120, 4156, 1881, 4116, 597, 2692, 326, 31850, 285, 13757, 273, 19627, 12624, 476, 320, 21010, 407, 970, 288, 276, 11375, 21257, 285, 19732, 2977, 2168, 4232, 13757, 5609, 281, 3037, 841, 12624, 275, 1635, 253, 2929, 671, 1056, 897, 273, 40798, 13361, 793, 835, 13361, 793, 403, 13496, 342, 6864, 3020, 2410, 17009, 281, 1581, 323, 1066, 48027, 1561, 253, 39707, 2972, 50275, 43355, 15978, 490, 16148, 1027, 5609, 4081, 275, 253, 2929, 285, 5678, 253, 1682, 16012, 281, 5115, 1755, 3045, 327, 4618, 2491, 273, 2460, 8892, 253, 2022, 43073, 7680, 273, 253, 2929, 310, 970, 253, 2419, 11375, 9261, 10480, 281, 9569, 19627, 21257, 436, 2934, 310, 9904, 342, 2168, 5368, 5609, 273, 3497, 1881, 42959, 285, 49783, 1881, 42959, 4477, 1246, 9470, 4679, 490, 77, 839, 1016, 2060, 5853, 5611, 253, 2929, 310, 671, 973, 3542, 285, 3477, 281, 1239, 285, 2096, 50276, 28821, 326, 253, 2929, 310, 816, 32147, 318, 273, 2168, 5368, 5697, 342, 8723, 38135, 273, 9841, 5611, 5609, 891, 717, 417, 21802, 281, 2997, 253, 2929, 50276, 7152, 339, 431, 248, 2929, 29328, 247, 747, 5919, 1881, 42959, 830, 50276, 2156, 17397, 1881, 42959, 258, 6678, 326, 2589, 84, 1881, 42959, 1561, 2390, 835, 19627, 1320, 273, 3497, 21761, 403, 2684, 806, 1078, 9046, 253, 2390, 253, 2457, 10336, 8414, 273, 5795, 258, 6678, 285, 3497, 1881, 42959, 259, 6678, 347, 4116, 342, 269, 4174, 13496, 342, 6864, 3020, 2410, 17009, 352, 2722, 12085, 1543, 327, 4440, 257, 292, 9162, 9285, 80, 1789, 5481, 519, 70, 938, 76, 24705, 26405, 50276, 296, 3755, 20556, 50275, 186, 783, 1039, 281, 3989, 19627, 4315, 990, 2646, 4087, 407, 39763, 2419, 11375, 4887, 641, 310, 4460, 285, 4722, 253, 258, 6678, 830, 4081, 11410, 253, 10669, 10291, 1561, 1980, 3497, 285, 4859, 2439, 8323, 50276, 186, 783, 4679, 1543, 921, 12085, 3045, 2439, 1027, 8113, 8892, 1690, 2460, 9162, 1789, 5481, 24705, 26405, 50276, 20881, 1255, 265, 209, 186, 783, 5667, 1255, 273, 970, 19627, 4315, 310, 417, 5421, 50276, 783, 2644, 258, 6678, 1232, 50276, 18, 186, 11025, 84, 21761, 1561, 1980, 8323, 342, 1980, 3497, 10669, 19627, 1320, 310, 11029, 347, 13361, 81, 3828, 1561, 1980, 8323, 3707, 253, 2801, 4315, 273, 13361, 81, 310, 10748, 19627, 374, 186, 11025, 84, 21761, 4457, 1980, 8323, 407, 9046, 747, 2390, 2439, 2045, 1980, 3497, 50276, 20, 186, 13763, 8107, 347, 253, 13737, 273, 19627, 4315, 310, 3477, 281, 755, 816, 253, 811, 3014, 273, 253, 4315, 50264, 10539, 374, 476, 320, 2218, 10159, 273, 253, 2801, 4315, 273, 436, 1980, 3497, 13361, 81, 310, 19627, 390, 417, 50276, 186, 10539, 495, 310, 253, 12232, 629, 326, 760, 19627, 4315, 2801, 476, 1347, 891, 2868, 436, 943, 320, 5421, 534, 310, 417, 3559, 323, 3588, 839, 253, 5667, 1255, 273, 970, 19627, 4315, 2581, 685, 816, 1563, 253, 830, 326, 23417, 1980, 285, 23417, 4457, 1980, 8323, 50275, 2072, 5474, 33032, 2520, 2929, 4081, 271, 5919, 39707, 27882, 19627, 39707, 323, 5919, 5162, 273, 5304, 14800, 253, 27882, 310, 7194, 4270, 327, 253, 4081, 5919, 1881, 42959, 5122, 19627, 1881, 42959, 258, 6678, 281, 1566, 21011, 875, 21761, 275, 253, 19627, 2317, 253, 4081, 27882, 33526, 7126, 3045, 1258, 2695, 5454, 2727, 327, 5304, 8892, 50276, 296, 3755, 20556, 253, 16226, 273, 253, 4081, 27882, 403, 8936, 253, 2929, 310, 973, 3542, 285, 3477, 281, 956, 285, 253, 23356, 310, 2590, 281, 479, 50275, 20881, 1255, 265, 253, 16038, 310, 417, 2590, 281, 479, 2139, 513, 643, 5919, 1881, 42959, 7274, 1891, 253, 2488, 7558, 643, 5919, 1881, 42959, 6297, 7168, 4030, 5251, 4278, 323, 25319, 4156, 1881, 42959, 390, 34426, 1048, 6324, 14053, 323, 1980, 1881, 42959, 298, 1671, 1812, 2299, 891, 2550, 1089, 667, 7437, 273, 1941, 281, 1329, 253, 1750, 275, 1635, 849, 1057, 253, 4081, 19627, 1881, 42959, 258, 6678, 789, 253, 2488, 7558, 258, 6678, 28174, 4156, 18925, 1293, 10305, 4030, 5251, 4278, 298, 1787, 1839, 891, 452, 1119, 642, 1941, 281, 1329, 253, 1750, 2057, 28055, 390, 45190, 50276, 783, 7680, 310, 417, 1534, 891, 2868, 253, 760, 4460, 629, 273, 436, 2929, 310, 253, 4081, 5919, 1881, 42959, 2746, 50276, 783, 4081, 40798, 13361, 81, 275, 4706, 5910, 556, 644, 5421, 275, 268, 20282, 362, 19, 260, 11618, 75, 3307, 2636, 262, 17857, 17312, 3127, 285, 260, 6917, 30105, 1087, 3307, 594, 891, 651, 417, 1333, 436, 310, 253, 4460, 629, 50275, 783, 28913, 310, 417, 4209, 275, 10334, 721, 253, 4477, 2429, 253, 4081, 258, 6678, 342, 3497, 618, 285, 49783, 618, 849, 670, 253, 3497, 618, 342, 439, 1245, 3497, 751, 1863, 249, 4754, 5474, 339, 431, 248, 2929, 4081, 271, 10669, 19627, 1320, 3828, 627, 1881, 42959, 310, 2684, 327, 2390, 273, 19627, 1025, 21761, 253, 2929, 840, 11809, 253, 19627, 39707, 14366, 534, 24772, 253, 19627, 1881, 42959, 3828, 342, 247, 39694, 39707, 10336, 40798, 13361, 793, 6240, 247, 6864, 3020, 13118, 2241, 715, 253, 13361, 81, 3828, 2393, 2410, 17009, 285, 247, 4460, 1066, 48027, 1332, 875, 8661, 14366, 17923, 973, 327, 4440, 257, 292, 4064, 8658, 1506, 9285, 80, 5481, 285, 4227, 26405, 285, 519, 70, 938, 76, 24705, 26405, 20544, 50275, 74, 751, 253, 10419, 273, 26736, 247, 6194, 494, 19627, 1320, 6333, 432, 253, 1885, 273, 2419, 11375, 12624, 253, 19627, 1881, 42959, 3828, 310, 9648, 2969, 285, 2087, 285, 891, 1158, 310, 247, 5322, 7680, 50275, 783, 3368, 3835, 2709, 8892, 3176, 11080, 285, 7277, 281, 1142, 1375, 23037, 14387, 18075, 50275, 9088, 310, 271, 28913, 273, 253, 1740, 4295, 273, 253, 19627, 39707, 50276, 20881, 1255, 265, 50275, 74, 1158, 253, 2022, 14855, 310, 253, 1781, 1180, 273, 4886, 4243, 275, 14366, 253, 5019, 273, 1097, 3497, 264, 4116, 285, 19627, 4116, 1635, 2410, 17009, 387, 253, 1265, 285, 275, 253, 4766, 273, 253, 2990, 285, 247, 747, 1066, 48027, 5122, 253, 5019, 273, 841, 2616, 3012, 5459, 253, 10454, 273, 253, 2990, 689, 253, 3236, 4979, 677, 1297, 39707, 7826, 14155, 16253, 253, 28913, 1263, 2722, 326, 690, 273, 253, 5884, 4295, 452, 271, 4503, 2061, 675, 254, 3486, 685, 253, 19627, 4116, 3828, 534, 310, 253, 2022, 10156, 1127, 323, 14366, 24088, 2410, 1899, 46234, 3157, 253, 7363, 327, 512, 8892, 689, 7880, 1899, 46234, 625, 685, 19627, 4116, 3103, 352, 9193, 751, 253, 19627, 1881, 42959, 3828, 310, 417, 253, 2234, 6254, 273, 3045, 275, 14366, 50275, 28821, 326, 253, 2990, 4428, 1142, 36522, 29765, 4295, 2410, 17009, 891, 1928, 326, 19627, 39707, 310, 689, 23708, 390, 689, 16691, 3006, 253, 2990, 247, 1416, 751, 19627, 8113, 5695, 39707, 651, 1056, 352, 625, 2590, 326, 436, 310, 247, 36522, 29765, 12955, 50275, 9088, 403, 247, 1643, 963, 993, 824, 347, 50276, 45499, 556, 247, 2406, 6064, 50276, 9802, 50276, 4093, 70, 14366, 67, 50276, 20199, 14366, 67, 2593, 269, 275, 253, 30762, 9940, 281, 7364, 285, 38058, 3486, 253, 38058, 3486, 310, 18212, 9713, 253, 7364, 2593, 310, 4864, 533, 891, 1158, 352, 310, 10599, 285, 891, 2868, 28174, 253, 2022, 12291, 326, 253, 1263, 310, 11096, 760, 281, 2460, 3169, 8892, 2490, 187, 4118, 18435, 27, 783, 2929, 10262, 19627, 4116, 5122, 323, 8113, 4979, 398, 512, 30628, 1119, 253, 4583, 985, 556, 1175, 3045, 285, 253, 5611, 19627, 4116, 556, 253, 2442, 281, 320, 7561, 908, 253, 4477, 30080, 22559, 501, 14503, 253, 5020, 273, 253, 3533, 50276, 783, 4477, 943, 823, 616, 12316, 3081, 4679, 275, 253, 2457, 2715 ]
Below is given review of a research paper from cnoference journal. Please write a summary the review. ### Review: the paper proposes an approach to learn nonlinear causal relationship from time series data that is based on empirical risk minimization regularized by mutual information the mutual information at the minimizer of the objective function is used as causal measure the paper is well written and the proposed method well motivate and intuitive however i am concerned by the assumption that the lagged variables xt1j follow a diagonal gaussian distribution this appears to be very restrictive since typically the values of time series j at time t1 are typically depending say of those that time t2 t3 etc another key concern concerns scalability the authors mention gene regulatory networks neuroscience etc as key applications yet the experiments considered in the paper are limited to very few time series for instance the simulation experiments use n30 which is much smaller than the number of time series usually involved say in gene regulatory network data the real data experiments use n 6 or n2 this is way to small the real data experiments sections 42 and 43 are not very convincing not only because of the very small size of n but also because there is no comparison with the other approaches how do these compare does the proposed approach offer insights on these datasets which are not captured by the comparison methodsdocsepthis paper aims to estimate timedelayed nonlinear causal influences from time series under the causal sufficiency assumption it is easy to follow and contains a lot of empirical results thanks for the results but i have several questions first in theorem 2 which seems to be a main result of the paper the authors were concerned with the condition when wji 0 but there is not conclusion if wji 0 in order to correctly estimate causal relations from data both cases must be considered second the conclusion of theorem 2 seems to be flawed let me try to make it clear with the following example suppose x1t2 directly causes x2t1 and that x2t1 directly causes x3t without a direct influence from x1t2 to x3t then when minimizing 2 we have the following results step by step 1 the noise standard deviation in x2t1 denoted by eta2 may be nonzero this is because we minimize a tradeoff of the prediction error the first term in 2 and a function of the reciprocal of the noise standard deviation eta2 the second term in 2 not only the prediction error 2 if eta2 is nonzero then x1t2 will be useful for the purpose of predicting x3t note that if eta2 is zero then x1t2 is not useful for predicting x3t from the dseparation perspective this is because x1t2 and x3t are not dseparated by x2t1 eta2 cdot epsilon2 although they are dseparated by x2t1 then the causal markov condition tells use that x1t2 and x3t are not independent conditional on x2t1 eta2 cdot epsilon2 which means that x1t2 is useful for predicting x3t 3 given that x1t2 is useful for predicting x3t when 2 is minimized eta1 will not go to infinity resulting in a nonzero w13 which mistakenly tells us that x1t1 directly structurally causes x3t this illustrates that the conclusion of theorem 2 may be wrong i believe this is because the proof of theorem 2 is flawed in lines 56 on page 16 it does not seem sensible to drop xjt1 etax cdot epsilonx and attain a smaller value of the cost function at the same time please carefully check it especially the argument given in lines 1013 third it is rather surprising that the authors didnt mention anything about the traditional causal discovery methods based on conditional independence relations in the data known as constraintbased methods such as the pc algorithm spirtes et al 1993 ic algorithm pearl 2000 and fci spirtes et al 1993 such methods are directly applicable to timedelayed causal relations by further considering the constraint that effects temporally follow the causes fourth please make it clear that the proposed method aims to estimate causalityinmean because of the formulation in terms of regression for instance if xjt1 influences only the variance of xit but not its mean then the proposed method may not detect such a causal influence although the constraintbased methods can any response would be highly appreciateddocsepin the manuscript entitled neural causal discovery with learnable input noise the authors describe a method for automated causal inference under the scenario of a stream of temporally structured random variables with no missingness and a lookback window of given size the proposed approach combines a novel measure of the importance of fidelty in each variable to predictive accuracy of the future system state learnable noise risk with a flexible functional approximation neural network although the setting informative temporal data is relatively restricted with respect to the general problem of causal inference this is not unreasonable given the proposed direction of application to automated reasoning in machine learning the simulation and real data experiments are interesting and seem well applied a concern i have is that the manuscript as it stands is positioned somewhere between two distinct fields sparse learningfeature selection and causal inference for counterfactual estimationdecision making but doesnt entirely illustrate its relationship to either in particular the derived criterion is comparable to other sparsityinducing penalities on variable inclusion in machine learning models although it has motivation in causality it is not exclusively derived from this position so one might wonder how alternative sparsity penalities might perform on the same challenge likewise it is not well explained what is the value of the learnt relationships and how uncertainty and errors in the causal learning are relevant to the downstream use of the learnt model in the ordinary feature selection regime one is concerned simply with improving the predictive capacity of models eg a nonlinear model might be fit using just the causal variables that might outperform both a linear model and a nonlinear model fit using all variables here the end goal is less clear this is understandable in the sense that the work is positioned as a piece in a grand objective but it would seem valuable to nevertheless describe some concrete examples to elucidate this aspect of the algorithm use case error effects downstream ### Summary:
granger causality is a beautiful operational definition of causality that reduces causal modeling to the pasttofuture predictive strength the combination of classical granger causality with deep learning is very well motivated as a research problem as such the continuation of the effort in this paper is strongly encouraged however the review process did uncover possible flaws in some of the main original results of this paper the reviewers also expressed concerns that the experiments were unconvincing due to very small data sizes the paper will benefit from a revision and resubmission to another venue and is not ready for acceptance at iclr2019
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 783, 2929, 29328, 271, 2746, 281, 3037, 14561, 19349, 2954, 432, 673, 2962, 941, 326, 310, 1754, 327, 16774, 2495, 41458, 3963, 1025, 407, 15577, 1491, 50276, 783, 15577, 1491, 387, 253, 7221, 6081, 273, 253, 8103, 1159, 50276, 261, 908, 347, 19349, 2557, 253, 2929, 310, 973, 3542, 285, 253, 4081, 1332, 973, 41509, 285, 27350, 50275, 35529, 891, 717, 7514, 407, 253, 9376, 326, 253, 16653, 2400, 4903, 209, 633, 18, 75, 956, 247, 16421, 305, 12064, 3268, 436, 4620, 281, 320, 1077, 29190, 1580, 5431, 253, 2193, 273, 673, 2962, 480, 387, 673, 246, 18, 403, 5431, 7293, 1333, 273, 1110, 326, 673, 246, 19, 246, 20, 3966, 50276, 23955, 2234, 4468, 7350, 9171, 1430, 50276, 783, 4477, 3748, 3320, 10545, 6928, 50276, 570, 1822, 21559, 3966, 347, 2234, 4893, 2568, 253, 4679, 2783, 275, 253, 2929, 403, 3710, 281, 1077, 1643, 673, 2962, 323, 4227, 253, 9864, 4679, 897, 50276, 79, 1229, 50276, 4609, 310, 1199, 4577, 685, 253, 1180, 273, 673, 2962, 3798, 3206, 1333, 275, 3320, 10545, 2990, 941, 50276, 783, 1524, 941, 4679, 897, 295, 721, 390, 295, 19, 436, 310, 1039, 281, 1355, 50275, 783, 1524, 941, 4679, 7118, 5976, 285, 7652, 403, 417, 1077, 21414, 417, 760, 984, 273, 253, 1077, 1355, 1979, 273, 295, 533, 671, 984, 627, 310, 642, 5301, 342, 253, 643, 7274, 50276, 5430, 513, 841, 7277, 1057, 253, 4081, 2746, 3959, 50276, 968, 4380, 327, 841, 15302, 534, 403, 417, 10848, 407, 253, 5301, 3082, 7152, 33032, 2520, 2929, 13698, 281, 6642, 37282, 293, 19416, 14561, 19349, 16178, 432, 673, 2962, 762, 253, 19349, 32572, 9376, 352, 310, 3477, 281, 956, 285, 4428, 247, 2257, 273, 16774, 1543, 6701, 323, 253, 1543, 533, 891, 452, 2067, 3533, 50276, 7053, 275, 10012, 374, 534, 3133, 281, 320, 247, 2022, 906, 273, 253, 2929, 253, 4477, 497, 7514, 342, 253, 1617, 672, 259, 8020, 470, 533, 627, 310, 417, 6452, 604, 259, 8020, 470, 275, 1340, 281, 9113, 6642, 19349, 2493, 432, 941, 1097, 2219, 1364, 320, 2783, 50276, 9815, 253, 6452, 273, 10012, 374, 3133, 281, 320, 33657, 1339, 479, 1611, 281, 1056, 352, 2590, 342, 253, 1563, 1650, 9428, 1269, 18, 85, 19, 3587, 5997, 1269, 19, 85, 18, 285, 326, 1269, 19, 85, 18, 3587, 5997, 1269, 20, 85, 1293, 247, 1480, 4833, 432, 1269, 18, 85, 19, 50276, 936, 1269, 20, 85, 840, 672, 28699, 374, 359, 452, 253, 1563, 1543, 3213, 407, 3213, 337, 253, 6046, 2629, 11254, 275, 1269, 19, 85, 18, 17007, 407, 1162, 66, 19, 778, 320, 28078, 436, 310, 984, 359, 15338, 247, 5454, 2727, 273, 253, 10554, 2228, 253, 806, 1307, 275, 374, 285, 247, 1159, 273, 253, 33561, 273, 253, 6046, 2629, 11254, 1162, 66, 19, 253, 1273, 1307, 275, 374, 417, 760, 253, 10554, 2228, 374, 604, 1162, 66, 19, 310, 28078, 840, 1269, 18, 85, 19, 588, 320, 4217, 323, 253, 4096, 273, 21565, 1269, 20, 85, 3877, 326, 604, 1162, 66, 19, 310, 5058, 840, 1269, 18, 85, 19, 310, 417, 4217, 323, 21565, 1269, 20, 85, 432, 253, 277, 16806, 318, 8668, 436, 310, 984, 1269, 18, 85, 19, 285, 1269, 20, 85, 403, 417, 277, 49741, 407, 1269, 19, 85, 18, 50276, 1464, 19, 260, 5256, 299, 4277, 19, 3738, 597, 403, 277, 49741, 407, 1269, 19, 85, 18, 840, 253, 19349, 1616, 729, 1617, 8599, 897, 326, 1269, 18, 85, 19, 285, 1269, 20, 85, 403, 417, 3907, 17697, 327, 1269, 19, 85, 18, 50276, 1464, 19, 260, 5256, 299, 4277, 19, 534, 2097, 326, 1269, 18, 85, 19, 310, 4217, 323, 21565, 1269, 20, 85, 495, 1677, 326, 1269, 18, 85, 19, 310, 4217, 323, 21565, 1269, 20, 85, 672, 374, 310, 36625, 1162, 66, 18, 588, 417, 564, 281, 23579, 4795, 275, 247, 28078, 259, 1012, 534, 49294, 8599, 441, 326, 1269, 18, 85, 18, 3587, 38291, 5997, 1269, 20, 85, 50276, 2520, 18303, 326, 253, 6452, 273, 10012, 374, 778, 320, 3430, 50276, 74, 2868, 436, 310, 984, 253, 4737, 273, 10012, 374, 310, 33657, 275, 3104, 8026, 327, 3239, 1668, 352, 1057, 417, 1646, 24600, 281, 5926, 1269, 42565, 18, 50276, 292, 991, 260, 5256, 299, 4277, 89, 285, 20685, 247, 4577, 1318, 273, 253, 2105, 1159, 387, 253, 1072, 673, 4496, 9257, 2451, 352, 3340, 253, 4154, 1677, 275, 3104, 8437, 20, 50276, 19016, 352, 310, 2581, 10084, 326, 253, 4477, 42126, 3748, 2712, 670, 253, 5899, 19349, 8900, 3082, 1754, 327, 17697, 14275, 2493, 275, 253, 941, 1929, 347, 7658, 3169, 3082, 824, 347, 253, 21136, 5933, 653, 4580, 265, 1162, 355, 9725, 17857, 5933, 27887, 77, 5307, 285, 269, 5297, 653, 4580, 265, 1162, 355, 9725, 824, 3082, 403, 3587, 7763, 281, 37282, 293, 19416, 19349, 2493, 407, 2007, 7296, 253, 7658, 326, 2538, 5897, 595, 956, 253, 5997, 50275, 48499, 4496, 1056, 352, 2590, 326, 253, 4081, 1332, 13698, 281, 6642, 46449, 249, 10722, 984, 273, 253, 15895, 275, 2426, 273, 9077, 323, 4227, 604, 1269, 42565, 18, 16178, 760, 253, 11041, 273, 1269, 262, 533, 417, 697, 1599, 840, 253, 4081, 1332, 778, 417, 2736, 824, 247, 19349, 4833, 3738, 253, 7658, 3169, 3082, 476, 50276, 1279, 2380, 651, 320, 4122, 14109, 7152, 339, 9852, 253, 7714, 7429, 11454, 19349, 8900, 342, 3037, 494, 3280, 6046, 253, 4477, 6266, 247, 1332, 323, 16644, 19349, 17032, 762, 253, 10076, 273, 247, 5542, 273, 5897, 595, 18872, 3632, 4903, 342, 642, 5816, 1255, 285, 247, 1007, 2135, 3497, 273, 1677, 1979, 50276, 783, 4081, 2746, 24772, 247, 4460, 2557, 273, 253, 6349, 273, 269, 5735, 555, 275, 1016, 4778, 281, 15970, 7200, 273, 253, 2852, 985, 1375, 3037, 494, 6046, 2495, 342, 247, 12112, 5164, 11193, 11454, 2990, 50276, 20261, 253, 4758, 27096, 11935, 941, 310, 4942, 11096, 342, 1675, 281, 253, 2087, 1895, 273, 19349, 17032, 436, 310, 417, 20697, 1677, 253, 4081, 3884, 273, 2898, 281, 16644, 14720, 275, 5145, 4715, 50276, 783, 9864, 285, 1524, 941, 4679, 403, 4722, 285, 1646, 973, 3732, 50276, 66, 4468, 891, 452, 310, 326, 253, 7714, 347, 352, 9572, 310, 15471, 9366, 875, 767, 5799, 4910, 23507, 4715, 24594, 5438, 285, 19349, 17032, 323, 4828, 12690, 780, 13418, 33642, 2403, 533, 36908, 7094, 17093, 697, 2954, 281, 2057, 50276, 249, 1798, 253, 6012, 17705, 310, 10870, 281, 643, 37139, 414, 527, 32578, 4331, 12908, 327, 4778, 11250, 275, 5145, 4715, 3210, 3738, 352, 556, 16038, 275, 46449, 352, 310, 417, 14288, 6012, 432, 436, 1899, 594, 581, 1537, 4282, 849, 5795, 37139, 414, 4331, 12908, 1537, 1347, 327, 253, 1072, 5691, 50276, 3022, 3020, 352, 310, 417, 973, 5544, 752, 310, 253, 1318, 273, 253, 34003, 7688, 285, 849, 11649, 285, 6332, 275, 253, 19349, 4715, 403, 4623, 281, 253, 15450, 897, 273, 253, 34003, 1566, 50276, 249, 253, 9826, 4735, 5438, 9459, 581, 310, 7514, 3365, 342, 11138, 253, 15970, 5350, 273, 3210, 24088, 247, 14561, 1566, 1537, 320, 4944, 970, 816, 253, 19349, 4903, 326, 1537, 562, 32231, 1097, 247, 4872, 1566, 285, 247, 14561, 1566, 4944, 970, 512, 4903, 50276, 1568, 253, 990, 4736, 310, 1679, 2590, 436, 310, 34007, 275, 253, 3282, 326, 253, 789, 310, 15471, 347, 247, 5313, 275, 247, 4936, 8103, 533, 352, 651, 1646, 9865, 281, 17837, 6266, 690, 11859, 6667, 281, 30955, 436, 4809, 273, 253, 5933, 897, 1083, 50276, 3775, 2538, 15450, 50276, 187, 187, 4118, 18435, 27, 737, 3751, 46449, 310, 247, 5389, 15942, 5426, 273, 46449, 326, 11355, 19349, 14053, 281, 253, 2469, 936, 32279, 15970, 4757, 253, 5019, 273, 8946, 650, 3751, 46449, 342, 3676, 4715, 310, 1077, 973, 17194, 347, 247, 2561, 1895, 347, 824, 253, 26272, 273, 253, 3434, 275, 436, 2929, 310, 7052, 14659, 2299, 253, 2278, 1232, 858, 32355, 1896, 32138, 275, 690, 273, 253, 2022, 3236, 1543, 273, 436, 2929, 253, 30628, 671, 4469, 7350, 326, 253, 4679, 497, 10915, 87, 19163, 1955, 281, 1077, 1355, 941, 9552, 253, 2929, 588, 5649, 432, 247, 18520, 285, 501, 538, 2230, 281, 1529, 18767, 285, 310, 417, 4704, 323, 14924, 387, 17857, 32888, 9638 ]
[ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 ]
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 783, 2929, 29328, 271, 2746, 281, 3037, 14561, 19349, 2954, 432, 673, 2962, 941, 326, 310, 1754, 327, 16774, 2495, 41458, 3963, 1025, 407, 15577, 1491, 50276, 783, 15577, 1491, 387, 253, 7221, 6081, 273, 253, 8103, 1159, 50276, 261, 908, 347, 19349, 2557, 253, 2929, 310, 973, 3542, 285, 253, 4081, 1332, 973, 41509, 285, 27350, 50275, 35529, 891, 717, 7514, 407, 253, 9376, 326, 253, 16653, 2400, 4903, 209, 633, 18, 75, 956, 247, 16421, 305, 12064, 3268, 436, 4620, 281, 320, 1077, 29190, 1580, 5431, 253, 2193, 273, 673, 2962, 480, 387, 673, 246, 18, 403, 5431, 7293, 1333, 273, 1110, 326, 673, 246, 19, 246, 20, 3966, 50276, 23955, 2234, 4468, 7350, 9171, 1430, 50276, 783, 4477, 3748, 3320, 10545, 6928, 50276, 570, 1822, 21559, 3966, 347, 2234, 4893, 2568, 253, 4679, 2783, 275, 253, 2929, 403, 3710, 281, 1077, 1643, 673, 2962, 323, 4227, 253, 9864, 4679, 897, 50276, 79, 1229, 50276, 4609, 310, 1199, 4577, 685, 253, 1180, 273, 673, 2962, 3798, 3206, 1333, 275, 3320, 10545, 2990, 941, 50276, 783, 1524, 941, 4679, 897, 295, 721, 390, 295, 19, 436, 310, 1039, 281, 1355, 50275, 783, 1524, 941, 4679, 7118, 5976, 285, 7652, 403, 417, 1077, 21414, 417, 760, 984, 273, 253, 1077, 1355, 1979, 273, 295, 533, 671, 984, 627, 310, 642, 5301, 342, 253, 643, 7274, 50276, 5430, 513, 841, 7277, 1057, 253, 4081, 2746, 3959, 50276, 968, 4380, 327, 841, 15302, 534, 403, 417, 10848, 407, 253, 5301, 3082, 7152, 33032, 2520, 2929, 13698, 281, 6642, 37282, 293, 19416, 14561, 19349, 16178, 432, 673, 2962, 762, 253, 19349, 32572, 9376, 352, 310, 3477, 281, 956, 285, 4428, 247, 2257, 273, 16774, 1543, 6701, 323, 253, 1543, 533, 891, 452, 2067, 3533, 50276, 7053, 275, 10012, 374, 534, 3133, 281, 320, 247, 2022, 906, 273, 253, 2929, 253, 4477, 497, 7514, 342, 253, 1617, 672, 259, 8020, 470, 533, 627, 310, 417, 6452, 604, 259, 8020, 470, 275, 1340, 281, 9113, 6642, 19349, 2493, 432, 941, 1097, 2219, 1364, 320, 2783, 50276, 9815, 253, 6452, 273, 10012, 374, 3133, 281, 320, 33657, 1339, 479, 1611, 281, 1056, 352, 2590, 342, 253, 1563, 1650, 9428, 1269, 18, 85, 19, 3587, 5997, 1269, 19, 85, 18, 285, 326, 1269, 19, 85, 18, 3587, 5997, 1269, 20, 85, 1293, 247, 1480, 4833, 432, 1269, 18, 85, 19, 50276, 936, 1269, 20, 85, 840, 672, 28699, 374, 359, 452, 253, 1563, 1543, 3213, 407, 3213, 337, 253, 6046, 2629, 11254, 275, 1269, 19, 85, 18, 17007, 407, 1162, 66, 19, 778, 320, 28078, 436, 310, 984, 359, 15338, 247, 5454, 2727, 273, 253, 10554, 2228, 253, 806, 1307, 275, 374, 285, 247, 1159, 273, 253, 33561, 273, 253, 6046, 2629, 11254, 1162, 66, 19, 253, 1273, 1307, 275, 374, 417, 760, 253, 10554, 2228, 374, 604, 1162, 66, 19, 310, 28078, 840, 1269, 18, 85, 19, 588, 320, 4217, 323, 253, 4096, 273, 21565, 1269, 20, 85, 3877, 326, 604, 1162, 66, 19, 310, 5058, 840, 1269, 18, 85, 19, 310, 417, 4217, 323, 21565, 1269, 20, 85, 432, 253, 277, 16806, 318, 8668, 436, 310, 984, 1269, 18, 85, 19, 285, 1269, 20, 85, 403, 417, 277, 49741, 407, 1269, 19, 85, 18, 50276, 1464, 19, 260, 5256, 299, 4277, 19, 3738, 597, 403, 277, 49741, 407, 1269, 19, 85, 18, 840, 253, 19349, 1616, 729, 1617, 8599, 897, 326, 1269, 18, 85, 19, 285, 1269, 20, 85, 403, 417, 3907, 17697, 327, 1269, 19, 85, 18, 50276, 1464, 19, 260, 5256, 299, 4277, 19, 534, 2097, 326, 1269, 18, 85, 19, 310, 4217, 323, 21565, 1269, 20, 85, 495, 1677, 326, 1269, 18, 85, 19, 310, 4217, 323, 21565, 1269, 20, 85, 672, 374, 310, 36625, 1162, 66, 18, 588, 417, 564, 281, 23579, 4795, 275, 247, 28078, 259, 1012, 534, 49294, 8599, 441, 326, 1269, 18, 85, 18, 3587, 38291, 5997, 1269, 20, 85, 50276, 2520, 18303, 326, 253, 6452, 273, 10012, 374, 778, 320, 3430, 50276, 74, 2868, 436, 310, 984, 253, 4737, 273, 10012, 374, 310, 33657, 275, 3104, 8026, 327, 3239, 1668, 352, 1057, 417, 1646, 24600, 281, 5926, 1269, 42565, 18, 50276, 292, 991, 260, 5256, 299, 4277, 89, 285, 20685, 247, 4577, 1318, 273, 253, 2105, 1159, 387, 253, 1072, 673, 4496, 9257, 2451, 352, 3340, 253, 4154, 1677, 275, 3104, 8437, 20, 50276, 19016, 352, 310, 2581, 10084, 326, 253, 4477, 42126, 3748, 2712, 670, 253, 5899, 19349, 8900, 3082, 1754, 327, 17697, 14275, 2493, 275, 253, 941, 1929, 347, 7658, 3169, 3082, 824, 347, 253, 21136, 5933, 653, 4580, 265, 1162, 355, 9725, 17857, 5933, 27887, 77, 5307, 285, 269, 5297, 653, 4580, 265, 1162, 355, 9725, 824, 3082, 403, 3587, 7763, 281, 37282, 293, 19416, 19349, 2493, 407, 2007, 7296, 253, 7658, 326, 2538, 5897, 595, 956, 253, 5997, 50275, 48499, 4496, 1056, 352, 2590, 326, 253, 4081, 1332, 13698, 281, 6642, 46449, 249, 10722, 984, 273, 253, 15895, 275, 2426, 273, 9077, 323, 4227, 604, 1269, 42565, 18, 16178, 760, 253, 11041, 273, 1269, 262, 533, 417, 697, 1599, 840, 253, 4081, 1332, 778, 417, 2736, 824, 247, 19349, 4833, 3738, 253, 7658, 3169, 3082, 476, 50276, 1279, 2380, 651, 320, 4122, 14109, 7152, 339, 9852, 253, 7714, 7429, 11454, 19349, 8900, 342, 3037, 494, 3280, 6046, 253, 4477, 6266, 247, 1332, 323, 16644, 19349, 17032, 762, 253, 10076, 273, 247, 5542, 273, 5897, 595, 18872, 3632, 4903, 342, 642, 5816, 1255, 285, 247, 1007, 2135, 3497, 273, 1677, 1979, 50276, 783, 4081, 2746, 24772, 247, 4460, 2557, 273, 253, 6349, 273, 269, 5735, 555, 275, 1016, 4778, 281, 15970, 7200, 273, 253, 2852, 985, 1375, 3037, 494, 6046, 2495, 342, 247, 12112, 5164, 11193, 11454, 2990, 50276, 20261, 253, 4758, 27096, 11935, 941, 310, 4942, 11096, 342, 1675, 281, 253, 2087, 1895, 273, 19349, 17032, 436, 310, 417, 20697, 1677, 253, 4081, 3884, 273, 2898, 281, 16644, 14720, 275, 5145, 4715, 50276, 783, 9864, 285, 1524, 941, 4679, 403, 4722, 285, 1646, 973, 3732, 50276, 66, 4468, 891, 452, 310, 326, 253, 7714, 347, 352, 9572, 310, 15471, 9366, 875, 767, 5799, 4910, 23507, 4715, 24594, 5438, 285, 19349, 17032, 323, 4828, 12690, 780, 13418, 33642, 2403, 533, 36908, 7094, 17093, 697, 2954, 281, 2057, 50276, 249, 1798, 253, 6012, 17705, 310, 10870, 281, 643, 37139, 414, 527, 32578, 4331, 12908, 327, 4778, 11250, 275, 5145, 4715, 3210, 3738, 352, 556, 16038, 275, 46449, 352, 310, 417, 14288, 6012, 432, 436, 1899, 594, 581, 1537, 4282, 849, 5795, 37139, 414, 4331, 12908, 1537, 1347, 327, 253, 1072, 5691, 50276, 3022, 3020, 352, 310, 417, 973, 5544, 752, 310, 253, 1318, 273, 253, 34003, 7688, 285, 849, 11649, 285, 6332, 275, 253, 19349, 4715, 403, 4623, 281, 253, 15450, 897, 273, 253, 34003, 1566, 50276, 249, 253, 9826, 4735, 5438, 9459, 581, 310, 7514, 3365, 342, 11138, 253, 15970, 5350, 273, 3210, 24088, 247, 14561, 1566, 1537, 320, 4944, 970, 816, 253, 19349, 4903, 326, 1537, 562, 32231, 1097, 247, 4872, 1566, 285, 247, 14561, 1566, 4944, 970, 512, 4903, 50276, 1568, 253, 990, 4736, 310, 1679, 2590, 436, 310, 34007, 275, 253, 3282, 326, 253, 789, 310, 15471, 347, 247, 5313, 275, 247, 4936, 8103, 533, 352, 651, 1646, 9865, 281, 17837, 6266, 690, 11859, 6667, 281, 30955, 436, 4809, 273, 253, 5933, 897, 1083, 50276, 3775, 2538, 15450, 50276, 187, 187, 4118, 18435, 27, 737, 3751, 46449, 310, 247, 5389, 15942, 5426, 273, 46449, 326, 11355, 19349, 14053, 281, 253, 2469, 936, 32279, 15970, 4757, 253, 5019, 273, 8946, 650, 3751, 46449, 342, 3676, 4715, 310, 1077, 973, 17194, 347, 247, 2561, 1895, 347, 824, 253, 26272, 273, 253, 3434, 275, 436, 2929, 310, 7052, 14659, 2299, 253, 2278, 1232, 858, 32355, 1896, 32138, 275, 690, 273, 253, 2022, 3236, 1543, 273, 436, 2929, 253, 30628, 671, 4469, 7350, 326, 253, 4679, 497, 10915, 87, 19163, 1955, 281, 1077, 1355, 941, 9552, 253, 2929, 588, 5649, 432, 247, 18520, 285, 501, 538, 2230, 281, 1529, 18767, 285, 310, 417, 4704, 323, 14924, 387, 17857, 32888, 9638 ]
Below is given review of a research paper from cnoference journal. Please write a summary the review. ### Review: this paper studies a new multilabel learning mll setting ie the single positive mll spml where only one positive label is observed the authors use an to learn spml meanwhile they put forward pseudolabeling consistency plc regularization and labelaware global consistency lac regularization to address the false negative problem raising by an the experiments show that the proposed method an plc lac achieves the best performance the main strengths are as follows although an is not stable for spml the authors put forward plc and lac to address the false negative problem raising by an the proposed method anplclac is shown to have the best performance compared with exiting spml methods the ablation study and case study are clear besides i also have several questions plc ie eq 4 has the indicator function which is not easily to be trained i expect to see the convergence of the proposed method the intraclass connection matrix i has a size of b2 which is such an enormous matrix what exactly is the size of the matrix in your experimental datasets is it possible to train with such an enormous matrix lac ie eq 7 only involves the matrix i without the matrix bari where does bari function the authors havent claimed any limitations docsepthe paper considers the multilabel learning with single positive labels in such scenario each image is only associated with a single positive label to recover the potential positive labels the paper first pseudolabels the unobserved labels based on the model predictions to boost the labeling performance the method considers the global consistency regularization for labelwise embeddings to learn more distinctive feature representations the experiments are conducted in multiple benchmark multilabel datasets experimental results show the superiority of the proposed method strength 1 the paper proposes an effective solution to spml problems the main contribution of this paper is to utilize the labelaware global consistency to encourage the feature representations to maintain a more compact structure the proposed method is well motivated and easy to follow 2 the experiments are performed on multiple benchmark datasets to compare with stateofart methods the proposed method achieves impressive performances and shows significant superiority to the comparing methods furthermore the ablation studies and sensitivity analyses of hyperparameters are reported weakness 1 the experimental settings are less clear for example it is not introduced how to assign a single label for each image the paper is suggested to provide some details 2 the label correlations are not considered in the proposed method the label correlations have been regarded as a fundamental element for performing multilabel classification is it beneficial for improving the performance of spml 3 in experiments the proposed method shows large superiority on coco than that on other datasets authors are suggested to provide detailed discussions about this observation 4 there are some language mistakes the paper is suggested to be carefully proofread yes docsepthis paper deals with the single positive multilabel learning spml task different from vanilla multilabel learning in spml only a single eg the most obvious positive label is annotated the authors propose a labelaware global consistency loss for the task with a twostage optimization the method achieves good performance on various benchmarks the paper is easy to read and the method is simple yet effective the authors utilizes the manifold instead of the clustering assumptions in spml and show the advantage of the asumption based on the empirical experiments here are some suggestions for the paper 1 figure 1 is not clear enough by comparing the singlelabel and multilabel cases the authors may first show the difference between vanilla multilabel learning and spml while there are some elements such as the embeddings and regularizations in the figure which make the figure a bit confusing 2 the labelwise embedding before eq 1 is not clear the authors use z to denote the label embedding but z does not exist in eq 1 since the label embedding is one of the most important elements in the global consistency loss the authors need to make it clear does the label embedding useful in vanilla multilabel learning eg in eq 12 in the paper 3 since the true number of positive labels cannot be accessible in many realworld scenarios the authors need to discuss how the proposed method can deal with this case before the experiments the authors have adequately addressed the limitations and potential negative social impact of the work docsepthis paper proposes a labelaware global consistencylac regularization to solve the single positive multilabel learning spml problem where each instance is only annotated with one of the positive labels related the proposed labelaware global consistencylac regularization leverages the manifold structure information via encouraging the global consistency among labelwise embedding ie making intraclass embedding closer while keeping away interclass embedding finally they conduct extensive experimental results on multiple benchmark datasets demonstrating that the proposed method can achieve superior performance strength 1 this paper proposes a novel labelaware global consistencylac regularization to solve the single positive multilabel learning spml problem where each instance is only annotated with one of the positive labels related 2 this paper successfully applies clustering assumption in spl and achieves good performance 3 the visualization of the problem background the proposed method and the experiment results are good which makes the reader easy to understand this work weakness 1 the basis for saying the sentences in page 5 line 150153 needs to be introduced 2 in page 3 line 104 it mentions that this paper adopts the attentionbased method to construct the labelwise embedding model but in the following introduction about the proposed method this point seems to be neglected in the proposed method which is only related to section 45 case study none ### Summary:
this paper studies the singlepositive multilabel learning problem to address this problem the authors adopt the pseudolabels to recover the potential positive labels and adopt the global consistency regularization for labelwise embeddings to learn more distinctive feature representations experimental results demonstrate the superiority of the proposal all reviewers agree to accept this paper so i recommend acceptance moreover i still have some more suggestions 1 the font size in figure 3 could be larger to make the plot clear 2 the reference format is not unified i suggest the authors revise the reference format carefully
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 2520, 2929, 2175, 247, 747, 33362, 1492, 4715, 278, 620, 4758, 26332, 253, 2014, 2762, 278, 620, 653, 1686, 835, 760, 581, 2762, 5203, 310, 2540, 253, 4477, 897, 271, 281, 3037, 653, 1686, 26614, 597, 1691, 3579, 10585, 311, 1492, 272, 15274, 499, 68, 37820, 285, 5203, 13823, 4156, 15274, 26238, 37820, 281, 2953, 253, 3221, 4016, 1895, 12976, 407, 271, 253, 4679, 921, 326, 253, 4081, 1332, 271, 50276, 446, 68, 50276, 42184, 33526, 253, 1682, 3045, 50276, 783, 2022, 20544, 403, 347, 3637, 50276, 20261, 271, 310, 417, 6474, 323, 653, 1686, 253, 4477, 1691, 3579, 499, 68, 285, 26238, 281, 2953, 253, 3221, 4016, 1895, 12976, 407, 271, 50275, 783, 4081, 1332, 271, 446, 498, 317, 310, 2011, 281, 452, 253, 1682, 3045, 2429, 342, 44528, 653, 1686, 3082, 253, 28913, 1263, 285, 1083, 1263, 403, 2590, 50275, 67, 11587, 891, 671, 452, 2067, 3533, 50276, 446, 68, 26332, 16186, 577, 556, 253, 15301, 1159, 534, 310, 417, 4354, 281, 320, 10166, 891, 1902, 281, 923, 253, 14940, 273, 253, 4081, 1332, 50275, 783, 11251, 14407, 4602, 4315, 891, 556, 247, 1979, 273, 270, 19, 534, 310, 824, 271, 14779, 4315, 752, 4555, 310, 253, 1979, 273, 253, 4315, 275, 634, 5661, 15302, 310, 352, 1896, 281, 6194, 342, 824, 271, 14779, 4315, 50276, 42184, 26332, 16186, 818, 760, 8687, 253, 4315, 891, 1293, 253, 4315, 270, 1792, 835, 1057, 270, 1792, 1159, 253, 4477, 419, 2254, 7558, 667, 7364, 50276, 7152, 339, 431, 248, 2929, 19401, 253, 33362, 1492, 4715, 342, 2014, 2762, 13301, 275, 824, 10076, 1016, 2460, 310, 760, 2330, 342, 247, 2014, 2762, 5203, 281, 9295, 253, 2442, 2762, 13301, 253, 2929, 806, 10585, 311, 357, 1241, 253, 440, 45912, 13301, 1754, 327, 253, 1566, 13650, 281, 9510, 253, 21473, 3045, 253, 1332, 19401, 253, 4156, 15274, 37820, 323, 5203, 3020, 46234, 281, 3037, 625, 21488, 4735, 14237, 253, 4679, 403, 5196, 275, 2709, 22791, 33362, 1492, 15302, 5661, 1543, 921, 253, 34385, 273, 253, 4081, 1332, 4757, 337, 186, 783, 2929, 29328, 271, 3576, 2900, 281, 653, 1686, 3237, 253, 2022, 7680, 273, 436, 2929, 310, 281, 16584, 253, 5203, 13823, 4156, 15274, 281, 11907, 253, 4735, 14237, 281, 6558, 247, 625, 8566, 2605, 253, 4081, 1332, 310, 973, 17194, 285, 3477, 281, 956, 374, 186, 783, 4679, 403, 2684, 327, 2709, 22791, 15302, 281, 7277, 342, 1375, 1171, 435, 3082, 253, 4081, 1332, 33526, 13943, 16226, 285, 2722, 1534, 34385, 281, 253, 10941, 3082, 33810, 253, 28913, 2175, 285, 7340, 6260, 273, 4373, 22041, 403, 2361, 50275, 20881, 1255, 50276, 18, 186, 783, 5661, 7533, 403, 1679, 2590, 323, 1650, 352, 310, 417, 5611, 849, 281, 9212, 247, 2014, 5203, 323, 1016, 2460, 253, 2929, 310, 5125, 281, 2085, 690, 4278, 374, 186, 783, 5203, 13007, 403, 417, 2783, 275, 253, 4081, 1332, 253, 5203, 13007, 452, 644, 12258, 347, 247, 7936, 3284, 323, 9591, 33362, 1492, 9162, 310, 352, 12912, 323, 11138, 253, 3045, 273, 653, 1686, 495, 186, 249, 4679, 253, 4081, 1332, 2722, 1781, 34385, 327, 9285, 80, 685, 326, 327, 643, 15302, 4477, 403, 5125, 281, 2085, 7000, 11985, 670, 436, 8310, 577, 186, 9088, 403, 690, 3448, 16503, 253, 2929, 310, 5125, 281, 320, 9257, 4737, 1088, 50276, 9820, 5474, 33032, 2520, 2929, 13330, 342, 253, 2014, 2762, 33362, 1492, 4715, 653, 1686, 4836, 1027, 432, 26724, 33362, 1492, 4715, 275, 653, 1686, 760, 247, 2014, 24088, 253, 954, 4755, 2762, 5203, 310, 28267, 253, 4477, 12661, 247, 5203, 13823, 4156, 15274, 2957, 323, 253, 4836, 342, 247, 2500, 493, 486, 13757, 253, 1332, 33526, 1175, 3045, 327, 2710, 49602, 50276, 783, 2929, 310, 3477, 281, 1239, 285, 253, 1332, 310, 2969, 2568, 3576, 253, 4477, 29820, 253, 16751, 3185, 273, 253, 17524, 13260, 275, 653, 1686, 285, 921, 253, 5750, 273, 253, 347, 23892, 1754, 327, 253, 16774, 4679, 50275, 1568, 403, 690, 13991, 323, 253, 2929, 337, 4677, 337, 310, 417, 2590, 2217, 407, 10941, 253, 2014, 1968, 285, 33362, 1492, 2219, 253, 4477, 778, 806, 921, 253, 3064, 875, 26724, 33362, 1492, 4715, 285, 653, 1686, 1223, 627, 403, 690, 3603, 824, 347, 253, 46234, 285, 3963, 5904, 275, 253, 4677, 534, 1056, 253, 4677, 247, 2372, 21643, 374, 253, 5203, 3020, 21496, 1078, 16186, 337, 310, 417, 2590, 253, 4477, 897, 1182, 281, 9173, 253, 5203, 21496, 533, 1182, 1057, 417, 2226, 275, 16186, 337, 1580, 253, 5203, 21496, 310, 581, 273, 253, 954, 1774, 3603, 275, 253, 4156, 15274, 2957, 253, 4477, 878, 281, 1056, 352, 2590, 1057, 253, 5203, 21496, 4217, 275, 26724, 33362, 1492, 4715, 24088, 275, 16186, 1249, 275, 253, 2929, 495, 1580, 253, 2032, 1180, 273, 2762, 13301, 2550, 320, 12482, 275, 1142, 1524, 10186, 15216, 253, 4477, 878, 281, 2319, 849, 253, 4081, 1332, 476, 2968, 342, 436, 1083, 1078, 253, 4679, 50276, 783, 4477, 452, 18212, 9713, 253, 7364, 285, 2442, 4016, 2675, 3486, 273, 253, 789, 50276, 7152, 33032, 2520, 2929, 29328, 247, 5203, 13823, 4156, 15274, 42184, 37820, 281, 8415, 253, 2014, 2762, 33362, 1492, 4715, 653, 1686, 1895, 835, 1016, 4227, 310, 760, 28267, 342, 581, 273, 253, 2762, 13301, 2905, 253, 4081, 5203, 13823, 4156, 15274, 42184, 37820, 19732, 1131, 253, 16751, 2605, 1491, 3066, 18462, 253, 4156, 15274, 2190, 5203, 3020, 21496, 26332, 2403, 11251, 14407, 21496, 8003, 1223, 7562, 1977, 734, 2437, 21496, 50276, 71, 3341, 597, 2589, 9470, 5661, 1543, 327, 2709, 22791, 15302, 17227, 326, 253, 4081, 1332, 476, 5115, 8936, 3045, 4757, 337, 436, 2929, 29328, 247, 4460, 5203, 13823, 4156, 15274, 42184, 37820, 281, 8415, 253, 2014, 2762, 33362, 1492, 4715, 653, 1686, 1895, 835, 1016, 4227, 310, 760, 28267, 342, 581, 273, 253, 2762, 13301, 2905, 50276, 19, 436, 2929, 8379, 10384, 17524, 9376, 275, 6821, 285, 33526, 1175, 3045, 495, 253, 24426, 273, 253, 1895, 4114, 253, 4081, 1332, 285, 253, 3368, 1543, 403, 1175, 534, 2789, 253, 9414, 3477, 281, 2096, 436, 789, 50276, 20881, 1255, 50276, 18, 50276, 783, 3720, 323, 3981, 253, 14683, 275, 3239, 608, 1386, 1458, 520, 3357, 3198, 281, 320, 5611, 50274, 19, 275, 3239, 495, 1386, 12131, 352, 25957, 326, 436, 2929, 47932, 253, 4116, 3169, 1332, 281, 3989, 253, 5203, 3020, 21496, 1566, 533, 275, 253, 1563, 10199, 670, 253, 4081, 1332, 436, 1127, 3133, 281, 320, 22459, 275, 253, 4081, 1332, 534, 310, 760, 2905, 281, 2593, 5329, 1083, 1263, 50271, 15422, 2490, 187, 4118, 18435, 27, 2520, 2929, 2175, 253, 2014, 10247, 33362, 1492, 4715, 1895, 281, 2953, 436, 1895, 253, 4477, 5283, 253, 10585, 311, 357, 1241, 281, 9295, 253, 2442, 2762, 13301, 285, 5283, 253, 4156, 15274, 37820, 323, 5203, 3020, 46234, 281, 3037, 625, 21488, 4735, 14237, 5661, 1543, 7568, 253, 34385, 273, 253, 10419, 512, 30628, 5194, 281, 2997, 436, 2929, 594, 891, 5583, 14924, 25761, 891, 1335, 452, 690, 625, 13991, 337, 253, 8266, 1979, 275, 4677, 495, 812, 320, 4067, 281, 1056, 253, 7484, 2590, 374, 253, 3806, 5981, 310, 417, 27998, 891, 1804, 253, 4477, 49620, 253, 3806, 5981, 9257 ]
[ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 ]
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 2520, 2929, 2175, 247, 747, 33362, 1492, 4715, 278, 620, 4758, 26332, 253, 2014, 2762, 278, 620, 653, 1686, 835, 760, 581, 2762, 5203, 310, 2540, 253, 4477, 897, 271, 281, 3037, 653, 1686, 26614, 597, 1691, 3579, 10585, 311, 1492, 272, 15274, 499, 68, 37820, 285, 5203, 13823, 4156, 15274, 26238, 37820, 281, 2953, 253, 3221, 4016, 1895, 12976, 407, 271, 253, 4679, 921, 326, 253, 4081, 1332, 271, 50276, 446, 68, 50276, 42184, 33526, 253, 1682, 3045, 50276, 783, 2022, 20544, 403, 347, 3637, 50276, 20261, 271, 310, 417, 6474, 323, 653, 1686, 253, 4477, 1691, 3579, 499, 68, 285, 26238, 281, 2953, 253, 3221, 4016, 1895, 12976, 407, 271, 50275, 783, 4081, 1332, 271, 446, 498, 317, 310, 2011, 281, 452, 253, 1682, 3045, 2429, 342, 44528, 653, 1686, 3082, 253, 28913, 1263, 285, 1083, 1263, 403, 2590, 50275, 67, 11587, 891, 671, 452, 2067, 3533, 50276, 446, 68, 26332, 16186, 577, 556, 253, 15301, 1159, 534, 310, 417, 4354, 281, 320, 10166, 891, 1902, 281, 923, 253, 14940, 273, 253, 4081, 1332, 50275, 783, 11251, 14407, 4602, 4315, 891, 556, 247, 1979, 273, 270, 19, 534, 310, 824, 271, 14779, 4315, 752, 4555, 310, 253, 1979, 273, 253, 4315, 275, 634, 5661, 15302, 310, 352, 1896, 281, 6194, 342, 824, 271, 14779, 4315, 50276, 42184, 26332, 16186, 818, 760, 8687, 253, 4315, 891, 1293, 253, 4315, 270, 1792, 835, 1057, 270, 1792, 1159, 253, 4477, 419, 2254, 7558, 667, 7364, 50276, 7152, 339, 431, 248, 2929, 19401, 253, 33362, 1492, 4715, 342, 2014, 2762, 13301, 275, 824, 10076, 1016, 2460, 310, 760, 2330, 342, 247, 2014, 2762, 5203, 281, 9295, 253, 2442, 2762, 13301, 253, 2929, 806, 10585, 311, 357, 1241, 253, 440, 45912, 13301, 1754, 327, 253, 1566, 13650, 281, 9510, 253, 21473, 3045, 253, 1332, 19401, 253, 4156, 15274, 37820, 323, 5203, 3020, 46234, 281, 3037, 625, 21488, 4735, 14237, 253, 4679, 403, 5196, 275, 2709, 22791, 33362, 1492, 15302, 5661, 1543, 921, 253, 34385, 273, 253, 4081, 1332, 4757, 337, 186, 783, 2929, 29328, 271, 3576, 2900, 281, 653, 1686, 3237, 253, 2022, 7680, 273, 436, 2929, 310, 281, 16584, 253, 5203, 13823, 4156, 15274, 281, 11907, 253, 4735, 14237, 281, 6558, 247, 625, 8566, 2605, 253, 4081, 1332, 310, 973, 17194, 285, 3477, 281, 956, 374, 186, 783, 4679, 403, 2684, 327, 2709, 22791, 15302, 281, 7277, 342, 1375, 1171, 435, 3082, 253, 4081, 1332, 33526, 13943, 16226, 285, 2722, 1534, 34385, 281, 253, 10941, 3082, 33810, 253, 28913, 2175, 285, 7340, 6260, 273, 4373, 22041, 403, 2361, 50275, 20881, 1255, 50276, 18, 186, 783, 5661, 7533, 403, 1679, 2590, 323, 1650, 352, 310, 417, 5611, 849, 281, 9212, 247, 2014, 5203, 323, 1016, 2460, 253, 2929, 310, 5125, 281, 2085, 690, 4278, 374, 186, 783, 5203, 13007, 403, 417, 2783, 275, 253, 4081, 1332, 253, 5203, 13007, 452, 644, 12258, 347, 247, 7936, 3284, 323, 9591, 33362, 1492, 9162, 310, 352, 12912, 323, 11138, 253, 3045, 273, 653, 1686, 495, 186, 249, 4679, 253, 4081, 1332, 2722, 1781, 34385, 327, 9285, 80, 685, 326, 327, 643, 15302, 4477, 403, 5125, 281, 2085, 7000, 11985, 670, 436, 8310, 577, 186, 9088, 403, 690, 3448, 16503, 253, 2929, 310, 5125, 281, 320, 9257, 4737, 1088, 50276, 9820, 5474, 33032, 2520, 2929, 13330, 342, 253, 2014, 2762, 33362, 1492, 4715, 653, 1686, 4836, 1027, 432, 26724, 33362, 1492, 4715, 275, 653, 1686, 760, 247, 2014, 24088, 253, 954, 4755, 2762, 5203, 310, 28267, 253, 4477, 12661, 247, 5203, 13823, 4156, 15274, 2957, 323, 253, 4836, 342, 247, 2500, 493, 486, 13757, 253, 1332, 33526, 1175, 3045, 327, 2710, 49602, 50276, 783, 2929, 310, 3477, 281, 1239, 285, 253, 1332, 310, 2969, 2568, 3576, 253, 4477, 29820, 253, 16751, 3185, 273, 253, 17524, 13260, 275, 653, 1686, 285, 921, 253, 5750, 273, 253, 347, 23892, 1754, 327, 253, 16774, 4679, 50275, 1568, 403, 690, 13991, 323, 253, 2929, 337, 4677, 337, 310, 417, 2590, 2217, 407, 10941, 253, 2014, 1968, 285, 33362, 1492, 2219, 253, 4477, 778, 806, 921, 253, 3064, 875, 26724, 33362, 1492, 4715, 285, 653, 1686, 1223, 627, 403, 690, 3603, 824, 347, 253, 46234, 285, 3963, 5904, 275, 253, 4677, 534, 1056, 253, 4677, 247, 2372, 21643, 374, 253, 5203, 3020, 21496, 1078, 16186, 337, 310, 417, 2590, 253, 4477, 897, 1182, 281, 9173, 253, 5203, 21496, 533, 1182, 1057, 417, 2226, 275, 16186, 337, 1580, 253, 5203, 21496, 310, 581, 273, 253, 954, 1774, 3603, 275, 253, 4156, 15274, 2957, 253, 4477, 878, 281, 1056, 352, 2590, 1057, 253, 5203, 21496, 4217, 275, 26724, 33362, 1492, 4715, 24088, 275, 16186, 1249, 275, 253, 2929, 495, 1580, 253, 2032, 1180, 273, 2762, 13301, 2550, 320, 12482, 275, 1142, 1524, 10186, 15216, 253, 4477, 878, 281, 2319, 849, 253, 4081, 1332, 476, 2968, 342, 436, 1083, 1078, 253, 4679, 50276, 783, 4477, 452, 18212, 9713, 253, 7364, 285, 2442, 4016, 2675, 3486, 273, 253, 789, 50276, 7152, 33032, 2520, 2929, 29328, 247, 5203, 13823, 4156, 15274, 42184, 37820, 281, 8415, 253, 2014, 2762, 33362, 1492, 4715, 653, 1686, 1895, 835, 1016, 4227, 310, 760, 28267, 342, 581, 273, 253, 2762, 13301, 2905, 253, 4081, 5203, 13823, 4156, 15274, 42184, 37820, 19732, 1131, 253, 16751, 2605, 1491, 3066, 18462, 253, 4156, 15274, 2190, 5203, 3020, 21496, 26332, 2403, 11251, 14407, 21496, 8003, 1223, 7562, 1977, 734, 2437, 21496, 50276, 71, 3341, 597, 2589, 9470, 5661, 1543, 327, 2709, 22791, 15302, 17227, 326, 253, 4081, 1332, 476, 5115, 8936, 3045, 4757, 337, 436, 2929, 29328, 247, 4460, 5203, 13823, 4156, 15274, 42184, 37820, 281, 8415, 253, 2014, 2762, 33362, 1492, 4715, 653, 1686, 1895, 835, 1016, 4227, 310, 760, 28267, 342, 581, 273, 253, 2762, 13301, 2905, 50276, 19, 436, 2929, 8379, 10384, 17524, 9376, 275, 6821, 285, 33526, 1175, 3045, 495, 253, 24426, 273, 253, 1895, 4114, 253, 4081, 1332, 285, 253, 3368, 1543, 403, 1175, 534, 2789, 253, 9414, 3477, 281, 2096, 436, 789, 50276, 20881, 1255, 50276, 18, 50276, 783, 3720, 323, 3981, 253, 14683, 275, 3239, 608, 1386, 1458, 520, 3357, 3198, 281, 320, 5611, 50274, 19, 275, 3239, 495, 1386, 12131, 352, 25957, 326, 436, 2929, 47932, 253, 4116, 3169, 1332, 281, 3989, 253, 5203, 3020, 21496, 1566, 533, 275, 253, 1563, 10199, 670, 253, 4081, 1332, 436, 1127, 3133, 281, 320, 22459, 275, 253, 4081, 1332, 534, 310, 760, 2905, 281, 2593, 5329, 1083, 1263, 50271, 15422, 2490, 187, 4118, 18435, 27, 2520, 2929, 2175, 253, 2014, 10247, 33362, 1492, 4715, 1895, 281, 2953, 436, 1895, 253, 4477, 5283, 253, 10585, 311, 357, 1241, 281, 9295, 253, 2442, 2762, 13301, 285, 5283, 253, 4156, 15274, 37820, 323, 5203, 3020, 46234, 281, 3037, 625, 21488, 4735, 14237, 5661, 1543, 7568, 253, 34385, 273, 253, 10419, 512, 30628, 5194, 281, 2997, 436, 2929, 594, 891, 5583, 14924, 25761, 891, 1335, 452, 690, 625, 13991, 337, 253, 8266, 1979, 275, 4677, 495, 812, 320, 4067, 281, 1056, 253, 7484, 2590, 374, 253, 3806, 5981, 310, 417, 27998, 891, 1804, 253, 4477, 49620, 253, 3806, 5981, 9257 ]
Below is given review of a research paper from cnoference journal. Please write a summary the review. ### Review: this paper studies the mean estimation problem in the case that the users have heterogeneous sensitivities for local and central differential privacy by sharing their data the users receive payments but lose both local and central privacy a users goal is to maximize her total utility while the platforms goal is to minimize the sum of the mean squared error plus the total payments to all users with two constraints incentive compatibility and individual rationality the privacy loss is quantified by local renyi dp and a generalized version of central renyi dp in which n different epsilons are given one for each user each users privacy sensitivity is represented by a scalar ci in 01 which means that the weight of local dp loss for user i is ci and the weight of central dp loss is 1ci the authors analyze the minimax lower bounds for the estimation error and propose a linear estimator that is order optimal then they propose a data acquisition mechanism with the assumption that the server knows the agents privacy sensitivity distribution and has both local and central privacy guarantees and model it as a nonconvex optimization problem finally they show that there is an approximately optimal polynomial time algorithm for the linear estimators originality this paper is related to central local differential privacy and bayesian data acquisition mechanisms the authors propose a new privacy model that each user has a privacy sensitivity which determines whether they care more about central dp loss or local dp loss also the users are allowed to have different epsilon values in both the local and central model the problem is modeled by a nonconvex optimization problem and approximate optimal polynomial time solutions are discussed based on the introduction and related work the setting and solutions are novel i do wonder if there are previous publications about users having different sensitivities for local and central dp or different users having different central dp epsilons quality the setting notations are mostly ok some clarification questions page 3 line 138 is each xi a scalar or could it be a vector page 3 line 141 is zi defined page 3 line 142 minimize the estimates error what kind of estimate do you consider in this paper could it be arbitrary from page 4 line 173 seems you only consider estimating a scalar from user data page 4 line 184 is varepsilon1l defined before you use it page 6 line 241 and 243 why is varepsilonil mathcalx rightarrow hatmathcalx but varepsilonic mathbbrn rightarrow mathbbr shouldnt both functions have the same input and output value range clarity the writing and organization is mostly clear some comments page 1 line 9 the platform does not know the privacy sensitivity of users this might be misleading because you do assume that the platform knows the distribution of users sensitivities as stated in line 16 i suggest making the assumption clear in line 9 page 2 line 56 be the more the second one be more than the second one page 2 line 66 the platforms problem is to i suggest changing problem to goal or target the same comment for other occurrences of this phrase page 3 line 97 in this literature in the literature page 8 line 295296 this problem is still a functional optimization in terms this is still a functional optimization problem in terms page 9 line 362 a pointwise optimization problem hose hose whose significance while the setting this paper proposes is new and kind of interesting i wonder how practical it is to model privacy in a way that each user only has a privacy sensitivity in 0 1 to determine the weight between central and local dp and the platform got to determine every users epsilon value in both central and local dp the idea is that a user is fine with arbitrarily large privacy loss as long as they are paid according to some payment rule which is hard to justify if every user can have an upper bound epsilon then the setting would be more meaningful i dont think the setting proposed in this paper is practical see my comments in strengths and weaknesses significance docsepthis paper addresses an important problem in the field that is to eliminate local and central differential privacy dp concerns with one unified framework it also considers the heterogeneous differential privacy where different users have different privacy needs the approach is relatively simple compared to stateoftheart solutions they use renyi dp and find a minimax lower bound first then they establish a pointwise optimization problem to characterize the optimal data acquisition mechanism strengths this paper addresses an important problem for realworld applications and i found the simplicity of the approach very practical for those applications it is well written and theoretically sounds although i am not an expert in the field i found many parts of the paper easy to follow and understand weaknesses this approach is quite applicable to realworld problems id like to see some empirical evaluation of the proposed approach to understand how useful it could be for real applications minor but ive noticed a few typos eg hose in page 9 line 362 we capital w in page 2 line 87 authors discusses the limitations in conclusion section they mention that it is still an open problem to study the optimality of linear estimators i dont have any other concern docsepthis paper considers a heterogeneous population of users with respect to their privacy preferences they consider the change neighjbourhood for dp which is natural for local dp it first shows a lower bound on the linear estimation for the gaussianmechanism and a matching upper bound linear estimator for individuals with epsiloni preferences in the main paper however they consider relative privacy sensitivity to local and central mechanisms instead they then cast the case of mse in parameter estimation with incentive compatibility and individual rationality constraints in some sense the result that its optimal to only add local noise is not extremely surprising given the results about anonymity and local dp it is a well written and interesting combination of a number of areas that have not been studied very much so far even though the design of optimal mechanisms for privacy is important under the assumption of a monotonic increasing virtual cost its ok if you dont explain it but it is not actually fully defined what are fi fi yes docsepthe authors study the design of optimal bayesian data acquisition mechanisms for a platform interested in estimating the mean of distribution by collecting data from privacyconscious users the authors assume users have heterogeneous sensitivities for local and central differential privacy measures and the users share their data in exchange for a payment that compensates for their privacy losses the authors establish minimax lower bounds for the estimation error and show that a linear estimator is near optimal a bayesian setting is considered to turn the design of an optimal data acquisition mechanism into a nonconvex optimization problem the manuscript is well organized and includes enough information needed to support the claims it makes the proposed method is novel in the sense that the authors consider the heterogeneous sensitivities of users and use a bayesian setting to cast the mechanism designing problem as an optimization problem the authors have adequately addressed the limitations ### Summary:
the reviewers all agreed that the problem setting is wellmotivated the proposed solution is novel and that the paper is easy to understand and to follow though the authors response addressed most of the reviewers concerns during the acreviewer discussion phase some reviewers still noted that the method can add too much noise to data vrjn the setting where the server assigns different privacy levels epsilon in exchange for payments may be unrealistic buzh prior knowledge of the distribution users sensitivities may not be feasible in practice bcdv despite these issues remaining the reviewers and i agree that the merits of the paper outweigh the relatively minor flaws above and that this paper would make a valuable addition to the conference i encourage the authors to review the final version of their manuscript prior to publication
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 2520, 2929, 2175, 253, 1599, 13418, 1895, 275, 253, 1083, 326, 253, 4212, 452, 22766, 21750, 16762, 323, 1980, 285, 4275, 8967, 11068, 407, 9628, 616, 941, 253, 4212, 4763, 10762, 533, 7168, 1097, 1980, 285, 4275, 11068, 247, 4212, 4736, 310, 281, 22950, 617, 2264, 11839, 1223, 253, 13498, 4736, 310, 281, 15338, 253, 2020, 273, 253, 1599, 30044, 2228, 5043, 253, 2264, 10762, 281, 512, 4212, 342, 767, 10806, 25275, 22862, 285, 2060, 8870, 414, 253, 11068, 2957, 310, 18755, 407, 1980, 3816, 28212, 33234, 285, 247, 14923, 2715, 273, 4275, 3816, 28212, 33234, 275, 534, 295, 1027, 299, 793, 300, 790, 403, 1677, 581, 323, 1016, 2608, 1016, 4212, 11068, 7340, 310, 6607, 407, 247, 13434, 16399, 275, 14805, 534, 2097, 326, 253, 2801, 273, 1980, 33234, 2957, 323, 2608, 891, 310, 16399, 285, 253, 2801, 273, 4275, 33234, 2957, 310, 337, 5297, 253, 4477, 12106, 253, 7221, 991, 2406, 14493, 323, 253, 13418, 2228, 285, 12661, 247, 4872, 29107, 326, 310, 1340, 8654, 840, 597, 12661, 247, 941, 11931, 5122, 342, 253, 9376, 326, 253, 4771, 6057, 253, 6083, 11068, 7340, 3268, 285, 556, 1097, 1980, 285, 4275, 11068, 23632, 285, 1566, 352, 347, 247, 1327, 44181, 13757, 1895, 4720, 597, 921, 326, 627, 310, 271, 5512, 8654, 14189, 673, 5933, 323, 253, 4872, 48489, 50275, 19164, 414, 436, 2929, 310, 2905, 281, 4275, 50276, 6790, 8967, 11068, 285, 17699, 16561, 941, 11931, 6297, 253, 4477, 12661, 247, 747, 11068, 1566, 326, 1016, 2608, 556, 247, 11068, 7340, 534, 14802, 1880, 597, 1557, 625, 670, 4275, 33234, 2957, 390, 1980, 33234, 2957, 671, 253, 4212, 403, 4136, 281, 452, 1027, 299, 4277, 2193, 275, 1097, 253, 1980, 285, 4275, 1566, 253, 1895, 310, 23115, 407, 247, 1327, 44181, 13757, 1895, 285, 16851, 8654, 14189, 673, 5482, 403, 5469, 1754, 327, 253, 10199, 285, 2905, 789, 253, 4758, 285, 5482, 403, 4460, 891, 513, 4282, 604, 627, 403, 2045, 16516, 670, 4212, 1907, 1027, 21750, 16762, 323, 1980, 285, 4275, 33234, 390, 1027, 4212, 1907, 1027, 4275, 33234, 299, 793, 300, 790, 50275, 15177, 253, 4758, 50276, 42788, 403, 6571, 8718, 690, 37699, 3533, 50274, 6377, 495, 1386, 15410, 310, 1016, 1269, 74, 247, 13434, 390, 812, 352, 320, 247, 4972, 50274, 6377, 495, 1386, 21886, 310, 1182, 74, 2931, 50274, 6377, 495, 1386, 21669, 15338, 253, 8197, 2228, 752, 2238, 273, 6642, 513, 368, 1908, 275, 436, 2929, 812, 352, 320, 10341, 432, 3239, 577, 1386, 24687, 3133, 368, 760, 1908, 26230, 247, 13434, 432, 2608, 941, 50274, 6377, 577, 1386, 25921, 310, 362, 609, 4277, 18, 77, 2931, 1078, 368, 897, 352, 50274, 6377, 721, 1386, 29754, 285, 30188, 2139, 310, 362, 609, 4277, 300, 14168, 1179, 89, 987, 2501, 7856, 1588, 89, 533, 362, 609, 4277, 280, 14168, 67, 1288, 79, 987, 2501, 14168, 67, 1288, 943, 2649, 1097, 3470, 452, 253, 1072, 3280, 285, 3453, 1318, 2491, 50275, 498, 15752, 253, 4028, 285, 6003, 310, 6571, 2590, 690, 5701, 50274, 6377, 337, 1386, 898, 253, 5147, 1057, 417, 871, 253, 11068, 7340, 273, 4212, 436, 1537, 320, 24363, 984, 368, 513, 5467, 326, 253, 5147, 6057, 253, 3268, 273, 4212, 21750, 16762, 347, 4767, 275, 1386, 1668, 891, 1804, 2403, 253, 9376, 2590, 275, 1386, 898, 50274, 6377, 374, 1386, 8026, 320, 253, 625, 253, 1273, 581, 50276, 1257, 625, 685, 253, 1273, 581, 50274, 6377, 374, 1386, 9523, 253, 13498, 1895, 310, 281, 891, 1804, 6890, 1895, 281, 4736, 390, 2303, 253, 1072, 4385, 323, 643, 37102, 273, 436, 12616, 50274, 6377, 495, 1386, 10694, 275, 436, 6239, 50276, 249, 253, 6239, 50274, 6377, 854, 1386, 28195, 23402, 436, 1895, 310, 1335, 247, 5164, 13757, 275, 2426, 50275, 2520, 310, 1335, 247, 5164, 13757, 1895, 275, 2426, 50273, 6377, 898, 1386, 33933, 247, 1127, 3020, 13757, 1895, 36863, 36863, 50276, 39374, 50275, 9188, 40348, 1223, 253, 4758, 436, 2929, 29328, 310, 747, 285, 2238, 273, 4722, 891, 4282, 849, 8542, 352, 310, 281, 1566, 11068, 275, 247, 1039, 326, 1016, 2608, 760, 556, 247, 11068, 7340, 275, 470, 337, 281, 3653, 253, 2801, 875, 4275, 285, 1980, 33234, 285, 253, 5147, 1694, 281, 3653, 1046, 4212, 299, 4277, 1318, 275, 1097, 4275, 285, 1980, 33234, 253, 2934, 310, 326, 247, 2608, 310, 4030, 342, 29607, 1781, 11068, 2957, 347, 1048, 347, 597, 403, 5087, 2556, 281, 690, 7830, 4086, 534, 310, 1892, 281, 15249, 604, 1046, 2608, 476, 452, 271, 5170, 3033, 299, 4277, 840, 253, 4758, 651, 320, 625, 14282, 891, 13414, 1158, 253, 4758, 4081, 275, 436, 2929, 310, 8542, 923, 619, 5701, 275, 20544, 285, 32213, 50276, 9188, 40348, 5474, 33032, 2520, 2929, 12453, 271, 1774, 1895, 275, 253, 1673, 326, 310, 281, 13469, 1980, 285, 4275, 8967, 11068, 33234, 7350, 342, 581, 27998, 7792, 352, 671, 19401, 253, 22766, 8967, 11068, 835, 1027, 4212, 452, 1027, 11068, 3198, 253, 2746, 310, 4942, 2969, 2429, 281, 1375, 23037, 14387, 5482, 597, 897, 3816, 28212, 33234, 285, 1089, 247, 7221, 991, 2406, 3033, 806, 840, 597, 5100, 247, 1127, 3020, 13757, 1895, 281, 17710, 253, 8654, 941, 11931, 5122, 20544, 436, 2929, 12453, 271, 1774, 1895, 323, 1524, 10186, 4893, 285, 891, 1119, 253, 17647, 273, 253, 2746, 1077, 8542, 323, 1110, 4893, 352, 310, 973, 3542, 285, 28055, 7835, 3738, 891, 717, 417, 271, 6485, 275, 253, 1673, 891, 1119, 1142, 4243, 273, 253, 2929, 3477, 281, 956, 285, 2096, 50276, 20881, 1255, 265, 436, 2746, 310, 3240, 7763, 281, 1524, 10186, 3237, 2654, 751, 281, 923, 690, 16774, 7103, 273, 253, 4081, 2746, 281, 2096, 849, 4217, 352, 812, 320, 323, 1524, 4893, 5884, 533, 209, 422, 8344, 247, 1643, 963, 993, 24088, 36863, 275, 3239, 898, 1386, 33933, 359, 5347, 259, 275, 50276, 6377, 374, 1386, 11422, 50275, 43355, 25339, 253, 7364, 275, 6452, 2593, 597, 3748, 326, 352, 310, 1335, 271, 1527, 1895, 281, 1263, 253, 5556, 1319, 273, 4872, 48489, 891, 13414, 452, 667, 643, 4468, 5474, 33032, 2520, 2929, 19401, 247, 22766, 3072, 273, 4212, 342, 1675, 50276, 936, 616, 11068, 17971, 597, 1908, 253, 1818, 50276, 570, 798, 75, 7390, 3639, 323, 33234, 534, 310, 3626, 323, 1980, 33234, 352, 806, 2722, 50276, 66, 2406, 3033, 327, 253, 4872, 13418, 323, 253, 305, 12064, 1405, 2291, 1204, 50276, 395, 247, 11038, 5170, 3033, 50276, 8172, 29107, 323, 4292, 342, 50276, 4259, 74, 17971, 50276, 249, 253, 2022, 2929, 2299, 597, 1908, 50276, 27038, 11068, 7340, 281, 1980, 285, 4275, 6297, 3185, 50276, 9328, 840, 5248, 253, 1083, 273, 278, 339, 275, 4764, 13418, 342, 25275, 22862, 285, 2060, 8870, 414, 10806, 275, 690, 3282, 253, 906, 326, 697, 8654, 281, 760, 823, 1980, 6046, 310, 417, 6685, 10084, 1677, 253, 1543, 670, 39185, 285, 1980, 33234, 352, 310, 247, 973, 3542, 285, 4722, 5019, 273, 247, 1180, 273, 3672, 326, 452, 417, 644, 5421, 1077, 1199, 594, 2080, 1014, 2167, 253, 2216, 273, 8654, 6297, 323, 11068, 310, 1774, 50275, 4524, 253, 9376, 273, 247, 45973, 3629, 7503, 2105, 697, 8718, 604, 368, 13414, 5513, 352, 533, 352, 310, 417, 2686, 4751, 2931, 752, 403, 12684, 12684, 50276, 9820, 5474, 339, 431, 248, 4477, 1263, 253, 2216, 273, 8654, 17699, 16561, 941, 11931, 6297, 323, 247, 5147, 6110, 275, 26230, 253, 1599, 273, 3268, 407, 17055, 941, 432, 11068, 23739, 4212, 253, 4477, 5467, 4212, 452, 22766, 21750, 16762, 323, 1980, 285, 4275, 8967, 11068, 5593, 285, 253, 4212, 3894, 616, 941, 275, 6431, 323, 247, 7830, 326, 7037, 684, 323, 616, 11068, 11655, 253, 4477, 5100, 7221, 991, 2406, 14493, 323, 253, 13418, 2228, 285, 921, 326, 247, 4872, 29107, 310, 2822, 8654, 247, 17699, 16561, 4758, 310, 2783, 281, 1614, 253, 2216, 273, 271, 8654, 941, 11931, 5122, 715, 247, 1327, 44181, 13757, 1895, 50276, 783, 7714, 310, 973, 10932, 285, 3797, 2217, 1491, 3058, 281, 1329, 253, 3916, 352, 2789, 253, 4081, 1332, 310, 4460, 275, 253, 3282, 326, 253, 4477, 1908, 253, 22766, 21750, 16762, 273, 4212, 285, 897, 247, 17699, 16561, 4758, 281, 5248, 253, 5122, 20462, 1895, 347, 271, 13757, 1895, 253, 4477, 452, 18212, 9713, 253, 7364, 2490, 187, 4118, 18435, 27, 783, 30628, 512, 5821, 326, 253, 1895, 4758, 310, 973, 24013, 8550, 253, 4081, 2900, 310, 4460, 285, 326, 253, 2929, 310, 3477, 281, 2096, 285, 281, 956, 2167, 253, 4477, 2380, 9713, 954, 273, 253, 30628, 7350, 1309, 253, 36982, 1374, 254, 5955, 3408, 690, 30628, 1335, 4879, 326, 50275, 783, 1332, 476, 823, 1512, 1199, 6046, 281, 941, 362, 83, 33126, 50276, 783, 4758, 835, 253, 4771, 39360, 1027, 11068, 2308, 299, 4277, 275, 6431, 323, 10762, 778, 320, 46521, 1081, 20122, 50276, 40844, 3640, 273, 253, 3268, 4212, 21750, 16762, 778, 417, 320, 17887, 275, 3946, 270, 2428, 87, 50276, 3229, 3784, 841, 3374, 5780, 253, 30628, 285, 891, 5194, 326, 253, 16108, 273, 253, 2929, 32180, 798, 253, 4942, 5884, 32138, 1840, 285, 326, 436, 2929, 651, 1056, 247, 9865, 1635, 281, 253, 8059, 891, 11907, 253, 4477, 281, 2278, 253, 2457, 2715, 273, 616, 7714, 2720, 281, 9311 ]
[ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 ]
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 2520, 2929, 2175, 253, 1599, 13418, 1895, 275, 253, 1083, 326, 253, 4212, 452, 22766, 21750, 16762, 323, 1980, 285, 4275, 8967, 11068, 407, 9628, 616, 941, 253, 4212, 4763, 10762, 533, 7168, 1097, 1980, 285, 4275, 11068, 247, 4212, 4736, 310, 281, 22950, 617, 2264, 11839, 1223, 253, 13498, 4736, 310, 281, 15338, 253, 2020, 273, 253, 1599, 30044, 2228, 5043, 253, 2264, 10762, 281, 512, 4212, 342, 767, 10806, 25275, 22862, 285, 2060, 8870, 414, 253, 11068, 2957, 310, 18755, 407, 1980, 3816, 28212, 33234, 285, 247, 14923, 2715, 273, 4275, 3816, 28212, 33234, 275, 534, 295, 1027, 299, 793, 300, 790, 403, 1677, 581, 323, 1016, 2608, 1016, 4212, 11068, 7340, 310, 6607, 407, 247, 13434, 16399, 275, 14805, 534, 2097, 326, 253, 2801, 273, 1980, 33234, 2957, 323, 2608, 891, 310, 16399, 285, 253, 2801, 273, 4275, 33234, 2957, 310, 337, 5297, 253, 4477, 12106, 253, 7221, 991, 2406, 14493, 323, 253, 13418, 2228, 285, 12661, 247, 4872, 29107, 326, 310, 1340, 8654, 840, 597, 12661, 247, 941, 11931, 5122, 342, 253, 9376, 326, 253, 4771, 6057, 253, 6083, 11068, 7340, 3268, 285, 556, 1097, 1980, 285, 4275, 11068, 23632, 285, 1566, 352, 347, 247, 1327, 44181, 13757, 1895, 4720, 597, 921, 326, 627, 310, 271, 5512, 8654, 14189, 673, 5933, 323, 253, 4872, 48489, 50275, 19164, 414, 436, 2929, 310, 2905, 281, 4275, 50276, 6790, 8967, 11068, 285, 17699, 16561, 941, 11931, 6297, 253, 4477, 12661, 247, 747, 11068, 1566, 326, 1016, 2608, 556, 247, 11068, 7340, 534, 14802, 1880, 597, 1557, 625, 670, 4275, 33234, 2957, 390, 1980, 33234, 2957, 671, 253, 4212, 403, 4136, 281, 452, 1027, 299, 4277, 2193, 275, 1097, 253, 1980, 285, 4275, 1566, 253, 1895, 310, 23115, 407, 247, 1327, 44181, 13757, 1895, 285, 16851, 8654, 14189, 673, 5482, 403, 5469, 1754, 327, 253, 10199, 285, 2905, 789, 253, 4758, 285, 5482, 403, 4460, 891, 513, 4282, 604, 627, 403, 2045, 16516, 670, 4212, 1907, 1027, 21750, 16762, 323, 1980, 285, 4275, 33234, 390, 1027, 4212, 1907, 1027, 4275, 33234, 299, 793, 300, 790, 50275, 15177, 253, 4758, 50276, 42788, 403, 6571, 8718, 690, 37699, 3533, 50274, 6377, 495, 1386, 15410, 310, 1016, 1269, 74, 247, 13434, 390, 812, 352, 320, 247, 4972, 50274, 6377, 495, 1386, 21886, 310, 1182, 74, 2931, 50274, 6377, 495, 1386, 21669, 15338, 253, 8197, 2228, 752, 2238, 273, 6642, 513, 368, 1908, 275, 436, 2929, 812, 352, 320, 10341, 432, 3239, 577, 1386, 24687, 3133, 368, 760, 1908, 26230, 247, 13434, 432, 2608, 941, 50274, 6377, 577, 1386, 25921, 310, 362, 609, 4277, 18, 77, 2931, 1078, 368, 897, 352, 50274, 6377, 721, 1386, 29754, 285, 30188, 2139, 310, 362, 609, 4277, 300, 14168, 1179, 89, 987, 2501, 7856, 1588, 89, 533, 362, 609, 4277, 280, 14168, 67, 1288, 79, 987, 2501, 14168, 67, 1288, 943, 2649, 1097, 3470, 452, 253, 1072, 3280, 285, 3453, 1318, 2491, 50275, 498, 15752, 253, 4028, 285, 6003, 310, 6571, 2590, 690, 5701, 50274, 6377, 337, 1386, 898, 253, 5147, 1057, 417, 871, 253, 11068, 7340, 273, 4212, 436, 1537, 320, 24363, 984, 368, 513, 5467, 326, 253, 5147, 6057, 253, 3268, 273, 4212, 21750, 16762, 347, 4767, 275, 1386, 1668, 891, 1804, 2403, 253, 9376, 2590, 275, 1386, 898, 50274, 6377, 374, 1386, 8026, 320, 253, 625, 253, 1273, 581, 50276, 1257, 625, 685, 253, 1273, 581, 50274, 6377, 374, 1386, 9523, 253, 13498, 1895, 310, 281, 891, 1804, 6890, 1895, 281, 4736, 390, 2303, 253, 1072, 4385, 323, 643, 37102, 273, 436, 12616, 50274, 6377, 495, 1386, 10694, 275, 436, 6239, 50276, 249, 253, 6239, 50274, 6377, 854, 1386, 28195, 23402, 436, 1895, 310, 1335, 247, 5164, 13757, 275, 2426, 50275, 2520, 310, 1335, 247, 5164, 13757, 1895, 275, 2426, 50273, 6377, 898, 1386, 33933, 247, 1127, 3020, 13757, 1895, 36863, 36863, 50276, 39374, 50275, 9188, 40348, 1223, 253, 4758, 436, 2929, 29328, 310, 747, 285, 2238, 273, 4722, 891, 4282, 849, 8542, 352, 310, 281, 1566, 11068, 275, 247, 1039, 326, 1016, 2608, 760, 556, 247, 11068, 7340, 275, 470, 337, 281, 3653, 253, 2801, 875, 4275, 285, 1980, 33234, 285, 253, 5147, 1694, 281, 3653, 1046, 4212, 299, 4277, 1318, 275, 1097, 4275, 285, 1980, 33234, 253, 2934, 310, 326, 247, 2608, 310, 4030, 342, 29607, 1781, 11068, 2957, 347, 1048, 347, 597, 403, 5087, 2556, 281, 690, 7830, 4086, 534, 310, 1892, 281, 15249, 604, 1046, 2608, 476, 452, 271, 5170, 3033, 299, 4277, 840, 253, 4758, 651, 320, 625, 14282, 891, 13414, 1158, 253, 4758, 4081, 275, 436, 2929, 310, 8542, 923, 619, 5701, 275, 20544, 285, 32213, 50276, 9188, 40348, 5474, 33032, 2520, 2929, 12453, 271, 1774, 1895, 275, 253, 1673, 326, 310, 281, 13469, 1980, 285, 4275, 8967, 11068, 33234, 7350, 342, 581, 27998, 7792, 352, 671, 19401, 253, 22766, 8967, 11068, 835, 1027, 4212, 452, 1027, 11068, 3198, 253, 2746, 310, 4942, 2969, 2429, 281, 1375, 23037, 14387, 5482, 597, 897, 3816, 28212, 33234, 285, 1089, 247, 7221, 991, 2406, 3033, 806, 840, 597, 5100, 247, 1127, 3020, 13757, 1895, 281, 17710, 253, 8654, 941, 11931, 5122, 20544, 436, 2929, 12453, 271, 1774, 1895, 323, 1524, 10186, 4893, 285, 891, 1119, 253, 17647, 273, 253, 2746, 1077, 8542, 323, 1110, 4893, 352, 310, 973, 3542, 285, 28055, 7835, 3738, 891, 717, 417, 271, 6485, 275, 253, 1673, 891, 1119, 1142, 4243, 273, 253, 2929, 3477, 281, 956, 285, 2096, 50276, 20881, 1255, 265, 436, 2746, 310, 3240, 7763, 281, 1524, 10186, 3237, 2654, 751, 281, 923, 690, 16774, 7103, 273, 253, 4081, 2746, 281, 2096, 849, 4217, 352, 812, 320, 323, 1524, 4893, 5884, 533, 209, 422, 8344, 247, 1643, 963, 993, 24088, 36863, 275, 3239, 898, 1386, 33933, 359, 5347, 259, 275, 50276, 6377, 374, 1386, 11422, 50275, 43355, 25339, 253, 7364, 275, 6452, 2593, 597, 3748, 326, 352, 310, 1335, 271, 1527, 1895, 281, 1263, 253, 5556, 1319, 273, 4872, 48489, 891, 13414, 452, 667, 643, 4468, 5474, 33032, 2520, 2929, 19401, 247, 22766, 3072, 273, 4212, 342, 1675, 50276, 936, 616, 11068, 17971, 597, 1908, 253, 1818, 50276, 570, 798, 75, 7390, 3639, 323, 33234, 534, 310, 3626, 323, 1980, 33234, 352, 806, 2722, 50276, 66, 2406, 3033, 327, 253, 4872, 13418, 323, 253, 305, 12064, 1405, 2291, 1204, 50276, 395, 247, 11038, 5170, 3033, 50276, 8172, 29107, 323, 4292, 342, 50276, 4259, 74, 17971, 50276, 249, 253, 2022, 2929, 2299, 597, 1908, 50276, 27038, 11068, 7340, 281, 1980, 285, 4275, 6297, 3185, 50276, 9328, 840, 5248, 253, 1083, 273, 278, 339, 275, 4764, 13418, 342, 25275, 22862, 285, 2060, 8870, 414, 10806, 275, 690, 3282, 253, 906, 326, 697, 8654, 281, 760, 823, 1980, 6046, 310, 417, 6685, 10084, 1677, 253, 1543, 670, 39185, 285, 1980, 33234, 352, 310, 247, 973, 3542, 285, 4722, 5019, 273, 247, 1180, 273, 3672, 326, 452, 417, 644, 5421, 1077, 1199, 594, 2080, 1014, 2167, 253, 2216, 273, 8654, 6297, 323, 11068, 310, 1774, 50275, 4524, 253, 9376, 273, 247, 45973, 3629, 7503, 2105, 697, 8718, 604, 368, 13414, 5513, 352, 533, 352, 310, 417, 2686, 4751, 2931, 752, 403, 12684, 12684, 50276, 9820, 5474, 339, 431, 248, 4477, 1263, 253, 2216, 273, 8654, 17699, 16561, 941, 11931, 6297, 323, 247, 5147, 6110, 275, 26230, 253, 1599, 273, 3268, 407, 17055, 941, 432, 11068, 23739, 4212, 253, 4477, 5467, 4212, 452, 22766, 21750, 16762, 323, 1980, 285, 4275, 8967, 11068, 5593, 285, 253, 4212, 3894, 616, 941, 275, 6431, 323, 247, 7830, 326, 7037, 684, 323, 616, 11068, 11655, 253, 4477, 5100, 7221, 991, 2406, 14493, 323, 253, 13418, 2228, 285, 921, 326, 247, 4872, 29107, 310, 2822, 8654, 247, 17699, 16561, 4758, 310, 2783, 281, 1614, 253, 2216, 273, 271, 8654, 941, 11931, 5122, 715, 247, 1327, 44181, 13757, 1895, 50276, 783, 7714, 310, 973, 10932, 285, 3797, 2217, 1491, 3058, 281, 1329, 253, 3916, 352, 2789, 253, 4081, 1332, 310, 4460, 275, 253, 3282, 326, 253, 4477, 1908, 253, 22766, 21750, 16762, 273, 4212, 285, 897, 247, 17699, 16561, 4758, 281, 5248, 253, 5122, 20462, 1895, 347, 271, 13757, 1895, 253, 4477, 452, 18212, 9713, 253, 7364, 2490, 187, 4118, 18435, 27, 783, 30628, 512, 5821, 326, 253, 1895, 4758, 310, 973, 24013, 8550, 253, 4081, 2900, 310, 4460, 285, 326, 253, 2929, 310, 3477, 281, 2096, 285, 281, 956, 2167, 253, 4477, 2380, 9713, 954, 273, 253, 30628, 7350, 1309, 253, 36982, 1374, 254, 5955, 3408, 690, 30628, 1335, 4879, 326, 50275, 783, 1332, 476, 823, 1512, 1199, 6046, 281, 941, 362, 83, 33126, 50276, 783, 4758, 835, 253, 4771, 39360, 1027, 11068, 2308, 299, 4277, 275, 6431, 323, 10762, 778, 320, 46521, 1081, 20122, 50276, 40844, 3640, 273, 253, 3268, 4212, 21750, 16762, 778, 417, 320, 17887, 275, 3946, 270, 2428, 87, 50276, 3229, 3784, 841, 3374, 5780, 253, 30628, 285, 891, 5194, 326, 253, 16108, 273, 253, 2929, 32180, 798, 253, 4942, 5884, 32138, 1840, 285, 326, 436, 2929, 651, 1056, 247, 9865, 1635, 281, 253, 8059, 891, 11907, 253, 4477, 281, 2278, 253, 2457, 2715, 273, 616, 7714, 2720, 281, 9311 ]
Below is given review of a research paper from cnoference journal. Please write a summary the review. ### Review: in this paper the authors propose an improvement for scorematching based generative modeling 1 resembling low temperature sampling as in gans or flowbased models similarly to 2 they propose to modify the drift function used in the sampling step of the diffusion model by including the gradient of some classifier however contrary to 2 the classifier considered by the authors is implicit in the sense that it is purely defined by a conditional and an unconditional generative model the authors show that using such a classifier allows to control a tradeoff between is and fid on the imagenet dataset the use of such implicit classifier also provides intuition on the guidance influence in scorebased generative modeling the model tries to reduce the unconditional likelihood while increasing the conditional likelihood 1 song sohldickstein kingma kumar ermon poole scorebased generative modeling through stochastic differential equations 2 dhariwal nichol diffusion models beat gans on image synthesis strengths the paper is clear and overall wellwritten the experiments are easy to read and illustrate the announced tradeoff between fid and is the idea of using implicit classifiers is interesting and provides a new step toward defining guided scorebased generative models using only scorebased models similarly i found the intuitive explanation for how guidance works to be well motivated and illustrated with the implicit classifier guidance the experiments are interesting and clearly illustrate the tradeoff between is and fid weaknesses in my opinion the contribution is quite incremental in the sense that 2 already used classifiers for the guidance of scorebased models the authors justify the use of unconditional classifiers by claiming that the use of the classifiers of 2 can be interpreted as an adversarial attack and that therefore the guided scorebased generative modeling behaves like gans i am not fully convinced by this explanation as i think that even though the model in 2 incorporate a guidance classifier term there are still quite far from being close to gans also i am not clear as of why the classifier used in 2 would constitute an adversarial attack against the model as of now the motivation for considering implicit classifier as presented in the introduction seems a bit superficial i think the paper would truly benefit an investigation of the shortcomings of the scorebased generative models using classifier guidance then we could fully appreciate the benefits of using unconditional guidance diffusion models even though the paper is mostly experimental i think that the theoretical part of the paper could have been more developed in particular it is not clear to me what are the properties the model given in the equation at the end of p3 also i think that in order to be able to say that the classifier guidance model appoximately sample from tildepthetazlambdac one needs to explicitly write down what is the joint forward model and how to derive the backward model the approximation can then be obtained similarly to equation 4 in 3 general comments in table 1 i would have expected nonguided model to have better fid than the guided model with w01 however this does not seem to be the case similarly in table 2 while the is score keeps increasing with the parameter w it is not the case for fid is there a reason why the best fid score is reached for w03 i think it would have been interested to show the nearest neighbors in the imagenet dataset depending on the parameter w how close is the model to the original dataset for large values of w for large values of w the score given by equation 6 is approximately scaled by w thinking in terms of diffusions this amounts to scale the drift by a parameter w the lipschitz constant of the drift is also multiplied by w and therefore in order to obtain a stable discretization we need to divide the stepsize in the eulermaruyama discretization by a parameter w however in the experiments it seems that the authors choose the same stepsize for every value of w could you comment on this 1 song sohldickstein kingma kumar ermon poole scorebased generative modeling through stochastic differential equations 2 dhariwal nichol diffusion models beat gans on image synthesis 3 de bortoli thornton heng doucet diffusion schrodinger bridge with applications to scorebased generative modeling the main idea of the paper is interesting and the experiments are quite convincing however i feel that the motivation behind this improvement is quite superficial i think that the authors should better motivate their study especially putting an emphasis on the limitations of classifier guidance in scorebased generative models i will increase my score if the authors address my concerns docsepthis work proposed a method to tradeoff sample diversity for sample quality in diffusion models which is termed unconditional guidance different from the prior work called classifier guidance dhariwal nichol 2021 that relies on a classifier for providing the guidance signal the proposed unconditional guidance mixes the score estimates of a conditional diffusion model and an unconditional diffusion model for a tradeoff between sample quality and sample diversity experiments on imagenet 64x64 and 128x128 showed that the proposed model can achieve the claimed qualitydiversity tradeoff regarding fid and is scores strengths 1 the paper is well written and easy to read the method is clearly stated and i particularly like the way the authors posit some hypotheses on classifier guidance ie being adversarial to the classifier to motivate the proposed method and the resulting discussions 2 the idea of constructing an implicit classifier from the conditional and unconditional generative model to provide a guidance signal in diffusion models is new to me although i have some concerns about its significance in the below comments 3 experiments demonstrate the effectiveness of the proposed method in trading off sample diversity for sample quality weaknesses 1 my first major concern is about the significance of the proposed method is the unconditional guidance of more practical significance than the classifier guidance the proposed unconditional guidance relies on a mixture of a conditional diffusion model and an unconditional diffusion model for the qualitydiversity tradeoff 1 it means we have to train a conditional diffusion model from scratch to apply the proposed method wouldnt it be much less flexible than training a noisy image classifier from scratch and directly applying it to the pretrained unconditional diffusion model in a plugnplay manner plus the proposed method has worse sampling speed and no better sample quality and diversity tradeoff compared to classifier guidance 2 the authors mentioned the truncation trick in gans as their motivating example but the truncation trick is a posthoc process that plugs in a pretrained gan while the proposed method has to train a new generative model with labels to get the tradeoff on the contrary the classifier guidance enjoys the same posthoc property as the truncation trick though it also needs labels during inference in this sense the classifier guidance more resembles the truncation trick than the proposed method 2 in experiments to show the tradeoff between sample quality and sample diversity i would recommend adding the precision and recall metrics rather than solely focusing on fid and is scores because the fid score does not only capture the sample diversity but also is affected by the sample quality a precisionrecall curve will be more convincing 3 in experiments only imagenet 64x64 and 128x128 are considered i think adding experiments on imagenet 256x256 and imagenet 512x512 as the classifier guidance paper did will make the results in the paper more impressive 4 when introducing the equation epsilonzlambda c sigmalambda nablazlambdalog pzlambdac is the sigmalambda assumed to be negative it seems that a negative sign is missing before sigma 5 the method name unconditional guidance is inappropriate because it requires training a conditional diffusion model with labels although the idea is new in the context of diffusion models and the experiments support the major claim to some extent i think 1 the proposed method is of less practical significance and less flexibility compared with the prior work on classifier guidance 2 the experiments can be improved to better support the claim and to make the results more impressive thus my initial recommendation is not accepting the paper docsepthe paper belongs to the class of diffusionbased generative models for generating synthetic images using class labels as guidance it starts with a view of a recent work dhariwalnichol 2021 where a classifier model is trained jointly with the diffusionbased generative model and the score gradient of log probability of the classifier is magnified and added to the score of the generative model to increase fidelity and the expense of diversity in the view dhariwalnichols approach resembles adversarial attacks of ganbased approaches motivating the key question that the paper addresses whether we can train a generative model without a classifier but still enjoying the ability to trade diversity for fidelity the answer presented in the paper is to replace the explicit classifier with an implicit classifier modeled as a conditioned generative model conditioned on the class and the weights between the conditioned generative model than the unconditioned generative model are shared by viewing the unconditioned generative model as part of the conditional model but with an additional null class experiments on imagenet shows that the results compare favourably to dhariwalnichol 2021 and ho et al 2021 and in some cases slightly outperforming positive points quite an impressive background review and explanation of dhariwalnichol 2020s approach resembling gan although not clearly stated in the paper but i find replacing an explicit classifier with a generativebased implicit classifier has a potential of improving classification performance on noisy latent spaces because of better model capacity and more resemblance to the already accurate unconditional generative model negative points the motivation for removing a classifier is not entirely clear for understanding yes but that is not too difficult to see from the last equationline of page 3 if the reason is to avoid adversarial attacks altogether then the reason leads to another question why do we need to avoid adversarial attacks the mathematical formulation of gan may be flawed for optimisation but the idea of increasingly harder examples to learn is beneficial this tactic is very common in imbalanced classification problems more discussion on the motivation of the paper is needed the paper introduced a clever trick to allow sharing weights between a conditioned model and an unconditioned model by introducing an additional label representing the unconditional case this works well for class labels and allows for deriving implicit classifiers directly from the model via bayes rules however there is an efficiency cost at inference for every iteration at inference the model has to run twice to generate conditioned and unconditioned probabilities using an explicit classifier as pointed out by the authors is faster while i think speed is a concern what would be more interesting is discussion both theoretically and experimentally of the positive point 2 above the title unconditioned diffusion guidance is misleading the presented model is clearly conditioned diffusion guidance because you still condition on the labels in the end you just replaced an explicit classifier with an implicit one it is more appropriate to call the approach as conditioned diffusion guidance without a classifier as mentioned in a few places in the paper the explanation on page 4 that claims that a w1weight guided unconditioned model should theoretically lead to the same result as a wweighted guided conditioned model is somewhat not entirely true we are roughly discussing trading an unconditioned generative model and a classifier model for a conditioned generative model if the capacity of the generative models outweighs that of a classifier model it would be fair to say the latter would be dominated by the former the experiments are somewhat lacking apart from the replacement of the explicit classifier with an implicit classifier sharing weights with the generative model there is a switch from discretetime training in dhariwalnichol 2021 to continuoustime training here the experimental results are on the total performance however how much of improvement of fid and is in bestcase scenarios really comes from the classifier replacement ie the implicit classifier model has larger capacity than the explicit classifier and shares weights with the generative model how much comes from the switch from discretetime training to continuoustime training in section 5 the argument that the proposed unconditionalguided sampler cannot be interpreted as a gradientbased adversarial attack on a classifier and the explanation on how guidance works it decreases the unconditional likelihood of the sample while increasing the conditional likelihood somewhat contradicts with each other since the latter can be viewed as a form of implicitly injecting adversarial attacks into the computation of the score ie last equation on the 2ndlast line of page 4 section 32 the paper has some novelties but they are not wellmotivated and not well backed up by the provided discussion and evidences docsepthis paper proposes an unconditional guidance which discards the classifier in previous workunder such unconditional guidance it is also able to obtain a tradeoff between sample quality and diversity as the classifier guidance even though the effectiveness of the proposed unconditional guidance is justified i cannot admit its advantages comparing with the classifier guidance specifically the proposed method need to simultaneously train an unconditional model and a conditional one while the classifier guidance method need to simultaneously train a conditional model and a classifier in my opinion both of them need to train two models so the computational cost to train is similar whats more for the sampling phase the proposed method is much slower as pointed out in sec5 therefore im confused with the usefulness and the advantages of the proposed unconditional guidance as for the experiments all the network architecture and hyperparameter settings are following the previous work dhariwal nichol 2021 through the experiments the authors verify that the unconditional method can also achieve the goal of balancing the sample quality and diversity however as a complete work it should also sufficiently exploits its best potential performance and gives some optimal settings for it so as to provide conveniences for others to follow in summary i give positive comments to this work though some aspects of this method can be improved ### Summary:
this paper modifies the conditional diffusion model guided by a classifier as introduced by dhariwal nichol 2021 by replacing the explicit classifier with an implicit classifier this implicit classifier is derived under bayes rule and combined with the conditional diffusion model this combination can be realized by mixing the score estimates of a conditional diffusion model and an unconditional diffusion model a tradeoff between sample quality and diversity in terms of the is and fid scores can be achieved by adjusting the mixing weight the paper is clearly written and easy to follow however the reviewers do not consider the modification to be that significant in practice as it still requires label guidance and also increases the computational complexity from the acs perspective the practical significance could be enhanced if the authors can generalize their technique beyond assisting conditional diffusion models
[ 372, 270, 430, 10424, 289, 1575, 1299, 344, 1251, 2443, 68, 292, 50276, 13437, 2035, 5807, 16104, 4940, 9729, 342, 4893, 281, 4868, 3169, 1006, 800, 14053, 50276, 783, 2022, 2934, 273, 253, 2929, 310, 4722, 285, 253, 4679, 403, 3240, 21414, 50276, 35529, 891, 1928, 326, 253, 16038, 3212, 436, 7756, 310, 3240, 28019, 891, 1158, 326, 253, 4477, 943, 1805, 41509, 616, 1263, 3340, 8133, 271, 15075, 327, 253, 7364, 273, 30410, 12925, 275, 4868, 3169, 1006, 800, 3210, 891, 588, 2572, 619, 4868, 604, 253, 4477, 2953, 619, 7350, 5474, 33032, 2520, 789, 4081, 247, 1332, 281, 5454, 2727, 3410, 9991, 323, 3410, 3290, 275, 12393, 3210, 534, 310, 23776, 49795, 12925, 1027, 432, 253, 2720, 789, 1925, 30410, 12925, 42158, 1792, 18758, 50276, 79, 469, 311, 43425, 326, 15771, 327, 247, 30410, 323, 5277, 253, 12925, 2625, 253, 4081, 49795, 12925, 47603, 253, 4868, 8197, 273, 247, 17697, 12393, 1566, 285, 271, 49795, 12393, 1566, 323, 247, 5454, 2727, 875, 3410, 3290, 285, 3410, 9991, 4679, 327, 4440, 257, 292, 6705, 89, 1540, 285, 12842, 89, 8196, 2692, 326, 253, 4081, 1566, 476, 5115, 253, 7558, 3290, 69, 2095, 5454, 2727, 5001, 269, 301, 285, 310, 7363, 50275, 296, 3755, 20556, 337, 253, 2929, 310, 973, 3542, 285, 3477, 281, 1239, 253, 1332, 310, 4518, 4767, 285, 891, 3782, 751, 253, 1039, 253, 4477, 10538, 690, 24316, 327, 30410, 12925, 26332, 1146, 48960, 281, 253, 30410, 281, 41509, 253, 4081, 1332, 285, 253, 4795, 11985, 374, 253, 2934, 273, 26736, 271, 15424, 30410, 432, 253, 17697, 285, 49795, 1006, 800, 1566, 281, 2085, 247, 12925, 2625, 275, 12393, 3210, 310, 747, 281, 479, 3738, 891, 452, 690, 7350, 670, 697, 8453, 275, 253, 2708, 5701, 495, 4679, 7568, 253, 12510, 273, 253, 4081, 1332, 275, 11947, 745, 3410, 9991, 323, 3410, 3290, 50275, 20881, 1255, 265, 337, 619, 806, 2201, 4468, 310, 670, 253, 8453, 273, 253, 4081, 1332, 310, 253, 49795, 12925, 273, 625, 8542, 8453, 685, 253, 30410, 12925, 253, 4081, 49795, 12925, 15771, 327, 247, 7802, 273, 247, 17697, 12393, 1566, 285, 271, 49795, 12393, 1566, 323, 253, 3290, 69, 2095, 5454, 2727, 337, 352, 2097, 359, 452, 281, 6194, 247, 17697, 12393, 1566, 432, 20041, 281, 4647, 253, 4081, 1332, 651, 2649, 352, 320, 1199, 1679, 12112, 685, 3733, 247, 27620, 2460, 30410, 432, 20041, 285, 3587, 9433, 352, 281, 253, 3215, 11273, 49795, 12393, 1566, 275, 247, 10358, 79, 1993, 5133, 5043, 253, 4081, 1332, 556, 7197, 10491, 3885, 285, 642, 1805, 3410, 3290, 285, 9991, 5454, 2727, 2429, 281, 30410, 12925, 374, 253, 4477, 5393, 253, 47024, 10480, 275, 305, 507, 347, 616, 15265, 839, 1650, 533, 253, 47024, 10480, 310, 247, 1501, 37806, 1232, 326, 49359, 275, 247, 3215, 11273, 36827, 1223, 253, 4081, 1332, 556, 281, 6194, 247, 747, 1006, 800, 1566, 342, 13301, 281, 755, 253, 5454, 2727, 327, 253, 10214, 253, 30410, 12925, 29566, 253, 1072, 1501, 37806, 2867, 347, 253, 47024, 10480, 2167, 352, 671, 3198, 13301, 1309, 17032, 275, 436, 3282, 253, 30410, 12925, 625, 29217, 253, 47024, 10480, 685, 253, 4081, 1332, 50276, 19, 275, 4679, 281, 921, 253, 5454, 2727, 875, 3410, 3290, 285, 3410, 9991, 891, 651, 5583, 6240, 253, 12320, 285, 6983, 17082, 2581, 685, 12718, 13654, 327, 269, 301, 285, 310, 7363, 984, 253, 269, 301, 4868, 1057, 417, 760, 9232, 253, 3410, 9991, 533, 671, 310, 5876, 407, 253, 3410, 3290, 247, 12320, 2845, 455, 6970, 588, 320, 625, 21414, 50276, 20, 275, 4679, 760, 4440, 257, 292, 6705, 89, 1540, 285, 12842, 89, 8196, 403, 2783, 891, 1158, 6240, 4679, 327, 4440, 257, 292, 17558, 89, 9726, 285, 4440, 257, 292, 23414, 89, 19233, 347, 253, 30410, 12925, 2929, 858, 588, 1056, 253, 1543, 275, 253, 2929, 625, 13943, 50276, 21, 672, 16984, 253, 5150, 299, 4277, 91, 2260, 260, 50276, 24502, 10367, 1836, 295, 1752, 1370, 77, 1369, 26955, 462, 268, 91, 77, 1369, 69, 317, 310, 253, 9788, 10367, 1836, 8025, 281, 320, 4016, 352, 3133, 326, 247, 4016, 861, 310, 5816, 1078, 40009, 50275, 22, 253, 1332, 1416, 49795, 12925, 310, 19582, 984, 352, 4419, 3733, 247, 17697, 12393, 1566, 342, 13301, 50276, 20261, 253, 2934, 310, 747, 275, 253, 3634, 273, 12393, 3210, 285, 253, 4679, 1329, 253, 2201, 1750, 281, 690, 6070, 891, 1158, 337, 253, 4081, 1332, 310, 273, 1679, 8542, 8453, 285, 1679, 15840, 2429, 342, 253, 2720, 789, 327, 30410, 12925, 374, 253, 4679, 476, 320, 5520, 281, 1805, 1329, 253, 1750, 285, 281, 1056, 253, 1543, 625, 13943, 3021, 619, 3302, 17401, 310, 417, 18738, 253, 2929, 50276, 7152, 339, 431, 248, 2929, 14125, 281, 253, 966, 273, 12393, 3169, 1006, 800, 3210, 323, 11365, 13506, 3888, 970, 966, 13301, 347, 12925, 352, 7866, 342, 247, 1859, 273, 247, 3332, 789, 42158, 1792, 18758, 79, 469, 311, 43425, 835, 247, 30410, 1566, 310, 10166, 26277, 342, 253, 12393, 3169, 1006, 800, 1566, 285, 253, 4868, 11786, 273, 2412, 5912, 273, 253, 30410, 310, 2849, 1245, 285, 2879, 281, 253, 4868, 273, 253, 1006, 800, 1566, 281, 2572, 32422, 285, 253, 14247, 273, 9991, 275, 253, 1859, 42158, 1792, 18758, 79, 469, 3017, 2746, 29217, 48960, 8104, 273, 36827, 3169, 7274, 15265, 839, 253, 2234, 1953, 326, 253, 2929, 12453, 1880, 359, 476, 6194, 247, 1006, 800, 1566, 1293, 247, 30410, 533, 1335, 18262, 253, 3745, 281, 5454, 9991, 323, 32422, 50276, 783, 3662, 3559, 275, 253, 2929, 310, 281, 8171, 253, 6843, 30410, 342, 271, 15424, 30410, 23115, 347, 247, 27039, 1006, 800, 1566, 27039, 327, 253, 966, 285, 253, 13461, 875, 253, 27039, 1006, 800, 1566, 685, 253, 440, 44321, 1006, 800, 1566, 403, 6096, 407, 14657, 253, 440, 44321, 1006, 800, 1566, 347, 629, 273, 253, 17697, 1566, 533, 342, 271, 3081, 3635, 966, 50276, 16217, 3825, 327, 4440, 257, 292, 2722, 326, 253, 1543, 7277, 9796, 1598, 281, 42158, 1792, 18758, 79, 469, 311, 43425, 285, 8511, 1162, 355, 43425, 285, 275, 690, 2219, 5777, 41731, 14692, 50276, 10247, 2792, 50275, 39911, 271, 13943, 4114, 2278, 285, 8813, 273, 42158, 1792, 18758, 79, 469, 311, 9169, 84, 2746, 37427, 36827, 50276, 20261, 417, 4518, 4767, 275, 253, 2929, 533, 891, 1089, 15706, 271, 6843, 30410, 342, 247, 1006, 800, 3169, 15424, 30410, 556, 247, 2442, 273, 11138, 9162, 3045, 327, 27620, 21624, 8470, 984, 273, 1805, 1566, 5350, 285, 625, 36199, 281, 253, 2168, 7899, 49795, 1006, 800, 1566, 50275, 12373, 2792, 50275, 783, 16038, 323, 11922, 247, 30410, 310, 417, 7094, 2590, 323, 4685, 4754, 533, 326, 310, 417, 1512, 2834, 281, 923, 432, 253, 1390, 5150, 1282, 273, 3239, 495, 604, 253, 1921, 310, 281, 3693, 48960, 8104, 17965, 840, 253, 1921, 5644, 281, 1529, 1953, 2139, 513, 359, 878, 281, 3693, 48960, 8104, 253, 15965, 15895, 273, 36827, 778, 320, 33657, 323, 5556, 5837, 533, 253, 2934, 273, 9592, 12150, 6667, 281, 3037, 310, 12912, 436, 36633, 310, 1077, 1846, 275, 516, 30063, 9162, 3237, 50276, 3062, 5955, 327, 253, 16038, 273, 253, 2929, 310, 3058, 50276, 783, 2929, 5611, 247, 19080, 10480, 281, 1581, 9628, 13461, 875, 247, 27039, 1566, 285, 271, 440, 44321, 1566, 407, 16984, 271, 3081, 5203, 9999, 253, 49795, 1083, 436, 2987, 973, 323, 966, 13301, 285, 4483, 323, 44190, 15424, 49996, 3587, 432, 253, 1566, 3066, 17699, 265, 4803, 2299, 627, 310, 271, 6733, 2105, 387, 17032, 323, 1046, 19502, 387, 17032, 253, 1566, 556, 281, 1408, 7019, 281, 6635, 27039, 285, 440, 44321, 20552, 970, 271, 6843, 30410, 347, 8042, 562, 407, 253, 4477, 310, 7938, 1223, 891, 1158, 3885, 310, 247, 4468, 752, 651, 320, 625, 4722, 310, 5955, 1097, 28055, 285, 21657, 273, 253, 2762, 1127, 374, 1840, 50276, 783, 4060, 440, 44321, 12393, 12925, 310, 24363, 253, 3559, 1566, 310, 4518, 27039, 12393, 12925, 984, 368, 1335, 1617, 327, 253, 13301, 275, 253, 990, 368, 816, 7932, 271, 6843, 30410, 342, 271, 15424, 581, 352, 310, 625, 4569, 281, 1067, 253, 2746, 347, 27039, 12393, 12925, 1293, 247, 30410, 347, 5393, 275, 247, 1643, 5053, 275, 253, 2929, 50276, 783, 8813, 327, 3239, 577, 326, 3916, 326, 247, 259, 18, 6712, 18107, 440, 44321, 1566, 943, 28055, 1421, 281, 253, 1072, 906, 347, 247, 259, 24676, 18107, 27039, 1566, 310, 8489, 417, 7094, 2032, 359, 403, 11467, 16585, 11947, 271, 440, 44321, 1006, 800, 1566, 285, 247, 30410, 1566, 323, 247, 27039, 1006, 800, 1566, 604, 253, 5350, 273, 253, 1006, 800, 3210, 32180, 21144, 326, 273, 247, 30410, 1566, 352, 651, 320, 4344, 281, 1333, 253, 6158, 651, 320, 14691, 407, 253, 3438, 50276, 783, 4679, 403, 8489, 14999, 7419, 432, 253, 5407, 273, 253, 6843, 30410, 342, 271, 15424, 30410, 9628, 13461, 342, 253, 1006, 800, 1566, 627, 310, 247, 5234, 432, 35132, 7816, 3733, 275, 42158, 1792, 18758, 79, 469, 311, 43425, 281, 44351, 26202, 553, 3733, 1060, 253, 5661, 1543, 403, 327, 253, 2264, 3045, 2299, 849, 1199, 273, 7756, 273, 269, 301, 285, 310, 275, 1682, 5045, 15216, 1663, 3249, 432, 253, 30410, 5407, 26332, 253, 15424, 30410, 1566, 556, 4067, 5350, 685, 253, 6843, 30410, 285, 10764, 13461, 342, 253, 1006, 800, 1566, 849, 1199, 3249, 432, 253, 5234, 432, 35132, 7816, 3733, 281, 44351, 26202, 553, 3733, 50276, 249, 2593, 608, 253, 4154, 326, 253, 4081, 49795, 26960, 1775, 17407, 2550, 320, 12814, 347, 247, 11786, 3169, 48960, 2983, 327, 247, 30410, 285, 253, 8813, 327, 849, 12925, 2987, 352, 12075, 253, 49795, 12177, 273, 253, 3410, 1223, 3629, 253, 17697, 12177, 8489, 40878, 342, 1016, 643, 1580, 253, 6158, 476, 320, 11575, 347, 247, 830, 273, 29688, 42014, 48960, 8104, 715, 253, 13782, 273, 253, 4868, 26332, 1390, 5150, 327, 253, 374, 2109, 6275, 1386, 273, 3239, 577, 2593, 4567, 50275, 783, 2929, 556, 690, 4460, 2890, 533, 597, 403, 417, 973, 24013, 8550, 285, 417, 973, 17245, 598, 407, 253, 2530, 5955, 285, 20456, 2979, 5474, 33032, 2520, 2929, 29328, 271, 49795, 12925, 534, 1262, 2196, 253, 30410, 275, 2045, 789, 4524, 824, 49795, 12925, 352, 310, 671, 2104, 281, 4044, 247, 5454, 2727, 875, 3410, 3290, 285, 9991, 347, 253, 30410, 12925, 1014, 2167, 253, 12510, 273, 253, 4081, 49795, 12925, 310, 17285, 891, 2550, 11476, 697, 11361, 10941, 342, 253, 30410, 12925, 5742, 253, 4081, 1332, 878, 281, 10486, 6194, 271, 49795, 1566, 285, 247, 17697, 581, 1223, 253, 30410, 12925, 1332, 878, 281, 10486, 6194, 247, 17697, 1566, 285, 247, 30410, 275, 619, 4743, 1097, 273, 731, 878, 281, 6194, 767, 3210, 594, 253, 15180, 2105, 281, 6194, 310, 2074, 50276, 5371, 84, 625, 323, 253, 10491, 3408, 253, 4081, 1332, 310, 1199, 17357, 347, 8042, 562, 275, 4706, 22, 3103, 516, 13477, 342, 253, 31471, 285, 253, 11361, 273, 253, 4081, 49795, 12925, 50276, 284, 323, 253, 4679, 512, 253, 2990, 10336, 285, 4373, 19484, 7533, 403, 1563, 253, 2045, 789, 42158, 1792, 18758, 50276, 79, 469, 311, 43425, 949, 253, 4679, 253, 4477, 12654, 326, 253, 49795, 1332, 476, 671, 5115, 253, 4736, 273, 26259, 253, 3410, 3290, 285, 9991, 2299, 347, 247, 3426, 789, 352, 943, 671, 10481, 40725, 697, 1682, 2442, 3045, 285, 4245, 690, 8654, 7533, 323, 352, 594, 347, 281, 2085, 7581, 7545, 323, 2571, 281, 956, 50276, 249, 6010, 891, 1918, 2762, 5701, 281, 436, 789, 2167, 690, 7794, 273, 436, 1332, 476, 320, 5520, 2490, 187, 4118, 18435, 27, 2520, 2929, 771, 7790, 253, 17697, 12393, 1566, 18107, 407, 247, 30410, 347, 5611, 407, 42158, 1792, 18758, 50276, 79, 469, 311, 43425, 407, 15706, 253, 6843, 30410, 342, 271, 15424, 30410, 436, 15424, 30410, 310, 6012, 762, 17699, 265, 4086, 285, 5678, 342, 253, 17697, 12393, 1566, 436, 5019, 476, 320, 8156, 407, 12480, 253, 4868, 8197, 273, 247, 17697, 12393, 1566, 285, 271, 49795, 12393, 1566, 247, 5454, 2727, 875, 3410, 3290, 285, 9991, 275, 2426, 273, 253, 310, 285, 269, 301, 7363, 476, 320, 6786, 407, 19427, 253, 12480, 2801, 253, 2929, 310, 4518, 3542, 285, 3477, 281, 956, 2299, 253, 30628, 513, 417, 1908, 253, 11237, 281, 320, 326, 1534, 275, 3946, 347, 352, 1335, 4419, 5203, 12925, 285, 671, 5459, 253, 15180, 10454, 432, 253, 913, 84, 8668, 253, 8542, 8453, 812, 320, 8655, 604, 253, 4477, 476, 39970, 616, 5853, 4457, 34735, 17697, 12393, 3210 ]
[ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 ]
[ 372, 270, 430, 10424, 289, 1575, 1299, 344, 1251, 2443, 68, 292, 50276, 13437, 2035, 5807, 16104, 4940, 9729, 342, 4893, 281, 4868, 3169, 1006, 800, 14053, 50276, 783, 2022, 2934, 273, 253, 2929, 310, 4722, 285, 253, 4679, 403, 3240, 21414, 50276, 35529, 891, 1928, 326, 253, 16038, 3212, 436, 7756, 310, 3240, 28019, 891, 1158, 326, 253, 4477, 943, 1805, 41509, 616, 1263, 3340, 8133, 271, 15075, 327, 253, 7364, 273, 30410, 12925, 275, 4868, 3169, 1006, 800, 3210, 891, 588, 2572, 619, 4868, 604, 253, 4477, 2953, 619, 7350, 5474, 33032, 2520, 789, 4081, 247, 1332, 281, 5454, 2727, 3410, 9991, 323, 3410, 3290, 275, 12393, 3210, 534, 310, 23776, 49795, 12925, 1027, 432, 253, 2720, 789, 1925, 30410, 12925, 42158, 1792, 18758, 50276, 79, 469, 311, 43425, 326, 15771, 327, 247, 30410, 323, 5277, 253, 12925, 2625, 253, 4081, 49795, 12925, 47603, 253, 4868, 8197, 273, 247, 17697, 12393, 1566, 285, 271, 49795, 12393, 1566, 323, 247, 5454, 2727, 875, 3410, 3290, 285, 3410, 9991, 4679, 327, 4440, 257, 292, 6705, 89, 1540, 285, 12842, 89, 8196, 2692, 326, 253, 4081, 1566, 476, 5115, 253, 7558, 3290, 69, 2095, 5454, 2727, 5001, 269, 301, 285, 310, 7363, 50275, 296, 3755, 20556, 337, 253, 2929, 310, 973, 3542, 285, 3477, 281, 1239, 253, 1332, 310, 4518, 4767, 285, 891, 3782, 751, 253, 1039, 253, 4477, 10538, 690, 24316, 327, 30410, 12925, 26332, 1146, 48960, 281, 253, 30410, 281, 41509, 253, 4081, 1332, 285, 253, 4795, 11985, 374, 253, 2934, 273, 26736, 271, 15424, 30410, 432, 253, 17697, 285, 49795, 1006, 800, 1566, 281, 2085, 247, 12925, 2625, 275, 12393, 3210, 310, 747, 281, 479, 3738, 891, 452, 690, 7350, 670, 697, 8453, 275, 253, 2708, 5701, 495, 4679, 7568, 253, 12510, 273, 253, 4081, 1332, 275, 11947, 745, 3410, 9991, 323, 3410, 3290, 50275, 20881, 1255, 265, 337, 619, 806, 2201, 4468, 310, 670, 253, 8453, 273, 253, 4081, 1332, 310, 253, 49795, 12925, 273, 625, 8542, 8453, 685, 253, 30410, 12925, 253, 4081, 49795, 12925, 15771, 327, 247, 7802, 273, 247, 17697, 12393, 1566, 285, 271, 49795, 12393, 1566, 323, 253, 3290, 69, 2095, 5454, 2727, 337, 352, 2097, 359, 452, 281, 6194, 247, 17697, 12393, 1566, 432, 20041, 281, 4647, 253, 4081, 1332, 651, 2649, 352, 320, 1199, 1679, 12112, 685, 3733, 247, 27620, 2460, 30410, 432, 20041, 285, 3587, 9433, 352, 281, 253, 3215, 11273, 49795, 12393, 1566, 275, 247, 10358, 79, 1993, 5133, 5043, 253, 4081, 1332, 556, 7197, 10491, 3885, 285, 642, 1805, 3410, 3290, 285, 9991, 5454, 2727, 2429, 281, 30410, 12925, 374, 253, 4477, 5393, 253, 47024, 10480, 275, 305, 507, 347, 616, 15265, 839, 1650, 533, 253, 47024, 10480, 310, 247, 1501, 37806, 1232, 326, 49359, 275, 247, 3215, 11273, 36827, 1223, 253, 4081, 1332, 556, 281, 6194, 247, 747, 1006, 800, 1566, 342, 13301, 281, 755, 253, 5454, 2727, 327, 253, 10214, 253, 30410, 12925, 29566, 253, 1072, 1501, 37806, 2867, 347, 253, 47024, 10480, 2167, 352, 671, 3198, 13301, 1309, 17032, 275, 436, 3282, 253, 30410, 12925, 625, 29217, 253, 47024, 10480, 685, 253, 4081, 1332, 50276, 19, 275, 4679, 281, 921, 253, 5454, 2727, 875, 3410, 3290, 285, 3410, 9991, 891, 651, 5583, 6240, 253, 12320, 285, 6983, 17082, 2581, 685, 12718, 13654, 327, 269, 301, 285, 310, 7363, 984, 253, 269, 301, 4868, 1057, 417, 760, 9232, 253, 3410, 9991, 533, 671, 310, 5876, 407, 253, 3410, 3290, 247, 12320, 2845, 455, 6970, 588, 320, 625, 21414, 50276, 20, 275, 4679, 760, 4440, 257, 292, 6705, 89, 1540, 285, 12842, 89, 8196, 403, 2783, 891, 1158, 6240, 4679, 327, 4440, 257, 292, 17558, 89, 9726, 285, 4440, 257, 292, 23414, 89, 19233, 347, 253, 30410, 12925, 2929, 858, 588, 1056, 253, 1543, 275, 253, 2929, 625, 13943, 50276, 21, 672, 16984, 253, 5150, 299, 4277, 91, 2260, 260, 50276, 24502, 10367, 1836, 295, 1752, 1370, 77, 1369, 26955, 462, 268, 91, 77, 1369, 69, 317, 310, 253, 9788, 10367, 1836, 8025, 281, 320, 4016, 352, 3133, 326, 247, 4016, 861, 310, 5816, 1078, 40009, 50275, 22, 253, 1332, 1416, 49795, 12925, 310, 19582, 984, 352, 4419, 3733, 247, 17697, 12393, 1566, 342, 13301, 50276, 20261, 253, 2934, 310, 747, 275, 253, 3634, 273, 12393, 3210, 285, 253, 4679, 1329, 253, 2201, 1750, 281, 690, 6070, 891, 1158, 337, 253, 4081, 1332, 310, 273, 1679, 8542, 8453, 285, 1679, 15840, 2429, 342, 253, 2720, 789, 327, 30410, 12925, 374, 253, 4679, 476, 320, 5520, 281, 1805, 1329, 253, 1750, 285, 281, 1056, 253, 1543, 625, 13943, 3021, 619, 3302, 17401, 310, 417, 18738, 253, 2929, 50276, 7152, 339, 431, 248, 2929, 14125, 281, 253, 966, 273, 12393, 3169, 1006, 800, 3210, 323, 11365, 13506, 3888, 970, 966, 13301, 347, 12925, 352, 7866, 342, 247, 1859, 273, 247, 3332, 789, 42158, 1792, 18758, 79, 469, 311, 43425, 835, 247, 30410, 1566, 310, 10166, 26277, 342, 253, 12393, 3169, 1006, 800, 1566, 285, 253, 4868, 11786, 273, 2412, 5912, 273, 253, 30410, 310, 2849, 1245, 285, 2879, 281, 253, 4868, 273, 253, 1006, 800, 1566, 281, 2572, 32422, 285, 253, 14247, 273, 9991, 275, 253, 1859, 42158, 1792, 18758, 79, 469, 3017, 2746, 29217, 48960, 8104, 273, 36827, 3169, 7274, 15265, 839, 253, 2234, 1953, 326, 253, 2929, 12453, 1880, 359, 476, 6194, 247, 1006, 800, 1566, 1293, 247, 30410, 533, 1335, 18262, 253, 3745, 281, 5454, 9991, 323, 32422, 50276, 783, 3662, 3559, 275, 253, 2929, 310, 281, 8171, 253, 6843, 30410, 342, 271, 15424, 30410, 23115, 347, 247, 27039, 1006, 800, 1566, 27039, 327, 253, 966, 285, 253, 13461, 875, 253, 27039, 1006, 800, 1566, 685, 253, 440, 44321, 1006, 800, 1566, 403, 6096, 407, 14657, 253, 440, 44321, 1006, 800, 1566, 347, 629, 273, 253, 17697, 1566, 533, 342, 271, 3081, 3635, 966, 50276, 16217, 3825, 327, 4440, 257, 292, 2722, 326, 253, 1543, 7277, 9796, 1598, 281, 42158, 1792, 18758, 79, 469, 311, 43425, 285, 8511, 1162, 355, 43425, 285, 275, 690, 2219, 5777, 41731, 14692, 50276, 10247, 2792, 50275, 39911, 271, 13943, 4114, 2278, 285, 8813, 273, 42158, 1792, 18758, 79, 469, 311, 9169, 84, 2746, 37427, 36827, 50276, 20261, 417, 4518, 4767, 275, 253, 2929, 533, 891, 1089, 15706, 271, 6843, 30410, 342, 247, 1006, 800, 3169, 15424, 30410, 556, 247, 2442, 273, 11138, 9162, 3045, 327, 27620, 21624, 8470, 984, 273, 1805, 1566, 5350, 285, 625, 36199, 281, 253, 2168, 7899, 49795, 1006, 800, 1566, 50275, 12373, 2792, 50275, 783, 16038, 323, 11922, 247, 30410, 310, 417, 7094, 2590, 323, 4685, 4754, 533, 326, 310, 417, 1512, 2834, 281, 923, 432, 253, 1390, 5150, 1282, 273, 3239, 495, 604, 253, 1921, 310, 281, 3693, 48960, 8104, 17965, 840, 253, 1921, 5644, 281, 1529, 1953, 2139, 513, 359, 878, 281, 3693, 48960, 8104, 253, 15965, 15895, 273, 36827, 778, 320, 33657, 323, 5556, 5837, 533, 253, 2934, 273, 9592, 12150, 6667, 281, 3037, 310, 12912, 436, 36633, 310, 1077, 1846, 275, 516, 30063, 9162, 3237, 50276, 3062, 5955, 327, 253, 16038, 273, 253, 2929, 310, 3058, 50276, 783, 2929, 5611, 247, 19080, 10480, 281, 1581, 9628, 13461, 875, 247, 27039, 1566, 285, 271, 440, 44321, 1566, 407, 16984, 271, 3081, 5203, 9999, 253, 49795, 1083, 436, 2987, 973, 323, 966, 13301, 285, 4483, 323, 44190, 15424, 49996, 3587, 432, 253, 1566, 3066, 17699, 265, 4803, 2299, 627, 310, 271, 6733, 2105, 387, 17032, 323, 1046, 19502, 387, 17032, 253, 1566, 556, 281, 1408, 7019, 281, 6635, 27039, 285, 440, 44321, 20552, 970, 271, 6843, 30410, 347, 8042, 562, 407, 253, 4477, 310, 7938, 1223, 891, 1158, 3885, 310, 247, 4468, 752, 651, 320, 625, 4722, 310, 5955, 1097, 28055, 285, 21657, 273, 253, 2762, 1127, 374, 1840, 50276, 783, 4060, 440, 44321, 12393, 12925, 310, 24363, 253, 3559, 1566, 310, 4518, 27039, 12393, 12925, 984, 368, 1335, 1617, 327, 253, 13301, 275, 253, 990, 368, 816, 7932, 271, 6843, 30410, 342, 271, 15424, 581, 352, 310, 625, 4569, 281, 1067, 253, 2746, 347, 27039, 12393, 12925, 1293, 247, 30410, 347, 5393, 275, 247, 1643, 5053, 275, 253, 2929, 50276, 783, 8813, 327, 3239, 577, 326, 3916, 326, 247, 259, 18, 6712, 18107, 440, 44321, 1566, 943, 28055, 1421, 281, 253, 1072, 906, 347, 247, 259, 24676, 18107, 27039, 1566, 310, 8489, 417, 7094, 2032, 359, 403, 11467, 16585, 11947, 271, 440, 44321, 1006, 800, 1566, 285, 247, 30410, 1566, 323, 247, 27039, 1006, 800, 1566, 604, 253, 5350, 273, 253, 1006, 800, 3210, 32180, 21144, 326, 273, 247, 30410, 1566, 352, 651, 320, 4344, 281, 1333, 253, 6158, 651, 320, 14691, 407, 253, 3438, 50276, 783, 4679, 403, 8489, 14999, 7419, 432, 253, 5407, 273, 253, 6843, 30410, 342, 271, 15424, 30410, 9628, 13461, 342, 253, 1006, 800, 1566, 627, 310, 247, 5234, 432, 35132, 7816, 3733, 275, 42158, 1792, 18758, 79, 469, 311, 43425, 281, 44351, 26202, 553, 3733, 1060, 253, 5661, 1543, 403, 327, 253, 2264, 3045, 2299, 849, 1199, 273, 7756, 273, 269, 301, 285, 310, 275, 1682, 5045, 15216, 1663, 3249, 432, 253, 30410, 5407, 26332, 253, 15424, 30410, 1566, 556, 4067, 5350, 685, 253, 6843, 30410, 285, 10764, 13461, 342, 253, 1006, 800, 1566, 849, 1199, 3249, 432, 253, 5234, 432, 35132, 7816, 3733, 281, 44351, 26202, 553, 3733, 50276, 249, 2593, 608, 253, 4154, 326, 253, 4081, 49795, 26960, 1775, 17407, 2550, 320, 12814, 347, 247, 11786, 3169, 48960, 2983, 327, 247, 30410, 285, 253, 8813, 327, 849, 12925, 2987, 352, 12075, 253, 49795, 12177, 273, 253, 3410, 1223, 3629, 253, 17697, 12177, 8489, 40878, 342, 1016, 643, 1580, 253, 6158, 476, 320, 11575, 347, 247, 830, 273, 29688, 42014, 48960, 8104, 715, 253, 13782, 273, 253, 4868, 26332, 1390, 5150, 327, 253, 374, 2109, 6275, 1386, 273, 3239, 577, 2593, 4567, 50275, 783, 2929, 556, 690, 4460, 2890, 533, 597, 403, 417, 973, 24013, 8550, 285, 417, 973, 17245, 598, 407, 253, 2530, 5955, 285, 20456, 2979, 5474, 33032, 2520, 2929, 29328, 271, 49795, 12925, 534, 1262, 2196, 253, 30410, 275, 2045, 789, 4524, 824, 49795, 12925, 352, 310, 671, 2104, 281, 4044, 247, 5454, 2727, 875, 3410, 3290, 285, 9991, 347, 253, 30410, 12925, 1014, 2167, 253, 12510, 273, 253, 4081, 49795, 12925, 310, 17285, 891, 2550, 11476, 697, 11361, 10941, 342, 253, 30410, 12925, 5742, 253, 4081, 1332, 878, 281, 10486, 6194, 271, 49795, 1566, 285, 247, 17697, 581, 1223, 253, 30410, 12925, 1332, 878, 281, 10486, 6194, 247, 17697, 1566, 285, 247, 30410, 275, 619, 4743, 1097, 273, 731, 878, 281, 6194, 767, 3210, 594, 253, 15180, 2105, 281, 6194, 310, 2074, 50276, 5371, 84, 625, 323, 253, 10491, 3408, 253, 4081, 1332, 310, 1199, 17357, 347, 8042, 562, 275, 4706, 22, 3103, 516, 13477, 342, 253, 31471, 285, 253, 11361, 273, 253, 4081, 49795, 12925, 50276, 284, 323, 253, 4679, 512, 253, 2990, 10336, 285, 4373, 19484, 7533, 403, 1563, 253, 2045, 789, 42158, 1792, 18758, 50276, 79, 469, 311, 43425, 949, 253, 4679, 253, 4477, 12654, 326, 253, 49795, 1332, 476, 671, 5115, 253, 4736, 273, 26259, 253, 3410, 3290, 285, 9991, 2299, 347, 247, 3426, 789, 352, 943, 671, 10481, 40725, 697, 1682, 2442, 3045, 285, 4245, 690, 8654, 7533, 323, 352, 594, 347, 281, 2085, 7581, 7545, 323, 2571, 281, 956, 50276, 249, 6010, 891, 1918, 2762, 5701, 281, 436, 789, 2167, 690, 7794, 273, 436, 1332, 476, 320, 5520, 2490, 187, 4118, 18435, 27, 2520, 2929, 771, 7790, 253, 17697, 12393, 1566, 18107, 407, 247, 30410, 347, 5611, 407, 42158, 1792, 18758, 50276, 79, 469, 311, 43425, 407, 15706, 253, 6843, 30410, 342, 271, 15424, 30410, 436, 15424, 30410, 310, 6012, 762, 17699, 265, 4086, 285, 5678, 342, 253, 17697, 12393, 1566, 436, 5019, 476, 320, 8156, 407, 12480, 253, 4868, 8197, 273, 247, 17697, 12393, 1566, 285, 271, 49795, 12393, 1566, 247, 5454, 2727, 875, 3410, 3290, 285, 9991, 275, 2426, 273, 253, 310, 285, 269, 301, 7363, 476, 320, 6786, 407, 19427, 253, 12480, 2801, 253, 2929, 310, 4518, 3542, 285, 3477, 281, 956, 2299, 253, 30628, 513, 417, 1908, 253, 11237, 281, 320, 326, 1534, 275, 3946, 347, 352, 1335, 4419, 5203, 12925, 285, 671, 5459, 253, 15180, 10454, 432, 253, 913, 84, 8668, 253, 8542, 8453, 812, 320, 8655, 604, 253, 4477, 476, 39970, 616, 5853, 4457, 34735, 17697, 12393, 3210 ]
Below is given review of a research paper from cnoference journal. Please write a summary the review. ### Review: the authors show that certain complete neural network verifiers can be mislead by carefully crafted neural networks that exploit roundoff errors which when large magnitude values overwhelm low magnitude values such a construction can be obfuscated by taking advantage of the compounding effect when there are many layers of the network this can also be used to add backdoors to existing networks albeit in a way that looks quite artificial i definitely agree with the authors that is important to draw attention to edge cases where complete verifiers can fail given that completeness can lead to a false sense of security for that alone i think this paper merits attention even if numerical errors can mess up neural networks is a wellknown fact that being said i think there are a few significant drawbacks to this work 1 presentation of the paper the paper at times feels like more of a discussion than a detailed exploration of a certain attack type as an example of why this is not optimal it makes it difficult to figure out at a glance what each table is referring to also it obfuscates the experimental results of which there are quite a few in the paper i believe the paper can benefit from a more formal style with paragraph headings and subsections breaking up the text and the conclusions clearly highlighted as opposed to being spread throughout the text 2 flipping the answer from yes to no for a binary function requires a small perturbation near the decision boundaries so the fact that numerical computations can lead to wrong answers in and of itself is not surprising what would be much more interesting is the degree to which such attacks can shift a continuous function i believe that the method is this work leads to arbitrarily large differences but i think this is something that should be explicitly exploreddocsepthe paper shows that it is possible to fool exact verifiers using numerical instabilities it proposes network architectures that can exploit numerical issues in order to get certificates from verifiers based on architecture such as mipverifiy and that can at the same time accepts adversarial examples within the certificated epsilonball using a simple trigger this raises important security issues and as the author suggest i do believe that such problem car arise in many situations the problem is first put into light on a very simple architecture and then on more complex ones and then with a added backdoor on existing network sereval optimizer for the verifiers are compared and behave similarly a defence to this behavior is proposed making all network parameters slightly noisy while im convinced on the importance of the subject and i understand that it is probably mainly for illustration purposes i have some questions mainly on the backdoor concept how can it be invisible to the verifier in section 5 here i understand the only the original architecture and weights are provided to mipverify and detected in section 6 i miss a point there about the defence i wish there were more experimental results with different epsilon values to have a better intuition on the global behavior of the defence i also think there could be more details on how the verifier detects the backdoor as mentioned in previous point docsepthis paper argues that although existing complete neural network verifiers can provide some guarantees on the robustness these verifiers have overlooked potential numerical roundoff errors in the verification and in such cases the provided guarantees may be invalid to show such a phenomenon the authors propose to construct adversarial neural networks that can cause the complete verifier to produce imprecise results in floating point arithmetics and can thereby fool the verifier they also showed it is possible to insert a backdoor to the network such that the backdoor is missed by the verifier while it can trigger some behavior desired by the attacker although this paper has also discussed a possible defense i find the corresponding section not very clearly written pros this paper raises an interesting problem in complete verifiers about potential numerical errors this can be important to ensure the robustness of complete verifiers against some potential adversarial networks or backdoors the authors demonstrated the existence of the numerical error problem via constructing adversarial networks to fool complete verifiers cons the structure of adversarial networks or inserted backdoors is not made to match some actual neural network architectures eg in figure 2 the network has a series of linear layers but has no activation between and thus it does not look like an actual nn structure is it possible to construct adversarial networks on realistic architectures eg mlp or cnn with relu activations although defending against adversarial networks has been discussed in the paper the writing appears inconsistent and unclear see additional comments below additional comments i find sec 6 is probably not very consistently and clearly written in the beginning it is said that adversarial networks are sensitive to weight perturbations the key insight is that some of the parameters of our adversarial network are rather sensitive to noise whereas nonadversarial networks are naturally robust to a very small perturbation of their parameters and later it is said that the network with the backdoor appears to be robust to noise this looks confusing to me can you elaborate more whether you think the adversarial network is or is not robust to small noise and the later paragraphs look difficult to understand updates after rebuttal thanks to the authors for the reply i have read the author response and understand that actually there are activations in the networks but just omitted from the figures i am increasing my recommendation to 6docsepthe paper presents a method to create neural networks that due to floatingpoint error lead to wrong robustness certifications on most input images by a socalled complete verifier for neural network robustness the authors show how to make their networks look a bit less suspicious and they discuss a way to detect neural networks that have been manipulated in the way they suggest to me it was obvious a priori that any complete verifier for neural network robustness that treats floatingpoint arithmetic as a perfect representation of real arithmetic is unsound however i think works like the current one are important to publish such as to practically demonstrate the limitations of the guarantees given by certain robustness certification systems and to motivate further research therefore i expect the target audience of the paper to be informed outsiders who have not so far questioned the validity of robustness certification research that did not explicitly address floatingpoint semantics in light of this the paper has several weaknesses related to presentation terminology is often used in a confusing way for example the approach that is practically demonstrated to be unsound is called a complete verifier with the strongest guarantees wrongly implying that all other verifiers must be at least as unsound the related work is incomplete for example unsoundness due to floatingpointerror has been previously practically observed in reluplex httpsarxivorgpdf180410829pdf in this case it produced wrong adversarial examples without any special measures having been taken to fool the verifier the related work is not properly discussed in relation to floatingpoint semantics some of the cited works are sound with respect to roundoff others are not i would expect this to be the central theme of the related work section such as to properly inform the reader if and why certain approaches should be expected to be unsound with respect to roundoff the current wording that all the verifiers that work with a model of the network are potentially vulnerable is not fair to all authors of such systems some have taken great care to ensure they properly capture roundoff semantics i did not find obfuscation and defense particularly wellmotivated what is the practical scenario in which they would become necessary the paper sends a somewhat strange message it exclusively suggests to combat floatingpoint unsoundness by employing heuristics to make it harder to find actual counterexamples what about just employing verifiers with honest error bounds that explicitly take into account floatingpoint semantics it may not be possible in the nearterm to actually make correct complete verifiers but at least authors of incomplete verifiers will not have to succumb to pressure to make an unsound complete version in order to match precision performance andor guarantees of their competitors the technical sections are written well enough to be understandable and the main technical contribution is a pattern of neurons we can insert into a neural network in order to make it behave in an arbitrary way that is invisible to the considered verifier this is interesting and disproves any claim of completeness but scenarios where this would be a way to attack a system seem a bit contrived ideally there would be an approach that can exploit roundoff within a nonmanipulated verified neural network to arbitrarily change the classification of a given input without changing the network the paper might benefit from a discussion of this possibility and an explanation why it was not attempted the new section 24 is appreciated though it seems the paper still does not say that incomplete methods can deal with roundoff error by sound overapproximation ### Summary:
the authors demonstrate that complete neural network verification methods that use limited precision arithmetic can fail to detect the possibility of attacks that exploit numerical roundoff errors they develop techniques to insert a backdoor into networks enabling such exploitation that remains undetected by neural network verifiers and a simple defence against this particular backdoor insertion the paper demonstrates an important and often ignored shortcoming of neural network verification methods getting around which remains a significant challenge particularly in adversarial situations this is a significant risk and needs to be studied carefully in further work all reviewers were in agreement on acceptance and concerns raised were adequately addressed in the rebuttal phase hence i recommend acceptance however a few clarifications raised by the official reviewers and public comments should be addressed in the final revision 1 acknowledging that incomplete verification methods that rely on sound overapproximation do not suffer from this shortcoming 2 concerns around reproducibility of mipverify related experiments brought up in public comments
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 783, 4477, 921, 326, 2176, 3426, 11454, 2990, 2336, 13783, 476, 320, 3731, 26460, 407, 9257, 37171, 11454, 6928, 326, 22059, 3790, 2727, 6332, 534, 672, 1781, 9777, 2193, 12357, 78, 1698, 9777, 2193, 824, 247, 5140, 476, 320, 691, 71, 19387, 456, 407, 3192, 5750, 273, 253, 8508, 272, 1055, 672, 627, 403, 1142, 8090, 273, 253, 2990, 436, 476, 671, 320, 908, 281, 823, 896, 26695, 281, 5368, 6928, 23447, 275, 247, 1039, 326, 4453, 3240, 13345, 50276, 74, 7964, 5194, 342, 253, 4477, 326, 310, 1774, 281, 3812, 4116, 281, 5024, 2219, 835, 3426, 2336, 13783, 476, 1891, 1677, 326, 29867, 476, 1421, 281, 247, 3221, 3282, 273, 3988, 50276, 1542, 326, 3815, 891, 1158, 436, 2929, 16108, 4116, 1014, 604, 10704, 6332, 476, 4840, 598, 11454, 6928, 310, 247, 973, 4304, 958, 50275, 3529, 1146, 753, 891, 1158, 627, 403, 247, 1643, 1534, 30453, 281, 436, 789, 50276, 18, 9759, 273, 253, 2929, 253, 2929, 387, 2069, 9193, 751, 625, 273, 247, 5955, 685, 247, 7000, 17947, 273, 247, 2176, 2983, 1511, 347, 271, 1650, 273, 2139, 436, 310, 417, 8654, 352, 2789, 352, 2834, 281, 4677, 562, 387, 247, 17834, 752, 1016, 2829, 310, 14339, 281, 671, 352, 691, 71, 19387, 684, 253, 5661, 1543, 273, 534, 627, 403, 3240, 247, 1643, 275, 253, 2929, 891, 2868, 253, 2929, 476, 5649, 432, 247, 625, 7473, 3740, 342, 12494, 1481, 723, 285, 749, 21454, 10155, 598, 253, 2505, 285, 253, 11815, 4518, 16318, 347, 10066, 281, 1146, 5195, 4768, 253, 2505, 50276, 19, 46899, 253, 3662, 432, 4754, 281, 642, 323, 247, 8985, 1159, 4419, 247, 1355, 20452, 2822, 253, 3061, 13674, 594, 253, 958, 326, 10704, 30745, 476, 1421, 281, 3430, 9172, 275, 285, 273, 3139, 310, 417, 10084, 752, 651, 320, 1199, 625, 4722, 310, 253, 4248, 281, 534, 824, 8104, 476, 5333, 247, 5415, 1159, 891, 2868, 326, 253, 1332, 310, 436, 789, 5644, 281, 29607, 1781, 3910, 533, 891, 1158, 436, 310, 1633, 326, 943, 320, 11120, 8338, 1678, 406, 339, 431, 248, 2929, 2722, 326, 352, 310, 1896, 281, 11213, 3242, 2336, 13783, 970, 10704, 978, 6720, 352, 29328, 2990, 35615, 326, 476, 22059, 10704, 3374, 275, 1340, 281, 755, 28460, 432, 2336, 13783, 1754, 327, 10336, 824, 347, 278, 532, 332, 338, 14059, 285, 326, 476, 387, 253, 1072, 673, 25026, 48960, 6667, 1561, 253, 5306, 692, 456, 299, 4277, 2910, 970, 247, 2969, 9632, 50276, 2520, 16540, 1774, 3988, 3374, 285, 347, 253, 2488, 1804, 891, 513, 2868, 326, 824, 1895, 1113, 12893, 275, 1142, 9534, 50275, 783, 1895, 310, 806, 1691, 715, 1708, 327, 247, 1077, 2969, 10336, 285, 840, 327, 625, 2570, 4394, 285, 840, 342, 247, 2879, 896, 11806, 327, 5368, 2990, 396, 250, 1208, 5556, 6081, 323, 253, 2336, 13783, 403, 2429, 285, 21319, 12014, 247, 17147, 281, 436, 3879, 310, 4081, 2403, 512, 2990, 3602, 5777, 27620, 50276, 6050, 516, 13762, 327, 253, 6349, 273, 253, 2256, 285, 891, 2096, 326, 352, 310, 3164, 7194, 323, 23356, 6378, 891, 452, 690, 3533, 7194, 327, 253, 896, 11806, 4473, 50276, 5430, 476, 352, 320, 20624, 281, 253, 2336, 5425, 275, 2593, 608, 1060, 891, 2096, 253, 760, 253, 3236, 10336, 285, 13461, 403, 2530, 281, 278, 532, 36302, 285, 5189, 275, 2593, 721, 50276, 74, 2985, 247, 1127, 627, 50276, 10383, 253, 17147, 891, 5730, 627, 497, 625, 5661, 1543, 342, 1027, 299, 4277, 2193, 281, 452, 247, 1805, 30328, 327, 253, 4156, 3879, 273, 253, 17147, 891, 671, 1158, 627, 812, 320, 625, 4278, 327, 849, 253, 2336, 5425, 34472, 253, 896, 11806, 347, 5393, 275, 2045, 1127, 5474, 33032, 2520, 2929, 8219, 326, 3738, 5368, 3426, 11454, 2990, 2336, 13783, 476, 2085, 690, 23632, 327, 253, 31640, 841, 2336, 13783, 452, 28849, 2442, 10704, 3790, 2727, 6332, 275, 253, 21999, 285, 275, 824, 2219, 253, 2530, 23632, 778, 320, 12078, 281, 921, 824, 247, 11562, 253, 4477, 12661, 281, 3989, 48960, 11454, 6928, 326, 476, 2847, 253, 3426, 2336, 5425, 281, 4711, 1607, 2845, 885, 1543, 275, 14974, 1127, 549, 334, 46682, 285, 476, 7624, 11213, 253, 2336, 5425, 597, 671, 2692, 352, 310, 1896, 281, 5669, 247, 896, 11806, 281, 253, 2990, 824, 326, 253, 896, 11806, 310, 9829, 407, 253, 2336, 5425, 1223, 352, 476, 9632, 690, 3879, 6799, 407, 253, 30539, 3738, 436, 2929, 556, 671, 5469, 247, 1896, 5684, 891, 1089, 253, 3969, 2593, 417, 1077, 4518, 3542, 50276, 856, 84, 50276, 2520, 2929, 16540, 271, 4722, 1895, 275, 3426, 2336, 13783, 670, 2442, 10704, 6332, 436, 476, 320, 1774, 281, 5416, 253, 31640, 273, 3426, 2336, 13783, 1411, 690, 2442, 48960, 6928, 390, 896, 26695, 50276, 783, 4477, 5183, 253, 6242, 273, 253, 10704, 2228, 1895, 3066, 26736, 48960, 6928, 281, 11213, 3426, 2336, 13783, 50276, 5040, 50276, 783, 2605, 273, 48960, 6928, 390, 13400, 896, 26695, 310, 417, 1160, 281, 3761, 690, 4588, 11454, 2990, 35615, 24088, 275, 4677, 374, 253, 2990, 556, 247, 2962, 273, 4872, 8090, 533, 556, 642, 5743, 875, 285, 3021, 352, 1057, 417, 1007, 751, 271, 4588, 48257, 2605, 310, 352, 1896, 281, 3989, 48960, 6928, 327, 15958, 35615, 24088, 13361, 81, 390, 260, 9866, 342, 774, 86, 1396, 569, 50275, 20261, 21449, 1411, 48960, 6928, 556, 644, 5469, 275, 253, 2929, 253, 4028, 4620, 16706, 285, 12744, 923, 3081, 5701, 2708, 50276, 38092, 5701, 50276, 74, 1089, 4706, 721, 310, 3164, 417, 1077, 12724, 285, 4518, 3542, 275, 253, 5068, 352, 310, 753, 326, 48960, 6928, 403, 7996, 281, 2801, 26309, 253, 2234, 12288, 310, 326, 690, 273, 253, 3602, 273, 776, 48960, 2990, 403, 2581, 7996, 281, 6046, 5727, 1327, 324, 735, 24406, 6928, 403, 10748, 10237, 281, 247, 1077, 1355, 20452, 273, 616, 3602, 285, 1996, 352, 310, 753, 326, 253, 2990, 342, 253, 896, 11806, 4620, 281, 320, 10237, 281, 6046, 436, 4453, 21643, 281, 479, 476, 368, 21184, 625, 1880, 368, 1158, 253, 48960, 2990, 310, 390, 310, 417, 10237, 281, 1355, 6046, 285, 253, 1996, 33295, 1007, 2834, 281, 2096, 50273, 484, 24275, 846, 30080, 22559, 50276, 35501, 281, 253, 4477, 323, 253, 12252, 891, 452, 1239, 253, 2488, 2380, 285, 2096, 326, 2686, 627, 403, 1396, 569, 275, 253, 6928, 533, 816, 11035, 432, 253, 8442, 891, 717, 3629, 619, 17401, 281, 721, 7152, 339, 431, 248, 2929, 10262, 247, 1332, 281, 2794, 11454, 6928, 326, 1955, 281, 14974, 3659, 2228, 1421, 281, 3430, 31640, 5306, 6787, 327, 954, 3280, 3888, 407, 247, 9267, 18859, 3426, 2336, 5425, 323, 11454, 2990, 31640, 253, 4477, 921, 849, 281, 1056, 616, 6928, 1007, 247, 2372, 1679, 20634, 285, 597, 2319, 247, 1039, 281, 2736, 11454, 6928, 326, 452, 644, 32494, 275, 253, 1039, 597, 1804, 50276, 936, 479, 352, 369, 4755, 247, 30400, 326, 667, 3426, 2336, 5425, 323, 11454, 2990, 31640, 326, 26574, 14974, 3659, 27844, 347, 247, 3962, 6779, 273, 1524, 27844, 310, 5061, 517, 2299, 891, 1158, 2987, 751, 253, 1655, 581, 403, 1774, 281, 15452, 824, 347, 281, 18236, 7568, 253, 7364, 273, 253, 23632, 1677, 407, 2176, 31640, 21612, 2718, 285, 281, 41509, 2007, 2561, 3103, 891, 1902, 253, 2303, 8446, 273, 253, 2929, 281, 320, 8191, 20823, 5852, 665, 452, 417, 594, 2080, 17801, 253, 13091, 273, 31640, 21612, 2561, 326, 858, 417, 11120, 2953, 14974, 3659, 35185, 275, 1708, 273, 436, 253, 2929, 556, 2067, 32213, 2905, 281, 9759, 50276, 20792, 1497, 310, 2223, 908, 275, 247, 21643, 1039, 323, 1650, 253, 2746, 326, 310, 18236, 5183, 281, 320, 5061, 517, 310, 1925, 247, 3426, 2336, 5425, 342, 253, 19508, 23632, 47723, 27594, 326, 512, 643, 2336, 13783, 1364, 320, 387, 1878, 347, 5061, 517, 50276, 783, 2905, 789, 310, 18464, 323, 1650, 5061, 517, 1255, 1955, 281, 14974, 25078, 6045, 556, 644, 3786, 18236, 2540, 275, 774, 484, 1591, 5987, 39962, 2061, 9275, 1093, 2125, 12347, 1717, 9275, 275, 436, 1083, 352, 4197, 3430, 48960, 6667, 1293, 667, 2714, 5593, 1907, 644, 2668, 281, 11213, 253, 2336, 5425, 50276, 783, 2905, 789, 310, 417, 6283, 5469, 275, 5886, 281, 14974, 3659, 35185, 690, 273, 253, 11106, 2987, 403, 3590, 342, 1675, 281, 3790, 2727, 2571, 403, 417, 891, 651, 1902, 436, 281, 320, 253, 4275, 10014, 273, 253, 2905, 789, 2593, 824, 347, 281, 6283, 4151, 253, 9414, 604, 285, 2139, 2176, 7274, 943, 320, 3264, 281, 320, 5061, 517, 342, 1675, 281, 3790, 2727, 253, 1655, 41066, 326, 512, 253, 2336, 13783, 326, 789, 342, 247, 1566, 273, 253, 2990, 403, 7826, 14043, 310, 417, 4344, 281, 512, 4477, 273, 824, 2718, 690, 452, 2668, 1270, 1557, 281, 5416, 597, 6283, 9232, 3790, 2727, 35185, 50276, 74, 858, 417, 1089, 691, 71, 19387, 318, 285, 5684, 3782, 973, 24013, 8550, 752, 310, 253, 8542, 10076, 275, 534, 597, 651, 2489, 3309, 50276, 783, 2929, 16965, 247, 8489, 8921, 3935, 352, 14288, 5936, 281, 11757, 14974, 3659, 5061, 517, 1255, 407, 19693, 344, 321, 3397, 281, 1056, 352, 12150, 281, 1089, 4588, 2258, 442, 18398, 10240, 752, 670, 816, 19693, 2336, 13783, 342, 8274, 2228, 14493, 326, 11120, 1379, 715, 2395, 14974, 3659, 35185, 352, 778, 417, 320, 1896, 275, 253, 2822, 3945, 281, 2686, 1056, 3451, 3426, 2336, 13783, 533, 387, 1878, 4477, 273, 18464, 2336, 13783, 588, 417, 452, 281, 18382, 3561, 281, 3473, 281, 1056, 271, 5061, 517, 3426, 2715, 275, 1340, 281, 3761, 12320, 3045, 285, 263, 23632, 273, 616, 21607, 50276, 783, 7681, 7118, 403, 3542, 973, 2217, 281, 320, 34007, 285, 253, 2022, 7681, 7680, 310, 247, 3102, 273, 8512, 359, 476, 5669, 715, 247, 11454, 2990, 275, 1340, 281, 1056, 352, 21319, 275, 271, 10341, 1039, 326, 310, 20624, 281, 253, 2783, 2336, 5425, 436, 310, 4722, 285, 557, 856, 1634, 667, 1750, 273, 29867, 533, 15216, 835, 436, 651, 320, 247, 1039, 281, 2983, 247, 985, 1646, 247, 2372, 523, 30487, 34243, 627, 651, 320, 271, 2746, 326, 476, 22059, 3790, 2727, 1561, 247, 1327, 1342, 532, 2907, 16058, 11454, 2990, 281, 29607, 1818, 253, 9162, 273, 247, 1677, 3280, 1293, 6890, 253, 2990, 253, 2929, 1537, 5649, 432, 247, 5955, 273, 436, 6387, 285, 271, 8813, 2139, 352, 369, 417, 9919, 50275, 783, 747, 2593, 2164, 310, 14109, 2167, 352, 3133, 253, 2929, 1335, 1057, 417, 1333, 326, 18464, 3082, 476, 2968, 342, 3790, 2727, 2228, 407, 3590, 689, 6772, 3266, 318, 187, 187, 4118, 18435, 27, 783, 4477, 7568, 326, 3426, 11454, 2990, 21999, 3082, 326, 897, 3710, 12320, 27844, 476, 1891, 281, 2736, 253, 6387, 273, 8104, 326, 22059, 10704, 3790, 2727, 6332, 597, 1287, 5609, 281, 5669, 247, 896, 11806, 715, 6928, 17690, 824, 30211, 326, 4558, 36568, 264, 407, 11454, 2990, 2336, 13783, 285, 247, 2969, 17147, 1411, 436, 1798, 896, 11806, 16941, 50275, 783, 2929, 14371, 271, 1774, 285, 2223, 12841, 2159, 4202, 273, 11454, 2990, 21999, 3082, 2970, 1475, 534, 4558, 247, 1534, 5691, 3782, 275, 48960, 9534, 436, 310, 247, 1534, 2495, 285, 3198, 281, 320, 5421, 9257, 275, 2007, 789, 50276, 455, 30628, 497, 275, 4345, 327, 14924, 285, 7350, 5439, 497, 18212, 9713, 275, 253, 30080, 22559, 3408, 7613, 891, 5583, 14924, 2299, 247, 1643, 8254, 6787, 5439, 407, 253, 3565, 30628, 285, 1345, 5701, 943, 320, 9713, 275, 253, 2457, 18520, 337, 40088, 326, 18464, 21999, 3082, 326, 10725, 327, 3590, 689, 6772, 3266, 318, 513, 417, 11089, 432, 436, 2159, 4202, 374, 7350, 1475, 38041, 273, 278, 532, 36302, 2905, 4679, 3982, 598, 275, 1345, 5701 ]
[ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 ]
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 783, 4477, 921, 326, 2176, 3426, 11454, 2990, 2336, 13783, 476, 320, 3731, 26460, 407, 9257, 37171, 11454, 6928, 326, 22059, 3790, 2727, 6332, 534, 672, 1781, 9777, 2193, 12357, 78, 1698, 9777, 2193, 824, 247, 5140, 476, 320, 691, 71, 19387, 456, 407, 3192, 5750, 273, 253, 8508, 272, 1055, 672, 627, 403, 1142, 8090, 273, 253, 2990, 436, 476, 671, 320, 908, 281, 823, 896, 26695, 281, 5368, 6928, 23447, 275, 247, 1039, 326, 4453, 3240, 13345, 50276, 74, 7964, 5194, 342, 253, 4477, 326, 310, 1774, 281, 3812, 4116, 281, 5024, 2219, 835, 3426, 2336, 13783, 476, 1891, 1677, 326, 29867, 476, 1421, 281, 247, 3221, 3282, 273, 3988, 50276, 1542, 326, 3815, 891, 1158, 436, 2929, 16108, 4116, 1014, 604, 10704, 6332, 476, 4840, 598, 11454, 6928, 310, 247, 973, 4304, 958, 50275, 3529, 1146, 753, 891, 1158, 627, 403, 247, 1643, 1534, 30453, 281, 436, 789, 50276, 18, 9759, 273, 253, 2929, 253, 2929, 387, 2069, 9193, 751, 625, 273, 247, 5955, 685, 247, 7000, 17947, 273, 247, 2176, 2983, 1511, 347, 271, 1650, 273, 2139, 436, 310, 417, 8654, 352, 2789, 352, 2834, 281, 4677, 562, 387, 247, 17834, 752, 1016, 2829, 310, 14339, 281, 671, 352, 691, 71, 19387, 684, 253, 5661, 1543, 273, 534, 627, 403, 3240, 247, 1643, 275, 253, 2929, 891, 2868, 253, 2929, 476, 5649, 432, 247, 625, 7473, 3740, 342, 12494, 1481, 723, 285, 749, 21454, 10155, 598, 253, 2505, 285, 253, 11815, 4518, 16318, 347, 10066, 281, 1146, 5195, 4768, 253, 2505, 50276, 19, 46899, 253, 3662, 432, 4754, 281, 642, 323, 247, 8985, 1159, 4419, 247, 1355, 20452, 2822, 253, 3061, 13674, 594, 253, 958, 326, 10704, 30745, 476, 1421, 281, 3430, 9172, 275, 285, 273, 3139, 310, 417, 10084, 752, 651, 320, 1199, 625, 4722, 310, 253, 4248, 281, 534, 824, 8104, 476, 5333, 247, 5415, 1159, 891, 2868, 326, 253, 1332, 310, 436, 789, 5644, 281, 29607, 1781, 3910, 533, 891, 1158, 436, 310, 1633, 326, 943, 320, 11120, 8338, 1678, 406, 339, 431, 248, 2929, 2722, 326, 352, 310, 1896, 281, 11213, 3242, 2336, 13783, 970, 10704, 978, 6720, 352, 29328, 2990, 35615, 326, 476, 22059, 10704, 3374, 275, 1340, 281, 755, 28460, 432, 2336, 13783, 1754, 327, 10336, 824, 347, 278, 532, 332, 338, 14059, 285, 326, 476, 387, 253, 1072, 673, 25026, 48960, 6667, 1561, 253, 5306, 692, 456, 299, 4277, 2910, 970, 247, 2969, 9632, 50276, 2520, 16540, 1774, 3988, 3374, 285, 347, 253, 2488, 1804, 891, 513, 2868, 326, 824, 1895, 1113, 12893, 275, 1142, 9534, 50275, 783, 1895, 310, 806, 1691, 715, 1708, 327, 247, 1077, 2969, 10336, 285, 840, 327, 625, 2570, 4394, 285, 840, 342, 247, 2879, 896, 11806, 327, 5368, 2990, 396, 250, 1208, 5556, 6081, 323, 253, 2336, 13783, 403, 2429, 285, 21319, 12014, 247, 17147, 281, 436, 3879, 310, 4081, 2403, 512, 2990, 3602, 5777, 27620, 50276, 6050, 516, 13762, 327, 253, 6349, 273, 253, 2256, 285, 891, 2096, 326, 352, 310, 3164, 7194, 323, 23356, 6378, 891, 452, 690, 3533, 7194, 327, 253, 896, 11806, 4473, 50276, 5430, 476, 352, 320, 20624, 281, 253, 2336, 5425, 275, 2593, 608, 1060, 891, 2096, 253, 760, 253, 3236, 10336, 285, 13461, 403, 2530, 281, 278, 532, 36302, 285, 5189, 275, 2593, 721, 50276, 74, 2985, 247, 1127, 627, 50276, 10383, 253, 17147, 891, 5730, 627, 497, 625, 5661, 1543, 342, 1027, 299, 4277, 2193, 281, 452, 247, 1805, 30328, 327, 253, 4156, 3879, 273, 253, 17147, 891, 671, 1158, 627, 812, 320, 625, 4278, 327, 849, 253, 2336, 5425, 34472, 253, 896, 11806, 347, 5393, 275, 2045, 1127, 5474, 33032, 2520, 2929, 8219, 326, 3738, 5368, 3426, 11454, 2990, 2336, 13783, 476, 2085, 690, 23632, 327, 253, 31640, 841, 2336, 13783, 452, 28849, 2442, 10704, 3790, 2727, 6332, 275, 253, 21999, 285, 275, 824, 2219, 253, 2530, 23632, 778, 320, 12078, 281, 921, 824, 247, 11562, 253, 4477, 12661, 281, 3989, 48960, 11454, 6928, 326, 476, 2847, 253, 3426, 2336, 5425, 281, 4711, 1607, 2845, 885, 1543, 275, 14974, 1127, 549, 334, 46682, 285, 476, 7624, 11213, 253, 2336, 5425, 597, 671, 2692, 352, 310, 1896, 281, 5669, 247, 896, 11806, 281, 253, 2990, 824, 326, 253, 896, 11806, 310, 9829, 407, 253, 2336, 5425, 1223, 352, 476, 9632, 690, 3879, 6799, 407, 253, 30539, 3738, 436, 2929, 556, 671, 5469, 247, 1896, 5684, 891, 1089, 253, 3969, 2593, 417, 1077, 4518, 3542, 50276, 856, 84, 50276, 2520, 2929, 16540, 271, 4722, 1895, 275, 3426, 2336, 13783, 670, 2442, 10704, 6332, 436, 476, 320, 1774, 281, 5416, 253, 31640, 273, 3426, 2336, 13783, 1411, 690, 2442, 48960, 6928, 390, 896, 26695, 50276, 783, 4477, 5183, 253, 6242, 273, 253, 10704, 2228, 1895, 3066, 26736, 48960, 6928, 281, 11213, 3426, 2336, 13783, 50276, 5040, 50276, 783, 2605, 273, 48960, 6928, 390, 13400, 896, 26695, 310, 417, 1160, 281, 3761, 690, 4588, 11454, 2990, 35615, 24088, 275, 4677, 374, 253, 2990, 556, 247, 2962, 273, 4872, 8090, 533, 556, 642, 5743, 875, 285, 3021, 352, 1057, 417, 1007, 751, 271, 4588, 48257, 2605, 310, 352, 1896, 281, 3989, 48960, 6928, 327, 15958, 35615, 24088, 13361, 81, 390, 260, 9866, 342, 774, 86, 1396, 569, 50275, 20261, 21449, 1411, 48960, 6928, 556, 644, 5469, 275, 253, 2929, 253, 4028, 4620, 16706, 285, 12744, 923, 3081, 5701, 2708, 50276, 38092, 5701, 50276, 74, 1089, 4706, 721, 310, 3164, 417, 1077, 12724, 285, 4518, 3542, 275, 253, 5068, 352, 310, 753, 326, 48960, 6928, 403, 7996, 281, 2801, 26309, 253, 2234, 12288, 310, 326, 690, 273, 253, 3602, 273, 776, 48960, 2990, 403, 2581, 7996, 281, 6046, 5727, 1327, 324, 735, 24406, 6928, 403, 10748, 10237, 281, 247, 1077, 1355, 20452, 273, 616, 3602, 285, 1996, 352, 310, 753, 326, 253, 2990, 342, 253, 896, 11806, 4620, 281, 320, 10237, 281, 6046, 436, 4453, 21643, 281, 479, 476, 368, 21184, 625, 1880, 368, 1158, 253, 48960, 2990, 310, 390, 310, 417, 10237, 281, 1355, 6046, 285, 253, 1996, 33295, 1007, 2834, 281, 2096, 50273, 484, 24275, 846, 30080, 22559, 50276, 35501, 281, 253, 4477, 323, 253, 12252, 891, 452, 1239, 253, 2488, 2380, 285, 2096, 326, 2686, 627, 403, 1396, 569, 275, 253, 6928, 533, 816, 11035, 432, 253, 8442, 891, 717, 3629, 619, 17401, 281, 721, 7152, 339, 431, 248, 2929, 10262, 247, 1332, 281, 2794, 11454, 6928, 326, 1955, 281, 14974, 3659, 2228, 1421, 281, 3430, 31640, 5306, 6787, 327, 954, 3280, 3888, 407, 247, 9267, 18859, 3426, 2336, 5425, 323, 11454, 2990, 31640, 253, 4477, 921, 849, 281, 1056, 616, 6928, 1007, 247, 2372, 1679, 20634, 285, 597, 2319, 247, 1039, 281, 2736, 11454, 6928, 326, 452, 644, 32494, 275, 253, 1039, 597, 1804, 50276, 936, 479, 352, 369, 4755, 247, 30400, 326, 667, 3426, 2336, 5425, 323, 11454, 2990, 31640, 326, 26574, 14974, 3659, 27844, 347, 247, 3962, 6779, 273, 1524, 27844, 310, 5061, 517, 2299, 891, 1158, 2987, 751, 253, 1655, 581, 403, 1774, 281, 15452, 824, 347, 281, 18236, 7568, 253, 7364, 273, 253, 23632, 1677, 407, 2176, 31640, 21612, 2718, 285, 281, 41509, 2007, 2561, 3103, 891, 1902, 253, 2303, 8446, 273, 253, 2929, 281, 320, 8191, 20823, 5852, 665, 452, 417, 594, 2080, 17801, 253, 13091, 273, 31640, 21612, 2561, 326, 858, 417, 11120, 2953, 14974, 3659, 35185, 275, 1708, 273, 436, 253, 2929, 556, 2067, 32213, 2905, 281, 9759, 50276, 20792, 1497, 310, 2223, 908, 275, 247, 21643, 1039, 323, 1650, 253, 2746, 326, 310, 18236, 5183, 281, 320, 5061, 517, 310, 1925, 247, 3426, 2336, 5425, 342, 253, 19508, 23632, 47723, 27594, 326, 512, 643, 2336, 13783, 1364, 320, 387, 1878, 347, 5061, 517, 50276, 783, 2905, 789, 310, 18464, 323, 1650, 5061, 517, 1255, 1955, 281, 14974, 25078, 6045, 556, 644, 3786, 18236, 2540, 275, 774, 484, 1591, 5987, 39962, 2061, 9275, 1093, 2125, 12347, 1717, 9275, 275, 436, 1083, 352, 4197, 3430, 48960, 6667, 1293, 667, 2714, 5593, 1907, 644, 2668, 281, 11213, 253, 2336, 5425, 50276, 783, 2905, 789, 310, 417, 6283, 5469, 275, 5886, 281, 14974, 3659, 35185, 690, 273, 253, 11106, 2987, 403, 3590, 342, 1675, 281, 3790, 2727, 2571, 403, 417, 891, 651, 1902, 436, 281, 320, 253, 4275, 10014, 273, 253, 2905, 789, 2593, 824, 347, 281, 6283, 4151, 253, 9414, 604, 285, 2139, 2176, 7274, 943, 320, 3264, 281, 320, 5061, 517, 342, 1675, 281, 3790, 2727, 253, 1655, 41066, 326, 512, 253, 2336, 13783, 326, 789, 342, 247, 1566, 273, 253, 2990, 403, 7826, 14043, 310, 417, 4344, 281, 512, 4477, 273, 824, 2718, 690, 452, 2668, 1270, 1557, 281, 5416, 597, 6283, 9232, 3790, 2727, 35185, 50276, 74, 858, 417, 1089, 691, 71, 19387, 318, 285, 5684, 3782, 973, 24013, 8550, 752, 310, 253, 8542, 10076, 275, 534, 597, 651, 2489, 3309, 50276, 783, 2929, 16965, 247, 8489, 8921, 3935, 352, 14288, 5936, 281, 11757, 14974, 3659, 5061, 517, 1255, 407, 19693, 344, 321, 3397, 281, 1056, 352, 12150, 281, 1089, 4588, 2258, 442, 18398, 10240, 752, 670, 816, 19693, 2336, 13783, 342, 8274, 2228, 14493, 326, 11120, 1379, 715, 2395, 14974, 3659, 35185, 352, 778, 417, 320, 1896, 275, 253, 2822, 3945, 281, 2686, 1056, 3451, 3426, 2336, 13783, 533, 387, 1878, 4477, 273, 18464, 2336, 13783, 588, 417, 452, 281, 18382, 3561, 281, 3473, 281, 1056, 271, 5061, 517, 3426, 2715, 275, 1340, 281, 3761, 12320, 3045, 285, 263, 23632, 273, 616, 21607, 50276, 783, 7681, 7118, 403, 3542, 973, 2217, 281, 320, 34007, 285, 253, 2022, 7681, 7680, 310, 247, 3102, 273, 8512, 359, 476, 5669, 715, 247, 11454, 2990, 275, 1340, 281, 1056, 352, 21319, 275, 271, 10341, 1039, 326, 310, 20624, 281, 253, 2783, 2336, 5425, 436, 310, 4722, 285, 557, 856, 1634, 667, 1750, 273, 29867, 533, 15216, 835, 436, 651, 320, 247, 1039, 281, 2983, 247, 985, 1646, 247, 2372, 523, 30487, 34243, 627, 651, 320, 271, 2746, 326, 476, 22059, 3790, 2727, 1561, 247, 1327, 1342, 532, 2907, 16058, 11454, 2990, 281, 29607, 1818, 253, 9162, 273, 247, 1677, 3280, 1293, 6890, 253, 2990, 253, 2929, 1537, 5649, 432, 247, 5955, 273, 436, 6387, 285, 271, 8813, 2139, 352, 369, 417, 9919, 50275, 783, 747, 2593, 2164, 310, 14109, 2167, 352, 3133, 253, 2929, 1335, 1057, 417, 1333, 326, 18464, 3082, 476, 2968, 342, 3790, 2727, 2228, 407, 3590, 689, 6772, 3266, 318, 187, 187, 4118, 18435, 27, 783, 4477, 7568, 326, 3426, 11454, 2990, 21999, 3082, 326, 897, 3710, 12320, 27844, 476, 1891, 281, 2736, 253, 6387, 273, 8104, 326, 22059, 10704, 3790, 2727, 6332, 597, 1287, 5609, 281, 5669, 247, 896, 11806, 715, 6928, 17690, 824, 30211, 326, 4558, 36568, 264, 407, 11454, 2990, 2336, 13783, 285, 247, 2969, 17147, 1411, 436, 1798, 896, 11806, 16941, 50275, 783, 2929, 14371, 271, 1774, 285, 2223, 12841, 2159, 4202, 273, 11454, 2990, 21999, 3082, 2970, 1475, 534, 4558, 247, 1534, 5691, 3782, 275, 48960, 9534, 436, 310, 247, 1534, 2495, 285, 3198, 281, 320, 5421, 9257, 275, 2007, 789, 50276, 455, 30628, 497, 275, 4345, 327, 14924, 285, 7350, 5439, 497, 18212, 9713, 275, 253, 30080, 22559, 3408, 7613, 891, 5583, 14924, 2299, 247, 1643, 8254, 6787, 5439, 407, 253, 3565, 30628, 285, 1345, 5701, 943, 320, 9713, 275, 253, 2457, 18520, 337, 40088, 326, 18464, 21999, 3082, 326, 10725, 327, 3590, 689, 6772, 3266, 318, 513, 417, 11089, 432, 436, 2159, 4202, 374, 7350, 1475, 38041, 273, 278, 532, 36302, 2905, 4679, 3982, 598, 275, 1345, 5701 ]
Below is given review of a research paper from cnoference journal. Please write a summary the review. ### Review: summary the paper proposes a method for imitation learning for goaldirected tasks that uses a learned proximity function for computing rewards an ensemble of proximity functions is trained in supervised way to predict the time step rescaled to the range 01 for the experts states the ensemble is improved online based on the additional objective of assigning zero proximity to the states encountered by the agent the agent is improved using ppo where the gain in proximity serves as reward function with an additional cost based on the uncertainty about the proximity estimated by the standard deviation of the ensemble strong points predicting time from states is an interesting idea and assigning higher reward to later states is sensible for goal directed tasks the paper is wellwritten and sufficiently clear concerns soundness there is barely any theoretical justification for the approach furthermore several hacks hyperparameters have been added to achieve the reported results uncertainty penalty bonus reward proximity discounting reward scaling relevance the approach is limited to goaldirected tasks and thus much less applicable than comparable methods evaluation i think that the evaluations are not fair for several reasons 1 the competitors are not intended for learning under different initial state distributions it is well known that behavioral cloning often performs very bad for outofdistribution data adversarial methods like gail and gails are based on distribution matching which is in general not possible if the initial state is significantly different from the demonstrations especially gails can suffer here since it would move to the initial state when starting at the goal position matching state transitions as gailfo would be a bit more reasonable here methods that can guide the agent towards the demonstrations such as sqil reddy et al 2019 might be even more suitable 2 the paper mentions that gail makes use of additional information by taking into account the actions but completely ignores the fact that the proposed method makes use of timelabels which are much more relevant for tasks that are essentially defined by the state at the last time step it would be easy to provide this information also to the adversarial methods by weighting the discriminator samples during training dependent on the time step in the extrem case by only using the samples form the final time steps another naive baseline would be to use a nonparameteric reward function by placing radial basis functions at every expert sample again weighted dependent on the time step instead the evaluation does not seem to make any use of the strong assumption of goaldirected tasks for any of the competing methods 3 it seems that much more hyperparameter tuning was involved for obtaining the results for the proposed method than for competing methods even for hyperparameters that are applicable to other methods such as learning rates and reward scaling the competing methods share the same parameters during all experiments whereas the proposed method has different hyperparameters depending on the environment additional feedback clarity could be improved at some points section 32 introduces delta as a discounting but doesnt mention that it is typically set to 1t which essentially scales the predicted time step to the range 01 without this information it is hard to make sense of eq 1 because for a discount factor close to 1 which is typical for the discount factor of the mdp the proximity function would actually be negative for most time steps and by setting the proximity of agent states to 0 one would actually assign them very high goal proximity questions 1 why provide an additional reward for last time step is this really necessary to obtain good results 2 i dont see how the optimal proximity function for eq4 converges to frac12 delta fractt2 this is not really relevant to the algorithm but still im wondering whether this claim is correct can you sketch a proof 3 can you specify the initial distributions for the test and train experiments more precisely for example for the navigation task you mention that 50 of the possible initial states and goals where used for collecting demonstrations but i can neither find the set of possible initial states nor how the 50 where chosen uniformly or selected assessment the proposed heuristic approach seems reasonable for the considered setting however overall the contribution is too small strong theoretical results or a thorough empirical evaluation with suitable baselines would both be fine for me however the current submission lacks quite severly on both sides and is therefore in my opinion not suitable for publication references reddy s dragan a d levine s sqil imitation learning via reinforcement learning with sparse rewards iclr 2019docsepto accelerate and improve imitation learning for goaldriven tasks the authors introduce a goal proximity function that is learned from the observation of expert demonstrations and online agent experience the inferred goal proximity is used as an additional reward signal the authors showed that heuristics efficiently improve the performance of imitation learning the method is simple and looks effective as shown in the experiment however from the theoretical viewpoint this proposal looks a heuristic method it is better to clarify the theoretical foundation the relationship with gail is mentioned several times however the explicit comparison between the proposed method and gail is not given for example first we set the target proximity of states in agent trajectories to 0 similar to adversarial imitation learning methods ho ermon 2016 and train the proximity function with both expert demonstrations and agent experience by minimizing the following loss describing the comparison as an appendix may help readers to understand the key idea to my understanding the paper focus on lfo however the relationship between lfo and goal proximity is not clear the goal proximity can be used for lfd as well if we consider goal proximity function as a goalrelated reward function the method is regarded as the integration of an imitation learning eg gail and a goaldriven reinforcement learning from this view this work looks related to the following paper kinose akira and tadahiro taniguchi integration of imitation learning using gail and reinforcement learning using taskachievement rewards via probabilistic graphical model advanced robotics 3416 2020 10551067docsepsummary the authors propose a new method for imitation learning from observation that attempts to estimate and leverage a notion of goal proximity in order to help the learning process the authors provide a framework for computing this estimate and a technique for using that estimate along with a measure of uncertainty to perform imitation learning from observation experimental results for several domains are presented in which the proposed technique achieves better performance than the comparison methods strengths s1 the paper seeks to solve an interesting and relevant problem in ifo s2 the proposed technique estimating and using a notion of task completion proximity is as far as im aware a novel take on the ifo that seems to have the potential to advance the state of the art weaknesses w1 fundamental experimental results are missing the paper proposes a technique with two major components a an estimated proximity function and b a method to exploit uncertainty information in that estimate however its not clear from the results which if it is the unique combination of these components that leads to the good results or just one of them especially given the results in figure 5 which shows how critical b is one way to get at that would be to apply b to some adversarial imitation learning techniques eg gaifo and see how they perform without something like this one cannot tell if it is the proximity function the use of uncertainty information or the combination that truly leads to improvement this is a fundamental question that must be addressed w2 as written some of the details of the proposed method are not clear to where it would likely be difficult to reproduce for example a there appears to be an unstated assumption that all demonstration trajectories terminate at a known time t which is not typically true meanwhile it seems like such trajectories would necessitate different choices of delta but the authors only discuss how to set delta in general according to some parameter h which b it seems as though the training objective for fphi articulated in equation 1 is highly dependent on delta but it doesnt seem as though the choice of delta is discussed recommendation statement the proposed method describes a novel approach to a good problem and seems to hold promise however i feel that important experimental results need to be added before the paper should be published questions for authors q1 how would a standard imitation from observation algorithm perform if endowed with the same uncertainty information as the proposed method q2 in figure 4d why are cells in the nw quadrant seemingly just as proximal as cells in the swne it seems as though they should be less proximal minor comments mc1 the two legends present on figure 5 are confusing while looking at a and its unclear where some of the curves in the legend at the top are in b and c the figure should be revised to be more cleardocsep paper and review summary this paper introduces a goal proximity approach to learning rewards from observations captured during demonstrations in goaloriented tasks this metric allocates rewards based on the temporal distance to a goal relying on a model trained to predict this distance the paper also proposes an adversarial learning approach to policy optimisation with this reward which helps to improve policies in regions where demonstrations were not captured i like the idea but unfortunately this paper has missed key related work angelov et al burke et al on reward inference and policy scheduling using temporal progress metrics which have been proposed previously although i do see potential novelty and additions going beyond the work in angelov et al burke et al adversarial training ablations the incorporation of uncertainty in the metric the primary contribution is severely reduced as a result i am inclined to recommend rejecting this work unless the paper revisions can convince me that there are sufficient differences and novelty however this may require a substantial rewrite of the introduction and conclusions and major revisions pros the paper is well written with detailed experiments and nice ablations i like the idea of combining a goal progress metric with adversarial learning the inclusion of the uncertainty metric is interesting potentially allowing for riskbased policy optimisation cons limited novelty a linear goal progress metric has already proposed by angelov et al in the paper composing diverse policies for temporally extended taskshttpsarxivorgpdf190708199pdf this paper trains a model to predict the normalised time to goal using observed demonstrations and uses this to select subpolicies for longhorizon tasks in a modelbased setting angelov et al do not optimise policies directly using the goal metric in follow on work by burke et al in the paper learning rewards for robotic ultrasound scanning using probabilistic temporal rankinghttpsarxivorgabs200201240 this goal progress metric of angelov et al is extended to a probabilistic temporal ranking metric that allows for nonmonotonically increasing goal progress in demonstrations to be captured here the goal progress reward is used for policy optimisation value iteration for grid world experiments and online learning questions on page 3 the paper states there are alternative ways to represent and learn goal proximity such as exponentially discounted proximity and rankingbased proximity brown et al 2019 but in our experiments linearly discounted proximity consistently performed better than alternatives i am curious that linear proximity models performed better as experiments in brown et al and burke et al show that nonlinear temporal progress metrics are more effective in particular rankingbased methods allow for nonmonotonically increasing progress metrics is this comment based only on the comparison between exponentially and linearly increasing progress or was a temporal ranking approach considered if not i would recommend exploring a temporal ranking approach going forward in equation 2 i see the progress metric is not used directly instead a derived reward was used did you experiment with using the progress metric directly instead page 7 fetchpush why cant the methods learn diagonal pushing policies figure 5 why does the policy using offline only training fail completely is this a function of the way amount the training data was collected or something elsedocseppaper summary this paper proposes a method for imitation learning from observations based on learning a goal proximity function from expert demonstrations and using it as a dense reward for training an imitator the authors show that this method improves generalization to unseen states in comparison to several baselines pros the idea is simple wellmotivated and the paper is easy to read clear and wellwritten the experiments are well designed and the results are explained with sufficient detail cons while i find the idea interesting and the experiments welldesigned i have a few concerns about the method and would need some clarifications to evaluate it further 1 figure 5 and 9 demonstrate that a key component of the method is the adversarial training of the proximity function without it the method completely fails in comparison not using the uncertainty part or not training offline makes less of a difference this seems to indicate that the adversarial part is more important than the temporal aspect of it if we ignore the temporal aspect and replace the target for f 1delta tt by its upper value of 1 we obtain a loss eexpert f 12 eimitator f2 which is very similar to the one used in gail eexpert logfrac11d eimitator log frac1d for values of d and f between 0 and 1 plotting the 4 functions recommended given this similarity i think the paper would benefit from one more ablation of this form removing the temporal part and keeping the adversarial component 2 following from point 1 i wonder why are the gail results so different eg gaifo one concern i have is the final reward the method proposed in this paper makes the assumption that the last state of the demonstration is the goal therefore the authors use an extra reward at this time step eq 2 if i understand correctly gail in general doesnt make this assumption so i wonder if this is at least in part responsible for the differences between these methods i think the paper would benefit from clarifying this by performing an ablation that removes this final reward 3 one maybe more superficial concern is whether the name goal proximity is appropriate given that the function f is not only trained to follow time but also adversarially to be zero on imitator data this name is confusing to me in fact as shown in fig 4d the value of f doesnt end up corresponding to the time or number of actions that would be required to reach the goal in the general sense 4 regarding fig 4d im not sure why the centers of all quadrants are light shouldnt only the path to the goal used by the expert be light ### Summary:
the paper proposes a method for learning intristic reward from demonstrations the inartistic reward is computed as timetoreach and generalizes to unseen states the reviewers agree that the method is novel useful and of interest to iclr community although the authors significantly improved the manuscript during the rebuttal phase with new results and addressed many of the reviewers comments the overall novelty of the paper is still somewhat limited making it unsuitable for iclr in its current form the future version of the paper should address the comments below and go through a detailed pass for clarity additional comments that did not influence the final decision the idea of learning temporal distance to the goal is not novel 1 although the application as an intristic reward is the authors should connect the temporal difference to the reachability theory and solving twopoint boundary problem for systems with nonlinear dynamics as a theoretical foundation of the method 1 i am curious about the decision to use the time to reach as a reward directly instead of delta between the states some empirical work provides evidence 23 that delta yield less side effects in behaviors 1 httpsieeexploreieeeorgstampstampjsparnumber8772207 2httpsarxivorgabs180310227 3 httpsarxivorgabs200306906
[ 253, 2929, 2770, 327, 298, 4786, 2299, 253, 2954, 875, 298, 4786, 285, 4736, 18326, 310, 417, 2590, 253, 4736, 18326, 476, 320, 908, 323, 298, 9194, 347, 973, 50276, 338, 359, 1908, 4736, 18326, 1159, 347, 247, 4736, 4919, 10921, 1159, 253, 1332, 310, 12258, 347, 253, 9554, 273, 271, 45738, 4715, 24088, 305, 647, 285, 247, 4736, 17477, 35221, 4715, 432, 436, 1859, 436, 789, 4453, 2905, 281, 253, 1563, 2929, 50275, 5914, 583, 29507, 8432, 285, 46868, 1240, 9401, 23136, 304, 26550, 9554, 273, 45738, 4715, 970, 305, 647, 285, 35221, 4715, 970, 4836, 607, 12876, 420, 23267, 3066, 37851, 29886, 1566, 7269, 15688, 982, 5910, 1036, 9169, 884, 2417, 740, 2251, 7152, 339, 793, 360, 3454, 253, 4477, 12661, 247, 747, 1332, 323, 45738, 4715, 432, 8310, 326, 9437, 281, 6642, 285, 25057, 247, 10732, 273, 4736, 18326, 275, 1340, 281, 1361, 253, 4715, 1232, 253, 4477, 2085, 247, 7792, 323, 12672, 436, 6642, 285, 247, 5853, 323, 970, 326, 6642, 50276, 28694, 342, 247, 2557, 273, 11649, 50276, 936, 1347, 45738, 4715, 432, 8310, 5661, 1543, 323, 2067, 10625, 403, 3559, 275, 534, 253, 4081, 5853, 33526, 1805, 3045, 685, 253, 5301, 3082, 50274, 296, 3755, 20556, 209, 186, 84, 18, 253, 2929, 14993, 281, 8415, 271, 4722, 285, 4623, 1895, 275, 604, 80, 209, 186, 84, 19, 253, 4081, 5853, 50276, 383, 303, 839, 285, 970, 247, 10732, 273, 4836, 12240, 18326, 50276, 261, 347, 2080, 347, 516, 6600, 247, 4460, 1379, 327, 253, 604, 80, 326, 3133, 281, 452, 253, 2442, 281, 7170, 253, 1375, 273, 253, 1445, 50275, 20881, 1255, 265, 209, 186, 88, 18, 7936, 5661, 1543, 403, 5816, 253, 2929, 29328, 247, 5853, 342, 767, 2201, 4295, 247, 271, 5998, 18326, 1159, 285, 270, 247, 1332, 281, 22059, 11649, 1491, 275, 326, 6642, 2299, 697, 417, 2590, 432, 253, 1543, 534, 604, 352, 310, 253, 4451, 5019, 273, 841, 4295, 326, 5644, 281, 253, 1175, 1543, 390, 816, 581, 273, 731, 50276, 20432, 1677, 253, 1543, 275, 4677, 608, 534, 2722, 849, 4619, 270, 310, 581, 1039, 281, 755, 387, 326, 651, 320, 281, 4647, 270, 281, 690, 48960, 45738, 4715, 5609, 24088, 23646, 338, 80, 285, 923, 849, 597, 1347, 1293, 1633, 751, 436, 581, 2550, 2028, 604, 352, 310, 253, 18326, 1159, 253, 897, 273, 11649, 1491, 390, 253, 5019, 326, 7777, 5644, 281, 7756, 436, 310, 247, 7936, 1953, 326, 1364, 320, 9713, 209, 186, 88, 19, 347, 3542, 690, 273, 253, 4278, 273, 253, 4081, 1332, 403, 417, 2590, 281, 835, 352, 651, 2779, 320, 2834, 281, 18302, 323, 1650, 28910, 186, 66, 627, 4620, 281, 320, 271, 440, 33834, 9376, 326, 512, 20028, 24102, 24174, 387, 247, 1929, 673, 246, 534, 310, 417, 5431, 2032, 26614, 352, 3133, 751, 824, 24102, 651, 2436, 17255, 1027, 10165, 273, 18687, 533, 253, 4477, 760, 2319, 849, 281, 873, 18687, 275, 2087, 2556, 281, 690, 4764, 288, 534, 50276, 186, 186, 67, 352, 3133, 347, 2167, 253, 3733, 8103, 323, 269, 2162, 35144, 275, 5150, 337, 310, 4122, 7976, 327, 18687, 533, 352, 36908, 1646, 347, 2167, 253, 4327, 273, 18687, 310, 5469, 50275, 250, 27167, 318, 3908, 253, 4081, 1332, 8631, 247, 4460, 2746, 281, 247, 1175, 1895, 285, 3133, 281, 2186, 9023, 2299, 891, 1928, 326, 1774, 5661, 1543, 878, 281, 320, 2879, 1078, 253, 2929, 943, 320, 3863, 50275, 34974, 323, 4477, 209, 186, 82, 18, 849, 651, 247, 2629, 45738, 432, 8310, 5933, 1347, 604, 39433, 342, 253, 1072, 11649, 1491, 347, 253, 4081, 1332, 209, 186, 82, 19, 275, 4677, 577, 69, 2139, 403, 1341, 275, 253, 295, 88, 48045, 16907, 816, 347, 19561, 347, 1341, 275, 253, 1863, 570, 352, 3133, 347, 2167, 597, 943, 320, 1679, 19561, 50275, 37585, 5701, 209, 186, 17475, 18, 253, 767, 38209, 1246, 327, 4677, 608, 403, 21643, 1223, 2819, 387, 247, 285, 697, 12744, 835, 690, 273, 253, 9191, 275, 253, 13691, 387, 253, 1755, 403, 275, 270, 285, 260, 253, 4677, 943, 320, 17265, 281, 320, 625, 1391, 472, 406, 33032, 2929, 285, 2278, 6010, 50275, 2520, 2929, 23970, 247, 4736, 18326, 2746, 281, 4715, 23267, 432, 7313, 10848, 1309, 32367, 275, 4736, 21085, 8892, 436, 7982, 9771, 684, 23267, 1754, 327, 253, 11935, 4181, 281, 247, 4736, 22128, 327, 247, 1566, 10166, 281, 3283, 436, 4181, 253, 2929, 671, 29328, 271, 48960, 4715, 2746, 281, 3646, 5556, 5837, 342, 436, 10921, 534, 7729, 281, 3157, 7823, 275, 4811, 835, 32367, 497, 417, 10848, 50276, 74, 751, 253, 2934, 533, 19235, 436, 2929, 556, 9829, 2234, 2905, 789, 23087, 729, 1162, 355, 3600, 413, 1162, 355, 327, 10921, 17032, 285, 3646, 27387, 970, 11935, 4780, 17082, 534, 452, 644, 4081, 3786, 3738, 891, 513, 923, 2442, 38135, 285, 30733, 1469, 4457, 253, 789, 275, 23087, 729, 1162, 355, 3600, 413, 1162, 355, 48960, 3733, 490, 77, 569, 253, 24319, 273, 11649, 275, 253, 7982, 50276, 783, 3625, 7680, 310, 18270, 3777, 347, 247, 906, 891, 717, 21802, 281, 5583, 33944, 436, 789, 5734, 253, 2929, 38549, 476, 18578, 479, 326, 627, 403, 4209, 3910, 285, 38135, 2299, 436, 778, 2430, 247, 6832, 24813, 273, 253, 10199, 285, 11815, 285, 2201, 38549, 50274, 856, 84, 50274, 783, 2929, 310, 973, 3542, 342, 7000, 4679, 285, 5322, 490, 77, 569, 50276, 74, 751, 253, 2934, 273, 16248, 247, 4736, 4780, 7982, 342, 48960, 4715, 50276, 783, 11250, 273, 253, 11649, 7982, 310, 4722, 7826, 6941, 323, 2495, 3169, 3646, 5556, 5837, 50275, 5040, 3710, 38135, 50275, 66, 4872, 4736, 4780, 7982, 556, 2168, 4081, 407, 23087, 729, 1162, 355, 275, 253, 2929, 47247, 11117, 7823, 323, 5897, 595, 6508, 8892, 3614, 39962, 2061, 9275, 16129, 27158, 3031, 9275, 436, 2929, 18784, 247, 1566, 281, 3283, 253, 2622, 1701, 673, 281, 4736, 970, 2540, 32367, 285, 4648, 436, 281, 3609, 749, 81, 3422, 447, 323, 1048, 1688, 21148, 8892, 275, 247, 1566, 3169, 4758, 23087, 729, 1162, 355, 513, 417, 5556, 885, 7823, 3587, 970, 253, 4736, 7982, 50276, 249, 956, 327, 789, 407, 3600, 413, 1162, 355, 275, 253, 2929, 4715, 23267, 323, 35121, 18533, 13733, 970, 37851, 11935, 19947, 3614, 39962, 2061, 5375, 1518, 1252, 14028, 436, 4736, 4780, 7982, 273, 23087, 729, 1162, 355, 310, 6508, 281, 247, 37851, 11935, 19947, 7982, 326, 4483, 323, 1327, 2163, 14639, 1037, 3629, 4736, 4780, 275, 32367, 281, 320, 10848, 1060, 253, 4736, 4780, 10921, 310, 908, 323, 3646, 5556, 5837, 1318, 19502, 323, 9860, 1533, 4679, 285, 3909, 4715, 50275, 34974, 50275, 251, 3239, 495, 253, 2929, 3054, 627, 403, 5795, 4088, 281, 1957, 285, 3037, 4736, 18326, 824, 347, 28596, 42214, 18326, 285, 19947, 3169, 18326, 8516, 1162, 355, 6247, 533, 275, 776, 4679, 23352, 42214, 18326, 12724, 2684, 1805, 685, 18075, 50276, 74, 717, 14338, 326, 4872, 18326, 3210, 2684, 1805, 347, 4679, 275, 8516, 1162, 355, 285, 3600, 413, 1162, 355, 921, 326, 14561, 11935, 4780, 17082, 403, 625, 3576, 275, 1798, 19947, 3169, 3082, 1581, 323, 1327, 2163, 14639, 1037, 3629, 4780, 17082, 310, 436, 4385, 1754, 760, 327, 253, 5301, 875, 28596, 285, 23352, 3629, 4780, 390, 369, 247, 11935, 19947, 2746, 2783, 604, 417, 891, 651, 5583, 18216, 247, 11935, 19947, 2746, 1469, 3579, 50276, 249, 5150, 374, 891, 923, 253, 4780, 7982, 310, 417, 908, 3587, 3185, 247, 6012, 10921, 369, 908, 858, 368, 3368, 342, 970, 253, 4780, 7982, 3587, 3185, 50276, 6377, 818, 20279, 11340, 50276, 22309, 16216, 253, 3082, 3037, 16421, 13383, 7823, 50276, 13206, 608, 50276, 22309, 1057, 253, 3646, 970, 28841, 760, 3733, 1891, 4336, 50276, 261, 436, 247, 1159, 273, 253, 1039, 2408, 253, 3733, 941, 369, 5728, 390, 1633, 1045, 15690, 406, 339, 377, 6653, 6010, 50276, 2520, 2929, 29328, 247, 1332, 323, 45738, 4715, 432, 7313, 1754, 327, 4715, 247, 4736, 18326, 1159, 432, 6485, 32367, 285, 970, 352, 347, 247, 14086, 10921, 323, 3733, 271, 516, 262, 1080, 253, 4477, 921, 326, 436, 1332, 19132, 26647, 281, 39709, 3054, 275, 5301, 281, 2067, 1666, 25379, 50275, 856, 84, 50275, 783, 2934, 310, 2969, 973, 24013, 8550, 285, 253, 2929, 310, 3477, 281, 1239, 2590, 285, 973, 15720, 50275, 783, 4679, 403, 973, 4158, 285, 253, 1543, 403, 5544, 342, 4209, 2508, 50275, 5040, 50276, 6050, 891, 1089, 253, 2934, 4722, 285, 253, 4679, 6210, 392, 265, 1300, 891, 452, 247, 1643, 7350, 670, 253, 1332, 285, 651, 878, 690, 8254, 6787, 281, 7472, 352, 2007, 50276, 18, 4677, 608, 285, 898, 7568, 326, 247, 2234, 4445, 273, 253, 1332, 310, 253, 48960, 3733, 273, 253, 18326, 1159, 1293, 352, 253, 1332, 4336, 10224, 275, 5301, 417, 970, 253, 11649, 629, 390, 417, 3733, 28841, 2789, 1679, 273, 247, 3064, 436, 3133, 281, 5224, 326, 253, 48960, 629, 310, 625, 1774, 685, 253, 11935, 4809, 273, 352, 604, 359, 11823, 253, 11935, 4809, 285, 8171, 253, 2303, 323, 269, 337, 3005, 42085, 407, 697, 5170, 1318, 273, 337, 359, 4044, 247, 2957, 299, 911, 8292, 269, 1249, 50276, 70, 16563, 1080, 269, 19, 534, 310, 1077, 2074, 281, 253, 581, 908, 275, 305, 647, 299, 911, 8292, 2412, 1124, 883, 69, 50276, 70, 16563, 1080, 2412, 1315, 317, 18, 69, 323, 2193, 273, 277, 285, 269, 875, 470, 285, 337, 38542, 253, 577, 3470, 8521, 1677, 436, 14259, 891, 1158, 253, 2929, 651, 5649, 432, 581, 625, 28913, 273, 436, 830, 11922, 253, 11935, 629, 285, 7562, 253, 48960, 4445, 50275, 19, 1563, 432, 1127, 337, 891, 4282, 2139, 403, 253, 305, 647, 1543, 594, 1027, 24088, 23646, 338, 80, 581, 4468, 891, 452, 310, 253, 2457, 10921, 253, 1332, 4081, 275, 436, 2929, 2789, 253, 9376, 326, 253, 1390, 1375, 273, 253, 20028, 310, 253, 4736, 3103, 253, 4477, 897, 271, 4465, 10921, 387, 436, 673, 3213, 16186, 374, 604, 891, 2096, 9113, 305, 647, 275, 2087, 36908, 1056, 436, 9376, 594, 891, 4282, 604, 436, 310, 387, 1878, 275, 629, 5506, 323, 253, 3910, 875, 841, 3082, 891, 1158, 253, 2929, 651, 5649, 432, 8254, 5411, 436, 407, 9591, 271, 28913, 326, 26586, 436, 2457, 10921, 50276, 20, 581, 5046, 625, 28019, 4468, 310, 1880, 253, 1416, 4736, 18326, 310, 4569, 1677, 326, 253, 1159, 269, 310, 417, 760, 10166, 281, 956, 673, 533, 671, 18539, 274, 1365, 281, 320, 5058, 327, 516, 262, 1080, 941, 436, 1416, 310, 21643, 281, 479, 275, 958, 347, 2011, 275, 3036, 577, 69, 253, 1318, 273, 269, 36908, 990, 598, 3969, 281, 253, 673, 390, 1180, 273, 5231, 326, 651, 320, 2424, 281, 3986, 253, 4736, 275, 253, 2087, 3282, 50275, 21, 5001, 3036, 577, 69, 516, 417, 2119, 2139, 253, 12127, 273, 512, 9853, 9231, 403, 1708, 943, 2649, 760, 253, 1854, 281, 253, 4736, 908, 407, 253, 6485, 320, 1708, 2490, 187, 4118, 18435, 27, 783, 2929, 29328, 247, 1332, 323, 4715, 540, 2198, 280, 10921, 432, 32367, 253, 275, 435, 2531, 10921, 310, 10302, 347, 4522, 292, 410, 607, 285, 2087, 4219, 281, 39709, 3054, 50276, 783, 30628, 5194, 326, 253, 1332, 310, 4460, 4217, 285, 273, 1600, 281, 17857, 32888, 3114, 50275, 20261, 253, 4477, 3012, 5520, 253, 7714, 1309, 253, 30080, 22559, 3408, 342, 747, 1543, 285, 9713, 1142, 273, 253, 30628, 5701, 253, 4583, 38135, 273, 253, 2929, 310, 1335, 8489, 3710, 2403, 352, 49590, 323, 17857, 32888, 275, 697, 1655, 830, 50273, 783, 2852, 50276, 4149, 273, 253, 2929, 943, 2953, 253, 5701, 2708, 285, 564, 949, 247, 7000, 1509, 323, 19843, 50276, 38092, 5701, 326, 858, 417, 4833, 253, 2457, 3061, 50276, 783, 2934, 273, 4715, 11935, 4181, 281, 253, 4736, 310, 417, 4460, 337, 3738, 253, 2898, 347, 271, 540, 2198, 280, 10921, 310, 253, 4477, 943, 4684, 253, 11935, 3064, 281, 253, 3986, 1430, 3762, 285, 16161, 2500, 412, 842, 7548, 1895, 323, 2718, 342, 14561, 8062, 347, 247, 10527, 12153, 273, 253, 1332, 337, 50276, 74, 717, 14338, 670, 253, 3061, 281, 897, 253, 673, 281, 3986, 347, 247, 10921, 3587, 3185, 273, 18687, 875, 253, 3054, 690, 16774, 789, 3400, 1941, 3495, 326, 18687, 4917, 1679, 1930, 2538, 275, 13576, 50275, 18, 5987, 466, 70, 15083, 410, 466, 1796, 2061, 296, 1301, 296, 1301, 42717, 1596, 2764, 47220, 14256, 24, 374, 3614, 39962, 2061, 5375, 1093, 2941, 740, 20785, 495, 5987, 39962, 2061, 5375, 1518, 1229, 2090, 3071 ]
[ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 ]
[ 253, 2929, 2770, 327, 298, 4786, 2299, 253, 2954, 875, 298, 4786, 285, 4736, 18326, 310, 417, 2590, 253, 4736, 18326, 476, 320, 908, 323, 298, 9194, 347, 973, 50276, 338, 359, 1908, 4736, 18326, 1159, 347, 247, 4736, 4919, 10921, 1159, 253, 1332, 310, 12258, 347, 253, 9554, 273, 271, 45738, 4715, 24088, 305, 647, 285, 247, 4736, 17477, 35221, 4715, 432, 436, 1859, 436, 789, 4453, 2905, 281, 253, 1563, 2929, 50275, 5914, 583, 29507, 8432, 285, 46868, 1240, 9401, 23136, 304, 26550, 9554, 273, 45738, 4715, 970, 305, 647, 285, 35221, 4715, 970, 4836, 607, 12876, 420, 23267, 3066, 37851, 29886, 1566, 7269, 15688, 982, 5910, 1036, 9169, 884, 2417, 740, 2251, 7152, 339, 793, 360, 3454, 253, 4477, 12661, 247, 747, 1332, 323, 45738, 4715, 432, 8310, 326, 9437, 281, 6642, 285, 25057, 247, 10732, 273, 4736, 18326, 275, 1340, 281, 1361, 253, 4715, 1232, 253, 4477, 2085, 247, 7792, 323, 12672, 436, 6642, 285, 247, 5853, 323, 970, 326, 6642, 50276, 28694, 342, 247, 2557, 273, 11649, 50276, 936, 1347, 45738, 4715, 432, 8310, 5661, 1543, 323, 2067, 10625, 403, 3559, 275, 534, 253, 4081, 5853, 33526, 1805, 3045, 685, 253, 5301, 3082, 50274, 296, 3755, 20556, 209, 186, 84, 18, 253, 2929, 14993, 281, 8415, 271, 4722, 285, 4623, 1895, 275, 604, 80, 209, 186, 84, 19, 253, 4081, 5853, 50276, 383, 303, 839, 285, 970, 247, 10732, 273, 4836, 12240, 18326, 50276, 261, 347, 2080, 347, 516, 6600, 247, 4460, 1379, 327, 253, 604, 80, 326, 3133, 281, 452, 253, 2442, 281, 7170, 253, 1375, 273, 253, 1445, 50275, 20881, 1255, 265, 209, 186, 88, 18, 7936, 5661, 1543, 403, 5816, 253, 2929, 29328, 247, 5853, 342, 767, 2201, 4295, 247, 271, 5998, 18326, 1159, 285, 270, 247, 1332, 281, 22059, 11649, 1491, 275, 326, 6642, 2299, 697, 417, 2590, 432, 253, 1543, 534, 604, 352, 310, 253, 4451, 5019, 273, 841, 4295, 326, 5644, 281, 253, 1175, 1543, 390, 816, 581, 273, 731, 50276, 20432, 1677, 253, 1543, 275, 4677, 608, 534, 2722, 849, 4619, 270, 310, 581, 1039, 281, 755, 387, 326, 651, 320, 281, 4647, 270, 281, 690, 48960, 45738, 4715, 5609, 24088, 23646, 338, 80, 285, 923, 849, 597, 1347, 1293, 1633, 751, 436, 581, 2550, 2028, 604, 352, 310, 253, 18326, 1159, 253, 897, 273, 11649, 1491, 390, 253, 5019, 326, 7777, 5644, 281, 7756, 436, 310, 247, 7936, 1953, 326, 1364, 320, 9713, 209, 186, 88, 19, 347, 3542, 690, 273, 253, 4278, 273, 253, 4081, 1332, 403, 417, 2590, 281, 835, 352, 651, 2779, 320, 2834, 281, 18302, 323, 1650, 28910, 186, 66, 627, 4620, 281, 320, 271, 440, 33834, 9376, 326, 512, 20028, 24102, 24174, 387, 247, 1929, 673, 246, 534, 310, 417, 5431, 2032, 26614, 352, 3133, 751, 824, 24102, 651, 2436, 17255, 1027, 10165, 273, 18687, 533, 253, 4477, 760, 2319, 849, 281, 873, 18687, 275, 2087, 2556, 281, 690, 4764, 288, 534, 50276, 186, 186, 67, 352, 3133, 347, 2167, 253, 3733, 8103, 323, 269, 2162, 35144, 275, 5150, 337, 310, 4122, 7976, 327, 18687, 533, 352, 36908, 1646, 347, 2167, 253, 4327, 273, 18687, 310, 5469, 50275, 250, 27167, 318, 3908, 253, 4081, 1332, 8631, 247, 4460, 2746, 281, 247, 1175, 1895, 285, 3133, 281, 2186, 9023, 2299, 891, 1928, 326, 1774, 5661, 1543, 878, 281, 320, 2879, 1078, 253, 2929, 943, 320, 3863, 50275, 34974, 323, 4477, 209, 186, 82, 18, 849, 651, 247, 2629, 45738, 432, 8310, 5933, 1347, 604, 39433, 342, 253, 1072, 11649, 1491, 347, 253, 4081, 1332, 209, 186, 82, 19, 275, 4677, 577, 69, 2139, 403, 1341, 275, 253, 295, 88, 48045, 16907, 816, 347, 19561, 347, 1341, 275, 253, 1863, 570, 352, 3133, 347, 2167, 597, 943, 320, 1679, 19561, 50275, 37585, 5701, 209, 186, 17475, 18, 253, 767, 38209, 1246, 327, 4677, 608, 403, 21643, 1223, 2819, 387, 247, 285, 697, 12744, 835, 690, 273, 253, 9191, 275, 253, 13691, 387, 253, 1755, 403, 275, 270, 285, 260, 253, 4677, 943, 320, 17265, 281, 320, 625, 1391, 472, 406, 33032, 2929, 285, 2278, 6010, 50275, 2520, 2929, 23970, 247, 4736, 18326, 2746, 281, 4715, 23267, 432, 7313, 10848, 1309, 32367, 275, 4736, 21085, 8892, 436, 7982, 9771, 684, 23267, 1754, 327, 253, 11935, 4181, 281, 247, 4736, 22128, 327, 247, 1566, 10166, 281, 3283, 436, 4181, 253, 2929, 671, 29328, 271, 48960, 4715, 2746, 281, 3646, 5556, 5837, 342, 436, 10921, 534, 7729, 281, 3157, 7823, 275, 4811, 835, 32367, 497, 417, 10848, 50276, 74, 751, 253, 2934, 533, 19235, 436, 2929, 556, 9829, 2234, 2905, 789, 23087, 729, 1162, 355, 3600, 413, 1162, 355, 327, 10921, 17032, 285, 3646, 27387, 970, 11935, 4780, 17082, 534, 452, 644, 4081, 3786, 3738, 891, 513, 923, 2442, 38135, 285, 30733, 1469, 4457, 253, 789, 275, 23087, 729, 1162, 355, 3600, 413, 1162, 355, 48960, 3733, 490, 77, 569, 253, 24319, 273, 11649, 275, 253, 7982, 50276, 783, 3625, 7680, 310, 18270, 3777, 347, 247, 906, 891, 717, 21802, 281, 5583, 33944, 436, 789, 5734, 253, 2929, 38549, 476, 18578, 479, 326, 627, 403, 4209, 3910, 285, 38135, 2299, 436, 778, 2430, 247, 6832, 24813, 273, 253, 10199, 285, 11815, 285, 2201, 38549, 50274, 856, 84, 50274, 783, 2929, 310, 973, 3542, 342, 7000, 4679, 285, 5322, 490, 77, 569, 50276, 74, 751, 253, 2934, 273, 16248, 247, 4736, 4780, 7982, 342, 48960, 4715, 50276, 783, 11250, 273, 253, 11649, 7982, 310, 4722, 7826, 6941, 323, 2495, 3169, 3646, 5556, 5837, 50275, 5040, 3710, 38135, 50275, 66, 4872, 4736, 4780, 7982, 556, 2168, 4081, 407, 23087, 729, 1162, 355, 275, 253, 2929, 47247, 11117, 7823, 323, 5897, 595, 6508, 8892, 3614, 39962, 2061, 9275, 16129, 27158, 3031, 9275, 436, 2929, 18784, 247, 1566, 281, 3283, 253, 2622, 1701, 673, 281, 4736, 970, 2540, 32367, 285, 4648, 436, 281, 3609, 749, 81, 3422, 447, 323, 1048, 1688, 21148, 8892, 275, 247, 1566, 3169, 4758, 23087, 729, 1162, 355, 513, 417, 5556, 885, 7823, 3587, 970, 253, 4736, 7982, 50276, 249, 956, 327, 789, 407, 3600, 413, 1162, 355, 275, 253, 2929, 4715, 23267, 323, 35121, 18533, 13733, 970, 37851, 11935, 19947, 3614, 39962, 2061, 5375, 1518, 1252, 14028, 436, 4736, 4780, 7982, 273, 23087, 729, 1162, 355, 310, 6508, 281, 247, 37851, 11935, 19947, 7982, 326, 4483, 323, 1327, 2163, 14639, 1037, 3629, 4736, 4780, 275, 32367, 281, 320, 10848, 1060, 253, 4736, 4780, 10921, 310, 908, 323, 3646, 5556, 5837, 1318, 19502, 323, 9860, 1533, 4679, 285, 3909, 4715, 50275, 34974, 50275, 251, 3239, 495, 253, 2929, 3054, 627, 403, 5795, 4088, 281, 1957, 285, 3037, 4736, 18326, 824, 347, 28596, 42214, 18326, 285, 19947, 3169, 18326, 8516, 1162, 355, 6247, 533, 275, 776, 4679, 23352, 42214, 18326, 12724, 2684, 1805, 685, 18075, 50276, 74, 717, 14338, 326, 4872, 18326, 3210, 2684, 1805, 347, 4679, 275, 8516, 1162, 355, 285, 3600, 413, 1162, 355, 921, 326, 14561, 11935, 4780, 17082, 403, 625, 3576, 275, 1798, 19947, 3169, 3082, 1581, 323, 1327, 2163, 14639, 1037, 3629, 4780, 17082, 310, 436, 4385, 1754, 760, 327, 253, 5301, 875, 28596, 285, 23352, 3629, 4780, 390, 369, 247, 11935, 19947, 2746, 2783, 604, 417, 891, 651, 5583, 18216, 247, 11935, 19947, 2746, 1469, 3579, 50276, 249, 5150, 374, 891, 923, 253, 4780, 7982, 310, 417, 908, 3587, 3185, 247, 6012, 10921, 369, 908, 858, 368, 3368, 342, 970, 253, 4780, 7982, 3587, 3185, 50276, 6377, 818, 20279, 11340, 50276, 22309, 16216, 253, 3082, 3037, 16421, 13383, 7823, 50276, 13206, 608, 50276, 22309, 1057, 253, 3646, 970, 28841, 760, 3733, 1891, 4336, 50276, 261, 436, 247, 1159, 273, 253, 1039, 2408, 253, 3733, 941, 369, 5728, 390, 1633, 1045, 15690, 406, 339, 377, 6653, 6010, 50276, 2520, 2929, 29328, 247, 1332, 323, 45738, 4715, 432, 7313, 1754, 327, 4715, 247, 4736, 18326, 1159, 432, 6485, 32367, 285, 970, 352, 347, 247, 14086, 10921, 323, 3733, 271, 516, 262, 1080, 253, 4477, 921, 326, 436, 1332, 19132, 26647, 281, 39709, 3054, 275, 5301, 281, 2067, 1666, 25379, 50275, 856, 84, 50275, 783, 2934, 310, 2969, 973, 24013, 8550, 285, 253, 2929, 310, 3477, 281, 1239, 2590, 285, 973, 15720, 50275, 783, 4679, 403, 973, 4158, 285, 253, 1543, 403, 5544, 342, 4209, 2508, 50275, 5040, 50276, 6050, 891, 1089, 253, 2934, 4722, 285, 253, 4679, 6210, 392, 265, 1300, 891, 452, 247, 1643, 7350, 670, 253, 1332, 285, 651, 878, 690, 8254, 6787, 281, 7472, 352, 2007, 50276, 18, 4677, 608, 285, 898, 7568, 326, 247, 2234, 4445, 273, 253, 1332, 310, 253, 48960, 3733, 273, 253, 18326, 1159, 1293, 352, 253, 1332, 4336, 10224, 275, 5301, 417, 970, 253, 11649, 629, 390, 417, 3733, 28841, 2789, 1679, 273, 247, 3064, 436, 3133, 281, 5224, 326, 253, 48960, 629, 310, 625, 1774, 685, 253, 11935, 4809, 273, 352, 604, 359, 11823, 253, 11935, 4809, 285, 8171, 253, 2303, 323, 269, 337, 3005, 42085, 407, 697, 5170, 1318, 273, 337, 359, 4044, 247, 2957, 299, 911, 8292, 269, 1249, 50276, 70, 16563, 1080, 269, 19, 534, 310, 1077, 2074, 281, 253, 581, 908, 275, 305, 647, 299, 911, 8292, 2412, 1124, 883, 69, 50276, 70, 16563, 1080, 2412, 1315, 317, 18, 69, 323, 2193, 273, 277, 285, 269, 875, 470, 285, 337, 38542, 253, 577, 3470, 8521, 1677, 436, 14259, 891, 1158, 253, 2929, 651, 5649, 432, 581, 625, 28913, 273, 436, 830, 11922, 253, 11935, 629, 285, 7562, 253, 48960, 4445, 50275, 19, 1563, 432, 1127, 337, 891, 4282, 2139, 403, 253, 305, 647, 1543, 594, 1027, 24088, 23646, 338, 80, 581, 4468, 891, 452, 310, 253, 2457, 10921, 253, 1332, 4081, 275, 436, 2929, 2789, 253, 9376, 326, 253, 1390, 1375, 273, 253, 20028, 310, 253, 4736, 3103, 253, 4477, 897, 271, 4465, 10921, 387, 436, 673, 3213, 16186, 374, 604, 891, 2096, 9113, 305, 647, 275, 2087, 36908, 1056, 436, 9376, 594, 891, 4282, 604, 436, 310, 387, 1878, 275, 629, 5506, 323, 253, 3910, 875, 841, 3082, 891, 1158, 253, 2929, 651, 5649, 432, 8254, 5411, 436, 407, 9591, 271, 28913, 326, 26586, 436, 2457, 10921, 50276, 20, 581, 5046, 625, 28019, 4468, 310, 1880, 253, 1416, 4736, 18326, 310, 4569, 1677, 326, 253, 1159, 269, 310, 417, 760, 10166, 281, 956, 673, 533, 671, 18539, 274, 1365, 281, 320, 5058, 327, 516, 262, 1080, 941, 436, 1416, 310, 21643, 281, 479, 275, 958, 347, 2011, 275, 3036, 577, 69, 253, 1318, 273, 269, 36908, 990, 598, 3969, 281, 253, 673, 390, 1180, 273, 5231, 326, 651, 320, 2424, 281, 3986, 253, 4736, 275, 253, 2087, 3282, 50275, 21, 5001, 3036, 577, 69, 516, 417, 2119, 2139, 253, 12127, 273, 512, 9853, 9231, 403, 1708, 943, 2649, 760, 253, 1854, 281, 253, 4736, 908, 407, 253, 6485, 320, 1708, 2490, 187, 4118, 18435, 27, 783, 2929, 29328, 247, 1332, 323, 4715, 540, 2198, 280, 10921, 432, 32367, 253, 275, 435, 2531, 10921, 310, 10302, 347, 4522, 292, 410, 607, 285, 2087, 4219, 281, 39709, 3054, 50276, 783, 30628, 5194, 326, 253, 1332, 310, 4460, 4217, 285, 273, 1600, 281, 17857, 32888, 3114, 50275, 20261, 253, 4477, 3012, 5520, 253, 7714, 1309, 253, 30080, 22559, 3408, 342, 747, 1543, 285, 9713, 1142, 273, 253, 30628, 5701, 253, 4583, 38135, 273, 253, 2929, 310, 1335, 8489, 3710, 2403, 352, 49590, 323, 17857, 32888, 275, 697, 1655, 830, 50273, 783, 2852, 50276, 4149, 273, 253, 2929, 943, 2953, 253, 5701, 2708, 285, 564, 949, 247, 7000, 1509, 323, 19843, 50276, 38092, 5701, 326, 858, 417, 4833, 253, 2457, 3061, 50276, 783, 2934, 273, 4715, 11935, 4181, 281, 253, 4736, 310, 417, 4460, 337, 3738, 253, 2898, 347, 271, 540, 2198, 280, 10921, 310, 253, 4477, 943, 4684, 253, 11935, 3064, 281, 253, 3986, 1430, 3762, 285, 16161, 2500, 412, 842, 7548, 1895, 323, 2718, 342, 14561, 8062, 347, 247, 10527, 12153, 273, 253, 1332, 337, 50276, 74, 717, 14338, 670, 253, 3061, 281, 897, 253, 673, 281, 3986, 347, 247, 10921, 3587, 3185, 273, 18687, 875, 253, 3054, 690, 16774, 789, 3400, 1941, 3495, 326, 18687, 4917, 1679, 1930, 2538, 275, 13576, 50275, 18, 5987, 466, 70, 15083, 410, 466, 1796, 2061, 296, 1301, 296, 1301, 42717, 1596, 2764, 47220, 14256, 24, 374, 3614, 39962, 2061, 5375, 1093, 2941, 740, 20785, 495, 5987, 39962, 2061, 5375, 1518, 1229, 2090, 3071 ]
Below is given review of a research paper from cnoference journal. Please write a summary the review. ### Review: this manuscript develops conformalized sketching a method using conformal prediction to construct confidence intervals for frequency queries based on sketched data the method works at any desired level unlike the standard version of countmin sketch and also incorporates conservative updates theoretical results and simulations with both synthetic and reallife datasets are provided which establish validity and showcase the appeal of the proposed method update after reading the authors response i have increased my score to a 7 accept strengths the paper is well written and appears to be up to date with the literature a overview of conformal prediction is provided and the results are selfcontained weakness more theoretical results would be helpful for example some theoretical results concerning the length of the confidence intervals and also performance when the exchangeability assumption does not hold would be helpful see questions and the weakness comment potential negative societal impact not applicable docsepthe paper proposes a conformal method that given a sketching function phi computes a confidence interval for the frequency of a random query sampled exchangeably from the data the papers biggest strength is the sheer novelty of the combination of conformal prediction and data sketching to my knowledge the combination is completely new unfortunately the method has at least two significant limitations first it is unclear whether the type of theoretical guarantee that the method can provide is of interest for working with sketched data since the probability of coverage is with respect to the randomness of the query and not of the sketching second for this guarantee to hold the random queries must be exchangeable the authors appear to be aware of the limitations of their method cf section 5 however i think that these limitations deserve a more extensive treatment docsepthe authors propose a method for constructing confidence intervals for counts based on approximate counts generated by a sketching algorithm the proposed method involves keeping track of exact counts for a sparse dictionary of items and computing conformity scores based on disparity between the sketched count and true count using these scores a prediction region is generated for a randomly selected element of the dictionary using conformal prediction the method is shown to be competitive on synthetic and real data examples strengths the proposed method inherits the flexibility of conformal prediction and thus works for any sketching algorithm and for any performance measure ie conformity score the construction of the ordered pairs to transform the sketching problem into a supervised learning problem amenable to conformal prediction is novel the fact that it suffices to keep track of m0 ll m counts in order to construct a prediction interval for the test object due to the exchangeability assumption is interesting weaknesses one of the main arguments that the authors make in favor of conformal prediction over deterministic approaches is that the deterministic approaches are too conservative yet the procedures proposed use a deterministic upper bound thus these prediction intervals are also conservative in the sense that the probability that the true quantity is greater than the upper bound is 0 while it is typically a mild assumption exchangeability is a nontrivial one for sketching problems in fact the 2gram real data example does not appear to be exchangeable since consecutive 2grams share a word and thus the joint distribution of 2grams is not invariant under permutation the authors are instead treating the 2grams as fixed and are sampling iid from them but then the interpretation of the conformal prediction guarantee is less clear as the probability statement includes this sampling variation in this setup the prediction interval covers a sampled frequency instead of the true one another typically benign feature of conformal prediction that may be problematic in this setting is that the test point is assumed to be exchangeable when combined with the training examples this means that the one can only construct prediction intervals for randomly chosen items which is quite restrictive particularly if the dictionary is large the technical contributions of the paper are not substantial this is not meant to be a major criticism since the combination of simplicity and generality is what makes conformal prediction useful in a wide variety of areas the procedure becomes trivial if the element of interest appears in the warmup phase or if one is interested in a small number of elements in which case one may just track these elements following the same reasoning used to justify the warmup phase while the writing is clear i believe the exposition can be improved in places for example the sketching problem should be first introduced in more generality before the countmin sketch is introduced since the paper is not solely about cms or cmscu also it is misleading to refer to the constructed prediction sets as one that is as short as possible on line 162 since optimality of prediction sets is a different topic that is not investigated here the authors are forthright about exchangeability being potentially restrictive for sketching problems but do not discuss the subtle issues with interpreting the sampled items in the real data examples ### Summary:
the paper proposes a method based on conformal inference in order to obtain confidence intervals for the frequencies of queried objects in very large data sets based on sketched data the applicability of the method relies solely on the exchangeability assumption for the data not on the sketching procedure nor on the data distribution and is therefore very general as emphasized by all reviewers the reviewers have done a great job and this should and has been acknowledged by the authors there has been some objections concerning the applicability of the main assumption exchangeability the meaningfulness of the experimental comparison with prior work and the interpretation of the resulting plots or on the amount of theoretical content of the paper but the postreview discussion appears to have been very active and fruitful it overall gives me the impression that the authors took very seriously the comments and will improve the manuscript accordingly and that many objections could be answered by a more appropriate exposition given that this paper lies of the edge of the acceptance threshold this improvement is very important as the reviewers concerns which have some strong overlap will otherwise be probably be shared by the wider audience of readers this is especially true given the statistics flavor of the paper which does not target the main neurips audience implying that an even greater effort has to be put on the presentation very detailed and i find meaningful from a layman perspective answers have been provided by the authors and not all their content will fit in the additional page there is thus a important work of selection and rewriting ahead of the authors before publication
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 2520, 7714, 24357, 29269, 1025, 30547, 7695, 247, 1332, 970, 29269, 10554, 281, 3989, 7162, 11508, 323, 4294, 19241, 1754, 327, 30547, 2147, 941, 253, 1332, 2987, 387, 667, 6799, 1268, 12401, 253, 2629, 2715, 273, 1385, 1222, 23211, 285, 671, 31167, 11518, 11269, 10527, 1543, 285, 9938, 342, 1097, 13506, 285, 294, 455, 1074, 15302, 403, 2530, 534, 5100, 13091, 285, 34647, 253, 4549, 273, 253, 4081, 1332, 50274, 11183, 846, 4361, 253, 4477, 2380, 891, 452, 2559, 619, 4868, 281, 247, 818, 2997, 20544, 253, 2929, 310, 973, 3542, 285, 4620, 281, 320, 598, 281, 3522, 342, 253, 6239, 247, 18389, 273, 29269, 10554, 310, 2530, 285, 253, 1543, 403, 1881, 41010, 50275, 20881, 1255, 625, 10527, 1543, 651, 320, 9371, 323, 1650, 690, 10527, 1543, 8664, 253, 2978, 273, 253, 7162, 11508, 285, 671, 3045, 672, 253, 6431, 1430, 9376, 1057, 417, 2186, 651, 320, 9371, 50276, 2887, 3533, 285, 253, 14855, 4385, 2442, 4016, 38058, 3486, 417, 7763, 5474, 339, 431, 248, 2929, 29328, 247, 29269, 1332, 326, 1677, 247, 30547, 7695, 1159, 815, 74, 48169, 247, 7162, 7726, 323, 253, 4294, 273, 247, 3632, 7316, 19958, 6431, 1598, 432, 253, 941, 253, 9380, 5962, 4757, 310, 253, 23658, 38135, 273, 253, 5019, 273, 29269, 10554, 285, 941, 30547, 7695, 281, 619, 3640, 253, 5019, 310, 4336, 747, 50276, 328, 9520, 253, 1332, 556, 387, 1878, 767, 1534, 7364, 806, 352, 310, 12744, 1880, 253, 1511, 273, 10527, 12215, 326, 253, 1332, 476, 2085, 310, 273, 1600, 323, 2444, 342, 30547, 2147, 941, 1580, 253, 5912, 273, 7031, 310, 342, 1675, 281, 253, 3632, 1255, 273, 253, 7316, 285, 417, 273, 253, 30547, 7695, 1273, 323, 436, 12215, 281, 2186, 253, 3632, 19241, 1364, 320, 6431, 494, 253, 4477, 3176, 281, 320, 6600, 273, 253, 7364, 273, 616, 1332, 21194, 2593, 608, 2299, 891, 1158, 326, 841, 7364, 17337, 247, 625, 9470, 1971, 5474, 339, 431, 248, 4477, 12661, 247, 1332, 323, 26736, 7162, 11508, 323, 9372, 1754, 327, 16851, 9372, 4561, 407, 247, 30547, 7695, 5933, 50276, 783, 4081, 1332, 8687, 7562, 3540, 273, 3242, 9372, 323, 247, 23507, 19034, 273, 4957, 285, 12672, 46073, 7363, 1754, 327, 37808, 875, 253, 30547, 2147, 1385, 285, 2032, 1385, 50276, 5302, 841, 7363, 247, 10554, 2919, 310, 4561, 323, 247, 12421, 4236, 3284, 273, 253, 19034, 970, 29269, 10554, 50276, 783, 1332, 310, 2011, 281, 320, 12085, 327, 13506, 285, 1524, 941, 6667, 50276, 296, 3755, 20556, 50276, 783, 4081, 1332, 10958, 953, 253, 15840, 273, 29269, 10554, 285, 3021, 2987, 323, 667, 30547, 7695, 5933, 285, 323, 667, 3045, 2557, 26332, 46073, 4868, 50273, 783, 5140, 273, 253, 6960, 8557, 281, 4979, 253, 30547, 7695, 1895, 715, 247, 22296, 4715, 1895, 42133, 281, 29269, 10554, 310, 4460, 50275, 783, 958, 326, 352, 31088, 281, 1978, 3540, 273, 278, 17, 26198, 278, 9372, 275, 1340, 281, 3989, 247, 10554, 7726, 323, 253, 1071, 1789, 1955, 281, 253, 6431, 1430, 9376, 310, 4722, 50275, 20881, 1255, 265, 50276, 531, 273, 253, 2022, 7125, 326, 253, 4477, 1056, 275, 3718, 273, 29269, 10554, 689, 30027, 7274, 310, 326, 253, 30027, 7274, 403, 1512, 11518, 50276, 28948, 253, 7259, 4081, 897, 247, 30027, 5170, 3033, 50276, 40622, 841, 10554, 11508, 403, 671, 11518, 275, 253, 3282, 326, 253, 5912, 326, 253, 2032, 10671, 310, 3687, 685, 253, 5170, 3033, 310, 470, 50275, 6050, 352, 310, 5431, 247, 11134, 9376, 6431, 1430, 310, 247, 37825, 581, 323, 30547, 7695, 3237, 50276, 249, 958, 253, 374, 1710, 1524, 941, 1650, 1057, 417, 3176, 281, 320, 6431, 494, 1580, 12640, 374, 5059, 3894, 247, 3159, 285, 3021, 253, 6036, 3268, 273, 374, 5059, 310, 417, 13727, 762, 29391, 50276, 783, 4477, 403, 3185, 12767, 253, 374, 5059, 347, 4229, 285, 403, 10491, 891, 301, 432, 731, 533, 840, 253, 7914, 273, 253, 29269, 10554, 12215, 310, 1679, 2590, 347, 253, 5912, 3908, 3797, 436, 10491, 7629, 50276, 249, 436, 9978, 253, 10554, 7726, 50276, 1940, 735, 247, 19958, 4294, 3185, 273, 253, 2032, 581, 50276, 23955, 5431, 21690, 4735, 273, 29269, 10554, 326, 778, 320, 20276, 275, 436, 4758, 310, 326, 253, 1071, 1127, 310, 8025, 281, 320, 6431, 494, 672, 5678, 342, 253, 3733, 6667, 436, 2097, 326, 253, 581, 476, 760, 3989, 10554, 11508, 323, 12421, 6777, 4957, 534, 310, 3240, 29190, 3782, 604, 253, 19034, 310, 1781, 50275, 783, 7681, 9021, 273, 253, 2929, 403, 417, 6832, 50276, 2520, 310, 417, 5486, 281, 320, 247, 2201, 14226, 1580, 253, 5019, 273, 17647, 285, 31376, 310, 752, 2789, 29269, 10554, 4217, 275, 247, 4618, 5235, 273, 3672, 50276, 783, 5199, 4916, 14916, 604, 253, 3284, 273, 1600, 4620, 275, 253, 5890, 484, 3408, 390, 604, 581, 310, 6110, 275, 247, 1355, 1180, 273, 3603, 275, 534, 1083, 581, 778, 816, 3540, 841, 3603, 1563, 253, 1072, 14720, 908, 281, 15249, 253, 5890, 484, 3408, 50275, 6050, 253, 4028, 310, 2590, 891, 2868, 253, 47284, 476, 320, 5520, 275, 5053, 50276, 1542, 1650, 253, 30547, 7695, 1895, 943, 320, 806, 5611, 275, 625, 31376, 1078, 253, 1385, 1222, 23211, 310, 5611, 1580, 253, 2929, 310, 417, 12718, 670, 260, 983, 390, 260, 983, 14573, 50276, 12563, 352, 310, 24363, 281, 3730, 281, 253, 8818, 10554, 5239, 347, 581, 326, 310, 347, 2159, 347, 1896, 327, 1386, 23094, 1580, 5556, 1319, 273, 10554, 5239, 310, 247, 1027, 9400, 326, 310, 417, 6949, 1060, 50276, 783, 4477, 403, 6593, 918, 670, 6431, 1430, 1146, 7826, 29190, 323, 30547, 7695, 3237, 533, 513, 417, 2319, 253, 16105, 3374, 342, 29375, 253, 19958, 4957, 275, 253, 1524, 941, 6667, 50276, 187, 187, 4118, 18435, 27, 783, 2929, 29328, 247, 1332, 1754, 327, 29269, 17032, 275, 1340, 281, 4044, 7162, 11508, 323, 253, 11383, 273, 32305, 728, 5113, 275, 1077, 1781, 941, 5239, 1754, 327, 30547, 2147, 941, 253, 30437, 273, 253, 1332, 15771, 12718, 327, 253, 6431, 1430, 9376, 323, 253, 941, 417, 327, 253, 30547, 7695, 5199, 4543, 327, 253, 941, 3268, 285, 310, 3103, 1077, 2087, 347, 21947, 407, 512, 30628, 50276, 783, 30628, 452, 2218, 247, 1270, 2628, 285, 436, 943, 285, 556, 644, 14969, 407, 253, 4477, 627, 556, 644, 690, 21915, 8664, 253, 30437, 273, 253, 2022, 9376, 6431, 1430, 253, 14282, 1255, 273, 253, 5661, 5301, 342, 2720, 789, 285, 253, 7914, 273, 253, 4795, 14777, 390, 327, 253, 2408, 273, 10527, 2600, 273, 253, 2929, 533, 253, 1501, 15337, 5955, 4620, 281, 452, 644, 1077, 3939, 285, 46001, 352, 4583, 4245, 479, 253, 13214, 326, 253, 4477, 2335, 1077, 10369, 253, 5701, 285, 588, 3157, 253, 7714, 15672, 285, 326, 1142, 21915, 812, 320, 9577, 407, 247, 625, 4569, 47284, 50276, 28821, 326, 436, 2929, 8696, 273, 253, 5024, 273, 253, 14924, 7887, 436, 7756, 310, 1077, 1774, 347, 253, 30628, 7350, 534, 452, 690, 2266, 14787, 588, 5010, 320, 3164, 320, 6096, 407, 253, 14200, 8446, 273, 10668, 436, 310, 3340, 2032, 1677, 253, 9990, 13746, 273, 253, 2929, 534, 1057, 417, 2303, 253, 2022, 5723, 2824, 8446, 27594, 326, 271, 1014, 3687, 3434, 556, 281, 320, 1691, 327, 253, 9759, 1077, 7000, 285, 891, 1089, 14282, 432, 247, 2242, 1342, 8668, 9172, 452, 644, 2530, 407, 253, 4477, 285, 417, 512, 616, 2600, 588, 4944, 275, 253, 3081, 3239, 627, 310, 3021, 247, 1774, 789, 273, 5438, 285, 294, 17695, 6386, 273, 253, 4477, 1078, 9311 ]
[ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 ]
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 2520, 7714, 24357, 29269, 1025, 30547, 7695, 247, 1332, 970, 29269, 10554, 281, 3989, 7162, 11508, 323, 4294, 19241, 1754, 327, 30547, 2147, 941, 253, 1332, 2987, 387, 667, 6799, 1268, 12401, 253, 2629, 2715, 273, 1385, 1222, 23211, 285, 671, 31167, 11518, 11269, 10527, 1543, 285, 9938, 342, 1097, 13506, 285, 294, 455, 1074, 15302, 403, 2530, 534, 5100, 13091, 285, 34647, 253, 4549, 273, 253, 4081, 1332, 50274, 11183, 846, 4361, 253, 4477, 2380, 891, 452, 2559, 619, 4868, 281, 247, 818, 2997, 20544, 253, 2929, 310, 973, 3542, 285, 4620, 281, 320, 598, 281, 3522, 342, 253, 6239, 247, 18389, 273, 29269, 10554, 310, 2530, 285, 253, 1543, 403, 1881, 41010, 50275, 20881, 1255, 625, 10527, 1543, 651, 320, 9371, 323, 1650, 690, 10527, 1543, 8664, 253, 2978, 273, 253, 7162, 11508, 285, 671, 3045, 672, 253, 6431, 1430, 9376, 1057, 417, 2186, 651, 320, 9371, 50276, 2887, 3533, 285, 253, 14855, 4385, 2442, 4016, 38058, 3486, 417, 7763, 5474, 339, 431, 248, 2929, 29328, 247, 29269, 1332, 326, 1677, 247, 30547, 7695, 1159, 815, 74, 48169, 247, 7162, 7726, 323, 253, 4294, 273, 247, 3632, 7316, 19958, 6431, 1598, 432, 253, 941, 253, 9380, 5962, 4757, 310, 253, 23658, 38135, 273, 253, 5019, 273, 29269, 10554, 285, 941, 30547, 7695, 281, 619, 3640, 253, 5019, 310, 4336, 747, 50276, 328, 9520, 253, 1332, 556, 387, 1878, 767, 1534, 7364, 806, 352, 310, 12744, 1880, 253, 1511, 273, 10527, 12215, 326, 253, 1332, 476, 2085, 310, 273, 1600, 323, 2444, 342, 30547, 2147, 941, 1580, 253, 5912, 273, 7031, 310, 342, 1675, 281, 253, 3632, 1255, 273, 253, 7316, 285, 417, 273, 253, 30547, 7695, 1273, 323, 436, 12215, 281, 2186, 253, 3632, 19241, 1364, 320, 6431, 494, 253, 4477, 3176, 281, 320, 6600, 273, 253, 7364, 273, 616, 1332, 21194, 2593, 608, 2299, 891, 1158, 326, 841, 7364, 17337, 247, 625, 9470, 1971, 5474, 339, 431, 248, 4477, 12661, 247, 1332, 323, 26736, 7162, 11508, 323, 9372, 1754, 327, 16851, 9372, 4561, 407, 247, 30547, 7695, 5933, 50276, 783, 4081, 1332, 8687, 7562, 3540, 273, 3242, 9372, 323, 247, 23507, 19034, 273, 4957, 285, 12672, 46073, 7363, 1754, 327, 37808, 875, 253, 30547, 2147, 1385, 285, 2032, 1385, 50276, 5302, 841, 7363, 247, 10554, 2919, 310, 4561, 323, 247, 12421, 4236, 3284, 273, 253, 19034, 970, 29269, 10554, 50276, 783, 1332, 310, 2011, 281, 320, 12085, 327, 13506, 285, 1524, 941, 6667, 50276, 296, 3755, 20556, 50276, 783, 4081, 1332, 10958, 953, 253, 15840, 273, 29269, 10554, 285, 3021, 2987, 323, 667, 30547, 7695, 5933, 285, 323, 667, 3045, 2557, 26332, 46073, 4868, 50273, 783, 5140, 273, 253, 6960, 8557, 281, 4979, 253, 30547, 7695, 1895, 715, 247, 22296, 4715, 1895, 42133, 281, 29269, 10554, 310, 4460, 50275, 783, 958, 326, 352, 31088, 281, 1978, 3540, 273, 278, 17, 26198, 278, 9372, 275, 1340, 281, 3989, 247, 10554, 7726, 323, 253, 1071, 1789, 1955, 281, 253, 6431, 1430, 9376, 310, 4722, 50275, 20881, 1255, 265, 50276, 531, 273, 253, 2022, 7125, 326, 253, 4477, 1056, 275, 3718, 273, 29269, 10554, 689, 30027, 7274, 310, 326, 253, 30027, 7274, 403, 1512, 11518, 50276, 28948, 253, 7259, 4081, 897, 247, 30027, 5170, 3033, 50276, 40622, 841, 10554, 11508, 403, 671, 11518, 275, 253, 3282, 326, 253, 5912, 326, 253, 2032, 10671, 310, 3687, 685, 253, 5170, 3033, 310, 470, 50275, 6050, 352, 310, 5431, 247, 11134, 9376, 6431, 1430, 310, 247, 37825, 581, 323, 30547, 7695, 3237, 50276, 249, 958, 253, 374, 1710, 1524, 941, 1650, 1057, 417, 3176, 281, 320, 6431, 494, 1580, 12640, 374, 5059, 3894, 247, 3159, 285, 3021, 253, 6036, 3268, 273, 374, 5059, 310, 417, 13727, 762, 29391, 50276, 783, 4477, 403, 3185, 12767, 253, 374, 5059, 347, 4229, 285, 403, 10491, 891, 301, 432, 731, 533, 840, 253, 7914, 273, 253, 29269, 10554, 12215, 310, 1679, 2590, 347, 253, 5912, 3908, 3797, 436, 10491, 7629, 50276, 249, 436, 9978, 253, 10554, 7726, 50276, 1940, 735, 247, 19958, 4294, 3185, 273, 253, 2032, 581, 50276, 23955, 5431, 21690, 4735, 273, 29269, 10554, 326, 778, 320, 20276, 275, 436, 4758, 310, 326, 253, 1071, 1127, 310, 8025, 281, 320, 6431, 494, 672, 5678, 342, 253, 3733, 6667, 436, 2097, 326, 253, 581, 476, 760, 3989, 10554, 11508, 323, 12421, 6777, 4957, 534, 310, 3240, 29190, 3782, 604, 253, 19034, 310, 1781, 50275, 783, 7681, 9021, 273, 253, 2929, 403, 417, 6832, 50276, 2520, 310, 417, 5486, 281, 320, 247, 2201, 14226, 1580, 253, 5019, 273, 17647, 285, 31376, 310, 752, 2789, 29269, 10554, 4217, 275, 247, 4618, 5235, 273, 3672, 50276, 783, 5199, 4916, 14916, 604, 253, 3284, 273, 1600, 4620, 275, 253, 5890, 484, 3408, 390, 604, 581, 310, 6110, 275, 247, 1355, 1180, 273, 3603, 275, 534, 1083, 581, 778, 816, 3540, 841, 3603, 1563, 253, 1072, 14720, 908, 281, 15249, 253, 5890, 484, 3408, 50275, 6050, 253, 4028, 310, 2590, 891, 2868, 253, 47284, 476, 320, 5520, 275, 5053, 50276, 1542, 1650, 253, 30547, 7695, 1895, 943, 320, 806, 5611, 275, 625, 31376, 1078, 253, 1385, 1222, 23211, 310, 5611, 1580, 253, 2929, 310, 417, 12718, 670, 260, 983, 390, 260, 983, 14573, 50276, 12563, 352, 310, 24363, 281, 3730, 281, 253, 8818, 10554, 5239, 347, 581, 326, 310, 347, 2159, 347, 1896, 327, 1386, 23094, 1580, 5556, 1319, 273, 10554, 5239, 310, 247, 1027, 9400, 326, 310, 417, 6949, 1060, 50276, 783, 4477, 403, 6593, 918, 670, 6431, 1430, 1146, 7826, 29190, 323, 30547, 7695, 3237, 533, 513, 417, 2319, 253, 16105, 3374, 342, 29375, 253, 19958, 4957, 275, 253, 1524, 941, 6667, 50276, 187, 187, 4118, 18435, 27, 783, 2929, 29328, 247, 1332, 1754, 327, 29269, 17032, 275, 1340, 281, 4044, 7162, 11508, 323, 253, 11383, 273, 32305, 728, 5113, 275, 1077, 1781, 941, 5239, 1754, 327, 30547, 2147, 941, 253, 30437, 273, 253, 1332, 15771, 12718, 327, 253, 6431, 1430, 9376, 323, 253, 941, 417, 327, 253, 30547, 7695, 5199, 4543, 327, 253, 941, 3268, 285, 310, 3103, 1077, 2087, 347, 21947, 407, 512, 30628, 50276, 783, 30628, 452, 2218, 247, 1270, 2628, 285, 436, 943, 285, 556, 644, 14969, 407, 253, 4477, 627, 556, 644, 690, 21915, 8664, 253, 30437, 273, 253, 2022, 9376, 6431, 1430, 253, 14282, 1255, 273, 253, 5661, 5301, 342, 2720, 789, 285, 253, 7914, 273, 253, 4795, 14777, 390, 327, 253, 2408, 273, 10527, 2600, 273, 253, 2929, 533, 253, 1501, 15337, 5955, 4620, 281, 452, 644, 1077, 3939, 285, 46001, 352, 4583, 4245, 479, 253, 13214, 326, 253, 4477, 2335, 1077, 10369, 253, 5701, 285, 588, 3157, 253, 7714, 15672, 285, 326, 1142, 21915, 812, 320, 9577, 407, 247, 625, 4569, 47284, 50276, 28821, 326, 436, 2929, 8696, 273, 253, 5024, 273, 253, 14924, 7887, 436, 7756, 310, 1077, 1774, 347, 253, 30628, 7350, 534, 452, 690, 2266, 14787, 588, 5010, 320, 3164, 320, 6096, 407, 253, 14200, 8446, 273, 10668, 436, 310, 3340, 2032, 1677, 253, 9990, 13746, 273, 253, 2929, 534, 1057, 417, 2303, 253, 2022, 5723, 2824, 8446, 27594, 326, 271, 1014, 3687, 3434, 556, 281, 320, 1691, 327, 253, 9759, 1077, 7000, 285, 891, 1089, 14282, 432, 247, 2242, 1342, 8668, 9172, 452, 644, 2530, 407, 253, 4477, 285, 417, 512, 616, 2600, 588, 4944, 275, 253, 3081, 3239, 627, 310, 3021, 247, 1774, 789, 273, 5438, 285, 294, 17695, 6386, 273, 253, 4477, 1078, 9311 ]
Below is given review of a research paper from cnoference journal. Please write a summary the review. ### Review: this paper tackles the classimbalanced problem with a semisupervised learning scenario unlike the previous approaches which require a complicated sampling strategy and multiple training pipelines the authors provide a simple and unified framework udal by connecting the ideas of progressive distribution alignment proposed for imbalanced semisup to logits adjustment proposed for imbalanced sup this approach incurs no additional training time on top of the underlying semisupervised learner significant empirical improvement on widely used benchmarks cifar10lt cifar100lt and imagenet127 demonstrates the effectiveness of the proposed method pros clarity overall the writing is clear and easy to follow important problem with a simple and effective solution considering the class imbalance scenario is an essential step for applying ssl in a more realistic scenario but has been less explored the proposed method can be used with a simple modification of the existing baseline and it shows significant empirical gain hence it has the potential to be widely used to mitigate this problem without additional burdens although the proposed method can be viewed as a simple combination of the existing methods i believe that the simplicity and effectiveness of the proposed method will be interesting to the reader in iclr cons more general imbalanced semisupervised learning scenarios although i agree that a considered scenario ie distributions of both labeled and unlabeled are same is most natural i wonder that the proposed method can be applicable to more generic scenarios ie distributions of both labeled and unlabeled are different is there any way to apply the proposed udal to such a challenging scenario effect of the strategy in kang et al 2020 in section 43 the authors present that they use a deferred resampling strategy which is introduced by kang et al 2020 spend the very last stages of training aligning to a relatively balanced class distribution can the authors give the details about this also is this necessary for empirical improvement kang et al decoupling representation and classifier for longtailed recognition in iclr 2020 although the proposed udal can be viewed as a simple combination of the existing methods progressive distribution alignment and logits adjustment i believe that the simplicity and empirical effectiveness of the proposed method will be interesting to the reader in iclr docsepthis paper addresses the topic of semisupervised learning in cases where the underlying data distribution is severely imbalanced this approach combines distribution alignment with logit adjustment resulting in an efficient method for solving the aforementioned problem while improving performance in the test setting unlike existing stateoftheart approaches such as crest the approach involves no sampling state and imbalance mitigation is achieved just by modifying the loss functions of the model experiments are conducted over three benchmark vision datasets of varying complexity longtailed versions of cifar10 cifar100 and imagenet127 experimental results are competitive with or exceed other methods on the tested datasets with 5x training speedup compared to crest strengths this paper addresses the topic of semisupervised learning in cases where the underlying data distribution is severely imbalanced this is an underexplored problem that is relevant to the aiml community and has practical realworld impact in contrast to other papers addressing this problem the proposed approach relies only on diistribution alignment without samplingbased strategies this is done by moving the distribution into the crossentropy loss computations of recent semisupervised learning algorithms fixmatch and mixmatch this work connects ideas of progressive distribution alignment with logit adjustment from the fully supervised imbalanced learning setting by considering distribution alignment alone training time is reduced 5x reduction from crest the method achieves significantly better performance characteristics as more labeled data becomes available and the approach scales to larger datasets outperforming the best existing method on imagenet127 in terms of accuracy the proposed method is wellframed in the existing literature section 2 provides a nice overview of the prerequisities needed to understand the method eqs 8 and 9 are interesting insights the approach is simple requiring only a few additional lines of code to implement with no alteration to the training scheme and no increased training time the approach only has two hyperparameters which need to be tuned experiments are conducted over three benchmark vision datasets of varying complexity longtailed versions of cifar10 cifar100 and imagenet127 experimental setup seems reasonable and follows a similar procedure to crest experimental results are competitive or exceeds other methods on the tested datasets ablations are run to validate each component of the proposed approach weaknesses imbalanced semisupervised learning while an important problem is not a new problem somewhat limiting novelty the proposed approach is a combination of two existing approaches somewhat limiting novelty experiments are conducted on only a single network architecture it would be useful to show it works on a wide range of neural network architectures it would be useful to see if the results are statistically significant questions what happens if the assumption that the unlabeled set of examples is imbalanced in a different way than the training examples please define strong and weakly augmented versions of the unlabeled example are there any significant drawbacks to this approach compared to others what happens in the balanced setting does it fall back to a standard semisupervised learning approach this paper addresses the topic of semisupervised learning in cases where the underlying data distribution is severely imbalanced this is an underexplored problem that is relevant to the aiml community and has practical realworld impact in contrast to other papers addressing this problem the proposed approach relies only on diistribution alignment without samplingbased strategies this is done by moving the distribution into the crossentropy loss computations of recent semisupervised learning algorithms fixmatch and mixmatch the paper is generally wellstructured and clearly written the experimental set up is sound and experimental results show the promise of the model the approach is wellgrounded in the existing literature and its motivation is clear as it stands i do not see any major flaws with the approach docsepthis paper focuses on classimbalanced semisupervised learning and proposes a new method that combines distribution alignment and logit adjustment techniques experiments show that the proposal can achieve performance improvement this paper focuses on classimbalanced semisupervised learning and proposes a new method that combines distribution alignment and logit adjustment techniques the proposal is simple however the proposed method relies on the assumption that the imbalance distribution between labeled and unlabeled datasets is the same this is a very strong prior knowledge and in reality we can not obtain this knowledge thus the application of the proposal is limited moreover the new proposal is mainly based on existing techniques although combining these techniques can achieve a performance improvement the novelty and contribution are limited this paper proposes a new method for classimbalanced semisupervised learning however in my view the assumption of the proposal is too strong to satisfy and the novelty of the proposal is limited docsepthis paper studies classimbalanced semisupervised learning to handle this problem a unified approach is proposed by combing distribution alignment da and logits adjustment la in particular this paper proposes to apply da and la to both supervised and unsupervised losses which is new this method shows significant improvement over baselines on three datasets strengths 1 this paper studies an important problem which is underexplored in literature 2 the proposed method is simple yet effective which shows substantial performance gains on multiple datasets 3 this paper is well written and easy to understand 4 the proposed method is reasonable that aligns the distribution of both supervised and unsupervised losses weaknesses 1 the novelty is limited as mentioned the proposed method is pretty much based on existing works ie da and la which are used and demonstrated to be helpful in classimbalanced tasks importantly the improvements of the methodologies are limited for example it is very natural to extend la to unsupervised loss by using pseudolabels in this regard this paper does not examine the quality of the generated pseudolabels does the distribution of pseudolabel make sense for the da in eq 10 it immediately follows previous work with a minor difference by introducing an additional parameter k 2 more baselines should be compared in the current version this paper mainly compares its proposal with crest i am aware that crest is a strong baseline but it would be more convincing if more baselines could be compared such as 123 3 hyperparameters from figure 3 it is observed that the performance of the model is quite sensitive to the choice of alphamin 4 in table 4 the results for supervised models are not informative to better quantify the gap la or other techniques that handles class imbalance should be also applied 1 distributionaware semanticsoriented pseudolabel for imbalanced semisupervised learning 2 abc auxiliary balanced classifier for classimbalanced semisupervised learning neurips 2021 3 unsupervised semantic aggregation and deformable template matching for semisupervised learning neurips 2020 my main concern is that the novelty is limited which makes this paper like an incremental work therefore i recommend rejection ### Summary:
thanks for your submission to iclr reviews were fairly mixed on this paper with two reviewers advocating for accepting the paper and two advocating for rejecting the paper there were some concerns raised by the reviewers most notably novelty and some issues with the experiments after rebuttal the negative reviewers maintained their scores and the positive reviewers were somewhat less enthusiastic in the end the paper is quite borderline and could really go either way but it seems that the paper could use another round of reviewing particularly to make sure the issues raised by the reviewers are adequately addressed please do keep in mind their comments when preparing a future version of the manuscript
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 436, 2929, 39223, 253, 966, 6785, 267, 3086, 1895, 342, 247, 49863, 29974, 13337, 4715, 10076, 12401, 253, 2045, 7274, 534, 2430, 247, 9542, 10491, 5700, 285, 2709, 3733, 44387, 253, 4477, 2085, 247, 2969, 285, 27998, 7792, 18198, 267, 407, 12873, 253, 5697, 273, 13439, 3268, 12420, 4081, 323, 516, 30063, 49863, 484, 281, 2412, 953, 14000, 4081, 323, 516, 30063, 7018, 436, 2746, 1485, 2244, 642, 3081, 3733, 673, 327, 1755, 273, 253, 6944, 49863, 29974, 13337, 458, 47612, 1534, 16774, 7756, 327, 7561, 908, 49602, 260, 338, 274, 740, 5792, 260, 338, 274, 2313, 5792, 285, 4440, 257, 292, 11946, 14371, 253, 12510, 273, 253, 4081, 1332, 5847, 50275, 498, 15752, 4583, 253, 4028, 310, 2590, 285, 3477, 281, 956, 50276, 18108, 1895, 342, 247, 2969, 285, 3576, 2900, 7296, 253, 966, 31561, 10076, 310, 271, 5667, 3213, 323, 9433, 256, 3433, 275, 247, 625, 15958, 10076, 533, 556, 644, 1679, 14859, 253, 4081, 1332, 476, 320, 908, 342, 247, 2969, 11237, 273, 253, 5368, 8245, 285, 352, 2722, 1534, 16774, 6351, 7613, 352, 556, 253, 2442, 281, 320, 7561, 908, 281, 29966, 436, 1895, 1293, 3081, 32274, 3738, 253, 4081, 1332, 476, 320, 11575, 347, 247, 2969, 5019, 273, 253, 5368, 3082, 891, 2868, 326, 253, 17647, 285, 12510, 273, 253, 4081, 1332, 588, 320, 4722, 281, 253, 9414, 275, 17857, 32888, 50276, 5040, 50275, 3062, 2087, 516, 30063, 49863, 29974, 13337, 4715, 15216, 3738, 891, 5194, 326, 247, 2783, 10076, 26332, 10670, 273, 1097, 13130, 285, 440, 22027, 403, 1072, 310, 954, 3626, 891, 4282, 326, 253, 4081, 1332, 476, 320, 7763, 281, 625, 12314, 15216, 26332, 50276, 8155, 8303, 273, 1097, 13130, 285, 440, 22027, 403, 1027, 310, 627, 667, 1039, 281, 4647, 253, 4081, 18198, 267, 281, 824, 247, 11132, 10076, 50276, 8222, 273, 253, 5700, 275, 465, 606, 1162, 355, 9169, 275, 2593, 7652, 253, 4477, 1246, 326, 597, 897, 247, 36334, 501, 312, 4906, 5700, 534, 310, 5611, 407, 465, 606, 1162, 355, 9169, 50276, 1033, 423, 253, 1077, 1390, 8661, 273, 3733, 8495, 272, 281, 247, 4942, 16645, 966, 3268, 476, 253, 4477, 1918, 253, 4278, 670, 436, 671, 310, 436, 3309, 323, 16774, 7756, 50276, 76, 606, 1162, 355, 34430, 4906, 6779, 285, 30410, 323, 1048, 29551, 8981, 275, 17857, 32888, 9169, 3738, 253, 4081, 18198, 267, 476, 320, 11575, 347, 247, 2969, 5019, 273, 253, 5368, 3082, 13439, 3268, 12420, 285, 2412, 953, 14000, 891, 2868, 326, 253, 17647, 285, 16774, 12510, 273, 253, 4081, 1332, 588, 320, 4722, 281, 253, 9414, 275, 17857, 32888, 5474, 33032, 2520, 2929, 12453, 253, 9400, 273, 49863, 29974, 13337, 4715, 275, 2219, 835, 253, 6944, 941, 3268, 310, 18270, 516, 30063, 436, 2746, 24772, 3268, 12420, 342, 2412, 262, 14000, 4795, 275, 271, 5919, 1332, 323, 16161, 253, 18979, 1895, 1223, 11138, 3045, 275, 253, 1071, 4758, 12401, 5368, 1375, 23037, 14387, 7274, 824, 347, 30402, 253, 2746, 8687, 642, 10491, 1375, 285, 31561, 36455, 310, 6786, 816, 407, 26264, 253, 2957, 3470, 273, 253, 1566, 4679, 403, 5196, 689, 1264, 22791, 8113, 15302, 273, 11962, 10454, 1048, 29551, 9508, 273, 260, 338, 274, 740, 260, 338, 274, 2313, 285, 4440, 257, 292, 11946, 5661, 1543, 403, 12085, 342, 390, 8268, 643, 3082, 327, 253, 5762, 15302, 342, 608, 89, 3733, 3885, 484, 2429, 281, 30402, 20544, 50276, 2520, 2929, 12453, 253, 9400, 273, 49863, 29974, 13337, 4715, 275, 2219, 835, 253, 6944, 941, 3268, 310, 18270, 516, 30063, 436, 310, 271, 15560, 18398, 446, 2149, 1895, 326, 310, 4623, 281, 253, 4388, 77, 3114, 285, 556, 8542, 1524, 10186, 3486, 50276, 249, 4499, 281, 643, 9380, 15974, 436, 1895, 253, 4081, 2746, 15771, 760, 327, 1073, 382, 2382, 12420, 1293, 10491, 3169, 8130, 436, 310, 2218, 407, 4886, 253, 3268, 715, 253, 2831, 290, 10144, 2957, 30745, 273, 3332, 49863, 29974, 13337, 4715, 11333, 4993, 8992, 285, 5878, 8992, 50276, 2520, 789, 23417, 5697, 273, 13439, 3268, 12420, 342, 2412, 262, 14000, 432, 253, 4751, 22296, 516, 30063, 4715, 4758, 50276, 1615, 7296, 3268, 12420, 3815, 3733, 673, 310, 3777, 608, 89, 5141, 432, 30402, 50276, 783, 1332, 33526, 3012, 1805, 3045, 5319, 347, 625, 13130, 941, 4916, 2130, 285, 253, 2746, 11498, 281, 4067, 15302, 41731, 14692, 253, 1682, 5368, 1332, 327, 4440, 257, 292, 11946, 275, 2426, 273, 7200, 50276, 783, 4081, 1332, 310, 973, 925, 3163, 275, 253, 5368, 6239, 50276, 4674, 374, 3400, 247, 5322, 18389, 273, 253, 31341, 261, 1005, 3058, 281, 2096, 253, 1332, 50276, 2574, 84, 854, 285, 898, 403, 4722, 16039, 50276, 783, 2746, 310, 2969, 10568, 760, 247, 1643, 3081, 3104, 273, 2127, 281, 3359, 342, 642, 26787, 281, 253, 3733, 6974, 285, 642, 2559, 3733, 673, 50276, 783, 2746, 760, 556, 767, 4373, 22041, 534, 878, 281, 320, 24251, 50276, 16217, 3825, 403, 5196, 689, 1264, 22791, 8113, 15302, 273, 11962, 10454, 1048, 29551, 9508, 273, 260, 338, 274, 740, 260, 338, 274, 2313, 285, 4440, 257, 292, 11946, 50276, 49363, 9978, 3133, 5272, 285, 3637, 247, 2074, 5199, 281, 30402, 50276, 49363, 1543, 403, 12085, 390, 23141, 643, 3082, 327, 253, 5762, 15302, 50276, 1752, 569, 403, 1408, 281, 17813, 1016, 4445, 273, 253, 4081, 2746, 50276, 20881, 1255, 265, 50276, 6785, 267, 3086, 49863, 29974, 13337, 4715, 1223, 271, 1774, 1895, 310, 417, 247, 747, 1895, 8489, 14155, 38135, 50276, 783, 4081, 2746, 310, 247, 5019, 273, 767, 5368, 7274, 8489, 14155, 38135, 50276, 16217, 3825, 403, 5196, 327, 760, 247, 2014, 2990, 10336, 352, 651, 320, 4217, 281, 921, 352, 2987, 327, 247, 4618, 2491, 273, 11454, 2990, 35615, 50276, 262, 651, 320, 4217, 281, 923, 604, 253, 1543, 403, 10126, 1534, 50276, 34974, 50276, 5371, 6569, 604, 253, 9376, 326, 253, 440, 22027, 873, 273, 6667, 310, 516, 30063, 275, 247, 1027, 1039, 685, 253, 3733, 6667, 50276, 32897, 4853, 2266, 285, 22112, 31612, 9508, 273, 253, 440, 22027, 1650, 50276, 609, 627, 667, 1534, 30453, 281, 436, 2746, 2429, 281, 2571, 50276, 5371, 6569, 275, 253, 16645, 4758, 1057, 352, 2965, 896, 281, 247, 2629, 49863, 29974, 13337, 4715, 2746, 436, 2929, 12453, 253, 9400, 273, 49863, 29974, 13337, 4715, 275, 2219, 835, 253, 6944, 941, 3268, 310, 18270, 516, 30063, 436, 310, 271, 15560, 18398, 446, 2149, 1895, 326, 310, 4623, 281, 253, 4388, 77, 3114, 285, 556, 8542, 1524, 10186, 3486, 275, 4499, 281, 643, 9380, 15974, 436, 1895, 253, 4081, 2746, 15771, 760, 327, 1073, 382, 2382, 12420, 1293, 10491, 3169, 8130, 436, 310, 2218, 407, 4886, 253, 3268, 715, 253, 2831, 290, 10144, 2957, 30745, 273, 3332, 49863, 29974, 13337, 4715, 11333, 4993, 8992, 285, 5878, 8992, 253, 2929, 310, 3839, 973, 34218, 285, 4518, 3542, 253, 5661, 873, 598, 310, 3590, 285, 5661, 1543, 921, 253, 9023, 273, 253, 1566, 253, 2746, 310, 973, 2595, 264, 275, 253, 5368, 6239, 285, 697, 16038, 310, 2590, 347, 352, 9572, 891, 513, 417, 923, 667, 2201, 32138, 342, 253, 2746, 5474, 33032, 2520, 2929, 16633, 327, 966, 6785, 267, 3086, 49863, 29974, 13337, 4715, 285, 29328, 247, 747, 1332, 326, 24772, 3268, 12420, 285, 2412, 262, 14000, 5609, 4679, 921, 326, 253, 10419, 476, 5115, 3045, 7756, 436, 2929, 16633, 327, 966, 6785, 267, 3086, 49863, 29974, 13337, 4715, 285, 29328, 247, 747, 1332, 326, 24772, 3268, 12420, 285, 2412, 262, 14000, 5609, 253, 10419, 310, 2969, 2299, 253, 4081, 1332, 15771, 327, 253, 9376, 326, 253, 31561, 3268, 875, 13130, 285, 440, 22027, 15302, 310, 253, 1072, 436, 310, 247, 1077, 2266, 2720, 3640, 285, 275, 6612, 359, 476, 417, 4044, 436, 3640, 50276, 40622, 253, 2898, 273, 253, 10419, 310, 3710, 25761, 253, 747, 10419, 310, 7194, 1754, 327, 5368, 5609, 3738, 16248, 841, 5609, 476, 5115, 247, 3045, 7756, 253, 38135, 285, 7680, 403, 3710, 436, 2929, 29328, 247, 747, 1332, 323, 966, 6785, 267, 3086, 49863, 29974, 13337, 4715, 2299, 275, 619, 1859, 253, 9376, 273, 253, 10419, 310, 1512, 2266, 281, 10517, 285, 253, 38135, 273, 253, 10419, 310, 3710, 5474, 33032, 2520, 2929, 2175, 966, 6785, 267, 3086, 49863, 29974, 13337, 4715, 281, 6016, 436, 1895, 247, 27998, 2746, 310, 4081, 407, 2049, 272, 3268, 12420, 4204, 285, 2412, 953, 14000, 826, 275, 1798, 436, 2929, 29328, 281, 4647, 4204, 285, 826, 281, 1097, 22296, 285, 440, 35421, 11655, 534, 310, 747, 436, 1332, 2722, 1534, 7756, 689, 1666, 25379, 327, 1264, 15302, 50275, 296, 3755, 20556, 337, 436, 2929, 2175, 271, 1774, 1895, 534, 310, 15560, 18398, 446, 2149, 275, 6239, 374, 253, 4081, 1332, 310, 2969, 2568, 3576, 534, 2722, 6832, 3045, 15988, 327, 2709, 15302, 495, 436, 2929, 310, 973, 3542, 285, 3477, 281, 2096, 577, 253, 4081, 1332, 310, 5272, 326, 8495, 84, 253, 3268, 273, 1097, 22296, 285, 440, 35421, 11655, 50275, 20881, 1255, 265, 337, 253, 38135, 310, 3710, 347, 5393, 253, 4081, 1332, 310, 3965, 1199, 1754, 327, 5368, 2987, 26332, 4204, 285, 826, 534, 403, 908, 285, 5183, 281, 320, 9371, 275, 966, 6785, 267, 3086, 8892, 15538, 253, 11701, 273, 253, 39396, 403, 3710, 323, 1650, 352, 310, 1077, 3626, 281, 9017, 826, 281, 440, 35421, 2957, 407, 970, 10585, 311, 357, 1241, 275, 436, 2743, 436, 2929, 1057, 417, 9186, 253, 3290, 273, 253, 4561, 10585, 311, 357, 1241, 1057, 253, 3268, 273, 10585, 311, 1492, 1056, 3282, 323, 253, 4204, 275, 16186, 884, 352, 4745, 3637, 2045, 789, 342, 247, 5884, 3064, 407, 16984, 271, 3081, 4764, 465, 374, 625, 1666, 25379, 943, 320, 2429, 275, 253, 1655, 2715, 436, 2929, 7194, 26662, 697, 10419, 342, 30402, 891, 717, 6600, 326, 30402, 310, 247, 2266, 8245, 533, 352, 651, 320, 625, 21414, 604, 625, 1666, 25379, 812, 320, 2429, 824, 347, 15567, 495, 4373, 22041, 432, 4677, 495, 352, 310, 2540, 326, 253, 3045, 273, 253, 1566, 310, 3240, 7996, 281, 253, 4327, 273, 355, 545, 4988, 577, 275, 2829, 577, 253, 1543, 323, 22296, 3210, 403, 417, 27096, 281, 1805, 22048, 253, 8037, 826, 390, 643, 5609, 326, 22139, 966, 31561, 943, 320, 671, 3732, 50275, 18, 3268, 13823, 35185, 21085, 10585, 311, 1492, 323, 516, 30063, 49863, 29974, 13337, 4715, 50276, 19, 490, 68, 24026, 16645, 30410, 323, 966, 6785, 267, 3086, 49863, 29974, 13337, 4715, 5723, 2824, 43425, 50276, 20, 440, 35421, 24705, 20828, 285, 22403, 494, 7646, 11038, 323, 49863, 29974, 13337, 4715, 5723, 2824, 9169, 619, 2022, 4468, 310, 326, 253, 38135, 310, 3710, 534, 2789, 436, 2929, 751, 271, 32809, 789, 3103, 891, 5583, 18235, 2490, 187, 4118, 18435, 27, 35501, 323, 634, 19529, 281, 17857, 32888, 50276, 15337, 84, 497, 9648, 6804, 327, 436, 2929, 342, 767, 30628, 43243, 323, 18738, 253, 2929, 285, 767, 43243, 323, 33944, 253, 2929, 50276, 9088, 497, 690, 7350, 5439, 407, 253, 30628, 954, 19836, 38135, 285, 690, 3374, 342, 253, 4679, 50276, 6438, 30080, 22559, 253, 4016, 30628, 8838, 616, 7363, 285, 253, 2762, 30628, 497, 8489, 1679, 31905, 50276, 249, 253, 990, 253, 2929, 310, 3240, 45210, 285, 812, 1663, 564, 2057, 1039, 533, 352, 3133, 326, 253, 2929, 812, 897, 1529, 3790, 273, 16725, 3782, 281, 1056, 2119, 253, 3374, 5439, 407, 253, 30628, 403, 18212, 9713, 50276, 32897, 513, 1978, 275, 2564, 616, 5701, 672, 13828, 247, 2852, 2715, 273, 253, 7714 ]
[ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 ]
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 436, 2929, 39223, 253, 966, 6785, 267, 3086, 1895, 342, 247, 49863, 29974, 13337, 4715, 10076, 12401, 253, 2045, 7274, 534, 2430, 247, 9542, 10491, 5700, 285, 2709, 3733, 44387, 253, 4477, 2085, 247, 2969, 285, 27998, 7792, 18198, 267, 407, 12873, 253, 5697, 273, 13439, 3268, 12420, 4081, 323, 516, 30063, 49863, 484, 281, 2412, 953, 14000, 4081, 323, 516, 30063, 7018, 436, 2746, 1485, 2244, 642, 3081, 3733, 673, 327, 1755, 273, 253, 6944, 49863, 29974, 13337, 458, 47612, 1534, 16774, 7756, 327, 7561, 908, 49602, 260, 338, 274, 740, 5792, 260, 338, 274, 2313, 5792, 285, 4440, 257, 292, 11946, 14371, 253, 12510, 273, 253, 4081, 1332, 5847, 50275, 498, 15752, 4583, 253, 4028, 310, 2590, 285, 3477, 281, 956, 50276, 18108, 1895, 342, 247, 2969, 285, 3576, 2900, 7296, 253, 966, 31561, 10076, 310, 271, 5667, 3213, 323, 9433, 256, 3433, 275, 247, 625, 15958, 10076, 533, 556, 644, 1679, 14859, 253, 4081, 1332, 476, 320, 908, 342, 247, 2969, 11237, 273, 253, 5368, 8245, 285, 352, 2722, 1534, 16774, 6351, 7613, 352, 556, 253, 2442, 281, 320, 7561, 908, 281, 29966, 436, 1895, 1293, 3081, 32274, 3738, 253, 4081, 1332, 476, 320, 11575, 347, 247, 2969, 5019, 273, 253, 5368, 3082, 891, 2868, 326, 253, 17647, 285, 12510, 273, 253, 4081, 1332, 588, 320, 4722, 281, 253, 9414, 275, 17857, 32888, 50276, 5040, 50275, 3062, 2087, 516, 30063, 49863, 29974, 13337, 4715, 15216, 3738, 891, 5194, 326, 247, 2783, 10076, 26332, 10670, 273, 1097, 13130, 285, 440, 22027, 403, 1072, 310, 954, 3626, 891, 4282, 326, 253, 4081, 1332, 476, 320, 7763, 281, 625, 12314, 15216, 26332, 50276, 8155, 8303, 273, 1097, 13130, 285, 440, 22027, 403, 1027, 310, 627, 667, 1039, 281, 4647, 253, 4081, 18198, 267, 281, 824, 247, 11132, 10076, 50276, 8222, 273, 253, 5700, 275, 465, 606, 1162, 355, 9169, 275, 2593, 7652, 253, 4477, 1246, 326, 597, 897, 247, 36334, 501, 312, 4906, 5700, 534, 310, 5611, 407, 465, 606, 1162, 355, 9169, 50276, 1033, 423, 253, 1077, 1390, 8661, 273, 3733, 8495, 272, 281, 247, 4942, 16645, 966, 3268, 476, 253, 4477, 1918, 253, 4278, 670, 436, 671, 310, 436, 3309, 323, 16774, 7756, 50276, 76, 606, 1162, 355, 34430, 4906, 6779, 285, 30410, 323, 1048, 29551, 8981, 275, 17857, 32888, 9169, 3738, 253, 4081, 18198, 267, 476, 320, 11575, 347, 247, 2969, 5019, 273, 253, 5368, 3082, 13439, 3268, 12420, 285, 2412, 953, 14000, 891, 2868, 326, 253, 17647, 285, 16774, 12510, 273, 253, 4081, 1332, 588, 320, 4722, 281, 253, 9414, 275, 17857, 32888, 5474, 33032, 2520, 2929, 12453, 253, 9400, 273, 49863, 29974, 13337, 4715, 275, 2219, 835, 253, 6944, 941, 3268, 310, 18270, 516, 30063, 436, 2746, 24772, 3268, 12420, 342, 2412, 262, 14000, 4795, 275, 271, 5919, 1332, 323, 16161, 253, 18979, 1895, 1223, 11138, 3045, 275, 253, 1071, 4758, 12401, 5368, 1375, 23037, 14387, 7274, 824, 347, 30402, 253, 2746, 8687, 642, 10491, 1375, 285, 31561, 36455, 310, 6786, 816, 407, 26264, 253, 2957, 3470, 273, 253, 1566, 4679, 403, 5196, 689, 1264, 22791, 8113, 15302, 273, 11962, 10454, 1048, 29551, 9508, 273, 260, 338, 274, 740, 260, 338, 274, 2313, 285, 4440, 257, 292, 11946, 5661, 1543, 403, 12085, 342, 390, 8268, 643, 3082, 327, 253, 5762, 15302, 342, 608, 89, 3733, 3885, 484, 2429, 281, 30402, 20544, 50276, 2520, 2929, 12453, 253, 9400, 273, 49863, 29974, 13337, 4715, 275, 2219, 835, 253, 6944, 941, 3268, 310, 18270, 516, 30063, 436, 310, 271, 15560, 18398, 446, 2149, 1895, 326, 310, 4623, 281, 253, 4388, 77, 3114, 285, 556, 8542, 1524, 10186, 3486, 50276, 249, 4499, 281, 643, 9380, 15974, 436, 1895, 253, 4081, 2746, 15771, 760, 327, 1073, 382, 2382, 12420, 1293, 10491, 3169, 8130, 436, 310, 2218, 407, 4886, 253, 3268, 715, 253, 2831, 290, 10144, 2957, 30745, 273, 3332, 49863, 29974, 13337, 4715, 11333, 4993, 8992, 285, 5878, 8992, 50276, 2520, 789, 23417, 5697, 273, 13439, 3268, 12420, 342, 2412, 262, 14000, 432, 253, 4751, 22296, 516, 30063, 4715, 4758, 50276, 1615, 7296, 3268, 12420, 3815, 3733, 673, 310, 3777, 608, 89, 5141, 432, 30402, 50276, 783, 1332, 33526, 3012, 1805, 3045, 5319, 347, 625, 13130, 941, 4916, 2130, 285, 253, 2746, 11498, 281, 4067, 15302, 41731, 14692, 253, 1682, 5368, 1332, 327, 4440, 257, 292, 11946, 275, 2426, 273, 7200, 50276, 783, 4081, 1332, 310, 973, 925, 3163, 275, 253, 5368, 6239, 50276, 4674, 374, 3400, 247, 5322, 18389, 273, 253, 31341, 261, 1005, 3058, 281, 2096, 253, 1332, 50276, 2574, 84, 854, 285, 898, 403, 4722, 16039, 50276, 783, 2746, 310, 2969, 10568, 760, 247, 1643, 3081, 3104, 273, 2127, 281, 3359, 342, 642, 26787, 281, 253, 3733, 6974, 285, 642, 2559, 3733, 673, 50276, 783, 2746, 760, 556, 767, 4373, 22041, 534, 878, 281, 320, 24251, 50276, 16217, 3825, 403, 5196, 689, 1264, 22791, 8113, 15302, 273, 11962, 10454, 1048, 29551, 9508, 273, 260, 338, 274, 740, 260, 338, 274, 2313, 285, 4440, 257, 292, 11946, 50276, 49363, 9978, 3133, 5272, 285, 3637, 247, 2074, 5199, 281, 30402, 50276, 49363, 1543, 403, 12085, 390, 23141, 643, 3082, 327, 253, 5762, 15302, 50276, 1752, 569, 403, 1408, 281, 17813, 1016, 4445, 273, 253, 4081, 2746, 50276, 20881, 1255, 265, 50276, 6785, 267, 3086, 49863, 29974, 13337, 4715, 1223, 271, 1774, 1895, 310, 417, 247, 747, 1895, 8489, 14155, 38135, 50276, 783, 4081, 2746, 310, 247, 5019, 273, 767, 5368, 7274, 8489, 14155, 38135, 50276, 16217, 3825, 403, 5196, 327, 760, 247, 2014, 2990, 10336, 352, 651, 320, 4217, 281, 921, 352, 2987, 327, 247, 4618, 2491, 273, 11454, 2990, 35615, 50276, 262, 651, 320, 4217, 281, 923, 604, 253, 1543, 403, 10126, 1534, 50276, 34974, 50276, 5371, 6569, 604, 253, 9376, 326, 253, 440, 22027, 873, 273, 6667, 310, 516, 30063, 275, 247, 1027, 1039, 685, 253, 3733, 6667, 50276, 32897, 4853, 2266, 285, 22112, 31612, 9508, 273, 253, 440, 22027, 1650, 50276, 609, 627, 667, 1534, 30453, 281, 436, 2746, 2429, 281, 2571, 50276, 5371, 6569, 275, 253, 16645, 4758, 1057, 352, 2965, 896, 281, 247, 2629, 49863, 29974, 13337, 4715, 2746, 436, 2929, 12453, 253, 9400, 273, 49863, 29974, 13337, 4715, 275, 2219, 835, 253, 6944, 941, 3268, 310, 18270, 516, 30063, 436, 310, 271, 15560, 18398, 446, 2149, 1895, 326, 310, 4623, 281, 253, 4388, 77, 3114, 285, 556, 8542, 1524, 10186, 3486, 275, 4499, 281, 643, 9380, 15974, 436, 1895, 253, 4081, 2746, 15771, 760, 327, 1073, 382, 2382, 12420, 1293, 10491, 3169, 8130, 436, 310, 2218, 407, 4886, 253, 3268, 715, 253, 2831, 290, 10144, 2957, 30745, 273, 3332, 49863, 29974, 13337, 4715, 11333, 4993, 8992, 285, 5878, 8992, 253, 2929, 310, 3839, 973, 34218, 285, 4518, 3542, 253, 5661, 873, 598, 310, 3590, 285, 5661, 1543, 921, 253, 9023, 273, 253, 1566, 253, 2746, 310, 973, 2595, 264, 275, 253, 5368, 6239, 285, 697, 16038, 310, 2590, 347, 352, 9572, 891, 513, 417, 923, 667, 2201, 32138, 342, 253, 2746, 5474, 33032, 2520, 2929, 16633, 327, 966, 6785, 267, 3086, 49863, 29974, 13337, 4715, 285, 29328, 247, 747, 1332, 326, 24772, 3268, 12420, 285, 2412, 262, 14000, 5609, 4679, 921, 326, 253, 10419, 476, 5115, 3045, 7756, 436, 2929, 16633, 327, 966, 6785, 267, 3086, 49863, 29974, 13337, 4715, 285, 29328, 247, 747, 1332, 326, 24772, 3268, 12420, 285, 2412, 262, 14000, 5609, 253, 10419, 310, 2969, 2299, 253, 4081, 1332, 15771, 327, 253, 9376, 326, 253, 31561, 3268, 875, 13130, 285, 440, 22027, 15302, 310, 253, 1072, 436, 310, 247, 1077, 2266, 2720, 3640, 285, 275, 6612, 359, 476, 417, 4044, 436, 3640, 50276, 40622, 253, 2898, 273, 253, 10419, 310, 3710, 25761, 253, 747, 10419, 310, 7194, 1754, 327, 5368, 5609, 3738, 16248, 841, 5609, 476, 5115, 247, 3045, 7756, 253, 38135, 285, 7680, 403, 3710, 436, 2929, 29328, 247, 747, 1332, 323, 966, 6785, 267, 3086, 49863, 29974, 13337, 4715, 2299, 275, 619, 1859, 253, 9376, 273, 253, 10419, 310, 1512, 2266, 281, 10517, 285, 253, 38135, 273, 253, 10419, 310, 3710, 5474, 33032, 2520, 2929, 2175, 966, 6785, 267, 3086, 49863, 29974, 13337, 4715, 281, 6016, 436, 1895, 247, 27998, 2746, 310, 4081, 407, 2049, 272, 3268, 12420, 4204, 285, 2412, 953, 14000, 826, 275, 1798, 436, 2929, 29328, 281, 4647, 4204, 285, 826, 281, 1097, 22296, 285, 440, 35421, 11655, 534, 310, 747, 436, 1332, 2722, 1534, 7756, 689, 1666, 25379, 327, 1264, 15302, 50275, 296, 3755, 20556, 337, 436, 2929, 2175, 271, 1774, 1895, 534, 310, 15560, 18398, 446, 2149, 275, 6239, 374, 253, 4081, 1332, 310, 2969, 2568, 3576, 534, 2722, 6832, 3045, 15988, 327, 2709, 15302, 495, 436, 2929, 310, 973, 3542, 285, 3477, 281, 2096, 577, 253, 4081, 1332, 310, 5272, 326, 8495, 84, 253, 3268, 273, 1097, 22296, 285, 440, 35421, 11655, 50275, 20881, 1255, 265, 337, 253, 38135, 310, 3710, 347, 5393, 253, 4081, 1332, 310, 3965, 1199, 1754, 327, 5368, 2987, 26332, 4204, 285, 826, 534, 403, 908, 285, 5183, 281, 320, 9371, 275, 966, 6785, 267, 3086, 8892, 15538, 253, 11701, 273, 253, 39396, 403, 3710, 323, 1650, 352, 310, 1077, 3626, 281, 9017, 826, 281, 440, 35421, 2957, 407, 970, 10585, 311, 357, 1241, 275, 436, 2743, 436, 2929, 1057, 417, 9186, 253, 3290, 273, 253, 4561, 10585, 311, 357, 1241, 1057, 253, 3268, 273, 10585, 311, 1492, 1056, 3282, 323, 253, 4204, 275, 16186, 884, 352, 4745, 3637, 2045, 789, 342, 247, 5884, 3064, 407, 16984, 271, 3081, 4764, 465, 374, 625, 1666, 25379, 943, 320, 2429, 275, 253, 1655, 2715, 436, 2929, 7194, 26662, 697, 10419, 342, 30402, 891, 717, 6600, 326, 30402, 310, 247, 2266, 8245, 533, 352, 651, 320, 625, 21414, 604, 625, 1666, 25379, 812, 320, 2429, 824, 347, 15567, 495, 4373, 22041, 432, 4677, 495, 352, 310, 2540, 326, 253, 3045, 273, 253, 1566, 310, 3240, 7996, 281, 253, 4327, 273, 355, 545, 4988, 577, 275, 2829, 577, 253, 1543, 323, 22296, 3210, 403, 417, 27096, 281, 1805, 22048, 253, 8037, 826, 390, 643, 5609, 326, 22139, 966, 31561, 943, 320, 671, 3732, 50275, 18, 3268, 13823, 35185, 21085, 10585, 311, 1492, 323, 516, 30063, 49863, 29974, 13337, 4715, 50276, 19, 490, 68, 24026, 16645, 30410, 323, 966, 6785, 267, 3086, 49863, 29974, 13337, 4715, 5723, 2824, 43425, 50276, 20, 440, 35421, 24705, 20828, 285, 22403, 494, 7646, 11038, 323, 49863, 29974, 13337, 4715, 5723, 2824, 9169, 619, 2022, 4468, 310, 326, 253, 38135, 310, 3710, 534, 2789, 436, 2929, 751, 271, 32809, 789, 3103, 891, 5583, 18235, 2490, 187, 4118, 18435, 27, 35501, 323, 634, 19529, 281, 17857, 32888, 50276, 15337, 84, 497, 9648, 6804, 327, 436, 2929, 342, 767, 30628, 43243, 323, 18738, 253, 2929, 285, 767, 43243, 323, 33944, 253, 2929, 50276, 9088, 497, 690, 7350, 5439, 407, 253, 30628, 954, 19836, 38135, 285, 690, 3374, 342, 253, 4679, 50276, 6438, 30080, 22559, 253, 4016, 30628, 8838, 616, 7363, 285, 253, 2762, 30628, 497, 8489, 1679, 31905, 50276, 249, 253, 990, 253, 2929, 310, 3240, 45210, 285, 812, 1663, 564, 2057, 1039, 533, 352, 3133, 326, 253, 2929, 812, 897, 1529, 3790, 273, 16725, 3782, 281, 1056, 2119, 253, 3374, 5439, 407, 253, 30628, 403, 18212, 9713, 50276, 32897, 513, 1978, 275, 2564, 616, 5701, 672, 13828, 247, 2852, 2715, 273, 253, 7714 ]
Below is given review of a research paper from cnoference journal. Please write a summary the review. ### Review: this paper introduces a novel learnable mpc approach by enhancing the traditional mpc cost with learnable components these learnable components use transformerbased embeddings of realworld context the overall model is learned in imitation learning fashion by leveraging expert demonstrations various optimizations are considered to enable the model to run in realtime through experiments the model is less likely to be stuck in local optimum and are more social when compared to the traditional mpc the model is also less likely to run into obstacles when compared to pure expert demonstration models this paper is very clearly written the authors have very cleanly formulated their approach and provide good reasoning describing every single step i also concur that learnablempc combines the best of both worlds with the learning part being able to encode complex environmental and social context and the mpc part providing safety guarantees to some extent the experiments are also carefully designed around 4 example scenarios and support the authors claims i still have a few questions about the experiments for learning highly constrained maneuvers are the human expert demonstrations conducted in the same cluttered environment as in the evaluation how well does performermpc perform in unseen cluttered environments when compared to rmpc or how does rmpc perform because i saw performermpc generalizes in one instance shown in the video for blind corner and pedestrian obstruction scenarios i do not see any statistical analysis are the comparisons statistically significant among the social score ratings of rmpc ep and performermpc how many subjects are there are these the same as the pilot studies mentioned in the supplementary materials i am also curious about how this method generalizes to more complex pedestrian scenarios such as navigating through a crowd how many demonstrations are needed to be successful in this crowd behaviors are highly variable and unconstrained docsepthe paper proposes an approach that allows incorporation of complex rules such as social norms obstacle avoidance in cluttered scenarios into the behavior generated by standard mpc policies the main idea of the approach is based around transformergenerated cost functions trained through imitation learning behavior cloning in this instance the authors demonstrate effectiveness of their approach on a real robotics platform in multiple scenarios including navigation in cluttered scenes and sociallyacceptable navigation in humanoccupied environments praise the work tackles an important problem of incorporating complex behaviors while retaining advantages of the existing control approaches optimality interpretability the work conducts interesting experiments on a real robotics platform and demonstrates some level of generalization on novel albeit similar cases concerns literature overview is not complete authors cite a lot of papers for transformers in unrelatedweaklyrelated fields which is definitely excessive but the section on inverserl and cost learning is tiny even the relevant papers cited by the authors are just mentioned there is no detailed comparison ie how the current work advances already existing methods it makes it harder to appreciate what has been done in the paper just to name a few other examples from the existing relevant body of literature that is worth including comparing to sermanet et al timecontrastive networks selfsupervised learning from video hsu et al unsupervised learning via metalearning finn et al guided cost learning deep inverse optimal control via policy optimization bechtle et al metalearning via learned loss each task is learned separately hence it brings a question how to stitch them later although the authors explicitly mention this limitation it renders results quite a bit less impressive also the experiments are conducted in somewhat similar environments and do not exhibit strong distributional shifts under proposed inputs docsepthis paper presents a novel way of augmenting the cost function of a model predictive controller by allowing the cost function to be a combination of a fixed one and a learned one the learned part of the cost function takes the most general form of nonlinear least squares and learns the matrix coefficients directly using the embeddings of a vision transformer architecture they use performers instead of transformers to speed up the training and forward pass using bilevel optimization the gradients are propagated through the mpc optimizer the main strengths and learnings from this paper are the paper is very well written and very selfexplanatory the figures are very well chosen and the theoretical part is rigorous and sound its the first approach to my knowledge that uses vision transformers in combination with an mpc in the loop to learn the cost function of said mpc it can be an inspiration for future researchers and it has the potential of being in my opinion a very good research direction it uses bilevel optimization to propagate the gradients through an optimizer i believe that using this tool is a very elegant way of getting the best of both worlds learningbased and modelbased to work together in what they respectively excel at and weaknesses while the approach renders very good results and its elegant it is compared with rather naive baselines furthermore the existence of the mpc in the loop is not ablated in any way since in all approaches there is a tracking mpc problem that is solved at the end docsepthis paper presents a novel learningbased mpc approach where the cost function is learned using the expressiveness of deep neural networks such as transformers these extract the cost directly from context information and trained using imitation learning while the mpc is solved using classic optimization techniques the proposed solution is compared with two main baselines the first one includes is a classic mpc whereas the second one is an explicit policy method that directly generates control actions without using the mpc structure the paper is well written and the approach is interesting however there are several aspects that are not fully convincing first it is not clear why the proposed representation as input to generate the final latent embedding is selected how other representations affect the learning module this is not clear second the current setup is tested and analyzed in separate scenarios and every time needs to be separately retrained this is a huge limitation of this approach and it is not clear if the architecture is able to learn different scenarios at the same time the network is tested except for minor variation in the same scenario or task it was trained for third it is not clear by using a performer type of architecture rather than the original transformer if there is any loss in performance or in the ability to learn the cost function it would have been interesting to have some study about that at least showing what the different architectures learn in terms of costs if not possible to deploy all of them directly in the mpc for computational reasons this can also be extended considering different architectures beyond transformers finally it is not clear what is the model used in the current robot and at which level of abstraction the control commands are sent and how some constraints like pedestrian detections are incorporated in the regular mpc approach an appropriate constraint should not let the robot get stuck in front of the human the proposed comparison is probably not fair in this case ### Summary:
this paper proposes a method called performermpc which uses a transformerbased learned cost function by imitation learning with visual input to be combined with mpc for realworld robot navigation tasks its effectiveness is demonstrated in multiple realworld navigation scenarios and confirmed performance improvement over standard mpc strengths the paper is well written strong technical contribution in bilevel optimization realworld experiments with four scenarios that support the authors claims weaknesses lack of some details in the proposed method experimental setups and results literature review is not sufficient no comparisons with sota post rebuttal in the rebuttal period some reviewers concerns were resolved and turned positive and all the reviewers agreed that this paper should be accepted therefore i conclude that the paper should be accepted i strongly encourage the authors to take all the reviewers suggestions for the final version of the paper
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 2520, 2929, 23970, 247, 4460, 3037, 494, 278, 5902, 2746, 407, 22474, 253, 5899, 278, 5902, 2105, 342, 3037, 494, 4295, 841, 3037, 494, 4295, 897, 39707, 3169, 46234, 273, 1524, 10186, 3634, 253, 4583, 1566, 310, 6311, 275, 45738, 4715, 8142, 407, 19732, 2977, 6485, 32367, 2710, 5556, 5904, 403, 2783, 281, 8046, 253, 1566, 281, 1408, 275, 1524, 2606, 949, 4679, 253, 1566, 310, 1679, 2779, 281, 320, 10960, 275, 1980, 24571, 285, 403, 625, 2675, 672, 2429, 281, 253, 5899, 278, 5902, 253, 1566, 310, 671, 1679, 2779, 281, 1408, 715, 24238, 672, 2429, 281, 6313, 6485, 20028, 3210, 436, 2929, 310, 1077, 4518, 3542, 253, 4477, 452, 1077, 4076, 314, 26115, 616, 2746, 285, 2085, 1175, 14720, 12930, 1046, 2014, 3213, 891, 671, 15038, 326, 3037, 494, 2503, 68, 24772, 253, 1682, 273, 1097, 20490, 342, 253, 4715, 629, 1146, 2104, 281, 22573, 2570, 6938, 285, 2675, 3634, 285, 253, 278, 5902, 629, 5277, 5252, 23632, 281, 690, 6070, 253, 4679, 403, 671, 9257, 4158, 1475, 577, 1650, 15216, 285, 1329, 253, 4477, 3916, 50276, 74, 1335, 452, 247, 1643, 3533, 670, 253, 4679, 50276, 1542, 4715, 4122, 20793, 24974, 735, 403, 253, 1966, 6485, 32367, 5196, 275, 253, 1072, 26986, 3606, 3126, 347, 275, 253, 7103, 849, 973, 1057, 1347, 693, 5902, 1347, 275, 39709, 26986, 3606, 12620, 672, 2429, 281, 391, 2503, 68, 390, 849, 1057, 391, 2503, 68, 1347, 984, 891, 3047, 1347, 693, 5902, 2087, 4219, 275, 581, 4227, 2011, 275, 253, 3492, 50276, 1542, 9645, 7145, 285, 34792, 22080, 15216, 891, 513, 417, 923, 667, 7605, 1783, 403, 253, 14023, 10126, 1534, 2190, 253, 2675, 4868, 17503, 273, 391, 2503, 68, 2563, 285, 1347, 693, 5902, 849, 1142, 5705, 403, 627, 403, 841, 253, 1072, 347, 253, 11572, 2175, 5393, 275, 253, 24864, 4753, 50276, 74, 717, 671, 14338, 670, 849, 436, 1332, 2087, 4219, 281, 625, 2570, 34792, 15216, 824, 347, 49858, 949, 247, 9539, 849, 1142, 32367, 403, 3058, 281, 320, 5547, 275, 436, 9539, 13576, 403, 4122, 4778, 285, 440, 48454, 5474, 339, 431, 248, 2929, 29328, 271, 2746, 326, 4483, 24319, 273, 2570, 4803, 824, 347, 2675, 22429, 26982, 28772, 275, 26986, 3606, 15216, 715, 253, 3879, 4561, 407, 2629, 278, 5902, 7823, 253, 2022, 2934, 273, 253, 2746, 310, 1754, 1475, 4979, 1326, 4330, 456, 2105, 3470, 10166, 949, 45738, 4715, 3879, 34591, 275, 436, 4227, 253, 4477, 7568, 12510, 273, 616, 2746, 327, 247, 1524, 15688, 982, 5147, 275, 2709, 15216, 1690, 15034, 275, 26986, 3606, 13451, 285, 28071, 24826, 15034, 275, 1966, 32980, 12620, 50276, 81, 22525, 50276, 783, 789, 39223, 271, 1774, 1895, 273, 24049, 2570, 13576, 1223, 26179, 11361, 273, 253, 5368, 1453, 7274, 5556, 1319, 4665, 1430, 50276, 783, 789, 2589, 84, 4722, 4679, 327, 247, 1524, 15688, 982, 5147, 285, 14371, 690, 1268, 273, 26647, 327, 4460, 23447, 2074, 2219, 50275, 585, 1209, 2224, 50276, 22478, 1177, 18389, 310, 417, 3426, 4477, 26542, 247, 2257, 273, 9380, 323, 4979, 398, 275, 20804, 20881, 314, 4919, 4910, 534, 310, 7964, 13622, 533, 253, 2593, 327, 275, 735, 45043, 285, 2105, 4715, 310, 10058, 1014, 253, 4623, 9380, 11106, 407, 253, 4477, 403, 816, 5393, 627, 310, 642, 7000, 5301, 26332, 849, 253, 1655, 789, 16424, 2168, 5368, 3082, 352, 2789, 352, 12150, 281, 11435, 752, 556, 644, 2218, 275, 253, 2929, 816, 281, 1416, 247, 1643, 643, 6667, 432, 253, 5368, 4623, 2133, 273, 6239, 326, 310, 4409, 1690, 50276, 681, 48434, 281, 50273, 84, 8592, 292, 1162, 355, 673, 45842, 422, 6928, 1881, 35421, 4715, 432, 3492, 50273, 73, 3467, 1162, 355, 440, 35421, 4715, 3066, 5148, 613, 920, 50273, 71, 2966, 1162, 355, 18107, 2105, 4715, 3676, 13737, 8654, 1453, 3066, 3646, 13757, 50273, 1257, 12914, 282, 1162, 355, 5148, 613, 920, 3066, 6311, 2957, 50272, 14382, 4836, 310, 6311, 11794, 7613, 352, 10316, 247, 1953, 849, 281, 40532, 731, 1996, 3738, 253, 4477, 11120, 3748, 436, 12291, 352, 29512, 1543, 3240, 247, 2372, 1679, 13943, 671, 253, 4679, 403, 5196, 275, 8489, 2074, 12620, 285, 513, 417, 10738, 2266, 3268, 267, 15036, 762, 4081, 14800, 5474, 33032, 2520, 2929, 10262, 247, 4460, 1039, 273, 35919, 272, 253, 2105, 1159, 273, 247, 1566, 15970, 9763, 407, 6941, 253, 2105, 1159, 281, 320, 247, 5019, 273, 247, 4229, 581, 285, 247, 6311, 581, 253, 6311, 629, 273, 253, 2105, 1159, 3936, 253, 954, 2087, 830, 273, 14561, 1878, 19325, 285, 33772, 253, 4315, 10303, 3587, 970, 253, 46234, 273, 247, 8113, 39707, 10336, 597, 897, 34983, 3185, 273, 4979, 398, 281, 3885, 598, 253, 3733, 285, 3579, 1509, 970, 26413, 652, 13757, 253, 27935, 403, 46695, 949, 253, 278, 5902, 5556, 6081, 253, 2022, 20544, 285, 3037, 723, 432, 436, 2929, 403, 50275, 783, 2929, 310, 1077, 973, 3542, 285, 1077, 11329, 453, 89, 11139, 2473, 253, 8442, 403, 1077, 973, 6777, 285, 253, 10527, 629, 310, 26565, 285, 3590, 50275, 953, 253, 806, 2746, 281, 619, 3640, 326, 4648, 8113, 4979, 398, 275, 5019, 342, 271, 278, 5902, 275, 253, 6287, 281, 3037, 253, 2105, 1159, 273, 753, 278, 5902, 352, 476, 320, 271, 17006, 323, 2852, 8607, 285, 352, 556, 253, 2442, 273, 1146, 275, 619, 4743, 247, 1077, 1175, 2561, 3884, 50275, 262, 4648, 26413, 652, 13757, 281, 38500, 253, 27935, 949, 271, 5556, 6081, 891, 2868, 326, 970, 436, 4968, 310, 247, 1077, 20654, 1039, 273, 2970, 253, 1682, 273, 1097, 20490, 4715, 3169, 285, 1566, 3169, 281, 789, 2366, 275, 752, 597, 2975, 34219, 387, 50276, 395, 32213, 50275, 6050, 253, 2746, 29512, 1077, 1175, 1543, 285, 697, 20654, 352, 310, 2429, 342, 2581, 27785, 1666, 25379, 33810, 253, 6242, 273, 253, 278, 5902, 275, 253, 6287, 310, 417, 490, 16148, 275, 667, 1039, 1580, 275, 512, 7274, 627, 310, 247, 12544, 278, 5902, 1895, 326, 310, 14042, 387, 253, 990, 50271, 7152, 33032, 2520, 2929, 10262, 247, 4460, 4715, 3169, 278, 5902, 2746, 835, 253, 2105, 1159, 310, 6311, 970, 253, 3890, 6460, 273, 3676, 11454, 6928, 824, 347, 4979, 398, 841, 4908, 253, 2105, 3587, 432, 3634, 1491, 285, 10166, 970, 45738, 4715, 1223, 253, 278, 5902, 310, 14042, 970, 10610, 13757, 5609, 253, 4081, 2900, 310, 2429, 342, 767, 2022, 1666, 25379, 253, 806, 581, 3797, 310, 247, 10610, 278, 5902, 5727, 253, 1273, 581, 310, 271, 6843, 3646, 1332, 326, 3587, 15693, 1453, 5231, 1293, 970, 253, 278, 5902, 2605, 253, 2929, 310, 973, 3542, 285, 253, 2746, 310, 4722, 2299, 627, 403, 2067, 7794, 326, 403, 417, 4751, 21414, 806, 352, 310, 417, 2590, 2139, 253, 4081, 6779, 347, 3280, 281, 6635, 253, 2457, 21624, 21496, 310, 4236, 849, 643, 14237, 2818, 253, 4715, 6333, 436, 310, 417, 2590, 1273, 253, 1655, 9978, 310, 5762, 285, 5867, 275, 4858, 15216, 285, 1046, 673, 3198, 281, 320, 11794, 851, 11273, 436, 310, 247, 5699, 12291, 273, 436, 2746, 285, 352, 310, 417, 2590, 604, 253, 10336, 310, 2104, 281, 3037, 1027, 15216, 387, 253, 1072, 673, 253, 2990, 310, 5762, 3707, 323, 5884, 7629, 275, 253, 1072, 10076, 390, 4836, 352, 369, 10166, 323, 2626, 352, 310, 417, 2590, 407, 970, 247, 40247, 1511, 273, 10336, 2581, 685, 253, 3236, 39707, 604, 627, 310, 667, 2957, 275, 3045, 390, 275, 253, 3745, 281, 3037, 253, 2105, 1159, 352, 651, 452, 644, 4722, 281, 452, 690, 1263, 670, 326, 387, 1878, 4645, 752, 253, 1027, 35615, 3037, 275, 2426, 273, 4815, 604, 417, 1896, 281, 8745, 512, 273, 731, 3587, 275, 253, 278, 5902, 323, 15180, 4606, 436, 476, 671, 320, 6508, 7296, 1027, 35615, 4457, 4979, 398, 4720, 352, 310, 417, 2590, 752, 310, 253, 1566, 908, 275, 253, 1655, 15688, 285, 387, 534, 1268, 273, 38562, 253, 1453, 13896, 403, 2197, 285, 849, 690, 10806, 751, 34792, 843, 20713, 403, 11217, 275, 253, 3963, 278, 5902, 2746, 271, 4569, 7658, 943, 417, 1339, 253, 15688, 755, 10960, 275, 2914, 273, 253, 1966, 253, 4081, 5301, 310, 3164, 417, 4344, 275, 436, 1083, 2490, 187, 4118, 18435, 27, 2520, 2929, 29328, 247, 1332, 1925, 1347, 693, 5902, 534, 4648, 247, 39707, 3169, 6311, 2105, 1159, 407, 45738, 4715, 342, 5304, 3280, 281, 320, 5678, 342, 278, 5902, 323, 1524, 10186, 15688, 15034, 8892, 697, 12510, 310, 5183, 275, 2709, 1524, 10186, 15034, 15216, 285, 5783, 3045, 7756, 689, 2629, 278, 5902, 50275, 296, 3755, 20556, 50275, 783, 2929, 310, 973, 3542, 50274, 9072, 7681, 7680, 275, 26413, 652, 13757, 50274, 6549, 10186, 4679, 342, 1740, 15216, 326, 1329, 253, 4477, 3916, 50276, 20881, 1255, 265, 50275, 77, 471, 273, 690, 4278, 275, 253, 4081, 1332, 5661, 873, 8777, 285, 1543, 50274, 22478, 1177, 2278, 310, 417, 4209, 50275, 2369, 14023, 342, 256, 5503, 50275, 5996, 30080, 22559, 50275, 249, 253, 30080, 22559, 2180, 690, 30628, 7350, 497, 11512, 285, 3531, 2762, 285, 512, 253, 30628, 5821, 326, 436, 2929, 943, 320, 7607, 3103, 891, 7525, 326, 253, 2929, 943, 320, 7607, 891, 7052, 11907, 253, 4477, 281, 1379, 512, 253, 30628, 13991, 323, 253, 2457, 2715, 273, 253, 2929, 50276 ]
[ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 ]
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 2520, 2929, 23970, 247, 4460, 3037, 494, 278, 5902, 2746, 407, 22474, 253, 5899, 278, 5902, 2105, 342, 3037, 494, 4295, 841, 3037, 494, 4295, 897, 39707, 3169, 46234, 273, 1524, 10186, 3634, 253, 4583, 1566, 310, 6311, 275, 45738, 4715, 8142, 407, 19732, 2977, 6485, 32367, 2710, 5556, 5904, 403, 2783, 281, 8046, 253, 1566, 281, 1408, 275, 1524, 2606, 949, 4679, 253, 1566, 310, 1679, 2779, 281, 320, 10960, 275, 1980, 24571, 285, 403, 625, 2675, 672, 2429, 281, 253, 5899, 278, 5902, 253, 1566, 310, 671, 1679, 2779, 281, 1408, 715, 24238, 672, 2429, 281, 6313, 6485, 20028, 3210, 436, 2929, 310, 1077, 4518, 3542, 253, 4477, 452, 1077, 4076, 314, 26115, 616, 2746, 285, 2085, 1175, 14720, 12930, 1046, 2014, 3213, 891, 671, 15038, 326, 3037, 494, 2503, 68, 24772, 253, 1682, 273, 1097, 20490, 342, 253, 4715, 629, 1146, 2104, 281, 22573, 2570, 6938, 285, 2675, 3634, 285, 253, 278, 5902, 629, 5277, 5252, 23632, 281, 690, 6070, 253, 4679, 403, 671, 9257, 4158, 1475, 577, 1650, 15216, 285, 1329, 253, 4477, 3916, 50276, 74, 1335, 452, 247, 1643, 3533, 670, 253, 4679, 50276, 1542, 4715, 4122, 20793, 24974, 735, 403, 253, 1966, 6485, 32367, 5196, 275, 253, 1072, 26986, 3606, 3126, 347, 275, 253, 7103, 849, 973, 1057, 1347, 693, 5902, 1347, 275, 39709, 26986, 3606, 12620, 672, 2429, 281, 391, 2503, 68, 390, 849, 1057, 391, 2503, 68, 1347, 984, 891, 3047, 1347, 693, 5902, 2087, 4219, 275, 581, 4227, 2011, 275, 253, 3492, 50276, 1542, 9645, 7145, 285, 34792, 22080, 15216, 891, 513, 417, 923, 667, 7605, 1783, 403, 253, 14023, 10126, 1534, 2190, 253, 2675, 4868, 17503, 273, 391, 2503, 68, 2563, 285, 1347, 693, 5902, 849, 1142, 5705, 403, 627, 403, 841, 253, 1072, 347, 253, 11572, 2175, 5393, 275, 253, 24864, 4753, 50276, 74, 717, 671, 14338, 670, 849, 436, 1332, 2087, 4219, 281, 625, 2570, 34792, 15216, 824, 347, 49858, 949, 247, 9539, 849, 1142, 32367, 403, 3058, 281, 320, 5547, 275, 436, 9539, 13576, 403, 4122, 4778, 285, 440, 48454, 5474, 339, 431, 248, 2929, 29328, 271, 2746, 326, 4483, 24319, 273, 2570, 4803, 824, 347, 2675, 22429, 26982, 28772, 275, 26986, 3606, 15216, 715, 253, 3879, 4561, 407, 2629, 278, 5902, 7823, 253, 2022, 2934, 273, 253, 2746, 310, 1754, 1475, 4979, 1326, 4330, 456, 2105, 3470, 10166, 949, 45738, 4715, 3879, 34591, 275, 436, 4227, 253, 4477, 7568, 12510, 273, 616, 2746, 327, 247, 1524, 15688, 982, 5147, 275, 2709, 15216, 1690, 15034, 275, 26986, 3606, 13451, 285, 28071, 24826, 15034, 275, 1966, 32980, 12620, 50276, 81, 22525, 50276, 783, 789, 39223, 271, 1774, 1895, 273, 24049, 2570, 13576, 1223, 26179, 11361, 273, 253, 5368, 1453, 7274, 5556, 1319, 4665, 1430, 50276, 783, 789, 2589, 84, 4722, 4679, 327, 247, 1524, 15688, 982, 5147, 285, 14371, 690, 1268, 273, 26647, 327, 4460, 23447, 2074, 2219, 50275, 585, 1209, 2224, 50276, 22478, 1177, 18389, 310, 417, 3426, 4477, 26542, 247, 2257, 273, 9380, 323, 4979, 398, 275, 20804, 20881, 314, 4919, 4910, 534, 310, 7964, 13622, 533, 253, 2593, 327, 275, 735, 45043, 285, 2105, 4715, 310, 10058, 1014, 253, 4623, 9380, 11106, 407, 253, 4477, 403, 816, 5393, 627, 310, 642, 7000, 5301, 26332, 849, 253, 1655, 789, 16424, 2168, 5368, 3082, 352, 2789, 352, 12150, 281, 11435, 752, 556, 644, 2218, 275, 253, 2929, 816, 281, 1416, 247, 1643, 643, 6667, 432, 253, 5368, 4623, 2133, 273, 6239, 326, 310, 4409, 1690, 50276, 681, 48434, 281, 50273, 84, 8592, 292, 1162, 355, 673, 45842, 422, 6928, 1881, 35421, 4715, 432, 3492, 50273, 73, 3467, 1162, 355, 440, 35421, 4715, 3066, 5148, 613, 920, 50273, 71, 2966, 1162, 355, 18107, 2105, 4715, 3676, 13737, 8654, 1453, 3066, 3646, 13757, 50273, 1257, 12914, 282, 1162, 355, 5148, 613, 920, 3066, 6311, 2957, 50272, 14382, 4836, 310, 6311, 11794, 7613, 352, 10316, 247, 1953, 849, 281, 40532, 731, 1996, 3738, 253, 4477, 11120, 3748, 436, 12291, 352, 29512, 1543, 3240, 247, 2372, 1679, 13943, 671, 253, 4679, 403, 5196, 275, 8489, 2074, 12620, 285, 513, 417, 10738, 2266, 3268, 267, 15036, 762, 4081, 14800, 5474, 33032, 2520, 2929, 10262, 247, 4460, 1039, 273, 35919, 272, 253, 2105, 1159, 273, 247, 1566, 15970, 9763, 407, 6941, 253, 2105, 1159, 281, 320, 247, 5019, 273, 247, 4229, 581, 285, 247, 6311, 581, 253, 6311, 629, 273, 253, 2105, 1159, 3936, 253, 954, 2087, 830, 273, 14561, 1878, 19325, 285, 33772, 253, 4315, 10303, 3587, 970, 253, 46234, 273, 247, 8113, 39707, 10336, 597, 897, 34983, 3185, 273, 4979, 398, 281, 3885, 598, 253, 3733, 285, 3579, 1509, 970, 26413, 652, 13757, 253, 27935, 403, 46695, 949, 253, 278, 5902, 5556, 6081, 253, 2022, 20544, 285, 3037, 723, 432, 436, 2929, 403, 50275, 783, 2929, 310, 1077, 973, 3542, 285, 1077, 11329, 453, 89, 11139, 2473, 253, 8442, 403, 1077, 973, 6777, 285, 253, 10527, 629, 310, 26565, 285, 3590, 50275, 953, 253, 806, 2746, 281, 619, 3640, 326, 4648, 8113, 4979, 398, 275, 5019, 342, 271, 278, 5902, 275, 253, 6287, 281, 3037, 253, 2105, 1159, 273, 753, 278, 5902, 352, 476, 320, 271, 17006, 323, 2852, 8607, 285, 352, 556, 253, 2442, 273, 1146, 275, 619, 4743, 247, 1077, 1175, 2561, 3884, 50275, 262, 4648, 26413, 652, 13757, 281, 38500, 253, 27935, 949, 271, 5556, 6081, 891, 2868, 326, 970, 436, 4968, 310, 247, 1077, 20654, 1039, 273, 2970, 253, 1682, 273, 1097, 20490, 4715, 3169, 285, 1566, 3169, 281, 789, 2366, 275, 752, 597, 2975, 34219, 387, 50276, 395, 32213, 50275, 6050, 253, 2746, 29512, 1077, 1175, 1543, 285, 697, 20654, 352, 310, 2429, 342, 2581, 27785, 1666, 25379, 33810, 253, 6242, 273, 253, 278, 5902, 275, 253, 6287, 310, 417, 490, 16148, 275, 667, 1039, 1580, 275, 512, 7274, 627, 310, 247, 12544, 278, 5902, 1895, 326, 310, 14042, 387, 253, 990, 50271, 7152, 33032, 2520, 2929, 10262, 247, 4460, 4715, 3169, 278, 5902, 2746, 835, 253, 2105, 1159, 310, 6311, 970, 253, 3890, 6460, 273, 3676, 11454, 6928, 824, 347, 4979, 398, 841, 4908, 253, 2105, 3587, 432, 3634, 1491, 285, 10166, 970, 45738, 4715, 1223, 253, 278, 5902, 310, 14042, 970, 10610, 13757, 5609, 253, 4081, 2900, 310, 2429, 342, 767, 2022, 1666, 25379, 253, 806, 581, 3797, 310, 247, 10610, 278, 5902, 5727, 253, 1273, 581, 310, 271, 6843, 3646, 1332, 326, 3587, 15693, 1453, 5231, 1293, 970, 253, 278, 5902, 2605, 253, 2929, 310, 973, 3542, 285, 253, 2746, 310, 4722, 2299, 627, 403, 2067, 7794, 326, 403, 417, 4751, 21414, 806, 352, 310, 417, 2590, 2139, 253, 4081, 6779, 347, 3280, 281, 6635, 253, 2457, 21624, 21496, 310, 4236, 849, 643, 14237, 2818, 253, 4715, 6333, 436, 310, 417, 2590, 1273, 253, 1655, 9978, 310, 5762, 285, 5867, 275, 4858, 15216, 285, 1046, 673, 3198, 281, 320, 11794, 851, 11273, 436, 310, 247, 5699, 12291, 273, 436, 2746, 285, 352, 310, 417, 2590, 604, 253, 10336, 310, 2104, 281, 3037, 1027, 15216, 387, 253, 1072, 673, 253, 2990, 310, 5762, 3707, 323, 5884, 7629, 275, 253, 1072, 10076, 390, 4836, 352, 369, 10166, 323, 2626, 352, 310, 417, 2590, 407, 970, 247, 40247, 1511, 273, 10336, 2581, 685, 253, 3236, 39707, 604, 627, 310, 667, 2957, 275, 3045, 390, 275, 253, 3745, 281, 3037, 253, 2105, 1159, 352, 651, 452, 644, 4722, 281, 452, 690, 1263, 670, 326, 387, 1878, 4645, 752, 253, 1027, 35615, 3037, 275, 2426, 273, 4815, 604, 417, 1896, 281, 8745, 512, 273, 731, 3587, 275, 253, 278, 5902, 323, 15180, 4606, 436, 476, 671, 320, 6508, 7296, 1027, 35615, 4457, 4979, 398, 4720, 352, 310, 417, 2590, 752, 310, 253, 1566, 908, 275, 253, 1655, 15688, 285, 387, 534, 1268, 273, 38562, 253, 1453, 13896, 403, 2197, 285, 849, 690, 10806, 751, 34792, 843, 20713, 403, 11217, 275, 253, 3963, 278, 5902, 2746, 271, 4569, 7658, 943, 417, 1339, 253, 15688, 755, 10960, 275, 2914, 273, 253, 1966, 253, 4081, 5301, 310, 3164, 417, 4344, 275, 436, 1083, 2490, 187, 4118, 18435, 27, 2520, 2929, 29328, 247, 1332, 1925, 1347, 693, 5902, 534, 4648, 247, 39707, 3169, 6311, 2105, 1159, 407, 45738, 4715, 342, 5304, 3280, 281, 320, 5678, 342, 278, 5902, 323, 1524, 10186, 15688, 15034, 8892, 697, 12510, 310, 5183, 275, 2709, 1524, 10186, 15034, 15216, 285, 5783, 3045, 7756, 689, 2629, 278, 5902, 50275, 296, 3755, 20556, 50275, 783, 2929, 310, 973, 3542, 50274, 9072, 7681, 7680, 275, 26413, 652, 13757, 50274, 6549, 10186, 4679, 342, 1740, 15216, 326, 1329, 253, 4477, 3916, 50276, 20881, 1255, 265, 50275, 77, 471, 273, 690, 4278, 275, 253, 4081, 1332, 5661, 873, 8777, 285, 1543, 50274, 22478, 1177, 2278, 310, 417, 4209, 50275, 2369, 14023, 342, 256, 5503, 50275, 5996, 30080, 22559, 50275, 249, 253, 30080, 22559, 2180, 690, 30628, 7350, 497, 11512, 285, 3531, 2762, 285, 512, 253, 30628, 5821, 326, 436, 2929, 943, 320, 7607, 3103, 891, 7525, 326, 253, 2929, 943, 320, 7607, 891, 7052, 11907, 253, 4477, 281, 1379, 512, 253, 30628, 13991, 323, 253, 2457, 2715, 273, 253, 2929, 50276 ]
Below is given review of a research paper from cnoference journal. Please write a summary the review. ### Review: the work studies the problem of collecting new data to learn outcome distribution of sensitive groups it analyses three schemes of data collection exploitation only one in which decisions to give a positive decision to individuals are solely based on current classifier and exploration only and active debiasing other two where the individuals who would not have been given a positive decision are given so randomly with or without some constraints theoretical analysing the cases of unimodal and univariate distributions the work shows that the latter two schemes estimate the outcome distributions correctly it also proves bounded regret in making correct decisions for the active debasing scheme these observations are verified in experiments on two real datasets strengths interesting and overlooked problem of how new data collection impacts fairness guarantees theoretical analysis poses reasonable questions on learnability in this setting and finds intuitive answers generally well written paper weaknesses some terms are not made precise like bias or debiasing most relevant work are not discussed in the main paper or appendix limitations of the assumptions like unimodal distributions and of the method like not looking at the features are not discussed overall i like the research direction of studying the interplay between data collection and fairness when starting from imperfect data proposition 1 is a good step towards understanding this interplay after the response the response addresses my queries adequately hence i have updated my score to 7 accept the paper explores interesting ideas on controlled data collection to correct biased estimates and the effect of putting fairness constraints during collection there are limitations like the restriction to single dimensional data overall the work makes a good contribution i appreciate the thorough review of related work in the response please include the discussion into the main paper differentiating the problem setting and analysis from past work clarity of presentation needs to be improved as noted by the other reviewers including clarifying the problem and notation i would suggest addressing the restriction to singledimensional data by pointing to ways in which the insights can be transferred to the multidimensional case it is unreasonable to expect that dimension reduction would be as lossless as shown in the adult income experiment also i could find the exact procedure used for dimension reduction so the result is a bit surprising if it is completely unsupervised limitations of the work are not discussed in the main paper please discuss the limitations of the analysis for unimodal distributions and of the method in ignoring the feature information in data collection please discuss related work and use it to motivate the uniqueness of the problem setup some of the discussion from appendix a can be moved to the main paper potential negative impacts are discussed docsepthe paper presents a method for data debiasing in a sequentialdata setting with one feature analysis extended to two in supplement the algorithm conducts a form of boundedrandom exploration in this onedimensional setting and can be used to perform learning under fairness constraints the authors provide a theoretical analysis of the proposed algorithm demonstrating convergence and analyzing the regret bound the paper includes experiments on a simluated datasets and versions of two fairness datasets overall the paper is mostly wellwritten and notation clear with some exceptions noted below i have three main concerns about the paper the first is about the practical significance of the problem this seems to be of limited realworld significance and the authors do not identify a meaningful application where such a model and the described setting hold second i have some concerns with the presentation of the work as data debiasing as their work is really about model debiasing in an activesampling setting third i find the realworld experiments somewhat unconvincing see below after response see comments below major comments this work strikes me as inappropriately framed in two ways first it seems to be about model debiasing not data debiasing for example the authors own definition of bias is a property of the model not of the data cf l4349 l125129 second this definition of bias seems wholly unrelated to any fairness concerns and is effectively a measure of model error the work would make sense as a fairness work if their measure of bias was for example related to disparities in this error over subgroups but asis the proposed algorithm seems to simply be a onedimensional activelearning method that could incorporate fairness or any other constraints the entirety of the main text focuses on the case of onedimensional data with a reference to 2d case in supplement if this is the case the proposed method seems to be of limited usefulness in most realworld classification scenarios where many features are typically available the authors own somewhat contrived experiments on realworld datasets adult fico which reduce these datasets to a single feature in order to apply their proposed method seem to demonstrate the limited realworld usefulness i understand that this may be an interesting theoretical case for the bandit literature but the authors claim to solve a meaningful fairness problem is hard to take at face value without any clear examples of such a scenario emerging in the real world the realworld data experiments in particular are weak the authors do not provide the groundtruth values in fig 4a4b4c4d only one panel compares the proposed method to baselines and the synthetic data augmentation in 4c seems ad hoc and is not described most concerning however is the authors distilling these richlyfeatured tabular datasets into a single feature for the purposes of their experiments which amounts to discarding information in order to shoehorn in the proposed method this is effectively ignoring hundreds of published results on these datasets in the fairness literature which would perhaps be an appropriate baseline the results are also missing any notion of classification error or loss if i understand them correctly minor comments l131 this type of assumption is common in the literature please provide relevant citations definition 1 could use more unpacking some of the properties described as intuitive or otherwise are not immediately clear to me a clearer description of the properties of lbt would clarify the work considerably the intuition in l304312 is not clear in particular the sentence in l306l309 could use more unpacking it currently reads paradoxically as an increase in exploration makes the model more conservative at exploration typos etc i am confused why there is no hat on theta as there is on omega isnt it the case that there exists a true thetaa thetab which we are estimating via hatthetaa hatthetab respectively l154 in my opinion it would be clearer to retain the g subscript in the following analysis lowerbound lower bound l178 update the estimates to f update f l179 thetaty is not an unknown parameter if we are updating it l213 overestiamted see above docsepthe paper considers an algorithm to debias datasets through adaptive and bounded exploration for classification in particular the bias in the data is defined as inaccuracy in the estimation of a univariate distribution assumed known to be fit their algorithm works by adaptively picking a lower bound corresponding to a function of the percentile of the estimated univariate distribution with current parameters then given a decision threshold classification algorithm a threshold is fitted to determine if samples agents are accepted where violation of the threshold can occur wrt to the lower bound calculated with low probability data is then selected for 1 retraining the classifier and 2 updating the parameters of the distributionthe paper presents properties of a purely exploitation and purely exploitation algorithms and further shows that their algorithm has a favorable properties in a mix this leads to convergence of estimated parameters regret analysis is done on the classifiers learnt and the analysis of how a fairness constraint can effect the speed of convergence is done experiments are presented over synthetic and real world datasets update i have updated my score to a 7 similar to other reviewers i would also highlight that additional clarity in the paper would improve the paper greatly one specific point on this is to make the types of fairness less ambiguous in the discussion and in the author responses language has been used for different types of fairness which is confusing there are two broad types of unfairness being highlighted 1 from data versus 2 from prediction the authors are focusing on the former however the language used to describe both is confusing for instance when describing prediction fairness terms like social biasmodel biasunfairness are used but data fairness is also social and unfairness is too broad of a term in my opinion this should simply be stated as prediction fairness or something similar on the other hand data fairness is labelled statistical biasdata bias here the statistical bias term could refer to predictive fairness as constraints for this are statistical in nature simply put the language here needs to be made specific especially so in fairness where there are many flavors strength experiments show recovery of shifted parameters theoretic results provide convergence and regret analysis weaknesses writing can be improved there are aspects of the paper which are very unclear enumerated below figures in the main text are confusing missing information algorithm pseudocode are not clear the primary limitations are stated ### Summary:
this paper has seen a lot of discussion between reviewers and authors reviewers are fairly positive after the discussionrebuttal phase and there have been significant score revisions upwards few concerns that were highlighted during rebuttaldiscussion phase are 1 multiple reviewers have pointed out that amongst two sources of bias data bias and model bias the authors focus on assembling a dataset to avoid the first type of bias it has been pointed out using terms like social bias unfairness and statistical bias are very misleading i strongly suggest the authors to better revise the paper according to reviewer comments using more precise terminology data bias andor model bias clarity has been a concern uniformly shared amongst all reviewers 2 the authors principally reduce the data to a single dimension using dimension reduction techniques and use thresholded classifier authors responded to this concern saying effective feature learning in general amounts to that and there are optimal data dimension reduction techniques further authors also experimentally demonstrate that losses in accuracy is not much due to these techniques in summary concerns 1 and 2 are not severe enough as acknowledged by reviewers raising scores but important to keep in mind while preparing the camera ready
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 783, 789, 2175, 253, 1895, 273, 17055, 747, 941, 281, 3037, 6454, 3268, 273, 7996, 2390, 352, 6260, 1264, 15849, 273, 941, 4849, 50276, 15083, 80, 3535, 760, 581, 275, 534, 7089, 281, 1918, 247, 2762, 3061, 281, 4292, 403, 12718, 1754, 327, 1655, 30410, 285, 17947, 760, 285, 3939, 372, 4193, 2355, 643, 767, 835, 253, 4292, 665, 651, 417, 452, 644, 1677, 247, 2762, 3061, 403, 1677, 594, 12421, 342, 390, 1293, 690, 10806, 10527, 5127, 272, 253, 2219, 273, 32505, 26306, 285, 36474, 10670, 253, 789, 2722, 326, 253, 6158, 767, 15849, 6642, 253, 6454, 10670, 9113, 352, 671, 19539, 11542, 14938, 275, 2403, 3451, 7089, 323, 253, 3939, 4274, 2355, 6974, 841, 7313, 403, 16058, 275, 4679, 327, 767, 1524, 15302, 20544, 50276, 47606, 285, 28849, 1895, 273, 849, 747, 941, 4849, 16274, 28959, 23632, 50276, 783, 33977, 1783, 24543, 5272, 3533, 327, 3037, 1430, 275, 436, 4758, 285, 9010, 27350, 9172, 50276, 43786, 973, 3542, 2929, 50276, 20881, 1255, 265, 50276, 8826, 2426, 403, 417, 1160, 10799, 751, 8492, 390, 372, 4193, 2355, 50276, 2252, 4623, 789, 403, 417, 5469, 275, 253, 2022, 2929, 390, 30762, 50276, 17465, 569, 273, 253, 13260, 751, 32505, 26306, 10670, 285, 273, 253, 1332, 751, 417, 2819, 387, 253, 3386, 403, 417, 5469, 50276, 1189, 455, 891, 751, 253, 2561, 3884, 273, 12392, 253, 36039, 875, 941, 4849, 285, 28959, 672, 4983, 432, 35180, 941, 13989, 337, 310, 247, 1175, 3213, 4404, 4685, 436, 36039, 50272, 6438, 253, 2380, 50276, 783, 2380, 12453, 619, 19241, 18212, 7613, 891, 452, 9300, 619, 4868, 281, 818, 2997, 253, 2929, 33826, 4722, 5697, 327, 6537, 941, 4849, 281, 3451, 23539, 8197, 285, 253, 1055, 273, 8133, 28959, 10806, 1309, 4849, 627, 403, 7364, 751, 253, 12400, 281, 2014, 15759, 941, 4583, 253, 789, 2789, 247, 1175, 7680, 50276, 74, 11435, 253, 11080, 2278, 273, 2905, 789, 275, 253, 2380, 4496, 2486, 253, 5955, 715, 253, 2022, 2929, 43073, 253, 1895, 4758, 285, 1783, 432, 2469, 789, 19843, 273, 9759, 3198, 281, 320, 5520, 347, 4879, 407, 253, 643, 30628, 1690, 8254, 5411, 253, 1895, 285, 14951, 891, 651, 1804, 15974, 253, 12400, 281, 1625, 1070, 37613, 941, 407, 13458, 281, 4088, 275, 534, 253, 16039, 476, 320, 9495, 281, 253, 23964, 37613, 1083, 352, 310, 20697, 281, 1902, 326, 7877, 5141, 651, 320, 347, 2957, 1417, 347, 2011, 275, 253, 6782, 6021, 3368, 671, 891, 812, 1089, 253, 3242, 5199, 908, 323, 7877, 5141, 594, 253, 906, 310, 247, 2372, 10084, 604, 352, 310, 4336, 440, 35421, 7364, 273, 253, 789, 403, 417, 5469, 275, 253, 2022, 2929, 4496, 2319, 253, 7364, 273, 253, 1783, 323, 32505, 26306, 10670, 285, 273, 253, 1332, 275, 23111, 253, 4735, 1491, 275, 941, 4849, 4496, 2319, 2905, 789, 285, 897, 352, 281, 41509, 253, 34002, 273, 253, 1895, 9978, 690, 273, 253, 5955, 432, 30762, 247, 476, 320, 4395, 281, 253, 2022, 2929, 2442, 4016, 16274, 403, 5469, 5474, 339, 431, 248, 2929, 10262, 247, 1332, 323, 941, 372, 4193, 2355, 275, 247, 22453, 2203, 4758, 342, 581, 4735, 1783, 6508, 281, 767, 275, 8499, 253, 5933, 2589, 84, 247, 830, 273, 11542, 14719, 17947, 275, 436, 327, 264, 37613, 4758, 285, 476, 320, 908, 281, 1347, 4715, 762, 28959, 10806, 253, 4477, 2085, 247, 10527, 1783, 273, 253, 4081, 5933, 17227, 14940, 285, 18918, 253, 14938, 3033, 253, 2929, 3797, 4679, 327, 247, 948, 7675, 456, 15302, 285, 9508, 273, 767, 28959, 15302, 50276, 1189, 455, 253, 2929, 310, 6571, 973, 15720, 285, 14951, 2590, 342, 690, 16022, 4879, 2708, 891, 452, 1264, 2022, 7350, 670, 253, 2929, 253, 806, 310, 670, 253, 8542, 8453, 273, 253, 1895, 436, 3133, 281, 320, 273, 3710, 1524, 10186, 8453, 285, 253, 4477, 513, 417, 4271, 247, 14282, 2898, 835, 824, 247, 1566, 285, 253, 2529, 4758, 2186, 1273, 891, 452, 690, 7350, 342, 253, 9759, 273, 253, 789, 347, 941, 372, 4193, 2355, 347, 616, 789, 310, 1663, 670, 1566, 372, 4193, 2355, 275, 271, 1396, 265, 312, 4906, 4758, 2626, 891, 1089, 253, 1524, 10186, 4679, 8489, 10915, 87, 19163, 923, 2708, 50275, 6438, 2380, 50276, 2887, 5701, 2708, 50276, 24330, 5701, 50275, 2520, 789, 18200, 479, 347, 275, 13759, 1523, 29318, 275, 767, 4088, 806, 352, 3133, 281, 320, 670, 1566, 372, 4193, 2355, 417, 941, 372, 4193, 2355, 323, 1650, 253, 4477, 1211, 5426, 273, 8492, 310, 247, 2867, 273, 253, 1566, 417, 273, 253, 941, 21194, 298, 21, 21095, 298, 9312, 13482, 1273, 436, 5426, 273, 8492, 3133, 21105, 20804, 281, 667, 28959, 7350, 285, 310, 8069, 247, 2557, 273, 1566, 2228, 253, 789, 651, 1056, 3282, 347, 247, 28959, 789, 604, 616, 2557, 273, 8492, 369, 323, 1650, 2905, 281, 43586, 275, 436, 2228, 689, 22105, 533, 347, 261, 253, 4081, 5933, 3133, 281, 3365, 320, 247, 327, 264, 37613, 3939, 28269, 1332, 326, 812, 19071, 28959, 390, 667, 643, 10806, 50275, 783, 25983, 273, 253, 2022, 2505, 16633, 327, 253, 1083, 273, 327, 264, 37613, 941, 342, 247, 3806, 281, 374, 69, 1083, 275, 8499, 604, 436, 310, 253, 1083, 253, 4081, 1332, 3133, 281, 320, 273, 3710, 31471, 275, 954, 1524, 10186, 9162, 15216, 835, 1142, 3386, 403, 5431, 2130, 253, 4477, 1211, 8489, 523, 30487, 4679, 327, 1524, 10186, 15302, 6782, 269, 4173, 534, 4796, 841, 15302, 281, 247, 2014, 4735, 275, 1340, 281, 4647, 616, 4081, 1332, 1646, 281, 7568, 253, 3710, 1524, 10186, 31471, 891, 2096, 326, 436, 778, 320, 271, 4722, 10527, 1083, 323, 253, 3961, 262, 6239, 533, 253, 4477, 1750, 281, 8415, 247, 14282, 28959, 1895, 310, 1892, 281, 1379, 387, 2454, 1318, 1293, 667, 2590, 6667, 273, 824, 247, 10076, 14149, 275, 253, 1524, 1533, 50275, 783, 1524, 10186, 941, 4679, 275, 1798, 403, 5075, 253, 4477, 513, 417, 2085, 253, 3216, 33024, 2193, 275, 3036, 577, 66, 21, 67, 21, 68, 21, 69, 760, 581, 5370, 26662, 253, 4081, 1332, 281, 1666, 25379, 285, 253, 13506, 941, 42072, 275, 577, 68, 3133, 519, 26901, 285, 310, 417, 2529, 954, 8664, 2299, 310, 253, 4477, 940, 3867, 841, 6793, 314, 453, 30387, 10334, 792, 15302, 715, 247, 2014, 4735, 323, 253, 6378, 273, 616, 4679, 534, 8322, 281, 1262, 13218, 1491, 275, 1340, 281, 22756, 27721, 275, 253, 4081, 1332, 436, 310, 8069, 23111, 8307, 273, 3863, 1543, 327, 841, 15302, 275, 253, 28959, 6239, 534, 651, 4931, 320, 271, 4569, 8245, 253, 1543, 403, 671, 5816, 667, 10732, 273, 9162, 2228, 390, 2957, 604, 891, 2096, 731, 9113, 50275, 37585, 5701, 50275, 77, 17015, 436, 1511, 273, 9376, 310, 1846, 275, 253, 6239, 50276, 32897, 2085, 4623, 30404, 50275, 28692, 337, 812, 897, 625, 45737, 272, 690, 273, 253, 3607, 2529, 347, 27350, 390, 5010, 403, 417, 4745, 2590, 281, 479, 247, 30909, 5740, 273, 253, 3607, 273, 298, 2612, 651, 19148, 253, 789, 15455, 50275, 783, 30328, 275, 298, 1229, 3079, 805, 310, 417, 2590, 275, 1798, 253, 6197, 275, 298, 18950, 77, 22000, 812, 897, 625, 45737, 272, 352, 4390, 9563, 25286, 1037, 347, 271, 2572, 275, 17947, 2789, 253, 1566, 625, 11518, 387, 17947, 50275, 555, 993, 3966, 50273, 74, 717, 13477, 2139, 627, 310, 642, 7856, 327, 39116, 347, 627, 310, 327, 40639, 310, 2649, 352, 253, 1083, 326, 627, 4961, 247, 2032, 39116, 66, 253, 8476, 534, 359, 403, 26230, 3066, 7856, 3124, 66, 7856, 783, 8476, 2975, 50275, 77, 17161, 275, 619, 4743, 352, 651, 320, 30909, 281, 13280, 253, 305, 749, 3866, 275, 253, 1563, 1783, 50275, 12973, 9458, 50276, 12973, 3033, 50275, 77, 20070, 5731, 253, 8197, 281, 269, 50276, 11183, 269, 50275, 77, 17864, 253, 85, 37968, 310, 417, 271, 7202, 4764, 604, 359, 403, 22753, 352, 50275, 77, 21197, 35039, 16726, 8659, 923, 1840, 5474, 339, 431, 248, 2929, 19401, 271, 5933, 281, 372, 39043, 15302, 949, 17825, 285, 11542, 17947, 323, 9162, 275, 1798, 253, 8492, 275, 253, 941, 310, 2931, 347, 23437, 1974, 275, 253, 13418, 273, 247, 36474, 3268, 8025, 1929, 281, 320, 4944, 616, 5933, 2987, 407, 5223, 1242, 8871, 247, 2406, 3033, 3969, 281, 247, 1159, 273, 253, 36384, 273, 253, 5998, 36474, 3268, 342, 1655, 3602, 840, 1677, 247, 3061, 7887, 9162, 5933, 247, 7887, 310, 14662, 281, 3653, 604, 3530, 50276, 21215, 403, 7607, 835, 8411, 273, 253, 7887, 476, 2826, 8772, 281, 253, 2406, 3033, 5118, 342, 1698, 5912, 941, 310, 840, 4236, 323, 337, 851, 26208, 253, 30410, 285, 374, 22753, 253, 3602, 273, 253, 3268, 783, 2929, 10262, 3607, 273, 247, 15846, 30211, 285, 15846, 30211, 11333, 285, 2007, 2722, 326, 616, 5933, 556, 247, 13857, 3607, 275, 247, 5878, 436, 5644, 281, 14940, 273, 5998, 3602, 14938, 1783, 310, 2218, 327, 253, 49996, 34003, 285, 253, 1783, 273, 849, 247, 28959, 7658, 476, 1055, 253, 3885, 273, 14940, 310, 2218, 4679, 403, 3559, 689, 13506, 285, 1524, 1533, 15302, 50274, 11183, 50276, 74, 452, 9300, 619, 4868, 281, 247, 818, 2074, 281, 643, 30628, 891, 651, 671, 6780, 326, 3081, 19843, 275, 253, 2929, 651, 3157, 253, 2929, 10260, 50276, 531, 2173, 1127, 327, 436, 310, 281, 1056, 253, 3510, 273, 28959, 1679, 23851, 275, 253, 5955, 285, 275, 253, 2488, 6128, 3448, 556, 644, 908, 323, 1027, 3510, 273, 28959, 534, 310, 21643, 627, 403, 767, 3862, 3510, 273, 16593, 1255, 1146, 16318, 337, 432, 941, 7147, 374, 432, 10554, 253, 4477, 403, 13654, 327, 253, 3438, 2299, 253, 3448, 908, 281, 6266, 1097, 310, 21643, 323, 4227, 672, 12930, 10554, 28959, 2426, 751, 2675, 8492, 7645, 8492, 328, 25525, 1255, 403, 908, 533, 941, 28959, 310, 671, 2675, 285, 16593, 1255, 310, 1512, 3862, 273, 247, 1307, 275, 619, 4743, 436, 943, 3365, 320, 4767, 347, 10554, 28959, 390, 1633, 2074, 327, 253, 643, 1133, 941, 28959, 310, 27214, 7605, 8492, 2203, 8492, 1060, 253, 7605, 8492, 1307, 812, 3730, 281, 15970, 28959, 347, 10806, 323, 436, 403, 7605, 275, 3753, 3365, 1691, 253, 3448, 1060, 3198, 281, 320, 1160, 2173, 3340, 594, 275, 28959, 835, 627, 403, 1142, 26791, 4757, 50274, 16217, 3825, 921, 7355, 273, 14728, 3602, 50274, 783, 30325, 1543, 2085, 14940, 285, 14938, 1783, 50276, 20881, 1255, 265, 50274, 17695, 476, 320, 5520, 627, 403, 7794, 273, 253, 2929, 534, 403, 1077, 12744, 41671, 2708, 50274, 40203, 275, 253, 2022, 2505, 403, 21643, 50276, 33722, 1491, 50274, 41528, 50276, 39176, 406, 853, 403, 417, 2590, 253, 3625, 7364, 403, 4767, 2490, 187, 4118, 18435, 27, 2520, 2929, 556, 2326, 247, 2257, 273, 5955, 875, 30628, 285, 4477, 30628, 403, 9648, 2762, 846, 253, 5955, 250, 2858, 22559, 3408, 285, 627, 452, 644, 1534, 4868, 38549, 32372, 50276, 37922, 7350, 326, 497, 16318, 1309, 30080, 85, 8950, 10055, 3408, 403, 50275, 18, 2709, 30628, 452, 8042, 562, 326, 15995, 767, 4973, 273, 8492, 50276, 2203, 8492, 285, 1566, 8492, 50276, 783, 4477, 2770, 327, 44015, 247, 10895, 281, 3693, 253, 806, 1511, 273, 8492, 352, 556, 644, 8042, 562, 970, 2426, 751, 2675, 8492, 16593, 1255, 285, 7605, 8492, 403, 1077, 24363, 50276, 74, 7052, 1804, 253, 4477, 281, 1805, 49620, 253, 2929, 2556, 281, 37317, 5701, 970, 625, 10799, 28939, 941, 8492, 285, 263, 1566, 8492, 19843, 556, 644, 247, 4468, 17568, 6096, 15995, 512, 30628, 50276, 19, 253, 4477, 39288, 4796, 253, 941, 281, 247, 2014, 7877, 970, 7877, 5141, 5609, 285, 897, 7887, 264, 30410, 4477, 10974, 281, 436, 4468, 3981, 3576, 4735, 4715, 275, 2087, 8322, 281, 326, 285, 627, 403, 8654, 941, 7877, 5141, 5609, 2007, 4477, 671, 21657, 7568, 326, 11655, 275, 7200, 310, 417, 1199, 1955, 281, 841, 5609, 50275, 249, 6010, 7350, 337, 285, 374, 403, 417, 5460, 2217, 347, 14969, 407, 30628, 12976, 7363, 533, 1774, 281, 1978, 275, 2564, 1223, 13828, 253, 6568, 4704 ]
[ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 ]
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 783, 789, 2175, 253, 1895, 273, 17055, 747, 941, 281, 3037, 6454, 3268, 273, 7996, 2390, 352, 6260, 1264, 15849, 273, 941, 4849, 50276, 15083, 80, 3535, 760, 581, 275, 534, 7089, 281, 1918, 247, 2762, 3061, 281, 4292, 403, 12718, 1754, 327, 1655, 30410, 285, 17947, 760, 285, 3939, 372, 4193, 2355, 643, 767, 835, 253, 4292, 665, 651, 417, 452, 644, 1677, 247, 2762, 3061, 403, 1677, 594, 12421, 342, 390, 1293, 690, 10806, 10527, 5127, 272, 253, 2219, 273, 32505, 26306, 285, 36474, 10670, 253, 789, 2722, 326, 253, 6158, 767, 15849, 6642, 253, 6454, 10670, 9113, 352, 671, 19539, 11542, 14938, 275, 2403, 3451, 7089, 323, 253, 3939, 4274, 2355, 6974, 841, 7313, 403, 16058, 275, 4679, 327, 767, 1524, 15302, 20544, 50276, 47606, 285, 28849, 1895, 273, 849, 747, 941, 4849, 16274, 28959, 23632, 50276, 783, 33977, 1783, 24543, 5272, 3533, 327, 3037, 1430, 275, 436, 4758, 285, 9010, 27350, 9172, 50276, 43786, 973, 3542, 2929, 50276, 20881, 1255, 265, 50276, 8826, 2426, 403, 417, 1160, 10799, 751, 8492, 390, 372, 4193, 2355, 50276, 2252, 4623, 789, 403, 417, 5469, 275, 253, 2022, 2929, 390, 30762, 50276, 17465, 569, 273, 253, 13260, 751, 32505, 26306, 10670, 285, 273, 253, 1332, 751, 417, 2819, 387, 253, 3386, 403, 417, 5469, 50276, 1189, 455, 891, 751, 253, 2561, 3884, 273, 12392, 253, 36039, 875, 941, 4849, 285, 28959, 672, 4983, 432, 35180, 941, 13989, 337, 310, 247, 1175, 3213, 4404, 4685, 436, 36039, 50272, 6438, 253, 2380, 50276, 783, 2380, 12453, 619, 19241, 18212, 7613, 891, 452, 9300, 619, 4868, 281, 818, 2997, 253, 2929, 33826, 4722, 5697, 327, 6537, 941, 4849, 281, 3451, 23539, 8197, 285, 253, 1055, 273, 8133, 28959, 10806, 1309, 4849, 627, 403, 7364, 751, 253, 12400, 281, 2014, 15759, 941, 4583, 253, 789, 2789, 247, 1175, 7680, 50276, 74, 11435, 253, 11080, 2278, 273, 2905, 789, 275, 253, 2380, 4496, 2486, 253, 5955, 715, 253, 2022, 2929, 43073, 253, 1895, 4758, 285, 1783, 432, 2469, 789, 19843, 273, 9759, 3198, 281, 320, 5520, 347, 4879, 407, 253, 643, 30628, 1690, 8254, 5411, 253, 1895, 285, 14951, 891, 651, 1804, 15974, 253, 12400, 281, 1625, 1070, 37613, 941, 407, 13458, 281, 4088, 275, 534, 253, 16039, 476, 320, 9495, 281, 253, 23964, 37613, 1083, 352, 310, 20697, 281, 1902, 326, 7877, 5141, 651, 320, 347, 2957, 1417, 347, 2011, 275, 253, 6782, 6021, 3368, 671, 891, 812, 1089, 253, 3242, 5199, 908, 323, 7877, 5141, 594, 253, 906, 310, 247, 2372, 10084, 604, 352, 310, 4336, 440, 35421, 7364, 273, 253, 789, 403, 417, 5469, 275, 253, 2022, 2929, 4496, 2319, 253, 7364, 273, 253, 1783, 323, 32505, 26306, 10670, 285, 273, 253, 1332, 275, 23111, 253, 4735, 1491, 275, 941, 4849, 4496, 2319, 2905, 789, 285, 897, 352, 281, 41509, 253, 34002, 273, 253, 1895, 9978, 690, 273, 253, 5955, 432, 30762, 247, 476, 320, 4395, 281, 253, 2022, 2929, 2442, 4016, 16274, 403, 5469, 5474, 339, 431, 248, 2929, 10262, 247, 1332, 323, 941, 372, 4193, 2355, 275, 247, 22453, 2203, 4758, 342, 581, 4735, 1783, 6508, 281, 767, 275, 8499, 253, 5933, 2589, 84, 247, 830, 273, 11542, 14719, 17947, 275, 436, 327, 264, 37613, 4758, 285, 476, 320, 908, 281, 1347, 4715, 762, 28959, 10806, 253, 4477, 2085, 247, 10527, 1783, 273, 253, 4081, 5933, 17227, 14940, 285, 18918, 253, 14938, 3033, 253, 2929, 3797, 4679, 327, 247, 948, 7675, 456, 15302, 285, 9508, 273, 767, 28959, 15302, 50276, 1189, 455, 253, 2929, 310, 6571, 973, 15720, 285, 14951, 2590, 342, 690, 16022, 4879, 2708, 891, 452, 1264, 2022, 7350, 670, 253, 2929, 253, 806, 310, 670, 253, 8542, 8453, 273, 253, 1895, 436, 3133, 281, 320, 273, 3710, 1524, 10186, 8453, 285, 253, 4477, 513, 417, 4271, 247, 14282, 2898, 835, 824, 247, 1566, 285, 253, 2529, 4758, 2186, 1273, 891, 452, 690, 7350, 342, 253, 9759, 273, 253, 789, 347, 941, 372, 4193, 2355, 347, 616, 789, 310, 1663, 670, 1566, 372, 4193, 2355, 275, 271, 1396, 265, 312, 4906, 4758, 2626, 891, 1089, 253, 1524, 10186, 4679, 8489, 10915, 87, 19163, 923, 2708, 50275, 6438, 2380, 50276, 2887, 5701, 2708, 50276, 24330, 5701, 50275, 2520, 789, 18200, 479, 347, 275, 13759, 1523, 29318, 275, 767, 4088, 806, 352, 3133, 281, 320, 670, 1566, 372, 4193, 2355, 417, 941, 372, 4193, 2355, 323, 1650, 253, 4477, 1211, 5426, 273, 8492, 310, 247, 2867, 273, 253, 1566, 417, 273, 253, 941, 21194, 298, 21, 21095, 298, 9312, 13482, 1273, 436, 5426, 273, 8492, 3133, 21105, 20804, 281, 667, 28959, 7350, 285, 310, 8069, 247, 2557, 273, 1566, 2228, 253, 789, 651, 1056, 3282, 347, 247, 28959, 789, 604, 616, 2557, 273, 8492, 369, 323, 1650, 2905, 281, 43586, 275, 436, 2228, 689, 22105, 533, 347, 261, 253, 4081, 5933, 3133, 281, 3365, 320, 247, 327, 264, 37613, 3939, 28269, 1332, 326, 812, 19071, 28959, 390, 667, 643, 10806, 50275, 783, 25983, 273, 253, 2022, 2505, 16633, 327, 253, 1083, 273, 327, 264, 37613, 941, 342, 247, 3806, 281, 374, 69, 1083, 275, 8499, 604, 436, 310, 253, 1083, 253, 4081, 1332, 3133, 281, 320, 273, 3710, 31471, 275, 954, 1524, 10186, 9162, 15216, 835, 1142, 3386, 403, 5431, 2130, 253, 4477, 1211, 8489, 523, 30487, 4679, 327, 1524, 10186, 15302, 6782, 269, 4173, 534, 4796, 841, 15302, 281, 247, 2014, 4735, 275, 1340, 281, 4647, 616, 4081, 1332, 1646, 281, 7568, 253, 3710, 1524, 10186, 31471, 891, 2096, 326, 436, 778, 320, 271, 4722, 10527, 1083, 323, 253, 3961, 262, 6239, 533, 253, 4477, 1750, 281, 8415, 247, 14282, 28959, 1895, 310, 1892, 281, 1379, 387, 2454, 1318, 1293, 667, 2590, 6667, 273, 824, 247, 10076, 14149, 275, 253, 1524, 1533, 50275, 783, 1524, 10186, 941, 4679, 275, 1798, 403, 5075, 253, 4477, 513, 417, 2085, 253, 3216, 33024, 2193, 275, 3036, 577, 66, 21, 67, 21, 68, 21, 69, 760, 581, 5370, 26662, 253, 4081, 1332, 281, 1666, 25379, 285, 253, 13506, 941, 42072, 275, 577, 68, 3133, 519, 26901, 285, 310, 417, 2529, 954, 8664, 2299, 310, 253, 4477, 940, 3867, 841, 6793, 314, 453, 30387, 10334, 792, 15302, 715, 247, 2014, 4735, 323, 253, 6378, 273, 616, 4679, 534, 8322, 281, 1262, 13218, 1491, 275, 1340, 281, 22756, 27721, 275, 253, 4081, 1332, 436, 310, 8069, 23111, 8307, 273, 3863, 1543, 327, 841, 15302, 275, 253, 28959, 6239, 534, 651, 4931, 320, 271, 4569, 8245, 253, 1543, 403, 671, 5816, 667, 10732, 273, 9162, 2228, 390, 2957, 604, 891, 2096, 731, 9113, 50275, 37585, 5701, 50275, 77, 17015, 436, 1511, 273, 9376, 310, 1846, 275, 253, 6239, 50276, 32897, 2085, 4623, 30404, 50275, 28692, 337, 812, 897, 625, 45737, 272, 690, 273, 253, 3607, 2529, 347, 27350, 390, 5010, 403, 417, 4745, 2590, 281, 479, 247, 30909, 5740, 273, 253, 3607, 273, 298, 2612, 651, 19148, 253, 789, 15455, 50275, 783, 30328, 275, 298, 1229, 3079, 805, 310, 417, 2590, 275, 1798, 253, 6197, 275, 298, 18950, 77, 22000, 812, 897, 625, 45737, 272, 352, 4390, 9563, 25286, 1037, 347, 271, 2572, 275, 17947, 2789, 253, 1566, 625, 11518, 387, 17947, 50275, 555, 993, 3966, 50273, 74, 717, 13477, 2139, 627, 310, 642, 7856, 327, 39116, 347, 627, 310, 327, 40639, 310, 2649, 352, 253, 1083, 326, 627, 4961, 247, 2032, 39116, 66, 253, 8476, 534, 359, 403, 26230, 3066, 7856, 3124, 66, 7856, 783, 8476, 2975, 50275, 77, 17161, 275, 619, 4743, 352, 651, 320, 30909, 281, 13280, 253, 305, 749, 3866, 275, 253, 1563, 1783, 50275, 12973, 9458, 50276, 12973, 3033, 50275, 77, 20070, 5731, 253, 8197, 281, 269, 50276, 11183, 269, 50275, 77, 17864, 253, 85, 37968, 310, 417, 271, 7202, 4764, 604, 359, 403, 22753, 352, 50275, 77, 21197, 35039, 16726, 8659, 923, 1840, 5474, 339, 431, 248, 2929, 19401, 271, 5933, 281, 372, 39043, 15302, 949, 17825, 285, 11542, 17947, 323, 9162, 275, 1798, 253, 8492, 275, 253, 941, 310, 2931, 347, 23437, 1974, 275, 253, 13418, 273, 247, 36474, 3268, 8025, 1929, 281, 320, 4944, 616, 5933, 2987, 407, 5223, 1242, 8871, 247, 2406, 3033, 3969, 281, 247, 1159, 273, 253, 36384, 273, 253, 5998, 36474, 3268, 342, 1655, 3602, 840, 1677, 247, 3061, 7887, 9162, 5933, 247, 7887, 310, 14662, 281, 3653, 604, 3530, 50276, 21215, 403, 7607, 835, 8411, 273, 253, 7887, 476, 2826, 8772, 281, 253, 2406, 3033, 5118, 342, 1698, 5912, 941, 310, 840, 4236, 323, 337, 851, 26208, 253, 30410, 285, 374, 22753, 253, 3602, 273, 253, 3268, 783, 2929, 10262, 3607, 273, 247, 15846, 30211, 285, 15846, 30211, 11333, 285, 2007, 2722, 326, 616, 5933, 556, 247, 13857, 3607, 275, 247, 5878, 436, 5644, 281, 14940, 273, 5998, 3602, 14938, 1783, 310, 2218, 327, 253, 49996, 34003, 285, 253, 1783, 273, 849, 247, 28959, 7658, 476, 1055, 253, 3885, 273, 14940, 310, 2218, 4679, 403, 3559, 689, 13506, 285, 1524, 1533, 15302, 50274, 11183, 50276, 74, 452, 9300, 619, 4868, 281, 247, 818, 2074, 281, 643, 30628, 891, 651, 671, 6780, 326, 3081, 19843, 275, 253, 2929, 651, 3157, 253, 2929, 10260, 50276, 531, 2173, 1127, 327, 436, 310, 281, 1056, 253, 3510, 273, 28959, 1679, 23851, 275, 253, 5955, 285, 275, 253, 2488, 6128, 3448, 556, 644, 908, 323, 1027, 3510, 273, 28959, 534, 310, 21643, 627, 403, 767, 3862, 3510, 273, 16593, 1255, 1146, 16318, 337, 432, 941, 7147, 374, 432, 10554, 253, 4477, 403, 13654, 327, 253, 3438, 2299, 253, 3448, 908, 281, 6266, 1097, 310, 21643, 323, 4227, 672, 12930, 10554, 28959, 2426, 751, 2675, 8492, 7645, 8492, 328, 25525, 1255, 403, 908, 533, 941, 28959, 310, 671, 2675, 285, 16593, 1255, 310, 1512, 3862, 273, 247, 1307, 275, 619, 4743, 436, 943, 3365, 320, 4767, 347, 10554, 28959, 390, 1633, 2074, 327, 253, 643, 1133, 941, 28959, 310, 27214, 7605, 8492, 2203, 8492, 1060, 253, 7605, 8492, 1307, 812, 3730, 281, 15970, 28959, 347, 10806, 323, 436, 403, 7605, 275, 3753, 3365, 1691, 253, 3448, 1060, 3198, 281, 320, 1160, 2173, 3340, 594, 275, 28959, 835, 627, 403, 1142, 26791, 4757, 50274, 16217, 3825, 921, 7355, 273, 14728, 3602, 50274, 783, 30325, 1543, 2085, 14940, 285, 14938, 1783, 50276, 20881, 1255, 265, 50274, 17695, 476, 320, 5520, 627, 403, 7794, 273, 253, 2929, 534, 403, 1077, 12744, 41671, 2708, 50274, 40203, 275, 253, 2022, 2505, 403, 21643, 50276, 33722, 1491, 50274, 41528, 50276, 39176, 406, 853, 403, 417, 2590, 253, 3625, 7364, 403, 4767, 2490, 187, 4118, 18435, 27, 2520, 2929, 556, 2326, 247, 2257, 273, 5955, 875, 30628, 285, 4477, 30628, 403, 9648, 2762, 846, 253, 5955, 250, 2858, 22559, 3408, 285, 627, 452, 644, 1534, 4868, 38549, 32372, 50276, 37922, 7350, 326, 497, 16318, 1309, 30080, 85, 8950, 10055, 3408, 403, 50275, 18, 2709, 30628, 452, 8042, 562, 326, 15995, 767, 4973, 273, 8492, 50276, 2203, 8492, 285, 1566, 8492, 50276, 783, 4477, 2770, 327, 44015, 247, 10895, 281, 3693, 253, 806, 1511, 273, 8492, 352, 556, 644, 8042, 562, 970, 2426, 751, 2675, 8492, 16593, 1255, 285, 7605, 8492, 403, 1077, 24363, 50276, 74, 7052, 1804, 253, 4477, 281, 1805, 49620, 253, 2929, 2556, 281, 37317, 5701, 970, 625, 10799, 28939, 941, 8492, 285, 263, 1566, 8492, 19843, 556, 644, 247, 4468, 17568, 6096, 15995, 512, 30628, 50276, 19, 253, 4477, 39288, 4796, 253, 941, 281, 247, 2014, 7877, 970, 7877, 5141, 5609, 285, 897, 7887, 264, 30410, 4477, 10974, 281, 436, 4468, 3981, 3576, 4735, 4715, 275, 2087, 8322, 281, 326, 285, 627, 403, 8654, 941, 7877, 5141, 5609, 2007, 4477, 671, 21657, 7568, 326, 11655, 275, 7200, 310, 417, 1199, 1955, 281, 841, 5609, 50275, 249, 6010, 7350, 337, 285, 374, 403, 417, 5460, 2217, 347, 14969, 407, 30628, 12976, 7363, 533, 1774, 281, 1978, 275, 2564, 1223, 13828, 253, 6568, 4704 ]
Below is given review of a research paper from cnoference journal. Please write a summary the review. ### Review: this paper tries to investigate and understand if and how adversarial training helps the models trained on the source domain transfer easier and faster to target domains with extensive different configurations such as finetuning strategies in experiments the authors show that robust models transfer better than natural models with less training data from the target domain also they demonstrate the intuition behind through experiments such as capturing shapes than textures or using influence functions strengths the idea is interesting and have a potential for impacts in the community extensive experiments and investigations how and when the robust models works better than natural models is good to demonstrate the main ideas of this paper paper is easy to understand weaknesses even though it was shown by the experiments it might need to have more theoretical understanding why the robust models transfer better or have better representations than natural models even though it seems to provide some explanations it lacks more thorough investigation why the specific configuration choices yield better performances than others the presented dataset choices seem limited which could limit its potential impacts and applications in realworld problems see the comments below detailed comments if the shape is indeed more important than texture for humanlike performance is it possible make the model even works on par with the natural models why specific configuration works better than others such as finetuning three conv blocks and 2 3 in cifar100 and especially cifar10 finetuning one conv block is better than zero conv block and why the target domains except cifar100 and cifar10 are all digit datasets so its application to realworld problems may be limited how about using different and nonoverlapping classes than those in the source domain in other image datasets as target domains such as caltech256 it could make the paper stronger in table 5 the accuracy differences seem larger than 5 as written in the text typo page 3 nxonnegative nonnegativedocsepsummary of paper this paper investigates how robust adversarially trained models can improve the transfer of representations finding that they transfer better additionally they investigate some reasons this could be the case examining the biases robust models appear to induce prosstrong points interesting wellexplained experiments with mostly clear nice figures nice extra investigations giving insight into the biass conferred by adversarial training consweak points overstatement as though the results apply to any type of datamodel but only image data and convolutional nets are tried potential issue with influence function experiments no analysis of computational cost of adversarial training or other information on tradeoffs background lackingpotential issues with related work summary of review  recommendation good paper with nice thoughtful experiments mostly well written although i think the biggest issue there is overclaiming results applying to all data when only image dataconvnets are studied im also a bit worried about the thoroughness of the background research and would like to see an analysis of the computational cost of adversarial training if these and my question about influence function experiments are addressed i would be happy to raise my score detailed review quality generally well written and organized although the link between successive sections could be made a bit more strongly flow could be better overstatement of scope transfer learning in general when only image data is studied combined with misstatements in the first parts about the origins of transfer learning makes me skeptical of the depth of background research done and makes me suspect there may be some very relevant things which have not been surveyedcited  see specific questionsrecommendations for details on both points below    comparison of computational cost other tradeoffs in adversarial vs natural training not discussed potential weaknessmisleading conclusions in influence function experiments if ive understood correctly see specifics below clarity the abstract and introduction would benefir from more technical specific language to avoid ambiguity and establish flow of the sections clarity within sections is pretty good some specific suggestions below figures are mostly nice and clear not so much table 1 and 7a though originality this is the largest potential problem that i am most unsure about im very familiar with work on statistical learning theory and generalization overall but im not an expert in transfer learning or adversarial method and im not sure how well these works are reviewed so im not sure how novel this work is the experiments are well done and well explained though and i think this is a good contribution even if it is less original than it is made to seem due to the lack of context significance nice insights and interesting experiments for those wanting to understand the impact of robust training on transfer limited practical insights without an analysisdiscussion of tradeoffs eg overall computational time and stability directions proposed for future work are concrete and interesting specific questionsrecommendations the first sentence of abstract and the title talk about dl generically but the secondsentence is about images specifically if you add nonimage data id suggest rephrasing to make the 2nd sentence generic and maybe mention this technique has been particularly successful for images however since all experiments are with images id suggest making the title and abstract specific to that domain eg evidence from image data that adversariallytrained deep nets transfer better or adversariallytrained deep nets transfer to new images better   overstatement of results if you want the stronggeneral claims about transfer learning i would strongly recommend doing experiments with at least mlps in addition to convnets and at least one other type of data in addition to  images otherwise the statements in titleabstractintro should be circumscribed to more accurately reflect the nature of the experiments   mention what is adversarially vs naturally trained although i strongly suggest using a different term than naturally including in diagramselsewhere as it is confusing it makes it seem like you are comparing adversarial training to natural gradient methods first sentence putting data hogs in quotes misleadingly suggests that its a commonly used phrase suggest removing this ie they are known to require large and suggest adding a reference which quantifies this rather than referring to hearsay similarly stunning is an opinion suggest remarkable or something like excellent which can be derived from empirical results  comparison of computational cost other tradeoffs even just a reference where this is done with a sentence summarizing those results other tradeoffs eg its usually at least 2x as slow and more likely to be unstable or something like that might be enough but i would prefer to see full training curves and computational cost numbers in appendix with a line or two summarizing these in the main text stating that caruana 1995 proposed transfer learning seems incorrect to me there was a neurips workshop that same year on the subject suggesting it was already an established term at that time i dont know the full history but from a quick google it seems to have been very common in psychology education literature in the 70s and 80s heres a book that talked about it in the context of ml in the 80s httpspdfssemanticscholarorgb547c5837bff9347dc76330a72fd7cbc517ee08cpdf and heres rich sutton talking about it in 92 httpslinkspringercomcontentpdf1010079781461536185pdf related work    covariate shift could also mention risk extrapolation httpsarxivorgpdf200300688pdf how does the extrapolation done there differ from the adversarial training    it seems like a lot of works on adversarial and contrastive training and the relationship to generalization are missing im not an expert in this but starting way back hard negative mining and other contrastive methods eg word2vec have been used and their properties discussed  the first subheading in section 4 is the conclusion drawn from that paragraph adversariallytrained models transfer better and faster but subsequent headings do not have the same syntax they are more like titles than conclusions to be drawn i like the conclusionastitle format i find it very engaging and helpful for skimming especially since there are many experiments but most of all i would strongly suggest that all titles have the same syntax ie if you cant think of conclusionsastitles for the other bold p headers although i think you can and should i would recommend rephrasing this one to be like the others eg comparison of adversarial and nonadversarial transfer formatting of table 1 with captions both below and above the tables is hard to read put it all above or all below could have been a nice opportunity to investigate whether robust models are more or less susceptible to the types of bias people worry about in realworld image datasets eg face recognition maybe worth mentioning this in future work  unless im mistaken the experiments with influence functions do not distinguish the effect of performance from that of training bias toward the human prior to do so the robust and natural methods would have to have the same accuracy ie this might involve very early stopping of the robust method to match the natural model performance without this the qualitative and quantitative results could just be due to the higher accuracy of the robust method not the particular form of prior it inducesdocsep1 summary of contribution this paper claims that the pretrained model trained adversarially can achieve better performance on transfer learning and conducted extensive experiments on the efficacy of the adversely trained pretrained models also the paper conducts an empirical analysis of the trained models and shows that the adversarially pretrained models uses the shape of the images rather than the texture to classify the images using the influence function koh 2017 the paper reveals that each influential image on the adversarially trained model is much more perceptively similar to its test example 2 strengths and weaknesses the paper is wellwritten and organized and the experiments look fair and well support the claim the analysis is interesting and insightful meanwhile the transfer is done to the domain of lower complexity and some important comparative ideas are not extensively investigated 3 recommendation while the papers empirical results are solid there seems to be a substantial room left for comparative studies more ablation studies shall be done for other regularization methods i believe that the paper is marginally above the acceptance threshold 4 reasons for recommendation the reader will benefit more from the paper if the authors can justify their use of adversarial training as the regularization in the pretraining process i believe that this research warrants some comparative study for dropout weight decay as well as random perturbations i think the paper can be more insightful if it shows whether the other classical regularization performs better or worse on transfer learning than the proposed approach 5 additional feedback in addition to the suggestions made in 4 i also believe that comparison shall be made against the model trained without pretraining post rebuttal thank you for the response and thank you for checking the performance comparison against the whitenoise perturbation it would be interesting to see a future work involving means other than adversarial training eg including other simple mechanisms like weight decay and dropout to help reduce the overfitting effects in the pretraining phase i would like to keep my score as is docsep contributions the paper proposes that models that were adversarially trained transfer better to other datasets in that they increase clean performance on this target dataset if there are only few labeled datapoints for the target task or only few training epochs are conducted the authors test their hypothesis for resnets pretrained on imagenet in different threat models and transfer these models to 6 different target datasets generally results provide sufficient evidence for the papers main hypothesis robust models transfer better additional experiments provide evidence that better transferability of robust models is partly due to relying more on shape rather than texture cues moreover an additional analysis using influence functions leads to the hypothesis that robust neural networks might have learned to classify using examplebased concept learning like in human beings significance transfer learning is a topic of high relevancy for practitioners since it can reduce both data label effort and training time improving upon the baseline of transferring models pretrained on imagenet with a nonadversarial loss is thus a potential significant result however the papers review of the transfer learning literature is superficial and misses some relevant related work such as rethinking imagenet pretraining by he et al iccv 2019 additionally geirhos et al iclr 2019 also showed that models pretrained on stylized imagenet and thus having a stronger shape bias transfer better to object detection tasks this should be mentioned in section 5 and more generally if stronger transferability is mainly due to increased shape bias wouldnt it make sense to pretrain explicitly for strong shape bias rather than achieving this indirectly via adversarial training as proposed in this paper a more thorough review of the transfer learning literature and relating the obtained results to this would generally strengthen the paper originality the work is a purely empirical work studying the stated hypotheses no novel methods are proposed originality can thus only come from the hypotheses the main hypothesis robust models transfer better was also proposed by salman et al 2020 however this work should be seen as concurrent since it was released on arxiv only four months ago the main prior work is shafahi et al iclr 2020 which also studies transferring adversarially pretrained models to other tasks however their focus is on the robustness gains of transferred models rather than on the effect on clean performance in summary i think the main hypothesis studied in the paper is original however it is also clearly only a relatively small incremental step beyond what shafahi et al 2020 did clarity experimental setup training pipeline and analysis are outlined clearly releasing code for finetuning and replicating the experiments would further strengthen reproducibility quality generally the experiments are well conducted covering a broad range of threat models target datasets training image and epochs regimes and finetuning strategies additionally section 5 and 6 shed additional light onto why robust models might transfer better and by this further strengthening the main message of the paper one shortcoming is that all target tasks are image classification tasks whether robust models also transfer better to task such object detection or semantic segmentation remains unclear recommendation in summary i think the paper is a nice experimental study of a clearly stated hypothesis with potential practical impact i thus lean towards acceptance even though novelty is clearly borderline final recommendation after author response the authors have addressed several of my main concerns it would have been helpful to study transferability to tasks beyond image recognition but overall i think the paper has been considerably improved i increase my score to 7 two remarks regarding new content i find it misleading to denote pgd as a targeted adversary and additive gaussian noise as a random adversary section 7 targeted usually refers to an adversary that aims at achieving a specific misclassification target class gaussian noise is not really an adversary rather a distortionimage corruption i would recommend clarifying the naming to avoid confusion of readers is there any particular reason to use pgd3 in table 1b for evaluation would the effect hold also against stronger attacks more iterations etc ### Summary:
the premise of the work is simple enough investigate if networks that are trained with an adversarial objective end up being more suitable for transfer learning tasks especially in the context of limited labeled data for the new domain the work uncovers the fact that shapebiased representations are learned this way and this helps for the tasks they considered there was rather robust back and forth between the authors and the reviewers the consensus is that this work has merit has good quality experiments and investigates something with high potential impact given the importance of transfer learning in general i hope that most of the back and forth findings are incorporated in the final version of this work especially the discussion and comparison with shafahi et al as well as all the nuances of the shape bias
[ 4679, 342, 387, 1878, 13361, 793, 275, 1635, 281, 2410, 47301, 285, 387, 1878, 581, 643, 1511, 273, 941, 275, 1635, 281, 575, 3888, 5010, 253, 7234, 275, 4060, 15834, 35322, 943, 320, 4493, 31509, 281, 625, 13613, 4887, 253, 3753, 273, 253, 4679, 2519, 50275, 420, 279, 752, 310, 18539, 274, 1365, 4632, 10748, 10166, 3738, 891, 7052, 1804, 970, 247, 1027, 1307, 685, 10748, 1690, 275, 21302, 7271, 2811, 347, 352, 310, 21643, 352, 2789, 352, 1646, 751, 368, 403, 10941, 48960, 3733, 281, 3626, 11786, 3082, 50275, 7053, 6197, 8133, 941, 288, 14175, 275, 19101, 3731, 26460, 5356, 5936, 326, 697, 247, 7744, 908, 12616, 1804, 11922, 436, 26332, 597, 403, 1929, 281, 2430, 1781, 285, 1804, 6240, 247, 3806, 534, 2677, 7790, 436, 2581, 685, 14339, 281, 35513, 50275, 3549, 6241, 21472, 310, 271, 4743, 1804, 13406, 390, 1633, 751, 7126, 534, 476, 320, 6012, 432, 16774, 1543, 575, 575, 47109, 273, 15180, 2105, 50276, 977, 5454, 14273, 1014, 816, 247, 3806, 835, 436, 310, 2218, 342, 247, 6197, 10405, 3006, 1110, 1543, 50276, 977, 5454, 14273, 24088, 697, 3798, 387, 1878, 374, 89, 347, 3468, 285, 625, 2779, 281, 320, 17631, 390, 1633, 751, 326, 1537, 320, 2217, 533, 891, 651, 4510, 281, 923, 2120, 3733, 9191, 285, 15180, 2105, 3904, 275, 30762, 342, 247, 1386, 390, 767, 10405, 3006, 841, 275, 253, 2022, 2505, 50275, 44101, 326, 1113, 86, 3230, 8878, 4081, 3700, 4715, 3133, 13583, 281, 479, 627, 369, 247, 5723, 2824, 22586, 326, 1072, 807, 327, 253, 2256, 7738, 352, 369, 2168, 271, 4232, 1307, 387, 326, 673, 891, 13414, 871, 253, 2120, 2892, 533, 432, 247, 3158, 17899, 352, 3133, 281, 452, 644, 1077, 1846, 275, 20162, 50276, 43052, 6239, 275, 253, 5571, 84, 285, 5096, 84, 344, 373, 247, 1984, 326, 10062, 670, 352, 275, 253, 3634, 273, 13361, 275, 253, 5096, 84, 5987, 19875, 3671, 6017, 28601, 348, 6062, 2061, 67, 48793, 68, 3680, 1787, 67, 567, 26, 23568, 12352, 23250, 1229, 66, 3547, 9194, 24, 11316, 68, 43628, 1796, 2904, 7693, 4989, 285, 344, 373, 6793, 256, 28738, 5015, 670, 352, 275, 11266, 575, 3614, 23053, 81, 47761, 681, 6071, 9275, 6903, 361, 2787, 3141, 16607, 1010, 1812, 17786, 9275, 50275, 4919, 789, 575, 17345, 50276, 31485, 11610, 5333, 812, 671, 3748, 2495, 26480, 17888, 575, 3614, 39962, 2061, 9275, 9755, 7174, 2055, 9275, 849, 1057, 253, 26480, 17888, 2218, 627, 9184, 432, 253, 48960, 3733, 575, 17345, 50276, 262, 3133, 751, 247, 2257, 273, 2987, 327, 48960, 285, 4499, 422, 3733, 285, 253, 2954, 281, 26647, 403, 5816, 516, 417, 271, 6485, 275, 436, 533, 4983, 1039, 896, 1892, 4016, 15067, 285, 643, 4499, 422, 3082, 24088, 3159, 19, 4642, 452, 644, 908, 285, 616, 3607, 5469, 575, 50275, 783, 806, 749, 33049, 275, 2593, 577, 310, 253, 6452, 8392, 432, 326, 12494, 18539, 274, 1365, 32927, 3210, 3700, 1805, 285, 7938, 533, 6774, 1481, 723, 513, 417, 452, 253, 1072, 16144, 597, 403, 625, 751, 14505, 685, 11815, 281, 320, 8392, 891, 751, 253, 6452, 505, 2404, 5981, 891, 1089, 352, 1077, 15966, 285, 9371, 323, 1629, 34282, 3340, 1580, 627, 403, 1142, 4679, 533, 954, 273, 512, 891, 651, 7052, 1804, 326, 512, 14505, 452, 253, 1072, 16144, 26332, 604, 368, 16216, 1158, 273, 11815, 505, 262, 868, 323, 253, 643, 13433, 268, 20546, 3738, 891, 1158, 368, 476, 285, 943, 891, 651, 5583, 294, 545, 83, 2355, 436, 581, 281, 320, 751, 253, 2571, 24088, 5301, 273, 48960, 285, 1327, 324, 735, 24406, 3700, 50275, 8124, 1076, 273, 2829, 337, 342, 3403, 621, 1097, 2708, 285, 1840, 253, 7180, 310, 1892, 281, 1239, 1691, 352, 512, 1840, 390, 512, 2708, 50275, 16534, 452, 644, 247, 5322, 5107, 281, 7409, 1880, 10237, 3210, 403, 625, 390, 1679, 16931, 281, 253, 3510, 273, 8492, 952, 7664, 670, 275, 1524, 10186, 2460, 15302, 24088, 2454, 8981, 5046, 4409, 29570, 436, 275, 2852, 789, 575, 50275, 28558, 516, 20854, 253, 4679, 342, 4833, 3470, 513, 417, 12129, 253, 1055, 273, 3045, 432, 326, 273, 3733, 8492, 2584, 253, 1966, 2720, 281, 513, 594, 253, 10237, 285, 3626, 3082, 651, 452, 281, 452, 253, 1072, 7200, 26332, 436, 1537, 6388, 1077, 2393, 15910, 273, 253, 10237, 1332, 281, 3761, 253, 3626, 1566, 3045, 1293, 436, 253, 18276, 285, 11745, 1543, 812, 816, 320, 1955, 281, 253, 2169, 7200, 273, 253, 10237, 1332, 575, 1439, 253, 1798, 830, 273, 2720, 352, 575, 527, 86, 707, 7152, 33032, 18, 6010, 273, 7680, 436, 2929, 3916, 326, 253, 3215, 11273, 1566, 10166, 18539, 274, 1365, 476, 5115, 1805, 3045, 327, 3700, 4715, 285, 5196, 9470, 4679, 327, 253, 10307, 273, 253, 31611, 10166, 3215, 11273, 3210, 671, 253, 2929, 2589, 84, 271, 16774, 1783, 273, 253, 10166, 3210, 285, 2722, 326, 253, 18539, 274, 1365, 3215, 11273, 3210, 4648, 253, 5281, 273, 253, 3888, 2581, 685, 253, 14542, 281, 30215, 253, 3888, 50276, 5302, 253, 4833, 1159, 465, 1368, 4240, 253, 2929, 12957, 326, 1016, 20803, 2460, 327, 253, 18539, 274, 1365, 10166, 1566, 310, 1199, 625, 591, 916, 1242, 2074, 281, 697, 1071, 1650, 50274, 19, 20544, 285, 32213, 253, 2929, 310, 973, 15720, 285, 10932, 285, 253, 4679, 1007, 4344, 285, 973, 1329, 253, 1750, 50275, 783, 1783, 310, 4722, 285, 47860, 50276, 10722, 6050, 253, 3700, 310, 2218, 281, 253, 5028, 273, 2406, 10454, 285, 690, 1774, 20407, 5697, 403, 417, 18171, 6949, 50274, 20, 17401, 1223, 253, 9380, 16774, 1543, 403, 4891, 627, 3133, 281, 320, 247, 6832, 2316, 1669, 323, 20407, 2175, 50276, 3062, 28913, 2175, 3091, 320, 2218, 323, 643, 37820, 3082, 891, 2868, 326, 253, 2929, 310, 42876, 1840, 253, 14924, 7887, 50275, 21, 4606, 323, 17401, 253, 9414, 588, 5649, 625, 432, 253, 2929, 604, 253, 4477, 476, 15249, 616, 897, 273, 48960, 3733, 347, 253, 37820, 275, 253, 3215, 26208, 1232, 50276, 74, 2868, 326, 436, 2561, 32570, 690, 20407, 1263, 323, 5926, 483, 2801, 10027, 347, 973, 347, 3632, 26309, 50276, 74, 1158, 253, 2929, 476, 320, 625, 47860, 604, 352, 2722, 1880, 253, 643, 8946, 37820, 17923, 1805, 390, 7197, 327, 3700, 4715, 685, 253, 4081, 2746, 50275, 22, 50276, 38092, 8680, 275, 1635, 281, 253, 13991, 1160, 275, 577, 891, 671, 2868, 326, 5301, 3091, 320, 1160, 1411, 253, 1566, 10166, 1293, 3215, 26208, 50274, 5996, 30080, 22559, 50276, 47033, 368, 323, 253, 2380, 285, 5717, 368, 323, 12669, 253, 3045, 5301, 1411, 253, 30661, 15854, 885, 20452, 352, 651, 320, 4722, 281, 923, 247, 2852, 789, 7668, 2097, 643, 685, 48960, 3733, 24088, 1690, 643, 2969, 6297, 751, 2801, 10027, 285, 5926, 483, 281, 1361, 4796, 253, 689, 31893, 2538, 275, 253, 3215, 26208, 3408, 891, 651, 751, 281, 1978, 619, 4868, 347, 310, 50276, 7152, 33032, 9021, 50275, 783, 2929, 29328, 326, 3210, 326, 497, 18539, 274, 1365, 10166, 3700, 1805, 281, 643, 15302, 275, 326, 597, 2572, 4076, 3045, 327, 436, 2303, 10895, 604, 627, 403, 760, 1643, 13130, 2856, 522, 842, 84, 323, 253, 2303, 4836, 390, 760, 1643, 3733, 44540, 403, 5196, 50276, 783, 4477, 1071, 616, 9079, 323, 501, 47301, 3215, 11273, 327, 4440, 257, 292, 275, 1027, 4322, 3210, 285, 3700, 841, 3210, 281, 721, 1027, 2303, 15302, 3839, 1543, 2085, 4209, 1941, 323, 253, 9380, 2022, 9079, 10237, 3210, 3700, 1805, 50276, 38092, 4679, 2085, 1941, 326, 1805, 3700, 1430, 273, 10237, 3210, 310, 13730, 1955, 281, 22128, 625, 327, 5281, 2581, 685, 14542, 26638, 25761, 271, 3081, 1783, 970, 4833, 3470, 5644, 281, 253, 9079, 326, 50276, 18848, 461, 11454, 6928, 1537, 452, 6311, 281, 30215, 970, 1650, 3169, 4473, 4715, 751, 275, 1966, 14965, 50275, 9188, 40348, 50276, 17338, 4715, 310, 247, 9400, 273, 1029, 1693, 87, 4306, 323, 24432, 1580, 352, 476, 4796, 1097, 941, 5203, 3434, 285, 3733, 673, 11138, 2220, 253, 8245, 273, 27090, 3210, 3215, 11273, 327, 4440, 257, 292, 342, 247, 1327, 324, 735, 24406, 2957, 310, 3021, 247, 2442, 1534, 906, 50276, 35529, 253, 9380, 2278, 273, 253, 3700, 4715, 6239, 310, 28019, 285, 38771, 690, 4623, 2905, 789, 824, 347, 294, 37341, 4440, 257, 292, 3215, 26208, 407, 344, 1162, 355, 17857, 17312, 6247, 50276, 29483, 595, 3471, 343, 40001, 1162, 355, 17857, 32888, 6247, 671, 2692, 326, 3210, 3215, 11273, 327, 17521, 1025, 4440, 257, 292, 285, 3021, 1907, 247, 10046, 5281, 8492, 3700, 1805, 281, 1789, 5481, 8892, 436, 943, 320, 5393, 275, 2593, 608, 50276, 395, 625, 3839, 604, 10046, 3700, 1430, 310, 7194, 1955, 281, 2559, 5281, 8492, 651, 2649, 352, 1056, 3282, 281, 3215, 1949, 11120, 323, 2266, 5281, 8492, 2581, 685, 17170, 436, 21719, 3066, 48960, 3733, 347, 4081, 275, 436, 2929, 247, 625, 11080, 2278, 273, 253, 3700, 4715, 6239, 285, 12600, 253, 2797, 1543, 281, 436, 651, 3839, 17084, 253, 2929, 50275, 19164, 414, 50276, 783, 789, 310, 247, 15846, 16774, 789, 12392, 253, 4767, 24316, 642, 4460, 3082, 403, 4081, 3236, 414, 476, 3021, 760, 1705, 432, 253, 24316, 253, 2022, 9079, 10237, 3210, 3700, 1805, 369, 671, 4081, 407, 3779, 1342, 1162, 355, 9169, 2299, 436, 789, 943, 320, 2326, 347, 17336, 1580, 352, 369, 4439, 327, 549, 32693, 760, 1740, 2607, 3622, 50276, 783, 2022, 2720, 789, 310, 439, 2320, 42128, 1162, 355, 17857, 32888, 9169, 534, 671, 2175, 27090, 18539, 274, 1365, 3215, 11273, 3210, 281, 643, 8892, 2299, 616, 2770, 310, 327, 253, 31640, 15988, 273, 9495, 3210, 2581, 685, 327, 253, 1055, 327, 4076, 3045, 275, 6010, 891, 1158, 253, 2022, 9079, 5421, 275, 253, 2929, 310, 3236, 2299, 352, 310, 671, 4518, 760, 247, 4942, 1355, 32809, 3213, 4457, 752, 439, 2320, 42128, 1162, 355, 9169, 858, 50275, 498, 15752, 50276, 49363, 9978, 3733, 15722, 285, 1783, 403, 18627, 4518, 20437, 2127, 323, 1442, 292, 25004, 285, 7446, 839, 253, 4679, 651, 2007, 17084, 38041, 50275, 15177, 50276, 43786, 253, 4679, 403, 973, 5196, 10985, 247, 3862, 2491, 273, 4322, 3210, 2303, 15302, 3733, 2460, 285, 44540, 27005, 285, 1442, 292, 25004, 8130, 23000, 2593, 608, 285, 721, 17914, 3081, 1708, 4830, 2139, 10237, 3210, 1537, 3700, 1805, 285, 407, 436, 2007, 31845, 253, 2022, 3935, 273, 253, 2929, 581, 2159, 4202, 310, 326, 512, 2303, 8892, 403, 2460, 9162, 8892, 1880, 10237, 3210, 671, 3700, 1805, 281, 4836, 824, 1789, 5481, 390, 24705, 26405, 4558, 12744, 50274, 250, 27167, 318, 50276, 249, 6010, 891, 1158, 253, 2929, 310, 247, 5322, 5661, 1263, 273, 247, 4518, 4767, 9079, 342, 2442, 8542, 3486, 891, 3021, 9644, 4404, 14924, 1014, 2167, 38135, 310, 4518, 45210, 50275, 13017, 17401, 846, 2488, 2380, 253, 4477, 452, 9713, 2067, 273, 619, 2022, 7350, 352, 651, 452, 644, 9371, 281, 1263, 3700, 1430, 281, 8892, 4457, 2460, 8981, 533, 4583, 891, 1158, 253, 2929, 556, 644, 15455, 5520, 891, 2572, 619, 4868, 281, 818, 767, 16157, 5001, 747, 2600, 50275, 74, 1089, 352, 24363, 281, 9173, 23256, 69, 347, 247, 10522, 34014, 285, 21842, 305, 12064, 6046, 347, 247, 3632, 34014, 2593, 818, 10522, 3798, 10770, 281, 271, 34014, 326, 13698, 387, 17170, 247, 2173, 3731, 42070, 2303, 966, 305, 12064, 6046, 310, 417, 1663, 271, 34014, 2581, 247, 22841, 5695, 16933, 891, 651, 5583, 8254, 5411, 253, 26086, 281, 3693, 13775, 273, 10668, 50275, 261, 627, 667, 1798, 1921, 281, 897, 23256, 69, 20, 275, 2829, 337, 67, 323, 7103, 651, 253, 1055, 2186, 671, 1411, 10046, 8104, 625, 25142, 3966, 2490, 187, 4118, 18435, 27, 783, 26536, 273, 253, 789, 310, 2969, 2217, 7409, 604, 6928, 326, 403, 10166, 342, 271, 48960, 8103, 990, 598, 1146, 625, 7470, 323, 3700, 4715, 8892, 3340, 275, 253, 3634, 273, 3710, 13130, 941, 323, 253, 747, 5028, 253, 789, 440, 1940, 735, 253, 958, 326, 5281, 30344, 14237, 403, 6311, 436, 1039, 285, 436, 7729, 323, 253, 8892, 597, 2783, 50276, 9088, 369, 2581, 10237, 896, 285, 6593, 875, 253, 4477, 285, 253, 30628, 253, 13969, 310, 326, 436, 789, 556, 15785, 556, 1175, 3290, 4679, 285, 2340, 684, 1633, 342, 1029, 2442, 3486, 1677, 253, 6349, 273, 3700, 4715, 275, 2087, 891, 3524, 326, 954, 273, 253, 896, 285, 6593, 4342, 403, 11217, 275, 253, 2457, 2715, 273, 436, 789, 3340, 253, 5955, 285, 5301, 342, 439, 2320, 42128, 1162, 355, 347, 973, 347, 512, 253, 8794, 1972, 273, 253, 5281, 8492 ]
[ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 ]
[ 4679, 342, 387, 1878, 13361, 793, 275, 1635, 281, 2410, 47301, 285, 387, 1878, 581, 643, 1511, 273, 941, 275, 1635, 281, 575, 3888, 5010, 253, 7234, 275, 4060, 15834, 35322, 943, 320, 4493, 31509, 281, 625, 13613, 4887, 253, 3753, 273, 253, 4679, 2519, 50275, 420, 279, 752, 310, 18539, 274, 1365, 4632, 10748, 10166, 3738, 891, 7052, 1804, 970, 247, 1027, 1307, 685, 10748, 1690, 275, 21302, 7271, 2811, 347, 352, 310, 21643, 352, 2789, 352, 1646, 751, 368, 403, 10941, 48960, 3733, 281, 3626, 11786, 3082, 50275, 7053, 6197, 8133, 941, 288, 14175, 275, 19101, 3731, 26460, 5356, 5936, 326, 697, 247, 7744, 908, 12616, 1804, 11922, 436, 26332, 597, 403, 1929, 281, 2430, 1781, 285, 1804, 6240, 247, 3806, 534, 2677, 7790, 436, 2581, 685, 14339, 281, 35513, 50275, 3549, 6241, 21472, 310, 271, 4743, 1804, 13406, 390, 1633, 751, 7126, 534, 476, 320, 6012, 432, 16774, 1543, 575, 575, 47109, 273, 15180, 2105, 50276, 977, 5454, 14273, 1014, 816, 247, 3806, 835, 436, 310, 2218, 342, 247, 6197, 10405, 3006, 1110, 1543, 50276, 977, 5454, 14273, 24088, 697, 3798, 387, 1878, 374, 89, 347, 3468, 285, 625, 2779, 281, 320, 17631, 390, 1633, 751, 326, 1537, 320, 2217, 533, 891, 651, 4510, 281, 923, 2120, 3733, 9191, 285, 15180, 2105, 3904, 275, 30762, 342, 247, 1386, 390, 767, 10405, 3006, 841, 275, 253, 2022, 2505, 50275, 44101, 326, 1113, 86, 3230, 8878, 4081, 3700, 4715, 3133, 13583, 281, 479, 627, 369, 247, 5723, 2824, 22586, 326, 1072, 807, 327, 253, 2256, 7738, 352, 369, 2168, 271, 4232, 1307, 387, 326, 673, 891, 13414, 871, 253, 2120, 2892, 533, 432, 247, 3158, 17899, 352, 3133, 281, 452, 644, 1077, 1846, 275, 20162, 50276, 43052, 6239, 275, 253, 5571, 84, 285, 5096, 84, 344, 373, 247, 1984, 326, 10062, 670, 352, 275, 253, 3634, 273, 13361, 275, 253, 5096, 84, 5987, 19875, 3671, 6017, 28601, 348, 6062, 2061, 67, 48793, 68, 3680, 1787, 67, 567, 26, 23568, 12352, 23250, 1229, 66, 3547, 9194, 24, 11316, 68, 43628, 1796, 2904, 7693, 4989, 285, 344, 373, 6793, 256, 28738, 5015, 670, 352, 275, 11266, 575, 3614, 23053, 81, 47761, 681, 6071, 9275, 6903, 361, 2787, 3141, 16607, 1010, 1812, 17786, 9275, 50275, 4919, 789, 575, 17345, 50276, 31485, 11610, 5333, 812, 671, 3748, 2495, 26480, 17888, 575, 3614, 39962, 2061, 9275, 9755, 7174, 2055, 9275, 849, 1057, 253, 26480, 17888, 2218, 627, 9184, 432, 253, 48960, 3733, 575, 17345, 50276, 262, 3133, 751, 247, 2257, 273, 2987, 327, 48960, 285, 4499, 422, 3733, 285, 253, 2954, 281, 26647, 403, 5816, 516, 417, 271, 6485, 275, 436, 533, 4983, 1039, 896, 1892, 4016, 15067, 285, 643, 4499, 422, 3082, 24088, 3159, 19, 4642, 452, 644, 908, 285, 616, 3607, 5469, 575, 50275, 783, 806, 749, 33049, 275, 2593, 577, 310, 253, 6452, 8392, 432, 326, 12494, 18539, 274, 1365, 32927, 3210, 3700, 1805, 285, 7938, 533, 6774, 1481, 723, 513, 417, 452, 253, 1072, 16144, 597, 403, 625, 751, 14505, 685, 11815, 281, 320, 8392, 891, 751, 253, 6452, 505, 2404, 5981, 891, 1089, 352, 1077, 15966, 285, 9371, 323, 1629, 34282, 3340, 1580, 627, 403, 1142, 4679, 533, 954, 273, 512, 891, 651, 7052, 1804, 326, 512, 14505, 452, 253, 1072, 16144, 26332, 604, 368, 16216, 1158, 273, 11815, 505, 262, 868, 323, 253, 643, 13433, 268, 20546, 3738, 891, 1158, 368, 476, 285, 943, 891, 651, 5583, 294, 545, 83, 2355, 436, 581, 281, 320, 751, 253, 2571, 24088, 5301, 273, 48960, 285, 1327, 324, 735, 24406, 3700, 50275, 8124, 1076, 273, 2829, 337, 342, 3403, 621, 1097, 2708, 285, 1840, 253, 7180, 310, 1892, 281, 1239, 1691, 352, 512, 1840, 390, 512, 2708, 50275, 16534, 452, 644, 247, 5322, 5107, 281, 7409, 1880, 10237, 3210, 403, 625, 390, 1679, 16931, 281, 253, 3510, 273, 8492, 952, 7664, 670, 275, 1524, 10186, 2460, 15302, 24088, 2454, 8981, 5046, 4409, 29570, 436, 275, 2852, 789, 575, 50275, 28558, 516, 20854, 253, 4679, 342, 4833, 3470, 513, 417, 12129, 253, 1055, 273, 3045, 432, 326, 273, 3733, 8492, 2584, 253, 1966, 2720, 281, 513, 594, 253, 10237, 285, 3626, 3082, 651, 452, 281, 452, 253, 1072, 7200, 26332, 436, 1537, 6388, 1077, 2393, 15910, 273, 253, 10237, 1332, 281, 3761, 253, 3626, 1566, 3045, 1293, 436, 253, 18276, 285, 11745, 1543, 812, 816, 320, 1955, 281, 253, 2169, 7200, 273, 253, 10237, 1332, 575, 1439, 253, 1798, 830, 273, 2720, 352, 575, 527, 86, 707, 7152, 33032, 18, 6010, 273, 7680, 436, 2929, 3916, 326, 253, 3215, 11273, 1566, 10166, 18539, 274, 1365, 476, 5115, 1805, 3045, 327, 3700, 4715, 285, 5196, 9470, 4679, 327, 253, 10307, 273, 253, 31611, 10166, 3215, 11273, 3210, 671, 253, 2929, 2589, 84, 271, 16774, 1783, 273, 253, 10166, 3210, 285, 2722, 326, 253, 18539, 274, 1365, 3215, 11273, 3210, 4648, 253, 5281, 273, 253, 3888, 2581, 685, 253, 14542, 281, 30215, 253, 3888, 50276, 5302, 253, 4833, 1159, 465, 1368, 4240, 253, 2929, 12957, 326, 1016, 20803, 2460, 327, 253, 18539, 274, 1365, 10166, 1566, 310, 1199, 625, 591, 916, 1242, 2074, 281, 697, 1071, 1650, 50274, 19, 20544, 285, 32213, 253, 2929, 310, 973, 15720, 285, 10932, 285, 253, 4679, 1007, 4344, 285, 973, 1329, 253, 1750, 50275, 783, 1783, 310, 4722, 285, 47860, 50276, 10722, 6050, 253, 3700, 310, 2218, 281, 253, 5028, 273, 2406, 10454, 285, 690, 1774, 20407, 5697, 403, 417, 18171, 6949, 50274, 20, 17401, 1223, 253, 9380, 16774, 1543, 403, 4891, 627, 3133, 281, 320, 247, 6832, 2316, 1669, 323, 20407, 2175, 50276, 3062, 28913, 2175, 3091, 320, 2218, 323, 643, 37820, 3082, 891, 2868, 326, 253, 2929, 310, 42876, 1840, 253, 14924, 7887, 50275, 21, 4606, 323, 17401, 253, 9414, 588, 5649, 625, 432, 253, 2929, 604, 253, 4477, 476, 15249, 616, 897, 273, 48960, 3733, 347, 253, 37820, 275, 253, 3215, 26208, 1232, 50276, 74, 2868, 326, 436, 2561, 32570, 690, 20407, 1263, 323, 5926, 483, 2801, 10027, 347, 973, 347, 3632, 26309, 50276, 74, 1158, 253, 2929, 476, 320, 625, 47860, 604, 352, 2722, 1880, 253, 643, 8946, 37820, 17923, 1805, 390, 7197, 327, 3700, 4715, 685, 253, 4081, 2746, 50275, 22, 50276, 38092, 8680, 275, 1635, 281, 253, 13991, 1160, 275, 577, 891, 671, 2868, 326, 5301, 3091, 320, 1160, 1411, 253, 1566, 10166, 1293, 3215, 26208, 50274, 5996, 30080, 22559, 50276, 47033, 368, 323, 253, 2380, 285, 5717, 368, 323, 12669, 253, 3045, 5301, 1411, 253, 30661, 15854, 885, 20452, 352, 651, 320, 4722, 281, 923, 247, 2852, 789, 7668, 2097, 643, 685, 48960, 3733, 24088, 1690, 643, 2969, 6297, 751, 2801, 10027, 285, 5926, 483, 281, 1361, 4796, 253, 689, 31893, 2538, 275, 253, 3215, 26208, 3408, 891, 651, 751, 281, 1978, 619, 4868, 347, 310, 50276, 7152, 33032, 9021, 50275, 783, 2929, 29328, 326, 3210, 326, 497, 18539, 274, 1365, 10166, 3700, 1805, 281, 643, 15302, 275, 326, 597, 2572, 4076, 3045, 327, 436, 2303, 10895, 604, 627, 403, 760, 1643, 13130, 2856, 522, 842, 84, 323, 253, 2303, 4836, 390, 760, 1643, 3733, 44540, 403, 5196, 50276, 783, 4477, 1071, 616, 9079, 323, 501, 47301, 3215, 11273, 327, 4440, 257, 292, 275, 1027, 4322, 3210, 285, 3700, 841, 3210, 281, 721, 1027, 2303, 15302, 3839, 1543, 2085, 4209, 1941, 323, 253, 9380, 2022, 9079, 10237, 3210, 3700, 1805, 50276, 38092, 4679, 2085, 1941, 326, 1805, 3700, 1430, 273, 10237, 3210, 310, 13730, 1955, 281, 22128, 625, 327, 5281, 2581, 685, 14542, 26638, 25761, 271, 3081, 1783, 970, 4833, 3470, 5644, 281, 253, 9079, 326, 50276, 18848, 461, 11454, 6928, 1537, 452, 6311, 281, 30215, 970, 1650, 3169, 4473, 4715, 751, 275, 1966, 14965, 50275, 9188, 40348, 50276, 17338, 4715, 310, 247, 9400, 273, 1029, 1693, 87, 4306, 323, 24432, 1580, 352, 476, 4796, 1097, 941, 5203, 3434, 285, 3733, 673, 11138, 2220, 253, 8245, 273, 27090, 3210, 3215, 11273, 327, 4440, 257, 292, 342, 247, 1327, 324, 735, 24406, 2957, 310, 3021, 247, 2442, 1534, 906, 50276, 35529, 253, 9380, 2278, 273, 253, 3700, 4715, 6239, 310, 28019, 285, 38771, 690, 4623, 2905, 789, 824, 347, 294, 37341, 4440, 257, 292, 3215, 26208, 407, 344, 1162, 355, 17857, 17312, 6247, 50276, 29483, 595, 3471, 343, 40001, 1162, 355, 17857, 32888, 6247, 671, 2692, 326, 3210, 3215, 11273, 327, 17521, 1025, 4440, 257, 292, 285, 3021, 1907, 247, 10046, 5281, 8492, 3700, 1805, 281, 1789, 5481, 8892, 436, 943, 320, 5393, 275, 2593, 608, 50276, 395, 625, 3839, 604, 10046, 3700, 1430, 310, 7194, 1955, 281, 2559, 5281, 8492, 651, 2649, 352, 1056, 3282, 281, 3215, 1949, 11120, 323, 2266, 5281, 8492, 2581, 685, 17170, 436, 21719, 3066, 48960, 3733, 347, 4081, 275, 436, 2929, 247, 625, 11080, 2278, 273, 253, 3700, 4715, 6239, 285, 12600, 253, 2797, 1543, 281, 436, 651, 3839, 17084, 253, 2929, 50275, 19164, 414, 50276, 783, 789, 310, 247, 15846, 16774, 789, 12392, 253, 4767, 24316, 642, 4460, 3082, 403, 4081, 3236, 414, 476, 3021, 760, 1705, 432, 253, 24316, 253, 2022, 9079, 10237, 3210, 3700, 1805, 369, 671, 4081, 407, 3779, 1342, 1162, 355, 9169, 2299, 436, 789, 943, 320, 2326, 347, 17336, 1580, 352, 369, 4439, 327, 549, 32693, 760, 1740, 2607, 3622, 50276, 783, 2022, 2720, 789, 310, 439, 2320, 42128, 1162, 355, 17857, 32888, 9169, 534, 671, 2175, 27090, 18539, 274, 1365, 3215, 11273, 3210, 281, 643, 8892, 2299, 616, 2770, 310, 327, 253, 31640, 15988, 273, 9495, 3210, 2581, 685, 327, 253, 1055, 327, 4076, 3045, 275, 6010, 891, 1158, 253, 2022, 9079, 5421, 275, 253, 2929, 310, 3236, 2299, 352, 310, 671, 4518, 760, 247, 4942, 1355, 32809, 3213, 4457, 752, 439, 2320, 42128, 1162, 355, 9169, 858, 50275, 498, 15752, 50276, 49363, 9978, 3733, 15722, 285, 1783, 403, 18627, 4518, 20437, 2127, 323, 1442, 292, 25004, 285, 7446, 839, 253, 4679, 651, 2007, 17084, 38041, 50275, 15177, 50276, 43786, 253, 4679, 403, 973, 5196, 10985, 247, 3862, 2491, 273, 4322, 3210, 2303, 15302, 3733, 2460, 285, 44540, 27005, 285, 1442, 292, 25004, 8130, 23000, 2593, 608, 285, 721, 17914, 3081, 1708, 4830, 2139, 10237, 3210, 1537, 3700, 1805, 285, 407, 436, 2007, 31845, 253, 2022, 3935, 273, 253, 2929, 581, 2159, 4202, 310, 326, 512, 2303, 8892, 403, 2460, 9162, 8892, 1880, 10237, 3210, 671, 3700, 1805, 281, 4836, 824, 1789, 5481, 390, 24705, 26405, 4558, 12744, 50274, 250, 27167, 318, 50276, 249, 6010, 891, 1158, 253, 2929, 310, 247, 5322, 5661, 1263, 273, 247, 4518, 4767, 9079, 342, 2442, 8542, 3486, 891, 3021, 9644, 4404, 14924, 1014, 2167, 38135, 310, 4518, 45210, 50275, 13017, 17401, 846, 2488, 2380, 253, 4477, 452, 9713, 2067, 273, 619, 2022, 7350, 352, 651, 452, 644, 9371, 281, 1263, 3700, 1430, 281, 8892, 4457, 2460, 8981, 533, 4583, 891, 1158, 253, 2929, 556, 644, 15455, 5520, 891, 2572, 619, 4868, 281, 818, 767, 16157, 5001, 747, 2600, 50275, 74, 1089, 352, 24363, 281, 9173, 23256, 69, 347, 247, 10522, 34014, 285, 21842, 305, 12064, 6046, 347, 247, 3632, 34014, 2593, 818, 10522, 3798, 10770, 281, 271, 34014, 326, 13698, 387, 17170, 247, 2173, 3731, 42070, 2303, 966, 305, 12064, 6046, 310, 417, 1663, 271, 34014, 2581, 247, 22841, 5695, 16933, 891, 651, 5583, 8254, 5411, 253, 26086, 281, 3693, 13775, 273, 10668, 50275, 261, 627, 667, 1798, 1921, 281, 897, 23256, 69, 20, 275, 2829, 337, 67, 323, 7103, 651, 253, 1055, 2186, 671, 1411, 10046, 8104, 625, 25142, 3966, 2490, 187, 4118, 18435, 27, 783, 26536, 273, 253, 789, 310, 2969, 2217, 7409, 604, 6928, 326, 403, 10166, 342, 271, 48960, 8103, 990, 598, 1146, 625, 7470, 323, 3700, 4715, 8892, 3340, 275, 253, 3634, 273, 3710, 13130, 941, 323, 253, 747, 5028, 253, 789, 440, 1940, 735, 253, 958, 326, 5281, 30344, 14237, 403, 6311, 436, 1039, 285, 436, 7729, 323, 253, 8892, 597, 2783, 50276, 9088, 369, 2581, 10237, 896, 285, 6593, 875, 253, 4477, 285, 253, 30628, 253, 13969, 310, 326, 436, 789, 556, 15785, 556, 1175, 3290, 4679, 285, 2340, 684, 1633, 342, 1029, 2442, 3486, 1677, 253, 6349, 273, 3700, 4715, 275, 2087, 891, 3524, 326, 954, 273, 253, 896, 285, 6593, 4342, 403, 11217, 275, 253, 2457, 2715, 273, 436, 789, 3340, 253, 5955, 285, 5301, 342, 439, 2320, 42128, 1162, 355, 347, 973, 347, 512, 253, 8794, 1972, 273, 253, 5281, 8492 ]
Below is given review of a research paper from cnoference journal. Please write a summary the review. ### Review: the paper deals with image to image of faces translation solving two main typical issues 1 the style information comes from the entire region of a given exemplar collecting information from the background too without properly isolating the face area 2 the extracted style is applied to the entire region of the target image even if some parts should be kept unchanged the approach is called lomit and is very elaborated with source code which is available possible infringement of the anonymity area chair please check in few words lomit lies on a cosegmentation basis which allows to find semantic correspondences between image regions of the exemplar and the source image the correspondences are shown as a soft mask where the user may decide to operate on some parts leaving unchanged the remaining in the paper is shown for many alternatives hair eyes mouth technically the paper assembles other state of the art techniques cosegmentation networks adaptive instance normalization via highway networks but it does it nicely the major job in the paper lies in the regularization part where the authors specify each of their adds in a proper way experiments are nice since for one of the first times provide facial images which are pleasant to see one thing i did not like were on the three set of final qualitative results where gender change results in images which are obviously diverse wrt the source one but after a while are not communicating any newer thing should have been better to explore other attributes combo docsepsummary the paper tries to address an issue existing in current imagetoimage translation at the point that different regions of the image should be treated differently in other word background should not be transferred while only foreground of interest should be transferred the paper propose to use cosegmentation to find the common areas to for image translation it reports the proposed method works through experiments there are several major concerns to be addressed before considering to publish 1 the paper says that for example in a persons facial image translation if the exemplar image has two attributes 1 a smiling expression and 2 a blonde hair then both attributes have to be transferred with no other options but the model in the paper seems still incapable of transferring only one attribute perhaps an interactive transfer make more sense while cosegmentation does not distinguish the part of interest to the user or training a semantic segmentation make more sense as the semantic segment can specify which region to transfer 2 as cosegmentation is proposed to capture the regions of a common object existing in multiple input images why does the cosegmentation network only capture the eye and mouth part in figure 2 and 3 why does it capture the mouth of different shape and style in the third macro column in figure 4 instead of eyes how to train the cosegmentation module what is the objective function why not using a semantic segmentation model 3 the domaininvariant content code and the style code seem rather subjective are there any principles to design content and style codes in the experiments it seems the paper considers five styles to transfer as shown in table 1 is the model easy to extend to novel styles for image translation 4 what does the pink color mean in the very bottomleft or topright heatmap images in figure 2 there is no pink color reference in the colorbar 5 figure 5 why there is similariy dark patterns on the mouth is it some manual manipulation for interactive transfer 6 though it is always good to see the authors are willing to release code and models it appears uncomfortable that github page noted in the abstract reveals the author information moreover in the github page even though it says an example is exampleipynb the only ipynb file contains nothing informative and this makes reviewers feel cheated minor there are several typos eg lightinigdocsepthis paper proposes an unpaired imagetoimage translation method which applies the cosegmentation network and adaptive instance normalization techniques to enable the manipulation on the local regions pros this paper proposes to jointly learn the local mask to make the translation focus on the foreground instead of the whole image the local maskbased highway adaptive instance normalization apply the style information to the local region correctly cons there seems a conflict in the introduction page 1 the authors clarify that previous methods 123 have a drawback of and then clarify that 123 have taken a userselected exemplar image as additional input as the main experiments are about facial attributes translation i strongly recommend to the author to compare their work with stargan 4 it is mentioned in the introduction page 2 that this approach has something in common with those recent approaches that have attempted to leverage an attention mask in image translation however the differences between the proposed method with these prior works are not compared or mentioned some of these works also applied the mask technique or adaptive instance normalization to the imagetoimage translation problem i wonder the advantages of the proposed method compared to these works the experiment setting is not clear enough if i understand correctly the face images are divided into two groups based on their attributes eg smile vs no smile if so what role does the exemplar image play here since the attribute information has been modeled by the network parameters will different exemplar image lead to different translation outputs the github link for code should not provide any author information 1 multimodal unsupervised imagetoimage translation 2 diverse imagetoimage translation via disentangled representations 3 exemplar guided unsupervised imagetoimage translation 4 stargan unified generative adversarial networks for multidomain imagetoimage translation overall i think the proposed method is welldesigned but the comparison and experiment setting are not explained well my initial rating is weakly reject ### Summary:
the paper received mixed ratings the proposed idea is quite reasonable but also sounds somewhat incremental while the idea of separating foregroundbackground is reasonable it also limits the applicability of the proposed method ie the method is only demonstrated on aligned face images in addition combining adain with foreground mask is a reasonable idea but doesnt sound groundbreakingly novel the comparison against stargan looks quite anecdotal and the proposed method seems to cause only hairstyle changes but transfer with other attributes are not obvious in addition please refer to detailed reviewers comments for other concerns overall it sounds like a good engineering paper that might be better fit to computer vision venue but experimental validation seems somewhat preliminary and its unclear how much novel insight and general technical contributions that this work provides
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 783, 2929, 13330, 342, 2460, 281, 2460, 273, 9365, 10234, 16161, 767, 2022, 6867, 3374, 337, 253, 3740, 1491, 3249, 432, 253, 2862, 2919, 273, 247, 1677, 17449, 274, 17055, 1491, 432, 253, 4114, 1512, 1293, 6283, 4186, 839, 253, 2454, 2170, 374, 253, 10375, 3740, 310, 3732, 281, 253, 2862, 2919, 273, 253, 2303, 2460, 1014, 604, 690, 4243, 943, 320, 4934, 19965, 253, 2746, 310, 1925, 298, 33719, 285, 310, 1077, 50221, 342, 2603, 2127, 534, 310, 2130, 1896, 29157, 273, 253, 39185, 2170, 6951, 4496, 2451, 275, 1643, 3000, 298, 33719, 8696, 327, 247, 260, 583, 2747, 318, 3720, 534, 4483, 281, 1089, 24705, 2723, 2979, 875, 2460, 4811, 273, 253, 17449, 274, 285, 253, 2603, 2460, 253, 2723, 2979, 403, 2011, 347, 247, 2602, 8989, 835, 253, 2608, 778, 7617, 281, 10196, 327, 690, 4243, 6108, 19965, 253, 5780, 275, 253, 2929, 310, 2011, 323, 1142, 18075, 4707, 2927, 6208, 22335, 253, 2929, 347, 40275, 643, 1375, 273, 253, 1445, 5609, 50276, 68, 583, 2747, 318, 6928, 17825, 4227, 21539, 3066, 17657, 6928, 533, 352, 1057, 352, 23395, 253, 2201, 2628, 275, 253, 2929, 8696, 275, 253, 37820, 629, 835, 253, 4477, 13199, 1016, 273, 616, 11323, 275, 247, 1463, 1039, 4679, 403, 5322, 1580, 323, 581, 273, 253, 806, 2069, 2085, 17754, 3888, 534, 403, 17127, 281, 923, 581, 2181, 891, 858, 417, 751, 497, 327, 253, 1264, 873, 273, 2457, 18276, 1543, 835, 8645, 1818, 1543, 275, 3888, 534, 403, 9090, 11117, 8772, 253, 2603, 581, 533, 846, 247, 1223, 403, 417, 26728, 667, 21629, 2181, 943, 452, 644, 1805, 281, 8338, 643, 12474, 35141, 50276, 7152, 339, 793, 360, 3454, 253, 2929, 14177, 281, 2953, 271, 2523, 5368, 275, 1655, 4440, 16713, 5695, 10234, 387, 253, 1127, 326, 1027, 4811, 273, 253, 2460, 943, 320, 4127, 13359, 275, 643, 3159, 4114, 943, 417, 320, 9495, 1223, 760, 35936, 273, 1600, 943, 320, 9495, 253, 2929, 12661, 281, 897, 260, 583, 2747, 318, 281, 1089, 253, 1846, 3672, 281, 323, 2460, 10234, 352, 5012, 253, 4081, 1332, 2987, 949, 4679, 50276, 9088, 403, 2067, 2201, 7350, 281, 320, 9713, 1078, 7296, 281, 15452, 50276, 18, 253, 2929, 2296, 326, 323, 1650, 275, 247, 7732, 17754, 2460, 10234, 604, 253, 17449, 274, 2460, 556, 767, 12474, 337, 247, 18727, 2048, 285, 374, 247, 35932, 4707, 840, 1097, 12474, 452, 281, 320, 9495, 342, 642, 643, 4610, 533, 253, 1566, 275, 253, 2929, 3133, 1335, 31257, 273, 27090, 760, 581, 11104, 4931, 271, 18366, 3700, 1056, 625, 3282, 1223, 260, 583, 2747, 318, 1057, 417, 12129, 253, 629, 273, 1600, 281, 253, 2608, 390, 3733, 247, 24705, 26405, 1056, 625, 3282, 347, 253, 24705, 8223, 476, 13199, 534, 2919, 281, 3700, 50276, 19, 347, 260, 583, 2747, 318, 310, 4081, 281, 9232, 253, 4811, 273, 247, 1846, 1789, 5368, 275, 2709, 3280, 3888, 2139, 1057, 253, 260, 583, 2747, 318, 2990, 760, 9232, 253, 5130, 285, 6208, 629, 275, 4677, 374, 285, 495, 2139, 1057, 352, 9232, 253, 6208, 273, 1027, 5281, 285, 3740, 275, 253, 2626, 14823, 5084, 275, 4677, 577, 3185, 273, 2927, 849, 281, 6194, 253, 260, 583, 2747, 318, 6333, 752, 310, 253, 8103, 1159, 2139, 417, 970, 247, 24705, 26405, 1566, 50276, 20, 253, 5028, 25168, 2600, 2127, 285, 253, 3740, 2127, 1646, 2581, 17854, 403, 627, 667, 9241, 281, 2216, 2600, 285, 3740, 11646, 275, 253, 4679, 352, 3133, 253, 2929, 19401, 2620, 14957, 281, 3700, 347, 2011, 275, 2829, 337, 310, 253, 1566, 3477, 281, 9017, 281, 4460, 14957, 323, 2460, 10234, 50276, 21, 752, 1057, 253, 14863, 3295, 1599, 275, 253, 1077, 5004, 1274, 390, 1755, 918, 4250, 4251, 3888, 275, 4677, 374, 627, 310, 642, 14863, 3295, 3806, 275, 253, 3295, 2009, 50276, 22, 4677, 608, 2139, 627, 310, 948, 300, 1792, 90, 3644, 6127, 327, 253, 6208, 310, 352, 690, 11595, 19763, 323, 18366, 3700, 50276, 23, 2167, 352, 310, 1900, 1175, 281, 923, 253, 4477, 403, 7378, 281, 3727, 2127, 285, 3210, 352, 4620, 20032, 326, 40477, 3239, 4879, 275, 253, 12002, 12957, 253, 2488, 1491, 25761, 275, 253, 40477, 3239, 1014, 2167, 352, 2296, 271, 1650, 310, 1650, 532, 1362, 67, 253, 760, 13997, 1362, 67, 1873, 4428, 2717, 27096, 285, 436, 2789, 30628, 1928, 1161, 456, 50276, 37585, 627, 403, 2067, 963, 993, 24088, 1708, 249, 304, 7152, 33032, 2520, 2929, 29328, 271, 47223, 4440, 16713, 5695, 10234, 1332, 534, 10384, 253, 260, 583, 2747, 318, 2990, 285, 17825, 4227, 21539, 5609, 281, 8046, 253, 19763, 327, 253, 1980, 4811, 50276, 856, 84, 50276, 2520, 2929, 29328, 281, 26277, 3037, 253, 1980, 8989, 281, 1056, 253, 10234, 2770, 327, 253, 35936, 3185, 273, 253, 2644, 2460, 50276, 783, 1980, 8989, 3169, 17657, 17825, 4227, 21539, 4647, 253, 3740, 1491, 281, 253, 1980, 2919, 9113, 50276, 5040, 50276, 9088, 3133, 247, 7344, 275, 253, 10199, 3239, 337, 253, 4477, 19148, 326, 2045, 3082, 15567, 452, 247, 32489, 273, 50276, 395, 840, 19148, 326, 15567, 452, 2668, 247, 2608, 16191, 17449, 274, 2460, 347, 3081, 3280, 50274, 284, 253, 2022, 4679, 403, 670, 17754, 12474, 10234, 891, 7052, 5583, 281, 253, 2488, 281, 7277, 616, 789, 342, 4177, 1247, 577, 50275, 262, 310, 5393, 275, 253, 10199, 3239, 374, 326, 436, 2746, 556, 1633, 275, 1846, 342, 1110, 3332, 7274, 326, 452, 9919, 281, 25057, 271, 4116, 8989, 275, 2460, 10234, 2299, 253, 3910, 875, 253, 4081, 1332, 342, 841, 2720, 2987, 403, 417, 2429, 390, 5393, 690, 273, 841, 2987, 671, 3732, 253, 8989, 5853, 390, 17825, 4227, 21539, 281, 253, 4440, 16713, 5695, 10234, 1895, 891, 4282, 253, 11361, 273, 253, 4081, 1332, 2429, 281, 841, 2987, 50276, 783, 3368, 4758, 310, 417, 2590, 2217, 604, 891, 2096, 9113, 253, 2454, 3888, 403, 4272, 715, 767, 2390, 1754, 327, 616, 12474, 24088, 8580, 4632, 642, 8580, 604, 594, 752, 2554, 1057, 253, 17449, 274, 2460, 1132, 1060, 1580, 253, 11104, 1491, 556, 644, 23115, 407, 253, 2990, 3602, 588, 1027, 17449, 274, 2460, 1421, 281, 1027, 10234, 18012, 50275, 783, 40477, 3048, 323, 2127, 943, 417, 2085, 667, 2488, 1491, 50276, 18, 23390, 26306, 440, 35421, 4440, 16713, 5695, 10234, 374, 11117, 4440, 16713, 5695, 10234, 3066, 557, 290, 33195, 14237, 495, 17449, 274, 18107, 440, 35421, 4440, 16713, 5695, 10234, 577, 4177, 1247, 27998, 1006, 800, 48960, 6928, 323, 23964, 297, 404, 4440, 16713, 5695, 10234, 50276, 1189, 455, 891, 1158, 253, 4081, 1332, 310, 6210, 392, 265, 1300, 533, 253, 5301, 285, 3368, 4758, 403, 417, 5544, 973, 619, 3302, 13716, 310, 22112, 12009, 187, 187, 4118, 18435, 27, 783, 2929, 2959, 6804, 17503, 253, 4081, 2934, 310, 3240, 5272, 533, 671, 7835, 8489, 32809, 1223, 253, 2934, 273, 23694, 35936, 11814, 310, 5272, 352, 671, 7787, 253, 30437, 273, 253, 4081, 1332, 26332, 253, 1332, 310, 760, 5183, 327, 15616, 2454, 3888, 275, 1635, 16248, 519, 404, 342, 35936, 8989, 310, 247, 5272, 2934, 533, 36908, 3590, 3216, 22071, 314, 4460, 253, 5301, 1411, 4177, 1247, 4453, 3240, 34009, 5256, 267, 50276, 395, 253, 4081, 1332, 3133, 281, 2847, 760, 419, 712, 2172, 2544, 533, 3700, 342, 643, 12474, 403, 417, 4755, 275, 1635, 4496, 3730, 281, 7000, 30628, 5701, 323, 643, 7350, 4583, 352, 7835, 751, 247, 1175, 11369, 2929, 326, 1537, 320, 1805, 4944, 281, 4382, 8113, 18767, 533, 5661, 12820, 3133, 8489, 12611, 285, 697, 12744, 849, 1199, 4460, 12288, 285, 2087, 7681, 9021, 326, 436, 789, 3400, 209 ]
[ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 ]
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 783, 2929, 13330, 342, 2460, 281, 2460, 273, 9365, 10234, 16161, 767, 2022, 6867, 3374, 337, 253, 3740, 1491, 3249, 432, 253, 2862, 2919, 273, 247, 1677, 17449, 274, 17055, 1491, 432, 253, 4114, 1512, 1293, 6283, 4186, 839, 253, 2454, 2170, 374, 253, 10375, 3740, 310, 3732, 281, 253, 2862, 2919, 273, 253, 2303, 2460, 1014, 604, 690, 4243, 943, 320, 4934, 19965, 253, 2746, 310, 1925, 298, 33719, 285, 310, 1077, 50221, 342, 2603, 2127, 534, 310, 2130, 1896, 29157, 273, 253, 39185, 2170, 6951, 4496, 2451, 275, 1643, 3000, 298, 33719, 8696, 327, 247, 260, 583, 2747, 318, 3720, 534, 4483, 281, 1089, 24705, 2723, 2979, 875, 2460, 4811, 273, 253, 17449, 274, 285, 253, 2603, 2460, 253, 2723, 2979, 403, 2011, 347, 247, 2602, 8989, 835, 253, 2608, 778, 7617, 281, 10196, 327, 690, 4243, 6108, 19965, 253, 5780, 275, 253, 2929, 310, 2011, 323, 1142, 18075, 4707, 2927, 6208, 22335, 253, 2929, 347, 40275, 643, 1375, 273, 253, 1445, 5609, 50276, 68, 583, 2747, 318, 6928, 17825, 4227, 21539, 3066, 17657, 6928, 533, 352, 1057, 352, 23395, 253, 2201, 2628, 275, 253, 2929, 8696, 275, 253, 37820, 629, 835, 253, 4477, 13199, 1016, 273, 616, 11323, 275, 247, 1463, 1039, 4679, 403, 5322, 1580, 323, 581, 273, 253, 806, 2069, 2085, 17754, 3888, 534, 403, 17127, 281, 923, 581, 2181, 891, 858, 417, 751, 497, 327, 253, 1264, 873, 273, 2457, 18276, 1543, 835, 8645, 1818, 1543, 275, 3888, 534, 403, 9090, 11117, 8772, 253, 2603, 581, 533, 846, 247, 1223, 403, 417, 26728, 667, 21629, 2181, 943, 452, 644, 1805, 281, 8338, 643, 12474, 35141, 50276, 7152, 339, 793, 360, 3454, 253, 2929, 14177, 281, 2953, 271, 2523, 5368, 275, 1655, 4440, 16713, 5695, 10234, 387, 253, 1127, 326, 1027, 4811, 273, 253, 2460, 943, 320, 4127, 13359, 275, 643, 3159, 4114, 943, 417, 320, 9495, 1223, 760, 35936, 273, 1600, 943, 320, 9495, 253, 2929, 12661, 281, 897, 260, 583, 2747, 318, 281, 1089, 253, 1846, 3672, 281, 323, 2460, 10234, 352, 5012, 253, 4081, 1332, 2987, 949, 4679, 50276, 9088, 403, 2067, 2201, 7350, 281, 320, 9713, 1078, 7296, 281, 15452, 50276, 18, 253, 2929, 2296, 326, 323, 1650, 275, 247, 7732, 17754, 2460, 10234, 604, 253, 17449, 274, 2460, 556, 767, 12474, 337, 247, 18727, 2048, 285, 374, 247, 35932, 4707, 840, 1097, 12474, 452, 281, 320, 9495, 342, 642, 643, 4610, 533, 253, 1566, 275, 253, 2929, 3133, 1335, 31257, 273, 27090, 760, 581, 11104, 4931, 271, 18366, 3700, 1056, 625, 3282, 1223, 260, 583, 2747, 318, 1057, 417, 12129, 253, 629, 273, 1600, 281, 253, 2608, 390, 3733, 247, 24705, 26405, 1056, 625, 3282, 347, 253, 24705, 8223, 476, 13199, 534, 2919, 281, 3700, 50276, 19, 347, 260, 583, 2747, 318, 310, 4081, 281, 9232, 253, 4811, 273, 247, 1846, 1789, 5368, 275, 2709, 3280, 3888, 2139, 1057, 253, 260, 583, 2747, 318, 2990, 760, 9232, 253, 5130, 285, 6208, 629, 275, 4677, 374, 285, 495, 2139, 1057, 352, 9232, 253, 6208, 273, 1027, 5281, 285, 3740, 275, 253, 2626, 14823, 5084, 275, 4677, 577, 3185, 273, 2927, 849, 281, 6194, 253, 260, 583, 2747, 318, 6333, 752, 310, 253, 8103, 1159, 2139, 417, 970, 247, 24705, 26405, 1566, 50276, 20, 253, 5028, 25168, 2600, 2127, 285, 253, 3740, 2127, 1646, 2581, 17854, 403, 627, 667, 9241, 281, 2216, 2600, 285, 3740, 11646, 275, 253, 4679, 352, 3133, 253, 2929, 19401, 2620, 14957, 281, 3700, 347, 2011, 275, 2829, 337, 310, 253, 1566, 3477, 281, 9017, 281, 4460, 14957, 323, 2460, 10234, 50276, 21, 752, 1057, 253, 14863, 3295, 1599, 275, 253, 1077, 5004, 1274, 390, 1755, 918, 4250, 4251, 3888, 275, 4677, 374, 627, 310, 642, 14863, 3295, 3806, 275, 253, 3295, 2009, 50276, 22, 4677, 608, 2139, 627, 310, 948, 300, 1792, 90, 3644, 6127, 327, 253, 6208, 310, 352, 690, 11595, 19763, 323, 18366, 3700, 50276, 23, 2167, 352, 310, 1900, 1175, 281, 923, 253, 4477, 403, 7378, 281, 3727, 2127, 285, 3210, 352, 4620, 20032, 326, 40477, 3239, 4879, 275, 253, 12002, 12957, 253, 2488, 1491, 25761, 275, 253, 40477, 3239, 1014, 2167, 352, 2296, 271, 1650, 310, 1650, 532, 1362, 67, 253, 760, 13997, 1362, 67, 1873, 4428, 2717, 27096, 285, 436, 2789, 30628, 1928, 1161, 456, 50276, 37585, 627, 403, 2067, 963, 993, 24088, 1708, 249, 304, 7152, 33032, 2520, 2929, 29328, 271, 47223, 4440, 16713, 5695, 10234, 1332, 534, 10384, 253, 260, 583, 2747, 318, 2990, 285, 17825, 4227, 21539, 5609, 281, 8046, 253, 19763, 327, 253, 1980, 4811, 50276, 856, 84, 50276, 2520, 2929, 29328, 281, 26277, 3037, 253, 1980, 8989, 281, 1056, 253, 10234, 2770, 327, 253, 35936, 3185, 273, 253, 2644, 2460, 50276, 783, 1980, 8989, 3169, 17657, 17825, 4227, 21539, 4647, 253, 3740, 1491, 281, 253, 1980, 2919, 9113, 50276, 5040, 50276, 9088, 3133, 247, 7344, 275, 253, 10199, 3239, 337, 253, 4477, 19148, 326, 2045, 3082, 15567, 452, 247, 32489, 273, 50276, 395, 840, 19148, 326, 15567, 452, 2668, 247, 2608, 16191, 17449, 274, 2460, 347, 3081, 3280, 50274, 284, 253, 2022, 4679, 403, 670, 17754, 12474, 10234, 891, 7052, 5583, 281, 253, 2488, 281, 7277, 616, 789, 342, 4177, 1247, 577, 50275, 262, 310, 5393, 275, 253, 10199, 3239, 374, 326, 436, 2746, 556, 1633, 275, 1846, 342, 1110, 3332, 7274, 326, 452, 9919, 281, 25057, 271, 4116, 8989, 275, 2460, 10234, 2299, 253, 3910, 875, 253, 4081, 1332, 342, 841, 2720, 2987, 403, 417, 2429, 390, 5393, 690, 273, 841, 2987, 671, 3732, 253, 8989, 5853, 390, 17825, 4227, 21539, 281, 253, 4440, 16713, 5695, 10234, 1895, 891, 4282, 253, 11361, 273, 253, 4081, 1332, 2429, 281, 841, 2987, 50276, 783, 3368, 4758, 310, 417, 2590, 2217, 604, 891, 2096, 9113, 253, 2454, 3888, 403, 4272, 715, 767, 2390, 1754, 327, 616, 12474, 24088, 8580, 4632, 642, 8580, 604, 594, 752, 2554, 1057, 253, 17449, 274, 2460, 1132, 1060, 1580, 253, 11104, 1491, 556, 644, 23115, 407, 253, 2990, 3602, 588, 1027, 17449, 274, 2460, 1421, 281, 1027, 10234, 18012, 50275, 783, 40477, 3048, 323, 2127, 943, 417, 2085, 667, 2488, 1491, 50276, 18, 23390, 26306, 440, 35421, 4440, 16713, 5695, 10234, 374, 11117, 4440, 16713, 5695, 10234, 3066, 557, 290, 33195, 14237, 495, 17449, 274, 18107, 440, 35421, 4440, 16713, 5695, 10234, 577, 4177, 1247, 27998, 1006, 800, 48960, 6928, 323, 23964, 297, 404, 4440, 16713, 5695, 10234, 50276, 1189, 455, 891, 1158, 253, 4081, 1332, 310, 6210, 392, 265, 1300, 533, 253, 5301, 285, 3368, 4758, 403, 417, 5544, 973, 619, 3302, 13716, 310, 22112, 12009, 187, 187, 4118, 18435, 27, 783, 2929, 2959, 6804, 17503, 253, 4081, 2934, 310, 3240, 5272, 533, 671, 7835, 8489, 32809, 1223, 253, 2934, 273, 23694, 35936, 11814, 310, 5272, 352, 671, 7787, 253, 30437, 273, 253, 4081, 1332, 26332, 253, 1332, 310, 760, 5183, 327, 15616, 2454, 3888, 275, 1635, 16248, 519, 404, 342, 35936, 8989, 310, 247, 5272, 2934, 533, 36908, 3590, 3216, 22071, 314, 4460, 253, 5301, 1411, 4177, 1247, 4453, 3240, 34009, 5256, 267, 50276, 395, 253, 4081, 1332, 3133, 281, 2847, 760, 419, 712, 2172, 2544, 533, 3700, 342, 643, 12474, 403, 417, 4755, 275, 1635, 4496, 3730, 281, 7000, 30628, 5701, 323, 643, 7350, 4583, 352, 7835, 751, 247, 1175, 11369, 2929, 326, 1537, 320, 1805, 4944, 281, 4382, 8113, 18767, 533, 5661, 12820, 3133, 8489, 12611, 285, 697, 12744, 849, 1199, 4460, 12288, 285, 2087, 7681, 9021, 326, 436, 789, 3400, 209 ]
Below is given review of a research paper from cnoference journal. Please write a summary the review. ### Review: the authors describe an rnn architecture the hmrnn which models the loglikelihood in an hmm the authors provide theoretical results showing that the hmrnn objective function does indeed correspond to the loglikelihood of an hmm synthetic results are presented which compare the learned parameters between the hmrnn and an hmm trained using the baumwelch algorithm finally hmrnn training is augmented and compared to an hmm trained using the baumwelch algorithm for the task of alzheimers progression prediction firstly the idea of providing an rnn architecture which models the loglikelihood of an hmm is very interesting however the authors make several important mistakes in the broadness of their claims for instance we also make a theoretical contribution by formulating discreteobservation hmms as a special case of rnns and by proving coincidence of their likelihood functions this does not make the two models equivalent which is a major misconception stated throughout the paper for instance while the authors proved that the hmm loglikelihood is modeled by the hmrnn this does not mean hmrnns provide equivalent quantities for the betarecursion posterior inference filteringsmoothingprediction probabilities or the viterbi path in an hmm this is important to distinguish and i also think the title is incorrect by stating hmms are rnns for the evidence provided in the paper the following claim would be correct rnns can exactly model the loglikelihood of hmms the empirical evidence which is meant to support the presented theoretical results showing the equivalence of hmrnns objective to hmms loglikelihood also seem dubious the following is claimed in the paper and meant to be supported in section 41 we demonstrate that an hmrnn trained via gradient descent yields statistically similar solutions to the baumwelch algorithm however the hmrnn trained via gradient descent is in practice gradient descent on the hmrnn rarely yields parameter values that are exactly zero to facilitate comparability between the hmrnn and baumwelch we postprocess all hmrnn results with one iteration of the baumwelch algorithm which forces lowprobability entries to zero even if the resulting learned parameters between the two were exact the extra iteration of baumwelch added to hmrnn results invalidates the statement that hmrnns trained via gradient descent produce similar parameters to the baumwelch algorithm furthermore wasserstein distance is used to compare the two sets of learned parameters but meansquared error is much easier to interpret and also the standard measure of similarity when comparing learned graphical model parameters eg see noorshams nima and martin j wainwright stochastic belief propagation a lowcomplexity alternative to the sumproduct algorithm ieee transactions on information theory 594 2012 19812000 in section 41 why is just a singlelayer neural network used to predict the initial state distribution does the baumwelchtrained parameters similarly use this initial state distribution if not this does not actually fairly compare to the baumwelch learned parameters since this is something easily included when training an hmm using standard baumwelch similarly the second extension to hmrnn training in section 42 does not seem entirely exclusive to rnns second at each time point the probability of being in the most impaired state is used to predict concurrent scores on the clinical dementia rating a global assessment of dementia severity allowing another clinical metric to inform estimation this also seems like something that possible using a very simple augmented baumwelch algorithm for hmms all the overhead and complexity of hmrnn do not seem warranted also while the theoretical results do seem to be correct for discrete emission densities the notation used in the paper is really difficult to understand and the authors are inconsistent in defining variablesnotation making the description of the model difficult to follow along with for instance in definition 1 what is n the definition is very confusing without this definition is n the number of training sequences this setup is also weird because different training sequences in an hmm may have different lengths whereas t is fixed in the paper please state the notation prior to use ie that h1tj just a column vector statements involving mathbfyt and hyt are also very hard to understand please consider some other way of describing how the observations are loaded into the model finally the description of previous work is very light only one mention of combining anns and hmms is mentioned and it is largely dismissed that this work used the maximum mutual information training criterion as opposed to maximum likelihood however there is a long history of hmms combined with dnns which are called hybrid hmms there are also hybrid dynamic bayesian networks these should be discussed and the novelty of the presented work should be addressed in light of this previous work some examples bourlard herv and christian j wellekens links between markov models and multilayer perceptrons advances in neural information processing systems 1989 dahl george e et al contextdependent pretrained deep neural networks for largevocabulary speech recognition ieee transactions on audio speech and language processing 201 2011 3042 graves alex navdeep jaitly and abdelrahman mohamed hybrid speech recognition with deep bidirectional lstm 2013 ieee workshop on automatic speech recognition and understanding ieee 2013 other comments it seems warranted that some kind of complexity be given for the hmrnn since one of the major selling points of hmms is that the forwardrecursion can be done in ot k2 time and ot k memory in section 41 please list implementation details regarding how hmrnn and the baumwelch algorithms were run this is generally important but is even more so since timing numbers are reported in section 42 also since it not discussed at any length it is unclear whether the authors ran the betarecursion along with the alpharecursion the only of the two discussed which is necessary to perform the baumwelch algorithm since hmms may have multiple local optima this statement is improper training hmms using embaumwelch may have multiple local optima note that other hmm parameter estimation algorithm exist such as spectral learning algorithms which are not subject to multiple local optima it looks like the range of t values in the alzheimers application is 3 4 and 5 is that right if so that is incredibly small to be testing an hmm in general while this application is undoubtedly important given that the premise is to show that hmrnns can do things better than hmms a much better benchmark would be for a sufficiently long timeseries such as speech data the authors would probably gain more traction considering speech data as 1 hmms have been most leveraged for this application domain 2 kaldi exists and you can easily modify its source 3 hmms still play an important part in the training of dnnspeech architectures eg serving as one of the building blocks in training a cddnnhmm there are many many other applications of hmms to much longer time sequences with easily modifiable opensource implementations or the pomegranate graphical models package is also available that the authors can turn to to more convincingly demonstrate the superiority of their architecture to standard hmms the parantheses around superscripts seems completely unnecessary in section 33 n is not still not defined which is confusing docsepthe paper proposes a hidden markov recurrent neural network hmrnn that mimics the behavior of traditional hidden markov models hmm and shows that the proposed network can be trained to obtain a similar solution as hmm some questions 1 the topic is not significant the paper proposes an rnn to mimic the behavior of traditional hmm although the paper proves that the model can be optimized to obtain similar parameters as hmm it doesnt include any comparison or experiment for the usefulness of the model 2 the paper lacks the necessary discussions in particular the paper claims that the proposed model allows for substantial modularity with other predictive networks even though the paper has added an additional network on the alzheimer dataset the paper does not show any ablation experiment on the contribution of each component and eventually the effectiveness of the proposed work 3 the paper lacks baseline models for the disease progression modeling there are several disease progression models based on hmm eg 1 and 2 the paper doesnt compare to such baseline models to demonstrate the effectiveness of the proposed work 1 yuying liu shuang li fuxin li le song james m rehg efficientlearningofcontinuous time hidden markov models for disease progression in advances in neural information processing systems pages 36003608 2015 2 ahmed m alaa scott hu and mihaela van der schaar learning from clinical judgments semimarkovmodulated marked hawkes processes for risk prognosis international conference on machine learning 2017 4 the experiment on alzheimers disease dataset is not well described feature dimension covariances etc and no code or pseudo algorithm is provided making the experimental results and findings hard to be reproduced 5 the methodology is not well explained in the traditional hmm the states in the next timestamp are computed based solely on the state on the current timestamp and are independent of the observation however based on equation4 6 the proposed rnn is computing the states based on the current observationdocsepthis paper introduces a novel architecture of recurrent neural network that mimics the working of a standard hmm in particular the proposed hmrnn learns modelparameters which are statistically similar solutions to those of a standard hmm within a general neural network learning framework while it is shown the proposed network similarly to the hmm there are many issues that should be considered critically in eq 5 yt should be ytn to indicate nth columnsample and 1nxkc be 11xkc the notation of psiij is undefined in the simulation study it is recommended to initialize parameters with random values instead of exploiting the background knowledge for the simulation environment it is unclear for the reason of postprocessing to compare with the standard hmm the network learning hyperparameter values need describing for reproducibility page 7 in the middle what is the meaning of ht12 due to inconsistency in notations it is hard to read the manuscript the experimental settings should be denoted in detail note that the authors argued and stated in section 3 that hmrnn mimics the operations of the standard hmm and produces statistically similar solutions to the bw algorithm if so it does not make sense to this reviewer why the hmrnns performance was superior to the hmm the problem setting over alzheimers disease case study is not clear to this reviewers understanding the input to the models was a sequence of quantized mmse values and the final output is the category of either cdr05 or cdr05 in fig 3 why it is about quantization categories of mmse ie 0 1 and 2 there should be more rigorous experiments and their results to better support the validity of the proposed methoddocsep summary authors demonstrated that one can encode the data likelihood function of an hmm using a specialized rnn architecture unlike previous work where neurons from different layers were multiplied together the new encoding strictly followed the classical architecture restrictions of a neural network ie each layer was a weighted sum of the previous layer empirically author showed that the parameter learned by applying gradient descent on the likelihood function is similar to the one that is obtained using the em algorithm in addition authors demonstrated that such formulation enables an application in studying alzheimers disease symptom progression strength it is an interesting new connection between rnn and hmm that was found by authors i found the alzheimers application is very inspiring compared to the baseline method that uses a single hmm to model the progression of the alzheimer for every patient author proposed to accommodate each individual patients attributes when predicting the progression this leads to a more accurate prediction of the final alzheimers prediction for each patient weakness the motivation for the connection between rnn and hmm is not very clear and it hinders the significance of the work as author has pointed out there are multiple work that tries to formulate the likelihood function of a graphical model ie a hmm as a computation graph besides the ones that are cited by the author chapter 12 of 1 also shows general method to generate a computation graph from a graphical model and the computation graph evaluates the probability query on the graphical model these different representations differ only in the syntactical representation of the likelihood function although authors managed to formulate the function using the structure of a classical rnn the underlying function is semantically equivalent to the previous methods hence the gradients wrt the hmm parameters are going to the same in all different representations hence performing gradient descent on a hmrnn does not seem to be different from performing gradient descent on other syntactical representations of the likelihood function 1 modeling and reasoning with bayesian networks adnan darwiche questions it was not clear to me how the supervisions from each patients attributes were incorporated to model the alzheimers progression as there was no ground truth on the hidden states i was not sure how the initial probabilities were trained and how the cdr helped to supervise the training of the hidden states are these supervisions incorporated through the loss function ### Summary:
there is consensus that the submission is not yet ready for publication the reviews contain multiple comments and suggestions and i hope they can be useful for the authors
[ 5976, 1057, 417, 1646, 7094, 11855, 281, 391, 79, 2224, 1273, 387, 1016, 673, 1127, 253, 5912, 273, 1146, 275, 253, 954, 15506, 1375, 310, 908, 281, 3283, 17336, 7363, 327, 253, 3382, 21763, 13716, 247, 4156, 6803, 273, 21763, 12147, 6941, 1529, 3382, 7982, 281, 4151, 13418, 50276, 2520, 671, 3133, 751, 1633, 326, 1896, 970, 247, 1077, 2969, 31612, 18927, 360, 88, 293, 348, 5933, 323, 288, 78, 983, 50276, 455, 253, 18332, 285, 10454, 273, 288, 28094, 9866, 513, 417, 1646, 26085, 50276, 12563, 1223, 253, 10527, 1543, 513, 1646, 281, 320, 3451, 323, 13358, 8295, 16689, 253, 14951, 908, 275, 253, 2929, 310, 1663, 2834, 281, 2096, 285, 253, 4477, 403, 16706, 275, 13947, 4903, 25604, 2403, 253, 5740, 273, 253, 1566, 2834, 281, 956, 2112, 342, 50276, 1542, 4227, 275, 5426, 337, 752, 310, 295, 50276, 783, 5426, 310, 1077, 21643, 1293, 436, 5426, 50276, 261, 295, 253, 1180, 273, 3733, 6430, 50276, 2520, 9978, 310, 671, 12504, 984, 1027, 3733, 6430, 275, 271, 34746, 778, 452, 1027, 16095, 5727, 246, 310, 4229, 275, 253, 2929, 50275, 32897, 1375, 253, 14951, 2720, 281, 897, 26332, 326, 288, 18, 85, 75, 816, 247, 5084, 4972, 50276, 48639, 7668, 14168, 3342, 1767, 285, 1465, 85, 403, 671, 1077, 1892, 281, 2096, 4496, 1908, 690, 643, 1039, 273, 12930, 849, 253, 7313, 403, 10607, 715, 253, 1566, 50274, 71, 3341, 253, 5740, 273, 2045, 789, 310, 1077, 1708, 760, 581, 3748, 273, 16248, 271, 2224, 285, 288, 78, 983, 310, 5393, 285, 352, 310, 8127, 11511, 326, 436, 789, 908, 253, 4869, 15577, 1491, 3733, 17705, 347, 10066, 281, 4869, 12177, 50276, 35529, 627, 310, 247, 1048, 2892, 273, 288, 78, 983, 5678, 342, 277, 79, 2224, 534, 403, 1925, 9769, 288, 78, 983, 627, 403, 671, 9769, 7870, 17699, 16561, 6928, 50276, 20513, 943, 320, 5469, 285, 253, 38135, 273, 253, 3559, 789, 943, 320, 9713, 275, 1708, 273, 436, 2045, 789, 690, 6667, 29772, 77, 472, 617, 87, 285, 37622, 757, 480, 259, 4415, 76, 561, 4859, 875, 1616, 729, 3210, 285, 33362, 4071, 591, 916, 9036, 16424, 275, 11454, 1491, 5162, 2718, 11161, 277, 16758, 3471, 4652, 299, 1162, 355, 3634, 6820, 3215, 11273, 3676, 11454, 6928, 323, 1781, 87, 406, 25718, 6519, 8981, 26332, 1796, 13122, 327, 9797, 6519, 285, 3448, 5162, 848, 4332, 1884, 2945, 38854, 247, 1591, 6563, 22412, 480, 1942, 314, 285, 490, 7555, 22876, 1342, 278, 1368, 3163, 9769, 6519, 8981, 342, 3676, 12246, 30869, 298, 296, 78, 4072, 26332, 1796, 22586, 327, 12077, 6519, 8981, 285, 4685, 26332, 1796, 4072, 50276, 977, 5701, 50276, 262, 3133, 26085, 326, 690, 2238, 273, 10454, 320, 1677, 323, 253, 288, 28094, 9866, 1580, 581, 273, 253, 2201, 10156, 2792, 273, 288, 78, 983, 310, 326, 253, 3579, 48934, 279, 476, 320, 2218, 275, 14366, 465, 19, 673, 285, 14366, 465, 3541, 50276, 249, 2593, 7609, 4496, 1618, 7092, 4278, 5001, 849, 288, 28094, 9866, 285, 253, 18927, 360, 88, 293, 348, 11333, 497, 1408, 50276, 2520, 310, 3839, 1774, 533, 310, 1014, 625, 594, 1580, 11795, 3904, 403, 2361, 275, 2593, 5976, 671, 1580, 352, 417, 5469, 387, 667, 2978, 352, 310, 12744, 1880, 253, 4477, 6337, 253, 701, 609, 16461, 279, 2112, 342, 253, 355, 545, 609, 16461, 279, 253, 760, 273, 253, 767, 5469, 534, 310, 3309, 281, 1347, 253, 18927, 360, 88, 293, 348, 5933, 50276, 17480, 288, 78, 983, 778, 452, 2709, 1980, 5556, 66, 50276, 2520, 3908, 310, 14697, 50276, 31158, 288, 78, 983, 970, 4443, 66, 360, 88, 293, 348, 778, 452, 2709, 1980, 5556, 66, 50276, 9939, 326, 643, 34746, 4764, 13418, 5933, 2226, 824, 347, 9879, 4715, 11333, 534, 403, 417, 2256, 281, 2709, 1980, 5556, 66, 50276, 262, 4453, 751, 253, 2491, 273, 246, 2193, 275, 253, 355, 91, 17563, 398, 2898, 310, 495, 577, 285, 608, 50276, 261, 326, 987, 50276, 338, 594, 326, 310, 16088, 1355, 281, 320, 5175, 271, 34746, 275, 2087, 50276, 6050, 436, 2898, 310, 25369, 1774, 1677, 326, 253, 26536, 310, 281, 921, 326, 288, 28094, 79, 2224, 476, 513, 1841, 1805, 685, 288, 78, 983, 247, 1199, 1805, 22791, 651, 320, 323, 247, 10481, 1048, 2069, 12395, 824, 347, 6519, 941, 50276, 783, 4477, 651, 3164, 6351, 625, 32535, 7296, 6519, 941, 347, 337, 288, 78, 983, 452, 644, 954, 19732, 2961, 323, 436, 2898, 5028, 374, 465, 267, 5168, 4961, 285, 368, 476, 4354, 10007, 697, 2603, 495, 288, 78, 983, 1335, 1132, 271, 1774, 629, 275, 253, 3733, 273, 277, 79, 2224, 365, 5036, 35615, 24088, 9417, 347, 581, 273, 253, 3652, 8336, 275, 3733, 247, 260, 1678, 9866, 44264, 627, 403, 1142, 1142, 643, 4893, 273, 288, 78, 983, 281, 1199, 3356, 673, 6430, 342, 4354, 771, 18397, 13279, 1505, 27558, 390, 253, 268, 485, 28859, 366, 29886, 3210, 5522, 310, 671, 2130, 326, 253, 4477, 476, 1614, 281, 281, 625, 2410, 1763, 5356, 7568, 253, 34385, 273, 616, 10336, 281, 2629, 288, 78, 983, 50276, 783, 1061, 386, 31006, 1475, 17402, 1687, 84, 3133, 4336, 15279, 50276, 249, 2593, 5922, 295, 310, 417, 1335, 417, 2931, 534, 310, 21643, 50276, 7152, 339, 431, 248, 2929, 29328, 247, 8763, 1616, 729, 18902, 11454, 2990, 288, 28094, 9866, 326, 43341, 253, 3879, 273, 5899, 8763, 1616, 729, 3210, 34746, 285, 2722, 326, 253, 4081, 2990, 476, 320, 10166, 281, 4044, 247, 2074, 2900, 347, 34746, 50276, 8826, 3533, 50276, 18, 253, 9400, 310, 417, 1534, 253, 2929, 29328, 271, 391, 9866, 281, 25066, 253, 3879, 273, 5899, 34746, 3738, 253, 2929, 19539, 326, 253, 1566, 476, 320, 18325, 281, 4044, 2074, 3602, 347, 34746, 352, 36908, 2486, 667, 5301, 390, 3368, 323, 253, 31471, 273, 253, 1566, 50276, 19, 253, 2929, 19756, 253, 3309, 11985, 275, 1798, 253, 2929, 3916, 326, 253, 4081, 1566, 4483, 323, 6832, 23178, 414, 342, 643, 15970, 6928, 1014, 2167, 253, 2929, 556, 2879, 271, 3081, 2990, 327, 253, 355, 22110, 10895, 253, 2929, 1057, 417, 921, 667, 28913, 3368, 327, 253, 7680, 273, 1016, 4445, 285, 6524, 253, 12510, 273, 253, 4081, 789, 50276, 20, 253, 2929, 19756, 8245, 3210, 323, 253, 2728, 10005, 14053, 627, 403, 2067, 2728, 10005, 3210, 1754, 327, 34746, 24088, 337, 285, 374, 253, 2929, 36908, 7277, 281, 824, 8245, 3210, 281, 7568, 253, 12510, 273, 253, 4081, 789, 50276, 18, 340, 86, 3184, 632, 86, 439, 86, 606, 632, 269, 2310, 249, 632, 458, 4498, 480, 1443, 278, 294, 47911, 5919, 28269, 1171, 38927, 673, 8763, 1616, 729, 3210, 323, 2728, 10005, 275, 16424, 275, 11454, 1491, 5162, 2718, 7223, 5540, 4838, 25805, 4104, 50276, 19, 21799, 1314, 278, 355, 5781, 660, 1519, 30287, 285, 3641, 3227, 7896, 3889, 1784, 5807, 66, 274, 4715, 432, 3382, 23014, 3300, 303, 782, 729, 2307, 2907, 7101, 48011, 8583, 4870, 323, 2495, 17102, 5213, 8059, 327, 5145, 4715, 4240, 50275, 21, 253, 3368, 327, 355, 91, 17563, 398, 2728, 10895, 310, 417, 973, 2529, 4735, 7877, 9383, 6656, 707, 3966, 285, 642, 2127, 390, 17927, 5933, 310, 2530, 2403, 253, 5661, 1543, 285, 4342, 1892, 281, 320, 23775, 50276, 22, 253, 16182, 310, 417, 973, 5544, 275, 253, 5899, 34746, 253, 3054, 275, 253, 1735, 28921, 403, 10302, 1754, 12718, 327, 253, 1375, 327, 253, 1655, 28921, 285, 403, 3907, 273, 253, 8310, 2299, 1754, 327, 5150, 21, 50276, 23, 253, 4081, 391, 9866, 310, 12672, 253, 3054, 1754, 327, 253, 1655, 8310, 7152, 33032, 2520, 2929, 23970, 247, 4460, 10336, 273, 18902, 11454, 2990, 326, 43341, 253, 2444, 273, 247, 2629, 34746, 275, 1798, 253, 4081, 288, 28094, 9866, 33772, 1566, 22041, 534, 403, 10126, 2074, 5482, 281, 1110, 273, 247, 2629, 34746, 1561, 247, 2087, 11454, 2990, 4715, 7792, 1223, 352, 310, 2011, 253, 4081, 2990, 12014, 281, 253, 34746, 627, 403, 1142, 3374, 326, 943, 320, 2783, 21038, 50272, 249, 16186, 608, 340, 85, 943, 320, 340, 14543, 281, 5224, 295, 394, 9930, 4636, 285, 337, 28708, 39745, 320, 1903, 89, 39745, 50276, 783, 14951, 273, 3714, 74, 1944, 310, 17011, 50276, 249, 253, 9864, 1263, 352, 310, 8521, 281, 26641, 3602, 342, 3632, 2193, 3185, 273, 38883, 253, 4114, 3640, 323, 253, 9864, 3126, 50276, 262, 310, 12744, 323, 253, 1921, 273, 1501, 21678, 281, 7277, 342, 253, 2629, 34746, 50276, 783, 2990, 4715, 4373, 19484, 2193, 878, 12930, 323, 38041, 50276, 6377, 818, 275, 253, 4766, 752, 310, 253, 4495, 273, 288, 85, 805, 50276, 21848, 281, 43430, 275, 41818, 352, 310, 1892, 281, 1239, 253, 7714, 50276, 783, 5661, 7533, 943, 320, 17007, 275, 2508, 50276, 9939, 326, 253, 4477, 9125, 285, 4767, 275, 2593, 495, 326, 288, 28094, 9866, 43341, 253, 5871, 273, 253, 2629, 34746, 285, 11330, 10126, 2074, 5482, 281, 253, 270, 88, 5933, 604, 594, 352, 1057, 417, 1056, 3282, 281, 436, 37317, 2139, 253, 288, 28094, 79, 2224, 3045, 369, 8936, 281, 253, 34746, 50275, 783, 1895, 4758, 689, 355, 91, 17563, 398, 2728, 1083, 1263, 310, 417, 2590, 281, 436, 30628, 4685, 253, 3280, 281, 253, 3210, 369, 247, 3425, 273, 2677, 1025, 5823, 339, 2193, 285, 253, 2457, 3453, 310, 253, 7140, 273, 2057, 260, 5267, 1762, 390, 260, 5267, 1762, 275, 3036, 495, 2139, 352, 310, 670, 36643, 9050, 273, 5823, 339, 26332, 470, 337, 285, 374, 50276, 9088, 943, 320, 625, 26565, 4679, 285, 616, 1543, 281, 1805, 1329, 253, 13091, 273, 253, 4081, 1332, 7152, 33032, 6010, 4477, 5183, 326, 581, 476, 22573, 253, 941, 12177, 1159, 273, 271, 34746, 970, 247, 18052, 391, 9866, 10336, 12401, 2045, 789, 835, 8512, 432, 1027, 8090, 497, 31458, 2366, 253, 747, 9706, 13714, 3560, 253, 8946, 10336, 13133, 273, 247, 11454, 2990, 50276, 466, 1016, 3828, 369, 247, 17375, 2020, 273, 253, 2045, 3828, 45190, 2488, 2692, 326, 253, 4764, 6311, 407, 9433, 11786, 18499, 327, 253, 12177, 1159, 310, 2074, 281, 253, 581, 326, 310, 2797, 970, 253, 802, 5933, 275, 1635, 4477, 5183, 326, 824, 15895, 13276, 271, 2898, 275, 12392, 355, 91, 17563, 398, 2728, 19831, 10005, 50275, 45563, 50276, 262, 310, 271, 4722, 747, 4602, 875, 391, 9866, 285, 34746, 326, 369, 1119, 407, 4477, 50275, 74, 1119, 253, 355, 91, 17563, 398, 2898, 310, 1077, 29853, 2429, 281, 253, 8245, 1332, 326, 4648, 247, 2014, 34746, 281, 1566, 253, 10005, 273, 253, 355, 22110, 323, 1046, 3110, 2488, 4081, 281, 18045, 1016, 2060, 1363, 12474, 672, 21565, 253, 10005, 436, 5644, 281, 247, 625, 7899, 10554, 273, 253, 2457, 355, 91, 17563, 398, 10554, 323, 1016, 3110, 50275, 20881, 1255, 50276, 783, 16038, 323, 253, 4602, 875, 391, 9866, 285, 34746, 310, 417, 1077, 2590, 285, 352, 17134, 398, 253, 8453, 273, 253, 789, 347, 2488, 556, 8042, 562, 627, 403, 2709, 789, 326, 14177, 281, 36803, 253, 12177, 1159, 273, 247, 29886, 1566, 26332, 247, 34746, 347, 247, 13782, 4216, 16280, 253, 4394, 326, 403, 11106, 407, 253, 2488, 8857, 1249, 273, 337, 671, 2722, 2087, 1332, 281, 6635, 247, 13782, 4216, 432, 247, 29886, 1566, 285, 253, 13782, 4216, 44995, 253, 5912, 7316, 327, 253, 29886, 1566, 841, 1027, 14237, 9184, 760, 275, 253, 43548, 514, 474, 6779, 273, 253, 12177, 1159, 3738, 4477, 7303, 281, 36803, 253, 1159, 970, 253, 2605, 273, 247, 8946, 391, 9866, 253, 6944, 1159, 310, 3300, 39904, 6425, 281, 253, 2045, 3082, 7613, 253, 27935, 8772, 253, 34746, 3602, 403, 1469, 281, 253, 1072, 275, 512, 1027, 14237, 7613, 9591, 11786, 18499, 327, 247, 288, 28094, 9866, 1057, 417, 1646, 281, 320, 1027, 432, 9591, 11786, 18499, 327, 643, 43548, 514, 474, 14237, 273, 253, 12177, 1159, 50276, 18, 14053, 285, 14720, 342, 17699, 16561, 6928, 519, 11943, 13681, 88, 16128, 50275, 34974, 50276, 262, 369, 417, 2590, 281, 479, 849, 253, 35220, 3836, 432, 1016, 1363, 12474, 497, 11217, 281, 1566, 253, 355, 91, 17563, 398, 10005, 347, 627, 369, 642, 3216, 5083, 327, 253, 8763, 3054, 891, 369, 417, 2119, 849, 253, 3302, 20552, 497, 10166, 285, 849, 253, 260, 5267, 6518, 281, 35220, 885, 253, 3733, 273, 253, 8763, 3054, 403, 841, 35220, 3836, 11217, 949, 253, 2957, 1159, 187, 187, 4118, 18435, 27, 9088, 310, 13969, 326, 253, 19529, 310, 417, 2568, 4704, 323, 9311, 253, 10123, 3831, 2709, 5701, 285, 13991, 285, 891, 3524, 597, 476, 320, 4217, 323, 253, 4477 ]
[ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 ]
[ 5976, 1057, 417, 1646, 7094, 11855, 281, 391, 79, 2224, 1273, 387, 1016, 673, 1127, 253, 5912, 273, 1146, 275, 253, 954, 15506, 1375, 310, 908, 281, 3283, 17336, 7363, 327, 253, 3382, 21763, 13716, 247, 4156, 6803, 273, 21763, 12147, 6941, 1529, 3382, 7982, 281, 4151, 13418, 50276, 2520, 671, 3133, 751, 1633, 326, 1896, 970, 247, 1077, 2969, 31612, 18927, 360, 88, 293, 348, 5933, 323, 288, 78, 983, 50276, 455, 253, 18332, 285, 10454, 273, 288, 28094, 9866, 513, 417, 1646, 26085, 50276, 12563, 1223, 253, 10527, 1543, 513, 1646, 281, 320, 3451, 323, 13358, 8295, 16689, 253, 14951, 908, 275, 253, 2929, 310, 1663, 2834, 281, 2096, 285, 253, 4477, 403, 16706, 275, 13947, 4903, 25604, 2403, 253, 5740, 273, 253, 1566, 2834, 281, 956, 2112, 342, 50276, 1542, 4227, 275, 5426, 337, 752, 310, 295, 50276, 783, 5426, 310, 1077, 21643, 1293, 436, 5426, 50276, 261, 295, 253, 1180, 273, 3733, 6430, 50276, 2520, 9978, 310, 671, 12504, 984, 1027, 3733, 6430, 275, 271, 34746, 778, 452, 1027, 16095, 5727, 246, 310, 4229, 275, 253, 2929, 50275, 32897, 1375, 253, 14951, 2720, 281, 897, 26332, 326, 288, 18, 85, 75, 816, 247, 5084, 4972, 50276, 48639, 7668, 14168, 3342, 1767, 285, 1465, 85, 403, 671, 1077, 1892, 281, 2096, 4496, 1908, 690, 643, 1039, 273, 12930, 849, 253, 7313, 403, 10607, 715, 253, 1566, 50274, 71, 3341, 253, 5740, 273, 2045, 789, 310, 1077, 1708, 760, 581, 3748, 273, 16248, 271, 2224, 285, 288, 78, 983, 310, 5393, 285, 352, 310, 8127, 11511, 326, 436, 789, 908, 253, 4869, 15577, 1491, 3733, 17705, 347, 10066, 281, 4869, 12177, 50276, 35529, 627, 310, 247, 1048, 2892, 273, 288, 78, 983, 5678, 342, 277, 79, 2224, 534, 403, 1925, 9769, 288, 78, 983, 627, 403, 671, 9769, 7870, 17699, 16561, 6928, 50276, 20513, 943, 320, 5469, 285, 253, 38135, 273, 253, 3559, 789, 943, 320, 9713, 275, 1708, 273, 436, 2045, 789, 690, 6667, 29772, 77, 472, 617, 87, 285, 37622, 757, 480, 259, 4415, 76, 561, 4859, 875, 1616, 729, 3210, 285, 33362, 4071, 591, 916, 9036, 16424, 275, 11454, 1491, 5162, 2718, 11161, 277, 16758, 3471, 4652, 299, 1162, 355, 3634, 6820, 3215, 11273, 3676, 11454, 6928, 323, 1781, 87, 406, 25718, 6519, 8981, 26332, 1796, 13122, 327, 9797, 6519, 285, 3448, 5162, 848, 4332, 1884, 2945, 38854, 247, 1591, 6563, 22412, 480, 1942, 314, 285, 490, 7555, 22876, 1342, 278, 1368, 3163, 9769, 6519, 8981, 342, 3676, 12246, 30869, 298, 296, 78, 4072, 26332, 1796, 22586, 327, 12077, 6519, 8981, 285, 4685, 26332, 1796, 4072, 50276, 977, 5701, 50276, 262, 3133, 26085, 326, 690, 2238, 273, 10454, 320, 1677, 323, 253, 288, 28094, 9866, 1580, 581, 273, 253, 2201, 10156, 2792, 273, 288, 78, 983, 310, 326, 253, 3579, 48934, 279, 476, 320, 2218, 275, 14366, 465, 19, 673, 285, 14366, 465, 3541, 50276, 249, 2593, 7609, 4496, 1618, 7092, 4278, 5001, 849, 288, 28094, 9866, 285, 253, 18927, 360, 88, 293, 348, 11333, 497, 1408, 50276, 2520, 310, 3839, 1774, 533, 310, 1014, 625, 594, 1580, 11795, 3904, 403, 2361, 275, 2593, 5976, 671, 1580, 352, 417, 5469, 387, 667, 2978, 352, 310, 12744, 1880, 253, 4477, 6337, 253, 701, 609, 16461, 279, 2112, 342, 253, 355, 545, 609, 16461, 279, 253, 760, 273, 253, 767, 5469, 534, 310, 3309, 281, 1347, 253, 18927, 360, 88, 293, 348, 5933, 50276, 17480, 288, 78, 983, 778, 452, 2709, 1980, 5556, 66, 50276, 2520, 3908, 310, 14697, 50276, 31158, 288, 78, 983, 970, 4443, 66, 360, 88, 293, 348, 778, 452, 2709, 1980, 5556, 66, 50276, 9939, 326, 643, 34746, 4764, 13418, 5933, 2226, 824, 347, 9879, 4715, 11333, 534, 403, 417, 2256, 281, 2709, 1980, 5556, 66, 50276, 262, 4453, 751, 253, 2491, 273, 246, 2193, 275, 253, 355, 91, 17563, 398, 2898, 310, 495, 577, 285, 608, 50276, 261, 326, 987, 50276, 338, 594, 326, 310, 16088, 1355, 281, 320, 5175, 271, 34746, 275, 2087, 50276, 6050, 436, 2898, 310, 25369, 1774, 1677, 326, 253, 26536, 310, 281, 921, 326, 288, 28094, 79, 2224, 476, 513, 1841, 1805, 685, 288, 78, 983, 247, 1199, 1805, 22791, 651, 320, 323, 247, 10481, 1048, 2069, 12395, 824, 347, 6519, 941, 50276, 783, 4477, 651, 3164, 6351, 625, 32535, 7296, 6519, 941, 347, 337, 288, 78, 983, 452, 644, 954, 19732, 2961, 323, 436, 2898, 5028, 374, 465, 267, 5168, 4961, 285, 368, 476, 4354, 10007, 697, 2603, 495, 288, 78, 983, 1335, 1132, 271, 1774, 629, 275, 253, 3733, 273, 277, 79, 2224, 365, 5036, 35615, 24088, 9417, 347, 581, 273, 253, 3652, 8336, 275, 3733, 247, 260, 1678, 9866, 44264, 627, 403, 1142, 1142, 643, 4893, 273, 288, 78, 983, 281, 1199, 3356, 673, 6430, 342, 4354, 771, 18397, 13279, 1505, 27558, 390, 253, 268, 485, 28859, 366, 29886, 3210, 5522, 310, 671, 2130, 326, 253, 4477, 476, 1614, 281, 281, 625, 2410, 1763, 5356, 7568, 253, 34385, 273, 616, 10336, 281, 2629, 288, 78, 983, 50276, 783, 1061, 386, 31006, 1475, 17402, 1687, 84, 3133, 4336, 15279, 50276, 249, 2593, 5922, 295, 310, 417, 1335, 417, 2931, 534, 310, 21643, 50276, 7152, 339, 431, 248, 2929, 29328, 247, 8763, 1616, 729, 18902, 11454, 2990, 288, 28094, 9866, 326, 43341, 253, 3879, 273, 5899, 8763, 1616, 729, 3210, 34746, 285, 2722, 326, 253, 4081, 2990, 476, 320, 10166, 281, 4044, 247, 2074, 2900, 347, 34746, 50276, 8826, 3533, 50276, 18, 253, 9400, 310, 417, 1534, 253, 2929, 29328, 271, 391, 9866, 281, 25066, 253, 3879, 273, 5899, 34746, 3738, 253, 2929, 19539, 326, 253, 1566, 476, 320, 18325, 281, 4044, 2074, 3602, 347, 34746, 352, 36908, 2486, 667, 5301, 390, 3368, 323, 253, 31471, 273, 253, 1566, 50276, 19, 253, 2929, 19756, 253, 3309, 11985, 275, 1798, 253, 2929, 3916, 326, 253, 4081, 1566, 4483, 323, 6832, 23178, 414, 342, 643, 15970, 6928, 1014, 2167, 253, 2929, 556, 2879, 271, 3081, 2990, 327, 253, 355, 22110, 10895, 253, 2929, 1057, 417, 921, 667, 28913, 3368, 327, 253, 7680, 273, 1016, 4445, 285, 6524, 253, 12510, 273, 253, 4081, 789, 50276, 20, 253, 2929, 19756, 8245, 3210, 323, 253, 2728, 10005, 14053, 627, 403, 2067, 2728, 10005, 3210, 1754, 327, 34746, 24088, 337, 285, 374, 253, 2929, 36908, 7277, 281, 824, 8245, 3210, 281, 7568, 253, 12510, 273, 253, 4081, 789, 50276, 18, 340, 86, 3184, 632, 86, 439, 86, 606, 632, 269, 2310, 249, 632, 458, 4498, 480, 1443, 278, 294, 47911, 5919, 28269, 1171, 38927, 673, 8763, 1616, 729, 3210, 323, 2728, 10005, 275, 16424, 275, 11454, 1491, 5162, 2718, 7223, 5540, 4838, 25805, 4104, 50276, 19, 21799, 1314, 278, 355, 5781, 660, 1519, 30287, 285, 3641, 3227, 7896, 3889, 1784, 5807, 66, 274, 4715, 432, 3382, 23014, 3300, 303, 782, 729, 2307, 2907, 7101, 48011, 8583, 4870, 323, 2495, 17102, 5213, 8059, 327, 5145, 4715, 4240, 50275, 21, 253, 3368, 327, 355, 91, 17563, 398, 2728, 10895, 310, 417, 973, 2529, 4735, 7877, 9383, 6656, 707, 3966, 285, 642, 2127, 390, 17927, 5933, 310, 2530, 2403, 253, 5661, 1543, 285, 4342, 1892, 281, 320, 23775, 50276, 22, 253, 16182, 310, 417, 973, 5544, 275, 253, 5899, 34746, 253, 3054, 275, 253, 1735, 28921, 403, 10302, 1754, 12718, 327, 253, 1375, 327, 253, 1655, 28921, 285, 403, 3907, 273, 253, 8310, 2299, 1754, 327, 5150, 21, 50276, 23, 253, 4081, 391, 9866, 310, 12672, 253, 3054, 1754, 327, 253, 1655, 8310, 7152, 33032, 2520, 2929, 23970, 247, 4460, 10336, 273, 18902, 11454, 2990, 326, 43341, 253, 2444, 273, 247, 2629, 34746, 275, 1798, 253, 4081, 288, 28094, 9866, 33772, 1566, 22041, 534, 403, 10126, 2074, 5482, 281, 1110, 273, 247, 2629, 34746, 1561, 247, 2087, 11454, 2990, 4715, 7792, 1223, 352, 310, 2011, 253, 4081, 2990, 12014, 281, 253, 34746, 627, 403, 1142, 3374, 326, 943, 320, 2783, 21038, 50272, 249, 16186, 608, 340, 85, 943, 320, 340, 14543, 281, 5224, 295, 394, 9930, 4636, 285, 337, 28708, 39745, 320, 1903, 89, 39745, 50276, 783, 14951, 273, 3714, 74, 1944, 310, 17011, 50276, 249, 253, 9864, 1263, 352, 310, 8521, 281, 26641, 3602, 342, 3632, 2193, 3185, 273, 38883, 253, 4114, 3640, 323, 253, 9864, 3126, 50276, 262, 310, 12744, 323, 253, 1921, 273, 1501, 21678, 281, 7277, 342, 253, 2629, 34746, 50276, 783, 2990, 4715, 4373, 19484, 2193, 878, 12930, 323, 38041, 50276, 6377, 818, 275, 253, 4766, 752, 310, 253, 4495, 273, 288, 85, 805, 50276, 21848, 281, 43430, 275, 41818, 352, 310, 1892, 281, 1239, 253, 7714, 50276, 783, 5661, 7533, 943, 320, 17007, 275, 2508, 50276, 9939, 326, 253, 4477, 9125, 285, 4767, 275, 2593, 495, 326, 288, 28094, 9866, 43341, 253, 5871, 273, 253, 2629, 34746, 285, 11330, 10126, 2074, 5482, 281, 253, 270, 88, 5933, 604, 594, 352, 1057, 417, 1056, 3282, 281, 436, 37317, 2139, 253, 288, 28094, 79, 2224, 3045, 369, 8936, 281, 253, 34746, 50275, 783, 1895, 4758, 689, 355, 91, 17563, 398, 2728, 1083, 1263, 310, 417, 2590, 281, 436, 30628, 4685, 253, 3280, 281, 253, 3210, 369, 247, 3425, 273, 2677, 1025, 5823, 339, 2193, 285, 253, 2457, 3453, 310, 253, 7140, 273, 2057, 260, 5267, 1762, 390, 260, 5267, 1762, 275, 3036, 495, 2139, 352, 310, 670, 36643, 9050, 273, 5823, 339, 26332, 470, 337, 285, 374, 50276, 9088, 943, 320, 625, 26565, 4679, 285, 616, 1543, 281, 1805, 1329, 253, 13091, 273, 253, 4081, 1332, 7152, 33032, 6010, 4477, 5183, 326, 581, 476, 22573, 253, 941, 12177, 1159, 273, 271, 34746, 970, 247, 18052, 391, 9866, 10336, 12401, 2045, 789, 835, 8512, 432, 1027, 8090, 497, 31458, 2366, 253, 747, 9706, 13714, 3560, 253, 8946, 10336, 13133, 273, 247, 11454, 2990, 50276, 466, 1016, 3828, 369, 247, 17375, 2020, 273, 253, 2045, 3828, 45190, 2488, 2692, 326, 253, 4764, 6311, 407, 9433, 11786, 18499, 327, 253, 12177, 1159, 310, 2074, 281, 253, 581, 326, 310, 2797, 970, 253, 802, 5933, 275, 1635, 4477, 5183, 326, 824, 15895, 13276, 271, 2898, 275, 12392, 355, 91, 17563, 398, 2728, 19831, 10005, 50275, 45563, 50276, 262, 310, 271, 4722, 747, 4602, 875, 391, 9866, 285, 34746, 326, 369, 1119, 407, 4477, 50275, 74, 1119, 253, 355, 91, 17563, 398, 2898, 310, 1077, 29853, 2429, 281, 253, 8245, 1332, 326, 4648, 247, 2014, 34746, 281, 1566, 253, 10005, 273, 253, 355, 22110, 323, 1046, 3110, 2488, 4081, 281, 18045, 1016, 2060, 1363, 12474, 672, 21565, 253, 10005, 436, 5644, 281, 247, 625, 7899, 10554, 273, 253, 2457, 355, 91, 17563, 398, 10554, 323, 1016, 3110, 50275, 20881, 1255, 50276, 783, 16038, 323, 253, 4602, 875, 391, 9866, 285, 34746, 310, 417, 1077, 2590, 285, 352, 17134, 398, 253, 8453, 273, 253, 789, 347, 2488, 556, 8042, 562, 627, 403, 2709, 789, 326, 14177, 281, 36803, 253, 12177, 1159, 273, 247, 29886, 1566, 26332, 247, 34746, 347, 247, 13782, 4216, 16280, 253, 4394, 326, 403, 11106, 407, 253, 2488, 8857, 1249, 273, 337, 671, 2722, 2087, 1332, 281, 6635, 247, 13782, 4216, 432, 247, 29886, 1566, 285, 253, 13782, 4216, 44995, 253, 5912, 7316, 327, 253, 29886, 1566, 841, 1027, 14237, 9184, 760, 275, 253, 43548, 514, 474, 6779, 273, 253, 12177, 1159, 3738, 4477, 7303, 281, 36803, 253, 1159, 970, 253, 2605, 273, 247, 8946, 391, 9866, 253, 6944, 1159, 310, 3300, 39904, 6425, 281, 253, 2045, 3082, 7613, 253, 27935, 8772, 253, 34746, 3602, 403, 1469, 281, 253, 1072, 275, 512, 1027, 14237, 7613, 9591, 11786, 18499, 327, 247, 288, 28094, 9866, 1057, 417, 1646, 281, 320, 1027, 432, 9591, 11786, 18499, 327, 643, 43548, 514, 474, 14237, 273, 253, 12177, 1159, 50276, 18, 14053, 285, 14720, 342, 17699, 16561, 6928, 519, 11943, 13681, 88, 16128, 50275, 34974, 50276, 262, 369, 417, 2590, 281, 479, 849, 253, 35220, 3836, 432, 1016, 1363, 12474, 497, 11217, 281, 1566, 253, 355, 91, 17563, 398, 10005, 347, 627, 369, 642, 3216, 5083, 327, 253, 8763, 3054, 891, 369, 417, 2119, 849, 253, 3302, 20552, 497, 10166, 285, 849, 253, 260, 5267, 6518, 281, 35220, 885, 253, 3733, 273, 253, 8763, 3054, 403, 841, 35220, 3836, 11217, 949, 253, 2957, 1159, 187, 187, 4118, 18435, 27, 9088, 310, 13969, 326, 253, 19529, 310, 417, 2568, 4704, 323, 9311, 253, 10123, 3831, 2709, 5701, 285, 13991, 285, 891, 3524, 597, 476, 320, 4217, 323, 253, 4477 ]
Below is given review of a research paper from cnoference journal. Please write a summary the review. ### Review: the authors consider the practice of weight quantization in deep convolutional neural networks and deep convolutional graph neural networks leveraging the analogy between convolutional architectures and timedependent pdes the authors argue for certain conditions that should improve the stability of network outputs with respect to weight quantization in actuality they propose two practices a specific form of weightsharing to create a symmetry in the action of convolutional layers and the use of cfllike conditions to guarantee the numerical stability of these symmetric layers this is an ingenious use of the analogy between neural architectures and des to understand an increasingly important problem in deep learning ie the need for enormous amounts of computer memory i recommend the submission be accepted to the workshop discussion line 3 the authors state that fixedpoint arithmetic is not natural to neural computations i am curious to better understand what is meant here are there other applications in which fixedpoint arithmetic can be incorporated more naturally similarly are there really any applications in which reducing the floating point accuracy of variables can be done without careful precautions i am no expert in this matter but it seems to me that deep networks are remarkable robust to weight quantization compared to eg how i would expect a finite element solver to respond to a reduction from 32 bit variables to 4 bit variables this has little impact on the rest of the submission but it piqued my curiousity lines 2130 this discussion is a little clumsy given how central this analogy is to the rest of the work i think the authors main point is that the forward evaluation of a convolutional network with residual connections through its layers is analogous to the forward evolution of a pde through time the other points made in the paragraph distract from this message similarly a discussion of the cfl conditions would probably make this more accessible to those members of the dldesciml community who may be less familiar with classical numerical analysis lines 3547 are good lines 4853 appear to repeat something very similar fig 1 these might benefit from being plotting with a logarithmic mse axis since the ratio of mse between the stable and unstable variants is of primary interest also can the authors explain what happens at the jumps around layers 9 and 16 and why the blue and red curves respond very differently in the intervening layers the authors may be interested in the recently released principles of deep learning theory httpsarxivorgabs210610165 that work provides an interesting perspective of the stability of very deep neural networks which may be complementary to the results discussed here the submission might benefit from additional discussion of related works especially regarding the type of symmetry they impose in equations 2 and 4 to what extent has such symmetry been studied before eg in refs 1221 in particular arguably the most compelling result of the submission is that these symmetric layers perform nearly as well as the more general nonsymmetric layers despite consuming half the memory regardless of weight quantization the authors conjecture that this advantage is due to smoother optimization line 117 although i dont find this explanation convincing do the authors have any evidence of this i was inclined to interpret this behaviour as being due to an inductive bias similar to the translationinvariance or locality properties of convolutions can the authors comment on what inductive biases the symmetric layers may created for the resulting neural architectures the authors experimental protocol could use some clarifiation what exactly do the columns 4w8a etc in tables 13 mean was quantization effected before during or after training admittedly it seems that many of my questions are resolved in the fairly long supplemental materials table 3 the authors claim that the symmetric networks preserve their accuracy better under quantization than the nonsymmetric networks however it seems to me that the data do not clearly support this argument for the citeseer case the symmetric vs nonsymmetric accuracies differ from left to right by 17 17 and 11 if anything the symmetric network has performed slightly worse under quantization that the nonsymmetric one unless i am missing something perhaps the difference in accuracy values is not the best way of comparing the response of these networks to quantization but if that is the case then the authors should provide a more explicit form of evidence to support the claim that the symmetricstable networks respond better to quantization in some suitably defined sense line 115 i believe this is incorrect the symmetric architectures will generally be less expressive than the nonsymmetric ones this is similar to the fact that convolutional networks are less expressive than fully general mlps as alluded to above this loss in expressivity is acceptable and indeed beneficial for certain applications because it corresponds to an inductive bias that restricts the possible functions the network can represent docsep summary the authors study the stability properties of cnn and gcn models with quantization the process of reducing computational requirements by reducing the precision of the parameters in a neural network in particular error propagation in the network is analyzed through the lens of partial differential equations and the stability properties are considered analogously to the courantfriedrichslewy conditions for pdes from this analysis the authors design stable variants of quantized cnns and symmetric variants of quantized gnns experiments show that these new architectures can yield improvements in accuracy moreover stable and symmetric variants exhibit a perlayer bounded divergence in error propagation compared to their nonstableasymmetric counterparts significance of the work designing quantized neural networks with stability properties through the lens of partial differential equations is an interesting and neat idea moreover this work can bring relevant practical contributions by introducing pareto optimal neural networks in terms of computational requirements and accuracy for applications with resource or time constraints other the paper is well written and structured some miscellaneous notes line 70 the jacobian mathbfj is not explicitly defined in the main text table 3 caption typo and anddocsepusing pde ode stability properties to design cnn gnn that peform better under feature quantization is a really neat ideal this is a nice contribution to the field with high technical merit and practical utility several improvements could however be made to the sections dealing with graph neural networks gcn generally refers to the model of semisupervised classification with graph convolutional networks kipf and welling iclr17 with gnn being the more common general term for graph convolutional neural networks i assume that this work applies to the more general class of message passing gnns and not just to the gcn model of kipf and welling otherwise the scope of this work is somewhat limited missing citations related to diffusion pdes and gnns continuous graph neural networks icml20 grand graph neural diffusion icml21 ### Summary:
this paper draws an analogy between quantisation error in quantised neural networks and truncation error in the numerical solution of differential equations all reviewers agree that this is an excellent paper i agree the analogy is a good one with the potential for much further exploration in addition to the very substantial review comments i would make a few of my own for the quantisation error to decay the authors mandate that the spectral radius of the jacobian be strictly less than 1 this means that the corresponding dynamical system converges towards a fixed point in the infinitelayer limit all possible inputs would produce the same output and in any finite approximation there is merely some bias towards this behaviour this seems very unlikely to be desirable behaviour and is in my opinion the greatest weakness of the paper the authors define stability simply via a lipschitz condition equation 1 theoretically speaking at least this is always true provided the activation function is lipschitz it is not clear that this motivation really adds anything nitpick when separating names line 29 it is correct to use an en dash obtainable as in latex rather than a hypen
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 783, 4477, 1908, 253, 3946, 273, 2801, 36643, 275, 3676, 27311, 267, 11454, 6928, 285, 3676, 27311, 267, 4216, 11454, 6928, 19732, 2977, 253, 24760, 875, 27311, 267, 35615, 285, 37282, 2662, 268, 3229, 253, 4477, 9059, 323, 2176, 2515, 326, 943, 3157, 253, 7882, 273, 2990, 18012, 342, 1675, 281, 2801, 36643, 275, 4588, 414, 597, 12661, 767, 8333, 247, 2173, 830, 273, 2801, 35870, 281, 2794, 247, 10377, 275, 253, 2250, 273, 27311, 267, 8090, 285, 253, 897, 273, 21194, 620, 2804, 2515, 281, 12215, 253, 10704, 7882, 273, 841, 13123, 8090, 436, 310, 271, 35604, 784, 897, 273, 253, 24760, 875, 11454, 35615, 285, 711, 281, 2096, 271, 9592, 1774, 1895, 275, 3676, 4715, 26332, 253, 878, 323, 14779, 8322, 273, 4382, 3541, 891, 5583, 253, 19529, 320, 7607, 281, 253, 22586, 50276, 49794, 50276, 1282, 495, 253, 4477, 1375, 326, 4229, 3659, 27844, 310, 417, 3626, 281, 11454, 30745, 891, 717, 14338, 281, 1805, 2096, 752, 310, 5486, 1060, 403, 627, 643, 4893, 275, 534, 4229, 3659, 27844, 476, 320, 11217, 625, 10748, 12014, 403, 627, 1663, 667, 4893, 275, 534, 8493, 253, 14974, 1127, 7200, 273, 4903, 476, 320, 2218, 1293, 10182, 39933, 891, 717, 642, 6485, 275, 436, 2647, 533, 352, 3133, 281, 479, 326, 3676, 6928, 403, 13406, 10237, 281, 2801, 36643, 2429, 281, 24088, 849, 891, 651, 1902, 247, 6486, 3284, 47037, 281, 3794, 281, 247, 5141, 432, 4567, 2372, 4903, 281, 577, 2372, 4903, 436, 556, 1652, 3486, 327, 253, 1551, 273, 253, 19529, 533, 352, 268, 3008, 264, 619, 14338, 414, 50276, 8737, 374, 11246, 436, 5955, 310, 247, 1652, 43331, 90, 1677, 849, 4275, 436, 24760, 310, 281, 253, 1551, 273, 253, 789, 891, 1158, 253, 4477, 2022, 1127, 310, 326, 253, 3579, 7103, 273, 247, 27311, 267, 2990, 342, 12541, 10291, 949, 697, 8090, 310, 19890, 281, 253, 3579, 5606, 273, 247, 268, 615, 949, 673, 253, 643, 2792, 1160, 275, 253, 12494, 36815, 432, 436, 3935, 12014, 247, 5955, 273, 253, 260, 1258, 2515, 651, 3164, 1056, 436, 625, 12482, 281, 1110, 2758, 273, 253, 277, 392, 20039, 303, 77, 3114, 665, 778, 320, 1679, 7615, 342, 8946, 10704, 1783, 50276, 8737, 4791, 2504, 403, 1175, 3104, 577, 35866, 3176, 281, 10280, 1633, 1077, 2074, 50276, 926, 337, 841, 1537, 5649, 432, 1146, 38542, 342, 247, 32643, 278, 339, 7844, 1580, 253, 4313, 273, 278, 339, 875, 253, 6474, 285, 17631, 11640, 310, 273, 3625, 1600, 671, 476, 253, 4477, 5513, 752, 6569, 387, 253, 27287, 1475, 8090, 898, 285, 1668, 285, 2139, 253, 4797, 285, 2502, 9191, 3794, 1077, 13359, 275, 253, 37686, 8090, 50276, 783, 4477, 778, 320, 6110, 275, 253, 4102, 4439, 9241, 273, 3676, 4715, 3762, 5987, 39962, 2061, 5375, 16899, 3832, 520, 2082, 326, 789, 3400, 271, 4722, 8668, 273, 253, 7882, 273, 1077, 3676, 11454, 6928, 534, 778, 320, 19767, 281, 253, 1543, 5469, 1060, 50276, 783, 19529, 1537, 5649, 432, 3081, 5955, 273, 2905, 2987, 3340, 5001, 253, 1511, 273, 10377, 597, 16209, 275, 7424, 374, 285, 577, 281, 752, 6070, 556, 824, 10377, 644, 5421, 1078, 24088, 275, 1275, 84, 1249, 1797, 275, 1798, 25711, 253, 954, 18511, 906, 273, 253, 19529, 310, 326, 841, 13123, 8090, 1347, 4829, 347, 973, 347, 253, 625, 2087, 14122, 25562, 8090, 5747, 21337, 2716, 253, 3541, 10159, 273, 2801, 36643, 253, 4477, 24366, 326, 436, 5750, 310, 1955, 281, 39797, 977, 13757, 1386, 12387, 3738, 891, 13414, 1089, 436, 8813, 21414, 513, 253, 4477, 452, 667, 1941, 273, 436, 891, 369, 21802, 281, 4665, 436, 8770, 347, 1146, 1955, 281, 271, 42115, 8492, 2074, 281, 253, 10234, 7821, 14417, 390, 33643, 3607, 273, 2410, 17009, 476, 253, 4477, 4385, 327, 752, 42115, 31306, 253, 13123, 8090, 778, 3562, 323, 253, 4795, 11454, 35615, 50276, 783, 4477, 5661, 7241, 812, 897, 690, 8254, 338, 2492, 752, 4555, 513, 253, 9930, 577, 88, 25, 66, 3966, 275, 7180, 2145, 1599, 369, 36643, 37099, 1078, 1309, 390, 846, 3733, 47421, 352, 3133, 326, 1142, 273, 619, 3533, 403, 11512, 275, 253, 9648, 1048, 25702, 4753, 50276, 2420, 495, 253, 4477, 1750, 326, 253, 13123, 6928, 14003, 616, 7200, 1805, 762, 36643, 685, 253, 14122, 25562, 6928, 2299, 352, 3133, 281, 479, 326, 253, 941, 513, 417, 4518, 1329, 436, 4154, 323, 253, 4851, 3248, 254, 1083, 253, 13123, 4632, 14122, 25562, 3933, 19103, 9184, 432, 1669, 281, 987, 407, 1722, 1722, 285, 1903, 604, 2712, 253, 13123, 2990, 556, 2684, 5777, 7197, 762, 36643, 326, 253, 14122, 25562, 581, 5734, 891, 717, 5816, 1633, 4931, 253, 3064, 275, 7200, 2193, 310, 417, 253, 1682, 1039, 273, 10941, 253, 2380, 273, 841, 6928, 281, 36643, 533, 604, 326, 310, 253, 1083, 840, 253, 4477, 943, 2085, 247, 625, 6843, 830, 273, 1941, 281, 1329, 253, 1750, 326, 253, 13123, 11351, 6928, 3794, 1805, 281, 36643, 275, 690, 43364, 2931, 3282, 50276, 1282, 11343, 891, 2868, 436, 310, 13583, 253, 13123, 35615, 588, 3839, 320, 1679, 43541, 685, 253, 14122, 25562, 4394, 436, 310, 2074, 281, 253, 958, 326, 27311, 267, 6928, 403, 1679, 43541, 685, 4751, 2087, 13361, 793, 347, 512, 21015, 281, 1840, 436, 2957, 275, 3890, 2351, 310, 12207, 285, 6296, 12912, 323, 2176, 4893, 984, 352, 10140, 281, 271, 42115, 8492, 326, 45798, 253, 1896, 3470, 253, 2990, 476, 1957, 5474, 33032, 6010, 50276, 783, 4477, 1263, 253, 7882, 3607, 273, 260, 9866, 285, 305, 14340, 3210, 342, 36643, 253, 1232, 273, 8493, 15180, 6095, 407, 8493, 253, 12320, 273, 253, 3602, 275, 247, 11454, 2990, 275, 1798, 2228, 18634, 275, 253, 2990, 310, 5867, 949, 253, 9655, 273, 7898, 8967, 7424, 285, 253, 7882, 3607, 403, 2783, 7370, 4087, 281, 253, 1960, 386, 41717, 5969, 84, 282, 22383, 2515, 323, 268, 3229, 432, 436, 1783, 253, 4477, 2216, 6474, 11640, 273, 2677, 1025, 260, 79, 2224, 285, 13123, 11640, 273, 2677, 1025, 18976, 2224, 4679, 921, 326, 841, 747, 35615, 476, 4917, 11701, 275, 7200, 25761, 6474, 285, 13123, 11640, 10738, 247, 591, 12026, 11542, 23279, 275, 2228, 18634, 2429, 281, 616, 1327, 11351, 284, 25562, 21421, 50275, 9188, 40348, 273, 253, 789, 50276, 19417, 272, 2677, 1025, 11454, 6928, 342, 7882, 3607, 949, 253, 9655, 273, 7898, 8967, 7424, 310, 271, 4722, 285, 18176, 2934, 25761, 436, 789, 476, 3324, 4623, 8542, 9021, 407, 16984, 22865, 936, 8654, 11454, 6928, 275, 2426, 273, 15180, 6095, 285, 7200, 323, 4893, 342, 7741, 390, 673, 10806, 50275, 977, 50276, 783, 2929, 310, 973, 3542, 285, 18872, 690, 27722, 43295, 7211, 50275, 1282, 5571, 253, 480, 317, 706, 757, 14168, 3342, 75, 310, 417, 11120, 2931, 275, 253, 2022, 2505, 50276, 2420, 495, 11743, 1745, 80, 285, 285, 7152, 33032, 5302, 268, 615, 50276, 853, 7882, 3607, 281, 2216, 260, 9866, 50276, 3757, 79, 326, 759, 630, 1805, 762, 4735, 36643, 310, 247, 1663, 18176, 7445, 436, 310, 247, 5322, 7680, 281, 253, 1673, 342, 1029, 7681, 15785, 285, 8542, 11839, 50275, 43249, 11701, 812, 2299, 320, 1160, 281, 253, 7118, 10620, 342, 4216, 11454, 6928, 50276, 72, 14340, 3839, 10770, 281, 253, 1566, 273, 49863, 29974, 13337, 9162, 342, 4216, 27311, 267, 6928, 465, 532, 71, 285, 973, 272, 17857, 32888, 1166, 342, 305, 9866, 1146, 253, 625, 1846, 2087, 1307, 323, 4216, 27311, 267, 11454, 6928, 891, 5467, 326, 436, 789, 10384, 281, 253, 625, 2087, 966, 273, 3935, 8136, 18976, 2224, 285, 417, 816, 281, 253, 305, 14340, 1566, 273, 465, 532, 71, 285, 973, 272, 5010, 253, 7990, 273, 436, 789, 310, 8489, 3710, 50276, 33722, 30404, 2905, 281, 12393, 268, 3229, 285, 18976, 2224, 50276, 38927, 4216, 11454, 6928, 17857, 1686, 938, 50276, 26852, 4216, 11454, 12393, 17857, 1686, 1797, 2490, 187, 4118, 18435, 27, 2520, 2929, 21354, 271, 24760, 875, 2677, 5837, 2228, 275, 2677, 1701, 11454, 6928, 285, 47024, 2228, 275, 253, 10704, 2900, 273, 8967, 7424, 50276, 455, 30628, 5194, 326, 436, 310, 271, 7126, 2929, 891, 5194, 253, 24760, 310, 247, 1175, 581, 342, 253, 2442, 323, 1199, 2007, 17947, 50276, 249, 1635, 281, 253, 1077, 6832, 2278, 5701, 891, 651, 1056, 247, 1643, 273, 619, 1211, 50276, 1542, 253, 2677, 5837, 2228, 281, 10027, 253, 4477, 21787, 326, 253, 9879, 9941, 273, 253, 480, 317, 706, 757, 320, 13714, 1679, 685, 337, 436, 2097, 326, 253, 3969, 18525, 985, 26414, 4404, 247, 4229, 1127, 275, 253, 2192, 4478, 293, 4071, 2701, 512, 1896, 14800, 651, 4711, 253, 1072, 3453, 285, 275, 667, 6486, 11193, 627, 310, 7960, 690, 8492, 4404, 436, 8770, 436, 3133, 1077, 11543, 281, 320, 11408, 8770, 285, 310, 275, 619, 4743, 253, 6459, 14855, 273, 253, 2929, 50276, 783, 4477, 4853, 7882, 3365, 3066, 247, 11233, 37913, 1617, 5150, 337, 28055, 8288, 387, 1878, 436, 310, 1900, 2032, 2530, 253, 5743, 1159, 310, 11233, 37913, 352, 310, 417, 2590, 326, 436, 16038, 1663, 11323, 2712, 50276, 32202, 29397, 672, 23694, 4454, 1386, 3285, 352, 310, 3451, 281, 897, 271, 546, 20134, 50276, 706, 14721, 494, 347, 50276, 249, 44127, 2581, 685, 247, 3500, 257, 50275 ]
[ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 ]
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 783, 4477, 1908, 253, 3946, 273, 2801, 36643, 275, 3676, 27311, 267, 11454, 6928, 285, 3676, 27311, 267, 4216, 11454, 6928, 19732, 2977, 253, 24760, 875, 27311, 267, 35615, 285, 37282, 2662, 268, 3229, 253, 4477, 9059, 323, 2176, 2515, 326, 943, 3157, 253, 7882, 273, 2990, 18012, 342, 1675, 281, 2801, 36643, 275, 4588, 414, 597, 12661, 767, 8333, 247, 2173, 830, 273, 2801, 35870, 281, 2794, 247, 10377, 275, 253, 2250, 273, 27311, 267, 8090, 285, 253, 897, 273, 21194, 620, 2804, 2515, 281, 12215, 253, 10704, 7882, 273, 841, 13123, 8090, 436, 310, 271, 35604, 784, 897, 273, 253, 24760, 875, 11454, 35615, 285, 711, 281, 2096, 271, 9592, 1774, 1895, 275, 3676, 4715, 26332, 253, 878, 323, 14779, 8322, 273, 4382, 3541, 891, 5583, 253, 19529, 320, 7607, 281, 253, 22586, 50276, 49794, 50276, 1282, 495, 253, 4477, 1375, 326, 4229, 3659, 27844, 310, 417, 3626, 281, 11454, 30745, 891, 717, 14338, 281, 1805, 2096, 752, 310, 5486, 1060, 403, 627, 643, 4893, 275, 534, 4229, 3659, 27844, 476, 320, 11217, 625, 10748, 12014, 403, 627, 1663, 667, 4893, 275, 534, 8493, 253, 14974, 1127, 7200, 273, 4903, 476, 320, 2218, 1293, 10182, 39933, 891, 717, 642, 6485, 275, 436, 2647, 533, 352, 3133, 281, 479, 326, 3676, 6928, 403, 13406, 10237, 281, 2801, 36643, 2429, 281, 24088, 849, 891, 651, 1902, 247, 6486, 3284, 47037, 281, 3794, 281, 247, 5141, 432, 4567, 2372, 4903, 281, 577, 2372, 4903, 436, 556, 1652, 3486, 327, 253, 1551, 273, 253, 19529, 533, 352, 268, 3008, 264, 619, 14338, 414, 50276, 8737, 374, 11246, 436, 5955, 310, 247, 1652, 43331, 90, 1677, 849, 4275, 436, 24760, 310, 281, 253, 1551, 273, 253, 789, 891, 1158, 253, 4477, 2022, 1127, 310, 326, 253, 3579, 7103, 273, 247, 27311, 267, 2990, 342, 12541, 10291, 949, 697, 8090, 310, 19890, 281, 253, 3579, 5606, 273, 247, 268, 615, 949, 673, 253, 643, 2792, 1160, 275, 253, 12494, 36815, 432, 436, 3935, 12014, 247, 5955, 273, 253, 260, 1258, 2515, 651, 3164, 1056, 436, 625, 12482, 281, 1110, 2758, 273, 253, 277, 392, 20039, 303, 77, 3114, 665, 778, 320, 1679, 7615, 342, 8946, 10704, 1783, 50276, 8737, 4791, 2504, 403, 1175, 3104, 577, 35866, 3176, 281, 10280, 1633, 1077, 2074, 50276, 926, 337, 841, 1537, 5649, 432, 1146, 38542, 342, 247, 32643, 278, 339, 7844, 1580, 253, 4313, 273, 278, 339, 875, 253, 6474, 285, 17631, 11640, 310, 273, 3625, 1600, 671, 476, 253, 4477, 5513, 752, 6569, 387, 253, 27287, 1475, 8090, 898, 285, 1668, 285, 2139, 253, 4797, 285, 2502, 9191, 3794, 1077, 13359, 275, 253, 37686, 8090, 50276, 783, 4477, 778, 320, 6110, 275, 253, 4102, 4439, 9241, 273, 3676, 4715, 3762, 5987, 39962, 2061, 5375, 16899, 3832, 520, 2082, 326, 789, 3400, 271, 4722, 8668, 273, 253, 7882, 273, 1077, 3676, 11454, 6928, 534, 778, 320, 19767, 281, 253, 1543, 5469, 1060, 50276, 783, 19529, 1537, 5649, 432, 3081, 5955, 273, 2905, 2987, 3340, 5001, 253, 1511, 273, 10377, 597, 16209, 275, 7424, 374, 285, 577, 281, 752, 6070, 556, 824, 10377, 644, 5421, 1078, 24088, 275, 1275, 84, 1249, 1797, 275, 1798, 25711, 253, 954, 18511, 906, 273, 253, 19529, 310, 326, 841, 13123, 8090, 1347, 4829, 347, 973, 347, 253, 625, 2087, 14122, 25562, 8090, 5747, 21337, 2716, 253, 3541, 10159, 273, 2801, 36643, 253, 4477, 24366, 326, 436, 5750, 310, 1955, 281, 39797, 977, 13757, 1386, 12387, 3738, 891, 13414, 1089, 436, 8813, 21414, 513, 253, 4477, 452, 667, 1941, 273, 436, 891, 369, 21802, 281, 4665, 436, 8770, 347, 1146, 1955, 281, 271, 42115, 8492, 2074, 281, 253, 10234, 7821, 14417, 390, 33643, 3607, 273, 2410, 17009, 476, 253, 4477, 4385, 327, 752, 42115, 31306, 253, 13123, 8090, 778, 3562, 323, 253, 4795, 11454, 35615, 50276, 783, 4477, 5661, 7241, 812, 897, 690, 8254, 338, 2492, 752, 4555, 513, 253, 9930, 577, 88, 25, 66, 3966, 275, 7180, 2145, 1599, 369, 36643, 37099, 1078, 1309, 390, 846, 3733, 47421, 352, 3133, 326, 1142, 273, 619, 3533, 403, 11512, 275, 253, 9648, 1048, 25702, 4753, 50276, 2420, 495, 253, 4477, 1750, 326, 253, 13123, 6928, 14003, 616, 7200, 1805, 762, 36643, 685, 253, 14122, 25562, 6928, 2299, 352, 3133, 281, 479, 326, 253, 941, 513, 417, 4518, 1329, 436, 4154, 323, 253, 4851, 3248, 254, 1083, 253, 13123, 4632, 14122, 25562, 3933, 19103, 9184, 432, 1669, 281, 987, 407, 1722, 1722, 285, 1903, 604, 2712, 253, 13123, 2990, 556, 2684, 5777, 7197, 762, 36643, 326, 253, 14122, 25562, 581, 5734, 891, 717, 5816, 1633, 4931, 253, 3064, 275, 7200, 2193, 310, 417, 253, 1682, 1039, 273, 10941, 253, 2380, 273, 841, 6928, 281, 36643, 533, 604, 326, 310, 253, 1083, 840, 253, 4477, 943, 2085, 247, 625, 6843, 830, 273, 1941, 281, 1329, 253, 1750, 326, 253, 13123, 11351, 6928, 3794, 1805, 281, 36643, 275, 690, 43364, 2931, 3282, 50276, 1282, 11343, 891, 2868, 436, 310, 13583, 253, 13123, 35615, 588, 3839, 320, 1679, 43541, 685, 253, 14122, 25562, 4394, 436, 310, 2074, 281, 253, 958, 326, 27311, 267, 6928, 403, 1679, 43541, 685, 4751, 2087, 13361, 793, 347, 512, 21015, 281, 1840, 436, 2957, 275, 3890, 2351, 310, 12207, 285, 6296, 12912, 323, 2176, 4893, 984, 352, 10140, 281, 271, 42115, 8492, 326, 45798, 253, 1896, 3470, 253, 2990, 476, 1957, 5474, 33032, 6010, 50276, 783, 4477, 1263, 253, 7882, 3607, 273, 260, 9866, 285, 305, 14340, 3210, 342, 36643, 253, 1232, 273, 8493, 15180, 6095, 407, 8493, 253, 12320, 273, 253, 3602, 275, 247, 11454, 2990, 275, 1798, 2228, 18634, 275, 253, 2990, 310, 5867, 949, 253, 9655, 273, 7898, 8967, 7424, 285, 253, 7882, 3607, 403, 2783, 7370, 4087, 281, 253, 1960, 386, 41717, 5969, 84, 282, 22383, 2515, 323, 268, 3229, 432, 436, 1783, 253, 4477, 2216, 6474, 11640, 273, 2677, 1025, 260, 79, 2224, 285, 13123, 11640, 273, 2677, 1025, 18976, 2224, 4679, 921, 326, 841, 747, 35615, 476, 4917, 11701, 275, 7200, 25761, 6474, 285, 13123, 11640, 10738, 247, 591, 12026, 11542, 23279, 275, 2228, 18634, 2429, 281, 616, 1327, 11351, 284, 25562, 21421, 50275, 9188, 40348, 273, 253, 789, 50276, 19417, 272, 2677, 1025, 11454, 6928, 342, 7882, 3607, 949, 253, 9655, 273, 7898, 8967, 7424, 310, 271, 4722, 285, 18176, 2934, 25761, 436, 789, 476, 3324, 4623, 8542, 9021, 407, 16984, 22865, 936, 8654, 11454, 6928, 275, 2426, 273, 15180, 6095, 285, 7200, 323, 4893, 342, 7741, 390, 673, 10806, 50275, 977, 50276, 783, 2929, 310, 973, 3542, 285, 18872, 690, 27722, 43295, 7211, 50275, 1282, 5571, 253, 480, 317, 706, 757, 14168, 3342, 75, 310, 417, 11120, 2931, 275, 253, 2022, 2505, 50276, 2420, 495, 11743, 1745, 80, 285, 285, 7152, 33032, 5302, 268, 615, 50276, 853, 7882, 3607, 281, 2216, 260, 9866, 50276, 3757, 79, 326, 759, 630, 1805, 762, 4735, 36643, 310, 247, 1663, 18176, 7445, 436, 310, 247, 5322, 7680, 281, 253, 1673, 342, 1029, 7681, 15785, 285, 8542, 11839, 50275, 43249, 11701, 812, 2299, 320, 1160, 281, 253, 7118, 10620, 342, 4216, 11454, 6928, 50276, 72, 14340, 3839, 10770, 281, 253, 1566, 273, 49863, 29974, 13337, 9162, 342, 4216, 27311, 267, 6928, 465, 532, 71, 285, 973, 272, 17857, 32888, 1166, 342, 305, 9866, 1146, 253, 625, 1846, 2087, 1307, 323, 4216, 27311, 267, 11454, 6928, 891, 5467, 326, 436, 789, 10384, 281, 253, 625, 2087, 966, 273, 3935, 8136, 18976, 2224, 285, 417, 816, 281, 253, 305, 14340, 1566, 273, 465, 532, 71, 285, 973, 272, 5010, 253, 7990, 273, 436, 789, 310, 8489, 3710, 50276, 33722, 30404, 2905, 281, 12393, 268, 3229, 285, 18976, 2224, 50276, 38927, 4216, 11454, 6928, 17857, 1686, 938, 50276, 26852, 4216, 11454, 12393, 17857, 1686, 1797, 2490, 187, 4118, 18435, 27, 2520, 2929, 21354, 271, 24760, 875, 2677, 5837, 2228, 275, 2677, 1701, 11454, 6928, 285, 47024, 2228, 275, 253, 10704, 2900, 273, 8967, 7424, 50276, 455, 30628, 5194, 326, 436, 310, 271, 7126, 2929, 891, 5194, 253, 24760, 310, 247, 1175, 581, 342, 253, 2442, 323, 1199, 2007, 17947, 50276, 249, 1635, 281, 253, 1077, 6832, 2278, 5701, 891, 651, 1056, 247, 1643, 273, 619, 1211, 50276, 1542, 253, 2677, 5837, 2228, 281, 10027, 253, 4477, 21787, 326, 253, 9879, 9941, 273, 253, 480, 317, 706, 757, 320, 13714, 1679, 685, 337, 436, 2097, 326, 253, 3969, 18525, 985, 26414, 4404, 247, 4229, 1127, 275, 253, 2192, 4478, 293, 4071, 2701, 512, 1896, 14800, 651, 4711, 253, 1072, 3453, 285, 275, 667, 6486, 11193, 627, 310, 7960, 690, 8492, 4404, 436, 8770, 436, 3133, 1077, 11543, 281, 320, 11408, 8770, 285, 310, 275, 619, 4743, 253, 6459, 14855, 273, 253, 2929, 50276, 783, 4477, 4853, 7882, 3365, 3066, 247, 11233, 37913, 1617, 5150, 337, 28055, 8288, 387, 1878, 436, 310, 1900, 2032, 2530, 253, 5743, 1159, 310, 11233, 37913, 352, 310, 417, 2590, 326, 436, 16038, 1663, 11323, 2712, 50276, 32202, 29397, 672, 23694, 4454, 1386, 3285, 352, 310, 3451, 281, 897, 271, 546, 20134, 50276, 706, 14721, 494, 347, 50276, 249, 44127, 2581, 685, 247, 3500, 257, 50275 ]
Below is given review of a research paper from cnoference journal. Please write a summary the review. ### Review: this paper proposes an exploration of the effect of normalization and initialization in residual networks in particular the authors propose a novel way to initialize residual networks which is motivated by the need to avoid explodingvanishing gradients the paper proposes some theoretical analysis of the benefits of the proposed initialization i find the paper well written and the idea well executed overall the proposed analysis is clear and motivates well the proposed initialization overall i think this adds something to the literature on residual networks helping the reader to get a better understanding of the effect of normalization and initialization i have to admit i am not an expert on residual networks so it is possible that i have overlooked at previous contributions from the literature that illustrate some of these concepts already having said that the proposal seems novel enough to me overall i think that the experiments have a satisfactory degree of depth the only question mark is on the performance of the proposed method which is comparable to batch normalization if i understand correctly this is something remarkable given that it is achieved without the common practice of introducing normalizations however i have not found a convincing argument against the use of batch normalization in favor of zeroinit i believe this is something to elaborate on in the revised version of this paper as it could increase the impact of this work and attract a wider readership docsepsummary a method is presented for initialization and normalization of deep residual networks the method is based on interesting observations regarding forward and backward explosion in such networks with the standard xavier or he 2015 initializations experiments with the new method show that it is able to learn with very deep networks and that its performance is on a par with the best results obtained by other networks with more explicit normalization advantages the paper includes interesting observations resulting in two theorems which show the sensitivity of traditional initializations in residual networks the method presented seems to work comparable to other state of the art initialization normalization methods providing overall strong empirical results disadvantages the authors claim to suggest a method without normalization but the claim is misleading the network has additive and multiplicative normalization nodes and their function and placement is at least as mysterious as the role of normalization in methods like batch and layer normalization o this significantly limits the novelty of the method it is not an intialization method but a combination of initialization and normalization which differ from previous ones in some details the method includes 3 components of which only one is justified in a principled manner the other components are not justified neither by an argument nor by experiments without such experiments it is not clear what actually works in this method and what is not important the argument for the justified component is not entirely clear to me the main gist is fine but important details are not explained so i could not get the entire argument stepbystep this may be a clarity problem or maybe indicate deeper problem of arbitrary decisions made without justification i am not entirely sure such lack of clear argumentation occurs in several places experiments isolating the contribution of the method with respect to traditional initializations are missing for example experiments on cifar10 and svhn showing the result of traditional initializations with all the bells and whistles cutout mixup as the zeroinit gets more detailed comments page 3 while i could follow the general argument before eq 2 leading to the conclusion that the initial variance in a resnet explodes exponentially i could not understand eq 2 what is its justification and how is it related to the discussion before it i think it requires some argumentation page 4 i did not understand example 2 for a ph set i think an argument reminder of the details of resnet or a figure are required i could not follow the details of the argument leading to the zeroinit method o how is the second design principle varflxl o 1l justified as far as i can see having varflxl 1l will lead to output variance of 11ll e which is indeed o1 is this the argument is yes why wasnt it stated also why not smaller than o1l o following this design principle several unclear sentences are stated we strive to make varflxl 1l yet we set the last convolutional layer in the branch to 0 weights does not it set varflxl 0 in contradiction to the 1l requirement assuming the error signal passing to the branch is o1 what does the term error signal refer to how is it defined do you refer to the branchs input i understand why the input to the mth layer in the branch is olambdam1 if the branch input is o1 but why is it claimed that the overall scaling of the residual branch after update is olambda2m2 what is the overall scaling after update definition and why is it the square of forward scaling the zero init procedure step 3 is not justified by any argument in the proceeding discussion is there any reason for this policy or was it found by trial and error and is currently unjustified theoretically justified empirically instead this issue should be clearly elaborated in the text note that the addition of trainable additive and multiplicative elements is inserting the normalization back while it was claimed to be eliminated if i understand correctly the zeroinit method is hence not based on initialization or at least not only on initialization but on another form of normalization which is not more justified than its competitors in fact it is even more mysterious what should we need an additive bias before every element in the network page 5 what is sqrt12 scaling it should be defined or given a reference page 6 it is not stated on what data set figure 2 was generated in table 2 for cifar10 the comparison between xavier init and zeroinit shows only a small advantage for the latter for svhn such an experiment is completely missing and should be added o it raises the suspect the the good results obtained with zeroinit in this table are only due to the cutout and mixup used that is maybe such results could be obtained with cutoutmixup without zero init using plain xavier init experiments clarifying this point are also missing additional missing experiments it seems that zeroinit includes 3 ingredients according to the box in page 4 among which only one number 2 is roughly justified from the discussion step 1 of zeroing the last layer in each branch is not justified why are we zeroing the last layer and not the first for example step 3 is not even discussed in the text it appear without any argumentation for such steps empirical evidence should be brought and experiments doing this are missing specifically experiments of interest are o using zero init without its step 3 does it work the theory says it should o using only step 3 without steps 12 maybe only the normalization is doing the magic the paper is longer than 8 pages i have read the rebuttal regarding normalization i think that there are at least two reasonable meanings to the word normalziation in the wider sense is just means mechanism for reducing a global constant additive normalization and dividing by a global constant multiplicative normalization in this sense the constant parameters can be learnt in any way in the narrow sense the constants have to be statistics of the data i agree with the authors that their method is not normalization in sense 2 only in sense 1 note that keeping the normalization in sense 1 is not trivial why do we need these normalization operations at least for the multiplicative ones the network has the same expressive power without them i think the meaning of normalization should be clearly explained in the claim for no normalization regarding additional mathematical and empirical justifications required i think such justifications are missing in the current paper version and are not minor or easy to add i believe the work should be rejudged after resubmission of a version addressing the problemsdocsep this paper shows that with a clever initialization method resnets can be trained without using batchnorm and other normalization techniques the network can still reach stateoftheart performance the authors propose a new initialization method called zeroinit and use it to train very deep resnets up to 10000 layers they also show that the test performance of their method matches the performance of stateoftheart results on many tasks with the help of strong data augmentation this paper also indicates that the role of normalization in training deep resnets might not be as important as people thought in sum this is a very interesting paper that has novel contribution to the practical side of neural networks and new insights on the theoretical side pros 1 the analysis is not complicated and the algorithm for zeroinit is not complicated 2 many people believe normalization batchnorm layernorm etc not only improves the trainability of deep nns but also improves their generalization this paper provides empirical support that nns can still generalize well without using normalization it might be the case that the benefits from the data augmentation ie mixup cutout strictly contain those from normalization thus it is interesting to see if the network can still generalize well achieving 95 test accuracy on cifar10 without using strong dataaugmentation like mixup or cutout 3theoretical analysis of batchnorm and other normalization methods is quite challenging and often very technical the empirical results of this paper indicate that such analysis although very interesting might not be necessary for the theoretical understanding of resnets cons 1the analysis works for positively homogeneous activation functions ie relu but not for tanh or swish 2the method works for residual architectures but may not be applied to nonresidual networks ie vgg inception ### Summary:
the paper explores the effect of normalization and initialization in residual networks motivated by the need to avoid exploding and vanishing activations and gradients based on some theoretical analysis of stepsizes in sgd the authors propose a sensible but effective way of initializing a network that greatly increases training stability in a nutshell the method comes down to initializing the residual layers such that a single step of sgd results in a change in activations that is invariant to the depth of the network the experiments in the paper provide supporting evidence for the benefits the authors were able to train networks of up to 10000 layers deep the experiments have sufficient depth to support the claims overall the method seems to be a simple but effective technique for learning very deep residual networks while some aspects of the network have been used in earlier work such as initializing residual branches to output zeros these earlier methods lacked the rescaling aspect which seems crucial to the performance of this network the reviewers agree that the papers provides interesting ideas and significant theoretical and empirical contributions the main concerns by the reviewers were addressed by the author responses the ac finds that the remaining concerns raised by the reviewers are minor and insufficient for rejection of the paper
[ 253, 4081, 31850, 4583, 891, 1158, 436, 11323, 1633, 281, 253, 6239, 327, 12541, 6928, 9073, 253, 9414, 281, 755, 247, 1805, 4685, 273, 253, 1055, 273, 21539, 285, 31850, 891, 452, 281, 11476, 891, 717, 417, 271, 6485, 327, 12541, 6928, 594, 352, 310, 1896, 326, 891, 452, 28849, 387, 2045, 9021, 432, 253, 6239, 326, 17093, 690, 273, 841, 12342, 2168, 1907, 753, 326, 253, 10419, 3133, 4460, 2217, 281, 479, 50275, 1189, 455, 891, 1158, 326, 253, 4679, 452, 247, 20297, 4248, 273, 6864, 253, 760, 1953, 1616, 310, 327, 253, 3045, 273, 253, 4081, 1332, 534, 310, 10870, 281, 14604, 21539, 604, 891, 2096, 9113, 436, 310, 1633, 13406, 1677, 326, 352, 310, 6786, 1293, 253, 1846, 3946, 273, 16984, 2622, 5904, 2299, 891, 452, 417, 1119, 247, 21414, 4154, 1411, 253, 897, 273, 14604, 21539, 275, 3718, 273, 5058, 4478, 891, 2868, 436, 310, 1633, 281, 21184, 327, 275, 253, 17265, 2715, 273, 436, 2929, 347, 352, 812, 2572, 253, 3486, 273, 436, 789, 285, 6427, 247, 14200, 1239, 4249, 5474, 339, 793, 360, 3454, 50276, 66, 1332, 310, 3559, 323, 31850, 285, 21539, 273, 3676, 12541, 6928, 253, 1332, 310, 1754, 327, 4722, 7313, 5001, 3579, 285, 19265, 18864, 275, 824, 6928, 342, 253, 2629, 1269, 28024, 390, 344, 4104, 3302, 5904, 4679, 342, 253, 747, 1332, 921, 326, 352, 310, 2104, 281, 3037, 342, 1077, 3676, 6928, 285, 326, 697, 3045, 310, 327, 247, 1061, 342, 253, 1682, 1543, 2797, 407, 643, 6928, 342, 625, 6843, 21539, 11361, 209, 186, 783, 2929, 3797, 4722, 7313, 4795, 275, 767, 39383, 50276, 4609, 921, 253, 7340, 273, 5899, 3302, 5904, 275, 12541, 6928, 209, 186, 783, 1332, 3559, 3133, 281, 789, 10870, 281, 643, 1375, 273, 253, 1445, 31850, 50276, 6320, 1320, 3082, 5277, 4583, 2266, 16774, 1543, 50276, 3431, 11402, 1131, 209, 186, 783, 4477, 1750, 281, 1804, 247, 1332, 1293, 21539, 533, 253, 1750, 310, 24363, 253, 2990, 556, 21842, 285, 43904, 21539, 7632, 285, 616, 1159, 285, 14663, 310, 387, 1878, 347, 19796, 50276, 284, 253, 2554, 273, 21539, 275, 3082, 751, 14604, 285, 3828, 21539, 258, 186, 2520, 3012, 7787, 253, 38135, 273, 253, 1332, 352, 310, 417, 271, 540, 451, 1320, 1332, 533, 247, 5019, 273, 31850, 285, 21539, 534, 9184, 432, 2045, 4394, 275, 690, 4278, 50276, 186, 783, 1332, 3797, 495, 4295, 273, 534, 760, 581, 310, 17285, 275, 247, 3505, 74, 6216, 5133, 253, 643, 4295, 403, 417, 17285, 6747, 407, 271, 4154, 4543, 407, 4679, 1293, 824, 4679, 352, 310, 417, 2590, 752, 2686, 2987, 275, 436, 1332, 285, 752, 310, 417, 1774, 209, 186, 783, 4154, 323, 253, 17285, 4445, 310, 417, 7094, 2590, 281, 479, 253, 2022, 305, 382, 310, 4030, 533, 1774, 4278, 403, 417, 5544, 594, 891, 812, 417, 755, 253, 2862, 4154, 3213, 1615, 10539, 436, 778, 320, 247, 19843, 1895, 390, 5046, 5224, 12861, 1895, 273, 10341, 7089, 1160, 1293, 22861, 50276, 74, 717, 417, 7094, 2119, 824, 3480, 273, 2590, 4154, 318, 6634, 275, 2067, 5053, 209, 186, 16217, 3825, 4186, 839, 253, 7680, 273, 253, 1332, 342, 1675, 281, 5899, 3302, 5904, 403, 5816, 323, 1650, 4679, 327, 260, 338, 274, 740, 285, 18504, 13107, 4645, 253, 906, 273, 5899, 3302, 5904, 342, 512, 253, 36161, 285, 37031, 868, 2624, 483, 5878, 484, 347, 253, 5058, 4478, 4850, 50276, 3062, 7000, 5701, 3239, 495, 209, 186, 6050, 891, 812, 956, 253, 2087, 4154, 1078, 16186, 374, 4283, 281, 253, 6452, 326, 253, 3302, 11041, 275, 247, 501, 3024, 1414, 3180, 28596, 891, 812, 417, 2096, 16186, 374, 752, 310, 697, 22861, 285, 849, 310, 352, 2905, 281, 253, 5955, 1078, 352, 891, 1158, 352, 4419, 690, 4154, 318, 3239, 577, 209, 186, 74, 858, 417, 2096, 1650, 374, 323, 247, 815, 873, 891, 1158, 271, 4154, 24388, 273, 253, 4278, 273, 501, 3024, 390, 247, 4677, 403, 2424, 209, 186, 74, 812, 417, 956, 253, 4278, 273, 253, 4154, 4283, 281, 253, 5058, 4478, 1332, 258, 186, 5430, 310, 253, 1273, 2216, 8063, 945, 1258, 30291, 50276, 80, 337, 77, 17285, 347, 2080, 347, 891, 476, 923, 1907, 945, 1258, 30291, 50276, 18, 77, 588, 1421, 281, 3453, 11041, 273, 1903, 620, 299, 534, 310, 6296, 258, 18, 310, 436, 253, 4154, 310, 4754, 2139, 369, 2649, 352, 4767, 671, 2139, 417, 4577, 685, 258, 18, 77, 258, 186, 34814, 436, 2216, 8063, 2067, 12744, 14683, 403, 4767, 209, 186, 664, 32094, 281, 1056, 945, 1258, 30291, 50276, 18, 77, 2568, 359, 873, 253, 1390, 27311, 267, 3828, 275, 253, 7789, 281, 470, 13461, 1057, 417, 352, 873, 945, 1258, 30291, 50276, 17, 275, 20620, 281, 253, 337, 77, 8284, 28910, 7384, 253, 2228, 2625, 8136, 281, 253, 7789, 310, 258, 18, 50276, 5371, 1057, 253, 1307, 2228, 2625, 3730, 281, 849, 310, 352, 2931, 513, 368, 3730, 281, 253, 7789, 84, 3280, 209, 186, 74, 2096, 2139, 253, 3280, 281, 253, 278, 394, 3828, 275, 253, 7789, 310, 8919, 1369, 11747, 18, 604, 253, 7789, 3280, 310, 258, 18, 533, 2139, 310, 352, 7558, 326, 253, 4583, 13642, 273, 253, 12541, 7789, 846, 5731, 310, 258, 2260, 19, 78, 19, 752, 310, 253, 4583, 13642, 846, 5731, 5426, 285, 2139, 310, 352, 253, 6278, 273, 3579, 13642, 209, 186, 783, 5058, 2012, 5199, 3213, 495, 310, 417, 17285, 407, 667, 4154, 275, 253, 12162, 5955, 310, 627, 667, 1921, 323, 436, 3646, 390, 369, 352, 1119, 407, 2332, 285, 2228, 285, 310, 4390, 26694, 1245, 28055, 17285, 45190, 3185, 436, 2523, 943, 320, 4518, 50221, 275, 253, 2505, 3877, 326, 253, 1635, 273, 6194, 494, 21842, 285, 43904, 3603, 310, 30471, 253, 21539, 896, 1223, 352, 369, 7558, 281, 320, 17527, 604, 891, 2096, 9113, 253, 5058, 4478, 1332, 310, 7613, 417, 1754, 327, 31850, 390, 387, 1878, 417, 760, 327, 31850, 533, 327, 1529, 830, 273, 21539, 534, 310, 417, 625, 17285, 685, 697, 21607, 275, 958, 352, 310, 1014, 625, 19796, 752, 943, 359, 878, 271, 21842, 8492, 1078, 1046, 3284, 275, 253, 2990, 3239, 608, 209, 186, 5371, 310, 8084, 805, 13642, 352, 943, 320, 2931, 390, 1677, 247, 3806, 3239, 721, 209, 186, 262, 310, 417, 4767, 327, 752, 941, 873, 4677, 374, 369, 4561, 209, 186, 249, 2829, 374, 323, 260, 338, 274, 740, 253, 5301, 875, 1269, 28024, 2012, 285, 5058, 4478, 2722, 760, 247, 1355, 5750, 323, 253, 6158, 323, 18504, 13107, 824, 271, 3368, 310, 4336, 5816, 285, 943, 320, 2879, 258, 186, 262, 16540, 253, 9101, 253, 253, 1175, 1543, 2797, 342, 5058, 4478, 275, 436, 2829, 403, 760, 1955, 281, 253, 2624, 483, 285, 5878, 484, 908, 326, 310, 5046, 824, 1543, 812, 320, 2797, 342, 2624, 483, 24706, 484, 1293, 5058, 2012, 970, 8342, 1269, 28024, 2012, 4679, 8254, 5411, 436, 1127, 403, 671, 5816, 3081, 5816, 4679, 209, 186, 262, 3133, 326, 50276, 10528, 4478, 3797, 495, 12696, 2556, 281, 253, 3817, 275, 3239, 577, 2190, 534, 760, 581, 1180, 374, 310, 11467, 17285, 432, 253, 5955, 50276, 10539, 337, 273, 5058, 272, 253, 1390, 3828, 275, 1016, 7789, 310, 417, 17285, 2139, 403, 359, 5058, 272, 253, 1390, 3828, 285, 417, 253, 806, 323, 1650, 3213, 495, 310, 417, 1014, 5469, 275, 253, 2505, 50276, 262, 3176, 1293, 667, 4154, 318, 323, 824, 5018, 16774, 1941, 943, 320, 3982, 285, 4679, 2509, 436, 403, 5816, 5742, 4679, 273, 1600, 403, 258, 186, 5302, 5058, 2012, 1293, 697, 3213, 495, 1057, 352, 789, 253, 3762, 2296, 352, 943, 258, 186, 5302, 760, 3213, 495, 1293, 5018, 1249, 5046, 760, 253, 21539, 310, 2509, 253, 10721, 253, 2929, 310, 3356, 685, 854, 7223, 50276, 74, 452, 1239, 253, 30080, 22559, 5001, 21539, 891, 1158, 326, 627, 403, 387, 1878, 767, 5272, 30460, 281, 253, 3159, 2622, 91, 2492, 275, 253, 14200, 3282, 310, 816, 2097, 5122, 323, 8493, 247, 4156, 3638, 21842, 21539, 285, 23534, 407, 247, 4156, 3638, 43904, 21539, 275, 436, 3282, 253, 3638, 3602, 476, 320, 34003, 275, 667, 1039, 275, 253, 6891, 3282, 253, 14637, 452, 281, 320, 9990, 273, 253, 941, 891, 5194, 342, 253, 4477, 326, 616, 1332, 310, 417, 21539, 275, 3282, 374, 760, 275, 3282, 337, 3877, 326, 7562, 253, 21539, 275, 3282, 337, 310, 417, 14916, 2139, 513, 359, 878, 841, 21539, 5871, 387, 1878, 323, 253, 43904, 4394, 253, 2990, 556, 253, 1072, 43541, 1612, 1293, 731, 50276, 74, 1158, 253, 4495, 273, 21539, 50276, 11425, 320, 4518, 5544, 275, 253, 1750, 323, 642, 50276, 6320, 1320, 5001, 3081, 15965, 285, 16774, 816, 6787, 2424, 891, 1158, 824, 816, 6787, 403, 5816, 275, 253, 1655, 2929, 2715, 285, 403, 417, 5884, 390, 3477, 281, 823, 891, 2868, 253, 789, 943, 320, 294, 6881, 2400, 846, 501, 538, 2230, 273, 247, 2715, 15974, 253, 3237, 7152, 33032, 436, 2929, 2722, 326, 342, 247, 19080, 31850, 1332, 501, 47301, 476, 320, 10166, 1293, 970, 10464, 1451, 526, 285, 643, 21539, 5609, 50276, 783, 2990, 476, 1335, 3986, 1375, 23037, 14387, 3045, 50275, 783, 4477, 12661, 247, 747, 31850, 1332, 1925, 5058, 4478, 285, 897, 352, 281, 6194, 1077, 3676, 501, 47301, 598, 281, 30321, 8090, 597, 671, 921, 326, 253, 1071, 3045, 273, 616, 1332, 10129, 253, 3045, 273, 1375, 23037, 14387, 1543, 327, 1142, 8892, 342, 253, 1361, 273, 2266, 941, 42072, 436, 2929, 671, 6492, 326, 253, 2554, 273, 21539, 275, 3733, 3676, 501, 47301, 1537, 417, 320, 347, 1774, 347, 952, 1869, 275, 2020, 436, 310, 247, 1077, 4722, 2929, 326, 556, 4460, 7680, 281, 253, 8542, 1930, 273, 11454, 6928, 285, 747, 16039, 327, 253, 10527, 1930, 50275, 856, 84, 337, 253, 1783, 310, 417, 9542, 285, 253, 5933, 323, 5058, 4478, 310, 417, 9542, 50275, 19, 1142, 952, 2868, 21539, 10464, 1451, 526, 2242, 1808, 526, 3966, 50276, 1439, 760, 19132, 253, 6194, 1430, 273, 3676, 295, 2224, 533, 671, 19132, 616, 26647, 436, 2929, 3400, 16774, 1329, 326, 295, 2224, 476, 1335, 39970, 973, 1293, 970, 21539, 352, 1537, 320, 253, 1083, 326, 253, 5373, 432, 253, 941, 42072, 26332, 5878, 484, 50276, 7317, 483, 13714, 3831, 1110, 432, 21539, 3021, 352, 310, 4722, 281, 923, 604, 253, 2990, 476, 1335, 39970, 973, 17170, 5325, 1071, 7200, 327, 260, 338, 274, 740, 50276, 14920, 970, 2266, 941, 2321, 16977, 751, 5878, 484, 390, 2624, 483, 50276, 20, 783, 33977, 1783, 273, 10464, 1451, 526, 285, 643, 21539, 3082, 310, 3240, 11132, 285, 2223, 1077, 7681, 253, 16774, 1543, 273, 436, 2929, 5224, 326, 824, 1783, 3738, 1077, 4722, 1537, 417, 320, 3309, 323, 253, 10527, 4685, 273, 501, 47301, 50273, 5040, 337, 783, 1783, 2987, 323, 14962, 17010, 5743, 3470, 26332, 774, 86, 533, 417, 323, 23136, 73, 390, 1863, 763, 50276, 19, 783, 1332, 2987, 323, 12541, 35615, 533, 778, 417, 320, 3732, 281, 1327, 40512, 780, 6928, 26332, 362, 1266, 39645, 50276, 187, 187, 4118, 18435, 27, 783, 2929, 33826, 253, 1055, 273, 21539, 285, 31850, 275, 12541, 6928, 17194, 407, 253, 878, 281, 3693, 1414, 4442, 285, 29199, 1396, 569, 285, 27935, 1754, 327, 690, 10527, 1783, 273, 5018, 4219, 275, 256, 35333, 253, 4477, 12661, 247, 24600, 533, 3576, 1039, 273, 3302, 3006, 247, 2990, 326, 10260, 5459, 3733, 7882, 275, 247, 5825, 17901, 253, 1332, 3249, 1066, 281, 3302, 3006, 253, 12541, 8090, 824, 326, 247, 2014, 3213, 273, 256, 35333, 1543, 275, 247, 1818, 275, 1396, 569, 326, 310, 13727, 281, 253, 6864, 273, 253, 2990, 253, 4679, 275, 253, 2929, 2085, 8109, 1941, 323, 253, 5373, 253, 4477, 497, 2104, 281, 6194, 6928, 273, 598, 281, 30321, 8090, 3676, 253, 4679, 452, 4209, 6864, 281, 1329, 253, 3916, 4583, 253, 1332, 3133, 281, 320, 247, 2969, 533, 3576, 5853, 323, 4715, 1077, 3676, 12541, 6928, 50275, 6050, 690, 7794, 273, 253, 2990, 452, 644, 908, 275, 4321, 789, 824, 347, 3302, 3006, 12541, 12998, 281, 3453, 33303, 841, 4321, 3082, 20296, 253, 46595, 272, 4809, 534, 3133, 9560, 281, 253, 3045, 273, 436, 2990, 50276, 783, 30628, 5194, 326, 253, 9380, 3400, 4722, 5697, 285, 1534, 10527, 285, 16774, 9021, 253, 2022, 7350, 407, 253, 30628, 497, 9713, 407, 253, 2488, 6128, 253, 913, 9010, 326, 253, 5780, 7350, 5439, 407, 253, 30628, 403, 5884, 285, 12497, 323, 18235, 273, 253, 2929, 209 ]
[ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 ]
[ 253, 4081, 31850, 4583, 891, 1158, 436, 11323, 1633, 281, 253, 6239, 327, 12541, 6928, 9073, 253, 9414, 281, 755, 247, 1805, 4685, 273, 253, 1055, 273, 21539, 285, 31850, 891, 452, 281, 11476, 891, 717, 417, 271, 6485, 327, 12541, 6928, 594, 352, 310, 1896, 326, 891, 452, 28849, 387, 2045, 9021, 432, 253, 6239, 326, 17093, 690, 273, 841, 12342, 2168, 1907, 753, 326, 253, 10419, 3133, 4460, 2217, 281, 479, 50275, 1189, 455, 891, 1158, 326, 253, 4679, 452, 247, 20297, 4248, 273, 6864, 253, 760, 1953, 1616, 310, 327, 253, 3045, 273, 253, 4081, 1332, 534, 310, 10870, 281, 14604, 21539, 604, 891, 2096, 9113, 436, 310, 1633, 13406, 1677, 326, 352, 310, 6786, 1293, 253, 1846, 3946, 273, 16984, 2622, 5904, 2299, 891, 452, 417, 1119, 247, 21414, 4154, 1411, 253, 897, 273, 14604, 21539, 275, 3718, 273, 5058, 4478, 891, 2868, 436, 310, 1633, 281, 21184, 327, 275, 253, 17265, 2715, 273, 436, 2929, 347, 352, 812, 2572, 253, 3486, 273, 436, 789, 285, 6427, 247, 14200, 1239, 4249, 5474, 339, 793, 360, 3454, 50276, 66, 1332, 310, 3559, 323, 31850, 285, 21539, 273, 3676, 12541, 6928, 253, 1332, 310, 1754, 327, 4722, 7313, 5001, 3579, 285, 19265, 18864, 275, 824, 6928, 342, 253, 2629, 1269, 28024, 390, 344, 4104, 3302, 5904, 4679, 342, 253, 747, 1332, 921, 326, 352, 310, 2104, 281, 3037, 342, 1077, 3676, 6928, 285, 326, 697, 3045, 310, 327, 247, 1061, 342, 253, 1682, 1543, 2797, 407, 643, 6928, 342, 625, 6843, 21539, 11361, 209, 186, 783, 2929, 3797, 4722, 7313, 4795, 275, 767, 39383, 50276, 4609, 921, 253, 7340, 273, 5899, 3302, 5904, 275, 12541, 6928, 209, 186, 783, 1332, 3559, 3133, 281, 789, 10870, 281, 643, 1375, 273, 253, 1445, 31850, 50276, 6320, 1320, 3082, 5277, 4583, 2266, 16774, 1543, 50276, 3431, 11402, 1131, 209, 186, 783, 4477, 1750, 281, 1804, 247, 1332, 1293, 21539, 533, 253, 1750, 310, 24363, 253, 2990, 556, 21842, 285, 43904, 21539, 7632, 285, 616, 1159, 285, 14663, 310, 387, 1878, 347, 19796, 50276, 284, 253, 2554, 273, 21539, 275, 3082, 751, 14604, 285, 3828, 21539, 258, 186, 2520, 3012, 7787, 253, 38135, 273, 253, 1332, 352, 310, 417, 271, 540, 451, 1320, 1332, 533, 247, 5019, 273, 31850, 285, 21539, 534, 9184, 432, 2045, 4394, 275, 690, 4278, 50276, 186, 783, 1332, 3797, 495, 4295, 273, 534, 760, 581, 310, 17285, 275, 247, 3505, 74, 6216, 5133, 253, 643, 4295, 403, 417, 17285, 6747, 407, 271, 4154, 4543, 407, 4679, 1293, 824, 4679, 352, 310, 417, 2590, 752, 2686, 2987, 275, 436, 1332, 285, 752, 310, 417, 1774, 209, 186, 783, 4154, 323, 253, 17285, 4445, 310, 417, 7094, 2590, 281, 479, 253, 2022, 305, 382, 310, 4030, 533, 1774, 4278, 403, 417, 5544, 594, 891, 812, 417, 755, 253, 2862, 4154, 3213, 1615, 10539, 436, 778, 320, 247, 19843, 1895, 390, 5046, 5224, 12861, 1895, 273, 10341, 7089, 1160, 1293, 22861, 50276, 74, 717, 417, 7094, 2119, 824, 3480, 273, 2590, 4154, 318, 6634, 275, 2067, 5053, 209, 186, 16217, 3825, 4186, 839, 253, 7680, 273, 253, 1332, 342, 1675, 281, 5899, 3302, 5904, 403, 5816, 323, 1650, 4679, 327, 260, 338, 274, 740, 285, 18504, 13107, 4645, 253, 906, 273, 5899, 3302, 5904, 342, 512, 253, 36161, 285, 37031, 868, 2624, 483, 5878, 484, 347, 253, 5058, 4478, 4850, 50276, 3062, 7000, 5701, 3239, 495, 209, 186, 6050, 891, 812, 956, 253, 2087, 4154, 1078, 16186, 374, 4283, 281, 253, 6452, 326, 253, 3302, 11041, 275, 247, 501, 3024, 1414, 3180, 28596, 891, 812, 417, 2096, 16186, 374, 752, 310, 697, 22861, 285, 849, 310, 352, 2905, 281, 253, 5955, 1078, 352, 891, 1158, 352, 4419, 690, 4154, 318, 3239, 577, 209, 186, 74, 858, 417, 2096, 1650, 374, 323, 247, 815, 873, 891, 1158, 271, 4154, 24388, 273, 253, 4278, 273, 501, 3024, 390, 247, 4677, 403, 2424, 209, 186, 74, 812, 417, 956, 253, 4278, 273, 253, 4154, 4283, 281, 253, 5058, 4478, 1332, 258, 186, 5430, 310, 253, 1273, 2216, 8063, 945, 1258, 30291, 50276, 80, 337, 77, 17285, 347, 2080, 347, 891, 476, 923, 1907, 945, 1258, 30291, 50276, 18, 77, 588, 1421, 281, 3453, 11041, 273, 1903, 620, 299, 534, 310, 6296, 258, 18, 310, 436, 253, 4154, 310, 4754, 2139, 369, 2649, 352, 4767, 671, 2139, 417, 4577, 685, 258, 18, 77, 258, 186, 34814, 436, 2216, 8063, 2067, 12744, 14683, 403, 4767, 209, 186, 664, 32094, 281, 1056, 945, 1258, 30291, 50276, 18, 77, 2568, 359, 873, 253, 1390, 27311, 267, 3828, 275, 253, 7789, 281, 470, 13461, 1057, 417, 352, 873, 945, 1258, 30291, 50276, 17, 275, 20620, 281, 253, 337, 77, 8284, 28910, 7384, 253, 2228, 2625, 8136, 281, 253, 7789, 310, 258, 18, 50276, 5371, 1057, 253, 1307, 2228, 2625, 3730, 281, 849, 310, 352, 2931, 513, 368, 3730, 281, 253, 7789, 84, 3280, 209, 186, 74, 2096, 2139, 253, 3280, 281, 253, 278, 394, 3828, 275, 253, 7789, 310, 8919, 1369, 11747, 18, 604, 253, 7789, 3280, 310, 258, 18, 533, 2139, 310, 352, 7558, 326, 253, 4583, 13642, 273, 253, 12541, 7789, 846, 5731, 310, 258, 2260, 19, 78, 19, 752, 310, 253, 4583, 13642, 846, 5731, 5426, 285, 2139, 310, 352, 253, 6278, 273, 3579, 13642, 209, 186, 783, 5058, 2012, 5199, 3213, 495, 310, 417, 17285, 407, 667, 4154, 275, 253, 12162, 5955, 310, 627, 667, 1921, 323, 436, 3646, 390, 369, 352, 1119, 407, 2332, 285, 2228, 285, 310, 4390, 26694, 1245, 28055, 17285, 45190, 3185, 436, 2523, 943, 320, 4518, 50221, 275, 253, 2505, 3877, 326, 253, 1635, 273, 6194, 494, 21842, 285, 43904, 3603, 310, 30471, 253, 21539, 896, 1223, 352, 369, 7558, 281, 320, 17527, 604, 891, 2096, 9113, 253, 5058, 4478, 1332, 310, 7613, 417, 1754, 327, 31850, 390, 387, 1878, 417, 760, 327, 31850, 533, 327, 1529, 830, 273, 21539, 534, 310, 417, 625, 17285, 685, 697, 21607, 275, 958, 352, 310, 1014, 625, 19796, 752, 943, 359, 878, 271, 21842, 8492, 1078, 1046, 3284, 275, 253, 2990, 3239, 608, 209, 186, 5371, 310, 8084, 805, 13642, 352, 943, 320, 2931, 390, 1677, 247, 3806, 3239, 721, 209, 186, 262, 310, 417, 4767, 327, 752, 941, 873, 4677, 374, 369, 4561, 209, 186, 249, 2829, 374, 323, 260, 338, 274, 740, 253, 5301, 875, 1269, 28024, 2012, 285, 5058, 4478, 2722, 760, 247, 1355, 5750, 323, 253, 6158, 323, 18504, 13107, 824, 271, 3368, 310, 4336, 5816, 285, 943, 320, 2879, 258, 186, 262, 16540, 253, 9101, 253, 253, 1175, 1543, 2797, 342, 5058, 4478, 275, 436, 2829, 403, 760, 1955, 281, 253, 2624, 483, 285, 5878, 484, 908, 326, 310, 5046, 824, 1543, 812, 320, 2797, 342, 2624, 483, 24706, 484, 1293, 5058, 2012, 970, 8342, 1269, 28024, 2012, 4679, 8254, 5411, 436, 1127, 403, 671, 5816, 3081, 5816, 4679, 209, 186, 262, 3133, 326, 50276, 10528, 4478, 3797, 495, 12696, 2556, 281, 253, 3817, 275, 3239, 577, 2190, 534, 760, 581, 1180, 374, 310, 11467, 17285, 432, 253, 5955, 50276, 10539, 337, 273, 5058, 272, 253, 1390, 3828, 275, 1016, 7789, 310, 417, 17285, 2139, 403, 359, 5058, 272, 253, 1390, 3828, 285, 417, 253, 806, 323, 1650, 3213, 495, 310, 417, 1014, 5469, 275, 253, 2505, 50276, 262, 3176, 1293, 667, 4154, 318, 323, 824, 5018, 16774, 1941, 943, 320, 3982, 285, 4679, 2509, 436, 403, 5816, 5742, 4679, 273, 1600, 403, 258, 186, 5302, 5058, 2012, 1293, 697, 3213, 495, 1057, 352, 789, 253, 3762, 2296, 352, 943, 258, 186, 5302, 760, 3213, 495, 1293, 5018, 1249, 5046, 760, 253, 21539, 310, 2509, 253, 10721, 253, 2929, 310, 3356, 685, 854, 7223, 50276, 74, 452, 1239, 253, 30080, 22559, 5001, 21539, 891, 1158, 326, 627, 403, 387, 1878, 767, 5272, 30460, 281, 253, 3159, 2622, 91, 2492, 275, 253, 14200, 3282, 310, 816, 2097, 5122, 323, 8493, 247, 4156, 3638, 21842, 21539, 285, 23534, 407, 247, 4156, 3638, 43904, 21539, 275, 436, 3282, 253, 3638, 3602, 476, 320, 34003, 275, 667, 1039, 275, 253, 6891, 3282, 253, 14637, 452, 281, 320, 9990, 273, 253, 941, 891, 5194, 342, 253, 4477, 326, 616, 1332, 310, 417, 21539, 275, 3282, 374, 760, 275, 3282, 337, 3877, 326, 7562, 253, 21539, 275, 3282, 337, 310, 417, 14916, 2139, 513, 359, 878, 841, 21539, 5871, 387, 1878, 323, 253, 43904, 4394, 253, 2990, 556, 253, 1072, 43541, 1612, 1293, 731, 50276, 74, 1158, 253, 4495, 273, 21539, 50276, 11425, 320, 4518, 5544, 275, 253, 1750, 323, 642, 50276, 6320, 1320, 5001, 3081, 15965, 285, 16774, 816, 6787, 2424, 891, 1158, 824, 816, 6787, 403, 5816, 275, 253, 1655, 2929, 2715, 285, 403, 417, 5884, 390, 3477, 281, 823, 891, 2868, 253, 789, 943, 320, 294, 6881, 2400, 846, 501, 538, 2230, 273, 247, 2715, 15974, 253, 3237, 7152, 33032, 436, 2929, 2722, 326, 342, 247, 19080, 31850, 1332, 501, 47301, 476, 320, 10166, 1293, 970, 10464, 1451, 526, 285, 643, 21539, 5609, 50276, 783, 2990, 476, 1335, 3986, 1375, 23037, 14387, 3045, 50275, 783, 4477, 12661, 247, 747, 31850, 1332, 1925, 5058, 4478, 285, 897, 352, 281, 6194, 1077, 3676, 501, 47301, 598, 281, 30321, 8090, 597, 671, 921, 326, 253, 1071, 3045, 273, 616, 1332, 10129, 253, 3045, 273, 1375, 23037, 14387, 1543, 327, 1142, 8892, 342, 253, 1361, 273, 2266, 941, 42072, 436, 2929, 671, 6492, 326, 253, 2554, 273, 21539, 275, 3733, 3676, 501, 47301, 1537, 417, 320, 347, 1774, 347, 952, 1869, 275, 2020, 436, 310, 247, 1077, 4722, 2929, 326, 556, 4460, 7680, 281, 253, 8542, 1930, 273, 11454, 6928, 285, 747, 16039, 327, 253, 10527, 1930, 50275, 856, 84, 337, 253, 1783, 310, 417, 9542, 285, 253, 5933, 323, 5058, 4478, 310, 417, 9542, 50275, 19, 1142, 952, 2868, 21539, 10464, 1451, 526, 2242, 1808, 526, 3966, 50276, 1439, 760, 19132, 253, 6194, 1430, 273, 3676, 295, 2224, 533, 671, 19132, 616, 26647, 436, 2929, 3400, 16774, 1329, 326, 295, 2224, 476, 1335, 39970, 973, 1293, 970, 21539, 352, 1537, 320, 253, 1083, 326, 253, 5373, 432, 253, 941, 42072, 26332, 5878, 484, 50276, 7317, 483, 13714, 3831, 1110, 432, 21539, 3021, 352, 310, 4722, 281, 923, 604, 253, 2990, 476, 1335, 39970, 973, 17170, 5325, 1071, 7200, 327, 260, 338, 274, 740, 50276, 14920, 970, 2266, 941, 2321, 16977, 751, 5878, 484, 390, 2624, 483, 50276, 20, 783, 33977, 1783, 273, 10464, 1451, 526, 285, 643, 21539, 3082, 310, 3240, 11132, 285, 2223, 1077, 7681, 253, 16774, 1543, 273, 436, 2929, 5224, 326, 824, 1783, 3738, 1077, 4722, 1537, 417, 320, 3309, 323, 253, 10527, 4685, 273, 501, 47301, 50273, 5040, 337, 783, 1783, 2987, 323, 14962, 17010, 5743, 3470, 26332, 774, 86, 533, 417, 323, 23136, 73, 390, 1863, 763, 50276, 19, 783, 1332, 2987, 323, 12541, 35615, 533, 778, 417, 320, 3732, 281, 1327, 40512, 780, 6928, 26332, 362, 1266, 39645, 50276, 187, 187, 4118, 18435, 27, 783, 2929, 33826, 253, 1055, 273, 21539, 285, 31850, 275, 12541, 6928, 17194, 407, 253, 878, 281, 3693, 1414, 4442, 285, 29199, 1396, 569, 285, 27935, 1754, 327, 690, 10527, 1783, 273, 5018, 4219, 275, 256, 35333, 253, 4477, 12661, 247, 24600, 533, 3576, 1039, 273, 3302, 3006, 247, 2990, 326, 10260, 5459, 3733, 7882, 275, 247, 5825, 17901, 253, 1332, 3249, 1066, 281, 3302, 3006, 253, 12541, 8090, 824, 326, 247, 2014, 3213, 273, 256, 35333, 1543, 275, 247, 1818, 275, 1396, 569, 326, 310, 13727, 281, 253, 6864, 273, 253, 2990, 253, 4679, 275, 253, 2929, 2085, 8109, 1941, 323, 253, 5373, 253, 4477, 497, 2104, 281, 6194, 6928, 273, 598, 281, 30321, 8090, 3676, 253, 4679, 452, 4209, 6864, 281, 1329, 253, 3916, 4583, 253, 1332, 3133, 281, 320, 247, 2969, 533, 3576, 5853, 323, 4715, 1077, 3676, 12541, 6928, 50275, 6050, 690, 7794, 273, 253, 2990, 452, 644, 908, 275, 4321, 789, 824, 347, 3302, 3006, 12541, 12998, 281, 3453, 33303, 841, 4321, 3082, 20296, 253, 46595, 272, 4809, 534, 3133, 9560, 281, 253, 3045, 273, 436, 2990, 50276, 783, 30628, 5194, 326, 253, 9380, 3400, 4722, 5697, 285, 1534, 10527, 285, 16774, 9021, 253, 2022, 7350, 407, 253, 30628, 497, 9713, 407, 253, 2488, 6128, 253, 913, 9010, 326, 253, 5780, 7350, 5439, 407, 253, 30628, 403, 5884, 285, 12497, 323, 18235, 273, 253, 2929, 209 ]
Below is given review of a research paper from cnoference journal. Please write a summary the review. ### Review: the paper proposes a new approach for automatic termination of hyperparameter optimization based on bayesian optimization bo the idea is to construct highprobability confidence bound on the regret and then determine when to terminate the bo process empirical experiments are conducted on various realworld hpo and nas benchmarks to show the efficacy of the proposed approach overall i think the main idea of the work is interesting the writing of the paper is generally clear except section 3 which i think is not mathematically rigorous this section causes me a lot of confusion and i find its hard to understand all the maths behind the proposed approach i have several questions listed below 1 im confused by the definition of epsilonbo is epsilonbo a userdefined threshold or is there any formula showing the definition of epsilonbo i cant find the definition of epsilonbo in the paper also in proposition 1 it says that hatgammad is a candidate solution of mingamma in gamma hatfgamma such that hatfhatgammad hatfgammad leq epsilonbo this means the set of hatgammad depends on epsilonbo if this is the case then i expect the notation of hatgammad should be changed to reflect the dependent because for different values of epsilonbo there will be a different set of candidate solutions 2 in the first sentence right after the proof of proposition 1 it is stated that the optimization error epsilonbo is bounded by the simple regret hatrt why is this the case if epsilonbo is a userdefined threshold then it shouldnt have any relation with the simple regret why is it bounded by the simple regret i think the confusion here is also related to my confusion on the definition of epsilonbo 3 in the paragraph upper bound for hatrt at the end of page 3 there is a statement saying that with high probability hatfgamma is bounded by lower and upper confidence bounds defined as lcbtgamma mutgamma betat sigmatgamma and ucbtgamma mutgamma betat sigmatgamma this statement is not always true it is only true when the domain of the objective function is discrete when the domain is continuous it needs to have another term o1t2 in the upper and lower bound because of the discretization process see lemmas 55 56 57 in srinivas et al 2010 4 in eq 7 i do not understand why epsilonbo is smaller than hatrt can the authors explain in more detail 5 in the sentence right below eq 7 it states that hatfgammat mingamma in gt hatfgamma on the other hand in eq 6 it states that hatfgammat is the best value found by iteration t combining these two equations this means gammat is the value with the lowest value of hatfgamma in practice this is only true when the empirical loss function hatfgamma is noiseless however note that to apply the regret analysis in srinivas et al 2010 the objective function needs to be noisy 6 im also confused with proposition 2 is this a proposal from the paper that is to terminate bo when barrt sqrtvar hatf gammat or is this a mathematical proposition ie when the condition in eq 9 occurs bo is automatically terminated i cant find a proof of proposition 2 in the appendix also by checking the algorithm 1 in page 12 in the appendix i have a feeling that this proposition 2 is just a proposal from the paper is this the case 7 i found its a bit hard to read figure 3 more detailed explanation for figure 3 is needed in the revised version of the paper like i mentioned in the previous section i think the overall idea of the work is interesting but the maths behind the proposed approach is not rigorous and confused i find a lot of confusion reading section 3 which is the main section of the paper i have several questions to the authors listed in the previous sections hopefully the answers from the authors will be able to clear my doubt docsepthis paper proposes an automatic termination criterion for bayesian optimization bo by using the upper bound of the simple reget various experiments are conducted to demonstrate that with the utilization of the termination criterion computation and energy consumption can be reduced the major contribution of this paper lies in the two propositions proposition 1 discusses the relationship between statistical error and optimization error the authors claim that due to the irreducible statistical error it is appropriate to reduce the optimization error epsilonbo to the same magnitude wrt the statistical error since the statistical error epsilonst is unknown the authors adopt an existing crossvalidate method to estimate it then in proposition 2 the termination criterion is detailed once the bo regret is less than the standard deviation of statistical variance lack of literature review in general this paper aims to solve the earlyoptimal stopping of the bayesian optimization bo plenty of related works are even not mentioned in the paper eg freezethaw bo 1 multifidelity bo 2 hyperband bo 3 and bosbo 4 these works all discussed how to achieve efficiency and saving budget in bo a detailed discussion of the difference between the proposed method and all the previous works needs to be mentioned not to mention comparison experiments the stopping criterion is mainly decided by eq 7 specifically determined by the upper confidence bound and lower confidence bound however this bound is quite loose since there is one variable determining the magnitude of it betat through the paper the authors do not mention the role of betat and its impact on regret in fact if beta is set small enough the condition in proposition 2 can be easily satisfied right moreover the variance estimate of hatf lacks theoretical analysis i am expecting the authors to giving theoretical guarantee of the estimate to the real variance otherwise it wont be possible to determine whether the estimate is to the same magnitude of the real variance 1 swersky k snoek j and adams r p freezethaw bayesian optimization arxiv14063896 2014 2 kandasamy k dasarathy g oliva j b schneider j and poczos b gaussian process bandit optimization with multifidelity evaluations in proc nips 2016 3 falkner s klein a and hutter f bohb robust and efficient hyperparameter optimization at scale in proc icml 2018 4 dai z et al bayesian optimization meets bayesian optimal stopping in proc icml 2019 given the lack of literature review and comparison experiments plus the bound for epsilonbo lacks theoretical guarantee this paper needs a major revision docsepthis paper studies the problem of prespecifying the optimal termination criterion for bayesian optimization different from prior work that tracks the value of acquisition function this paper proposes an automatic termination criterion for bo in particular they construct a highprobability confidence bound on the regret and then the users can specify a desired tolerance that shows how accurate the final solution should be compared to the global optimal they estimate the threshold via a crossvalidation estimate of the generalization error empirically they design two evaluation metrics relative test error change ryc and relative time change rtc and compare to the comprehensive prior work the results demonstrate the effectiveness of their proposed approach the problem setting of early terminating the entire bo process is interesting and novel how to effectively prespecify a termination criterion is a practical challenge in the field of hpo the proposed mechanism of bounding the regret based on a userdefined tolerance makes intuitive sense it is also reasonable to notice that overfitting can occur in hpo here are a few detailed comments figure 1a is not very visually informative it is hard to understand that the orange line is the ucb and the green line is the lcb visually some typos proovedproved in page 3 letter latter in page 4 it is hard to understand intuitively why the suboptimal solution resulted from the empirical surrogate can be mitigated by early stopping it is suspicious to draw the conclusion that it can be mitigated by the proposed method could you explain more quality the submission is technically sound the claims in the contribution are wellsupported by theoretical analyses and empirically results it is a complete piece of work that outperforms prior work empirically clarity this paper is wellwritten and easy to follow the problem is also wellmotivated the experimental details are also very specific such that reproducing the results should be possible this paper proposes a novel early stopping criterion for bayesian optimization it is wellwritten and wellmotivated the proposed idea is simple and technically sound the claims are wellsupported by theoretical analyses and extensive experimental results ### Summary:
in this paper the stopping condition of bayesian optimization bo is discussed this problem is very important when bo is applied to the hyperparameter optimization hpo task all the reviewers agree that the proposed approach based on highprobability confidence bound on the regret is interesting and reasonable an important issue raised by a reviewer is that many existing bo works discussed how to achieve efficiency and saving budget in bo although they did not explicitly mention the stopping condition due to the lack of discussion regarding the relationship with these highly related studies we have to conclude that the paper cannot be accepted in its current form
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 783, 2929, 29328, 247, 747, 2746, 323, 12077, 15056, 273, 4373, 19484, 13757, 1754, 327, 17699, 16561, 13757, 1766, 50276, 783, 2934, 310, 281, 3989, 1029, 22275, 1430, 7162, 3033, 327, 253, 14938, 285, 840, 3653, 672, 281, 24174, 253, 1766, 1232, 16774, 4679, 403, 5196, 327, 2710, 1524, 10186, 288, 5367, 285, 13332, 49602, 281, 921, 253, 10307, 273, 253, 4081, 2746, 4583, 891, 1158, 253, 2022, 2934, 273, 253, 789, 310, 4722, 253, 4028, 273, 253, 2929, 310, 3839, 2590, 3707, 2593, 495, 534, 891, 1158, 310, 417, 11076, 1037, 26565, 436, 2593, 5997, 479, 247, 2257, 273, 13775, 285, 891, 1089, 697, 1892, 281, 2096, 512, 253, 14168, 84, 3212, 253, 4081, 2746, 891, 452, 2067, 3533, 7117, 2708, 337, 186, 303, 13477, 407, 253, 5426, 273, 299, 4277, 2399, 310, 299, 4277, 2399, 247, 2608, 7769, 7887, 390, 310, 627, 667, 7212, 4645, 253, 5426, 273, 299, 4277, 2399, 891, 16216, 1089, 253, 5426, 273, 299, 4277, 2399, 275, 253, 2929, 671, 275, 13989, 337, 352, 2296, 326, 7856, 72, 20136, 310, 247, 7431, 2900, 273, 43261, 1861, 275, 17356, 7856, 71, 2733, 824, 326, 7856, 71, 700, 72, 20136, 50276, 700, 16054, 20136, 458, 82, 299, 4277, 2399, 436, 2097, 253, 873, 273, 7856, 72, 20136, 7024, 327, 299, 4277, 2399, 604, 436, 310, 253, 1083, 840, 891, 1902, 253, 14951, 273, 7856, 72, 20136, 943, 320, 4391, 281, 4887, 253, 7976, 984, 323, 1027, 2193, 273, 299, 4277, 2399, 627, 588, 320, 247, 1027, 873, 273, 7431, 5482, 374, 186, 249, 253, 806, 6197, 987, 846, 253, 4737, 273, 13989, 337, 352, 310, 4767, 326, 253, 13757, 2228, 299, 4277, 2399, 310, 11542, 407, 253, 2969, 14938, 7856, 1378, 2139, 310, 436, 253, 1083, 604, 299, 4277, 2399, 310, 247, 2608, 7769, 7887, 840, 352, 943, 2649, 452, 667, 5886, 342, 253, 2969, 14938, 2139, 310, 352, 11542, 407, 253, 2969, 14938, 891, 1158, 253, 13775, 1060, 310, 671, 2905, 281, 619, 13775, 327, 253, 5426, 273, 299, 4277, 2399, 495, 186, 249, 253, 12494, 5170, 3033, 323, 7856, 1378, 387, 253, 990, 273, 3239, 495, 627, 310, 247, 3908, 3981, 326, 342, 1029, 5912, 7856, 71, 2733, 310, 11542, 407, 2406, 285, 5170, 7162, 14493, 2931, 347, 298, 68, 2612, 2733, 50276, 10082, 2733, 50276, 9900, 255, 9788, 2056, 2733, 285, 44274, 2612, 2733, 50276, 10082, 2733, 50276, 9900, 255, 9788, 2056, 2733, 436, 3908, 310, 417, 1900, 2032, 352, 310, 760, 2032, 672, 253, 5028, 273, 253, 8103, 1159, 310, 13358, 672, 253, 5028, 310, 5415, 352, 3198, 281, 452, 1529, 1307, 258, 18, 85, 19, 275, 253, 5170, 285, 2406, 3033, 984, 273, 253, 35132, 1320, 1232, 923, 458, 44661, 7288, 8026, 8988, 275, 256, 11078, 34627, 1162, 355, 4267, 577, 186, 249, 16186, 818, 891, 513, 417, 2096, 2139, 299, 4277, 2399, 310, 4577, 685, 7856, 1378, 476, 253, 4477, 5513, 275, 625, 2508, 608, 186, 249, 253, 6197, 987, 2708, 16186, 818, 352, 3054, 326, 7856, 16054, 6710, 50276, 3987, 1861, 275, 305, 85, 7856, 71, 2733, 327, 253, 643, 1133, 275, 16186, 721, 352, 3054, 326, 7856, 16054, 6710, 310, 253, 1682, 1318, 1119, 407, 19502, 246, 16248, 841, 767, 7424, 436, 2097, 305, 6710, 310, 253, 1318, 342, 253, 8840, 1318, 273, 7856, 71, 2733, 275, 3946, 436, 310, 760, 2032, 672, 253, 16774, 2957, 1159, 7856, 71, 2733, 310, 642, 261, 6134, 2299, 3877, 326, 281, 4647, 253, 14938, 1783, 275, 256, 11078, 34627, 1162, 355, 4267, 253, 8103, 1159, 3198, 281, 320, 27620, 721, 186, 303, 671, 13477, 342, 13989, 374, 310, 436, 247, 10419, 432, 253, 2929, 326, 310, 281, 24174, 1766, 672, 2534, 1378, 50276, 2609, 2044, 7856, 71, 305, 6710, 390, 310, 436, 247, 15965, 13989, 26332, 672, 253, 1617, 275, 16186, 898, 6634, 1766, 310, 8356, 19344, 891, 16216, 1089, 247, 4737, 273, 13989, 374, 275, 253, 30762, 671, 407, 12669, 253, 5933, 337, 275, 3239, 1249, 275, 253, 30762, 891, 452, 247, 5471, 326, 436, 13989, 374, 310, 816, 247, 10419, 432, 253, 2929, 310, 436, 253, 1083, 818, 186, 74, 1119, 697, 247, 2372, 1892, 281, 1239, 4677, 495, 625, 7000, 8813, 323, 4677, 495, 310, 3058, 275, 253, 17265, 2715, 273, 253, 2929, 50276, 3022, 891, 5393, 275, 253, 2045, 2593, 891, 1158, 253, 4583, 2934, 273, 253, 789, 310, 4722, 533, 253, 14168, 84, 3212, 253, 4081, 2746, 310, 417, 26565, 285, 13477, 891, 1089, 247, 2257, 273, 13775, 4361, 2593, 495, 534, 310, 253, 2022, 2593, 273, 253, 2929, 891, 452, 2067, 3533, 281, 253, 4477, 7117, 275, 253, 2045, 7118, 18670, 253, 9172, 432, 253, 4477, 588, 320, 2104, 281, 2590, 619, 5545, 5474, 33032, 2520, 2929, 29328, 271, 12077, 15056, 17705, 323, 17699, 16561, 13757, 1766, 407, 970, 253, 5170, 3033, 273, 253, 2969, 294, 788, 2710, 4679, 403, 5196, 281, 7568, 326, 342, 253, 19575, 273, 253, 15056, 17705, 13782, 285, 2341, 8353, 476, 320, 3777, 253, 2201, 7680, 273, 436, 2929, 8696, 275, 253, 767, 39325, 13989, 337, 25339, 253, 2954, 875, 7605, 2228, 285, 13757, 2228, 253, 4477, 1750, 326, 1955, 281, 253, 22816, 7605, 2228, 352, 310, 4569, 281, 4796, 253, 13757, 2228, 299, 4277, 2399, 281, 253, 1072, 9777, 8772, 253, 7605, 2228, 1580, 253, 7605, 2228, 299, 4277, 296, 310, 7202, 253, 4477, 5283, 271, 5368, 2831, 30716, 1332, 281, 6642, 352, 840, 275, 13989, 374, 253, 15056, 17705, 310, 7000, 2378, 253, 1766, 14938, 310, 1679, 685, 253, 2629, 11254, 273, 7605, 11041, 50276, 77, 471, 273, 6239, 2278, 275, 2087, 436, 2929, 13698, 281, 8415, 253, 2393, 29776, 15910, 273, 253, 17699, 16561, 13757, 1766, 9828, 273, 2905, 2987, 403, 1014, 417, 5393, 275, 253, 2929, 24088, 1959, 91, 678, 1403, 1766, 337, 50276, 9961, 338, 21718, 1766, 374, 4373, 4152, 1766, 495, 285, 16237, 2399, 577, 841, 2987, 512, 5469, 849, 281, 5115, 6733, 285, 13868, 7563, 275, 1766, 247, 7000, 5955, 273, 253, 3064, 875, 253, 4081, 1332, 285, 512, 253, 2045, 2987, 3198, 281, 320, 5393, 417, 281, 3748, 5301, 4679, 50275, 783, 15910, 17705, 310, 7194, 4425, 407, 16186, 818, 5742, 3413, 407, 253, 5170, 7162, 3033, 285, 2406, 7162, 3033, 2299, 436, 3033, 310, 3240, 13155, 1580, 627, 310, 581, 4778, 8925, 253, 9777, 273, 352, 701, 255, 949, 253, 2929, 253, 4477, 513, 417, 3748, 253, 2554, 273, 701, 255, 285, 697, 3486, 327, 14938, 275, 958, 604, 9840, 310, 873, 1355, 2217, 253, 1617, 275, 13989, 374, 476, 320, 4354, 10048, 987, 50275, 3062, 1189, 253, 11041, 6642, 273, 7856, 71, 19756, 10527, 1783, 891, 717, 16764, 253, 4477, 281, 4933, 10527, 12215, 273, 253, 6642, 281, 253, 1524, 11041, 5010, 352, 31451, 320, 1896, 281, 3653, 1880, 253, 6642, 310, 281, 253, 1072, 9777, 273, 253, 1524, 11041, 50274, 18, 1863, 398, 4742, 465, 256, 2369, 1441, 480, 285, 519, 1317, 391, 268, 1959, 91, 678, 1403, 17699, 16561, 13757, 549, 32693, 1047, 3071, 1839, 4196, 4059, 50276, 19, 465, 395, 284, 14743, 465, 9527, 274, 7822, 305, 8919, 9321, 480, 270, 5807, 34629, 480, 285, 268, 406, 46031, 270, 305, 12064, 1232, 3961, 262, 13757, 342, 25274, 21718, 27163, 275, 15613, 295, 2824, 4022, 50276, 20, 269, 1278, 1216, 256, 465, 18937, 247, 285, 288, 12216, 269, 270, 1368, 67, 10237, 285, 5919, 4373, 19484, 13757, 387, 4311, 275, 15613, 17857, 1686, 4765, 50276, 21, 277, 2284, 1182, 1162, 355, 17699, 16561, 13757, 16382, 17699, 16561, 8654, 15910, 275, 15613, 17857, 1686, 6247, 1677, 253, 3480, 273, 6239, 2278, 285, 5301, 4679, 5043, 253, 3033, 323, 299, 4277, 2399, 19756, 10527, 12215, 436, 2929, 3198, 247, 2201, 18520, 50276, 7152, 33032, 2520, 2929, 2175, 253, 1895, 273, 838, 1553, 5411, 253, 8654, 15056, 17705, 323, 17699, 16561, 13757, 1027, 432, 2720, 789, 326, 11411, 253, 1318, 273, 11931, 1159, 436, 2929, 29328, 271, 12077, 15056, 17705, 323, 1766, 275, 1798, 597, 3989, 247, 1029, 22275, 1430, 7162, 3033, 327, 253, 14938, 285, 840, 253, 4212, 476, 13199, 247, 6799, 13761, 326, 2722, 849, 7899, 253, 2457, 2900, 943, 320, 2429, 281, 253, 4156, 8654, 597, 6642, 253, 7887, 3066, 247, 2831, 29599, 6642, 273, 253, 26647, 2228, 45190, 597, 2216, 767, 7103, 17082, 4103, 1071, 2228, 1818, 391, 33855, 285, 4103, 673, 1818, 391, 18038, 285, 7277, 281, 253, 11088, 2720, 789, 253, 1543, 7568, 253, 12510, 273, 616, 4081, 2746, 50276, 783, 1895, 4758, 273, 2393, 38915, 253, 2862, 1766, 1232, 310, 4722, 285, 4460, 849, 281, 8069, 838, 1553, 1419, 247, 15056, 17705, 310, 247, 8542, 5691, 275, 253, 1673, 273, 288, 5367, 253, 4081, 5122, 273, 41113, 253, 14938, 1754, 327, 247, 2608, 7769, 13761, 2789, 27350, 3282, 352, 310, 671, 5272, 281, 4366, 326, 689, 31893, 476, 2826, 275, 288, 5367, 1060, 403, 247, 1643, 7000, 5701, 50275, 13206, 337, 66, 310, 417, 1077, 25910, 27096, 352, 310, 1892, 281, 2096, 326, 253, 13735, 1386, 310, 253, 44274, 67, 285, 253, 4759, 1386, 310, 253, 298, 11316, 25910, 50276, 8826, 963, 993, 354, 3149, 27369, 275, 3239, 495, 4857, 6158, 275, 3239, 577, 50276, 262, 310, 1892, 281, 2096, 540, 41597, 2139, 253, 749, 29776, 2900, 7369, 432, 253, 16774, 35701, 476, 320, 4784, 27285, 407, 2393, 15910, 352, 310, 20634, 281, 3812, 253, 6452, 326, 352, 476, 320, 4784, 27285, 407, 253, 4081, 1332, 812, 368, 5513, 625, 50276, 15177, 253, 19529, 310, 22335, 3590, 253, 3916, 275, 253, 7680, 403, 973, 19391, 407, 10527, 6260, 285, 45190, 1543, 352, 310, 247, 3426, 5313, 273, 789, 326, 41731, 13015, 2720, 789, 45190, 50276, 498, 15752, 436, 2929, 310, 973, 15720, 285, 3477, 281, 956, 253, 1895, 310, 671, 973, 24013, 8550, 253, 5661, 4278, 403, 671, 1077, 2173, 824, 326, 39306, 253, 1543, 943, 320, 1896, 50275, 2520, 2929, 29328, 247, 4460, 2393, 15910, 17705, 323, 17699, 16561, 13757, 352, 310, 973, 15720, 285, 973, 24013, 8550, 253, 4081, 2934, 310, 2969, 285, 22335, 3590, 253, 3916, 403, 973, 19391, 407, 10527, 6260, 285, 9470, 5661, 1543, 2490, 187, 4118, 18435, 27, 249, 436, 2929, 253, 15910, 1617, 273, 17699, 16561, 13757, 1766, 310, 5469, 436, 1895, 310, 1077, 1774, 672, 1766, 310, 3732, 281, 253, 4373, 19484, 13757, 288, 5367, 4836, 512, 253, 30628, 5194, 326, 253, 4081, 2746, 1754, 327, 1029, 22275, 1430, 7162, 3033, 327, 253, 14938, 310, 4722, 285, 5272, 50276, 266, 1774, 2523, 5439, 407, 247, 37317, 310, 326, 1142, 5368, 1766, 2987, 5469, 849, 281, 5115, 6733, 285, 13868, 7563, 275, 1766, 3738, 597, 858, 417, 11120, 3748, 253, 15910, 1617, 1955, 281, 253, 3480, 273, 5955, 5001, 253, 2954, 342, 841, 4122, 2905, 2175, 359, 452, 281, 7525, 326, 253, 2929, 2550, 320, 7607, 275, 697, 1655, 830 ]
[ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 ]
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 783, 2929, 29328, 247, 747, 2746, 323, 12077, 15056, 273, 4373, 19484, 13757, 1754, 327, 17699, 16561, 13757, 1766, 50276, 783, 2934, 310, 281, 3989, 1029, 22275, 1430, 7162, 3033, 327, 253, 14938, 285, 840, 3653, 672, 281, 24174, 253, 1766, 1232, 16774, 4679, 403, 5196, 327, 2710, 1524, 10186, 288, 5367, 285, 13332, 49602, 281, 921, 253, 10307, 273, 253, 4081, 2746, 4583, 891, 1158, 253, 2022, 2934, 273, 253, 789, 310, 4722, 253, 4028, 273, 253, 2929, 310, 3839, 2590, 3707, 2593, 495, 534, 891, 1158, 310, 417, 11076, 1037, 26565, 436, 2593, 5997, 479, 247, 2257, 273, 13775, 285, 891, 1089, 697, 1892, 281, 2096, 512, 253, 14168, 84, 3212, 253, 4081, 2746, 891, 452, 2067, 3533, 7117, 2708, 337, 186, 303, 13477, 407, 253, 5426, 273, 299, 4277, 2399, 310, 299, 4277, 2399, 247, 2608, 7769, 7887, 390, 310, 627, 667, 7212, 4645, 253, 5426, 273, 299, 4277, 2399, 891, 16216, 1089, 253, 5426, 273, 299, 4277, 2399, 275, 253, 2929, 671, 275, 13989, 337, 352, 2296, 326, 7856, 72, 20136, 310, 247, 7431, 2900, 273, 43261, 1861, 275, 17356, 7856, 71, 2733, 824, 326, 7856, 71, 700, 72, 20136, 50276, 700, 16054, 20136, 458, 82, 299, 4277, 2399, 436, 2097, 253, 873, 273, 7856, 72, 20136, 7024, 327, 299, 4277, 2399, 604, 436, 310, 253, 1083, 840, 891, 1902, 253, 14951, 273, 7856, 72, 20136, 943, 320, 4391, 281, 4887, 253, 7976, 984, 323, 1027, 2193, 273, 299, 4277, 2399, 627, 588, 320, 247, 1027, 873, 273, 7431, 5482, 374, 186, 249, 253, 806, 6197, 987, 846, 253, 4737, 273, 13989, 337, 352, 310, 4767, 326, 253, 13757, 2228, 299, 4277, 2399, 310, 11542, 407, 253, 2969, 14938, 7856, 1378, 2139, 310, 436, 253, 1083, 604, 299, 4277, 2399, 310, 247, 2608, 7769, 7887, 840, 352, 943, 2649, 452, 667, 5886, 342, 253, 2969, 14938, 2139, 310, 352, 11542, 407, 253, 2969, 14938, 891, 1158, 253, 13775, 1060, 310, 671, 2905, 281, 619, 13775, 327, 253, 5426, 273, 299, 4277, 2399, 495, 186, 249, 253, 12494, 5170, 3033, 323, 7856, 1378, 387, 253, 990, 273, 3239, 495, 627, 310, 247, 3908, 3981, 326, 342, 1029, 5912, 7856, 71, 2733, 310, 11542, 407, 2406, 285, 5170, 7162, 14493, 2931, 347, 298, 68, 2612, 2733, 50276, 10082, 2733, 50276, 9900, 255, 9788, 2056, 2733, 285, 44274, 2612, 2733, 50276, 10082, 2733, 50276, 9900, 255, 9788, 2056, 2733, 436, 3908, 310, 417, 1900, 2032, 352, 310, 760, 2032, 672, 253, 5028, 273, 253, 8103, 1159, 310, 13358, 672, 253, 5028, 310, 5415, 352, 3198, 281, 452, 1529, 1307, 258, 18, 85, 19, 275, 253, 5170, 285, 2406, 3033, 984, 273, 253, 35132, 1320, 1232, 923, 458, 44661, 7288, 8026, 8988, 275, 256, 11078, 34627, 1162, 355, 4267, 577, 186, 249, 16186, 818, 891, 513, 417, 2096, 2139, 299, 4277, 2399, 310, 4577, 685, 7856, 1378, 476, 253, 4477, 5513, 275, 625, 2508, 608, 186, 249, 253, 6197, 987, 2708, 16186, 818, 352, 3054, 326, 7856, 16054, 6710, 50276, 3987, 1861, 275, 305, 85, 7856, 71, 2733, 327, 253, 643, 1133, 275, 16186, 721, 352, 3054, 326, 7856, 16054, 6710, 310, 253, 1682, 1318, 1119, 407, 19502, 246, 16248, 841, 767, 7424, 436, 2097, 305, 6710, 310, 253, 1318, 342, 253, 8840, 1318, 273, 7856, 71, 2733, 275, 3946, 436, 310, 760, 2032, 672, 253, 16774, 2957, 1159, 7856, 71, 2733, 310, 642, 261, 6134, 2299, 3877, 326, 281, 4647, 253, 14938, 1783, 275, 256, 11078, 34627, 1162, 355, 4267, 253, 8103, 1159, 3198, 281, 320, 27620, 721, 186, 303, 671, 13477, 342, 13989, 374, 310, 436, 247, 10419, 432, 253, 2929, 326, 310, 281, 24174, 1766, 672, 2534, 1378, 50276, 2609, 2044, 7856, 71, 305, 6710, 390, 310, 436, 247, 15965, 13989, 26332, 672, 253, 1617, 275, 16186, 898, 6634, 1766, 310, 8356, 19344, 891, 16216, 1089, 247, 4737, 273, 13989, 374, 275, 253, 30762, 671, 407, 12669, 253, 5933, 337, 275, 3239, 1249, 275, 253, 30762, 891, 452, 247, 5471, 326, 436, 13989, 374, 310, 816, 247, 10419, 432, 253, 2929, 310, 436, 253, 1083, 818, 186, 74, 1119, 697, 247, 2372, 1892, 281, 1239, 4677, 495, 625, 7000, 8813, 323, 4677, 495, 310, 3058, 275, 253, 17265, 2715, 273, 253, 2929, 50276, 3022, 891, 5393, 275, 253, 2045, 2593, 891, 1158, 253, 4583, 2934, 273, 253, 789, 310, 4722, 533, 253, 14168, 84, 3212, 253, 4081, 2746, 310, 417, 26565, 285, 13477, 891, 1089, 247, 2257, 273, 13775, 4361, 2593, 495, 534, 310, 253, 2022, 2593, 273, 253, 2929, 891, 452, 2067, 3533, 281, 253, 4477, 7117, 275, 253, 2045, 7118, 18670, 253, 9172, 432, 253, 4477, 588, 320, 2104, 281, 2590, 619, 5545, 5474, 33032, 2520, 2929, 29328, 271, 12077, 15056, 17705, 323, 17699, 16561, 13757, 1766, 407, 970, 253, 5170, 3033, 273, 253, 2969, 294, 788, 2710, 4679, 403, 5196, 281, 7568, 326, 342, 253, 19575, 273, 253, 15056, 17705, 13782, 285, 2341, 8353, 476, 320, 3777, 253, 2201, 7680, 273, 436, 2929, 8696, 275, 253, 767, 39325, 13989, 337, 25339, 253, 2954, 875, 7605, 2228, 285, 13757, 2228, 253, 4477, 1750, 326, 1955, 281, 253, 22816, 7605, 2228, 352, 310, 4569, 281, 4796, 253, 13757, 2228, 299, 4277, 2399, 281, 253, 1072, 9777, 8772, 253, 7605, 2228, 1580, 253, 7605, 2228, 299, 4277, 296, 310, 7202, 253, 4477, 5283, 271, 5368, 2831, 30716, 1332, 281, 6642, 352, 840, 275, 13989, 374, 253, 15056, 17705, 310, 7000, 2378, 253, 1766, 14938, 310, 1679, 685, 253, 2629, 11254, 273, 7605, 11041, 50276, 77, 471, 273, 6239, 2278, 275, 2087, 436, 2929, 13698, 281, 8415, 253, 2393, 29776, 15910, 273, 253, 17699, 16561, 13757, 1766, 9828, 273, 2905, 2987, 403, 1014, 417, 5393, 275, 253, 2929, 24088, 1959, 91, 678, 1403, 1766, 337, 50276, 9961, 338, 21718, 1766, 374, 4373, 4152, 1766, 495, 285, 16237, 2399, 577, 841, 2987, 512, 5469, 849, 281, 5115, 6733, 285, 13868, 7563, 275, 1766, 247, 7000, 5955, 273, 253, 3064, 875, 253, 4081, 1332, 285, 512, 253, 2045, 2987, 3198, 281, 320, 5393, 417, 281, 3748, 5301, 4679, 50275, 783, 15910, 17705, 310, 7194, 4425, 407, 16186, 818, 5742, 3413, 407, 253, 5170, 7162, 3033, 285, 2406, 7162, 3033, 2299, 436, 3033, 310, 3240, 13155, 1580, 627, 310, 581, 4778, 8925, 253, 9777, 273, 352, 701, 255, 949, 253, 2929, 253, 4477, 513, 417, 3748, 253, 2554, 273, 701, 255, 285, 697, 3486, 327, 14938, 275, 958, 604, 9840, 310, 873, 1355, 2217, 253, 1617, 275, 13989, 374, 476, 320, 4354, 10048, 987, 50275, 3062, 1189, 253, 11041, 6642, 273, 7856, 71, 19756, 10527, 1783, 891, 717, 16764, 253, 4477, 281, 4933, 10527, 12215, 273, 253, 6642, 281, 253, 1524, 11041, 5010, 352, 31451, 320, 1896, 281, 3653, 1880, 253, 6642, 310, 281, 253, 1072, 9777, 273, 253, 1524, 11041, 50274, 18, 1863, 398, 4742, 465, 256, 2369, 1441, 480, 285, 519, 1317, 391, 268, 1959, 91, 678, 1403, 17699, 16561, 13757, 549, 32693, 1047, 3071, 1839, 4196, 4059, 50276, 19, 465, 395, 284, 14743, 465, 9527, 274, 7822, 305, 8919, 9321, 480, 270, 5807, 34629, 480, 285, 268, 406, 46031, 270, 305, 12064, 1232, 3961, 262, 13757, 342, 25274, 21718, 27163, 275, 15613, 295, 2824, 4022, 50276, 20, 269, 1278, 1216, 256, 465, 18937, 247, 285, 288, 12216, 269, 270, 1368, 67, 10237, 285, 5919, 4373, 19484, 13757, 387, 4311, 275, 15613, 17857, 1686, 4765, 50276, 21, 277, 2284, 1182, 1162, 355, 17699, 16561, 13757, 16382, 17699, 16561, 8654, 15910, 275, 15613, 17857, 1686, 6247, 1677, 253, 3480, 273, 6239, 2278, 285, 5301, 4679, 5043, 253, 3033, 323, 299, 4277, 2399, 19756, 10527, 12215, 436, 2929, 3198, 247, 2201, 18520, 50276, 7152, 33032, 2520, 2929, 2175, 253, 1895, 273, 838, 1553, 5411, 253, 8654, 15056, 17705, 323, 17699, 16561, 13757, 1027, 432, 2720, 789, 326, 11411, 253, 1318, 273, 11931, 1159, 436, 2929, 29328, 271, 12077, 15056, 17705, 323, 1766, 275, 1798, 597, 3989, 247, 1029, 22275, 1430, 7162, 3033, 327, 253, 14938, 285, 840, 253, 4212, 476, 13199, 247, 6799, 13761, 326, 2722, 849, 7899, 253, 2457, 2900, 943, 320, 2429, 281, 253, 4156, 8654, 597, 6642, 253, 7887, 3066, 247, 2831, 29599, 6642, 273, 253, 26647, 2228, 45190, 597, 2216, 767, 7103, 17082, 4103, 1071, 2228, 1818, 391, 33855, 285, 4103, 673, 1818, 391, 18038, 285, 7277, 281, 253, 11088, 2720, 789, 253, 1543, 7568, 253, 12510, 273, 616, 4081, 2746, 50276, 783, 1895, 4758, 273, 2393, 38915, 253, 2862, 1766, 1232, 310, 4722, 285, 4460, 849, 281, 8069, 838, 1553, 1419, 247, 15056, 17705, 310, 247, 8542, 5691, 275, 253, 1673, 273, 288, 5367, 253, 4081, 5122, 273, 41113, 253, 14938, 1754, 327, 247, 2608, 7769, 13761, 2789, 27350, 3282, 352, 310, 671, 5272, 281, 4366, 326, 689, 31893, 476, 2826, 275, 288, 5367, 1060, 403, 247, 1643, 7000, 5701, 50275, 13206, 337, 66, 310, 417, 1077, 25910, 27096, 352, 310, 1892, 281, 2096, 326, 253, 13735, 1386, 310, 253, 44274, 67, 285, 253, 4759, 1386, 310, 253, 298, 11316, 25910, 50276, 8826, 963, 993, 354, 3149, 27369, 275, 3239, 495, 4857, 6158, 275, 3239, 577, 50276, 262, 310, 1892, 281, 2096, 540, 41597, 2139, 253, 749, 29776, 2900, 7369, 432, 253, 16774, 35701, 476, 320, 4784, 27285, 407, 2393, 15910, 352, 310, 20634, 281, 3812, 253, 6452, 326, 352, 476, 320, 4784, 27285, 407, 253, 4081, 1332, 812, 368, 5513, 625, 50276, 15177, 253, 19529, 310, 22335, 3590, 253, 3916, 275, 253, 7680, 403, 973, 19391, 407, 10527, 6260, 285, 45190, 1543, 352, 310, 247, 3426, 5313, 273, 789, 326, 41731, 13015, 2720, 789, 45190, 50276, 498, 15752, 436, 2929, 310, 973, 15720, 285, 3477, 281, 956, 253, 1895, 310, 671, 973, 24013, 8550, 253, 5661, 4278, 403, 671, 1077, 2173, 824, 326, 39306, 253, 1543, 943, 320, 1896, 50275, 2520, 2929, 29328, 247, 4460, 2393, 15910, 17705, 323, 17699, 16561, 13757, 352, 310, 973, 15720, 285, 973, 24013, 8550, 253, 4081, 2934, 310, 2969, 285, 22335, 3590, 253, 3916, 403, 973, 19391, 407, 10527, 6260, 285, 9470, 5661, 1543, 2490, 187, 4118, 18435, 27, 249, 436, 2929, 253, 15910, 1617, 273, 17699, 16561, 13757, 1766, 310, 5469, 436, 1895, 310, 1077, 1774, 672, 1766, 310, 3732, 281, 253, 4373, 19484, 13757, 288, 5367, 4836, 512, 253, 30628, 5194, 326, 253, 4081, 2746, 1754, 327, 1029, 22275, 1430, 7162, 3033, 327, 253, 14938, 310, 4722, 285, 5272, 50276, 266, 1774, 2523, 5439, 407, 247, 37317, 310, 326, 1142, 5368, 1766, 2987, 5469, 849, 281, 5115, 6733, 285, 13868, 7563, 275, 1766, 3738, 597, 858, 417, 11120, 3748, 253, 15910, 1617, 1955, 281, 253, 3480, 273, 5955, 5001, 253, 2954, 342, 841, 4122, 2905, 2175, 359, 452, 281, 7525, 326, 253, 2929, 2550, 320, 7607, 275, 697, 1655, 830 ]
Below is given review of a research paper from cnoference journal. Please write a summary the review. ### Review: the manuscript proposed an unsupervised image registration algorithm based on the optimal transfer scheme solving an unbalanced fused gromovwasserstein optimization problem the algorithm was evaluated on an fmri data registration task to test whether the correspondence between fmri signals can be improved after the registration effective registration from one individual image to a common space as well as from one individual to another individual is highly needed in medical image analysis although various registration algorithms have been developed the proposed fused gromovwasserstein optimal transfer algorithm with unbalanced matching is a novel and plausible way to solve this task in an unsupervised approach the capability of considering both the image itself ie the mesh and features defined on the image eg the activation map used in this work is much needed especially for the multimodal image fusion the methodology of this work is clearly written and the overall quality of this manuscript is good the major issue of this work is that its evaluation scheme needs to be improved see questions below na this works only developed an image registration algorithm docsepthe authors propose an optimal transport based registration method for meshes with multivariate features a principle usecase of which is neuroimaging cortical mesh alignment they construct a gromovwasserstein loss which is the usual transport distance term to penalize large deformations wrt the mesh geodesic which they use alongside a wasserstein loss which is the transport matching term overall the transport method is defined for an unbalanced transport problem where the domains need not match and where restrictions may be placed on transport tofrom specific regions experiments are presented with both synthetic and real data using subsets of the individual brain charting dataset composed of a variety of task fmri and the standard anatomical scans comparisons are made with multisurface matching msm robinson 2014 as well as a template analysis barycenters strengths this paper is well written and enjoyable to read the treatment of the transport problem is well done and the implementation of a standard solution method scaling to this particular problem seems sound weaknesses the method requires a dense interaction matrix of v2 size where v is the number of vertices on the mesh limiting this method to something on the order of 10k vertices though to the authors credit considerations were made here the baseline method is untuned comment the technical innovation is limited to a specific application this is not disqualifying but it is limiting in significance to the neurips community relative to say a conference or journal from the application domain eg ieee tmi none docsepa common problem in the use of brain imaging data is that individual brains differ in both geometry and function requiring some form of alignment before comparison the authors propose a novel method for aligning different brains fugw which uses a combination of distance losses to respect functional geometric and even fundamental region differences between different brain images strengths the problem statement and motivations are clear the problem of accurate brain alignment is interesting both for current human neuroscience research and potential future crossspecies comparisons the paper is well organized and clearly written toy examples are given to demonstrate the differences of their method the method itself seems technically sound with appropriate references to past work and a clear intuition of what each term does the numerical experiments are welldesigned and explained and they demonstrate a marked advantage of the proposed method in both quantitative and qualitative terms weaknesses i would have liked to see the impact of these brain alignments on downstream experimentstasks i would also be interested in computational comparisons how much more difficult is it to do this vs msm in experiment 2 it was necessary to reduce the scale of the individual brain meshes to run at all though admittedly the other methods couldnt even be applied the technique may not scale well computationally though that could more be an issue of the size of the images themselves ### Summary:
this paper uses optimal transport for aligning cortical surfaces based on the similarity of their functional signatures under different stimulations the paper is well written and the experimental setup is sound the authors added experiments and clarifications to address reviewers comments and concerns the reviewers provided a consensus accept rating for this paper
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 783, 7714, 4081, 271, 440, 35421, 2460, 12960, 5933, 1754, 327, 253, 8654, 3700, 6974, 16161, 271, 440, 30063, 29843, 305, 409, 729, 88, 30666, 6339, 13757, 1895, 253, 5933, 369, 6760, 327, 271, 49555, 363, 941, 12960, 4836, 281, 1071, 1880, 253, 17668, 875, 49555, 363, 6298, 476, 320, 5520, 846, 253, 12960, 50276, 13116, 12960, 432, 581, 2060, 2460, 281, 247, 1846, 2317, 347, 973, 347, 432, 581, 2060, 281, 1529, 2060, 310, 4122, 3058, 275, 3739, 2460, 1783, 3738, 2710, 12960, 11333, 452, 644, 3715, 253, 4081, 29843, 305, 409, 729, 88, 30666, 6339, 8654, 3700, 5933, 342, 440, 30063, 11038, 310, 247, 4460, 285, 21541, 1039, 281, 8415, 436, 4836, 275, 271, 440, 35421, 2746, 253, 14603, 273, 7296, 1097, 253, 2460, 3139, 26332, 253, 17489, 285, 3386, 2931, 327, 253, 2460, 24088, 253, 5743, 3711, 908, 275, 436, 789, 310, 1199, 3058, 3340, 323, 253, 23390, 26306, 2460, 11781, 253, 16182, 273, 436, 789, 310, 4518, 3542, 285, 253, 4583, 3290, 273, 436, 7714, 310, 1175, 50275, 783, 2201, 2523, 273, 436, 789, 310, 326, 697, 7103, 6974, 3198, 281, 320, 5520, 923, 3533, 2708, 50276, 2072, 436, 2987, 760, 3715, 271, 2460, 12960, 5933, 5474, 339, 431, 248, 4477, 12661, 271, 8654, 4616, 1754, 12960, 1332, 323, 6191, 1041, 342, 21471, 3386, 247, 8063, 441, 886, 511, 273, 534, 310, 6551, 40270, 17902, 17489, 12420, 597, 3989, 247, 305, 409, 729, 88, 30666, 6339, 2957, 534, 310, 253, 7312, 4616, 4181, 1307, 281, 29697, 907, 1781, 45989, 8772, 253, 17489, 35917, 534, 597, 897, 12936, 247, 369, 2152, 6339, 2957, 534, 310, 253, 4616, 11038, 1307, 4583, 253, 4616, 1332, 310, 2931, 323, 271, 440, 30063, 4616, 1895, 835, 253, 10625, 878, 417, 3761, 285, 835, 13133, 778, 320, 4845, 327, 4616, 281, 4064, 2173, 4811, 50276, 16217, 3825, 403, 3559, 342, 1097, 13506, 285, 1524, 941, 970, 20077, 273, 253, 2060, 3998, 8326, 272, 10895, 9924, 273, 247, 5235, 273, 4836, 49555, 363, 285, 253, 2629, 27166, 20947, 14023, 403, 1160, 342, 1554, 261, 32961, 11038, 278, 3610, 4848, 9258, 4059, 347, 973, 347, 247, 7646, 1783, 28556, 1154, 398, 20544, 50274, 2520, 2929, 310, 973, 3542, 285, 30357, 281, 1239, 50274, 783, 1971, 273, 253, 4616, 1895, 310, 973, 2218, 285, 253, 7092, 273, 247, 2629, 2900, 1332, 13642, 281, 436, 1798, 1895, 3133, 3590, 50274, 20881, 1255, 265, 50276, 783, 1332, 4419, 247, 14086, 5016, 4315, 273, 362, 19, 1979, 835, 362, 310, 253, 1180, 273, 13388, 327, 253, 17489, 14155, 436, 1332, 281, 1633, 327, 253, 1340, 273, 884, 76, 13388, 2167, 281, 253, 4477, 6152, 15711, 497, 1160, 1060, 50273, 783, 8245, 1332, 310, 18093, 37437, 50274, 13982, 50276, 783, 7681, 15832, 310, 3710, 281, 247, 2173, 2898, 436, 310, 417, 42069, 5411, 533, 352, 310, 14155, 275, 8453, 281, 253, 5723, 2824, 3114, 4103, 281, 1333, 247, 8059, 390, 6698, 432, 253, 2898, 5028, 24088, 26332, 1796, 246, 7373, 50275, 15422, 5474, 339, 4904, 1846, 1895, 275, 253, 897, 273, 3998, 6979, 941, 310, 326, 2060, 19795, 9184, 275, 1097, 12087, 285, 1159, 10568, 690, 830, 273, 12420, 1078, 5301, 253, 4477, 12661, 247, 4460, 1332, 323, 8495, 272, 1027, 19795, 33067, 88, 534, 4648, 247, 5019, 273, 4181, 11655, 281, 1675, 5164, 17856, 285, 1014, 7936, 2919, 3910, 875, 1027, 3998, 3888, 20544, 50275, 783, 1895, 3908, 285, 42852, 403, 2590, 253, 1895, 273, 7899, 3998, 12420, 310, 4722, 1097, 323, 1655, 1966, 6551, 21559, 2561, 285, 2442, 2852, 2831, 28956, 14023, 253, 2929, 310, 973, 10932, 285, 4518, 3542, 20953, 6667, 403, 1677, 281, 7568, 253, 3910, 273, 616, 1332, 253, 1332, 3139, 3133, 22335, 3590, 342, 4569, 10414, 281, 2469, 789, 285, 247, 2590, 30328, 273, 752, 1016, 1307, 1057, 50276, 783, 10704, 4679, 403, 6210, 392, 265, 1300, 285, 5544, 285, 597, 7568, 247, 7101, 5750, 273, 253, 4081, 1332, 275, 1097, 11745, 285, 18276, 2426, 50276, 20881, 1255, 265, 891, 651, 452, 10490, 281, 923, 253, 3486, 273, 841, 3998, 43097, 327, 15450, 3368, 296, 6579, 50276, 74, 651, 671, 320, 6110, 275, 15180, 14023, 849, 1199, 625, 2834, 310, 352, 281, 513, 436, 4632, 278, 3610, 275, 3368, 374, 352, 369, 3309, 281, 4796, 253, 4311, 273, 253, 2060, 3998, 6191, 1041, 281, 1408, 387, 512, 2167, 47421, 253, 643, 3082, 812, 2649, 1014, 320, 3732, 50276, 783, 5853, 778, 417, 4311, 973, 43245, 2167, 326, 812, 625, 320, 271, 2523, 273, 253, 1979, 273, 253, 3888, 3746, 50276, 187, 187, 4118, 18435, 27, 2520, 2929, 4648, 8654, 4616, 323, 8495, 272, 17902, 9421, 1754, 327, 253, 14259, 273, 616, 5164, 20076, 762, 1027, 4308, 3339, 253, 2929, 310, 973, 3542, 285, 253, 5661, 9978, 310, 3590, 253, 4477, 2879, 4679, 285, 8254, 6787, 281, 2953, 30628, 5701, 285, 7350, 253, 30628, 2530, 247, 13969, 2997, 13716, 323, 436, 2929, 50276 ]
[ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 ]
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 783, 7714, 4081, 271, 440, 35421, 2460, 12960, 5933, 1754, 327, 253, 8654, 3700, 6974, 16161, 271, 440, 30063, 29843, 305, 409, 729, 88, 30666, 6339, 13757, 1895, 253, 5933, 369, 6760, 327, 271, 49555, 363, 941, 12960, 4836, 281, 1071, 1880, 253, 17668, 875, 49555, 363, 6298, 476, 320, 5520, 846, 253, 12960, 50276, 13116, 12960, 432, 581, 2060, 2460, 281, 247, 1846, 2317, 347, 973, 347, 432, 581, 2060, 281, 1529, 2060, 310, 4122, 3058, 275, 3739, 2460, 1783, 3738, 2710, 12960, 11333, 452, 644, 3715, 253, 4081, 29843, 305, 409, 729, 88, 30666, 6339, 8654, 3700, 5933, 342, 440, 30063, 11038, 310, 247, 4460, 285, 21541, 1039, 281, 8415, 436, 4836, 275, 271, 440, 35421, 2746, 253, 14603, 273, 7296, 1097, 253, 2460, 3139, 26332, 253, 17489, 285, 3386, 2931, 327, 253, 2460, 24088, 253, 5743, 3711, 908, 275, 436, 789, 310, 1199, 3058, 3340, 323, 253, 23390, 26306, 2460, 11781, 253, 16182, 273, 436, 789, 310, 4518, 3542, 285, 253, 4583, 3290, 273, 436, 7714, 310, 1175, 50275, 783, 2201, 2523, 273, 436, 789, 310, 326, 697, 7103, 6974, 3198, 281, 320, 5520, 923, 3533, 2708, 50276, 2072, 436, 2987, 760, 3715, 271, 2460, 12960, 5933, 5474, 339, 431, 248, 4477, 12661, 271, 8654, 4616, 1754, 12960, 1332, 323, 6191, 1041, 342, 21471, 3386, 247, 8063, 441, 886, 511, 273, 534, 310, 6551, 40270, 17902, 17489, 12420, 597, 3989, 247, 305, 409, 729, 88, 30666, 6339, 2957, 534, 310, 253, 7312, 4616, 4181, 1307, 281, 29697, 907, 1781, 45989, 8772, 253, 17489, 35917, 534, 597, 897, 12936, 247, 369, 2152, 6339, 2957, 534, 310, 253, 4616, 11038, 1307, 4583, 253, 4616, 1332, 310, 2931, 323, 271, 440, 30063, 4616, 1895, 835, 253, 10625, 878, 417, 3761, 285, 835, 13133, 778, 320, 4845, 327, 4616, 281, 4064, 2173, 4811, 50276, 16217, 3825, 403, 3559, 342, 1097, 13506, 285, 1524, 941, 970, 20077, 273, 253, 2060, 3998, 8326, 272, 10895, 9924, 273, 247, 5235, 273, 4836, 49555, 363, 285, 253, 2629, 27166, 20947, 14023, 403, 1160, 342, 1554, 261, 32961, 11038, 278, 3610, 4848, 9258, 4059, 347, 973, 347, 247, 7646, 1783, 28556, 1154, 398, 20544, 50274, 2520, 2929, 310, 973, 3542, 285, 30357, 281, 1239, 50274, 783, 1971, 273, 253, 4616, 1895, 310, 973, 2218, 285, 253, 7092, 273, 247, 2629, 2900, 1332, 13642, 281, 436, 1798, 1895, 3133, 3590, 50274, 20881, 1255, 265, 50276, 783, 1332, 4419, 247, 14086, 5016, 4315, 273, 362, 19, 1979, 835, 362, 310, 253, 1180, 273, 13388, 327, 253, 17489, 14155, 436, 1332, 281, 1633, 327, 253, 1340, 273, 884, 76, 13388, 2167, 281, 253, 4477, 6152, 15711, 497, 1160, 1060, 50273, 783, 8245, 1332, 310, 18093, 37437, 50274, 13982, 50276, 783, 7681, 15832, 310, 3710, 281, 247, 2173, 2898, 436, 310, 417, 42069, 5411, 533, 352, 310, 14155, 275, 8453, 281, 253, 5723, 2824, 3114, 4103, 281, 1333, 247, 8059, 390, 6698, 432, 253, 2898, 5028, 24088, 26332, 1796, 246, 7373, 50275, 15422, 5474, 339, 4904, 1846, 1895, 275, 253, 897, 273, 3998, 6979, 941, 310, 326, 2060, 19795, 9184, 275, 1097, 12087, 285, 1159, 10568, 690, 830, 273, 12420, 1078, 5301, 253, 4477, 12661, 247, 4460, 1332, 323, 8495, 272, 1027, 19795, 33067, 88, 534, 4648, 247, 5019, 273, 4181, 11655, 281, 1675, 5164, 17856, 285, 1014, 7936, 2919, 3910, 875, 1027, 3998, 3888, 20544, 50275, 783, 1895, 3908, 285, 42852, 403, 2590, 253, 1895, 273, 7899, 3998, 12420, 310, 4722, 1097, 323, 1655, 1966, 6551, 21559, 2561, 285, 2442, 2852, 2831, 28956, 14023, 253, 2929, 310, 973, 10932, 285, 4518, 3542, 20953, 6667, 403, 1677, 281, 7568, 253, 3910, 273, 616, 1332, 253, 1332, 3139, 3133, 22335, 3590, 342, 4569, 10414, 281, 2469, 789, 285, 247, 2590, 30328, 273, 752, 1016, 1307, 1057, 50276, 783, 10704, 4679, 403, 6210, 392, 265, 1300, 285, 5544, 285, 597, 7568, 247, 7101, 5750, 273, 253, 4081, 1332, 275, 1097, 11745, 285, 18276, 2426, 50276, 20881, 1255, 265, 891, 651, 452, 10490, 281, 923, 253, 3486, 273, 841, 3998, 43097, 327, 15450, 3368, 296, 6579, 50276, 74, 651, 671, 320, 6110, 275, 15180, 14023, 849, 1199, 625, 2834, 310, 352, 281, 513, 436, 4632, 278, 3610, 275, 3368, 374, 352, 369, 3309, 281, 4796, 253, 4311, 273, 253, 2060, 3998, 6191, 1041, 281, 1408, 387, 512, 2167, 47421, 253, 643, 3082, 812, 2649, 1014, 320, 3732, 50276, 783, 5853, 778, 417, 4311, 973, 43245, 2167, 326, 812, 625, 320, 271, 2523, 273, 253, 1979, 273, 253, 3888, 3746, 50276, 187, 187, 4118, 18435, 27, 2520, 2929, 4648, 8654, 4616, 323, 8495, 272, 17902, 9421, 1754, 327, 253, 14259, 273, 616, 5164, 20076, 762, 1027, 4308, 3339, 253, 2929, 310, 973, 3542, 285, 253, 5661, 9978, 310, 3590, 253, 4477, 2879, 4679, 285, 8254, 6787, 281, 2953, 30628, 5701, 285, 7350, 253, 30628, 2530, 247, 13969, 2997, 13716, 323, 436, 2929, 50276 ]
Below is given review of a research paper from cnoference journal. Please write a summary the review. ### Review: incorporating hamiltonian dynamics based inductive bias into deep neural networks has gained significant attention in the machine learning community over the last few years however in many real physical systems the underlying odes can exhibit stiffness ie numerical instability over some time intervals or for certain values of initial conditions andor parameter choice as a result the existing realizations of hamiltonianpreserving neural networks underperform in these scenarios this paper proposes a solution to overcome this issue in particular it introduces an easy to compute stiffnessaware index to split the training data points into stiff and nonstiff groups which are then treated with different integration schemes with different values for integration time interval and time steps when demonstrated on relevant physical systems the proposed approach shows improved accuracy in prediction and energy conservation the paper is wellwritten and it conveys the key ideas in a very precise manner the authors have done an excellent job in highlighting the need for a stiffnessaware approach figure 1 and the related discussions in the introduction section provide a compelling motivation for the problem the idea is straightforward but quite effective as one can see from the experimental results however unless some aspects of the paper are further improved or better explained it remains challenging to evaluate the real contribution and impact of this work the paper mentions that the mass matrix m is diagonal and subsequently it uses a set of trainable parameters to learn the individual elements of this diagonal matrix would you please confirm if this is indeed the case or the writing is conveying a wrong picture for a large class of hamiltonian systems the massinertia matrix is not diagonal and its entries depend on the generalized coordinates for example please consider a klink pendulum with joint angles as the generalized coordinates therefore if m is indeed assumed to be a diagonal matrix the scope of this work is very restricted unless the authors can show with additional experiments that the proposed approach holds true even when the mass matrix is nondiagonal and positiondependent basri et al the convergence rate of neural networks for learned functions of different frequencies neurips 2019 have demonstrated that the number of epochs to learn a function has a squared relationship with its frequency as the stiffness of an ode is related to the presence of very highfrequency components in its solution it may be possible that srnn or hnn can achieve comparable accuracy if they are trained over a sufficiently long period but the current set of experiments do not provide a precise answer to this equation therefore i would strongly encourage the authors to run an additional experiment that trains srnn and hnn with a very high number of epochs finzi et al simplifying hamiltonian and lagrangian neural networks via explicit constraints neurips 2020 have shown that the l1 loss functions exhibit better performance while inferring dynamics from data it would be helpful to see if the same holds true for this problem as well therefore the authors should consider carrying out an additional ablation study that shows how the performance varies when the loss function uses the l1 norm recent work by kim et al stiff neural ordinary differential equations arxiv210315341 has extended the scope of neural odes to stiff differential equations i would encourage the authors to consider this approach if possible as an additional baseline the related work section has missed some relevant prior work that enforces hamiltonian dynamics while using a neural network to infer dynamics from data please refer to the following survey papers and the references therein for further details about the relevant prior work integrating physicsbased modeling with machine learning a survey willard et al arxiv200304919 an overview on recent machine learning techniques for port hamiltonian systems cherifi physica d 2020 and benchmarking energyconserving neural networks for learning dynamics from data zhong et al l4dc 2021 some additional minor comments the second paragraph of the introduction mentions the particles deviate from the reference orbits rapidly after collision as two particles cannot collide in this setting the authors should rephrase the sentence with after a close encounter the authors should consider introducing gamma ie the hyperparameter that denotes the stiffness ratio threshold as a percentile number it would make the point clearer also the largerstiffer should be either a largerstiffer or the largeststiffest this paper has proposed a straightforward but effective solution that can infer hamiltonian dynamics even when the governing odes exhibit numerical stiffness the problem is very wellmotivated and the paper explains the core ideas in a very precise way however i have mentioned in the main review some aspects especially the assumption that m is a diagonal matrix with entries that are positionindependent should be properly addressed or explained to increase the concreteness and overall clarity of the paper post rebuttal response i would like to thank the authors for addressing the prior concerns i have updated my score docsepthis paper presents a new method stiffnessaware neural network sann for learning hamiltonian systems from data the authors define a stiffnessaware index sai for classifying the training data into stiff and nonstiff samples based on the classification result the step size for the numerical solver is adjusted and the number of samples for training is balanced the effectiveness of sann is demonstrated using chaotic hamiltonian systems ie a threebody problem and billiard model strong points a stiffnessaware approach for learning hamiltonian systems is novel the experiments show that sann can accurately simulate complex hamiltonian systems than baseline methods hnn and srnn this paper is wellwritten weak points there are some hyperparameters to be determined manually the compared methods are the bare minimum there are several concerns about the experimental setting comments 1 the task of learning physical dynamics from data has been of great interest recently a key idea of incorporating stiffness into the learning scheme is novel and exciting the proposed method is simple but effective however there are the hyperparameters gamma s to be manually determined especially the ratio of the stiff portion gamma could be critical for performance could the authors explain how to estimate this hyperparameter for various physical systems including the mbody problem when m is large 2 the proposed method classifies the intervals into binary labels that is stiff or nonstiff it might be helpful to model the continuous stiffratio for each interval that is learnable 3 the authors assume the separable hamiltonian hpqtpvq can the proposed learning framework apply to the inseparable generalized model 4 the hnn greydanus et al 2019 does not consider the separable assumption in the experiments did the authors use this assumption in the hnn learning also the input of hnn is the partial derivatives unlike sann and hrnn how did the authors give the input to hnn 5 why did the authors use the leapfrog solver isnt the simple euler method appropriate this paper addresses the interesting problem and is wellmotivated also the proposed approach is novel and the experimental results are insightful although i tend to accept this paper i have some concerns which i detail in the main review docsepthis paper proposes to improve the learning of a hamiltonian system by characterizing the stiffness of the time series data a stiffnessaware index is first used to classify the time interval into stiff and nonstiff then during the training of the hamiltonian network the stiff part is integrated using a smaller timestep and also sampled more frequently in the training data separating the stiff and nonstiff parts of the data when training a hamiltonian network is a novel idea and the results show certain advantages of using this method and experiments are included to discuss the influence of resampling integration partitions and activation functions however only two examples are shown in the experiments it would be more convincing if experiments of more hamiltonian systems can be conducted some additional questions when predicting future states does the model use s fixed time step or still using a smaller timestep for stiff parts if so how is the stiffness calculated are there significant advantages of using an additional trainable term ptmp instead of training a single network that represents ptmp qw there are certain contributions of this paper on proposing characterizing the stiffness of the time series data which shows its advantage of improving the performance of the hamiltonian network docsepthis study proposes the stiffaware index sai for an ordinary differential equation and the training strategy that uses samples with large sais more frequently a neural network has the implicit bias to tend to learn a smooth function only a limited portion of data obtained from a stiff system exhibits a rapid change in other words the training data is imbalanced between a gradual change and a rapid change hence without sai it is difficult for a neural network to learn a stiff dynamics the contribution of sai is confirmed using a threebody problem positives it is an insightful suggestion that the stiffness is a bottleneck to learning of a physical system it is surprising that a simple oversampling is enough negatives section 5 demonstrates that sai is a good approximation to stiffness index si but this might not hold for different coordinate systems si assumes that the origin of the coordinate system is an equilibrium point in other words the bias from the equilibrium point is already subtracted from the position however sai does not or cannot in practice the norm of the state and thereby sai depend on the coordinate system it is preferable to valid the generality of sai the sampling strategy is heuristic as shown in table 1 the results are sensitive to the hyperparameter tuning a guidance justified theoretically is preferable minor comments it might be an insightful suggestion that the neural network tends to learn a smooth function and this implicit bias is a bottleneck to learning of a stiff system however this suggestion is not validated by experiments and it is unclear whether the proposal resolves this problem an additional experiment or analysis is not mandatory but may improve the contribution of the present study figure 1 is a bit confusing i prefer that the scales of the axes are fixed within each example after discussion all my concerns were addressed by the additional experiments and explanations i update the score from 5 to 6 this study is based on an insightful suggestion and the proposed method is simple but effective however the strategy is heuristic and the generality is unclear ### Summary:
this paper introduces the stiffnessaware neural network sann for improving numerical stability in hamiltonian neural networks to this end the authors introduce the stiffnessaware index sai to classify time intervals into stiff and nonstiff portions and propose to adapt the integration scheme accordingly the paper initially received three weak accept and one weak reject recommendations the main limitations pointed out by reviewers relate to missing references from the literature assumptions behind the proposed approach eg structure of the mass matrix separable hamiltonian and clarifications on experiments including additional baselines and hyperparameter settings the rebuttal did a good job in answering reviewers concerns rittu increased his rating to a clear accept and rmyxe increased his rating to weak accept eventually there is a consensus among reviewers to accept the paper the acs own readings confirmed the reviewers recommendation the method is straightforward yet effective and the paper is well written the effectiveness of the proposed approach is shown in different contexts since several complex systems exhibit chaotic characteristics the paper brings a meaningful contribution to the community
[ 2929, 310, 973, 15720, 285, 352, 11785, 656, 253, 2234, 5697, 275, 247, 1077, 10799, 5133, 253, 4477, 452, 2218, 271, 7126, 2628, 275, 27321, 253, 878, 323, 247, 26370, 13823, 2746, 4677, 337, 285, 253, 2905, 11985, 275, 253, 10199, 2593, 2085, 247, 18511, 16038, 323, 253, 1895, 50276, 783, 2934, 310, 15246, 533, 3240, 3576, 347, 581, 476, 923, 432, 253, 5661, 1543, 2299, 5734, 690, 7794, 273, 253, 2929, 403, 2007, 5520, 390, 1805, 5544, 352, 4558, 11132, 281, 7472, 253, 1524, 7680, 285, 3486, 273, 436, 789, 50276, 783, 2929, 25957, 326, 253, 2280, 4315, 278, 310, 16421, 285, 9674, 352, 4648, 247, 873, 273, 6194, 494, 3602, 281, 3037, 253, 2060, 3603, 273, 436, 16421, 4315, 651, 368, 4496, 6583, 604, 436, 310, 6296, 253, 1083, 390, 253, 4028, 310, 43049, 247, 3430, 5406, 323, 247, 1781, 966, 273, 10546, 7839, 757, 2718, 253, 2280, 249, 797, 571, 4315, 310, 417, 16421, 285, 697, 12028, 3469, 327, 253, 14923, 11627, 323, 1650, 4496, 1908, 247, 465, 4492, 32752, 15508, 342, 6036, 14636, 347, 253, 14923, 11627, 3103, 604, 278, 310, 6296, 8025, 281, 320, 247, 16421, 4315, 253, 7990, 273, 436, 789, 310, 1077, 11096, 5734, 253, 4477, 476, 921, 342, 3081, 4679, 326, 253, 4081, 2746, 6556, 2032, 1014, 672, 253, 2280, 4315, 310, 27370, 74, 21805, 285, 1899, 6820, 50276, 10352, 363, 1162, 355, 253, 14940, 2281, 273, 11454, 6928, 323, 6311, 3470, 273, 1027, 11383, 5723, 2824, 6247, 452, 5183, 326, 253, 1180, 273, 44540, 281, 3037, 247, 1159, 556, 247, 30044, 2954, 342, 697, 4294, 347, 253, 26370, 273, 271, 258, 615, 310, 2905, 281, 253, 3361, 273, 1077, 1029, 18163, 4295, 275, 697, 2900, 352, 778, 320, 1896, 326, 49975, 9866, 390, 288, 9866, 476, 5115, 10870, 7200, 604, 597, 403, 10166, 689, 247, 10481, 1048, 2180, 533, 253, 1655, 873, 273, 4679, 513, 417, 2085, 247, 10799, 3662, 281, 436, 5150, 3103, 891, 651, 7052, 11907, 253, 4477, 281, 1408, 271, 3081, 3368, 326, 18784, 49975, 9866, 285, 288, 9866, 342, 247, 1077, 1029, 1180, 273, 44540, 50276, 9750, 9877, 1162, 355, 8077, 5411, 10546, 7839, 757, 285, 16653, 23623, 11454, 6928, 3066, 6843, 10806, 5723, 2824, 9169, 452, 2011, 326, 253, 298, 18, 2957, 3470, 10738, 1805, 3045, 1223, 9441, 804, 8062, 432, 941, 352, 651, 320, 9371, 281, 923, 604, 253, 1072, 6556, 2032, 323, 436, 1895, 347, 973, 3103, 253, 4477, 943, 1908, 8785, 562, 271, 3081, 28913, 1263, 326, 2722, 849, 253, 3045, 16149, 672, 253, 2957, 1159, 4648, 253, 298, 18, 5222, 50276, 45019, 789, 407, 465, 303, 1162, 355, 13827, 11454, 9826, 8967, 7424, 549, 32693, 19, 12172, 1010, 28459, 556, 6508, 253, 7990, 273, 11454, 258, 3229, 281, 13827, 8967, 7424, 891, 651, 11907, 253, 4477, 281, 1908, 436, 2746, 604, 1896, 347, 271, 3081, 8245, 50276, 783, 2905, 789, 2593, 556, 9829, 690, 4623, 2720, 789, 326, 546, 36217, 10546, 7839, 757, 8062, 1223, 970, 247, 11454, 2990, 281, 9441, 8062, 432, 941, 4496, 3730, 281, 253, 1563, 6630, 9380, 285, 253, 10414, 15308, 323, 2007, 4278, 670, 253, 4623, 2720, 789, 24399, 12057, 3169, 14053, 342, 5145, 4715, 247, 6630, 588, 472, 1162, 355, 549, 32693, 1518, 1229, 2537, 746, 271, 18389, 327, 3332, 5145, 4715, 5609, 323, 2245, 10546, 7839, 757, 2718, 22765, 18279, 2150, 3737, 277, 9169, 285, 22791, 272, 2341, 36157, 272, 11454, 6928, 323, 4715, 8062, 432, 941, 1182, 73, 543, 1162, 355, 298, 21, 12352, 43425, 50276, 8826, 3081, 5884, 5701, 50275, 783, 1273, 12494, 273, 253, 10199, 25957, 253, 6353, 1474, 4513, 432, 253, 3806, 24679, 9086, 846, 15708, 347, 767, 6353, 2550, 3007, 504, 275, 436, 4758, 253, 4477, 943, 294, 40712, 253, 6197, 342, 846, 247, 2810, 13329, 50275, 783, 4477, 943, 1908, 16984, 17356, 26332, 253, 4373, 19484, 326, 12853, 253, 26370, 4313, 7887, 347, 247, 36384, 1180, 352, 651, 1056, 253, 1127, 30909, 50275, 12563, 253, 4067, 296, 36271, 943, 320, 2057, 247, 4067, 296, 36271, 390, 253, 6253, 296, 1648, 383, 436, 2929, 556, 4081, 247, 15246, 533, 3576, 2900, 326, 476, 9441, 10546, 7839, 757, 8062, 1014, 672, 253, 13200, 258, 3229, 10738, 10704, 26370, 253, 1895, 310, 1077, 973, 24013, 8550, 285, 253, 2929, 11424, 253, 5161, 5697, 275, 247, 1077, 10799, 1039, 2299, 891, 452, 5393, 275, 253, 2022, 2278, 690, 7794, 3340, 253, 9376, 326, 278, 310, 247, 16421, 4315, 342, 12028, 326, 403, 1899, 17777, 943, 320, 6283, 9713, 390, 5544, 281, 2572, 253, 345, 719, 1866, 405, 285, 4583, 19843, 273, 253, 2929, 50275, 5996, 30080, 22559, 2380, 50276, 74, 651, 751, 281, 5717, 253, 4477, 323, 15974, 253, 2720, 7350, 891, 452, 9300, 619, 4868, 5474, 33032, 2520, 2929, 10262, 247, 747, 1332, 26370, 13823, 11454, 2990, 256, 1136, 323, 4715, 10546, 7839, 757, 2718, 432, 941, 253, 4477, 4853, 247, 26370, 13823, 3605, 618, 74, 323, 49653, 253, 3733, 941, 715, 13827, 285, 1327, 296, 1648, 3530, 1754, 327, 253, 9162, 906, 253, 3213, 1979, 323, 253, 10704, 47037, 310, 10904, 285, 253, 1180, 273, 3530, 323, 3733, 310, 16645, 253, 12510, 273, 256, 1136, 310, 5183, 970, 29784, 10546, 7839, 757, 2718, 26332, 247, 1264, 2915, 1895, 285, 270, 3370, 472, 1566, 2266, 2792, 50276, 66, 26370, 13823, 2746, 323, 4715, 10546, 7839, 757, 2718, 310, 4460, 50276, 783, 4679, 921, 326, 256, 1136, 476, 13613, 26065, 2570, 10546, 7839, 757, 2718, 685, 8245, 3082, 288, 9866, 285, 49975, 9866, 50276, 2520, 2929, 310, 973, 15720, 50276, 20881, 2792, 50276, 9088, 403, 690, 4373, 22041, 281, 320, 3413, 13542, 50276, 783, 2429, 3082, 403, 253, 8050, 5927, 50276, 9088, 403, 2067, 7350, 670, 253, 5661, 4758, 50276, 26122, 337, 253, 4836, 273, 4715, 3520, 8062, 432, 941, 556, 644, 273, 1270, 1600, 4102, 247, 2234, 2934, 273, 24049, 26370, 715, 253, 4715, 6974, 310, 4460, 285, 12302, 253, 4081, 1332, 310, 2969, 533, 3576, 2299, 627, 403, 253, 4373, 22041, 17356, 256, 281, 320, 13542, 3413, 3340, 253, 4313, 273, 253, 13827, 5110, 17356, 812, 320, 4619, 323, 3045, 812, 253, 4477, 5513, 849, 281, 6642, 436, 4373, 19484, 323, 2710, 3520, 2718, 1690, 253, 278, 2915, 1895, 672, 278, 310, 1781, 50276, 19, 253, 4081, 1332, 966, 7790, 253, 11508, 715, 8985, 13301, 326, 310, 13827, 390, 1327, 296, 1648, 352, 1537, 320, 9371, 281, 1566, 253, 5415, 331, 338, 925, 28965, 323, 1016, 7726, 326, 310, 3037, 494, 50276, 20, 253, 4477, 5467, 253, 39690, 10546, 7839, 757, 288, 33426, 17394, 87, 82, 476, 253, 4081, 4715, 7792, 4647, 281, 253, 275, 16806, 494, 14923, 1566, 50276, 21, 253, 288, 9866, 14370, 21329, 316, 1162, 355, 6247, 1057, 417, 1908, 253, 39690, 9376, 275, 253, 4679, 858, 253, 4477, 897, 436, 9376, 275, 253, 288, 9866, 4715, 671, 253, 3280, 273, 288, 9866, 310, 253, 7898, 13335, 12401, 256, 1136, 285, 20589, 9866, 849, 858, 253, 4477, 1918, 253, 3280, 281, 288, 9866, 50276, 22, 2139, 858, 253, 4477, 897, 253, 26416, 71, 6375, 47037, 310, 2649, 253, 2969, 299, 14398, 1332, 4569, 436, 2929, 12453, 253, 4722, 1895, 285, 310, 973, 24013, 8550, 671, 253, 4081, 2746, 310, 4460, 285, 253, 5661, 1543, 403, 47860, 3738, 891, 5257, 281, 2997, 436, 2929, 891, 452, 690, 7350, 534, 891, 2508, 275, 253, 2022, 2278, 5474, 33032, 2520, 2929, 29328, 281, 3157, 253, 4715, 273, 247, 10546, 7839, 757, 985, 407, 39330, 253, 26370, 273, 253, 673, 2962, 941, 50276, 66, 26370, 13823, 3605, 310, 806, 908, 281, 30215, 253, 673, 7726, 715, 13827, 285, 1327, 296, 1648, 840, 1309, 253, 3733, 273, 253, 10546, 7839, 757, 2990, 253, 13827, 629, 310, 8527, 970, 247, 4577, 4522, 383, 554, 285, 671, 19958, 625, 7208, 275, 253, 3733, 941, 23694, 253, 13827, 285, 1327, 296, 1648, 4243, 273, 253, 941, 672, 3733, 247, 10546, 7839, 757, 2990, 310, 247, 4460, 2934, 285, 253, 1543, 921, 2176, 11361, 273, 970, 436, 1332, 285, 4679, 403, 2908, 281, 2319, 253, 4833, 273, 501, 312, 4906, 9554, 27959, 285, 5743, 3470, 2299, 760, 767, 6667, 403, 2011, 275, 253, 4679, 352, 651, 320, 625, 21414, 604, 4679, 273, 625, 10546, 7839, 757, 2718, 476, 320, 5196, 50276, 8826, 3081, 3533, 672, 21565, 2852, 3054, 1057, 253, 1566, 897, 256, 4229, 673, 3213, 390, 1335, 970, 247, 4577, 4522, 383, 554, 323, 13827, 4243, 604, 594, 849, 310, 253, 26370, 5118, 403, 627, 1534, 11361, 273, 970, 271, 3081, 6194, 494, 1307, 268, 12780, 3185, 273, 3733, 247, 2014, 2990, 326, 6125, 268, 12780, 50276, 82, 88, 50274, 9088, 403, 2176, 9021, 273, 436, 2929, 327, 36636, 39330, 253, 26370, 273, 253, 673, 2962, 941, 534, 2722, 697, 5750, 273, 11138, 253, 3045, 273, 253, 10546, 7839, 757, 2990, 5474, 33032, 2520, 1263, 29328, 253, 13827, 13823, 3605, 618, 74, 323, 271, 9826, 8967, 5150, 285, 253, 3733, 5700, 326, 4648, 3530, 342, 1781, 618, 261, 625, 7208, 247, 11454, 2990, 556, 253, 15424, 8492, 281, 5257, 281, 3037, 247, 6032, 1159, 760, 247, 3710, 5110, 273, 941, 2797, 432, 247, 13827, 985, 15646, 247, 5233, 1818, 275, 643, 3000, 253, 3733, 941, 310, 516, 30063, 875, 247, 26830, 1818, 285, 247, 5233, 1818, 7613, 1293, 618, 74, 352, 310, 2834, 323, 247, 11454, 2990, 281, 3037, 247, 13827, 8062, 253, 7680, 273, 618, 74, 310, 5783, 970, 247, 1264, 2915, 1895, 50275, 993, 23223, 50276, 262, 310, 271, 47860, 14876, 326, 253, 26370, 310, 247, 3673, 44856, 281, 4715, 273, 247, 3520, 985, 50276, 262, 310, 10084, 326, 247, 2969, 689, 48027, 310, 2217, 50275, 8265, 3993, 50276, 4674, 608, 14371, 326, 618, 74, 310, 247, 1175, 11193, 281, 26370, 3605, 4927, 533, 436, 1537, 417, 2186, 323, 1027, 13249, 2718, 4927, 19584, 326, 253, 6510, 273, 253, 13249, 985, 310, 271, 12902, 1127, 275, 643, 3000, 253, 8492, 432, 253, 12902, 1127, 310, 2168, 42426, 432, 253, 1899, 2299, 618, 74, 1057, 417, 390, 2550, 275, 3946, 253, 5222, 273, 253, 1375, 285, 7624, 618, 74, 3469, 327, 253, 13249, 985, 352, 310, 29224, 281, 3588, 253, 31376, 273, 618, 74, 50276, 783, 10491, 5700, 310, 47641, 347, 2011, 275, 2829, 337, 253, 1543, 403, 7996, 281, 253, 4373, 19484, 25184, 247, 12925, 17285, 28055, 310, 29224, 50275, 37585, 5701, 50276, 262, 1537, 320, 271, 47860, 14876, 326, 253, 11454, 2990, 14280, 281, 3037, 247, 6032, 1159, 285, 436, 15424, 8492, 310, 247, 3673, 44856, 281, 4715, 273, 247, 13827, 985, 2299, 436, 14876, 310, 417, 17618, 407, 4679, 285, 352, 310, 12744, 1880, 253, 10419, 501, 14503, 436, 1895, 271, 3081, 3368, 390, 1783, 310, 417, 17396, 533, 778, 3157, 253, 7680, 273, 253, 1246, 1263, 50276, 13206, 337, 310, 247, 2372, 21643, 891, 4510, 326, 253, 11498, 273, 253, 24039, 403, 4229, 1561, 1016, 1650, 50275, 6438, 5955, 50276, 455, 619, 7350, 497, 9713, 407, 253, 3081, 4679, 285, 22909, 891, 5731, 253, 4868, 432, 608, 281, 721, 436, 1263, 310, 1754, 327, 271, 47860, 14876, 285, 253, 4081, 1332, 310, 2969, 533, 3576, 2299, 253, 5700, 310, 47641, 285, 253, 31376, 310, 12744, 50276, 187, 187, 4118, 18435, 27, 2520, 2929, 23970, 253, 26370, 13823, 11454, 2990, 256, 1136, 323, 11138, 10704, 7882, 275, 10546, 7839, 757, 11454, 6928, 281, 436, 990, 253, 4477, 9569, 253, 26370, 13823, 3605, 618, 74, 281, 30215, 673, 11508, 715, 13827, 285, 1327, 296, 1648, 11821, 285, 12661, 281, 5223, 253, 9554, 6974, 15672, 50276, 783, 2929, 8523, 2959, 1264, 5075, 2997, 285, 581, 5075, 12009, 12645, 253, 2022, 7364, 8042, 562, 407, 30628, 14588, 281, 5816, 10414, 432, 253, 6239, 13260, 3212, 253, 4081, 2746, 24088, 2605, 273, 253, 2280, 4315, 39690, 10546, 7839, 757, 285, 8254, 6787, 327, 4679, 1690, 3081, 1666, 25379, 285, 4373, 19484, 7533, 50276, 783, 30080, 22559, 858, 247, 1175, 2628, 275, 22291, 30628, 7350, 391, 770, 86, 2559, 521, 13716, 281, 247, 2590, 2997, 285, 391, 2577, 7096, 2559, 521, 13716, 281, 5075, 2997, 6524, 627, 310, 247, 13969, 2190, 30628, 281, 2997, 253, 2929, 50275, 783, 913, 84, 1211, 28799, 5783, 253, 30628, 17401, 253, 1332, 310, 15246, 2568, 3576, 285, 253, 2929, 310, 973, 3542, 253, 12510, 273, 253, 4081, 2746, 310, 2011, 275, 1027, 22349, 1580, 2067, 2570, 2718, 10738, 29784, 5319, 253, 2929, 10316, 247, 14282, 7680, 281, 253, 3114 ]
[ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 ]
[ 2929, 310, 973, 15720, 285, 352, 11785, 656, 253, 2234, 5697, 275, 247, 1077, 10799, 5133, 253, 4477, 452, 2218, 271, 7126, 2628, 275, 27321, 253, 878, 323, 247, 26370, 13823, 2746, 4677, 337, 285, 253, 2905, 11985, 275, 253, 10199, 2593, 2085, 247, 18511, 16038, 323, 253, 1895, 50276, 783, 2934, 310, 15246, 533, 3240, 3576, 347, 581, 476, 923, 432, 253, 5661, 1543, 2299, 5734, 690, 7794, 273, 253, 2929, 403, 2007, 5520, 390, 1805, 5544, 352, 4558, 11132, 281, 7472, 253, 1524, 7680, 285, 3486, 273, 436, 789, 50276, 783, 2929, 25957, 326, 253, 2280, 4315, 278, 310, 16421, 285, 9674, 352, 4648, 247, 873, 273, 6194, 494, 3602, 281, 3037, 253, 2060, 3603, 273, 436, 16421, 4315, 651, 368, 4496, 6583, 604, 436, 310, 6296, 253, 1083, 390, 253, 4028, 310, 43049, 247, 3430, 5406, 323, 247, 1781, 966, 273, 10546, 7839, 757, 2718, 253, 2280, 249, 797, 571, 4315, 310, 417, 16421, 285, 697, 12028, 3469, 327, 253, 14923, 11627, 323, 1650, 4496, 1908, 247, 465, 4492, 32752, 15508, 342, 6036, 14636, 347, 253, 14923, 11627, 3103, 604, 278, 310, 6296, 8025, 281, 320, 247, 16421, 4315, 253, 7990, 273, 436, 789, 310, 1077, 11096, 5734, 253, 4477, 476, 921, 342, 3081, 4679, 326, 253, 4081, 2746, 6556, 2032, 1014, 672, 253, 2280, 4315, 310, 27370, 74, 21805, 285, 1899, 6820, 50276, 10352, 363, 1162, 355, 253, 14940, 2281, 273, 11454, 6928, 323, 6311, 3470, 273, 1027, 11383, 5723, 2824, 6247, 452, 5183, 326, 253, 1180, 273, 44540, 281, 3037, 247, 1159, 556, 247, 30044, 2954, 342, 697, 4294, 347, 253, 26370, 273, 271, 258, 615, 310, 2905, 281, 253, 3361, 273, 1077, 1029, 18163, 4295, 275, 697, 2900, 352, 778, 320, 1896, 326, 49975, 9866, 390, 288, 9866, 476, 5115, 10870, 7200, 604, 597, 403, 10166, 689, 247, 10481, 1048, 2180, 533, 253, 1655, 873, 273, 4679, 513, 417, 2085, 247, 10799, 3662, 281, 436, 5150, 3103, 891, 651, 7052, 11907, 253, 4477, 281, 1408, 271, 3081, 3368, 326, 18784, 49975, 9866, 285, 288, 9866, 342, 247, 1077, 1029, 1180, 273, 44540, 50276, 9750, 9877, 1162, 355, 8077, 5411, 10546, 7839, 757, 285, 16653, 23623, 11454, 6928, 3066, 6843, 10806, 5723, 2824, 9169, 452, 2011, 326, 253, 298, 18, 2957, 3470, 10738, 1805, 3045, 1223, 9441, 804, 8062, 432, 941, 352, 651, 320, 9371, 281, 923, 604, 253, 1072, 6556, 2032, 323, 436, 1895, 347, 973, 3103, 253, 4477, 943, 1908, 8785, 562, 271, 3081, 28913, 1263, 326, 2722, 849, 253, 3045, 16149, 672, 253, 2957, 1159, 4648, 253, 298, 18, 5222, 50276, 45019, 789, 407, 465, 303, 1162, 355, 13827, 11454, 9826, 8967, 7424, 549, 32693, 19, 12172, 1010, 28459, 556, 6508, 253, 7990, 273, 11454, 258, 3229, 281, 13827, 8967, 7424, 891, 651, 11907, 253, 4477, 281, 1908, 436, 2746, 604, 1896, 347, 271, 3081, 8245, 50276, 783, 2905, 789, 2593, 556, 9829, 690, 4623, 2720, 789, 326, 546, 36217, 10546, 7839, 757, 8062, 1223, 970, 247, 11454, 2990, 281, 9441, 8062, 432, 941, 4496, 3730, 281, 253, 1563, 6630, 9380, 285, 253, 10414, 15308, 323, 2007, 4278, 670, 253, 4623, 2720, 789, 24399, 12057, 3169, 14053, 342, 5145, 4715, 247, 6630, 588, 472, 1162, 355, 549, 32693, 1518, 1229, 2537, 746, 271, 18389, 327, 3332, 5145, 4715, 5609, 323, 2245, 10546, 7839, 757, 2718, 22765, 18279, 2150, 3737, 277, 9169, 285, 22791, 272, 2341, 36157, 272, 11454, 6928, 323, 4715, 8062, 432, 941, 1182, 73, 543, 1162, 355, 298, 21, 12352, 43425, 50276, 8826, 3081, 5884, 5701, 50275, 783, 1273, 12494, 273, 253, 10199, 25957, 253, 6353, 1474, 4513, 432, 253, 3806, 24679, 9086, 846, 15708, 347, 767, 6353, 2550, 3007, 504, 275, 436, 4758, 253, 4477, 943, 294, 40712, 253, 6197, 342, 846, 247, 2810, 13329, 50275, 783, 4477, 943, 1908, 16984, 17356, 26332, 253, 4373, 19484, 326, 12853, 253, 26370, 4313, 7887, 347, 247, 36384, 1180, 352, 651, 1056, 253, 1127, 30909, 50275, 12563, 253, 4067, 296, 36271, 943, 320, 2057, 247, 4067, 296, 36271, 390, 253, 6253, 296, 1648, 383, 436, 2929, 556, 4081, 247, 15246, 533, 3576, 2900, 326, 476, 9441, 10546, 7839, 757, 8062, 1014, 672, 253, 13200, 258, 3229, 10738, 10704, 26370, 253, 1895, 310, 1077, 973, 24013, 8550, 285, 253, 2929, 11424, 253, 5161, 5697, 275, 247, 1077, 10799, 1039, 2299, 891, 452, 5393, 275, 253, 2022, 2278, 690, 7794, 3340, 253, 9376, 326, 278, 310, 247, 16421, 4315, 342, 12028, 326, 403, 1899, 17777, 943, 320, 6283, 9713, 390, 5544, 281, 2572, 253, 345, 719, 1866, 405, 285, 4583, 19843, 273, 253, 2929, 50275, 5996, 30080, 22559, 2380, 50276, 74, 651, 751, 281, 5717, 253, 4477, 323, 15974, 253, 2720, 7350, 891, 452, 9300, 619, 4868, 5474, 33032, 2520, 2929, 10262, 247, 747, 1332, 26370, 13823, 11454, 2990, 256, 1136, 323, 4715, 10546, 7839, 757, 2718, 432, 941, 253, 4477, 4853, 247, 26370, 13823, 3605, 618, 74, 323, 49653, 253, 3733, 941, 715, 13827, 285, 1327, 296, 1648, 3530, 1754, 327, 253, 9162, 906, 253, 3213, 1979, 323, 253, 10704, 47037, 310, 10904, 285, 253, 1180, 273, 3530, 323, 3733, 310, 16645, 253, 12510, 273, 256, 1136, 310, 5183, 970, 29784, 10546, 7839, 757, 2718, 26332, 247, 1264, 2915, 1895, 285, 270, 3370, 472, 1566, 2266, 2792, 50276, 66, 26370, 13823, 2746, 323, 4715, 10546, 7839, 757, 2718, 310, 4460, 50276, 783, 4679, 921, 326, 256, 1136, 476, 13613, 26065, 2570, 10546, 7839, 757, 2718, 685, 8245, 3082, 288, 9866, 285, 49975, 9866, 50276, 2520, 2929, 310, 973, 15720, 50276, 20881, 2792, 50276, 9088, 403, 690, 4373, 22041, 281, 320, 3413, 13542, 50276, 783, 2429, 3082, 403, 253, 8050, 5927, 50276, 9088, 403, 2067, 7350, 670, 253, 5661, 4758, 50276, 26122, 337, 253, 4836, 273, 4715, 3520, 8062, 432, 941, 556, 644, 273, 1270, 1600, 4102, 247, 2234, 2934, 273, 24049, 26370, 715, 253, 4715, 6974, 310, 4460, 285, 12302, 253, 4081, 1332, 310, 2969, 533, 3576, 2299, 627, 403, 253, 4373, 22041, 17356, 256, 281, 320, 13542, 3413, 3340, 253, 4313, 273, 253, 13827, 5110, 17356, 812, 320, 4619, 323, 3045, 812, 253, 4477, 5513, 849, 281, 6642, 436, 4373, 19484, 323, 2710, 3520, 2718, 1690, 253, 278, 2915, 1895, 672, 278, 310, 1781, 50276, 19, 253, 4081, 1332, 966, 7790, 253, 11508, 715, 8985, 13301, 326, 310, 13827, 390, 1327, 296, 1648, 352, 1537, 320, 9371, 281, 1566, 253, 5415, 331, 338, 925, 28965, 323, 1016, 7726, 326, 310, 3037, 494, 50276, 20, 253, 4477, 5467, 253, 39690, 10546, 7839, 757, 288, 33426, 17394, 87, 82, 476, 253, 4081, 4715, 7792, 4647, 281, 253, 275, 16806, 494, 14923, 1566, 50276, 21, 253, 288, 9866, 14370, 21329, 316, 1162, 355, 6247, 1057, 417, 1908, 253, 39690, 9376, 275, 253, 4679, 858, 253, 4477, 897, 436, 9376, 275, 253, 288, 9866, 4715, 671, 253, 3280, 273, 288, 9866, 310, 253, 7898, 13335, 12401, 256, 1136, 285, 20589, 9866, 849, 858, 253, 4477, 1918, 253, 3280, 281, 288, 9866, 50276, 22, 2139, 858, 253, 4477, 897, 253, 26416, 71, 6375, 47037, 310, 2649, 253, 2969, 299, 14398, 1332, 4569, 436, 2929, 12453, 253, 4722, 1895, 285, 310, 973, 24013, 8550, 671, 253, 4081, 2746, 310, 4460, 285, 253, 5661, 1543, 403, 47860, 3738, 891, 5257, 281, 2997, 436, 2929, 891, 452, 690, 7350, 534, 891, 2508, 275, 253, 2022, 2278, 5474, 33032, 2520, 2929, 29328, 281, 3157, 253, 4715, 273, 247, 10546, 7839, 757, 985, 407, 39330, 253, 26370, 273, 253, 673, 2962, 941, 50276, 66, 26370, 13823, 3605, 310, 806, 908, 281, 30215, 253, 673, 7726, 715, 13827, 285, 1327, 296, 1648, 840, 1309, 253, 3733, 273, 253, 10546, 7839, 757, 2990, 253, 13827, 629, 310, 8527, 970, 247, 4577, 4522, 383, 554, 285, 671, 19958, 625, 7208, 275, 253, 3733, 941, 23694, 253, 13827, 285, 1327, 296, 1648, 4243, 273, 253, 941, 672, 3733, 247, 10546, 7839, 757, 2990, 310, 247, 4460, 2934, 285, 253, 1543, 921, 2176, 11361, 273, 970, 436, 1332, 285, 4679, 403, 2908, 281, 2319, 253, 4833, 273, 501, 312, 4906, 9554, 27959, 285, 5743, 3470, 2299, 760, 767, 6667, 403, 2011, 275, 253, 4679, 352, 651, 320, 625, 21414, 604, 4679, 273, 625, 10546, 7839, 757, 2718, 476, 320, 5196, 50276, 8826, 3081, 3533, 672, 21565, 2852, 3054, 1057, 253, 1566, 897, 256, 4229, 673, 3213, 390, 1335, 970, 247, 4577, 4522, 383, 554, 323, 13827, 4243, 604, 594, 849, 310, 253, 26370, 5118, 403, 627, 1534, 11361, 273, 970, 271, 3081, 6194, 494, 1307, 268, 12780, 3185, 273, 3733, 247, 2014, 2990, 326, 6125, 268, 12780, 50276, 82, 88, 50274, 9088, 403, 2176, 9021, 273, 436, 2929, 327, 36636, 39330, 253, 26370, 273, 253, 673, 2962, 941, 534, 2722, 697, 5750, 273, 11138, 253, 3045, 273, 253, 10546, 7839, 757, 2990, 5474, 33032, 2520, 1263, 29328, 253, 13827, 13823, 3605, 618, 74, 323, 271, 9826, 8967, 5150, 285, 253, 3733, 5700, 326, 4648, 3530, 342, 1781, 618, 261, 625, 7208, 247, 11454, 2990, 556, 253, 15424, 8492, 281, 5257, 281, 3037, 247, 6032, 1159, 760, 247, 3710, 5110, 273, 941, 2797, 432, 247, 13827, 985, 15646, 247, 5233, 1818, 275, 643, 3000, 253, 3733, 941, 310, 516, 30063, 875, 247, 26830, 1818, 285, 247, 5233, 1818, 7613, 1293, 618, 74, 352, 310, 2834, 323, 247, 11454, 2990, 281, 3037, 247, 13827, 8062, 253, 7680, 273, 618, 74, 310, 5783, 970, 247, 1264, 2915, 1895, 50275, 993, 23223, 50276, 262, 310, 271, 47860, 14876, 326, 253, 26370, 310, 247, 3673, 44856, 281, 4715, 273, 247, 3520, 985, 50276, 262, 310, 10084, 326, 247, 2969, 689, 48027, 310, 2217, 50275, 8265, 3993, 50276, 4674, 608, 14371, 326, 618, 74, 310, 247, 1175, 11193, 281, 26370, 3605, 4927, 533, 436, 1537, 417, 2186, 323, 1027, 13249, 2718, 4927, 19584, 326, 253, 6510, 273, 253, 13249, 985, 310, 271, 12902, 1127, 275, 643, 3000, 253, 8492, 432, 253, 12902, 1127, 310, 2168, 42426, 432, 253, 1899, 2299, 618, 74, 1057, 417, 390, 2550, 275, 3946, 253, 5222, 273, 253, 1375, 285, 7624, 618, 74, 3469, 327, 253, 13249, 985, 352, 310, 29224, 281, 3588, 253, 31376, 273, 618, 74, 50276, 783, 10491, 5700, 310, 47641, 347, 2011, 275, 2829, 337, 253, 1543, 403, 7996, 281, 253, 4373, 19484, 25184, 247, 12925, 17285, 28055, 310, 29224, 50275, 37585, 5701, 50276, 262, 1537, 320, 271, 47860, 14876, 326, 253, 11454, 2990, 14280, 281, 3037, 247, 6032, 1159, 285, 436, 15424, 8492, 310, 247, 3673, 44856, 281, 4715, 273, 247, 13827, 985, 2299, 436, 14876, 310, 417, 17618, 407, 4679, 285, 352, 310, 12744, 1880, 253, 10419, 501, 14503, 436, 1895, 271, 3081, 3368, 390, 1783, 310, 417, 17396, 533, 778, 3157, 253, 7680, 273, 253, 1246, 1263, 50276, 13206, 337, 310, 247, 2372, 21643, 891, 4510, 326, 253, 11498, 273, 253, 24039, 403, 4229, 1561, 1016, 1650, 50275, 6438, 5955, 50276, 455, 619, 7350, 497, 9713, 407, 253, 3081, 4679, 285, 22909, 891, 5731, 253, 4868, 432, 608, 281, 721, 436, 1263, 310, 1754, 327, 271, 47860, 14876, 285, 253, 4081, 1332, 310, 2969, 533, 3576, 2299, 253, 5700, 310, 47641, 285, 253, 31376, 310, 12744, 50276, 187, 187, 4118, 18435, 27, 2520, 2929, 23970, 253, 26370, 13823, 11454, 2990, 256, 1136, 323, 11138, 10704, 7882, 275, 10546, 7839, 757, 11454, 6928, 281, 436, 990, 253, 4477, 9569, 253, 26370, 13823, 3605, 618, 74, 281, 30215, 673, 11508, 715, 13827, 285, 1327, 296, 1648, 11821, 285, 12661, 281, 5223, 253, 9554, 6974, 15672, 50276, 783, 2929, 8523, 2959, 1264, 5075, 2997, 285, 581, 5075, 12009, 12645, 253, 2022, 7364, 8042, 562, 407, 30628, 14588, 281, 5816, 10414, 432, 253, 6239, 13260, 3212, 253, 4081, 2746, 24088, 2605, 273, 253, 2280, 4315, 39690, 10546, 7839, 757, 285, 8254, 6787, 327, 4679, 1690, 3081, 1666, 25379, 285, 4373, 19484, 7533, 50276, 783, 30080, 22559, 858, 247, 1175, 2628, 275, 22291, 30628, 7350, 391, 770, 86, 2559, 521, 13716, 281, 247, 2590, 2997, 285, 391, 2577, 7096, 2559, 521, 13716, 281, 5075, 2997, 6524, 627, 310, 247, 13969, 2190, 30628, 281, 2997, 253, 2929, 50275, 783, 913, 84, 1211, 28799, 5783, 253, 30628, 17401, 253, 1332, 310, 15246, 2568, 3576, 285, 253, 2929, 310, 973, 3542, 253, 12510, 273, 253, 4081, 2746, 310, 2011, 275, 1027, 22349, 1580, 2067, 2570, 2718, 10738, 29784, 5319, 253, 2929, 10316, 247, 14282, 7680, 281, 253, 3114 ]
Below is given review of a research paper from cnoference journal. Please write a summary the review. ### Review: in this paper the authors proposed to represent the image as a graph structure and introduce a graph neural network vig architecture to extract graph level feature for visual tasks the graph neural network can be aligned with standard vision transformers with many shared micro designs images are spilt into patches as nodes in graphs each node is connected with its neighborhoods the vig network is in a hierarchical feature extraction style like that of swin transformer the authors conducted extensive experiments on image recognition and object detection and comaprable performance with other stateoftheart methods strength 1 this paper is wellorganized and can be easily understood by readers the technical details are introduced clearly 2 the authors conducted extensive experiments on multiple benchmarks to investigate the effectiveness of different modules and designs in this paper weakness 1 the idea of graph neural network for visual recognition is appealing but it seems to be great to expoloits image structures to adjust deep network structures however it seems that vig just takes the simplest setting of graph neural network it looks like a complex verision of vision transformer the hierarchical structures involved is in a swin transformer style since there are a lot of techniques in vit used such as multihead attention feed forward network and etc i am a little confused about whether vig is just a simplfied version of swin transformer 2 the feature dimensions resolutions and other settings listed in table 2 are very similar to that of swin transformer so when there are 224 times 224 inputs vig outputs 7 times 7 feature maps but in table 5 i dont find obvious advantage of vig when compared with swin transformer 3 another big problem is that table 3 lists a lot of tricks used in vig according to the mae paper masked autoencoders are scalable vision learners cvpr 2022 a vanilla vitb model 86m parameters can get 823 top 1 imagenet accuracy with the exponential moving average trick ema so the proposed vig doesnt show any advantages compared with the vitb model in table 1 i suggest the authors check whether its possible to use image intrinsic structures proposed in graphfpn graphfpn graph feature pyramid network for object detection to guide the feature learning of vig it will be very interesting then docsepthis manuscript proposes a new kind of backbone named vig which represents the image as a graph and extracts graphlevel features for vision tasks specifically the input image is separated into patches as nodes in a graph grapher module and ffn module are used to aggregate the information among nodes and transfer feature space isotropic and pyramid architectures are proposed to build models of different sizes the vig are compared with other sota backbones on both image classification task and object detection task pros representing an image as graph is novel and interesting the proposed grapher module and ffn are soundness also the visualization of the graph structure indeed shows that the proposed model has learned meaningful relationships among image patches the ablation studies and experiments on several vision tasks are good both isotropic and pyramid architectures show the effectiveness of the proposed vig model cons some parts of the manuscript are unclear for example how to initialize a graph of an image is unclear it is said the graph is built based on k nearest neighbors but how to compute the knn is unclear is the knn constructed based on the position or the similarity of the feature does the method to compute the knn influence the performance it would be great if the authors can show more visualization cases of the constructed graph under a more complicated scenario that several objects are involved yes docseprecently convnets and transformers have achieved stateoftheart results on various visual recognition tasks this paper explores graph neural networks gnn for visual tasks by constructing a graph structure from the input image the paper discusses the differences and advantages of graph structure over grid and sequence structure gnn is applied on the graph data and ffn is introduced for improving the feature diversity the obtained vig backbone can achieve comparable and even better performance than the sota convnets and transformers strengths the paper is easy to follow and wellwritten the pioneering exploration of gnn as vision backbone is inspiring to more works this work will much appeal to the community the experiments on image classification and object detection show the effectiveness of vision gnn weaknesses in different layers will the constructed graph structure be updated how vig is used in object detection is not described in detail please include the implementation details using graph neural networks as a component although not as the backbone for image recognition has been investigated in some previous works 12 please discuss with these works 1 chen zhaomin et al multilabel image recognition with graph convolutional networks proceedings of the ieeecvf conference on computer vision and pattern recognition 2019 2 shen yantao et al person reidentification with deep similarityguided graph neural network proceedings of the european conference on computer vision eccv 2018 the limitations and potential negative societal impact were addressed in the paper docsepthis paper introduces a gnnbased backbone model for image tasks that is mainly built on the gnn layers and fc layers basically it splits an image into patches and takes each patch as a node to construct the graph structure a gnnbased architecture is utilized to the graph for visual representation learning to address the degradation of feature diversity more transformation of feature is added in the network extensive experiments of image classification and object detection show that the proposed vision gnn can outperform representative convolutional networks and transformers with similar number of parameters strengths this work is clearly motivated and well written the background of the research the motivation of graph representation for images and the related work are all clearly stated and summarized the first gnnbased backbone for visual tasks the simple yet effective vision gnn is introduced by adapting the gnn with reasonable modification by patchbased graph construction and adding more node feature transformations extensive evaluation and impressive results this paper demonstrates the power of the vision gnn model on imagenet and coco datasets outperforming representative cnn and transformer models the results are impressive and interesting weaknesses based on my experience the inference speed of gnn is not as fast as cnn its a general issue for the researchers to explore for mobile applications of vision gnn typos in line41 gcngrapher in supplemental material the output channel of selffc1 in ffn should be hiddenchannels rather than inchannels the limitations are addressed ### Summary:
this paper proposes to explore the graph structure of images by considering patches as nodes where the graph is constructed by connecting nearest neighbors extensive experiments on various visual tasks ie image recognition and object detection have demonstrated the effectiveness of the proposed vig all the reviewers agree on the inspiring and promising exploration the paper is also wellwritten and the experimental results are impressive
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 249, 436, 2929, 253, 4477, 4081, 281, 1957, 253, 2460, 347, 247, 4216, 2605, 285, 9569, 247, 4216, 11454, 2990, 13271, 10336, 281, 4908, 4216, 1268, 4735, 323, 5304, 8892, 253, 4216, 11454, 2990, 476, 320, 15616, 342, 2629, 8113, 4979, 398, 342, 1142, 6096, 2494, 11809, 3888, 403, 653, 2878, 715, 20412, 347, 7632, 275, 14580, 1016, 4666, 310, 4802, 342, 697, 25237, 253, 13271, 2990, 310, 275, 247, 24498, 4735, 11998, 3740, 751, 326, 273, 1863, 249, 39707, 253, 4477, 5196, 9470, 4679, 327, 2460, 8981, 285, 1789, 5481, 285, 389, 522, 83, 494, 3045, 342, 643, 1375, 23037, 14387, 3082, 4757, 337, 436, 2929, 310, 973, 34092, 285, 476, 320, 4354, 7192, 407, 10668, 253, 7681, 4278, 403, 5611, 4518, 374, 253, 4477, 5196, 9470, 4679, 327, 2709, 49602, 281, 7409, 253, 12510, 273, 1027, 11911, 285, 11809, 275, 436, 2929, 50276, 20881, 1255, 337, 253, 2934, 273, 4216, 11454, 2990, 323, 5304, 8981, 310, 23176, 533, 352, 3133, 281, 320, 1270, 281, 866, 13013, 953, 2460, 5289, 281, 4575, 3676, 2990, 5289, 2299, 352, 3133, 326, 13271, 816, 3936, 253, 22325, 4758, 273, 4216, 11454, 2990, 352, 4453, 751, 247, 2570, 2336, 1297, 273, 8113, 39707, 253, 24498, 5289, 3206, 310, 275, 247, 1863, 249, 39707, 3740, 1580, 627, 403, 247, 2257, 273, 5609, 275, 9084, 908, 824, 347, 4471, 2522, 4116, 3997, 3579, 2990, 285, 3966, 891, 717, 247, 1652, 13477, 670, 1880, 13271, 310, 816, 247, 8077, 71, 728, 2715, 273, 1863, 249, 39707, 50276, 19, 253, 4735, 10103, 30285, 285, 643, 7533, 7117, 275, 2829, 374, 403, 1077, 2074, 281, 326, 273, 1863, 249, 39707, 50276, 601, 672, 627, 403, 22856, 2069, 22856, 14800, 13271, 18012, 818, 2069, 818, 4735, 8115, 533, 275, 2829, 608, 891, 13414, 1089, 4755, 5750, 273, 13271, 672, 2429, 342, 1863, 249, 39707, 495, 1529, 1943, 1895, 310, 326, 2829, 495, 10894, 247, 2257, 273, 24866, 908, 275, 13271, 2556, 281, 253, 278, 3348, 2929, 34741, 6753, 2083, 351, 398, 403, 44755, 8113, 40390, 30105, 1087, 1384, 1423, 247, 26724, 9084, 67, 1566, 11614, 78, 3602, 476, 755, 854, 1508, 1755, 337, 4440, 257, 292, 7200, 342, 253, 17619, 4886, 3388, 10480, 299, 785, 594, 253, 4081, 13271, 36908, 921, 667, 11361, 2429, 342, 253, 9084, 67, 1566, 275, 2829, 337, 50276, 74, 1804, 253, 4477, 2451, 1880, 697, 1896, 281, 897, 2460, 15276, 5289, 4081, 275, 4216, 71, 16077, 50276, 10580, 71, 16077, 4216, 4735, 39694, 2990, 323, 1789, 5481, 50276, 936, 7102, 253, 4735, 4715, 273, 13271, 352, 588, 320, 1077, 4722, 840, 5474, 33032, 2520, 7714, 29328, 247, 747, 2238, 273, 27882, 4907, 13271, 534, 6125, 253, 2460, 347, 247, 4216, 285, 16756, 4216, 5251, 3386, 323, 8113, 8892, 5742, 253, 3280, 2460, 310, 9070, 715, 20412, 347, 7632, 275, 247, 4216, 17309, 379, 6333, 285, 269, 4174, 6333, 403, 908, 281, 19737, 253, 1491, 2190, 7632, 285, 3700, 4735, 2317, 29436, 285, 39694, 35615, 403, 4081, 281, 1973, 3210, 273, 1027, 9552, 253, 13271, 403, 2429, 342, 643, 256, 5503, 896, 47473, 327, 1097, 2460, 9162, 4836, 285, 1789, 5481, 4836, 5847, 50275, 12554, 272, 271, 2460, 347, 4216, 310, 4460, 285, 4722, 50275, 783, 4081, 17309, 379, 6333, 285, 269, 4174, 403, 3590, 1255, 671, 253, 24426, 273, 253, 4216, 2605, 6296, 2722, 326, 253, 4081, 1566, 556, 6311, 14282, 7688, 2190, 2460, 20412, 50275, 783, 28913, 2175, 285, 4679, 327, 2067, 8113, 8892, 403, 1175, 1097, 29436, 285, 39694, 35615, 921, 253, 12510, 273, 253, 4081, 13271, 1566, 50276, 5040, 50275, 8826, 4243, 273, 253, 7714, 403, 12744, 323, 1650, 849, 281, 26641, 247, 4216, 273, 271, 2460, 310, 12744, 352, 310, 753, 253, 4216, 310, 4270, 1754, 327, 465, 5275, 15833, 533, 849, 281, 11897, 253, 694, 79, 310, 12744, 310, 253, 694, 79, 8818, 1754, 327, 253, 1899, 390, 253, 14259, 273, 253, 4735, 50275, 18566, 253, 1332, 281, 11897, 253, 694, 79, 4833, 253, 3045, 50275, 262, 651, 320, 1270, 604, 253, 4477, 476, 921, 625, 24426, 2219, 273, 253, 8818, 4216, 762, 247, 625, 9542, 10076, 326, 2067, 5113, 403, 3206, 50276, 9820, 5474, 339, 3456, 1154, 314, 2410, 47301, 285, 4979, 398, 452, 6786, 1375, 23037, 14387, 1543, 327, 2710, 5304, 8981, 8892, 436, 2929, 33826, 4216, 11454, 6928, 305, 9866, 323, 5304, 8892, 407, 26736, 247, 4216, 2605, 432, 253, 3280, 2460, 253, 2929, 25339, 253, 3910, 285, 11361, 273, 4216, 2605, 689, 9860, 285, 3425, 2605, 305, 9866, 310, 3732, 327, 253, 4216, 941, 285, 269, 4174, 310, 5611, 323, 11138, 253, 4735, 9991, 253, 2797, 13271, 27882, 476, 5115, 10870, 285, 1014, 1805, 3045, 685, 253, 256, 5503, 2410, 47301, 285, 4979, 398, 20544, 50276, 783, 2929, 310, 3477, 281, 956, 285, 973, 15720, 253, 45200, 17947, 273, 305, 9866, 347, 8113, 27882, 310, 29853, 281, 625, 2987, 436, 789, 588, 1199, 4549, 281, 253, 3114, 253, 4679, 327, 2460, 9162, 285, 1789, 5481, 921, 253, 12510, 273, 8113, 305, 9866, 50276, 20881, 1255, 265, 50275, 249, 1027, 8090, 588, 253, 8818, 4216, 2605, 320, 9300, 50276, 5430, 13271, 310, 908, 275, 1789, 5481, 310, 417, 2529, 275, 2508, 4496, 2486, 253, 7092, 4278, 50276, 5302, 4216, 11454, 6928, 347, 247, 4445, 3738, 417, 347, 253, 27882, 323, 2460, 8981, 556, 644, 6949, 275, 690, 2045, 2987, 1249, 4496, 2319, 342, 841, 2987, 50276, 18, 260, 864, 1182, 3227, 5240, 1162, 355, 33362, 1492, 2460, 8981, 342, 4216, 27311, 267, 6928, 10061, 273, 253, 26332, 70, 886, 39985, 8059, 327, 4382, 8113, 285, 3102, 8981, 6247, 374, 703, 79, 340, 7300, 80, 1162, 355, 1436, 294, 888, 1877, 342, 3676, 14259, 26960, 4216, 11454, 2990, 10061, 273, 253, 19454, 266, 8059, 327, 4382, 8113, 23746, 87, 4765, 253, 7364, 285, 2442, 4016, 38058, 3486, 497, 9713, 275, 253, 2929, 5474, 33032, 2520, 2929, 23970, 247, 305, 9866, 3169, 27882, 1566, 323, 2460, 8892, 326, 310, 7194, 4270, 327, 253, 305, 9866, 8090, 285, 269, 68, 8090, 10323, 352, 36509, 271, 2460, 715, 20412, 285, 3936, 1016, 12097, 347, 247, 4666, 281, 3989, 253, 4216, 2605, 247, 305, 9866, 3169, 10336, 310, 12845, 281, 253, 4216, 323, 5304, 6779, 4715, 281, 2953, 253, 11961, 273, 4735, 9991, 625, 9261, 273, 4735, 310, 2879, 275, 253, 2990, 9470, 4679, 273, 2460, 9162, 285, 1789, 5481, 921, 326, 253, 4081, 8113, 305, 9866, 476, 562, 32231, 8612, 27311, 267, 6928, 285, 4979, 398, 342, 2074, 1180, 273, 3602, 20544, 50276, 2520, 789, 310, 4518, 17194, 285, 973, 3542, 253, 4114, 273, 253, 2561, 253, 16038, 273, 4216, 6779, 323, 3888, 285, 253, 2905, 789, 403, 512, 4518, 4767, 285, 17903, 50276, 783, 806, 305, 9866, 3169, 27882, 323, 5304, 8892, 253, 2969, 2568, 3576, 8113, 305, 9866, 310, 5611, 407, 42174, 253, 305, 9866, 342, 5272, 11237, 407, 12097, 3169, 4216, 5140, 285, 6240, 625, 4666, 4735, 21257, 50276, 2068, 3134, 7103, 285, 13943, 1543, 436, 2929, 14371, 253, 1612, 273, 253, 8113, 305, 9866, 1566, 327, 4440, 257, 292, 285, 9285, 80, 15302, 41731, 14692, 8612, 260, 9866, 285, 39707, 3210, 253, 1543, 403, 13943, 285, 4722, 50276, 20881, 1255, 265, 50276, 3169, 327, 619, 2793, 253, 17032, 3885, 273, 305, 9866, 310, 417, 347, 3809, 347, 260, 9866, 697, 247, 2087, 2523, 323, 253, 8607, 281, 8338, 323, 6109, 4893, 273, 8113, 305, 9866, 50276, 555, 993, 275, 1386, 3156, 305, 68, 1251, 1761, 379, 275, 25702, 2144, 253, 3453, 5048, 273, 11329, 567, 68, 18, 275, 269, 4174, 943, 320, 8763, 32460, 2581, 685, 16416, 7059, 50276, 783, 7364, 403, 9713, 2490, 187, 4118, 18435, 27, 2520, 2929, 29328, 281, 8338, 253, 4216, 2605, 273, 3888, 407, 7296, 20412, 347, 7632, 835, 253, 4216, 310, 8818, 407, 12873, 5275, 15833, 9470, 4679, 327, 2710, 5304, 8892, 26332, 2460, 8981, 285, 1789, 5481, 452, 5183, 253, 12510, 273, 253, 4081, 13271, 512, 253, 30628, 5194, 327, 253, 29853, 285, 12532, 17947, 253, 2929, 310, 671, 973, 15720, 285, 253, 5661, 1543, 403, 13943, 209 ]
[ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 ]
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 249, 436, 2929, 253, 4477, 4081, 281, 1957, 253, 2460, 347, 247, 4216, 2605, 285, 9569, 247, 4216, 11454, 2990, 13271, 10336, 281, 4908, 4216, 1268, 4735, 323, 5304, 8892, 253, 4216, 11454, 2990, 476, 320, 15616, 342, 2629, 8113, 4979, 398, 342, 1142, 6096, 2494, 11809, 3888, 403, 653, 2878, 715, 20412, 347, 7632, 275, 14580, 1016, 4666, 310, 4802, 342, 697, 25237, 253, 13271, 2990, 310, 275, 247, 24498, 4735, 11998, 3740, 751, 326, 273, 1863, 249, 39707, 253, 4477, 5196, 9470, 4679, 327, 2460, 8981, 285, 1789, 5481, 285, 389, 522, 83, 494, 3045, 342, 643, 1375, 23037, 14387, 3082, 4757, 337, 436, 2929, 310, 973, 34092, 285, 476, 320, 4354, 7192, 407, 10668, 253, 7681, 4278, 403, 5611, 4518, 374, 253, 4477, 5196, 9470, 4679, 327, 2709, 49602, 281, 7409, 253, 12510, 273, 1027, 11911, 285, 11809, 275, 436, 2929, 50276, 20881, 1255, 337, 253, 2934, 273, 4216, 11454, 2990, 323, 5304, 8981, 310, 23176, 533, 352, 3133, 281, 320, 1270, 281, 866, 13013, 953, 2460, 5289, 281, 4575, 3676, 2990, 5289, 2299, 352, 3133, 326, 13271, 816, 3936, 253, 22325, 4758, 273, 4216, 11454, 2990, 352, 4453, 751, 247, 2570, 2336, 1297, 273, 8113, 39707, 253, 24498, 5289, 3206, 310, 275, 247, 1863, 249, 39707, 3740, 1580, 627, 403, 247, 2257, 273, 5609, 275, 9084, 908, 824, 347, 4471, 2522, 4116, 3997, 3579, 2990, 285, 3966, 891, 717, 247, 1652, 13477, 670, 1880, 13271, 310, 816, 247, 8077, 71, 728, 2715, 273, 1863, 249, 39707, 50276, 19, 253, 4735, 10103, 30285, 285, 643, 7533, 7117, 275, 2829, 374, 403, 1077, 2074, 281, 326, 273, 1863, 249, 39707, 50276, 601, 672, 627, 403, 22856, 2069, 22856, 14800, 13271, 18012, 818, 2069, 818, 4735, 8115, 533, 275, 2829, 608, 891, 13414, 1089, 4755, 5750, 273, 13271, 672, 2429, 342, 1863, 249, 39707, 495, 1529, 1943, 1895, 310, 326, 2829, 495, 10894, 247, 2257, 273, 24866, 908, 275, 13271, 2556, 281, 253, 278, 3348, 2929, 34741, 6753, 2083, 351, 398, 403, 44755, 8113, 40390, 30105, 1087, 1384, 1423, 247, 26724, 9084, 67, 1566, 11614, 78, 3602, 476, 755, 854, 1508, 1755, 337, 4440, 257, 292, 7200, 342, 253, 17619, 4886, 3388, 10480, 299, 785, 594, 253, 4081, 13271, 36908, 921, 667, 11361, 2429, 342, 253, 9084, 67, 1566, 275, 2829, 337, 50276, 74, 1804, 253, 4477, 2451, 1880, 697, 1896, 281, 897, 2460, 15276, 5289, 4081, 275, 4216, 71, 16077, 50276, 10580, 71, 16077, 4216, 4735, 39694, 2990, 323, 1789, 5481, 50276, 936, 7102, 253, 4735, 4715, 273, 13271, 352, 588, 320, 1077, 4722, 840, 5474, 33032, 2520, 7714, 29328, 247, 747, 2238, 273, 27882, 4907, 13271, 534, 6125, 253, 2460, 347, 247, 4216, 285, 16756, 4216, 5251, 3386, 323, 8113, 8892, 5742, 253, 3280, 2460, 310, 9070, 715, 20412, 347, 7632, 275, 247, 4216, 17309, 379, 6333, 285, 269, 4174, 6333, 403, 908, 281, 19737, 253, 1491, 2190, 7632, 285, 3700, 4735, 2317, 29436, 285, 39694, 35615, 403, 4081, 281, 1973, 3210, 273, 1027, 9552, 253, 13271, 403, 2429, 342, 643, 256, 5503, 896, 47473, 327, 1097, 2460, 9162, 4836, 285, 1789, 5481, 4836, 5847, 50275, 12554, 272, 271, 2460, 347, 4216, 310, 4460, 285, 4722, 50275, 783, 4081, 17309, 379, 6333, 285, 269, 4174, 403, 3590, 1255, 671, 253, 24426, 273, 253, 4216, 2605, 6296, 2722, 326, 253, 4081, 1566, 556, 6311, 14282, 7688, 2190, 2460, 20412, 50275, 783, 28913, 2175, 285, 4679, 327, 2067, 8113, 8892, 403, 1175, 1097, 29436, 285, 39694, 35615, 921, 253, 12510, 273, 253, 4081, 13271, 1566, 50276, 5040, 50275, 8826, 4243, 273, 253, 7714, 403, 12744, 323, 1650, 849, 281, 26641, 247, 4216, 273, 271, 2460, 310, 12744, 352, 310, 753, 253, 4216, 310, 4270, 1754, 327, 465, 5275, 15833, 533, 849, 281, 11897, 253, 694, 79, 310, 12744, 310, 253, 694, 79, 8818, 1754, 327, 253, 1899, 390, 253, 14259, 273, 253, 4735, 50275, 18566, 253, 1332, 281, 11897, 253, 694, 79, 4833, 253, 3045, 50275, 262, 651, 320, 1270, 604, 253, 4477, 476, 921, 625, 24426, 2219, 273, 253, 8818, 4216, 762, 247, 625, 9542, 10076, 326, 2067, 5113, 403, 3206, 50276, 9820, 5474, 339, 3456, 1154, 314, 2410, 47301, 285, 4979, 398, 452, 6786, 1375, 23037, 14387, 1543, 327, 2710, 5304, 8981, 8892, 436, 2929, 33826, 4216, 11454, 6928, 305, 9866, 323, 5304, 8892, 407, 26736, 247, 4216, 2605, 432, 253, 3280, 2460, 253, 2929, 25339, 253, 3910, 285, 11361, 273, 4216, 2605, 689, 9860, 285, 3425, 2605, 305, 9866, 310, 3732, 327, 253, 4216, 941, 285, 269, 4174, 310, 5611, 323, 11138, 253, 4735, 9991, 253, 2797, 13271, 27882, 476, 5115, 10870, 285, 1014, 1805, 3045, 685, 253, 256, 5503, 2410, 47301, 285, 4979, 398, 20544, 50276, 783, 2929, 310, 3477, 281, 956, 285, 973, 15720, 253, 45200, 17947, 273, 305, 9866, 347, 8113, 27882, 310, 29853, 281, 625, 2987, 436, 789, 588, 1199, 4549, 281, 253, 3114, 253, 4679, 327, 2460, 9162, 285, 1789, 5481, 921, 253, 12510, 273, 8113, 305, 9866, 50276, 20881, 1255, 265, 50275, 249, 1027, 8090, 588, 253, 8818, 4216, 2605, 320, 9300, 50276, 5430, 13271, 310, 908, 275, 1789, 5481, 310, 417, 2529, 275, 2508, 4496, 2486, 253, 7092, 4278, 50276, 5302, 4216, 11454, 6928, 347, 247, 4445, 3738, 417, 347, 253, 27882, 323, 2460, 8981, 556, 644, 6949, 275, 690, 2045, 2987, 1249, 4496, 2319, 342, 841, 2987, 50276, 18, 260, 864, 1182, 3227, 5240, 1162, 355, 33362, 1492, 2460, 8981, 342, 4216, 27311, 267, 6928, 10061, 273, 253, 26332, 70, 886, 39985, 8059, 327, 4382, 8113, 285, 3102, 8981, 6247, 374, 703, 79, 340, 7300, 80, 1162, 355, 1436, 294, 888, 1877, 342, 3676, 14259, 26960, 4216, 11454, 2990, 10061, 273, 253, 19454, 266, 8059, 327, 4382, 8113, 23746, 87, 4765, 253, 7364, 285, 2442, 4016, 38058, 3486, 497, 9713, 275, 253, 2929, 5474, 33032, 2520, 2929, 23970, 247, 305, 9866, 3169, 27882, 1566, 323, 2460, 8892, 326, 310, 7194, 4270, 327, 253, 305, 9866, 8090, 285, 269, 68, 8090, 10323, 352, 36509, 271, 2460, 715, 20412, 285, 3936, 1016, 12097, 347, 247, 4666, 281, 3989, 253, 4216, 2605, 247, 305, 9866, 3169, 10336, 310, 12845, 281, 253, 4216, 323, 5304, 6779, 4715, 281, 2953, 253, 11961, 273, 4735, 9991, 625, 9261, 273, 4735, 310, 2879, 275, 253, 2990, 9470, 4679, 273, 2460, 9162, 285, 1789, 5481, 921, 326, 253, 4081, 8113, 305, 9866, 476, 562, 32231, 8612, 27311, 267, 6928, 285, 4979, 398, 342, 2074, 1180, 273, 3602, 20544, 50276, 2520, 789, 310, 4518, 17194, 285, 973, 3542, 253, 4114, 273, 253, 2561, 253, 16038, 273, 4216, 6779, 323, 3888, 285, 253, 2905, 789, 403, 512, 4518, 4767, 285, 17903, 50276, 783, 806, 305, 9866, 3169, 27882, 323, 5304, 8892, 253, 2969, 2568, 3576, 8113, 305, 9866, 310, 5611, 407, 42174, 253, 305, 9866, 342, 5272, 11237, 407, 12097, 3169, 4216, 5140, 285, 6240, 625, 4666, 4735, 21257, 50276, 2068, 3134, 7103, 285, 13943, 1543, 436, 2929, 14371, 253, 1612, 273, 253, 8113, 305, 9866, 1566, 327, 4440, 257, 292, 285, 9285, 80, 15302, 41731, 14692, 8612, 260, 9866, 285, 39707, 3210, 253, 1543, 403, 13943, 285, 4722, 50276, 20881, 1255, 265, 50276, 3169, 327, 619, 2793, 253, 17032, 3885, 273, 305, 9866, 310, 417, 347, 3809, 347, 260, 9866, 697, 247, 2087, 2523, 323, 253, 8607, 281, 8338, 323, 6109, 4893, 273, 8113, 305, 9866, 50276, 555, 993, 275, 1386, 3156, 305, 68, 1251, 1761, 379, 275, 25702, 2144, 253, 3453, 5048, 273, 11329, 567, 68, 18, 275, 269, 4174, 943, 320, 8763, 32460, 2581, 685, 16416, 7059, 50276, 783, 7364, 403, 9713, 2490, 187, 4118, 18435, 27, 2520, 2929, 29328, 281, 8338, 253, 4216, 2605, 273, 3888, 407, 7296, 20412, 347, 7632, 835, 253, 4216, 310, 8818, 407, 12873, 5275, 15833, 9470, 4679, 327, 2710, 5304, 8892, 26332, 2460, 8981, 285, 1789, 5481, 452, 5183, 253, 12510, 273, 253, 4081, 13271, 512, 253, 30628, 5194, 327, 253, 29853, 285, 12532, 17947, 253, 2929, 310, 671, 973, 15720, 285, 253, 5661, 1543, 403, 13943, 209 ]
Below is given review of a research paper from cnoference journal. Please write a summary the review. ### Review: summary the paper proposes a fast method for solving reinforcement learning problems with constraints oracle computational and memory complexity of the proposed algorithm are provided along with experiments on a gridworld navigation task to illustrate the convergence behavior of the proposed algorithm strength instead of using a penalty or a regularization scheme to impose safety risk or budget constraints the paper proposes to impose the constraints explicitly or in other words treat them as hard constraints dealing with hard constraints is an important open problem especially in nonconvex problems as in reinforcement learning the paper uses the reductions approach proposed in agarwal et al to transform the policy constrained problem to a mixed policy scheme in which finding a feasible solution is equivalent to finding a distribution over policies now with this reduction the paper leverages the fact that feasible set is guaranteed to be a polytope given by the convex hull of all feasible policies and hence the paper solves the equivalent problem of finding a point that minimizes the distance to the convex polytope the minimum norm point algorithm or wolfes method is proposed to solve the distance minimization problem an advantage of the algorithm is that it can be used in tandem with any offtheshelf rl algorithm while guaranteeing feasibility unlike the penalty based methods in addition to the time complexity of the algorithm they also provide a memory complexity bound which in some sense is inherited from the minimum norm point algorithm the analysis in the appendix seem to use the techniques from the chakrabarty et al paper weaknesses there is no discussion or comparison of subproblem complexity with respect to existing methods i consider this paper to be a theoretical paper and it is unfortunate that there is no mention to any practical use cases i believe that this is an important aspect of numerical algorithm that needs to be discussed in papers for example it is well known that exact projections do provide strictly feasible solutions and solve a problem that is in theory equivalent that is if the feasible set is a polytope euclidean projection and minimum norm point are both generic quadratic programming problems hence it is not clear why the minimum norm point should be preferred in reinforcement learning settings i believe that sparsity in the intermediate iterates seem like the crucial difference but the paper discusses this aspect merely in passing some specific examples of safety risk or budget constraints and describing the subproblem complexity for such constraints will make this clear im not sure if the experiments are reproducible with the information provided in the paper first the experiments seem separate from the rest of the paper and the reader has to go over other papers to get an idea of the overall setup for example it is not clear why 300 steps are sufficient or what risky region is secondly there is no discussion of the hyperparameters used in the experiments and algorithm for example how was the value of epsilon determined the step size or learning rate i think the paper would hugely benefit from ablation studies in more than one task preferably something that is used in practice after response thanks for the clarifications however conceptually important questions are not yet clarified yet for example the objective itself can be made into a pure square function and hence strongly convex in both classical wolfes formulation and the proposed as the authors are pointing out the main issue is in designing separation or projection oracle for the constraints which corresponds to the base polytope in the context of submodular optimization and was the main motivation for wolfes algorithm moreover authors mention that main difficulty in using projections is intractability but it is not clear why the linear optimization performed in the proposed algorithm is efficientdocsepthe authors propose c2rl to solve rl problems under convex constraints the authors reduce the rl problem under convex constraints to a distance minimization problem and solve the distance minimization problem with a frankwolfe type method with an rl solver as a subroutine the authors further show that the algorithm converges in terms of approximation error and validate some of their theoretical findings with simulations pros i like the fact that the authors have theoretical guarantees for their approximation error the reduction to distance minimization problem is also clear cons i find it hard to understand section 43 and section 44 even after quite a few passes this is the main reason for my score and my confidence level for section 43 while i understand how algorithm 1 works i have no intuitive idea why algorithm 1 converges there is a lack of connection to the original wolfes algorithm such as what corresponds to the objection function and to the linear minimization oracle why the linear property of the rloracle is important etc for section 44 the authors just pileup their results without further remark on the implications i dont understand the role of the sparse policy here does finding a sparse policy makes the problem easier or harder why do we want to find a sparse policy it seems that the optimal mu is not unique if there are multiple mu such that cmu in omega if this is so the analysis of the frankwolfe type method could be tricky while i understand the challenges of rl problem under convex constraints could the author list specifically what are the applications that can be formulated into rl under cvx constraints do we have an easy projection operator for these convex constraints how to choose the policy set in the real world application minor comments above equation 7 is equivalent to minimizing the distance between the polytope and the convex set it is misleading to talk about the distance between the two sets maybe find a point in the polytope that is closest to the convex set omega some comments on the meaning of equation 4 should be helpful for the readers to understand the main flow docsepthis paper presents a reduction approach to tackle the optimization problem of constrained rl they propose a frankwolfe type algorithm for the task which avoids many shortcomings of previous methods such as the memory complexity they prove that their algorithm can find an epsilonapproximate solution with o1epsilon invocation they also show the power of their algorithm with experiments in a gridworld navigation task though the tasks looks relatively simple pros the application of frankwolfe algorithm to constrained rl problem is novel the method is basically different from the that of miryoosefi et al 2019the improvement is mainly due to the algorithm design the theoretical improvement is solid the paper tackles the memory requirement issue in the previous literature and only requires constant memory complexity further the number of rl oracle invocation is also reduced from o1epsilon2 to o1epsilon the paper is wellwritten though i only sketched the proof in the appendix the algorithm and the analysis in the main part is reasonable and sound comments the algorithm requires a policy set mathcalu and finds a mixed policy mu in deltamathcalu to satisfy the constraints how to get a policy set with a feasible solution is mathcalu predefined for an mdp with s states and a actions the possible deterministic policy can be as trivially setting mathcalu as a set with all possible policies may lead to exponential computational and memory complexity constrained rl problem can be formulated from the standard dual lp form of rl problem in which the policy pi can be fully represented as the density over stateaction dsa see eg 1 is it possible to solve constrained rl problem under this formulation what is the advantage of using mixed policies over fixed policy set mathcalu compared with this formulation typos line 3 of algorithm 2 1etat wt1 1etat wt1 1 constrained episodic reinforcement learning in concaveconvex and knapsack settingsdocsep1 summarize what the paper claims to docontribute be positive and generous in this paper the authors consider a class of constrained mdp problem in the considered problem instead of getting a scalar reward in each step the mdp returns a vector reward which is termed as a measurement vector then the problem requires finding a mixed policy such that the expected measurement vector belongs to a convex set this problem was first considered in miryoosefi et al 2019 and the authors of this paper propose a new algorithm and claim an improvement over the sparsity in the mixed policy the main idea of the new algorithm is to solve the problem as convex minimization of the squared distance to the target set a standard frankwolfetype algorithm is proposed to solve the problem convergence and complexity analysis are also provided 2 clearly state your decision accept or reject with one or two key reasons for this choice this paper is marginally above the acceptance threshold 3 provide supporting arguments for the reasons for the decision i weakness a significant flaw is the wrong claim of improvement over the previous result miryoosefi et al 2019 in this paper the authors yield distcmuomega leq 1t while miryoosefi et al 2019 proves distcmuomega leq 1sqrtt thus the authors claim an improved o1epsilon complexity over the o1epsilon2 complexity of the compared paper however this is a wrong argument if the authors pay attention to the definition of miryoosefi et al 2019 they should find that the dist denotes the standard notation of euclidean distance however the dist function in this paper is the squared euclidean distance they are defined differently therefore if we view the two results under the same optimality measure these two complexity results are the same no improvement is made in this paper in terms of complexity ii strength though in i we find there is no improvement in terms of complexity and hence the sparsity of the mixed strategy the reason why i still think it is marginally above the threshold is that compared to the existing approach in miryoosefi et al 2019 the frankwolfe type algorithm is way more natural and robust iii strength due to the desirable structure of the frankwolfe method the authors are able to constantly eliminate the affinely dependent historical policiesmeasurement vectors thus limiting the storage memory to only m1 where m is the dimension of the measurement vector 4 provide additional feedback with the aim to improve the paper make it clear that these points are here to help and not necessarily part of your decision assessment i the author should change their claim of the improvement over miryoosefi et al 2019 such improvement is not achieved in this paper ii in terms of notation the definition of the dist function should be changed traditionally dist only denotes the distance instead of the squared distance the authors should use dist2cx or simply change to some other notation otherwise it will cause confusion this notation confusion is also possibly the reason why the authors wrongly claim the improvement over miryoosefi et al 2019 iii it might worth more discussion about what algorithm 1 is doing for example for line 3 briefly comment that xomega is the gradient of the objective function will make the understanding of the algorithm significantly simpler similarly the authors can make more explanation about the line 613 of algorithm 1 which might cause confusion without explanation 5 ask questions you would like answered by the authors that help you clarify your understanding of the paper and provides the additional evidence you need to make be confident in your assessment consider finding a policy pi whose measurement vector cpiinomega for simplicity suppose there are only 2 policies in this paper the proposed solution is to find p st pcpi 1pcpiinomega however this corresponds to the situation where before doing anything first toss a coin to decide whether pi or pi is used then use that policy for all future plays however this situation is weird in the sense that none of the pi or pi is feasible and the variance can be very large my question is is it possible to find the convex combination st cppi 1ppiinomega it is obvious that cppi 1ppi neq pcpi 1pcpi i think such a policy will be stabler ### Summary:
the paper builds on the prior work by miryoosefi et al 2019 that finds a feasible mixed policy under convex constraints through distance minimization over a simplex set instead of the primaldual approach used in miryoosefi et al 2019 this paper proposes to apply frankwolfe type algorithm particularly the minimum norm point algorithm to promote sparsity of the mixed policy while achieving the same complexity despite the improvement on sparsity the ac and some reviewers share two main concerns 1 incremental novelty of the algorithmtheory which basically follows from existing optimization work 2 lack of theoretical and numerical justification of the significance of sparsity especially given that the main computation costs come from projection and rl oracle unfortunately the paper lands just below borderline and cannot be accepted this time
[ 684, 1646, 751, 253, 9560, 3064, 533, 253, 2929, 25339, 436, 4809, 7960, 275, 8136, 690, 2173, 6667, 273, 5252, 2495, 390, 7563, 10806, 285, 12930, 253, 749, 28872, 10454, 323, 824, 10806, 588, 1056, 436, 2590, 50274, 303, 417, 2119, 604, 253, 4679, 403, 41374, 342, 253, 1491, 2530, 275, 253, 2929, 806, 253, 4679, 1646, 4858, 432, 253, 1551, 273, 253, 2929, 285, 253, 9414, 556, 281, 564, 689, 643, 9380, 281, 755, 271, 2934, 273, 253, 4583, 9978, 323, 1650, 352, 310, 417, 2590, 2139, 7469, 5018, 403, 4209, 390, 752, 29198, 2919, 310, 1273, 314, 627, 310, 642, 5955, 273, 253, 4373, 22041, 908, 275, 253, 4679, 285, 5933, 323, 1650, 849, 369, 253, 1318, 273, 299, 4277, 3413, 253, 3213, 1979, 390, 4715, 2281, 891, 1158, 253, 2929, 651, 40704, 5649, 432, 28913, 2175, 275, 625, 685, 581, 4836, 13027, 1633, 326, 310, 908, 275, 3946, 50276, 6438, 2380, 6701, 323, 253, 8254, 6787, 2299, 4473, 1230, 1774, 3533, 403, 417, 2568, 31637, 2568, 323, 1650, 50276, 783, 8103, 3139, 476, 320, 1160, 715, 247, 6313, 6278, 1159, 285, 7613, 7052, 17133, 275, 1097, 8946, 25872, 265, 15895, 285, 253, 4081, 347, 253, 4477, 403, 13458, 562, 253, 2022, 2523, 310, 275, 20462, 9712, 390, 12378, 42295, 323, 253, 10806, 534, 10140, 281, 253, 2613, 3488, 936, 365, 275, 253, 3634, 273, 749, 2307, 792, 13757, 285, 369, 253, 2022, 16038, 323, 25872, 265, 5933, 25761, 4477, 3748, 326, 2022, 10183, 275, 970, 20553, 310, 540, 974, 1430, 533, 352, 310, 417, 2590, 2139, 253, 4872, 13757, 2684, 275, 253, 4081, 5933, 310, 5919, 7152, 339, 431, 248, 4477, 12661, 260, 19, 8435, 281, 8415, 391, 77, 3237, 762, 17133, 10806, 253, 4477, 4796, 253, 391, 77, 1895, 762, 17133, 10806, 281, 247, 4181, 41458, 1895, 285, 8415, 253, 4181, 41458, 1895, 342, 247, 21332, 88, 311, 453, 1511, 1332, 342, 271, 391, 77, 47037, 347, 247, 749, 27861, 460, 253, 4477, 2007, 921, 326, 253, 5933, 26414, 275, 2426, 273, 11193, 2228, 285, 17813, 690, 273, 616, 10527, 4342, 342, 9938, 50275, 856, 84, 891, 751, 253, 958, 326, 253, 4477, 452, 10527, 23632, 323, 616, 11193, 2228, 253, 5141, 281, 4181, 41458, 1895, 310, 671, 2590, 50275, 5040, 50276, 74, 1089, 352, 1892, 281, 2096, 2593, 7652, 285, 2593, 7127, 1014, 846, 3240, 247, 1643, 11999, 436, 310, 253, 2022, 1921, 323, 619, 4868, 285, 619, 7162, 1268, 323, 2593, 7652, 1223, 891, 2096, 849, 5933, 337, 2987, 891, 452, 642, 27350, 2934, 2139, 5933, 337, 26414, 627, 310, 247, 3480, 273, 4602, 281, 253, 3236, 25872, 265, 5933, 824, 347, 752, 10140, 281, 253, 14926, 1159, 285, 281, 253, 4872, 41458, 42295, 2139, 253, 4872, 2867, 273, 253, 391, 3833, 6929, 310, 1774, 3966, 323, 2593, 7127, 253, 4477, 816, 19176, 484, 616, 1543, 1293, 2007, 7579, 327, 253, 12739, 50275, 74, 13414, 2096, 253, 2554, 273, 253, 23507, 3646, 1060, 1057, 4560, 247, 23507, 3646, 2789, 253, 1895, 6927, 390, 12150, 2139, 513, 359, 971, 281, 1089, 247, 23507, 3646, 50275, 262, 3133, 326, 253, 8654, 12910, 310, 417, 4451, 604, 627, 403, 2709, 12910, 824, 326, 260, 1906, 275, 40639, 604, 436, 310, 594, 253, 1783, 273, 253, 21332, 88, 311, 453, 1511, 1332, 812, 320, 28190, 50275, 6050, 891, 2096, 253, 7881, 273, 391, 77, 1895, 762, 17133, 10806, 812, 253, 2488, 1618, 5742, 752, 403, 253, 4893, 326, 476, 320, 26115, 715, 391, 77, 762, 30105, 89, 10806, 513, 359, 452, 271, 3477, 12378, 5572, 323, 841, 17133, 10806, 849, 281, 5206, 253, 3646, 873, 275, 253, 1524, 1533, 2898, 50276, 37585, 5701, 50275, 25117, 5150, 818, 310, 6425, 281, 28699, 253, 4181, 875, 253, 3488, 936, 365, 285, 253, 17133, 873, 352, 310, 24363, 281, 2312, 670, 253, 4181, 875, 253, 767, 5239, 5046, 1089, 247, 1127, 275, 253, 3488, 936, 365, 326, 310, 8642, 281, 253, 17133, 873, 40639, 50275, 8826, 5701, 327, 253, 4495, 273, 5150, 577, 943, 320, 9371, 323, 253, 10668, 281, 2096, 253, 2022, 2685, 5474, 33032, 2520, 2929, 10262, 247, 5141, 2746, 281, 18915, 253, 13757, 1895, 273, 20793, 391, 77, 597, 12661, 247, 21332, 88, 311, 453, 1511, 5933, 323, 253, 4836, 534, 32547, 1142, 35387, 273, 2045, 3082, 824, 347, 253, 3541, 10454, 597, 5276, 326, 616, 5933, 476, 1089, 271, 299, 4277, 9887, 2542, 2900, 342, 258, 18, 4259, 45781, 597, 671, 921, 253, 1612, 273, 616, 5933, 342, 4679, 275, 247, 9860, 10186, 15034, 4836, 2167, 253, 8892, 4453, 4942, 2969, 50276, 856, 84, 50276, 783, 2898, 273, 21332, 88, 311, 453, 5933, 281, 20793, 391, 77, 1895, 310, 4460, 253, 1332, 310, 10323, 1027, 432, 253, 326, 273, 6385, 11904, 583, 11125, 1162, 355, 6247, 783, 7756, 310, 7194, 1955, 281, 253, 5933, 2216, 50275, 783, 10527, 7756, 310, 4891, 253, 2929, 39223, 253, 3541, 8284, 50276, 15697, 275, 253, 2045, 6239, 285, 760, 4419, 3638, 3541, 10454, 2007, 253, 1180, 273, 391, 77, 42295, 45781, 310, 671, 3777, 432, 258, 18, 4259, 19, 281, 258, 18, 4259, 50275, 783, 2929, 310, 973, 15720, 2167, 891, 760, 30547, 2147, 253, 4737, 275, 253, 30762, 253, 5933, 285, 253, 1783, 275, 253, 2022, 629, 310, 5272, 285, 3590, 50276, 26122, 50276, 783, 5933, 4419, 247, 3646, 873, 14168, 1179, 86, 285, 9010, 247, 6804, 3646, 12910, 275, 1448, 85, 312, 506, 1179, 86, 281, 10517, 253, 10806, 849, 281, 755, 247, 3646, 873, 342, 247, 17887, 2900, 310, 14168, 1179, 86, 41364, 323, 271, 278, 12132, 342, 256, 3054, 285, 247, 5231, 253, 1896, 30027, 3646, 476, 320, 347, 35820, 1365, 4758, 14168, 1179, 86, 347, 247, 873, 342, 512, 1896, 7823, 778, 1421, 281, 17619, 15180, 285, 3541, 10454, 50275, 48454, 391, 77, 1895, 476, 320, 26115, 432, 253, 2629, 8746, 39322, 830, 273, 391, 77, 1895, 275, 534, 253, 3646, 12580, 476, 320, 4751, 6607, 347, 253, 4038, 689, 1375, 1913, 277, 6678, 923, 24088, 337, 310, 352, 1896, 281, 8415, 20793, 391, 77, 1895, 762, 436, 15895, 752, 310, 253, 5750, 273, 970, 6804, 7823, 689, 4229, 3646, 873, 14168, 1179, 86, 2429, 342, 436, 15895, 50276, 555, 993, 50276, 1282, 495, 273, 5933, 374, 337, 292, 255, 22923, 18, 50276, 18, 292, 255, 22923, 18, 50276, 18, 20793, 6314, 23329, 35221, 4715, 275, 40886, 44181, 285, 694, 1825, 471, 7533, 7152, 33032, 18, 26799, 752, 253, 2929, 3916, 281, 5474, 834, 3337, 320, 2762, 285, 19649, 50276, 249, 436, 2929, 253, 4477, 1908, 247, 966, 273, 20793, 278, 12132, 1895, 275, 253, 2783, 1895, 3185, 273, 2970, 247, 13434, 10921, 275, 1016, 3213, 253, 278, 12132, 6548, 247, 4972, 10921, 534, 310, 23776, 347, 247, 6814, 4972, 840, 253, 1895, 4419, 4560, 247, 6804, 3646, 824, 326, 253, 3264, 6814, 4972, 14125, 281, 247, 17133, 873, 436, 1895, 369, 806, 2783, 275, 6385, 11904, 583, 11125, 1162, 355, 50276, 9638, 285, 253, 4477, 273, 436, 2929, 12661, 247, 747, 5933, 285, 1750, 271, 7756, 689, 253, 37139, 414, 275, 253, 6804, 3646, 253, 2022, 2934, 273, 253, 747, 5933, 310, 281, 8415, 253, 1895, 347, 17133, 41458, 273, 253, 30044, 4181, 281, 253, 2303, 873, 247, 2629, 21332, 40995, 40427, 5933, 310, 4081, 281, 8415, 253, 1895, 14940, 285, 10454, 1783, 403, 671, 2530, 50275, 19, 4518, 1375, 634, 3061, 2997, 390, 12009, 342, 581, 390, 767, 2234, 4606, 323, 436, 4327, 436, 2929, 310, 42876, 1840, 253, 14924, 7887, 50275, 20, 2085, 8109, 7125, 323, 253, 4606, 323, 253, 3061, 50275, 74, 14855, 247, 1534, 19652, 310, 253, 3430, 1750, 273, 7756, 689, 253, 2045, 906, 6385, 11904, 583, 11125, 1162, 355, 50276, 9638, 275, 436, 2929, 253, 4477, 4917, 940, 68, 1906, 3151, 458, 82, 337, 85, 1223, 6385, 11904, 583, 11125, 1162, 355, 50276, 9638, 19539, 940, 68, 1906, 3151, 458, 82, 337, 2609, 85, 3021, 253, 4477, 1750, 271, 5520, 258, 18, 4259, 10454, 689, 253, 258, 18, 4259, 19, 10454, 273, 253, 2429, 2929, 2299, 436, 310, 247, 3430, 4154, 604, 253, 4477, 2075, 4116, 281, 253, 5426, 273, 6385, 11904, 583, 11125, 1162, 355, 50276, 9638, 597, 943, 1089, 326, 253, 940, 12853, 253, 2629, 14951, 273, 299, 26365, 4181, 2299, 253, 940, 1159, 275, 436, 2929, 310, 253, 30044, 299, 26365, 4181, 597, 403, 2931, 13359, 3103, 604, 359, 1859, 253, 767, 1543, 762, 253, 1072, 5556, 1319, 2557, 841, 767, 10454, 1543, 403, 253, 1072, 642, 7756, 310, 1160, 275, 436, 2929, 275, 2426, 273, 10454, 50275, 2886, 4757, 2167, 275, 891, 359, 1089, 627, 310, 642, 7756, 275, 2426, 273, 10454, 285, 7613, 253, 37139, 414, 273, 253, 6804, 5700, 253, 1921, 2139, 891, 1335, 1158, 352, 310, 42876, 1840, 253, 7887, 310, 326, 2429, 281, 253, 5368, 2746, 275, 6385, 11904, 583, 11125, 1162, 355, 50276, 9638, 253, 21332, 88, 311, 453, 1511, 5933, 310, 1039, 625, 3626, 285, 10237, 50275, 12211, 4757, 1955, 281, 253, 11408, 2605, 273, 253, 21332, 88, 311, 453, 1332, 253, 4477, 403, 2104, 281, 11485, 13469, 253, 2438, 14258, 7976, 9493, 7823, 30238, 420, 11390, 3021, 14155, 253, 5718, 3541, 281, 760, 278, 18, 835, 278, 310, 253, 7877, 273, 253, 6814, 4972, 50276, 21, 2085, 3081, 8680, 342, 253, 4388, 281, 3157, 253, 2929, 1056, 352, 2590, 326, 841, 2792, 403, 1060, 281, 1361, 285, 417, 7933, 629, 273, 634, 3061, 6803, 50276, 74, 253, 2488, 943, 1818, 616, 1750, 273, 253, 7756, 689, 6385, 11904, 583, 11125, 1162, 355, 50276, 9638, 824, 7756, 310, 417, 6786, 275, 436, 2929, 50275, 2886, 275, 2426, 273, 14951, 253, 5426, 273, 253, 940, 1159, 943, 320, 4391, 21533, 940, 760, 12853, 253, 4181, 3185, 273, 253, 30044, 4181, 253, 4477, 943, 897, 940, 19, 33060, 390, 3365, 1818, 281, 690, 643, 14951, 5010, 352, 588, 2847, 13775, 436, 14951, 13775, 310, 671, 6830, 253, 1921, 2139, 253, 4477, 47723, 1750, 253, 7756, 689, 6385, 11904, 583, 11125, 1162, 355, 50276, 9638, 50276, 12211, 352, 1537, 4409, 625, 5955, 670, 752, 5933, 337, 310, 2509, 323, 1650, 323, 1386, 495, 13366, 4385, 326, 1269, 3151, 310, 253, 11786, 273, 253, 8103, 1159, 588, 1056, 253, 4685, 273, 253, 5933, 3012, 19554, 12014, 253, 4477, 476, 1056, 625, 8813, 670, 253, 1386, 48726, 273, 5933, 337, 534, 1537, 2847, 13775, 1293, 8813, 50276, 22, 1642, 3533, 368, 651, 751, 9577, 407, 253, 4477, 326, 1361, 368, 19148, 634, 4685, 273, 253, 2929, 285, 3400, 253, 3081, 1941, 368, 878, 281, 1056, 320, 13224, 275, 634, 6803, 50275, 15603, 4560, 247, 3646, 12580, 3692, 6814, 4972, 260, 2059, 249, 3151, 323, 17647, 9428, 627, 403, 760, 374, 7823, 275, 436, 2929, 253, 4081, 2900, 310, 281, 1089, 268, 331, 50276, 5902, 2059, 50276, 18, 5902, 2059, 249, 3151, 2299, 436, 10140, 281, 253, 4112, 835, 1078, 2509, 2712, 806, 15331, 247, 18011, 281, 7617, 1880, 12580, 390, 12580, 310, 908, 840, 897, 326, 3646, 323, 512, 2852, 7120, 2299, 436, 4112, 310, 12504, 275, 253, 3282, 326, 5293, 273, 253, 12580, 390, 12580, 310, 17887, 285, 253, 11041, 476, 320, 1077, 1781, 619, 1953, 310, 310, 352, 1896, 281, 1089, 253, 17133, 5019, 331, 260, 377, 74, 50276, 18, 377, 74, 249, 3151, 352, 310, 4755, 326, 260, 377, 74, 50276, 18, 377, 74, 425, 82, 21136, 2059, 50276, 18, 5902, 2059, 891, 1158, 824, 247, 3646, 588, 320, 331, 1752, 254, 50274, 187, 187, 4118, 18435, 27, 783, 2929, 21168, 327, 253, 2720, 789, 407, 6385, 11904, 583, 11125, 1162, 355, 6247, 326, 9010, 247, 17887, 6804, 3646, 762, 17133, 10806, 949, 4181, 41458, 689, 247, 44053, 873, 3185, 273, 253, 819, 1983, 34716, 2746, 908, 275, 6385, 11904, 583, 11125, 1162, 355, 6247, 436, 2929, 29328, 281, 4647, 21332, 88, 311, 453, 1511, 5933, 3782, 253, 5927, 5222, 1127, 5933, 281, 8591, 37139, 414, 273, 253, 6804, 3646, 1223, 17170, 253, 1072, 10454, 50275, 3229, 3784, 253, 7756, 327, 37139, 414, 253, 913, 285, 690, 30628, 3894, 767, 2022, 7350, 337, 32809, 38135, 273, 253, 5933, 32525, 534, 10323, 3637, 432, 5368, 13757, 789, 374, 3480, 273, 10527, 285, 10704, 22861, 273, 253, 8453, 273, 37139, 414, 3340, 1677, 326, 253, 2022, 13782, 4815, 1705, 432, 12378, 285, 391, 77, 42295, 50275, 328, 9520, 253, 2929, 13446, 816, 2708, 45210, 285, 2550, 320, 7607, 436, 673, 50276 ]
[ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 ]
[ 684, 1646, 751, 253, 9560, 3064, 533, 253, 2929, 25339, 436, 4809, 7960, 275, 8136, 690, 2173, 6667, 273, 5252, 2495, 390, 7563, 10806, 285, 12930, 253, 749, 28872, 10454, 323, 824, 10806, 588, 1056, 436, 2590, 50274, 303, 417, 2119, 604, 253, 4679, 403, 41374, 342, 253, 1491, 2530, 275, 253, 2929, 806, 253, 4679, 1646, 4858, 432, 253, 1551, 273, 253, 2929, 285, 253, 9414, 556, 281, 564, 689, 643, 9380, 281, 755, 271, 2934, 273, 253, 4583, 9978, 323, 1650, 352, 310, 417, 2590, 2139, 7469, 5018, 403, 4209, 390, 752, 29198, 2919, 310, 1273, 314, 627, 310, 642, 5955, 273, 253, 4373, 22041, 908, 275, 253, 4679, 285, 5933, 323, 1650, 849, 369, 253, 1318, 273, 299, 4277, 3413, 253, 3213, 1979, 390, 4715, 2281, 891, 1158, 253, 2929, 651, 40704, 5649, 432, 28913, 2175, 275, 625, 685, 581, 4836, 13027, 1633, 326, 310, 908, 275, 3946, 50276, 6438, 2380, 6701, 323, 253, 8254, 6787, 2299, 4473, 1230, 1774, 3533, 403, 417, 2568, 31637, 2568, 323, 1650, 50276, 783, 8103, 3139, 476, 320, 1160, 715, 247, 6313, 6278, 1159, 285, 7613, 7052, 17133, 275, 1097, 8946, 25872, 265, 15895, 285, 253, 4081, 347, 253, 4477, 403, 13458, 562, 253, 2022, 2523, 310, 275, 20462, 9712, 390, 12378, 42295, 323, 253, 10806, 534, 10140, 281, 253, 2613, 3488, 936, 365, 275, 253, 3634, 273, 749, 2307, 792, 13757, 285, 369, 253, 2022, 16038, 323, 25872, 265, 5933, 25761, 4477, 3748, 326, 2022, 10183, 275, 970, 20553, 310, 540, 974, 1430, 533, 352, 310, 417, 2590, 2139, 253, 4872, 13757, 2684, 275, 253, 4081, 5933, 310, 5919, 7152, 339, 431, 248, 4477, 12661, 260, 19, 8435, 281, 8415, 391, 77, 3237, 762, 17133, 10806, 253, 4477, 4796, 253, 391, 77, 1895, 762, 17133, 10806, 281, 247, 4181, 41458, 1895, 285, 8415, 253, 4181, 41458, 1895, 342, 247, 21332, 88, 311, 453, 1511, 1332, 342, 271, 391, 77, 47037, 347, 247, 749, 27861, 460, 253, 4477, 2007, 921, 326, 253, 5933, 26414, 275, 2426, 273, 11193, 2228, 285, 17813, 690, 273, 616, 10527, 4342, 342, 9938, 50275, 856, 84, 891, 751, 253, 958, 326, 253, 4477, 452, 10527, 23632, 323, 616, 11193, 2228, 253, 5141, 281, 4181, 41458, 1895, 310, 671, 2590, 50275, 5040, 50276, 74, 1089, 352, 1892, 281, 2096, 2593, 7652, 285, 2593, 7127, 1014, 846, 3240, 247, 1643, 11999, 436, 310, 253, 2022, 1921, 323, 619, 4868, 285, 619, 7162, 1268, 323, 2593, 7652, 1223, 891, 2096, 849, 5933, 337, 2987, 891, 452, 642, 27350, 2934, 2139, 5933, 337, 26414, 627, 310, 247, 3480, 273, 4602, 281, 253, 3236, 25872, 265, 5933, 824, 347, 752, 10140, 281, 253, 14926, 1159, 285, 281, 253, 4872, 41458, 42295, 2139, 253, 4872, 2867, 273, 253, 391, 3833, 6929, 310, 1774, 3966, 323, 2593, 7127, 253, 4477, 816, 19176, 484, 616, 1543, 1293, 2007, 7579, 327, 253, 12739, 50275, 74, 13414, 2096, 253, 2554, 273, 253, 23507, 3646, 1060, 1057, 4560, 247, 23507, 3646, 2789, 253, 1895, 6927, 390, 12150, 2139, 513, 359, 971, 281, 1089, 247, 23507, 3646, 50275, 262, 3133, 326, 253, 8654, 12910, 310, 417, 4451, 604, 627, 403, 2709, 12910, 824, 326, 260, 1906, 275, 40639, 604, 436, 310, 594, 253, 1783, 273, 253, 21332, 88, 311, 453, 1511, 1332, 812, 320, 28190, 50275, 6050, 891, 2096, 253, 7881, 273, 391, 77, 1895, 762, 17133, 10806, 812, 253, 2488, 1618, 5742, 752, 403, 253, 4893, 326, 476, 320, 26115, 715, 391, 77, 762, 30105, 89, 10806, 513, 359, 452, 271, 3477, 12378, 5572, 323, 841, 17133, 10806, 849, 281, 5206, 253, 3646, 873, 275, 253, 1524, 1533, 2898, 50276, 37585, 5701, 50275, 25117, 5150, 818, 310, 6425, 281, 28699, 253, 4181, 875, 253, 3488, 936, 365, 285, 253, 17133, 873, 352, 310, 24363, 281, 2312, 670, 253, 4181, 875, 253, 767, 5239, 5046, 1089, 247, 1127, 275, 253, 3488, 936, 365, 326, 310, 8642, 281, 253, 17133, 873, 40639, 50275, 8826, 5701, 327, 253, 4495, 273, 5150, 577, 943, 320, 9371, 323, 253, 10668, 281, 2096, 253, 2022, 2685, 5474, 33032, 2520, 2929, 10262, 247, 5141, 2746, 281, 18915, 253, 13757, 1895, 273, 20793, 391, 77, 597, 12661, 247, 21332, 88, 311, 453, 1511, 5933, 323, 253, 4836, 534, 32547, 1142, 35387, 273, 2045, 3082, 824, 347, 253, 3541, 10454, 597, 5276, 326, 616, 5933, 476, 1089, 271, 299, 4277, 9887, 2542, 2900, 342, 258, 18, 4259, 45781, 597, 671, 921, 253, 1612, 273, 616, 5933, 342, 4679, 275, 247, 9860, 10186, 15034, 4836, 2167, 253, 8892, 4453, 4942, 2969, 50276, 856, 84, 50276, 783, 2898, 273, 21332, 88, 311, 453, 5933, 281, 20793, 391, 77, 1895, 310, 4460, 253, 1332, 310, 10323, 1027, 432, 253, 326, 273, 6385, 11904, 583, 11125, 1162, 355, 6247, 783, 7756, 310, 7194, 1955, 281, 253, 5933, 2216, 50275, 783, 10527, 7756, 310, 4891, 253, 2929, 39223, 253, 3541, 8284, 50276, 15697, 275, 253, 2045, 6239, 285, 760, 4419, 3638, 3541, 10454, 2007, 253, 1180, 273, 391, 77, 42295, 45781, 310, 671, 3777, 432, 258, 18, 4259, 19, 281, 258, 18, 4259, 50275, 783, 2929, 310, 973, 15720, 2167, 891, 760, 30547, 2147, 253, 4737, 275, 253, 30762, 253, 5933, 285, 253, 1783, 275, 253, 2022, 629, 310, 5272, 285, 3590, 50276, 26122, 50276, 783, 5933, 4419, 247, 3646, 873, 14168, 1179, 86, 285, 9010, 247, 6804, 3646, 12910, 275, 1448, 85, 312, 506, 1179, 86, 281, 10517, 253, 10806, 849, 281, 755, 247, 3646, 873, 342, 247, 17887, 2900, 310, 14168, 1179, 86, 41364, 323, 271, 278, 12132, 342, 256, 3054, 285, 247, 5231, 253, 1896, 30027, 3646, 476, 320, 347, 35820, 1365, 4758, 14168, 1179, 86, 347, 247, 873, 342, 512, 1896, 7823, 778, 1421, 281, 17619, 15180, 285, 3541, 10454, 50275, 48454, 391, 77, 1895, 476, 320, 26115, 432, 253, 2629, 8746, 39322, 830, 273, 391, 77, 1895, 275, 534, 253, 3646, 12580, 476, 320, 4751, 6607, 347, 253, 4038, 689, 1375, 1913, 277, 6678, 923, 24088, 337, 310, 352, 1896, 281, 8415, 20793, 391, 77, 1895, 762, 436, 15895, 752, 310, 253, 5750, 273, 970, 6804, 7823, 689, 4229, 3646, 873, 14168, 1179, 86, 2429, 342, 436, 15895, 50276, 555, 993, 50276, 1282, 495, 273, 5933, 374, 337, 292, 255, 22923, 18, 50276, 18, 292, 255, 22923, 18, 50276, 18, 20793, 6314, 23329, 35221, 4715, 275, 40886, 44181, 285, 694, 1825, 471, 7533, 7152, 33032, 18, 26799, 752, 253, 2929, 3916, 281, 5474, 834, 3337, 320, 2762, 285, 19649, 50276, 249, 436, 2929, 253, 4477, 1908, 247, 966, 273, 20793, 278, 12132, 1895, 275, 253, 2783, 1895, 3185, 273, 2970, 247, 13434, 10921, 275, 1016, 3213, 253, 278, 12132, 6548, 247, 4972, 10921, 534, 310, 23776, 347, 247, 6814, 4972, 840, 253, 1895, 4419, 4560, 247, 6804, 3646, 824, 326, 253, 3264, 6814, 4972, 14125, 281, 247, 17133, 873, 436, 1895, 369, 806, 2783, 275, 6385, 11904, 583, 11125, 1162, 355, 50276, 9638, 285, 253, 4477, 273, 436, 2929, 12661, 247, 747, 5933, 285, 1750, 271, 7756, 689, 253, 37139, 414, 275, 253, 6804, 3646, 253, 2022, 2934, 273, 253, 747, 5933, 310, 281, 8415, 253, 1895, 347, 17133, 41458, 273, 253, 30044, 4181, 281, 253, 2303, 873, 247, 2629, 21332, 40995, 40427, 5933, 310, 4081, 281, 8415, 253, 1895, 14940, 285, 10454, 1783, 403, 671, 2530, 50275, 19, 4518, 1375, 634, 3061, 2997, 390, 12009, 342, 581, 390, 767, 2234, 4606, 323, 436, 4327, 436, 2929, 310, 42876, 1840, 253, 14924, 7887, 50275, 20, 2085, 8109, 7125, 323, 253, 4606, 323, 253, 3061, 50275, 74, 14855, 247, 1534, 19652, 310, 253, 3430, 1750, 273, 7756, 689, 253, 2045, 906, 6385, 11904, 583, 11125, 1162, 355, 50276, 9638, 275, 436, 2929, 253, 4477, 4917, 940, 68, 1906, 3151, 458, 82, 337, 85, 1223, 6385, 11904, 583, 11125, 1162, 355, 50276, 9638, 19539, 940, 68, 1906, 3151, 458, 82, 337, 2609, 85, 3021, 253, 4477, 1750, 271, 5520, 258, 18, 4259, 10454, 689, 253, 258, 18, 4259, 19, 10454, 273, 253, 2429, 2929, 2299, 436, 310, 247, 3430, 4154, 604, 253, 4477, 2075, 4116, 281, 253, 5426, 273, 6385, 11904, 583, 11125, 1162, 355, 50276, 9638, 597, 943, 1089, 326, 253, 940, 12853, 253, 2629, 14951, 273, 299, 26365, 4181, 2299, 253, 940, 1159, 275, 436, 2929, 310, 253, 30044, 299, 26365, 4181, 597, 403, 2931, 13359, 3103, 604, 359, 1859, 253, 767, 1543, 762, 253, 1072, 5556, 1319, 2557, 841, 767, 10454, 1543, 403, 253, 1072, 642, 7756, 310, 1160, 275, 436, 2929, 275, 2426, 273, 10454, 50275, 2886, 4757, 2167, 275, 891, 359, 1089, 627, 310, 642, 7756, 275, 2426, 273, 10454, 285, 7613, 253, 37139, 414, 273, 253, 6804, 5700, 253, 1921, 2139, 891, 1335, 1158, 352, 310, 42876, 1840, 253, 7887, 310, 326, 2429, 281, 253, 5368, 2746, 275, 6385, 11904, 583, 11125, 1162, 355, 50276, 9638, 253, 21332, 88, 311, 453, 1511, 5933, 310, 1039, 625, 3626, 285, 10237, 50275, 12211, 4757, 1955, 281, 253, 11408, 2605, 273, 253, 21332, 88, 311, 453, 1332, 253, 4477, 403, 2104, 281, 11485, 13469, 253, 2438, 14258, 7976, 9493, 7823, 30238, 420, 11390, 3021, 14155, 253, 5718, 3541, 281, 760, 278, 18, 835, 278, 310, 253, 7877, 273, 253, 6814, 4972, 50276, 21, 2085, 3081, 8680, 342, 253, 4388, 281, 3157, 253, 2929, 1056, 352, 2590, 326, 841, 2792, 403, 1060, 281, 1361, 285, 417, 7933, 629, 273, 634, 3061, 6803, 50276, 74, 253, 2488, 943, 1818, 616, 1750, 273, 253, 7756, 689, 6385, 11904, 583, 11125, 1162, 355, 50276, 9638, 824, 7756, 310, 417, 6786, 275, 436, 2929, 50275, 2886, 275, 2426, 273, 14951, 253, 5426, 273, 253, 940, 1159, 943, 320, 4391, 21533, 940, 760, 12853, 253, 4181, 3185, 273, 253, 30044, 4181, 253, 4477, 943, 897, 940, 19, 33060, 390, 3365, 1818, 281, 690, 643, 14951, 5010, 352, 588, 2847, 13775, 436, 14951, 13775, 310, 671, 6830, 253, 1921, 2139, 253, 4477, 47723, 1750, 253, 7756, 689, 6385, 11904, 583, 11125, 1162, 355, 50276, 9638, 50276, 12211, 352, 1537, 4409, 625, 5955, 670, 752, 5933, 337, 310, 2509, 323, 1650, 323, 1386, 495, 13366, 4385, 326, 1269, 3151, 310, 253, 11786, 273, 253, 8103, 1159, 588, 1056, 253, 4685, 273, 253, 5933, 3012, 19554, 12014, 253, 4477, 476, 1056, 625, 8813, 670, 253, 1386, 48726, 273, 5933, 337, 534, 1537, 2847, 13775, 1293, 8813, 50276, 22, 1642, 3533, 368, 651, 751, 9577, 407, 253, 4477, 326, 1361, 368, 19148, 634, 4685, 273, 253, 2929, 285, 3400, 253, 3081, 1941, 368, 878, 281, 1056, 320, 13224, 275, 634, 6803, 50275, 15603, 4560, 247, 3646, 12580, 3692, 6814, 4972, 260, 2059, 249, 3151, 323, 17647, 9428, 627, 403, 760, 374, 7823, 275, 436, 2929, 253, 4081, 2900, 310, 281, 1089, 268, 331, 50276, 5902, 2059, 50276, 18, 5902, 2059, 249, 3151, 2299, 436, 10140, 281, 253, 4112, 835, 1078, 2509, 2712, 806, 15331, 247, 18011, 281, 7617, 1880, 12580, 390, 12580, 310, 908, 840, 897, 326, 3646, 323, 512, 2852, 7120, 2299, 436, 4112, 310, 12504, 275, 253, 3282, 326, 5293, 273, 253, 12580, 390, 12580, 310, 17887, 285, 253, 11041, 476, 320, 1077, 1781, 619, 1953, 310, 310, 352, 1896, 281, 1089, 253, 17133, 5019, 331, 260, 377, 74, 50276, 18, 377, 74, 249, 3151, 352, 310, 4755, 326, 260, 377, 74, 50276, 18, 377, 74, 425, 82, 21136, 2059, 50276, 18, 5902, 2059, 891, 1158, 824, 247, 3646, 588, 320, 331, 1752, 254, 50274, 187, 187, 4118, 18435, 27, 783, 2929, 21168, 327, 253, 2720, 789, 407, 6385, 11904, 583, 11125, 1162, 355, 6247, 326, 9010, 247, 17887, 6804, 3646, 762, 17133, 10806, 949, 4181, 41458, 689, 247, 44053, 873, 3185, 273, 253, 819, 1983, 34716, 2746, 908, 275, 6385, 11904, 583, 11125, 1162, 355, 6247, 436, 2929, 29328, 281, 4647, 21332, 88, 311, 453, 1511, 5933, 3782, 253, 5927, 5222, 1127, 5933, 281, 8591, 37139, 414, 273, 253, 6804, 3646, 1223, 17170, 253, 1072, 10454, 50275, 3229, 3784, 253, 7756, 327, 37139, 414, 253, 913, 285, 690, 30628, 3894, 767, 2022, 7350, 337, 32809, 38135, 273, 253, 5933, 32525, 534, 10323, 3637, 432, 5368, 13757, 789, 374, 3480, 273, 10527, 285, 10704, 22861, 273, 253, 8453, 273, 37139, 414, 3340, 1677, 326, 253, 2022, 13782, 4815, 1705, 432, 12378, 285, 391, 77, 42295, 50275, 328, 9520, 253, 2929, 13446, 816, 2708, 45210, 285, 2550, 320, 7607, 436, 673, 50276 ]
Below is given review of a research paper from cnoference journal. Please write a summary the review. ### Review: this manuscript proposes to reduce the intensive computation and memory requirement in reinforcement learning trainings by freezing the parameters of lower layers early besides the authors also propose to store the lowdimensional latent vectors rather than the highdimensional images in the replay buffer for experience replay the effectiveness of the proposed techniques is evaluated on deepmind control environments and atari the motivation for this work is strong and the results are impressive however the proposed technique is described in a very general way without clearly defined applicable conditions and specific design methods below are detailed comments and questions 1 the main idea is to freeze lower layers of cnn encoders however for a certain network with various structures is there any applicable conditions or limitations how to choose the number of layers to freeze the proposed technique needs to rigorous definition and explanation 2 it seems the reduction comes from the freezing of lower layers i am wondering what is the computationmemory requirement breakdown in terms of layers is it always the case that lower layers consume a significant amount of computation and memory if it is not the case can lever still be effective 3 this paper also proposes to store latent vectors instead of highdimensional images there again lacks detailed description and explanation of the applicable conditions and to what extent we can reduce the dimension of the latent vectors docsep summary this paper presents a method for compute and memoryefficient reinforcement learning where the visual encoder is frozen partway into training after freezing latent vectors are stored in the replay buffer instead of images and any existing images are replaced by them this leads to both better compute and memory utilization the authors demonstrate their method by comparing to rainbow on atari and curl on dm control on dm control their method reduces the compute by a considerable margin on atari the results are less clear cut but the compute cost is reduced when they also impose a memory constraint the effectiveness of their method is further increased strengths elegant and obvious in hindsight a good thing idea meaning it will likely have broad applicability while the authors only tested it on offpolicy methods it is clearly also applicable to onpolicy methods that use a rollout storage ppo ppg impala vmpo a2c etc good flops vs reward results on dm control weaknesses the memory constrained results seem very contrived 60 mb is a tiny amount of memory and even 90 gb of presumably cpu memory isnt that prohibitive perhaps if wallclock time was plotted in addition to samples the smaller memory footprint of lever would mean the replay buffer can be stored on the gpu and training would be much faster since many expensive cpu gpu transfers would be eliminated the cnn is frozen all at once instead of frozen iteratively raghu 2017 and figure 6c suggest that the early layers could be frozen much earlier although this may increase the memory usage initially since cnns typically increase the memory size of the feature map in lower layers tf seems like yet another hyperparameter to tune in theory svcca or pwcca from morcos 2018 could be used to choose when to freeze if the representation of the shallowest unfrozen layer didnt change in the last k steps freeze it there is a nontrivial cost to computing either of those so it suggestions for improvement i very much like the idea of this paper but i think the chosen application is making the idea look less convincing ie freezing the cnn isnt really that impactful when the cnn and observations are tiny i urge the authors to try this for visual navigation ie pointgoal navigation in habitatai2 thoretc where deeper cnns eg resnet18 and higher resolution images eg 256x256 are used one other potentially benefit of lever is the ability to increase the batch size during training as in smith 2017 this could perhaps increase its effectiveness further in figure 6a there should also be a curl lever from scratch line currently two variables are changing one paper that should be cited is fang 2019 they do a very similar thing as lever out of necessity overall while i like this paper and think the idea has a lot of potential i dont think it is quite ready for publication yet i urge the authors to try their idea in a setting with a larger cnn and higher resolutions and to see if there is a way to find tf without it being yet another hyperparameter references smith 2017 httpsarxivorgabs171100489 morcos 2018 httpsarxivorgabs180605759 fang 2019 httpsarxivorgabs190303878 post rebuttal i thank the authors for their responses the results with a larger cnn and increased batch size help show the benefit of the method further i still believe the presentation of the method would be considerably stronger if results were presented in a setting with larger cnns and higher resolution docsepsignificance the paper proposes to reduce memory and computation demands of an image based rl by exploiting early convergence of the convolutional encoder while the approach is quite intriguing i find it hard to see the approach being general and thus having a significant effect on the rl community pros the paper provides an interesting approach in order to save memory and computational footprint in imagebased deep rl the method is based on an observation that the convolutional encoder converges faster then the actor and critic networks the authors provide an extensive comparison and ablation study that covers different domains dmcontrol and atari the ablation study shades more light on some of the properties for the training dynamics of an image based model free rl algorithm attention map layer convergence cons im a bit skeptical about the generality of this approach freezing the encoders weights prematurely prevents the encoder to adequately encode priorly unseen images out of distribution this in turn will hurt the agents performance down the road note that in dmcontrol the method is only showcased on the simplest tasks at least cheetah run should be included where learning a good policy only takes about 100k env steps enough to collect sufficient data support to learn an almost optimal policy figure 6c adheres to my point here as conv1 weights pretty much dont change suggesting convergence the task transfer walker stand to walker walk experiment is exactly the same as the one demonstrated in sacae httpsarxivorgpdf191001741pdf section 53 im not sure what is the difference here besides using curl instead of sacae could the authors elaborate also the domain transfer experiment app g shows that the approach doesnt really buy anything it seems that the approach requires storing 4 data augmented observations per env observation and the replay buffer size is equal to the number of training steps i would like to point out that drq httpsarxivorgpdf200413649pdf only needs a constant size replay buffer of 100k transitions even if training requires 3m steps given that im skeptical that lever would buy much in terms of memory and computation here it would be nice to see a head to head comparison results are demonstrated over 3 random seeds which is too few to get any conclusive statistical evidence given the variance a common practice is to use 10 seeds for dmcontrol and 5 seeds for atari quality while the technical contributions of the paper are limited in novelty and significance and dont meet the high acceptance bar of iclr i still think the paper is well done and could be a good workshop paper clarity the paper in general is clearly written and well organized i particularly appreciate the contribution bullet points and the experimentation roadmap post rebuttal the authors has addressed several of my concerns regarding the methods generality and some experiments while im raising my score to 5 im still not convinced that the paper proposed a valuable contribution to the community comparing rl algorithms in memory or compute footprints instead of the number interactions with the environment is not meaningful especially when a simulator is in use there are several much simpler things one can do to tradeoff compute or memory for example rerender observations on the fly from stored low dimensional states thus im voting for a rejection docsep paper summary this work proposes lever a method that modifies general offpolicy rl algorithms with a fixed layer freezing policy for early embedding layers in this particular case a few early layers of a cnn as a direct consequence the method enables to store embeddings in the experience replay buffer rather than observations with a potential decrease in memory required as well as providing a boost in clock time due to fewer gradient computations needed for every update the method is benchmarked with a couple of offpolicy rl algorithms against a few different environments good things the approach is extremely relevant to most of the rl community we are training for longer periods of time and generating significantly more data than we did a few years ago so any method that enables to increase training efficiency is extremely welcomed the method is simple but clever and the manuscript quite nicely details the steps taken to improve learning stability which can arise due to the obvious possibility of bad model freezes the coverage of related work in the literature throughout the manuscript is excellent and provides enough pointers for the reader to understand how the manuscript is placed in it the experimental setting clearly states hypotheses and questions that will be answered section 54 convincingly argues that the freezing method is empirically justified concerns this is a good paper so generally i dont have any strong negative thoughts however i think it would be good to report how the method does when different choices are made with respect to how much of the network is frozen that is in the proposed experiment setting the choices were reasonable but nonetheless a little arbitrary so knowing a little bit more about learning dynamics with this approach would probably make the paper stronger and more robust for future readers questions i wonder whether the authors would shed more details on the transfer learning setting eg whether the transfer capacity changes wrt changes in method hyperparameters such as freezing time different saved embedding functions etc and whether the results do generally show up in more environmentsalgorithms the reduction in variance after the freezing is interesting i wonder if the plots could show all the single runs and whether the authors have any explanations for this somewhat consistent change in learning behaviour ### Summary:
this paper presents approach to improve compute and memory efficiency by freezing layers and storing latent features the approach is simple and provide efficiency however there are concerns as well one big concern is that the experiments are not on realistic settings for example real world images and the current cnn is too simple overall the reviewers are split the ac agrees with some of the reviewers that for a paper like this experiments on more realistic setting will make it significantly stronger
[ 6391, 323, 2793, 44864, 253, 12510, 273, 253, 4081, 5609, 310, 6760, 327, 3676, 14785, 1453, 12620, 285, 387, 1792, 253, 16038, 323, 436, 789, 310, 2266, 285, 253, 1543, 403, 13943, 2299, 253, 4081, 5853, 310, 2529, 275, 247, 1077, 2087, 1039, 1293, 4518, 2931, 7763, 2515, 285, 2173, 2216, 3082, 2708, 403, 7000, 5701, 285, 3533, 50276, 18, 253, 2022, 2934, 310, 281, 21090, 2406, 8090, 273, 260, 9866, 2349, 351, 398, 2299, 323, 247, 2176, 2990, 342, 2710, 5289, 310, 627, 667, 7763, 2515, 390, 7364, 849, 281, 5206, 253, 1180, 273, 8090, 281, 21090, 253, 4081, 5853, 3198, 281, 26565, 5426, 285, 8813, 50275, 19, 352, 3133, 253, 5141, 3249, 432, 253, 24250, 273, 2406, 8090, 891, 717, 12371, 752, 310, 253, 13782, 20704, 8284, 19501, 275, 2426, 273, 8090, 310, 352, 1900, 253, 1083, 326, 2406, 8090, 21409, 247, 1534, 2408, 273, 13782, 285, 3541, 604, 352, 310, 417, 253, 1083, 476, 19732, 1335, 320, 3576, 50276, 20, 436, 2929, 671, 29328, 281, 4657, 21624, 11390, 3185, 273, 1029, 6967, 3888, 627, 969, 19756, 7000, 5740, 285, 8813, 273, 253, 7763, 2515, 285, 281, 752, 6070, 359, 476, 4796, 253, 7877, 273, 253, 21624, 11390, 5474, 33032, 6010, 50272, 2520, 2929, 10262, 247, 1332, 323, 11897, 285, 3541, 20246, 35221, 4715, 835, 253, 5304, 32049, 310, 13831, 629, 1106, 715, 3733, 50276, 6438, 24250, 21624, 11390, 403, 7141, 275, 253, 44864, 6391, 3185, 273, 3888, 285, 667, 5368, 3888, 403, 7932, 407, 731, 50276, 2520, 5644, 281, 1097, 1805, 11897, 285, 3541, 19575, 50272, 783, 4477, 7568, 616, 1332, 407, 10941, 281, 37422, 327, 387, 1792, 285, 26721, 327, 42961, 1453, 50276, 251, 42961, 1453, 616, 1332, 11355, 253, 11897, 407, 247, 10665, 8459, 50276, 251, 387, 1792, 253, 1543, 403, 1679, 2590, 2624, 533, 253, 11897, 2105, 310, 3777, 50272, 9453, 597, 671, 16209, 247, 3541, 7658, 253, 12510, 273, 616, 1332, 310, 2007, 2559, 50276, 296, 3755, 20556, 50272, 70, 1851, 386, 285, 4755, 275, 17134, 18347, 247, 1175, 2181, 2934, 4495, 352, 588, 2779, 452, 3862, 30437, 50272, 6050, 253, 4477, 760, 5762, 352, 327, 745, 22872, 3082, 352, 310, 4518, 671, 7763, 281, 327, 22872, 3082, 326, 897, 247, 4533, 483, 5718, 268, 5367, 7266, 72, 1607, 7080, 362, 2503, 80, 247, 19, 68, 3966, 50272, 12311, 892, 2695, 4632, 10921, 1543, 327, 42961, 1453, 50276, 20881, 1255, 265, 50272, 783, 3541, 20793, 1543, 1646, 1077, 523, 30487, 50276, 1549, 45505, 310, 247, 10058, 2408, 273, 3541, 285, 1014, 5091, 305, 67, 273, 18289, 27754, 3541, 310, 2649, 326, 9419, 1483, 50268, 30875, 604, 3402, 13273, 673, 369, 17944, 275, 1635, 281, 3530, 253, 4577, 3541, 33257, 273, 19732, 651, 1599, 253, 44864, 6391, 476, 320, 7141, 327, 253, 305, 11113, 285, 3733, 651, 320, 1199, 7938, 1580, 1142, 8214, 27754, 50276, 37597, 21916, 651, 320, 17527, 50272, 783, 260, 9866, 310, 13831, 512, 387, 2378, 3185, 273, 13831, 10040, 3146, 23603, 11917, 4240, 285, 4677, 721, 68, 1804, 326, 253, 2393, 8090, 812, 320, 13831, 1199, 4321, 3738, 436, 778, 2572, 253, 3541, 10393, 8523, 1580, 260, 79, 2224, 5431, 2572, 253, 3541, 1979, 273, 253, 4735, 3711, 275, 2406, 8090, 50272, 16114, 3133, 751, 2568, 1529, 4373, 19484, 281, 19928, 50276, 249, 3762, 18504, 18442, 390, 268, 88, 18442, 432, 2298, 4752, 4765, 812, 320, 908, 281, 5206, 672, 281, 21090, 604, 253, 6779, 273, 253, 20126, 383, 5369, 287, 5282, 3828, 42126, 1818, 275, 253, 1390, 465, 5018, 21090, 352, 50276, 9088, 310, 247, 37825, 2105, 281, 12672, 2057, 273, 1110, 594, 352, 50274, 35640, 621, 323, 7756, 50272, 74, 1077, 1199, 751, 253, 2934, 273, 436, 2929, 533, 891, 1158, 253, 6777, 2898, 310, 2403, 253, 2934, 1007, 1679, 21414, 26332, 24250, 253, 260, 9866, 310, 2649, 1663, 326, 3486, 1020, 672, 253, 260, 9866, 285, 7313, 403, 10058, 50276, 74, 21434, 253, 4477, 281, 1611, 436, 323, 5304, 15034, 26332, 1127, 41881, 15034, 275, 8803, 682, 74, 19, 289, 7262, 68, 835, 12861, 260, 79, 2224, 24088, 501, 3024, 1093, 285, 2169, 6064, 3888, 24088, 17558, 89, 9726, 403, 908, 50272, 531, 643, 7826, 5649, 273, 19732, 310, 253, 3745, 281, 2572, 253, 14604, 1979, 1309, 3733, 347, 275, 924, 334, 4240, 50276, 2520, 812, 4931, 2572, 697, 12510, 2007, 50272, 249, 4677, 721, 66, 627, 943, 671, 320, 247, 26721, 50276, 282, 332, 432, 20041, 1386, 50276, 47590, 767, 4903, 403, 6890, 50272, 531, 2929, 326, 943, 320, 11106, 310, 269, 606, 6247, 50276, 9328, 513, 247, 1077, 2074, 2181, 347, 19732, 562, 273, 15504, 50276, 1189, 455, 50272, 6050, 891, 751, 436, 2929, 285, 1158, 253, 2934, 556, 247, 2257, 273, 2442, 891, 13414, 1158, 352, 310, 3240, 4704, 323, 9311, 2568, 50276, 74, 21434, 253, 4477, 281, 1611, 616, 2934, 275, 247, 4758, 342, 247, 4067, 260, 9866, 285, 2169, 30285, 285, 281, 923, 604, 627, 310, 247, 1039, 281, 1089, 28793, 1293, 352, 1146, 2568, 1529, 4373, 19484, 50276, 250, 3065, 50272, 37915, 4240, 5987, 39962, 2061, 5375, 1166, 37965, 30452, 50272, 20014, 4752, 4765, 5987, 39962, 2061, 5375, 11395, 20954, 32054, 50272, 71, 606, 6247, 5987, 39962, 2061, 5375, 746, 15960, 1839, 3141, 50275, 5996, 30080, 22559, 50276, 74, 5717, 253, 4477, 323, 616, 6128, 253, 1543, 342, 247, 4067, 260, 9866, 285, 2559, 14604, 1979, 1361, 921, 253, 5649, 273, 253, 1332, 2007, 50276, 74, 1335, 2868, 253, 9759, 273, 253, 1332, 651, 320, 15455, 10046, 604, 1543, 497, 3559, 275, 247, 4758, 342, 4067, 260, 79, 2224, 285, 2169, 6064, 5474, 339, 793, 525, 40348, 253, 2929, 29328, 281, 4796, 3541, 285, 13782, 11684, 273, 271, 2460, 1754, 391, 77, 407, 38883, 2393, 14940, 273, 253, 27311, 267, 32049, 1223, 253, 2746, 310, 3240, 27807, 891, 1089, 352, 1892, 281, 923, 253, 2746, 1146, 2087, 285, 3021, 1907, 247, 1534, 1055, 327, 253, 391, 77, 3114, 50275, 856, 84, 253, 2929, 3400, 271, 4722, 2746, 275, 1340, 281, 5321, 3541, 285, 15180, 33257, 275, 2460, 3169, 3676, 391, 77, 253, 1332, 310, 1754, 327, 271, 8310, 326, 253, 27311, 267, 32049, 26414, 7938, 840, 253, 12353, 285, 7291, 6928, 253, 4477, 2085, 271, 9470, 5301, 285, 28913, 1263, 326, 10949, 1027, 10625, 42961, 8519, 285, 387, 1792, 253, 28913, 1263, 30553, 625, 1708, 327, 690, 273, 253, 3607, 323, 253, 3733, 8062, 273, 271, 2460, 1754, 1566, 1959, 391, 77, 5933, 4116, 3711, 3828, 14940, 772, 516, 247, 2372, 33872, 670, 253, 31376, 273, 436, 2746, 24250, 253, 2349, 351, 398, 13461, 20346, 314, 16897, 253, 32049, 281, 18212, 22573, 2720, 314, 39709, 3888, 562, 273, 3268, 436, 275, 1614, 588, 8513, 253, 6083, 3045, 1066, 253, 3971, 3877, 326, 275, 42961, 8519, 253, 1332, 310, 760, 44762, 833, 327, 253, 22325, 8892, 387, 1878, 1161, 292, 1240, 1408, 943, 320, 2908, 835, 4715, 247, 1175, 3646, 760, 3936, 670, 2233, 76, 17400, 5018, 50276, 40252, 281, 4822, 4209, 941, 1329, 281, 3037, 271, 2761, 8654, 3646, 4677, 721, 68, 519, 14210, 281, 619, 1127, 1060, 347, 2410, 18, 13461, 3965, 1199, 13414, 1818, 7738, 14940, 253, 4836, 3700, 2940, 254, 1462, 281, 2940, 254, 2940, 3368, 310, 4555, 253, 1072, 347, 253, 581, 5183, 275, 7044, 3348, 5987, 39962, 2061, 9275, 746, 2313, 1166, 3156, 9275, 2593, 8676, 516, 417, 2119, 752, 310, 253, 3064, 1060, 16280, 970, 26721, 3185, 273, 7044, 3348, 812, 253, 4477, 21184, 671, 253, 5028, 3700, 3368, 622, 305, 2722, 326, 253, 2746, 36908, 1663, 4489, 2712, 352, 3133, 326, 253, 2746, 4419, 20073, 577, 941, 31612, 7313, 591, 17400, 8310, 285, 253, 44864, 6391, 1979, 310, 4503, 281, 253, 1180, 273, 3733, 5018, 891, 651, 751, 281, 1127, 562, 326, 1837, 82, 5987, 39962, 2061, 9275, 9430, 1012, 26937, 9275, 760, 3198, 247, 3638, 1979, 44864, 6391, 273, 2233, 76, 16307, 1014, 604, 3733, 4419, 495, 78, 5018, 1677, 326, 516, 33872, 326, 19732, 651, 4489, 1199, 275, 2426, 273, 3541, 285, 13782, 1060, 352, 651, 320, 5322, 281, 923, 247, 1481, 281, 1481, 5301, 1543, 403, 5183, 689, 495, 3632, 12922, 534, 310, 1512, 1643, 281, 755, 667, 38662, 7605, 1941, 1677, 253, 11041, 247, 1846, 3946, 310, 281, 897, 884, 12922, 323, 42961, 8519, 285, 608, 12922, 323, 387, 1792, 50275, 15177, 1223, 253, 7681, 9021, 273, 253, 2929, 403, 3710, 275, 38135, 285, 8453, 285, 13414, 2525, 253, 1029, 14924, 2534, 273, 17857, 32888, 891, 1335, 1158, 253, 2929, 310, 973, 2218, 285, 812, 320, 247, 1175, 22586, 2929, 50275, 498, 15752, 253, 2929, 275, 2087, 310, 4518, 3542, 285, 973, 10932, 891, 3782, 11435, 253, 7680, 16950, 2792, 285, 253, 40290, 3971, 4251, 50275, 5996, 30080, 22559, 50276, 783, 4477, 556, 9713, 2067, 273, 619, 7350, 5001, 253, 3082, 31376, 285, 690, 4679, 1223, 516, 12976, 619, 4868, 281, 608, 516, 1335, 417, 13762, 326, 253, 2929, 4081, 247, 9865, 7680, 281, 253, 3114, 50276, 681, 48434, 391, 77, 11333, 275, 3541, 390, 11897, 3174, 21937, 3185, 273, 253, 1180, 6355, 342, 253, 3126, 310, 417, 14282, 3340, 672, 247, 40022, 310, 275, 897, 627, 403, 2067, 1199, 19554, 1841, 581, 476, 513, 281, 5454, 2727, 11897, 390, 3541, 323, 1650, 294, 12574, 7313, 327, 253, 8778, 432, 7141, 1698, 15759, 3054, 3021, 516, 13423, 323, 247, 18235, 50276, 7152, 33032, 2929, 6010, 50276, 2520, 789, 29328, 19732, 247, 1332, 326, 771, 7790, 2087, 745, 22872, 391, 77, 11333, 342, 247, 4229, 3828, 24250, 3646, 323, 2393, 21496, 8090, 275, 436, 1798, 1083, 247, 1643, 2393, 8090, 273, 247, 260, 9866, 347, 247, 1480, 9936, 253, 1332, 13276, 281, 4657, 46234, 275, 253, 2793, 44864, 6391, 2581, 685, 7313, 342, 247, 2442, 6379, 275, 3541, 2424, 347, 973, 347, 5277, 247, 9510, 275, 8886, 673, 1955, 281, 11184, 11786, 30745, 3058, 323, 1046, 5731, 253, 1332, 310, 22791, 264, 342, 247, 4564, 273, 745, 22872, 391, 77, 11333, 1411, 247, 1643, 1027, 12620, 50275, 12311, 1841, 50275, 783, 2746, 310, 6685, 4623, 281, 954, 273, 253, 391, 77, 3114, 359, 403, 3733, 323, 3356, 9894, 273, 673, 285, 11365, 3012, 625, 941, 685, 359, 858, 247, 1643, 1107, 3622, 594, 667, 1332, 326, 13276, 281, 2572, 3733, 6733, 310, 6685, 25213, 50276, 783, 1332, 310, 2969, 533, 19080, 285, 253, 7714, 3240, 23395, 4278, 253, 5018, 2668, 281, 3157, 4715, 7882, 534, 476, 12893, 1955, 281, 253, 4755, 6387, 273, 3076, 1566, 1959, 13505, 50276, 783, 7031, 273, 2905, 789, 275, 253, 6239, 4768, 253, 7714, 310, 7126, 285, 3400, 2217, 29476, 323, 253, 9414, 281, 2096, 849, 253, 7714, 310, 4845, 275, 352, 50276, 783, 5661, 4758, 4518, 3054, 24316, 285, 3533, 326, 588, 320, 9577, 50276, 4674, 8255, 2410, 1763, 5356, 8219, 326, 253, 24250, 1332, 310, 45190, 17285, 50275, 585, 1209, 2224, 50276, 2520, 310, 247, 1175, 2929, 594, 3839, 891, 13414, 452, 667, 2266, 4016, 7906, 2299, 891, 1158, 352, 651, 320, 1175, 281, 1304, 849, 253, 1332, 1057, 672, 1027, 10165, 403, 1160, 342, 1675, 281, 849, 1199, 273, 253, 2990, 310, 13831, 326, 310, 275, 253, 4081, 3368, 4758, 253, 10165, 497, 5272, 533, 23188, 247, 1652, 10341, 594, 8958, 247, 1652, 2372, 625, 670, 4715, 8062, 342, 436, 2746, 651, 3164, 1056, 253, 2929, 10046, 285, 625, 10237, 323, 2852, 10668, 50275, 34974, 50275, 74, 4282, 1880, 253, 4477, 651, 17914, 625, 4278, 327, 253, 3700, 4715, 4758, 24088, 1880, 253, 3700, 5350, 2544, 8772, 2544, 275, 1332, 4373, 22041, 824, 347, 24250, 673, 1027, 9809, 21496, 3470, 3966, 285, 1880, 253, 1543, 513, 3839, 921, 598, 275, 625, 12620, 267, 46042, 50276, 783, 5141, 275, 11041, 846, 253, 24250, 310, 4722, 891, 4282, 604, 253, 14777, 812, 921, 512, 253, 2014, 6613, 285, 1880, 253, 4477, 452, 667, 22909, 323, 436, 8489, 5185, 50276, 4168, 275, 4715, 8770, 2490, 187, 4118, 18435, 27, 436, 2929, 10262, 2746, 281, 3157, 11897, 285, 3541, 6733, 407, 24250, 8090, 285, 20073, 21624, 3386, 253, 2746, 310, 2969, 285, 2085, 6733, 2299, 627, 403, 7350, 347, 973, 581, 1943, 4468, 310, 326, 253, 4679, 403, 417, 327, 15958, 7533, 323, 1650, 1524, 1533, 3888, 285, 253, 1655, 260, 9866, 310, 1512, 2969, 4583, 253, 30628, 403, 8085, 253, 913, 18726, 342, 690, 273, 253, 30628, 326, 323, 247, 2929, 751, 436, 4679, 327, 625, 15958, 4758, 588, 1056, 352, 3012, 10046, 50276 ]
[ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 ]
[ 6391, 323, 2793, 44864, 253, 12510, 273, 253, 4081, 5609, 310, 6760, 327, 3676, 14785, 1453, 12620, 285, 387, 1792, 253, 16038, 323, 436, 789, 310, 2266, 285, 253, 1543, 403, 13943, 2299, 253, 4081, 5853, 310, 2529, 275, 247, 1077, 2087, 1039, 1293, 4518, 2931, 7763, 2515, 285, 2173, 2216, 3082, 2708, 403, 7000, 5701, 285, 3533, 50276, 18, 253, 2022, 2934, 310, 281, 21090, 2406, 8090, 273, 260, 9866, 2349, 351, 398, 2299, 323, 247, 2176, 2990, 342, 2710, 5289, 310, 627, 667, 7763, 2515, 390, 7364, 849, 281, 5206, 253, 1180, 273, 8090, 281, 21090, 253, 4081, 5853, 3198, 281, 26565, 5426, 285, 8813, 50275, 19, 352, 3133, 253, 5141, 3249, 432, 253, 24250, 273, 2406, 8090, 891, 717, 12371, 752, 310, 253, 13782, 20704, 8284, 19501, 275, 2426, 273, 8090, 310, 352, 1900, 253, 1083, 326, 2406, 8090, 21409, 247, 1534, 2408, 273, 13782, 285, 3541, 604, 352, 310, 417, 253, 1083, 476, 19732, 1335, 320, 3576, 50276, 20, 436, 2929, 671, 29328, 281, 4657, 21624, 11390, 3185, 273, 1029, 6967, 3888, 627, 969, 19756, 7000, 5740, 285, 8813, 273, 253, 7763, 2515, 285, 281, 752, 6070, 359, 476, 4796, 253, 7877, 273, 253, 21624, 11390, 5474, 33032, 6010, 50272, 2520, 2929, 10262, 247, 1332, 323, 11897, 285, 3541, 20246, 35221, 4715, 835, 253, 5304, 32049, 310, 13831, 629, 1106, 715, 3733, 50276, 6438, 24250, 21624, 11390, 403, 7141, 275, 253, 44864, 6391, 3185, 273, 3888, 285, 667, 5368, 3888, 403, 7932, 407, 731, 50276, 2520, 5644, 281, 1097, 1805, 11897, 285, 3541, 19575, 50272, 783, 4477, 7568, 616, 1332, 407, 10941, 281, 37422, 327, 387, 1792, 285, 26721, 327, 42961, 1453, 50276, 251, 42961, 1453, 616, 1332, 11355, 253, 11897, 407, 247, 10665, 8459, 50276, 251, 387, 1792, 253, 1543, 403, 1679, 2590, 2624, 533, 253, 11897, 2105, 310, 3777, 50272, 9453, 597, 671, 16209, 247, 3541, 7658, 253, 12510, 273, 616, 1332, 310, 2007, 2559, 50276, 296, 3755, 20556, 50272, 70, 1851, 386, 285, 4755, 275, 17134, 18347, 247, 1175, 2181, 2934, 4495, 352, 588, 2779, 452, 3862, 30437, 50272, 6050, 253, 4477, 760, 5762, 352, 327, 745, 22872, 3082, 352, 310, 4518, 671, 7763, 281, 327, 22872, 3082, 326, 897, 247, 4533, 483, 5718, 268, 5367, 7266, 72, 1607, 7080, 362, 2503, 80, 247, 19, 68, 3966, 50272, 12311, 892, 2695, 4632, 10921, 1543, 327, 42961, 1453, 50276, 20881, 1255, 265, 50272, 783, 3541, 20793, 1543, 1646, 1077, 523, 30487, 50276, 1549, 45505, 310, 247, 10058, 2408, 273, 3541, 285, 1014, 5091, 305, 67, 273, 18289, 27754, 3541, 310, 2649, 326, 9419, 1483, 50268, 30875, 604, 3402, 13273, 673, 369, 17944, 275, 1635, 281, 3530, 253, 4577, 3541, 33257, 273, 19732, 651, 1599, 253, 44864, 6391, 476, 320, 7141, 327, 253, 305, 11113, 285, 3733, 651, 320, 1199, 7938, 1580, 1142, 8214, 27754, 50276, 37597, 21916, 651, 320, 17527, 50272, 783, 260, 9866, 310, 13831, 512, 387, 2378, 3185, 273, 13831, 10040, 3146, 23603, 11917, 4240, 285, 4677, 721, 68, 1804, 326, 253, 2393, 8090, 812, 320, 13831, 1199, 4321, 3738, 436, 778, 2572, 253, 3541, 10393, 8523, 1580, 260, 79, 2224, 5431, 2572, 253, 3541, 1979, 273, 253, 4735, 3711, 275, 2406, 8090, 50272, 16114, 3133, 751, 2568, 1529, 4373, 19484, 281, 19928, 50276, 249, 3762, 18504, 18442, 390, 268, 88, 18442, 432, 2298, 4752, 4765, 812, 320, 908, 281, 5206, 672, 281, 21090, 604, 253, 6779, 273, 253, 20126, 383, 5369, 287, 5282, 3828, 42126, 1818, 275, 253, 1390, 465, 5018, 21090, 352, 50276, 9088, 310, 247, 37825, 2105, 281, 12672, 2057, 273, 1110, 594, 352, 50274, 35640, 621, 323, 7756, 50272, 74, 1077, 1199, 751, 253, 2934, 273, 436, 2929, 533, 891, 1158, 253, 6777, 2898, 310, 2403, 253, 2934, 1007, 1679, 21414, 26332, 24250, 253, 260, 9866, 310, 2649, 1663, 326, 3486, 1020, 672, 253, 260, 9866, 285, 7313, 403, 10058, 50276, 74, 21434, 253, 4477, 281, 1611, 436, 323, 5304, 15034, 26332, 1127, 41881, 15034, 275, 8803, 682, 74, 19, 289, 7262, 68, 835, 12861, 260, 79, 2224, 24088, 501, 3024, 1093, 285, 2169, 6064, 3888, 24088, 17558, 89, 9726, 403, 908, 50272, 531, 643, 7826, 5649, 273, 19732, 310, 253, 3745, 281, 2572, 253, 14604, 1979, 1309, 3733, 347, 275, 924, 334, 4240, 50276, 2520, 812, 4931, 2572, 697, 12510, 2007, 50272, 249, 4677, 721, 66, 627, 943, 671, 320, 247, 26721, 50276, 282, 332, 432, 20041, 1386, 50276, 47590, 767, 4903, 403, 6890, 50272, 531, 2929, 326, 943, 320, 11106, 310, 269, 606, 6247, 50276, 9328, 513, 247, 1077, 2074, 2181, 347, 19732, 562, 273, 15504, 50276, 1189, 455, 50272, 6050, 891, 751, 436, 2929, 285, 1158, 253, 2934, 556, 247, 2257, 273, 2442, 891, 13414, 1158, 352, 310, 3240, 4704, 323, 9311, 2568, 50276, 74, 21434, 253, 4477, 281, 1611, 616, 2934, 275, 247, 4758, 342, 247, 4067, 260, 9866, 285, 2169, 30285, 285, 281, 923, 604, 627, 310, 247, 1039, 281, 1089, 28793, 1293, 352, 1146, 2568, 1529, 4373, 19484, 50276, 250, 3065, 50272, 37915, 4240, 5987, 39962, 2061, 5375, 1166, 37965, 30452, 50272, 20014, 4752, 4765, 5987, 39962, 2061, 5375, 11395, 20954, 32054, 50272, 71, 606, 6247, 5987, 39962, 2061, 5375, 746, 15960, 1839, 3141, 50275, 5996, 30080, 22559, 50276, 74, 5717, 253, 4477, 323, 616, 6128, 253, 1543, 342, 247, 4067, 260, 9866, 285, 2559, 14604, 1979, 1361, 921, 253, 5649, 273, 253, 1332, 2007, 50276, 74, 1335, 2868, 253, 9759, 273, 253, 1332, 651, 320, 15455, 10046, 604, 1543, 497, 3559, 275, 247, 4758, 342, 4067, 260, 79, 2224, 285, 2169, 6064, 5474, 339, 793, 525, 40348, 253, 2929, 29328, 281, 4796, 3541, 285, 13782, 11684, 273, 271, 2460, 1754, 391, 77, 407, 38883, 2393, 14940, 273, 253, 27311, 267, 32049, 1223, 253, 2746, 310, 3240, 27807, 891, 1089, 352, 1892, 281, 923, 253, 2746, 1146, 2087, 285, 3021, 1907, 247, 1534, 1055, 327, 253, 391, 77, 3114, 50275, 856, 84, 253, 2929, 3400, 271, 4722, 2746, 275, 1340, 281, 5321, 3541, 285, 15180, 33257, 275, 2460, 3169, 3676, 391, 77, 253, 1332, 310, 1754, 327, 271, 8310, 326, 253, 27311, 267, 32049, 26414, 7938, 840, 253, 12353, 285, 7291, 6928, 253, 4477, 2085, 271, 9470, 5301, 285, 28913, 1263, 326, 10949, 1027, 10625, 42961, 8519, 285, 387, 1792, 253, 28913, 1263, 30553, 625, 1708, 327, 690, 273, 253, 3607, 323, 253, 3733, 8062, 273, 271, 2460, 1754, 1566, 1959, 391, 77, 5933, 4116, 3711, 3828, 14940, 772, 516, 247, 2372, 33872, 670, 253, 31376, 273, 436, 2746, 24250, 253, 2349, 351, 398, 13461, 20346, 314, 16897, 253, 32049, 281, 18212, 22573, 2720, 314, 39709, 3888, 562, 273, 3268, 436, 275, 1614, 588, 8513, 253, 6083, 3045, 1066, 253, 3971, 3877, 326, 275, 42961, 8519, 253, 1332, 310, 760, 44762, 833, 327, 253, 22325, 8892, 387, 1878, 1161, 292, 1240, 1408, 943, 320, 2908, 835, 4715, 247, 1175, 3646, 760, 3936, 670, 2233, 76, 17400, 5018, 50276, 40252, 281, 4822, 4209, 941, 1329, 281, 3037, 271, 2761, 8654, 3646, 4677, 721, 68, 519, 14210, 281, 619, 1127, 1060, 347, 2410, 18, 13461, 3965, 1199, 13414, 1818, 7738, 14940, 253, 4836, 3700, 2940, 254, 1462, 281, 2940, 254, 2940, 3368, 310, 4555, 253, 1072, 347, 253, 581, 5183, 275, 7044, 3348, 5987, 39962, 2061, 9275, 746, 2313, 1166, 3156, 9275, 2593, 8676, 516, 417, 2119, 752, 310, 253, 3064, 1060, 16280, 970, 26721, 3185, 273, 7044, 3348, 812, 253, 4477, 21184, 671, 253, 5028, 3700, 3368, 622, 305, 2722, 326, 253, 2746, 36908, 1663, 4489, 2712, 352, 3133, 326, 253, 2746, 4419, 20073, 577, 941, 31612, 7313, 591, 17400, 8310, 285, 253, 44864, 6391, 1979, 310, 4503, 281, 253, 1180, 273, 3733, 5018, 891, 651, 751, 281, 1127, 562, 326, 1837, 82, 5987, 39962, 2061, 9275, 9430, 1012, 26937, 9275, 760, 3198, 247, 3638, 1979, 44864, 6391, 273, 2233, 76, 16307, 1014, 604, 3733, 4419, 495, 78, 5018, 1677, 326, 516, 33872, 326, 19732, 651, 4489, 1199, 275, 2426, 273, 3541, 285, 13782, 1060, 352, 651, 320, 5322, 281, 923, 247, 1481, 281, 1481, 5301, 1543, 403, 5183, 689, 495, 3632, 12922, 534, 310, 1512, 1643, 281, 755, 667, 38662, 7605, 1941, 1677, 253, 11041, 247, 1846, 3946, 310, 281, 897, 884, 12922, 323, 42961, 8519, 285, 608, 12922, 323, 387, 1792, 50275, 15177, 1223, 253, 7681, 9021, 273, 253, 2929, 403, 3710, 275, 38135, 285, 8453, 285, 13414, 2525, 253, 1029, 14924, 2534, 273, 17857, 32888, 891, 1335, 1158, 253, 2929, 310, 973, 2218, 285, 812, 320, 247, 1175, 22586, 2929, 50275, 498, 15752, 253, 2929, 275, 2087, 310, 4518, 3542, 285, 973, 10932, 891, 3782, 11435, 253, 7680, 16950, 2792, 285, 253, 40290, 3971, 4251, 50275, 5996, 30080, 22559, 50276, 783, 4477, 556, 9713, 2067, 273, 619, 7350, 5001, 253, 3082, 31376, 285, 690, 4679, 1223, 516, 12976, 619, 4868, 281, 608, 516, 1335, 417, 13762, 326, 253, 2929, 4081, 247, 9865, 7680, 281, 253, 3114, 50276, 681, 48434, 391, 77, 11333, 275, 3541, 390, 11897, 3174, 21937, 3185, 273, 253, 1180, 6355, 342, 253, 3126, 310, 417, 14282, 3340, 672, 247, 40022, 310, 275, 897, 627, 403, 2067, 1199, 19554, 1841, 581, 476, 513, 281, 5454, 2727, 11897, 390, 3541, 323, 1650, 294, 12574, 7313, 327, 253, 8778, 432, 7141, 1698, 15759, 3054, 3021, 516, 13423, 323, 247, 18235, 50276, 7152, 33032, 2929, 6010, 50276, 2520, 789, 29328, 19732, 247, 1332, 326, 771, 7790, 2087, 745, 22872, 391, 77, 11333, 342, 247, 4229, 3828, 24250, 3646, 323, 2393, 21496, 8090, 275, 436, 1798, 1083, 247, 1643, 2393, 8090, 273, 247, 260, 9866, 347, 247, 1480, 9936, 253, 1332, 13276, 281, 4657, 46234, 275, 253, 2793, 44864, 6391, 2581, 685, 7313, 342, 247, 2442, 6379, 275, 3541, 2424, 347, 973, 347, 5277, 247, 9510, 275, 8886, 673, 1955, 281, 11184, 11786, 30745, 3058, 323, 1046, 5731, 253, 1332, 310, 22791, 264, 342, 247, 4564, 273, 745, 22872, 391, 77, 11333, 1411, 247, 1643, 1027, 12620, 50275, 12311, 1841, 50275, 783, 2746, 310, 6685, 4623, 281, 954, 273, 253, 391, 77, 3114, 359, 403, 3733, 323, 3356, 9894, 273, 673, 285, 11365, 3012, 625, 941, 685, 359, 858, 247, 1643, 1107, 3622, 594, 667, 1332, 326, 13276, 281, 2572, 3733, 6733, 310, 6685, 25213, 50276, 783, 1332, 310, 2969, 533, 19080, 285, 253, 7714, 3240, 23395, 4278, 253, 5018, 2668, 281, 3157, 4715, 7882, 534, 476, 12893, 1955, 281, 253, 4755, 6387, 273, 3076, 1566, 1959, 13505, 50276, 783, 7031, 273, 2905, 789, 275, 253, 6239, 4768, 253, 7714, 310, 7126, 285, 3400, 2217, 29476, 323, 253, 9414, 281, 2096, 849, 253, 7714, 310, 4845, 275, 352, 50276, 783, 5661, 4758, 4518, 3054, 24316, 285, 3533, 326, 588, 320, 9577, 50276, 4674, 8255, 2410, 1763, 5356, 8219, 326, 253, 24250, 1332, 310, 45190, 17285, 50275, 585, 1209, 2224, 50276, 2520, 310, 247, 1175, 2929, 594, 3839, 891, 13414, 452, 667, 2266, 4016, 7906, 2299, 891, 1158, 352, 651, 320, 1175, 281, 1304, 849, 253, 1332, 1057, 672, 1027, 10165, 403, 1160, 342, 1675, 281, 849, 1199, 273, 253, 2990, 310, 13831, 326, 310, 275, 253, 4081, 3368, 4758, 253, 10165, 497, 5272, 533, 23188, 247, 1652, 10341, 594, 8958, 247, 1652, 2372, 625, 670, 4715, 8062, 342, 436, 2746, 651, 3164, 1056, 253, 2929, 10046, 285, 625, 10237, 323, 2852, 10668, 50275, 34974, 50275, 74, 4282, 1880, 253, 4477, 651, 17914, 625, 4278, 327, 253, 3700, 4715, 4758, 24088, 1880, 253, 3700, 5350, 2544, 8772, 2544, 275, 1332, 4373, 22041, 824, 347, 24250, 673, 1027, 9809, 21496, 3470, 3966, 285, 1880, 253, 1543, 513, 3839, 921, 598, 275, 625, 12620, 267, 46042, 50276, 783, 5141, 275, 11041, 846, 253, 24250, 310, 4722, 891, 4282, 604, 253, 14777, 812, 921, 512, 253, 2014, 6613, 285, 1880, 253, 4477, 452, 667, 22909, 323, 436, 8489, 5185, 50276, 4168, 275, 4715, 8770, 2490, 187, 4118, 18435, 27, 436, 2929, 10262, 2746, 281, 3157, 11897, 285, 3541, 6733, 407, 24250, 8090, 285, 20073, 21624, 3386, 253, 2746, 310, 2969, 285, 2085, 6733, 2299, 627, 403, 7350, 347, 973, 581, 1943, 4468, 310, 326, 253, 4679, 403, 417, 327, 15958, 7533, 323, 1650, 1524, 1533, 3888, 285, 253, 1655, 260, 9866, 310, 1512, 2969, 4583, 253, 30628, 403, 8085, 253, 913, 18726, 342, 690, 273, 253, 30628, 326, 323, 247, 2929, 751, 436, 4679, 327, 625, 15958, 4758, 588, 1056, 352, 3012, 10046, 50276 ]
Below is given review of a research paper from cnoference journal. Please write a summary the review. ### Review: based on empirical observations and the neural tangent kernel theory the authors present an explanation of why knowledge distillation kd and early stopping kd eskd are better than using just the onehot labels as well as how hard samples have high variance curve in the output space based on these observations they propose a novel method called filterkd which yields better performance over using just onehot labels as well as eksd for standard vision datasets strengths 1 the theoretical explanation and the illustrative toy experiments are very suggestive of what may actually be happening with eskd these contributions are very novel and useful in my opinion they show that for harder examples compared to easier examples the learning path initially takes the network output to the true mathbfp before making a turn for the onehot target 2 the experiments when performed more realworld datasets using a proxy called integral difficulty instead of the true mathbfp as it is unknown still show very similar patterns at least for the easy examples compared to the toy gaussian dataset where mathbfp can be calculated exactly this is intellectually satisfying 3 the experiments show how hard examples as measured in terms of integral difficulty show a very different highly erratic learning path compared to easier examples more importantly the authors show that by lowpass filtering the path actually looks quite similar to the case of medium hardness this is very important as it allows the authors to propose an intuitive novel kd framework called filterkd where the targets are smoothed teacher outputs 4 the filterkd algorithm is novel intuitive relatively easy to implement for smaller datasets 5 the experiments on cifar10 and cifar100 show that filterkd is indeed better than using eskd 6 experiments with noisy labels are wellperformed and provide good evidence for the theoretical ideas as well as filterkd questions and suggestions for improvement experimental results need to be improved i would suggest the following additions 1 results of regular kd in addition to eskd 2 there are results only with one teacherstudent architecture pair a wider set of teacherstudent pairs is required before concluding that the proposed method is better 3 in table1 both the teacher and student have the same architecture one of the main reasons to do kd is for model compression so having a larger teacher and smaller student would be quite useful for a larger audience i would suggest looking at the benchmark and code provided by tian et al in contrastive representation distillation iclr 2020 as an easy starting point 4 the memory requirements for storing smoothed targets in memory may be prohibitive but why cant we load them from permanent memory as and when required like we do with a dataloader for images and videos 5 experiments on a dataset like imagenet or at least tinyimagenet would strengthen the paper more overall i believe there are solid contributions in this submission which can be further demonstrated with the suggested experiments hopefully the authors will be able to address the weaknesses i have outlined for now i am giving this a 6 marginal acceptance i am happy to increase the rating based on the authors response to the next level update after author response i thank the authors for a clear response to my suggestions and the additional experiments i am satisfied with the additions and modifications and i am increasing my score please try to add imagenet results if possible as it may help others in the field who use it as a common benchmark i have also read the other reviews and the authors have also addressed them well too docsepthis paper proposes an explanation for the success of distillation it first experiments with synthetic gaussian data on synthetic data it shows that distillation works better from an earlystopped model it probes why and finds that when the onehot label is far from the true conditional distribution the earlystopped models distribution tends to be closer to the true distribution than either the onehot label or the converged model the earlystopped model is better because over the course of training the model tends to produce a distribution close to the true one before finally overfitting the onehot label some examples show similar patterns for realworld networksdatasets provided the predictions over training are smoothed motivated by these results the paper proposes to use an averaged distribution of labels from different steps of training rather than the distribution at the end of training and shows that this improves performance by a little bit in terms of novelty clarity and significance i think this paper is good the question of why distillation improves performance is an important one analyzing the learning paths of individual examples is an interesting idea to my knowledge the visualization of learning paths is novel as is the filterkd method i enjoyed reading the paper and i think the ideas described have the potential to inspire interesting future work with that said im not entirely convinced that the explanation for the success of kd provided here is accurate for realworld kd use cases 1 a major claim is that kds success is related to the zigzag pattern but this claim does not seem to be sufficiently supported by the experiments involving real data it is shown that on some examples there is a zigzag in the learning path but there is no direct demonstration that this zigzag is related to the efficacy of kd one idea is to measure the degree of the zigzag for each example and then verify that it suffices to use kd only for these examples and onehot training for others this seems like the bare minimum thing to show but the entropy of the model distribution at the early stopping point might be correlated with the degree of zigzag in which case it seems somewhat trivial that the lowentropy outputs could be replaced with onehot labels i am not sure exactly what a good empirical test of the zigzag hypothesis would look like but i dont find the evidence provided compelling enough 2 its not clear to me whether distillation is performed at a temperature of 1 or a temperature that is tuned somehow assuming the magnitude of the final logits provides some indication of learning difficulty i could see distillation at a temperature 1 providing some of the benefit of early stopping this also seems like a relevant baseline for filterkd since both increasing the temperature and averaging predictions will have the effect of increasing the entropy of the target distribution 3 the numbers in table 1 seem uncompelling the caption does not say whether the is standard error standard deviation or something else but assuming it is standard error i dont think the improvements on cifar10 or ag news would be statistically significant the improvements from kd on these datasets are small to begin with so they may not be particularly good benchmarks for this work the performance with noisy labels is more compelling but doesnt help to validate that the success of kd on clean data is attributable to the phenomenon described 4 im suspicious of the idea of base difficulty base difficulty is defined as the l2 distance between the onehot label and the true conditional distribution a nice aspect of base difficulty is that it depends only on the true data distribution however it seems to capture only two kinds of difficult examples those that are ambiguous if the conditional distribution has high entropy or incorrectly labeled if the conditional distribution has low entropy the proposed integral difficulty measure further captures the difficulty of learning particular examples and is somewhat similar to previously proposed metrics such as cscorehttpsarxivorgabs200203206 see also the learning speed based proxies in that paper one of which is equivalent to integral difficulty or variance of gradientshttpsarxivorgpdf200811600pdf intuitively data points with low base difficulty could have high integral difficulty if they are located in regions of the distribution with low density the labels in cifar10100 are relatively clean and the images are relatively unambiguous so i suspect that most of the high integral difficulty examples do not actually have high base difficulty minor earlystopped knowledge distillation is introduced without any further description only from the appendix is it clear that it means knowledge distillation from an earlystopped teacher as opposed to early stopping during knowledge distillation iiuc the intuition from section 33 is that the reason for the zigzagging in figure 3 is that the label assigned to a given point changes not only based on the gradient of the loss for that point but also on the gradients of the losses on other similar points this point seems intuitive even without bringing up the mathcalat or the empirical ntk and i am not sure that proposition 1 helps to understand it further analysis of the dynamics of mathcalat and the stationarity of the empirical ntk might make this decomposition more interesting but the analysis in appendix e doesnt seem conclusive missing paren in definition of integral difficulty on p 5 the idea of taking a moving average of network weights as explored in appendix g is much older than moco and it would be good to cite relevant literature eg rupperthttpsecommonscornelledubitstreamhandle18138664tr000781pdfsequence1 polyak juditskyhttpsepubssiamorgdoiabs1011370330046 and the inception paperhttpswwwcvfoundationorgopenaccesscontentcvpr2015htmlszegedygoingdeeperwith2015cvprpaperhtml which afaik was the first or among the first papers to apply this method in deep neural network training this paper is wellwritten novel tackles an important question and contains potentially interesting ideas however the evidence provided is not quite sufficient to validate the claims made the suggestion that zigzag in the learning trajectory is important to the success of distillation on real nonnoisy data is not welldemonstrated and the benefits of the proposed filterkd technique seem marginal on realworld data docsepthis paper analyzes the learning process in knowledge distillation and proposes a method called filterkd it was evaluated by using cifar10 cifar100 and ag news the observed zigzag pattern is interesting and gives one reason that early stopping is effective in knowledge distillation the improvement obtained by early stopping and the proposed filterkd is rather minor i am still not sure the proposed method is justified well in theory it is based on ntk theory but it is applicable only for a sufficiently wide network also several zigzag patterns are shown in figures but we are not sure of their definition and how often it appears in the real dataset the improvement in accuracy is rather minor docsepthis paper introduces a method for doing knowledge distillation with noisy labels the contribution consists on representing distilled labels as a weighted moving average of the predicted labels from a teacher as it trains using noisy onehot encoded labels finally a student is trained using the distilled labels this review has been updated since it was originally posted see thread below strengths the paper is for the most part wellstructured and easy to read argument and ideas are developed stepbystep formulated and substantiated formally based on previous work establishing hypothesis up front starting from a toy example building up the contribution from initial observations and finally moving on to more complexrealistic scenarios after tackles a relevant problems for the ml community namely knowledge distillation and noisy labels weaknesses the concept of noise is not welldefined to the point where the problem definition becomes confusing is the noise big enough to flip the target label more precisely does arg maxi pyxi arg maxi eyi always hold if it does the hard sample from figure 3 does not seem to comply if the equality doesnt hold the problem definition is more about data with wrongly labeled samples in this case the insights from section 3 will be contingent on the number of samples for which the equality mentioned above doesnt hold ie mislabeled samples this aspect has not been addressed in the paper moreover the concept of easy medium and hard samples does not explicitly compensate for mislabeled samples and hard samples could trivially correspond to the mislabeled samples kd eskd and ls depend substantially on a temperature hyperparameter which the paper does not address having the wrong temperature can drastically affect the performance of these baselines the labels from eskd can be seen as a special case of the proposed filterkd where alpha 1 therefore one could expect the labels from an eskd teacher to be similar to those of filterkd if the teacher were trained using less epochs in other words if the earlystopping criterion forces the teacher to stop sooner even though eskd is a valid baseline is it the most compelling one to justify the introduction of filterkd does not use a real dataset ie not a toy dataset for which p is available like cifar10h 1 to evaluate hypothesis 1 detailed comments the abstract is not a clear description of the paper lacks context and coherence it is not clear what the contributions are is it a criterion to measure generalization a function to measure difficulty or a novel way to do knowledge distillation increase text size in all figures at least as big as the figures caption in this paper we first clarify what makes supervision good for a classification problem and then explain two existing label refining methods label smoothing and knowledge distillation kd in terms of our proposed criterion the paper clarifies or proposes a new criterion to quantify generalization performance we look deeper into the learning path of the networks predicted distribution for training samples with different difficulty what is the learning path how is difficulty defined make it clear that these are definitions that will be introduced later on evaluate labels produced by teachers trained on kd eskd ls and filterkd wrt human annotated labels from cifar10h 1 proposition 3 from menon et al lacks context i recommend explaining further where it comes from to make the argument more selfcontained figure 1bc the range on the yaxis is rather small coverage of 07 for accuracy and 002 on ece exaggerating correlation between accuracy and similarity between p and ptar also kd and ls depend on a temperature parameter which according to the supplementary material has only been tested with with one setup leads to better generalization performance as measured either by accuracy acc or expected calibration error ece it is critical to define the term generalization here if these are the two criteria that are going to define generalization or are distribution shifts also being considered figure 2 it would be interesting to see the distribution of the entire dataset for each axishttpsmatplotliborgstableimagessphxglrscatterhist001png the scatterplot can hide how dense the distribution is and how thin the tails are figure 3 hard sample how can the p label for the hard sample be so close to the bottom right corner say 01 08 01 and yet the onehot encoding be all the way across on the bottom left corner 1 0 0 it looks as if the samples were mislabeled refer to the first point under weaknesses figure 3 how representative is this behavior across the dataset how many easymediumhard samples are there figure 3 the paths on the right are close to impossible to see and interpret proposition 1 xo has not been defined within the proposition however for a harder sample middle it is very difficult to observe any patterns capitalization figure 4 what does the black dashed line mean section 51 what labels are used for the students test set is it the original oht or a label set by the corresponding teacher section 5 figure 5 how does filterkd compare to models trained using ls and kd with optimum temperature any reason to leave them out references 1 peterson joshua c et al human uncertainty makes classification more robust proceedings of the ieeecvf international conference on computer vision 2019 the paper is easy to follow and has contributions that are sufficiently motivated empirical evaluation shows that the proposed method is more robust to noisy labels however the problem definition therefore the scope of the paper is not clear and the core contribution filterkd has not been evaluated to an extent that justifies its simplicity update after discussing with the authors see thread below i have updated my review to reflect my view of the paper ### Summary:
the authors made substantial improvements to the originally submitted manuscript however reviewers initially remained reluctant to support the paper for acceptance based on the degree to which they were confident in the underlying arguments position taken by the authors and the evidence provided to support their position and arguments there are also concerns about the significance of the gains in performance afforded by the proposed approach during the author response period two reviewers became satisfied with the additions and modifications leading to an increase in the final score it will be critical for the authors to try to add imagenet results if possible in addition to other promises made to reviewers the ac recommends accepting this paper
[ 11743, 1057, 417, 1333, 1880, 253, 50276, 261, 2629, 2228, 2629, 11254, 390, 1633, 2010, 533, 7384, 352, 310, 2629, 2228, 891, 13414, 1158, 253, 11701, 327, 260, 338, 274, 740, 390, 639, 3668, 651, 320, 10126, 1534, 253, 11701, 432, 465, 69, 327, 841, 15302, 403, 1355, 281, 3135, 342, 594, 597, 778, 417, 320, 3782, 1175, 49602, 323, 436, 789, 253, 3045, 342, 27620, 13301, 310, 625, 18511, 533, 36908, 1361, 281, 17813, 326, 253, 2323, 273, 465, 69, 327, 4076, 941, 310, 26585, 281, 253, 11562, 2529, 50276, 21, 516, 20634, 273, 253, 2934, 273, 2613, 10183, 2613, 10183, 310, 2931, 347, 253, 298, 19, 4181, 875, 253, 581, 12022, 5203, 285, 253, 2032, 17697, 3268, 247, 5322, 4809, 273, 2613, 10183, 310, 326, 352, 7024, 760, 327, 253, 2032, 941, 3268, 2299, 352, 3133, 281, 9232, 760, 767, 9351, 273, 2834, 6667, 1110, 326, 403, 23851, 604, 253, 17697, 3268, 556, 1029, 15579, 390, 30833, 13130, 604, 253, 17697, 3268, 556, 1698, 15579, 253, 4081, 9909, 10183, 2557, 2007, 28174, 253, 10183, 273, 4715, 1798, 6667, 285, 310, 8489, 2074, 281, 3786, 4081, 17082, 824, 347, 260, 18891, 3614, 39962, 2061, 5375, 1518, 17490, 18040, 923, 671, 253, 4715, 3885, 1754, 16843, 447, 275, 326, 2929, 581, 273, 534, 310, 6425, 281, 9909, 10183, 390, 11041, 273, 27935, 3614, 39962, 2061, 9275, 8012, 883, 10487, 9275, 540, 41597, 941, 2792, 342, 1698, 2613, 10183, 812, 452, 1029, 9909, 10183, 604, 597, 403, 4441, 275, 4811, 273, 253, 3268, 342, 1698, 4038, 253, 13301, 275, 260, 338, 274, 6903, 361, 403, 4942, 4076, 285, 253, 3888, 403, 4942, 39662, 594, 891, 9101, 326, 954, 273, 253, 1029, 9909, 10183, 6667, 513, 417, 2686, 452, 1029, 2613, 10183, 50275, 37585, 50276, 18579, 11769, 1882, 3640, 940, 21755, 310, 5611, 1293, 667, 2007, 5740, 760, 432, 253, 30762, 310, 352, 2590, 326, 352, 2097, 3640, 940, 21755, 432, 271, 2393, 11769, 1882, 9732, 347, 10066, 281, 2393, 15910, 1309, 3640, 940, 21755, 50276, 2886, 1028, 253, 30328, 432, 2593, 5922, 310, 326, 253, 1921, 323, 253, 48567, 91, 29191, 275, 4677, 495, 310, 326, 253, 5203, 7922, 281, 247, 1677, 1127, 2544, 417, 760, 1754, 327, 253, 11786, 273, 253, 2957, 323, 326, 1127, 533, 671, 327, 253, 27935, 273, 253, 11655, 327, 643, 2074, 2792, 436, 1127, 3133, 27350, 1014, 1293, 9745, 598, 253, 14168, 1179, 255, 390, 253, 16774, 295, 17922, 285, 891, 717, 417, 2119, 326, 13989, 337, 7729, 281, 2096, 352, 2007, 1783, 273, 253, 8062, 273, 14168, 1179, 255, 285, 253, 4660, 15752, 273, 253, 16774, 295, 17922, 1537, 1056, 436, 14717, 625, 4722, 533, 253, 1783, 275, 30762, 299, 36908, 1646, 38662, 50276, 33722, 1349, 445, 275, 5426, 273, 9909, 10183, 327, 268, 608, 50276, 783, 2934, 273, 3192, 247, 4886, 3388, 273, 2990, 13461, 347, 14859, 275, 30762, 305, 310, 1199, 5662, 685, 278, 16856, 285, 352, 651, 320, 1175, 281, 26542, 4623, 6239, 24088, 391, 1135, 797, 2413, 339, 34870, 30736, 5911, 538, 262, 4963, 13393, 1093, 1012, 2691, 1540, 1206, 933, 44953, 9275, 21934, 18, 3488, 518, 50276, 6881, 953, 4742, 2413, 339, 16712, 859, 16726, 2061, 14369, 5375, 6903, 1012, 1967, 1610, 361, 2950, 285, 253, 39645, 2929, 3614, 2700, 17312, 14541, 318, 2061, 5758, 10773, 6071, 17312, 1087, 6620, 2974, 20932, 909, 6368, 5681, 615, 8390, 3113, 6620, 17312, 1087, 20790, 2974, 534, 6706, 66, 1479, 369, 253, 806, 390, 2190, 253, 806, 9380, 281, 4647, 436, 1332, 275, 3676, 11454, 2990, 3733, 436, 2929, 310, 973, 15720, 4460, 39223, 271, 1774, 1953, 285, 4428, 7826, 4722, 5697, 2299, 253, 1941, 2530, 310, 417, 3240, 4209, 281, 17813, 253, 3916, 1160, 253, 14876, 326, 48567, 47482, 275, 253, 4715, 18974, 310, 1774, 281, 253, 2323, 273, 940, 21755, 327, 1524, 1327, 2369, 17976, 941, 310, 417, 6210, 392, 15125, 1344, 456, 285, 253, 5373, 273, 253, 4081, 5806, 76, 69, 5853, 1646, 16888, 327, 1524, 10186, 941, 5474, 33032, 2520, 2929, 3537, 13505, 253, 4715, 1232, 275, 3640, 940, 21755, 285, 29328, 247, 1332, 1925, 5806, 76, 69, 352, 369, 6760, 407, 970, 260, 338, 274, 740, 260, 338, 274, 2313, 285, 639, 3668, 253, 2540, 48567, 47482, 3102, 310, 4722, 285, 4245, 581, 1921, 326, 2393, 15910, 310, 3576, 275, 3640, 940, 21755, 253, 7756, 2797, 407, 2393, 15910, 285, 253, 4081, 5806, 76, 69, 310, 2581, 5884, 50276, 74, 717, 1335, 417, 2119, 253, 4081, 1332, 310, 17285, 973, 275, 3762, 352, 310, 1754, 327, 295, 17922, 3762, 533, 352, 310, 7763, 760, 323, 247, 10481, 4618, 2990, 671, 2067, 48567, 47482, 6127, 403, 2011, 275, 8442, 533, 359, 403, 417, 2119, 273, 616, 5426, 285, 849, 2223, 352, 4620, 275, 253, 1524, 10895, 253, 7756, 275, 7200, 310, 2581, 5884, 5474, 33032, 2520, 2929, 23970, 247, 1332, 323, 2509, 3640, 940, 21755, 342, 27620, 13301, 253, 7680, 8414, 327, 9999, 35755, 13301, 347, 247, 17375, 4886, 3388, 273, 253, 8131, 13301, 432, 247, 9732, 347, 352, 18784, 970, 27620, 581, 12022, 16202, 13301, 4720, 247, 5974, 310, 10166, 970, 253, 35755, 13301, 436, 2278, 556, 644, 9300, 1580, 352, 369, 8927, 9269, 923, 6293, 2708, 50275, 296, 3755, 20556, 50276, 783, 2929, 310, 323, 253, 954, 629, 973, 34218, 285, 3477, 281, 1239, 4154, 285, 5697, 403, 3715, 3213, 1615, 10539, 26115, 285, 4326, 4215, 19186, 1754, 327, 2045, 789, 50276, 15425, 3647, 9079, 598, 2914, 50276, 45033, 432, 247, 20953, 1650, 3652, 598, 253, 7680, 432, 3302, 7313, 285, 4720, 4886, 327, 281, 625, 2570, 6549, 2531, 15216, 846, 50276, 85, 471, 868, 247, 4623, 3237, 323, 253, 13361, 3114, 10775, 3640, 940, 21755, 285, 27620, 13301, 50275, 20881, 1255, 265, 50276, 783, 4473, 273, 6046, 310, 417, 6210, 392, 37224, 281, 253, 1127, 835, 253, 1895, 5426, 4916, 21643, 310, 253, 6046, 1943, 2217, 281, 19153, 253, 2303, 5203, 625, 10534, 1057, 1736, 2781, 74, 7239, 2981, 50276, 1662, 2781, 74, 2046, 74, 1900, 2186, 604, 352, 1057, 253, 1892, 3410, 432, 4677, 495, 1057, 417, 1646, 281, 16041, 604, 253, 13919, 36908, 2186, 253, 1895, 5426, 310, 625, 670, 941, 342, 47723, 13130, 3530, 275, 436, 1083, 253, 16039, 432, 2593, 495, 588, 320, 32391, 327, 253, 1180, 273, 3530, 323, 534, 253, 13919, 5393, 1840, 36908, 2186, 26332, 3731, 22027, 3530, 436, 4809, 556, 417, 644, 9713, 275, 253, 2929, 25761, 253, 4473, 273, 3477, 4646, 285, 1892, 3530, 1057, 417, 11120, 23514, 323, 3731, 22027, 3530, 285, 1892, 3530, 812, 35820, 1365, 2723, 281, 253, 3731, 22027, 3530, 50276, 76, 69, 1578, 76, 69, 285, 35253, 3469, 9619, 327, 247, 3276, 4373, 19484, 534, 253, 2929, 1057, 417, 2953, 1907, 253, 3430, 3276, 476, 31063, 2818, 253, 3045, 273, 841, 1666, 25379, 50276, 783, 13301, 432, 1578, 76, 69, 476, 320, 2326, 347, 247, 2714, 1083, 273, 253, 4081, 5806, 76, 69, 835, 9765, 50276, 18, 3103, 581, 812, 1902, 253, 13301, 432, 271, 1578, 76, 69, 9732, 281, 320, 2074, 281, 1110, 273, 5806, 76, 69, 604, 253, 9732, 497, 10166, 970, 1679, 44540, 275, 643, 3000, 604, 253, 2393, 11769, 2784, 17705, 5621, 253, 9732, 281, 3523, 19473, 1014, 2167, 1578, 76, 69, 310, 247, 3588, 8245, 310, 352, 253, 954, 18511, 581, 281, 15249, 253, 10199, 273, 5806, 76, 69, 50276, 18566, 417, 897, 247, 1524, 10895, 26332, 417, 247, 20953, 10895, 323, 534, 268, 310, 2130, 751, 260, 338, 274, 740, 73, 337, 281, 7472, 9079, 337, 50275, 5992, 7193, 5701, 50276, 783, 12002, 310, 417, 247, 2590, 5740, 273, 253, 2929, 19756, 3634, 285, 25253, 352, 310, 417, 2590, 752, 253, 9021, 403, 310, 352, 247, 17705, 281, 2557, 26647, 247, 1159, 281, 2557, 10183, 390, 247, 4460, 1039, 281, 513, 3640, 940, 21755, 50275, 19687, 511, 2505, 1979, 275, 512, 8442, 387, 1878, 347, 1943, 347, 253, 8442, 11743, 50275, 249, 436, 2929, 359, 806, 19148, 752, 2789, 20446, 1175, 323, 247, 9162, 1895, 285, 840, 5513, 767, 5368, 5203, 1275, 1699, 3082, 5203, 36971, 285, 3640, 940, 21755, 465, 69, 275, 2426, 273, 776, 4081, 17705, 50276, 783, 2929, 8254, 7790, 390, 29328, 247, 747, 17705, 281, 22048, 26647, 3045, 50275, 664, 1007, 12861, 715, 253, 4715, 1854, 273, 253, 6928, 8131, 3268, 323, 3733, 3530, 342, 1027, 10183, 50276, 5371, 310, 253, 4715, 1854, 849, 310, 10183, 2931, 1056, 352, 2590, 326, 841, 403, 14308, 326, 588, 320, 5611, 1996, 327, 50275, 45141, 13301, 4197, 407, 10954, 10166, 327, 465, 69, 1578, 76, 69, 35253, 285, 5806, 76, 69, 8772, 1966, 28267, 13301, 432, 260, 338, 274, 740, 73, 337, 50275, 856, 3321, 495, 432, 1821, 251, 1162, 355, 19756, 3634, 891, 5583, 15571, 2007, 835, 352, 3249, 432, 281, 1056, 253, 4154, 625, 1881, 41010, 50275, 13206, 337, 12847, 253, 2491, 327, 253, 340, 10565, 310, 2581, 1355, 7031, 273, 18188, 323, 7200, 285, 209, 4699, 327, 299, 336, 23668, 839, 5921, 875, 7200, 285, 14259, 875, 268, 285, 268, 17447, 671, 465, 69, 285, 35253, 3469, 327, 247, 3276, 4764, 534, 2556, 281, 253, 24864, 2144, 556, 760, 644, 5762, 342, 342, 581, 9978, 50275, 282, 6594, 281, 1805, 26647, 3045, 347, 4080, 2057, 407, 7200, 756, 390, 3264, 18543, 2228, 299, 336, 50276, 262, 310, 4619, 281, 4853, 253, 1307, 26647, 1060, 604, 841, 403, 253, 767, 6866, 326, 403, 1469, 281, 4853, 26647, 390, 403, 3268, 15036, 671, 1146, 2783, 50275, 13206, 374, 352, 651, 320, 4722, 281, 923, 253, 3268, 273, 253, 2862, 10895, 323, 1016, 7844, 3614, 2056, 14095, 4658, 2061, 11351, 35690, 405, 545, 89, 3129, 83, 1026, 2569, 15700, 2874, 8567, 253, 24493, 14095, 476, 10507, 849, 14086, 253, 3268, 310, 285, 849, 6906, 253, 32936, 403, 50275, 13206, 495, 1892, 3410, 849, 476, 253, 268, 5203, 323, 253, 1892, 3410, 320, 594, 2810, 281, 253, 5004, 987, 7145, 1333, 14805, 16331, 14805, 285, 2568, 253, 581, 12022, 9706, 320, 512, 253, 1039, 2439, 327, 253, 5004, 1669, 7145, 337, 470, 470, 352, 4453, 347, 604, 253, 3530, 497, 3731, 22027, 3730, 281, 253, 806, 1127, 762, 32213, 50275, 13206, 495, 849, 8612, 310, 436, 3879, 2439, 253, 10895, 849, 1142, 1842, 1105, 264, 1514, 10984, 3530, 403, 627, 50275, 13206, 495, 253, 11865, 327, 253, 987, 403, 2810, 281, 7479, 281, 923, 285, 4665, 50275, 856, 3321, 337, 1269, 80, 556, 417, 644, 2931, 1561, 253, 13989, 50275, 35529, 323, 247, 12150, 3410, 4766, 352, 310, 1077, 2834, 281, 10018, 667, 6127, 50276, 38479, 1320, 50275, 13206, 577, 752, 1057, 253, 2806, 17889, 1386, 1599, 50275, 4674, 8319, 752, 13301, 403, 908, 323, 253, 3484, 1071, 873, 310, 352, 253, 3236, 258, 384, 390, 247, 5203, 873, 407, 253, 3969, 9732, 50275, 4674, 608, 50276, 13206, 608, 849, 1057, 5806, 76, 69, 7277, 281, 3210, 10166, 970, 35253, 285, 465, 69, 342, 24571, 3276, 667, 1921, 281, 3553, 731, 562, 50275, 250, 3065, 337, 268, 2521, 251, 480, 6934, 5738, 260, 1162, 355, 1966, 11649, 2789, 9162, 625, 10237, 10061, 273, 253, 26332, 70, 886, 39985, 5213, 8059, 327, 4382, 8113, 6247, 253, 2929, 310, 3477, 281, 956, 285, 556, 9021, 326, 403, 10481, 17194, 16774, 7103, 2722, 326, 253, 4081, 1332, 310, 625, 10237, 281, 27620, 13301, 2299, 253, 1895, 5426, 3103, 253, 7990, 273, 253, 2929, 310, 417, 2590, 285, 253, 5161, 7680, 5806, 76, 69, 556, 417, 644, 6760, 281, 271, 6070, 326, 816, 7790, 697, 17647, 50276, 11183, 846, 16585, 342, 253, 4477, 923, 6293, 2708, 891, 452, 9300, 619, 2278, 281, 4887, 619, 1859, 273, 253, 2929, 2490, 187, 4118, 18435, 27, 783, 4477, 1160, 6832, 50276, 49831, 942, 281, 253, 8927, 9262, 7714, 2299, 30628, 8523, 6376, 27821, 281, 1329, 253, 2929, 323, 14924, 1754, 327, 253, 4248, 281, 534, 597, 497, 13224, 275, 253, 6944, 7125, 50276, 3321, 2668, 407, 253, 4477, 285, 253, 1941, 2530, 281, 1329, 616, 1899, 285, 7125, 627, 403, 671, 7350, 670, 253, 8453, 273, 253, 15988, 275, 3045, 26299, 407, 253, 4081, 2746, 50276, 32674, 253, 2488, 2380, 2180, 767, 30628, 3395, 10048, 342, 253, 30733, 285, 14586, 4283, 281, 271, 2572, 275, 253, 2457, 4868, 352, 588, 320, 4619, 323, 253, 4477, 281, 1611, 281, 823, 4440, 257, 292, 1543, 604, 1896, 275, 1635, 281, 643, 16966, 1160, 281, 30628, 50276, 783, 913, 32636, 18738, 436, 2929 ]
[ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 ]
[ 11743, 1057, 417, 1333, 1880, 253, 50276, 261, 2629, 2228, 2629, 11254, 390, 1633, 2010, 533, 7384, 352, 310, 2629, 2228, 891, 13414, 1158, 253, 11701, 327, 260, 338, 274, 740, 390, 639, 3668, 651, 320, 10126, 1534, 253, 11701, 432, 465, 69, 327, 841, 15302, 403, 1355, 281, 3135, 342, 594, 597, 778, 417, 320, 3782, 1175, 49602, 323, 436, 789, 253, 3045, 342, 27620, 13301, 310, 625, 18511, 533, 36908, 1361, 281, 17813, 326, 253, 2323, 273, 465, 69, 327, 4076, 941, 310, 26585, 281, 253, 11562, 2529, 50276, 21, 516, 20634, 273, 253, 2934, 273, 2613, 10183, 2613, 10183, 310, 2931, 347, 253, 298, 19, 4181, 875, 253, 581, 12022, 5203, 285, 253, 2032, 17697, 3268, 247, 5322, 4809, 273, 2613, 10183, 310, 326, 352, 7024, 760, 327, 253, 2032, 941, 3268, 2299, 352, 3133, 281, 9232, 760, 767, 9351, 273, 2834, 6667, 1110, 326, 403, 23851, 604, 253, 17697, 3268, 556, 1029, 15579, 390, 30833, 13130, 604, 253, 17697, 3268, 556, 1698, 15579, 253, 4081, 9909, 10183, 2557, 2007, 28174, 253, 10183, 273, 4715, 1798, 6667, 285, 310, 8489, 2074, 281, 3786, 4081, 17082, 824, 347, 260, 18891, 3614, 39962, 2061, 5375, 1518, 17490, 18040, 923, 671, 253, 4715, 3885, 1754, 16843, 447, 275, 326, 2929, 581, 273, 534, 310, 6425, 281, 9909, 10183, 390, 11041, 273, 27935, 3614, 39962, 2061, 9275, 8012, 883, 10487, 9275, 540, 41597, 941, 2792, 342, 1698, 2613, 10183, 812, 452, 1029, 9909, 10183, 604, 597, 403, 4441, 275, 4811, 273, 253, 3268, 342, 1698, 4038, 253, 13301, 275, 260, 338, 274, 6903, 361, 403, 4942, 4076, 285, 253, 3888, 403, 4942, 39662, 594, 891, 9101, 326, 954, 273, 253, 1029, 9909, 10183, 6667, 513, 417, 2686, 452, 1029, 2613, 10183, 50275, 37585, 50276, 18579, 11769, 1882, 3640, 940, 21755, 310, 5611, 1293, 667, 2007, 5740, 760, 432, 253, 30762, 310, 352, 2590, 326, 352, 2097, 3640, 940, 21755, 432, 271, 2393, 11769, 1882, 9732, 347, 10066, 281, 2393, 15910, 1309, 3640, 940, 21755, 50276, 2886, 1028, 253, 30328, 432, 2593, 5922, 310, 326, 253, 1921, 323, 253, 48567, 91, 29191, 275, 4677, 495, 310, 326, 253, 5203, 7922, 281, 247, 1677, 1127, 2544, 417, 760, 1754, 327, 253, 11786, 273, 253, 2957, 323, 326, 1127, 533, 671, 327, 253, 27935, 273, 253, 11655, 327, 643, 2074, 2792, 436, 1127, 3133, 27350, 1014, 1293, 9745, 598, 253, 14168, 1179, 255, 390, 253, 16774, 295, 17922, 285, 891, 717, 417, 2119, 326, 13989, 337, 7729, 281, 2096, 352, 2007, 1783, 273, 253, 8062, 273, 14168, 1179, 255, 285, 253, 4660, 15752, 273, 253, 16774, 295, 17922, 1537, 1056, 436, 14717, 625, 4722, 533, 253, 1783, 275, 30762, 299, 36908, 1646, 38662, 50276, 33722, 1349, 445, 275, 5426, 273, 9909, 10183, 327, 268, 608, 50276, 783, 2934, 273, 3192, 247, 4886, 3388, 273, 2990, 13461, 347, 14859, 275, 30762, 305, 310, 1199, 5662, 685, 278, 16856, 285, 352, 651, 320, 1175, 281, 26542, 4623, 6239, 24088, 391, 1135, 797, 2413, 339, 34870, 30736, 5911, 538, 262, 4963, 13393, 1093, 1012, 2691, 1540, 1206, 933, 44953, 9275, 21934, 18, 3488, 518, 50276, 6881, 953, 4742, 2413, 339, 16712, 859, 16726, 2061, 14369, 5375, 6903, 1012, 1967, 1610, 361, 2950, 285, 253, 39645, 2929, 3614, 2700, 17312, 14541, 318, 2061, 5758, 10773, 6071, 17312, 1087, 6620, 2974, 20932, 909, 6368, 5681, 615, 8390, 3113, 6620, 17312, 1087, 20790, 2974, 534, 6706, 66, 1479, 369, 253, 806, 390, 2190, 253, 806, 9380, 281, 4647, 436, 1332, 275, 3676, 11454, 2990, 3733, 436, 2929, 310, 973, 15720, 4460, 39223, 271, 1774, 1953, 285, 4428, 7826, 4722, 5697, 2299, 253, 1941, 2530, 310, 417, 3240, 4209, 281, 17813, 253, 3916, 1160, 253, 14876, 326, 48567, 47482, 275, 253, 4715, 18974, 310, 1774, 281, 253, 2323, 273, 940, 21755, 327, 1524, 1327, 2369, 17976, 941, 310, 417, 6210, 392, 15125, 1344, 456, 285, 253, 5373, 273, 253, 4081, 5806, 76, 69, 5853, 1646, 16888, 327, 1524, 10186, 941, 5474, 33032, 2520, 2929, 3537, 13505, 253, 4715, 1232, 275, 3640, 940, 21755, 285, 29328, 247, 1332, 1925, 5806, 76, 69, 352, 369, 6760, 407, 970, 260, 338, 274, 740, 260, 338, 274, 2313, 285, 639, 3668, 253, 2540, 48567, 47482, 3102, 310, 4722, 285, 4245, 581, 1921, 326, 2393, 15910, 310, 3576, 275, 3640, 940, 21755, 253, 7756, 2797, 407, 2393, 15910, 285, 253, 4081, 5806, 76, 69, 310, 2581, 5884, 50276, 74, 717, 1335, 417, 2119, 253, 4081, 1332, 310, 17285, 973, 275, 3762, 352, 310, 1754, 327, 295, 17922, 3762, 533, 352, 310, 7763, 760, 323, 247, 10481, 4618, 2990, 671, 2067, 48567, 47482, 6127, 403, 2011, 275, 8442, 533, 359, 403, 417, 2119, 273, 616, 5426, 285, 849, 2223, 352, 4620, 275, 253, 1524, 10895, 253, 7756, 275, 7200, 310, 2581, 5884, 5474, 33032, 2520, 2929, 23970, 247, 1332, 323, 2509, 3640, 940, 21755, 342, 27620, 13301, 253, 7680, 8414, 327, 9999, 35755, 13301, 347, 247, 17375, 4886, 3388, 273, 253, 8131, 13301, 432, 247, 9732, 347, 352, 18784, 970, 27620, 581, 12022, 16202, 13301, 4720, 247, 5974, 310, 10166, 970, 253, 35755, 13301, 436, 2278, 556, 644, 9300, 1580, 352, 369, 8927, 9269, 923, 6293, 2708, 50275, 296, 3755, 20556, 50276, 783, 2929, 310, 323, 253, 954, 629, 973, 34218, 285, 3477, 281, 1239, 4154, 285, 5697, 403, 3715, 3213, 1615, 10539, 26115, 285, 4326, 4215, 19186, 1754, 327, 2045, 789, 50276, 15425, 3647, 9079, 598, 2914, 50276, 45033, 432, 247, 20953, 1650, 3652, 598, 253, 7680, 432, 3302, 7313, 285, 4720, 4886, 327, 281, 625, 2570, 6549, 2531, 15216, 846, 50276, 85, 471, 868, 247, 4623, 3237, 323, 253, 13361, 3114, 10775, 3640, 940, 21755, 285, 27620, 13301, 50275, 20881, 1255, 265, 50276, 783, 4473, 273, 6046, 310, 417, 6210, 392, 37224, 281, 253, 1127, 835, 253, 1895, 5426, 4916, 21643, 310, 253, 6046, 1943, 2217, 281, 19153, 253, 2303, 5203, 625, 10534, 1057, 1736, 2781, 74, 7239, 2981, 50276, 1662, 2781, 74, 2046, 74, 1900, 2186, 604, 352, 1057, 253, 1892, 3410, 432, 4677, 495, 1057, 417, 1646, 281, 16041, 604, 253, 13919, 36908, 2186, 253, 1895, 5426, 310, 625, 670, 941, 342, 47723, 13130, 3530, 275, 436, 1083, 253, 16039, 432, 2593, 495, 588, 320, 32391, 327, 253, 1180, 273, 3530, 323, 534, 253, 13919, 5393, 1840, 36908, 2186, 26332, 3731, 22027, 3530, 436, 4809, 556, 417, 644, 9713, 275, 253, 2929, 25761, 253, 4473, 273, 3477, 4646, 285, 1892, 3530, 1057, 417, 11120, 23514, 323, 3731, 22027, 3530, 285, 1892, 3530, 812, 35820, 1365, 2723, 281, 253, 3731, 22027, 3530, 50276, 76, 69, 1578, 76, 69, 285, 35253, 3469, 9619, 327, 247, 3276, 4373, 19484, 534, 253, 2929, 1057, 417, 2953, 1907, 253, 3430, 3276, 476, 31063, 2818, 253, 3045, 273, 841, 1666, 25379, 50276, 783, 13301, 432, 1578, 76, 69, 476, 320, 2326, 347, 247, 2714, 1083, 273, 253, 4081, 5806, 76, 69, 835, 9765, 50276, 18, 3103, 581, 812, 1902, 253, 13301, 432, 271, 1578, 76, 69, 9732, 281, 320, 2074, 281, 1110, 273, 5806, 76, 69, 604, 253, 9732, 497, 10166, 970, 1679, 44540, 275, 643, 3000, 604, 253, 2393, 11769, 2784, 17705, 5621, 253, 9732, 281, 3523, 19473, 1014, 2167, 1578, 76, 69, 310, 247, 3588, 8245, 310, 352, 253, 954, 18511, 581, 281, 15249, 253, 10199, 273, 5806, 76, 69, 50276, 18566, 417, 897, 247, 1524, 10895, 26332, 417, 247, 20953, 10895, 323, 534, 268, 310, 2130, 751, 260, 338, 274, 740, 73, 337, 281, 7472, 9079, 337, 50275, 5992, 7193, 5701, 50276, 783, 12002, 310, 417, 247, 2590, 5740, 273, 253, 2929, 19756, 3634, 285, 25253, 352, 310, 417, 2590, 752, 253, 9021, 403, 310, 352, 247, 17705, 281, 2557, 26647, 247, 1159, 281, 2557, 10183, 390, 247, 4460, 1039, 281, 513, 3640, 940, 21755, 50275, 19687, 511, 2505, 1979, 275, 512, 8442, 387, 1878, 347, 1943, 347, 253, 8442, 11743, 50275, 249, 436, 2929, 359, 806, 19148, 752, 2789, 20446, 1175, 323, 247, 9162, 1895, 285, 840, 5513, 767, 5368, 5203, 1275, 1699, 3082, 5203, 36971, 285, 3640, 940, 21755, 465, 69, 275, 2426, 273, 776, 4081, 17705, 50276, 783, 2929, 8254, 7790, 390, 29328, 247, 747, 17705, 281, 22048, 26647, 3045, 50275, 664, 1007, 12861, 715, 253, 4715, 1854, 273, 253, 6928, 8131, 3268, 323, 3733, 3530, 342, 1027, 10183, 50276, 5371, 310, 253, 4715, 1854, 849, 310, 10183, 2931, 1056, 352, 2590, 326, 841, 403, 14308, 326, 588, 320, 5611, 1996, 327, 50275, 45141, 13301, 4197, 407, 10954, 10166, 327, 465, 69, 1578, 76, 69, 35253, 285, 5806, 76, 69, 8772, 1966, 28267, 13301, 432, 260, 338, 274, 740, 73, 337, 50275, 856, 3321, 495, 432, 1821, 251, 1162, 355, 19756, 3634, 891, 5583, 15571, 2007, 835, 352, 3249, 432, 281, 1056, 253, 4154, 625, 1881, 41010, 50275, 13206, 337, 12847, 253, 2491, 327, 253, 340, 10565, 310, 2581, 1355, 7031, 273, 18188, 323, 7200, 285, 209, 4699, 327, 299, 336, 23668, 839, 5921, 875, 7200, 285, 14259, 875, 268, 285, 268, 17447, 671, 465, 69, 285, 35253, 3469, 327, 247, 3276, 4764, 534, 2556, 281, 253, 24864, 2144, 556, 760, 644, 5762, 342, 342, 581, 9978, 50275, 282, 6594, 281, 1805, 26647, 3045, 347, 4080, 2057, 407, 7200, 756, 390, 3264, 18543, 2228, 299, 336, 50276, 262, 310, 4619, 281, 4853, 253, 1307, 26647, 1060, 604, 841, 403, 253, 767, 6866, 326, 403, 1469, 281, 4853, 26647, 390, 403, 3268, 15036, 671, 1146, 2783, 50275, 13206, 374, 352, 651, 320, 4722, 281, 923, 253, 3268, 273, 253, 2862, 10895, 323, 1016, 7844, 3614, 2056, 14095, 4658, 2061, 11351, 35690, 405, 545, 89, 3129, 83, 1026, 2569, 15700, 2874, 8567, 253, 24493, 14095, 476, 10507, 849, 14086, 253, 3268, 310, 285, 849, 6906, 253, 32936, 403, 50275, 13206, 495, 1892, 3410, 849, 476, 253, 268, 5203, 323, 253, 1892, 3410, 320, 594, 2810, 281, 253, 5004, 987, 7145, 1333, 14805, 16331, 14805, 285, 2568, 253, 581, 12022, 9706, 320, 512, 253, 1039, 2439, 327, 253, 5004, 1669, 7145, 337, 470, 470, 352, 4453, 347, 604, 253, 3530, 497, 3731, 22027, 3730, 281, 253, 806, 1127, 762, 32213, 50275, 13206, 495, 849, 8612, 310, 436, 3879, 2439, 253, 10895, 849, 1142, 1842, 1105, 264, 1514, 10984, 3530, 403, 627, 50275, 13206, 495, 253, 11865, 327, 253, 987, 403, 2810, 281, 7479, 281, 923, 285, 4665, 50275, 856, 3321, 337, 1269, 80, 556, 417, 644, 2931, 1561, 253, 13989, 50275, 35529, 323, 247, 12150, 3410, 4766, 352, 310, 1077, 2834, 281, 10018, 667, 6127, 50276, 38479, 1320, 50275, 13206, 577, 752, 1057, 253, 2806, 17889, 1386, 1599, 50275, 4674, 8319, 752, 13301, 403, 908, 323, 253, 3484, 1071, 873, 310, 352, 253, 3236, 258, 384, 390, 247, 5203, 873, 407, 253, 3969, 9732, 50275, 4674, 608, 50276, 13206, 608, 849, 1057, 5806, 76, 69, 7277, 281, 3210, 10166, 970, 35253, 285, 465, 69, 342, 24571, 3276, 667, 1921, 281, 3553, 731, 562, 50275, 250, 3065, 337, 268, 2521, 251, 480, 6934, 5738, 260, 1162, 355, 1966, 11649, 2789, 9162, 625, 10237, 10061, 273, 253, 26332, 70, 886, 39985, 5213, 8059, 327, 4382, 8113, 6247, 253, 2929, 310, 3477, 281, 956, 285, 556, 9021, 326, 403, 10481, 17194, 16774, 7103, 2722, 326, 253, 4081, 1332, 310, 625, 10237, 281, 27620, 13301, 2299, 253, 1895, 5426, 3103, 253, 7990, 273, 253, 2929, 310, 417, 2590, 285, 253, 5161, 7680, 5806, 76, 69, 556, 417, 644, 6760, 281, 271, 6070, 326, 816, 7790, 697, 17647, 50276, 11183, 846, 16585, 342, 253, 4477, 923, 6293, 2708, 891, 452, 9300, 619, 2278, 281, 4887, 619, 1859, 273, 253, 2929, 2490, 187, 4118, 18435, 27, 783, 4477, 1160, 6832, 50276, 49831, 942, 281, 253, 8927, 9262, 7714, 2299, 30628, 8523, 6376, 27821, 281, 1329, 253, 2929, 323, 14924, 1754, 327, 253, 4248, 281, 534, 597, 497, 13224, 275, 253, 6944, 7125, 50276, 3321, 2668, 407, 253, 4477, 285, 253, 1941, 2530, 281, 1329, 616, 1899, 285, 7125, 627, 403, 671, 7350, 670, 253, 8453, 273, 253, 15988, 275, 3045, 26299, 407, 253, 4081, 2746, 50276, 32674, 253, 2488, 2380, 2180, 767, 30628, 3395, 10048, 342, 253, 30733, 285, 14586, 4283, 281, 271, 2572, 275, 253, 2457, 4868, 352, 588, 320, 4619, 323, 253, 4477, 281, 1611, 281, 823, 4440, 257, 292, 1543, 604, 1896, 275, 1635, 281, 643, 16966, 1160, 281, 30628, 50276, 783, 913, 32636, 18738, 436, 2929 ]
Below is given review of a research paper from cnoference journal. Please write a summary the review. ### Review: this paper presents an imitation learning algorithm named wasabi that enables quadruped robots to learn from rough and partial expert demonstration using generative adversarial learning instead of using taskspecific reward functions as with many robotics learning approaches the proposed approach can infer rewards from demonstration using gan in particular the authors incorporate taskagnostic regularization terms and adversarial rewards to enable the robot to learn stable and agile locomotion behaviors moreover this approach can learn only from partial and physically incompatible demonstrations eg a human operator holds and moves the robots trunk in a specific fashion the authors also showed simulation and experimental results of a quadruped robot learning four challenging tasks the papers show that wasabi outperforms the leastsquare gan lsgan in most of the tasks 3 out of 4 in terms of motion similarity measured by dtw distance strength able to learn from limited demonstration without defining taskspecific reward functions which often need to be designed and tuned by hand replacing lsgan loss with wasserstein loss improves the efficiency in discriminating the reference and the generated motions able to prevent mode collapse using gradient penalty term in the loss function demonstrating the performance of the proposed approach in hardware experiments weakness the final reward function also includes a regularization reward which seems to be specific to the robot how much does this term contribute to the total reward the observation space does not include the positionorientation of the robot base it only includes linear and angular velocities will the demonstration speed of the human operator would affect the final learned locomotion docsepthe paper proposed an adversarial imitation learning method to learn agile motor skills of a quadrupedal robot from partial demonstrations the main contributions of the paper are following two points 1 the authors introduced the wasserstein loss to the adversarial reward learning 2 the authors showed objective and reward functions to acquire the agile motor behaviors from the rough and partial demonstrations strength the authors successfully demonstrated that the proposed method wasserstein adversarial behavior imitation wasabi could learn the agile behaviors from the handcrafted partial state information the authors successfully showed that the agile behaviors of not only the simulated but also real quadrupedal robot could be generated with wasabi by transferring the learned policies weakness although the experimental results are fascinating the paper seems difficult to read especially readers may need to read the original paper of adversarial motion priors amp peng et al tog 2021 to understand the learning process of the adversarial reward learning and policy training to improve this for example i recommend that the authors make a new section to explain the original amp method as a preliminary in the paper the ablation study of the proposed reward function eq 6 is missing minor comments since the backflip behavior of the real quadrupedal robot is impressive it would be better if the authors show snapshots of the backflip on the paper like figure 1 left figure 1 right is hard to see docsepin this paper the authors proposed a ganbased framework to train agile locomotion policies using partial demonstrations similar to amp adversarial motion prior work the authors use wassersteingan to discriminate between the expert demonstrations and the learned policies to generate the reward signal for reinforcement learning different from previous works the demonstrations in this work are not from experts dogs for examples instead only rough base movement demonstrations are provided for the learner policy to figure out dynamically feasible motions that can realize the base trajectories the authors has demonstrated successful deployment of the learned policies on the hardware with zeroshot simtoreal transfer the strengths of the paper 1 the authors show that it is possible to only provide base trajectories as demonstrations to train imitation policies for quadruped robots this opens many new potentials for quadruped robots since expert demonstrations are often hard to obtain 2 there is a good ablation study and analysis on the different gan losses least square vs wasserstein 3 the learned policy can transfer to the real hardware on various agile tasks including back flipping the weaknesses of the paper 1 the delta between amp and this work is not quite large especially since amp has also been tested on hardware recently the main differences seem to be 1 whether or not to include the joint angles as part of the discriminator observations for reward generation and 2 use of wgan vs lsgan 2 not enough baselines to compare with i think it will be great to compare amp with the proposed method on the selected tasks docsepauthors present a novel generative adversarial method named wasabi for inferring reward functions from partial and potentially physically incompatible demonstrations wasabi simultaneously trains wasserstein gan to distinguish between real and generated observations and policy which uses output of the discriminator as a reward this approach allows to use partial demonstrations which dont contain full information about the state authors experimentally compare performance of wgan and lsgan for reward inference and experimentally validate the proposed method in simulation and on a quadruped robot training complex skills such as a backflip the paper proposes an extension of generative adversarial imitation learning method gail method wasserstein gan formulation allows to use the discriminator output as a reward function directly only with simple normalization this method allows to train a policy for complex behaviour from partially observed expert demonstrations the paper is well written and enjoyable to read ### Summary:
the paper proposes an extension of generative adversarial imitation learning method gail method and shows how it can be used to generate a backflip for the solo8 quadruped using a handheld human demonstration in addition to leap wave and standup motions it builds on the general approaches of gail and more recently amp the paper has recommendations of strong accept weak accept x3 the remaining weaknesses and issues are thoroughly addressed in the rebuttal this is a paper that will be appreciated by all the hardware demonstrations are impressive my recommendation accept oral consider for best paper strengths well written ability to learn from limited demonstrations ie body trajectory sketch without reward functions important details and related ablations lsgan wasserstein loss gradient penalty term in loss function hardware demonstrations weaknesses delta to the amp paper is not that large why not compare with amp now addressed in supp d final regularization reward may still be robotspecific reward ablation videos now added paper may be difficult to read ie may need to read amp sec 31 improved in revised version
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 2520, 2929, 10262, 271, 45738, 4715, 5933, 4907, 369, 18754, 326, 13276, 42227, 264, 25497, 281, 3037, 432, 7227, 285, 7898, 6485, 20028, 970, 1006, 800, 48960, 4715, 3185, 273, 970, 8892, 29765, 10921, 3470, 347, 342, 1142, 15688, 982, 4715, 7274, 253, 4081, 2746, 476, 9441, 23267, 432, 20028, 970, 36827, 275, 1798, 253, 4477, 19071, 4836, 1530, 6932, 37820, 2426, 285, 48960, 23267, 281, 8046, 253, 15688, 281, 3037, 6474, 285, 639, 587, 23904, 5011, 13576, 25761, 436, 2746, 476, 3037, 760, 432, 7898, 285, 13318, 32555, 32367, 24088, 247, 1966, 5572, 6556, 285, 9727, 253, 25497, 18855, 275, 247, 2173, 8142, 253, 4477, 671, 2692, 9864, 285, 5661, 1543, 273, 247, 42227, 264, 15688, 4715, 1740, 11132, 8892, 253, 9380, 921, 326, 369, 18754, 41731, 13015, 253, 1878, 15044, 36827, 35253, 1247, 275, 954, 273, 253, 8892, 495, 562, 273, 577, 275, 2426, 273, 3200, 14259, 4080, 407, 277, 7553, 4181, 50276, 45563, 50275, 494, 281, 3037, 432, 3710, 20028, 1293, 13947, 8892, 29765, 10921, 3470, 534, 2223, 878, 281, 320, 4158, 285, 24251, 407, 1133, 50276, 250, 47972, 35253, 1247, 2957, 342, 369, 2152, 6339, 2957, 19132, 253, 6733, 275, 7134, 8779, 253, 3806, 285, 253, 4561, 14462, 50276, 494, 281, 3657, 4438, 13551, 970, 11786, 12339, 1307, 275, 253, 2957, 1159, 50276, 48387, 839, 253, 3045, 273, 253, 4081, 2746, 275, 10309, 4679, 50276, 20881, 1255, 50275, 783, 2457, 10921, 1159, 671, 3797, 247, 37820, 10921, 534, 3133, 281, 320, 2173, 281, 253, 15688, 849, 1199, 1057, 436, 1307, 8162, 281, 253, 2264, 10921, 50276, 783, 8310, 2317, 1057, 417, 2486, 253, 1899, 31756, 273, 253, 15688, 2613, 352, 760, 3797, 4872, 285, 12336, 24409, 588, 253, 20028, 3885, 273, 253, 1966, 5572, 651, 2818, 253, 2457, 6311, 23904, 5011, 5474, 339, 431, 248, 2929, 4081, 271, 48960, 45738, 4715, 1332, 281, 3037, 639, 587, 5694, 6936, 273, 247, 42227, 264, 267, 15688, 432, 7898, 32367, 253, 2022, 9021, 273, 253, 2929, 403, 1563, 767, 2792, 337, 253, 4477, 5611, 253, 369, 2152, 6339, 2957, 281, 253, 48960, 10921, 4715, 374, 253, 4477, 2692, 8103, 285, 10921, 3470, 281, 16270, 253, 639, 587, 5694, 13576, 432, 253, 7227, 285, 7898, 32367, 50276, 45563, 50276, 783, 4477, 8379, 5183, 326, 253, 4081, 1332, 369, 2152, 6339, 48960, 3879, 45738, 369, 18754, 812, 3037, 253, 639, 587, 13576, 432, 253, 1133, 12517, 264, 7898, 1375, 1491, 50276, 783, 4477, 8379, 2692, 326, 253, 639, 587, 13576, 273, 417, 760, 253, 15524, 533, 671, 1524, 42227, 264, 267, 15688, 812, 320, 4561, 342, 369, 18754, 407, 27090, 253, 6311, 7823, 50276, 20881, 1255, 50276, 20261, 253, 5661, 1543, 403, 20996, 253, 2929, 3133, 2834, 281, 1239, 3340, 10668, 778, 878, 281, 1239, 253, 3236, 2929, 273, 48960, 3200, 2235, 641, 32263, 42151, 1162, 355, 281, 72, 43425, 281, 2096, 253, 4715, 1232, 273, 253, 48960, 10921, 4715, 285, 3646, 3733, 281, 3157, 436, 323, 1650, 891, 5583, 326, 253, 4477, 1056, 247, 747, 2593, 281, 5513, 253, 3236, 32263, 1332, 347, 247, 12611, 275, 253, 2929, 50276, 783, 28913, 1263, 273, 253, 4081, 10921, 1159, 16186, 721, 310, 5816, 50276, 37585, 5701, 50276, 17480, 253, 896, 43272, 3879, 273, 253, 1524, 42227, 264, 267, 15688, 310, 13943, 352, 651, 320, 1805, 604, 253, 4477, 921, 14496, 28853, 273, 253, 896, 43272, 327, 253, 2929, 751, 4677, 337, 1669, 4677, 337, 987, 310, 1892, 281, 923, 50276, 7152, 339, 9852, 436, 2929, 253, 4477, 4081, 247, 36827, 3169, 7792, 281, 6194, 639, 587, 23904, 5011, 7823, 970, 7898, 32367, 2074, 281, 32263, 48960, 3200, 2720, 789, 253, 4477, 897, 369, 2152, 3241, 272, 266, 281, 30530, 875, 253, 6485, 32367, 285, 253, 6311, 7823, 281, 6635, 253, 10921, 2625, 323, 35221, 4715, 1027, 432, 2045, 2987, 253, 32367, 275, 436, 789, 403, 417, 432, 10071, 9097, 323, 6667, 3185, 760, 7227, 2613, 4866, 32367, 403, 2530, 323, 253, 458, 47612, 3646, 281, 4677, 562, 23043, 17887, 14462, 326, 476, 8968, 253, 2613, 24102, 253, 4477, 556, 5183, 5547, 19007, 273, 253, 6311, 7823, 327, 253, 10309, 342, 1182, 254, 6934, 302, 948, 85, 43549, 3700, 50275, 783, 20544, 273, 253, 2929, 50276, 18, 253, 4477, 921, 326, 352, 310, 1896, 281, 760, 2085, 2613, 24102, 347, 32367, 281, 6194, 45738, 7823, 323, 42227, 264, 25497, 436, 13279, 1142, 747, 19316, 323, 42227, 264, 25497, 1580, 6485, 32367, 403, 2223, 1892, 281, 4044, 50275, 19, 627, 310, 247, 1175, 28913, 1263, 285, 1783, 327, 253, 1027, 36827, 11655, 1878, 6278, 4632, 369, 2152, 6339, 50276, 20, 253, 6311, 3646, 476, 3700, 281, 253, 1524, 10309, 327, 2710, 639, 587, 8892, 1690, 896, 46899, 50275, 783, 32213, 273, 253, 2929, 50276, 18, 253, 18687, 875, 32263, 285, 436, 789, 310, 417, 3240, 1781, 3340, 1580, 32263, 556, 671, 644, 5762, 327, 10309, 4102, 253, 2022, 3910, 1646, 281, 320, 337, 1880, 390, 417, 281, 2486, 253, 6036, 14636, 347, 629, 273, 253, 7134, 12915, 7313, 323, 10921, 5978, 285, 374, 897, 273, 259, 1247, 4632, 35253, 1247, 50275, 19, 417, 2217, 1666, 25379, 281, 7277, 342, 891, 1158, 352, 588, 320, 1270, 281, 7277, 32263, 342, 253, 4081, 1332, 327, 253, 4236, 8892, 50275, 7152, 33032, 43355, 1246, 247, 4460, 1006, 800, 48960, 1332, 4907, 369, 18754, 323, 9441, 804, 10921, 3470, 432, 7898, 285, 7826, 13318, 32555, 32367, 369, 18754, 10486, 18784, 369, 2152, 6339, 36827, 281, 12129, 875, 1524, 285, 4561, 7313, 285, 3646, 534, 4648, 3453, 273, 253, 7134, 12915, 347, 247, 10921, 436, 2746, 4483, 281, 897, 7898, 32367, 534, 13414, 3831, 2120, 1491, 670, 253, 1375, 4477, 21657, 7277, 3045, 273, 259, 1247, 285, 35253, 1247, 323, 10921, 17032, 285, 21657, 17813, 253, 4081, 1332, 275, 9864, 285, 327, 247, 42227, 264, 15688, 3733, 2570, 6936, 824, 347, 247, 896, 43272, 50275, 783, 2929, 29328, 271, 6880, 273, 1006, 800, 48960, 45738, 4715, 1332, 305, 647, 1332, 369, 2152, 6339, 36827, 15895, 4483, 281, 897, 253, 7134, 12915, 3453, 347, 247, 10921, 1159, 3587, 760, 342, 2969, 21539, 436, 1332, 4483, 281, 6194, 247, 3646, 323, 2570, 8770, 432, 10571, 2540, 6485, 32367, 253, 2929, 310, 50276, 4714, 3542, 285, 30357, 281, 1239, 50276, 187, 187, 4118, 18435, 27, 783, 2929, 29328, 271, 6880, 273, 1006, 800, 48960, 45738, 4715, 1332, 305, 647, 1332, 285, 2722, 849, 352, 476, 320, 908, 281, 6635, 247, 896, 43272, 323, 253, 14088, 25, 42227, 264, 970, 247, 45848, 1966, 20028, 275, 1635, 281, 26416, 5149, 285, 1462, 484, 14462, 352, 21168, 327, 253, 2087, 7274, 273, 305, 647, 285, 625, 4102, 32263, 50276, 783, 2929, 556, 12645, 273, 50276, 9072, 2997, 5075, 2997, 1269, 20, 253, 5780, 32213, 285, 3374, 403, 16575, 9713, 275, 253, 30080, 22559, 436, 310, 247, 2929, 326, 588, 320, 14109, 407, 512, 50276, 783, 10309, 32367, 403, 13943, 619, 17401, 50276, 14764, 7946, 1908, 323, 1682, 2929, 50276, 296, 3755, 20556, 50276, 4714, 3542, 50276, 1430, 281, 3037, 432, 3710, 32367, 50276, 466, 2133, 18974, 23211, 1293, 10921, 3470, 50275, 18108, 4278, 285, 2905, 490, 77, 569, 50276, 5200, 1247, 50276, 88, 30666, 6339, 2957, 11786, 12339, 1307, 275, 2957, 1159, 50276, 10984, 1935, 32367, 50276, 20881, 1255, 265, 50276, 3005, 281, 253, 32263, 2929, 310, 417, 326, 1781, 50276, 22309, 417, 7277, 342, 32263, 50276, 2666, 9713, 275, 915, 277, 50276, 13017, 37820, 10921, 778, 1335, 320, 25497, 29765, 10921, 28913, 10556, 1024, 2879, 50276, 20790, 778, 320, 2834, 281, 1239, 26332, 778, 878, 281, 1239, 32263, 50276, 1704, 4562, 5520, 275, 17265, 2715, 209 ]
[ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 ]
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 2520, 2929, 10262, 271, 45738, 4715, 5933, 4907, 369, 18754, 326, 13276, 42227, 264, 25497, 281, 3037, 432, 7227, 285, 7898, 6485, 20028, 970, 1006, 800, 48960, 4715, 3185, 273, 970, 8892, 29765, 10921, 3470, 347, 342, 1142, 15688, 982, 4715, 7274, 253, 4081, 2746, 476, 9441, 23267, 432, 20028, 970, 36827, 275, 1798, 253, 4477, 19071, 4836, 1530, 6932, 37820, 2426, 285, 48960, 23267, 281, 8046, 253, 15688, 281, 3037, 6474, 285, 639, 587, 23904, 5011, 13576, 25761, 436, 2746, 476, 3037, 760, 432, 7898, 285, 13318, 32555, 32367, 24088, 247, 1966, 5572, 6556, 285, 9727, 253, 25497, 18855, 275, 247, 2173, 8142, 253, 4477, 671, 2692, 9864, 285, 5661, 1543, 273, 247, 42227, 264, 15688, 4715, 1740, 11132, 8892, 253, 9380, 921, 326, 369, 18754, 41731, 13015, 253, 1878, 15044, 36827, 35253, 1247, 275, 954, 273, 253, 8892, 495, 562, 273, 577, 275, 2426, 273, 3200, 14259, 4080, 407, 277, 7553, 4181, 50276, 45563, 50275, 494, 281, 3037, 432, 3710, 20028, 1293, 13947, 8892, 29765, 10921, 3470, 534, 2223, 878, 281, 320, 4158, 285, 24251, 407, 1133, 50276, 250, 47972, 35253, 1247, 2957, 342, 369, 2152, 6339, 2957, 19132, 253, 6733, 275, 7134, 8779, 253, 3806, 285, 253, 4561, 14462, 50276, 494, 281, 3657, 4438, 13551, 970, 11786, 12339, 1307, 275, 253, 2957, 1159, 50276, 48387, 839, 253, 3045, 273, 253, 4081, 2746, 275, 10309, 4679, 50276, 20881, 1255, 50275, 783, 2457, 10921, 1159, 671, 3797, 247, 37820, 10921, 534, 3133, 281, 320, 2173, 281, 253, 15688, 849, 1199, 1057, 436, 1307, 8162, 281, 253, 2264, 10921, 50276, 783, 8310, 2317, 1057, 417, 2486, 253, 1899, 31756, 273, 253, 15688, 2613, 352, 760, 3797, 4872, 285, 12336, 24409, 588, 253, 20028, 3885, 273, 253, 1966, 5572, 651, 2818, 253, 2457, 6311, 23904, 5011, 5474, 339, 431, 248, 2929, 4081, 271, 48960, 45738, 4715, 1332, 281, 3037, 639, 587, 5694, 6936, 273, 247, 42227, 264, 267, 15688, 432, 7898, 32367, 253, 2022, 9021, 273, 253, 2929, 403, 1563, 767, 2792, 337, 253, 4477, 5611, 253, 369, 2152, 6339, 2957, 281, 253, 48960, 10921, 4715, 374, 253, 4477, 2692, 8103, 285, 10921, 3470, 281, 16270, 253, 639, 587, 5694, 13576, 432, 253, 7227, 285, 7898, 32367, 50276, 45563, 50276, 783, 4477, 8379, 5183, 326, 253, 4081, 1332, 369, 2152, 6339, 48960, 3879, 45738, 369, 18754, 812, 3037, 253, 639, 587, 13576, 432, 253, 1133, 12517, 264, 7898, 1375, 1491, 50276, 783, 4477, 8379, 2692, 326, 253, 639, 587, 13576, 273, 417, 760, 253, 15524, 533, 671, 1524, 42227, 264, 267, 15688, 812, 320, 4561, 342, 369, 18754, 407, 27090, 253, 6311, 7823, 50276, 20881, 1255, 50276, 20261, 253, 5661, 1543, 403, 20996, 253, 2929, 3133, 2834, 281, 1239, 3340, 10668, 778, 878, 281, 1239, 253, 3236, 2929, 273, 48960, 3200, 2235, 641, 32263, 42151, 1162, 355, 281, 72, 43425, 281, 2096, 253, 4715, 1232, 273, 253, 48960, 10921, 4715, 285, 3646, 3733, 281, 3157, 436, 323, 1650, 891, 5583, 326, 253, 4477, 1056, 247, 747, 2593, 281, 5513, 253, 3236, 32263, 1332, 347, 247, 12611, 275, 253, 2929, 50276, 783, 28913, 1263, 273, 253, 4081, 10921, 1159, 16186, 721, 310, 5816, 50276, 37585, 5701, 50276, 17480, 253, 896, 43272, 3879, 273, 253, 1524, 42227, 264, 267, 15688, 310, 13943, 352, 651, 320, 1805, 604, 253, 4477, 921, 14496, 28853, 273, 253, 896, 43272, 327, 253, 2929, 751, 4677, 337, 1669, 4677, 337, 987, 310, 1892, 281, 923, 50276, 7152, 339, 9852, 436, 2929, 253, 4477, 4081, 247, 36827, 3169, 7792, 281, 6194, 639, 587, 23904, 5011, 7823, 970, 7898, 32367, 2074, 281, 32263, 48960, 3200, 2720, 789, 253, 4477, 897, 369, 2152, 3241, 272, 266, 281, 30530, 875, 253, 6485, 32367, 285, 253, 6311, 7823, 281, 6635, 253, 10921, 2625, 323, 35221, 4715, 1027, 432, 2045, 2987, 253, 32367, 275, 436, 789, 403, 417, 432, 10071, 9097, 323, 6667, 3185, 760, 7227, 2613, 4866, 32367, 403, 2530, 323, 253, 458, 47612, 3646, 281, 4677, 562, 23043, 17887, 14462, 326, 476, 8968, 253, 2613, 24102, 253, 4477, 556, 5183, 5547, 19007, 273, 253, 6311, 7823, 327, 253, 10309, 342, 1182, 254, 6934, 302, 948, 85, 43549, 3700, 50275, 783, 20544, 273, 253, 2929, 50276, 18, 253, 4477, 921, 326, 352, 310, 1896, 281, 760, 2085, 2613, 24102, 347, 32367, 281, 6194, 45738, 7823, 323, 42227, 264, 25497, 436, 13279, 1142, 747, 19316, 323, 42227, 264, 25497, 1580, 6485, 32367, 403, 2223, 1892, 281, 4044, 50275, 19, 627, 310, 247, 1175, 28913, 1263, 285, 1783, 327, 253, 1027, 36827, 11655, 1878, 6278, 4632, 369, 2152, 6339, 50276, 20, 253, 6311, 3646, 476, 3700, 281, 253, 1524, 10309, 327, 2710, 639, 587, 8892, 1690, 896, 46899, 50275, 783, 32213, 273, 253, 2929, 50276, 18, 253, 18687, 875, 32263, 285, 436, 789, 310, 417, 3240, 1781, 3340, 1580, 32263, 556, 671, 644, 5762, 327, 10309, 4102, 253, 2022, 3910, 1646, 281, 320, 337, 1880, 390, 417, 281, 2486, 253, 6036, 14636, 347, 629, 273, 253, 7134, 12915, 7313, 323, 10921, 5978, 285, 374, 897, 273, 259, 1247, 4632, 35253, 1247, 50275, 19, 417, 2217, 1666, 25379, 281, 7277, 342, 891, 1158, 352, 588, 320, 1270, 281, 7277, 32263, 342, 253, 4081, 1332, 327, 253, 4236, 8892, 50275, 7152, 33032, 43355, 1246, 247, 4460, 1006, 800, 48960, 1332, 4907, 369, 18754, 323, 9441, 804, 10921, 3470, 432, 7898, 285, 7826, 13318, 32555, 32367, 369, 18754, 10486, 18784, 369, 2152, 6339, 36827, 281, 12129, 875, 1524, 285, 4561, 7313, 285, 3646, 534, 4648, 3453, 273, 253, 7134, 12915, 347, 247, 10921, 436, 2746, 4483, 281, 897, 7898, 32367, 534, 13414, 3831, 2120, 1491, 670, 253, 1375, 4477, 21657, 7277, 3045, 273, 259, 1247, 285, 35253, 1247, 323, 10921, 17032, 285, 21657, 17813, 253, 4081, 1332, 275, 9864, 285, 327, 247, 42227, 264, 15688, 3733, 2570, 6936, 824, 347, 247, 896, 43272, 50275, 783, 2929, 29328, 271, 6880, 273, 1006, 800, 48960, 45738, 4715, 1332, 305, 647, 1332, 369, 2152, 6339, 36827, 15895, 4483, 281, 897, 253, 7134, 12915, 3453, 347, 247, 10921, 1159, 3587, 760, 342, 2969, 21539, 436, 1332, 4483, 281, 6194, 247, 3646, 323, 2570, 8770, 432, 10571, 2540, 6485, 32367, 253, 2929, 310, 50276, 4714, 3542, 285, 30357, 281, 1239, 50276, 187, 187, 4118, 18435, 27, 783, 2929, 29328, 271, 6880, 273, 1006, 800, 48960, 45738, 4715, 1332, 305, 647, 1332, 285, 2722, 849, 352, 476, 320, 908, 281, 6635, 247, 896, 43272, 323, 253, 14088, 25, 42227, 264, 970, 247, 45848, 1966, 20028, 275, 1635, 281, 26416, 5149, 285, 1462, 484, 14462, 352, 21168, 327, 253, 2087, 7274, 273, 305, 647, 285, 625, 4102, 32263, 50276, 783, 2929, 556, 12645, 273, 50276, 9072, 2997, 5075, 2997, 1269, 20, 253, 5780, 32213, 285, 3374, 403, 16575, 9713, 275, 253, 30080, 22559, 436, 310, 247, 2929, 326, 588, 320, 14109, 407, 512, 50276, 783, 10309, 32367, 403, 13943, 619, 17401, 50276, 14764, 7946, 1908, 323, 1682, 2929, 50276, 296, 3755, 20556, 50276, 4714, 3542, 50276, 1430, 281, 3037, 432, 3710, 32367, 50276, 466, 2133, 18974, 23211, 1293, 10921, 3470, 50275, 18108, 4278, 285, 2905, 490, 77, 569, 50276, 5200, 1247, 50276, 88, 30666, 6339, 2957, 11786, 12339, 1307, 275, 2957, 1159, 50276, 10984, 1935, 32367, 50276, 20881, 1255, 265, 50276, 3005, 281, 253, 32263, 2929, 310, 417, 326, 1781, 50276, 22309, 417, 7277, 342, 32263, 50276, 2666, 9713, 275, 915, 277, 50276, 13017, 37820, 10921, 778, 1335, 320, 25497, 29765, 10921, 28913, 10556, 1024, 2879, 50276, 20790, 778, 320, 2834, 281, 1239, 26332, 778, 878, 281, 1239, 32263, 50276, 1704, 4562, 5520, 275, 17265, 2715, 209 ]
Below is given review of a research paper from cnoference journal. Please write a summary the review. ### Review: this work derives a new generalization bound for estimating causal effects of continuous treatments the new bound is based on the idea of generalized propensity score reweighting and is obtained via a sequence of inequalities translating the original marginal treatment effect which is hard to estimate to a new quantity that is easier to estimate the generalization bound has been used as the objective function for training a deep neural network whose architecture has been adapted from existing works experimental results show stateoftheart performance of the proposed method objective function originality the proposed bound in this work seems novel to me one question about related literature on generalization bounds is raised in the next section quality and clarify this work is overall of high quality and is very clear in terms of presentation the only concern i have is about the soundness of the bounds and is listed in the next section significance judging from the experimental results the improvements seem significant however it is not entirely clear to me why the method can perform so well the bound seems loose and it is interesting to understand why they can still perform well a better understanding is needed in order for the method to be more trustworthy some minor remarks 1 in line 127 hat u is a typo 2 in line 198 the sume of the reweighted factual loss should it be average 3 more details are needed for line 201 to line 207 think the mmd is defined for both continuous and discrete domains while the authors seems to argue that it cannot be obtained for continuous domains am i understanding correctly the authors do not discuss this docsepthis paper estimates the causal effects of continuous treatments to balance the covariates among infinite subpopulations they learn resampling weights that reduce the ipm distance between observed and counterfactual groups thus they derive an upper bound on the estimated counterfactual error and demonstrate experimentally the proposed algorithm admit based on the derived upper bound outperforms gps ebct drnet scigan and vcnet strengths 1 provide a theoretical guarantee to the causal effect estimation of continuous treatments 2 give a comprehensive summary of the current studies on the continuous treatment effect estimation weaknesses 1 lack of innovation the proposed framework is mostly based on previous works such as gps cfr and drnet 2 limited contribution although the author provides the generalization bounds for estimating causal effects of continuous treatment the theoretical proofs are the extension of cfr shalit uri fredrik d johansson and david sontag estimating individual treatment effect generalization bounds and algorithms international conference on machine learning pmlr 2017 the experimental analysis is relatively insufficient for the continuous treatments docsepthis paper derives an error bound for treatment effect estimation in the continuous treatment dosage regime the key insight is to relate this error to the bias introduced by distributional shift in treatment assignment using ipm distances between distributions the authors further provide an algorithm inspired by the theoretical upper bound as well as empirical validation for the proposed method the paper is byandlarge an extension of 1 to continuous treatments domains by leveraging a discretized version of the ipm metric used in 1 as well as importance sampling with learned weights 2 3 the idea itself is a good addition to the causal inference literature but i have reservations regarding the execution both from a technical and quality of writing standpoint overall structure there is not enough contextualization with previous work for example 1 2 3 are only briefly mentioned and their connection with the current work is not made obvious the entire related work section lacks clarity the introduction is also confusing especially for those unfamiliar with the advances in this specific area of causal inference the different paragraphs dont seem connected and its unclear what the state of the art that this paper is improving upon actually is section 44 and the algorithm box are too underdeveloped it it unclear what the different quantities phi h w until you read the text and the text doesnt have enough explanations about how to execute the different steps of the algorithm eg how to computer the imp gradient or how to choose delta appropriately the appendix is very sparse and doesnt contain additional information that could answer this question technical weaknesses theorem 1 also holds for sigmamin0 for example when ytftx ie the counterfactual outcome under treatment t depends solely on observed features lemma 1 seems somewhat restrictive given that the loss function has to be in the family of functions that defines the ipm what are the implications for the squared loss emse is a population quantity what are the finite sample guarantees if any how do you actually calculate the ipm gradients and how does it contribute to the computational complexity how do the sample and computational complexities depend on the choice of delta concerns about reproducibility the simulations were run on 3080 gpus according to the appendix what were the computational bottlenecks how does the gpu usage compare across methods the experimental results are encouraging but there are some issues with the performance of the benchmarks for example the dgp is similar to the one in the vcnet paper 4 but their performance were more along simeq 015 rather than the simeq 019 found in this paper and if the discrepancy comes from the minor dgp modifications have you tuned the parameters of the vcnet to the new dgp overall i dont think this work is ready for publication yet 1 uri shalit fredrik d johansson and david sontag estimating individual treatment effect generalization bounds and algorithms 2 negar hassanpour and russell greiner counterfactual regression with importance sampling weights 3 fredrik d johansson uri shalit nathan kallus and david sontag generalization bounds and 360 representation learning for estimation of potential outcomes and causal effects 4 lizhen nie mao ye dan nicolae et al vcnet and functional targeted regularization for learning causal effects of continuous treatments the authors have not addressed the computational limitations of their algorithm besides the fact that simeq 3000 gpus were used docsepthe authors tackle the problem of estimating causal effects under continuous treatments by proposing a novel framework to estimate the average dose response function adrf the core idea of their framework is to minimize a bound on the adrf loss instead of minimizing the error in estimating the adrf directly the adrf loss consists of both a factual loss exact and an upper bound on the counterfactual loss which is derived via a reweighting schema and the use of an integral probability metric ipm that measures the distance between the factual and counterfactual distributions the authors provide a practical implementation of this theoretical bound under a smoothness assumption on the shift between covariate distributions in different subpopulations they call their algorithm admit which they test in synthetic and semisynthetic settings strengths quality significance the authors provide a clear and wellmotivated development of their upper bound on the adrf loss their connection of the theory to practical implementation was also particularly well done weaknesses quality there are a few improvements that could be made to the empirical results section of the paper increasing the thoroughness of the empirical section is the only major weakness in my mind for one it would be useful to provide another view of the results by eg providing the actual dose response curve ie response vs dosage for the estimation and the ground truth this would help the reader get a better idea as to which regions of the treatment may be better in terms of estimation vs not secondly it would be interesting to see how the method performs in the setting where some of the confounders were unobserved or simulated to be unobserved ie you conceal them although an initial assumption is one of unconfoundedness characterizing the failure modes of the method is generally useful at a high level the authors need to do a better job at discussing the limitations and failure modes of their approach clarity minor there are a few areas where there are some grammatical and spelling errors see below for a few examples not exhaustive no there was not a significant discussion of the limitations of the approach which is a weakness of this paper see above for suggestions to improve this ie assumption 3 discussion more thorough empirical analyses ### Summary:
the authors propose theory and an algorithm for estimating average doseresponse functions adrf from observational data under assumptions of unconfoundedness and overlap the approach extends theory and methodology from primarily the work in 13 where neural networks and integral probability metrics are used to learn outcome regressions and reweighting functions to minimise a bound on the expected loss the approach was evaluated on semisynthetic datasets and compared favourably to baseline reviewers found the setting novel and interesting but were concerned that the analysis was very close to previous works requiring only a small modification to allow for continuous rather than binary treatments the empirical evaluation was also rather limited restricted to comparing mean squared errors on benchmark datasets one of the reviewers asked why we should expect the method to perform so well when the learning objective represents a fairly loose bound on the expected error the empirical results offer little to answer this question the authors rebuttal suggests that this is due to the reweighting function but there is no empirical or theoretical evidence that this is the deciding factor for example how does the admit model perform without reweighting in figure 3 the authors claim to show that baselines perform worse when selection bias increases but this trend is noisy at best if anything i would argue that it suggests that admit does better no matter the selection bias which begs the question where is the advantage coming from overall reviewers thought the paper appears sound and offered a few clarifying comments and questions which were mostly answered by the authors the technical novelty is rather low but appropriately applied a revised version of the manuscript should address the presentation issues raised by reviewers as well as the attribution question asked above
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 2520, 789, 38422, 247, 747, 26647, 3033, 323, 26230, 19349, 2538, 273, 5415, 9694, 253, 747, 3033, 310, 1754, 327, 253, 2934, 273, 14923, 33882, 4868, 294, 6712, 272, 285, 310, 2797, 3066, 247, 3425, 273, 25930, 42477, 253, 3236, 16888, 1971, 1055, 534, 310, 1892, 281, 6642, 281, 247, 747, 10671, 326, 310, 6927, 281, 6642, 253, 26647, 3033, 556, 644, 908, 347, 253, 8103, 1159, 323, 3733, 247, 3676, 11454, 2990, 3692, 10336, 556, 644, 12956, 432, 5368, 2987, 5661, 1543, 921, 1375, 23037, 14387, 3045, 273, 253, 4081, 1332, 8103, 1159, 3236, 414, 50276, 783, 50276, 856, 7334, 3033, 275, 436, 789, 3133, 4460, 281, 479, 581, 1953, 670, 2905, 6239, 327, 26647, 14493, 310, 5439, 275, 253, 1735, 2593, 50276, 15177, 285, 19148, 436, 789, 310, 4583, 273, 1029, 3290, 285, 310, 1077, 2590, 275, 2426, 273, 9759, 253, 760, 4468, 891, 452, 310, 670, 253, 3590, 1255, 273, 253, 14493, 285, 310, 7117, 275, 253, 1735, 2593, 50276, 9188, 40348, 32721, 432, 253, 5661, 1543, 253, 11701, 1646, 1534, 2299, 352, 310, 417, 7094, 2590, 281, 479, 2139, 253, 1332, 476, 1347, 594, 973, 253, 3033, 3133, 13155, 285, 352, 310, 4722, 281, 2096, 2139, 597, 476, 1335, 1347, 973, 247, 1805, 4685, 310, 3058, 275, 1340, 323, 253, 1332, 281, 320, 625, 46808, 50276, 8826, 5884, 16157, 337, 275, 1386, 15610, 7856, 1484, 310, 247, 1745, 80, 374, 275, 1386, 26397, 253, 2020, 70, 273, 253, 294, 24676, 15010, 2957, 50276, 11425, 352, 320, 3388, 495, 625, 4278, 403, 3058, 323, 1386, 848, 281, 1386, 25589, 50276, 18959, 253, 5823, 69, 310, 2931, 323, 1097, 5415, 285, 13358, 10625, 1223, 253, 4477, 3133, 281, 9059, 326, 352, 2550, 320, 2797, 323, 5415, 10625, 50276, 312, 891, 4685, 9113, 253, 4477, 513, 417, 2319, 436, 5474, 33032, 2520, 2929, 8197, 253, 19349, 2538, 273, 5415, 9694, 281, 6654, 253, 33520, 2190, 11968, 749, 43221, 597, 3037, 501, 312, 4906, 13461, 326, 4796, 253, 13997, 78, 4181, 875, 2540, 285, 4828, 12690, 780, 2390, 3021, 597, 15313, 271, 5170, 3033, 327, 253, 5998, 4828, 12690, 780, 2228, 285, 7568, 21657, 253, 4081, 5933, 11476, 1754, 327, 253, 6012, 5170, 3033, 41731, 13015, 305, 793, 38391, 291, 1837, 3024, 660, 8541, 285, 362, 68, 3024, 20544, 337, 2085, 247, 10527, 12215, 281, 253, 19349, 1055, 13418, 273, 5415, 9694, 374, 1918, 247, 11088, 6010, 273, 253, 1655, 2175, 327, 253, 5415, 1971, 1055, 13418, 50276, 20881, 1255, 265, 337, 3480, 273, 15832, 253, 4081, 7792, 310, 6571, 1754, 327, 2045, 2987, 824, 347, 305, 793, 260, 925, 285, 1837, 3024, 374, 3710, 7680, 3738, 253, 2488, 3400, 253, 26647, 14493, 323, 26230, 19349, 2538, 273, 5415, 1971, 253, 10527, 27947, 403, 253, 6880, 273, 260, 925, 439, 267, 262, 41475, 269, 433, 16409, 277, 480, 1368, 507, 1665, 285, 34843, 301, 16491, 356, 26230, 2060, 1971, 1055, 26647, 14493, 285, 11333, 5213, 8059, 327, 5145, 4715, 268, 1686, 83, 4240, 253, 5661, 1783, 310, 4942, 12497, 323, 253, 5415, 9694, 50276, 7152, 33032, 2520, 2929, 38422, 271, 2228, 3033, 323, 1971, 1055, 13418, 275, 253, 5415, 1971, 24411, 9459, 253, 2234, 12288, 310, 281, 14588, 436, 2228, 281, 253, 8492, 5611, 407, 3268, 267, 5333, 275, 1971, 12714, 970, 13997, 78, 13849, 875, 10670, 253, 4477, 2007, 2085, 271, 5933, 11797, 407, 253, 10527, 5170, 3033, 347, 973, 347, 16774, 12820, 323, 253, 4081, 1332, 50275, 783, 2929, 310, 407, 395, 16374, 271, 6880, 273, 337, 281, 5415, 9694, 10625, 407, 19732, 2977, 247, 35132, 1025, 2715, 273, 253, 13997, 78, 7982, 908, 275, 337, 347, 973, 347, 6349, 10491, 342, 6311, 13461, 374, 495, 253, 2934, 3139, 310, 247, 1175, 1635, 281, 253, 19349, 17032, 6239, 533, 891, 452, 33196, 5001, 253, 10636, 1097, 432, 247, 7681, 285, 3290, 273, 4028, 32764, 50275, 1189, 455, 2605, 50276, 9088, 310, 417, 2217, 33876, 1320, 342, 2045, 789, 323, 1650, 337, 374, 495, 403, 760, 13366, 5393, 285, 616, 4602, 342, 253, 1655, 789, 310, 417, 1160, 4755, 253, 2862, 2905, 789, 2593, 19756, 19843, 50275, 783, 10199, 310, 671, 21643, 3340, 323, 1110, 32139, 342, 253, 16424, 275, 436, 2173, 2170, 273, 19349, 17032, 253, 1027, 33295, 13414, 1646, 4802, 285, 697, 12744, 752, 253, 1375, 273, 253, 1445, 326, 436, 2929, 310, 11138, 2220, 2686, 310, 50275, 4674, 7127, 285, 253, 5933, 3817, 403, 1512, 762, 35208, 352, 352, 12744, 752, 253, 1027, 13483, 815, 74, 288, 259, 1919, 368, 1239, 253, 2505, 285, 253, 2505, 36908, 452, 2217, 22909, 670, 849, 281, 13194, 253, 1027, 5018, 273, 253, 5933, 24088, 849, 281, 4382, 253, 1607, 11786, 390, 849, 281, 5206, 18687, 20420, 253, 30762, 310, 1077, 23507, 285, 36908, 3831, 3081, 1491, 326, 812, 3662, 436, 1953, 50276, 48746, 32213, 50276, 33921, 337, 671, 6556, 323, 9788, 78, 4988, 17, 323, 1650, 672, 340, 85, 649, 89, 26332, 253, 4828, 12690, 780, 6454, 762, 1971, 246, 7024, 12718, 327, 2540, 3386, 50276, 21838, 337, 3133, 8489, 29190, 1677, 326, 253, 2957, 1159, 556, 281, 320, 275, 253, 2021, 273, 3470, 326, 13067, 253, 13997, 78, 752, 403, 253, 12739, 323, 253, 30044, 2957, 50276, 358, 339, 310, 247, 3072, 10671, 752, 403, 253, 6486, 3410, 23632, 604, 667, 50276, 5430, 513, 368, 2686, 10173, 253, 13997, 78, 27935, 285, 849, 1057, 352, 8162, 281, 253, 15180, 10454, 849, 513, 253, 3410, 285, 15180, 48663, 3469, 327, 253, 4327, 273, 18687, 50276, 585, 1209, 2224, 670, 38041, 253, 9938, 497, 1408, 327, 1884, 1438, 31025, 316, 2556, 281, 253, 30762, 752, 497, 253, 15180, 3673, 5025, 886, 661, 849, 1057, 253, 305, 11113, 10393, 7277, 2439, 3082, 50276, 783, 5661, 1543, 403, 18462, 533, 627, 403, 690, 3374, 342, 253, 3045, 273, 253, 49602, 323, 1650, 253, 277, 17788, 310, 2074, 281, 253, 581, 275, 253, 362, 68, 3024, 2929, 577, 533, 616, 3045, 497, 625, 2112, 256, 553, 82, 470, 1010, 2581, 685, 253, 256, 553, 82, 470, 746, 1119, 275, 436, 2929, 285, 604, 253, 26210, 3249, 432, 253, 5884, 277, 17788, 14586, 452, 368, 24251, 253, 3602, 273, 253, 362, 68, 3024, 281, 253, 747, 277, 17788, 50276, 1189, 455, 891, 13414, 1158, 436, 789, 310, 4704, 323, 9311, 2568, 50275, 18, 41475, 439, 267, 262, 269, 433, 16409, 277, 480, 1368, 507, 1665, 285, 34843, 301, 16491, 356, 26230, 2060, 1971, 1055, 26647, 14493, 285, 11333, 50276, 19, 2297, 274, 38193, 266, 81, 454, 285, 391, 1316, 437, 13738, 7068, 4828, 12690, 780, 9077, 342, 6349, 10491, 13461, 50275, 20, 269, 433, 16409, 277, 480, 1368, 507, 1665, 41475, 439, 267, 262, 295, 10511, 465, 455, 316, 285, 34843, 301, 16491, 356, 26647, 14493, 285, 16951, 6779, 4715, 323, 13418, 273, 2442, 6973, 285, 19349, 2538, 50275, 21, 298, 478, 864, 15361, 6429, 80, 9094, 16447, 6815, 311, 3348, 1162, 355, 362, 68, 3024, 285, 5164, 10522, 37820, 323, 4715, 19349, 2538, 273, 5415, 9694, 253, 4477, 452, 417, 9713, 253, 15180, 7364, 273, 616, 5933, 16280, 253, 958, 326, 256, 553, 82, 27295, 31025, 316, 497, 908, 5474, 339, 431, 248, 4477, 18915, 253, 1895, 273, 26230, 19349, 2538, 762, 5415, 9694, 407, 36636, 247, 4460, 7792, 281, 6642, 253, 3388, 6178, 2380, 1159, 519, 19232, 253, 5161, 2934, 273, 616, 7792, 310, 281, 15338, 247, 3033, 327, 253, 519, 19232, 2957, 3185, 273, 28699, 253, 2228, 275, 26230, 253, 519, 19232, 3587, 253, 519, 19232, 2957, 8414, 273, 1097, 247, 15010, 2957, 3242, 285, 271, 5170, 3033, 327, 253, 4828, 12690, 780, 2957, 50276, 4609, 310, 6012, 3066, 247, 294, 6712, 272, 20824, 50276, 395, 253, 897, 273, 271, 9909, 5912, 7982, 13997, 78, 326, 5593, 253, 4181, 875, 253, 15010, 285, 4828, 12690, 780, 10670, 50275, 783, 4477, 2085, 247, 8542, 7092, 273, 436, 10527, 3033, 762, 247, 6032, 1255, 9376, 327, 253, 5333, 875, 9383, 11610, 10670, 275, 1027, 749, 43221, 597, 1067, 616, 5933, 11476, 534, 597, 1071, 275, 13506, 285, 49863, 23744, 7533, 50275, 296, 3755, 20556, 50275, 15177, 50276, 9188, 40348, 50276, 783, 4477, 2085, 247, 2590, 285, 973, 24013, 8550, 2440, 273, 616, 5170, 3033, 327, 253, 519, 19232, 2957, 616, 4602, 273, 253, 3762, 281, 8542, 7092, 369, 671, 3782, 973, 2218, 50276, 20881, 1255, 265, 50275, 15177, 50276, 9088, 403, 247, 1643, 11701, 326, 812, 320, 1160, 281, 253, 16774, 1543, 2593, 273, 253, 2929, 3629, 253, 11080, 1255, 273, 253, 16774, 2593, 310, 253, 760, 2201, 14855, 275, 619, 2564, 323, 581, 352, 651, 320, 4217, 281, 2085, 1529, 1859, 273, 253, 1543, 407, 24088, 5277, 253, 4588, 6178, 2380, 6970, 26332, 2380, 4632, 24411, 323, 253, 13418, 285, 253, 3216, 5083, 436, 651, 1361, 253, 9414, 755, 247, 1805, 2934, 347, 281, 534, 4811, 273, 253, 1971, 778, 320, 1805, 275, 2426, 273, 13418, 4632, 417, 1273, 314, 352, 651, 320, 4722, 281, 923, 849, 253, 1332, 17923, 275, 253, 4758, 835, 690, 273, 253, 44667, 398, 497, 440, 45912, 390, 15524, 281, 320, 440, 45912, 26332, 368, 26037, 731, 3738, 271, 3302, 9376, 310, 581, 273, 440, 8259, 8055, 1255, 39330, 253, 4433, 10006, 273, 253, 1332, 310, 3839, 4217, 387, 247, 1029, 1268, 253, 4477, 878, 281, 513, 247, 1805, 2628, 387, 16585, 253, 7364, 285, 4433, 10006, 273, 616, 2746, 50276, 498, 15752, 5884, 50276, 9088, 403, 247, 1643, 3672, 835, 627, 403, 690, 47412, 474, 285, 33797, 6332, 923, 2708, 323, 247, 1643, 6667, 417, 41389, 50274, 2369, 627, 369, 417, 247, 1534, 5955, 273, 253, 7364, 273, 253, 2746, 534, 310, 247, 14855, 273, 436, 2929, 923, 1840, 323, 13991, 281, 3157, 436, 26332, 9376, 495, 5955, 625, 11080, 16774, 6260, 50275, 187, 187, 4118, 18435, 27, 783, 4477, 12661, 3762, 285, 271, 5933, 323, 26230, 3388, 6178, 10927, 3470, 519, 19232, 432, 21899, 941, 762, 13260, 273, 440, 8259, 8055, 1255, 285, 14787, 253, 2746, 8725, 3762, 285, 16182, 432, 8558, 253, 789, 275, 2145, 835, 11454, 6928, 285, 9909, 5912, 17082, 403, 908, 281, 3037, 6454, 810, 37761, 285, 294, 6712, 272, 3470, 281, 7221, 885, 247, 3033, 327, 253, 3264, 2957, 253, 2746, 369, 6760, 327, 49863, 23744, 15302, 285, 2429, 9796, 1598, 281, 8245, 50275, 15337, 398, 1119, 253, 4758, 4460, 285, 4722, 533, 497, 7514, 326, 253, 1783, 369, 1077, 2810, 281, 2045, 2987, 10568, 760, 247, 1355, 11237, 281, 1581, 323, 5415, 2581, 685, 8985, 9694, 253, 16774, 7103, 369, 671, 2581, 3710, 11096, 281, 10941, 1599, 30044, 6332, 327, 22791, 15302, 581, 273, 253, 30628, 2546, 2139, 359, 943, 1902, 253, 1332, 281, 1347, 594, 973, 672, 253, 4715, 8103, 6125, 247, 9648, 13155, 3033, 327, 253, 3264, 2228, 253, 16774, 1543, 3959, 1652, 281, 3662, 436, 1953, 253, 4477, 30080, 22559, 5936, 326, 436, 310, 1955, 281, 253, 294, 6712, 272, 1159, 533, 627, 310, 642, 16774, 390, 10527, 1941, 326, 436, 310, 253, 18000, 2803, 323, 1650, 849, 1057, 253, 11476, 1566, 1347, 1293, 294, 6712, 272, 275, 4677, 495, 253, 4477, 1750, 281, 921, 326, 1666, 25379, 1347, 7197, 672, 5438, 8492, 5459, 533, 436, 9058, 310, 27620, 387, 1682, 604, 2712, 891, 651, 9059, 326, 352, 5936, 326, 11476, 1057, 1805, 642, 2647, 253, 5438, 8492, 534, 2353, 84, 253, 1953, 835, 310, 253, 5750, 3551, 432, 50275, 1189, 455, 30628, 1869, 253, 2929, 4620, 3590, 285, 5907, 247, 1643, 8254, 5411, 5701, 285, 3533, 534, 497, 6571, 9577, 407, 253, 4477, 253, 7681, 38135, 310, 2581, 1698, 533, 20420, 3732, 247, 17265, 2715, 273, 253, 7714, 943, 2953, 253, 9759, 3374, 5439, 407, 30628, 347, 973, 347, 253, 863, 2382, 1953, 2546, 1840, 209 ]
[ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 ]
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 2520, 789, 38422, 247, 747, 26647, 3033, 323, 26230, 19349, 2538, 273, 5415, 9694, 253, 747, 3033, 310, 1754, 327, 253, 2934, 273, 14923, 33882, 4868, 294, 6712, 272, 285, 310, 2797, 3066, 247, 3425, 273, 25930, 42477, 253, 3236, 16888, 1971, 1055, 534, 310, 1892, 281, 6642, 281, 247, 747, 10671, 326, 310, 6927, 281, 6642, 253, 26647, 3033, 556, 644, 908, 347, 253, 8103, 1159, 323, 3733, 247, 3676, 11454, 2990, 3692, 10336, 556, 644, 12956, 432, 5368, 2987, 5661, 1543, 921, 1375, 23037, 14387, 3045, 273, 253, 4081, 1332, 8103, 1159, 3236, 414, 50276, 783, 50276, 856, 7334, 3033, 275, 436, 789, 3133, 4460, 281, 479, 581, 1953, 670, 2905, 6239, 327, 26647, 14493, 310, 5439, 275, 253, 1735, 2593, 50276, 15177, 285, 19148, 436, 789, 310, 4583, 273, 1029, 3290, 285, 310, 1077, 2590, 275, 2426, 273, 9759, 253, 760, 4468, 891, 452, 310, 670, 253, 3590, 1255, 273, 253, 14493, 285, 310, 7117, 275, 253, 1735, 2593, 50276, 9188, 40348, 32721, 432, 253, 5661, 1543, 253, 11701, 1646, 1534, 2299, 352, 310, 417, 7094, 2590, 281, 479, 2139, 253, 1332, 476, 1347, 594, 973, 253, 3033, 3133, 13155, 285, 352, 310, 4722, 281, 2096, 2139, 597, 476, 1335, 1347, 973, 247, 1805, 4685, 310, 3058, 275, 1340, 323, 253, 1332, 281, 320, 625, 46808, 50276, 8826, 5884, 16157, 337, 275, 1386, 15610, 7856, 1484, 310, 247, 1745, 80, 374, 275, 1386, 26397, 253, 2020, 70, 273, 253, 294, 24676, 15010, 2957, 50276, 11425, 352, 320, 3388, 495, 625, 4278, 403, 3058, 323, 1386, 848, 281, 1386, 25589, 50276, 18959, 253, 5823, 69, 310, 2931, 323, 1097, 5415, 285, 13358, 10625, 1223, 253, 4477, 3133, 281, 9059, 326, 352, 2550, 320, 2797, 323, 5415, 10625, 50276, 312, 891, 4685, 9113, 253, 4477, 513, 417, 2319, 436, 5474, 33032, 2520, 2929, 8197, 253, 19349, 2538, 273, 5415, 9694, 281, 6654, 253, 33520, 2190, 11968, 749, 43221, 597, 3037, 501, 312, 4906, 13461, 326, 4796, 253, 13997, 78, 4181, 875, 2540, 285, 4828, 12690, 780, 2390, 3021, 597, 15313, 271, 5170, 3033, 327, 253, 5998, 4828, 12690, 780, 2228, 285, 7568, 21657, 253, 4081, 5933, 11476, 1754, 327, 253, 6012, 5170, 3033, 41731, 13015, 305, 793, 38391, 291, 1837, 3024, 660, 8541, 285, 362, 68, 3024, 20544, 337, 2085, 247, 10527, 12215, 281, 253, 19349, 1055, 13418, 273, 5415, 9694, 374, 1918, 247, 11088, 6010, 273, 253, 1655, 2175, 327, 253, 5415, 1971, 1055, 13418, 50276, 20881, 1255, 265, 337, 3480, 273, 15832, 253, 4081, 7792, 310, 6571, 1754, 327, 2045, 2987, 824, 347, 305, 793, 260, 925, 285, 1837, 3024, 374, 3710, 7680, 3738, 253, 2488, 3400, 253, 26647, 14493, 323, 26230, 19349, 2538, 273, 5415, 1971, 253, 10527, 27947, 403, 253, 6880, 273, 260, 925, 439, 267, 262, 41475, 269, 433, 16409, 277, 480, 1368, 507, 1665, 285, 34843, 301, 16491, 356, 26230, 2060, 1971, 1055, 26647, 14493, 285, 11333, 5213, 8059, 327, 5145, 4715, 268, 1686, 83, 4240, 253, 5661, 1783, 310, 4942, 12497, 323, 253, 5415, 9694, 50276, 7152, 33032, 2520, 2929, 38422, 271, 2228, 3033, 323, 1971, 1055, 13418, 275, 253, 5415, 1971, 24411, 9459, 253, 2234, 12288, 310, 281, 14588, 436, 2228, 281, 253, 8492, 5611, 407, 3268, 267, 5333, 275, 1971, 12714, 970, 13997, 78, 13849, 875, 10670, 253, 4477, 2007, 2085, 271, 5933, 11797, 407, 253, 10527, 5170, 3033, 347, 973, 347, 16774, 12820, 323, 253, 4081, 1332, 50275, 783, 2929, 310, 407, 395, 16374, 271, 6880, 273, 337, 281, 5415, 9694, 10625, 407, 19732, 2977, 247, 35132, 1025, 2715, 273, 253, 13997, 78, 7982, 908, 275, 337, 347, 973, 347, 6349, 10491, 342, 6311, 13461, 374, 495, 253, 2934, 3139, 310, 247, 1175, 1635, 281, 253, 19349, 17032, 6239, 533, 891, 452, 33196, 5001, 253, 10636, 1097, 432, 247, 7681, 285, 3290, 273, 4028, 32764, 50275, 1189, 455, 2605, 50276, 9088, 310, 417, 2217, 33876, 1320, 342, 2045, 789, 323, 1650, 337, 374, 495, 403, 760, 13366, 5393, 285, 616, 4602, 342, 253, 1655, 789, 310, 417, 1160, 4755, 253, 2862, 2905, 789, 2593, 19756, 19843, 50275, 783, 10199, 310, 671, 21643, 3340, 323, 1110, 32139, 342, 253, 16424, 275, 436, 2173, 2170, 273, 19349, 17032, 253, 1027, 33295, 13414, 1646, 4802, 285, 697, 12744, 752, 253, 1375, 273, 253, 1445, 326, 436, 2929, 310, 11138, 2220, 2686, 310, 50275, 4674, 7127, 285, 253, 5933, 3817, 403, 1512, 762, 35208, 352, 352, 12744, 752, 253, 1027, 13483, 815, 74, 288, 259, 1919, 368, 1239, 253, 2505, 285, 253, 2505, 36908, 452, 2217, 22909, 670, 849, 281, 13194, 253, 1027, 5018, 273, 253, 5933, 24088, 849, 281, 4382, 253, 1607, 11786, 390, 849, 281, 5206, 18687, 20420, 253, 30762, 310, 1077, 23507, 285, 36908, 3831, 3081, 1491, 326, 812, 3662, 436, 1953, 50276, 48746, 32213, 50276, 33921, 337, 671, 6556, 323, 9788, 78, 4988, 17, 323, 1650, 672, 340, 85, 649, 89, 26332, 253, 4828, 12690, 780, 6454, 762, 1971, 246, 7024, 12718, 327, 2540, 3386, 50276, 21838, 337, 3133, 8489, 29190, 1677, 326, 253, 2957, 1159, 556, 281, 320, 275, 253, 2021, 273, 3470, 326, 13067, 253, 13997, 78, 752, 403, 253, 12739, 323, 253, 30044, 2957, 50276, 358, 339, 310, 247, 3072, 10671, 752, 403, 253, 6486, 3410, 23632, 604, 667, 50276, 5430, 513, 368, 2686, 10173, 253, 13997, 78, 27935, 285, 849, 1057, 352, 8162, 281, 253, 15180, 10454, 849, 513, 253, 3410, 285, 15180, 48663, 3469, 327, 253, 4327, 273, 18687, 50276, 585, 1209, 2224, 670, 38041, 253, 9938, 497, 1408, 327, 1884, 1438, 31025, 316, 2556, 281, 253, 30762, 752, 497, 253, 15180, 3673, 5025, 886, 661, 849, 1057, 253, 305, 11113, 10393, 7277, 2439, 3082, 50276, 783, 5661, 1543, 403, 18462, 533, 627, 403, 690, 3374, 342, 253, 3045, 273, 253, 49602, 323, 1650, 253, 277, 17788, 310, 2074, 281, 253, 581, 275, 253, 362, 68, 3024, 2929, 577, 533, 616, 3045, 497, 625, 2112, 256, 553, 82, 470, 1010, 2581, 685, 253, 256, 553, 82, 470, 746, 1119, 275, 436, 2929, 285, 604, 253, 26210, 3249, 432, 253, 5884, 277, 17788, 14586, 452, 368, 24251, 253, 3602, 273, 253, 362, 68, 3024, 281, 253, 747, 277, 17788, 50276, 1189, 455, 891, 13414, 1158, 436, 789, 310, 4704, 323, 9311, 2568, 50275, 18, 41475, 439, 267, 262, 269, 433, 16409, 277, 480, 1368, 507, 1665, 285, 34843, 301, 16491, 356, 26230, 2060, 1971, 1055, 26647, 14493, 285, 11333, 50276, 19, 2297, 274, 38193, 266, 81, 454, 285, 391, 1316, 437, 13738, 7068, 4828, 12690, 780, 9077, 342, 6349, 10491, 13461, 50275, 20, 269, 433, 16409, 277, 480, 1368, 507, 1665, 41475, 439, 267, 262, 295, 10511, 465, 455, 316, 285, 34843, 301, 16491, 356, 26647, 14493, 285, 16951, 6779, 4715, 323, 13418, 273, 2442, 6973, 285, 19349, 2538, 50275, 21, 298, 478, 864, 15361, 6429, 80, 9094, 16447, 6815, 311, 3348, 1162, 355, 362, 68, 3024, 285, 5164, 10522, 37820, 323, 4715, 19349, 2538, 273, 5415, 9694, 253, 4477, 452, 417, 9713, 253, 15180, 7364, 273, 616, 5933, 16280, 253, 958, 326, 256, 553, 82, 27295, 31025, 316, 497, 908, 5474, 339, 431, 248, 4477, 18915, 253, 1895, 273, 26230, 19349, 2538, 762, 5415, 9694, 407, 36636, 247, 4460, 7792, 281, 6642, 253, 3388, 6178, 2380, 1159, 519, 19232, 253, 5161, 2934, 273, 616, 7792, 310, 281, 15338, 247, 3033, 327, 253, 519, 19232, 2957, 3185, 273, 28699, 253, 2228, 275, 26230, 253, 519, 19232, 3587, 253, 519, 19232, 2957, 8414, 273, 1097, 247, 15010, 2957, 3242, 285, 271, 5170, 3033, 327, 253, 4828, 12690, 780, 2957, 50276, 4609, 310, 6012, 3066, 247, 294, 6712, 272, 20824, 50276, 395, 253, 897, 273, 271, 9909, 5912, 7982, 13997, 78, 326, 5593, 253, 4181, 875, 253, 15010, 285, 4828, 12690, 780, 10670, 50275, 783, 4477, 2085, 247, 8542, 7092, 273, 436, 10527, 3033, 762, 247, 6032, 1255, 9376, 327, 253, 5333, 875, 9383, 11610, 10670, 275, 1027, 749, 43221, 597, 1067, 616, 5933, 11476, 534, 597, 1071, 275, 13506, 285, 49863, 23744, 7533, 50275, 296, 3755, 20556, 50275, 15177, 50276, 9188, 40348, 50276, 783, 4477, 2085, 247, 2590, 285, 973, 24013, 8550, 2440, 273, 616, 5170, 3033, 327, 253, 519, 19232, 2957, 616, 4602, 273, 253, 3762, 281, 8542, 7092, 369, 671, 3782, 973, 2218, 50276, 20881, 1255, 265, 50275, 15177, 50276, 9088, 403, 247, 1643, 11701, 326, 812, 320, 1160, 281, 253, 16774, 1543, 2593, 273, 253, 2929, 3629, 253, 11080, 1255, 273, 253, 16774, 2593, 310, 253, 760, 2201, 14855, 275, 619, 2564, 323, 581, 352, 651, 320, 4217, 281, 2085, 1529, 1859, 273, 253, 1543, 407, 24088, 5277, 253, 4588, 6178, 2380, 6970, 26332, 2380, 4632, 24411, 323, 253, 13418, 285, 253, 3216, 5083, 436, 651, 1361, 253, 9414, 755, 247, 1805, 2934, 347, 281, 534, 4811, 273, 253, 1971, 778, 320, 1805, 275, 2426, 273, 13418, 4632, 417, 1273, 314, 352, 651, 320, 4722, 281, 923, 849, 253, 1332, 17923, 275, 253, 4758, 835, 690, 273, 253, 44667, 398, 497, 440, 45912, 390, 15524, 281, 320, 440, 45912, 26332, 368, 26037, 731, 3738, 271, 3302, 9376, 310, 581, 273, 440, 8259, 8055, 1255, 39330, 253, 4433, 10006, 273, 253, 1332, 310, 3839, 4217, 387, 247, 1029, 1268, 253, 4477, 878, 281, 513, 247, 1805, 2628, 387, 16585, 253, 7364, 285, 4433, 10006, 273, 616, 2746, 50276, 498, 15752, 5884, 50276, 9088, 403, 247, 1643, 3672, 835, 627, 403, 690, 47412, 474, 285, 33797, 6332, 923, 2708, 323, 247, 1643, 6667, 417, 41389, 50274, 2369, 627, 369, 417, 247, 1534, 5955, 273, 253, 7364, 273, 253, 2746, 534, 310, 247, 14855, 273, 436, 2929, 923, 1840, 323, 13991, 281, 3157, 436, 26332, 9376, 495, 5955, 625, 11080, 16774, 6260, 50275, 187, 187, 4118, 18435, 27, 783, 4477, 12661, 3762, 285, 271, 5933, 323, 26230, 3388, 6178, 10927, 3470, 519, 19232, 432, 21899, 941, 762, 13260, 273, 440, 8259, 8055, 1255, 285, 14787, 253, 2746, 8725, 3762, 285, 16182, 432, 8558, 253, 789, 275, 2145, 835, 11454, 6928, 285, 9909, 5912, 17082, 403, 908, 281, 3037, 6454, 810, 37761, 285, 294, 6712, 272, 3470, 281, 7221, 885, 247, 3033, 327, 253, 3264, 2957, 253, 2746, 369, 6760, 327, 49863, 23744, 15302, 285, 2429, 9796, 1598, 281, 8245, 50275, 15337, 398, 1119, 253, 4758, 4460, 285, 4722, 533, 497, 7514, 326, 253, 1783, 369, 1077, 2810, 281, 2045, 2987, 10568, 760, 247, 1355, 11237, 281, 1581, 323, 5415, 2581, 685, 8985, 9694, 253, 16774, 7103, 369, 671, 2581, 3710, 11096, 281, 10941, 1599, 30044, 6332, 327, 22791, 15302, 581, 273, 253, 30628, 2546, 2139, 359, 943, 1902, 253, 1332, 281, 1347, 594, 973, 672, 253, 4715, 8103, 6125, 247, 9648, 13155, 3033, 327, 253, 3264, 2228, 253, 16774, 1543, 3959, 1652, 281, 3662, 436, 1953, 253, 4477, 30080, 22559, 5936, 326, 436, 310, 1955, 281, 253, 294, 6712, 272, 1159, 533, 627, 310, 642, 16774, 390, 10527, 1941, 326, 436, 310, 253, 18000, 2803, 323, 1650, 849, 1057, 253, 11476, 1566, 1347, 1293, 294, 6712, 272, 275, 4677, 495, 253, 4477, 1750, 281, 921, 326, 1666, 25379, 1347, 7197, 672, 5438, 8492, 5459, 533, 436, 9058, 310, 27620, 387, 1682, 604, 2712, 891, 651, 9059, 326, 352, 5936, 326, 11476, 1057, 1805, 642, 2647, 253, 5438, 8492, 534, 2353, 84, 253, 1953, 835, 310, 253, 5750, 3551, 432, 50275, 1189, 455, 30628, 1869, 253, 2929, 4620, 3590, 285, 5907, 247, 1643, 8254, 5411, 5701, 285, 3533, 534, 497, 6571, 9577, 407, 253, 4477, 253, 7681, 38135, 310, 2581, 1698, 533, 20420, 3732, 247, 17265, 2715, 273, 253, 7714, 943, 2953, 253, 9759, 3374, 5439, 407, 30628, 347, 973, 347, 253, 863, 2382, 1953, 2546, 1840, 209 ]
Below is given review of a research paper from cnoference journal. Please write a summary the review. ### Review: this work proposes the neural shuffleexchange network to capture both local and global dependencies for 2d data the idea extends the 1d neural shuffleexchange network to its 2d application the proposed method first converts 2d data to 1d following the zorder then apply several quaternary switch and quaternary shuffle layers and finally convert the data back to 2d space the experimental results show that the proposed method can obtain better performance with reasonable computational cost strengths the proposed method is very interesting it studies how to capture longterm dependencies for 2d data in an efficient way it uses the zorder curve to flatten 2d data into 1d it can preserve locality information for features the experimental results show the proposed method can obtain better performance weaknesses this work extends the neural shuffleexchange network 1d to 2d compared with the 1d nse its technical contribution is incremental and the novelty is limited even though the zorder curve can locality information it is not clear whether the spatial information in 2d data can be preserved it should be discussed the attention mechanism can learn nonlocal dependencies effectively it should be compared in the experimental studies in addition it is true that the vanilla version of attention has on4 complexity but there are several improved variants with competitive performance but lower computational cost the proposed method is also applied to graph data and is compared with gin however for graph data i would suggest conducting experiments on benchmark datasets update after rebuttal i have read the authors rebuttal i still believe the novelty is limited and hence i keep my score unchanged docsepthis paper adapts the recently introduced neural shuffleexchange nse network for 2d inputs this is done by introducing two changes flattening 2d input using a zorder curve and using 4to4 switches rather than 2to2 experiments on synthetic data show that the proposed model can outperform baselines strengths 1 the extension of nse to 2d domains is an interesting research direction 2 the use of the zorder curve to map from 2d to 1d such that locality is approximately preserved is interesting 3 the model performs well for the tasks explored here weaknesses 1 the overall contributions are limited while flattening 2d data using a zorder curve is interesting it is a well known technique for spatial indexing the use of zorder curves is also not entirely unprecedented in the field for example 1 2 also use zorder curves in the context of 2d data and 3 in the context of 3d data 2 the paper states that naive flattening of 2d data is expensive since it results in long sequences that are very slow to process with a nse the paper then proposes a technique which flattens 2d into an equally long sequence this is very confusing im assuming that reported speed improvements are due to the use of 4to4 rather than 2to2 switch is this correct wouldnt this technique also be applicable to the original nse 3 the experiments are limited consisting exclusively of synthetic data and a single baseline for each task while an experiment involving images cifar10 is presented in the appendix the description is brief and the model performs no better than a feedforward network this is concerning as cv seems like a natural domain for exploration especially given overlapping motivation between this work and recent literature exploring transformers in the vision domain 4 5 as such while the results are promising they are not enough to convince me of that matrixse is likely to be useful for practical problems of interest 4 the presentation of technical and experimental details could be improved 1 im confused about how weights are shared in particular while the figure make it clear that weights are shared among shuffle layers in the same block similarly for inverse shuffle layers im unsure if they are shared between units although i suspect they are not 2 the original nse paper pads the input sequence to that length by placing the sequence at a random position and adding zeros on both ends is a similar technique used in this work if not and weights are not shared between units point 41 im confused about how the model can generalize to larger matrices 3 the paper does not contain information about how hyperparameters were tuned or information about stability of results how robust is the model to different hyperparameter settings are reported metrics averaged over multiple runs 4 it should be clarified in the paper that zorder curves are only approximately locality preserving and contain large discontinuous jumps 5 in section 52 the paper states that graph isomorphism network struggles to solve these tasks however on the hardest task gin outperforms matrixse on trianglefinding on the largest graphs recommendation i recommend rejection while i believe the extension of nse to 2d domains is a worthwhile pursuit i do not believe this paper represents significant progress is this regard minor issues 1 section 51 all tasks except matrix squaring has simple on2 algorithms all tasks except matrix squaring have simple on2 algorithms 2 section 52 a common practise a common practice 3 inconsistent spacing is used around parenthesis references 1 zhang jianjin et al zorder recurrent neural networks for video prediction 2019 ieee international conference on multimedia and expo icme ieee 2019 2 kumar jayaraman pradeep et al quadtree convolutional neural networks proceedings of the european conference on computer vision eccv 2018 3 corcoran thomas et al a spatial mapping algorithm with applications in deep learningbased structure classification arxiv preprint arxiv180202532 2018 4 parmar niki et al image transformer arxiv preprint arxiv180205751 2018 5 child rewon et al generating long sequences with sparse transformers arxiv preprint arxiv190410509 2019docsepsummary the paper proposes a network architecture called matrix shuffleexchange matrixse that can learn many logical reasoning tasks on 2d data and graph it has complexity on2 log n for 2d input of size n x n which is much smaller than the complexity of naive attention applied to 2d data on4 the proposed architecture is an adaptation of the neural shuffleexchange network architecture freivalds et al 2019 moving from 1d to 2d data this adaptation is done by using a zorder iteration of the 2d input then performing radix4 shuffle and radix4 exchange instead of radix2 this model is shown to be able to solve several hard tasks on 2d data such as inferring algorithms on binary matrices transpose rotation bitwise xor matrix squaring graph operations component labeling triangle finding transitivity and solving sudoku puzzles the experiments show impressive results and the models ability to generalize to test inputs of larger sizes than those in the training set strengths 1 wide variety of algorithmic and logical reasoning tasks i enjoyed reading the experiment section and the tasks are fun and creative 2 impressive generalization on larger sizes on those tasks mentioned above the matrixse model generalizes well to 2d arrays that have larger sizes than those in the training set outperforming baselines resnet graph isomorphism network 3 simple model design the generalization from neural shuffleexchange network to matrixse is natural and straightforward 4 the paper is wellwritten and easy to read weaknesses 1 lack of theoretical characterization of the model more concretely what kind of operations can be represented by matrixse the theoretical motivation seems to be from the classic result that benes networks can represent any permutation however its not clear how expressive the proposed model is 2 more realistic tasks do the logical reasoning tasks in more realistic scenario such as modeling social networks represented as graphs overall i vote for accepting the proposed model is simple can perform logical reasoning tasks on 2d data and generalizes well beyond sizes that are in the training set additional feedback and questions do all the matrices in section 51 have binary values how is accuracy defined in table 1 and 2 does the output matrix have to match the label exactly or is it accuracy per element in section 54 why is residualse so much slower 9x than matrixse i think it should only be 2x slower because the depth of residualse is twice that of the matrixse after rebuttal thank you for clarifying details i maintain my rating showing that the inductive bias from zorder curve and shuffleexchange allows generalization to larger sizes is very interesting overall i vote for accepting ### Summary:
the reviewers are split two reviewers consider the technical contribution of the paper to be insufficient and raise concerns about comparisons with transformers or using more standard benchmarks for gnn experiments the other considers the experiments convincing and the method worth publishing my own view is that this work is not ready for inclusion in the conference in particular i think this paper would be much stronger with either 1 a more practical task to illustrate where this method might be applied in earnest 2 more analysis and baselines on the synthetic data synthetic data can be enough for a new method if it illuminates the functioning and the benefits and drawbacks in this paper we have synthetic data with little analysis and imo concurring with r5 insufficient baselines for example while a vanilla transformer probably could not do the matrix problems with the matrices encoded naively one might expect transformers with sparse attention to do quite well on eg transpose and 90 degree rotation especially given the training curriculum and proper positional embeddings a convolutional network seems like a strawman i also agree with r5 that standard benchmarks for gnn exist and these might be appropriate or at least there should be some discussion of why they are not 3 some theoretical discussion of what the proposed model can do that other methods fundamentally cannot i do think this is interesting work and encourage the authors to revise and resubmit
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 2520, 789, 29328, 253, 11454, 46671, 35757, 2990, 281, 9232, 1097, 1980, 285, 4156, 21011, 323, 374, 69, 941, 253, 2934, 8725, 253, 337, 69, 11454, 46671, 35757, 2990, 281, 697, 374, 69, 2898, 253, 4081, 1332, 806, 28472, 374, 69, 941, 281, 337, 69, 1563, 253, 1182, 2621, 840, 4647, 2067, 40163, 552, 5234, 285, 40163, 552, 46671, 8090, 285, 4720, 6455, 253, 941, 896, 281, 374, 69, 2317, 253, 5661, 1543, 921, 326, 253, 4081, 1332, 476, 4044, 1805, 3045, 342, 5272, 15180, 2105, 50275, 296, 3755, 20556, 50276, 783, 4081, 1332, 310, 1077, 4722, 352, 2175, 849, 281, 9232, 1048, 3945, 21011, 323, 374, 69, 941, 275, 271, 5919, 1039, 50275, 262, 4648, 253, 1182, 2621, 6970, 281, 892, 21562, 374, 69, 941, 715, 337, 69, 352, 476, 14003, 33643, 1491, 323, 3386, 50275, 783, 5661, 1543, 921, 253, 4081, 1332, 476, 4044, 1805, 3045, 50276, 20881, 1255, 265, 50275, 2520, 789, 8725, 253, 11454, 46671, 35757, 2990, 337, 69, 281, 374, 69, 2429, 342, 253, 337, 69, 295, 339, 697, 7681, 7680, 310, 32809, 285, 253, 38135, 310, 3710, 50273, 9154, 2167, 253, 1182, 2621, 6970, 476, 33643, 1491, 352, 310, 417, 2590, 1880, 253, 8820, 1491, 275, 374, 69, 941, 476, 320, 15296, 352, 943, 320, 5469, 50274, 783, 4116, 5122, 476, 3037, 1327, 6790, 21011, 8069, 352, 943, 320, 2429, 275, 253, 5661, 2175, 275, 1635, 352, 310, 2032, 326, 253, 26724, 2715, 273, 4116, 556, 327, 21, 10454, 533, 627, 403, 2067, 5520, 11640, 342, 12085, 3045, 533, 2406, 15180, 2105, 50276, 783, 4081, 1332, 310, 671, 3732, 281, 4216, 941, 285, 310, 2429, 342, 48347, 2299, 323, 4216, 941, 891, 651, 1804, 16472, 4679, 327, 22791, 15302, 50276, 11183, 846, 30080, 22559, 50276, 74, 452, 1239, 253, 4477, 30080, 22559, 891, 1335, 2868, 253, 38135, 310, 3710, 285, 7613, 891, 1978, 619, 4868, 19965, 5474, 33032, 2520, 2929, 5223, 84, 253, 4102, 5611, 11454, 46671, 35757, 295, 339, 2990, 323, 374, 69, 14800, 436, 310, 2218, 407, 16984, 767, 2544, 30627, 2980, 374, 69, 3280, 970, 247, 1182, 2621, 6970, 285, 970, 577, 936, 21, 20994, 2581, 685, 374, 936, 19, 4679, 327, 13506, 941, 921, 326, 253, 4081, 1566, 476, 562, 32231, 1666, 25379, 50275, 296, 3755, 20556, 50276, 18, 253, 6880, 273, 295, 339, 281, 374, 69, 10625, 310, 271, 4722, 2561, 3884, 374, 253, 897, 273, 253, 1182, 2621, 6970, 281, 3711, 432, 374, 69, 281, 337, 69, 824, 326, 33643, 310, 5512, 15296, 310, 4722, 495, 253, 1566, 17923, 973, 323, 253, 8892, 14859, 1060, 50275, 20881, 1255, 265, 50276, 18, 253, 4583, 9021, 403, 3710, 1223, 30627, 2980, 374, 69, 941, 970, 247, 1182, 2621, 6970, 310, 4722, 352, 310, 247, 973, 1929, 5853, 323, 8820, 44176, 253, 897, 273, 1182, 2621, 9191, 310, 671, 417, 7094, 23663, 275, 253, 1673, 323, 1650, 337, 374, 671, 897, 1182, 2621, 9191, 275, 253, 3634, 273, 374, 69, 941, 285, 495, 275, 253, 3634, 273, 495, 69, 941, 374, 253, 2929, 3054, 326, 27785, 30627, 2980, 273, 374, 69, 941, 310, 8214, 1580, 352, 1543, 275, 1048, 6430, 326, 403, 1077, 3468, 281, 1232, 342, 247, 295, 339, 253, 2929, 840, 29328, 247, 5853, 534, 30627, 561, 374, 69, 715, 271, 9696, 1048, 3425, 436, 310, 1077, 21643, 516, 7384, 326, 2361, 3885, 11701, 403, 1955, 281, 253, 897, 273, 577, 936, 21, 2581, 685, 374, 936, 19, 5234, 310, 436, 3451, 651, 2649, 436, 5853, 671, 320, 7763, 281, 253, 3236, 295, 339, 495, 253, 4679, 403, 3710, 11253, 14288, 273, 13506, 941, 285, 247, 2014, 8245, 323, 1016, 4836, 1223, 271, 3368, 7668, 3888, 260, 338, 274, 740, 310, 3559, 275, 253, 30762, 253, 5740, 310, 4864, 285, 253, 1566, 17923, 642, 1805, 685, 247, 3997, 10495, 2990, 436, 310, 8664, 347, 30105, 3133, 751, 247, 3626, 5028, 323, 17947, 3340, 1677, 21481, 16038, 50276, 17352, 436, 789, 285, 3332, 6239, 18216, 4979, 398, 275, 253, 8113, 5028, 577, 608, 347, 824, 1223, 253, 1543, 403, 12532, 597, 403, 417, 2217, 281, 18578, 479, 273, 326, 4315, 339, 310, 2779, 281, 320, 4217, 323, 8542, 50276, 856, 23042, 273, 1600, 577, 253, 9759, 273, 7681, 285, 5661, 4278, 812, 320, 5520, 50274, 18, 516, 13477, 670, 849, 13461, 403, 6096, 275, 1798, 1223, 253, 4677, 1056, 352, 2590, 326, 13461, 403, 6096, 2190, 46671, 8090, 275, 253, 1072, 2972, 12014, 323, 13737, 46671, 8090, 516, 31488, 604, 597, 403, 6096, 875, 5085, 3738, 891, 9101, 597, 403, 417, 50274, 19, 253, 3236, 295, 339, 2929, 25297, 253, 3280, 3425, 281, 326, 2978, 407, 15606, 253, 3425, 387, 247, 3632, 1899, 285, 6240, 33303, 327, 1097, 7637, 310, 247, 2074, 5853, 908, 275, 436, 789, 604, 417, 285, 13461, 403, 417, 6096, 875, 5085, 1127, 7609, 516, 13477, 670, 849, 253, 1566, 476, 39970, 281, 4067, 12624, 50274, 20, 253, 2929, 1057, 417, 3831, 1491, 670, 849, 4373, 22041, 497, 24251, 390, 1491, 670, 7882, 273, 1543, 849, 10237, 310, 253, 1566, 281, 1027, 4373, 19484, 7533, 403, 2361, 17082, 17522, 689, 2709, 6613, 50274, 21, 352, 943, 320, 31637, 275, 253, 2929, 326, 1182, 2621, 9191, 403, 760, 5512, 33643, 24279, 285, 3831, 1781, 16196, 3472, 27287, 50274, 22, 275, 2593, 8073, 253, 2929, 3054, 326, 4216, 20169, 2990, 23490, 281, 8415, 841, 8892, 2299, 327, 253, 31056, 4836, 48347, 41731, 13015, 4315, 339, 327, 19037, 28983, 327, 253, 6253, 14580, 50275, 250, 27167, 318, 50276, 74, 5583, 18235, 1223, 891, 2868, 253, 6880, 273, 295, 339, 281, 374, 69, 10625, 310, 247, 32811, 20808, 891, 513, 417, 2868, 436, 2929, 6125, 1534, 4780, 310, 436, 2743, 50275, 37585, 3374, 50276, 18, 2593, 8319, 512, 8892, 3707, 4315, 3896, 1875, 556, 2969, 327, 19, 11333, 50276, 455, 8892, 3707, 4315, 3896, 1875, 452, 2969, 327, 19, 11333, 374, 2593, 8073, 247, 1846, 2283, 885, 50275, 66, 1846, 3946, 495, 16706, 22735, 310, 908, 1475, 2885, 25232, 50275, 250, 3065, 50276, 18, 1182, 12109, 480, 757, 37525, 1162, 355, 1182, 2621, 18902, 11454, 6928, 323, 3492, 10554, 6247, 26332, 1796, 5213, 8059, 327, 38195, 285, 866, 80, 17857, 1405, 26332, 1796, 6247, 374, 465, 22711, 480, 333, 274, 14990, 819, 796, 554, 1162, 355, 9853, 12588, 27311, 267, 11454, 6928, 10061, 273, 253, 19454, 266, 8059, 327, 4382, 8113, 23746, 87, 4765, 495, 944, 5528, 266, 289, 4921, 1162, 355, 247, 8820, 10603, 5933, 342, 4893, 275, 3676, 4715, 3169, 2605, 9162, 549, 32693, 638, 3845, 549, 32693, 1093, 9992, 1099, 1237, 4765, 577, 1061, 4175, 295, 8678, 1162, 355, 2460, 39707, 549, 32693, 638, 3845, 549, 32693, 1093, 9992, 22, 37950, 4765, 608, 1429, 294, 33382, 1162, 355, 11365, 1048, 6430, 342, 23507, 4979, 398, 549, 32693, 638, 3845, 549, 32693, 746, 2125, 740, 17013, 6247, 7152, 339, 793, 360, 3454, 253, 2929, 29328, 247, 2990, 10336, 1925, 4315, 46671, 35757, 4315, 339, 326, 476, 3037, 1142, 13760, 14720, 8892, 327, 374, 69, 941, 285, 4216, 352, 556, 10454, 327, 19, 2412, 295, 323, 374, 69, 3280, 273, 1979, 295, 1269, 295, 534, 310, 1199, 4577, 685, 253, 10454, 273, 27785, 4116, 3732, 281, 374, 69, 941, 327, 21, 253, 4081, 10336, 310, 271, 15644, 273, 253, 11454, 46671, 35757, 2990, 10336, 4107, 2401, 1397, 1162, 355, 6247, 4886, 432, 337, 69, 281, 374, 69, 941, 436, 15644, 310, 2218, 407, 970, 247, 1182, 2621, 19502, 273, 253, 374, 69, 3280, 840, 9591, 1985, 895, 21, 46671, 285, 1985, 895, 21, 6431, 3185, 273, 1985, 895, 19, 436, 1566, 310, 2011, 281, 320, 2104, 281, 8415, 2067, 1892, 8892, 327, 374, 69, 941, 824, 347, 9441, 804, 11333, 327, 8985, 12624, 811, 3014, 9381, 2372, 3020, 1269, 263, 4315, 3896, 1875, 4216, 5871, 4445, 21473, 19037, 4560, 811, 5714, 285, 16161, 402, 69, 23909, 43884, 253, 4679, 921, 13943, 1543, 285, 253, 3210, 3745, 281, 39970, 281, 1071, 14800, 273, 4067, 9552, 685, 1110, 275, 253, 3733, 873, 50274, 296, 3755, 20556, 337, 4618, 5235, 273, 5933, 280, 285, 13760, 14720, 8892, 891, 11346, 4361, 253, 3368, 2593, 285, 253, 8892, 403, 794, 285, 10995, 374, 13943, 26647, 327, 4067, 9552, 327, 1110, 8892, 5393, 1840, 253, 4315, 339, 1566, 2087, 4219, 973, 281, 374, 69, 16417, 326, 452, 4067, 9552, 685, 1110, 275, 253, 3733, 873, 41731, 14692, 1666, 25379, 501, 3024, 4216, 20169, 2990, 495, 2969, 1566, 2216, 253, 26647, 432, 11454, 46671, 35757, 2990, 281, 4315, 339, 310, 3626, 285, 15246, 577, 253, 2929, 310, 973, 15720, 285, 3477, 281, 1239, 50276, 20881, 1255, 265, 337, 3480, 273, 10527, 14846, 273, 253, 1566, 625, 345, 2414, 600, 752, 2238, 273, 5871, 476, 320, 6607, 407, 4315, 339, 253, 10527, 16038, 3133, 281, 320, 432, 253, 10610, 906, 326, 2240, 265, 6928, 476, 1957, 667, 29391, 2299, 697, 417, 2590, 849, 43541, 253, 4081, 1566, 310, 374, 625, 15958, 8892, 513, 253, 13760, 14720, 8892, 275, 625, 15958, 10076, 824, 347, 14053, 2675, 6928, 6607, 347, 14580, 50274, 1189, 455, 891, 6273, 323, 18738, 253, 4081, 1566, 310, 2969, 476, 1347, 13760, 14720, 8892, 327, 374, 69, 941, 285, 2087, 4219, 973, 4457, 9552, 326, 403, 275, 253, 3733, 873, 50274, 38092, 8680, 285, 3533, 50276, 3088, 512, 253, 12624, 275, 2593, 8319, 452, 8985, 2193, 50276, 5430, 310, 7200, 2931, 275, 2829, 337, 285, 374, 1057, 253, 3453, 4315, 452, 281, 3761, 253, 5203, 4555, 390, 310, 352, 7200, 591, 3284, 50276, 249, 2593, 8255, 2139, 310, 12541, 339, 594, 1199, 17357, 898, 89, 685, 4315, 339, 891, 1158, 352, 943, 760, 320, 374, 89, 17357, 984, 253, 6864, 273, 12541, 339, 310, 7019, 326, 273, 253, 4315, 339, 50275, 6438, 30080, 22559, 5717, 368, 323, 8254, 5411, 4278, 891, 6558, 619, 13716, 50276, 9029, 272, 326, 253, 42115, 8492, 432, 1182, 2621, 6970, 285, 46671, 35757, 4483, 26647, 281, 4067, 9552, 310, 1077, 4722, 4583, 891, 6273, 323, 18738, 2490, 187, 4118, 18435, 27, 783, 30628, 403, 8085, 50276, 9389, 30628, 1908, 253, 7681, 7680, 273, 253, 2929, 281, 320, 12497, 285, 7164, 7350, 670, 14023, 342, 4979, 398, 390, 970, 625, 2629, 49602, 323, 305, 9866, 4679, 50275, 783, 643, 19401, 253, 4679, 21414, 285, 253, 1332, 4409, 18051, 50275, 2577, 1211, 1859, 310, 326, 436, 789, 310, 417, 4704, 323, 11250, 275, 253, 8059, 50276, 249, 1798, 891, 1158, 436, 2929, 651, 320, 1199, 10046, 342, 2057, 50275, 18, 247, 625, 8542, 4836, 281, 17093, 835, 436, 1332, 1537, 320, 3732, 275, 34375, 374, 625, 1783, 285, 1666, 25379, 327, 253, 13506, 941, 50276, 84, 23744, 941, 476, 320, 2217, 323, 247, 747, 1332, 604, 352, 34080, 684, 253, 15415, 285, 253, 5373, 285, 30453, 50276, 249, 436, 2929, 359, 452, 13506, 941, 342, 1652, 1783, 285, 516, 80, 40226, 342, 391, 22, 12497, 1666, 25379, 50276, 1542, 1650, 1223, 247, 26724, 39707, 3164, 812, 417, 513, 253, 4315, 3237, 342, 253, 12624, 16202, 5549, 1242, 581, 1537, 1902, 4979, 398, 342, 23507, 4116, 281, 513, 3240, 973, 327, 24088, 811, 3014, 285, 5091, 4248, 9381, 3340, 1677, 253, 3733, 24642, 285, 1463, 40798, 46234, 247, 27311, 267, 2990, 3133, 751, 247, 17844, 1342, 50276, 74, 671, 5194, 342, 391, 22, 326, 2629, 49602, 323, 305, 9866, 2226, 285, 841, 1537, 320, 4569, 390, 387, 1878, 627, 943, 320, 690, 5955, 273, 2139, 597, 403, 417, 50276, 20, 690, 10527, 5955, 273, 752, 253, 4081, 1566, 476, 513, 326, 643, 3082, 26401, 2550, 50276, 74, 513, 1158, 436, 310, 4722, 789, 285, 11907, 253, 4477, 281, 49620, 285, 501, 538, 2225 ]
[ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 ]
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 2520, 789, 29328, 253, 11454, 46671, 35757, 2990, 281, 9232, 1097, 1980, 285, 4156, 21011, 323, 374, 69, 941, 253, 2934, 8725, 253, 337, 69, 11454, 46671, 35757, 2990, 281, 697, 374, 69, 2898, 253, 4081, 1332, 806, 28472, 374, 69, 941, 281, 337, 69, 1563, 253, 1182, 2621, 840, 4647, 2067, 40163, 552, 5234, 285, 40163, 552, 46671, 8090, 285, 4720, 6455, 253, 941, 896, 281, 374, 69, 2317, 253, 5661, 1543, 921, 326, 253, 4081, 1332, 476, 4044, 1805, 3045, 342, 5272, 15180, 2105, 50275, 296, 3755, 20556, 50276, 783, 4081, 1332, 310, 1077, 4722, 352, 2175, 849, 281, 9232, 1048, 3945, 21011, 323, 374, 69, 941, 275, 271, 5919, 1039, 50275, 262, 4648, 253, 1182, 2621, 6970, 281, 892, 21562, 374, 69, 941, 715, 337, 69, 352, 476, 14003, 33643, 1491, 323, 3386, 50275, 783, 5661, 1543, 921, 253, 4081, 1332, 476, 4044, 1805, 3045, 50276, 20881, 1255, 265, 50275, 2520, 789, 8725, 253, 11454, 46671, 35757, 2990, 337, 69, 281, 374, 69, 2429, 342, 253, 337, 69, 295, 339, 697, 7681, 7680, 310, 32809, 285, 253, 38135, 310, 3710, 50273, 9154, 2167, 253, 1182, 2621, 6970, 476, 33643, 1491, 352, 310, 417, 2590, 1880, 253, 8820, 1491, 275, 374, 69, 941, 476, 320, 15296, 352, 943, 320, 5469, 50274, 783, 4116, 5122, 476, 3037, 1327, 6790, 21011, 8069, 352, 943, 320, 2429, 275, 253, 5661, 2175, 275, 1635, 352, 310, 2032, 326, 253, 26724, 2715, 273, 4116, 556, 327, 21, 10454, 533, 627, 403, 2067, 5520, 11640, 342, 12085, 3045, 533, 2406, 15180, 2105, 50276, 783, 4081, 1332, 310, 671, 3732, 281, 4216, 941, 285, 310, 2429, 342, 48347, 2299, 323, 4216, 941, 891, 651, 1804, 16472, 4679, 327, 22791, 15302, 50276, 11183, 846, 30080, 22559, 50276, 74, 452, 1239, 253, 4477, 30080, 22559, 891, 1335, 2868, 253, 38135, 310, 3710, 285, 7613, 891, 1978, 619, 4868, 19965, 5474, 33032, 2520, 2929, 5223, 84, 253, 4102, 5611, 11454, 46671, 35757, 295, 339, 2990, 323, 374, 69, 14800, 436, 310, 2218, 407, 16984, 767, 2544, 30627, 2980, 374, 69, 3280, 970, 247, 1182, 2621, 6970, 285, 970, 577, 936, 21, 20994, 2581, 685, 374, 936, 19, 4679, 327, 13506, 941, 921, 326, 253, 4081, 1566, 476, 562, 32231, 1666, 25379, 50275, 296, 3755, 20556, 50276, 18, 253, 6880, 273, 295, 339, 281, 374, 69, 10625, 310, 271, 4722, 2561, 3884, 374, 253, 897, 273, 253, 1182, 2621, 6970, 281, 3711, 432, 374, 69, 281, 337, 69, 824, 326, 33643, 310, 5512, 15296, 310, 4722, 495, 253, 1566, 17923, 973, 323, 253, 8892, 14859, 1060, 50275, 20881, 1255, 265, 50276, 18, 253, 4583, 9021, 403, 3710, 1223, 30627, 2980, 374, 69, 941, 970, 247, 1182, 2621, 6970, 310, 4722, 352, 310, 247, 973, 1929, 5853, 323, 8820, 44176, 253, 897, 273, 1182, 2621, 9191, 310, 671, 417, 7094, 23663, 275, 253, 1673, 323, 1650, 337, 374, 671, 897, 1182, 2621, 9191, 275, 253, 3634, 273, 374, 69, 941, 285, 495, 275, 253, 3634, 273, 495, 69, 941, 374, 253, 2929, 3054, 326, 27785, 30627, 2980, 273, 374, 69, 941, 310, 8214, 1580, 352, 1543, 275, 1048, 6430, 326, 403, 1077, 3468, 281, 1232, 342, 247, 295, 339, 253, 2929, 840, 29328, 247, 5853, 534, 30627, 561, 374, 69, 715, 271, 9696, 1048, 3425, 436, 310, 1077, 21643, 516, 7384, 326, 2361, 3885, 11701, 403, 1955, 281, 253, 897, 273, 577, 936, 21, 2581, 685, 374, 936, 19, 5234, 310, 436, 3451, 651, 2649, 436, 5853, 671, 320, 7763, 281, 253, 3236, 295, 339, 495, 253, 4679, 403, 3710, 11253, 14288, 273, 13506, 941, 285, 247, 2014, 8245, 323, 1016, 4836, 1223, 271, 3368, 7668, 3888, 260, 338, 274, 740, 310, 3559, 275, 253, 30762, 253, 5740, 310, 4864, 285, 253, 1566, 17923, 642, 1805, 685, 247, 3997, 10495, 2990, 436, 310, 8664, 347, 30105, 3133, 751, 247, 3626, 5028, 323, 17947, 3340, 1677, 21481, 16038, 50276, 17352, 436, 789, 285, 3332, 6239, 18216, 4979, 398, 275, 253, 8113, 5028, 577, 608, 347, 824, 1223, 253, 1543, 403, 12532, 597, 403, 417, 2217, 281, 18578, 479, 273, 326, 4315, 339, 310, 2779, 281, 320, 4217, 323, 8542, 50276, 856, 23042, 273, 1600, 577, 253, 9759, 273, 7681, 285, 5661, 4278, 812, 320, 5520, 50274, 18, 516, 13477, 670, 849, 13461, 403, 6096, 275, 1798, 1223, 253, 4677, 1056, 352, 2590, 326, 13461, 403, 6096, 2190, 46671, 8090, 275, 253, 1072, 2972, 12014, 323, 13737, 46671, 8090, 516, 31488, 604, 597, 403, 6096, 875, 5085, 3738, 891, 9101, 597, 403, 417, 50274, 19, 253, 3236, 295, 339, 2929, 25297, 253, 3280, 3425, 281, 326, 2978, 407, 15606, 253, 3425, 387, 247, 3632, 1899, 285, 6240, 33303, 327, 1097, 7637, 310, 247, 2074, 5853, 908, 275, 436, 789, 604, 417, 285, 13461, 403, 417, 6096, 875, 5085, 1127, 7609, 516, 13477, 670, 849, 253, 1566, 476, 39970, 281, 4067, 12624, 50274, 20, 253, 2929, 1057, 417, 3831, 1491, 670, 849, 4373, 22041, 497, 24251, 390, 1491, 670, 7882, 273, 1543, 849, 10237, 310, 253, 1566, 281, 1027, 4373, 19484, 7533, 403, 2361, 17082, 17522, 689, 2709, 6613, 50274, 21, 352, 943, 320, 31637, 275, 253, 2929, 326, 1182, 2621, 9191, 403, 760, 5512, 33643, 24279, 285, 3831, 1781, 16196, 3472, 27287, 50274, 22, 275, 2593, 8073, 253, 2929, 3054, 326, 4216, 20169, 2990, 23490, 281, 8415, 841, 8892, 2299, 327, 253, 31056, 4836, 48347, 41731, 13015, 4315, 339, 327, 19037, 28983, 327, 253, 6253, 14580, 50275, 250, 27167, 318, 50276, 74, 5583, 18235, 1223, 891, 2868, 253, 6880, 273, 295, 339, 281, 374, 69, 10625, 310, 247, 32811, 20808, 891, 513, 417, 2868, 436, 2929, 6125, 1534, 4780, 310, 436, 2743, 50275, 37585, 3374, 50276, 18, 2593, 8319, 512, 8892, 3707, 4315, 3896, 1875, 556, 2969, 327, 19, 11333, 50276, 455, 8892, 3707, 4315, 3896, 1875, 452, 2969, 327, 19, 11333, 374, 2593, 8073, 247, 1846, 2283, 885, 50275, 66, 1846, 3946, 495, 16706, 22735, 310, 908, 1475, 2885, 25232, 50275, 250, 3065, 50276, 18, 1182, 12109, 480, 757, 37525, 1162, 355, 1182, 2621, 18902, 11454, 6928, 323, 3492, 10554, 6247, 26332, 1796, 5213, 8059, 327, 38195, 285, 866, 80, 17857, 1405, 26332, 1796, 6247, 374, 465, 22711, 480, 333, 274, 14990, 819, 796, 554, 1162, 355, 9853, 12588, 27311, 267, 11454, 6928, 10061, 273, 253, 19454, 266, 8059, 327, 4382, 8113, 23746, 87, 4765, 495, 944, 5528, 266, 289, 4921, 1162, 355, 247, 8820, 10603, 5933, 342, 4893, 275, 3676, 4715, 3169, 2605, 9162, 549, 32693, 638, 3845, 549, 32693, 1093, 9992, 1099, 1237, 4765, 577, 1061, 4175, 295, 8678, 1162, 355, 2460, 39707, 549, 32693, 638, 3845, 549, 32693, 1093, 9992, 22, 37950, 4765, 608, 1429, 294, 33382, 1162, 355, 11365, 1048, 6430, 342, 23507, 4979, 398, 549, 32693, 638, 3845, 549, 32693, 746, 2125, 740, 17013, 6247, 7152, 339, 793, 360, 3454, 253, 2929, 29328, 247, 2990, 10336, 1925, 4315, 46671, 35757, 4315, 339, 326, 476, 3037, 1142, 13760, 14720, 8892, 327, 374, 69, 941, 285, 4216, 352, 556, 10454, 327, 19, 2412, 295, 323, 374, 69, 3280, 273, 1979, 295, 1269, 295, 534, 310, 1199, 4577, 685, 253, 10454, 273, 27785, 4116, 3732, 281, 374, 69, 941, 327, 21, 253, 4081, 10336, 310, 271, 15644, 273, 253, 11454, 46671, 35757, 2990, 10336, 4107, 2401, 1397, 1162, 355, 6247, 4886, 432, 337, 69, 281, 374, 69, 941, 436, 15644, 310, 2218, 407, 970, 247, 1182, 2621, 19502, 273, 253, 374, 69, 3280, 840, 9591, 1985, 895, 21, 46671, 285, 1985, 895, 21, 6431, 3185, 273, 1985, 895, 19, 436, 1566, 310, 2011, 281, 320, 2104, 281, 8415, 2067, 1892, 8892, 327, 374, 69, 941, 824, 347, 9441, 804, 11333, 327, 8985, 12624, 811, 3014, 9381, 2372, 3020, 1269, 263, 4315, 3896, 1875, 4216, 5871, 4445, 21473, 19037, 4560, 811, 5714, 285, 16161, 402, 69, 23909, 43884, 253, 4679, 921, 13943, 1543, 285, 253, 3210, 3745, 281, 39970, 281, 1071, 14800, 273, 4067, 9552, 685, 1110, 275, 253, 3733, 873, 50274, 296, 3755, 20556, 337, 4618, 5235, 273, 5933, 280, 285, 13760, 14720, 8892, 891, 11346, 4361, 253, 3368, 2593, 285, 253, 8892, 403, 794, 285, 10995, 374, 13943, 26647, 327, 4067, 9552, 327, 1110, 8892, 5393, 1840, 253, 4315, 339, 1566, 2087, 4219, 973, 281, 374, 69, 16417, 326, 452, 4067, 9552, 685, 1110, 275, 253, 3733, 873, 41731, 14692, 1666, 25379, 501, 3024, 4216, 20169, 2990, 495, 2969, 1566, 2216, 253, 26647, 432, 11454, 46671, 35757, 2990, 281, 4315, 339, 310, 3626, 285, 15246, 577, 253, 2929, 310, 973, 15720, 285, 3477, 281, 1239, 50276, 20881, 1255, 265, 337, 3480, 273, 10527, 14846, 273, 253, 1566, 625, 345, 2414, 600, 752, 2238, 273, 5871, 476, 320, 6607, 407, 4315, 339, 253, 10527, 16038, 3133, 281, 320, 432, 253, 10610, 906, 326, 2240, 265, 6928, 476, 1957, 667, 29391, 2299, 697, 417, 2590, 849, 43541, 253, 4081, 1566, 310, 374, 625, 15958, 8892, 513, 253, 13760, 14720, 8892, 275, 625, 15958, 10076, 824, 347, 14053, 2675, 6928, 6607, 347, 14580, 50274, 1189, 455, 891, 6273, 323, 18738, 253, 4081, 1566, 310, 2969, 476, 1347, 13760, 14720, 8892, 327, 374, 69, 941, 285, 2087, 4219, 973, 4457, 9552, 326, 403, 275, 253, 3733, 873, 50274, 38092, 8680, 285, 3533, 50276, 3088, 512, 253, 12624, 275, 2593, 8319, 452, 8985, 2193, 50276, 5430, 310, 7200, 2931, 275, 2829, 337, 285, 374, 1057, 253, 3453, 4315, 452, 281, 3761, 253, 5203, 4555, 390, 310, 352, 7200, 591, 3284, 50276, 249, 2593, 8255, 2139, 310, 12541, 339, 594, 1199, 17357, 898, 89, 685, 4315, 339, 891, 1158, 352, 943, 760, 320, 374, 89, 17357, 984, 253, 6864, 273, 12541, 339, 310, 7019, 326, 273, 253, 4315, 339, 50275, 6438, 30080, 22559, 5717, 368, 323, 8254, 5411, 4278, 891, 6558, 619, 13716, 50276, 9029, 272, 326, 253, 42115, 8492, 432, 1182, 2621, 6970, 285, 46671, 35757, 4483, 26647, 281, 4067, 9552, 310, 1077, 4722, 4583, 891, 6273, 323, 18738, 2490, 187, 4118, 18435, 27, 783, 30628, 403, 8085, 50276, 9389, 30628, 1908, 253, 7681, 7680, 273, 253, 2929, 281, 320, 12497, 285, 7164, 7350, 670, 14023, 342, 4979, 398, 390, 970, 625, 2629, 49602, 323, 305, 9866, 4679, 50275, 783, 643, 19401, 253, 4679, 21414, 285, 253, 1332, 4409, 18051, 50275, 2577, 1211, 1859, 310, 326, 436, 789, 310, 417, 4704, 323, 11250, 275, 253, 8059, 50276, 249, 1798, 891, 1158, 436, 2929, 651, 320, 1199, 10046, 342, 2057, 50275, 18, 247, 625, 8542, 4836, 281, 17093, 835, 436, 1332, 1537, 320, 3732, 275, 34375, 374, 625, 1783, 285, 1666, 25379, 327, 253, 13506, 941, 50276, 84, 23744, 941, 476, 320, 2217, 323, 247, 747, 1332, 604, 352, 34080, 684, 253, 15415, 285, 253, 5373, 285, 30453, 50276, 249, 436, 2929, 359, 452, 13506, 941, 342, 1652, 1783, 285, 516, 80, 40226, 342, 391, 22, 12497, 1666, 25379, 50276, 1542, 1650, 1223, 247, 26724, 39707, 3164, 812, 417, 513, 253, 4315, 3237, 342, 253, 12624, 16202, 5549, 1242, 581, 1537, 1902, 4979, 398, 342, 23507, 4116, 281, 513, 3240, 973, 327, 24088, 811, 3014, 285, 5091, 4248, 9381, 3340, 1677, 253, 3733, 24642, 285, 1463, 40798, 46234, 247, 27311, 267, 2990, 3133, 751, 247, 17844, 1342, 50276, 74, 671, 5194, 342, 391, 22, 326, 2629, 49602, 323, 305, 9866, 2226, 285, 841, 1537, 320, 4569, 390, 387, 1878, 627, 943, 320, 690, 5955, 273, 2139, 597, 403, 417, 50276, 20, 690, 10527, 5955, 273, 752, 253, 4081, 1566, 476, 513, 326, 643, 3082, 26401, 2550, 50276, 74, 513, 1158, 436, 310, 4722, 789, 285, 11907, 253, 4477, 281, 49620, 285, 501, 538, 2225 ]
Below is given review of a research paper from cnoference journal. Please write a summary the review. ### Review: the paper proposes to use a score based on the fisherrao information metric the riemannian metric in the space of probability distributions for the detection of outofdistribution samples input to a trained dnn while the output softmax probabilities are used in the blackbox and greybox scenarios the learnt features in the intervening dnn layers are additionally used in the whitebox scenario the approach models each sample as providing posterior probabilities a the softmax probability in the label space and b classconditional pdfs over the corresponding feature spaces for each dnn layer these latter are modeled as multivariate gaussian distributions with diagonal covariance matrices extensive experiments on existing benchmarks are conducted for comparative results against the state of the art demonstrating promising results strengths weaknesses clarity the paper is mostly clear and well written the writing can be improved in some places eg paragraph 2 in the introduction novelty while the fisherrao metric has been applied in the context of deep learning natural gradient regularization of training etc need references here ive unaware of its use for anomaly outofdistribution ood detection its a reasonable extension to evaluate the utilization of mahalanobis distance in the prior art hendycks and gimpel iclr17 the novelty meets the bar for a publication without rising to the level of being significant technically correctness the material is technical correct in the large i went through the approach including the math in the paper and skimmed through the appendix im mostly familiar with the topic and the material seems correct though i didnt check the appendix comprehensively tc 1 gaussianity there are two main parts for the proposed score one based on the softmax output this seems ok my main concern is with the other part using the multivariate gaussian model with a diagonal covariance for the feature spaces without first validating the premise with a gaussianity test while testing for highdimensional multivariate gaussians may pose difficulties the diagonality assumption should make this feasible i recommend that evaluation and analysis of this assumption is added to the paper tc 2 pmfs and pdfs while the softmax formulation provides a posterior distribution in the label space having this interpretation for the feature layers for a single input sample without further grounding in theory past literature makes the approach ad hoc id like to see authors clarify this tc3 max and average the authors comment that for the softmax output they obtain slightly better results using 6 average rather than 5 min over the classes a would using the average be tantamount to computing the probability with respect to a mixture distribution and admit a model more applicable for the scenario single sample case than a pmf pdf estimate interpretation b was this also tried with the features corresponding to equations 12 and 13 experimental evaluation a good number of experiments have been conducted and presented in the main paper for the blackbox table 1 and the whitebox table 2 scenarios these are further supported by additional experiments in the appendix e tables 511 as well as ablation studies for various hyperparameters important to the proposed approach this is very good on the flip side some parts of the experimental validation can be improved to support better the central claims that ood scores based on the fisherrao information metrics outperform the previous state of the art ee1 difficulty of directly comparing with published results since the experimental settings seem different from other papers dnn models used finetuning dataset etc it is really hard to directly compare tables in the paper to those published in the compared sota i tried to do this both for the tables in the main paper as well as in the appendix since this beats the due diligence review process the authors should explain why this is ok a blackbox settings entries in table 1 and table 8 cant be compared with table 2 in hendryks and gimpel iclr 2017 referred to as baseline table 1 or table 2 in liu et al neurips 2020 or with odin liang et al iclr 2018 b whitebox settings i compared some numbers across tables 10 and 11 with table 2 in lee et al nips 2018 discrepancies in results exist sometimes very significant but mostly not all the comparative accuracy improvements hold i gave up on a more comprehensive crosschecking of the results but since there are differences in the setups ee2 sota used for comparison apart from liu et al neurips 2020 all the compared approaches seem 34 years old the authors should respond on whether those results are sota ee3 inconclusive evidence for improved performance performance improvements visavis the reported sota is mixed ref tables 8 11 though i consider the results to be promising the authors should discuss this in their response reproducibility i expect the results to be reproducible since enough details are shared in the paper and the code has been made publicly available at httpsgithubcomigeoodigeood i recommend to accept the paper it is a good addition to the set of methods on an important topic ood the approach is novel reasonably principled and the code has been made publicly available at httpsgithubcomigeoodigeoodhttpsgithubcomigeoodigeood which is good for the community to able to put the above methods to test the validation is comprehensive though the results are somewhat inconclusive though promising there are some concerns regarding the validation of assumptions and the experimental evaluation my preliminary assessment is that the submission passes the criteria for acceptance to this venue docsepthis paper presents igeood a new method for detecting ood samples by using geodesic fisherrao distance in confidence scoring it further combines confidence scores from the logit outputs and the layerwise features of a deep neural network the method is validated under various testing environments such as the availability of ood data or the accessibility of latent features of a deep network the idea of using fisherrao distance for ood detection seems novel and interesting the theoretical basis of the proposed methodology is well described and the experiments appear to be welldesigned overall empirically the proposed method outperforms other methods especially in the whitebox setting where ood validation samples are available according to table s8 table 8 in the appendix the performance difference of the different methods appears to be marginal can you provide statistical significance for the performance differences in addition according to table s9 greybox odin performs better than the proposed method in most cases can you add some discussion on why moreover the current description of the results given in the main text is a bit misleading because these observations are not properly explained in the whitebox setups igeood seems to perform best when ood samples are available for validation table s10 but it does not when only the adversarially generated samples are used for validationtable 2 table s11 this result does not match the purpose of this study presented in abstract and in introduction eg igeood applies to any pretrained neural network does not require ood samples or assumptions on the ood data recently many ssl selfsupervised learningbased ood detection methods have been developed examples are ssd1 and csi2 that use the same confidence score as in lee et al3 while utilizing selfsupervision simclr i wonder 1 if the proposed method can be compared to these selfsupervisionbased models and 2 whether the proposed geodesic distance can be applied to the sslbased approaches 1 sehwag vikash mung chiang and prateek mittal ssd a unified framework for selfsupervised outlier detection international conference on learning representations 2020 2 jihoon tack sangwoo mo jongheon jeong and jinwoo shin csi novelty detection via contrastive learning on distributionally shifted instances advances in neural information processing systems 33 2020 3 lee kimin et al a simple unified framework for detecting outofdistribution samples and adversarial attacks advances in neural information processing systems 31 2018 the proposed model is based on the assumption that the layer output follows a gaussian distribution it is possible to provide validation or discussion of this assumption regarding eqs 5 and 6 the authors note that taking the sum 5 instead of the minimum distance to the class conditional centroid produces better results this looks interesting can you provide empirical results or more analysis on this it would be interesting to see the score distributions as shown in figure 1c for the real datasets used in this study this paper presents a novel idea of using fisherrao distance for confidence scoring in ood detection the main idea is interesting the paper is well structured in terms of methodological description and the problemexperimental setups but the empirical results do not appear to be sufficient to validate the intended purpose of this study docsepthe paper proposes a new group of methods for supervised ood detection in particular the authors propose to use the fisherrao distance between output distributions on the indistribution data and test samples to detect ood the authors additionally propose to use fisherrao distance in the hidden layer feature space when possible whitebox setting the method achieves strong empirical performance improving upon standard baselines such as odin mahalanobis especially in the whitebox setting methodology the authors propose two related methods 1 measuring distance in the output space and 2 measuring distance in the space of the hidden layer activations 1 is quite intuitive and as far as i know novel to the best of my knowledge prior work typically considers just the confidence of the classifier and not the full predictive distribution 2 is quite similar to mahalanobis 1 with the main difference being the distance metric used however the authors show significant improvements in performance compared to 1 in section 43 justifying the proposed method results the empirical results constitute the main strength of this paper the method outperforms the considered baselines across the board however i have two concerns the improvements in the blackbox setting appear very minor in many cases the method does not outperform the baselines our improves the results by o01 see eg the auroc results for resnet in table 1 it would potentially be helpful to add error bars to table 1 the paper claims to set the new stateoftheart on visual outofdistribution detection but the considered baselines are somewhat limited ood detection is a very active field with many papers claiming to improve on the considered baselines see eg 26 from a quick comparison it seems like the results reported by the authors are competitive with the best methods i could find but i think the authors need to do a better job comparing to prior work in order to claim sota on the other hand i want to highlight that the authors perform a fairly exhaustive experimental evaluation in terms of the outofdistribution datasets considered for each indistribution dataset including both near and farood writing the writing is generally clear but there are several typos and minor inaccuracies comparison to other distance metrics as far as i understand it is possible to replace the fisherrao metric with any other distance metric in the method proposed by the authors i think it would be interesting to see a comparison of the results with a few other standard metrics in the blackbox setup eg kldivergence totalvariation distance etc if the fisherrao metric provides significantly better results this experiment would strengthen the paper references 1 a simple unified framework for detecting outofdistribution samples and adversarial attacks kimin lee kibok lee honglak lee jinwoo shin 2 generalized odin detecting outofdistribution image without learning from outofdistribution data yenchang hsu yilin shen hongxia jin zsolt kira 3 detecting outofdistribution examples with indistribution examples and gram matrices chandramouli shama sastry sageev oore 4 hybrid models for open set recognition hongjie zhang ang li jie guo yanwen guo 5 deep residual flow for out of distribution detection ev zisselman aviv tamar 6 a simple fix to mahalanobis distance for improving nearood detection jie ren stanislav fort jeremiah liu abhijit guha roy shreyas padhy balaji lakshminarayanan in summary i recommend a weak accept for this paper the empirical results are good and the method is generally novel i am open to increasing my score if the authors address the concerns i raised in my review docsepupdate i acknowledge that ive read the author responses as well as the other reviews while the authors added further analysis on the proposed method i am skeptical of the performance of the method since the proposed method requires validation ood data to achieve sota performance and the runtime is much larger than msp and energy baseline however i think the rebuttal clarified much of my concerns and therefore raise the score to 5 weak reject the paper proposes an outofdistribution detection ood metric based on fisherrao distance which can be applied for pretrained classifiers first the authors derive a fisherrao distance applied to the distribution of softmax also they motivate a toy example where the fisherrao distance outperforms the conventional ood metric mahalanobis distance furthermore they formulate the fisherrao distancebased framework igeood on blackgreybox where we can only get access on the logit of the network output and whitebox where we can get access to intermediate feature layers finally the authors compare igeood against conventional ood metrics on various outofdistribution data and indistribution data strengths of the paper 1 the concept of applying the fisherrao distance to outofdistribution seems novel and important to me furthermore examples on the gaussian strengthen the motivation of the fisherrao distances theoretical benefit 2 proposed fisherrao distance can be combined with preexisting techniques inputpreprocessing temperature scaling and feature ensembling to boost the ood performance 3 the proposed method shows competitive results on the blackbox ood detection setting weakness of the paper 1 one major issue of the proposed algorithm is that we have to tune the parameter of centroid parameter of each class while the authors have noted that they optimized for 100 epochs i wonder how the extra computation time scales compared to the baselines eg mahalanobis distance energybased distance especially in the cifar100 dataset where we have to evaluate the mean of the centroid 100 times 2 furthermore given the computation time of 1 experiment results are weak to champion igeood as the ood method suitable for pretrained classifiers in the greybox setting odin outperforms igeood furthermore in the whitebox setting the method is only compared against the mahalanobis distance comments 1 as mentioned in the weakness section the wall clock time of the igeood on the various datasets can help to solve the time efficiency issue of the method 2 furthermore in the whitebox setup i suggest comparing the method not only against the mahalanobis distance but also on the recently proposed methods eg 1 3 in figure 1c while its convincing that the fisherrao distance improves detection against type2 ood data the two histograms are not normalized enough to make comparisons i suggest matching the area of indistribution data frequencies between two histograms 4 empirically we show in the appendix see section c that this confidence score does not degrade and sometimes improves the indistribution test classification accuracy i found the results and the claim are rather redundant if there is no stark improvement why should we take a look into it 5 for figure 2 i suggest adding the distributions of the baselines for direct comparisons 6 how the choice of validation dataset impacts performance i am skeptical about the absolute or relative robustness of igegood against the other algorithms first 8 variation in tnr does not seem so robust since the papers gain against baselines is not bigger than 8 furthermore i cannot find a major difference against baselines given the results in e3 references 1 detecting outofdistribution examples with gram matrices c s satry and s oore the paper proposed a new ood detection framework igegood my biggest concern is that the experiment results are fairly weak given the complex procedure of obtaining the detection scores therefore i am leaning towards the rejection of the paper ### Summary:
this paper introduces a novel approach for out of distribution detection that generates scores from a trained dnn model by using the fisherrao distance between the feature distributions of a given input sample at the logit layer and the lower layers of the model and the corresponding mean feature distributions over the training data the use of fisherrao distance is novel in the context of ood and the empirical evaluations are extensive the main concerns of the reviewers were the limitations of the gaussianity assumption used in computing the fisherrao distance and the use of the sum of the fisherrao distances to the classconditional distributions of the target classes rather than the minimum distance these concerns were addressed satisfactorily in a revision in terms of technical novelty experimental evaluation and novelty the paper is above the bar of acceptance
[ 403, 908, 323, 12820, 2420, 374, 2829, 256, 883, 436, 906, 1057, 417, 3761, 253, 4096, 273, 436, 1263, 3559, 275, 12002, 285, 275, 10199, 24088, 209, 9236, 836, 10384, 281, 667, 3215, 11273, 11454, 2990, 1057, 417, 2430, 258, 351, 3530, 390, 13260, 327, 253, 258, 351, 941, 50276, 45019, 314, 1142, 256, 3433, 1881, 35421, 4715, 3169, 258, 351, 5481, 3082, 452, 644, 3715, 6667, 403, 256, 8289, 18, 285, 260, 9245, 19, 326, 897, 253, 1072, 7162, 4868, 347, 275, 458, 70, 1162, 355, 20, 1223, 17617, 1881, 12185, 4694, 948, 498, 83, 891, 4282, 337, 604, 253, 4081, 1332, 476, 320, 2429, 281, 841, 1881, 12185, 4694, 3169, 3210, 285, 374, 1880, 253, 4081, 35917, 4181, 476, 320, 3732, 281, 253, 256, 3433, 3169, 7274, 50276, 18, 396, 13816, 356, 362, 1479, 1225, 278, 1947, 21477, 606, 285, 819, 366, 1441, 278, 26490, 256, 8289, 247, 27998, 7792, 323, 1881, 35421, 562, 3623, 5481, 5213, 8059, 327, 4715, 14237, 9169, 374, 480, 74, 1689, 251, 11463, 21758, 680, 80, 5497, 480, 543, 248, 251, 5139, 543, 285, 480, 249, 680, 80, 439, 249, 260, 9245, 38135, 5481, 3066, 4499, 422, 4715, 327, 3268, 595, 14728, 10872, 16424, 275, 11454, 1491, 5162, 2718, 5922, 9169, 495, 458, 70, 465, 303, 249, 1162, 355, 247, 2969, 27998, 7792, 323, 15549, 562, 1171, 35360, 3530, 285, 48960, 8104, 16424, 275, 11454, 1491, 5162, 2718, 4562, 4765, 50276, 783, 4081, 1566, 310, 1754, 327, 253, 9376, 326, 253, 3828, 3453, 3637, 247, 305, 12064, 3268, 352, 310, 1896, 281, 2085, 12820, 390, 5955, 273, 436, 9376, 50275, 1747, 13218, 16186, 84, 608, 285, 721, 253, 4477, 3877, 326, 3192, 253, 2020, 608, 3185, 273, 253, 5927, 4181, 281, 253, 966, 17697, 1399, 2631, 11330, 1805, 1543, 436, 4453, 4722, 476, 368, 2085, 16774, 1543, 390, 625, 1783, 327, 436, 50276, 262, 651, 320, 4722, 281, 923, 253, 4868, 10670, 347, 2011, 275, 4677, 337, 68, 323, 253, 1524, 15302, 908, 275, 436, 1263, 50276, 2520, 2929, 10262, 247, 4460, 2934, 273, 970, 27633, 376, 80, 4181, 323, 7162, 14755, 275, 258, 351, 5481, 253, 2022, 2934, 310, 4722, 253, 2929, 310, 973, 18872, 275, 2426, 273, 35961, 5740, 285, 253, 1895, 49363, 873, 8777, 533, 253, 16774, 1543, 513, 417, 3176, 281, 320, 4209, 281, 17813, 253, 6034, 4096, 273, 436, 1263, 50276, 7152, 339, 431, 248, 2929, 29328, 247, 747, 1387, 273, 3082, 323, 22296, 258, 351, 5481, 275, 1798, 253, 4477, 12661, 281, 897, 253, 27633, 376, 80, 4181, 875, 3453, 10670, 327, 253, 31929, 2382, 941, 285, 1071, 3530, 281, 2736, 258, 351, 253, 4477, 23000, 12661, 281, 897, 27633, 376, 80, 4181, 275, 253, 8763, 3828, 4735, 2317, 672, 1896, 3168, 3364, 4758, 253, 1332, 33526, 2266, 16774, 3045, 11138, 2220, 2629, 1666, 25379, 824, 347, 7687, 249, 35926, 30637, 706, 261, 3340, 275, 253, 3168, 3364, 4758, 50276, 9349, 1497, 50276, 783, 4477, 12661, 767, 2905, 3082, 337, 10499, 4181, 275, 253, 3453, 2317, 285, 374, 10499, 4181, 275, 253, 2317, 273, 253, 8763, 3828, 1396, 569, 337, 310, 3240, 27350, 285, 347, 2080, 347, 891, 871, 4460, 281, 253, 1682, 273, 619, 3640, 2720, 789, 5431, 19401, 816, 253, 7162, 273, 253, 30410, 285, 417, 253, 2120, 15970, 3268, 374, 310, 3240, 2074, 281, 35926, 30637, 706, 261, 337, 342, 253, 2022, 3064, 1146, 253, 4181, 7982, 908, 2299, 253, 4477, 921, 1534, 11701, 275, 3045, 2429, 281, 337, 275, 2593, 7652, 816, 5411, 253, 4081, 1332, 50275, 16680, 50276, 783, 16774, 1543, 12647, 253, 2022, 4757, 273, 436, 2929, 253, 1332, 41731, 13015, 253, 2783, 1666, 25379, 2439, 253, 4450, 2299, 891, 452, 767, 7350, 50275, 783, 11701, 275, 253, 2806, 3364, 4758, 3176, 1077, 5884, 275, 1142, 2219, 253, 1332, 1057, 417, 562, 32231, 253, 1666, 25379, 776, 19132, 253, 1543, 407, 258, 520, 923, 24088, 253, 247, 1822, 68, 1543, 323, 501, 3024, 275, 2829, 337, 352, 651, 7826, 320, 9371, 281, 823, 2228, 8965, 281, 2829, 337, 50276, 783, 2929, 3916, 281, 873, 253, 747, 1375, 23037, 14387, 327, 5304, 562, 1171, 35360, 5481, 533, 253, 2783, 1666, 25379, 403, 8489, 3710, 258, 351, 5481, 310, 247, 1077, 3939, 1673, 342, 1142, 9380, 15081, 281, 3157, 327, 253, 2783, 1666, 25379, 923, 24088, 3436, 432, 247, 3158, 5301, 352, 3133, 751, 253, 1543, 2361, 407, 253, 4477, 403, 12085, 342, 253, 1682, 3082, 891, 812, 1089, 533, 891, 1158, 253, 4477, 878, 281, 513, 247, 1805, 2628, 10941, 281, 2720, 789, 275, 1340, 281, 1750, 256, 5503, 50276, 251, 253, 643, 1133, 891, 971, 281, 6780, 326, 253, 4477, 1347, 247, 9648, 41389, 5661, 7103, 275, 2426, 273, 253, 562, 1171, 35360, 15302, 2783, 323, 1016, 31929, 2382, 10895, 1690, 1097, 2822, 285, 2080, 836, 50275, 17695, 50276, 783, 4028, 310, 3839, 2590, 533, 627, 403, 2067, 963, 993, 285, 5884, 23437, 19103, 50275, 47109, 281, 643, 4181, 17082, 50276, 284, 2080, 347, 891, 2096, 352, 310, 1896, 281, 8171, 253, 27633, 376, 80, 7982, 342, 667, 643, 4181, 7982, 275, 253, 1332, 4081, 407, 253, 4477, 891, 1158, 352, 651, 320, 4722, 281, 923, 247, 5301, 273, 253, 1543, 342, 247, 1643, 643, 2629, 17082, 275, 253, 2806, 3364, 9978, 24088, 465, 392, 2373, 9515, 2264, 20617, 318, 4181, 3966, 604, 253, 27633, 376, 80, 7982, 3400, 3012, 1805, 1543, 436, 3368, 651, 17084, 253, 2929, 50274, 250, 3065, 50276, 18, 247, 2969, 27998, 7792, 323, 15549, 562, 1171, 35360, 3530, 285, 48960, 8104, 465, 303, 249, 458, 70, 465, 487, 536, 458, 70, 288, 543, 77, 518, 458, 70, 480, 249, 680, 80, 439, 249, 50276, 19, 14923, 7687, 249, 15549, 562, 1171, 35360, 2460, 1293, 4715, 432, 562, 1171, 35360, 941, 340, 15152, 606, 288, 3467, 340, 39806, 703, 79, 288, 543, 36592, 480, 249, 1182, 84, 7391, 465, 8432, 50276, 20, 15549, 562, 1171, 35360, 6667, 342, 31929, 2382, 6667, 285, 29975, 12624, 448, 395, 3358, 276, 965, 439, 2902, 256, 505, 610, 35855, 1173, 258, 410, 50276, 21, 9769, 3210, 323, 1527, 873, 8981, 288, 543, 75, 466, 1182, 12109, 2897, 632, 480, 466, 1149, 80, 340, 266, 15082, 1149, 80, 50276, 22, 3676, 12541, 2685, 323, 562, 273, 3268, 5481, 612, 1182, 739, 35406, 1323, 400, 21526, 274, 50276, 23, 247, 2969, 4993, 281, 35926, 30637, 706, 261, 4181, 323, 11138, 2822, 836, 5481, 480, 466, 3816, 331, 266, 12937, 580, 7574, 5139, 2013, 13560, 632, 86, 490, 73, 1944, 262, 1149, 3227, 12869, 439, 5292, 284, 13229, 1994, 4273, 37848, 298, 518, 1200, 1222, 274, 26782, 266, 275, 6010, 891, 5583, 247, 5075, 2997, 323, 436, 2929, 253, 16774, 1543, 403, 1175, 285, 253, 1332, 310, 3839, 4460, 891, 717, 1527, 281, 3629, 619, 4868, 604, 253, 4477, 2953, 253, 7350, 891, 5439, 275, 619, 2278, 5474, 33032, 11183, 50276, 74, 14409, 326, 209, 422, 1239, 253, 2488, 6128, 347, 973, 347, 253, 643, 10123, 50275, 6050, 253, 4477, 2879, 2007, 1783, 327, 253, 4081, 1332, 891, 717, 33872, 273, 253, 3045, 273, 253, 1332, 1580, 253, 4081, 1332, 4419, 12820, 258, 351, 941, 281, 5115, 256, 5503, 3045, 285, 253, 20243, 310, 1199, 4067, 685, 278, 1033, 285, 2341, 8245, 2299, 891, 1158, 253, 30080, 22559, 31637, 1199, 273, 619, 7350, 285, 3103, 7164, 253, 4868, 281, 608, 5075, 12009, 50273, 783, 2929, 29328, 271, 562, 1171, 35360, 5481, 258, 351, 7982, 1754, 327, 27633, 376, 80, 4181, 534, 476, 320, 3732, 323, 3215, 11273, 49996, 806, 253, 4477, 15313, 247, 27633, 376, 80, 4181, 3732, 281, 253, 3268, 273, 2602, 4090, 671, 597, 41509, 247, 20953, 1650, 835, 253, 27633, 376, 80, 4181, 41731, 13015, 253, 6041, 258, 351, 7982, 35926, 30637, 706, 261, 4181, 33810, 597, 36803, 253, 27633, 376, 80, 4181, 3169, 7792, 209, 9236, 836, 327, 2806, 35579, 3364, 835, 359, 476, 760, 755, 2289, 327, 253, 2412, 262, 273, 253, 2990, 3453, 285, 3168, 3364, 835, 359, 476, 755, 2289, 281, 10444, 4735, 8090, 4720, 253, 4477, 7277, 209, 9236, 836, 1411, 6041, 258, 351, 17082, 327, 2710, 562, 1171, 35360, 941, 285, 31929, 2382, 941, 20544, 273, 253, 2929, 337, 253, 4473, 273, 9433, 253, 27633, 376, 80, 4181, 281, 562, 1171, 35360, 3133, 4460, 285, 1774, 281, 479, 33810, 6667, 327, 253, 305, 12064, 17084, 253, 16038, 273, 253, 27633, 376, 80, 13849, 10527, 5649, 374, 4081, 27633, 376, 80, 4181, 476, 320, 5678, 342, 638, 20137, 5609, 3280, 3456, 21678, 3276, 13642, 285, 4735, 546, 35128, 281, 9510, 253, 258, 351, 3045, 495, 253, 4081, 1332, 2722, 12085, 1543, 327, 253, 2806, 3364, 258, 351, 5481, 4758, 50276, 20881, 1255, 273, 253, 2929, 337, 581, 2201, 2523, 273, 253, 4081, 5933, 310, 326, 359, 452, 281, 19928, 253, 4764, 273, 1399, 2631, 4764, 273, 1016, 966, 1223, 253, 4477, 452, 4879, 326, 597, 18325, 323, 2233, 44540, 891, 4282, 849, 253, 4465, 13782, 673, 11498, 2429, 281, 253, 1666, 25379, 24088, 35926, 30637, 706, 261, 4181, 2341, 3169, 4181, 3340, 275, 253, 260, 338, 274, 2313, 10895, 835, 359, 452, 281, 7472, 253, 1599, 273, 253, 1399, 2631, 2233, 2069, 374, 33810, 1677, 253, 13782, 673, 273, 337, 3368, 1543, 403, 5075, 281, 16928, 209, 9236, 836, 347, 253, 258, 351, 1332, 7470, 323, 3215, 11273, 49996, 275, 253, 14370, 3364, 4758, 7687, 249, 41731, 13015, 209, 9236, 836, 33810, 275, 253, 3168, 3364, 4758, 253, 1332, 310, 760, 2429, 1411, 253, 35926, 30637, 706, 261, 4181, 50275, 26122, 50275, 18, 347, 5393, 275, 253, 14855, 2593, 253, 3402, 8886, 673, 273, 253, 209, 9236, 836, 327, 253, 2710, 15302, 476, 1361, 281, 8415, 253, 673, 6733, 2523, 273, 253, 1332, 50275, 19, 33810, 275, 253, 3168, 3364, 9978, 891, 1804, 10941, 253, 1332, 417, 760, 1411, 253, 35926, 30637, 706, 261, 4181, 533, 671, 327, 253, 4102, 4081, 3082, 24088, 337, 50275, 20, 275, 4677, 337, 68, 1223, 697, 21414, 326, 253, 27633, 376, 80, 4181, 19132, 5481, 1411, 1511, 19, 258, 351, 941, 253, 767, 47846, 403, 417, 12650, 2217, 281, 1056, 14023, 891, 1804, 11038, 253, 2170, 273, 31929, 2382, 941, 11383, 875, 767, 47846, 50274, 21, 50276, 358, 5378, 1037, 359, 921, 275, 253, 30762, 923, 2593, 260, 326, 436, 7162, 4868, 1057, 417, 40195, 285, 4536, 19132, 253, 31929, 2382, 1071, 9162, 7200, 891, 1119, 253, 1543, 285, 253, 1750, 403, 2581, 28116, 604, 627, 310, 642, 29274, 7756, 2139, 943, 359, 1379, 247, 1007, 715, 352, 50275, 22, 323, 4677, 374, 891, 1804, 6240, 253, 10670, 273, 253, 1666, 25379, 323, 1480, 14023, 50275, 23, 849, 253, 4327, 273, 12820, 10895, 16274, 3045, 50276, 74, 717, 33872, 670, 253, 7880, 390, 4103, 31640, 273, 25477, 909, 836, 1411, 253, 643, 11333, 806, 854, 7629, 275, 246, 23838, 1057, 417, 1646, 594, 10237, 1580, 253, 9380, 6351, 1411, 1666, 25379, 310, 417, 8750, 685, 854, 33810, 891, 2550, 1089, 247, 2201, 3064, 1411, 1666, 25379, 1677, 253, 1543, 275, 299, 20, 50275, 250, 3065, 337, 15549, 562, 1171, 35360, 6667, 342, 29975, 12624, 260, 256, 2206, 610, 285, 256, 258, 410, 50275, 783, 2929, 4081, 247, 747, 258, 351, 5481, 7792, 25477, 909, 836, 619, 5962, 4468, 310, 326, 253, 3368, 1543, 403, 9648, 5075, 1677, 253, 2570, 5199, 273, 13546, 253, 5481, 7363, 3103, 891, 717, 25661, 4404, 253, 18235, 273, 253, 2929, 2490, 187, 4118, 18435, 27, 2520, 2929, 23970, 247, 4460, 2746, 323, 562, 273, 3268, 5481, 326, 15693, 7363, 432, 247, 10166, 277, 9866, 1566, 407, 970, 253, 27633, 376, 80, 4181, 875, 253, 4735, 10670, 273, 247, 1677, 3280, 3410, 387, 253, 2412, 262, 3828, 285, 253, 2406, 8090, 273, 253, 1566, 285, 253, 50276, 47771, 1599, 4735, 10670, 689, 253, 3733, 941, 50276, 783, 897, 273, 27633, 376, 80, 4181, 310, 4460, 275, 253, 3634, 273, 258, 351, 285, 253, 16774, 27163, 403, 9470, 50276, 783, 2022, 7350, 273, 253, 30628, 497, 253, 7364, 273, 253, 305, 12064, 414, 9376, 908, 275, 12672, 253, 27633, 376, 80, 4181, 285, 253, 897, 273, 253, 2020, 273, 253, 27633, 376, 80, 13849, 281, 253, 966, 35428, 10670, 273, 253, 2303, 5971, 2581, 685, 253, 5927, 4181, 841, 7350, 497, 9713, 3449, 5906, 1031, 275, 247, 18520, 275, 2426, 273, 7681, 38135, 5661, 7103, 285, 38135, 253, 2929, 310, 1840, 253, 2534, 273, 14924 ]
[ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 ]
[ 403, 908, 323, 12820, 2420, 374, 2829, 256, 883, 436, 906, 1057, 417, 3761, 253, 4096, 273, 436, 1263, 3559, 275, 12002, 285, 275, 10199, 24088, 209, 9236, 836, 10384, 281, 667, 3215, 11273, 11454, 2990, 1057, 417, 2430, 258, 351, 3530, 390, 13260, 327, 253, 258, 351, 941, 50276, 45019, 314, 1142, 256, 3433, 1881, 35421, 4715, 3169, 258, 351, 5481, 3082, 452, 644, 3715, 6667, 403, 256, 8289, 18, 285, 260, 9245, 19, 326, 897, 253, 1072, 7162, 4868, 347, 275, 458, 70, 1162, 355, 20, 1223, 17617, 1881, 12185, 4694, 948, 498, 83, 891, 4282, 337, 604, 253, 4081, 1332, 476, 320, 2429, 281, 841, 1881, 12185, 4694, 3169, 3210, 285, 374, 1880, 253, 4081, 35917, 4181, 476, 320, 3732, 281, 253, 256, 3433, 3169, 7274, 50276, 18, 396, 13816, 356, 362, 1479, 1225, 278, 1947, 21477, 606, 285, 819, 366, 1441, 278, 26490, 256, 8289, 247, 27998, 7792, 323, 1881, 35421, 562, 3623, 5481, 5213, 8059, 327, 4715, 14237, 9169, 374, 480, 74, 1689, 251, 11463, 21758, 680, 80, 5497, 480, 543, 248, 251, 5139, 543, 285, 480, 249, 680, 80, 439, 249, 260, 9245, 38135, 5481, 3066, 4499, 422, 4715, 327, 3268, 595, 14728, 10872, 16424, 275, 11454, 1491, 5162, 2718, 5922, 9169, 495, 458, 70, 465, 303, 249, 1162, 355, 247, 2969, 27998, 7792, 323, 15549, 562, 1171, 35360, 3530, 285, 48960, 8104, 16424, 275, 11454, 1491, 5162, 2718, 4562, 4765, 50276, 783, 4081, 1566, 310, 1754, 327, 253, 9376, 326, 253, 3828, 3453, 3637, 247, 305, 12064, 3268, 352, 310, 1896, 281, 2085, 12820, 390, 5955, 273, 436, 9376, 50275, 1747, 13218, 16186, 84, 608, 285, 721, 253, 4477, 3877, 326, 3192, 253, 2020, 608, 3185, 273, 253, 5927, 4181, 281, 253, 966, 17697, 1399, 2631, 11330, 1805, 1543, 436, 4453, 4722, 476, 368, 2085, 16774, 1543, 390, 625, 1783, 327, 436, 50276, 262, 651, 320, 4722, 281, 923, 253, 4868, 10670, 347, 2011, 275, 4677, 337, 68, 323, 253, 1524, 15302, 908, 275, 436, 1263, 50276, 2520, 2929, 10262, 247, 4460, 2934, 273, 970, 27633, 376, 80, 4181, 323, 7162, 14755, 275, 258, 351, 5481, 253, 2022, 2934, 310, 4722, 253, 2929, 310, 973, 18872, 275, 2426, 273, 35961, 5740, 285, 253, 1895, 49363, 873, 8777, 533, 253, 16774, 1543, 513, 417, 3176, 281, 320, 4209, 281, 17813, 253, 6034, 4096, 273, 436, 1263, 50276, 7152, 339, 431, 248, 2929, 29328, 247, 747, 1387, 273, 3082, 323, 22296, 258, 351, 5481, 275, 1798, 253, 4477, 12661, 281, 897, 253, 27633, 376, 80, 4181, 875, 3453, 10670, 327, 253, 31929, 2382, 941, 285, 1071, 3530, 281, 2736, 258, 351, 253, 4477, 23000, 12661, 281, 897, 27633, 376, 80, 4181, 275, 253, 8763, 3828, 4735, 2317, 672, 1896, 3168, 3364, 4758, 253, 1332, 33526, 2266, 16774, 3045, 11138, 2220, 2629, 1666, 25379, 824, 347, 7687, 249, 35926, 30637, 706, 261, 3340, 275, 253, 3168, 3364, 4758, 50276, 9349, 1497, 50276, 783, 4477, 12661, 767, 2905, 3082, 337, 10499, 4181, 275, 253, 3453, 2317, 285, 374, 10499, 4181, 275, 253, 2317, 273, 253, 8763, 3828, 1396, 569, 337, 310, 3240, 27350, 285, 347, 2080, 347, 891, 871, 4460, 281, 253, 1682, 273, 619, 3640, 2720, 789, 5431, 19401, 816, 253, 7162, 273, 253, 30410, 285, 417, 253, 2120, 15970, 3268, 374, 310, 3240, 2074, 281, 35926, 30637, 706, 261, 337, 342, 253, 2022, 3064, 1146, 253, 4181, 7982, 908, 2299, 253, 4477, 921, 1534, 11701, 275, 3045, 2429, 281, 337, 275, 2593, 7652, 816, 5411, 253, 4081, 1332, 50275, 16680, 50276, 783, 16774, 1543, 12647, 253, 2022, 4757, 273, 436, 2929, 253, 1332, 41731, 13015, 253, 2783, 1666, 25379, 2439, 253, 4450, 2299, 891, 452, 767, 7350, 50275, 783, 11701, 275, 253, 2806, 3364, 4758, 3176, 1077, 5884, 275, 1142, 2219, 253, 1332, 1057, 417, 562, 32231, 253, 1666, 25379, 776, 19132, 253, 1543, 407, 258, 520, 923, 24088, 253, 247, 1822, 68, 1543, 323, 501, 3024, 275, 2829, 337, 352, 651, 7826, 320, 9371, 281, 823, 2228, 8965, 281, 2829, 337, 50276, 783, 2929, 3916, 281, 873, 253, 747, 1375, 23037, 14387, 327, 5304, 562, 1171, 35360, 5481, 533, 253, 2783, 1666, 25379, 403, 8489, 3710, 258, 351, 5481, 310, 247, 1077, 3939, 1673, 342, 1142, 9380, 15081, 281, 3157, 327, 253, 2783, 1666, 25379, 923, 24088, 3436, 432, 247, 3158, 5301, 352, 3133, 751, 253, 1543, 2361, 407, 253, 4477, 403, 12085, 342, 253, 1682, 3082, 891, 812, 1089, 533, 891, 1158, 253, 4477, 878, 281, 513, 247, 1805, 2628, 10941, 281, 2720, 789, 275, 1340, 281, 1750, 256, 5503, 50276, 251, 253, 643, 1133, 891, 971, 281, 6780, 326, 253, 4477, 1347, 247, 9648, 41389, 5661, 7103, 275, 2426, 273, 253, 562, 1171, 35360, 15302, 2783, 323, 1016, 31929, 2382, 10895, 1690, 1097, 2822, 285, 2080, 836, 50275, 17695, 50276, 783, 4028, 310, 3839, 2590, 533, 627, 403, 2067, 963, 993, 285, 5884, 23437, 19103, 50275, 47109, 281, 643, 4181, 17082, 50276, 284, 2080, 347, 891, 2096, 352, 310, 1896, 281, 8171, 253, 27633, 376, 80, 7982, 342, 667, 643, 4181, 7982, 275, 253, 1332, 4081, 407, 253, 4477, 891, 1158, 352, 651, 320, 4722, 281, 923, 247, 5301, 273, 253, 1543, 342, 247, 1643, 643, 2629, 17082, 275, 253, 2806, 3364, 9978, 24088, 465, 392, 2373, 9515, 2264, 20617, 318, 4181, 3966, 604, 253, 27633, 376, 80, 7982, 3400, 3012, 1805, 1543, 436, 3368, 651, 17084, 253, 2929, 50274, 250, 3065, 50276, 18, 247, 2969, 27998, 7792, 323, 15549, 562, 1171, 35360, 3530, 285, 48960, 8104, 465, 303, 249, 458, 70, 465, 487, 536, 458, 70, 288, 543, 77, 518, 458, 70, 480, 249, 680, 80, 439, 249, 50276, 19, 14923, 7687, 249, 15549, 562, 1171, 35360, 2460, 1293, 4715, 432, 562, 1171, 35360, 941, 340, 15152, 606, 288, 3467, 340, 39806, 703, 79, 288, 543, 36592, 480, 249, 1182, 84, 7391, 465, 8432, 50276, 20, 15549, 562, 1171, 35360, 6667, 342, 31929, 2382, 6667, 285, 29975, 12624, 448, 395, 3358, 276, 965, 439, 2902, 256, 505, 610, 35855, 1173, 258, 410, 50276, 21, 9769, 3210, 323, 1527, 873, 8981, 288, 543, 75, 466, 1182, 12109, 2897, 632, 480, 466, 1149, 80, 340, 266, 15082, 1149, 80, 50276, 22, 3676, 12541, 2685, 323, 562, 273, 3268, 5481, 612, 1182, 739, 35406, 1323, 400, 21526, 274, 50276, 23, 247, 2969, 4993, 281, 35926, 30637, 706, 261, 4181, 323, 11138, 2822, 836, 5481, 480, 466, 3816, 331, 266, 12937, 580, 7574, 5139, 2013, 13560, 632, 86, 490, 73, 1944, 262, 1149, 3227, 12869, 439, 5292, 284, 13229, 1994, 4273, 37848, 298, 518, 1200, 1222, 274, 26782, 266, 275, 6010, 891, 5583, 247, 5075, 2997, 323, 436, 2929, 253, 16774, 1543, 403, 1175, 285, 253, 1332, 310, 3839, 4460, 891, 717, 1527, 281, 3629, 619, 4868, 604, 253, 4477, 2953, 253, 7350, 891, 5439, 275, 619, 2278, 5474, 33032, 11183, 50276, 74, 14409, 326, 209, 422, 1239, 253, 2488, 6128, 347, 973, 347, 253, 643, 10123, 50275, 6050, 253, 4477, 2879, 2007, 1783, 327, 253, 4081, 1332, 891, 717, 33872, 273, 253, 3045, 273, 253, 1332, 1580, 253, 4081, 1332, 4419, 12820, 258, 351, 941, 281, 5115, 256, 5503, 3045, 285, 253, 20243, 310, 1199, 4067, 685, 278, 1033, 285, 2341, 8245, 2299, 891, 1158, 253, 30080, 22559, 31637, 1199, 273, 619, 7350, 285, 3103, 7164, 253, 4868, 281, 608, 5075, 12009, 50273, 783, 2929, 29328, 271, 562, 1171, 35360, 5481, 258, 351, 7982, 1754, 327, 27633, 376, 80, 4181, 534, 476, 320, 3732, 323, 3215, 11273, 49996, 806, 253, 4477, 15313, 247, 27633, 376, 80, 4181, 3732, 281, 253, 3268, 273, 2602, 4090, 671, 597, 41509, 247, 20953, 1650, 835, 253, 27633, 376, 80, 4181, 41731, 13015, 253, 6041, 258, 351, 7982, 35926, 30637, 706, 261, 4181, 33810, 597, 36803, 253, 27633, 376, 80, 4181, 3169, 7792, 209, 9236, 836, 327, 2806, 35579, 3364, 835, 359, 476, 760, 755, 2289, 327, 253, 2412, 262, 273, 253, 2990, 3453, 285, 3168, 3364, 835, 359, 476, 755, 2289, 281, 10444, 4735, 8090, 4720, 253, 4477, 7277, 209, 9236, 836, 1411, 6041, 258, 351, 17082, 327, 2710, 562, 1171, 35360, 941, 285, 31929, 2382, 941, 20544, 273, 253, 2929, 337, 253, 4473, 273, 9433, 253, 27633, 376, 80, 4181, 281, 562, 1171, 35360, 3133, 4460, 285, 1774, 281, 479, 33810, 6667, 327, 253, 305, 12064, 17084, 253, 16038, 273, 253, 27633, 376, 80, 13849, 10527, 5649, 374, 4081, 27633, 376, 80, 4181, 476, 320, 5678, 342, 638, 20137, 5609, 3280, 3456, 21678, 3276, 13642, 285, 4735, 546, 35128, 281, 9510, 253, 258, 351, 3045, 495, 253, 4081, 1332, 2722, 12085, 1543, 327, 253, 2806, 3364, 258, 351, 5481, 4758, 50276, 20881, 1255, 273, 253, 2929, 337, 581, 2201, 2523, 273, 253, 4081, 5933, 310, 326, 359, 452, 281, 19928, 253, 4764, 273, 1399, 2631, 4764, 273, 1016, 966, 1223, 253, 4477, 452, 4879, 326, 597, 18325, 323, 2233, 44540, 891, 4282, 849, 253, 4465, 13782, 673, 11498, 2429, 281, 253, 1666, 25379, 24088, 35926, 30637, 706, 261, 4181, 2341, 3169, 4181, 3340, 275, 253, 260, 338, 274, 2313, 10895, 835, 359, 452, 281, 7472, 253, 1599, 273, 253, 1399, 2631, 2233, 2069, 374, 33810, 1677, 253, 13782, 673, 273, 337, 3368, 1543, 403, 5075, 281, 16928, 209, 9236, 836, 347, 253, 258, 351, 1332, 7470, 323, 3215, 11273, 49996, 275, 253, 14370, 3364, 4758, 7687, 249, 41731, 13015, 209, 9236, 836, 33810, 275, 253, 3168, 3364, 4758, 253, 1332, 310, 760, 2429, 1411, 253, 35926, 30637, 706, 261, 4181, 50275, 26122, 50275, 18, 347, 5393, 275, 253, 14855, 2593, 253, 3402, 8886, 673, 273, 253, 209, 9236, 836, 327, 253, 2710, 15302, 476, 1361, 281, 8415, 253, 673, 6733, 2523, 273, 253, 1332, 50275, 19, 33810, 275, 253, 3168, 3364, 9978, 891, 1804, 10941, 253, 1332, 417, 760, 1411, 253, 35926, 30637, 706, 261, 4181, 533, 671, 327, 253, 4102, 4081, 3082, 24088, 337, 50275, 20, 275, 4677, 337, 68, 1223, 697, 21414, 326, 253, 27633, 376, 80, 4181, 19132, 5481, 1411, 1511, 19, 258, 351, 941, 253, 767, 47846, 403, 417, 12650, 2217, 281, 1056, 14023, 891, 1804, 11038, 253, 2170, 273, 31929, 2382, 941, 11383, 875, 767, 47846, 50274, 21, 50276, 358, 5378, 1037, 359, 921, 275, 253, 30762, 923, 2593, 260, 326, 436, 7162, 4868, 1057, 417, 40195, 285, 4536, 19132, 253, 31929, 2382, 1071, 9162, 7200, 891, 1119, 253, 1543, 285, 253, 1750, 403, 2581, 28116, 604, 627, 310, 642, 29274, 7756, 2139, 943, 359, 1379, 247, 1007, 715, 352, 50275, 22, 323, 4677, 374, 891, 1804, 6240, 253, 10670, 273, 253, 1666, 25379, 323, 1480, 14023, 50275, 23, 849, 253, 4327, 273, 12820, 10895, 16274, 3045, 50276, 74, 717, 33872, 670, 253, 7880, 390, 4103, 31640, 273, 25477, 909, 836, 1411, 253, 643, 11333, 806, 854, 7629, 275, 246, 23838, 1057, 417, 1646, 594, 10237, 1580, 253, 9380, 6351, 1411, 1666, 25379, 310, 417, 8750, 685, 854, 33810, 891, 2550, 1089, 247, 2201, 3064, 1411, 1666, 25379, 1677, 253, 1543, 275, 299, 20, 50275, 250, 3065, 337, 15549, 562, 1171, 35360, 6667, 342, 29975, 12624, 260, 256, 2206, 610, 285, 256, 258, 410, 50275, 783, 2929, 4081, 247, 747, 258, 351, 5481, 7792, 25477, 909, 836, 619, 5962, 4468, 310, 326, 253, 3368, 1543, 403, 9648, 5075, 1677, 253, 2570, 5199, 273, 13546, 253, 5481, 7363, 3103, 891, 717, 25661, 4404, 253, 18235, 273, 253, 2929, 2490, 187, 4118, 18435, 27, 2520, 2929, 23970, 247, 4460, 2746, 323, 562, 273, 3268, 5481, 326, 15693, 7363, 432, 247, 10166, 277, 9866, 1566, 407, 970, 253, 27633, 376, 80, 4181, 875, 253, 4735, 10670, 273, 247, 1677, 3280, 3410, 387, 253, 2412, 262, 3828, 285, 253, 2406, 8090, 273, 253, 1566, 285, 253, 50276, 47771, 1599, 4735, 10670, 689, 253, 3733, 941, 50276, 783, 897, 273, 27633, 376, 80, 4181, 310, 4460, 275, 253, 3634, 273, 258, 351, 285, 253, 16774, 27163, 403, 9470, 50276, 783, 2022, 7350, 273, 253, 30628, 497, 253, 7364, 273, 253, 305, 12064, 414, 9376, 908, 275, 12672, 253, 27633, 376, 80, 4181, 285, 253, 897, 273, 253, 2020, 273, 253, 27633, 376, 80, 13849, 281, 253, 966, 35428, 10670, 273, 253, 2303, 5971, 2581, 685, 253, 5927, 4181, 841, 7350, 497, 9713, 3449, 5906, 1031, 275, 247, 18520, 275, 2426, 273, 7681, 38135, 5661, 7103, 285, 38135, 253, 2929, 310, 1840, 253, 2534, 273, 14924 ]
Below is given review of a research paper from cnoference journal. Please write a summary the review. ### Review: this paper casts some light on the complexity of evidence combination which is little explored in the literature characterizing some cases in which the complexity which in general is phard can dramatically drop to polynomial that has important consequences for the scalability of dempstershafer theory cannot see any this is an extremely interesting contribution which i hope will pave the ways to more investigations into the characterization of complexity of evidence combination the paper is very well written and technically sound to the extent that i could check i didnt check the proofs in the appendix docsep the complexity analysis the clear writing the coherence of the approach polytime algorithm if possible fixed parameter tractable algorithm if not no experimental study is provided as far as i understand the source of complexity is in the present study the number of bpas to merge but in application cited as motivation diagnosis combination of human opinion this number is generally low what are the other sources of complexity an experimental study on real problems thus with a low number of bpa to merge should be easy to conduct such a study would make the paper complete what about a generalization of the result to mass functions containing disjoint evidences the all the focal elements of a given bpa are disjoint to what extent the number of such basic evidences in each bpa is a source of complexity docsepnew contributions on the computational complexity of dempster combination rule with identification of situation where tractable methods can be identified definitions are dense results need to be more justified results of this paper are proposed in the context of belief functions for representing and reasoning with uncertain information the authors are interested in one of the fundamental operator in belief functions demspters conjunctive combination rule more precisely the paper characterizes situations where the dempster combination rule can be done in polynomial time the authors focus on the combination of simple support functions with a single focal element the key to the paper can be found in the result of shafer and logan 2008 where a polynomial algorithm for calculating plausibility functions obtained by combination by dempsters rule of simple support functions is provided the tractability became possible if the focal elements follow a hierarchical structure starting from this result the authors search under which conditions such hierarchies exist and therefore a polynomial algorithm for dempsters rule can be found the paper is generally well written even if some definitions are dense one can follow the main results of the paper it has more or less significant contributions a number of complexity results with reductions of problems a few remarks the example given on page 1 is not reused in the paper whereas it was expected to be a running example of the paper the restriction to the simple support function announced on page 2 is not justified we understand this later we expect a little more motivation in particular the combination of two simple support functions is not a simple support function so at this level we expect explanations why consider this restrictive framework i propose to give the definition of the rule of combination with simple support functions there are interesting properties that may explain the interest of restricting to simple support functions in theorem 4 it is not clear whether the dichotomous restrictions limit the significance and importance of the theorem finally i find that an important reference is missing p smets 1995 the canonical decomposition of a weighted belief in proceedings of the fourteenth international joint conference on artificial intelligence volume 2 pages 18961901 montreal canada august 1995 which concerns the decomposition of mass function into simple support function docsep the paper is really well written and clear the results are interesting and well presented the importance of the results in the given domain is well discussed most proofs are clearly written and easy to follow some proofs are too sketched and would be better presented by expanding them in an appendix the authors study the complexity of dempsters rule starting from the wellknown fact that the problem is pcomplete and exploring various restrictions to the problem for which lower complexity can be achieved in particular they focus on the case of evidence represented in hierearchical form providing both algorithm to check whether a body of evidence conforms to a given hierarchy or to transform bodies of evidence in a relaxed hierarchy and provide algorithms in polynomial or fixedparameter polynomial time for computing the combination of such bodies of evidence the paper is very well written the exposition is good and the proofs are clear even if some proof sketches are almost too sketched all results seem to be correct by working out the proof by onself however maybe the authors could add additional details in an appendix eg for theorem 54 and proposition 64 the results are wellmotivated and it is clear that from a theoretical point of view this work expands on the stateoftheart of algorithms for dempstershafer theory at least to my knowledge aside from the theoretical point of view however i think it would be interesting if the authors could provide some indications in regard to what degree the proposed algorithms allow to improve over naive application of dempsters combination rule on some examples or benchmarks this would be particularly interesting for the fixedparameter tractable algorithms as in some cases eg when in most practical problems the size of the parameter is relatively large such algorithms fail to offer a significant advantage compared with the more naive solutions and often with a nontrivial increase in implementation complexity ### Summary:
meta review pros 1 the paper makes nontrivial advances over the current stateoftheart for hierarchical models 2 the paper is likely to have high impact within a subfield of ai or moderate impact across more than one subfield of ai 3 key resources eg proofs code data are available and key details eg proof sketches experimental setup are comprehensively described for competent researchers to confidently and easily reproduce the main results 4 the paper is wellorganized and clearly written cons 1 some proofs are too sketched and would be better presented by expanding them in an appendix 2 some definitions are dense results need to be more justified
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 2520, 2929, 43603, 690, 1708, 327, 253, 10454, 273, 1941, 5019, 534, 310, 1652, 14859, 275, 253, 6239, 39330, 690, 2219, 275, 534, 253, 10454, 534, 275, 2087, 310, 815, 472, 476, 16821, 5926, 281, 14189, 326, 556, 1774, 9099, 323, 253, 9171, 1430, 273, 1471, 81, 13297, 3227, 1592, 3762, 50276, 32824, 923, 667, 436, 310, 271, 6685, 4722, 7680, 534, 891, 3524, 588, 29238, 253, 4088, 281, 625, 14006, 715, 253, 14846, 273, 10454, 273, 1941, 5019, 253, 2929, 310, 1077, 973, 3542, 285, 22335, 3590, 281, 253, 6070, 326, 891, 812, 2451, 891, 42126, 2451, 253, 27947, 275, 253, 30762, 5474, 33032, 253, 10454, 1783, 50276, 783, 2590, 4028, 50276, 783, 50276, 1940, 11724, 273, 253, 2746, 877, 1767, 553, 5933, 604, 1896, 4229, 4764, 10649, 494, 5933, 604, 417, 50275, 2369, 5661, 1263, 310, 2530, 347, 2080, 347, 891, 2096, 253, 2603, 273, 10454, 310, 275, 253, 1246, 1263, 253, 1180, 273, 20633, 284, 50276, 936, 17310, 50276, 2858, 275, 2898, 11106, 347, 16038, 6120, 5019, 273, 1966, 4743, 436, 1180, 310, 3839, 1698, 752, 403, 253, 643, 4973, 273, 10454, 50275, 266, 5661, 1263, 327, 1524, 3237, 3021, 342, 247, 1698, 1180, 273, 270, 4904, 281, 17310, 943, 320, 3477, 281, 2589, 824, 247, 1263, 651, 1056, 253, 2929, 3426, 50276, 5371, 670, 247, 26647, 273, 253, 906, 281, 2280, 3470, 4508, 28465, 20456, 2979, 253, 512, 253, 18560, 3603, 273, 247, 1677, 270, 4904, 403, 28465, 50276, 936, 752, 6070, 253, 1180, 273, 824, 5044, 20456, 2979, 50276, 249, 1016, 270, 4904, 310, 247, 2603, 273, 10454, 50275, 7152, 33032, 1826, 9021, 327, 253, 15180, 10454, 273, 1471, 81, 2971, 5019, 4086, 342, 8137, 273, 4112, 835, 10649, 494, 3082, 476, 320, 3636, 14308, 403, 14086, 1543, 878, 281, 320, 625, 17285, 50276, 16680, 273, 436, 2929, 403, 4081, 275, 253, 3634, 273, 9927, 3470, 323, 9999, 285, 14720, 342, 8767, 1491, 50276, 783, 4477, 403, 6110, 275, 581, 273, 253, 7936, 5572, 275, 9927, 3470, 1471, 1033, 1336, 7862, 28816, 5019, 4086, 50276, 3062, 10534, 253, 2929, 45589, 9534, 835, 253, 1471, 81, 2971, 5019, 4086, 476, 320, 2218, 275, 14189, 673, 50276, 783, 4477, 2770, 327, 253, 5019, 273, 2969, 1329, 3470, 342, 247, 2014, 18560, 3284, 50276, 783, 2234, 281, 253, 2929, 476, 320, 1119, 275, 253, 906, 273, 48183, 1592, 285, 2412, 266, 4695, 835, 247, 14189, 5933, 323, 18899, 18662, 2322, 3470, 2797, 407, 5019, 407, 1471, 81, 13297, 4086, 273, 2969, 1329, 3470, 310, 2530, 253, 10649, 1430, 3395, 1896, 604, 253, 18560, 3603, 956, 247, 24498, 2605, 50276, 45033, 432, 436, 906, 253, 4477, 3186, 762, 534, 2515, 824, 20258, 447, 2226, 285, 3103, 247, 14189, 5933, 323, 1471, 81, 13297, 4086, 476, 320, 1119, 50276, 783, 2929, 310, 3839, 973, 3542, 1014, 604, 690, 14308, 403, 14086, 581, 476, 956, 253, 2022, 1543, 273, 253, 2929, 352, 556, 625, 390, 1679, 1534, 9021, 247, 1180, 273, 10454, 1543, 342, 23082, 273, 3237, 50276, 66, 1643, 16157, 50276, 783, 1650, 1677, 327, 3239, 337, 310, 417, 294, 3197, 275, 253, 2929, 5727, 352, 369, 3264, 281, 320, 247, 3515, 1650, 273, 253, 2929, 50275, 783, 12400, 281, 253, 2969, 1329, 1159, 6138, 327, 3239, 374, 310, 417, 17285, 359, 2096, 436, 1996, 359, 1902, 247, 1652, 625, 16038, 275, 1798, 253, 5019, 273, 767, 2969, 1329, 3470, 310, 417, 247, 2969, 1329, 1159, 594, 387, 436, 1268, 359, 1902, 22909, 2139, 1908, 436, 29190, 7792, 50274, 74, 12661, 281, 1918, 253, 5426, 273, 253, 4086, 273, 5019, 342, 2969, 1329, 3470, 627, 403, 4722, 3607, 326, 778, 5513, 253, 1600, 273, 34617, 281, 2969, 1329, 3470, 50274, 249, 10012, 577, 352, 310, 417, 2590, 1880, 253, 19821, 35756, 528, 13133, 2701, 253, 8453, 285, 6349, 273, 253, 10012, 50275, 71, 3341, 891, 1089, 326, 271, 1774, 3806, 310, 5816, 268, 924, 1507, 8878, 253, 15516, 14717, 273, 247, 17375, 9927, 275, 10061, 273, 253, 1740, 16565, 5213, 6036, 8059, 327, 13345, 9260, 4644, 374, 7223, 43843, 746, 520, 24325, 6549, 476, 2960, 14688, 461, 8878, 534, 7350, 253, 14717, 273, 2280, 1159, 715, 2969, 1329, 1159, 5474, 33032, 253, 2929, 310, 1663, 973, 3542, 285, 2590, 50276, 783, 1543, 403, 4722, 285, 973, 3559, 50276, 783, 6349, 273, 253, 1543, 275, 253, 1677, 5028, 310, 973, 5469, 50276, 2252, 27947, 403, 4518, 3542, 285, 3477, 281, 956, 50276, 8826, 27947, 403, 1512, 30547, 2147, 285, 651, 320, 1805, 3559, 407, 16122, 731, 275, 271, 30762, 253, 4477, 1263, 253, 10454, 273, 1471, 81, 13297, 4086, 4983, 432, 253, 973, 4304, 958, 326, 253, 1895, 310, 268, 11984, 285, 18216, 2710, 13133, 281, 253, 1895, 323, 534, 2406, 10454, 476, 320, 6786, 275, 1798, 597, 2770, 327, 253, 1083, 273, 1941, 6607, 275, 288, 16749, 1116, 474, 830, 5277, 1097, 5933, 281, 2451, 1880, 247, 2133, 273, 1941, 10138, 84, 281, 247, 1677, 19868, 390, 281, 4979, 8248, 273, 1941, 275, 247, 19595, 19868, 285, 2085, 11333, 275, 14189, 390, 4229, 19484, 14189, 673, 323, 12672, 253, 5019, 273, 824, 8248, 273, 1941, 50276, 783, 2929, 310, 1077, 973, 3542, 253, 47284, 310, 1175, 285, 253, 27947, 403, 2590, 1014, 604, 690, 4737, 46159, 403, 2761, 1512, 30547, 2147, 512, 1543, 1646, 281, 320, 3451, 407, 2444, 562, 253, 4737, 407, 327, 1286, 2299, 5046, 253, 4477, 812, 823, 3081, 4278, 275, 271, 30762, 24088, 323, 10012, 8255, 285, 13989, 6705, 50276, 783, 1543, 403, 973, 24013, 8550, 285, 352, 310, 2590, 326, 432, 247, 10527, 1127, 273, 1859, 436, 789, 35205, 327, 253, 1375, 23037, 14387, 273, 11333, 323, 1471, 81, 13297, 3227, 1592, 3762, 387, 1878, 281, 619, 3640, 9255, 432, 253, 10527, 1127, 273, 1859, 2299, 891, 1158, 352, 651, 320, 4722, 604, 253, 4477, 812, 2085, 690, 25488, 275, 2743, 281, 752, 4248, 253, 4081, 11333, 1581, 281, 3157, 689, 27785, 2898, 273, 1471, 81, 13297, 5019, 4086, 327, 690, 6667, 390, 49602, 436, 651, 320, 3782, 4722, 323, 253, 4229, 19484, 10649, 494, 11333, 347, 275, 690, 2219, 24088, 672, 275, 954, 8542, 3237, 253, 1979, 273, 253, 4764, 310, 4942, 1781, 824, 11333, 1891, 281, 3959, 247, 1534, 5750, 2429, 342, 253, 625, 27785, 5482, 285, 2223, 342, 247, 37825, 2572, 275, 7092, 10454, 2490, 187, 4118, 18435, 27, 13518, 2278, 5847, 337, 253, 2929, 2789, 37825, 16424, 689, 253, 1655, 1375, 23037, 14387, 323, 24498, 3210, 374, 253, 2929, 310, 2779, 281, 452, 1029, 3486, 1561, 247, 749, 3423, 273, 23105, 390, 10290, 3486, 2439, 625, 685, 581, 749, 3423, 273, 23105, 495, 2234, 5300, 24088, 27947, 2127, 941, 403, 2130, 285, 2234, 4278, 24088, 4737, 46159, 5661, 9978, 403, 9483, 1242, 2529, 323, 20566, 8607, 281, 13224, 314, 285, 4354, 18302, 253, 2022, 1543, 577, 253, 2929, 310, 973, 34092, 285, 4518, 3542, 50276, 5040, 337, 690, 27947, 403, 1512, 30547, 2147, 285, 651, 320, 1805, 3559, 407, 16122, 731, 275, 271, 30762, 374, 690, 14308, 403, 14086, 1543, 878, 281, 320, 625, 17285, 209 ]
[ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 ]
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 2520, 2929, 43603, 690, 1708, 327, 253, 10454, 273, 1941, 5019, 534, 310, 1652, 14859, 275, 253, 6239, 39330, 690, 2219, 275, 534, 253, 10454, 534, 275, 2087, 310, 815, 472, 476, 16821, 5926, 281, 14189, 326, 556, 1774, 9099, 323, 253, 9171, 1430, 273, 1471, 81, 13297, 3227, 1592, 3762, 50276, 32824, 923, 667, 436, 310, 271, 6685, 4722, 7680, 534, 891, 3524, 588, 29238, 253, 4088, 281, 625, 14006, 715, 253, 14846, 273, 10454, 273, 1941, 5019, 253, 2929, 310, 1077, 973, 3542, 285, 22335, 3590, 281, 253, 6070, 326, 891, 812, 2451, 891, 42126, 2451, 253, 27947, 275, 253, 30762, 5474, 33032, 253, 10454, 1783, 50276, 783, 2590, 4028, 50276, 783, 50276, 1940, 11724, 273, 253, 2746, 877, 1767, 553, 5933, 604, 1896, 4229, 4764, 10649, 494, 5933, 604, 417, 50275, 2369, 5661, 1263, 310, 2530, 347, 2080, 347, 891, 2096, 253, 2603, 273, 10454, 310, 275, 253, 1246, 1263, 253, 1180, 273, 20633, 284, 50276, 936, 17310, 50276, 2858, 275, 2898, 11106, 347, 16038, 6120, 5019, 273, 1966, 4743, 436, 1180, 310, 3839, 1698, 752, 403, 253, 643, 4973, 273, 10454, 50275, 266, 5661, 1263, 327, 1524, 3237, 3021, 342, 247, 1698, 1180, 273, 270, 4904, 281, 17310, 943, 320, 3477, 281, 2589, 824, 247, 1263, 651, 1056, 253, 2929, 3426, 50276, 5371, 670, 247, 26647, 273, 253, 906, 281, 2280, 3470, 4508, 28465, 20456, 2979, 253, 512, 253, 18560, 3603, 273, 247, 1677, 270, 4904, 403, 28465, 50276, 936, 752, 6070, 253, 1180, 273, 824, 5044, 20456, 2979, 50276, 249, 1016, 270, 4904, 310, 247, 2603, 273, 10454, 50275, 7152, 33032, 1826, 9021, 327, 253, 15180, 10454, 273, 1471, 81, 2971, 5019, 4086, 342, 8137, 273, 4112, 835, 10649, 494, 3082, 476, 320, 3636, 14308, 403, 14086, 1543, 878, 281, 320, 625, 17285, 50276, 16680, 273, 436, 2929, 403, 4081, 275, 253, 3634, 273, 9927, 3470, 323, 9999, 285, 14720, 342, 8767, 1491, 50276, 783, 4477, 403, 6110, 275, 581, 273, 253, 7936, 5572, 275, 9927, 3470, 1471, 1033, 1336, 7862, 28816, 5019, 4086, 50276, 3062, 10534, 253, 2929, 45589, 9534, 835, 253, 1471, 81, 2971, 5019, 4086, 476, 320, 2218, 275, 14189, 673, 50276, 783, 4477, 2770, 327, 253, 5019, 273, 2969, 1329, 3470, 342, 247, 2014, 18560, 3284, 50276, 783, 2234, 281, 253, 2929, 476, 320, 1119, 275, 253, 906, 273, 48183, 1592, 285, 2412, 266, 4695, 835, 247, 14189, 5933, 323, 18899, 18662, 2322, 3470, 2797, 407, 5019, 407, 1471, 81, 13297, 4086, 273, 2969, 1329, 3470, 310, 2530, 253, 10649, 1430, 3395, 1896, 604, 253, 18560, 3603, 956, 247, 24498, 2605, 50276, 45033, 432, 436, 906, 253, 4477, 3186, 762, 534, 2515, 824, 20258, 447, 2226, 285, 3103, 247, 14189, 5933, 323, 1471, 81, 13297, 4086, 476, 320, 1119, 50276, 783, 2929, 310, 3839, 973, 3542, 1014, 604, 690, 14308, 403, 14086, 581, 476, 956, 253, 2022, 1543, 273, 253, 2929, 352, 556, 625, 390, 1679, 1534, 9021, 247, 1180, 273, 10454, 1543, 342, 23082, 273, 3237, 50276, 66, 1643, 16157, 50276, 783, 1650, 1677, 327, 3239, 337, 310, 417, 294, 3197, 275, 253, 2929, 5727, 352, 369, 3264, 281, 320, 247, 3515, 1650, 273, 253, 2929, 50275, 783, 12400, 281, 253, 2969, 1329, 1159, 6138, 327, 3239, 374, 310, 417, 17285, 359, 2096, 436, 1996, 359, 1902, 247, 1652, 625, 16038, 275, 1798, 253, 5019, 273, 767, 2969, 1329, 3470, 310, 417, 247, 2969, 1329, 1159, 594, 387, 436, 1268, 359, 1902, 22909, 2139, 1908, 436, 29190, 7792, 50274, 74, 12661, 281, 1918, 253, 5426, 273, 253, 4086, 273, 5019, 342, 2969, 1329, 3470, 627, 403, 4722, 3607, 326, 778, 5513, 253, 1600, 273, 34617, 281, 2969, 1329, 3470, 50274, 249, 10012, 577, 352, 310, 417, 2590, 1880, 253, 19821, 35756, 528, 13133, 2701, 253, 8453, 285, 6349, 273, 253, 10012, 50275, 71, 3341, 891, 1089, 326, 271, 1774, 3806, 310, 5816, 268, 924, 1507, 8878, 253, 15516, 14717, 273, 247, 17375, 9927, 275, 10061, 273, 253, 1740, 16565, 5213, 6036, 8059, 327, 13345, 9260, 4644, 374, 7223, 43843, 746, 520, 24325, 6549, 476, 2960, 14688, 461, 8878, 534, 7350, 253, 14717, 273, 2280, 1159, 715, 2969, 1329, 1159, 5474, 33032, 253, 2929, 310, 1663, 973, 3542, 285, 2590, 50276, 783, 1543, 403, 4722, 285, 973, 3559, 50276, 783, 6349, 273, 253, 1543, 275, 253, 1677, 5028, 310, 973, 5469, 50276, 2252, 27947, 403, 4518, 3542, 285, 3477, 281, 956, 50276, 8826, 27947, 403, 1512, 30547, 2147, 285, 651, 320, 1805, 3559, 407, 16122, 731, 275, 271, 30762, 253, 4477, 1263, 253, 10454, 273, 1471, 81, 13297, 4086, 4983, 432, 253, 973, 4304, 958, 326, 253, 1895, 310, 268, 11984, 285, 18216, 2710, 13133, 281, 253, 1895, 323, 534, 2406, 10454, 476, 320, 6786, 275, 1798, 597, 2770, 327, 253, 1083, 273, 1941, 6607, 275, 288, 16749, 1116, 474, 830, 5277, 1097, 5933, 281, 2451, 1880, 247, 2133, 273, 1941, 10138, 84, 281, 247, 1677, 19868, 390, 281, 4979, 8248, 273, 1941, 275, 247, 19595, 19868, 285, 2085, 11333, 275, 14189, 390, 4229, 19484, 14189, 673, 323, 12672, 253, 5019, 273, 824, 8248, 273, 1941, 50276, 783, 2929, 310, 1077, 973, 3542, 253, 47284, 310, 1175, 285, 253, 27947, 403, 2590, 1014, 604, 690, 4737, 46159, 403, 2761, 1512, 30547, 2147, 512, 1543, 1646, 281, 320, 3451, 407, 2444, 562, 253, 4737, 407, 327, 1286, 2299, 5046, 253, 4477, 812, 823, 3081, 4278, 275, 271, 30762, 24088, 323, 10012, 8255, 285, 13989, 6705, 50276, 783, 1543, 403, 973, 24013, 8550, 285, 352, 310, 2590, 326, 432, 247, 10527, 1127, 273, 1859, 436, 789, 35205, 327, 253, 1375, 23037, 14387, 273, 11333, 323, 1471, 81, 13297, 3227, 1592, 3762, 387, 1878, 281, 619, 3640, 9255, 432, 253, 10527, 1127, 273, 1859, 2299, 891, 1158, 352, 651, 320, 4722, 604, 253, 4477, 812, 2085, 690, 25488, 275, 2743, 281, 752, 4248, 253, 4081, 11333, 1581, 281, 3157, 689, 27785, 2898, 273, 1471, 81, 13297, 5019, 4086, 327, 690, 6667, 390, 49602, 436, 651, 320, 3782, 4722, 323, 253, 4229, 19484, 10649, 494, 11333, 347, 275, 690, 2219, 24088, 672, 275, 954, 8542, 3237, 253, 1979, 273, 253, 4764, 310, 4942, 1781, 824, 11333, 1891, 281, 3959, 247, 1534, 5750, 2429, 342, 253, 625, 27785, 5482, 285, 2223, 342, 247, 37825, 2572, 275, 7092, 10454, 2490, 187, 4118, 18435, 27, 13518, 2278, 5847, 337, 253, 2929, 2789, 37825, 16424, 689, 253, 1655, 1375, 23037, 14387, 323, 24498, 3210, 374, 253, 2929, 310, 2779, 281, 452, 1029, 3486, 1561, 247, 749, 3423, 273, 23105, 390, 10290, 3486, 2439, 625, 685, 581, 749, 3423, 273, 23105, 495, 2234, 5300, 24088, 27947, 2127, 941, 403, 2130, 285, 2234, 4278, 24088, 4737, 46159, 5661, 9978, 403, 9483, 1242, 2529, 323, 20566, 8607, 281, 13224, 314, 285, 4354, 18302, 253, 2022, 1543, 577, 253, 2929, 310, 973, 34092, 285, 4518, 3542, 50276, 5040, 337, 690, 27947, 403, 1512, 30547, 2147, 285, 651, 320, 1805, 3559, 407, 16122, 731, 275, 271, 30762, 374, 690, 14308, 403, 14086, 1543, 878, 281, 320, 625, 17285, 209 ]
Below is given review of a research paper from cnoference journal. Please write a summary the review. ### Review: this paper suggests a source of slowness when training a twolayer neural networks improperly trained output layer classifier may hamper learning of the hidden layer feature the authors call this inverse internal covariate shift as opposed to the usual one where the feature distribution shifts and trips the classifier they identify hard samples those with large loss as being the impediment they then propose a curriculum where such hard samples are identified at early epochs their loss attenuated and replaced with a requirement that their features be close to neighboring in feature space samples that are similarly classified but with a more comfortable margin thus easy the authors claim that this allows those samples to contribute through their features at first without slowing the training down then in later epochs fully contribute some experiments are offered as evidence that this indeed helps speedup the paper is extremely unclear and was hard to read the narrative is too casual a lot of handwaving is made the notation is very informal and inconsistent i had to second guess multiple times until deciphering what could have possibly been said based on this only i do not deem this work ready for sharing furthermore there are some general issues with the concepts here are some specific remarks the intuition of the inverse internal covariate shift is perhaps the main merit of the paper but im not sure if this was not mostly appreciated already the paper offers some experimental poking and probing to find the source of the issue but that part of the paper section 3 is disconnected from what follows mainly because hardness there is not a single points notion but rather that of regions of space with a heterogeneous presence of classes this is quite intuitive in fact later in section 4 hard simply means high loss this isnt quite the same since the former notion means rather being near the decision boundary which is not captured by just having high loss also the loss is not specified some issues with section 3 the notions of task needs a more formal definition and then subtasks and union of tasks priors on tasks etc its all too vague the term noncomputable has very specific meaning best to avoid figure 2 is very badly explained i believe the green curve is the number of classes represented by one element or more while the red curve is the number of classes represented by 5 elements or more but i had to figure it out on my own the whole paragraph preceding figure 3 is hard to follow i sort of can make up what is going especially with the hindsight of section 4 since its basically a variant of the proposed schedule easy to hard making sure all clusters as proxy to classes are represented without the feature loss but it needs a rewriting it is important to emphasize that the notion of easy and hard can change along the training because they are relative to what the weights are at the hidden layer features of some samples may be not very separable at some stage but they may become very separable later the suggested algorithm does this reevaluation but this is not made clear early on in section 4 the sentence where stx is mentioned is unclear i assume surpass means achieving a better loss also later mt a margin is used when i think what is meant is st a set the whole notation eg topk indexing that is not subscripted nonmath mode math is bad if lt is indeed a loss and not a performance like its sometimes referred to as in minus loss then i assume larger losses means that the weight on the feature loss in equation 3 should be larger so i think a minus sign is missing in the exponent of equation 2 and also in the algorithm im not sure if the experiments actually show a speedup in the sense of what the authors started out motivating a speedup for me would look like the training progress curves are basically compressed everything happens sooner in terms of epochs instead what we have is basically the same shape curve but with a slight boost in performance figure 4 its totally disingenuous to say this is a great boost in speed end of section 52 by saying it took 30 epochs for the noncurriculum version to get to its performance when within 4 epochs just like the curriculum version it was at its final performance basically so the real conclusion here is that this curriculum may not have sped up the training in the way we expect it at all however the gradual introduction of badly classified samples in later epochs while essentially replacing their features with similarly classified samples for earlier epochs has somehow regularized the training the authors do not discuss this at all and i think draw the wrong conclusion from the results docsepthis paper describes an approach for automated curriculum learning in a deep learning classification setup the main idea is to weigh data points according to the current value of the loss on these data points a naive approach would prevent learning from data points that are hard to classify given parameters of the current mode and so the authors propose to use an additional loss term for these hard data points which encourages the hidden representation of these data points to be closer to representation of points that are close in the hidden space and yet are easier to classify in the sense that the loss of easy samples is lower by some threshold value then the loss of hard samples this last part is implemented by caching hidden representations and classification loss values during training and fetching nearest neighbours in the feature space whenever a hard data point is encountered the final loss takes the form of a linear combination of the classification loss and the representation loss the idea is interesting in the sense that it tries to use information about how difficult classification of a given data point is to improve learning the proposed representation loss can lead to forming tight cluster of similar data point in the feature space and can make classification easier it is related to studentteacher networks where a student is trained to imitate the teacher in generated similar feature representations the authors justify the method by introducing the notion of inverse internal covariate shift however it is not defined formally nor is it supported empirically and is based on the often criticized 1 notion of internal covariate shift for this reason it is hard to accept the presented argumentation in its current state moreover there seems to be a mistake in equation 2 in 42 the equation defines the method of computing loss weighting for a given datapoint the authors note that it converges to the value of one with increasing training iterations but for correctness it should be in 0 1 if it is 1 one of the losses in equation 3 is negated and is therefore maximised instead of being minimised which can lead to unexpected behaviour current parameterization allows it to be in 0 infinity experimental evaluation consists of quantitative evaluation of random sampling usual sgd and the proposed approach in training a classification model on mnsit cifar10 and cifar100 the proposed approach outperforms random sampling this is encouraging but the method should be compared to state of the art in curriculum learning in order to gauge how useful this approach is the paper is poorly written with many grammatical lack of s at the end of verbs used in singular 3rd person many places in the paper and spelling mistakes eg 326 tough instead of through i think some descriptions are unclear eg 422 while some parts of the paper seem to be irrelevant to the problem at hand 31 describes training on a single minibatch for multiple iterations as if it were a separate task and motivates random sampling which is just sgd to summarize the paper presents a very interesting idea in its current state it is hard to read however it also contains a number of unsupported claims and can be misleading it could also benefit from a more extensive evaluation with this in mind i suggest rejecting this paper 1 rahimi a 2017 test of time award talk nipsdocsepthis paper proposes a curriculum that encourages training on easy examples first and postpones training on hard examples however contrary to common ideas they propose to keep hard examples contribute to the loss and only forcing them to have internal representations similar to a nearby easy example the proposed objective is hence biased at the beginning but they dampen it over time to converge to the true objective at the end positives there is not much work considering each example as an individual subtask the observation that an underfitted classifier can destroy a good feature extractor is good negatives in the intro it says update rule of gradient descent assumes the top layer f2 to be the right classifier this seems like a fundamental misunderstanding of gradient descent and the chain rule the term d outputd f1 takes into account the error in f2 the caption of figure 2 says the they cannot separate from its neighbors if the loss of all examples in a cluster is high all are being misclassified a classifier then might have an easy job fixing them if all their labels are the same or have a difficult job if their labels are random the second scenario is unlikely if based on the claim of this figure the entropy has decreased during training in short the conclusion made in fig 2 does not necessarily hold given that figure this method is supposed to speed up training not necessarily improve the final generalization performance of the model the figures show the opposite outcomes its not clear why the improvement might be due to not tuning the hyperparameters of the baselines figure 3 does not necessarily support the conclusion the fluctuations might be caused by any curriculum that forces a fixed ordering across training epochs often on mnist the ordering of data according to the loss does not change significantly throughout training ### Summary:
this paper attempts to address a problem they dub inverse covariate shift where an improperly trained output layer can hamper learning the idea is to use a form of curriculum learning the reviewers found that the notion of inverse covariate shift was not formally or empirically well defined furthermore the baselines used were too weak the authors should consider comparing against stateoftheart curriculum learning methods
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 2520, 2929, 5936, 247, 2603, 273, 1499, 628, 405, 672, 3733, 247, 2500, 311, 4071, 11454, 6928, 28203, 10166, 3453, 3828, 30410, 778, 10546, 468, 4715, 273, 253, 8763, 3828, 4735, 253, 4477, 1067, 436, 13737, 4812, 9383, 11610, 5333, 347, 10066, 281, 253, 7312, 581, 835, 253, 4735, 3268, 15036, 285, 19055, 253, 30410, 597, 4271, 1892, 3530, 1110, 342, 1781, 2957, 347, 1146, 253, 20125, 2092, 597, 840, 12661, 247, 24642, 835, 824, 1892, 3530, 403, 3636, 387, 2393, 44540, 616, 2957, 26513, 285, 7932, 342, 247, 8284, 326, 616, 3386, 320, 2810, 281, 20667, 275, 4735, 2317, 3530, 326, 403, 12014, 10509, 533, 342, 247, 625, 9848, 8459, 3021, 3477, 253, 4477, 1750, 326, 436, 4483, 1110, 3530, 281, 8162, 949, 616, 3386, 387, 806, 1293, 32103, 253, 3733, 1066, 840, 275, 1996, 44540, 4751, 8162, 690, 4679, 403, 5907, 347, 1941, 326, 436, 6296, 7729, 3885, 484, 50276, 783, 2929, 310, 6685, 12744, 285, 369, 1892, 281, 1239, 253, 14511, 310, 1512, 15120, 247, 2257, 273, 1133, 88, 3292, 310, 1160, 253, 14951, 310, 1077, 25040, 285, 16706, 891, 574, 281, 1273, 5476, 2709, 2069, 1919, 1086, 6894, 272, 752, 812, 452, 6830, 644, 753, 1754, 327, 436, 760, 891, 513, 417, 43413, 436, 789, 4704, 323, 9628, 33810, 627, 403, 690, 2087, 3374, 342, 253, 12342, 1060, 403, 690, 2173, 16157, 50276, 186, 783, 30328, 273, 253, 13737, 4812, 9383, 11610, 5333, 310, 4931, 253, 2022, 15785, 273, 253, 2929, 533, 516, 417, 2119, 604, 436, 369, 417, 6571, 14109, 2168, 50276, 186, 783, 2929, 6131, 690, 5661, 268, 6856, 285, 39578, 281, 1089, 253, 2603, 273, 253, 2523, 533, 326, 629, 273, 253, 2929, 2593, 495, 310, 33817, 432, 752, 3637, 7194, 984, 38576, 627, 310, 417, 247, 2014, 2792, 10732, 533, 2581, 326, 273, 4811, 273, 2317, 342, 247, 22766, 3361, 273, 5971, 436, 310, 3240, 27350, 275, 958, 1996, 275, 2593, 577, 1892, 3365, 2097, 1029, 2957, 436, 310, 2649, 3240, 253, 1072, 1580, 253, 3438, 10732, 2097, 2581, 1146, 2822, 253, 3061, 7548, 534, 310, 417, 10848, 407, 816, 1907, 1029, 2957, 671, 253, 2957, 310, 417, 7616, 50276, 186, 8826, 3374, 342, 2593, 495, 253, 27367, 273, 4836, 3198, 247, 625, 7473, 5426, 285, 840, 8482, 6579, 285, 8083, 273, 8892, 2235, 641, 327, 8892, 3966, 697, 512, 1512, 21248, 253, 1307, 1327, 16777, 494, 556, 1077, 2173, 4495, 1682, 281, 3693, 4677, 374, 310, 1077, 16426, 5544, 891, 2868, 253, 4759, 6970, 310, 253, 1180, 273, 5971, 6607, 407, 581, 3284, 390, 625, 1223, 253, 2502, 6970, 310, 253, 1180, 273, 5971, 6607, 407, 608, 3603, 390, 625, 533, 891, 574, 281, 4677, 352, 562, 327, 619, 1211, 253, 2644, 12494, 17691, 4677, 495, 310, 1892, 281, 956, 891, 3686, 273, 476, 1056, 598, 752, 310, 1469, 3340, 342, 253, 17134, 18347, 273, 2593, 577, 1580, 697, 10323, 247, 12955, 273, 253, 4081, 10130, 3477, 281, 1892, 2403, 2119, 512, 9959, 347, 17335, 281, 5971, 403, 6607, 1293, 253, 4735, 2957, 533, 352, 3198, 247, 294, 17695, 50276, 186, 262, 310, 1774, 281, 22175, 326, 253, 10732, 273, 3477, 285, 1892, 476, 1818, 2112, 253, 3733, 984, 597, 403, 4103, 281, 752, 253, 13461, 403, 387, 253, 8763, 3828, 3386, 273, 690, 3530, 778, 320, 417, 1077, 39690, 387, 690, 3924, 533, 597, 778, 2489, 1077, 39690, 1996, 253, 5125, 5933, 1057, 436, 294, 15419, 2368, 533, 436, 310, 417, 1160, 2590, 2393, 327, 50276, 186, 249, 2593, 577, 253, 6197, 835, 331, 89, 310, 5393, 310, 12744, 891, 5467, 28842, 2097, 17170, 247, 1805, 2957, 671, 1996, 26301, 247, 8459, 310, 908, 672, 891, 1158, 752, 310, 5486, 310, 331, 247, 873, 253, 2644, 14951, 24088, 1755, 76, 44176, 326, 310, 417, 749, 3866, 264, 1327, 679, 4438, 14168, 310, 3076, 50276, 186, 338, 46007, 310, 6296, 247, 2957, 285, 417, 247, 3045, 751, 697, 4536, 6289, 281, 347, 275, 19734, 2957, 840, 891, 5467, 4067, 11655, 2097, 326, 253, 2801, 327, 253, 4735, 2957, 275, 5150, 495, 943, 320, 4067, 594, 891, 1158, 247, 19734, 861, 310, 5816, 275, 253, 23653, 273, 5150, 374, 285, 671, 275, 253, 5933, 50276, 186, 303, 417, 2119, 604, 253, 4679, 2686, 921, 247, 3885, 484, 275, 253, 3282, 273, 752, 253, 4477, 3053, 562, 15265, 839, 247, 3885, 484, 323, 479, 651, 1007, 751, 253, 3733, 4780, 9191, 403, 10323, 21012, 3253, 6569, 19473, 275, 2426, 273, 44540, 3185, 752, 359, 452, 310, 10323, 253, 1072, 5281, 6970, 533, 342, 247, 4512, 9510, 275, 3045, 4677, 577, 697, 9106, 557, 25203, 3472, 281, 1333, 436, 310, 247, 1270, 9510, 275, 3885, 990, 273, 2593, 8073, 407, 3981, 352, 2335, 1884, 44540, 323, 253, 1327, 1915, 695, 15508, 2715, 281, 755, 281, 697, 3045, 672, 1561, 577, 44540, 816, 751, 253, 24642, 2715, 352, 369, 387, 697, 2457, 3045, 10323, 50276, 186, 601, 253, 1524, 6452, 1060, 310, 326, 436, 24642, 778, 417, 452, 653, 264, 598, 253, 3733, 275, 253, 1039, 359, 1902, 352, 387, 512, 2299, 253, 26830, 10199, 273, 16426, 10509, 3530, 275, 1996, 44540, 1223, 9093, 15706, 616, 3386, 342, 12014, 10509, 3530, 323, 4321, 44540, 556, 10380, 3963, 1025, 253, 3733, 253, 4477, 513, 417, 2319, 436, 387, 512, 285, 891, 1158, 3812, 253, 3430, 6452, 432, 253, 1543, 5474, 33032, 2520, 2929, 8631, 271, 2746, 323, 16644, 24642, 4715, 275, 247, 3676, 4715, 9162, 9978, 253, 2022, 2934, 310, 281, 14357, 941, 2792, 2556, 281, 253, 1655, 1318, 273, 253, 2957, 327, 841, 941, 2792, 247, 27785, 2746, 651, 3657, 4715, 432, 941, 2792, 326, 403, 1892, 281, 30215, 1677, 3602, 273, 253, 1655, 4438, 285, 594, 253, 4477, 12661, 281, 897, 271, 3081, 2957, 1307, 323, 841, 1892, 941, 2792, 534, 29426, 253, 8763, 6779, 273, 841, 941, 2792, 281, 320, 8003, 281, 6779, 273, 2792, 326, 403, 2810, 275, 253, 8763, 2317, 285, 2568, 403, 6927, 281, 30215, 275, 253, 3282, 326, 253, 2957, 273, 3477, 3530, 310, 2406, 407, 690, 7887, 1318, 840, 253, 2957, 273, 1892, 3530, 436, 1390, 629, 310, 9009, 407, 42324, 8763, 14237, 285, 9162, 2957, 2193, 1309, 3733, 285, 8264, 7695, 5275, 31359, 275, 253, 4735, 2317, 10793, 247, 1892, 941, 1127, 310, 14494, 253, 2457, 2957, 3936, 253, 830, 273, 247, 4872, 5019, 273, 253, 9162, 2957, 285, 253, 6779, 2957, 50276, 783, 2934, 310, 4722, 275, 253, 3282, 326, 352, 14177, 281, 897, 1491, 670, 849, 2834, 9162, 273, 247, 1677, 941, 1127, 310, 281, 3157, 4715, 253, 4081, 6779, 2957, 476, 1421, 281, 9046, 6863, 7368, 273, 2074, 941, 1127, 275, 253, 4735, 2317, 285, 476, 1056, 9162, 6927, 352, 310, 2905, 281, 5974, 442, 12844, 6928, 835, 247, 5974, 310, 10166, 281, 516, 17255, 253, 9732, 275, 4561, 2074, 4735, 14237, 50276, 783, 4477, 15249, 253, 1332, 407, 16984, 253, 10732, 273, 13737, 4812, 9383, 11610, 5333, 2299, 352, 310, 417, 2931, 19186, 4543, 310, 352, 4516, 45190, 285, 310, 1754, 327, 253, 2223, 23159, 337, 10732, 273, 4812, 9383, 11610, 5333, 323, 436, 1921, 352, 310, 1892, 281, 2997, 253, 3559, 4154, 318, 275, 697, 1655, 1375, 50276, 3062, 1189, 627, 3133, 281, 320, 247, 10551, 275, 5150, 374, 275, 5976, 253, 5150, 13067, 253, 1332, 273, 12672, 2957, 42428, 323, 247, 1677, 2856, 522, 842, 253, 4477, 3877, 326, 352, 26414, 281, 253, 1318, 273, 581, 342, 3629, 3733, 25142, 533, 323, 36594, 352, 943, 320, 275, 470, 337, 604, 352, 310, 50276, 18, 581, 273, 253, 11655, 275, 5150, 495, 310, 2297, 456, 285, 310, 3103, 11903, 1701, 3185, 273, 1146, 7221, 1701, 534, 476, 1421, 281, 12439, 8770, 1655, 4764, 1320, 4483, 352, 281, 320, 275, 470, 50276, 43723, 50276, 49363, 7103, 8414, 273, 11745, 7103, 273, 3632, 10491, 7312, 256, 35333, 285, 253, 4081, 2746, 275, 3733, 247, 9162, 1566, 327, 278, 2224, 262, 260, 338, 274, 740, 285, 260, 338, 274, 2313, 253, 4081, 2746, 41731, 13015, 3632, 10491, 436, 310, 18462, 533, 253, 1332, 943, 320, 2429, 281, 1375, 273, 253, 1445, 275, 24642, 4715, 275, 1340, 281, 11206, 849, 4217, 436, 2746, 310, 50276, 783, 2929, 310, 15225, 3542, 342, 1142, 47412, 474, 3480, 273, 256, 387, 253, 990, 273, 43369, 908, 275, 11098, 495, 5784, 1436, 1142, 5053, 275, 253, 2929, 285, 33797, 16503, 24088, 34897, 10458, 3185, 273, 949, 891, 1158, 690, 20121, 403, 12744, 24088, 38429, 1223, 690, 4243, 273, 253, 2929, 1646, 281, 320, 19124, 281, 253, 1895, 387, 1133, 4562, 8631, 3733, 327, 247, 2014, 1054, 487, 1506, 323, 2709, 25142, 347, 604, 352, 497, 247, 4858, 4836, 285, 15265, 684, 3632, 10491, 534, 310, 816, 256, 35333, 50276, 936, 26799, 253, 2929, 10262, 247, 1077, 4722, 2934, 275, 697, 1655, 1375, 352, 310, 1892, 281, 1239, 2299, 352, 671, 4428, 247, 1180, 273, 36542, 3916, 285, 476, 320, 24363, 352, 812, 671, 5649, 432, 247, 625, 9470, 7103, 342, 436, 275, 2564, 891, 1804, 33944, 436, 2929, 50276, 18, 1218, 13243, 74, 247, 4240, 1071, 273, 673, 5722, 2312, 295, 2824, 7152, 33032, 2520, 2929, 29328, 247, 24642, 326, 29426, 3733, 327, 3477, 6667, 806, 285, 34807, 2487, 3733, 327, 1892, 6667, 2299, 10214, 281, 1846, 5697, 597, 12661, 281, 1978, 1892, 6667, 8162, 281, 253, 2957, 285, 760, 17190, 731, 281, 452, 4812, 14237, 2074, 281, 247, 10151, 3477, 1650, 253, 4081, 8103, 310, 7613, 23539, 387, 253, 5068, 533, 597, 16109, 257, 352, 689, 673, 281, 29623, 281, 253, 2032, 8103, 387, 253, 990, 50276, 993, 23223, 50276, 9088, 310, 417, 1199, 789, 7296, 1016, 1650, 347, 271, 2060, 8482, 1945, 50276, 783, 8310, 326, 271, 762, 71, 2166, 30410, 476, 6909, 247, 1175, 4735, 4908, 263, 310, 1175, 50276, 8265, 3993, 50276, 249, 253, 26432, 352, 2296, 5731, 4086, 273, 11786, 18499, 19584, 253, 1755, 3828, 269, 19, 281, 320, 253, 987, 30410, 436, 3133, 751, 247, 7936, 40663, 273, 11786, 18499, 285, 253, 5931, 4086, 253, 1307, 277, 3453, 69, 269, 18, 3936, 715, 2395, 253, 2228, 275, 269, 19, 50276, 783, 11743, 273, 4677, 374, 2296, 253, 50276, 9328, 2550, 4858, 432, 697, 15833, 604, 253, 2957, 273, 512, 6667, 275, 247, 7368, 310, 1029, 512, 403, 1146, 3731, 39651, 247, 30410, 840, 1537, 452, 271, 3477, 2628, 18505, 731, 604, 512, 616, 13301, 403, 253, 1072, 390, 452, 247, 2834, 2628, 604, 616, 13301, 403, 3632, 253, 1273, 10076, 310, 11543, 604, 1754, 327, 253, 1750, 273, 436, 4677, 253, 15579, 556, 6137, 1309, 3733, 275, 2159, 253, 6452, 1160, 275, 3036, 374, 1057, 417, 7933, 2186, 1677, 326, 4677, 50276, 2520, 1332, 310, 6326, 281, 3885, 598, 3733, 417, 7933, 3157, 253, 2457, 26647, 3045, 273, 253, 1566, 253, 8442, 921, 253, 7285, 6973, 697, 417, 2590, 2139, 253, 7756, 1537, 320, 1955, 281, 417, 25184, 253, 4373, 22041, 273, 253, 1666, 25379, 50276, 13206, 495, 1057, 417, 7933, 1329, 253, 6452, 253, 15113, 1537, 320, 4269, 407, 667, 24642, 326, 5621, 247, 4229, 15824, 2439, 3733, 44540, 2223, 327, 278, 79, 382, 253, 15824, 273, 941, 2556, 281, 253, 2957, 1057, 417, 1818, 3012, 4768, 3733, 187, 187, 4118, 18435, 27, 2520, 2929, 9437, 281, 2953, 247, 1895, 597, 19155, 13737, 9383, 11610, 5333, 835, 271, 28203, 10166, 3453, 3828, 476, 10546, 468, 4715, 253, 2934, 310, 281, 897, 247, 830, 273, 24642, 4715, 253, 30628, 1119, 326, 253, 10732, 273, 13737, 9383, 11610, 5333, 369, 417, 19186, 390, 45190, 973, 2931, 33810, 253, 1666, 25379, 908, 497, 1512, 5075, 253, 4477, 943, 1908, 10941, 1411, 1375, 23037, 14387, 24642, 4715, 3082 ]
[ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 ]
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 2520, 2929, 5936, 247, 2603, 273, 1499, 628, 405, 672, 3733, 247, 2500, 311, 4071, 11454, 6928, 28203, 10166, 3453, 3828, 30410, 778, 10546, 468, 4715, 273, 253, 8763, 3828, 4735, 253, 4477, 1067, 436, 13737, 4812, 9383, 11610, 5333, 347, 10066, 281, 253, 7312, 581, 835, 253, 4735, 3268, 15036, 285, 19055, 253, 30410, 597, 4271, 1892, 3530, 1110, 342, 1781, 2957, 347, 1146, 253, 20125, 2092, 597, 840, 12661, 247, 24642, 835, 824, 1892, 3530, 403, 3636, 387, 2393, 44540, 616, 2957, 26513, 285, 7932, 342, 247, 8284, 326, 616, 3386, 320, 2810, 281, 20667, 275, 4735, 2317, 3530, 326, 403, 12014, 10509, 533, 342, 247, 625, 9848, 8459, 3021, 3477, 253, 4477, 1750, 326, 436, 4483, 1110, 3530, 281, 8162, 949, 616, 3386, 387, 806, 1293, 32103, 253, 3733, 1066, 840, 275, 1996, 44540, 4751, 8162, 690, 4679, 403, 5907, 347, 1941, 326, 436, 6296, 7729, 3885, 484, 50276, 783, 2929, 310, 6685, 12744, 285, 369, 1892, 281, 1239, 253, 14511, 310, 1512, 15120, 247, 2257, 273, 1133, 88, 3292, 310, 1160, 253, 14951, 310, 1077, 25040, 285, 16706, 891, 574, 281, 1273, 5476, 2709, 2069, 1919, 1086, 6894, 272, 752, 812, 452, 6830, 644, 753, 1754, 327, 436, 760, 891, 513, 417, 43413, 436, 789, 4704, 323, 9628, 33810, 627, 403, 690, 2087, 3374, 342, 253, 12342, 1060, 403, 690, 2173, 16157, 50276, 186, 783, 30328, 273, 253, 13737, 4812, 9383, 11610, 5333, 310, 4931, 253, 2022, 15785, 273, 253, 2929, 533, 516, 417, 2119, 604, 436, 369, 417, 6571, 14109, 2168, 50276, 186, 783, 2929, 6131, 690, 5661, 268, 6856, 285, 39578, 281, 1089, 253, 2603, 273, 253, 2523, 533, 326, 629, 273, 253, 2929, 2593, 495, 310, 33817, 432, 752, 3637, 7194, 984, 38576, 627, 310, 417, 247, 2014, 2792, 10732, 533, 2581, 326, 273, 4811, 273, 2317, 342, 247, 22766, 3361, 273, 5971, 436, 310, 3240, 27350, 275, 958, 1996, 275, 2593, 577, 1892, 3365, 2097, 1029, 2957, 436, 310, 2649, 3240, 253, 1072, 1580, 253, 3438, 10732, 2097, 2581, 1146, 2822, 253, 3061, 7548, 534, 310, 417, 10848, 407, 816, 1907, 1029, 2957, 671, 253, 2957, 310, 417, 7616, 50276, 186, 8826, 3374, 342, 2593, 495, 253, 27367, 273, 4836, 3198, 247, 625, 7473, 5426, 285, 840, 8482, 6579, 285, 8083, 273, 8892, 2235, 641, 327, 8892, 3966, 697, 512, 1512, 21248, 253, 1307, 1327, 16777, 494, 556, 1077, 2173, 4495, 1682, 281, 3693, 4677, 374, 310, 1077, 16426, 5544, 891, 2868, 253, 4759, 6970, 310, 253, 1180, 273, 5971, 6607, 407, 581, 3284, 390, 625, 1223, 253, 2502, 6970, 310, 253, 1180, 273, 5971, 6607, 407, 608, 3603, 390, 625, 533, 891, 574, 281, 4677, 352, 562, 327, 619, 1211, 253, 2644, 12494, 17691, 4677, 495, 310, 1892, 281, 956, 891, 3686, 273, 476, 1056, 598, 752, 310, 1469, 3340, 342, 253, 17134, 18347, 273, 2593, 577, 1580, 697, 10323, 247, 12955, 273, 253, 4081, 10130, 3477, 281, 1892, 2403, 2119, 512, 9959, 347, 17335, 281, 5971, 403, 6607, 1293, 253, 4735, 2957, 533, 352, 3198, 247, 294, 17695, 50276, 186, 262, 310, 1774, 281, 22175, 326, 253, 10732, 273, 3477, 285, 1892, 476, 1818, 2112, 253, 3733, 984, 597, 403, 4103, 281, 752, 253, 13461, 403, 387, 253, 8763, 3828, 3386, 273, 690, 3530, 778, 320, 417, 1077, 39690, 387, 690, 3924, 533, 597, 778, 2489, 1077, 39690, 1996, 253, 5125, 5933, 1057, 436, 294, 15419, 2368, 533, 436, 310, 417, 1160, 2590, 2393, 327, 50276, 186, 249, 2593, 577, 253, 6197, 835, 331, 89, 310, 5393, 310, 12744, 891, 5467, 28842, 2097, 17170, 247, 1805, 2957, 671, 1996, 26301, 247, 8459, 310, 908, 672, 891, 1158, 752, 310, 5486, 310, 331, 247, 873, 253, 2644, 14951, 24088, 1755, 76, 44176, 326, 310, 417, 749, 3866, 264, 1327, 679, 4438, 14168, 310, 3076, 50276, 186, 338, 46007, 310, 6296, 247, 2957, 285, 417, 247, 3045, 751, 697, 4536, 6289, 281, 347, 275, 19734, 2957, 840, 891, 5467, 4067, 11655, 2097, 326, 253, 2801, 327, 253, 4735, 2957, 275, 5150, 495, 943, 320, 4067, 594, 891, 1158, 247, 19734, 861, 310, 5816, 275, 253, 23653, 273, 5150, 374, 285, 671, 275, 253, 5933, 50276, 186, 303, 417, 2119, 604, 253, 4679, 2686, 921, 247, 3885, 484, 275, 253, 3282, 273, 752, 253, 4477, 3053, 562, 15265, 839, 247, 3885, 484, 323, 479, 651, 1007, 751, 253, 3733, 4780, 9191, 403, 10323, 21012, 3253, 6569, 19473, 275, 2426, 273, 44540, 3185, 752, 359, 452, 310, 10323, 253, 1072, 5281, 6970, 533, 342, 247, 4512, 9510, 275, 3045, 4677, 577, 697, 9106, 557, 25203, 3472, 281, 1333, 436, 310, 247, 1270, 9510, 275, 3885, 990, 273, 2593, 8073, 407, 3981, 352, 2335, 1884, 44540, 323, 253, 1327, 1915, 695, 15508, 2715, 281, 755, 281, 697, 3045, 672, 1561, 577, 44540, 816, 751, 253, 24642, 2715, 352, 369, 387, 697, 2457, 3045, 10323, 50276, 186, 601, 253, 1524, 6452, 1060, 310, 326, 436, 24642, 778, 417, 452, 653, 264, 598, 253, 3733, 275, 253, 1039, 359, 1902, 352, 387, 512, 2299, 253, 26830, 10199, 273, 16426, 10509, 3530, 275, 1996, 44540, 1223, 9093, 15706, 616, 3386, 342, 12014, 10509, 3530, 323, 4321, 44540, 556, 10380, 3963, 1025, 253, 3733, 253, 4477, 513, 417, 2319, 436, 387, 512, 285, 891, 1158, 3812, 253, 3430, 6452, 432, 253, 1543, 5474, 33032, 2520, 2929, 8631, 271, 2746, 323, 16644, 24642, 4715, 275, 247, 3676, 4715, 9162, 9978, 253, 2022, 2934, 310, 281, 14357, 941, 2792, 2556, 281, 253, 1655, 1318, 273, 253, 2957, 327, 841, 941, 2792, 247, 27785, 2746, 651, 3657, 4715, 432, 941, 2792, 326, 403, 1892, 281, 30215, 1677, 3602, 273, 253, 1655, 4438, 285, 594, 253, 4477, 12661, 281, 897, 271, 3081, 2957, 1307, 323, 841, 1892, 941, 2792, 534, 29426, 253, 8763, 6779, 273, 841, 941, 2792, 281, 320, 8003, 281, 6779, 273, 2792, 326, 403, 2810, 275, 253, 8763, 2317, 285, 2568, 403, 6927, 281, 30215, 275, 253, 3282, 326, 253, 2957, 273, 3477, 3530, 310, 2406, 407, 690, 7887, 1318, 840, 253, 2957, 273, 1892, 3530, 436, 1390, 629, 310, 9009, 407, 42324, 8763, 14237, 285, 9162, 2957, 2193, 1309, 3733, 285, 8264, 7695, 5275, 31359, 275, 253, 4735, 2317, 10793, 247, 1892, 941, 1127, 310, 14494, 253, 2457, 2957, 3936, 253, 830, 273, 247, 4872, 5019, 273, 253, 9162, 2957, 285, 253, 6779, 2957, 50276, 783, 2934, 310, 4722, 275, 253, 3282, 326, 352, 14177, 281, 897, 1491, 670, 849, 2834, 9162, 273, 247, 1677, 941, 1127, 310, 281, 3157, 4715, 253, 4081, 6779, 2957, 476, 1421, 281, 9046, 6863, 7368, 273, 2074, 941, 1127, 275, 253, 4735, 2317, 285, 476, 1056, 9162, 6927, 352, 310, 2905, 281, 5974, 442, 12844, 6928, 835, 247, 5974, 310, 10166, 281, 516, 17255, 253, 9732, 275, 4561, 2074, 4735, 14237, 50276, 783, 4477, 15249, 253, 1332, 407, 16984, 253, 10732, 273, 13737, 4812, 9383, 11610, 5333, 2299, 352, 310, 417, 2931, 19186, 4543, 310, 352, 4516, 45190, 285, 310, 1754, 327, 253, 2223, 23159, 337, 10732, 273, 4812, 9383, 11610, 5333, 323, 436, 1921, 352, 310, 1892, 281, 2997, 253, 3559, 4154, 318, 275, 697, 1655, 1375, 50276, 3062, 1189, 627, 3133, 281, 320, 247, 10551, 275, 5150, 374, 275, 5976, 253, 5150, 13067, 253, 1332, 273, 12672, 2957, 42428, 323, 247, 1677, 2856, 522, 842, 253, 4477, 3877, 326, 352, 26414, 281, 253, 1318, 273, 581, 342, 3629, 3733, 25142, 533, 323, 36594, 352, 943, 320, 275, 470, 337, 604, 352, 310, 50276, 18, 581, 273, 253, 11655, 275, 5150, 495, 310, 2297, 456, 285, 310, 3103, 11903, 1701, 3185, 273, 1146, 7221, 1701, 534, 476, 1421, 281, 12439, 8770, 1655, 4764, 1320, 4483, 352, 281, 320, 275, 470, 50276, 43723, 50276, 49363, 7103, 8414, 273, 11745, 7103, 273, 3632, 10491, 7312, 256, 35333, 285, 253, 4081, 2746, 275, 3733, 247, 9162, 1566, 327, 278, 2224, 262, 260, 338, 274, 740, 285, 260, 338, 274, 2313, 253, 4081, 2746, 41731, 13015, 3632, 10491, 436, 310, 18462, 533, 253, 1332, 943, 320, 2429, 281, 1375, 273, 253, 1445, 275, 24642, 4715, 275, 1340, 281, 11206, 849, 4217, 436, 2746, 310, 50276, 783, 2929, 310, 15225, 3542, 342, 1142, 47412, 474, 3480, 273, 256, 387, 253, 990, 273, 43369, 908, 275, 11098, 495, 5784, 1436, 1142, 5053, 275, 253, 2929, 285, 33797, 16503, 24088, 34897, 10458, 3185, 273, 949, 891, 1158, 690, 20121, 403, 12744, 24088, 38429, 1223, 690, 4243, 273, 253, 2929, 1646, 281, 320, 19124, 281, 253, 1895, 387, 1133, 4562, 8631, 3733, 327, 247, 2014, 1054, 487, 1506, 323, 2709, 25142, 347, 604, 352, 497, 247, 4858, 4836, 285, 15265, 684, 3632, 10491, 534, 310, 816, 256, 35333, 50276, 936, 26799, 253, 2929, 10262, 247, 1077, 4722, 2934, 275, 697, 1655, 1375, 352, 310, 1892, 281, 1239, 2299, 352, 671, 4428, 247, 1180, 273, 36542, 3916, 285, 476, 320, 24363, 352, 812, 671, 5649, 432, 247, 625, 9470, 7103, 342, 436, 275, 2564, 891, 1804, 33944, 436, 2929, 50276, 18, 1218, 13243, 74, 247, 4240, 1071, 273, 673, 5722, 2312, 295, 2824, 7152, 33032, 2520, 2929, 29328, 247, 24642, 326, 29426, 3733, 327, 3477, 6667, 806, 285, 34807, 2487, 3733, 327, 1892, 6667, 2299, 10214, 281, 1846, 5697, 597, 12661, 281, 1978, 1892, 6667, 8162, 281, 253, 2957, 285, 760, 17190, 731, 281, 452, 4812, 14237, 2074, 281, 247, 10151, 3477, 1650, 253, 4081, 8103, 310, 7613, 23539, 387, 253, 5068, 533, 597, 16109, 257, 352, 689, 673, 281, 29623, 281, 253, 2032, 8103, 387, 253, 990, 50276, 993, 23223, 50276, 9088, 310, 417, 1199, 789, 7296, 1016, 1650, 347, 271, 2060, 8482, 1945, 50276, 783, 8310, 326, 271, 762, 71, 2166, 30410, 476, 6909, 247, 1175, 4735, 4908, 263, 310, 1175, 50276, 8265, 3993, 50276, 249, 253, 26432, 352, 2296, 5731, 4086, 273, 11786, 18499, 19584, 253, 1755, 3828, 269, 19, 281, 320, 253, 987, 30410, 436, 3133, 751, 247, 7936, 40663, 273, 11786, 18499, 285, 253, 5931, 4086, 253, 1307, 277, 3453, 69, 269, 18, 3936, 715, 2395, 253, 2228, 275, 269, 19, 50276, 783, 11743, 273, 4677, 374, 2296, 253, 50276, 9328, 2550, 4858, 432, 697, 15833, 604, 253, 2957, 273, 512, 6667, 275, 247, 7368, 310, 1029, 512, 403, 1146, 3731, 39651, 247, 30410, 840, 1537, 452, 271, 3477, 2628, 18505, 731, 604, 512, 616, 13301, 403, 253, 1072, 390, 452, 247, 2834, 2628, 604, 616, 13301, 403, 3632, 253, 1273, 10076, 310, 11543, 604, 1754, 327, 253, 1750, 273, 436, 4677, 253, 15579, 556, 6137, 1309, 3733, 275, 2159, 253, 6452, 1160, 275, 3036, 374, 1057, 417, 7933, 2186, 1677, 326, 4677, 50276, 2520, 1332, 310, 6326, 281, 3885, 598, 3733, 417, 7933, 3157, 253, 2457, 26647, 3045, 273, 253, 1566, 253, 8442, 921, 253, 7285, 6973, 697, 417, 2590, 2139, 253, 7756, 1537, 320, 1955, 281, 417, 25184, 253, 4373, 22041, 273, 253, 1666, 25379, 50276, 13206, 495, 1057, 417, 7933, 1329, 253, 6452, 253, 15113, 1537, 320, 4269, 407, 667, 24642, 326, 5621, 247, 4229, 15824, 2439, 3733, 44540, 2223, 327, 278, 79, 382, 253, 15824, 273, 941, 2556, 281, 253, 2957, 1057, 417, 1818, 3012, 4768, 3733, 187, 187, 4118, 18435, 27, 2520, 2929, 9437, 281, 2953, 247, 1895, 597, 19155, 13737, 9383, 11610, 5333, 835, 271, 28203, 10166, 3453, 3828, 476, 10546, 468, 4715, 253, 2934, 310, 281, 897, 247, 830, 273, 24642, 4715, 253, 30628, 1119, 326, 253, 10732, 273, 13737, 9383, 11610, 5333, 369, 417, 19186, 390, 45190, 973, 2931, 33810, 253, 1666, 25379, 908, 497, 1512, 5075, 253, 4477, 943, 1908, 10941, 1411, 1375, 23037, 14387, 24642, 4715, 3082 ]
Below is given review of a research paper from cnoference journal. Please write a summary the review. ### Review: this paper proposes a crossimage context memory cicm for learning and using the crossimage context to recover the corrupted regions it tries to provide richer context that benefits the inpainting task the experimental results demonstrate the effectiveness and generalization of cicm strengths 1 the idea of using external memory is reasonable 2 the paper is clearly written weaknesses 1 the method is a little common 2 the advantages of external memory need more discussion 3 the framework figure 2 is a little hard to follow c and d provide the details but they are not necessary 4 the aim of maximizingminimizing the interintraset similarities between the crossimage features is not clear why is it useful for impainting 1 using external memory can improve the performance also it will bring a lot of computational costs please make some analysis docsepthe paper proposes a way to utilize external information for inpainting to do this the proposed method maintains a database of clustered features collected from the dataset then for each regions encoded feature a matching cluster is identified and the features of that cluster is augmented to the encoded feature in a soft differentiable way the proposed method can be an independent addon to existing inpainting methods and it consistently improves performance on quantitative metrics that measure similarity with the ground truth such as psnr ssim l1 or lpips strength i find it interesting to utilize external features for better image inpainting the ablations are thoughtful including singleimage vs crossimage context and different ways to create the bank of features the paper achieves consistently good results than the baseline methods in particular it can be plugged into multiple existing methods and improve on all of them weakness the exposition is a bit difficult to follow in general the paper focuses on how the method is formulated rather than why regarding this i have a few questions please see the questions section all evaluation metrics measure how much the output matches the ground truth but it may not directly correlated with realism for example since image inpainting is a multimodal problem with diverse possible outputs to minimize the l1 loss the output needs to be at the median of all possible values in the end l1 score gives advantage to the outputs that are smooth and low saturation for example a pix2pix model trained only on l1 objective httpsphillipigithubiopix2piximagesindexfacades2lossvariationshtml looks unrealistic even though it does achieve good l1 loss to separately evaluate realism metrics like fid are used comodgan zhao et al iclr2021 the authors does address the limitations docsepthe paper proposes to use the crossimage context which consists of features learned from different visual patterns to recover the corrupted regions the proposed method archives stateoftheart performance on singleimage inpainting on multiple datasets pros the proposed cicm achieves stable improvements on all the baseline methods and achieves stateofart inpainting performance the authors conduct extensive experiments for the ablation study to validate the design of context generalization and context augmentation the design of these modules is validated cons the core contribution seems not strong enough for me as the external memory bankbased method has been adopted in many tasks what is the unique design to make it suitable for inpainting on the other hand the visual quality improvement is also not obvious enough even with the proposed cicm the inpainting results are not as attractive as lama and repaint the writing needs to be improved there are many sentences that are not meaningful but very complicated eg lines 36 to 39 how is the external memory bankbased method compared to gan priorbased methods such as lama is it possible to visualize the features learned in cicm it is interesting to see how the cicm is learned during the training suggestions missing reference in figure 4 na docsepthis paper proposes an image inpainting algorithm that learns visual context features across different images and saves them in an external memory cicm these features are used to augment regional features of the corrupted input image which may result in better completion quality than relying on features inside the single image the proposed approach outperforms existing models on the public datasets strengths the proposed approach based on crossimage context memory is novel it saves higher level richer visual context thus is unlike previous memory based methods such as tmad 42 and srm 43 the proposed approach is highly effective outperforming recent existing works tables 4 and 5 the proposed approach is flexible and general capable of extending several existing frameworks with consistent enhancements table 3 ablation studies on internal component of the proposed approach have been thoroughly made thus proved their importance tables 1 and 2 several extensions and variants of the proposed approach have been explored in the supplementary materials all showing meaningful results weaknesses it is unclear how the cicm is initialized at the beginning of the training it is unclear which architecture is used as their default backbone network the use of an external memory bank increases the ram usage parameters and flops quite a bit depending on the scale of cicm and device property the searching processes may become a bottleneck of the inference speed comments after reading the rebuttal i believe that the proposed work describes a very interesting algorithm based on cicm yet unseen from existing works as addressed by the authors the cicm is fast to search into and effective to produce higher quality generation by utilizing compact crossimage information unavailable within the input image the quality gains are quite clear without much sacrificing the efficiency my final recommendation is still to strongly support to accept the paper yes they addressed the limitations in sections 3 and 4 of the supplementary materials which include failure cases memory increase cross model and cross dataset scenarios and societal impacts these discussions seem to have been thoroughly made docsepthis paper proposes the crossimage context memory cicm for learning and using the crossimage context to recover corrupted images cicm consists of multiple sets of crossimage features learned from the image regions with different visual patterns the regional features are learned across different images thus providing richer context that benefits the inpainting task the experimental results demonstrate the effectiveness and generalization of cicm which achieves stateoftheart performances on various datasets for single image inpainting strengths 1 the utilization of the crossimage context to assist in image inpainting is reasonable and the proposed crossimage context memory is somewhat novel and can also be generalized to the existing inpainting models 2 the experiments are sufficient and the internal study is nice to show the effectiveness of the cicm 3 the presentation is clear and the reference is also adequate weaknesses 1 besides the number of feature sets and the size of each set the resolution of the regional features and the number of layers adopting the cicm are also vital settings that affect the image inpainting performance but there seems no explanation and analysis for them minor 1 in line 74 symbols of h and w are used to denote the height and width of the image and the feature but they are actually different 2 in line 117 what is the value of the momentum factor 3 in line 168 four groups is three groups na ### Summary:
the paper discusses how to use external information for inpainting reviewers appreciated the idea but raised concerns regarding limited novelty use of the proposed method for inpainting baselines being evaluated incorrectly and missing ablations the rebuttal was able to address most of the concerns and reviewers remained positive ac concurs and doesnt find reasons to overturn an unanimous majority recommendation
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 2520, 2929, 29328, 247, 2831, 5695, 3634, 3541, 260, 280, 78, 323, 4715, 285, 970, 253, 2831, 5695, 3634, 281, 9295, 253, 40634, 4811, 352, 14177, 281, 2085, 38539, 3634, 326, 5373, 253, 275, 31406, 1076, 4836, 253, 5661, 1543, 7568, 253, 12510, 285, 26647, 273, 260, 280, 78, 20544, 337, 253, 2934, 273, 970, 6024, 3541, 310, 5272, 50276, 19, 253, 2929, 310, 4518, 3542, 50275, 20881, 1255, 265, 337, 253, 1332, 310, 247, 1652, 1846, 50276, 19, 253, 11361, 273, 6024, 3541, 878, 625, 5955, 50276, 20, 253, 7792, 4677, 374, 310, 247, 1652, 1892, 281, 956, 260, 285, 277, 2085, 253, 4278, 533, 597, 403, 417, 3309, 50276, 21, 253, 4388, 273, 46875, 1222, 303, 3006, 253, 734, 565, 6230, 292, 22620, 875, 253, 2831, 5695, 3386, 310, 417, 2590, 2139, 310, 352, 4217, 323, 1607, 404, 1076, 50276, 18, 970, 6024, 3541, 476, 3157, 253, 3045, 671, 352, 588, 3324, 247, 2257, 273, 15180, 4815, 4496, 1056, 690, 1783, 50275, 7152, 339, 431, 248, 2929, 29328, 247, 1039, 281, 16584, 6024, 1491, 323, 275, 31406, 1076, 281, 513, 436, 253, 4081, 1332, 18922, 247, 5447, 273, 29102, 3386, 5728, 432, 253, 10895, 840, 323, 1016, 4811, 16202, 4735, 247, 11038, 7368, 310, 3636, 285, 253, 3386, 273, 326, 7368, 310, 31612, 281, 253, 16202, 4735, 275, 247, 2602, 46350, 1039, 253, 4081, 1332, 476, 320, 271, 3907, 823, 251, 281, 5368, 275, 31406, 1076, 3082, 285, 352, 12724, 19132, 3045, 327, 11745, 17082, 326, 2557, 14259, 342, 253, 3216, 5083, 824, 347, 3714, 23838, 256, 3549, 298, 18, 390, 39322, 2824, 50276, 45563, 50276, 74, 1089, 352, 4722, 281, 16584, 6024, 3386, 323, 1805, 2460, 275, 31406, 1076, 50276, 783, 490, 77, 569, 403, 30457, 1690, 2014, 5695, 4632, 2831, 5695, 3634, 285, 1027, 4088, 281, 2794, 253, 4310, 273, 3386, 50276, 783, 2929, 33526, 12724, 1175, 1543, 685, 253, 8245, 3082, 275, 1798, 352, 476, 320, 43867, 715, 2709, 5368, 3082, 285, 3157, 327, 512, 273, 731, 50275, 20881, 1255, 50276, 783, 47284, 310, 247, 2372, 2834, 281, 956, 275, 2087, 253, 2929, 16633, 327, 849, 253, 1332, 310, 26115, 2581, 685, 2139, 5001, 436, 891, 452, 247, 1643, 3533, 4496, 923, 253, 3533, 2593, 50275, 455, 7103, 17082, 2557, 849, 1199, 253, 3453, 10129, 253, 3216, 5083, 533, 352, 778, 417, 3587, 9578, 342, 40135, 323, 1650, 1580, 2460, 275, 31406, 1076, 310, 247, 23390, 26306, 1895, 342, 11117, 1896, 18012, 281, 15338, 253, 298, 18, 2957, 253, 3453, 3198, 281, 320, 387, 253, 8876, 273, 512, 1896, 2193, 275, 253, 990, 298, 18, 4868, 4245, 5750, 281, 253, 18012, 326, 403, 6032, 285, 1698, 22185, 323, 1650, 247, 8066, 19, 30061, 1566, 10166, 760, 327, 298, 18, 8103, 5987, 545, 408, 532, 304, 6666, 17591, 895, 19, 30061, 13485, 4663, 28402, 3355, 19, 18585, 20617, 569, 2974, 4453, 46521, 1014, 2167, 352, 1057, 5115, 1175, 298, 18, 2957, 281, 11794, 7472, 40135, 17082, 751, 269, 301, 403, 908, 389, 351, 1247, 1182, 31035, 1162, 355, 17857, 32888, 938, 1797, 50276, 783, 4477, 1057, 2953, 253, 7364, 50276, 7152, 339, 431, 248, 2929, 29328, 281, 897, 253, 2831, 5695, 3634, 534, 8414, 273, 3386, 6311, 432, 1027, 5304, 6127, 281, 9295, 253, 40634, 4811, 253, 4081, 1332, 31523, 1375, 23037, 14387, 3045, 327, 2014, 5695, 275, 31406, 1076, 327, 2709, 15302, 5847, 50276, 783, 4081, 260, 280, 78, 33526, 6474, 11701, 327, 512, 253, 8245, 3082, 285, 33526, 1375, 1171, 435, 275, 31406, 1076, 3045, 253, 4477, 2589, 9470, 4679, 323, 253, 28913, 1263, 281, 17813, 253, 2216, 273, 3634, 26647, 285, 3634, 42072, 253, 2216, 273, 841, 11911, 310, 17618, 50275, 5040, 50276, 783, 5161, 7680, 3133, 417, 2266, 2217, 323, 479, 50276, 284, 253, 6024, 3541, 4310, 3169, 1332, 556, 644, 8671, 275, 1142, 8892, 752, 310, 253, 4451, 2216, 281, 1056, 352, 7470, 323, 275, 31406, 1076, 327, 253, 643, 1133, 253, 5304, 3290, 7756, 310, 671, 417, 4755, 2217, 1014, 342, 253, 4081, 260, 280, 78, 253, 275, 31406, 1076, 1543, 403, 417, 347, 12994, 347, 298, 2902, 285, 1234, 1143, 50275, 783, 4028, 3198, 281, 320, 5520, 627, 403, 1142, 14683, 326, 403, 417, 14282, 533, 1077, 9542, 24088, 3104, 5540, 281, 6931, 50275, 5430, 310, 253, 6024, 3541, 4310, 3169, 1332, 2429, 281, 36827, 2720, 3169, 3082, 824, 347, 298, 2902, 310, 352, 1896, 281, 31986, 253, 3386, 6311, 275, 260, 280, 78, 352, 310, 4722, 281, 923, 849, 253, 260, 280, 78, 310, 6311, 1309, 253, 3733, 50274, 35640, 621, 5816, 3806, 275, 4677, 577, 50275, 2072, 5474, 33032, 2520, 2929, 29328, 271, 2460, 275, 31406, 1076, 5933, 326, 33772, 5304, 3634, 3386, 2439, 1027, 3888, 285, 26866, 731, 275, 271, 6024, 3541, 260, 280, 78, 841, 3386, 403, 908, 281, 35919, 9933, 3386, 273, 253, 40634, 3280, 2460, 534, 778, 906, 275, 1805, 12240, 3290, 685, 22128, 327, 3386, 3304, 253, 2014, 2460, 253, 4081, 2746, 41731, 13015, 5368, 3210, 327, 253, 1345, 15302, 50276, 296, 3755, 20556, 50274, 783, 4081, 2746, 1754, 327, 2831, 5695, 3634, 3541, 310, 4460, 352, 26866, 2169, 1268, 38539, 5304, 3634, 3021, 310, 12401, 2045, 3541, 1754, 3082, 824, 347, 246, 34180, 5976, 285, 256, 1109, 7652, 50274, 783, 4081, 2746, 310, 4122, 3576, 41731, 14692, 3332, 5368, 2987, 7180, 577, 285, 608, 50274, 783, 4081, 2746, 310, 12112, 285, 2087, 7032, 273, 13633, 2067, 5368, 31225, 342, 5185, 42752, 2829, 495, 50274, 1752, 318, 2175, 327, 4812, 4445, 273, 253, 4081, 2746, 452, 644, 16575, 1160, 3021, 8058, 616, 6349, 7180, 337, 285, 374, 50274, 43249, 18149, 285, 11640, 273, 253, 4081, 2746, 452, 644, 14859, 275, 253, 24864, 4753, 512, 4645, 14282, 1543, 50275, 20881, 1255, 265, 50274, 262, 310, 12744, 849, 253, 260, 280, 78, 310, 31260, 387, 253, 5068, 273, 253, 3733, 50274, 262, 310, 12744, 534, 10336, 310, 908, 347, 616, 4284, 27882, 2990, 50274, 783, 897, 273, 271, 6024, 3541, 4310, 5459, 253, 17653, 10393, 3602, 285, 892, 2695, 3240, 247, 2372, 50274, 7955, 1946, 327, 253, 4311, 273, 260, 280, 78, 285, 2813, 2867, 253, 12203, 4870, 778, 2489, 247, 3673, 44856, 273, 253, 17032, 3885, 50274, 26122, 846, 4361, 253, 30080, 22559, 50275, 74, 2868, 326, 253, 4081, 789, 8631, 247, 1077, 4722, 5933, 1754, 327, 260, 280, 78, 2568, 39709, 432, 5368, 2987, 347, 9713, 407, 253, 4477, 253, 260, 280, 78, 310, 3809, 281, 3186, 715, 285, 3576, 281, 4711, 2169, 3290, 5978, 407, 17617, 8566, 2831, 5695, 1491, 29356, 1561, 253, 3280, 2460, 253, 3290, 15988, 403, 3240, 2590, 1293, 1199, 18501, 272, 253, 6733, 619, 2457, 17401, 310, 1335, 281, 7052, 1329, 281, 2997, 253, 2929, 4754, 597, 9713, 253, 7364, 275, 7118, 495, 285, 577, 273, 253, 24864, 4753, 534, 2486, 4433, 2219, 3541, 2572, 2831, 1566, 285, 2831, 10895, 15216, 285, 38058, 16274, 841, 11985, 1646, 281, 452, 644, 16575, 1160, 5474, 33032, 2520, 2929, 29328, 253, 2831, 5695, 3634, 3541, 260, 280, 78, 323, 4715, 285, 970, 253, 2831, 5695, 3634, 281, 9295, 40634, 3888, 260, 280, 78, 8414, 273, 2709, 5239, 273, 2831, 5695, 3386, 6311, 432, 253, 2460, 4811, 342, 1027, 5304, 6127, 253, 9933, 3386, 403, 6311, 2439, 1027, 3888, 3021, 5277, 38539, 3634, 326, 5373, 253, 275, 31406, 1076, 4836, 253, 5661, 1543, 7568, 253, 12510, 285, 26647, 273, 260, 280, 78, 534, 33526, 1375, 23037, 14387, 16226, 327, 2710, 15302, 323, 2014, 2460, 275, 31406, 1076, 20544, 50276, 18, 186, 783, 19575, 273, 253, 2831, 5695, 3634, 281, 10073, 275, 2460, 275, 31406, 1076, 310, 5272, 285, 253, 4081, 2831, 5695, 3634, 3541, 310, 8489, 4460, 285, 476, 671, 320, 14923, 281, 253, 5368, 275, 31406, 1076, 3210, 374, 186, 783, 4679, 403, 4209, 285, 253, 4812, 1263, 310, 5322, 281, 921, 253, 12510, 273, 253, 260, 280, 78, 495, 186, 783, 9759, 310, 2590, 285, 253, 3806, 310, 671, 10599, 50276, 20881, 1255, 265, 337, 186, 67, 11587, 253, 1180, 273, 4735, 5239, 285, 253, 1979, 273, 1016, 873, 253, 6064, 273, 253, 9933, 3386, 285, 253, 1180, 273, 8090, 25987, 253, 260, 280, 78, 403, 671, 12232, 7533, 326, 2818, 253, 2460, 275, 31406, 1076, 3045, 533, 627, 3133, 642, 8813, 285, 1783, 323, 731, 50276, 37585, 337, 186, 249, 1386, 10677, 14217, 273, 288, 285, 259, 403, 908, 281, 9173, 253, 4898, 285, 4871, 273, 253, 2460, 285, 253, 4735, 533, 597, 403, 2686, 1027, 374, 186, 249, 1386, 12387, 752, 310, 253, 1318, 273, 253, 10254, 2803, 495, 186, 249, 1386, 21748, 1740, 2390, 310, 1264, 2390, 50276, 2072, 2490, 187, 4118, 18435, 27, 783, 2929, 25339, 849, 281, 897, 6024, 1491, 323, 275, 31406, 1076, 30628, 14109, 253, 2934, 533, 5439, 7350, 5001, 3710, 38135, 897, 273, 253, 4081, 1332, 323, 275, 31406, 1076, 1666, 25379, 1146, 6760, 30833, 285, 5816, 490, 77, 569, 253, 30080, 22559, 369, 2104, 281, 2953, 954, 273, 253, 7350, 285, 30628, 6376, 2762, 913, 15038, 84, 285, 36908, 1089, 4606, 281, 46011, 271, 42293, 5020, 17401 ]
[ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 ]
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 2520, 2929, 29328, 247, 2831, 5695, 3634, 3541, 260, 280, 78, 323, 4715, 285, 970, 253, 2831, 5695, 3634, 281, 9295, 253, 40634, 4811, 352, 14177, 281, 2085, 38539, 3634, 326, 5373, 253, 275, 31406, 1076, 4836, 253, 5661, 1543, 7568, 253, 12510, 285, 26647, 273, 260, 280, 78, 20544, 337, 253, 2934, 273, 970, 6024, 3541, 310, 5272, 50276, 19, 253, 2929, 310, 4518, 3542, 50275, 20881, 1255, 265, 337, 253, 1332, 310, 247, 1652, 1846, 50276, 19, 253, 11361, 273, 6024, 3541, 878, 625, 5955, 50276, 20, 253, 7792, 4677, 374, 310, 247, 1652, 1892, 281, 956, 260, 285, 277, 2085, 253, 4278, 533, 597, 403, 417, 3309, 50276, 21, 253, 4388, 273, 46875, 1222, 303, 3006, 253, 734, 565, 6230, 292, 22620, 875, 253, 2831, 5695, 3386, 310, 417, 2590, 2139, 310, 352, 4217, 323, 1607, 404, 1076, 50276, 18, 970, 6024, 3541, 476, 3157, 253, 3045, 671, 352, 588, 3324, 247, 2257, 273, 15180, 4815, 4496, 1056, 690, 1783, 50275, 7152, 339, 431, 248, 2929, 29328, 247, 1039, 281, 16584, 6024, 1491, 323, 275, 31406, 1076, 281, 513, 436, 253, 4081, 1332, 18922, 247, 5447, 273, 29102, 3386, 5728, 432, 253, 10895, 840, 323, 1016, 4811, 16202, 4735, 247, 11038, 7368, 310, 3636, 285, 253, 3386, 273, 326, 7368, 310, 31612, 281, 253, 16202, 4735, 275, 247, 2602, 46350, 1039, 253, 4081, 1332, 476, 320, 271, 3907, 823, 251, 281, 5368, 275, 31406, 1076, 3082, 285, 352, 12724, 19132, 3045, 327, 11745, 17082, 326, 2557, 14259, 342, 253, 3216, 5083, 824, 347, 3714, 23838, 256, 3549, 298, 18, 390, 39322, 2824, 50276, 45563, 50276, 74, 1089, 352, 4722, 281, 16584, 6024, 3386, 323, 1805, 2460, 275, 31406, 1076, 50276, 783, 490, 77, 569, 403, 30457, 1690, 2014, 5695, 4632, 2831, 5695, 3634, 285, 1027, 4088, 281, 2794, 253, 4310, 273, 3386, 50276, 783, 2929, 33526, 12724, 1175, 1543, 685, 253, 8245, 3082, 275, 1798, 352, 476, 320, 43867, 715, 2709, 5368, 3082, 285, 3157, 327, 512, 273, 731, 50275, 20881, 1255, 50276, 783, 47284, 310, 247, 2372, 2834, 281, 956, 275, 2087, 253, 2929, 16633, 327, 849, 253, 1332, 310, 26115, 2581, 685, 2139, 5001, 436, 891, 452, 247, 1643, 3533, 4496, 923, 253, 3533, 2593, 50275, 455, 7103, 17082, 2557, 849, 1199, 253, 3453, 10129, 253, 3216, 5083, 533, 352, 778, 417, 3587, 9578, 342, 40135, 323, 1650, 1580, 2460, 275, 31406, 1076, 310, 247, 23390, 26306, 1895, 342, 11117, 1896, 18012, 281, 15338, 253, 298, 18, 2957, 253, 3453, 3198, 281, 320, 387, 253, 8876, 273, 512, 1896, 2193, 275, 253, 990, 298, 18, 4868, 4245, 5750, 281, 253, 18012, 326, 403, 6032, 285, 1698, 22185, 323, 1650, 247, 8066, 19, 30061, 1566, 10166, 760, 327, 298, 18, 8103, 5987, 545, 408, 532, 304, 6666, 17591, 895, 19, 30061, 13485, 4663, 28402, 3355, 19, 18585, 20617, 569, 2974, 4453, 46521, 1014, 2167, 352, 1057, 5115, 1175, 298, 18, 2957, 281, 11794, 7472, 40135, 17082, 751, 269, 301, 403, 908, 389, 351, 1247, 1182, 31035, 1162, 355, 17857, 32888, 938, 1797, 50276, 783, 4477, 1057, 2953, 253, 7364, 50276, 7152, 339, 431, 248, 2929, 29328, 281, 897, 253, 2831, 5695, 3634, 534, 8414, 273, 3386, 6311, 432, 1027, 5304, 6127, 281, 9295, 253, 40634, 4811, 253, 4081, 1332, 31523, 1375, 23037, 14387, 3045, 327, 2014, 5695, 275, 31406, 1076, 327, 2709, 15302, 5847, 50276, 783, 4081, 260, 280, 78, 33526, 6474, 11701, 327, 512, 253, 8245, 3082, 285, 33526, 1375, 1171, 435, 275, 31406, 1076, 3045, 253, 4477, 2589, 9470, 4679, 323, 253, 28913, 1263, 281, 17813, 253, 2216, 273, 3634, 26647, 285, 3634, 42072, 253, 2216, 273, 841, 11911, 310, 17618, 50275, 5040, 50276, 783, 5161, 7680, 3133, 417, 2266, 2217, 323, 479, 50276, 284, 253, 6024, 3541, 4310, 3169, 1332, 556, 644, 8671, 275, 1142, 8892, 752, 310, 253, 4451, 2216, 281, 1056, 352, 7470, 323, 275, 31406, 1076, 327, 253, 643, 1133, 253, 5304, 3290, 7756, 310, 671, 417, 4755, 2217, 1014, 342, 253, 4081, 260, 280, 78, 253, 275, 31406, 1076, 1543, 403, 417, 347, 12994, 347, 298, 2902, 285, 1234, 1143, 50275, 783, 4028, 3198, 281, 320, 5520, 627, 403, 1142, 14683, 326, 403, 417, 14282, 533, 1077, 9542, 24088, 3104, 5540, 281, 6931, 50275, 5430, 310, 253, 6024, 3541, 4310, 3169, 1332, 2429, 281, 36827, 2720, 3169, 3082, 824, 347, 298, 2902, 310, 352, 1896, 281, 31986, 253, 3386, 6311, 275, 260, 280, 78, 352, 310, 4722, 281, 923, 849, 253, 260, 280, 78, 310, 6311, 1309, 253, 3733, 50274, 35640, 621, 5816, 3806, 275, 4677, 577, 50275, 2072, 5474, 33032, 2520, 2929, 29328, 271, 2460, 275, 31406, 1076, 5933, 326, 33772, 5304, 3634, 3386, 2439, 1027, 3888, 285, 26866, 731, 275, 271, 6024, 3541, 260, 280, 78, 841, 3386, 403, 908, 281, 35919, 9933, 3386, 273, 253, 40634, 3280, 2460, 534, 778, 906, 275, 1805, 12240, 3290, 685, 22128, 327, 3386, 3304, 253, 2014, 2460, 253, 4081, 2746, 41731, 13015, 5368, 3210, 327, 253, 1345, 15302, 50276, 296, 3755, 20556, 50274, 783, 4081, 2746, 1754, 327, 2831, 5695, 3634, 3541, 310, 4460, 352, 26866, 2169, 1268, 38539, 5304, 3634, 3021, 310, 12401, 2045, 3541, 1754, 3082, 824, 347, 246, 34180, 5976, 285, 256, 1109, 7652, 50274, 783, 4081, 2746, 310, 4122, 3576, 41731, 14692, 3332, 5368, 2987, 7180, 577, 285, 608, 50274, 783, 4081, 2746, 310, 12112, 285, 2087, 7032, 273, 13633, 2067, 5368, 31225, 342, 5185, 42752, 2829, 495, 50274, 1752, 318, 2175, 327, 4812, 4445, 273, 253, 4081, 2746, 452, 644, 16575, 1160, 3021, 8058, 616, 6349, 7180, 337, 285, 374, 50274, 43249, 18149, 285, 11640, 273, 253, 4081, 2746, 452, 644, 14859, 275, 253, 24864, 4753, 512, 4645, 14282, 1543, 50275, 20881, 1255, 265, 50274, 262, 310, 12744, 849, 253, 260, 280, 78, 310, 31260, 387, 253, 5068, 273, 253, 3733, 50274, 262, 310, 12744, 534, 10336, 310, 908, 347, 616, 4284, 27882, 2990, 50274, 783, 897, 273, 271, 6024, 3541, 4310, 5459, 253, 17653, 10393, 3602, 285, 892, 2695, 3240, 247, 2372, 50274, 7955, 1946, 327, 253, 4311, 273, 260, 280, 78, 285, 2813, 2867, 253, 12203, 4870, 778, 2489, 247, 3673, 44856, 273, 253, 17032, 3885, 50274, 26122, 846, 4361, 253, 30080, 22559, 50275, 74, 2868, 326, 253, 4081, 789, 8631, 247, 1077, 4722, 5933, 1754, 327, 260, 280, 78, 2568, 39709, 432, 5368, 2987, 347, 9713, 407, 253, 4477, 253, 260, 280, 78, 310, 3809, 281, 3186, 715, 285, 3576, 281, 4711, 2169, 3290, 5978, 407, 17617, 8566, 2831, 5695, 1491, 29356, 1561, 253, 3280, 2460, 253, 3290, 15988, 403, 3240, 2590, 1293, 1199, 18501, 272, 253, 6733, 619, 2457, 17401, 310, 1335, 281, 7052, 1329, 281, 2997, 253, 2929, 4754, 597, 9713, 253, 7364, 275, 7118, 495, 285, 577, 273, 253, 24864, 4753, 534, 2486, 4433, 2219, 3541, 2572, 2831, 1566, 285, 2831, 10895, 15216, 285, 38058, 16274, 841, 11985, 1646, 281, 452, 644, 16575, 1160, 5474, 33032, 2520, 2929, 29328, 253, 2831, 5695, 3634, 3541, 260, 280, 78, 323, 4715, 285, 970, 253, 2831, 5695, 3634, 281, 9295, 40634, 3888, 260, 280, 78, 8414, 273, 2709, 5239, 273, 2831, 5695, 3386, 6311, 432, 253, 2460, 4811, 342, 1027, 5304, 6127, 253, 9933, 3386, 403, 6311, 2439, 1027, 3888, 3021, 5277, 38539, 3634, 326, 5373, 253, 275, 31406, 1076, 4836, 253, 5661, 1543, 7568, 253, 12510, 285, 26647, 273, 260, 280, 78, 534, 33526, 1375, 23037, 14387, 16226, 327, 2710, 15302, 323, 2014, 2460, 275, 31406, 1076, 20544, 50276, 18, 186, 783, 19575, 273, 253, 2831, 5695, 3634, 281, 10073, 275, 2460, 275, 31406, 1076, 310, 5272, 285, 253, 4081, 2831, 5695, 3634, 3541, 310, 8489, 4460, 285, 476, 671, 320, 14923, 281, 253, 5368, 275, 31406, 1076, 3210, 374, 186, 783, 4679, 403, 4209, 285, 253, 4812, 1263, 310, 5322, 281, 921, 253, 12510, 273, 253, 260, 280, 78, 495, 186, 783, 9759, 310, 2590, 285, 253, 3806, 310, 671, 10599, 50276, 20881, 1255, 265, 337, 186, 67, 11587, 253, 1180, 273, 4735, 5239, 285, 253, 1979, 273, 1016, 873, 253, 6064, 273, 253, 9933, 3386, 285, 253, 1180, 273, 8090, 25987, 253, 260, 280, 78, 403, 671, 12232, 7533, 326, 2818, 253, 2460, 275, 31406, 1076, 3045, 533, 627, 3133, 642, 8813, 285, 1783, 323, 731, 50276, 37585, 337, 186, 249, 1386, 10677, 14217, 273, 288, 285, 259, 403, 908, 281, 9173, 253, 4898, 285, 4871, 273, 253, 2460, 285, 253, 4735, 533, 597, 403, 2686, 1027, 374, 186, 249, 1386, 12387, 752, 310, 253, 1318, 273, 253, 10254, 2803, 495, 186, 249, 1386, 21748, 1740, 2390, 310, 1264, 2390, 50276, 2072, 2490, 187, 4118, 18435, 27, 783, 2929, 25339, 849, 281, 897, 6024, 1491, 323, 275, 31406, 1076, 30628, 14109, 253, 2934, 533, 5439, 7350, 5001, 3710, 38135, 897, 273, 253, 4081, 1332, 323, 275, 31406, 1076, 1666, 25379, 1146, 6760, 30833, 285, 5816, 490, 77, 569, 253, 30080, 22559, 369, 2104, 281, 2953, 954, 273, 253, 7350, 285, 30628, 6376, 2762, 913, 15038, 84, 285, 36908, 1089, 4606, 281, 46011, 271, 42293, 5020, 17401 ]
Below is given review of a research paper from cnoference journal. Please write a summary the review. ### Review: this paper presents a quantizationawarescaling mechanism to perform ondevice training with extremely limited memory bandwidth settings the system leverages sparse update throug tiny training engine by pruning backward computation graph the experiments on visual wakeword demonstrates the efficacy of the proposed system this is a very wellwritten paper with extensive descriptions and many practical tricks to enable ondevice training namely qas sparse tensor update dce inplace updatethe demo shows a functioning working system on the real microcontroller the work presented in the paper contains significant effort the individual tricks themself are not new eg for the idea of qas i think the authors should cite lsqesser et al 2019 on the whole to enable all the parts work together systematically this is nontrival effort the authors seem to have neglected hfp8 sun et al 2020 and their follow up 4bit ultra low precision work for comparison i dont see any statement of opensourcing the implementation without such promise i felt this work would be extremely difficult to reproduce this is a glaring issue with the quantization community so i would really see the code opensourced some day docsepthis paper presents a systemalgorithm codesign approach towards conducting transfer learning training on tiny microcontrollerbased systems with less than 256kb memory the ideas focus on 1 improving optimization characteristics of quantization aware training 2 gradient sparsification 3 new compile time optimizations with tiny training engine results are shown on several tinyml applications the paper has several strengths 1 training ondevice is much more difficult than inference ondevice hence doing simple transfer learning kind of tasks on tiny devices is interesting 2 the paper makes significant contributions towards optimization of quantized networks as well as training support compilers for tiny devices 3 the results demonstrate significant memory savings compared to standard deep learning frameworks tensorflowpytorch despite the strengths the paper does have several significant weaknesses 1 one of the main motivations that authors use to disregard several existing training timeadaptation techniques eg the ones that rely on updating only the normalization layers reference 25 in paper is that the normalization layers are not present once the model is deployed on tiny devices this happens because deployment tools like tflite etc fuse the batchnorm layers into convolution weights however this fusing process is extremely cheap and the compiler needs to do it only once for inference can we not keep the norm layers for the ondevice training purposes and fuse them only when running inference would training only the norm layers result in significant savings compared to the proposed approach this is a major class of upcoming cheap training timeadaptation techniques and solid evidence must be presented to show that the proposed approach indeed outperforms this new class of models the baselines considered for comparison update last k layers update last k biases update only the classifier are not strong if update only norm layers kind of techniques are effective it may also reduce the motivation behind the proposed quantization scaling method 2 the improvements to the optimization problem for quantization aware training seem general however the authors have not provided any evidence that this would result in general improvements in quantizationaware training and not just for tinyml applications specifically would the proposed scaling method reduce the gap between fp32 and int8 models for nontransfer learning scenarios this kind of experiment can clarify whether the proposed technique is limited to transfer learning only or does it apply to general quantizationaware training if it only works for transfer learning and not in general why some discussion around specific conditions under which the proposed method works would strengthen the paper 3 there are many existing gradient sparsification techniques authors have not presented any comparison against existing gradient sparsification techniques for instance this paper and the many references it cites are relevant httpsarxivorgpdf171201887pdf some newer papers probably exist in this space the above paper is from 2017 i would recommend authors to do a thorough comparison against sota gradient sparsification techniques 4 will the code for tiny training engine be open sourced overall i like the idea in this paper and the fact that this is the first paper that trains models ondevice for realistic applications under 256kb this is a significant contribution however i would like to make sure that the paper is not missing significant baselines and comparisons i can increase the rating if the above weaknesses are adequately addressed update after rebuttal i have read other reviews and author response the authors have provided good clarification and insightful new experiments during the rebuttal they have satisfactorily answered my questions as a result i have increased the rating to 6 weak accept if the paper is accepted i would like the authors to just show a simple comparison between standard quantizationawaretraining qat eg where bns are still present and are fused after qat is complete and qas on real quantized graph eg the one without bn while i can see that on a real quantized graph qas has benefits for nontransfer learning scenarios based on new results it would be useful to see the gap between regular qat and qas on real quantized graphs indeed this can help motivate future works that may improve upon qas to close the gap between qat and quantization algorithms that directly work on real quantized graphs mostly yes other things like point 2 above see weaknesses can help further docsepthis paper proposes a novel and efficient framework for the ondevice training with 256kb memory in practice this paper introduces quantizationaware scaling qas technique to stabilize the quantized training with mixedbit precision and present the sparse update technique to save the memory footprint besides the tiny training engine tte is deigned to implement the proposed methods in an mcu system strengths 1 this paper is wellwritten and easy to understand 2 this paper propose a complete hardwaresoftware codesign scheme for training on the mcu devices 3 the experiments and analysis are sufficient to support the proposed methods the experimental setup is detailed weaknesses 1 even with much optimization the inference speed is very slow on the mcu device is training really necessary and suitable yes docsepthis paper considers a timely topic especially with the explosion of tiny devices with limited computational resources it is a trend to deploy the lifelong learning tasks directly on the devices and realize user customization the authors propose the algorithmsystem codesign to fit the rigorous setting even with only 256kb the model adaptation methods for gradient calibration and sparse update are straightforward but experimentally work well this framework matches the practical demands in an edge environment and thus is a meaningful solution for machine learning analysis on iot and mcus the experiments take commodity ondevice learning settings and present promising performance in terms of model size memory cost and accuracy generally this is an interesting paper with a clear methodology design and performance evaluation however i have some concerns about the weakness as follows 1 the effectiveness of the proposed model compression and learning strategy are highly related to the processing pattern of the underlying hardware for example not all the commodity processing chips can support the specific int8 operations or efficiently handle these operations the true acceleration of sparse computing relies on specific chips so i am interested in whether the proposed methods can generalize to other commodity devices eg extending to the nvidia jetson or a mobile phones neural chips not just the specific stm series 2 the experiments are mainly based on classification and adopt transfer learning with limited epochs on downstream tasks what about the performance of segmentation and detection tasks 3 also supporting longtime a large number of epochs training is critical for ondevice lifelong learning can this method train from scratch instead of finetuning the pretrained models 4 handling the learning procedure timely is more critical i think it is better to present the performance of the traininginference speed or time cost when applying to the downstream tasks none ### Summary:
in this work the authors propose a framework for training cv models on tiny iot devices with very limited memory the reviewers agreed that the paper is well written and represents a valuable contribution to the area of efficient ondevice ml questions raised by reviewers were sufficiently addressed in the response
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 2520, 2929, 10262, 247, 36643, 1403, 4420, 1179, 272, 5122, 281, 1347, 327, 10933, 3733, 342, 6685, 3710, 3541, 16992, 7533, 253, 985, 19732, 1131, 23507, 5731, 7635, 529, 10058, 3733, 3948, 407, 819, 25004, 19265, 13782, 4216, 253, 4679, 327, 5304, 11772, 3418, 14371, 253, 10307, 273, 253, 4081, 985, 50275, 2520, 310, 247, 1077, 973, 15720, 2929, 342, 9470, 20121, 285, 1142, 8542, 24866, 281, 8046, 327, 10933, 3733, 10775, 2805, 284, 23507, 13148, 5731, 277, 336, 275, 5070, 5535, 255, 10666, 22020, 2722, 247, 15415, 2444, 985, 327, 253, 1524, 2494, 19879, 253, 789, 3559, 275, 253, 2929, 4428, 1534, 3434, 50276, 783, 2060, 24866, 731, 1286, 403, 417, 747, 24088, 323, 253, 2934, 273, 2805, 284, 891, 1158, 253, 4477, 943, 26542, 298, 18858, 405, 254, 1162, 355, 6247, 327, 253, 2644, 281, 8046, 512, 253, 4243, 789, 2366, 24181, 436, 310, 25450, 49182, 3434, 50275, 783, 4477, 1646, 281, 452, 22459, 288, 16983, 25, 5101, 1162, 355, 9169, 285, 616, 956, 598, 577, 2713, 17452, 1698, 12320, 789, 323, 5301, 50275, 74, 13414, 923, 667, 3908, 273, 13279, 40883, 253, 7092, 1293, 824, 9023, 891, 3543, 436, 789, 651, 320, 6685, 2834, 281, 18302, 436, 310, 247, 45982, 2523, 342, 253, 36643, 3114, 594, 891, 651, 1663, 923, 253, 2127, 13279, 47549, 690, 1388, 50276, 7152, 33032, 2520, 2929, 10262, 247, 985, 41528, 11646, 525, 2746, 4404, 16472, 3700, 4715, 3733, 327, 10058, 2494, 19879, 3169, 2718, 342, 1679, 685, 17558, 22421, 3541, 253, 5697, 2770, 327, 337, 11138, 13757, 5319, 273, 36643, 6600, 3733, 374, 11786, 37139, 1877, 495, 747, 18122, 673, 5556, 5904, 342, 10058, 3733, 3948, 1543, 403, 2011, 327, 2067, 20596, 46415, 4893, 253, 2929, 556, 2067, 20544, 50276, 18, 3733, 327, 10933, 310, 1199, 625, 2834, 685, 17032, 327, 10933, 7613, 2509, 2969, 3700, 4715, 2238, 273, 8892, 327, 10058, 4095, 310, 4722, 50276, 19, 253, 2929, 2789, 1534, 9021, 4404, 13757, 273, 2677, 1025, 6928, 347, 973, 347, 3733, 1329, 509, 30526, 323, 10058, 4095, 50276, 20, 253, 1543, 7568, 1534, 3541, 16347, 2429, 281, 2629, 3676, 4715, 31225, 13148, 5449, 81, 1767, 263, 348, 50276, 3229, 3784, 253, 20544, 253, 2929, 1057, 452, 2067, 1534, 32213, 50276, 18, 581, 273, 253, 2022, 42852, 326, 4477, 897, 281, 27719, 2067, 5368, 3733, 673, 26672, 318, 5609, 24088, 253, 4394, 326, 10725, 327, 22753, 760, 253, 21539, 8090, 3806, 2030, 275, 2929, 310, 326, 253, 21539, 8090, 403, 417, 1246, 2378, 253, 1566, 310, 18329, 327, 10058, 4095, 436, 6569, 984, 19007, 5657, 751, 246, 1258, 614, 3966, 34824, 253, 10464, 1451, 526, 8090, 715, 27311, 13461, 2299, 436, 269, 5302, 1232, 310, 6685, 11142, 285, 253, 17963, 3198, 281, 513, 352, 760, 2378, 323, 17032, 476, 359, 417, 1978, 253, 5222, 8090, 323, 253, 327, 10933, 3733, 6378, 285, 34824, 731, 760, 672, 3515, 17032, 651, 3733, 760, 253, 5222, 8090, 906, 275, 1534, 16347, 2429, 281, 253, 4081, 2746, 436, 310, 247, 2201, 966, 273, 15146, 11142, 3733, 673, 26672, 318, 5609, 285, 4891, 1941, 1364, 320, 3559, 281, 921, 326, 253, 4081, 2746, 6296, 41731, 13015, 436, 747, 966, 273, 3210, 253, 1666, 25379, 2783, 323, 5301, 5731, 1390, 465, 8090, 5731, 1390, 465, 31306, 5731, 760, 253, 30410, 403, 417, 2266, 604, 5731, 760, 5222, 8090, 2238, 273, 5609, 403, 3576, 352, 778, 671, 4796, 253, 16038, 3212, 253, 4081, 36643, 13642, 1332, 50276, 19, 253, 11701, 281, 253, 13757, 1895, 323, 36643, 6600, 3733, 1646, 2087, 2299, 253, 4477, 452, 417, 2530, 667, 1941, 326, 436, 651, 906, 275, 2087, 11701, 275, 36643, 13823, 3733, 285, 417, 816, 323, 20596, 46415, 4893, 5742, 651, 253, 4081, 13642, 1332, 4796, 253, 8037, 875, 44296, 1237, 285, 540, 25, 3210, 323, 25450, 16147, 1592, 4715, 15216, 436, 2238, 273, 3368, 476, 19148, 1880, 253, 4081, 5853, 310, 3710, 281, 3700, 4715, 760, 390, 1057, 352, 4647, 281, 2087, 36643, 13823, 3733, 604, 352, 760, 2987, 323, 3700, 4715, 285, 417, 275, 2087, 2139, 690, 5955, 1475, 2173, 2515, 762, 534, 253, 4081, 1332, 2987, 651, 17084, 253, 2929, 50276, 20, 627, 403, 1142, 5368, 11786, 37139, 1877, 5609, 4477, 452, 417, 3559, 667, 5301, 1411, 5368, 11786, 37139, 1877, 5609, 323, 4227, 436, 2929, 285, 253, 1142, 10414, 352, 28070, 403, 4623, 5987, 39962, 2061, 9275, 1166, 805, 520, 32381, 9275, 690, 21629, 9380, 3164, 2226, 275, 436, 2317, 253, 1840, 2929, 310, 432, 4240, 891, 651, 5583, 4477, 281, 513, 247, 11080, 5301, 1411, 256, 5503, 11786, 37139, 1877, 5609, 50276, 21, 588, 253, 2127, 323, 10058, 3733, 3948, 320, 1527, 47344, 50276, 1189, 455, 891, 751, 253, 2934, 275, 436, 2929, 285, 253, 958, 326, 436, 310, 253, 806, 2929, 326, 18784, 3210, 327, 10933, 323, 15958, 4893, 762, 17558, 22421, 436, 310, 247, 1534, 7680, 2299, 891, 651, 751, 281, 1056, 2119, 326, 253, 2929, 310, 417, 5816, 1534, 1666, 25379, 285, 14023, 891, 476, 2572, 253, 13716, 604, 253, 1840, 32213, 403, 18212, 9713, 50275, 11183, 846, 30080, 22559, 891, 452, 1239, 643, 10123, 285, 2488, 2380, 253, 4477, 452, 2530, 1175, 37699, 285, 47860, 747, 4679, 1309, 253, 30080, 22559, 597, 452, 3449, 5906, 1031, 9577, 619, 3533, 347, 247, 906, 891, 452, 2559, 253, 13716, 281, 721, 5075, 2997, 604, 253, 2929, 310, 7607, 891, 651, 751, 253, 4477, 281, 816, 921, 247, 2969, 5301, 875, 2629, 36643, 13823, 31158, 2805, 255, 24088, 835, 270, 2224, 403, 1335, 1246, 285, 403, 29843, 846, 2805, 255, 310, 3426, 285, 2805, 284, 327, 1524, 2677, 1025, 4216, 24088, 253, 581, 1293, 270, 79, 1223, 891, 476, 923, 326, 327, 247, 1524, 2677, 1025, 4216, 2805, 284, 556, 5373, 323, 25450, 16147, 1592, 4715, 15216, 1754, 327, 747, 1543, 352, 651, 320, 4217, 281, 923, 253, 8037, 875, 3963, 2805, 255, 285, 2805, 284, 327, 1524, 2677, 1025, 14580, 6296, 436, 476, 1361, 41509, 2852, 2987, 326, 778, 3157, 2220, 2805, 284, 281, 2810, 253, 8037, 875, 2805, 255, 285, 36643, 11333, 326, 3587, 789, 327, 1524, 2677, 1025, 14580, 50276, 39025, 4754, 643, 1841, 751, 1127, 374, 1840, 923, 32213, 476, 1361, 2007, 5474, 33032, 2520, 2929, 29328, 247, 4460, 285, 5919, 7792, 323, 253, 327, 10933, 3733, 342, 17558, 22421, 3541, 275, 3946, 436, 2929, 23970, 36643, 13823, 13642, 2805, 284, 5853, 281, 33292, 253, 2677, 1025, 3733, 342, 6804, 2713, 12320, 285, 1246, 253, 23507, 5731, 5853, 281, 5321, 253, 3541, 33257, 16280, 253, 10058, 3733, 3948, 246, 442, 310, 372, 1300, 281, 3359, 253, 4081, 3082, 275, 271, 278, 14573, 985, 20544, 337, 436, 2929, 310, 973, 15720, 285, 3477, 281, 2096, 374, 436, 2929, 12661, 247, 3426, 1892, 88, 4420, 80, 8179, 11646, 525, 6974, 323, 3733, 327, 253, 278, 14573, 4095, 495, 253, 4679, 285, 1783, 403, 4209, 281, 1329, 253, 4081, 3082, 253, 5661, 9978, 310, 7000, 50276, 20881, 1255, 265, 337, 1014, 342, 1199, 13757, 253, 17032, 3885, 310, 1077, 3468, 327, 253, 278, 14573, 2813, 310, 3733, 1663, 3309, 285, 7470, 4754, 5474, 33032, 2520, 2929, 19401, 247, 14793, 9400, 3340, 342, 253, 18864, 273, 10058, 4095, 342, 3710, 15180, 5300, 352, 310, 247, 9058, 281, 8745, 253, 36536, 4715, 8892, 3587, 327, 253, 4095, 285, 8968, 2608, 2840, 1320, 253, 4477, 12661, 253, 11333, 2468, 11646, 525, 281, 4944, 253, 26565, 4758, 1014, 342, 760, 17558, 22421, 253, 1566, 15644, 3082, 323, 11786, 18543, 285, 23507, 5731, 403, 15246, 533, 21657, 789, 973, 436, 7792, 10129, 253, 8542, 11684, 275, 271, 5024, 3126, 285, 3021, 310, 247, 14282, 2900, 323, 5145, 4715, 1783, 327, 891, 302, 285, 278, 11675, 253, 4679, 1379, 31142, 327, 10933, 4715, 7533, 285, 1246, 12532, 3045, 275, 2426, 273, 1566, 1979, 3541, 2105, 285, 7200, 3839, 436, 310, 271, 4722, 2929, 342, 247, 2590, 16182, 2216, 285, 3045, 7103, 2299, 891, 452, 690, 7350, 670, 253, 14855, 347, 3637, 50276, 18, 253, 12510, 273, 253, 4081, 1566, 13800, 285, 4715, 5700, 403, 4122, 2905, 281, 253, 5162, 3102, 273, 253, 6944, 10309, 323, 1650, 417, 512, 253, 31142, 5162, 16666, 476, 1329, 253, 2173, 540, 25, 5871, 390, 14556, 6016, 841, 5871, 253, 2032, 17680, 273, 23507, 12672, 15771, 327, 2173, 16666, 594, 891, 717, 6110, 275, 1880, 253, 4081, 3082, 476, 39970, 281, 643, 31142, 4095, 24088, 13633, 281, 253, 47181, 24825, 251, 390, 247, 6109, 15169, 11454, 16666, 417, 816, 253, 2173, 331, 78, 2962, 50276, 19, 253, 4679, 403, 7194, 1754, 327, 9162, 285, 5283, 3700, 4715, 342, 3710, 44540, 327, 15450, 8892, 752, 670, 253, 3045, 273, 26405, 285, 5481, 8892, 50275, 20, 671, 8109, 31156, 247, 1781, 1180, 273, 44540, 3733, 310, 4619, 323, 327, 10933, 36536, 4715, 476, 436, 1332, 6194, 432, 20041, 3185, 273, 1442, 292, 25004, 253, 3215, 11273, 3210, 50275, 21, 10885, 253, 4715, 5199, 14793, 310, 625, 4619, 891, 1158, 352, 310, 1805, 281, 1246, 253, 3045, 273, 253, 3733, 249, 1793, 3885, 390, 673, 2105, 672, 9433, 281, 253, 15450, 8892, 5293, 2490, 187, 4118, 18435, 27, 249, 436, 789, 253, 4477, 12661, 247, 7792, 323, 3733, 30105, 3210, 327, 10058, 891, 302, 4095, 342, 1077, 3710, 3541, 253, 30628, 5821, 326, 253, 2929, 310, 973, 3542, 285, 6125, 247, 9865, 7680, 281, 253, 2170, 273, 5919, 50276, 251, 10933, 13361, 3533, 5439, 407, 30628, 497, 10481, 9713, 275, 253, 2380, 209 ]
[ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 ]
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 2520, 2929, 10262, 247, 36643, 1403, 4420, 1179, 272, 5122, 281, 1347, 327, 10933, 3733, 342, 6685, 3710, 3541, 16992, 7533, 253, 985, 19732, 1131, 23507, 5731, 7635, 529, 10058, 3733, 3948, 407, 819, 25004, 19265, 13782, 4216, 253, 4679, 327, 5304, 11772, 3418, 14371, 253, 10307, 273, 253, 4081, 985, 50275, 2520, 310, 247, 1077, 973, 15720, 2929, 342, 9470, 20121, 285, 1142, 8542, 24866, 281, 8046, 327, 10933, 3733, 10775, 2805, 284, 23507, 13148, 5731, 277, 336, 275, 5070, 5535, 255, 10666, 22020, 2722, 247, 15415, 2444, 985, 327, 253, 1524, 2494, 19879, 253, 789, 3559, 275, 253, 2929, 4428, 1534, 3434, 50276, 783, 2060, 24866, 731, 1286, 403, 417, 747, 24088, 323, 253, 2934, 273, 2805, 284, 891, 1158, 253, 4477, 943, 26542, 298, 18858, 405, 254, 1162, 355, 6247, 327, 253, 2644, 281, 8046, 512, 253, 4243, 789, 2366, 24181, 436, 310, 25450, 49182, 3434, 50275, 783, 4477, 1646, 281, 452, 22459, 288, 16983, 25, 5101, 1162, 355, 9169, 285, 616, 956, 598, 577, 2713, 17452, 1698, 12320, 789, 323, 5301, 50275, 74, 13414, 923, 667, 3908, 273, 13279, 40883, 253, 7092, 1293, 824, 9023, 891, 3543, 436, 789, 651, 320, 6685, 2834, 281, 18302, 436, 310, 247, 45982, 2523, 342, 253, 36643, 3114, 594, 891, 651, 1663, 923, 253, 2127, 13279, 47549, 690, 1388, 50276, 7152, 33032, 2520, 2929, 10262, 247, 985, 41528, 11646, 525, 2746, 4404, 16472, 3700, 4715, 3733, 327, 10058, 2494, 19879, 3169, 2718, 342, 1679, 685, 17558, 22421, 3541, 253, 5697, 2770, 327, 337, 11138, 13757, 5319, 273, 36643, 6600, 3733, 374, 11786, 37139, 1877, 495, 747, 18122, 673, 5556, 5904, 342, 10058, 3733, 3948, 1543, 403, 2011, 327, 2067, 20596, 46415, 4893, 253, 2929, 556, 2067, 20544, 50276, 18, 3733, 327, 10933, 310, 1199, 625, 2834, 685, 17032, 327, 10933, 7613, 2509, 2969, 3700, 4715, 2238, 273, 8892, 327, 10058, 4095, 310, 4722, 50276, 19, 253, 2929, 2789, 1534, 9021, 4404, 13757, 273, 2677, 1025, 6928, 347, 973, 347, 3733, 1329, 509, 30526, 323, 10058, 4095, 50276, 20, 253, 1543, 7568, 1534, 3541, 16347, 2429, 281, 2629, 3676, 4715, 31225, 13148, 5449, 81, 1767, 263, 348, 50276, 3229, 3784, 253, 20544, 253, 2929, 1057, 452, 2067, 1534, 32213, 50276, 18, 581, 273, 253, 2022, 42852, 326, 4477, 897, 281, 27719, 2067, 5368, 3733, 673, 26672, 318, 5609, 24088, 253, 4394, 326, 10725, 327, 22753, 760, 253, 21539, 8090, 3806, 2030, 275, 2929, 310, 326, 253, 21539, 8090, 403, 417, 1246, 2378, 253, 1566, 310, 18329, 327, 10058, 4095, 436, 6569, 984, 19007, 5657, 751, 246, 1258, 614, 3966, 34824, 253, 10464, 1451, 526, 8090, 715, 27311, 13461, 2299, 436, 269, 5302, 1232, 310, 6685, 11142, 285, 253, 17963, 3198, 281, 513, 352, 760, 2378, 323, 17032, 476, 359, 417, 1978, 253, 5222, 8090, 323, 253, 327, 10933, 3733, 6378, 285, 34824, 731, 760, 672, 3515, 17032, 651, 3733, 760, 253, 5222, 8090, 906, 275, 1534, 16347, 2429, 281, 253, 4081, 2746, 436, 310, 247, 2201, 966, 273, 15146, 11142, 3733, 673, 26672, 318, 5609, 285, 4891, 1941, 1364, 320, 3559, 281, 921, 326, 253, 4081, 2746, 6296, 41731, 13015, 436, 747, 966, 273, 3210, 253, 1666, 25379, 2783, 323, 5301, 5731, 1390, 465, 8090, 5731, 1390, 465, 31306, 5731, 760, 253, 30410, 403, 417, 2266, 604, 5731, 760, 5222, 8090, 2238, 273, 5609, 403, 3576, 352, 778, 671, 4796, 253, 16038, 3212, 253, 4081, 36643, 13642, 1332, 50276, 19, 253, 11701, 281, 253, 13757, 1895, 323, 36643, 6600, 3733, 1646, 2087, 2299, 253, 4477, 452, 417, 2530, 667, 1941, 326, 436, 651, 906, 275, 2087, 11701, 275, 36643, 13823, 3733, 285, 417, 816, 323, 20596, 46415, 4893, 5742, 651, 253, 4081, 13642, 1332, 4796, 253, 8037, 875, 44296, 1237, 285, 540, 25, 3210, 323, 25450, 16147, 1592, 4715, 15216, 436, 2238, 273, 3368, 476, 19148, 1880, 253, 4081, 5853, 310, 3710, 281, 3700, 4715, 760, 390, 1057, 352, 4647, 281, 2087, 36643, 13823, 3733, 604, 352, 760, 2987, 323, 3700, 4715, 285, 417, 275, 2087, 2139, 690, 5955, 1475, 2173, 2515, 762, 534, 253, 4081, 1332, 2987, 651, 17084, 253, 2929, 50276, 20, 627, 403, 1142, 5368, 11786, 37139, 1877, 5609, 4477, 452, 417, 3559, 667, 5301, 1411, 5368, 11786, 37139, 1877, 5609, 323, 4227, 436, 2929, 285, 253, 1142, 10414, 352, 28070, 403, 4623, 5987, 39962, 2061, 9275, 1166, 805, 520, 32381, 9275, 690, 21629, 9380, 3164, 2226, 275, 436, 2317, 253, 1840, 2929, 310, 432, 4240, 891, 651, 5583, 4477, 281, 513, 247, 11080, 5301, 1411, 256, 5503, 11786, 37139, 1877, 5609, 50276, 21, 588, 253, 2127, 323, 10058, 3733, 3948, 320, 1527, 47344, 50276, 1189, 455, 891, 751, 253, 2934, 275, 436, 2929, 285, 253, 958, 326, 436, 310, 253, 806, 2929, 326, 18784, 3210, 327, 10933, 323, 15958, 4893, 762, 17558, 22421, 436, 310, 247, 1534, 7680, 2299, 891, 651, 751, 281, 1056, 2119, 326, 253, 2929, 310, 417, 5816, 1534, 1666, 25379, 285, 14023, 891, 476, 2572, 253, 13716, 604, 253, 1840, 32213, 403, 18212, 9713, 50275, 11183, 846, 30080, 22559, 891, 452, 1239, 643, 10123, 285, 2488, 2380, 253, 4477, 452, 2530, 1175, 37699, 285, 47860, 747, 4679, 1309, 253, 30080, 22559, 597, 452, 3449, 5906, 1031, 9577, 619, 3533, 347, 247, 906, 891, 452, 2559, 253, 13716, 281, 721, 5075, 2997, 604, 253, 2929, 310, 7607, 891, 651, 751, 253, 4477, 281, 816, 921, 247, 2969, 5301, 875, 2629, 36643, 13823, 31158, 2805, 255, 24088, 835, 270, 2224, 403, 1335, 1246, 285, 403, 29843, 846, 2805, 255, 310, 3426, 285, 2805, 284, 327, 1524, 2677, 1025, 4216, 24088, 253, 581, 1293, 270, 79, 1223, 891, 476, 923, 326, 327, 247, 1524, 2677, 1025, 4216, 2805, 284, 556, 5373, 323, 25450, 16147, 1592, 4715, 15216, 1754, 327, 747, 1543, 352, 651, 320, 4217, 281, 923, 253, 8037, 875, 3963, 2805, 255, 285, 2805, 284, 327, 1524, 2677, 1025, 14580, 6296, 436, 476, 1361, 41509, 2852, 2987, 326, 778, 3157, 2220, 2805, 284, 281, 2810, 253, 8037, 875, 2805, 255, 285, 36643, 11333, 326, 3587, 789, 327, 1524, 2677, 1025, 14580, 50276, 39025, 4754, 643, 1841, 751, 1127, 374, 1840, 923, 32213, 476, 1361, 2007, 5474, 33032, 2520, 2929, 29328, 247, 4460, 285, 5919, 7792, 323, 253, 327, 10933, 3733, 342, 17558, 22421, 3541, 275, 3946, 436, 2929, 23970, 36643, 13823, 13642, 2805, 284, 5853, 281, 33292, 253, 2677, 1025, 3733, 342, 6804, 2713, 12320, 285, 1246, 253, 23507, 5731, 5853, 281, 5321, 253, 3541, 33257, 16280, 253, 10058, 3733, 3948, 246, 442, 310, 372, 1300, 281, 3359, 253, 4081, 3082, 275, 271, 278, 14573, 985, 20544, 337, 436, 2929, 310, 973, 15720, 285, 3477, 281, 2096, 374, 436, 2929, 12661, 247, 3426, 1892, 88, 4420, 80, 8179, 11646, 525, 6974, 323, 3733, 327, 253, 278, 14573, 4095, 495, 253, 4679, 285, 1783, 403, 4209, 281, 1329, 253, 4081, 3082, 253, 5661, 9978, 310, 7000, 50276, 20881, 1255, 265, 337, 1014, 342, 1199, 13757, 253, 17032, 3885, 310, 1077, 3468, 327, 253, 278, 14573, 2813, 310, 3733, 1663, 3309, 285, 7470, 4754, 5474, 33032, 2520, 2929, 19401, 247, 14793, 9400, 3340, 342, 253, 18864, 273, 10058, 4095, 342, 3710, 15180, 5300, 352, 310, 247, 9058, 281, 8745, 253, 36536, 4715, 8892, 3587, 327, 253, 4095, 285, 8968, 2608, 2840, 1320, 253, 4477, 12661, 253, 11333, 2468, 11646, 525, 281, 4944, 253, 26565, 4758, 1014, 342, 760, 17558, 22421, 253, 1566, 15644, 3082, 323, 11786, 18543, 285, 23507, 5731, 403, 15246, 533, 21657, 789, 973, 436, 7792, 10129, 253, 8542, 11684, 275, 271, 5024, 3126, 285, 3021, 310, 247, 14282, 2900, 323, 5145, 4715, 1783, 327, 891, 302, 285, 278, 11675, 253, 4679, 1379, 31142, 327, 10933, 4715, 7533, 285, 1246, 12532, 3045, 275, 2426, 273, 1566, 1979, 3541, 2105, 285, 7200, 3839, 436, 310, 271, 4722, 2929, 342, 247, 2590, 16182, 2216, 285, 3045, 7103, 2299, 891, 452, 690, 7350, 670, 253, 14855, 347, 3637, 50276, 18, 253, 12510, 273, 253, 4081, 1566, 13800, 285, 4715, 5700, 403, 4122, 2905, 281, 253, 5162, 3102, 273, 253, 6944, 10309, 323, 1650, 417, 512, 253, 31142, 5162, 16666, 476, 1329, 253, 2173, 540, 25, 5871, 390, 14556, 6016, 841, 5871, 253, 2032, 17680, 273, 23507, 12672, 15771, 327, 2173, 16666, 594, 891, 717, 6110, 275, 1880, 253, 4081, 3082, 476, 39970, 281, 643, 31142, 4095, 24088, 13633, 281, 253, 47181, 24825, 251, 390, 247, 6109, 15169, 11454, 16666, 417, 816, 253, 2173, 331, 78, 2962, 50276, 19, 253, 4679, 403, 7194, 1754, 327, 9162, 285, 5283, 3700, 4715, 342, 3710, 44540, 327, 15450, 8892, 752, 670, 253, 3045, 273, 26405, 285, 5481, 8892, 50275, 20, 671, 8109, 31156, 247, 1781, 1180, 273, 44540, 3733, 310, 4619, 323, 327, 10933, 36536, 4715, 476, 436, 1332, 6194, 432, 20041, 3185, 273, 1442, 292, 25004, 253, 3215, 11273, 3210, 50275, 21, 10885, 253, 4715, 5199, 14793, 310, 625, 4619, 891, 1158, 352, 310, 1805, 281, 1246, 253, 3045, 273, 253, 3733, 249, 1793, 3885, 390, 673, 2105, 672, 9433, 281, 253, 15450, 8892, 5293, 2490, 187, 4118, 18435, 27, 249, 436, 789, 253, 4477, 12661, 247, 7792, 323, 3733, 30105, 3210, 327, 10058, 891, 302, 4095, 342, 1077, 3710, 3541, 253, 30628, 5821, 326, 253, 2929, 310, 973, 3542, 285, 6125, 247, 9865, 7680, 281, 253, 2170, 273, 5919, 50276, 251, 10933, 13361, 3533, 5439, 407, 30628, 497, 10481, 9713, 275, 253, 2380, 209 ]
Below is given review of a research paper from cnoference journal. Please write a summary the review. ### Review: the paper proposes several approaches for numerical feature embeddings for tabular deep learning models and experimentally investigates the gains that such approaches provide methodologically simple linear embeddings piecewise linear embeddings and periodicactivationbased embeddings are proposed and experimentally evaluated on a suite of many benchmark tabular datasets the experimental results showcase an improved performance competitive with gradient boosted decision trees gbdt on gbdtbiased datasets strengths possible performance improvements from using embeddings for numerical features are showcased the experiments are conducted on a large suite of datasets good average rank improvements over gbdt on a gbdtfriendly benchmark are achieved by using numerical embeddings the paper is very wellwritten and easytofollow the code is in good shape weaknesses these embeddings for numerical features are essentially feature engineering comparison to gbdt with feature engineering would dramatically improve the paper and the experimental results methodological contributions of piecewise linear embeddings are similar to feature binning 1 as the paper mentions in the introduction i believe automated feature engineering methods such as eg 234 should be mentioned in related work while the proposed numerical feature embedding approaches are compared to each other it is unclear if they provide any benefit over eg saint 5 with their dense layer or small mlp embeddings 1 dougherty j kohavi r and sahami m 1995 supervised and unsupervised discretization of continuous features in machine learning proceedings 1995 pp 194202 morgan kaufmann 2 khurana u samulowitz h and turaga d 2018 april feature engineering for predictive modeling using reinforcement learning in proceedings of the aaai conference on artificial intelligence vol 32 no 1 3 kaul a maheshwary s and pudi v 2017 november autolearnautomated feature generation and selection in 2017 ieee international conference on data mining icdm pp 217226 ieee 4 khurana u turaga d samulowitz h and parthasrathy s 2016 december cognito automated feature engineering for supervised learning in 2016 ieee 16th international conference on data mining workshops icdmw pp 13041307 ieee 5 somepalli g goldblum m schwarzschild a bruss cb and goldstein t 2021 saint improved neural networks for tabular data via row attention and contrastive pretraining arxiv preprint arxiv210601342 the authors stated that the work focuses on a generic aspect of deep learning models and that therefore the negative societal impact discussion does not apply the limitations have been adequately addressed docsepthis paper proposes a new encoding scheme for embedding numerical features in neural network models the authors report that for tabular data in many cases neural methods have failed to improve on gradient boosted decision trees gbdt thus necessitating more research into how to represent the tabular data in particular features that are numerical as opposed to categorical the authors propose a variation of a binningbucketing scheme which they call piecewise linear encoding ple which is more informative than a onehot encoding scheme where a numerical feature is simply represented by the embedding corresponding to the binbucket id it belongs to this is clearly a bad scheme for numerical features since the ordering information is lost if x1 x2 then there is no ordering relationship between the embeddingx1 and embeddingx2 so the model must first learn this implicit relationship such a scheme therefore loses some information in the case of numerical data the authors propose their ple scheme as follows for the data x if it falls between bin bt and bt 1 then embeddingi 0 if i t embeddingi 1 if i t and embeddingt x btbt1 bt thus this is a piecewise linear scheme which explains the naming of the scheme the authors also consider some variations inspired by the position encoding used in transformer models for text where embeddingi also has a sinusoidal component the authors show experimental results using this scheme where they show that on several tabular datasets this scheme improves on the baseline mlp models using simple linear embeddings and is competitive with decision trees strengths this paper explores a few different representation schemes to make neural networks work better for tabular data and present empirical results demonstrating that it improves on other simpler representation schemes such as linear or onehot encoding schemes the paper is also clearly written and the notation is easy to follow weaknesses several of the datasets used are likely very small scale so it is possible that in the abundant data regime the gains from this method are small i would also imagine that the reason the decision trees are better than neural networks on tabular data of the kind the authors study is also likely because it seems like a data limited setting in larger ctr datasets like criteo it is not clear that 1 decision trees would be better than mlps and 2 the encoding scheme would give as big of a win yes docsepthis paper address tabular data prediction by using deep models the authors argue that embedding for numerical features are critical but underexplored in previous work in practice two embedding modules termed piecewise linear encoding and periodic activations are proposed to improve the performance of the current deep models for tabular data extensive experimental results demonstrate the effectiveness of the proposed two embedding schemes strengths the paper is clearly written and easy to follow the proposed two embedding schemes are straightforward and have practical effectiveness extensive experimental results illustrate the performance of the proposed method on various deep model architectures weaknesses whats the insight of designing such two embedding modules it is more like the practical solutions that depend on massive experiments which makes me wonder about the intuition behind these choices whats the relationship between these two proposed embedding modules should we need to consider them as independent contributions and why are there any different application scenarios for these two modules which one is more effective in real applications is there any more deep architectures for tubular data analysis and how about their performance by adopting the proposed two modules if the authors aim to demonstrate the generality of the proposed module more recent proposed deep networks but not the simple resnet or transformer should be considered as the baseline models and their performance improvement by using proposed modules should be clearly reported the authors partially discussed the limitations of the current work in the conclusion and future work docsepthe paper presents methods for embedding numerical features for tabular deep learning methods the paper describes the current best practices in the tabular deep learning literature and then proposes two basic approaches for embedding numerical features and ways to combine and augment them first the authors propose piecewise linear encoding ple where a number is mapped to a tdimenionsal vector by partitioning the range of values into t bins corresponding to the coordinates where a coordinate is equal to 1 if the number is below the respective bin 0 if the number is above the bin and otherwise the relative location of the number in the given bin ie x low high low this is effectively a continuous relaxation of treating numeric features as categorical via quantization different strategies for how to compute the bins are considered the second method is based on mapping to a vector with periodic activation functions ie mapping x cosw1 xx sinw1 x coswn xx sinwn x similar to prior proposal for positional embeddings this paper proposes that the two base feature embeddings can be combined with some sequence of linear and relu activation layers before they are fed into the model for evaluating the different embedding methods the paper considers three base models a simple mlp a resnet and a transformers architecture various combinations of models and embedding methods are tested on a variety of tabular datasets the main findings are that with wellchosen numerical feature embedding deep learningbased approaches can nearly match or exceed the performance of gradient boosted decision trees moreover it is shown that numerical feature embeddings are important for all model types and even an mlp can perform exceptionally well the main strengths of the paper are its comprehensive set of experiments including not just one variant but many possible combinations of the proposed embeddings such a detailed set of results could prove useful for practitioners that might consider adopting one of the proposed methods based on similarity to one of the given datasets in the evaluation moreover tabular deep learning is still a relatively underexplored subject and any push to establish better baselines could prove important however the paper ignores the literature on embedding numerical features found in other domains but effectively addresses the same fundamental problem namely there are many papers on how to treat numerical data found in natural language and specifically how to embed it in the same space as conventional word embeddings many suggestions have been proposed over the years some are identical to the methods in this paper eg the use of periodic activation functions while the authors note that it was proposed for positional embeddings it was also proposed for embedding plain numbers found in the text see 1 for a survey on the various methods used in natural language processing it is also difficult to discern what the bottom line is from the many experimental results regardless of whether the proposed embedding methods are novel or not it is unclear which method is preferred if any and why one method would work better than another on a given dataset there is little attempt to analyze the results or draw any conclusions the paper would be significantly improved if it was concentrated on one specific method rather than all possible combinations of a set of methods finally it is unclear why the paper uses the specific datasets and not others the only explanation given is that these datasets are gbdtfriendly however it is unclear what are the qualifications of a dataset to be considered as gbdtfriendly and why we should ignore other datasets that were previously explored in related literature on tabular deep learning for instance in 2 a different set of datasets is considered though there is some overlap where they already showed that deep learning methods can beat in most cases gbdt in my view this submission lacks a thorough explanation of their methodology for selecting the datasets they analyze and why it is okay to ignore others that have appeared in other papers 1 thawani et al representing numbers in nlp a survey and a vision naacl 2021 2 gorishniy et al revisiting deep learning models for tabular data neurips 2021 the authors discussed the runtime costs associated with using deep learningbased methods relative to decision trees as well as the difficulty in drawing far reaching conclusions based on the selection of datasets and their sample size ### Summary:
the authors propose and study the use of embedding scheme to apply deep learning to tabular problems according to reviewers himf and 5og4 and reading the submission the method is simple and clearly explained the experiments are comprehensive and demonstrates empirical improvements on small scale datasets moreover discussions with reviewers have allowed the authors to provide additional relevant experiments providing comparisons with other methods i recommend this paper for acceptance
[ 7139, 305, 67, 7064, 327, 305, 67, 7064, 30344, 15302, 20544, 50276, 24902, 3045, 11701, 432, 970, 46234, 323, 10704, 3386, 403, 44762, 833, 50276, 783, 4679, 403, 5196, 327, 247, 1781, 18880, 273, 15302, 50276, 12311, 3388, 5958, 11701, 689, 305, 67, 7064, 327, 247, 305, 67, 7064, 19771, 22791, 403, 6786, 407, 970, 10704, 46234, 50276, 783, 2929, 310, 1077, 973, 15720, 285, 3477, 936, 25739, 50276, 783, 2127, 310, 275, 1175, 5281, 50276, 20881, 1255, 265, 50276, 20513, 46234, 323, 10704, 3386, 403, 9093, 4735, 11369, 5301, 281, 305, 67, 7064, 342, 4735, 11369, 651, 16821, 3157, 253, 2929, 285, 253, 5661, 1543, 50276, 9349, 1975, 9021, 273, 5313, 3020, 4872, 46234, 403, 2074, 281, 4735, 10269, 920, 337, 347, 253, 2929, 25957, 275, 253, 10199, 50276, 74, 2868, 16644, 4735, 11369, 3082, 824, 347, 24088, 27812, 943, 320, 5393, 275, 2905, 789, 50276, 6050, 253, 4081, 10704, 4735, 21496, 7274, 403, 2429, 281, 1016, 643, 352, 310, 12744, 604, 597, 2085, 667, 5649, 689, 24088, 24032, 608, 342, 616, 14086, 3828, 390, 1355, 13361, 81, 46234, 50275, 18, 277, 529, 38960, 480, 465, 1368, 23096, 391, 285, 618, 3964, 74, 278, 8878, 22296, 285, 440, 35421, 35132, 1320, 273, 5415, 3386, 275, 5145, 4715, 10061, 8878, 7266, 26771, 18161, 2298, 1247, 465, 21393, 8420, 50276, 19, 26856, 321, 3230, 1484, 1775, 335, 29384, 288, 285, 10709, 12727, 277, 4765, 1049, 21704, 4735, 11369, 323, 15970, 14053, 970, 35221, 4715, 275, 10061, 273, 253, 39951, 2284, 8059, 327, 13345, 9260, 1936, 4567, 642, 337, 50276, 20, 16288, 335, 247, 6429, 1041, 13816, 552, 256, 285, 35587, 74, 362, 4240, 642, 306, 2480, 1125, 1306, 1596, 37771, 456, 4735, 5978, 285, 5438, 275, 4240, 26332, 1796, 5213, 8059, 327, 941, 15067, 17857, 17670, 7266, 26517, 21345, 26332, 1796, 50276, 21, 26856, 321, 3230, 1484, 10709, 12727, 277, 1775, 335, 29384, 288, 285, 1061, 394, 284, 83, 7822, 256, 4022, 372, 4246, 7709, 7067, 16644, 4735, 11369, 323, 22296, 4715, 275, 4022, 26332, 1796, 1668, 394, 5213, 8059, 327, 941, 15067, 29561, 17857, 17670, 88, 7266, 11084, 37012, 2922, 26332, 1796, 50276, 22, 690, 81, 43953, 305, 5328, 1559, 360, 278, 5807, 29815, 50040, 247, 1308, 1316, 34795, 285, 5328, 6339, 246, 43425, 24032, 5520, 11454, 6928, 323, 10334, 792, 941, 3066, 4194, 4116, 285, 4499, 422, 3215, 26208, 549, 32693, 638, 3845, 549, 32693, 19, 12971, 520, 27222, 253, 4477, 4767, 326, 253, 789, 16633, 327, 247, 12314, 4809, 273, 3676, 4715, 3210, 285, 326, 3103, 253, 4016, 38058, 3486, 5955, 1057, 417, 4647, 253, 7364, 452, 644, 18212, 9713, 5474, 33032, 2520, 2929, 29328, 247, 747, 9706, 6974, 323, 21496, 10704, 3386, 275, 11454, 2990, 3210, 253, 4477, 1304, 326, 323, 10334, 792, 941, 275, 1142, 2219, 11454, 3082, 452, 4242, 281, 3157, 327, 11786, 46002, 3061, 7139, 305, 67, 7064, 3021, 2436, 27427, 625, 2561, 715, 849, 281, 1957, 253, 10334, 792, 941, 275, 1798, 3386, 326, 403, 10704, 347, 10066, 281, 31091, 253, 4477, 12661, 247, 7629, 273, 247, 10269, 920, 24507, 8211, 6974, 534, 597, 1067, 5313, 3020, 4872, 9706, 2318, 534, 310, 625, 27096, 685, 247, 581, 12022, 9706, 6974, 835, 247, 10704, 4735, 310, 3365, 6607, 407, 253, 21496, 3969, 281, 253, 10269, 38924, 2654, 352, 14125, 281, 436, 310, 4518, 247, 3076, 6974, 323, 10704, 3386, 1580, 253, 15824, 1491, 310, 3663, 604, 1269, 18, 50276, 89, 19, 840, 627, 310, 642, 15824, 2954, 875, 253, 21496, 89, 18, 285, 21496, 89, 19, 594, 253, 1566, 1364, 806, 3037, 436, 15424, 2954, 824, 247, 6974, 3103, 25068, 690, 1491, 275, 253, 1083, 273, 10704, 941, 253, 4477, 12661, 616, 2318, 6974, 347, 3637, 323, 253, 941, 1269, 604, 352, 11521, 875, 10269, 37989, 285, 37989, 50276, 18, 840, 21496, 74, 50276, 17, 604, 891, 50276, 85, 21496, 74, 50276, 18, 604, 891, 50276, 85, 285, 21496, 85, 50276, 89, 50276, 2612, 2612, 18, 50276, 2612, 3021, 436, 310, 247, 5313, 3020, 4872, 6974, 534, 11424, 253, 26086, 273, 253, 6974, 253, 4477, 671, 1908, 690, 10575, 11797, 407, 253, 1899, 9706, 908, 275, 39707, 3210, 323, 2505, 835, 21496, 74, 671, 556, 247, 22749, 16080, 4445, 50275, 783, 4477, 921, 5661, 1543, 970, 436, 6974, 835, 597, 921, 326, 327, 2067, 10334, 792, 15302, 436, 6974, 19132, 327, 253, 8245, 13361, 81, 3210, 970, 2969, 4872, 46234, 285, 310, 12085, 342, 3061, 7139, 50275, 296, 3755, 20556, 436, 2929, 33826, 247, 1643, 1027, 6779, 15849, 281, 1056, 11454, 6928, 789, 1805, 323, 10334, 792, 941, 285, 1246, 16774, 1543, 17227, 326, 352, 19132, 327, 643, 19554, 6779, 15849, 824, 347, 4872, 390, 581, 12022, 9706, 15849, 253, 2929, 310, 671, 4518, 3542, 285, 253, 14951, 310, 3477, 281, 956, 50276, 20881, 1255, 265, 2067, 273, 253, 15302, 908, 403, 2779, 1077, 1355, 4311, 594, 352, 310, 1896, 326, 275, 253, 17829, 941, 9459, 253, 15988, 432, 436, 1332, 403, 1355, 891, 651, 671, 8564, 326, 253, 1921, 253, 3061, 7139, 403, 1805, 685, 11454, 6928, 327, 10334, 792, 941, 273, 253, 2238, 253, 4477, 1263, 310, 671, 2779, 984, 352, 3133, 751, 247, 941, 3710, 4758, 275, 4067, 260, 1206, 15302, 751, 1531, 614, 80, 352, 310, 417, 2590, 326, 337, 3061, 7139, 651, 320, 1805, 685, 13361, 793, 285, 374, 253, 9706, 6974, 651, 1918, 347, 1943, 273, 247, 3330, 50276, 9820, 5474, 33032, 2520, 2929, 2953, 10334, 792, 941, 10554, 407, 970, 3676, 3210, 253, 4477, 9059, 326, 21496, 323, 10704, 3386, 403, 4619, 533, 15560, 18398, 446, 2149, 275, 2045, 789, 275, 3946, 767, 21496, 11911, 23776, 5313, 3020, 4872, 9706, 285, 15316, 1396, 569, 403, 4081, 281, 3157, 253, 3045, 273, 253, 1655, 3676, 3210, 323, 10334, 792, 941, 9470, 5661, 1543, 7568, 253, 12510, 273, 253, 4081, 767, 21496, 15849, 20544, 50276, 783, 2929, 310, 4518, 3542, 285, 3477, 281, 956, 50276, 783, 4081, 767, 21496, 15849, 403, 15246, 285, 452, 8542, 12510, 50276, 2068, 3134, 5661, 1543, 17093, 253, 3045, 273, 253, 4081, 1332, 327, 2710, 3676, 1566, 35615, 50276, 20881, 1255, 265, 50276, 5371, 84, 253, 12288, 273, 20462, 824, 767, 21496, 11911, 352, 310, 625, 751, 253, 8542, 5482, 326, 3469, 327, 7863, 4679, 534, 2789, 479, 4282, 670, 253, 30328, 3212, 841, 10165, 50275, 5371, 84, 253, 2954, 875, 841, 767, 4081, 21496, 11911, 943, 359, 878, 281, 1908, 731, 347, 3907, 9021, 285, 2139, 403, 627, 667, 1027, 2898, 15216, 323, 841, 767, 11911, 534, 581, 310, 625, 3576, 275, 1524, 4893, 50275, 261, 627, 667, 625, 3676, 35615, 323, 25784, 941, 1783, 285, 849, 670, 616, 3045, 407, 25987, 253, 4081, 767, 11911, 604, 253, 4477, 4388, 281, 7568, 253, 31376, 273, 253, 4081, 6333, 625, 3332, 4081, 3676, 6928, 533, 417, 253, 2969, 501, 3024, 390, 39707, 943, 320, 2783, 347, 253, 8245, 3210, 285, 616, 3045, 7756, 407, 970, 4081, 11911, 943, 320, 4518, 2361, 50276, 783, 4477, 10571, 5469, 253, 7364, 273, 253, 1655, 789, 275, 253, 6452, 285, 2852, 789, 5474, 339, 431, 248, 2929, 10262, 3082, 323, 21496, 10704, 3386, 323, 10334, 792, 3676, 4715, 3082, 253, 2929, 8631, 253, 1655, 1682, 8333, 275, 253, 10334, 792, 3676, 4715, 6239, 285, 840, 29328, 767, 5044, 7274, 323, 21496, 10704, 3386, 285, 4088, 281, 13398, 285, 35919, 731, 806, 253, 4477, 12661, 5313, 3020, 4872, 9706, 2318, 835, 247, 1180, 310, 18301, 281, 247, 246, 31592, 621, 267, 4972, 407, 41463, 253, 2491, 273, 2193, 715, 246, 27925, 3969, 281, 253, 11627, 835, 247, 13249, 310, 4503, 281, 337, 604, 253, 1180, 310, 2708, 253, 9056, 10269, 470, 604, 253, 1180, 310, 1840, 253, 10269, 285, 5010, 253, 4103, 4328, 273, 253, 1180, 275, 253, 1677, 10269, 26332, 1269, 50276, 676, 50276, 8656, 50276, 676, 436, 310, 8069, 247, 5415, 17040, 273, 12767, 31437, 3386, 347, 31091, 3066, 36643, 1027, 8130, 323, 849, 281, 11897, 253, 27925, 403, 2783, 253, 1273, 1332, 310, 1754, 327, 10603, 281, 247, 4972, 342, 15316, 5743, 3470, 26332, 10603, 1269, 50276, 4752, 88, 18, 50276, 5260, 6868, 88, 18, 50276, 89, 50276, 4752, 939, 50276, 5260, 6868, 939, 50276, 89, 2074, 281, 2720, 10419, 323, 40798, 46234, 436, 2929, 29328, 326, 253, 767, 2613, 4735, 46234, 476, 320, 5678, 342, 690, 3425, 273, 4872, 285, 774, 86, 5743, 8090, 1078, 597, 403, 10208, 715, 253, 1566, 50276, 1542, 16344, 253, 1027, 21496, 3082, 253, 2929, 19401, 1264, 2613, 3210, 247, 2969, 13361, 81, 247, 501, 3024, 285, 247, 4979, 398, 10336, 2710, 13553, 273, 3210, 285, 21496, 3082, 403, 5762, 327, 247, 5235, 273, 10334, 792, 15302, 253, 2022, 4342, 403, 326, 342, 973, 348, 5458, 10704, 4735, 21496, 3676, 4715, 3169, 7274, 476, 4829, 3761, 390, 8268, 253, 3045, 273, 11786, 46002, 3061, 7139, 25761, 352, 310, 2011, 326, 10704, 4735, 46234, 403, 1774, 323, 512, 1566, 3510, 285, 1014, 271, 13361, 81, 476, 1347, 35888, 973, 253, 2022, 20544, 273, 253, 2929, 403, 697, 11088, 873, 273, 4679, 1690, 417, 816, 581, 12955, 533, 1142, 1896, 13553, 273, 253, 4081, 46234, 824, 247, 7000, 873, 273, 1543, 812, 5276, 4217, 323, 24432, 326, 1537, 1908, 25987, 581, 273, 253, 4081, 3082, 1754, 327, 14259, 281, 581, 273, 253, 1677, 15302, 275, 253, 7103, 25761, 10334, 792, 3676, 4715, 310, 1335, 247, 4942, 15560, 18398, 446, 2149, 2256, 285, 667, 7450, 281, 5100, 1805, 1666, 25379, 812, 5276, 1774, 50276, 35529, 253, 2929, 35136, 253, 6239, 327, 21496, 10704, 3386, 1119, 275, 643, 10625, 533, 8069, 12453, 253, 1072, 7936, 1895, 10775, 627, 403, 1142, 9380, 327, 849, 281, 1555, 10704, 941, 1119, 275, 3626, 3448, 285, 5742, 849, 281, 8473, 352, 275, 253, 1072, 2317, 347, 6041, 3159, 46234, 1142, 13991, 452, 644, 4081, 689, 253, 1107, 690, 403, 8931, 281, 253, 3082, 275, 436, 2929, 24088, 253, 897, 273, 15316, 5743, 3470, 1223, 253, 4477, 3877, 326, 352, 369, 4081, 323, 40798, 46234, 352, 369, 671, 4081, 323, 21496, 8342, 3904, 1119, 275, 253, 2505, 923, 337, 323, 247, 6630, 327, 253, 2710, 3082, 908, 275, 3626, 3448, 5162, 50276, 262, 310, 671, 2834, 281, 26923, 752, 253, 5004, 1386, 310, 432, 253, 1142, 5661, 1543, 10159, 273, 1880, 253, 4081, 21496, 3082, 403, 4460, 390, 417, 352, 310, 12744, 534, 1332, 310, 9013, 604, 667, 285, 2139, 581, 1332, 651, 789, 1805, 685, 1529, 327, 247, 1677, 10895, 627, 310, 1652, 3177, 281, 12106, 253, 1543, 390, 3812, 667, 11815, 253, 2929, 651, 320, 3012, 5520, 604, 352, 369, 16761, 327, 581, 2173, 1332, 2581, 685, 512, 1896, 13553, 273, 247, 873, 273, 3082, 50276, 71, 3341, 352, 310, 12744, 2139, 253, 2929, 4648, 253, 2173, 15302, 285, 417, 2571, 253, 760, 8813, 1677, 310, 326, 841, 15302, 403, 305, 67, 7064, 19771, 2299, 352, 310, 12744, 752, 403, 253, 32696, 273, 247, 10895, 281, 320, 2783, 347, 305, 67, 7064, 19771, 285, 2139, 359, 943, 11823, 643, 15302, 326, 497, 3786, 14859, 275, 2905, 6239, 327, 10334, 792, 3676, 4715, 323, 4227, 275, 374, 247, 1027, 873, 273, 15302, 310, 2783, 2167, 627, 310, 690, 14787, 835, 597, 2168, 2692, 326, 3676, 4715, 3082, 476, 7171, 275, 954, 2219, 305, 67, 7064, 275, 619, 1859, 436, 19529, 19756, 247, 11080, 8813, 273, 616, 16182, 323, 17221, 253, 15302, 597, 12106, 285, 2139, 352, 310, 8261, 281, 11823, 2571, 326, 452, 5420, 275, 643, 9380, 50276, 18, 50276, 394, 1403, 6451, 1162, 355, 9999, 3904, 275, 295, 24343, 247, 6630, 285, 247, 8113, 5549, 29404, 43425, 50276, 19, 50276, 3892, 763, 8311, 90, 1162, 355, 27694, 2996, 3676, 4715, 3210, 323, 10334, 792, 941, 5723, 2824, 43425, 253, 4477, 5469, 253, 20243, 4815, 2330, 342, 970, 3676, 4715, 3169, 3082, 4103, 281, 3061, 7139, 347, 973, 347, 253, 10183, 275, 10263, 2080, 10922, 11815, 1754, 327, 253, 5438, 273, 15302, 285, 616, 3410, 1979, 2490, 187, 4118, 18435, 27, 783, 4477, 12661, 285, 1263, 253, 897, 273, 21496, 6974, 281, 4647, 3676, 4715, 281, 10334, 792, 3237, 2556, 281, 30628, 779, 71, 285, 608, 462, 21, 285, 4361, 253, 19529, 253, 1332, 310, 2969, 285, 4518, 5544, 253, 4679, 403, 11088, 285, 14371, 16774, 11701, 327, 1355, 4311, 15302, 25761, 11985, 342, 30628, 452, 4136, 253, 4477, 281, 2085, 3081, 4623, 4679, 5277, 14023, 342, 643, 3082, 50276, 74, 5583, 436, 2929, 323, 14924 ]
[ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 ]
[ 7139, 305, 67, 7064, 327, 305, 67, 7064, 30344, 15302, 20544, 50276, 24902, 3045, 11701, 432, 970, 46234, 323, 10704, 3386, 403, 44762, 833, 50276, 783, 4679, 403, 5196, 327, 247, 1781, 18880, 273, 15302, 50276, 12311, 3388, 5958, 11701, 689, 305, 67, 7064, 327, 247, 305, 67, 7064, 19771, 22791, 403, 6786, 407, 970, 10704, 46234, 50276, 783, 2929, 310, 1077, 973, 15720, 285, 3477, 936, 25739, 50276, 783, 2127, 310, 275, 1175, 5281, 50276, 20881, 1255, 265, 50276, 20513, 46234, 323, 10704, 3386, 403, 9093, 4735, 11369, 5301, 281, 305, 67, 7064, 342, 4735, 11369, 651, 16821, 3157, 253, 2929, 285, 253, 5661, 1543, 50276, 9349, 1975, 9021, 273, 5313, 3020, 4872, 46234, 403, 2074, 281, 4735, 10269, 920, 337, 347, 253, 2929, 25957, 275, 253, 10199, 50276, 74, 2868, 16644, 4735, 11369, 3082, 824, 347, 24088, 27812, 943, 320, 5393, 275, 2905, 789, 50276, 6050, 253, 4081, 10704, 4735, 21496, 7274, 403, 2429, 281, 1016, 643, 352, 310, 12744, 604, 597, 2085, 667, 5649, 689, 24088, 24032, 608, 342, 616, 14086, 3828, 390, 1355, 13361, 81, 46234, 50275, 18, 277, 529, 38960, 480, 465, 1368, 23096, 391, 285, 618, 3964, 74, 278, 8878, 22296, 285, 440, 35421, 35132, 1320, 273, 5415, 3386, 275, 5145, 4715, 10061, 8878, 7266, 26771, 18161, 2298, 1247, 465, 21393, 8420, 50276, 19, 26856, 321, 3230, 1484, 1775, 335, 29384, 288, 285, 10709, 12727, 277, 4765, 1049, 21704, 4735, 11369, 323, 15970, 14053, 970, 35221, 4715, 275, 10061, 273, 253, 39951, 2284, 8059, 327, 13345, 9260, 1936, 4567, 642, 337, 50276, 20, 16288, 335, 247, 6429, 1041, 13816, 552, 256, 285, 35587, 74, 362, 4240, 642, 306, 2480, 1125, 1306, 1596, 37771, 456, 4735, 5978, 285, 5438, 275, 4240, 26332, 1796, 5213, 8059, 327, 941, 15067, 17857, 17670, 7266, 26517, 21345, 26332, 1796, 50276, 21, 26856, 321, 3230, 1484, 10709, 12727, 277, 1775, 335, 29384, 288, 285, 1061, 394, 284, 83, 7822, 256, 4022, 372, 4246, 7709, 7067, 16644, 4735, 11369, 323, 22296, 4715, 275, 4022, 26332, 1796, 1668, 394, 5213, 8059, 327, 941, 15067, 29561, 17857, 17670, 88, 7266, 11084, 37012, 2922, 26332, 1796, 50276, 22, 690, 81, 43953, 305, 5328, 1559, 360, 278, 5807, 29815, 50040, 247, 1308, 1316, 34795, 285, 5328, 6339, 246, 43425, 24032, 5520, 11454, 6928, 323, 10334, 792, 941, 3066, 4194, 4116, 285, 4499, 422, 3215, 26208, 549, 32693, 638, 3845, 549, 32693, 19, 12971, 520, 27222, 253, 4477, 4767, 326, 253, 789, 16633, 327, 247, 12314, 4809, 273, 3676, 4715, 3210, 285, 326, 3103, 253, 4016, 38058, 3486, 5955, 1057, 417, 4647, 253, 7364, 452, 644, 18212, 9713, 5474, 33032, 2520, 2929, 29328, 247, 747, 9706, 6974, 323, 21496, 10704, 3386, 275, 11454, 2990, 3210, 253, 4477, 1304, 326, 323, 10334, 792, 941, 275, 1142, 2219, 11454, 3082, 452, 4242, 281, 3157, 327, 11786, 46002, 3061, 7139, 305, 67, 7064, 3021, 2436, 27427, 625, 2561, 715, 849, 281, 1957, 253, 10334, 792, 941, 275, 1798, 3386, 326, 403, 10704, 347, 10066, 281, 31091, 253, 4477, 12661, 247, 7629, 273, 247, 10269, 920, 24507, 8211, 6974, 534, 597, 1067, 5313, 3020, 4872, 9706, 2318, 534, 310, 625, 27096, 685, 247, 581, 12022, 9706, 6974, 835, 247, 10704, 4735, 310, 3365, 6607, 407, 253, 21496, 3969, 281, 253, 10269, 38924, 2654, 352, 14125, 281, 436, 310, 4518, 247, 3076, 6974, 323, 10704, 3386, 1580, 253, 15824, 1491, 310, 3663, 604, 1269, 18, 50276, 89, 19, 840, 627, 310, 642, 15824, 2954, 875, 253, 21496, 89, 18, 285, 21496, 89, 19, 594, 253, 1566, 1364, 806, 3037, 436, 15424, 2954, 824, 247, 6974, 3103, 25068, 690, 1491, 275, 253, 1083, 273, 10704, 941, 253, 4477, 12661, 616, 2318, 6974, 347, 3637, 323, 253, 941, 1269, 604, 352, 11521, 875, 10269, 37989, 285, 37989, 50276, 18, 840, 21496, 74, 50276, 17, 604, 891, 50276, 85, 21496, 74, 50276, 18, 604, 891, 50276, 85, 285, 21496, 85, 50276, 89, 50276, 2612, 2612, 18, 50276, 2612, 3021, 436, 310, 247, 5313, 3020, 4872, 6974, 534, 11424, 253, 26086, 273, 253, 6974, 253, 4477, 671, 1908, 690, 10575, 11797, 407, 253, 1899, 9706, 908, 275, 39707, 3210, 323, 2505, 835, 21496, 74, 671, 556, 247, 22749, 16080, 4445, 50275, 783, 4477, 921, 5661, 1543, 970, 436, 6974, 835, 597, 921, 326, 327, 2067, 10334, 792, 15302, 436, 6974, 19132, 327, 253, 8245, 13361, 81, 3210, 970, 2969, 4872, 46234, 285, 310, 12085, 342, 3061, 7139, 50275, 296, 3755, 20556, 436, 2929, 33826, 247, 1643, 1027, 6779, 15849, 281, 1056, 11454, 6928, 789, 1805, 323, 10334, 792, 941, 285, 1246, 16774, 1543, 17227, 326, 352, 19132, 327, 643, 19554, 6779, 15849, 824, 347, 4872, 390, 581, 12022, 9706, 15849, 253, 2929, 310, 671, 4518, 3542, 285, 253, 14951, 310, 3477, 281, 956, 50276, 20881, 1255, 265, 2067, 273, 253, 15302, 908, 403, 2779, 1077, 1355, 4311, 594, 352, 310, 1896, 326, 275, 253, 17829, 941, 9459, 253, 15988, 432, 436, 1332, 403, 1355, 891, 651, 671, 8564, 326, 253, 1921, 253, 3061, 7139, 403, 1805, 685, 11454, 6928, 327, 10334, 792, 941, 273, 253, 2238, 253, 4477, 1263, 310, 671, 2779, 984, 352, 3133, 751, 247, 941, 3710, 4758, 275, 4067, 260, 1206, 15302, 751, 1531, 614, 80, 352, 310, 417, 2590, 326, 337, 3061, 7139, 651, 320, 1805, 685, 13361, 793, 285, 374, 253, 9706, 6974, 651, 1918, 347, 1943, 273, 247, 3330, 50276, 9820, 5474, 33032, 2520, 2929, 2953, 10334, 792, 941, 10554, 407, 970, 3676, 3210, 253, 4477, 9059, 326, 21496, 323, 10704, 3386, 403, 4619, 533, 15560, 18398, 446, 2149, 275, 2045, 789, 275, 3946, 767, 21496, 11911, 23776, 5313, 3020, 4872, 9706, 285, 15316, 1396, 569, 403, 4081, 281, 3157, 253, 3045, 273, 253, 1655, 3676, 3210, 323, 10334, 792, 941, 9470, 5661, 1543, 7568, 253, 12510, 273, 253, 4081, 767, 21496, 15849, 20544, 50276, 783, 2929, 310, 4518, 3542, 285, 3477, 281, 956, 50276, 783, 4081, 767, 21496, 15849, 403, 15246, 285, 452, 8542, 12510, 50276, 2068, 3134, 5661, 1543, 17093, 253, 3045, 273, 253, 4081, 1332, 327, 2710, 3676, 1566, 35615, 50276, 20881, 1255, 265, 50276, 5371, 84, 253, 12288, 273, 20462, 824, 767, 21496, 11911, 352, 310, 625, 751, 253, 8542, 5482, 326, 3469, 327, 7863, 4679, 534, 2789, 479, 4282, 670, 253, 30328, 3212, 841, 10165, 50275, 5371, 84, 253, 2954, 875, 841, 767, 4081, 21496, 11911, 943, 359, 878, 281, 1908, 731, 347, 3907, 9021, 285, 2139, 403, 627, 667, 1027, 2898, 15216, 323, 841, 767, 11911, 534, 581, 310, 625, 3576, 275, 1524, 4893, 50275, 261, 627, 667, 625, 3676, 35615, 323, 25784, 941, 1783, 285, 849, 670, 616, 3045, 407, 25987, 253, 4081, 767, 11911, 604, 253, 4477, 4388, 281, 7568, 253, 31376, 273, 253, 4081, 6333, 625, 3332, 4081, 3676, 6928, 533, 417, 253, 2969, 501, 3024, 390, 39707, 943, 320, 2783, 347, 253, 8245, 3210, 285, 616, 3045, 7756, 407, 970, 4081, 11911, 943, 320, 4518, 2361, 50276, 783, 4477, 10571, 5469, 253, 7364, 273, 253, 1655, 789, 275, 253, 6452, 285, 2852, 789, 5474, 339, 431, 248, 2929, 10262, 3082, 323, 21496, 10704, 3386, 323, 10334, 792, 3676, 4715, 3082, 253, 2929, 8631, 253, 1655, 1682, 8333, 275, 253, 10334, 792, 3676, 4715, 6239, 285, 840, 29328, 767, 5044, 7274, 323, 21496, 10704, 3386, 285, 4088, 281, 13398, 285, 35919, 731, 806, 253, 4477, 12661, 5313, 3020, 4872, 9706, 2318, 835, 247, 1180, 310, 18301, 281, 247, 246, 31592, 621, 267, 4972, 407, 41463, 253, 2491, 273, 2193, 715, 246, 27925, 3969, 281, 253, 11627, 835, 247, 13249, 310, 4503, 281, 337, 604, 253, 1180, 310, 2708, 253, 9056, 10269, 470, 604, 253, 1180, 310, 1840, 253, 10269, 285, 5010, 253, 4103, 4328, 273, 253, 1180, 275, 253, 1677, 10269, 26332, 1269, 50276, 676, 50276, 8656, 50276, 676, 436, 310, 8069, 247, 5415, 17040, 273, 12767, 31437, 3386, 347, 31091, 3066, 36643, 1027, 8130, 323, 849, 281, 11897, 253, 27925, 403, 2783, 253, 1273, 1332, 310, 1754, 327, 10603, 281, 247, 4972, 342, 15316, 5743, 3470, 26332, 10603, 1269, 50276, 4752, 88, 18, 50276, 5260, 6868, 88, 18, 50276, 89, 50276, 4752, 939, 50276, 5260, 6868, 939, 50276, 89, 2074, 281, 2720, 10419, 323, 40798, 46234, 436, 2929, 29328, 326, 253, 767, 2613, 4735, 46234, 476, 320, 5678, 342, 690, 3425, 273, 4872, 285, 774, 86, 5743, 8090, 1078, 597, 403, 10208, 715, 253, 1566, 50276, 1542, 16344, 253, 1027, 21496, 3082, 253, 2929, 19401, 1264, 2613, 3210, 247, 2969, 13361, 81, 247, 501, 3024, 285, 247, 4979, 398, 10336, 2710, 13553, 273, 3210, 285, 21496, 3082, 403, 5762, 327, 247, 5235, 273, 10334, 792, 15302, 253, 2022, 4342, 403, 326, 342, 973, 348, 5458, 10704, 4735, 21496, 3676, 4715, 3169, 7274, 476, 4829, 3761, 390, 8268, 253, 3045, 273, 11786, 46002, 3061, 7139, 25761, 352, 310, 2011, 326, 10704, 4735, 46234, 403, 1774, 323, 512, 1566, 3510, 285, 1014, 271, 13361, 81, 476, 1347, 35888, 973, 253, 2022, 20544, 273, 253, 2929, 403, 697, 11088, 873, 273, 4679, 1690, 417, 816, 581, 12955, 533, 1142, 1896, 13553, 273, 253, 4081, 46234, 824, 247, 7000, 873, 273, 1543, 812, 5276, 4217, 323, 24432, 326, 1537, 1908, 25987, 581, 273, 253, 4081, 3082, 1754, 327, 14259, 281, 581, 273, 253, 1677, 15302, 275, 253, 7103, 25761, 10334, 792, 3676, 4715, 310, 1335, 247, 4942, 15560, 18398, 446, 2149, 2256, 285, 667, 7450, 281, 5100, 1805, 1666, 25379, 812, 5276, 1774, 50276, 35529, 253, 2929, 35136, 253, 6239, 327, 21496, 10704, 3386, 1119, 275, 643, 10625, 533, 8069, 12453, 253, 1072, 7936, 1895, 10775, 627, 403, 1142, 9380, 327, 849, 281, 1555, 10704, 941, 1119, 275, 3626, 3448, 285, 5742, 849, 281, 8473, 352, 275, 253, 1072, 2317, 347, 6041, 3159, 46234, 1142, 13991, 452, 644, 4081, 689, 253, 1107, 690, 403, 8931, 281, 253, 3082, 275, 436, 2929, 24088, 253, 897, 273, 15316, 5743, 3470, 1223, 253, 4477, 3877, 326, 352, 369, 4081, 323, 40798, 46234, 352, 369, 671, 4081, 323, 21496, 8342, 3904, 1119, 275, 253, 2505, 923, 337, 323, 247, 6630, 327, 253, 2710, 3082, 908, 275, 3626, 3448, 5162, 50276, 262, 310, 671, 2834, 281, 26923, 752, 253, 5004, 1386, 310, 432, 253, 1142, 5661, 1543, 10159, 273, 1880, 253, 4081, 21496, 3082, 403, 4460, 390, 417, 352, 310, 12744, 534, 1332, 310, 9013, 604, 667, 285, 2139, 581, 1332, 651, 789, 1805, 685, 1529, 327, 247, 1677, 10895, 627, 310, 1652, 3177, 281, 12106, 253, 1543, 390, 3812, 667, 11815, 253, 2929, 651, 320, 3012, 5520, 604, 352, 369, 16761, 327, 581, 2173, 1332, 2581, 685, 512, 1896, 13553, 273, 247, 873, 273, 3082, 50276, 71, 3341, 352, 310, 12744, 2139, 253, 2929, 4648, 253, 2173, 15302, 285, 417, 2571, 253, 760, 8813, 1677, 310, 326, 841, 15302, 403, 305, 67, 7064, 19771, 2299, 352, 310, 12744, 752, 403, 253, 32696, 273, 247, 10895, 281, 320, 2783, 347, 305, 67, 7064, 19771, 285, 2139, 359, 943, 11823, 643, 15302, 326, 497, 3786, 14859, 275, 2905, 6239, 327, 10334, 792, 3676, 4715, 323, 4227, 275, 374, 247, 1027, 873, 273, 15302, 310, 2783, 2167, 627, 310, 690, 14787, 835, 597, 2168, 2692, 326, 3676, 4715, 3082, 476, 7171, 275, 954, 2219, 305, 67, 7064, 275, 619, 1859, 436, 19529, 19756, 247, 11080, 8813, 273, 616, 16182, 323, 17221, 253, 15302, 597, 12106, 285, 2139, 352, 310, 8261, 281, 11823, 2571, 326, 452, 5420, 275, 643, 9380, 50276, 18, 50276, 394, 1403, 6451, 1162, 355, 9999, 3904, 275, 295, 24343, 247, 6630, 285, 247, 8113, 5549, 29404, 43425, 50276, 19, 50276, 3892, 763, 8311, 90, 1162, 355, 27694, 2996, 3676, 4715, 3210, 323, 10334, 792, 941, 5723, 2824, 43425, 253, 4477, 5469, 253, 20243, 4815, 2330, 342, 970, 3676, 4715, 3169, 3082, 4103, 281, 3061, 7139, 347, 973, 347, 253, 10183, 275, 10263, 2080, 10922, 11815, 1754, 327, 253, 5438, 273, 15302, 285, 616, 3410, 1979, 2490, 187, 4118, 18435, 27, 783, 4477, 12661, 285, 1263, 253, 897, 273, 21496, 6974, 281, 4647, 3676, 4715, 281, 10334, 792, 3237, 2556, 281, 30628, 779, 71, 285, 608, 462, 21, 285, 4361, 253, 19529, 253, 1332, 310, 2969, 285, 4518, 5544, 253, 4679, 403, 11088, 285, 14371, 16774, 11701, 327, 1355, 4311, 15302, 25761, 11985, 342, 30628, 452, 4136, 253, 4477, 281, 2085, 3081, 4623, 4679, 5277, 14023, 342, 643, 3082, 50276, 74, 5583, 436, 2929, 323, 14924 ]
Below is given review of a research paper from cnoference journal. Please write a summary the review. ### Review: authors show how the nervous system of c elegans can be modelled and simulated with datadriven models using different neural network architectures specifically they target the use of state of the art recurrent neural networks architectures such as lstms and grus and compare these architectures in terms of their properties and their rmse as well as the complexity of the resulting models authors show that gru models with a hidden layer size of 4 units are able to accurately reproduce the systems response to very different stimuli in this paper authors create models for the c elegans nervous system with three different recurrent neural networks architectures simple rnns lstms and grus the objective is to further generate a loworder description to replace the original detailed model in the neuron simulator authors should improve the automation in choosing appropriate stimuli for the training validation and test sets as well as optimal parameter selection and perform a systematic analysis of compression possibilities of the learningbased models with error control overall its a weak accept authors should improve the automation in choosing appropriate stimuli for the training validation and test sets as well as optimal parameter selection and perform a systematic analysis of compression possibilities of the learningbased models with error control docsepthe paper shows that a small recurrent neural network can be used to predict the activity of 4 neurons in a c elegans simulation with good accuracy as measured by rmse the paper uses a connectomebased inhouse model of the worm utilizing compartmental neuron models and chemical and electrical synapses to generate the ground truth data in general i found that insufficient details are provided about how this model was built validated how were the biophysical properties chosen what synapse models were used etc the authors state that the simulator is highfidelity because they assume it reproduces with fidelity the real output of c elegans neurons the model might be highcomplexity but as far as i could see fidelity was not demonstrated anywhere in fact the only test case also used for all subsequent modeling work with neural networks is forward locomotion in it 2 sensory neurons and 2 interneurons are stimulated to generate activity in 4 neurons known to be active in locomotion its possible that additional information is available in the supplementary materials which i was not able to access errorsthe value of offset is out of range it must be 0 17825792 received 17825795 even if thats the case more of this information should be in the main text i found section 31 with detailed discussion of gru lstm and problems related to training rnns a bit too verbose these are standard and wellknown architectures at this point and could just be briefly introduced with references to the original papers the reader would benefit much more from learning the details of the c elegans simulation the paper does a good job of systematically testing various variants of recurrent neural networks lstm gru vanilla rnn in a simple setup involving a single recurrent layer followed by a dense layer with 4 outputs the authors focus on gru and conclude that a tiny neural network with just 4 gru units is sufficient that a gru network works for modeling timeseries data is not particularly surprising i found the experimental work presented in the paper far too limited to yield useful insights into using rnns as a blackbox model of real brain activity ideally the experiments should cover more neurons and behaviors and variations in synapse strengths see doi101038s41586021037788 and the biophysical parameters what is missing is also a comparison of the computational complexity of the simulation and the proposed reduced order model one could also ask whether the model used in the simulation is actually necessary to produce this forward motion scenario could some neurons be excluded or simpler neuron models used instead specific comments the text mentions that the separation of the simulation results into traintestvalidation was done manually please provide more details on the criteria you used to ensure diversity within each group in experiment 3 you state that data sampled with different time steps as this leads to longer sequences this should probably be rephrased to highlight that the same physical time is simulated what exactly is the input to the network only the 2 currents on the sensory neurons or also on the interneurons why is it necessary to provide stimulus on interneurons instead of just sensory neurons the paper effectively shows that a tiny gru network can reproduce a few timeseries generated from a larger simulation the general research direction of building reducedorder models for such simulations is interesting but the very limited empirical results presented here and lack of details about the simulation make it impossible for me to recommend acceptance docsepthis paper investigates the use of recurrent neural networks as a model reduction tool in computational neuroscience more specifically the authors consider the problem of predicting the activity of a set of four neurons in the c elegans nematode worm resulting from the simulated electrical stimulation of other neurons in the animals connectome three experiments are carried out testing different network architectures network sizes and the effect of increasing the temporal resolution of the data the results show that a small grubased network is sufficient for achieving excellent agreement with the starting data which was generated using computationally demanding simulations of a network of multicompartimental neurons the paper also includes an introduction on popular rnn architectures and on the problem of model reduction in computational neuroscience strenghts the paper is very well written and is an enjoyable read it explains well the problem it is trying to address and why its important and it introduces all concepts and techniques it uses in an accessible way i think this is particularly important for this paper as its main audience would presumably be computational neuroscientists so it seems a good idea not to take for granted a deep knowledge of say recurrent network architectures addressing the problem of model reduction using offtheshelf machine learning approaches is a timely choice given the very rapid increase of experimental data available for modeling in this field the main thesis of the paper is well supported by the evidence which is laid out in a logical and clear progression weaknesses mirroring what i wrote above in one of the strenghts points the fact that paper is a rather simple application of offtheshelf rnn architectures to a welldefined modeling problem means that the paper is not hugely ambitious and that it doesnt bring much in the way of theoretical or conceptual insight however in my opinion this does not detract from the quality of the work minor suggestions when discussing related literature i invite the authors to consider the inclusion of a broader selection of papers that in recent years applied deep network approaches to modeling neural activity data as the works currently cited in the paper investigate only fmri data which is only distantly related to actual neural activity or are specific to c elegans two examples that come to mind are molanomazon et al iclr 2018 synthesizing realistic neural population activity patterns using generative adversarial networks and bellec et al neurips 2021 fitting summary statistics of neural data with a differentiable spiking network simulator if possible it would be great to have an actual comparison of the compute timecost incurred in running the trained rnn even just for the final selected version of the architecture the 4 or 8units gru versus running the groundtruth simulation with neuron i apologise if this is given somewhere and i missed it please make all figure labels bigger axes labels tick labels and legend contents are currently almost impossible to read on page 7 last line of section 51 we consider it the main option and keeping the lstm as an alternative architecture should be keep this is a solid paper that addresses a question which will be of interest to part of the iclr community the research is somewhat limited in scope but very well executed and written up and overall a worthy contribution it is also a good demonstration of the usefulness of open databases of neuroscience models and data my recommendation is to accept ### Summary:
this paper explores the use of recurrent neural networks to model neural activity timeseries data the hope is that computationally demanding biophysical models of neural circuits could be replaced by rnns when the goal is simply to capture the right inputoutput functions the authors show that they can fit rnns to the behaviour of a complex biophysical model of the c elegans nervous system and they explore the space of hyperparameter and network choices that lead to the best fits the reviews for this paper were borderline with scores of 3 6 and 8 on the positive side the reviewers agreed that the paper is very effective in demonstrating that the inputoutput behaviour of the biophysical model of c elegans can be replicated by rnns but on the negative side there were concerns about the limited nature of the empirical results lack of details about the simulation too much emphasis in describing wellknown rnn architectures and lack of systematic strategy for applying this technique in other systems the rebuttals did not change the borderline scores thus this is an instance where the ac must be a bit more involved in the decision after reading the paper and reviews the ac felt that this work was not sufficiently general in its application ultimately using artificial neural networks to fit neural data is common practice nowadays so really this paper serves as a proofofconcept for replacing a complex biophysical model with a simpler rnn but given that rnns are quite good at modelling sequence data its not terribly surprising that this works moreover though the authors do a very careful search over network design decisions they dont provide a systematic strategy for others to employ if they so wished also the authors do not provide much insight into what the rnns learn that might help us to better understand the modelled neural circuits and most importantly this only demonstrates the effectiveness for systems where we have biophysical models with wellestablished accuracy which is not the case for most neural circuits given these considerations a reject decision was reached
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 43355, 921, 849, 253, 11219, 985, 273, 260, 44984, 476, 320, 41329, 285, 15524, 342, 2856, 324, 1069, 257, 3210, 970, 1027, 11454, 2990, 35615, 5742, 597, 2303, 253, 897, 273, 1375, 273, 253, 1445, 18902, 11454, 6928, 35615, 824, 347, 298, 296, 983, 285, 650, 316, 285, 7277, 841, 35615, 275, 2426, 273, 616, 3607, 285, 616, 40373, 339, 347, 973, 347, 253, 10454, 273, 253, 4795, 3210, 4477, 921, 326, 26970, 3210, 342, 247, 8763, 3828, 1979, 273, 577, 5085, 403, 2104, 281, 13613, 18302, 253, 2718, 2380, 281, 1077, 1027, 15374, 275, 436, 2929, 4477, 2794, 3210, 323, 253, 260, 44984, 11219, 985, 342, 1264, 1027, 18902, 11454, 6928, 35615, 2969, 391, 79, 2224, 298, 296, 983, 285, 650, 316, 253, 8103, 310, 281, 2007, 6635, 247, 1698, 2621, 5740, 281, 8171, 253, 3236, 7000, 1566, 275, 253, 23586, 40022, 50275, 43355, 943, 3157, 253, 29885, 275, 13887, 4569, 15374, 323, 253, 3733, 12820, 285, 1071, 5239, 347, 973, 347, 8654, 4764, 5438, 285, 1347, 247, 12082, 1783, 273, 13800, 15018, 273, 253, 4715, 3169, 3210, 342, 2228, 1453, 4583, 697, 247, 5075, 2997, 4477, 943, 3157, 253, 29885, 275, 13887, 4569, 15374, 323, 253, 3733, 12820, 285, 1071, 5239, 347, 973, 347, 8654, 4764, 5438, 285, 1347, 247, 12082, 1783, 273, 13800, 15018, 273, 253, 4715, 3169, 3210, 342, 2228, 1453, 5474, 339, 431, 248, 2929, 2722, 326, 247, 1355, 18902, 11454, 2990, 476, 320, 908, 281, 3283, 253, 2425, 273, 577, 8512, 275, 247, 260, 44984, 9864, 342, 1175, 7200, 347, 4080, 407, 40373, 339, 50275, 783, 2929, 4648, 247, 4684, 485, 3169, 275, 5967, 1566, 273, 253, 28384, 17617, 18862, 267, 23586, 3210, 285, 5793, 285, 8545, 40041, 281, 6635, 253, 3216, 5083, 941, 275, 2087, 891, 1119, 326, 12497, 4278, 403, 2530, 670, 849, 436, 1566, 369, 4270, 50276, 7210, 456, 849, 497, 253, 1794, 40947, 3607, 6777, 752, 2753, 8023, 3210, 497, 908, 3966, 253, 4477, 1375, 326, 253, 40022, 310, 1029, 71, 21718, 984, 597, 5467, 352, 7598, 707, 342, 32422, 253, 1524, 3453, 273, 260, 44984, 8512, 253, 1566, 1537, 320, 1029, 19017, 414, 533, 347, 2080, 347, 891, 812, 923, 32422, 369, 417, 5183, 9825, 275, 958, 253, 760, 1071, 1083, 671, 908, 323, 512, 6774, 14053, 789, 342, 11454, 6928, 310, 3579, 23904, 5011, 275, 352, 374, 17872, 8512, 285, 374, 734, 32167, 790, 403, 18606, 281, 6635, 2425, 275, 577, 8512, 1929, 281, 320, 3939, 275, 23904, 5011, 697, 1896, 326, 3081, 1491, 310, 2130, 275, 253, 24864, 4753, 534, 891, 369, 417, 2104, 281, 2289, 2228, 296, 248, 1318, 273, 8409, 310, 562, 273, 2491, 352, 1364, 320, 50276, 17, 50275, 20070, 1099, 39282, 2959, 24833, 21553, 2222, 1014, 604, 28763, 253, 1083, 625, 273, 436, 1491, 943, 320, 275, 253, 2022, 2505, 50276, 74, 1119, 2593, 4562, 342, 7000, 5955, 273, 26970, 298, 296, 78, 285, 3237, 2905, 281, 3733, 391, 79, 2224, 247, 2372, 1512, 48656, 841, 403, 2629, 285, 973, 4304, 35615, 387, 436, 1127, 285, 812, 816, 320, 13366, 5611, 342, 10414, 281, 253, 3236, 9380, 253, 9414, 651, 5649, 1199, 625, 432, 4715, 253, 4278, 273, 253, 260, 44984, 9864, 50276, 783, 2929, 1057, 247, 1175, 2628, 273, 24181, 5175, 2710, 11640, 273, 18902, 11454, 6928, 298, 296, 78, 26970, 26724, 391, 9866, 275, 247, 2969, 9978, 7668, 247, 2014, 18902, 3828, 3560, 407, 247, 14086, 3828, 342, 577, 18012, 253, 4477, 2770, 327, 26970, 285, 7525, 326, 247, 10058, 11454, 2990, 342, 816, 577, 26970, 5085, 310, 4209, 50276, 3529, 247, 26970, 2990, 2987, 323, 14053, 2069, 12395, 941, 310, 417, 3782, 10084, 891, 1119, 253, 5661, 789, 3559, 275, 253, 2929, 2080, 1512, 3710, 281, 4917, 4217, 16039, 715, 970, 391, 79, 2224, 347, 247, 2806, 3364, 1566, 273, 1524, 3998, 2425, 34243, 253, 4679, 943, 3835, 625, 8512, 285, 13576, 285, 10575, 275, 2753, 8023, 20544, 923, 28076, 6903, 22439, 84, 21, 18663, 1549, 16899, 22863, 2055, 285, 253, 1794, 40947, 3602, 752, 310, 5816, 310, 671, 247, 5301, 273, 253, 15180, 10454, 273, 253, 9864, 285, 253, 4081, 3777, 1340, 1566, 581, 812, 671, 1642, 1880, 253, 1566, 908, 275, 253, 9864, 310, 2686, 3309, 281, 4711, 436, 3579, 3200, 10076, 50276, 16534, 690, 8512, 320, 10432, 390, 19554, 23586, 3210, 908, 3185, 50276, 6160, 5701, 50276, 783, 2505, 25957, 326, 253, 9712, 273, 253, 9864, 1543, 715, 1140, 565, 383, 29599, 369, 2218, 13542, 4496, 2085, 625, 4278, 327, 253, 6866, 368, 908, 281, 5416, 9991, 1561, 1016, 1387, 50276, 249, 3368, 495, 368, 1375, 326, 50276, 2203, 19958, 342, 1027, 673, 5018, 347, 436, 5644, 281, 3356, 6430, 436, 943, 3164, 320, 294, 545, 83, 833, 281, 6780, 326, 253, 1072, 3520, 673, 310, 15524, 50276, 5371, 4555, 310, 253, 3280, 281, 253, 2990, 760, 253, 374, 18476, 327, 253, 17872, 8512, 390, 671, 327, 253, 734, 32167, 790, 50276, 22309, 310, 352, 3309, 281, 2085, 15199, 327, 734, 32167, 790, 3185, 273, 816, 17872, 8512, 50274, 783, 2929, 8069, 2722, 326, 247, 10058, 26970, 2990, 476, 18302, 247, 1643, 2069, 12395, 4561, 432, 247, 4067, 9864, 253, 2087, 2561, 3884, 273, 3652, 3777, 2621, 3210, 323, 824, 9938, 310, 4722, 533, 253, 1077, 3710, 16774, 1543, 3559, 1060, 285, 3480, 273, 4278, 670, 253, 9864, 1056, 352, 7479, 323, 479, 281, 5583, 14924, 50276, 7152, 33032, 2520, 2929, 2340, 684, 253, 897, 273, 18902, 11454, 6928, 347, 247, 1566, 5141, 4968, 275, 15180, 6551, 21559, 625, 5742, 253, 4477, 1908, 253, 1895, 273, 21565, 253, 2425, 273, 247, 873, 273, 1740, 8512, 275, 253, 260, 44984, 44087, 853, 28384, 4795, 432, 253, 15524, 8545, 10277, 273, 643, 8512, 275, 253, 5074, 4684, 485, 1264, 4679, 403, 4824, 562, 5175, 1027, 2990, 35615, 2990, 9552, 285, 253, 1055, 273, 3629, 253, 11935, 6064, 273, 253, 941, 253, 1543, 921, 326, 247, 1355, 33336, 833, 2990, 310, 4209, 323, 17170, 7126, 4345, 342, 253, 4983, 941, 534, 369, 4561, 970, 43245, 17905, 9938, 273, 247, 2990, 273, 23559, 297, 2003, 11471, 8512, 253, 2929, 671, 3797, 271, 10199, 327, 4633, 391, 9866, 35615, 285, 327, 253, 1895, 273, 1566, 5141, 275, 15180, 6551, 21559, 50275, 296, 3755, 384, 84, 50276, 783, 2929, 310, 1077, 973, 3542, 285, 310, 271, 30357, 1239, 352, 11424, 50275, 4714, 253, 1895, 352, 310, 2820, 281, 2953, 285, 2139, 697, 1774, 285, 50275, 262, 23970, 512, 12342, 285, 5609, 352, 4648, 275, 271, 12482, 50275, 1106, 891, 1158, 436, 310, 3782, 1774, 323, 436, 2929, 347, 697, 50275, 7265, 8446, 651, 18289, 320, 15180, 6551, 30202, 1346, 594, 50275, 262, 3133, 247, 1175, 2934, 417, 281, 1379, 323, 7169, 247, 3676, 3640, 273, 50275, 19506, 18902, 2990, 35615, 50276, 12025, 272, 253, 1895, 273, 1566, 5141, 970, 273, 649, 1041, 48164, 50275, 28936, 4715, 7274, 310, 247, 14793, 4327, 1677, 253, 1077, 5233, 50275, 19687, 511, 273, 5661, 941, 2130, 323, 14053, 275, 436, 1673, 50276, 783, 2022, 22857, 273, 253, 2929, 310, 973, 4516, 407, 253, 1941, 50275, 4609, 310, 10090, 562, 275, 247, 13760, 285, 2590, 10005, 50275, 20881, 1255, 265, 50276, 17001, 83, 4263, 752, 891, 4159, 1840, 275, 581, 273, 253, 4056, 384, 84, 2792, 253, 50275, 12690, 326, 2929, 310, 247, 2581, 2969, 2898, 273, 273, 649, 1041, 48164, 391, 9866, 50275, 1116, 5671, 980, 281, 247, 6210, 392, 37224, 14053, 1895, 2097, 326, 253, 50275, 20790, 310, 417, 40704, 24683, 285, 326, 352, 36908, 3324, 1199, 275, 253, 50275, 1106, 273, 10527, 390, 20178, 12288, 2299, 275, 619, 4743, 50275, 2520, 1057, 417, 843, 974, 432, 253, 3290, 273, 253, 789, 50275, 37585, 13991, 50276, 9453, 16585, 2905, 6239, 891, 19864, 253, 4477, 281, 1908, 50275, 783, 11250, 273, 247, 16055, 5438, 273, 9380, 326, 275, 3332, 1107, 50275, 46188, 3676, 2990, 7274, 281, 14053, 11454, 2425, 941, 347, 50275, 783, 2987, 4390, 11106, 275, 253, 2929, 7409, 760, 49555, 363, 941, 50275, 4609, 310, 760, 940, 5954, 2905, 281, 4588, 11454, 2425, 390, 403, 50275, 6160, 281, 260, 44984, 767, 6667, 326, 1705, 281, 2564, 403, 50275, 17071, 266, 297, 1370, 251, 1162, 355, 17857, 32888, 4765, 35143, 3006, 15958, 11454, 50275, 30957, 2425, 6127, 970, 1006, 800, 48960, 6928, 50275, 395, 1112, 282, 68, 1162, 355, 5723, 2824, 43425, 13532, 6010, 9990, 273, 11454, 50275, 2203, 342, 247, 46350, 653, 16434, 2990, 40022, 50276, 338, 1896, 352, 651, 320, 1270, 281, 452, 271, 4588, 5301, 273, 253, 50275, 39413, 673, 16736, 23122, 275, 3515, 253, 10166, 391, 9866, 1014, 816, 323, 50275, 783, 2457, 4236, 2715, 273, 253, 10336, 253, 577, 390, 854, 19920, 50275, 30107, 7147, 3515, 253, 3216, 33024, 9864, 342, 23586, 891, 50275, 522, 862, 885, 604, 436, 310, 1677, 9366, 285, 891, 9829, 352, 50276, 32897, 1056, 512, 4677, 13301, 8750, 24039, 13301, 7049, 13301, 285, 50275, 42262, 9410, 403, 4390, 2761, 7479, 281, 1239, 50276, 251, 3239, 818, 1390, 1386, 273, 2593, 8319, 50276, 664, 1908, 352, 253, 2022, 50275, 7872, 285, 7562, 253, 298, 296, 78, 347, 271, 5795, 10336, 50276, 11425, 50275, 1257, 1978, 50275, 2520, 310, 247, 4891, 2929, 326, 12453, 247, 1953, 534, 588, 320, 273, 1600, 281, 629, 273, 253, 17857, 32888, 3114, 253, 2561, 310, 8489, 3710, 275, 7990, 533, 1077, 973, 11407, 285, 3542, 598, 285, 4583, 247, 18338, 7680, 352, 310, 671, 247, 1175, 20028, 273, 253, 31471, 273, 1527, 16634, 273, 6551, 21559, 3210, 285, 941, 50276, 2577, 17401, 310, 281, 2997, 2490, 187, 4118, 18435, 27, 2520, 2929, 33826, 253, 897, 273, 18902, 11454, 6928, 281, 1566, 11454, 2425, 2069, 12395, 941, 253, 3524, 310, 326, 43245, 17905, 1794, 40947, 3210, 273, 11454, 14174, 812, 320, 7932, 407, 391, 79, 2224, 672, 253, 4736, 310, 3365, 281, 9232, 253, 987, 3280, 9252, 3470, 253, 4477, 921, 326, 597, 476, 4944, 391, 79, 2224, 281, 253, 8770, 273, 247, 2570, 1794, 40947, 1566, 273, 253, 260, 44984, 11219, 985, 285, 597, 8338, 253, 2317, 273, 4373, 19484, 285, 2990, 10165, 326, 1421, 281, 253, 1682, 13840, 50276, 783, 10123, 323, 436, 2929, 497, 45210, 342, 7363, 273, 495, 721, 285, 854, 327, 253, 2762, 1930, 253, 30628, 5821, 326, 253, 2929, 310, 1077, 3576, 275, 17227, 326, 253, 3280, 9252, 8770, 273, 253, 1794, 40947, 1566, 273, 260, 44984, 476, 320, 37221, 407, 391, 79, 2224, 533, 327, 253, 4016, 1930, 627, 497, 7350, 670, 253, 3710, 3753, 273, 253, 16774, 1543, 3480, 273, 4278, 670, 253, 9864, 1512, 1199, 15075, 275, 12930, 973, 4304, 391, 9866, 35615, 285, 3480, 273, 12082, 5700, 323, 9433, 436, 5853, 275, 643, 2718, 253, 30080, 85, 932, 858, 417, 1818, 253, 45210, 7363, 50276, 40622, 436, 310, 271, 4227, 835, 253, 913, 1364, 320, 247, 2372, 625, 3206, 275, 253, 3061, 846, 4361, 253, 2929, 285, 10123, 253, 913, 3543, 326, 436, 789, 369, 417, 10481, 2087, 275, 697, 2898, 9142, 970, 13345, 11454, 6928, 281, 4944, 11454, 941, 310, 1846, 3946, 31735, 594, 1663, 436, 2929, 11029, 347, 247, 4737, 1171, 31503, 323, 15706, 247, 2570, 1794, 40947, 1566, 342, 247, 19554, 391, 9866, 533, 1677, 326, 391, 79, 2224, 403, 3240, 1175, 387, 26278, 3425, 941, 697, 417, 30643, 10084, 326, 436, 2987, 25761, 2167, 253, 4477, 513, 247, 1077, 10182, 3186, 689, 2990, 2216, 7089, 597, 13414, 2085, 247, 12082, 5700, 323, 2571, 281, 2126, 604, 597, 594, 16632, 671, 253, 4477, 513, 417, 2085, 1199, 12288, 715, 752, 253, 391, 79, 2224, 3037, 326, 1537, 1361, 441, 281, 1805, 2096, 253, 41329, 11454, 14174, 285, 954, 15538, 436, 760, 14371, 253, 12510, 323, 2718, 835, 359, 452, 1794, 40947, 3210, 342, 973, 21877, 7200, 534, 310, 417, 253, 1083, 323, 954, 11454, 14174, 1677, 841, 15711, 247, 12009, 3061, 369, 4925 ]
[ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 ]
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 43355, 921, 849, 253, 11219, 985, 273, 260, 44984, 476, 320, 41329, 285, 15524, 342, 2856, 324, 1069, 257, 3210, 970, 1027, 11454, 2990, 35615, 5742, 597, 2303, 253, 897, 273, 1375, 273, 253, 1445, 18902, 11454, 6928, 35615, 824, 347, 298, 296, 983, 285, 650, 316, 285, 7277, 841, 35615, 275, 2426, 273, 616, 3607, 285, 616, 40373, 339, 347, 973, 347, 253, 10454, 273, 253, 4795, 3210, 4477, 921, 326, 26970, 3210, 342, 247, 8763, 3828, 1979, 273, 577, 5085, 403, 2104, 281, 13613, 18302, 253, 2718, 2380, 281, 1077, 1027, 15374, 275, 436, 2929, 4477, 2794, 3210, 323, 253, 260, 44984, 11219, 985, 342, 1264, 1027, 18902, 11454, 6928, 35615, 2969, 391, 79, 2224, 298, 296, 983, 285, 650, 316, 253, 8103, 310, 281, 2007, 6635, 247, 1698, 2621, 5740, 281, 8171, 253, 3236, 7000, 1566, 275, 253, 23586, 40022, 50275, 43355, 943, 3157, 253, 29885, 275, 13887, 4569, 15374, 323, 253, 3733, 12820, 285, 1071, 5239, 347, 973, 347, 8654, 4764, 5438, 285, 1347, 247, 12082, 1783, 273, 13800, 15018, 273, 253, 4715, 3169, 3210, 342, 2228, 1453, 4583, 697, 247, 5075, 2997, 4477, 943, 3157, 253, 29885, 275, 13887, 4569, 15374, 323, 253, 3733, 12820, 285, 1071, 5239, 347, 973, 347, 8654, 4764, 5438, 285, 1347, 247, 12082, 1783, 273, 13800, 15018, 273, 253, 4715, 3169, 3210, 342, 2228, 1453, 5474, 339, 431, 248, 2929, 2722, 326, 247, 1355, 18902, 11454, 2990, 476, 320, 908, 281, 3283, 253, 2425, 273, 577, 8512, 275, 247, 260, 44984, 9864, 342, 1175, 7200, 347, 4080, 407, 40373, 339, 50275, 783, 2929, 4648, 247, 4684, 485, 3169, 275, 5967, 1566, 273, 253, 28384, 17617, 18862, 267, 23586, 3210, 285, 5793, 285, 8545, 40041, 281, 6635, 253, 3216, 5083, 941, 275, 2087, 891, 1119, 326, 12497, 4278, 403, 2530, 670, 849, 436, 1566, 369, 4270, 50276, 7210, 456, 849, 497, 253, 1794, 40947, 3607, 6777, 752, 2753, 8023, 3210, 497, 908, 3966, 253, 4477, 1375, 326, 253, 40022, 310, 1029, 71, 21718, 984, 597, 5467, 352, 7598, 707, 342, 32422, 253, 1524, 3453, 273, 260, 44984, 8512, 253, 1566, 1537, 320, 1029, 19017, 414, 533, 347, 2080, 347, 891, 812, 923, 32422, 369, 417, 5183, 9825, 275, 958, 253, 760, 1071, 1083, 671, 908, 323, 512, 6774, 14053, 789, 342, 11454, 6928, 310, 3579, 23904, 5011, 275, 352, 374, 17872, 8512, 285, 374, 734, 32167, 790, 403, 18606, 281, 6635, 2425, 275, 577, 8512, 1929, 281, 320, 3939, 275, 23904, 5011, 697, 1896, 326, 3081, 1491, 310, 2130, 275, 253, 24864, 4753, 534, 891, 369, 417, 2104, 281, 2289, 2228, 296, 248, 1318, 273, 8409, 310, 562, 273, 2491, 352, 1364, 320, 50276, 17, 50275, 20070, 1099, 39282, 2959, 24833, 21553, 2222, 1014, 604, 28763, 253, 1083, 625, 273, 436, 1491, 943, 320, 275, 253, 2022, 2505, 50276, 74, 1119, 2593, 4562, 342, 7000, 5955, 273, 26970, 298, 296, 78, 285, 3237, 2905, 281, 3733, 391, 79, 2224, 247, 2372, 1512, 48656, 841, 403, 2629, 285, 973, 4304, 35615, 387, 436, 1127, 285, 812, 816, 320, 13366, 5611, 342, 10414, 281, 253, 3236, 9380, 253, 9414, 651, 5649, 1199, 625, 432, 4715, 253, 4278, 273, 253, 260, 44984, 9864, 50276, 783, 2929, 1057, 247, 1175, 2628, 273, 24181, 5175, 2710, 11640, 273, 18902, 11454, 6928, 298, 296, 78, 26970, 26724, 391, 9866, 275, 247, 2969, 9978, 7668, 247, 2014, 18902, 3828, 3560, 407, 247, 14086, 3828, 342, 577, 18012, 253, 4477, 2770, 327, 26970, 285, 7525, 326, 247, 10058, 11454, 2990, 342, 816, 577, 26970, 5085, 310, 4209, 50276, 3529, 247, 26970, 2990, 2987, 323, 14053, 2069, 12395, 941, 310, 417, 3782, 10084, 891, 1119, 253, 5661, 789, 3559, 275, 253, 2929, 2080, 1512, 3710, 281, 4917, 4217, 16039, 715, 970, 391, 79, 2224, 347, 247, 2806, 3364, 1566, 273, 1524, 3998, 2425, 34243, 253, 4679, 943, 3835, 625, 8512, 285, 13576, 285, 10575, 275, 2753, 8023, 20544, 923, 28076, 6903, 22439, 84, 21, 18663, 1549, 16899, 22863, 2055, 285, 253, 1794, 40947, 3602, 752, 310, 5816, 310, 671, 247, 5301, 273, 253, 15180, 10454, 273, 253, 9864, 285, 253, 4081, 3777, 1340, 1566, 581, 812, 671, 1642, 1880, 253, 1566, 908, 275, 253, 9864, 310, 2686, 3309, 281, 4711, 436, 3579, 3200, 10076, 50276, 16534, 690, 8512, 320, 10432, 390, 19554, 23586, 3210, 908, 3185, 50276, 6160, 5701, 50276, 783, 2505, 25957, 326, 253, 9712, 273, 253, 9864, 1543, 715, 1140, 565, 383, 29599, 369, 2218, 13542, 4496, 2085, 625, 4278, 327, 253, 6866, 368, 908, 281, 5416, 9991, 1561, 1016, 1387, 50276, 249, 3368, 495, 368, 1375, 326, 50276, 2203, 19958, 342, 1027, 673, 5018, 347, 436, 5644, 281, 3356, 6430, 436, 943, 3164, 320, 294, 545, 83, 833, 281, 6780, 326, 253, 1072, 3520, 673, 310, 15524, 50276, 5371, 4555, 310, 253, 3280, 281, 253, 2990, 760, 253, 374, 18476, 327, 253, 17872, 8512, 390, 671, 327, 253, 734, 32167, 790, 50276, 22309, 310, 352, 3309, 281, 2085, 15199, 327, 734, 32167, 790, 3185, 273, 816, 17872, 8512, 50274, 783, 2929, 8069, 2722, 326, 247, 10058, 26970, 2990, 476, 18302, 247, 1643, 2069, 12395, 4561, 432, 247, 4067, 9864, 253, 2087, 2561, 3884, 273, 3652, 3777, 2621, 3210, 323, 824, 9938, 310, 4722, 533, 253, 1077, 3710, 16774, 1543, 3559, 1060, 285, 3480, 273, 4278, 670, 253, 9864, 1056, 352, 7479, 323, 479, 281, 5583, 14924, 50276, 7152, 33032, 2520, 2929, 2340, 684, 253, 897, 273, 18902, 11454, 6928, 347, 247, 1566, 5141, 4968, 275, 15180, 6551, 21559, 625, 5742, 253, 4477, 1908, 253, 1895, 273, 21565, 253, 2425, 273, 247, 873, 273, 1740, 8512, 275, 253, 260, 44984, 44087, 853, 28384, 4795, 432, 253, 15524, 8545, 10277, 273, 643, 8512, 275, 253, 5074, 4684, 485, 1264, 4679, 403, 4824, 562, 5175, 1027, 2990, 35615, 2990, 9552, 285, 253, 1055, 273, 3629, 253, 11935, 6064, 273, 253, 941, 253, 1543, 921, 326, 247, 1355, 33336, 833, 2990, 310, 4209, 323, 17170, 7126, 4345, 342, 253, 4983, 941, 534, 369, 4561, 970, 43245, 17905, 9938, 273, 247, 2990, 273, 23559, 297, 2003, 11471, 8512, 253, 2929, 671, 3797, 271, 10199, 327, 4633, 391, 9866, 35615, 285, 327, 253, 1895, 273, 1566, 5141, 275, 15180, 6551, 21559, 50275, 296, 3755, 384, 84, 50276, 783, 2929, 310, 1077, 973, 3542, 285, 310, 271, 30357, 1239, 352, 11424, 50275, 4714, 253, 1895, 352, 310, 2820, 281, 2953, 285, 2139, 697, 1774, 285, 50275, 262, 23970, 512, 12342, 285, 5609, 352, 4648, 275, 271, 12482, 50275, 1106, 891, 1158, 436, 310, 3782, 1774, 323, 436, 2929, 347, 697, 50275, 7265, 8446, 651, 18289, 320, 15180, 6551, 30202, 1346, 594, 50275, 262, 3133, 247, 1175, 2934, 417, 281, 1379, 323, 7169, 247, 3676, 3640, 273, 50275, 19506, 18902, 2990, 35615, 50276, 12025, 272, 253, 1895, 273, 1566, 5141, 970, 273, 649, 1041, 48164, 50275, 28936, 4715, 7274, 310, 247, 14793, 4327, 1677, 253, 1077, 5233, 50275, 19687, 511, 273, 5661, 941, 2130, 323, 14053, 275, 436, 1673, 50276, 783, 2022, 22857, 273, 253, 2929, 310, 973, 4516, 407, 253, 1941, 50275, 4609, 310, 10090, 562, 275, 247, 13760, 285, 2590, 10005, 50275, 20881, 1255, 265, 50276, 17001, 83, 4263, 752, 891, 4159, 1840, 275, 581, 273, 253, 4056, 384, 84, 2792, 253, 50275, 12690, 326, 2929, 310, 247, 2581, 2969, 2898, 273, 273, 649, 1041, 48164, 391, 9866, 50275, 1116, 5671, 980, 281, 247, 6210, 392, 37224, 14053, 1895, 2097, 326, 253, 50275, 20790, 310, 417, 40704, 24683, 285, 326, 352, 36908, 3324, 1199, 275, 253, 50275, 1106, 273, 10527, 390, 20178, 12288, 2299, 275, 619, 4743, 50275, 2520, 1057, 417, 843, 974, 432, 253, 3290, 273, 253, 789, 50275, 37585, 13991, 50276, 9453, 16585, 2905, 6239, 891, 19864, 253, 4477, 281, 1908, 50275, 783, 11250, 273, 247, 16055, 5438, 273, 9380, 326, 275, 3332, 1107, 50275, 46188, 3676, 2990, 7274, 281, 14053, 11454, 2425, 941, 347, 50275, 783, 2987, 4390, 11106, 275, 253, 2929, 7409, 760, 49555, 363, 941, 50275, 4609, 310, 760, 940, 5954, 2905, 281, 4588, 11454, 2425, 390, 403, 50275, 6160, 281, 260, 44984, 767, 6667, 326, 1705, 281, 2564, 403, 50275, 17071, 266, 297, 1370, 251, 1162, 355, 17857, 32888, 4765, 35143, 3006, 15958, 11454, 50275, 30957, 2425, 6127, 970, 1006, 800, 48960, 6928, 50275, 395, 1112, 282, 68, 1162, 355, 5723, 2824, 43425, 13532, 6010, 9990, 273, 11454, 50275, 2203, 342, 247, 46350, 653, 16434, 2990, 40022, 50276, 338, 1896, 352, 651, 320, 1270, 281, 452, 271, 4588, 5301, 273, 253, 50275, 39413, 673, 16736, 23122, 275, 3515, 253, 10166, 391, 9866, 1014, 816, 323, 50275, 783, 2457, 4236, 2715, 273, 253, 10336, 253, 577, 390, 854, 19920, 50275, 30107, 7147, 3515, 253, 3216, 33024, 9864, 342, 23586, 891, 50275, 522, 862, 885, 604, 436, 310, 1677, 9366, 285, 891, 9829, 352, 50276, 32897, 1056, 512, 4677, 13301, 8750, 24039, 13301, 7049, 13301, 285, 50275, 42262, 9410, 403, 4390, 2761, 7479, 281, 1239, 50276, 251, 3239, 818, 1390, 1386, 273, 2593, 8319, 50276, 664, 1908, 352, 253, 2022, 50275, 7872, 285, 7562, 253, 298, 296, 78, 347, 271, 5795, 10336, 50276, 11425, 50275, 1257, 1978, 50275, 2520, 310, 247, 4891, 2929, 326, 12453, 247, 1953, 534, 588, 320, 273, 1600, 281, 629, 273, 253, 17857, 32888, 3114, 253, 2561, 310, 8489, 3710, 275, 7990, 533, 1077, 973, 11407, 285, 3542, 598, 285, 4583, 247, 18338, 7680, 352, 310, 671, 247, 1175, 20028, 273, 253, 31471, 273, 1527, 16634, 273, 6551, 21559, 3210, 285, 941, 50276, 2577, 17401, 310, 281, 2997, 2490, 187, 4118, 18435, 27, 2520, 2929, 33826, 253, 897, 273, 18902, 11454, 6928, 281, 1566, 11454, 2425, 2069, 12395, 941, 253, 3524, 310, 326, 43245, 17905, 1794, 40947, 3210, 273, 11454, 14174, 812, 320, 7932, 407, 391, 79, 2224, 672, 253, 4736, 310, 3365, 281, 9232, 253, 987, 3280, 9252, 3470, 253, 4477, 921, 326, 597, 476, 4944, 391, 79, 2224, 281, 253, 8770, 273, 247, 2570, 1794, 40947, 1566, 273, 253, 260, 44984, 11219, 985, 285, 597, 8338, 253, 2317, 273, 4373, 19484, 285, 2990, 10165, 326, 1421, 281, 253, 1682, 13840, 50276, 783, 10123, 323, 436, 2929, 497, 45210, 342, 7363, 273, 495, 721, 285, 854, 327, 253, 2762, 1930, 253, 30628, 5821, 326, 253, 2929, 310, 1077, 3576, 275, 17227, 326, 253, 3280, 9252, 8770, 273, 253, 1794, 40947, 1566, 273, 260, 44984, 476, 320, 37221, 407, 391, 79, 2224, 533, 327, 253, 4016, 1930, 627, 497, 7350, 670, 253, 3710, 3753, 273, 253, 16774, 1543, 3480, 273, 4278, 670, 253, 9864, 1512, 1199, 15075, 275, 12930, 973, 4304, 391, 9866, 35615, 285, 3480, 273, 12082, 5700, 323, 9433, 436, 5853, 275, 643, 2718, 253, 30080, 85, 932, 858, 417, 1818, 253, 45210, 7363, 50276, 40622, 436, 310, 271, 4227, 835, 253, 913, 1364, 320, 247, 2372, 625, 3206, 275, 253, 3061, 846, 4361, 253, 2929, 285, 10123, 253, 913, 3543, 326, 436, 789, 369, 417, 10481, 2087, 275, 697, 2898, 9142, 970, 13345, 11454, 6928, 281, 4944, 11454, 941, 310, 1846, 3946, 31735, 594, 1663, 436, 2929, 11029, 347, 247, 4737, 1171, 31503, 323, 15706, 247, 2570, 1794, 40947, 1566, 342, 247, 19554, 391, 9866, 533, 1677, 326, 391, 79, 2224, 403, 3240, 1175, 387, 26278, 3425, 941, 697, 417, 30643, 10084, 326, 436, 2987, 25761, 2167, 253, 4477, 513, 247, 1077, 10182, 3186, 689, 2990, 2216, 7089, 597, 13414, 2085, 247, 12082, 5700, 323, 2571, 281, 2126, 604, 597, 594, 16632, 671, 253, 4477, 513, 417, 2085, 1199, 12288, 715, 752, 253, 391, 79, 2224, 3037, 326, 1537, 1361, 441, 281, 1805, 2096, 253, 41329, 11454, 14174, 285, 954, 15538, 436, 760, 14371, 253, 12510, 323, 2718, 835, 359, 452, 1794, 40947, 3210, 342, 973, 21877, 7200, 534, 310, 417, 253, 1083, 323, 954, 11454, 14174, 1677, 841, 15711, 247, 12009, 3061, 369, 4925 ]
Below is given review of a research paper from cnoference journal. Please write a summary the review. ### Review: optimizing sparse neural networks is an important topic due to their computational and space savings building on the work of the lottery ticket hypothesis others have shown there exist hidden subnetworks within randomly initialized nn that have good performance the authors extend this definition to disguised subnetworks which contain hidden subnetworks as a subclass moreover the authors present a novel combination of existing methods into a single algorithm they call peekaboo pab which can efficiently find such networks strengths the paper is well written and does a fair job covering recent works the definition of disguised subnetworks is clear and properly motivated the presentation of the pab is clear and well motivated with good context the results are quite convincing that pab scaled well to larger networks something other methods lack while providing good accuracy for much smaller and easier to train models weakness 1 i would have liked to see some more theoretical discussion though the authors admit this work is primarily empirical 2 it would be nice to discussion of other weight transformations u in the unmasking phase that may be useful 3 though the definition of disguised subnetworks is new and the generalized approach is novel and the results speak to the validity of the approach the pab algorithm is a relatively straightforward application of two existing methods the paper is well written and does a good job covering all the related material the definition of disguised subnetworks is novel and will be useful for future researchers the pab algorithm clearly yields good results on larger nn something other methods lack more exploration of theoretical understanding as well as other weight transformations would be useful docsepthis paper extends the definition of the hidden subnetworks in randomly initiated neural networks the new notion of subnetworks namely the disguised subnetworks apply a transformation on the hidden subnetwork weights to obtain final weights a posteriori finding the hidden subnetwork mathematically speaking the main decision variable of the underlying optimisation problem of finding hidden subnetworks is the socalled masking variable that is a binary vector the variable that decides which components of the randomly initiated weights will be zero constrained to a desired level of sparsity on the other hand the underlying optimisation problem of disguised subnetworks has an additional variable u that applies a transformation on the set of weights after being sparsified the selection of u i for i being the identity transformation recovers the problem of hidden subnetworks showing that the latter is a generalisation the authors present a heuristic algorithm that solves the forenamed optimisation problem the idea is to first find a solution for the masking variable where i u i is taken ii the objective function is independent of a training set afterward the algorithm proceeds by solving the problem for u given the solution of the previous step where the space of u is restricted to the class of transformations where only signflips are allowed if we think of the above solution process as a twophase problem the authors use the literature on sparse neural networks for the first phase and use the literature on binary neural networks for the sign flipping phase i think the paper is extremely wellwritten there are no typos the language and the presentation are very strong and the flow is very fluent moreover the literature is summarised very precisely and it includes most of the relevant papers to discuss to the best of my knowledge the algorithm is described well and the contribution is discussed in a fair way without unrealistic claims my major concerns are as follows 1 contribution problem 2 formally describes the optimal hidden subnetwork problem and it immediately follows from the definition of problem 1 that what the authors suggest will be an improvement over the training data this is not unimaginable that is it is already known that any transformation of weights we optimise over the training data will be an improvement 2 novelty as discussed above rather than the definition of disguised subnetworks what matters most here is i the classes of u transformation that can be handled in this work and ii novelty of the solution algorithm the work proposes regarding these two points i the class of u is constrained to sign flips which is already wellstudied and being applied in similar settings ii the solution algorithm splits the problem into two subproblems the first subproblem solves for m the masking variable by using existing training dataindependent methods cf synflow after fixing m the algorithm simply proceeds with the bop method from the bnn optimisation literature my concern here is that since in the loss function the order of these variables is loss u m first solving for m and then for u is the most trivial way and while doing this using all of the wellknown results is not bringing much novelty 3 generality the paper is defining a very general problem of optimal disguised subnetworks and mentioning this in the titleabstract and throughout the paper however in my view this work is more like a casestudy where heuristic methods are being used iteratively namely the contribution is actually taking a recent paper that finds a masking variable and just flipping the signs wrt the training loss this is still an interesting result but i also would like to highlight the distinction from my perspective finally i would like to highlight that although the optimization formulation of disguised subnetworks finds u transformation and m masking jointly the relaxation proposed pab first splits the problem so that the solution algorithm first finds a subnetwork ie m is found and then changes the weights so this algorithm does not find a disguised subnetwork as claimed in the introduction rather finds a subnetwork and then reassigns the weights which is a disguised subnetwork but not a new architecture i believe the problem definition of disguised networks is very general but pab first finds the hidden architecture so this is not a new way of finding subnetworks 4 numerical experiments this is a question for the authors rather than a direct concern some of the algorithms in the numerical experiments where pab is being compared with are trainingindependent algorithms the fact that they are trainingindependent or sometimes called without training methods is the main focus in the literature around them malach et al 2020 on the workramanujan et al 2019 comment within a sufficiently overparameterized neural network with random weights eg at initialization there exists a subnetwork that achieves competitive accuracy is this mentioned in the paper otherwise the fact that here there is an optimisation procedure after fixing the sparsity structure based on the training set already changes the setting of the problem here and introduces a bias in comparison however i still think the idea of combining these existing methods to find a solution for the optimal disguised subnetwork problem and hopefully a good one can pave the way for a new research direction here the key standpoint is rather than the generality or the novelty of solutions to demonstrate that instead of solving a dense network and transforming the weights or to mask for a sparse network one can combine these methods to have a sparse network that also performs better than the alternatives the numerical experiments are conducted and reported carefully and the results look promising hence despite the lack of theoretical contributions i would like to give my decision as marginally below the acceptance threshold minor comments page 2 even worse is maybe better to rewrite page 3 unfortunately the original experiments in zhou et al 2019 maybe change unfortunately page 5 in the first sentence s is not capital but previously it was the same happens again in the unmasking phase part of s32 computationally infeasible computationally intractable argmin in argmin or prove that argmin set is a singleton page 5 for example one sparse nn of k nonzero elements is this called an example because k is named maybe it is better to connect with before that is if a sparse nn page 5 optimisation over 2k many vectors is called nphard i am not sure but this terminology may be misleading the optimisation problem itself can be tractable but there the input is just intractably large in k i am not very sure about whether it is common to say the problem is nphard for example with that logic maxa1 a2 an that is found in linear time is nphard in logn page 7 and the psg variant there is an additional comma before the full stop pros the paper has a very clear overview of the relevant research and in general the paper is written very well the numerical experiments are very thorough the idea of generalizing the notion of hidden subnetworks is relevant cons the contribution is limited the definition of disguised networks is straightforward and the solution method comprises a subsequent application of existing algorithms not parallelized or nested but applied sequentially the optimal disguised networks problem is restricted iteratively so that the end result is not very interesting anymore docsepthis paper proposes the idea of a disguised subnetwork which are hidden random subnetworks that can be transformed into a wellperforming subnetwork the paper introduces pab as a way to uncover these subnetworks by first searching for a mask over the random weights using pruningatinitialization techniques and then learning a transformation on the subnetwork the paper further shows that this pab process can be efficiently implemented offering significant advantage over prior work strength this paper is very well written and a pleasure to read the paper tackles an important question which is to rethink the optimization strategies of sparse neural networks especially when using the idea of masking as training the observation that when we are only looking to flip signs of random weights during training we can tolerate much coarser gradients is insightful the results are significant as it reduces training cost compared to fully trained networks while maintaining competitive performance the method is compared to an appropriate set of baselines over a reasonable set of model architectures weakness how exactly is the compression ratio calculated the authors mention that this is due to the fact that the random initialization can be stored using a single seed while this may be true in some cases i would be hesitant to rely on this attribute as the seed may not result in the same parameters in different machines it would be more reasonable to assume that the weight values must be stored for a more conservative estimate most of the work in supermasks consider the signed constant variant where the weights are converted to a single signed value for each layer this parameterization has often shown improved performance over the parameterization using actual weight values and leads to significant reductions in model complexity and size how would this variant work under the pab framework the method is novel and achieves better efficiencyperformance tradeoff than the various baselines it tackles the question of rethinking optimization strategies for sparse nn training which i think is an important research direction i think the contributions of this paper are significant and would support its acceptance docsepthis paper presents an algorithm named peekaboo pab to optimize network pruning at initialization and optimization limited within flipping the sign of weights this setting has not been studied by prior works a twostep algorithm was designed pruning first optimization second experiments show competitive performance compared to prior methods with similar optimization complexity this paper presents an algorithm named peekaboo pab to optimize network pruning at initialization and optimization limited within flipping the sign of weights this setting has not been studied by prior works a twostep algorithm was designed pruning first optimization second experiments show competitive performance compared to prior methods with similar optimization complexity though the setting is new and interesting i shall say that its value to the community has not been perfectly revealed based on the existing experiments in the studied cases the neural networks and datasets are mostly small the value of this work is supposed to be accelerating largescale network training but i am not sure that it scales up well second it seems that the technical contribution of the proposed algorithm is limited when i see the objective function 1 i was expecting that pruning and optimization are jointly performed however according to algorithm 1 they were performed separately and the pruning part follows an existing algorithm separate optimization brings the issue that the best pruned architecture may not fit very well in the unmasking setting therefore i am wondering if there is an iterative solution that pruning is divided into several steps eg one decides to keep 50 weights so heshe can prune 10 weights every time for 5 rounds and after each step the flipping operation is performed this can align both parts and hopefully improve the final performance of course this may increase the training bitops but this shall be a tradeoff anyway showing whether this strategy helps is not a bad trial third the experiments are performed on cifar datasets since imagenet results are missing i am not sure if the approach can actually scale up in particular according to tables 1 and 2 pab does not show a significant advantage over sgd which further limits the application of the proposed approach i think imagenet experiments might be a good addon to this paper 1 the setting is new but the practical value to the community seems unclear 2 the algorithm is straightforward joint optimization is not considered 3 experiments are promising but not exciting imagenet results are absent overall a bit below the acceptance threshold ### Summary:
i recommend acceptance this paper presents an interesting inbetween of work on lottery tickets and work on supermasks and i think it is sufficiently novel to merit acceptance even if the significance of the results will need to be left to the judgment of future researchers the reviewers seem broadly in favor of acceptance and i defer to their judgment as a proxy for that signal for a quick bit of context work on supermasks zhou et al 2019 has shown that randomly initialized networks contain subnetworks that can reach high accuracy without training the weights themselves that is to say within randomly initialized networks are highaccuracy subnetworks this work is interesting in its own right and has had a number of interesting implications for the theoretical community this work derives from work on the lottery ticket hypothesis lth frankle carbin 2019 which shows that randomly initialized networks contain subnetworks that can train to full accuracy on their own the key distinctions between these two kinds of work are 1 the lth trains the subnetworks while supermask work does not and 2 the lth work requires that the subnetworks train to full accuracy while work on supermasks obtain high but not full accuracy in many cases no one approach is better than the other they simply showcase different properties of neural networks as far as i understand this paper creates space for an inbetween highaccuracy subnetworks are created by finding subnetworks at random initialization and flipping the signs of some of the weights to improve accuracy this is a limited modification to the subnetworks that falls short of actually training them lth work but is more than leaving them at their random initializations supermask work doing so appears to produce subnetworks that perform better than in supermask work but with a lighterweight procedure than lth work the procedure for accomplishing this feat is different than either approach using synflow to find the subnetwork and a binary neural network training scheme to find the signs and there is probably significant room for improvement in this new algorithmic space just as there was for both lth and supermasks this is novel and interesting and i defer to the reviewers who find it worthy of acceptance i have reservations about the eventual significance of the work but that determination will be made by future researchers
[ 247, 44790, 4778, 285, 816, 46899, 253, 7871, 8772, 253, 3733, 2957, 436, 310, 1335, 271, 4722, 906, 533, 891, 671, 651, 751, 281, 6780, 253, 13812, 432, 619, 8668, 4720, 891, 651, 751, 281, 6780, 326, 3738, 253, 13757, 15895, 273, 49932, 749, 3024, 4896, 9010, 1484, 9261, 285, 278, 44790, 26277, 253, 17040, 4081, 268, 357, 806, 36509, 253, 1895, 594, 326, 253, 2900, 5933, 806, 9010, 247, 749, 18428, 26332, 278, 310, 1119, 285, 840, 2544, 253, 13461, 594, 436, 5933, 1057, 417, 1089, 247, 49932, 749, 18428, 347, 7558, 275, 253, 10199, 2581, 9010, 247, 749, 18428, 285, 840, 17279, 525, 84, 253, 13461, 534, 310, 247, 49932, 749, 18428, 533, 417, 247, 747, 10336, 891, 2868, 253, 1895, 5426, 273, 49932, 6928, 310, 1077, 2087, 533, 268, 357, 806, 9010, 253, 8763, 10336, 594, 436, 310, 417, 247, 747, 1039, 273, 4560, 749, 3024, 4896, 50276, 21, 50276, 40907, 474, 4679, 436, 310, 247, 1953, 323, 253, 4477, 2581, 685, 247, 1480, 4468, 690, 273, 253, 11333, 275, 253, 10704, 4679, 835, 268, 357, 310, 1146, 2429, 342, 403, 3733, 17777, 11333, 253, 958, 326, 597, 403, 3733, 17777, 390, 4536, 1925, 1293, 3733, 3082, 310, 253, 2022, 2770, 275, 253, 6239, 1475, 731, 4691, 607, 1162, 355, 9169, 327, 253, 789, 3358, 266, 10441, 266, 1162, 355, 6247, 4385, 1561, 247, 10481, 689, 19484, 1025, 11454, 2990, 342, 3632, 13461, 24088, 387, 31850, 627, 4961, 247, 749, 18428, 326, 33526, 12085, 7200, 310, 436, 5393, 275, 253, 2929, 5010, 253, 958, 326, 1060, 627, 310, 271, 5556, 5837, 5199, 846, 18505, 253, 37139, 414, 2605, 1754, 327, 253, 3733, 873, 2168, 2544, 253, 4758, 273, 253, 1895, 1060, 285, 23970, 247, 8492, 275, 5301, 50276, 35529, 891, 1335, 1158, 253, 2934, 273, 16248, 841, 5368, 3082, 281, 1089, 247, 2900, 323, 253, 8654, 49932, 749, 18428, 1895, 285, 18670, 247, 1175, 581, 476, 29238, 253, 1039, 323, 247, 747, 2561, 3884, 1060, 253, 2234, 32764, 310, 2581, 685, 253, 31376, 390, 253, 38135, 273, 5482, 281, 7568, 326, 3185, 273, 16161, 247, 14086, 2990, 285, 27197, 253, 13461, 390, 281, 8989, 323, 247, 23507, 2990, 581, 476, 13398, 841, 3082, 281, 452, 247, 23507, 2990, 326, 671, 17923, 1805, 685, 253, 18075, 253, 10704, 4679, 403, 5196, 285, 2361, 9257, 285, 253, 1543, 1007, 12532, 7613, 5747, 253, 3480, 273, 10527, 9021, 891, 651, 751, 281, 1918, 619, 3061, 347, 42876, 2708, 253, 14924, 7887, 50276, 37585, 5701, 50276, 6377, 374, 1014, 7197, 310, 5046, 1805, 281, 24813, 50276, 6377, 495, 19235, 253, 3236, 4679, 275, 1182, 14451, 1162, 355, 6247, 5046, 1818, 19235, 50276, 6377, 608, 275, 253, 806, 6197, 256, 310, 417, 5347, 533, 3786, 352, 369, 253, 1072, 6569, 969, 275, 253, 440, 12477, 272, 3408, 629, 273, 256, 1237, 50276, 681, 10340, 595, 275, 36764, 917, 50276, 681, 10340, 595, 540, 44374, 50274, 1662, 1222, 50274, 249, 1736, 1222, 390, 5276, 326, 1736, 1222, 873, 310, 247, 47736, 50275, 6377, 608, 323, 1650, 581, 23507, 48257, 273, 465, 28078, 3603, 50275, 261, 436, 1925, 271, 1650, 984, 465, 310, 4907, 5046, 352, 310, 1805, 281, 4684, 342, 1078, 50276, 3529, 310, 604, 247, 23507, 48257, 50275, 6377, 608, 5556, 5837, 689, 374, 76, 1142, 11390, 310, 1925, 295, 545, 472, 891, 717, 417, 2119, 533, 436, 28939, 778, 320, 24363, 253, 5556, 5837, 1895, 3139, 476, 320, 10649, 494, 533, 627, 253, 3280, 310, 816, 540, 974, 1598, 1781, 275, 465, 50276, 74, 717, 417, 1077, 2119, 670, 1880, 352, 310, 1846, 281, 1333, 253, 1895, 310, 295, 545, 472, 323, 1650, 342, 326, 9317, 2781, 66, 18, 247, 19, 50275, 266, 50276, 3529, 310, 1119, 275, 4872, 673, 310, 295, 545, 472, 275, 298, 2331, 50275, 6377, 818, 285, 253, 3714, 72, 12955, 627, 310, 271, 3081, 39169, 50276, 9131, 253, 2120, 3523, 50275, 856, 84, 50276, 783, 2929, 556, 247, 1077, 2590, 18389, 273, 253, 4623, 2561, 285, 275, 2087, 253, 2929, 310, 3542, 1077, 973, 50276, 783, 10704, 4679, 403, 1077, 11080, 50276, 783, 2934, 273, 2087, 3006, 253, 10732, 273, 8763, 749, 3024, 4896, 310, 4623, 772, 50276, 783, 7680, 310, 3710, 50276, 783, 5426, 273, 49932, 6928, 310, 15246, 285, 253, 2900, 1332, 12093, 247, 6774, 2898, 273, 5368, 11333, 417, 7529, 1025, 390, 20494, 533, 3732, 32627, 50276, 783, 8654, 49932, 6928, 1895, 310, 11096, 10040, 3146, 594, 326, 253, 990, 906, 310, 417, 1077, 4722, 10542, 5474, 33032, 2520, 2929, 29328, 253, 2934, 273, 247, 49932, 749, 18428, 534, 403, 8763, 3632, 749, 3024, 4896, 326, 476, 320, 13657, 715, 247, 973, 468, 14692, 749, 18428, 253, 2929, 23970, 268, 357, 347, 247, 1039, 281, 32355, 841, 749, 3024, 4896, 407, 806, 12203, 323, 247, 8989, 689, 253, 3632, 13461, 970, 819, 25004, 255, 19078, 1320, 5609, 285, 840, 4715, 247, 9261, 327, 253, 749, 18428, 253, 2929, 2007, 2722, 326, 436, 268, 357, 1232, 476, 320, 14556, 9009, 9159, 1534, 5750, 689, 2720, 789, 4757, 50276, 2520, 2929, 310, 1077, 973, 3542, 285, 247, 11284, 281, 1239, 50275, 783, 2929, 39223, 271, 1774, 1953, 534, 310, 281, 294, 18959, 253, 13757, 8130, 273, 23507, 11454, 6928, 3340, 672, 970, 253, 2934, 273, 44790, 347, 3733, 253, 8310, 326, 672, 359, 403, 760, 2819, 281, 19153, 7871, 273, 3632, 13461, 1309, 3733, 359, 476, 31259, 1199, 820, 9332, 27935, 310, 47860, 253, 1543, 403, 1534, 347, 352, 11355, 3733, 2105, 2429, 281, 4751, 10166, 6928, 1223, 11850, 12085, 3045, 50275, 783, 1332, 310, 2429, 281, 271, 4569, 873, 273, 1666, 25379, 689, 247, 5272, 873, 273, 1566, 35615, 50275, 20881, 1255, 50276, 5430, 4555, 310, 253, 13800, 4313, 5118, 253, 4477, 3748, 326, 436, 310, 1955, 281, 253, 958, 326, 253, 3632, 31850, 476, 320, 7141, 970, 247, 2014, 8357, 1223, 436, 778, 320, 2032, 275, 690, 2219, 891, 651, 320, 16063, 386, 281, 10725, 327, 436, 11104, 347, 253, 8357, 778, 417, 906, 275, 253, 1072, 3602, 275, 1027, 10679, 352, 651, 320, 625, 5272, 281, 5467, 326, 253, 2801, 2193, 1364, 320, 7141, 323, 247, 625, 11518, 6642, 50275, 2252, 273, 253, 789, 275, 2221, 78, 6579, 1908, 253, 6704, 3638, 12955, 835, 253, 13461, 403, 11516, 281, 247, 2014, 6704, 1318, 323, 1016, 3828, 436, 4764, 1320, 556, 2223, 2011, 5520, 3045, 689, 253, 4764, 1320, 970, 4588, 2801, 2193, 285, 5644, 281, 1534, 23082, 275, 1566, 10454, 285, 1979, 849, 651, 436, 12955, 789, 762, 253, 268, 357, 7792, 253, 1332, 310, 4460, 285, 33526, 1805, 6733, 24159, 5454, 2727, 685, 253, 2710, 1666, 25379, 352, 39223, 253, 1953, 273, 294, 37341, 13757, 8130, 323, 23507, 48257, 3733, 534, 891, 1158, 310, 271, 1774, 2561, 3884, 891, 1158, 253, 9021, 273, 436, 2929, 403, 1534, 285, 651, 1329, 697, 14924, 50276, 7152, 33032, 2520, 2929, 10262, 271, 5933, 4907, 36811, 357, 3288, 268, 357, 281, 22318, 2990, 819, 25004, 387, 31850, 285, 13757, 3710, 1561, 46899, 253, 861, 273, 13461, 436, 4758, 556, 417, 644, 5421, 407, 2720, 2987, 247, 2500, 493, 554, 5933, 369, 4158, 50276, 1087, 25004, 806, 13757, 1273, 4679, 921, 12085, 3045, 2429, 281, 2720, 3082, 342, 2074, 13757, 10454, 436, 2929, 10262, 271, 5933, 4907, 36811, 357, 3288, 268, 357, 281, 22318, 2990, 819, 25004, 387, 31850, 285, 13757, 3710, 1561, 46899, 253, 861, 273, 13461, 436, 4758, 556, 417, 644, 5421, 407, 2720, 2987, 247, 2500, 493, 554, 5933, 369, 4158, 50276, 1087, 25004, 806, 13757, 1273, 4679, 921, 12085, 3045, 2429, 281, 2720, 3082, 342, 2074, 13757, 10454, 50276, 2004, 253, 4758, 310, 747, 285, 4722, 891, 3091, 1333, 326, 697, 1318, 281, 253, 3114, 556, 417, 644, 9670, 4950, 1754, 327, 253, 5368, 4679, 275, 253, 5421, 2219, 253, 11454, 6928, 285, 15302, 403, 6571, 1355, 253, 1318, 273, 436, 789, 310, 6326, 281, 320, 38757, 1236, 2510, 25912, 2990, 3733, 533, 891, 717, 417, 2119, 326, 352, 11498, 598, 973, 50276, 9815, 352, 3133, 326, 253, 7681, 7680, 273, 253, 4081, 5933, 310, 3710, 672, 891, 923, 253, 8103, 1159, 337, 891, 369, 16764, 326, 819, 25004, 285, 13757, 403, 26277, 2684, 2299, 2556, 281, 5933, 337, 597, 497, 2684, 11794, 285, 253, 819, 25004, 629, 3637, 271, 5368, 5933, 4858, 13757, 10316, 253, 2523, 326, 253, 1682, 819, 37437, 10336, 778, 417, 4944, 1077, 973, 275, 253, 440, 12477, 272, 4758, 3103, 891, 717, 12371, 604, 627, 310, 271, 34560, 2900, 326, 819, 25004, 310, 4272, 715, 2067, 5018, 24088, 581, 21936, 281, 1978, 2456, 13461, 594, 344, 6689, 476, 819, 2517, 884, 13461, 1046, 673, 323, 608, 16334, 285, 846, 1016, 3213, 253, 46899, 4254, 310, 2684, 436, 476, 8495, 1097, 4243, 285, 18670, 3157, 253, 2457, 3045, 273, 2282, 436, 778, 2572, 253, 3733, 2372, 2695, 533, 436, 3091, 320, 247, 5454, 2727, 50276, 1279, 1106, 4645, 1880, 436, 5700, 7729, 310, 417, 247, 3076, 2332, 50276, 19016, 253, 4679, 403, 2684, 327, 260, 338, 274, 15302, 1580, 4440, 257, 292, 1543, 403, 5816, 891, 717, 417, 2119, 604, 253, 2746, 476, 2686, 4311, 598, 275, 1798, 2556, 281, 7180, 337, 285, 374, 268, 357, 1057, 417, 921, 247, 1534, 5750, 689, 256, 35333, 534, 2007, 7787, 253, 2898, 273, 253, 4081, 2746, 891, 1158, 4440, 257, 292, 4679, 1537, 320, 247, 1175, 823, 251, 281, 436, 2929, 337, 253, 4758, 310, 747, 533, 253, 8542, 1318, 281, 253, 3114, 3133, 12744, 374, 253, 5933, 310, 15246, 6036, 13757, 310, 417, 2783, 495, 4679, 403, 12532, 533, 417, 12302, 4440, 257, 292, 1543, 403, 12125, 50276, 1189, 455, 247, 2372, 2708, 253, 14924, 7887, 2490, 187, 4118, 18435, 27, 74, 5583, 14924, 436, 2929, 10262, 271, 4722, 275, 17352, 273, 789, 327, 36284, 14997, 285, 789, 327, 2221, 78, 6579, 285, 891, 1158, 352, 310, 10481, 4460, 281, 15785, 14924, 1014, 604, 253, 8453, 273, 253, 1543, 588, 878, 281, 320, 1669, 281, 253, 3883, 273, 2852, 8607, 253, 30628, 1646, 21450, 275, 3718, 273, 14924, 285, 891, 36574, 281, 616, 3883, 347, 247, 17335, 323, 326, 2625, 50276, 1542, 247, 3158, 2372, 273, 3634, 789, 327, 2221, 78, 6579, 1182, 14451, 1162, 355, 6247, 556, 2011, 326, 12421, 31260, 6928, 3831, 749, 3024, 4896, 326, 476, 3986, 1029, 7200, 1293, 3733, 253, 13461, 3746, 326, 310, 281, 1333, 1561, 12421, 31260, 6928, 403, 1029, 18921, 1974, 749, 3024, 4896, 436, 789, 310, 4722, 275, 697, 1211, 987, 285, 556, 574, 247, 1180, 273, 4722, 12739, 323, 253, 10527, 3114, 436, 789, 38422, 432, 789, 327, 253, 36284, 13571, 9079, 298, 394, 21332, 282, 50276, 5546, 4805, 6247, 534, 2722, 326, 12421, 31260, 6928, 3831, 749, 3024, 4896, 326, 476, 6194, 281, 2120, 7200, 327, 616, 1211, 253, 2234, 42060, 875, 841, 767, 9351, 273, 789, 403, 337, 253, 298, 394, 18784, 253, 749, 3024, 4896, 1223, 2221, 12477, 789, 1057, 417, 285, 374, 253, 298, 394, 789, 4419, 326, 253, 749, 3024, 4896, 6194, 281, 2120, 7200, 1223, 789, 327, 2221, 78, 6579, 4044, 1029, 533, 417, 2120, 7200, 275, 1142, 2219, 642, 581, 2746, 310, 1805, 685, 253, 643, 597, 3365, 34647, 1027, 3607, 273, 11454, 6928, 50276, 284, 2080, 347, 891, 2096, 436, 2929, 10513, 2317, 323, 271, 275, 17352, 1029, 18921, 1974, 749, 3024, 4896, 403, 3562, 407, 4560, 749, 3024, 4896, 387, 3632, 31850, 285, 46899, 253, 7871, 273, 690, 273, 253, 13461, 281, 3157, 7200, 436, 310, 247, 3710, 11237, 281, 253, 749, 3024, 4896, 326, 11521, 2159, 273, 2686, 3733, 731, 298, 394, 789, 533, 310, 625, 685, 6108, 731, 387, 616, 3632, 3302, 5904, 2221, 12477, 789, 2509, 594, 4620, 281, 4711, 749, 3024, 4896, 326, 1347, 1805, 685, 275, 2221, 12477, 789, 533, 342, 247, 21259, 6712, 5199, 685, 298, 394, 789, 253, 5199, 323, 7576, 3647, 436, 8151, 310, 1027, 685, 2057, 2746, 970, 2753, 5449, 281, 1089, 253, 749, 18428, 285, 247, 8985, 11454, 2990, 3733, 6974, 281, 1089, 253, 7871, 285, 627, 310, 3164, 1534, 2316, 323, 7756, 275, 436, 747, 5933, 280, 2317, 816, 347, 627, 369, 323, 1097, 298, 394, 285, 2221, 78, 6579, 50276, 2520, 310, 4460, 285, 4722, 285, 891, 36574, 281, 253, 30628, 665, 1089, 352, 18338, 273, 14924, 891, 452, 33196, 670, 253, 27585, 8453, 273, 253, 789, 533, 326, 7441, 588, 320, 1160, 407, 2852, 8607 ]
[ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 ]
[ 247, 44790, 4778, 285, 816, 46899, 253, 7871, 8772, 253, 3733, 2957, 436, 310, 1335, 271, 4722, 906, 533, 891, 671, 651, 751, 281, 6780, 253, 13812, 432, 619, 8668, 4720, 891, 651, 751, 281, 6780, 326, 3738, 253, 13757, 15895, 273, 49932, 749, 3024, 4896, 9010, 1484, 9261, 285, 278, 44790, 26277, 253, 17040, 4081, 268, 357, 806, 36509, 253, 1895, 594, 326, 253, 2900, 5933, 806, 9010, 247, 749, 18428, 26332, 278, 310, 1119, 285, 840, 2544, 253, 13461, 594, 436, 5933, 1057, 417, 1089, 247, 49932, 749, 18428, 347, 7558, 275, 253, 10199, 2581, 9010, 247, 749, 18428, 285, 840, 17279, 525, 84, 253, 13461, 534, 310, 247, 49932, 749, 18428, 533, 417, 247, 747, 10336, 891, 2868, 253, 1895, 5426, 273, 49932, 6928, 310, 1077, 2087, 533, 268, 357, 806, 9010, 253, 8763, 10336, 594, 436, 310, 417, 247, 747, 1039, 273, 4560, 749, 3024, 4896, 50276, 21, 50276, 40907, 474, 4679, 436, 310, 247, 1953, 323, 253, 4477, 2581, 685, 247, 1480, 4468, 690, 273, 253, 11333, 275, 253, 10704, 4679, 835, 268, 357, 310, 1146, 2429, 342, 403, 3733, 17777, 11333, 253, 958, 326, 597, 403, 3733, 17777, 390, 4536, 1925, 1293, 3733, 3082, 310, 253, 2022, 2770, 275, 253, 6239, 1475, 731, 4691, 607, 1162, 355, 9169, 327, 253, 789, 3358, 266, 10441, 266, 1162, 355, 6247, 4385, 1561, 247, 10481, 689, 19484, 1025, 11454, 2990, 342, 3632, 13461, 24088, 387, 31850, 627, 4961, 247, 749, 18428, 326, 33526, 12085, 7200, 310, 436, 5393, 275, 253, 2929, 5010, 253, 958, 326, 1060, 627, 310, 271, 5556, 5837, 5199, 846, 18505, 253, 37139, 414, 2605, 1754, 327, 253, 3733, 873, 2168, 2544, 253, 4758, 273, 253, 1895, 1060, 285, 23970, 247, 8492, 275, 5301, 50276, 35529, 891, 1335, 1158, 253, 2934, 273, 16248, 841, 5368, 3082, 281, 1089, 247, 2900, 323, 253, 8654, 49932, 749, 18428, 1895, 285, 18670, 247, 1175, 581, 476, 29238, 253, 1039, 323, 247, 747, 2561, 3884, 1060, 253, 2234, 32764, 310, 2581, 685, 253, 31376, 390, 253, 38135, 273, 5482, 281, 7568, 326, 3185, 273, 16161, 247, 14086, 2990, 285, 27197, 253, 13461, 390, 281, 8989, 323, 247, 23507, 2990, 581, 476, 13398, 841, 3082, 281, 452, 247, 23507, 2990, 326, 671, 17923, 1805, 685, 253, 18075, 253, 10704, 4679, 403, 5196, 285, 2361, 9257, 285, 253, 1543, 1007, 12532, 7613, 5747, 253, 3480, 273, 10527, 9021, 891, 651, 751, 281, 1918, 619, 3061, 347, 42876, 2708, 253, 14924, 7887, 50276, 37585, 5701, 50276, 6377, 374, 1014, 7197, 310, 5046, 1805, 281, 24813, 50276, 6377, 495, 19235, 253, 3236, 4679, 275, 1182, 14451, 1162, 355, 6247, 5046, 1818, 19235, 50276, 6377, 608, 275, 253, 806, 6197, 256, 310, 417, 5347, 533, 3786, 352, 369, 253, 1072, 6569, 969, 275, 253, 440, 12477, 272, 3408, 629, 273, 256, 1237, 50276, 681, 10340, 595, 275, 36764, 917, 50276, 681, 10340, 595, 540, 44374, 50274, 1662, 1222, 50274, 249, 1736, 1222, 390, 5276, 326, 1736, 1222, 873, 310, 247, 47736, 50275, 6377, 608, 323, 1650, 581, 23507, 48257, 273, 465, 28078, 3603, 50275, 261, 436, 1925, 271, 1650, 984, 465, 310, 4907, 5046, 352, 310, 1805, 281, 4684, 342, 1078, 50276, 3529, 310, 604, 247, 23507, 48257, 50275, 6377, 608, 5556, 5837, 689, 374, 76, 1142, 11390, 310, 1925, 295, 545, 472, 891, 717, 417, 2119, 533, 436, 28939, 778, 320, 24363, 253, 5556, 5837, 1895, 3139, 476, 320, 10649, 494, 533, 627, 253, 3280, 310, 816, 540, 974, 1598, 1781, 275, 465, 50276, 74, 717, 417, 1077, 2119, 670, 1880, 352, 310, 1846, 281, 1333, 253, 1895, 310, 295, 545, 472, 323, 1650, 342, 326, 9317, 2781, 66, 18, 247, 19, 50275, 266, 50276, 3529, 310, 1119, 275, 4872, 673, 310, 295, 545, 472, 275, 298, 2331, 50275, 6377, 818, 285, 253, 3714, 72, 12955, 627, 310, 271, 3081, 39169, 50276, 9131, 253, 2120, 3523, 50275, 856, 84, 50276, 783, 2929, 556, 247, 1077, 2590, 18389, 273, 253, 4623, 2561, 285, 275, 2087, 253, 2929, 310, 3542, 1077, 973, 50276, 783, 10704, 4679, 403, 1077, 11080, 50276, 783, 2934, 273, 2087, 3006, 253, 10732, 273, 8763, 749, 3024, 4896, 310, 4623, 772, 50276, 783, 7680, 310, 3710, 50276, 783, 5426, 273, 49932, 6928, 310, 15246, 285, 253, 2900, 1332, 12093, 247, 6774, 2898, 273, 5368, 11333, 417, 7529, 1025, 390, 20494, 533, 3732, 32627, 50276, 783, 8654, 49932, 6928, 1895, 310, 11096, 10040, 3146, 594, 326, 253, 990, 906, 310, 417, 1077, 4722, 10542, 5474, 33032, 2520, 2929, 29328, 253, 2934, 273, 247, 49932, 749, 18428, 534, 403, 8763, 3632, 749, 3024, 4896, 326, 476, 320, 13657, 715, 247, 973, 468, 14692, 749, 18428, 253, 2929, 23970, 268, 357, 347, 247, 1039, 281, 32355, 841, 749, 3024, 4896, 407, 806, 12203, 323, 247, 8989, 689, 253, 3632, 13461, 970, 819, 25004, 255, 19078, 1320, 5609, 285, 840, 4715, 247, 9261, 327, 253, 749, 18428, 253, 2929, 2007, 2722, 326, 436, 268, 357, 1232, 476, 320, 14556, 9009, 9159, 1534, 5750, 689, 2720, 789, 4757, 50276, 2520, 2929, 310, 1077, 973, 3542, 285, 247, 11284, 281, 1239, 50275, 783, 2929, 39223, 271, 1774, 1953, 534, 310, 281, 294, 18959, 253, 13757, 8130, 273, 23507, 11454, 6928, 3340, 672, 970, 253, 2934, 273, 44790, 347, 3733, 253, 8310, 326, 672, 359, 403, 760, 2819, 281, 19153, 7871, 273, 3632, 13461, 1309, 3733, 359, 476, 31259, 1199, 820, 9332, 27935, 310, 47860, 253, 1543, 403, 1534, 347, 352, 11355, 3733, 2105, 2429, 281, 4751, 10166, 6928, 1223, 11850, 12085, 3045, 50275, 783, 1332, 310, 2429, 281, 271, 4569, 873, 273, 1666, 25379, 689, 247, 5272, 873, 273, 1566, 35615, 50275, 20881, 1255, 50276, 5430, 4555, 310, 253, 13800, 4313, 5118, 253, 4477, 3748, 326, 436, 310, 1955, 281, 253, 958, 326, 253, 3632, 31850, 476, 320, 7141, 970, 247, 2014, 8357, 1223, 436, 778, 320, 2032, 275, 690, 2219, 891, 651, 320, 16063, 386, 281, 10725, 327, 436, 11104, 347, 253, 8357, 778, 417, 906, 275, 253, 1072, 3602, 275, 1027, 10679, 352, 651, 320, 625, 5272, 281, 5467, 326, 253, 2801, 2193, 1364, 320, 7141, 323, 247, 625, 11518, 6642, 50275, 2252, 273, 253, 789, 275, 2221, 78, 6579, 1908, 253, 6704, 3638, 12955, 835, 253, 13461, 403, 11516, 281, 247, 2014, 6704, 1318, 323, 1016, 3828, 436, 4764, 1320, 556, 2223, 2011, 5520, 3045, 689, 253, 4764, 1320, 970, 4588, 2801, 2193, 285, 5644, 281, 1534, 23082, 275, 1566, 10454, 285, 1979, 849, 651, 436, 12955, 789, 762, 253, 268, 357, 7792, 253, 1332, 310, 4460, 285, 33526, 1805, 6733, 24159, 5454, 2727, 685, 253, 2710, 1666, 25379, 352, 39223, 253, 1953, 273, 294, 37341, 13757, 8130, 323, 23507, 48257, 3733, 534, 891, 1158, 310, 271, 1774, 2561, 3884, 891, 1158, 253, 9021, 273, 436, 2929, 403, 1534, 285, 651, 1329, 697, 14924, 50276, 7152, 33032, 2520, 2929, 10262, 271, 5933, 4907, 36811, 357, 3288, 268, 357, 281, 22318, 2990, 819, 25004, 387, 31850, 285, 13757, 3710, 1561, 46899, 253, 861, 273, 13461, 436, 4758, 556, 417, 644, 5421, 407, 2720, 2987, 247, 2500, 493, 554, 5933, 369, 4158, 50276, 1087, 25004, 806, 13757, 1273, 4679, 921, 12085, 3045, 2429, 281, 2720, 3082, 342, 2074, 13757, 10454, 436, 2929, 10262, 271, 5933, 4907, 36811, 357, 3288, 268, 357, 281, 22318, 2990, 819, 25004, 387, 31850, 285, 13757, 3710, 1561, 46899, 253, 861, 273, 13461, 436, 4758, 556, 417, 644, 5421, 407, 2720, 2987, 247, 2500, 493, 554, 5933, 369, 4158, 50276, 1087, 25004, 806, 13757, 1273, 4679, 921, 12085, 3045, 2429, 281, 2720, 3082, 342, 2074, 13757, 10454, 50276, 2004, 253, 4758, 310, 747, 285, 4722, 891, 3091, 1333, 326, 697, 1318, 281, 253, 3114, 556, 417, 644, 9670, 4950, 1754, 327, 253, 5368, 4679, 275, 253, 5421, 2219, 253, 11454, 6928, 285, 15302, 403, 6571, 1355, 253, 1318, 273, 436, 789, 310, 6326, 281, 320, 38757, 1236, 2510, 25912, 2990, 3733, 533, 891, 717, 417, 2119, 326, 352, 11498, 598, 973, 50276, 9815, 352, 3133, 326, 253, 7681, 7680, 273, 253, 4081, 5933, 310, 3710, 672, 891, 923, 253, 8103, 1159, 337, 891, 369, 16764, 326, 819, 25004, 285, 13757, 403, 26277, 2684, 2299, 2556, 281, 5933, 337, 597, 497, 2684, 11794, 285, 253, 819, 25004, 629, 3637, 271, 5368, 5933, 4858, 13757, 10316, 253, 2523, 326, 253, 1682, 819, 37437, 10336, 778, 417, 4944, 1077, 973, 275, 253, 440, 12477, 272, 4758, 3103, 891, 717, 12371, 604, 627, 310, 271, 34560, 2900, 326, 819, 25004, 310, 4272, 715, 2067, 5018, 24088, 581, 21936, 281, 1978, 2456, 13461, 594, 344, 6689, 476, 819, 2517, 884, 13461, 1046, 673, 323, 608, 16334, 285, 846, 1016, 3213, 253, 46899, 4254, 310, 2684, 436, 476, 8495, 1097, 4243, 285, 18670, 3157, 253, 2457, 3045, 273, 2282, 436, 778, 2572, 253, 3733, 2372, 2695, 533, 436, 3091, 320, 247, 5454, 2727, 50276, 1279, 1106, 4645, 1880, 436, 5700, 7729, 310, 417, 247, 3076, 2332, 50276, 19016, 253, 4679, 403, 2684, 327, 260, 338, 274, 15302, 1580, 4440, 257, 292, 1543, 403, 5816, 891, 717, 417, 2119, 604, 253, 2746, 476, 2686, 4311, 598, 275, 1798, 2556, 281, 7180, 337, 285, 374, 268, 357, 1057, 417, 921, 247, 1534, 5750, 689, 256, 35333, 534, 2007, 7787, 253, 2898, 273, 253, 4081, 2746, 891, 1158, 4440, 257, 292, 4679, 1537, 320, 247, 1175, 823, 251, 281, 436, 2929, 337, 253, 4758, 310, 747, 533, 253, 8542, 1318, 281, 253, 3114, 3133, 12744, 374, 253, 5933, 310, 15246, 6036, 13757, 310, 417, 2783, 495, 4679, 403, 12532, 533, 417, 12302, 4440, 257, 292, 1543, 403, 12125, 50276, 1189, 455, 247, 2372, 2708, 253, 14924, 7887, 2490, 187, 4118, 18435, 27, 74, 5583, 14924, 436, 2929, 10262, 271, 4722, 275, 17352, 273, 789, 327, 36284, 14997, 285, 789, 327, 2221, 78, 6579, 285, 891, 1158, 352, 310, 10481, 4460, 281, 15785, 14924, 1014, 604, 253, 8453, 273, 253, 1543, 588, 878, 281, 320, 1669, 281, 253, 3883, 273, 2852, 8607, 253, 30628, 1646, 21450, 275, 3718, 273, 14924, 285, 891, 36574, 281, 616, 3883, 347, 247, 17335, 323, 326, 2625, 50276, 1542, 247, 3158, 2372, 273, 3634, 789, 327, 2221, 78, 6579, 1182, 14451, 1162, 355, 6247, 556, 2011, 326, 12421, 31260, 6928, 3831, 749, 3024, 4896, 326, 476, 3986, 1029, 7200, 1293, 3733, 253, 13461, 3746, 326, 310, 281, 1333, 1561, 12421, 31260, 6928, 403, 1029, 18921, 1974, 749, 3024, 4896, 436, 789, 310, 4722, 275, 697, 1211, 987, 285, 556, 574, 247, 1180, 273, 4722, 12739, 323, 253, 10527, 3114, 436, 789, 38422, 432, 789, 327, 253, 36284, 13571, 9079, 298, 394, 21332, 282, 50276, 5546, 4805, 6247, 534, 2722, 326, 12421, 31260, 6928, 3831, 749, 3024, 4896, 326, 476, 6194, 281, 2120, 7200, 327, 616, 1211, 253, 2234, 42060, 875, 841, 767, 9351, 273, 789, 403, 337, 253, 298, 394, 18784, 253, 749, 3024, 4896, 1223, 2221, 12477, 789, 1057, 417, 285, 374, 253, 298, 394, 789, 4419, 326, 253, 749, 3024, 4896, 6194, 281, 2120, 7200, 1223, 789, 327, 2221, 78, 6579, 4044, 1029, 533, 417, 2120, 7200, 275, 1142, 2219, 642, 581, 2746, 310, 1805, 685, 253, 643, 597, 3365, 34647, 1027, 3607, 273, 11454, 6928, 50276, 284, 2080, 347, 891, 2096, 436, 2929, 10513, 2317, 323, 271, 275, 17352, 1029, 18921, 1974, 749, 3024, 4896, 403, 3562, 407, 4560, 749, 3024, 4896, 387, 3632, 31850, 285, 46899, 253, 7871, 273, 690, 273, 253, 13461, 281, 3157, 7200, 436, 310, 247, 3710, 11237, 281, 253, 749, 3024, 4896, 326, 11521, 2159, 273, 2686, 3733, 731, 298, 394, 789, 533, 310, 625, 685, 6108, 731, 387, 616, 3632, 3302, 5904, 2221, 12477, 789, 2509, 594, 4620, 281, 4711, 749, 3024, 4896, 326, 1347, 1805, 685, 275, 2221, 12477, 789, 533, 342, 247, 21259, 6712, 5199, 685, 298, 394, 789, 253, 5199, 323, 7576, 3647, 436, 8151, 310, 1027, 685, 2057, 2746, 970, 2753, 5449, 281, 1089, 253, 749, 18428, 285, 247, 8985, 11454, 2990, 3733, 6974, 281, 1089, 253, 7871, 285, 627, 310, 3164, 1534, 2316, 323, 7756, 275, 436, 747, 5933, 280, 2317, 816, 347, 627, 369, 323, 1097, 298, 394, 285, 2221, 78, 6579, 50276, 2520, 310, 4460, 285, 4722, 285, 891, 36574, 281, 253, 30628, 665, 1089, 352, 18338, 273, 14924, 891, 452, 33196, 670, 253, 27585, 8453, 273, 253, 789, 533, 326, 7441, 588, 320, 1160, 407, 2852, 8607 ]
Below is given review of a research paper from cnoference journal. Please write a summary the review. ### Review: the paper addresses multiagent performative prediction multipfd focusing on a setting where multiple agents are learning a common decision vector from data that can be influenced by their decisions the paper formulates this as a decentralized optimization problem and identifies a necessary and sufficient condition for a unique multiagent performative stable multips solution the paper demonstrates that the consensus effect results in a relaxed requirement compared to the single agent case as a solution to the problem the paper studies a decentralized extension to a greedy deployment scheme dsgdgd demonstrates that it converges to the multips solution and analyzes the nonasymptotic convergence rate the analysis reveals that heterogeneous users and poor graph connectivity may impair convergence but do not affect the asymptotic rate some numerical experiments on synthetic and real data are provided as validation of the analysis strengths 1 the paper studies an interesting problem and provides a thorough and useful theoretical analysis that addresses key questions of interest existence of stable solution convergence rate although parts of the proofs are similar to those of related results in decentralized optimization and performative prediction there are additional challenges in the analysis and these are handled in an elegant fashion 2 there is valuable discussion associated with the theoretical results going beyond expressions for bounds the papers discusses key contributing factors to various terms this leads to a specification of how factors such as graph connectivity impact the behaviour 3 the assumptions underpinning the analysis do not impose unreasonable constraints that make the setting impractical although the strongly convex loss assumption and the local decision impact do impose limitations 4 the paper is very well written the key theoretical results are presented in a clear manner and there is enough discussion of the proof procedure to provide the reader with an understanding of the general strategy 5 the numerical experiments while limited and not the main focus of the paper do provide a validation of the theoretical analysis weaknesses 1 there is a relatively weak motivation for the setting under study in the paper in the introduction the paper cites hardt et al dong et al and kleinberg et al in providing an example scenario but the extension to the multiagent setting is not clearly motivated the example is a spam classifier and the proposed multiagent context is regional servers with users that respond in different ways in this case its not obvious why the consensus solution is desirable if there is sufficient data then it would seem that individual classifiers for each region would be preferable if not then i would think that it would be natural to explore compromises with some parameters being shared and others permitting a suitable response to the local data 2 experiments while the experiments are primarily illustrative and serve the role of validating the theory it is misleading and incorrect to claim that the paper illustrates the multipfd problem in a real application l 302 while the analyzed dataset is from a real application neither the division into multiple agents nor the users proposed adaptation are yes docsepthis paper consider the problem of multiagent performative prediction multipfd and formulate multipfd as a decentralized optimization problem the authors first prove the necessary and sufficient condition for the multipfd problem to admit a unique multiagent performative stable multips solution show that enforcing consensus leads to a laxer condition for existence of multips solution with respect to the distributions sensitivities compared to the single agent case then they study a decentralized extension to the greedy deployment scheme mendlerdnner et al 2020 called the dsgdgd scheme they show that dsgdgd converges to the multips solution and analyze its nonasymptotic convergence rate pros this paper provides the study and analysis of multipfd with consensus seeking agents via a practical dsgdgd scheme the results indicate that when agents are consensus seeking multipfd admits a performative stable solution with laxer condition and dsgdgd achieves linear speedup cons in terms of optimization the proposed methods are not novel especially with limitations such as the requirement of synchronous updates among agents strongly convex loss etc i would say the major contribution will be on the multipfd problem formulation side which i am not quite familiar with the authors adequately addressed the limitations and potential negative societal impact of their work docsepthis paper studies a multiagent performative prediction problem where the agents want to learn an identical local decision parameter to minimize the average of all local loss functions the main challenge is that the local data distribution which is an input to the local loss function will change as the agents update the local decision vectors before this work similar performative prediction problems have been studied in singleagent settings and competitive multiagent settings in contrast this work studies a cooperative multiagent setting where all agents are willing to share local decision vectors and want to reach a consensus the authors propose a decentralized learning scheme and provide a finitetime convergence bound they also conduct numerical simulations on synthetic data to validate the theoretical results strength the paper is wellwritten and easy to follow and the broad topic of performative predictive is interesting for the ml community the authors explained the intuition behind the problem setting the algorithm and the main theoretical results very well they also did a detailed comparison with related works to emphasize the novelties and challenges of this work weakness the multiagent performative prediction problem setting proposed in this work is a novel and natural extension of the single agent setting but the motivation is not sufficient in the following aspects since the local data distribution di only depends on the local decision vector thetai it seems unnatural to require all agents to reach a consensus on their local decision vectors besides the graph g of agents is only used as a communication graph compared with other works where the local data distribution di can depend on nonlocal decision vectors the multiagent problem considered here looks more like a localized problem more application examples are desired to motivate the setting considered in this work yes the authors discussed the limitations of this work in the last paragraph and i dont see any negative social impact of this work docsepthis paper formulate multipfd as a decentralized optimization problem and admit a multiagent solution compared to the singleagent case enforcing consensus leads to a laxer condition for the existence of multips solution with respect to the distributions sensitivity this paper study the dsgdgd scheme and analyze its convergence towards the multips solution and show that the scheme is convergent under the same sufficient condition for existence of multips solution originality compared to multiagent reinforcement learning algorithms the proposed dsgddg method can converge to a unique multips solution quality clarity the article is well organized significance the algorithm proposed in this paper has practical significance in the dynamic classification task it can dynamically adapt to some changes in the environment such as spam classification tasks the algorithm was also experimented on this task and achieved good results on the spambasebased multiagent spam classification task no ### Summary:
this paper received a mixed set of reviews after reading the paper the reviews subsequent authorreviewer discussion and discussing with the reviewers i recommend that this paper be accepted the paper studies a multiagent version of the performative prediction problem the paper contains original and interesting contributions and we expect the community to appreciate this work in particular the theoretical contribution was appreciated by several reviewers although concerns were raised during the review process my impression is that these were satisfactorily addressed through the rebuttals while preparing the camera ready we strongly encourage the authors to address the main concerns that came up in reviews these include strengthening the motivation for the particular multiagent formulation studied in this paper clarifying the relationship between performative prediction and prior work on distributed optimization
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 783, 2929, 12453, 4471, 12788, 1347, 800, 10554, 10796, 9194, 13654, 327, 247, 4758, 835, 2709, 6083, 403, 4715, 247, 1846, 3061, 4972, 432, 941, 326, 476, 320, 12208, 407, 616, 7089, 253, 2929, 17075, 684, 436, 347, 247, 40880, 13757, 1895, 285, 22649, 247, 3309, 285, 4209, 1617, 323, 247, 4451, 4471, 12788, 1347, 800, 6474, 1554, 2824, 2900, 253, 2929, 14371, 326, 253, 13969, 1055, 1543, 275, 247, 19595, 8284, 2429, 281, 253, 2014, 5570, 1083, 347, 247, 2900, 281, 253, 1895, 253, 2929, 2175, 247, 40880, 6880, 281, 247, 38754, 19007, 6974, 277, 8433, 27421, 69, 14371, 326, 352, 26414, 281, 253, 1554, 2824, 2900, 285, 3537, 13505, 253, 1327, 284, 40045, 3875, 14940, 2281, 253, 1783, 12957, 326, 22766, 4212, 285, 4105, 4216, 17769, 778, 11490, 14940, 533, 513, 417, 2818, 253, 20185, 2281, 690, 10704, 4679, 327, 13506, 285, 1524, 941, 403, 2530, 347, 12820, 273, 253, 1783, 50276, 296, 3755, 20556, 50276, 18, 186, 783, 2929, 2175, 271, 4722, 1895, 285, 3400, 247, 11080, 285, 4217, 10527, 1783, 326, 12453, 2234, 3533, 273, 1600, 6242, 273, 6474, 2900, 14940, 50276, 4427, 3738, 4243, 273, 253, 27947, 403, 2074, 281, 1110, 273, 2905, 1543, 275, 40880, 13757, 285, 1347, 800, 10554, 627, 403, 3081, 7881, 275, 253, 1783, 285, 841, 403, 15726, 275, 271, 20654, 8142, 50274, 19, 186, 9088, 310, 9865, 5955, 2330, 342, 253, 10527, 1543, 1469, 4457, 12091, 323, 14493, 253, 9380, 25339, 2234, 15979, 2616, 281, 2710, 2426, 436, 5644, 281, 247, 17776, 273, 849, 2616, 824, 347, 4216, 17769, 3486, 253, 8770, 50276, 20, 186, 783, 13260, 762, 9852, 920, 253, 1783, 513, 417, 16209, 20697, 10806, 326, 1056, 253, 4758, 45783, 3738, 253, 7052, 17133, 2957, 9376, 285, 253, 1980, 3061, 3486, 513, 16209, 7364, 50276, 21, 186, 783, 2929, 310, 1077, 973, 3542, 253, 2234, 10527, 1543, 403, 3559, 275, 247, 2590, 5133, 285, 627, 310, 2217, 5955, 273, 253, 4737, 5199, 281, 2085, 253, 9414, 342, 271, 4685, 273, 253, 2087, 5700, 50276, 22, 186, 783, 10704, 4679, 1223, 3710, 285, 417, 253, 2022, 2770, 273, 253, 2929, 513, 2085, 247, 12820, 273, 253, 10527, 1783, 50275, 20881, 1255, 265, 50276, 18, 186, 9088, 310, 247, 4942, 5075, 16038, 323, 253, 4758, 762, 1263, 275, 253, 2929, 275, 253, 10199, 253, 2929, 28070, 1892, 85, 1162, 355, 277, 543, 1162, 355, 285, 465, 18937, 4978, 1162, 355, 275, 5277, 271, 1650, 10076, 533, 253, 6880, 281, 253, 4471, 12788, 4758, 310, 417, 4518, 17194, 253, 1650, 310, 247, 29296, 30410, 285, 253, 4081, 4471, 12788, 3634, 310, 9933, 14903, 342, 4212, 326, 3794, 275, 1027, 4088, 275, 436, 1083, 697, 417, 4755, 2139, 253, 13969, 2900, 310, 11408, 604, 627, 310, 4209, 941, 840, 352, 651, 1646, 326, 2060, 49996, 323, 1016, 2919, 651, 320, 29224, 604, 417, 840, 891, 651, 1158, 326, 352, 651, 320, 3626, 281, 8338, 10953, 3013, 342, 690, 3602, 1146, 6096, 285, 2571, 27382, 247, 7470, 2380, 281, 253, 1980, 941, 50273, 19, 186, 16217, 3825, 1223, 253, 4679, 403, 8558, 47386, 285, 5752, 253, 2554, 273, 3588, 839, 253, 3762, 352, 310, 24363, 285, 13583, 281, 1750, 326, 253, 2929, 18303, 253, 10796, 9194, 1895, 275, 247, 1524, 2898, 298, 29044, 1223, 253, 5867, 10895, 310, 432, 247, 1524, 2898, 6747, 253, 9025, 715, 2709, 6083, 4543, 253, 4212, 4081, 15644, 403, 50276, 9820, 5474, 33032, 2520, 2929, 1908, 253, 1895, 273, 4471, 12788, 1347, 800, 10554, 10796, 9194, 285, 36803, 10796, 9194, 347, 247, 40880, 13757, 1895, 253, 4477, 806, 5276, 253, 3309, 285, 4209, 1617, 323, 253, 10796, 9194, 1895, 281, 11476, 247, 4451, 4471, 12788, 1347, 800, 6474, 1554, 2824, 2900, 921, 326, 37703, 13969, 5644, 281, 247, 46948, 254, 1617, 323, 6242, 273, 1554, 2824, 2900, 342, 1675, 281, 253, 10670, 21750, 16762, 2429, 281, 253, 2014, 5570, 1083, 840, 597, 1263, 247, 40880, 6880, 281, 253, 38754, 19007, 6974, 278, 423, 2146, 17915, 1216, 1162, 355, 9169, 1925, 253, 277, 8433, 27421, 69, 6974, 597, 921, 326, 277, 8433, 27421, 69, 26414, 281, 253, 1554, 2824, 2900, 285, 12106, 697, 1327, 284, 40045, 3875, 14940, 2281, 50272, 856, 84, 436, 2929, 3400, 253, 1263, 285, 1783, 273, 10796, 9194, 342, 13969, 8445, 6083, 3066, 247, 8542, 277, 8433, 27421, 69, 6974, 253, 1543, 5224, 326, 672, 6083, 403, 13969, 8445, 10796, 9194, 19943, 247, 1347, 800, 6474, 2900, 342, 46948, 254, 1617, 285, 277, 8433, 27421, 69, 33526, 4872, 3885, 484, 50275, 5040, 275, 2426, 273, 13757, 253, 4081, 3082, 403, 417, 4460, 3340, 342, 7364, 824, 347, 253, 8284, 273, 34265, 11269, 2190, 6083, 7052, 17133, 2957, 3966, 891, 651, 1333, 253, 2201, 7680, 588, 320, 327, 253, 10796, 9194, 1895, 15895, 1930, 534, 891, 717, 417, 3240, 7615, 342, 50275, 783, 4477, 18212, 9713, 253, 7364, 285, 2442, 4016, 38058, 3486, 273, 616, 789, 5474, 33032, 2520, 2929, 2175, 247, 4471, 12788, 1347, 800, 10554, 1895, 835, 253, 6083, 971, 281, 3037, 271, 8931, 1980, 3061, 4764, 281, 15338, 253, 3388, 273, 512, 1980, 2957, 3470, 253, 2022, 5691, 310, 326, 253, 1980, 941, 3268, 534, 310, 271, 3280, 281, 253, 1980, 2957, 1159, 588, 1818, 347, 253, 6083, 5731, 253, 1980, 3061, 11390, 1078, 436, 789, 2074, 1347, 800, 10554, 3237, 452, 644, 5421, 275, 2014, 12788, 7533, 285, 12085, 4471, 12788, 7533, 275, 4499, 436, 789, 2175, 247, 27293, 4471, 12788, 4758, 835, 512, 6083, 403, 7378, 281, 3894, 1980, 3061, 11390, 285, 971, 281, 3986, 247, 13969, 253, 4477, 12661, 247, 40880, 4715, 6974, 285, 2085, 247, 1442, 262, 7816, 14940, 3033, 597, 671, 2589, 10704, 9938, 327, 13506, 941, 281, 17813, 253, 10527, 1543, 4757, 253, 2929, 310, 973, 15720, 285, 3477, 281, 956, 285, 253, 3862, 9400, 273, 1347, 800, 15970, 310, 4722, 323, 253, 13361, 3114, 253, 4477, 5544, 253, 30328, 3212, 253, 1895, 4758, 253, 5933, 285, 253, 2022, 10527, 1543, 1077, 973, 597, 671, 858, 247, 7000, 5301, 342, 2905, 2987, 281, 22175, 253, 4460, 2890, 285, 7881, 273, 436, 789, 50276, 20881, 1255, 253, 4471, 12788, 1347, 800, 10554, 1895, 4758, 4081, 275, 436, 789, 310, 247, 4460, 285, 3626, 6880, 273, 253, 2014, 5570, 4758, 533, 253, 16038, 310, 417, 4209, 275, 253, 1563, 7794, 1580, 253, 1980, 941, 3268, 1073, 760, 7024, 327, 253, 1980, 3061, 4972, 39116, 74, 352, 3133, 44822, 281, 2430, 512, 6083, 281, 3986, 247, 13969, 327, 616, 1980, 3061, 11390, 16280, 253, 4216, 305, 273, 6083, 310, 760, 908, 347, 247, 5511, 4216, 2429, 342, 643, 2987, 835, 253, 1980, 941, 3268, 1073, 50276, 5092, 3469, 327, 1327, 6790, 3061, 11390, 253, 4471, 12788, 1895, 2783, 1060, 4453, 625, 751, 247, 15783, 1895, 625, 2898, 6667, 403, 6799, 281, 41509, 253, 4758, 2783, 275, 436, 789, 50276, 9820, 253, 4477, 5469, 253, 7364, 273, 436, 789, 275, 253, 1390, 12494, 285, 891, 13414, 923, 667, 4016, 2675, 3486, 273, 436, 789, 5474, 33032, 2520, 2929, 36803, 10796, 9194, 347, 247, 40880, 13757, 1895, 285, 11476, 247, 4471, 12788, 2900, 2429, 281, 253, 2014, 12788, 1083, 37703, 13969, 5644, 281, 247, 46948, 254, 1617, 323, 253, 6242, 273, 1554, 2824, 2900, 342, 1675, 281, 253, 10670, 7340, 436, 2929, 1263, 253, 277, 8433, 27421, 69, 6974, 285, 12106, 697, 14940, 4404, 253, 1554, 2824, 2900, 285, 921, 326, 253, 6974, 310, 41886, 762, 253, 1072, 4209, 1617, 323, 6242, 273, 1554, 2824, 2900, 3236, 414, 2429, 281, 4471, 12788, 35221, 4715, 11333, 253, 4081, 277, 8433, 1678, 72, 1332, 476, 29623, 281, 247, 4451, 1554, 2824, 2900, 50276, 15177, 19843, 253, 3929, 310, 973, 10932, 50275, 9188, 40348, 253, 5933, 4081, 275, 436, 2929, 556, 8542, 8453, 275, 253, 7870, 9162, 4836, 352, 476, 23043, 5223, 281, 690, 2544, 275, 253, 3126, 824, 347, 29296, 9162, 8892, 253, 5933, 369, 671, 3368, 264, 327, 436, 4836, 285, 6786, 1175, 1543, 327, 253, 653, 1369, 511, 3169, 4471, 12788, 29296, 9162, 4836, 642, 2490, 187, 4118, 18435, 27, 2520, 2929, 2959, 247, 6804, 873, 273, 10123, 846, 4361, 253, 2929, 253, 10123, 6774, 2488, 15337, 254, 5955, 285, 16585, 342, 253, 30628, 891, 5583, 326, 436, 2929, 320, 7607, 253, 2929, 2175, 247, 4471, 12788, 2715, 273, 253, 1347, 800, 10554, 1895, 253, 2929, 4428, 3236, 285, 4722, 9021, 285, 359, 1902, 253, 3114, 281, 11435, 436, 789, 275, 1798, 253, 10527, 7680, 369, 14109, 407, 2067, 30628, 3738, 7350, 497, 5439, 1309, 253, 2278, 1232, 619, 13214, 310, 326, 841, 497, 3449, 5906, 1031, 9713, 949, 253, 30080, 85, 932, 50276, 6050, 13828, 253, 6568, 4704, 359, 7052, 11907, 253, 4477, 281, 2953, 253, 2022, 7350, 326, 2210, 598, 275, 10123, 841, 2486, 50276, 296, 3755, 25230, 253, 16038, 323, 253, 1798, 4471, 12788, 15895, 5421, 275, 436, 2929, 50276, 498, 274, 5411, 253, 2954, 875, 1347, 800, 10554, 285, 2720, 789, 327, 5939, 13757 ]
[ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 ]
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 783, 2929, 12453, 4471, 12788, 1347, 800, 10554, 10796, 9194, 13654, 327, 247, 4758, 835, 2709, 6083, 403, 4715, 247, 1846, 3061, 4972, 432, 941, 326, 476, 320, 12208, 407, 616, 7089, 253, 2929, 17075, 684, 436, 347, 247, 40880, 13757, 1895, 285, 22649, 247, 3309, 285, 4209, 1617, 323, 247, 4451, 4471, 12788, 1347, 800, 6474, 1554, 2824, 2900, 253, 2929, 14371, 326, 253, 13969, 1055, 1543, 275, 247, 19595, 8284, 2429, 281, 253, 2014, 5570, 1083, 347, 247, 2900, 281, 253, 1895, 253, 2929, 2175, 247, 40880, 6880, 281, 247, 38754, 19007, 6974, 277, 8433, 27421, 69, 14371, 326, 352, 26414, 281, 253, 1554, 2824, 2900, 285, 3537, 13505, 253, 1327, 284, 40045, 3875, 14940, 2281, 253, 1783, 12957, 326, 22766, 4212, 285, 4105, 4216, 17769, 778, 11490, 14940, 533, 513, 417, 2818, 253, 20185, 2281, 690, 10704, 4679, 327, 13506, 285, 1524, 941, 403, 2530, 347, 12820, 273, 253, 1783, 50276, 296, 3755, 20556, 50276, 18, 186, 783, 2929, 2175, 271, 4722, 1895, 285, 3400, 247, 11080, 285, 4217, 10527, 1783, 326, 12453, 2234, 3533, 273, 1600, 6242, 273, 6474, 2900, 14940, 50276, 4427, 3738, 4243, 273, 253, 27947, 403, 2074, 281, 1110, 273, 2905, 1543, 275, 40880, 13757, 285, 1347, 800, 10554, 627, 403, 3081, 7881, 275, 253, 1783, 285, 841, 403, 15726, 275, 271, 20654, 8142, 50274, 19, 186, 9088, 310, 9865, 5955, 2330, 342, 253, 10527, 1543, 1469, 4457, 12091, 323, 14493, 253, 9380, 25339, 2234, 15979, 2616, 281, 2710, 2426, 436, 5644, 281, 247, 17776, 273, 849, 2616, 824, 347, 4216, 17769, 3486, 253, 8770, 50276, 20, 186, 783, 13260, 762, 9852, 920, 253, 1783, 513, 417, 16209, 20697, 10806, 326, 1056, 253, 4758, 45783, 3738, 253, 7052, 17133, 2957, 9376, 285, 253, 1980, 3061, 3486, 513, 16209, 7364, 50276, 21, 186, 783, 2929, 310, 1077, 973, 3542, 253, 2234, 10527, 1543, 403, 3559, 275, 247, 2590, 5133, 285, 627, 310, 2217, 5955, 273, 253, 4737, 5199, 281, 2085, 253, 9414, 342, 271, 4685, 273, 253, 2087, 5700, 50276, 22, 186, 783, 10704, 4679, 1223, 3710, 285, 417, 253, 2022, 2770, 273, 253, 2929, 513, 2085, 247, 12820, 273, 253, 10527, 1783, 50275, 20881, 1255, 265, 50276, 18, 186, 9088, 310, 247, 4942, 5075, 16038, 323, 253, 4758, 762, 1263, 275, 253, 2929, 275, 253, 10199, 253, 2929, 28070, 1892, 85, 1162, 355, 277, 543, 1162, 355, 285, 465, 18937, 4978, 1162, 355, 275, 5277, 271, 1650, 10076, 533, 253, 6880, 281, 253, 4471, 12788, 4758, 310, 417, 4518, 17194, 253, 1650, 310, 247, 29296, 30410, 285, 253, 4081, 4471, 12788, 3634, 310, 9933, 14903, 342, 4212, 326, 3794, 275, 1027, 4088, 275, 436, 1083, 697, 417, 4755, 2139, 253, 13969, 2900, 310, 11408, 604, 627, 310, 4209, 941, 840, 352, 651, 1646, 326, 2060, 49996, 323, 1016, 2919, 651, 320, 29224, 604, 417, 840, 891, 651, 1158, 326, 352, 651, 320, 3626, 281, 8338, 10953, 3013, 342, 690, 3602, 1146, 6096, 285, 2571, 27382, 247, 7470, 2380, 281, 253, 1980, 941, 50273, 19, 186, 16217, 3825, 1223, 253, 4679, 403, 8558, 47386, 285, 5752, 253, 2554, 273, 3588, 839, 253, 3762, 352, 310, 24363, 285, 13583, 281, 1750, 326, 253, 2929, 18303, 253, 10796, 9194, 1895, 275, 247, 1524, 2898, 298, 29044, 1223, 253, 5867, 10895, 310, 432, 247, 1524, 2898, 6747, 253, 9025, 715, 2709, 6083, 4543, 253, 4212, 4081, 15644, 403, 50276, 9820, 5474, 33032, 2520, 2929, 1908, 253, 1895, 273, 4471, 12788, 1347, 800, 10554, 10796, 9194, 285, 36803, 10796, 9194, 347, 247, 40880, 13757, 1895, 253, 4477, 806, 5276, 253, 3309, 285, 4209, 1617, 323, 253, 10796, 9194, 1895, 281, 11476, 247, 4451, 4471, 12788, 1347, 800, 6474, 1554, 2824, 2900, 921, 326, 37703, 13969, 5644, 281, 247, 46948, 254, 1617, 323, 6242, 273, 1554, 2824, 2900, 342, 1675, 281, 253, 10670, 21750, 16762, 2429, 281, 253, 2014, 5570, 1083, 840, 597, 1263, 247, 40880, 6880, 281, 253, 38754, 19007, 6974, 278, 423, 2146, 17915, 1216, 1162, 355, 9169, 1925, 253, 277, 8433, 27421, 69, 6974, 597, 921, 326, 277, 8433, 27421, 69, 26414, 281, 253, 1554, 2824, 2900, 285, 12106, 697, 1327, 284, 40045, 3875, 14940, 2281, 50272, 856, 84, 436, 2929, 3400, 253, 1263, 285, 1783, 273, 10796, 9194, 342, 13969, 8445, 6083, 3066, 247, 8542, 277, 8433, 27421, 69, 6974, 253, 1543, 5224, 326, 672, 6083, 403, 13969, 8445, 10796, 9194, 19943, 247, 1347, 800, 6474, 2900, 342, 46948, 254, 1617, 285, 277, 8433, 27421, 69, 33526, 4872, 3885, 484, 50275, 5040, 275, 2426, 273, 13757, 253, 4081, 3082, 403, 417, 4460, 3340, 342, 7364, 824, 347, 253, 8284, 273, 34265, 11269, 2190, 6083, 7052, 17133, 2957, 3966, 891, 651, 1333, 253, 2201, 7680, 588, 320, 327, 253, 10796, 9194, 1895, 15895, 1930, 534, 891, 717, 417, 3240, 7615, 342, 50275, 783, 4477, 18212, 9713, 253, 7364, 285, 2442, 4016, 38058, 3486, 273, 616, 789, 5474, 33032, 2520, 2929, 2175, 247, 4471, 12788, 1347, 800, 10554, 1895, 835, 253, 6083, 971, 281, 3037, 271, 8931, 1980, 3061, 4764, 281, 15338, 253, 3388, 273, 512, 1980, 2957, 3470, 253, 2022, 5691, 310, 326, 253, 1980, 941, 3268, 534, 310, 271, 3280, 281, 253, 1980, 2957, 1159, 588, 1818, 347, 253, 6083, 5731, 253, 1980, 3061, 11390, 1078, 436, 789, 2074, 1347, 800, 10554, 3237, 452, 644, 5421, 275, 2014, 12788, 7533, 285, 12085, 4471, 12788, 7533, 275, 4499, 436, 789, 2175, 247, 27293, 4471, 12788, 4758, 835, 512, 6083, 403, 7378, 281, 3894, 1980, 3061, 11390, 285, 971, 281, 3986, 247, 13969, 253, 4477, 12661, 247, 40880, 4715, 6974, 285, 2085, 247, 1442, 262, 7816, 14940, 3033, 597, 671, 2589, 10704, 9938, 327, 13506, 941, 281, 17813, 253, 10527, 1543, 4757, 253, 2929, 310, 973, 15720, 285, 3477, 281, 956, 285, 253, 3862, 9400, 273, 1347, 800, 15970, 310, 4722, 323, 253, 13361, 3114, 253, 4477, 5544, 253, 30328, 3212, 253, 1895, 4758, 253, 5933, 285, 253, 2022, 10527, 1543, 1077, 973, 597, 671, 858, 247, 7000, 5301, 342, 2905, 2987, 281, 22175, 253, 4460, 2890, 285, 7881, 273, 436, 789, 50276, 20881, 1255, 253, 4471, 12788, 1347, 800, 10554, 1895, 4758, 4081, 275, 436, 789, 310, 247, 4460, 285, 3626, 6880, 273, 253, 2014, 5570, 4758, 533, 253, 16038, 310, 417, 4209, 275, 253, 1563, 7794, 1580, 253, 1980, 941, 3268, 1073, 760, 7024, 327, 253, 1980, 3061, 4972, 39116, 74, 352, 3133, 44822, 281, 2430, 512, 6083, 281, 3986, 247, 13969, 327, 616, 1980, 3061, 11390, 16280, 253, 4216, 305, 273, 6083, 310, 760, 908, 347, 247, 5511, 4216, 2429, 342, 643, 2987, 835, 253, 1980, 941, 3268, 1073, 50276, 5092, 3469, 327, 1327, 6790, 3061, 11390, 253, 4471, 12788, 1895, 2783, 1060, 4453, 625, 751, 247, 15783, 1895, 625, 2898, 6667, 403, 6799, 281, 41509, 253, 4758, 2783, 275, 436, 789, 50276, 9820, 253, 4477, 5469, 253, 7364, 273, 436, 789, 275, 253, 1390, 12494, 285, 891, 13414, 923, 667, 4016, 2675, 3486, 273, 436, 789, 5474, 33032, 2520, 2929, 36803, 10796, 9194, 347, 247, 40880, 13757, 1895, 285, 11476, 247, 4471, 12788, 2900, 2429, 281, 253, 2014, 12788, 1083, 37703, 13969, 5644, 281, 247, 46948, 254, 1617, 323, 253, 6242, 273, 1554, 2824, 2900, 342, 1675, 281, 253, 10670, 7340, 436, 2929, 1263, 253, 277, 8433, 27421, 69, 6974, 285, 12106, 697, 14940, 4404, 253, 1554, 2824, 2900, 285, 921, 326, 253, 6974, 310, 41886, 762, 253, 1072, 4209, 1617, 323, 6242, 273, 1554, 2824, 2900, 3236, 414, 2429, 281, 4471, 12788, 35221, 4715, 11333, 253, 4081, 277, 8433, 1678, 72, 1332, 476, 29623, 281, 247, 4451, 1554, 2824, 2900, 50276, 15177, 19843, 253, 3929, 310, 973, 10932, 50275, 9188, 40348, 253, 5933, 4081, 275, 436, 2929, 556, 8542, 8453, 275, 253, 7870, 9162, 4836, 352, 476, 23043, 5223, 281, 690, 2544, 275, 253, 3126, 824, 347, 29296, 9162, 8892, 253, 5933, 369, 671, 3368, 264, 327, 436, 4836, 285, 6786, 1175, 1543, 327, 253, 653, 1369, 511, 3169, 4471, 12788, 29296, 9162, 4836, 642, 2490, 187, 4118, 18435, 27, 2520, 2929, 2959, 247, 6804, 873, 273, 10123, 846, 4361, 253, 2929, 253, 10123, 6774, 2488, 15337, 254, 5955, 285, 16585, 342, 253, 30628, 891, 5583, 326, 436, 2929, 320, 7607, 253, 2929, 2175, 247, 4471, 12788, 2715, 273, 253, 1347, 800, 10554, 1895, 253, 2929, 4428, 3236, 285, 4722, 9021, 285, 359, 1902, 253, 3114, 281, 11435, 436, 789, 275, 1798, 253, 10527, 7680, 369, 14109, 407, 2067, 30628, 3738, 7350, 497, 5439, 1309, 253, 2278, 1232, 619, 13214, 310, 326, 841, 497, 3449, 5906, 1031, 9713, 949, 253, 30080, 85, 932, 50276, 6050, 13828, 253, 6568, 4704, 359, 7052, 11907, 253, 4477, 281, 2953, 253, 2022, 7350, 326, 2210, 598, 275, 10123, 841, 2486, 50276, 296, 3755, 25230, 253, 16038, 323, 253, 1798, 4471, 12788, 15895, 5421, 275, 436, 2929, 50276, 498, 274, 5411, 253, 2954, 875, 1347, 800, 10554, 285, 2720, 789, 327, 5939, 13757 ]
Below is given review of a research paper from cnoference journal. Please write a summary the review. ### Review: this paper investigates the properties of decision transformer models trained on multiple atari games using offline rl the model is found to be superior to multigame models using other approaches for example online nontransformer models or transformer models trained with behavioral cloning the performance gap between singlegame trained models vs multigame trained models observed in some earlier works was found in this study as well strengths the experiments here are interesting and insightful likely to be of general interest to the ml community the work also seems technically sound weaknesses please see the limitations section below i understand that these are ultimately empirical results but the authors could perhaps do a better job explaining why these observed results might make sense for example why online c51 doesnt seem to do as well as the offline dt or why nontransformer models in general dont seem to do as well as transformer models the lack of any scaling with model size for cql in fig 5 in fact inverse scaling with model size in fig 5b also seems surprising a priori for example what is the hypothesized explanation for this in figure 5a the decision transformer is claimed to exhibit power law scaling with model size but this doesnt seem correct first of all its not really possible to conclude a power law or anything like that from three data points only with any degree of confidence secondly the figure shows a roughly linear increase in performance for the decision transformer in a semilogx plot this corresponds to a logarithmic improvement in performance with model size in fact it seems like the scaling on a linearlinear plot will be worse than logarithmic because the trend seems slightly slower than linear in the semilogx plot in fig 5a a power law would be linear in a loglog plot also relatedly were separate hyperparameter searches performed for different model sizes in the scaling experiments in figure 5 and was the amount of compute flops controlled between different model sizes need to train smaller models for a larger number of iterations if not why not if not wouldnt that possibly underestimate the performance of the smaller size models in these experiments as demonstrated for example in the new scaling laws paper by hoffmann et al 2022 in section 46 the best model rollouts are compared against the best training data but im really not sure if this is a fair comparison with enough samples one is essentially guaranteed to surpass the top training data eventually shouldnt the model be expected to do better on average than the expert rollouts in the training data ideally the performance gap between singlegame and multigame models shown in fig 1 lack of positive transfer between games on average also observed in other works suggests this is not really the case yet with multigame models docsepthis paper introduces the possibility of having a generalist rl agent that is based on the transformer the paper demonstrate the effectiveness of a single rl agent and that its scaling trend follows those of language and vision the idea is validated on atari games strengths the paper is easy to follow and the problem is well motivated what really stands out the detailed description and the solid experiments performed i think the results could have great impact on rl and potentially making large transformer based offline rl training the new dominant paradigm for the field especially with the scaling behavior weakness i guess the main weakness is the novelty since going from decision transformer to multitask decision transformer seems to be very straightforward nevertheless i think this does not diminish the main value here no docsepthe paper empirically evaluates the ability of the recently proposed decision transformer model to generalize across atari games it also investigates other empirical questions scaling behavior transfer learning baselines etc the main empirical result is that the decision transformer model outperforms other generalist agents but loses to singlegame agents the paper demonstrates how an improved policy can be extracted using guided generation approaches similar to those used in language modeling and shows that scaling trends similar to those seen in vision and nlp hold for atari games update i thank the authors for their detailed responses after reading the other reviews and the responses ive increased my score to accept strengths the paper investigates an important question scaling behaviors of decision transformers and conducts a detailed empirical investigation of the decision transformer in atari games the experiments are well constructed and nicely detailed the experimental results are novel to my knowledge the multigame decision transformer seems to be the stateoftheart in general game playing although restricted to atari games the paper is well written its easy to follow the main ideas and experiments the paper also does a good job of placing its ideas into the larger body of work in sequence modeling upsidedown rl transfer learning etc there appears to be some algorithmic novelty in the guided action generation weaknesses as the paper mentions its unclear if these results hold outside of atari games however evaluating this seems outside the scope of this paper i didnt fully understand the testtime actionselection described in section 34 for example what happens if pexpertr is very small full algorithmic pseudocode might make it clearer it sounds like a single training run was used for each model is this correct the answers in the checklist also mention error bars but i didnt see any in the figures also the appendix mentions a single seed was used overall its unclear to me if these experiments are reproducible across seeds hyperparams etc that said given the large dataset and large computational cost this may not be a major issue but perhaps this needs to be addressed somewhere in the paper given that the main contributions of the paper are empirical the above points make the overall impact of this paper somewhat limited id be inclined to revise my opinion based on the authors response to the questions below the authors have described the main limitations based on the response to a question above it might need to be expanded to include sensitivity wrt seeds hyperparams etc docsepthis paper proposes training a single model on a lot of games combining the rl approach with the same token prediction approach weve seen enable the recent very capable llm results the model they train does very well on the suite of atari games and lets us ask questions like 1 what is learned from playing all of those games 2 do the trends in vision and language hold for generalist rl agents 3 does this nextstep tokenization prediction approach work for these games 4 how do different online and offline methods work in this multigame etup 5 does this idea of expert action inference which comes from decomposing the nextstep tokenization to explicitly model the expert prediction improve upon bc the model is trained on trajectories collected in prior work and includes both expert and nonexpert behavior there are many baseline methods that they compare against all in all this is an extensive work which demonstrates that while the model does not beat singlegame performance as expected it is competitive across the majority of games originality this paper is not particularly original the only part of the algorithm that could be original is the expert action inference inspired by krause et al clarity its very clear theres no issues here nit l249 we discuss these results in the appendix quality the quality is high as well this paper sets out to do a goal figure out how to train a single agent to a high level on a lot of games and does it it then supports that finding with relevant questions about whether it is capable of learning more transfer how it compares with onlineoffline methods how it scales with size and and how optimal vs suboptimal actions influence the learning these all make sense as experiments to do and bolster the argument significance the paper is significant in so far as it sets a bar for what comes next its somewhat of a successor although not multimodal to one model to rule them all which previously was significant in showing that we can do a lot more in a single model than previously thought its not significant in being worldshaking i think of this as being important in so far as a group showing the rest of the world that its possible the authors are honest about their limitations in the blurb at the end ### Summary:
this paper demonstrates the generalization abilities of decision transformers relative to other approaches in multitask settings the reviewers found the topic interesting and the results compelling unanimously supporting its inclusion in the conference programme accept
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 2520, 2929, 2340, 684, 253, 3607, 273, 3061, 39707, 3210, 10166, 327, 2709, 387, 1792, 3958, 970, 28841, 391, 77, 253, 1566, 310, 1119, 281, 320, 8936, 281, 1554, 304, 482, 3210, 970, 643, 7274, 323, 1650, 3909, 25450, 16147, 19946, 3210, 390, 39707, 3210, 10166, 342, 14613, 34591, 253, 3045, 8037, 875, 1625, 1851, 482, 10166, 3210, 4632, 1554, 304, 482, 10166, 3210, 2540, 275, 690, 4321, 2987, 369, 1119, 275, 436, 1263, 347, 973, 50276, 296, 3755, 20556, 50275, 783, 4679, 1060, 403, 4722, 285, 47860, 2779, 281, 320, 273, 2087, 1600, 281, 253, 13361, 3114, 253, 789, 671, 3133, 22335, 3590, 50273, 20881, 1255, 265, 50276, 32897, 923, 253, 7364, 2593, 2708, 50276, 74, 2096, 326, 841, 403, 9142, 16774, 1543, 533, 253, 4477, 812, 4931, 513, 247, 1805, 2628, 15571, 2139, 841, 2540, 1543, 1537, 1056, 3282, 323, 1650, 2139, 3909, 260, 3712, 36908, 1646, 281, 513, 347, 973, 347, 253, 28841, 19641, 390, 2139, 25450, 16147, 19946, 3210, 275, 2087, 13414, 1646, 281, 513, 347, 973, 347, 39707, 3210, 253, 3480, 273, 667, 13642, 342, 1566, 1979, 323, 260, 5848, 275, 3036, 608, 275, 958, 13737, 13642, 342, 1566, 1979, 275, 3036, 608, 67, 671, 3133, 10084, 247, 30400, 323, 1650, 752, 310, 253, 24045, 8813, 323, 436, 50274, 249, 4677, 608, 66, 253, 3061, 39707, 310, 7558, 281, 10738, 1612, 1569, 13642, 342, 1566, 1979, 533, 436, 36908, 1646, 3451, 806, 273, 512, 697, 417, 1663, 1896, 281, 7525, 247, 1612, 1569, 390, 2712, 751, 326, 432, 1264, 941, 2792, 760, 342, 667, 4248, 273, 7162, 1273, 314, 253, 4677, 2722, 247, 11467, 4872, 2572, 275, 3045, 323, 253, 3061, 39707, 275, 247, 3300, 300, 462, 89, 7484, 436, 10140, 281, 247, 32643, 7756, 275, 3045, 342, 1566, 1979, 275, 958, 352, 3133, 751, 253, 13642, 327, 247, 4872, 8172, 7484, 588, 320, 7197, 685, 32643, 984, 253, 9058, 3133, 5777, 17357, 685, 4872, 275, 253, 3300, 300, 462, 89, 7484, 275, 3036, 608, 66, 247, 1612, 1569, 651, 320, 4872, 275, 247, 2412, 2808, 7484, 50276, 12563, 2905, 314, 497, 4858, 4373, 19484, 17891, 2684, 323, 1027, 1566, 9552, 275, 253, 13642, 4679, 275, 4677, 608, 285, 369, 253, 2408, 273, 11897, 892, 2695, 6537, 875, 1027, 1566, 9552, 878, 281, 6194, 4577, 3210, 323, 247, 4067, 1180, 273, 25142, 604, 417, 2139, 417, 604, 417, 651, 2649, 326, 6830, 45166, 253, 3045, 273, 253, 4577, 1979, 3210, 275, 841, 4679, 347, 5183, 323, 1650, 275, 253, 747, 13642, 5323, 2929, 407, 288, 2727, 8420, 1162, 355, 1384, 1423, 50276, 249, 2593, 7904, 253, 1682, 1566, 4533, 8349, 403, 2429, 1411, 253, 1682, 3733, 941, 533, 516, 1663, 417, 2119, 604, 436, 310, 247, 4344, 5301, 342, 2217, 3530, 581, 310, 9093, 16293, 281, 28842, 253, 1755, 3733, 941, 6524, 943, 2649, 253, 1566, 320, 3264, 281, 513, 1805, 327, 3388, 685, 253, 6485, 4533, 8349, 275, 253, 3733, 941, 34243, 253, 3045, 8037, 875, 1625, 1851, 482, 285, 1554, 304, 482, 3210, 2011, 275, 3036, 337, 3480, 273, 2762, 3700, 875, 3958, 327, 3388, 671, 2540, 275, 643, 2987, 5936, 436, 310, 417, 1663, 253, 1083, 2568, 342, 1554, 304, 482, 3210, 50276, 7152, 33032, 2520, 2929, 23970, 253, 6387, 273, 1907, 247, 2087, 382, 391, 77, 5570, 326, 310, 1754, 327, 253, 39707, 253, 2929, 7568, 253, 12510, 273, 247, 2014, 391, 77, 5570, 285, 326, 697, 13642, 9058, 3637, 1110, 273, 3448, 285, 8113, 253, 2934, 310, 17618, 327, 387, 1792, 3958, 50276, 296, 3755, 20556, 253, 2929, 310, 3477, 281, 956, 285, 253, 1895, 310, 973, 17194, 752, 1663, 9572, 562, 253, 7000, 5740, 285, 253, 4891, 4679, 2684, 891, 1158, 253, 1543, 812, 452, 1270, 3486, 327, 391, 77, 285, 7826, 2403, 1781, 39707, 1754, 28841, 391, 77, 3733, 253, 747, 11360, 22199, 323, 253, 1673, 3340, 342, 253, 13642, 3879, 50276, 20881, 1255, 891, 5476, 253, 2022, 14855, 310, 253, 38135, 1580, 1469, 432, 3061, 39707, 281, 1554, 262, 1945, 3061, 39707, 3133, 281, 320, 1077, 15246, 17837, 891, 1158, 436, 1057, 417, 37856, 253, 2022, 1318, 1060, 642, 5474, 339, 431, 248, 2929, 45190, 44995, 253, 3745, 273, 253, 4102, 4081, 3061, 39707, 1566, 281, 39970, 2439, 387, 1792, 3958, 352, 671, 2340, 684, 643, 16774, 3533, 13642, 3879, 3700, 4715, 1666, 25379, 3966, 253, 2022, 16774, 906, 310, 326, 253, 3061, 39707, 1566, 41731, 13015, 643, 2087, 382, 6083, 533, 25068, 281, 1625, 1851, 482, 6083, 253, 2929, 14371, 849, 271, 5520, 3646, 476, 320, 10375, 970, 18107, 5978, 7274, 2074, 281, 1110, 908, 275, 3448, 14053, 285, 2722, 326, 13642, 13554, 2074, 281, 1110, 2326, 275, 8113, 285, 295, 24343, 2186, 323, 387, 1792, 3958, 50274, 11183, 891, 5717, 253, 4477, 323, 616, 7000, 6128, 846, 4361, 253, 643, 10123, 285, 253, 6128, 209, 422, 2559, 619, 4868, 281, 2997, 20544, 50274, 783, 2929, 2340, 684, 271, 1774, 1953, 13642, 13576, 273, 3061, 4979, 398, 285, 2589, 84, 247, 7000, 16774, 5839, 273, 253, 3061, 39707, 275, 387, 1792, 3958, 253, 4679, 403, 973, 8818, 285, 23395, 7000, 50274, 783, 5661, 1543, 403, 4460, 281, 619, 3640, 253, 1554, 304, 482, 3061, 39707, 3133, 281, 320, 253, 1375, 23037, 14387, 275, 2087, 2165, 4882, 3738, 11096, 281, 387, 1792, 3958, 50274, 783, 2929, 310, 973, 3542, 697, 3477, 281, 956, 253, 2022, 5697, 285, 4679, 253, 2929, 671, 1057, 247, 1175, 2628, 273, 15606, 697, 5697, 715, 253, 4067, 2133, 273, 789, 275, 3425, 14053, 598, 21773, 628, 391, 77, 3700, 4715, 3966, 50274, 9088, 4620, 281, 320, 690, 5933, 280, 38135, 275, 253, 18107, 2250, 5978, 50276, 20881, 1255, 265, 50274, 284, 253, 2929, 25957, 697, 12744, 604, 841, 1543, 2186, 3345, 273, 387, 1792, 3958, 2299, 16344, 436, 3133, 3345, 253, 7990, 273, 436, 2929, 50274, 74, 42126, 4751, 2096, 253, 1071, 2606, 2250, 27423, 2529, 275, 2593, 5910, 323, 1650, 752, 6569, 604, 759, 89, 468, 1206, 310, 1077, 1355, 2120, 5933, 280, 10585, 406, 853, 1537, 1056, 352, 30909, 50274, 262, 7835, 751, 247, 2014, 3733, 1408, 369, 908, 323, 1016, 1566, 310, 436, 3451, 253, 9172, 275, 253, 44282, 671, 3748, 2228, 8965, 533, 891, 42126, 923, 667, 275, 253, 8442, 671, 253, 30762, 25957, 247, 2014, 8357, 369, 908, 4583, 697, 12744, 281, 479, 604, 841, 4679, 403, 41374, 2439, 12922, 4373, 12928, 3966, 326, 753, 1677, 253, 1781, 10895, 285, 1781, 15180, 2105, 436, 778, 417, 320, 247, 2201, 2523, 533, 4931, 436, 3198, 281, 320, 9713, 9366, 275, 253, 2929, 50274, 28821, 326, 253, 2022, 9021, 273, 253, 2929, 403, 16774, 253, 1840, 2792, 1056, 253, 4583, 3486, 273, 436, 2929, 8489, 3710, 2654, 320, 21802, 281, 49620, 619, 4743, 1754, 327, 253, 4477, 2380, 281, 253, 3533, 2708, 253, 4477, 452, 2529, 253, 2022, 7364, 1754, 327, 253, 2380, 281, 247, 1953, 1840, 352, 1537, 878, 281, 320, 11848, 281, 2486, 7340, 8772, 12922, 4373, 12928, 3966, 5474, 33032, 2520, 2929, 29328, 3733, 247, 2014, 1566, 327, 247, 2257, 273, 3958, 16248, 253, 391, 77, 2746, 342, 253, 1072, 10669, 10554, 2746, 359, 306, 2326, 8046, 253, 3332, 1077, 7032, 298, 20347, 1543, 253, 1566, 597, 6194, 1057, 1077, 973, 327, 253, 18880, 273, 387, 1792, 3958, 285, 14935, 441, 1642, 3533, 751, 50276, 18, 752, 310, 6311, 432, 4882, 512, 273, 1110, 3958, 50276, 19, 513, 253, 13554, 275, 8113, 285, 3448, 2186, 323, 2087, 382, 391, 77, 6083, 495, 1057, 436, 1735, 10539, 10669, 1320, 10554, 2746, 789, 323, 841, 3958, 577, 849, 513, 1027, 3909, 285, 28841, 3082, 789, 275, 436, 1554, 304, 482, 1162, 484, 608, 1057, 436, 2934, 273, 6485, 2250, 17032, 534, 3249, 432, 11101, 28163, 253, 1735, 10539, 10669, 1320, 281, 11120, 1566, 253, 6485, 10554, 3157, 2220, 49501, 50276, 783, 1566, 310, 10166, 327, 24102, 5728, 275, 2720, 789, 285, 3797, 1097, 6485, 285, 44382, 8292, 3879, 627, 403, 1142, 8245, 3082, 326, 597, 7277, 1411, 512, 275, 512, 436, 310, 271, 9470, 789, 534, 14371, 326, 1223, 253, 1566, 1057, 417, 7171, 1625, 1851, 482, 3045, 347, 3264, 352, 310, 12085, 2439, 253, 5020, 273, 3958, 3236, 414, 50276, 2520, 2929, 310, 417, 3782, 3236, 253, 760, 629, 273, 253, 5933, 326, 812, 320, 3236, 310, 253, 6485, 2250, 17032, 11797, 407, 465, 376, 2327, 1162, 355, 50276, 498, 15752, 50276, 953, 1077, 2590, 253, 373, 642, 3374, 1060, 50276, 32202, 50276, 77, 21361, 359, 2319, 841, 1543, 275, 253, 30762, 50276, 15177, 50276, 783, 3290, 310, 1029, 347, 973, 436, 2929, 5239, 562, 281, 513, 247, 4736, 50276, 13206, 562, 849, 281, 6194, 247, 2014, 5570, 281, 247, 1029, 1268, 327, 247, 2257, 273, 3958, 50276, 395, 1057, 352, 352, 840, 8525, 326, 4560, 342, 4623, 3533, 670, 1880, 352, 310, 7032, 273, 4715, 625, 3700, 849, 352, 26662, 342, 3909, 2727, 1282, 3082, 849, 352, 11498, 342, 1979, 285, 285, 849, 8654, 4632, 749, 29776, 5231, 4833, 253, 4715, 841, 512, 1056, 3282, 347, 4679, 281, 513, 285, 48404, 253, 4154, 50276, 9188, 40348, 50276, 783, 2929, 310, 1534, 275, 594, 2080, 347, 352, 5239, 247, 2534, 323, 752, 3249, 1735, 697, 8489, 273, 247, 24193, 3738, 417, 23390, 26306, 281, 581, 1566, 281, 4086, 731, 512, 534, 3786, 369, 1534, 275, 4645, 326, 359, 476, 513, 247, 2257, 625, 275, 247, 2014, 1566, 685, 3786, 1869, 697, 417, 1534, 275, 1146, 1533, 1200, 1170, 891, 1158, 273, 436, 347, 1146, 1774, 275, 594, 2080, 347, 247, 1387, 4645, 253, 1551, 273, 253, 1533, 326, 697, 1896, 50276, 783, 4477, 403, 8274, 670, 616, 7364, 275, 253, 787, 4063, 387, 253, 990, 2490, 187, 4118, 18435, 27, 2520, 2929, 14371, 253, 26647, 15277, 273, 3061, 4979, 398, 4103, 281, 643, 7274, 275, 1554, 262, 1945, 7533, 253, 30628, 1119, 253, 9400, 4722, 285, 253, 1543, 18511, 38350, 8109, 697, 11250, 275, 253, 8059, 13521, 2997 ]
[ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 ]
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 2520, 2929, 2340, 684, 253, 3607, 273, 3061, 39707, 3210, 10166, 327, 2709, 387, 1792, 3958, 970, 28841, 391, 77, 253, 1566, 310, 1119, 281, 320, 8936, 281, 1554, 304, 482, 3210, 970, 643, 7274, 323, 1650, 3909, 25450, 16147, 19946, 3210, 390, 39707, 3210, 10166, 342, 14613, 34591, 253, 3045, 8037, 875, 1625, 1851, 482, 10166, 3210, 4632, 1554, 304, 482, 10166, 3210, 2540, 275, 690, 4321, 2987, 369, 1119, 275, 436, 1263, 347, 973, 50276, 296, 3755, 20556, 50275, 783, 4679, 1060, 403, 4722, 285, 47860, 2779, 281, 320, 273, 2087, 1600, 281, 253, 13361, 3114, 253, 789, 671, 3133, 22335, 3590, 50273, 20881, 1255, 265, 50276, 32897, 923, 253, 7364, 2593, 2708, 50276, 74, 2096, 326, 841, 403, 9142, 16774, 1543, 533, 253, 4477, 812, 4931, 513, 247, 1805, 2628, 15571, 2139, 841, 2540, 1543, 1537, 1056, 3282, 323, 1650, 2139, 3909, 260, 3712, 36908, 1646, 281, 513, 347, 973, 347, 253, 28841, 19641, 390, 2139, 25450, 16147, 19946, 3210, 275, 2087, 13414, 1646, 281, 513, 347, 973, 347, 39707, 3210, 253, 3480, 273, 667, 13642, 342, 1566, 1979, 323, 260, 5848, 275, 3036, 608, 275, 958, 13737, 13642, 342, 1566, 1979, 275, 3036, 608, 67, 671, 3133, 10084, 247, 30400, 323, 1650, 752, 310, 253, 24045, 8813, 323, 436, 50274, 249, 4677, 608, 66, 253, 3061, 39707, 310, 7558, 281, 10738, 1612, 1569, 13642, 342, 1566, 1979, 533, 436, 36908, 1646, 3451, 806, 273, 512, 697, 417, 1663, 1896, 281, 7525, 247, 1612, 1569, 390, 2712, 751, 326, 432, 1264, 941, 2792, 760, 342, 667, 4248, 273, 7162, 1273, 314, 253, 4677, 2722, 247, 11467, 4872, 2572, 275, 3045, 323, 253, 3061, 39707, 275, 247, 3300, 300, 462, 89, 7484, 436, 10140, 281, 247, 32643, 7756, 275, 3045, 342, 1566, 1979, 275, 958, 352, 3133, 751, 253, 13642, 327, 247, 4872, 8172, 7484, 588, 320, 7197, 685, 32643, 984, 253, 9058, 3133, 5777, 17357, 685, 4872, 275, 253, 3300, 300, 462, 89, 7484, 275, 3036, 608, 66, 247, 1612, 1569, 651, 320, 4872, 275, 247, 2412, 2808, 7484, 50276, 12563, 2905, 314, 497, 4858, 4373, 19484, 17891, 2684, 323, 1027, 1566, 9552, 275, 253, 13642, 4679, 275, 4677, 608, 285, 369, 253, 2408, 273, 11897, 892, 2695, 6537, 875, 1027, 1566, 9552, 878, 281, 6194, 4577, 3210, 323, 247, 4067, 1180, 273, 25142, 604, 417, 2139, 417, 604, 417, 651, 2649, 326, 6830, 45166, 253, 3045, 273, 253, 4577, 1979, 3210, 275, 841, 4679, 347, 5183, 323, 1650, 275, 253, 747, 13642, 5323, 2929, 407, 288, 2727, 8420, 1162, 355, 1384, 1423, 50276, 249, 2593, 7904, 253, 1682, 1566, 4533, 8349, 403, 2429, 1411, 253, 1682, 3733, 941, 533, 516, 1663, 417, 2119, 604, 436, 310, 247, 4344, 5301, 342, 2217, 3530, 581, 310, 9093, 16293, 281, 28842, 253, 1755, 3733, 941, 6524, 943, 2649, 253, 1566, 320, 3264, 281, 513, 1805, 327, 3388, 685, 253, 6485, 4533, 8349, 275, 253, 3733, 941, 34243, 253, 3045, 8037, 875, 1625, 1851, 482, 285, 1554, 304, 482, 3210, 2011, 275, 3036, 337, 3480, 273, 2762, 3700, 875, 3958, 327, 3388, 671, 2540, 275, 643, 2987, 5936, 436, 310, 417, 1663, 253, 1083, 2568, 342, 1554, 304, 482, 3210, 50276, 7152, 33032, 2520, 2929, 23970, 253, 6387, 273, 1907, 247, 2087, 382, 391, 77, 5570, 326, 310, 1754, 327, 253, 39707, 253, 2929, 7568, 253, 12510, 273, 247, 2014, 391, 77, 5570, 285, 326, 697, 13642, 9058, 3637, 1110, 273, 3448, 285, 8113, 253, 2934, 310, 17618, 327, 387, 1792, 3958, 50276, 296, 3755, 20556, 253, 2929, 310, 3477, 281, 956, 285, 253, 1895, 310, 973, 17194, 752, 1663, 9572, 562, 253, 7000, 5740, 285, 253, 4891, 4679, 2684, 891, 1158, 253, 1543, 812, 452, 1270, 3486, 327, 391, 77, 285, 7826, 2403, 1781, 39707, 1754, 28841, 391, 77, 3733, 253, 747, 11360, 22199, 323, 253, 1673, 3340, 342, 253, 13642, 3879, 50276, 20881, 1255, 891, 5476, 253, 2022, 14855, 310, 253, 38135, 1580, 1469, 432, 3061, 39707, 281, 1554, 262, 1945, 3061, 39707, 3133, 281, 320, 1077, 15246, 17837, 891, 1158, 436, 1057, 417, 37856, 253, 2022, 1318, 1060, 642, 5474, 339, 431, 248, 2929, 45190, 44995, 253, 3745, 273, 253, 4102, 4081, 3061, 39707, 1566, 281, 39970, 2439, 387, 1792, 3958, 352, 671, 2340, 684, 643, 16774, 3533, 13642, 3879, 3700, 4715, 1666, 25379, 3966, 253, 2022, 16774, 906, 310, 326, 253, 3061, 39707, 1566, 41731, 13015, 643, 2087, 382, 6083, 533, 25068, 281, 1625, 1851, 482, 6083, 253, 2929, 14371, 849, 271, 5520, 3646, 476, 320, 10375, 970, 18107, 5978, 7274, 2074, 281, 1110, 908, 275, 3448, 14053, 285, 2722, 326, 13642, 13554, 2074, 281, 1110, 2326, 275, 8113, 285, 295, 24343, 2186, 323, 387, 1792, 3958, 50274, 11183, 891, 5717, 253, 4477, 323, 616, 7000, 6128, 846, 4361, 253, 643, 10123, 285, 253, 6128, 209, 422, 2559, 619, 4868, 281, 2997, 20544, 50274, 783, 2929, 2340, 684, 271, 1774, 1953, 13642, 13576, 273, 3061, 4979, 398, 285, 2589, 84, 247, 7000, 16774, 5839, 273, 253, 3061, 39707, 275, 387, 1792, 3958, 253, 4679, 403, 973, 8818, 285, 23395, 7000, 50274, 783, 5661, 1543, 403, 4460, 281, 619, 3640, 253, 1554, 304, 482, 3061, 39707, 3133, 281, 320, 253, 1375, 23037, 14387, 275, 2087, 2165, 4882, 3738, 11096, 281, 387, 1792, 3958, 50274, 783, 2929, 310, 973, 3542, 697, 3477, 281, 956, 253, 2022, 5697, 285, 4679, 253, 2929, 671, 1057, 247, 1175, 2628, 273, 15606, 697, 5697, 715, 253, 4067, 2133, 273, 789, 275, 3425, 14053, 598, 21773, 628, 391, 77, 3700, 4715, 3966, 50274, 9088, 4620, 281, 320, 690, 5933, 280, 38135, 275, 253, 18107, 2250, 5978, 50276, 20881, 1255, 265, 50274, 284, 253, 2929, 25957, 697, 12744, 604, 841, 1543, 2186, 3345, 273, 387, 1792, 3958, 2299, 16344, 436, 3133, 3345, 253, 7990, 273, 436, 2929, 50274, 74, 42126, 4751, 2096, 253, 1071, 2606, 2250, 27423, 2529, 275, 2593, 5910, 323, 1650, 752, 6569, 604, 759, 89, 468, 1206, 310, 1077, 1355, 2120, 5933, 280, 10585, 406, 853, 1537, 1056, 352, 30909, 50274, 262, 7835, 751, 247, 2014, 3733, 1408, 369, 908, 323, 1016, 1566, 310, 436, 3451, 253, 9172, 275, 253, 44282, 671, 3748, 2228, 8965, 533, 891, 42126, 923, 667, 275, 253, 8442, 671, 253, 30762, 25957, 247, 2014, 8357, 369, 908, 4583, 697, 12744, 281, 479, 604, 841, 4679, 403, 41374, 2439, 12922, 4373, 12928, 3966, 326, 753, 1677, 253, 1781, 10895, 285, 1781, 15180, 2105, 436, 778, 417, 320, 247, 2201, 2523, 533, 4931, 436, 3198, 281, 320, 9713, 9366, 275, 253, 2929, 50274, 28821, 326, 253, 2022, 9021, 273, 253, 2929, 403, 16774, 253, 1840, 2792, 1056, 253, 4583, 3486, 273, 436, 2929, 8489, 3710, 2654, 320, 21802, 281, 49620, 619, 4743, 1754, 327, 253, 4477, 2380, 281, 253, 3533, 2708, 253, 4477, 452, 2529, 253, 2022, 7364, 1754, 327, 253, 2380, 281, 247, 1953, 1840, 352, 1537, 878, 281, 320, 11848, 281, 2486, 7340, 8772, 12922, 4373, 12928, 3966, 5474, 33032, 2520, 2929, 29328, 3733, 247, 2014, 1566, 327, 247, 2257, 273, 3958, 16248, 253, 391, 77, 2746, 342, 253, 1072, 10669, 10554, 2746, 359, 306, 2326, 8046, 253, 3332, 1077, 7032, 298, 20347, 1543, 253, 1566, 597, 6194, 1057, 1077, 973, 327, 253, 18880, 273, 387, 1792, 3958, 285, 14935, 441, 1642, 3533, 751, 50276, 18, 752, 310, 6311, 432, 4882, 512, 273, 1110, 3958, 50276, 19, 513, 253, 13554, 275, 8113, 285, 3448, 2186, 323, 2087, 382, 391, 77, 6083, 495, 1057, 436, 1735, 10539, 10669, 1320, 10554, 2746, 789, 323, 841, 3958, 577, 849, 513, 1027, 3909, 285, 28841, 3082, 789, 275, 436, 1554, 304, 482, 1162, 484, 608, 1057, 436, 2934, 273, 6485, 2250, 17032, 534, 3249, 432, 11101, 28163, 253, 1735, 10539, 10669, 1320, 281, 11120, 1566, 253, 6485, 10554, 3157, 2220, 49501, 50276, 783, 1566, 310, 10166, 327, 24102, 5728, 275, 2720, 789, 285, 3797, 1097, 6485, 285, 44382, 8292, 3879, 627, 403, 1142, 8245, 3082, 326, 597, 7277, 1411, 512, 275, 512, 436, 310, 271, 9470, 789, 534, 14371, 326, 1223, 253, 1566, 1057, 417, 7171, 1625, 1851, 482, 3045, 347, 3264, 352, 310, 12085, 2439, 253, 5020, 273, 3958, 3236, 414, 50276, 2520, 2929, 310, 417, 3782, 3236, 253, 760, 629, 273, 253, 5933, 326, 812, 320, 3236, 310, 253, 6485, 2250, 17032, 11797, 407, 465, 376, 2327, 1162, 355, 50276, 498, 15752, 50276, 953, 1077, 2590, 253, 373, 642, 3374, 1060, 50276, 32202, 50276, 77, 21361, 359, 2319, 841, 1543, 275, 253, 30762, 50276, 15177, 50276, 783, 3290, 310, 1029, 347, 973, 436, 2929, 5239, 562, 281, 513, 247, 4736, 50276, 13206, 562, 849, 281, 6194, 247, 2014, 5570, 281, 247, 1029, 1268, 327, 247, 2257, 273, 3958, 50276, 395, 1057, 352, 352, 840, 8525, 326, 4560, 342, 4623, 3533, 670, 1880, 352, 310, 7032, 273, 4715, 625, 3700, 849, 352, 26662, 342, 3909, 2727, 1282, 3082, 849, 352, 11498, 342, 1979, 285, 285, 849, 8654, 4632, 749, 29776, 5231, 4833, 253, 4715, 841, 512, 1056, 3282, 347, 4679, 281, 513, 285, 48404, 253, 4154, 50276, 9188, 40348, 50276, 783, 2929, 310, 1534, 275, 594, 2080, 347, 352, 5239, 247, 2534, 323, 752, 3249, 1735, 697, 8489, 273, 247, 24193, 3738, 417, 23390, 26306, 281, 581, 1566, 281, 4086, 731, 512, 534, 3786, 369, 1534, 275, 4645, 326, 359, 476, 513, 247, 2257, 625, 275, 247, 2014, 1566, 685, 3786, 1869, 697, 417, 1534, 275, 1146, 1533, 1200, 1170, 891, 1158, 273, 436, 347, 1146, 1774, 275, 594, 2080, 347, 247, 1387, 4645, 253, 1551, 273, 253, 1533, 326, 697, 1896, 50276, 783, 4477, 403, 8274, 670, 616, 7364, 275, 253, 787, 4063, 387, 253, 990, 2490, 187, 4118, 18435, 27, 2520, 2929, 14371, 253, 26647, 15277, 273, 3061, 4979, 398, 4103, 281, 643, 7274, 275, 1554, 262, 1945, 7533, 253, 30628, 1119, 253, 9400, 4722, 285, 253, 1543, 18511, 38350, 8109, 697, 11250, 275, 253, 8059, 13521, 2997 ]
Below is given review of a research paper from cnoference journal. Please write a summary the review. ### Review: this paper proposes a framework for recording synchronized multimodal data from wearable and global sensors in the real world to enable a learning pipeline to better understand human everyday behavior and improve inhome robotic assistants in addition to making the data publicly available opensource code and instructions for reproducing the presented framework are available the paper is excellent in that it simultaneously measures wearable sensors in addition to the environmentmounted sensing that has been done in action recognition to date in order to achieve a better understanding of human daily activities and a learning pipeline to improve inhome robotic assistants the fact that it targets kitchen activities that include twohanded operation and highlevel task planning as tasks is also considered a valid future application the name of the dataset is not appropriate the name of the dataset is limited to kitchen activities but it is misleading to think that it covers actionnet and general activities the superiority of this dataset should be indicated precisely in the comparison table with the dataset described in the related work section the number of sensors the number of recorded actions and the number of samples do not clearly indicate the superiority of this dataset as a publisher of the dataset shouldnt you benchmark the dataset with a basic model and indicate the baseline in the dataset when the dataset is cited in other papers it will be difficult to verify the performance and the validity of the dataset will be reduced docsepthe authors present a framework to record multimodal data from multiple wearable and visual sensors using this framework they collected human activity dataset the data samples contain a sequence of activities with multimodal sensor data the authors developed a framework to record multimodal sensor data synchronously the author collected human activity from multiple diverse wearable and sensor data as the authors claim that one of the contributions of this work is to develop a synchronized multimodal data recording system i would expect how it is different from the existing multimodal data collection system such as psiplatform for situated intelligence httpsgithubcommicrosoftpsi what is the novelty of this data collection framework compared to the existing data collection frameworks additionally i missed the details of their proposed synchronized data collection framework more specifically i would like to know more about the following things how the framework collects data from multiple sensors synchronously for example how can data be collected synchronously from rgbd pupil core headset and wearable sensors how are these sensors synchronized how the synchronization works is it offline ie data are collected from multiple sensors independently and then in the postprocessing step after the completion of data collection the data are synchronized a or online data are collected synchronously and there is no need for postprocession to synchronize the data streams after the data collection please explain concretely how are the online annotations performed i would like to suggest the author prepare a short video demo of how the data labeling system works during the online annotations are the participants continuously performing the sequence of actions without any interruption from the experimenter or an experimenter guides the participants on when to start and end one action and start another author claims this framework is extensible to additional sensors or environments line 61 i would like to know how the researchers can extend this framework to include additional sensors for example how the framework ensures the synchronized data collection for the additional sensor how easy will the extension be do the researchers need to change almost all the components of this framework to include additional sensors the author provides the list of libraries to recreate the environment to execute the data collection framework however i would suggest that the author share a docker to make it easy and accessible for other researchers to recreate the environment easily several multimodal activity datasets are available in the literature how does this new dataset contribute differently to the research community i suggest the author prepare a table to compare this new dataset to the existing and recent multimodal activity datasets such as utdmhad mmact nturgbd etc docsepthe paper presents a multimodal dataset and recording framework with an emphasis on wearable sensing in a kitchen environment it uses a wearable sensing suite to capture motion force and attention information coupled with different action sequences in the kitchen the paper is the first to provide a multimodal dataset and recording framework with an emphasis on wearable sensing in a kitchen environment it uses motion tactile sensing body and eye tracking audio and image to form a dataset of kitchen activities 31 the paper presents a human activity dataset with the extensible framework used to capture it however the selected activity in the paper is very limited 6 activities the author claims it is a growing dataset but having 6 activities is not enough for a large dataset to produce robust results as claimed in the introduction there is a lack of user application discussion in the paper in particular what use cases can be beneficial from this paper many kitchen activities require more information than a camera and motion for example temperature shape the texture of the different objects etc the author needs to provide an application that justifies the use case of what is captured in the dataset docsepthis paper presents a new multimodal dataset actionnet which records the activities of human activities in a mock kitchen environment with multiple sensors the dataset contains multiple tasks including object manipulations dexterous actions and complex action sequences that are commonly seen in kitchenrelated activities and all tasks are labeled by authors the paper utilizes a large number of sensors to record the activities from the aspects of vision sound motion activities and tactile feelings using wearable and external sensors 1 compared to other existing activity datasets actionnet contains way more modalities so that it is able to capture more detailed and comprehensive information about human interaction with the environment and have a better description of human activities 2 the dataset itself is wellorganized and easily accessible the authors provide detailed information and guidance about the information of the dataset and the user guide which benefits the potential users 3 the dataset may be valuable for the potential development of smart textiles and the emulation and assistance of robots for a persons action 1 the size and diversity of the actionnet are limited compared to existing activity datasets the actionnet currently contains 10 subjects where each subject performs around no more than 20 activities with about 785 minutes of videos on average however the existing activity dataset eg activitynet contains 648 hours of videos and 200 classes which is far more than the actionnet the authors claim actionnet circumvents the issue of workspace restriction line 38 while all videos are taken in the same mock kitchen the paper will be much stronger if it contains more diverse scarioes and activities 2 lack of experiments and applications the paper does not contain any applications or experiments that can be benefit from this dataset which undermines the utility of this paper also since the dataset contains a large amount of multimodal information it is better to show ablation studies that how each modality is useful compared to only vision in existing activity datasets it may raise my concern that some modalities may not be that useful in understanding human activities docsepin this paper the authors collected a dataset with 6 subjects 25 as planned by the time it is published on performing various kitchen cooking tasks a number of sensor streams are captured including motion eye tracking tactile sensing emg body and finger tracking audio video together with activity labels and subjective surveys subjects did a series of activities including peeling and slicing spreading openingclosing a jar wiping pouring and highlevel tableware tasks the authors envision that the potential usage of the dataset includes task planning robots training and smart textiles guiding i appreciate the authors creating such a rich dataset with excellent documentation and decent web pages a wide range of sensors and activities to provide a very rich dataset the website is very well documented with clear instructions on different sensorsactivities with corresponding data streamings the topic can be potentially interesting to a wide range of audience it is unclear how much impact brought by different sensors especially gloves on participants natural behavior of cutting peeling etc however in order to collect the dataset such an effect seems to be inevitable i would recommend the authors mention it in the limitation why doesnt the activity task include some representative cooking tasks eg frying stirring grilling and baking to name just a few please justify the number of participants is never mentioned in the main text from the website the current dataset has 6 subjects which is limited however with the pipeline established it is not difficult to collect data from more participants as the website mentioned they aim for 25 subjects by 2022 fall the paper may benefit from some initial discussion of potential usage of the dataset ### Summary:
this paper presents an interesting multimodal dataset for research work on analyzing human actions while there are some issues with this work the authors have addressed most of them well during the rebuttal phase overall there is sufficient support to accept this paper at this conference the authors should include all these points in the revised paper and dataset before the conference date
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 2520, 2929, 29328, 247, 7792, 323, 7663, 30492, 23390, 26306, 941, 432, 8251, 494, 285, 4156, 13479, 275, 253, 1524, 1533, 281, 8046, 247, 4715, 15722, 281, 1805, 2096, 1966, 15363, 3879, 285, 3157, 275, 9511, 35121, 35785, 50276, 249, 1635, 281, 2403, 253, 941, 13644, 2130, 13279, 1505, 2127, 285, 7997, 323, 39306, 253, 3559, 7792, 403, 2130, 253, 2929, 310, 7126, 275, 326, 352, 10486, 5593, 8251, 494, 13479, 275, 1635, 281, 253, 3126, 35758, 17950, 326, 556, 644, 2218, 275, 2250, 8981, 281, 3522, 275, 1340, 281, 5115, 247, 1805, 4685, 273, 1966, 5312, 4712, 285, 247, 4715, 15722, 281, 3157, 275, 9511, 35121, 35785, 50276, 783, 958, 326, 352, 8571, 8576, 4712, 326, 2486, 767, 22124, 4254, 285, 1029, 5251, 4836, 7219, 347, 8892, 310, 671, 2783, 247, 3588, 2852, 2898, 253, 1416, 273, 253, 10895, 310, 417, 4569, 253, 1416, 273, 253, 10895, 310, 3710, 281, 8576, 4712, 533, 352, 310, 24363, 281, 1158, 326, 352, 10949, 2250, 3024, 285, 2087, 4712, 50276, 783, 34385, 273, 436, 10895, 943, 320, 4860, 10534, 275, 253, 5301, 2829, 342, 253, 10895, 2529, 275, 253, 2905, 789, 2593, 253, 1180, 273, 13479, 253, 1180, 273, 5950, 5231, 285, 253, 1180, 273, 3530, 513, 417, 4518, 5224, 253, 34385, 273, 436, 10895, 50276, 284, 247, 19600, 273, 253, 10895, 943, 2649, 368, 22791, 253, 10895, 342, 247, 5044, 1566, 285, 5224, 253, 8245, 275, 253, 10895, 672, 253, 10895, 310, 11106, 275, 643, 9380, 352, 588, 320, 2834, 281, 12654, 253, 3045, 285, 253, 13091, 273, 253, 10895, 588, 320, 3777, 5474, 339, 431, 248, 4477, 1246, 247, 7792, 281, 1924, 23390, 26306, 941, 432, 2709, 8251, 494, 285, 5304, 13479, 970, 436, 7792, 597, 5728, 1966, 2425, 10895, 253, 941, 3530, 3831, 247, 3425, 273, 4712, 342, 23390, 26306, 8468, 941, 50276, 783, 4477, 3715, 247, 7792, 281, 1924, 23390, 26306, 8468, 941, 11842, 4087, 50276, 783, 2488, 5728, 1966, 2425, 432, 2709, 11117, 8251, 494, 285, 8468, 941, 50276, 284, 253, 4477, 1750, 326, 581, 273, 253, 9021, 273, 436, 789, 310, 281, 1287, 247, 30492, 23390, 26306, 941, 7663, 985, 891, 651, 1902, 849, 352, 310, 1027, 432, 253, 5368, 23390, 26306, 941, 4849, 985, 824, 347, 3714, 7070, 4012, 323, 17860, 9260, 5987, 7280, 2823, 6599, 4144, 752, 310, 253, 38135, 273, 436, 941, 4849, 7792, 2429, 281, 253, 5368, 941, 4849, 31225, 50276, 29483, 595, 891, 9829, 253, 4278, 273, 616, 4081, 30492, 941, 4849, 7792, 625, 5742, 891, 651, 751, 281, 871, 625, 670, 253, 1563, 1841, 50273, 5430, 253, 7792, 41084, 941, 432, 2709, 13479, 11842, 4087, 323, 1650, 849, 476, 941, 320, 5728, 11842, 4087, 432, 28937, 14836, 35048, 5161, 48691, 285, 8251, 494, 13479, 849, 403, 841, 13479, 30492, 50276, 5430, 253, 27801, 2987, 310, 352, 28841, 26332, 941, 403, 5728, 432, 2709, 13479, 10939, 285, 840, 275, 253, 1501, 21678, 3213, 846, 253, 12240, 273, 941, 4849, 253, 941, 403, 30492, 247, 390, 3909, 941, 403, 5728, 11842, 4087, 285, 627, 310, 642, 878, 323, 1501, 7404, 279, 281, 11842, 907, 253, 941, 17795, 846, 253, 941, 4849, 4496, 5513, 345, 2414, 600, 50275, 5430, 403, 253, 3909, 31825, 2684, 891, 651, 751, 281, 1804, 253, 2488, 10347, 247, 2159, 3492, 22020, 273, 849, 253, 941, 21473, 985, 2987, 50276, 32674, 253, 3909, 31825, 403, 253, 5014, 14949, 9591, 253, 3425, 273, 5231, 1293, 667, 38205, 432, 253, 3368, 254, 390, 271, 3368, 254, 22591, 253, 5014, 327, 672, 281, 1265, 285, 990, 581, 2250, 285, 1265, 1529, 50275, 7582, 3916, 436, 7792, 50276, 261, 1021, 35418, 281, 3081, 13479, 390, 12620, 1386, 9901, 891, 651, 751, 281, 871, 849, 253, 8607, 476, 9017, 436, 7792, 281, 2486, 3081, 13479, 323, 1650, 849, 253, 7792, 20096, 253, 30492, 941, 4849, 323, 253, 3081, 8468, 849, 3477, 588, 253, 6880, 320, 513, 253, 8607, 878, 281, 1818, 2761, 512, 253, 4295, 273, 436, 7792, 281, 2486, 3081, 13479, 50276, 783, 2488, 3400, 253, 1618, 273, 13747, 281, 48516, 253, 3126, 281, 13194, 253, 941, 4849, 7792, 2299, 891, 651, 1804, 326, 253, 2488, 3894, 247, 36191, 281, 1056, 352, 3477, 285, 12482, 323, 643, 8607, 281, 48516, 253, 3126, 4354, 2067, 23390, 26306, 2425, 15302, 403, 2130, 275, 253, 6239, 849, 1057, 436, 747, 10895, 8162, 13359, 281, 253, 2561, 3114, 891, 1804, 253, 2488, 10347, 247, 2829, 281, 7277, 436, 747, 10895, 281, 253, 5368, 285, 3332, 23390, 26306, 2425, 15302, 824, 347, 2780, 17670, 10178, 5823, 514, 34900, 3922, 14836, 3966, 50276, 7152, 339, 431, 248, 2929, 10262, 247, 23390, 26306, 10895, 285, 7663, 7792, 342, 271, 15075, 327, 8251, 494, 17950, 275, 247, 8576, 3126, 352, 4648, 247, 8251, 494, 17950, 18880, 281, 9232, 3200, 3490, 285, 4116, 1491, 9904, 342, 1027, 2250, 6430, 275, 253, 8576, 50275, 783, 2929, 310, 253, 806, 281, 2085, 247, 23390, 26306, 10895, 285, 7663, 7792, 342, 271, 15075, 327, 8251, 494, 17950, 275, 247, 8576, 3126, 352, 4648, 3200, 13253, 587, 17950, 2133, 285, 5130, 12544, 9797, 285, 2460, 281, 830, 247, 10895, 273, 8576, 4712, 50276, 2405, 253, 2929, 10262, 247, 1966, 2425, 10895, 342, 253, 1021, 35418, 7792, 908, 281, 9232, 352, 2299, 253, 4236, 2425, 275, 253, 2929, 310, 1077, 3710, 721, 4712, 253, 2488, 3916, 352, 310, 247, 5675, 10895, 533, 1907, 721, 4712, 310, 417, 2217, 323, 247, 1781, 10895, 281, 4711, 10237, 1543, 347, 7558, 275, 253, 10199, 50276, 9088, 310, 247, 3480, 273, 2608, 2898, 5955, 275, 253, 2929, 275, 1798, 752, 897, 2219, 476, 320, 12912, 432, 436, 2929, 1142, 8576, 4712, 2430, 625, 1491, 685, 247, 6568, 285, 3200, 323, 1650, 3276, 5281, 253, 14542, 273, 253, 1027, 5113, 3966, 253, 2488, 3198, 281, 2085, 271, 2898, 326, 816, 7790, 253, 897, 1083, 273, 752, 310, 10848, 275, 253, 10895, 50274, 7152, 33032, 2520, 2929, 10262, 247, 747, 23390, 26306, 10895, 2250, 3024, 534, 5861, 253, 4712, 273, 1966, 4712, 275, 247, 13031, 8576, 3126, 342, 2709, 13479, 253, 10895, 4428, 2709, 8892, 1690, 1789, 49373, 27625, 350, 528, 5231, 285, 2570, 2250, 6430, 326, 403, 7744, 2326, 275, 8576, 4919, 4712, 285, 512, 8892, 403, 13130, 407, 4477, 253, 2929, 29820, 247, 1781, 1180, 273, 13479, 281, 1924, 253, 4712, 432, 253, 7794, 273, 8113, 3590, 3200, 4712, 285, 13253, 587, 10450, 970, 8251, 494, 285, 6024, 13479, 337, 2429, 281, 643, 5368, 2425, 15302, 2250, 3024, 4428, 1039, 625, 33433, 594, 326, 352, 310, 2104, 281, 9232, 625, 7000, 285, 11088, 1491, 670, 1966, 5016, 342, 253, 3126, 285, 452, 247, 1805, 5740, 273, 1966, 4712, 50276, 19, 253, 10895, 3139, 310, 973, 34092, 285, 4354, 12482, 253, 4477, 2085, 7000, 1491, 285, 12925, 670, 253, 1491, 273, 253, 10895, 285, 253, 2608, 7102, 534, 5373, 253, 2442, 4212, 50276, 20, 253, 10895, 778, 320, 9865, 323, 253, 2442, 2440, 273, 7060, 2505, 3205, 285, 253, 802, 1427, 285, 8385, 273, 25497, 323, 247, 7732, 2250, 337, 253, 1979, 285, 9991, 273, 253, 2250, 3024, 403, 3710, 2429, 281, 5368, 2425, 15302, 253, 2250, 3024, 4390, 4428, 884, 5705, 835, 1016, 2256, 17923, 1475, 642, 625, 685, 1384, 4712, 342, 670, 818, 2227, 2909, 273, 10556, 327, 3388, 2299, 253, 5368, 2425, 10895, 24088, 2425, 3024, 4428, 721, 2385, 3038, 273, 10556, 285, 1052, 5971, 534, 310, 2080, 625, 685, 253, 2250, 3024, 253, 4477, 1750, 2250, 3024, 4493, 27156, 253, 2523, 273, 43350, 12400, 1386, 6480, 1223, 512, 10556, 403, 2668, 275, 253, 1072, 13031, 8576, 253, 2929, 588, 320, 1199, 10046, 604, 352, 4428, 625, 11117, 660, 5629, 265, 285, 4712, 50276, 19, 3480, 273, 4679, 285, 4893, 253, 2929, 1057, 417, 3831, 667, 4893, 390, 4679, 326, 476, 320, 5649, 432, 436, 10895, 534, 35162, 1100, 253, 11839, 273, 436, 2929, 671, 1580, 253, 10895, 4428, 247, 1781, 2408, 273, 23390, 26306, 1491, 352, 310, 1805, 281, 921, 28913, 2175, 326, 849, 1016, 36453, 310, 4217, 2429, 281, 760, 8113, 275, 5368, 2425, 15302, 352, 778, 7164, 619, 4468, 326, 690, 33433, 778, 417, 320, 326, 4217, 275, 4685, 1966, 4712, 5474, 339, 9852, 436, 2929, 253, 4477, 5728, 247, 10895, 342, 721, 5705, 2030, 347, 9355, 407, 253, 673, 352, 310, 3863, 327, 9591, 2710, 8576, 12398, 8892, 247, 1180, 273, 8468, 17795, 403, 10848, 1690, 3200, 5130, 12544, 13253, 587, 17950, 802, 72, 2133, 285, 9185, 12544, 9797, 3492, 2366, 342, 2425, 13301, 285, 17854, 17276, 5705, 858, 247, 2962, 273, 4712, 1690, 759, 8855, 285, 12367, 272, 17355, 5909, 49209, 247, 21152, 46945, 31226, 285, 1029, 5251, 2829, 1935, 8892, 253, 4477, 31161, 326, 253, 2442, 10393, 273, 253, 10895, 3797, 4836, 7219, 25497, 3733, 285, 7060, 2505, 3205, 26766, 891, 11435, 253, 4477, 6153, 824, 247, 6793, 10895, 342, 7126, 10097, 285, 12524, 4384, 7223, 50276, 66, 4618, 2491, 273, 13479, 285, 4712, 281, 2085, 247, 1077, 6793, 10895, 50276, 783, 4422, 310, 1077, 973, 14290, 342, 2590, 7997, 327, 1027, 13479, 514, 16762, 342, 3969, 941, 5542, 723, 50276, 783, 9400, 476, 320, 7826, 4722, 281, 247, 4618, 2491, 273, 8446, 50276, 262, 310, 12744, 849, 1199, 3486, 3982, 407, 1027, 13479, 3340, 26936, 327, 5014, 3626, 3879, 273, 9968, 759, 8855, 3966, 2299, 275, 1340, 281, 4822, 253, 10895, 824, 271, 1055, 3133, 281, 320, 19455, 891, 651, 5583, 253, 4477, 3748, 352, 275, 253, 12291, 50275, 22309, 36908, 253, 2425, 4836, 2486, 690, 8612, 12398, 8892, 24088, 47830, 22213, 650, 3867, 285, 17326, 281, 1416, 816, 247, 1643, 4496, 15249, 50275, 783, 1180, 273, 5014, 310, 1620, 5393, 275, 253, 2022, 2505, 432, 253, 4422, 253, 1655, 10895, 556, 721, 5705, 534, 310, 3710, 2299, 342, 253, 15722, 4232, 352, 310, 417, 2834, 281, 4822, 941, 432, 625, 5014, 347, 253, 4422, 5393, 597, 4388, 323, 2030, 5705, 407, 1384, 1423, 2965, 50275, 783, 2929, 778, 5649, 432, 690, 3302, 5955, 273, 2442, 10393, 273, 253, 10895, 2490, 187, 4118, 18435, 27, 2520, 2929, 10262, 271, 4722, 23390, 26306, 10895, 323, 2561, 789, 327, 18918, 1966, 5231, 1223, 627, 403, 690, 3374, 342, 436, 789, 253, 4477, 452, 9713, 954, 273, 731, 973, 1309, 253, 30080, 22559, 3408, 4583, 627, 310, 4209, 1329, 281, 2997, 436, 2929, 387, 436, 8059, 253, 4477, 943, 2486, 512, 841, 2792, 275, 253, 17265, 2929, 285, 10895, 1078, 253, 8059, 3522, 209 ]
[ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 ]
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 2520, 2929, 29328, 247, 7792, 323, 7663, 30492, 23390, 26306, 941, 432, 8251, 494, 285, 4156, 13479, 275, 253, 1524, 1533, 281, 8046, 247, 4715, 15722, 281, 1805, 2096, 1966, 15363, 3879, 285, 3157, 275, 9511, 35121, 35785, 50276, 249, 1635, 281, 2403, 253, 941, 13644, 2130, 13279, 1505, 2127, 285, 7997, 323, 39306, 253, 3559, 7792, 403, 2130, 253, 2929, 310, 7126, 275, 326, 352, 10486, 5593, 8251, 494, 13479, 275, 1635, 281, 253, 3126, 35758, 17950, 326, 556, 644, 2218, 275, 2250, 8981, 281, 3522, 275, 1340, 281, 5115, 247, 1805, 4685, 273, 1966, 5312, 4712, 285, 247, 4715, 15722, 281, 3157, 275, 9511, 35121, 35785, 50276, 783, 958, 326, 352, 8571, 8576, 4712, 326, 2486, 767, 22124, 4254, 285, 1029, 5251, 4836, 7219, 347, 8892, 310, 671, 2783, 247, 3588, 2852, 2898, 253, 1416, 273, 253, 10895, 310, 417, 4569, 253, 1416, 273, 253, 10895, 310, 3710, 281, 8576, 4712, 533, 352, 310, 24363, 281, 1158, 326, 352, 10949, 2250, 3024, 285, 2087, 4712, 50276, 783, 34385, 273, 436, 10895, 943, 320, 4860, 10534, 275, 253, 5301, 2829, 342, 253, 10895, 2529, 275, 253, 2905, 789, 2593, 253, 1180, 273, 13479, 253, 1180, 273, 5950, 5231, 285, 253, 1180, 273, 3530, 513, 417, 4518, 5224, 253, 34385, 273, 436, 10895, 50276, 284, 247, 19600, 273, 253, 10895, 943, 2649, 368, 22791, 253, 10895, 342, 247, 5044, 1566, 285, 5224, 253, 8245, 275, 253, 10895, 672, 253, 10895, 310, 11106, 275, 643, 9380, 352, 588, 320, 2834, 281, 12654, 253, 3045, 285, 253, 13091, 273, 253, 10895, 588, 320, 3777, 5474, 339, 431, 248, 4477, 1246, 247, 7792, 281, 1924, 23390, 26306, 941, 432, 2709, 8251, 494, 285, 5304, 13479, 970, 436, 7792, 597, 5728, 1966, 2425, 10895, 253, 941, 3530, 3831, 247, 3425, 273, 4712, 342, 23390, 26306, 8468, 941, 50276, 783, 4477, 3715, 247, 7792, 281, 1924, 23390, 26306, 8468, 941, 11842, 4087, 50276, 783, 2488, 5728, 1966, 2425, 432, 2709, 11117, 8251, 494, 285, 8468, 941, 50276, 284, 253, 4477, 1750, 326, 581, 273, 253, 9021, 273, 436, 789, 310, 281, 1287, 247, 30492, 23390, 26306, 941, 7663, 985, 891, 651, 1902, 849, 352, 310, 1027, 432, 253, 5368, 23390, 26306, 941, 4849, 985, 824, 347, 3714, 7070, 4012, 323, 17860, 9260, 5987, 7280, 2823, 6599, 4144, 752, 310, 253, 38135, 273, 436, 941, 4849, 7792, 2429, 281, 253, 5368, 941, 4849, 31225, 50276, 29483, 595, 891, 9829, 253, 4278, 273, 616, 4081, 30492, 941, 4849, 7792, 625, 5742, 891, 651, 751, 281, 871, 625, 670, 253, 1563, 1841, 50273, 5430, 253, 7792, 41084, 941, 432, 2709, 13479, 11842, 4087, 323, 1650, 849, 476, 941, 320, 5728, 11842, 4087, 432, 28937, 14836, 35048, 5161, 48691, 285, 8251, 494, 13479, 849, 403, 841, 13479, 30492, 50276, 5430, 253, 27801, 2987, 310, 352, 28841, 26332, 941, 403, 5728, 432, 2709, 13479, 10939, 285, 840, 275, 253, 1501, 21678, 3213, 846, 253, 12240, 273, 941, 4849, 253, 941, 403, 30492, 247, 390, 3909, 941, 403, 5728, 11842, 4087, 285, 627, 310, 642, 878, 323, 1501, 7404, 279, 281, 11842, 907, 253, 941, 17795, 846, 253, 941, 4849, 4496, 5513, 345, 2414, 600, 50275, 5430, 403, 253, 3909, 31825, 2684, 891, 651, 751, 281, 1804, 253, 2488, 10347, 247, 2159, 3492, 22020, 273, 849, 253, 941, 21473, 985, 2987, 50276, 32674, 253, 3909, 31825, 403, 253, 5014, 14949, 9591, 253, 3425, 273, 5231, 1293, 667, 38205, 432, 253, 3368, 254, 390, 271, 3368, 254, 22591, 253, 5014, 327, 672, 281, 1265, 285, 990, 581, 2250, 285, 1265, 1529, 50275, 7582, 3916, 436, 7792, 50276, 261, 1021, 35418, 281, 3081, 13479, 390, 12620, 1386, 9901, 891, 651, 751, 281, 871, 849, 253, 8607, 476, 9017, 436, 7792, 281, 2486, 3081, 13479, 323, 1650, 849, 253, 7792, 20096, 253, 30492, 941, 4849, 323, 253, 3081, 8468, 849, 3477, 588, 253, 6880, 320, 513, 253, 8607, 878, 281, 1818, 2761, 512, 253, 4295, 273, 436, 7792, 281, 2486, 3081, 13479, 50276, 783, 2488, 3400, 253, 1618, 273, 13747, 281, 48516, 253, 3126, 281, 13194, 253, 941, 4849, 7792, 2299, 891, 651, 1804, 326, 253, 2488, 3894, 247, 36191, 281, 1056, 352, 3477, 285, 12482, 323, 643, 8607, 281, 48516, 253, 3126, 4354, 2067, 23390, 26306, 2425, 15302, 403, 2130, 275, 253, 6239, 849, 1057, 436, 747, 10895, 8162, 13359, 281, 253, 2561, 3114, 891, 1804, 253, 2488, 10347, 247, 2829, 281, 7277, 436, 747, 10895, 281, 253, 5368, 285, 3332, 23390, 26306, 2425, 15302, 824, 347, 2780, 17670, 10178, 5823, 514, 34900, 3922, 14836, 3966, 50276, 7152, 339, 431, 248, 2929, 10262, 247, 23390, 26306, 10895, 285, 7663, 7792, 342, 271, 15075, 327, 8251, 494, 17950, 275, 247, 8576, 3126, 352, 4648, 247, 8251, 494, 17950, 18880, 281, 9232, 3200, 3490, 285, 4116, 1491, 9904, 342, 1027, 2250, 6430, 275, 253, 8576, 50275, 783, 2929, 310, 253, 806, 281, 2085, 247, 23390, 26306, 10895, 285, 7663, 7792, 342, 271, 15075, 327, 8251, 494, 17950, 275, 247, 8576, 3126, 352, 4648, 3200, 13253, 587, 17950, 2133, 285, 5130, 12544, 9797, 285, 2460, 281, 830, 247, 10895, 273, 8576, 4712, 50276, 2405, 253, 2929, 10262, 247, 1966, 2425, 10895, 342, 253, 1021, 35418, 7792, 908, 281, 9232, 352, 2299, 253, 4236, 2425, 275, 253, 2929, 310, 1077, 3710, 721, 4712, 253, 2488, 3916, 352, 310, 247, 5675, 10895, 533, 1907, 721, 4712, 310, 417, 2217, 323, 247, 1781, 10895, 281, 4711, 10237, 1543, 347, 7558, 275, 253, 10199, 50276, 9088, 310, 247, 3480, 273, 2608, 2898, 5955, 275, 253, 2929, 275, 1798, 752, 897, 2219, 476, 320, 12912, 432, 436, 2929, 1142, 8576, 4712, 2430, 625, 1491, 685, 247, 6568, 285, 3200, 323, 1650, 3276, 5281, 253, 14542, 273, 253, 1027, 5113, 3966, 253, 2488, 3198, 281, 2085, 271, 2898, 326, 816, 7790, 253, 897, 1083, 273, 752, 310, 10848, 275, 253, 10895, 50274, 7152, 33032, 2520, 2929, 10262, 247, 747, 23390, 26306, 10895, 2250, 3024, 534, 5861, 253, 4712, 273, 1966, 4712, 275, 247, 13031, 8576, 3126, 342, 2709, 13479, 253, 10895, 4428, 2709, 8892, 1690, 1789, 49373, 27625, 350, 528, 5231, 285, 2570, 2250, 6430, 326, 403, 7744, 2326, 275, 8576, 4919, 4712, 285, 512, 8892, 403, 13130, 407, 4477, 253, 2929, 29820, 247, 1781, 1180, 273, 13479, 281, 1924, 253, 4712, 432, 253, 7794, 273, 8113, 3590, 3200, 4712, 285, 13253, 587, 10450, 970, 8251, 494, 285, 6024, 13479, 337, 2429, 281, 643, 5368, 2425, 15302, 2250, 3024, 4428, 1039, 625, 33433, 594, 326, 352, 310, 2104, 281, 9232, 625, 7000, 285, 11088, 1491, 670, 1966, 5016, 342, 253, 3126, 285, 452, 247, 1805, 5740, 273, 1966, 4712, 50276, 19, 253, 10895, 3139, 310, 973, 34092, 285, 4354, 12482, 253, 4477, 2085, 7000, 1491, 285, 12925, 670, 253, 1491, 273, 253, 10895, 285, 253, 2608, 7102, 534, 5373, 253, 2442, 4212, 50276, 20, 253, 10895, 778, 320, 9865, 323, 253, 2442, 2440, 273, 7060, 2505, 3205, 285, 253, 802, 1427, 285, 8385, 273, 25497, 323, 247, 7732, 2250, 337, 253, 1979, 285, 9991, 273, 253, 2250, 3024, 403, 3710, 2429, 281, 5368, 2425, 15302, 253, 2250, 3024, 4390, 4428, 884, 5705, 835, 1016, 2256, 17923, 1475, 642, 625, 685, 1384, 4712, 342, 670, 818, 2227, 2909, 273, 10556, 327, 3388, 2299, 253, 5368, 2425, 10895, 24088, 2425, 3024, 4428, 721, 2385, 3038, 273, 10556, 285, 1052, 5971, 534, 310, 2080, 625, 685, 253, 2250, 3024, 253, 4477, 1750, 2250, 3024, 4493, 27156, 253, 2523, 273, 43350, 12400, 1386, 6480, 1223, 512, 10556, 403, 2668, 275, 253, 1072, 13031, 8576, 253, 2929, 588, 320, 1199, 10046, 604, 352, 4428, 625, 11117, 660, 5629, 265, 285, 4712, 50276, 19, 3480, 273, 4679, 285, 4893, 253, 2929, 1057, 417, 3831, 667, 4893, 390, 4679, 326, 476, 320, 5649, 432, 436, 10895, 534, 35162, 1100, 253, 11839, 273, 436, 2929, 671, 1580, 253, 10895, 4428, 247, 1781, 2408, 273, 23390, 26306, 1491, 352, 310, 1805, 281, 921, 28913, 2175, 326, 849, 1016, 36453, 310, 4217, 2429, 281, 760, 8113, 275, 5368, 2425, 15302, 352, 778, 7164, 619, 4468, 326, 690, 33433, 778, 417, 320, 326, 4217, 275, 4685, 1966, 4712, 5474, 339, 9852, 436, 2929, 253, 4477, 5728, 247, 10895, 342, 721, 5705, 2030, 347, 9355, 407, 253, 673, 352, 310, 3863, 327, 9591, 2710, 8576, 12398, 8892, 247, 1180, 273, 8468, 17795, 403, 10848, 1690, 3200, 5130, 12544, 13253, 587, 17950, 802, 72, 2133, 285, 9185, 12544, 9797, 3492, 2366, 342, 2425, 13301, 285, 17854, 17276, 5705, 858, 247, 2962, 273, 4712, 1690, 759, 8855, 285, 12367, 272, 17355, 5909, 49209, 247, 21152, 46945, 31226, 285, 1029, 5251, 2829, 1935, 8892, 253, 4477, 31161, 326, 253, 2442, 10393, 273, 253, 10895, 3797, 4836, 7219, 25497, 3733, 285, 7060, 2505, 3205, 26766, 891, 11435, 253, 4477, 6153, 824, 247, 6793, 10895, 342, 7126, 10097, 285, 12524, 4384, 7223, 50276, 66, 4618, 2491, 273, 13479, 285, 4712, 281, 2085, 247, 1077, 6793, 10895, 50276, 783, 4422, 310, 1077, 973, 14290, 342, 2590, 7997, 327, 1027, 13479, 514, 16762, 342, 3969, 941, 5542, 723, 50276, 783, 9400, 476, 320, 7826, 4722, 281, 247, 4618, 2491, 273, 8446, 50276, 262, 310, 12744, 849, 1199, 3486, 3982, 407, 1027, 13479, 3340, 26936, 327, 5014, 3626, 3879, 273, 9968, 759, 8855, 3966, 2299, 275, 1340, 281, 4822, 253, 10895, 824, 271, 1055, 3133, 281, 320, 19455, 891, 651, 5583, 253, 4477, 3748, 352, 275, 253, 12291, 50275, 22309, 36908, 253, 2425, 4836, 2486, 690, 8612, 12398, 8892, 24088, 47830, 22213, 650, 3867, 285, 17326, 281, 1416, 816, 247, 1643, 4496, 15249, 50275, 783, 1180, 273, 5014, 310, 1620, 5393, 275, 253, 2022, 2505, 432, 253, 4422, 253, 1655, 10895, 556, 721, 5705, 534, 310, 3710, 2299, 342, 253, 15722, 4232, 352, 310, 417, 2834, 281, 4822, 941, 432, 625, 5014, 347, 253, 4422, 5393, 597, 4388, 323, 2030, 5705, 407, 1384, 1423, 2965, 50275, 783, 2929, 778, 5649, 432, 690, 3302, 5955, 273, 2442, 10393, 273, 253, 10895, 2490, 187, 4118, 18435, 27, 2520, 2929, 10262, 271, 4722, 23390, 26306, 10895, 323, 2561, 789, 327, 18918, 1966, 5231, 1223, 627, 403, 690, 3374, 342, 436, 789, 253, 4477, 452, 9713, 954, 273, 731, 973, 1309, 253, 30080, 22559, 3408, 4583, 627, 310, 4209, 1329, 281, 2997, 436, 2929, 387, 436, 8059, 253, 4477, 943, 2486, 512, 841, 2792, 275, 253, 17265, 2929, 285, 10895, 1078, 253, 8059, 3522, 209 ]
Below is given review of a research paper from cnoference journal. Please write a summary the review. ### Review: the architecture of the tracker is standard siamese the novelty is at a technical level modules of the crossguided type have been proposed it does bring an improvement but not to the stateoftheart level there is no significant insight training updating novelty or theoretical recent shortterm trackers output segmentation the proposed tracker outputs a bounding box the performance of the tracker is evaluated on uav vot 2018 and vot 2019 uav is saturated the performance vot 2019 is worse than stateoftheart which is not reported the trackers selected for comparison do not include the best performing ones it is not clear why vot 2020 was not included overall this is yet another siamese tracker which is not sufficient for iclr acceptance docsepthis paper presents a siamesebased single object tracking method using attention mechanism both in channelwise and spatialwise for learning deep correlation between exemplar and candidate images extensive experiments on uav123 vot2018 and vot2019 demonstrate the effectiveness of the proposed method and the 35 fps running speed indicates its competitive efficiency strengths 1 the idea of multiplying the weights of aggregated template feature to the search feature is interesting and rather intuitive the experimental results on several datasets confirmed the effectiveness 2 this paper is wellorganized and easy to follow 3 comprehensive experiments including the appendix are carried out and convincing weaknesses 1 lacking important literature reviews on highly relevant works the proposed sot method belongs to 1 anchorfree and 2 attentionbased pipelines there are several latest and popular works the author should mention in related works siamfc towards robust and accurate visual tracking with target estimation guidelines httpsarxivorgpdf191106188pdf siamcar siamese fully convolutional classification and regression for visual tracking httpsarxivorgabs191107241v2 satin siamese attentional keypoint network for high performance visual tracking httpsarxivorgpdf190410128pdf all three works above are anchorfree methods and the third paper satin adopts attentional mechanism for sot the author compared the proposed method with siamfc and siamcar in experimental results but the review of these methods and explanation of difference in methodology are missing in this paper as a matter of fact the most relevant work to this paper is satin which also adopted channel and spatial attention in siamese architecture the only difference is multiplying the attentive channel weights of exemplar to candidate which is called crossattention in this paper the author should give a more thorough review and explain the difference between these works 2 the contribution of this work is rather limit the paper describes three contributions in section 1 however the second point anchorfree and diou is hardly the contribution of this work but reimplementation of previous works and the third point is only a report of experimental results in fact the only contribution of this work is introducing crosschannel attention to explicitly learn the correlation of template and search images which is a small trick in network design compared to satin which also adopts channel attention to template feature and spatial attention later this paper does not bring a substantial contribution to visual tracking community 3 the comparison with sota methods are not comprehensive the experimental results on uav123 do not include sota method siamfc similarly results on vot2018 do not include siamcar and results on vot2019 do not include siamfc and siamcar moreover the author should also compare the results with satin on both otb and vot datasets 4 the presentations of figure 1 and figure 4 are terrible the resolution of figure 1 is rather low and the author did not even explain which one corresponds to the challenge situation of three aspects in figure caption it is hard to tell which one is better in case 1 first row and case 2 second row due to the low resolution figure 4 is rather confusing the author did not explain the visualized confidence map belongs to which layer and this attentive map is whether from template feature or search feature i assume the second row on left side is from the subsequent frames then whats the meaning of both rows on right side and why there is a blank gap between left and right the author should pay more attention to improve the figures in this paper and the captions should be more thorough without ambiguitydocsep summary this paper proposes a siamese network for visual tracking namely siamcan which utilizes cross channel attention spatial attention and anchorfree regression head the method achives stateoftheart performance on four visual tracking benchmarks strengths the design of the network is reasonable including using templates channel information to help the search branch to learn more specific feature using spatial attention to aggregate location information from the depthwise correlation and utilizing anchorfree regression head to locate the object the final results are good and ablation study shows the effectiveness of these modules weaknesses the technical contribution is weak the main components including channel attention spacial attention and the anchor free network are not new this proposed method is quite similar to siamban especially the multihead fusion and the anchor free network furthermore the cross channel attention is more like a crosscorrelation between the search branch feature and the pooling template vector in this case are the 1d conv layer and the sigmoid function necessary the description of the method is not clear in figure 2 there is only one correlation map but in sec 33 the author sad it has n correlation maps from different layers of the backbone furthermore the index i of these n layers is conflict with the position index pointi j which would lead to misunderstanding also in figure 2 the classification maps size is 25x25x1 but it is 25x25x2 in sec 33 the description of spatial attention is not clear we can find maxavgpool in the figure but find no description so how is the spatial attention used the only word i can find is in related works cbam woo et al 2018 utilizes both maxpooling and averagepooling to generate the merged attention includes channel and spatial attention is that the same in sec 32 inspired by wang et al 2020b we add channel attention and spatial attention into our network however wang et al 2020b did not use spatial attention module overall rating due to the weakness described above i rate a marginally below acceptance threshold score for the paper docsep 1summary in this paper the authors introduce an crosschannel attrention mechanism and the anchorfree box regression branch with diouloss to deal with the clutters during the tracking procedure 2strengths the experimental results show that this method has excellent performance on several public benchmark datasets 3weaknesses the novelty of the paper is deficient the proposed method such as crossattention mechanism 1 anchorfree regression 2 3 have been previously exploited in existing models the proposed method is obviously based on the siamban2 however there exist much reduplicate content which has already mentioned in 2 in experiments part there are missing some important algorithms to compare such as siamban which is the baseline method for this paper the analysis of the proposed method is deficient 1 y yu y xiong w huang m r scott deformable siamese attention networks for visual object tracking in proceedings of the ieeecvf conference on computer vision and pattern recognition 2020 pp 6728 6737 2 zedu chen bineng zhong guorong li zhang shengping and ji rongrong siamese box adaptive network for visual tracking in ieee conference on computer vision and pattern recognition pages 66686677 2020 3 yinda xu zeyu wang zuoxin li ye yuan and gang yu siamfc towards robust and accurate visual tracking with target estimation guidelines in aaai pages 1254912556 2020 4 correctness the claims method and empirical methodology are correct 5 clarity the paper writing is clear and easy to follow but there exist some grammar faults moreover in sec 33 there are two same subtitles and similar content which may be confusing besides in figure 4 the arrangement of the pictures does not seem to match the description text 6 relation to prior work it is an incremental work based on the prior work 7 reproducibility yes ### Summary:
all three reviewers initially recommended reject the main concerns were 1 weak technical contribution and insight r1 r2 r3 r4 2 incremental novelty another variation of siamfc r1 r2 r3 3 unconvincing experiment results against missing sota r1 r2 r3 the authors response did not assuage these concerns
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 783, 10336, 273, 253, 40143, 310, 2629, 4927, 1443, 70, 253, 38135, 310, 387, 247, 7681, 1268, 11911, 273, 253, 2831, 26960, 1511, 452, 644, 4081, 352, 1057, 3324, 271, 7756, 533, 417, 281, 253, 1375, 23037, 14387, 1268, 50276, 9088, 310, 642, 1534, 12288, 3733, 22753, 38135, 390, 10527, 3332, 2159, 3945, 3540, 398, 3453, 26405, 253, 4081, 40143, 18012, 247, 41113, 3817, 50276, 783, 3045, 273, 253, 40143, 310, 6760, 327, 1484, 580, 8110, 4765, 285, 8110, 6247, 1484, 580, 310, 23543, 253, 3045, 8110, 6247, 310, 7197, 685, 1375, 23037, 14387, 534, 310, 417, 2361, 50276, 783, 3540, 398, 4236, 323, 5301, 513, 417, 2486, 253, 1682, 9591, 4394, 352, 310, 417, 2590, 2139, 50276, 87, 302, 9169, 369, 417, 2908, 50275, 1189, 455, 436, 310, 2568, 1529, 4927, 1443, 70, 40143, 534, 310, 417, 4209, 323, 17857, 32888, 14924, 50276, 7152, 33032, 2520, 2929, 10262, 247, 4927, 1443, 2275, 833, 2014, 1789, 12544, 1332, 970, 4116, 5122, 1097, 275, 5048, 3020, 285, 8820, 3020, 323, 4715, 3676, 5921, 875, 17449, 274, 285, 7431, 3888, 9470, 4679, 327, 1484, 580, 10683, 8110, 7798, 285, 8110, 9638, 7568, 253, 12510, 273, 253, 4081, 1332, 285, 253, 4791, 269, 793, 3515, 3885, 6492, 697, 12085, 6733, 50276, 296, 3755, 20556, 337, 253, 2934, 273, 39763, 253, 13461, 273, 40006, 7646, 4735, 281, 253, 3186, 4735, 310, 4722, 285, 2581, 27350, 253, 5661, 1543, 327, 2067, 15302, 5783, 253, 12510, 374, 436, 2929, 310, 973, 34092, 285, 3477, 281, 956, 495, 11088, 4679, 1690, 253, 30762, 403, 4824, 562, 285, 21414, 50276, 20881, 1255, 265, 337, 14999, 1774, 6239, 10123, 327, 4122, 4623, 2987, 253, 4081, 256, 302, 1332, 14125, 281, 337, 18536, 4924, 285, 374, 4116, 3169, 44387, 627, 403, 2067, 6323, 285, 4633, 2987, 253, 2488, 943, 3748, 275, 2905, 2987, 4927, 312, 12964, 4404, 10237, 285, 7899, 5304, 12544, 342, 2303, 13418, 9600, 50275, 3614, 39962, 2061, 9275, 746, 7749, 23, 17599, 9275, 4927, 312, 5546, 4927, 1443, 70, 4751, 27311, 267, 9162, 285, 9077, 323, 5304, 12544, 50274, 3614, 39962, 2061, 5375, 746, 7749, 24, 24552, 87, 19, 2206, 249, 4927, 1443, 70, 4116, 267, 2234, 3659, 2990, 323, 1029, 3045, 5304, 12544, 5987, 39962, 2061, 9275, 746, 2125, 6903, 1619, 9275, 512, 1264, 2987, 1840, 403, 18536, 4924, 3082, 285, 253, 2626, 2929, 2206, 249, 47932, 4116, 267, 5122, 323, 256, 302, 253, 2488, 2429, 253, 4081, 1332, 342, 4927, 312, 12964, 285, 4927, 312, 5546, 275, 5661, 1543, 533, 253, 2278, 273, 841, 3082, 285, 8813, 273, 3064, 275, 16182, 403, 5816, 275, 436, 2929, 347, 247, 2647, 273, 958, 253, 954, 4623, 789, 281, 436, 2929, 310, 2206, 249, 534, 671, 8671, 5048, 285, 8820, 4116, 275, 4927, 1443, 70, 10336, 253, 760, 3064, 310, 39763, 253, 33056, 422, 5048, 13461, 273, 17449, 274, 281, 7431, 534, 310, 1925, 2831, 42959, 275, 436, 2929, 50276, 783, 2488, 943, 1918, 247, 625, 11080, 2278, 285, 5513, 253, 3064, 875, 841, 2987, 50276, 19, 50276, 783, 7680, 273, 436, 789, 310, 2581, 2701, 253, 2929, 8631, 1264, 9021, 275, 2593, 337, 2299, 50276, 783, 1273, 1127, 18536, 4924, 285, 1073, 276, 310, 10693, 253, 7680, 273, 436, 789, 50276, 2858, 294, 39595, 273, 2045, 2987, 285, 253, 2626, 1127, 310, 760, 247, 1304, 273, 5661, 1543, 275, 958, 253, 760, 7680, 273, 436, 789, 310, 16984, 2831, 13695, 4116, 281, 11120, 3037, 253, 5921, 273, 7646, 285, 3186, 3888, 534, 310, 247, 1355, 10480, 275, 2990, 2216, 2429, 281, 2206, 249, 534, 671, 47932, 5048, 4116, 281, 7646, 4735, 285, 8820, 4116, 1996, 50276, 2520, 2929, 1057, 417, 3324, 247, 6832, 7680, 281, 5304, 12544, 3114, 50275, 20, 253, 5301, 342, 256, 5503, 3082, 403, 417, 11088, 253, 5661, 1543, 327, 1484, 580, 10683, 513, 417, 2486, 256, 5503, 1332, 4927, 312, 12964, 12014, 1543, 327, 8110, 7798, 513, 417, 2486, 4927, 312, 5546, 285, 1543, 327, 8110, 9638, 513, 417, 2486, 4927, 312, 12964, 285, 4927, 312, 5546, 25761, 253, 2488, 943, 671, 7277, 253, 1543, 342, 2206, 249, 327, 1097, 14366, 67, 285, 8110, 15302, 50276, 21, 253, 27228, 273, 4677, 337, 285, 4677, 577, 403, 11527, 253, 6064, 273, 4677, 337, 310, 2581, 1698, 285, 253, 2488, 858, 417, 1014, 5513, 534, 581, 10140, 281, 253, 5691, 4112, 273, 1264, 7794, 275, 4677, 11743, 352, 310, 1892, 281, 2028, 534, 581, 310, 1805, 275, 1083, 337, 806, 4194, 285, 1083, 374, 1273, 4194, 1955, 281, 253, 1698, 6064, 4677, 577, 310, 2581, 21643, 253, 2488, 858, 417, 5513, 253, 27130, 7162, 3711, 14125, 281, 534, 3828, 285, 436, 33056, 422, 3711, 310, 1880, 432, 7646, 4735, 390, 3186, 4735, 891, 5467, 253, 1273, 4194, 327, 1669, 1930, 310, 432, 253, 6774, 13009, 840, 47515, 253, 4495, 273, 1097, 10175, 327, 987, 1930, 285, 2139, 627, 310, 247, 9912, 8037, 875, 1669, 285, 987, 253, 2488, 943, 2075, 625, 4116, 281, 3157, 253, 8442, 275, 436, 2929, 285, 253, 3403, 621, 943, 320, 625, 11080, 1293, 28931, 7152, 33032, 6010, 50276, 2520, 2929, 29328, 247, 4927, 1443, 70, 2990, 323, 5304, 12544, 10775, 4927, 312, 5092, 534, 29820, 2831, 5048, 4116, 8820, 4116, 285, 18536, 4924, 9077, 1481, 253, 1332, 247, 348, 1644, 1375, 23037, 14387, 3045, 327, 1740, 5304, 12544, 49602, 50274, 296, 3755, 20556, 50274, 783, 2216, 273, 253, 2990, 310, 5272, 1690, 970, 20665, 5048, 1491, 281, 1361, 253, 3186, 7789, 281, 3037, 625, 2173, 4735, 970, 8820, 4116, 281, 19737, 4328, 1491, 432, 253, 6864, 3020, 5921, 285, 17617, 18536, 4924, 9077, 1481, 281, 19912, 253, 1789, 50274, 783, 2457, 1543, 403, 1175, 285, 28913, 1263, 2722, 253, 12510, 273, 841, 11911, 50275, 20881, 1255, 265, 50275, 783, 7681, 7680, 310, 5075, 253, 2022, 4295, 1690, 5048, 4116, 653, 19601, 4116, 285, 253, 18536, 1959, 2990, 403, 417, 747, 436, 4081, 1332, 310, 3240, 2074, 281, 4927, 1369, 266, 3340, 253, 4471, 2522, 11781, 285, 253, 18536, 1959, 2990, 50276, 44295, 3062, 50276, 783, 2831, 5048, 4116, 310, 625, 751, 247, 2831, 39305, 875, 253, 3186, 7789, 4735, 285, 253, 45900, 7646, 4972, 275, 436, 1083, 403, 253, 337, 69, 2410, 3828, 285, 253, 9788, 78, 1238, 1159, 3309, 50273, 783, 5740, 273, 253, 1332, 310, 417, 2590, 275, 4677, 374, 627, 310, 760, 581, 5921, 3711, 533, 275, 4706, 5922, 253, 2488, 8872, 352, 556, 295, 5921, 8115, 432, 1027, 8090, 273, 253, 27882, 33810, 253, 3605, 891, 273, 841, 295, 8090, 310, 7344, 342, 253, 1899, 3605, 1127, 74, 480, 534, 651, 1421, 281, 40663, 575, 12563, 275, 4677, 374, 253, 9162, 8115, 1979, 310, 2030, 89, 1099, 89, 18, 533, 352, 310, 2030, 89, 1099, 89, 19, 275, 4706, 5922, 575, 783, 5740, 273, 8820, 4116, 310, 417, 2590, 359, 476, 1089, 2781, 42921, 10730, 275, 253, 4677, 533, 1089, 642, 5740, 594, 849, 310, 253, 8820, 4116, 908, 253, 760, 3159, 891, 476, 1089, 310, 275, 2905, 2987, 34795, 312, 259, 3288, 1162, 355, 4765, 29820, 1097, 2781, 10730, 272, 285, 3388, 10730, 272, 281, 6635, 253, 21884, 4116, 3797, 5048, 285, 8820, 4116, 310, 326, 253, 1072, 575, 249, 4706, 4567, 11797, 407, 259, 606, 1162, 355, 9169, 67, 359, 823, 5048, 4116, 285, 8820, 4116, 715, 776, 2990, 50276, 35529, 259, 606, 1162, 355, 9169, 67, 858, 417, 897, 8820, 4116, 6333, 50274, 1189, 455, 13716, 50276, 21848, 281, 253, 14855, 2529, 1840, 891, 2281, 247, 42876, 2708, 14924, 7887, 4868, 323, 253, 2929, 50276, 7152, 33032, 337, 8774, 275, 436, 2929, 253, 4477, 9569, 271, 2831, 13695, 863, 624, 279, 5122, 285, 253, 18536, 4924, 3817, 9077, 7789, 342, 1073, 3941, 1730, 281, 2968, 342, 253, 26986, 1336, 1309, 253, 12544, 5199, 50276, 19, 296, 3755, 20556, 253, 5661, 1543, 921, 326, 436, 1332, 556, 7126, 3045, 327, 2067, 1345, 22791, 15302, 50276, 20, 20881, 1255, 265, 50276, 783, 38135, 273, 253, 2929, 310, 24030, 50276, 783, 4081, 1332, 824, 347, 2831, 42959, 5122, 337, 18536, 4924, 9077, 374, 495, 50276, 9802, 644, 3786, 28734, 275, 5368, 3210, 50276, 783, 4081, 1332, 310, 9090, 1754, 327, 253, 4927, 1369, 266, 19, 2299, 627, 2226, 1199, 1660, 21821, 2600, 534, 556, 2168, 5393, 275, 374, 50276, 249, 4679, 629, 627, 403, 5816, 690, 1774, 11333, 281, 7277, 824, 347, 4927, 1369, 266, 534, 310, 253, 8245, 1332, 323, 436, 2929, 50276, 783, 1783, 273, 253, 4081, 1332, 310, 24030, 50274, 18, 340, 340, 86, 340, 1269, 279, 72, 259, 30287, 606, 278, 391, 660, 1519, 22403, 494, 4927, 1443, 70, 4116, 6928, 323, 5304, 1789, 12544, 275, 10061, 273, 253, 26332, 70, 886, 39985, 8059, 327, 4382, 8113, 285, 3102, 8981, 9169, 7266, 9963, 1619, 9963, 1787, 50274, 19, 1182, 13808, 260, 864, 10269, 1205, 1182, 73, 543, 1149, 263, 543, 632, 1182, 12109, 703, 1251, 14650, 285, 480, 74, 391, 543, 29487, 4927, 1443, 70, 3817, 17825, 2990, 323, 5304, 12544, 275, 26332, 1796, 8059, 327, 4382, 8113, 285, 3102, 8981, 7223, 9523, 28057, 30311, 50276, 14952, 50274, 20, 340, 17662, 1269, 86, 1182, 2653, 86, 259, 606, 10736, 32747, 632, 9094, 340, 9041, 285, 10821, 340, 86, 4927, 312, 12964, 4404, 10237, 285, 7899, 5304, 12544, 342, 2303, 13418, 9600, 275, 39951, 2284, 7223, 11140, 2537, 805, 35466, 9169, 50274, 21, 36594, 253, 3916, 1332, 285, 16774, 16182, 403, 3451, 50276, 22, 19843, 253, 2929, 4028, 310, 2590, 285, 3477, 281, 956, 533, 627, 2226, 690, 28146, 35354, 25761, 275, 4706, 5922, 627, 403, 767, 1072, 8482, 262, 868, 285, 2074, 2600, 534, 778, 320, 21643, 16280, 275, 4677, 577, 253, 11461, 273, 253, 7968, 1057, 417, 1646, 281, 3761, 253, 5740, 2505, 50276, 23, 5886, 281, 2720, 789, 352, 310, 271, 32809, 789, 1754, 327, 253, 2720, 789, 50276, 24, 38041, 4754, 50275, 187, 187, 4118, 18435, 27, 455, 1264, 30628, 8523, 8521, 12009, 50276, 783, 2022, 7350, 497, 337, 5075, 7681, 7680, 285, 12288, 391, 18, 391, 19, 391, 20, 391, 21, 374, 32809, 38135, 1529, 7629, 273, 4927, 312, 12964, 391, 18, 391, 19, 391, 20, 495, 10915, 87, 19163, 3368, 1543, 1411, 5816, 256, 5503, 391, 18, 391, 19, 391, 20, 50276, 783, 4477, 2380, 858, 417, 718, 86, 486, 841, 7350 ]
[ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 ]
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 783, 10336, 273, 253, 40143, 310, 2629, 4927, 1443, 70, 253, 38135, 310, 387, 247, 7681, 1268, 11911, 273, 253, 2831, 26960, 1511, 452, 644, 4081, 352, 1057, 3324, 271, 7756, 533, 417, 281, 253, 1375, 23037, 14387, 1268, 50276, 9088, 310, 642, 1534, 12288, 3733, 22753, 38135, 390, 10527, 3332, 2159, 3945, 3540, 398, 3453, 26405, 253, 4081, 40143, 18012, 247, 41113, 3817, 50276, 783, 3045, 273, 253, 40143, 310, 6760, 327, 1484, 580, 8110, 4765, 285, 8110, 6247, 1484, 580, 310, 23543, 253, 3045, 8110, 6247, 310, 7197, 685, 1375, 23037, 14387, 534, 310, 417, 2361, 50276, 783, 3540, 398, 4236, 323, 5301, 513, 417, 2486, 253, 1682, 9591, 4394, 352, 310, 417, 2590, 2139, 50276, 87, 302, 9169, 369, 417, 2908, 50275, 1189, 455, 436, 310, 2568, 1529, 4927, 1443, 70, 40143, 534, 310, 417, 4209, 323, 17857, 32888, 14924, 50276, 7152, 33032, 2520, 2929, 10262, 247, 4927, 1443, 2275, 833, 2014, 1789, 12544, 1332, 970, 4116, 5122, 1097, 275, 5048, 3020, 285, 8820, 3020, 323, 4715, 3676, 5921, 875, 17449, 274, 285, 7431, 3888, 9470, 4679, 327, 1484, 580, 10683, 8110, 7798, 285, 8110, 9638, 7568, 253, 12510, 273, 253, 4081, 1332, 285, 253, 4791, 269, 793, 3515, 3885, 6492, 697, 12085, 6733, 50276, 296, 3755, 20556, 337, 253, 2934, 273, 39763, 253, 13461, 273, 40006, 7646, 4735, 281, 253, 3186, 4735, 310, 4722, 285, 2581, 27350, 253, 5661, 1543, 327, 2067, 15302, 5783, 253, 12510, 374, 436, 2929, 310, 973, 34092, 285, 3477, 281, 956, 495, 11088, 4679, 1690, 253, 30762, 403, 4824, 562, 285, 21414, 50276, 20881, 1255, 265, 337, 14999, 1774, 6239, 10123, 327, 4122, 4623, 2987, 253, 4081, 256, 302, 1332, 14125, 281, 337, 18536, 4924, 285, 374, 4116, 3169, 44387, 627, 403, 2067, 6323, 285, 4633, 2987, 253, 2488, 943, 3748, 275, 2905, 2987, 4927, 312, 12964, 4404, 10237, 285, 7899, 5304, 12544, 342, 2303, 13418, 9600, 50275, 3614, 39962, 2061, 9275, 746, 7749, 23, 17599, 9275, 4927, 312, 5546, 4927, 1443, 70, 4751, 27311, 267, 9162, 285, 9077, 323, 5304, 12544, 50274, 3614, 39962, 2061, 5375, 746, 7749, 24, 24552, 87, 19, 2206, 249, 4927, 1443, 70, 4116, 267, 2234, 3659, 2990, 323, 1029, 3045, 5304, 12544, 5987, 39962, 2061, 9275, 746, 2125, 6903, 1619, 9275, 512, 1264, 2987, 1840, 403, 18536, 4924, 3082, 285, 253, 2626, 2929, 2206, 249, 47932, 4116, 267, 5122, 323, 256, 302, 253, 2488, 2429, 253, 4081, 1332, 342, 4927, 312, 12964, 285, 4927, 312, 5546, 275, 5661, 1543, 533, 253, 2278, 273, 841, 3082, 285, 8813, 273, 3064, 275, 16182, 403, 5816, 275, 436, 2929, 347, 247, 2647, 273, 958, 253, 954, 4623, 789, 281, 436, 2929, 310, 2206, 249, 534, 671, 8671, 5048, 285, 8820, 4116, 275, 4927, 1443, 70, 10336, 253, 760, 3064, 310, 39763, 253, 33056, 422, 5048, 13461, 273, 17449, 274, 281, 7431, 534, 310, 1925, 2831, 42959, 275, 436, 2929, 50276, 783, 2488, 943, 1918, 247, 625, 11080, 2278, 285, 5513, 253, 3064, 875, 841, 2987, 50276, 19, 50276, 783, 7680, 273, 436, 789, 310, 2581, 2701, 253, 2929, 8631, 1264, 9021, 275, 2593, 337, 2299, 50276, 783, 1273, 1127, 18536, 4924, 285, 1073, 276, 310, 10693, 253, 7680, 273, 436, 789, 50276, 2858, 294, 39595, 273, 2045, 2987, 285, 253, 2626, 1127, 310, 760, 247, 1304, 273, 5661, 1543, 275, 958, 253, 760, 7680, 273, 436, 789, 310, 16984, 2831, 13695, 4116, 281, 11120, 3037, 253, 5921, 273, 7646, 285, 3186, 3888, 534, 310, 247, 1355, 10480, 275, 2990, 2216, 2429, 281, 2206, 249, 534, 671, 47932, 5048, 4116, 281, 7646, 4735, 285, 8820, 4116, 1996, 50276, 2520, 2929, 1057, 417, 3324, 247, 6832, 7680, 281, 5304, 12544, 3114, 50275, 20, 253, 5301, 342, 256, 5503, 3082, 403, 417, 11088, 253, 5661, 1543, 327, 1484, 580, 10683, 513, 417, 2486, 256, 5503, 1332, 4927, 312, 12964, 12014, 1543, 327, 8110, 7798, 513, 417, 2486, 4927, 312, 5546, 285, 1543, 327, 8110, 9638, 513, 417, 2486, 4927, 312, 12964, 285, 4927, 312, 5546, 25761, 253, 2488, 943, 671, 7277, 253, 1543, 342, 2206, 249, 327, 1097, 14366, 67, 285, 8110, 15302, 50276, 21, 253, 27228, 273, 4677, 337, 285, 4677, 577, 403, 11527, 253, 6064, 273, 4677, 337, 310, 2581, 1698, 285, 253, 2488, 858, 417, 1014, 5513, 534, 581, 10140, 281, 253, 5691, 4112, 273, 1264, 7794, 275, 4677, 11743, 352, 310, 1892, 281, 2028, 534, 581, 310, 1805, 275, 1083, 337, 806, 4194, 285, 1083, 374, 1273, 4194, 1955, 281, 253, 1698, 6064, 4677, 577, 310, 2581, 21643, 253, 2488, 858, 417, 5513, 253, 27130, 7162, 3711, 14125, 281, 534, 3828, 285, 436, 33056, 422, 3711, 310, 1880, 432, 7646, 4735, 390, 3186, 4735, 891, 5467, 253, 1273, 4194, 327, 1669, 1930, 310, 432, 253, 6774, 13009, 840, 47515, 253, 4495, 273, 1097, 10175, 327, 987, 1930, 285, 2139, 627, 310, 247, 9912, 8037, 875, 1669, 285, 987, 253, 2488, 943, 2075, 625, 4116, 281, 3157, 253, 8442, 275, 436, 2929, 285, 253, 3403, 621, 943, 320, 625, 11080, 1293, 28931, 7152, 33032, 6010, 50276, 2520, 2929, 29328, 247, 4927, 1443, 70, 2990, 323, 5304, 12544, 10775, 4927, 312, 5092, 534, 29820, 2831, 5048, 4116, 8820, 4116, 285, 18536, 4924, 9077, 1481, 253, 1332, 247, 348, 1644, 1375, 23037, 14387, 3045, 327, 1740, 5304, 12544, 49602, 50274, 296, 3755, 20556, 50274, 783, 2216, 273, 253, 2990, 310, 5272, 1690, 970, 20665, 5048, 1491, 281, 1361, 253, 3186, 7789, 281, 3037, 625, 2173, 4735, 970, 8820, 4116, 281, 19737, 4328, 1491, 432, 253, 6864, 3020, 5921, 285, 17617, 18536, 4924, 9077, 1481, 281, 19912, 253, 1789, 50274, 783, 2457, 1543, 403, 1175, 285, 28913, 1263, 2722, 253, 12510, 273, 841, 11911, 50275, 20881, 1255, 265, 50275, 783, 7681, 7680, 310, 5075, 253, 2022, 4295, 1690, 5048, 4116, 653, 19601, 4116, 285, 253, 18536, 1959, 2990, 403, 417, 747, 436, 4081, 1332, 310, 3240, 2074, 281, 4927, 1369, 266, 3340, 253, 4471, 2522, 11781, 285, 253, 18536, 1959, 2990, 50276, 44295, 3062, 50276, 783, 2831, 5048, 4116, 310, 625, 751, 247, 2831, 39305, 875, 253, 3186, 7789, 4735, 285, 253, 45900, 7646, 4972, 275, 436, 1083, 403, 253, 337, 69, 2410, 3828, 285, 253, 9788, 78, 1238, 1159, 3309, 50273, 783, 5740, 273, 253, 1332, 310, 417, 2590, 275, 4677, 374, 627, 310, 760, 581, 5921, 3711, 533, 275, 4706, 5922, 253, 2488, 8872, 352, 556, 295, 5921, 8115, 432, 1027, 8090, 273, 253, 27882, 33810, 253, 3605, 891, 273, 841, 295, 8090, 310, 7344, 342, 253, 1899, 3605, 1127, 74, 480, 534, 651, 1421, 281, 40663, 575, 12563, 275, 4677, 374, 253, 9162, 8115, 1979, 310, 2030, 89, 1099, 89, 18, 533, 352, 310, 2030, 89, 1099, 89, 19, 275, 4706, 5922, 575, 783, 5740, 273, 8820, 4116, 310, 417, 2590, 359, 476, 1089, 2781, 42921, 10730, 275, 253, 4677, 533, 1089, 642, 5740, 594, 849, 310, 253, 8820, 4116, 908, 253, 760, 3159, 891, 476, 1089, 310, 275, 2905, 2987, 34795, 312, 259, 3288, 1162, 355, 4765, 29820, 1097, 2781, 10730, 272, 285, 3388, 10730, 272, 281, 6635, 253, 21884, 4116, 3797, 5048, 285, 8820, 4116, 310, 326, 253, 1072, 575, 249, 4706, 4567, 11797, 407, 259, 606, 1162, 355, 9169, 67, 359, 823, 5048, 4116, 285, 8820, 4116, 715, 776, 2990, 50276, 35529, 259, 606, 1162, 355, 9169, 67, 858, 417, 897, 8820, 4116, 6333, 50274, 1189, 455, 13716, 50276, 21848, 281, 253, 14855, 2529, 1840, 891, 2281, 247, 42876, 2708, 14924, 7887, 4868, 323, 253, 2929, 50276, 7152, 33032, 337, 8774, 275, 436, 2929, 253, 4477, 9569, 271, 2831, 13695, 863, 624, 279, 5122, 285, 253, 18536, 4924, 3817, 9077, 7789, 342, 1073, 3941, 1730, 281, 2968, 342, 253, 26986, 1336, 1309, 253, 12544, 5199, 50276, 19, 296, 3755, 20556, 253, 5661, 1543, 921, 326, 436, 1332, 556, 7126, 3045, 327, 2067, 1345, 22791, 15302, 50276, 20, 20881, 1255, 265, 50276, 783, 38135, 273, 253, 2929, 310, 24030, 50276, 783, 4081, 1332, 824, 347, 2831, 42959, 5122, 337, 18536, 4924, 9077, 374, 495, 50276, 9802, 644, 3786, 28734, 275, 5368, 3210, 50276, 783, 4081, 1332, 310, 9090, 1754, 327, 253, 4927, 1369, 266, 19, 2299, 627, 2226, 1199, 1660, 21821, 2600, 534, 556, 2168, 5393, 275, 374, 50276, 249, 4679, 629, 627, 403, 5816, 690, 1774, 11333, 281, 7277, 824, 347, 4927, 1369, 266, 534, 310, 253, 8245, 1332, 323, 436, 2929, 50276, 783, 1783, 273, 253, 4081, 1332, 310, 24030, 50274, 18, 340, 340, 86, 340, 1269, 279, 72, 259, 30287, 606, 278, 391, 660, 1519, 22403, 494, 4927, 1443, 70, 4116, 6928, 323, 5304, 1789, 12544, 275, 10061, 273, 253, 26332, 70, 886, 39985, 8059, 327, 4382, 8113, 285, 3102, 8981, 9169, 7266, 9963, 1619, 9963, 1787, 50274, 19, 1182, 13808, 260, 864, 10269, 1205, 1182, 73, 543, 1149, 263, 543, 632, 1182, 12109, 703, 1251, 14650, 285, 480, 74, 391, 543, 29487, 4927, 1443, 70, 3817, 17825, 2990, 323, 5304, 12544, 275, 26332, 1796, 8059, 327, 4382, 8113, 285, 3102, 8981, 7223, 9523, 28057, 30311, 50276, 14952, 50274, 20, 340, 17662, 1269, 86, 1182, 2653, 86, 259, 606, 10736, 32747, 632, 9094, 340, 9041, 285, 10821, 340, 86, 4927, 312, 12964, 4404, 10237, 285, 7899, 5304, 12544, 342, 2303, 13418, 9600, 275, 39951, 2284, 7223, 11140, 2537, 805, 35466, 9169, 50274, 21, 36594, 253, 3916, 1332, 285, 16774, 16182, 403, 3451, 50276, 22, 19843, 253, 2929, 4028, 310, 2590, 285, 3477, 281, 956, 533, 627, 2226, 690, 28146, 35354, 25761, 275, 4706, 5922, 627, 403, 767, 1072, 8482, 262, 868, 285, 2074, 2600, 534, 778, 320, 21643, 16280, 275, 4677, 577, 253, 11461, 273, 253, 7968, 1057, 417, 1646, 281, 3761, 253, 5740, 2505, 50276, 23, 5886, 281, 2720, 789, 352, 310, 271, 32809, 789, 1754, 327, 253, 2720, 789, 50276, 24, 38041, 4754, 50275, 187, 187, 4118, 18435, 27, 455, 1264, 30628, 8523, 8521, 12009, 50276, 783, 2022, 7350, 497, 337, 5075, 7681, 7680, 285, 12288, 391, 18, 391, 19, 391, 20, 391, 21, 374, 32809, 38135, 1529, 7629, 273, 4927, 312, 12964, 391, 18, 391, 19, 391, 20, 495, 10915, 87, 19163, 3368, 1543, 1411, 5816, 256, 5503, 391, 18, 391, 19, 391, 20, 50276, 783, 4477, 2380, 858, 417, 718, 86, 486, 841, 7350 ]
Below is given review of a research paper from cnoference journal. Please write a summary the review. ### Review: summary of the contributions this paper investigates the impact of two variables on language acquisition with human subjects the two variables are the amount of structure as measured by a metric akin to topographic similarity brighton and kirby 2006 in the artificial languages whose participants have to consider and the population size that made those artificial languages emerge as found in a previous study the paper reports on the learnabilitymemorisationeaseoflearning measured via a memorisation test on the training set and the generalisation abilities measured via a zeroshot compositional test as it is common in the language emergence field of the participants depending on the artificial language they were considering decision given the goal of this years workshop to foster interdisciplinary discussions i think that this paper is very relevant as it tackled some concerns that have been investigated in the field of language emergence with populations of artificial agents and it does so with populations of human participants references h brighton and s kirby understanding linguistic evolution by visualizing the emergence of topographic mappings artificial life 122229242 jan 2006 issn 10645462 doi 101162artl2006122229 url httpwwwmitpressjournalsorgdoi101162artl2006122229 docsepwhile i am not an expert in cognitive scientist i am familiar with the topic on language learnability and its connection to easeoflearning this paper mainly provides a confirmation for this statement with the emerged language from another study i think it is a good contribution for the workshop so that more people are aware of this statement on linguistic structure and easeoflearning ### Summary:
this previously published work gives a great cognitive sciencelanguage evolution perspective on ease of language learning and we look forward to having discussions at the workshop
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 8774, 273, 253, 9021, 50276, 2520, 2929, 2340, 684, 253, 3486, 273, 767, 4903, 327, 3448, 11931, 342, 1966, 5705, 253, 767, 4903, 403, 253, 2408, 273, 2605, 347, 4080, 407, 247, 7982, 33917, 281, 1755, 5576, 14259, 6627, 251, 285, 465, 343, 1615, 5403, 275, 253, 13345, 11515, 3692, 5014, 452, 281, 1908, 285, 253, 3072, 1979, 326, 1160, 1110, 13345, 11515, 20177, 347, 1119, 275, 247, 2045, 1263, 50276, 783, 2929, 5012, 327, 253, 3037, 1430, 6441, 263, 5837, 70, 511, 1171, 28269, 4080, 3066, 247, 16407, 5837, 1071, 327, 253, 3733, 873, 285, 253, 2087, 5837, 15277, 4080, 3066, 247, 1182, 254, 6934, 302, 5889, 267, 1071, 347, 352, 310, 1846, 275, 253, 3448, 21313, 1673, 273, 253, 5014, 7293, 327, 253, 13345, 3448, 597, 497, 7296, 50275, 33642, 50276, 28821, 253, 4736, 273, 436, 1107, 22586, 281, 22846, 734, 36078, 11985, 891, 1158, 326, 436, 2929, 310, 1077, 4623, 347, 352, 11463, 1070, 690, 7350, 326, 452, 644, 6949, 275, 253, 1673, 273, 3448, 21313, 342, 7625, 273, 13345, 6083, 285, 352, 1057, 594, 342, 7625, 273, 1966, 5014, 50276, 250, 3065, 50276, 73, 6627, 251, 285, 256, 465, 343, 1615, 4685, 32019, 5606, 407, 5304, 3006, 253, 21313, 273, 1755, 5576, 42794, 13345, 1495, 1249, 1423, 1717, 22232, 44118, 5403, 1521, 79, 884, 25654, 34333, 28076, 8437, 19603, 435, 77, 8603, 805, 1423, 1717, 9688, 3944, 2700, 2225, 7100, 34859, 2061, 14369, 6903, 19603, 435, 77, 8603, 805, 1423, 1717, 5474, 33032, 6050, 891, 717, 417, 271, 6485, 275, 9699, 20687, 891, 717, 7615, 342, 253, 9400, 327, 3448, 3037, 1430, 285, 697, 4602, 281, 11990, 1171, 28269, 436, 2929, 7194, 3400, 247, 16883, 323, 436, 3908, 342, 253, 13082, 3448, 432, 1529, 1263, 891, 1158, 352, 310, 247, 1175, 7680, 323, 253, 22586, 594, 326, 625, 952, 403, 6600, 273, 436, 3908, 327, 32019, 2605, 285, 11990, 1171, 28269, 187, 187, 4118, 18435, 27, 2520, 3786, 3863, 789, 4245, 247, 1270, 9699, 660, 1914, 6226, 2848, 5606, 8668, 327, 11990, 273, 3448, 4715, 285, 359, 1007, 3579, 281, 1907, 11985, 387, 253, 22586 ]
[ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 ]
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 8774, 273, 253, 9021, 50276, 2520, 2929, 2340, 684, 253, 3486, 273, 767, 4903, 327, 3448, 11931, 342, 1966, 5705, 253, 767, 4903, 403, 253, 2408, 273, 2605, 347, 4080, 407, 247, 7982, 33917, 281, 1755, 5576, 14259, 6627, 251, 285, 465, 343, 1615, 5403, 275, 253, 13345, 11515, 3692, 5014, 452, 281, 1908, 285, 253, 3072, 1979, 326, 1160, 1110, 13345, 11515, 20177, 347, 1119, 275, 247, 2045, 1263, 50276, 783, 2929, 5012, 327, 253, 3037, 1430, 6441, 263, 5837, 70, 511, 1171, 28269, 4080, 3066, 247, 16407, 5837, 1071, 327, 253, 3733, 873, 285, 253, 2087, 5837, 15277, 4080, 3066, 247, 1182, 254, 6934, 302, 5889, 267, 1071, 347, 352, 310, 1846, 275, 253, 3448, 21313, 1673, 273, 253, 5014, 7293, 327, 253, 13345, 3448, 597, 497, 7296, 50275, 33642, 50276, 28821, 253, 4736, 273, 436, 1107, 22586, 281, 22846, 734, 36078, 11985, 891, 1158, 326, 436, 2929, 310, 1077, 4623, 347, 352, 11463, 1070, 690, 7350, 326, 452, 644, 6949, 275, 253, 1673, 273, 3448, 21313, 342, 7625, 273, 13345, 6083, 285, 352, 1057, 594, 342, 7625, 273, 1966, 5014, 50276, 250, 3065, 50276, 73, 6627, 251, 285, 256, 465, 343, 1615, 4685, 32019, 5606, 407, 5304, 3006, 253, 21313, 273, 1755, 5576, 42794, 13345, 1495, 1249, 1423, 1717, 22232, 44118, 5403, 1521, 79, 884, 25654, 34333, 28076, 8437, 19603, 435, 77, 8603, 805, 1423, 1717, 9688, 3944, 2700, 2225, 7100, 34859, 2061, 14369, 6903, 19603, 435, 77, 8603, 805, 1423, 1717, 5474, 33032, 6050, 891, 717, 417, 271, 6485, 275, 9699, 20687, 891, 717, 7615, 342, 253, 9400, 327, 3448, 3037, 1430, 285, 697, 4602, 281, 11990, 1171, 28269, 436, 2929, 7194, 3400, 247, 16883, 323, 436, 3908, 342, 253, 13082, 3448, 432, 1529, 1263, 891, 1158, 352, 310, 247, 1175, 7680, 323, 253, 22586, 594, 326, 625, 952, 403, 6600, 273, 436, 3908, 327, 32019, 2605, 285, 11990, 1171, 28269, 187, 187, 4118, 18435, 27, 2520, 3786, 3863, 789, 4245, 247, 1270, 9699, 660, 1914, 6226, 2848, 5606, 8668, 327, 11990, 273, 3448, 4715, 285, 359, 1007, 3579, 281, 1907, 11985, 387, 253, 22586 ]
Below is given review of a research paper from cnoference journal. Please write a summary the review. ### Review: the authors propose a statistic from information theory as a tool to analyze several aspects in machine learning data model etc use the codelength of a dataset subtracting the crossentropy of the final model the results are interesting but slightly fall short of novelty and do not lead to any significant implicationimprovement to the existing machine learning pipelinemodel clarityoriginalitysignificance below i will take section 3 as a specific example to elaborate in section 3 the authors try to use lmdd to define task complexity and argues that noise and task complexity are two independent directions it seems what the authors try to define is simply the mutual information between x and y where xiyi is drawn from the joint distribution of xy the quantity lmdd is an approximation lower bound of the mutual information this is a commonly known fact which makes the results fall short of novelty when the authors define the approximation as the task complexity it becomes dependent on both model and the dataset not task alone therefore lmdd will be strongly dependent on how one trains the model and the size of dataset for example withwithout dropout lmdd will differ significantly as a result it is difficult to get useful takeaway from the results of section 3 such as figure 3 in addition it is unclear what the implications would be if we have an approximation of mutual information between x and y could we use it to improve the learning of the task in authors response i hope the authors can clarify the dependence of their metric on model dataset besides the task also please clarify how it relates to mutual information whether there is any related work that studies the approximation of mutual information between x and y and their relationship with the current work also it would be interesting to know if there are any potential application for the proposed descriptive statistic on task complexity docsepthe paper examines different usecases of a quantity proposed in prior works which is said to capture the model information it shows that this quantity behaves as expected overall quantifying the amount of information a deep neural network is a very interesting question for the community with both theoretical and practical appeal from an analytical perspective but also from a model selection perspective for example overall im scoring the paper with a reject while the model information seemingly behaves as expected the weaknesses in the exposition as well as potential mistakes and a lack of rigour in the paper will require careful editing quantifying model information can be an important tool this paper sets out to show that quantities defined by voita titov 2020 and zhang et al 2020 respectively seem meaningful and behave as one would expect however the examples lack falsification and depth the structure of the resnet in 7a is either wrong or nonstandard the residual connection is usually just an identity function out in blockin is this intentional and if it is what does this tell us about regular resnets then similarly for the model ablation it is not clear why resetting the parameters to their initial parameters is the appropriate method of ablation another potential issue is that the definitions in section 4 pretend that lma b lmb a when it is clearly not see also xu 2020httpsarxivorgabs200210689 for example this reviewer would wish for a more indepth treatment of the various examples in order to be thoroughly convinced minor concerns figure 1 uses a line plot even though the xaxis refers to unordered categories rebuttal i thank the authors for their detailed reply i still consider the contribution to be too highlevel and to cover too much ground without going into sufficient depth i am not sure i can follow the argument about kolmogorov complexity its chain rule is also only equal up to a logarithmic factor i will keep my score the samedocsepthe central concept of this paper is model information a description length of a discriminative model the authors advocate usage of model information for analyzing several aspects in deep learning in particular they show how model information can be used to judge about difficulty of supervised tasks domain similarity model capacity roles of different network components and knowledge distillation all of these are important topics and are relevant to the iclr community while most of the definitions and interpretations seem intuitive and valid there are a few concerns and questions and in some cases it is hard to decide whether the conclusions of the interpretations should be trusted model information the term model information is vague and may create associations with other quantities like the shannon mutual information model has about the training dataset itheta d i suggest to clarify early that by model information it is meant some kind of description length of the discriminative model pthetay mid x as model information is not a welldefined concept it is not appropriate to say an approximation of model information better to say an instancea varianta definition of model information as all experiments are done using the instance of model information defined by zhang et al 1 called information transfer the conclusions may not hold for other instances of model information for example if one defines model information with the prequential coding of bleir and ollivier 2 then increasing the number of examples with noisy labels will make the task more difficult please clarify whether the k examples in equation 2 are from the training set d for not task difficulty i really like using model information for assessing task difficulty the only concern is that the task difficulty depends on the network and the training algorithm additionally i suggest to plot task difficulty versus the number of training examples will we see that it plateaus after enough number of training examples are given how would task difficulty behave in the small data regime domain similarity what do union and intersection signs mean for datasets in equation 4 if they mean the same as in equation 3 you can just repeat the same notation please explain how the righthandside of equation 4 fits into the framework of equation 3 when reading that stextunia b is asymmetric one might imply sab is symmetric which is not true please clarify that both measures are unidirectional for example if stextunia b 1 and stextunib a 1 then one can tell that a is a subset of b please explain why is this sentence true the definitions of domain similarity of equations 4 and 5 seem a little bit arbitrary why should they be defined like that how do these domain similarity measures compare to other measures such as domain similarity computed by task2vec 3 for easier interpretation of fig 5 you can use the same color map in all subplots model capacity as i understand model capacity defined in equation 6 depends on the task why shouldnt one take supremum over tasks too in the experiments of fig 6 do all datasets have the same size the straight lines in fig 6c dont approximate the curves well to have more points on the horizontal axis you can consider resnet183550k networks and vary k ablation how do we know that quantity defined in equation 7 truly captures how important a component is and that we should trust the conclusions of experiments presented in fig 7 minor notes to make the paper more selfcontained please mention in the main text which definition of model information is used in the experiments the same applies for model confidence in eq 7 mdbarc should be defined beforehand references 1 xiao zhang xingjian li dejing dou and ji wu measuring information transfer in neural networks arxiv preprint arxiv200907624 2020 2 leonard blier and yann ollivier the description length of deep learning models neurips 2018 3 achille alessandro et al task2vec task embedding for metalearning proceedings of the ieee international conference on computer vision 2019 docsep this paper discusses the use of model information to assess properties of deep learning models and problems in particular it illustrates how model information may be used to capture the difficulty of a task the degree of similarity between domains the capacity of a model the idea of using a quantity related to codelength to assess properties of a learning model is interesting indeed quantifying the properties chosen by the authors task complexity domain similarity would be certainly useful however i am dubious whether the assessment using model information would be reliable to my understanding model information evaluates the amount of information a trained model md contains about dataset d as a difference in codelength of the data and codelenghth of modeldata as explained in section 2 critically it seems to me that this quantity depends both on the data and the model it is therefore a particular measure for a specific model or at most a family of models and a dataset incidentally although guessable it would be proper to define all the symbols in the equations in section 3 it is not clear to me how this measure may be used to evaluate for instance the difficulty of a task how is md chosen is there an underlying claim that this measure would be independent from the choice of model what is the sensitivity of information measure to the choice of the model i am also perplexed by the results of the simulations it seems that dropping samples or scrambling labels makes the task easier this though is quite counterintuitive as it would suggest that to simplify the problem we could literally drop samples or scramble labels i guess but i may be wrong that this outcome is due to the fact that difficulty is not evaluated in terms of generalization similarly in section 4 it seems to me that a discussion on ma and mb is lacking it is stated that a property of informational similarity is its independence from particular representationsmodelling this seems to me not to hold for model information where we have to rely on ma and mb in section 5 how is a given task t defined this particularly important as it defines the set over which the maximization is computed and with respect to which capacity is defined my intuition is that computing this quantity on a small set of tasks would return at best a lower bound i have some doubts also in section 7 on equation 8 it would seem to that the model information of the student should be relative to a different dataset one enriched with the output probabilities generated by the teacher why is it not ### Summary:
this work presented a broad set of interesting applications of model information toward understanding task difficulty domain similarity and more however reviewers were concerned around the validity and rigor of the conclusions going into more depth in a subset of the areas presented would strengthen the paper as would further discussions and experiments around the limitations of model information with regards to specific models and dataset sizes as you have begun to discuss in section 8 additionally reviewers found the updated paper with connections to kolmogorov complexity interesting but reviewers wanted a more formal treatment and analysis of the relationship
[ 498, 15752, 19164, 414, 9188, 40348, 50275, 27490, 891, 588, 1379, 2593, 495, 347, 247, 2173, 1650, 281, 21184, 275, 2593, 495, 253, 4477, 1611, 281, 897, 298, 78, 1678, 281, 4853, 4836, 10454, 285, 8219, 326, 6046, 285, 4836, 10454, 403, 767, 3907, 10746, 50275, 262, 3133, 752, 253, 4477, 1611, 281, 4853, 310, 3365, 253, 15577, 1491, 875, 1269, 285, 340, 835, 1269, 14059, 74, 310, 8392, 432, 253, 6036, 3268, 273, 1269, 90, 253, 10671, 298, 78, 1678, 310, 271, 11193, 2406, 3033, 273, 253, 15577, 1491, 436, 310, 247, 7744, 1929, 958, 534, 2789, 253, 1543, 2965, 2159, 273, 38135, 50276, 9453, 253, 4477, 4853, 253, 11193, 347, 253, 4836, 10454, 352, 4916, 7976, 327, 1097, 1566, 285, 253, 10895, 417, 4836, 3815, 3103, 298, 78, 1678, 588, 320, 7052, 7976, 327, 849, 581, 18784, 253, 1566, 285, 253, 1979, 273, 10895, 323, 1650, 342, 14920, 5926, 483, 298, 78, 1678, 588, 9184, 3012, 50275, 284, 247, 906, 352, 310, 2834, 281, 755, 4217, 1379, 12594, 432, 253, 1543, 273, 2593, 495, 824, 347, 4677, 495, 50275, 249, 1635, 352, 310, 12744, 752, 253, 12739, 651, 320, 604, 359, 452, 271, 11193, 273, 15577, 1491, 875, 1269, 285, 340, 812, 359, 897, 352, 281, 3157, 253, 4715, 273, 253, 4836, 50276, 249, 4477, 2380, 891, 3524, 253, 4477, 476, 19148, 253, 10096, 273, 616, 7982, 327, 1566, 10895, 16280, 253, 4836, 671, 4496, 19148, 849, 352, 7033, 281, 15577, 1491, 1880, 627, 310, 667, 2905, 789, 326, 2175, 253, 11193, 273, 15577, 1491, 875, 1269, 285, 340, 285, 616, 2954, 342, 253, 1655, 789, 671, 352, 651, 320, 4722, 281, 871, 604, 627, 403, 667, 2442, 2898, 323, 253, 4081, 27389, 26312, 327, 4836, 10454, 5474, 339, 431, 248, 2929, 33888, 1027, 441, 886, 1169, 273, 247, 10671, 4081, 275, 2720, 2987, 534, 310, 753, 281, 9232, 253, 1566, 1491, 352, 2722, 326, 436, 10671, 37824, 347, 3264, 4583, 2677, 5411, 253, 2408, 273, 1491, 247, 3676, 11454, 2990, 310, 247, 1077, 4722, 1953, 323, 253, 3114, 342, 1097, 10527, 285, 8542, 4549, 432, 271, 16101, 8668, 533, 671, 432, 247, 1566, 5438, 8668, 323, 1650, 50276, 1189, 455, 516, 14755, 253, 2929, 342, 247, 12009, 1223, 253, 1566, 1491, 16907, 37824, 347, 3264, 253, 32213, 275, 253, 47284, 347, 973, 347, 2442, 16503, 285, 247, 3480, 273, 8132, 454, 275, 253, 2929, 588, 2430, 10182, 14835, 50276, 17149, 5411, 1566, 1491, 476, 320, 271, 1774, 4968, 436, 2929, 5239, 562, 281, 921, 326, 13483, 2931, 407, 3273, 5741, 50276, 39584, 729, 9169, 285, 1182, 12109, 1162, 355, 9169, 2975, 1646, 14282, 285, 21319, 347, 581, 651, 1902, 50276, 35529, 253, 6667, 3480, 21649, 1877, 285, 6864, 50275, 783, 2605, 273, 253, 501, 3024, 275, 818, 66, 310, 2057, 3430, 390, 1327, 15291, 253, 12541, 4602, 310, 3798, 816, 271, 6489, 1159, 562, 50276, 249, 50276, 6172, 249, 310, 436, 22991, 285, 604, 352, 310, 752, 1057, 436, 2028, 441, 670, 3963, 501, 47301, 840, 50276, 3549, 6241, 323, 253, 1566, 28913, 352, 310, 417, 2590, 2139, 14932, 1076, 253, 3602, 281, 616, 3302, 3602, 310, 253, 4569, 1332, 273, 28913, 50276, 23955, 2442, 2523, 310, 326, 253, 14308, 275, 2593, 577, 22830, 326, 298, 785, 270, 50276, 77, 1814, 247, 672, 352, 310, 4518, 417, 923, 671, 1269, 86, 9169, 3614, 39962, 2061, 5375, 10016, 740, 29941, 323, 1650, 50276, 2520, 37317, 651, 5730, 323, 247, 625, 801, 554, 394, 1971, 273, 253, 2710, 6667, 275, 1340, 281, 320, 16575, 13762, 50275, 37585, 7350, 50276, 13206, 337, 4648, 247, 1386, 7484, 1014, 2167, 253, 1269, 10565, 10770, 281, 440, 16586, 9050, 50274, 250, 2858, 22559, 50276, 74, 5717, 253, 4477, 323, 616, 7000, 12252, 891, 1335, 1908, 253, 7680, 281, 320, 1512, 1029, 5251, 285, 281, 3835, 1512, 1199, 3216, 1293, 1469, 715, 4209, 6864, 891, 717, 417, 2119, 891, 476, 956, 253, 4154, 670, 38301, 44519, 42017, 10454, 697, 5931, 4086, 310, 671, 760, 4503, 598, 281, 247, 32643, 2803, 891, 588, 1978, 619, 4868, 253, 1775, 264, 406, 339, 431, 248, 4275, 4473, 273, 436, 2929, 310, 1566, 1491, 247, 5740, 2978, 273, 247, 20741, 800, 1566, 253, 4477, 21424, 10393, 273, 1566, 1491, 323, 18918, 2067, 7794, 275, 3676, 4715, 275, 1798, 597, 921, 849, 1566, 1491, 476, 320, 908, 281, 5963, 670, 10183, 273, 22296, 8892, 5028, 14259, 1566, 5350, 9503, 273, 1027, 2990, 4295, 285, 3640, 940, 21755, 512, 273, 841, 403, 1774, 12989, 285, 403, 4623, 281, 253, 17857, 32888, 3114, 1223, 954, 273, 253, 14308, 285, 27838, 1646, 27350, 285, 3588, 627, 403, 247, 1643, 7350, 285, 3533, 285, 275, 690, 2219, 352, 310, 1892, 281, 7617, 1880, 253, 11815, 273, 253, 27838, 943, 320, 18273, 50276, 7645, 1491, 50276, 783, 1307, 1566, 1491, 310, 21248, 285, 778, 2794, 12485, 342, 643, 13483, 751, 253, 439, 16554, 15577, 1491, 1566, 556, 670, 253, 3733, 10895, 352, 22666, 50276, 69, 891, 1804, 281, 19148, 2393, 326, 407, 1566, 1491, 352, 310, 5486, 690, 2238, 273, 5740, 2978, 273, 253, 20741, 800, 1566, 268, 783, 85, 333, 4260, 1269, 50276, 284, 1566, 1491, 310, 417, 247, 6210, 392, 37224, 4473, 352, 310, 417, 4569, 281, 1333, 271, 11193, 273, 1566, 1491, 1805, 281, 1333, 271, 4227, 66, 1459, 7300, 5426, 273, 1566, 1491, 50276, 284, 512, 4679, 403, 2218, 970, 253, 4227, 273, 1566, 1491, 2931, 407, 1182, 12109, 1162, 355, 337, 1925, 1491, 3700, 253, 11815, 778, 417, 2186, 323, 643, 10872, 273, 1566, 1491, 323, 1650, 604, 581, 13067, 1566, 1491, 342, 253, 638, 371, 1624, 12425, 273, 7387, 343, 285, 258, 620, 400, 1321, 374, 840, 3629, 253, 1180, 273, 6667, 342, 27620, 13301, 588, 1056, 253, 4836, 625, 2834, 50275, 32897, 19148, 1880, 253, 465, 6667, 275, 5150, 374, 403, 432, 253, 3733, 873, 277, 323, 417, 50276, 14605, 10183, 50274, 74, 1663, 751, 970, 1566, 1491, 323, 18005, 4836, 10183, 253, 760, 4468, 310, 326, 253, 4836, 10183, 7024, 327, 253, 2990, 285, 253, 3733, 5933, 23000, 891, 1804, 281, 7484, 4836, 10183, 7147, 253, 1180, 273, 3733, 6667, 588, 359, 923, 326, 352, 5340, 666, 846, 2217, 1180, 273, 3733, 6667, 403, 1677, 849, 651, 4836, 10183, 21319, 275, 253, 1355, 941, 9459, 50276, 13517, 14259, 50276, 5371, 513, 8083, 285, 15171, 7871, 1599, 323, 15302, 275, 5150, 577, 604, 597, 1599, 253, 1072, 347, 275, 5150, 495, 368, 476, 816, 10280, 253, 1072, 14951, 50276, 32897, 5513, 849, 253, 987, 4608, 2189, 273, 5150, 577, 13840, 715, 253, 7792, 273, 5150, 495, 50276, 9453, 4361, 326, 331, 2068, 328, 571, 270, 310, 26640, 581, 1537, 16084, 18429, 310, 13123, 534, 310, 417, 2032, 4496, 19148, 326, 1097, 5593, 403, 440, 301, 30869, 50276, 1542, 1650, 604, 331, 2068, 328, 571, 270, 50276, 18, 285, 331, 2068, 328, 487, 247, 50276, 18, 840, 581, 476, 2028, 326, 247, 310, 247, 8578, 273, 270, 4496, 5513, 2139, 310, 436, 6197, 2032, 50276, 783, 14308, 273, 5028, 14259, 273, 7424, 577, 285, 608, 1646, 247, 1652, 2372, 10341, 2139, 943, 597, 320, 2931, 751, 326, 849, 513, 841, 5028, 14259, 5593, 7277, 281, 643, 5593, 824, 347, 5028, 14259, 10302, 407, 4836, 19, 4642, 495, 50276, 1542, 6927, 7914, 273, 3036, 608, 368, 476, 897, 253, 1072, 3295, 3711, 275, 512, 749, 42045, 50276, 7645, 5350, 50276, 284, 891, 2096, 1566, 5350, 2931, 275, 5150, 721, 7024, 327, 253, 4836, 2139, 943, 2649, 581, 1379, 25937, 360, 689, 8892, 1512, 50276, 249, 253, 4679, 273, 3036, 721, 513, 512, 15302, 452, 253, 1072, 1979, 50276, 783, 4951, 3104, 275, 3036, 721, 68, 13414, 16851, 253, 9191, 973, 281, 452, 625, 2792, 327, 253, 11593, 7844, 368, 476, 1908, 501, 3024, 1093, 1671, 1235, 76, 6928, 285, 6889, 465, 50276, 1752, 318, 50276, 5430, 513, 359, 871, 326, 10671, 2931, 275, 5150, 818, 7777, 28174, 849, 1774, 247, 4445, 310, 285, 326, 359, 943, 4517, 253, 11815, 273, 4679, 3559, 275, 3036, 818, 50276, 37585, 7211, 50276, 936, 1056, 253, 2929, 625, 1881, 41010, 4496, 3748, 275, 253, 2022, 2505, 534, 5426, 273, 1566, 1491, 310, 908, 275, 253, 4679, 253, 1072, 10384, 323, 1566, 7162, 50276, 249, 16186, 818, 31934, 2009, 68, 943, 320, 2931, 38565, 50276, 250, 3065, 50275, 18, 1269, 22728, 1182, 12109, 1269, 272, 75, 757, 632, 372, 75, 272, 2443, 285, 480, 74, 259, 86, 10499, 1491, 3700, 275, 11454, 6928, 549, 32693, 638, 3845, 549, 32693, 1518, 2270, 3121, 1348, 9169, 374, 458, 251, 472, 787, 1321, 285, 340, 1136, 258, 620, 400, 1321, 253, 5740, 2978, 273, 3676, 4715, 3210, 5723, 2824, 4765, 495, 247, 348, 4002, 355, 405, 33998, 1162, 355, 4836, 19, 4642, 4836, 21496, 323, 5148, 613, 920, 10061, 273, 253, 26332, 1796, 5213, 8059, 327, 4382, 8113, 6247, 5474, 33032, 436, 2929, 25339, 253, 897, 273, 1566, 1491, 281, 2939, 3607, 273, 3676, 4715, 3210, 285, 3237, 275, 1798, 352, 18303, 849, 1566, 1491, 778, 320, 908, 281, 9232, 253, 10183, 273, 247, 4836, 253, 4248, 273, 14259, 875, 10625, 253, 5350, 273, 247, 1566, 50275, 783, 2934, 273, 970, 247, 10671, 2905, 281, 12738, 293, 1746, 281, 2939, 3607, 273, 247, 4715, 1566, 310, 4722, 6296, 2677, 5411, 253, 3607, 6777, 407, 253, 4477, 4836, 10454, 5028, 14259, 651, 320, 5604, 4217, 2299, 891, 717, 42326, 1880, 253, 6803, 970, 1566, 1491, 651, 320, 9630, 50276, 936, 619, 4685, 1566, 1491, 44995, 253, 2408, 273, 1491, 247, 10166, 1566, 31934, 4428, 670, 10895, 277, 347, 247, 3064, 275, 12738, 293, 1746, 273, 253, 941, 285, 12738, 293, 1205, 44796, 273, 1566, 2203, 347, 5544, 275, 2593, 374, 21038, 352, 3133, 281, 479, 326, 436, 10671, 7024, 1097, 327, 253, 941, 285, 253, 1566, 352, 310, 3103, 247, 1798, 2557, 323, 247, 2173, 1566, 390, 387, 954, 247, 2021, 273, 3210, 285, 247, 10895, 7119, 595, 3738, 5476, 494, 352, 651, 320, 1463, 281, 4853, 512, 253, 14217, 275, 253, 7424, 50276, 249, 2593, 495, 352, 310, 417, 2590, 281, 479, 849, 436, 2557, 778, 320, 908, 281, 7472, 323, 4227, 253, 10183, 273, 247, 4836, 849, 310, 31934, 6777, 310, 627, 271, 6944, 1750, 326, 436, 2557, 651, 320, 3907, 432, 253, 4327, 273, 1566, 752, 310, 253, 7340, 273, 1491, 2557, 281, 253, 4327, 273, 253, 1566, 891, 717, 671, 44229, 264, 407, 253, 1543, 273, 253, 9938, 352, 3133, 326, 18752, 3530, 390, 7362, 17461, 13301, 2789, 253, 4836, 6927, 436, 2167, 310, 3240, 4828, 565, 48714, 347, 352, 651, 1804, 326, 281, 25636, 253, 1895, 359, 812, 12832, 5926, 3530, 390, 660, 45110, 13301, 891, 5476, 533, 891, 778, 320, 3430, 326, 436, 6454, 310, 1955, 281, 253, 958, 326, 10183, 310, 417, 6760, 275, 2426, 273, 26647, 50276, 3549, 6241, 275, 2593, 577, 352, 3133, 281, 479, 326, 247, 5955, 327, 6429, 285, 45505, 310, 14999, 352, 310, 4767, 326, 247, 2867, 273, 46373, 14259, 310, 697, 14275, 432, 1798, 14237, 2307, 3485, 436, 3133, 281, 479, 417, 281, 2186, 323, 1566, 1491, 835, 359, 452, 281, 10725, 327, 6429, 285, 45505, 50276, 249, 2593, 608, 849, 310, 247, 1677, 4836, 246, 2931, 436, 3782, 1774, 347, 352, 13067, 253, 873, 689, 534, 253, 11903, 1320, 310, 10302, 285, 342, 1675, 281, 534, 5350, 310, 2931, 619, 30328, 310, 326, 12672, 436, 10671, 327, 247, 1355, 873, 273, 8892, 651, 1091, 387, 1682, 247, 2406, 3033, 50276, 74, 452, 690, 24626, 671, 275, 2593, 818, 327, 5150, 854, 352, 651, 1646, 281, 326, 253, 1566, 1491, 273, 253, 5974, 943, 320, 4103, 281, 247, 1027, 10895, 581, 18839, 342, 253, 3453, 20552, 4561, 407, 253, 9732, 2139, 310, 352, 417, 187, 187, 4118, 18435, 27, 2520, 789, 3559, 247, 3862, 873, 273, 4722, 4893, 273, 1566, 1491, 2584, 4685, 4836, 10183, 5028, 14259, 285, 625, 2299, 30628, 497, 7514, 1475, 253, 13091, 285, 8132, 263, 273, 253, 11815, 1469, 715, 625, 6864, 275, 247, 8578, 273, 253, 3672, 3559, 651, 17084, 253, 2929, 347, 651, 2007, 11985, 285, 4679, 1475, 253, 7364, 273, 1566, 1491, 342, 17730, 281, 2173, 3210, 285, 10895, 9552, 347, 368, 452, 13207, 281, 2319, 275, 2593, 854, 23000, 30628, 1119, 253, 9300, 2929, 342, 10291, 281, 38301, 44519, 42017, 10454, 4722, 533, 30628, 3078, 247, 625, 7473, 1971, 285, 1783, 273, 253, 2954, 209 ]
[ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 ]
[ 498, 15752, 19164, 414, 9188, 40348, 50275, 27490, 891, 588, 1379, 2593, 495, 347, 247, 2173, 1650, 281, 21184, 275, 2593, 495, 253, 4477, 1611, 281, 897, 298, 78, 1678, 281, 4853, 4836, 10454, 285, 8219, 326, 6046, 285, 4836, 10454, 403, 767, 3907, 10746, 50275, 262, 3133, 752, 253, 4477, 1611, 281, 4853, 310, 3365, 253, 15577, 1491, 875, 1269, 285, 340, 835, 1269, 14059, 74, 310, 8392, 432, 253, 6036, 3268, 273, 1269, 90, 253, 10671, 298, 78, 1678, 310, 271, 11193, 2406, 3033, 273, 253, 15577, 1491, 436, 310, 247, 7744, 1929, 958, 534, 2789, 253, 1543, 2965, 2159, 273, 38135, 50276, 9453, 253, 4477, 4853, 253, 11193, 347, 253, 4836, 10454, 352, 4916, 7976, 327, 1097, 1566, 285, 253, 10895, 417, 4836, 3815, 3103, 298, 78, 1678, 588, 320, 7052, 7976, 327, 849, 581, 18784, 253, 1566, 285, 253, 1979, 273, 10895, 323, 1650, 342, 14920, 5926, 483, 298, 78, 1678, 588, 9184, 3012, 50275, 284, 247, 906, 352, 310, 2834, 281, 755, 4217, 1379, 12594, 432, 253, 1543, 273, 2593, 495, 824, 347, 4677, 495, 50275, 249, 1635, 352, 310, 12744, 752, 253, 12739, 651, 320, 604, 359, 452, 271, 11193, 273, 15577, 1491, 875, 1269, 285, 340, 812, 359, 897, 352, 281, 3157, 253, 4715, 273, 253, 4836, 50276, 249, 4477, 2380, 891, 3524, 253, 4477, 476, 19148, 253, 10096, 273, 616, 7982, 327, 1566, 10895, 16280, 253, 4836, 671, 4496, 19148, 849, 352, 7033, 281, 15577, 1491, 1880, 627, 310, 667, 2905, 789, 326, 2175, 253, 11193, 273, 15577, 1491, 875, 1269, 285, 340, 285, 616, 2954, 342, 253, 1655, 789, 671, 352, 651, 320, 4722, 281, 871, 604, 627, 403, 667, 2442, 2898, 323, 253, 4081, 27389, 26312, 327, 4836, 10454, 5474, 339, 431, 248, 2929, 33888, 1027, 441, 886, 1169, 273, 247, 10671, 4081, 275, 2720, 2987, 534, 310, 753, 281, 9232, 253, 1566, 1491, 352, 2722, 326, 436, 10671, 37824, 347, 3264, 4583, 2677, 5411, 253, 2408, 273, 1491, 247, 3676, 11454, 2990, 310, 247, 1077, 4722, 1953, 323, 253, 3114, 342, 1097, 10527, 285, 8542, 4549, 432, 271, 16101, 8668, 533, 671, 432, 247, 1566, 5438, 8668, 323, 1650, 50276, 1189, 455, 516, 14755, 253, 2929, 342, 247, 12009, 1223, 253, 1566, 1491, 16907, 37824, 347, 3264, 253, 32213, 275, 253, 47284, 347, 973, 347, 2442, 16503, 285, 247, 3480, 273, 8132, 454, 275, 253, 2929, 588, 2430, 10182, 14835, 50276, 17149, 5411, 1566, 1491, 476, 320, 271, 1774, 4968, 436, 2929, 5239, 562, 281, 921, 326, 13483, 2931, 407, 3273, 5741, 50276, 39584, 729, 9169, 285, 1182, 12109, 1162, 355, 9169, 2975, 1646, 14282, 285, 21319, 347, 581, 651, 1902, 50276, 35529, 253, 6667, 3480, 21649, 1877, 285, 6864, 50275, 783, 2605, 273, 253, 501, 3024, 275, 818, 66, 310, 2057, 3430, 390, 1327, 15291, 253, 12541, 4602, 310, 3798, 816, 271, 6489, 1159, 562, 50276, 249, 50276, 6172, 249, 310, 436, 22991, 285, 604, 352, 310, 752, 1057, 436, 2028, 441, 670, 3963, 501, 47301, 840, 50276, 3549, 6241, 323, 253, 1566, 28913, 352, 310, 417, 2590, 2139, 14932, 1076, 253, 3602, 281, 616, 3302, 3602, 310, 253, 4569, 1332, 273, 28913, 50276, 23955, 2442, 2523, 310, 326, 253, 14308, 275, 2593, 577, 22830, 326, 298, 785, 270, 50276, 77, 1814, 247, 672, 352, 310, 4518, 417, 923, 671, 1269, 86, 9169, 3614, 39962, 2061, 5375, 10016, 740, 29941, 323, 1650, 50276, 2520, 37317, 651, 5730, 323, 247, 625, 801, 554, 394, 1971, 273, 253, 2710, 6667, 275, 1340, 281, 320, 16575, 13762, 50275, 37585, 7350, 50276, 13206, 337, 4648, 247, 1386, 7484, 1014, 2167, 253, 1269, 10565, 10770, 281, 440, 16586, 9050, 50274, 250, 2858, 22559, 50276, 74, 5717, 253, 4477, 323, 616, 7000, 12252, 891, 1335, 1908, 253, 7680, 281, 320, 1512, 1029, 5251, 285, 281, 3835, 1512, 1199, 3216, 1293, 1469, 715, 4209, 6864, 891, 717, 417, 2119, 891, 476, 956, 253, 4154, 670, 38301, 44519, 42017, 10454, 697, 5931, 4086, 310, 671, 760, 4503, 598, 281, 247, 32643, 2803, 891, 588, 1978, 619, 4868, 253, 1775, 264, 406, 339, 431, 248, 4275, 4473, 273, 436, 2929, 310, 1566, 1491, 247, 5740, 2978, 273, 247, 20741, 800, 1566, 253, 4477, 21424, 10393, 273, 1566, 1491, 323, 18918, 2067, 7794, 275, 3676, 4715, 275, 1798, 597, 921, 849, 1566, 1491, 476, 320, 908, 281, 5963, 670, 10183, 273, 22296, 8892, 5028, 14259, 1566, 5350, 9503, 273, 1027, 2990, 4295, 285, 3640, 940, 21755, 512, 273, 841, 403, 1774, 12989, 285, 403, 4623, 281, 253, 17857, 32888, 3114, 1223, 954, 273, 253, 14308, 285, 27838, 1646, 27350, 285, 3588, 627, 403, 247, 1643, 7350, 285, 3533, 285, 275, 690, 2219, 352, 310, 1892, 281, 7617, 1880, 253, 11815, 273, 253, 27838, 943, 320, 18273, 50276, 7645, 1491, 50276, 783, 1307, 1566, 1491, 310, 21248, 285, 778, 2794, 12485, 342, 643, 13483, 751, 253, 439, 16554, 15577, 1491, 1566, 556, 670, 253, 3733, 10895, 352, 22666, 50276, 69, 891, 1804, 281, 19148, 2393, 326, 407, 1566, 1491, 352, 310, 5486, 690, 2238, 273, 5740, 2978, 273, 253, 20741, 800, 1566, 268, 783, 85, 333, 4260, 1269, 50276, 284, 1566, 1491, 310, 417, 247, 6210, 392, 37224, 4473, 352, 310, 417, 4569, 281, 1333, 271, 11193, 273, 1566, 1491, 1805, 281, 1333, 271, 4227, 66, 1459, 7300, 5426, 273, 1566, 1491, 50276, 284, 512, 4679, 403, 2218, 970, 253, 4227, 273, 1566, 1491, 2931, 407, 1182, 12109, 1162, 355, 337, 1925, 1491, 3700, 253, 11815, 778, 417, 2186, 323, 643, 10872, 273, 1566, 1491, 323, 1650, 604, 581, 13067, 1566, 1491, 342, 253, 638, 371, 1624, 12425, 273, 7387, 343, 285, 258, 620, 400, 1321, 374, 840, 3629, 253, 1180, 273, 6667, 342, 27620, 13301, 588, 1056, 253, 4836, 625, 2834, 50275, 32897, 19148, 1880, 253, 465, 6667, 275, 5150, 374, 403, 432, 253, 3733, 873, 277, 323, 417, 50276, 14605, 10183, 50274, 74, 1663, 751, 970, 1566, 1491, 323, 18005, 4836, 10183, 253, 760, 4468, 310, 326, 253, 4836, 10183, 7024, 327, 253, 2990, 285, 253, 3733, 5933, 23000, 891, 1804, 281, 7484, 4836, 10183, 7147, 253, 1180, 273, 3733, 6667, 588, 359, 923, 326, 352, 5340, 666, 846, 2217, 1180, 273, 3733, 6667, 403, 1677, 849, 651, 4836, 10183, 21319, 275, 253, 1355, 941, 9459, 50276, 13517, 14259, 50276, 5371, 513, 8083, 285, 15171, 7871, 1599, 323, 15302, 275, 5150, 577, 604, 597, 1599, 253, 1072, 347, 275, 5150, 495, 368, 476, 816, 10280, 253, 1072, 14951, 50276, 32897, 5513, 849, 253, 987, 4608, 2189, 273, 5150, 577, 13840, 715, 253, 7792, 273, 5150, 495, 50276, 9453, 4361, 326, 331, 2068, 328, 571, 270, 310, 26640, 581, 1537, 16084, 18429, 310, 13123, 534, 310, 417, 2032, 4496, 19148, 326, 1097, 5593, 403, 440, 301, 30869, 50276, 1542, 1650, 604, 331, 2068, 328, 571, 270, 50276, 18, 285, 331, 2068, 328, 487, 247, 50276, 18, 840, 581, 476, 2028, 326, 247, 310, 247, 8578, 273, 270, 4496, 5513, 2139, 310, 436, 6197, 2032, 50276, 783, 14308, 273, 5028, 14259, 273, 7424, 577, 285, 608, 1646, 247, 1652, 2372, 10341, 2139, 943, 597, 320, 2931, 751, 326, 849, 513, 841, 5028, 14259, 5593, 7277, 281, 643, 5593, 824, 347, 5028, 14259, 10302, 407, 4836, 19, 4642, 495, 50276, 1542, 6927, 7914, 273, 3036, 608, 368, 476, 897, 253, 1072, 3295, 3711, 275, 512, 749, 42045, 50276, 7645, 5350, 50276, 284, 891, 2096, 1566, 5350, 2931, 275, 5150, 721, 7024, 327, 253, 4836, 2139, 943, 2649, 581, 1379, 25937, 360, 689, 8892, 1512, 50276, 249, 253, 4679, 273, 3036, 721, 513, 512, 15302, 452, 253, 1072, 1979, 50276, 783, 4951, 3104, 275, 3036, 721, 68, 13414, 16851, 253, 9191, 973, 281, 452, 625, 2792, 327, 253, 11593, 7844, 368, 476, 1908, 501, 3024, 1093, 1671, 1235, 76, 6928, 285, 6889, 465, 50276, 1752, 318, 50276, 5430, 513, 359, 871, 326, 10671, 2931, 275, 5150, 818, 7777, 28174, 849, 1774, 247, 4445, 310, 285, 326, 359, 943, 4517, 253, 11815, 273, 4679, 3559, 275, 3036, 818, 50276, 37585, 7211, 50276, 936, 1056, 253, 2929, 625, 1881, 41010, 4496, 3748, 275, 253, 2022, 2505, 534, 5426, 273, 1566, 1491, 310, 908, 275, 253, 4679, 253, 1072, 10384, 323, 1566, 7162, 50276, 249, 16186, 818, 31934, 2009, 68, 943, 320, 2931, 38565, 50276, 250, 3065, 50275, 18, 1269, 22728, 1182, 12109, 1269, 272, 75, 757, 632, 372, 75, 272, 2443, 285, 480, 74, 259, 86, 10499, 1491, 3700, 275, 11454, 6928, 549, 32693, 638, 3845, 549, 32693, 1518, 2270, 3121, 1348, 9169, 374, 458, 251, 472, 787, 1321, 285, 340, 1136, 258, 620, 400, 1321, 253, 5740, 2978, 273, 3676, 4715, 3210, 5723, 2824, 4765, 495, 247, 348, 4002, 355, 405, 33998, 1162, 355, 4836, 19, 4642, 4836, 21496, 323, 5148, 613, 920, 10061, 273, 253, 26332, 1796, 5213, 8059, 327, 4382, 8113, 6247, 5474, 33032, 436, 2929, 25339, 253, 897, 273, 1566, 1491, 281, 2939, 3607, 273, 3676, 4715, 3210, 285, 3237, 275, 1798, 352, 18303, 849, 1566, 1491, 778, 320, 908, 281, 9232, 253, 10183, 273, 247, 4836, 253, 4248, 273, 14259, 875, 10625, 253, 5350, 273, 247, 1566, 50275, 783, 2934, 273, 970, 247, 10671, 2905, 281, 12738, 293, 1746, 281, 2939, 3607, 273, 247, 4715, 1566, 310, 4722, 6296, 2677, 5411, 253, 3607, 6777, 407, 253, 4477, 4836, 10454, 5028, 14259, 651, 320, 5604, 4217, 2299, 891, 717, 42326, 1880, 253, 6803, 970, 1566, 1491, 651, 320, 9630, 50276, 936, 619, 4685, 1566, 1491, 44995, 253, 2408, 273, 1491, 247, 10166, 1566, 31934, 4428, 670, 10895, 277, 347, 247, 3064, 275, 12738, 293, 1746, 273, 253, 941, 285, 12738, 293, 1205, 44796, 273, 1566, 2203, 347, 5544, 275, 2593, 374, 21038, 352, 3133, 281, 479, 326, 436, 10671, 7024, 1097, 327, 253, 941, 285, 253, 1566, 352, 310, 3103, 247, 1798, 2557, 323, 247, 2173, 1566, 390, 387, 954, 247, 2021, 273, 3210, 285, 247, 10895, 7119, 595, 3738, 5476, 494, 352, 651, 320, 1463, 281, 4853, 512, 253, 14217, 275, 253, 7424, 50276, 249, 2593, 495, 352, 310, 417, 2590, 281, 479, 849, 436, 2557, 778, 320, 908, 281, 7472, 323, 4227, 253, 10183, 273, 247, 4836, 849, 310, 31934, 6777, 310, 627, 271, 6944, 1750, 326, 436, 2557, 651, 320, 3907, 432, 253, 4327, 273, 1566, 752, 310, 253, 7340, 273, 1491, 2557, 281, 253, 4327, 273, 253, 1566, 891, 717, 671, 44229, 264, 407, 253, 1543, 273, 253, 9938, 352, 3133, 326, 18752, 3530, 390, 7362, 17461, 13301, 2789, 253, 4836, 6927, 436, 2167, 310, 3240, 4828, 565, 48714, 347, 352, 651, 1804, 326, 281, 25636, 253, 1895, 359, 812, 12832, 5926, 3530, 390, 660, 45110, 13301, 891, 5476, 533, 891, 778, 320, 3430, 326, 436, 6454, 310, 1955, 281, 253, 958, 326, 10183, 310, 417, 6760, 275, 2426, 273, 26647, 50276, 3549, 6241, 275, 2593, 577, 352, 3133, 281, 479, 326, 247, 5955, 327, 6429, 285, 45505, 310, 14999, 352, 310, 4767, 326, 247, 2867, 273, 46373, 14259, 310, 697, 14275, 432, 1798, 14237, 2307, 3485, 436, 3133, 281, 479, 417, 281, 2186, 323, 1566, 1491, 835, 359, 452, 281, 10725, 327, 6429, 285, 45505, 50276, 249, 2593, 608, 849, 310, 247, 1677, 4836, 246, 2931, 436, 3782, 1774, 347, 352, 13067, 253, 873, 689, 534, 253, 11903, 1320, 310, 10302, 285, 342, 1675, 281, 534, 5350, 310, 2931, 619, 30328, 310, 326, 12672, 436, 10671, 327, 247, 1355, 873, 273, 8892, 651, 1091, 387, 1682, 247, 2406, 3033, 50276, 74, 452, 690, 24626, 671, 275, 2593, 818, 327, 5150, 854, 352, 651, 1646, 281, 326, 253, 1566, 1491, 273, 253, 5974, 943, 320, 4103, 281, 247, 1027, 10895, 581, 18839, 342, 253, 3453, 20552, 4561, 407, 253, 9732, 2139, 310, 352, 417, 187, 187, 4118, 18435, 27, 2520, 789, 3559, 247, 3862, 873, 273, 4722, 4893, 273, 1566, 1491, 2584, 4685, 4836, 10183, 5028, 14259, 285, 625, 2299, 30628, 497, 7514, 1475, 253, 13091, 285, 8132, 263, 273, 253, 11815, 1469, 715, 625, 6864, 275, 247, 8578, 273, 253, 3672, 3559, 651, 17084, 253, 2929, 347, 651, 2007, 11985, 285, 4679, 1475, 253, 7364, 273, 1566, 1491, 342, 17730, 281, 2173, 3210, 285, 10895, 9552, 347, 368, 452, 13207, 281, 2319, 275, 2593, 854, 23000, 30628, 1119, 253, 9300, 2929, 342, 10291, 281, 38301, 44519, 42017, 10454, 4722, 533, 30628, 3078, 247, 625, 7473, 1971, 285, 1783, 273, 253, 2954, 209 ]
Below is given review of a research paper from cnoference journal. Please write a summary the review. ### Review: this paper presents a pretty cool idea for enabling adaptive kernels for cnns which allow dramatic reduction in the size of models with moderate to large performance drops in at least one case the training time is also significantly reduced 2x the best part about this paper is that the size of the models are much smaller but the paper does offer any explanation of the value of this for example even a 1 drop in accuracy can be unacceptable but in some applications like cell phones and iot devices model size is critical the authors should add some wording to explain this value the adaptivekernels the the authors talk about are really a new class of nonlinear kernels it would be very interesting to see a discussion of the class of functions these nonlinear kernels represent this kind of discussion would give the reader motivation for the choice of function ideas for how to improve in this class of functions and insight into why it works the method presented is interesting but it is not clear that it is present with enough detail for its results to be replicated it would be nice if the authors pointed to a git repository with their code an experiments more importantly the results presented are quite meager if this is a method for image recognition it would be better to present results for a more substantial image recognition problem than mnist and cifar10 and the analysis of the dynamic range of the algorithim is missing how do performance and model size trade off how were the number of layers and kernels chosen was the 5x10x20x10 topology used for mnist the only topology tried that would be very surprising what is the performance on all of the other topologies tried for the proposed algorithm was crossvalidation used to select the topology if so what was the methodology additionally some readers may find this paper a little difficult to read due to 1 lack of clarity in the writing eg the first three paragraphs in section 3 2 omitted details eg how much overlap exists between kernels figs 1 2 and 4 suggests there is no overlap this should be made clear and 3 poor grammar and nonstandard terminology eg the authors use of the word energy and the phrase degradation problem all of these issues should be addressed in a future version of the paper not sure why eqns 2 and 9 need any parentheses they should be removeddocsepthe paper develops a new convolution operation i think it is misleading to call it a convolution as a it is not a convolution mathematically and b fast convolution techniques fourier winograd cannot be applied so claims to greater efficiency may be misleading p23 section 31 i found the equations impossible to read what are the subscripts over in 2 is n1xn1 the kernel size sums are over 01n is the output of the first convolution a single hxw feature planes or a hxwxn1xn1 tensor equation 4 what is dkl a pixelwise target label where does it come from experimental section like depthwise convolutions you seem to achieve reasonable accuracy at fairly low computational cost it would therefore be much more interesting to compare your networks with shufflenet style networks designed for computational efficiency rather than networks designed mainly to push the benchmark numbers down whatever the cost it would be helpful to have the computational cost of the network in flops and running time compared a regular convnet using winogradfourier convolutionsdocsepthe paper introduces adaptive kernels that adapts its weights as a function of image content to the framework of cnn the benefit of adaptive kernels is the reduction of memory usage at training and at the inference time as well as training speedups up to 2x the kernels are evaluated on two datasets mnist and cifar10 i like the idea of building models that are memory efficient at training and at evaluation time however the evaluation of the proposed adaptive kernels is rather limited in order to improve the paper the authors could take into consideration the following points 1 why there is still a need to combine adaptive convolutions with regular convolutions what would the model performance be for a model with only adaptive kernels 2 i might have missed it but i couldnt find any motivation on why tanh is used as nonlinearity would the method work with relu 3 traditional convolutional kernels together with max pooling operations ensures some degree of translation invariance how big is the generalization gap for the tested models when adaptive kernel is used 4 how sensitive are the results to the number of adaptive kernels in the layers 5 adaptive kernels have only been tested in the first convolutional layer would the adaptive kernels work well also in different layers 6 on cifar10 the results seem to be worse that other methods however it is important to note that the adaptive kernels cnn has way less parameters it would be interesting to see how the performance of adaptive kernels based cnns scales with the number of parameters 7 the evaluation on two datasets seems to be rather limited additional comparisons should be included 8 the authors acknowledge the similarities and some differences with brabandere et al 2016 it might be beneficial to include comparison to this approach in the experimental section moreover given the similarities it might be good to discuss the differences in the approaches in the introduction section 9 the ideas presented in the paper seems related to general concept of hypernetworks where one network learns or helps to learn paramenters of the other network it would be nice to position the ideas from the paper wrt this line of research too 10 another related paper seems to be spatial transformer networks jaderberg et al i like the drawings however the font on the drawings is too small making it hard to read some typos 1 the difficult to train the network 2 table 2 dynamic adaptive overall the paper presents interesting ideas with some degree of originality id encourage the authors to extend the intro and position the ideas wrt existing works and extend the evaluation ### Summary:
the paper presents a modification of the convolution layer where the convolution weights are generated by another convolution operation while this is an interesting idea all reviewers felt that the evaluation and results are not particularly convincing and the paper is not ready for acceptance
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 2520, 2929, 10262, 247, 3965, 4484, 2934, 323, 17690, 17825, 34501, 323, 260, 79, 2224, 534, 1581, 14138, 5141, 275, 253, 1979, 273, 3210, 342, 10290, 281, 1781, 3045, 15323, 50276, 249, 387, 1878, 581, 1083, 253, 3733, 673, 310, 671, 3012, 3777, 374, 89, 50276, 783, 1682, 629, 670, 436, 2929, 310, 326, 253, 1979, 273, 253, 3210, 403, 1199, 4577, 533, 253, 2929, 1057, 3959, 667, 8813, 273, 253, 1318, 273, 436, 50276, 1542, 1650, 1014, 247, 337, 5926, 275, 7200, 476, 320, 28536, 533, 275, 690, 4893, 751, 894, 15169, 285, 891, 302, 4095, 1566, 1979, 310, 4619, 50276, 783, 4477, 943, 823, 690, 41066, 281, 5513, 436, 1318, 50276, 783, 17825, 25272, 1241, 253, 253, 4477, 2312, 670, 403, 1663, 247, 747, 966, 273, 14561, 34501, 50276, 262, 651, 320, 1077, 4722, 281, 923, 247, 5955, 273, 253, 966, 273, 3470, 841, 14561, 34501, 1957, 50276, 2520, 2238, 273, 5955, 651, 1918, 253, 9414, 50276, 24013, 7639, 323, 253, 4327, 273, 1159, 5697, 323, 849, 281, 3157, 275, 436, 966, 273, 3470, 285, 12288, 715, 2139, 352, 2987, 50276, 783, 1332, 3559, 310, 4722, 533, 352, 310, 417, 2590, 326, 352, 310, 1246, 342, 2217, 2508, 323, 697, 1543, 281, 320, 37221, 50276, 262, 651, 320, 5322, 604, 253, 4477, 8042, 281, 247, 19421, 18491, 342, 616, 2127, 271, 4679, 50276, 3062, 15538, 253, 1543, 3559, 403, 3240, 479, 3800, 50276, 338, 436, 310, 247, 1332, 323, 2460, 8981, 352, 651, 320, 1805, 281, 1246, 1543, 323, 247, 625, 6832, 2460, 8981, 1895, 685, 278, 79, 382, 285, 260, 338, 274, 740, 50276, 395, 253, 1783, 273, 253, 7870, 2491, 273, 253, 4649, 303, 310, 5816, 50276, 5430, 513, 3045, 285, 1566, 1979, 5454, 745, 50276, 5430, 497, 253, 1180, 273, 8090, 285, 34501, 6777, 50276, 4238, 253, 608, 89, 740, 89, 938, 89, 740, 18080, 908, 323, 278, 79, 382, 253, 760, 18080, 3597, 50276, 3529, 651, 320, 1077, 10084, 50276, 5371, 310, 253, 3045, 327, 512, 273, 253, 643, 1755, 5970, 3597, 323, 253, 4081, 5933, 50276, 4238, 2831, 29599, 908, 281, 3609, 253, 18080, 50276, 338, 594, 752, 369, 253, 16182, 50275, 29483, 595, 690, 10668, 778, 1089, 436, 2929, 247, 1652, 2834, 281, 1239, 1955, 281, 337, 3480, 273, 19843, 275, 253, 4028, 24088, 253, 806, 1264, 33295, 275, 2593, 495, 374, 11035, 4278, 24088, 849, 1199, 14787, 4961, 875, 34501, 3036, 84, 337, 374, 285, 577, 5936, 627, 310, 642, 14787, 50276, 2520, 943, 320, 1160, 2590, 285, 495, 4105, 28146, 285, 1327, 15291, 28939, 24088, 253, 4477, 897, 273, 253, 3159, 2341, 285, 253, 12616, 11961, 1895, 50276, 455, 273, 841, 3374, 943, 320, 9713, 275, 247, 2852, 2715, 273, 253, 2929, 50276, 1439, 2119, 2139, 16186, 2224, 374, 285, 898, 878, 667, 41616, 50276, 9328, 943, 320, 5176, 7152, 339, 431, 248, 2929, 24357, 247, 747, 27311, 4254, 50276, 74, 1158, 352, 310, 24363, 281, 1067, 352, 247, 27311, 347, 247, 352, 310, 417, 247, 27311, 11076, 1037, 285, 270, 3809, 27311, 5609, 269, 15421, 3330, 462, 4614, 2550, 320, 3732, 594, 3916, 281, 3687, 6733, 778, 320, 24363, 50276, 81, 1508, 2593, 4562, 50276, 74, 1119, 253, 7424, 7479, 281, 1239, 752, 403, 253, 749, 26804, 689, 275, 374, 310, 295, 18, 89, 79, 18, 253, 10295, 1979, 22661, 403, 689, 14805, 79, 310, 253, 3453, 273, 253, 806, 27311, 247, 2014, 288, 89, 88, 4735, 16340, 390, 247, 288, 89, 22358, 79, 18, 89, 79, 18, 13148, 50276, 29813, 577, 752, 310, 277, 7261, 247, 12275, 3020, 2303, 5203, 835, 1057, 352, 1705, 432, 50276, 49363, 2593, 751, 6864, 3020, 2410, 17009, 368, 1646, 281, 5115, 5272, 7200, 387, 9648, 1698, 15180, 2105, 352, 651, 3103, 320, 1199, 625, 4722, 281, 7277, 634, 6928, 342, 439, 2066, 5025, 292, 3740, 6928, 4158, 323, 15180, 6733, 2581, 685, 6928, 4158, 7194, 281, 7450, 253, 22791, 3904, 1066, 5913, 253, 2105, 50276, 262, 651, 320, 9371, 281, 452, 253, 15180, 2105, 273, 253, 2990, 275, 892, 2695, 285, 3515, 673, 2429, 247, 3963, 2410, 3024, 970, 3330, 462, 4614, 71, 15421, 2410, 17009, 7152, 339, 431, 248, 2929, 23970, 17825, 34501, 326, 5223, 84, 697, 13461, 347, 247, 1159, 273, 2460, 2600, 281, 253, 7792, 273, 260, 9866, 253, 5649, 273, 17825, 34501, 310, 253, 5141, 273, 3541, 10393, 387, 3733, 285, 387, 253, 17032, 673, 347, 973, 347, 3733, 3885, 8777, 598, 281, 374, 89, 253, 34501, 403, 6760, 327, 767, 15302, 278, 79, 382, 285, 260, 338, 274, 740, 50276, 74, 751, 253, 2934, 273, 3652, 3210, 326, 403, 3541, 5919, 387, 3733, 285, 387, 7103, 673, 2299, 253, 7103, 273, 253, 4081, 17825, 34501, 310, 2581, 3710, 275, 1340, 281, 3157, 253, 2929, 253, 4477, 812, 1379, 715, 8180, 253, 1563, 2792, 50276, 18, 2139, 627, 310, 1335, 247, 878, 281, 13398, 17825, 2410, 17009, 342, 3963, 2410, 17009, 752, 651, 253, 1566, 3045, 320, 323, 247, 1566, 342, 760, 17825, 34501, 374, 891, 1537, 452, 9829, 352, 533, 891, 812, 2649, 1089, 667, 16038, 327, 2139, 23136, 73, 310, 908, 347, 14561, 414, 651, 253, 1332, 789, 342, 774, 86, 495, 5899, 27311, 267, 34501, 2366, 342, 2781, 45900, 5871, 20096, 690, 4248, 273, 10234, 31429, 849, 1943, 310, 253, 26647, 8037, 323, 253, 5762, 3210, 672, 17825, 10295, 310, 908, 577, 849, 7996, 403, 253, 1543, 281, 253, 1180, 273, 17825, 34501, 275, 253, 8090, 608, 17825, 34501, 452, 760, 644, 5762, 275, 253, 806, 27311, 267, 3828, 651, 253, 17825, 34501, 789, 973, 671, 275, 1027, 8090, 721, 327, 260, 338, 274, 740, 253, 1543, 1646, 281, 320, 7197, 326, 643, 3082, 2299, 352, 310, 1774, 281, 3877, 326, 253, 17825, 34501, 260, 9866, 556, 1039, 1679, 3602, 352, 651, 320, 4722, 281, 923, 849, 253, 3045, 273, 17825, 34501, 1754, 260, 79, 2224, 11498, 342, 253, 1180, 273, 3602, 818, 253, 7103, 327, 767, 15302, 3133, 281, 320, 2581, 3710, 3081, 14023, 943, 320, 2908, 854, 253, 4477, 14409, 253, 22620, 285, 690, 3910, 342, 1308, 357, 10273, 250, 1162, 355, 4022, 352, 1537, 320, 12912, 281, 2486, 5301, 281, 436, 2746, 275, 253, 5661, 2593, 25761, 1677, 253, 22620, 352, 1537, 320, 1175, 281, 2319, 253, 3910, 275, 253, 7274, 275, 253, 10199, 2593, 898, 253, 5697, 3559, 275, 253, 2929, 3133, 2905, 281, 2087, 4473, 273, 4373, 3024, 4896, 835, 581, 2990, 33772, 390, 7729, 281, 3037, 2236, 290, 398, 273, 253, 643, 2990, 352, 651, 320, 5322, 281, 1899, 253, 5697, 432, 253, 2929, 8772, 436, 1386, 273, 2561, 1512, 884, 1529, 2905, 2929, 3133, 281, 320, 8820, 39707, 6928, 480, 6475, 4978, 1162, 355, 50276, 74, 751, 253, 21780, 2299, 253, 8266, 327, 253, 21780, 310, 1512, 1355, 50276, 11849, 352, 1892, 281, 1239, 50276, 8826, 963, 993, 337, 253, 2834, 281, 6194, 253, 2990, 374, 2829, 374, 7870, 50276, 26672, 422, 50276, 1189, 455, 253, 2929, 10262, 4722, 5697, 342, 690, 4248, 273, 3236, 414, 2654, 11907, 253, 4477, 281, 9017, 253, 26432, 285, 1899, 253, 5697, 8772, 5368, 2987, 285, 9017, 253, 7103, 187, 187, 4118, 18435, 27, 783, 2929, 10262, 247, 11237, 273, 253, 27311, 3828, 835, 253, 27311, 13461, 403, 4561, 407, 1529, 27311, 4254, 1223, 436, 310, 271, 4722, 2934, 512, 30628, 3543, 326, 253, 7103, 285, 1543, 403, 417, 3782, 21414, 285, 253, 2929, 310, 417, 4704, 323, 14924 ]
[ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 ]
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 2520, 2929, 10262, 247, 3965, 4484, 2934, 323, 17690, 17825, 34501, 323, 260, 79, 2224, 534, 1581, 14138, 5141, 275, 253, 1979, 273, 3210, 342, 10290, 281, 1781, 3045, 15323, 50276, 249, 387, 1878, 581, 1083, 253, 3733, 673, 310, 671, 3012, 3777, 374, 89, 50276, 783, 1682, 629, 670, 436, 2929, 310, 326, 253, 1979, 273, 253, 3210, 403, 1199, 4577, 533, 253, 2929, 1057, 3959, 667, 8813, 273, 253, 1318, 273, 436, 50276, 1542, 1650, 1014, 247, 337, 5926, 275, 7200, 476, 320, 28536, 533, 275, 690, 4893, 751, 894, 15169, 285, 891, 302, 4095, 1566, 1979, 310, 4619, 50276, 783, 4477, 943, 823, 690, 41066, 281, 5513, 436, 1318, 50276, 783, 17825, 25272, 1241, 253, 253, 4477, 2312, 670, 403, 1663, 247, 747, 966, 273, 14561, 34501, 50276, 262, 651, 320, 1077, 4722, 281, 923, 247, 5955, 273, 253, 966, 273, 3470, 841, 14561, 34501, 1957, 50276, 2520, 2238, 273, 5955, 651, 1918, 253, 9414, 50276, 24013, 7639, 323, 253, 4327, 273, 1159, 5697, 323, 849, 281, 3157, 275, 436, 966, 273, 3470, 285, 12288, 715, 2139, 352, 2987, 50276, 783, 1332, 3559, 310, 4722, 533, 352, 310, 417, 2590, 326, 352, 310, 1246, 342, 2217, 2508, 323, 697, 1543, 281, 320, 37221, 50276, 262, 651, 320, 5322, 604, 253, 4477, 8042, 281, 247, 19421, 18491, 342, 616, 2127, 271, 4679, 50276, 3062, 15538, 253, 1543, 3559, 403, 3240, 479, 3800, 50276, 338, 436, 310, 247, 1332, 323, 2460, 8981, 352, 651, 320, 1805, 281, 1246, 1543, 323, 247, 625, 6832, 2460, 8981, 1895, 685, 278, 79, 382, 285, 260, 338, 274, 740, 50276, 395, 253, 1783, 273, 253, 7870, 2491, 273, 253, 4649, 303, 310, 5816, 50276, 5430, 513, 3045, 285, 1566, 1979, 5454, 745, 50276, 5430, 497, 253, 1180, 273, 8090, 285, 34501, 6777, 50276, 4238, 253, 608, 89, 740, 89, 938, 89, 740, 18080, 908, 323, 278, 79, 382, 253, 760, 18080, 3597, 50276, 3529, 651, 320, 1077, 10084, 50276, 5371, 310, 253, 3045, 327, 512, 273, 253, 643, 1755, 5970, 3597, 323, 253, 4081, 5933, 50276, 4238, 2831, 29599, 908, 281, 3609, 253, 18080, 50276, 338, 594, 752, 369, 253, 16182, 50275, 29483, 595, 690, 10668, 778, 1089, 436, 2929, 247, 1652, 2834, 281, 1239, 1955, 281, 337, 3480, 273, 19843, 275, 253, 4028, 24088, 253, 806, 1264, 33295, 275, 2593, 495, 374, 11035, 4278, 24088, 849, 1199, 14787, 4961, 875, 34501, 3036, 84, 337, 374, 285, 577, 5936, 627, 310, 642, 14787, 50276, 2520, 943, 320, 1160, 2590, 285, 495, 4105, 28146, 285, 1327, 15291, 28939, 24088, 253, 4477, 897, 273, 253, 3159, 2341, 285, 253, 12616, 11961, 1895, 50276, 455, 273, 841, 3374, 943, 320, 9713, 275, 247, 2852, 2715, 273, 253, 2929, 50276, 1439, 2119, 2139, 16186, 2224, 374, 285, 898, 878, 667, 41616, 50276, 9328, 943, 320, 5176, 7152, 339, 431, 248, 2929, 24357, 247, 747, 27311, 4254, 50276, 74, 1158, 352, 310, 24363, 281, 1067, 352, 247, 27311, 347, 247, 352, 310, 417, 247, 27311, 11076, 1037, 285, 270, 3809, 27311, 5609, 269, 15421, 3330, 462, 4614, 2550, 320, 3732, 594, 3916, 281, 3687, 6733, 778, 320, 24363, 50276, 81, 1508, 2593, 4562, 50276, 74, 1119, 253, 7424, 7479, 281, 1239, 752, 403, 253, 749, 26804, 689, 275, 374, 310, 295, 18, 89, 79, 18, 253, 10295, 1979, 22661, 403, 689, 14805, 79, 310, 253, 3453, 273, 253, 806, 27311, 247, 2014, 288, 89, 88, 4735, 16340, 390, 247, 288, 89, 22358, 79, 18, 89, 79, 18, 13148, 50276, 29813, 577, 752, 310, 277, 7261, 247, 12275, 3020, 2303, 5203, 835, 1057, 352, 1705, 432, 50276, 49363, 2593, 751, 6864, 3020, 2410, 17009, 368, 1646, 281, 5115, 5272, 7200, 387, 9648, 1698, 15180, 2105, 352, 651, 3103, 320, 1199, 625, 4722, 281, 7277, 634, 6928, 342, 439, 2066, 5025, 292, 3740, 6928, 4158, 323, 15180, 6733, 2581, 685, 6928, 4158, 7194, 281, 7450, 253, 22791, 3904, 1066, 5913, 253, 2105, 50276, 262, 651, 320, 9371, 281, 452, 253, 15180, 2105, 273, 253, 2990, 275, 892, 2695, 285, 3515, 673, 2429, 247, 3963, 2410, 3024, 970, 3330, 462, 4614, 71, 15421, 2410, 17009, 7152, 339, 431, 248, 2929, 23970, 17825, 34501, 326, 5223, 84, 697, 13461, 347, 247, 1159, 273, 2460, 2600, 281, 253, 7792, 273, 260, 9866, 253, 5649, 273, 17825, 34501, 310, 253, 5141, 273, 3541, 10393, 387, 3733, 285, 387, 253, 17032, 673, 347, 973, 347, 3733, 3885, 8777, 598, 281, 374, 89, 253, 34501, 403, 6760, 327, 767, 15302, 278, 79, 382, 285, 260, 338, 274, 740, 50276, 74, 751, 253, 2934, 273, 3652, 3210, 326, 403, 3541, 5919, 387, 3733, 285, 387, 7103, 673, 2299, 253, 7103, 273, 253, 4081, 17825, 34501, 310, 2581, 3710, 275, 1340, 281, 3157, 253, 2929, 253, 4477, 812, 1379, 715, 8180, 253, 1563, 2792, 50276, 18, 2139, 627, 310, 1335, 247, 878, 281, 13398, 17825, 2410, 17009, 342, 3963, 2410, 17009, 752, 651, 253, 1566, 3045, 320, 323, 247, 1566, 342, 760, 17825, 34501, 374, 891, 1537, 452, 9829, 352, 533, 891, 812, 2649, 1089, 667, 16038, 327, 2139, 23136, 73, 310, 908, 347, 14561, 414, 651, 253, 1332, 789, 342, 774, 86, 495, 5899, 27311, 267, 34501, 2366, 342, 2781, 45900, 5871, 20096, 690, 4248, 273, 10234, 31429, 849, 1943, 310, 253, 26647, 8037, 323, 253, 5762, 3210, 672, 17825, 10295, 310, 908, 577, 849, 7996, 403, 253, 1543, 281, 253, 1180, 273, 17825, 34501, 275, 253, 8090, 608, 17825, 34501, 452, 760, 644, 5762, 275, 253, 806, 27311, 267, 3828, 651, 253, 17825, 34501, 789, 973, 671, 275, 1027, 8090, 721, 327, 260, 338, 274, 740, 253, 1543, 1646, 281, 320, 7197, 326, 643, 3082, 2299, 352, 310, 1774, 281, 3877, 326, 253, 17825, 34501, 260, 9866, 556, 1039, 1679, 3602, 352, 651, 320, 4722, 281, 923, 849, 253, 3045, 273, 17825, 34501, 1754, 260, 79, 2224, 11498, 342, 253, 1180, 273, 3602, 818, 253, 7103, 327, 767, 15302, 3133, 281, 320, 2581, 3710, 3081, 14023, 943, 320, 2908, 854, 253, 4477, 14409, 253, 22620, 285, 690, 3910, 342, 1308, 357, 10273, 250, 1162, 355, 4022, 352, 1537, 320, 12912, 281, 2486, 5301, 281, 436, 2746, 275, 253, 5661, 2593, 25761, 1677, 253, 22620, 352, 1537, 320, 1175, 281, 2319, 253, 3910, 275, 253, 7274, 275, 253, 10199, 2593, 898, 253, 5697, 3559, 275, 253, 2929, 3133, 2905, 281, 2087, 4473, 273, 4373, 3024, 4896, 835, 581, 2990, 33772, 390, 7729, 281, 3037, 2236, 290, 398, 273, 253, 643, 2990, 352, 651, 320, 5322, 281, 1899, 253, 5697, 432, 253, 2929, 8772, 436, 1386, 273, 2561, 1512, 884, 1529, 2905, 2929, 3133, 281, 320, 8820, 39707, 6928, 480, 6475, 4978, 1162, 355, 50276, 74, 751, 253, 21780, 2299, 253, 8266, 327, 253, 21780, 310, 1512, 1355, 50276, 11849, 352, 1892, 281, 1239, 50276, 8826, 963, 993, 337, 253, 2834, 281, 6194, 253, 2990, 374, 2829, 374, 7870, 50276, 26672, 422, 50276, 1189, 455, 253, 2929, 10262, 4722, 5697, 342, 690, 4248, 273, 3236, 414, 2654, 11907, 253, 4477, 281, 9017, 253, 26432, 285, 1899, 253, 5697, 8772, 5368, 2987, 285, 9017, 253, 7103, 187, 187, 4118, 18435, 27, 783, 2929, 10262, 247, 11237, 273, 253, 27311, 3828, 835, 253, 27311, 13461, 403, 4561, 407, 1529, 27311, 4254, 1223, 436, 310, 271, 4722, 2934, 512, 30628, 3543, 326, 253, 7103, 285, 1543, 403, 417, 3782, 21414, 285, 253, 2929, 310, 417, 4704, 323, 14924 ]
Below is given review of a research paper from cnoference journal. Please write a summary the review. ### Review: the paper idea is pretty much summarized in the title given a greyscale template image they find the closest colored reference image in some custom metric and match the greyscale values to the colors using a weighted graph matching the technical contribution of the paper is formulating the problem as an instance of graph matching and solving it using a polynomialtime approximation introduced in 22 the overall idea of the paper is quite clean and straightforward in order to color a template it makes sense to find a geometrically similar image and use its colors matching geometrically similar elements however this is exactly where the paper falls a little short on its promise instead of some notion of geometric similarity the two out of three features the paper uses to compare images are purely pixel statistics composition and sum of gradient i guess i would have expected some more geometric features like shape context or at least some histograms of gradient moreover the paper doesnt really demonstrate how well the reference image search works i would have expected some validation of just that stage perhaps a few closest images and a few far images in the collection at the very least otherwise its hard to judge whether it works at all fig 4 doesnt help here template doesnt look like the ref image finally im a little surprised to see the discussion of experiment2 do i understand correctly that the authors infer their method is superior to percept based on 10 out of 10 students opinions ie one person on average i do realize it is not a completely formal user study but i would not call this statistically significant so while the paper is written well and clearly while i appreciate the novelty of formulating the problem as a graph matching and using an approximation algorithm even while the final colorizations look pretty the issues above make me a little less optimistic about the paper if the other reviewers think those issues are minor i wont argue thoughdocsepthis paper proposes an interesting approach to colorize grayscale graphics arts by using the color scheme in reference images given input templates it firstly searches similar reference images from a colored image dataset then colors can be transferred from reference images to input templates the colorization pipeline is validated by two different user studies the results are impressive in the paper and the supplementary materials but the exposition of the paper should be improved there are many typos and grammatical errors in the paper which hinder understanding i incline to weakly accept the paper if the authors can improve the exposition and address my concerns in the following major issues the authors claim that they use an analytic approach other than a combinatorial or an iterative one but they use the hungarian algorithm approach to solve the matching problem which is an iterative method essentially the method requires the template and the reference images to have the same number of color groups it is unclear what are the olor groups distribution in crid and vg dataset respectively what are the lower bound and the upper bound of the number of color groups in the implementation based on those datasets how to select a reference image from the knn search from the dataset are all k candidates feasible for final results are all results in figure 1 generated using k reference images what is the value of k in the implementation it is not mentioned in the paper since the idea of graphic arts colorization is inspired by 5 it would be better to show the results of 5 the network of which can be trained on crid and vg dataset using a similar way presented in their paper minor issues prepositions in the title should not be capitalized in 41 a reference images are a reference image is or reference images are in eq4 pargmax argmaxp in 42 the subject is missing after composition matching mcmp eq eqmaximization eq4 in eq6 pargmax argmaxp eq eqminimization eq6 in figure 7 result demonstrate results demonstratedocseppaper summary the paper presents a new method for the automatic colorization of patternbased images using other colored graphics as a reference the authors show in a study that their colorization method is superior to others references the references are good implementation the algorithm is relatively simple and is explained well writing the writing is good and easy to follow furthermore the paper is short and still manages to discuss the method in full details which is very welcome equations 4 and 6 should specify as a subscript over which variable the argmax argmin is performed minor writing issues p1 one keyword is graphicarts p1 it is considered a valuable ingredient of our artistic abilities so and so that unusual use of so and so p2 if the template and reference image has should be have p3 because the content in natural images have singular plural mismatch p3 collected from colourloversan online community space missing between colourlovers and opening bracket p4 each element of the vector represents size of corresponding color group article missing p4 the objective is to propagate colors of reference image to the input template article missing p5 we created a graphical user interfacegui space missing between interface and opening bracket novelty i am not versed enough in the field of image processing and colorization to judge the novelty of this paper given the discussion of previous works presented in this article this article seems sufficiently novel to warrant publication general this article addresses an interesting problem in image processing with an easy and compelling solution the authors apply their method to a variety of images to be colored and perform a user study to quantify their method this paper is ready for publication after the minor issues in this review have been addressed ### Summary:
all the reviewers agree that the results are impressive and the problem is interesting the exposition and validation of the paper however need some work r1 r3 since overall the reviewers are positive about the paper i recommend accepting it with the following provisions clarify or rephrase the analysis of the second user study r1 add missing details on knn r3 and if possible a figure demonstrating how well the reference search works r1 fix typos r2 r3
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 783, 2929, 2934, 310, 3965, 1199, 17903, 275, 253, 4060, 1677, 247, 13738, 656, 25912, 7646, 2460, 597, 1089, 253, 8642, 18010, 3806, 2460, 275, 690, 2840, 7982, 285, 3761, 253, 13738, 656, 25912, 2193, 281, 253, 9830, 970, 247, 17375, 4216, 11038, 253, 7681, 7680, 273, 253, 2929, 310, 830, 8287, 253, 1895, 347, 271, 4227, 273, 4216, 11038, 285, 16161, 352, 970, 247, 14189, 2606, 11193, 5611, 275, 3307, 50276, 783, 4583, 2934, 273, 253, 2929, 310, 3240, 4076, 285, 15246, 275, 1340, 281, 3295, 247, 7646, 352, 2789, 3282, 281, 1089, 247, 22040, 16671, 2074, 2460, 285, 897, 697, 9830, 11038, 22040, 16671, 2074, 3603, 2299, 436, 310, 4555, 835, 253, 2929, 11521, 247, 1652, 2159, 327, 697, 9023, 3185, 273, 690, 10732, 273, 17856, 14259, 253, 767, 562, 273, 1264, 3386, 253, 2929, 4648, 281, 7277, 3888, 403, 15846, 12275, 9990, 5889, 285, 2020, 273, 11786, 891, 5476, 891, 651, 452, 3264, 690, 625, 17856, 3386, 751, 5281, 3634, 390, 387, 1878, 690, 47846, 273, 11786, 50275, 3062, 1189, 253, 2929, 36908, 1663, 7568, 849, 973, 253, 3806, 2460, 3186, 2987, 891, 651, 452, 3264, 690, 12820, 273, 816, 326, 3924, 4931, 247, 1643, 8642, 3888, 285, 247, 1643, 2080, 3888, 275, 253, 4849, 387, 253, 1077, 1878, 5010, 697, 1892, 281, 5963, 1880, 352, 2987, 387, 512, 3036, 577, 36908, 1361, 1060, 7646, 36908, 1007, 751, 253, 1275, 2460, 50276, 71, 3341, 516, 247, 1652, 9861, 281, 923, 253, 5955, 273, 3368, 19, 513, 891, 2096, 9113, 326, 253, 4477, 9441, 616, 1332, 310, 8936, 281, 591, 916, 1754, 327, 884, 562, 273, 884, 3484, 11626, 26332, 581, 1436, 327, 3388, 891, 513, 8968, 352, 310, 417, 247, 4336, 7473, 2608, 1263, 533, 891, 651, 417, 1067, 436, 10126, 1534, 50276, 601, 1223, 253, 2929, 310, 3542, 973, 285, 4518, 1223, 891, 11435, 253, 38135, 273, 830, 8287, 253, 1895, 347, 247, 4216, 11038, 285, 970, 271, 11193, 5933, 1014, 1223, 253, 2457, 3295, 5904, 1007, 3965, 253, 3374, 1840, 1056, 479, 247, 1652, 1679, 28684, 670, 253, 2929, 604, 253, 643, 30628, 1158, 1110, 3374, 403, 5884, 891, 31451, 9059, 2167, 7152, 33032, 2520, 2929, 29328, 271, 4722, 2746, 281, 3295, 907, 650, 698, 25912, 15896, 14635, 407, 970, 253, 3295, 6974, 275, 3806, 3888, 1677, 3280, 20665, 352, 41005, 17891, 2074, 3806, 3888, 432, 247, 18010, 2460, 10895, 840, 9830, 476, 320, 9495, 432, 3806, 3888, 281, 3280, 20665, 253, 3295, 1320, 15722, 310, 17618, 407, 767, 1027, 2608, 2175, 253, 1543, 403, 13943, 275, 253, 2929, 285, 253, 24864, 4753, 533, 253, 47284, 273, 253, 2929, 943, 320, 5520, 627, 403, 1142, 963, 993, 285, 47412, 474, 6332, 275, 253, 2929, 534, 35007, 4685, 891, 12186, 460, 281, 22112, 2997, 253, 2929, 604, 253, 4477, 476, 3157, 253, 47284, 285, 2953, 619, 7350, 275, 253, 1563, 50276, 24330, 3374, 50276, 783, 4477, 1750, 326, 597, 897, 271, 20059, 2746, 643, 685, 247, 38183, 390, 271, 34560, 581, 533, 597, 897, 253, 10416, 6656, 5933, 2746, 281, 8415, 253, 11038, 1895, 534, 310, 271, 34560, 1332, 9093, 50276, 783, 1332, 4419, 253, 7646, 285, 253, 3806, 3888, 281, 452, 253, 1072, 1180, 273, 3295, 2390, 352, 310, 12744, 752, 403, 253, 258, 3833, 2390, 3268, 275, 1531, 301, 285, 362, 72, 10895, 2975, 752, 403, 253, 2406, 3033, 285, 253, 5170, 3033, 273, 253, 1180, 273, 3295, 2390, 275, 253, 7092, 1754, 327, 1110, 15302, 50276, 5430, 281, 3609, 247, 3806, 2460, 432, 253, 694, 79, 3186, 432, 253, 10895, 403, 512, 465, 9183, 17887, 323, 2457, 1543, 403, 512, 1543, 275, 4677, 337, 4561, 970, 465, 3806, 3888, 752, 310, 253, 1318, 273, 465, 275, 253, 7092, 352, 310, 417, 5393, 275, 253, 2929, 50276, 17480, 253, 2934, 273, 19908, 14635, 3295, 1320, 310, 11797, 407, 608, 352, 651, 320, 1805, 281, 921, 253, 1543, 273, 608, 253, 2990, 273, 534, 476, 320, 10166, 327, 1531, 301, 285, 362, 72, 10895, 970, 247, 2074, 1039, 3559, 275, 616, 2929, 50276, 37585, 3374, 50266, 3456, 35507, 275, 253, 4060, 943, 417, 320, 5347, 1025, 50266, 249, 7609, 247, 3806, 3888, 403, 50276, 66, 3806, 2460, 310, 390, 3806, 3888, 403, 50266, 249, 16186, 21, 1061, 72, 4090, 50276, 1662, 4090, 81, 50266, 249, 5976, 253, 2256, 310, 5816, 846, 5889, 11038, 278, 21630, 50266, 2574, 16186, 785, 3266, 1320, 50276, 2574, 21, 50266, 249, 16186, 23, 1061, 72, 4090, 50276, 1662, 4090, 81, 50266, 2574, 16186, 1222, 27996, 50276, 2574, 23, 50266, 249, 4677, 818, 906, 7568, 50276, 16680, 5183, 406, 339, 377, 6653, 6010, 50275, 783, 2929, 10262, 247, 747, 1332, 323, 253, 12077, 3295, 1320, 273, 3102, 3169, 3888, 970, 643, 18010, 15896, 347, 247, 3806, 253, 4477, 921, 275, 247, 1263, 326, 616, 3295, 1320, 1332, 310, 8936, 281, 2571, 50275, 250, 3065, 50275, 783, 10414, 403, 1175, 50274, 39595, 50275, 783, 5933, 310, 4942, 2969, 285, 310, 5544, 973, 50274, 17695, 50275, 783, 4028, 310, 1175, 285, 3477, 281, 956, 33810, 253, 2929, 310, 2159, 285, 1335, 26091, 281, 2319, 253, 1332, 275, 2120, 4278, 534, 310, 1077, 10112, 50276, 2655, 569, 577, 285, 721, 943, 13199, 347, 247, 749, 3866, 689, 534, 4778, 253, 1736, 4090, 50276, 1662, 1222, 310, 2684, 50276, 37585, 4028, 3374, 50276, 81, 18, 581, 23473, 310, 19908, 12863, 50276, 81, 18, 352, 310, 2783, 247, 9865, 24405, 273, 776, 21518, 15277, 594, 285, 594, 326, 50275, 328, 45580, 897, 273, 594, 285, 594, 50276, 81, 19, 604, 253, 7646, 285, 3806, 2460, 556, 50276, 11425, 320, 452, 50276, 81, 20, 984, 253, 2600, 275, 3626, 3888, 452, 11098, 50276, 446, 1546, 29713, 50276, 81, 20, 5728, 432, 10688, 4213, 735, 266, 3909, 3114, 2317, 5816, 875, 10688, 4213, 735, 285, 5909, 24312, 50276, 81, 21, 1016, 3284, 273, 253, 4972, 6125, 1979, 273, 3969, 3295, 1387, 3929, 5816, 50276, 81, 21, 253, 8103, 310, 281, 38500, 9830, 273, 3806, 2460, 281, 253, 3280, 7646, 3929, 5816, 50276, 81, 22, 359, 3562, 247, 29886, 2608, 5673, 36961, 2317, 5816, 875, 5673, 285, 5909, 24312, 50274, 2369, 652, 555, 50275, 74, 717, 417, 1888, 264, 2217, 275, 253, 1673, 273, 2460, 5162, 285, 3295, 1320, 281, 5963, 253, 38135, 273, 436, 2929, 1677, 253, 5955, 273, 2045, 2987, 3559, 275, 436, 3929, 436, 3929, 3133, 10481, 4460, 281, 7501, 9311, 50274, 16691, 50275, 2520, 3929, 12453, 271, 4722, 1895, 275, 2460, 5162, 342, 271, 3477, 285, 18511, 2900, 253, 4477, 4647, 616, 1332, 281, 247, 5235, 273, 3888, 281, 320, 18010, 285, 1347, 247, 2608, 1263, 281, 22048, 616, 1332, 50276, 2520, 2929, 310, 4704, 323, 9311, 846, 253, 5884, 3374, 275, 436, 2278, 452, 644, 9713, 50275, 187, 187, 4118, 18435, 27, 455, 253, 30628, 5194, 326, 253, 1543, 403, 13943, 285, 253, 1895, 310, 4722, 253, 47284, 285, 12820, 273, 253, 2929, 2299, 878, 690, 789, 391, 18, 391, 20, 1580, 4583, 253, 30628, 403, 2762, 670, 253, 2929, 891, 5583, 18738, 352, 342, 253, 1563, 10067, 50275, 498, 274, 1419, 390, 294, 40712, 253, 1783, 273, 253, 1273, 2608, 1263, 391, 18, 50276, 1911, 5816, 4278, 327, 694, 79, 391, 20, 285, 604, 1896, 247, 4677, 17227, 849, 973, 253, 3806, 3186, 2987, 391, 18, 50276, 11097, 963, 993, 391, 19, 391, 20 ]
[ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 ]
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 783, 2929, 2934, 310, 3965, 1199, 17903, 275, 253, 4060, 1677, 247, 13738, 656, 25912, 7646, 2460, 597, 1089, 253, 8642, 18010, 3806, 2460, 275, 690, 2840, 7982, 285, 3761, 253, 13738, 656, 25912, 2193, 281, 253, 9830, 970, 247, 17375, 4216, 11038, 253, 7681, 7680, 273, 253, 2929, 310, 830, 8287, 253, 1895, 347, 271, 4227, 273, 4216, 11038, 285, 16161, 352, 970, 247, 14189, 2606, 11193, 5611, 275, 3307, 50276, 783, 4583, 2934, 273, 253, 2929, 310, 3240, 4076, 285, 15246, 275, 1340, 281, 3295, 247, 7646, 352, 2789, 3282, 281, 1089, 247, 22040, 16671, 2074, 2460, 285, 897, 697, 9830, 11038, 22040, 16671, 2074, 3603, 2299, 436, 310, 4555, 835, 253, 2929, 11521, 247, 1652, 2159, 327, 697, 9023, 3185, 273, 690, 10732, 273, 17856, 14259, 253, 767, 562, 273, 1264, 3386, 253, 2929, 4648, 281, 7277, 3888, 403, 15846, 12275, 9990, 5889, 285, 2020, 273, 11786, 891, 5476, 891, 651, 452, 3264, 690, 625, 17856, 3386, 751, 5281, 3634, 390, 387, 1878, 690, 47846, 273, 11786, 50275, 3062, 1189, 253, 2929, 36908, 1663, 7568, 849, 973, 253, 3806, 2460, 3186, 2987, 891, 651, 452, 3264, 690, 12820, 273, 816, 326, 3924, 4931, 247, 1643, 8642, 3888, 285, 247, 1643, 2080, 3888, 275, 253, 4849, 387, 253, 1077, 1878, 5010, 697, 1892, 281, 5963, 1880, 352, 2987, 387, 512, 3036, 577, 36908, 1361, 1060, 7646, 36908, 1007, 751, 253, 1275, 2460, 50276, 71, 3341, 516, 247, 1652, 9861, 281, 923, 253, 5955, 273, 3368, 19, 513, 891, 2096, 9113, 326, 253, 4477, 9441, 616, 1332, 310, 8936, 281, 591, 916, 1754, 327, 884, 562, 273, 884, 3484, 11626, 26332, 581, 1436, 327, 3388, 891, 513, 8968, 352, 310, 417, 247, 4336, 7473, 2608, 1263, 533, 891, 651, 417, 1067, 436, 10126, 1534, 50276, 601, 1223, 253, 2929, 310, 3542, 973, 285, 4518, 1223, 891, 11435, 253, 38135, 273, 830, 8287, 253, 1895, 347, 247, 4216, 11038, 285, 970, 271, 11193, 5933, 1014, 1223, 253, 2457, 3295, 5904, 1007, 3965, 253, 3374, 1840, 1056, 479, 247, 1652, 1679, 28684, 670, 253, 2929, 604, 253, 643, 30628, 1158, 1110, 3374, 403, 5884, 891, 31451, 9059, 2167, 7152, 33032, 2520, 2929, 29328, 271, 4722, 2746, 281, 3295, 907, 650, 698, 25912, 15896, 14635, 407, 970, 253, 3295, 6974, 275, 3806, 3888, 1677, 3280, 20665, 352, 41005, 17891, 2074, 3806, 3888, 432, 247, 18010, 2460, 10895, 840, 9830, 476, 320, 9495, 432, 3806, 3888, 281, 3280, 20665, 253, 3295, 1320, 15722, 310, 17618, 407, 767, 1027, 2608, 2175, 253, 1543, 403, 13943, 275, 253, 2929, 285, 253, 24864, 4753, 533, 253, 47284, 273, 253, 2929, 943, 320, 5520, 627, 403, 1142, 963, 993, 285, 47412, 474, 6332, 275, 253, 2929, 534, 35007, 4685, 891, 12186, 460, 281, 22112, 2997, 253, 2929, 604, 253, 4477, 476, 3157, 253, 47284, 285, 2953, 619, 7350, 275, 253, 1563, 50276, 24330, 3374, 50276, 783, 4477, 1750, 326, 597, 897, 271, 20059, 2746, 643, 685, 247, 38183, 390, 271, 34560, 581, 533, 597, 897, 253, 10416, 6656, 5933, 2746, 281, 8415, 253, 11038, 1895, 534, 310, 271, 34560, 1332, 9093, 50276, 783, 1332, 4419, 253, 7646, 285, 253, 3806, 3888, 281, 452, 253, 1072, 1180, 273, 3295, 2390, 352, 310, 12744, 752, 403, 253, 258, 3833, 2390, 3268, 275, 1531, 301, 285, 362, 72, 10895, 2975, 752, 403, 253, 2406, 3033, 285, 253, 5170, 3033, 273, 253, 1180, 273, 3295, 2390, 275, 253, 7092, 1754, 327, 1110, 15302, 50276, 5430, 281, 3609, 247, 3806, 2460, 432, 253, 694, 79, 3186, 432, 253, 10895, 403, 512, 465, 9183, 17887, 323, 2457, 1543, 403, 512, 1543, 275, 4677, 337, 4561, 970, 465, 3806, 3888, 752, 310, 253, 1318, 273, 465, 275, 253, 7092, 352, 310, 417, 5393, 275, 253, 2929, 50276, 17480, 253, 2934, 273, 19908, 14635, 3295, 1320, 310, 11797, 407, 608, 352, 651, 320, 1805, 281, 921, 253, 1543, 273, 608, 253, 2990, 273, 534, 476, 320, 10166, 327, 1531, 301, 285, 362, 72, 10895, 970, 247, 2074, 1039, 3559, 275, 616, 2929, 50276, 37585, 3374, 50266, 3456, 35507, 275, 253, 4060, 943, 417, 320, 5347, 1025, 50266, 249, 7609, 247, 3806, 3888, 403, 50276, 66, 3806, 2460, 310, 390, 3806, 3888, 403, 50266, 249, 16186, 21, 1061, 72, 4090, 50276, 1662, 4090, 81, 50266, 249, 5976, 253, 2256, 310, 5816, 846, 5889, 11038, 278, 21630, 50266, 2574, 16186, 785, 3266, 1320, 50276, 2574, 21, 50266, 249, 16186, 23, 1061, 72, 4090, 50276, 1662, 4090, 81, 50266, 2574, 16186, 1222, 27996, 50276, 2574, 23, 50266, 249, 4677, 818, 906, 7568, 50276, 16680, 5183, 406, 339, 377, 6653, 6010, 50275, 783, 2929, 10262, 247, 747, 1332, 323, 253, 12077, 3295, 1320, 273, 3102, 3169, 3888, 970, 643, 18010, 15896, 347, 247, 3806, 253, 4477, 921, 275, 247, 1263, 326, 616, 3295, 1320, 1332, 310, 8936, 281, 2571, 50275, 250, 3065, 50275, 783, 10414, 403, 1175, 50274, 39595, 50275, 783, 5933, 310, 4942, 2969, 285, 310, 5544, 973, 50274, 17695, 50275, 783, 4028, 310, 1175, 285, 3477, 281, 956, 33810, 253, 2929, 310, 2159, 285, 1335, 26091, 281, 2319, 253, 1332, 275, 2120, 4278, 534, 310, 1077, 10112, 50276, 2655, 569, 577, 285, 721, 943, 13199, 347, 247, 749, 3866, 689, 534, 4778, 253, 1736, 4090, 50276, 1662, 1222, 310, 2684, 50276, 37585, 4028, 3374, 50276, 81, 18, 581, 23473, 310, 19908, 12863, 50276, 81, 18, 352, 310, 2783, 247, 9865, 24405, 273, 776, 21518, 15277, 594, 285, 594, 326, 50275, 328, 45580, 897, 273, 594, 285, 594, 50276, 81, 19, 604, 253, 7646, 285, 3806, 2460, 556, 50276, 11425, 320, 452, 50276, 81, 20, 984, 253, 2600, 275, 3626, 3888, 452, 11098, 50276, 446, 1546, 29713, 50276, 81, 20, 5728, 432, 10688, 4213, 735, 266, 3909, 3114, 2317, 5816, 875, 10688, 4213, 735, 285, 5909, 24312, 50276, 81, 21, 1016, 3284, 273, 253, 4972, 6125, 1979, 273, 3969, 3295, 1387, 3929, 5816, 50276, 81, 21, 253, 8103, 310, 281, 38500, 9830, 273, 3806, 2460, 281, 253, 3280, 7646, 3929, 5816, 50276, 81, 22, 359, 3562, 247, 29886, 2608, 5673, 36961, 2317, 5816, 875, 5673, 285, 5909, 24312, 50274, 2369, 652, 555, 50275, 74, 717, 417, 1888, 264, 2217, 275, 253, 1673, 273, 2460, 5162, 285, 3295, 1320, 281, 5963, 253, 38135, 273, 436, 2929, 1677, 253, 5955, 273, 2045, 2987, 3559, 275, 436, 3929, 436, 3929, 3133, 10481, 4460, 281, 7501, 9311, 50274, 16691, 50275, 2520, 3929, 12453, 271, 4722, 1895, 275, 2460, 5162, 342, 271, 3477, 285, 18511, 2900, 253, 4477, 4647, 616, 1332, 281, 247, 5235, 273, 3888, 281, 320, 18010, 285, 1347, 247, 2608, 1263, 281, 22048, 616, 1332, 50276, 2520, 2929, 310, 4704, 323, 9311, 846, 253, 5884, 3374, 275, 436, 2278, 452, 644, 9713, 50275, 187, 187, 4118, 18435, 27, 455, 253, 30628, 5194, 326, 253, 1543, 403, 13943, 285, 253, 1895, 310, 4722, 253, 47284, 285, 12820, 273, 253, 2929, 2299, 878, 690, 789, 391, 18, 391, 20, 1580, 4583, 253, 30628, 403, 2762, 670, 253, 2929, 891, 5583, 18738, 352, 342, 253, 1563, 10067, 50275, 498, 274, 1419, 390, 294, 40712, 253, 1783, 273, 253, 1273, 2608, 1263, 391, 18, 50276, 1911, 5816, 4278, 327, 694, 79, 391, 20, 285, 604, 1896, 247, 4677, 17227, 849, 973, 253, 3806, 3186, 2987, 391, 18, 50276, 11097, 963, 993, 391, 19, 391, 20 ]
Below is given review of a research paper from cnoference journal. Please write a summary the review. ### Review: this paper presents a unified framework for doing point cloud upsampling denoising and completion jointly the proposed framework contains two modules a pointtovoxel autoencoder and a point relocalization transformer it outperforms singletask baselines on multiple benchmarks by a great margin note ill use the the word multitask to refer to the problem setting in this work in contrast to the singletask baselines i understand that multitask slightly abused here as one may argue that the the studied problem is a unified single task rather than three independent ones strengths 1 the highlevel idea makes lots of sense to me using voxelization to denoise and complete irregular point cloud is reasonable and smart besides we can sample as many points as we want from a voxel grid to get a dense surface im not 100 sure whether this idea has been studied before as im not an expert for these two tasks if possible the authors could further elaborate on this contribution 2 the empirical results are strong the proposed framework greatly outperforms single task baselines 3 writing is mostly clear and proper ablation has been conducted weakness 1 the presented results are great but the baselines are not very considerate this paper is arguing that the three tasks should be unified and hence multitask jointlearning baselines should be considered the simplest setting is to ensemble all three baselines together in an optimized order alternatives contain training each baseline methods on all three tasks jointly 2 following above point the proposed method should also be evaluated for each single tasks to demonstrate advantages for example i would be only curious about how good is the proposed framework for point completion if my point cloud isnt noisy does this method improved over specified pointcompletion network 3 im not fully convinced by the story of positional encoding in the original form the distance information is already expressed as the frequency of the signal the magnitude is just redundant information why is it important and what kind of effect it will cause precisely 4 lots of details are unclear i list several that confuses me most how many voxels are processed in the transformer on average since the complexity of the attention function is on2 does the proposed framework suffer from big memory consumption for the tested dataset especially the synthetic ones how does the input noisesparsityincompleteness are created scannet has lots of incomplete objects eg chair with no legs but i dont find any qualitative results showing that the network can complete them which is unsatisfactory also regular synthetic noise eg gaussian noise are much simpler than realworld noise how does the proposed framework deal with realworld noise for example mvs reconstructed pointclouds similar thing also holds for sparsity its unclear how does this framework work for realworld sparse point clouds eg the remote areas in lidar data if the authors think these are outofthescope then please clearly introduce the problem setting and limitations how is the dimension of pout decided i thought arbitrary number of points can be generated if this is true how are the numbers in tab1 decided i found these values of pout are confusing and they are not consistent with prior work misc 1 the difference between sparse point upsampling and incomplete pointcompletion is not very clear very specific definition should be made in the very early stage of the writing so that there is no ambiguity between the challenges that these two problems pose besides the experimental results should clearly demonstrate how does the proposed framework address each subproblem above twodenoising individually unfortunately these results are either missing or not very clear right now 2 if find it misleading to term this task pointcloud reconstruction a maybe improper analogy is that we dont call joint image denoising and superresolution and image generation if the authors really want to use the term reconstruction maybe call it conditioned point cloud reconstruction or something else that is more accurate like point cloud refinement im just making examples and throwing out random names 3 its unclear to me whats the motivation of clustering voxel hashing and how important is this step why do we need a clustering before transformer and where are the cluster information used in the later framework if i understand correctly this is used because the voxeltopoint step only operates on center voxels and hence we want a locally augmented feature representation minor issues 1 fig2 input figure is barely visible a better colormap should be used 2 3d sparse hourglass network is not a super popular network its also not implemented in popular sparse conv net framework eg minkowski engine why this framework is adopted maybe the authors can share some related work to justify the advantage its not a common choice suggestions 1 in tab1 maybe the light color can be used to highlight the 2nd best among the baseline method highlighting the 2nd best among all including ours cannot fully demonstrate the advantages summary my current rating is mainly based on two reasons 1 the experimental results are strong and the idea of utilizing voxelization for denoisingrefiningdensifying point clouds makes lots of sense to me in addition the proposed transformer works well for further improving the results and recovering points from voxels 2 even though the numbers are great i do think the experiments can be improved multitask baselines should be devised and compared its not enough to only demonstrate the proposed method outperforms singletask baselines moreover finegrained experiments should also be conducted to help us understand the performance on each single tasks ill also use this section to justify my ratings below this paper is mainly proposing a new problem setting and develops a new framework correspondingly technical wise this new framework doesnt carry significant new components it does a great job choosing the proper submodules combining and slightly improving them for the final tasks however most of these components existed in the literature empirical wise the results in tab1 is exciting and the proposed framework greatly improves over singetask baselines more strictly speaking single tasks and twotask x scorepd baselines the qualitative results in fig5 are also encouraging and clearly show the advantages i have worked in the field of pointcloud recognition and pretty familiar with the recognition frameworks however im not an expert of pointcloud denoisingcompletiondensifying and not very familiar with the sota methods in these fields docsepthe paper proposes an endtoend unified approach to solve the subtasks in point cloud completion the main selling point of the paper is that combine two existing methods for point cloud densification denoising stage 1 and point cloud completion stage 2 the paper does extensive experiments on 3 datasets and compares to 3 different baselines they achieve sota performance on these datasets the use of sparse convolution to process point clouds is quite interesting as an autoencoder termed as hour glass aligns with the task at hand strengths 1 the paper brings a new perspective solving two old problems in computer vision and this is useful i like the idea it is clean and simple 2 the twostage novel architecture is simple but apart from the additional positional embedding i dont there is any other addition here but the stage1 and stage2 architectures have been used in the past to solve problems my understanding is that the network majorly befits from stage1 as a lot of qualitative results concentrate on completion 3 the numbers are better and change with voxel resolution but significantly better than the baseline approaches 4 qualitative examples are good and show the performance improvement but are limited and would like to see more of them in the supplemental which is missing weakness 1 this paper needs a quite few presentation changes to make the idea clear and easy flowing the initial section of the method section of the paper is up until 31 is clear and well explained while 32 needs a better explanation i still do not understand the exact intuition behind the amplitudebased positional embedding and logic is not very clear 2 the baselines are not explained well at all im not sure if the comparison to the baselines is fair because the exact characteristics of the snowflakepc scorepd are not explained and likewise for dispu these sections need rewriting 3 there is no evaluation independently for stage1 all evaluations in the paper use stage1 and stage2 i think to justify the need for stage2 and see its impact it is important to set up some evaluation criteria for stage1 only i understand that maybe doing the full evaluation on stage1 is not possible 4 reiterating a point i mentioned earlier more qualitative examples are needed 5 the paper claims like past work when they upsample they dont have this scalingup factor of r but i think in this case this r just manifests as the voxel density in stage 1 6 im also not clear what happens when multiple voxel centers move to the same location in 3d points justification i think the paper does interesting contributions but lacks extensive qualitative results and the snowflakepc scorepd and similar things make me wonder if things are fair the performance gap between the method and baselines is quite huge and im not able to completely understand the exact difference in impact between stage 1 and stage 2 depending upon the rebuttal by the authors i will change my rating upwards if given reasonable explanations docsepthis paper proposes an elegant twostage pipeline to jointly solve the three tasks of point cloud densification denoising and reconstruction at one time voxel densification and denoising are performed using a 3d spark stacked hourglass network and then voxels are reconstructed into dense point clouds using transformers the experimental results of shapenet exceed most existing methods and have strong generalization ability this paper mainly has two strengths first a twostage pipeline is innovatively proposed which can enable the network to solve multiple tasks such as point cloud denoising completion and reconstruction at one time and the reconstruction accuracy and visualization results of this method can surpass the latest methods in various sub fields second the innovative use of amplified positive encoding and transformers to compute the relationship between a center voxel and its neighbor voxels has certain enlightening significance for the followup work but at the same time this paper also has some shortcomings from the experimental results shapenetpart dataset is comparable to sota results and the generalization ability of scannet and iclnuim datasets is significantly better than other methods but the paper only compares the experimental results it would be better if the paper could explain why the proposed new method can achieve such strong generalization ability and why the generalization ability of the existing methods is poor this paper creatively integrates different tasks such as point cloud denoising completion and reconstruction into a twostage task and the generalization ability of the experimental results on scannet and iclnuim datasets is much better than the latest existing methods this paper creatively uses amplified positive encoding to compute the relationship between a center voxel and its neighbor voxels which has certain enlightening significance for the followup work docsepthe paper proposes a new method to jointly address the tasks of point cloud denoising completion and upsampling they call this joint problem point cloud reconstruction the method consists of two main modules a voxel generation module to increase voxel density and remove outliers implemented as an hourglass sparse convolutional neural network and a point relocalization module to convert the discrete voxels back to point clouds implemented using a transformer network the results show good performance in the chosen metrics and datasets the problem of point cloud reconstruction combines several existing important challenges in point cloud processing the paper is well written and easy to follow however some details remain unclear and the evaluation choices inconclusive major issues since many scans are combined into a single scan there is another challenge that is entirely overlooked misalignment if the idea is to combine all challenges into one this should be another challange to address missing related work there is a very large body of recent works on neural shape implicit representation which focuses on the surface reconstruction task while i understand how this is a bit different from the proposed point cloud reconstruction one could simply sample points on the zero level set and achieve point cloud reconstruction among these works are deepsdf occupancy networks sal sald dpdist igr convolutional occupancy networks phase siren digs etc these are missing particularly since in section 32 it is claimed that the local 3d shape is estimated using neighbour voxels approach in the described approach it was not clear to me how the final number of points is specified also not clear how color information is determined for the newly generated points evaluation baselines i find the choice of the baseline questionable first there was no unified approach used denoising completion sampling second it would make more sense to me to compare the proposed method on the others task since it is expected that a denoising method will not do well for upsampling third in the combined baselines snowflakepcscorepd and dispuscorepd it is not clear the order of application it makes sense to first denoise and then upsamplecomplete but the text suggests the opposite which should also be reported finally as mentioned in the related work comment the paper lacks comparison to implicit representations surface reconstruction evaluation datasets the chosen datasets are commonly used and an understandable choice for this task however they are strongly biased towards planar geometries for example most shapes in shapenet and most scenes in scannet have large planes i expected to see results on the surface reconstruction benchmark berger et al which is small but has a mix of complex geometries evaluation time and memory requirements are missing from the implementation details it seems to be resourceheavy how does that compare to the other baselines lack of insight the paper is missing an important subsection that provides some insight into what the different modules have learned and how is that beneficial to the task this raises some question about novelty how is this approach better than the sum of its technical parts in its current form the paper is not selfcontained and the method is not reproducible due to a lack of technical details minor issues there is an issue with the term continuous 3d points which repeats itself in the paper section 32 for example point clouds are not continuous they are sampled on a continuous surface evaluation the evaluation metrics were not properly specified when performing standardized experiments it may be sometimes acceptable to commit this however in these case the paper proposes a new task therefore the evaluation measures should be specified vmid is missing from fig 2 and fig3 one can only assume where it is in the architecture it should be better specified section 2 point cloud upsampling input points input points typo in summary i find the paper well written and the problem interesting the approach is well presented and technically sound for the most part however due to the lack of a significant body of works in the related work section as well as the missing evaluations i can not recommend this paper for acceptance at this stage ### Summary:
the paper proposes a unified framework for point cloud upsampling denoising and completion through a twostage approach it receives three reviews with three leaning to accept and one leaning to reject most of the reviewers like the proposed twostage approach for its simplicity and demonstrated strong performance the reviewer recommending marginally below the acceptance threshold expresses concerns about missing comparison to neural shape implicit representation and a lack of insights on what is learned by individual layers in the network while the metareviewer agrees that having both would make the paper stronger the metareviewer feels the paper has enough merit and would like to recommend its acceptance
[ 285, 27930, 2792, 432, 32582, 1241, 374, 1014, 2167, 253, 3904, 403, 1270, 891, 513, 1158, 253, 4679, 476, 320, 5520, 1554, 262, 1945, 1666, 25379, 943, 320, 32434, 285, 2429, 697, 417, 2217, 281, 760, 7568, 253, 4081, 1332, 41731, 13015, 34791, 1945, 1666, 25379, 25761, 4030, 72, 11273, 4679, 943, 671, 320, 5196, 281, 1361, 441, 2096, 253, 3045, 327, 1016, 2014, 8892, 50275, 408, 671, 897, 436, 2593, 281, 15249, 619, 17503, 2708, 436, 2929, 310, 7194, 36636, 247, 747, 1895, 4758, 285, 24357, 247, 747, 7792, 47512, 50276, 48746, 15822, 436, 747, 7792, 36908, 4459, 1534, 747, 4295, 352, 1057, 247, 1270, 2628, 13887, 253, 1463, 749, 14825, 50276, 17890, 1699, 285, 5777, 11138, 731, 323, 253, 2457, 8892, 2299, 954, 273, 841, 4295, 13164, 275, 253, 6239, 16774, 15822, 253, 1543, 275, 10334, 18, 310, 12302, 285, 253, 4081, 7792, 10260, 19132, 689, 1625, 292, 1945, 1666, 25379, 625, 13714, 8288, 2014, 8892, 285, 2500, 302, 1945, 1269, 4868, 19875, 1666, 25379, 253, 18276, 1543, 275, 3036, 22, 403, 671, 18462, 285, 4518, 921, 253, 11361, 50275, 74, 452, 4307, 275, 253, 1673, 273, 1127, 18534, 8981, 285, 3965, 7615, 342, 253, 8981, 31225, 50276, 35529, 516, 417, 271, 6485, 273, 1127, 18534, 1850, 80, 2182, 45634, 23624, 5411, 285, 417, 1077, 7615, 342, 253, 256, 5503, 3082, 275, 841, 4910, 50274, 7152, 339, 431, 248, 2929, 29328, 271, 990, 936, 423, 27998, 2746, 281, 8415, 253, 8482, 6579, 275, 1127, 9005, 12240, 253, 2022, 10156, 1127, 273, 253, 2929, 310, 326, 13398, 767, 5368, 3082, 323, 1127, 9005, 12006, 1877, 50275, 3354, 80, 2182, 3924, 337, 285, 1127, 9005, 12240, 3924, 374, 253, 2929, 1057, 9470, 4679, 327, 495, 15302, 285, 26662, 281, 495, 1027, 1666, 25379, 597, 5115, 256, 5503, 3045, 327, 841, 15302, 253, 897, 273, 23507, 27311, 281, 1232, 1127, 16173, 310, 3240, 4722, 347, 271, 6753, 36465, 23776, 347, 4964, 5253, 8495, 84, 342, 253, 4836, 387, 1133, 50275, 296, 3755, 20556, 337, 253, 2929, 10316, 247, 747, 8668, 16161, 767, 1711, 3237, 275, 4382, 8113, 285, 436, 310, 4217, 891, 751, 253, 2934, 352, 310, 4076, 285, 2969, 374, 253, 2500, 493, 486, 4460, 10336, 310, 2969, 533, 7419, 432, 253, 3081, 40798, 21496, 891, 13414, 627, 310, 667, 643, 1635, 1060, 533, 253, 3924, 18, 285, 3924, 19, 35615, 452, 644, 908, 275, 253, 2469, 281, 8415, 3237, 619, 4685, 310, 326, 253, 2990, 2201, 314, 320, 26017, 432, 3924, 18, 347, 247, 2257, 273, 18276, 1543, 21364, 327, 12240, 495, 253, 3904, 403, 1805, 285, 1818, 342, 46092, 6064, 533, 3012, 1805, 685, 253, 8245, 7274, 577, 18276, 6667, 403, 1175, 285, 921, 253, 3045, 7756, 533, 403, 3710, 285, 651, 751, 281, 923, 625, 273, 731, 275, 253, 25702, 534, 310, 5816, 50272, 20881, 1255, 337, 436, 2929, 3198, 247, 3240, 1643, 9759, 2544, 281, 1056, 253, 2934, 2590, 285, 3477, 19246, 253, 3302, 2593, 273, 253, 1332, 2593, 273, 253, 2929, 310, 598, 1919, 4562, 310, 2590, 285, 973, 5544, 1223, 4567, 3198, 247, 1805, 8813, 891, 1335, 513, 417, 2096, 253, 3242, 30328, 3212, 253, 717, 4403, 438, 2275, 833, 40798, 21496, 285, 9317, 310, 417, 1077, 2590, 50276, 19, 253, 1666, 25379, 403, 417, 5544, 973, 387, 512, 516, 417, 2119, 604, 253, 5301, 281, 253, 1666, 25379, 310, 4344, 984, 253, 3242, 5319, 273, 253, 8762, 1258, 640, 5902, 50276, 18891, 19875, 403, 417, 5544, 285, 21223, 323, 7644, 86, 841, 7118, 878, 294, 17695, 495, 627, 310, 642, 7103, 10939, 323, 3924, 18, 512, 27163, 275, 253, 2929, 897, 3924, 18, 285, 3924, 19, 891, 1158, 281, 15249, 253, 878, 323, 3924, 19, 285, 923, 697, 3486, 352, 310, 1774, 281, 873, 598, 690, 7103, 6866, 323, 3924, 18, 760, 891, 2096, 326, 5046, 2509, 253, 2120, 7103, 327, 3924, 18, 310, 417, 1896, 50276, 21, 28411, 839, 247, 1127, 891, 5393, 4321, 50276, 3062, 18276, 6667, 403, 3058, 608, 253, 2929, 3916, 751, 2469, 789, 672, 597, 598, 16848, 597, 13414, 452, 436, 13642, 484, 2803, 273, 391, 533, 891, 1158, 275, 436, 1083, 436, 391, 816, 46955, 347, 253, 46092, 4038, 275, 3924, 337, 721, 516, 671, 417, 2590, 752, 6569, 672, 2709, 46092, 12127, 2118, 281, 253, 1072, 4328, 275, 495, 69, 2792, 50270, 6309, 1877, 891, 1158, 253, 2929, 1057, 4722, 9021, 533, 19756, 9470, 18276, 1543, 285, 253, 8762, 1258, 640, 5902, 50276, 18891, 19875, 285, 2074, 1841, 1056, 479, 4282, 604, 1841, 403, 4344, 253, 3045, 8037, 875, 253, 1332, 285, 1666, 25379, 310, 3240, 5699, 285, 516, 417, 2104, 281, 4336, 2096, 253, 3242, 3064, 275, 3486, 875, 3924, 337, 285, 3924, 374, 7293, 2220, 253, 30080, 22559, 407, 253, 4477, 891, 588, 1818, 619, 13716, 32372, 604, 1677, 5272, 22909, 5474, 33032, 2520, 2929, 29328, 271, 20654, 2500, 493, 486, 15722, 281, 26277, 8415, 253, 1264, 8892, 273, 1127, 9005, 12006, 1877, 1850, 80, 2182, 285, 14433, 387, 581, 673, 46092, 12006, 1877, 285, 1850, 80, 2182, 403, 2684, 970, 247, 495, 69, 13673, 24982, 4964, 25483, 2990, 285, 840, 32582, 1241, 403, 25578, 715, 14086, 1127, 16173, 970, 4979, 398, 253, 5661, 1543, 273, 439, 522, 257, 292, 8268, 954, 5368, 3082, 285, 452, 2266, 26647, 3745, 436, 2929, 7194, 556, 767, 20544, 806, 247, 2500, 493, 486, 15722, 310, 8434, 3146, 4081, 534, 476, 8046, 253, 2990, 281, 8415, 2709, 8892, 824, 347, 1127, 9005, 1850, 80, 2182, 12240, 285, 14433, 387, 581, 673, 285, 253, 14433, 7200, 285, 24426, 1543, 273, 436, 1332, 476, 28842, 253, 6323, 3082, 275, 2710, 749, 4910, 1273, 253, 16694, 897, 273, 20068, 2762, 9706, 285, 4979, 398, 281, 11897, 253, 2954, 875, 247, 4055, 46092, 285, 697, 6346, 32582, 1241, 556, 2176, 25441, 2980, 8453, 323, 253, 956, 484, 789, 50276, 2858, 387, 253, 1072, 673, 436, 2929, 671, 556, 690, 35387, 432, 253, 5661, 1543, 439, 522, 257, 292, 2003, 10895, 310, 10870, 281, 256, 5503, 1543, 285, 253, 26647, 3745, 273, 660, 1136, 292, 285, 17857, 77, 3023, 303, 15302, 310, 3012, 1805, 685, 643, 3082, 533, 253, 2929, 760, 26662, 253, 5661, 1543, 352, 651, 320, 1805, 604, 253, 2929, 812, 5513, 2139, 253, 4081, 747, 1332, 476, 5115, 824, 2266, 26647, 3745, 285, 2139, 253, 26647, 3745, 273, 253, 5368, 3082, 310, 4105, 50276, 2520, 2929, 2833, 1242, 49661, 1027, 8892, 824, 347, 1127, 9005, 1850, 80, 2182, 12240, 285, 14433, 715, 247, 2500, 493, 486, 4836, 285, 253, 26647, 3745, 273, 253, 5661, 1543, 327, 660, 1136, 292, 285, 17857, 77, 3023, 303, 15302, 310, 1199, 1805, 685, 253, 6323, 5368, 3082, 436, 2929, 2833, 1242, 4648, 20068, 2762, 9706, 281, 11897, 253, 2954, 875, 247, 4055, 46092, 285, 697, 6346, 32582, 1241, 534, 556, 2176, 25441, 2980, 8453, 323, 253, 956, 484, 789, 5474, 339, 431, 248, 2929, 29328, 247, 747, 1332, 281, 26277, 2953, 253, 8892, 273, 1127, 9005, 1850, 80, 2182, 12240, 285, 598, 48027, 597, 1067, 436, 6036, 1895, 1127, 9005, 14433, 253, 1332, 8414, 273, 767, 2022, 11911, 50276, 66, 46092, 5978, 6333, 281, 2572, 46092, 4038, 285, 5386, 42559, 9009, 347, 271, 4964, 25483, 23507, 27311, 267, 11454, 2990, 285, 247, 1127, 774, 3100, 1320, 6333, 281, 6455, 253, 13358, 32582, 1241, 896, 281, 1127, 16173, 9009, 970, 247, 39707, 2990, 253, 1543, 921, 1175, 3045, 275, 253, 6777, 17082, 285, 15302, 50276, 783, 1895, 273, 1127, 9005, 14433, 24772, 2067, 5368, 1774, 7881, 275, 1127, 9005, 5162, 253, 2929, 310, 973, 3542, 285, 3477, 281, 956, 2299, 690, 4278, 3464, 12744, 285, 253, 7103, 10165, 16656, 7426, 50275, 24330, 3374, 50273, 17480, 1142, 20947, 403, 5678, 715, 247, 2014, 11017, 627, 310, 1529, 5691, 326, 310, 7094, 28849, 50276, 24418, 40446, 604, 253, 2934, 310, 281, 13398, 512, 7881, 715, 581, 436, 943, 320, 1529, 3089, 912, 281, 2953, 50274, 33722, 2905, 789, 50276, 9088, 310, 247, 1077, 1781, 2133, 273, 3332, 2987, 327, 11454, 5281, 15424, 6779, 534, 16633, 327, 253, 2553, 14433, 4836, 1223, 891, 2096, 849, 436, 310, 247, 2372, 1027, 432, 253, 4081, 1127, 9005, 14433, 581, 812, 3365, 3410, 2792, 327, 253, 5058, 1268, 873, 285, 5115, 1127, 9005, 14433, 2190, 841, 2987, 403, 372, 2265, 4989, 35190, 6928, 3779, 3779, 69, 33234, 8155, 209, 19531, 50276, 13118, 2241, 267, 35190, 6928, 3408, 4927, 445, 2836, 84, 3966, 841, 403, 5816, 3782, 1580, 275, 2593, 4567, 352, 310, 7558, 326, 253, 1980, 495, 69, 5281, 310, 5998, 970, 14646, 32582, 1241, 50275, 6772, 607, 50276, 249, 253, 2529, 2746, 352, 369, 417, 2590, 281, 479, 849, 253, 2457, 1180, 273, 2792, 310, 7616, 671, 417, 2590, 849, 3295, 1491, 310, 3413, 323, 253, 9841, 4561, 2792, 50276, 15419, 2368, 1666, 25379, 891, 1089, 253, 4327, 273, 253, 8245, 30455, 806, 627, 369, 642, 27998, 2746, 908, 1850, 80, 2182, 50276, 45634, 50276, 48027, 1273, 352, 651, 1056, 625, 3282, 281, 479, 281, 7277, 253, 4081, 1332, 327, 253, 2571, 4836, 1580, 352, 310, 3264, 326, 247, 1850, 80, 2182, 1332, 588, 417, 513, 973, 323, 598, 48027, 2626, 275, 253, 5678, 1666, 25379, 8762, 1258, 640, 5902, 18891, 19875, 285, 7644, 316, 6443, 19875, 352, 310, 417, 2590, 253, 1340, 273, 2898, 352, 2789, 3282, 281, 806, 1850, 45416, 285, 840, 598, 16848, 11984, 533, 253, 2505, 5936, 253, 7285, 534, 943, 671, 320, 2361, 4720, 347, 5393, 275, 253, 2905, 789, 4385, 253, 2929, 19756, 5301, 281, 15424, 14237, 2553, 14433, 50275, 15419, 2368, 15302, 50276, 783, 6777, 15302, 403, 7744, 908, 285, 271, 34007, 4327, 323, 436, 4836, 2299, 597, 403, 7052, 23539, 4404, 23601, 41184, 323, 1650, 954, 15029, 275, 439, 522, 257, 292, 285, 954, 13451, 275, 660, 1136, 292, 452, 1781, 16340, 891, 3264, 281, 923, 1543, 327, 253, 2553, 14433, 22791, 17099, 1063, 1162, 355, 534, 310, 1355, 533, 556, 247, 5878, 273, 2570, 41184, 50276, 15419, 2368, 50276, 2606, 285, 3541, 6095, 403, 5816, 432, 253, 7092, 4278, 352, 3133, 281, 320, 7741, 37893, 849, 1057, 326, 7277, 281, 253, 643, 1666, 25379, 50275, 77, 471, 273, 12288, 50276, 783, 2929, 310, 5816, 271, 1774, 19087, 326, 3400, 690, 12288, 715, 752, 253, 1027, 11911, 452, 6311, 285, 849, 310, 326, 12912, 281, 253, 4836, 436, 16540, 690, 1953, 670, 38135, 50276, 5430, 310, 436, 2746, 1805, 685, 253, 2020, 273, 697, 7681, 4243, 50275, 249, 697, 1655, 830, 253, 2929, 310, 417, 1881, 41010, 285, 253, 1332, 310, 417, 41374, 1955, 281, 247, 3480, 273, 7681, 4278, 50275, 37585, 3374, 50274, 9088, 310, 271, 2523, 342, 253, 1307, 5415, 495, 69, 2792, 534, 24510, 3139, 275, 253, 2929, 2593, 4567, 323, 1650, 1127, 16173, 403, 417, 5415, 597, 403, 19958, 327, 247, 5415, 2553, 50275, 15419, 2368, 253, 7103, 17082, 497, 417, 6283, 7616, 672, 9591, 19817, 4679, 352, 778, 320, 4536, 12207, 281, 4514, 436, 2299, 275, 841, 1083, 253, 2929, 29328, 247, 747, 4836, 3103, 253, 7103, 5593, 943, 320, 7616, 50275, 87, 7893, 310, 5816, 432, 3036, 374, 285, 3036, 20, 581, 476, 760, 5467, 835, 352, 310, 275, 253, 10336, 352, 943, 320, 1805, 7616, 50276, 4674, 374, 50276, 3659, 9005, 598, 48027, 3280, 2792, 3280, 2792, 1745, 80, 50274, 249, 6010, 891, 1089, 253, 2929, 973, 3542, 285, 253, 1895, 4722, 253, 2746, 310, 973, 3559, 285, 22335, 3590, 323, 253, 954, 629, 2299, 1955, 281, 253, 3480, 273, 247, 1534, 2133, 273, 2987, 275, 253, 2905, 789, 2593, 347, 973, 347, 253, 5816, 27163, 891, 476, 417, 5583, 436, 2929, 323, 14924, 387, 436, 3924, 50276, 187, 187, 4118, 18435, 27, 783, 2929, 29328, 247, 27998, 7792, 323, 1127, 9005, 598, 48027, 1850, 80, 2182, 285, 12240, 949, 247, 2500, 493, 486, 2746, 352, 14488, 1264, 10123, 342, 1264, 25661, 281, 2997, 285, 581, 25661, 281, 12009, 954, 273, 253, 30628, 751, 253, 4081, 2500, 493, 486, 2746, 323, 697, 17647, 285, 5183, 2266, 3045, 253, 37317, 46705, 42876, 2708, 253, 14924, 7887, 30599, 7350, 670, 5816, 5301, 281, 11454, 5281, 15424, 6779, 285, 247, 3480, 273, 16039, 327, 752, 310, 6311, 407, 2060, 8090, 275, 253, 2990, 1223, 253, 1313, 609, 1374, 254, 18726, 326, 1907, 1097, 651, 1056, 253, 2929, 10046, 253, 1313, 609, 1374, 254, 9193, 253, 2929, 556, 2217, 15785, 285, 651, 751, 281, 5583, 697, 14924 ]
[ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 ]
[ 285, 27930, 2792, 432, 32582, 1241, 374, 1014, 2167, 253, 3904, 403, 1270, 891, 513, 1158, 253, 4679, 476, 320, 5520, 1554, 262, 1945, 1666, 25379, 943, 320, 32434, 285, 2429, 697, 417, 2217, 281, 760, 7568, 253, 4081, 1332, 41731, 13015, 34791, 1945, 1666, 25379, 25761, 4030, 72, 11273, 4679, 943, 671, 320, 5196, 281, 1361, 441, 2096, 253, 3045, 327, 1016, 2014, 8892, 50275, 408, 671, 897, 436, 2593, 281, 15249, 619, 17503, 2708, 436, 2929, 310, 7194, 36636, 247, 747, 1895, 4758, 285, 24357, 247, 747, 7792, 47512, 50276, 48746, 15822, 436, 747, 7792, 36908, 4459, 1534, 747, 4295, 352, 1057, 247, 1270, 2628, 13887, 253, 1463, 749, 14825, 50276, 17890, 1699, 285, 5777, 11138, 731, 323, 253, 2457, 8892, 2299, 954, 273, 841, 4295, 13164, 275, 253, 6239, 16774, 15822, 253, 1543, 275, 10334, 18, 310, 12302, 285, 253, 4081, 7792, 10260, 19132, 689, 1625, 292, 1945, 1666, 25379, 625, 13714, 8288, 2014, 8892, 285, 2500, 302, 1945, 1269, 4868, 19875, 1666, 25379, 253, 18276, 1543, 275, 3036, 22, 403, 671, 18462, 285, 4518, 921, 253, 11361, 50275, 74, 452, 4307, 275, 253, 1673, 273, 1127, 18534, 8981, 285, 3965, 7615, 342, 253, 8981, 31225, 50276, 35529, 516, 417, 271, 6485, 273, 1127, 18534, 1850, 80, 2182, 45634, 23624, 5411, 285, 417, 1077, 7615, 342, 253, 256, 5503, 3082, 275, 841, 4910, 50274, 7152, 339, 431, 248, 2929, 29328, 271, 990, 936, 423, 27998, 2746, 281, 8415, 253, 8482, 6579, 275, 1127, 9005, 12240, 253, 2022, 10156, 1127, 273, 253, 2929, 310, 326, 13398, 767, 5368, 3082, 323, 1127, 9005, 12006, 1877, 50275, 3354, 80, 2182, 3924, 337, 285, 1127, 9005, 12240, 3924, 374, 253, 2929, 1057, 9470, 4679, 327, 495, 15302, 285, 26662, 281, 495, 1027, 1666, 25379, 597, 5115, 256, 5503, 3045, 327, 841, 15302, 253, 897, 273, 23507, 27311, 281, 1232, 1127, 16173, 310, 3240, 4722, 347, 271, 6753, 36465, 23776, 347, 4964, 5253, 8495, 84, 342, 253, 4836, 387, 1133, 50275, 296, 3755, 20556, 337, 253, 2929, 10316, 247, 747, 8668, 16161, 767, 1711, 3237, 275, 4382, 8113, 285, 436, 310, 4217, 891, 751, 253, 2934, 352, 310, 4076, 285, 2969, 374, 253, 2500, 493, 486, 4460, 10336, 310, 2969, 533, 7419, 432, 253, 3081, 40798, 21496, 891, 13414, 627, 310, 667, 643, 1635, 1060, 533, 253, 3924, 18, 285, 3924, 19, 35615, 452, 644, 908, 275, 253, 2469, 281, 8415, 3237, 619, 4685, 310, 326, 253, 2990, 2201, 314, 320, 26017, 432, 3924, 18, 347, 247, 2257, 273, 18276, 1543, 21364, 327, 12240, 495, 253, 3904, 403, 1805, 285, 1818, 342, 46092, 6064, 533, 3012, 1805, 685, 253, 8245, 7274, 577, 18276, 6667, 403, 1175, 285, 921, 253, 3045, 7756, 533, 403, 3710, 285, 651, 751, 281, 923, 625, 273, 731, 275, 253, 25702, 534, 310, 5816, 50272, 20881, 1255, 337, 436, 2929, 3198, 247, 3240, 1643, 9759, 2544, 281, 1056, 253, 2934, 2590, 285, 3477, 19246, 253, 3302, 2593, 273, 253, 1332, 2593, 273, 253, 2929, 310, 598, 1919, 4562, 310, 2590, 285, 973, 5544, 1223, 4567, 3198, 247, 1805, 8813, 891, 1335, 513, 417, 2096, 253, 3242, 30328, 3212, 253, 717, 4403, 438, 2275, 833, 40798, 21496, 285, 9317, 310, 417, 1077, 2590, 50276, 19, 253, 1666, 25379, 403, 417, 5544, 973, 387, 512, 516, 417, 2119, 604, 253, 5301, 281, 253, 1666, 25379, 310, 4344, 984, 253, 3242, 5319, 273, 253, 8762, 1258, 640, 5902, 50276, 18891, 19875, 403, 417, 5544, 285, 21223, 323, 7644, 86, 841, 7118, 878, 294, 17695, 495, 627, 310, 642, 7103, 10939, 323, 3924, 18, 512, 27163, 275, 253, 2929, 897, 3924, 18, 285, 3924, 19, 891, 1158, 281, 15249, 253, 878, 323, 3924, 19, 285, 923, 697, 3486, 352, 310, 1774, 281, 873, 598, 690, 7103, 6866, 323, 3924, 18, 760, 891, 2096, 326, 5046, 2509, 253, 2120, 7103, 327, 3924, 18, 310, 417, 1896, 50276, 21, 28411, 839, 247, 1127, 891, 5393, 4321, 50276, 3062, 18276, 6667, 403, 3058, 608, 253, 2929, 3916, 751, 2469, 789, 672, 597, 598, 16848, 597, 13414, 452, 436, 13642, 484, 2803, 273, 391, 533, 891, 1158, 275, 436, 1083, 436, 391, 816, 46955, 347, 253, 46092, 4038, 275, 3924, 337, 721, 516, 671, 417, 2590, 752, 6569, 672, 2709, 46092, 12127, 2118, 281, 253, 1072, 4328, 275, 495, 69, 2792, 50270, 6309, 1877, 891, 1158, 253, 2929, 1057, 4722, 9021, 533, 19756, 9470, 18276, 1543, 285, 253, 8762, 1258, 640, 5902, 50276, 18891, 19875, 285, 2074, 1841, 1056, 479, 4282, 604, 1841, 403, 4344, 253, 3045, 8037, 875, 253, 1332, 285, 1666, 25379, 310, 3240, 5699, 285, 516, 417, 2104, 281, 4336, 2096, 253, 3242, 3064, 275, 3486, 875, 3924, 337, 285, 3924, 374, 7293, 2220, 253, 30080, 22559, 407, 253, 4477, 891, 588, 1818, 619, 13716, 32372, 604, 1677, 5272, 22909, 5474, 33032, 2520, 2929, 29328, 271, 20654, 2500, 493, 486, 15722, 281, 26277, 8415, 253, 1264, 8892, 273, 1127, 9005, 12006, 1877, 1850, 80, 2182, 285, 14433, 387, 581, 673, 46092, 12006, 1877, 285, 1850, 80, 2182, 403, 2684, 970, 247, 495, 69, 13673, 24982, 4964, 25483, 2990, 285, 840, 32582, 1241, 403, 25578, 715, 14086, 1127, 16173, 970, 4979, 398, 253, 5661, 1543, 273, 439, 522, 257, 292, 8268, 954, 5368, 3082, 285, 452, 2266, 26647, 3745, 436, 2929, 7194, 556, 767, 20544, 806, 247, 2500, 493, 486, 15722, 310, 8434, 3146, 4081, 534, 476, 8046, 253, 2990, 281, 8415, 2709, 8892, 824, 347, 1127, 9005, 1850, 80, 2182, 12240, 285, 14433, 387, 581, 673, 285, 253, 14433, 7200, 285, 24426, 1543, 273, 436, 1332, 476, 28842, 253, 6323, 3082, 275, 2710, 749, 4910, 1273, 253, 16694, 897, 273, 20068, 2762, 9706, 285, 4979, 398, 281, 11897, 253, 2954, 875, 247, 4055, 46092, 285, 697, 6346, 32582, 1241, 556, 2176, 25441, 2980, 8453, 323, 253, 956, 484, 789, 50276, 2858, 387, 253, 1072, 673, 436, 2929, 671, 556, 690, 35387, 432, 253, 5661, 1543, 439, 522, 257, 292, 2003, 10895, 310, 10870, 281, 256, 5503, 1543, 285, 253, 26647, 3745, 273, 660, 1136, 292, 285, 17857, 77, 3023, 303, 15302, 310, 3012, 1805, 685, 643, 3082, 533, 253, 2929, 760, 26662, 253, 5661, 1543, 352, 651, 320, 1805, 604, 253, 2929, 812, 5513, 2139, 253, 4081, 747, 1332, 476, 5115, 824, 2266, 26647, 3745, 285, 2139, 253, 26647, 3745, 273, 253, 5368, 3082, 310, 4105, 50276, 2520, 2929, 2833, 1242, 49661, 1027, 8892, 824, 347, 1127, 9005, 1850, 80, 2182, 12240, 285, 14433, 715, 247, 2500, 493, 486, 4836, 285, 253, 26647, 3745, 273, 253, 5661, 1543, 327, 660, 1136, 292, 285, 17857, 77, 3023, 303, 15302, 310, 1199, 1805, 685, 253, 6323, 5368, 3082, 436, 2929, 2833, 1242, 4648, 20068, 2762, 9706, 281, 11897, 253, 2954, 875, 247, 4055, 46092, 285, 697, 6346, 32582, 1241, 534, 556, 2176, 25441, 2980, 8453, 323, 253, 956, 484, 789, 5474, 339, 431, 248, 2929, 29328, 247, 747, 1332, 281, 26277, 2953, 253, 8892, 273, 1127, 9005, 1850, 80, 2182, 12240, 285, 598, 48027, 597, 1067, 436, 6036, 1895, 1127, 9005, 14433, 253, 1332, 8414, 273, 767, 2022, 11911, 50276, 66, 46092, 5978, 6333, 281, 2572, 46092, 4038, 285, 5386, 42559, 9009, 347, 271, 4964, 25483, 23507, 27311, 267, 11454, 2990, 285, 247, 1127, 774, 3100, 1320, 6333, 281, 6455, 253, 13358, 32582, 1241, 896, 281, 1127, 16173, 9009, 970, 247, 39707, 2990, 253, 1543, 921, 1175, 3045, 275, 253, 6777, 17082, 285, 15302, 50276, 783, 1895, 273, 1127, 9005, 14433, 24772, 2067, 5368, 1774, 7881, 275, 1127, 9005, 5162, 253, 2929, 310, 973, 3542, 285, 3477, 281, 956, 2299, 690, 4278, 3464, 12744, 285, 253, 7103, 10165, 16656, 7426, 50275, 24330, 3374, 50273, 17480, 1142, 20947, 403, 5678, 715, 247, 2014, 11017, 627, 310, 1529, 5691, 326, 310, 7094, 28849, 50276, 24418, 40446, 604, 253, 2934, 310, 281, 13398, 512, 7881, 715, 581, 436, 943, 320, 1529, 3089, 912, 281, 2953, 50274, 33722, 2905, 789, 50276, 9088, 310, 247, 1077, 1781, 2133, 273, 3332, 2987, 327, 11454, 5281, 15424, 6779, 534, 16633, 327, 253, 2553, 14433, 4836, 1223, 891, 2096, 849, 436, 310, 247, 2372, 1027, 432, 253, 4081, 1127, 9005, 14433, 581, 812, 3365, 3410, 2792, 327, 253, 5058, 1268, 873, 285, 5115, 1127, 9005, 14433, 2190, 841, 2987, 403, 372, 2265, 4989, 35190, 6928, 3779, 3779, 69, 33234, 8155, 209, 19531, 50276, 13118, 2241, 267, 35190, 6928, 3408, 4927, 445, 2836, 84, 3966, 841, 403, 5816, 3782, 1580, 275, 2593, 4567, 352, 310, 7558, 326, 253, 1980, 495, 69, 5281, 310, 5998, 970, 14646, 32582, 1241, 50275, 6772, 607, 50276, 249, 253, 2529, 2746, 352, 369, 417, 2590, 281, 479, 849, 253, 2457, 1180, 273, 2792, 310, 7616, 671, 417, 2590, 849, 3295, 1491, 310, 3413, 323, 253, 9841, 4561, 2792, 50276, 15419, 2368, 1666, 25379, 891, 1089, 253, 4327, 273, 253, 8245, 30455, 806, 627, 369, 642, 27998, 2746, 908, 1850, 80, 2182, 50276, 45634, 50276, 48027, 1273, 352, 651, 1056, 625, 3282, 281, 479, 281, 7277, 253, 4081, 1332, 327, 253, 2571, 4836, 1580, 352, 310, 3264, 326, 247, 1850, 80, 2182, 1332, 588, 417, 513, 973, 323, 598, 48027, 2626, 275, 253, 5678, 1666, 25379, 8762, 1258, 640, 5902, 18891, 19875, 285, 7644, 316, 6443, 19875, 352, 310, 417, 2590, 253, 1340, 273, 2898, 352, 2789, 3282, 281, 806, 1850, 45416, 285, 840, 598, 16848, 11984, 533, 253, 2505, 5936, 253, 7285, 534, 943, 671, 320, 2361, 4720, 347, 5393, 275, 253, 2905, 789, 4385, 253, 2929, 19756, 5301, 281, 15424, 14237, 2553, 14433, 50275, 15419, 2368, 15302, 50276, 783, 6777, 15302, 403, 7744, 908, 285, 271, 34007, 4327, 323, 436, 4836, 2299, 597, 403, 7052, 23539, 4404, 23601, 41184, 323, 1650, 954, 15029, 275, 439, 522, 257, 292, 285, 954, 13451, 275, 660, 1136, 292, 452, 1781, 16340, 891, 3264, 281, 923, 1543, 327, 253, 2553, 14433, 22791, 17099, 1063, 1162, 355, 534, 310, 1355, 533, 556, 247, 5878, 273, 2570, 41184, 50276, 15419, 2368, 50276, 2606, 285, 3541, 6095, 403, 5816, 432, 253, 7092, 4278, 352, 3133, 281, 320, 7741, 37893, 849, 1057, 326, 7277, 281, 253, 643, 1666, 25379, 50275, 77, 471, 273, 12288, 50276, 783, 2929, 310, 5816, 271, 1774, 19087, 326, 3400, 690, 12288, 715, 752, 253, 1027, 11911, 452, 6311, 285, 849, 310, 326, 12912, 281, 253, 4836, 436, 16540, 690, 1953, 670, 38135, 50276, 5430, 310, 436, 2746, 1805, 685, 253, 2020, 273, 697, 7681, 4243, 50275, 249, 697, 1655, 830, 253, 2929, 310, 417, 1881, 41010, 285, 253, 1332, 310, 417, 41374, 1955, 281, 247, 3480, 273, 7681, 4278, 50275, 37585, 3374, 50274, 9088, 310, 271, 2523, 342, 253, 1307, 5415, 495, 69, 2792, 534, 24510, 3139, 275, 253, 2929, 2593, 4567, 323, 1650, 1127, 16173, 403, 417, 5415, 597, 403, 19958, 327, 247, 5415, 2553, 50275, 15419, 2368, 253, 7103, 17082, 497, 417, 6283, 7616, 672, 9591, 19817, 4679, 352, 778, 320, 4536, 12207, 281, 4514, 436, 2299, 275, 841, 1083, 253, 2929, 29328, 247, 747, 4836, 3103, 253, 7103, 5593, 943, 320, 7616, 50275, 87, 7893, 310, 5816, 432, 3036, 374, 285, 3036, 20, 581, 476, 760, 5467, 835, 352, 310, 275, 253, 10336, 352, 943, 320, 1805, 7616, 50276, 4674, 374, 50276, 3659, 9005, 598, 48027, 3280, 2792, 3280, 2792, 1745, 80, 50274, 249, 6010, 891, 1089, 253, 2929, 973, 3542, 285, 253, 1895, 4722, 253, 2746, 310, 973, 3559, 285, 22335, 3590, 323, 253, 954, 629, 2299, 1955, 281, 253, 3480, 273, 247, 1534, 2133, 273, 2987, 275, 253, 2905, 789, 2593, 347, 973, 347, 253, 5816, 27163, 891, 476, 417, 5583, 436, 2929, 323, 14924, 387, 436, 3924, 50276, 187, 187, 4118, 18435, 27, 783, 2929, 29328, 247, 27998, 7792, 323, 1127, 9005, 598, 48027, 1850, 80, 2182, 285, 12240, 949, 247, 2500, 493, 486, 2746, 352, 14488, 1264, 10123, 342, 1264, 25661, 281, 2997, 285, 581, 25661, 281, 12009, 954, 273, 253, 30628, 751, 253, 4081, 2500, 493, 486, 2746, 323, 697, 17647, 285, 5183, 2266, 3045, 253, 37317, 46705, 42876, 2708, 253, 14924, 7887, 30599, 7350, 670, 5816, 5301, 281, 11454, 5281, 15424, 6779, 285, 247, 3480, 273, 16039, 327, 752, 310, 6311, 407, 2060, 8090, 275, 253, 2990, 1223, 253, 1313, 609, 1374, 254, 18726, 326, 1907, 1097, 651, 1056, 253, 2929, 10046, 253, 1313, 609, 1374, 254, 9193, 253, 2929, 556, 2217, 15785, 285, 651, 751, 281, 5583, 697, 14924 ]
Below is given review of a research paper from cnoference journal. Please write a summary the review. ### Review: this work proposes to change the labels for the mixed examples in mixup my major concerns are as follows 1 the motivation of adopting soft label for mixup is not clear label smoothing is helpful for generic training but why it can benefit mixup 2 the proposed method is more like a combination of mixup and label smoothing the improvement may come from label smoothing as a generic trick rather than mixup itself 3 the performance of proposed method is very close to mixup where the improvement is not significant additional experiments on imagenet can make the results more convincingdocsepmain claim the authors propose to use soft labels on naive mixup as an alternative to sophisticated mixup strategy in experiments on 5 small datasets the proposed method can achieve better accuracy than baselines strong points the authors propose to use soft labels to overcome the mislearning features in mixup images the idea is simple and straightforward experiment results show the method works well on small datasets weak points the contribution of this work is incremental experiment results on imagenet are missing on tinyimagenet its also helpful to show the performance of mixup and cutmix some decisions are made without justification some details are not clearly explained see questions recommendation reject the proposed method is incremental the method should be evaluated on imagenet questions in eq9 a sigmoid function is applied before the softmax function so the unnormalized logits of this distribution is in 1 1 why why is target soft labels too far away an issue for the model why does a minibatch with original inputs prevent lamix from assigning target soft labels too far away if the trick is not applied what will happen to the model will the model take more time to converge or it wont converge for lamix the added fully connected layer is just a copy of the fullyconnected layer of the original network with a softmax function on the top is the network pretrained are the two matrices sharing weights as shown on figure 2 the top 2 classes occupy a large portion of the probability so im curious which part actually contributes to the improvements is it a the reweighting of yi and yj or 2 the introduction of other labels ie after computing eq 10 keep the value for yi and yj set all other dimensions to zero then renormalize the distribution and use it as the training label what will happen my impression is it may solve the problem of target soft labels too far away comments eq 3 is confusing to me i think authors can follow the convention in yun et al rewrite the equation as xlambdaijphixi xj lambda xi 1 phixi xj lambda xj has tow forms has two forms after rebuttal thanks to the author for providing additional experimental data but without the results of imagenet it is difficult to judge the effectiveness of this method on complicated data so i decided to keep the original score docsepsummary the previous advanced mixup methods such as cutmix and puzzlemix involve input mixing this paper suggests a new mixup approach called lamix that does not require input mixing the solution is combining the original target label interpolation of two onehot targets and generated target labels from an additional network to use it for training the authors argue that lamix achieves superior performance without input mixing reasons for score the authors should explain the reason why using global soft labels is theoretically plausible before describing the method although empirical results look nice i am suspicious about the experimental settings for the comparison with other methods pros the paper includes diverse results on probing experiments the paper is clearly written concernsquestions as far as i understood neural network parameters for the global soft label wt are not trained because they are only used for training labels then these parameters are just randomly initialized values is it right if so isnt the final effect sensitive to the initialization sigmoid activation is used in equation 9 different from equation 7 however there is no explanation about the reason for using it my guess is to make artificial labels similar to each other i dont understand why beta in lamix is 10 for section 32 experiments first i think beta 10 means not using the global soft label and it is equivalent to standard mixup second the authors mention that setting beta to 05 is a good choice in section 313 figure 4 moreover they use beta as 05 for the experiments of section 311 table 1 the settings of table 1 and table 3 model architectures and datasets are the same except for the beta value of lamix could you provide hyperparameters used for other mixup methods as a baseline are they welltuned i am curious whether the combination with the global soft label can be done after input mixing if then i think providing these results would be helpful to check whether the regularization effect is orthogonal to input mixing to compare lamix with label smoothing i think the author should apply label smoothing to mixup rather than the vanilla setting minor comment puzzle mix puzzlemixdocsepsummary this paper simply combines mixup and selfdistillation to achieve more adaptive soft label which effectively regularize the training in the manuscript authors argue that the existed mixupbased approaches has two mainly efforts may create misleading training samples or meet computation cost issue on creating samples motivated by this they propose lamix which can leverage the information of selfdistillation to solve those two efforts and achieve competitive performance with sota puzzlemixup comment 1 this paper introduces a combination method between mixup and selfdistillation and simply use an additional fc layer to have adaptive soft label which is intuitive and clear but lack of novelty can you give more insightful comment about how adaptive label can help mixup soft label 2 in figure 3 i think the provided evidence for the effect of considering adaptive soft label with mixup approach is promising 3 for experimental results in section 32 the proposed lamix seems not achieve significant improvement compared to the sota puzzlemix on cifar10 and 100 ### Summary:
this work proposes to improve mixup by using soft labels removing the need for input mixup the reviewers found the paper was clear and found the experiments promising the reviewers raised concerns about the lack of experiments comparing this approach to mixuplabel smoothing which were addressed during the rebuttal by the authors however the reviewers did not find the empirical evidence strong enough given that this is mostly an empirical contribution the authors do not necessarily need to train on the full imagenet but it would be beneficial to evaluate on more standard settings on the dataset considered to facilitate comparison to previous work
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 2520, 789, 29328, 281, 1818, 253, 13301, 323, 253, 6804, 6667, 275, 5878, 484, 619, 2201, 7350, 403, 347, 3637, 337, 186, 783, 16038, 273, 25987, 2602, 5203, 323, 5878, 484, 310, 417, 2590, 5203, 36971, 310, 9371, 323, 12314, 3733, 533, 2139, 352, 476, 5649, 5878, 484, 374, 186, 783, 4081, 1332, 310, 625, 751, 247, 5019, 273, 5878, 484, 285, 5203, 36971, 253, 7756, 778, 1705, 432, 5203, 36971, 347, 247, 12314, 10480, 2581, 685, 5878, 484, 3139, 495, 186, 783, 3045, 273, 4081, 1332, 310, 1077, 2810, 281, 5878, 484, 835, 253, 7756, 310, 417, 1534, 3081, 4679, 327, 4440, 257, 292, 476, 1056, 253, 1543, 625, 21414, 7152, 339, 2617, 404, 1750, 50276, 783, 4477, 12661, 281, 897, 2602, 13301, 327, 27785, 5878, 484, 347, 271, 5795, 281, 18144, 5878, 484, 5700, 275, 4679, 327, 608, 1355, 15302, 253, 4081, 1332, 476, 5115, 1805, 7200, 685, 1666, 25379, 50276, 9072, 2792, 50276, 783, 4477, 12661, 281, 897, 2602, 13301, 281, 11399, 253, 3731, 28269, 3386, 275, 5878, 484, 3888, 253, 2934, 310, 2969, 285, 15246, 50275, 16217, 2092, 1543, 921, 253, 1332, 2987, 973, 327, 1355, 15302, 50275, 20881, 2792, 50276, 783, 7680, 273, 436, 789, 310, 32809, 50276, 16217, 2092, 1543, 327, 4440, 257, 292, 403, 5816, 327, 10058, 303, 6533, 292, 697, 671, 9371, 281, 921, 253, 3045, 273, 5878, 484, 285, 2624, 24706, 50276, 8826, 7089, 403, 1160, 1293, 22861, 690, 4278, 403, 417, 4518, 5544, 923, 3533, 50274, 250, 27167, 318, 50276, 49844, 50276, 783, 4081, 1332, 310, 32809, 253, 1332, 943, 320, 6760, 327, 4440, 257, 292, 50275, 34974, 50276, 249, 16186, 26, 247, 9788, 78, 1238, 1159, 310, 3732, 1078, 253, 2602, 4090, 1159, 594, 253, 440, 6320, 1025, 2412, 953, 273, 436, 3268, 310, 275, 337, 337, 2139, 50276, 22309, 310, 2303, 2602, 13301, 1512, 2080, 1977, 271, 2523, 323, 253, 1566, 2139, 1057, 247, 1054, 487, 1506, 342, 3236, 14800, 3657, 16519, 895, 432, 34018, 2303, 2602, 13301, 1512, 2080, 1977, 604, 253, 10480, 310, 417, 3732, 752, 588, 5108, 281, 253, 1566, 588, 253, 1566, 1379, 625, 673, 281, 29623, 390, 352, 31451, 29623, 50276, 1542, 16519, 895, 253, 2879, 4751, 4802, 3828, 310, 816, 247, 3491, 273, 253, 4751, 14063, 3828, 273, 253, 3236, 2990, 342, 247, 2602, 4090, 1159, 327, 253, 1755, 310, 253, 2990, 3215, 11273, 403, 253, 767, 12624, 9628, 13461, 50276, 284, 2011, 327, 4677, 374, 253, 1755, 374, 5971, 26263, 247, 1781, 5110, 273, 253, 5912, 594, 516, 14338, 534, 629, 2686, 17904, 281, 253, 11701, 310, 352, 247, 253, 294, 6712, 272, 273, 340, 74, 285, 340, 75, 390, 374, 253, 10199, 273, 643, 13301, 26332, 846, 12672, 16186, 884, 1978, 253, 1318, 323, 340, 74, 285, 340, 75, 873, 512, 643, 10103, 281, 5058, 840, 26749, 907, 253, 3268, 285, 897, 352, 347, 253, 3733, 5203, 752, 588, 5108, 50276, 2577, 13214, 310, 352, 778, 8415, 253, 1895, 273, 50276, 7831, 2602, 13301, 1512, 2080, 1977, 50276, 26122, 50276, 2574, 495, 310, 21643, 281, 479, 891, 1158, 4477, 476, 956, 253, 5008, 275, 340, 328, 1162, 355, 24813, 253, 5150, 347, 1269, 2260, 1944, 545, 895, 74, 1269, 75, 29331, 50276, 2981, 50276, 18, 50276, 545, 895, 74, 1269, 75, 29331, 50276, 89, 75, 50276, 7110, 12413, 4948, 50276, 7110, 767, 4948, 50276, 6438, 30080, 22559, 50276, 35501, 281, 253, 2488, 323, 5277, 3081, 5661, 941, 533, 1293, 253, 1543, 273, 4440, 257, 292, 352, 310, 2834, 281, 5963, 253, 12510, 273, 436, 1332, 327, 9542, 941, 594, 891, 4425, 281, 1978, 253, 3236, 4868, 5474, 339, 793, 360, 3454, 253, 2045, 7269, 5878, 484, 3082, 824, 347, 2624, 24706, 285, 21843, 5616, 895, 6388, 3280, 12480, 436, 2929, 5936, 247, 747, 5878, 484, 2746, 1925, 16519, 895, 326, 1057, 417, 2430, 3280, 12480, 253, 2900, 310, 16248, 253, 3236, 2303, 5203, 30370, 273, 767, 581, 12022, 8571, 285, 4561, 2303, 13301, 432, 271, 3081, 2990, 281, 897, 352, 323, 3733, 253, 4477, 9059, 326, 16519, 895, 33526, 8936, 3045, 1293, 3280, 12480, 50276, 250, 3743, 323, 4868, 253, 4477, 943, 5513, 253, 1921, 2139, 970, 4156, 2602, 13301, 310, 28055, 21541, 1078, 12930, 253, 1332, 3738, 16774, 1543, 1007, 5322, 891, 717, 20634, 670, 253, 5661, 7533, 323, 253, 5301, 342, 643, 3082, 50276, 856, 84, 50276, 783, 2929, 3797, 11117, 1543, 327, 39578, 4679, 50276, 783, 2929, 310, 4518, 3542, 50276, 585, 1209, 2224, 34974, 50276, 284, 2080, 347, 891, 7192, 11454, 2990, 3602, 323, 253, 4156, 2602, 5203, 22923, 403, 417, 10166, 984, 597, 403, 760, 908, 323, 3733, 13301, 840, 841, 3602, 403, 816, 12421, 31260, 2193, 310, 352, 987, 604, 594, 310, 2649, 253, 2457, 1055, 7996, 281, 253, 31850, 50276, 84, 15379, 1238, 5743, 310, 908, 275, 5150, 898, 1027, 432, 5150, 818, 2299, 627, 310, 642, 8813, 670, 253, 1921, 323, 970, 352, 619, 5476, 310, 281, 1056, 13345, 13301, 2074, 281, 1016, 643, 50276, 74, 13414, 2096, 2139, 9840, 275, 16519, 895, 310, 884, 323, 2593, 4567, 4679, 806, 891, 1158, 9840, 50276, 740, 2097, 417, 970, 253, 4156, 2602, 5203, 285, 352, 310, 6425, 281, 2629, 5878, 484, 1273, 253, 4477, 3748, 326, 4758, 9840, 281, 16987, 310, 247, 1175, 4327, 275, 2593, 31389, 4677, 577, 25761, 597, 897, 9840, 347, 16987, 323, 253, 4679, 273, 2593, 32236, 2829, 337, 253, 7533, 273, 2829, 337, 285, 2829, 495, 1566, 35615, 285, 15302, 403, 253, 1072, 3707, 323, 253, 9840, 1318, 273, 16519, 895, 50276, 16534, 368, 2085, 4373, 22041, 908, 323, 643, 5878, 484, 3082, 347, 247, 8245, 403, 597, 973, 85, 37437, 50276, 74, 717, 14338, 1880, 253, 5019, 342, 253, 4156, 2602, 5203, 476, 320, 2218, 846, 3280, 12480, 604, 840, 891, 1158, 5277, 841, 1543, 651, 320, 9371, 281, 2451, 1880, 253, 37820, 1055, 310, 19627, 281, 3280, 12480, 50276, 936, 7277, 16519, 895, 342, 5203, 36971, 891, 1158, 253, 2488, 943, 4647, 5203, 36971, 281, 5878, 484, 2581, 685, 253, 26724, 4758, 50276, 37585, 4385, 50276, 11113, 12610, 5878, 50276, 11113, 4396, 5616, 895, 7152, 339, 793, 360, 3454, 50276, 2520, 2929, 3365, 24772, 5878, 484, 285, 1881, 8155, 21755, 281, 5115, 625, 17825, 2602, 5203, 534, 8069, 3963, 907, 253, 3733, 275, 253, 7714, 4477, 9059, 326, 253, 13164, 5878, 484, 3169, 7274, 556, 767, 7194, 6031, 778, 2794, 24363, 3733, 3530, 390, 2525, 13782, 2105, 2523, 327, 6153, 3530, 17194, 407, 436, 597, 12661, 16519, 895, 534, 476, 25057, 253, 1491, 273, 1881, 8155, 21755, 281, 8415, 1110, 767, 6031, 285, 5115, 12085, 3045, 342, 256, 5503, 21843, 5616, 895, 484, 50272, 13982, 50276, 18, 436, 2929, 23970, 247, 5019, 1332, 875, 5878, 484, 285, 1881, 8155, 21755, 285, 3365, 897, 271, 3081, 269, 68, 3828, 281, 452, 17825, 2602, 5203, 534, 310, 27350, 285, 2590, 533, 3480, 273, 38135, 476, 368, 1918, 625, 47860, 4385, 670, 849, 17825, 5203, 476, 1361, 5878, 484, 2602, 5203, 50276, 19, 275, 4677, 495, 891, 1158, 253, 2530, 1941, 323, 253, 1055, 273, 7296, 17825, 2602, 5203, 342, 5878, 484, 2746, 310, 12532, 50276, 20, 323, 5661, 1543, 275, 2593, 4567, 253, 4081, 16519, 895, 3133, 417, 5115, 1534, 7756, 2429, 281, 253, 256, 5503, 21843, 5616, 895, 327, 260, 338, 274, 740, 285, 2233, 2490, 187, 4118, 18435, 27, 2520, 789, 29328, 281, 3157, 5878, 484, 407, 970, 2602, 13301, 11922, 253, 878, 323, 3280, 5878, 484, 253, 30628, 1119, 253, 2929, 369, 2590, 285, 1119, 253, 4679, 12532, 253, 30628, 5439, 7350, 670, 253, 3480, 273, 4679, 10941, 436, 2746, 281, 5878, 86, 446, 1492, 36971, 534, 497, 9713, 1309, 253, 30080, 22559, 407, 253, 4477, 2299, 253, 30628, 858, 417, 1089, 253, 16774, 1941, 2266, 2217, 1677, 326, 436, 310, 6571, 271, 16774, 7680, 253, 4477, 513, 417, 7933, 878, 281, 6194, 327, 253, 2120, 4440, 257, 292, 533, 352, 651, 320, 12912, 281, 7472, 327, 625, 2629, 7533, 327, 253, 10895, 2783, 281, 12454, 5301, 281, 2045, 789 ]
[ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 ]
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 2520, 789, 29328, 281, 1818, 253, 13301, 323, 253, 6804, 6667, 275, 5878, 484, 619, 2201, 7350, 403, 347, 3637, 337, 186, 783, 16038, 273, 25987, 2602, 5203, 323, 5878, 484, 310, 417, 2590, 5203, 36971, 310, 9371, 323, 12314, 3733, 533, 2139, 352, 476, 5649, 5878, 484, 374, 186, 783, 4081, 1332, 310, 625, 751, 247, 5019, 273, 5878, 484, 285, 5203, 36971, 253, 7756, 778, 1705, 432, 5203, 36971, 347, 247, 12314, 10480, 2581, 685, 5878, 484, 3139, 495, 186, 783, 3045, 273, 4081, 1332, 310, 1077, 2810, 281, 5878, 484, 835, 253, 7756, 310, 417, 1534, 3081, 4679, 327, 4440, 257, 292, 476, 1056, 253, 1543, 625, 21414, 7152, 339, 2617, 404, 1750, 50276, 783, 4477, 12661, 281, 897, 2602, 13301, 327, 27785, 5878, 484, 347, 271, 5795, 281, 18144, 5878, 484, 5700, 275, 4679, 327, 608, 1355, 15302, 253, 4081, 1332, 476, 5115, 1805, 7200, 685, 1666, 25379, 50276, 9072, 2792, 50276, 783, 4477, 12661, 281, 897, 2602, 13301, 281, 11399, 253, 3731, 28269, 3386, 275, 5878, 484, 3888, 253, 2934, 310, 2969, 285, 15246, 50275, 16217, 2092, 1543, 921, 253, 1332, 2987, 973, 327, 1355, 15302, 50275, 20881, 2792, 50276, 783, 7680, 273, 436, 789, 310, 32809, 50276, 16217, 2092, 1543, 327, 4440, 257, 292, 403, 5816, 327, 10058, 303, 6533, 292, 697, 671, 9371, 281, 921, 253, 3045, 273, 5878, 484, 285, 2624, 24706, 50276, 8826, 7089, 403, 1160, 1293, 22861, 690, 4278, 403, 417, 4518, 5544, 923, 3533, 50274, 250, 27167, 318, 50276, 49844, 50276, 783, 4081, 1332, 310, 32809, 253, 1332, 943, 320, 6760, 327, 4440, 257, 292, 50275, 34974, 50276, 249, 16186, 26, 247, 9788, 78, 1238, 1159, 310, 3732, 1078, 253, 2602, 4090, 1159, 594, 253, 440, 6320, 1025, 2412, 953, 273, 436, 3268, 310, 275, 337, 337, 2139, 50276, 22309, 310, 2303, 2602, 13301, 1512, 2080, 1977, 271, 2523, 323, 253, 1566, 2139, 1057, 247, 1054, 487, 1506, 342, 3236, 14800, 3657, 16519, 895, 432, 34018, 2303, 2602, 13301, 1512, 2080, 1977, 604, 253, 10480, 310, 417, 3732, 752, 588, 5108, 281, 253, 1566, 588, 253, 1566, 1379, 625, 673, 281, 29623, 390, 352, 31451, 29623, 50276, 1542, 16519, 895, 253, 2879, 4751, 4802, 3828, 310, 816, 247, 3491, 273, 253, 4751, 14063, 3828, 273, 253, 3236, 2990, 342, 247, 2602, 4090, 1159, 327, 253, 1755, 310, 253, 2990, 3215, 11273, 403, 253, 767, 12624, 9628, 13461, 50276, 284, 2011, 327, 4677, 374, 253, 1755, 374, 5971, 26263, 247, 1781, 5110, 273, 253, 5912, 594, 516, 14338, 534, 629, 2686, 17904, 281, 253, 11701, 310, 352, 247, 253, 294, 6712, 272, 273, 340, 74, 285, 340, 75, 390, 374, 253, 10199, 273, 643, 13301, 26332, 846, 12672, 16186, 884, 1978, 253, 1318, 323, 340, 74, 285, 340, 75, 873, 512, 643, 10103, 281, 5058, 840, 26749, 907, 253, 3268, 285, 897, 352, 347, 253, 3733, 5203, 752, 588, 5108, 50276, 2577, 13214, 310, 352, 778, 8415, 253, 1895, 273, 50276, 7831, 2602, 13301, 1512, 2080, 1977, 50276, 26122, 50276, 2574, 495, 310, 21643, 281, 479, 891, 1158, 4477, 476, 956, 253, 5008, 275, 340, 328, 1162, 355, 24813, 253, 5150, 347, 1269, 2260, 1944, 545, 895, 74, 1269, 75, 29331, 50276, 2981, 50276, 18, 50276, 545, 895, 74, 1269, 75, 29331, 50276, 89, 75, 50276, 7110, 12413, 4948, 50276, 7110, 767, 4948, 50276, 6438, 30080, 22559, 50276, 35501, 281, 253, 2488, 323, 5277, 3081, 5661, 941, 533, 1293, 253, 1543, 273, 4440, 257, 292, 352, 310, 2834, 281, 5963, 253, 12510, 273, 436, 1332, 327, 9542, 941, 594, 891, 4425, 281, 1978, 253, 3236, 4868, 5474, 339, 793, 360, 3454, 253, 2045, 7269, 5878, 484, 3082, 824, 347, 2624, 24706, 285, 21843, 5616, 895, 6388, 3280, 12480, 436, 2929, 5936, 247, 747, 5878, 484, 2746, 1925, 16519, 895, 326, 1057, 417, 2430, 3280, 12480, 253, 2900, 310, 16248, 253, 3236, 2303, 5203, 30370, 273, 767, 581, 12022, 8571, 285, 4561, 2303, 13301, 432, 271, 3081, 2990, 281, 897, 352, 323, 3733, 253, 4477, 9059, 326, 16519, 895, 33526, 8936, 3045, 1293, 3280, 12480, 50276, 250, 3743, 323, 4868, 253, 4477, 943, 5513, 253, 1921, 2139, 970, 4156, 2602, 13301, 310, 28055, 21541, 1078, 12930, 253, 1332, 3738, 16774, 1543, 1007, 5322, 891, 717, 20634, 670, 253, 5661, 7533, 323, 253, 5301, 342, 643, 3082, 50276, 856, 84, 50276, 783, 2929, 3797, 11117, 1543, 327, 39578, 4679, 50276, 783, 2929, 310, 4518, 3542, 50276, 585, 1209, 2224, 34974, 50276, 284, 2080, 347, 891, 7192, 11454, 2990, 3602, 323, 253, 4156, 2602, 5203, 22923, 403, 417, 10166, 984, 597, 403, 760, 908, 323, 3733, 13301, 840, 841, 3602, 403, 816, 12421, 31260, 2193, 310, 352, 987, 604, 594, 310, 2649, 253, 2457, 1055, 7996, 281, 253, 31850, 50276, 84, 15379, 1238, 5743, 310, 908, 275, 5150, 898, 1027, 432, 5150, 818, 2299, 627, 310, 642, 8813, 670, 253, 1921, 323, 970, 352, 619, 5476, 310, 281, 1056, 13345, 13301, 2074, 281, 1016, 643, 50276, 74, 13414, 2096, 2139, 9840, 275, 16519, 895, 310, 884, 323, 2593, 4567, 4679, 806, 891, 1158, 9840, 50276, 740, 2097, 417, 970, 253, 4156, 2602, 5203, 285, 352, 310, 6425, 281, 2629, 5878, 484, 1273, 253, 4477, 3748, 326, 4758, 9840, 281, 16987, 310, 247, 1175, 4327, 275, 2593, 31389, 4677, 577, 25761, 597, 897, 9840, 347, 16987, 323, 253, 4679, 273, 2593, 32236, 2829, 337, 253, 7533, 273, 2829, 337, 285, 2829, 495, 1566, 35615, 285, 15302, 403, 253, 1072, 3707, 323, 253, 9840, 1318, 273, 16519, 895, 50276, 16534, 368, 2085, 4373, 22041, 908, 323, 643, 5878, 484, 3082, 347, 247, 8245, 403, 597, 973, 85, 37437, 50276, 74, 717, 14338, 1880, 253, 5019, 342, 253, 4156, 2602, 5203, 476, 320, 2218, 846, 3280, 12480, 604, 840, 891, 1158, 5277, 841, 1543, 651, 320, 9371, 281, 2451, 1880, 253, 37820, 1055, 310, 19627, 281, 3280, 12480, 50276, 936, 7277, 16519, 895, 342, 5203, 36971, 891, 1158, 253, 2488, 943, 4647, 5203, 36971, 281, 5878, 484, 2581, 685, 253, 26724, 4758, 50276, 37585, 4385, 50276, 11113, 12610, 5878, 50276, 11113, 4396, 5616, 895, 7152, 339, 793, 360, 3454, 50276, 2520, 2929, 3365, 24772, 5878, 484, 285, 1881, 8155, 21755, 281, 5115, 625, 17825, 2602, 5203, 534, 8069, 3963, 907, 253, 3733, 275, 253, 7714, 4477, 9059, 326, 253, 13164, 5878, 484, 3169, 7274, 556, 767, 7194, 6031, 778, 2794, 24363, 3733, 3530, 390, 2525, 13782, 2105, 2523, 327, 6153, 3530, 17194, 407, 436, 597, 12661, 16519, 895, 534, 476, 25057, 253, 1491, 273, 1881, 8155, 21755, 281, 8415, 1110, 767, 6031, 285, 5115, 12085, 3045, 342, 256, 5503, 21843, 5616, 895, 484, 50272, 13982, 50276, 18, 436, 2929, 23970, 247, 5019, 1332, 875, 5878, 484, 285, 1881, 8155, 21755, 285, 3365, 897, 271, 3081, 269, 68, 3828, 281, 452, 17825, 2602, 5203, 534, 310, 27350, 285, 2590, 533, 3480, 273, 38135, 476, 368, 1918, 625, 47860, 4385, 670, 849, 17825, 5203, 476, 1361, 5878, 484, 2602, 5203, 50276, 19, 275, 4677, 495, 891, 1158, 253, 2530, 1941, 323, 253, 1055, 273, 7296, 17825, 2602, 5203, 342, 5878, 484, 2746, 310, 12532, 50276, 20, 323, 5661, 1543, 275, 2593, 4567, 253, 4081, 16519, 895, 3133, 417, 5115, 1534, 7756, 2429, 281, 253, 256, 5503, 21843, 5616, 895, 327, 260, 338, 274, 740, 285, 2233, 2490, 187, 4118, 18435, 27, 2520, 789, 29328, 281, 3157, 5878, 484, 407, 970, 2602, 13301, 11922, 253, 878, 323, 3280, 5878, 484, 253, 30628, 1119, 253, 2929, 369, 2590, 285, 1119, 253, 4679, 12532, 253, 30628, 5439, 7350, 670, 253, 3480, 273, 4679, 10941, 436, 2746, 281, 5878, 86, 446, 1492, 36971, 534, 497, 9713, 1309, 253, 30080, 22559, 407, 253, 4477, 2299, 253, 30628, 858, 417, 1089, 253, 16774, 1941, 2266, 2217, 1677, 326, 436, 310, 6571, 271, 16774, 7680, 253, 4477, 513, 417, 7933, 878, 281, 6194, 327, 253, 2120, 4440, 257, 292, 533, 352, 651, 320, 12912, 281, 7472, 327, 625, 2629, 7533, 327, 253, 10895, 2783, 281, 12454, 5301, 281, 2045, 789 ]
Below is given review of a research paper from cnoference journal. Please write a summary the review. ### Review: this work presents elast an approach for longhorizon planning through samplingbased motion planning in learned latent spaces given a high dimensional observation images the approach first learns a latent representation using selfsupervised contrastive learning a latent dynamics model and a transition density function this latent space is then searched via expansive space trees to solve motion planning problems from initial states to goal states finally plans are executed via mpc elast allows efficient searching of spaces and longhorizon planning as shown over a number of environments the paper is generally well written and clear as are the figures demonstrating the approach the area of learning latent representations is useful and important the minimal assumptions and requirements on data for elast makes the method generally applicable based on the planning results and plots of the latent spaces the latent space learned is generally robust the mpc replanner seems to be effective for following the trajectories in terms of number of environments and comparisons the experiments section is thorough though see notes below on complexity of the environments the detailed ablations in the appendix and ablations in the experiments section are well done the two main weaknesses i see are 1 the complexity of the environments both in terms robot dimensionality and environment variation and 2 the novelty compared to prior approaches for latent tree searches 1 each of the environments has relatively low dimensional true important state spaces which is an area where these latent space methods perform well it would be useful to see how it performs in spaces such as an se2 or 3 rigid body in a complex environment furthermore it would be useful to see how it performs with more complex dynamics such as a kinodynamic system which may force higher dimensional learned latent spaces a double integrator for instance finally as a benefit over prior methods is that it does not require a collision checker which allows some degree of generalization to new environments it is imperative the authors demonstrate the approach on more complex obstacle environments which can be encoded via the context im not sure from the results shown that it would perform well in such cases in the current implementation the conditioning on the actual states of parameterization such as the position of the moved block does not seem consistent with the rest of the paper why not condition on the image alone a 2d environment with significantly more variations and objects than 5c may suffice to show that it can work in such a more complex setting though scaling to 3d scenes would be more informative 2 the novelty compared to prior works using treesearch in latent space is potentially limited for example 15 does use a sampler and a learned collision checker as noted but removing them essentially reduces that algorithm to elast the state sampler is only used to bias the search and make it more efficient but also state sampling is likely freely available from your training data it is not clear to me why one wouldnt use it the collision checker is used to allow generalization to novel environments which in elast is handled by the context c it is not clear how powerfully the context can be used to extend to more complex environments and should be shown by this work one major change between 15 and this work is that the latent space is learned in a potentially more robust way through contrastive learning it would be useful to investigate the effect this has by comparing to vaebased latent spaces with enforced dynamics a few more minor weaknesses an algorithm box for the tree search and the latent space learning would be appreciated adding the network names to the figures in eg fig 3 and 4 would be helpful to quickly see how each fits together minor notes line 198 list the inptuts and outputs of hf elast rl is an interesting extension appendix d1 is interesting and would appreciate more detailed i would appreciate greater discussion of the limitations of elast right now its focused more on future work and how previous works can be incorporated a few limitations of at least the current setup i see is it is not clear how robust the approach is to more complex environments eg with more obstacles and higher dimensional problems in the true lowlevel state space or in the controls space it would also be good to understand how it scales to kinodynamic systems i feel the computation time may too be a limitation depending on the results docsepthis paper proposes a new algorithm for planning and control for longhorizon goalreaching tasks in highdimensional spaces eg control from vision to do so the proposed approach performs samplingbased planning in a lowerdimensional latent space this latent space and its associated transition model are learned using contrastive learning with these two building blocks the proposed approach recursively plans a sequence of latent states that are tracked using a learned control policy the approach outperforms existing baselines in a number of simulation experiments this paper makes an interesting contribution for longhorizon planning in highdimensional spaces as it combines longhorizon samplingbased planning techniques from the robotics community with learned representations from the machine learning community to enable planning in a lowerdimensional latent space the approach is original and clearly presented and in general the paper is wellwritten strengths the learned latent space and density transition model alleviate the need to learn a collision detector distance metric and sampling distribution for samplingbased planning by planning and learning a transition model in a latent space the approach does not require learning highdimensional transition models eg video generative models which can be computationally expensive to train and evaluate weaknesses the paper does not report computation times as such it is unclear if the approach runs in realtime and can be applied to realworld robotic problems or if it only works in simulation in particular more details should be provided regarding scalability of the method with respect to the dimension of the latent space computation times for the mpc controller etc yes the authors state the limitations of the approach but importantly the paper misses a discussion about the computation times a discussion of failure cases eg only 22 success rate on the hammer environment is included if the authors provide a convincing discussion about the computation times for their proposed approach i will be happy to raise my rating docsepthe authors propose elast a method for modelbased longhorizon planning in a lowdimensional latent space of partiallyobservable systems with highdimensional observations st eg images the method can be separated in three conceptual stages first latent encodings zt are obtained for each observation st see below second a transition model is fit to timeconsecutive encodings to capture the system dynamics pzt1 zt disregarding context c for brevity here finally the transition model is used for latent space planning towards a latent goal this is done by expanding a search tree and then using a heuristic to cull expanded nodes based on the transition models density such that the tree remains compliant with the learned dynamics ie samplingbased motion planning inspired by rapidlyexploring random trees rrt and expansive space trees est the lowdimensional latent space lowers the search cost which allows the authors to use long planning horizons the search produces waypoints for an mpc controller which uses a local parametric policy to bridge them the method is validated over 10 different simulated environments with decent complexity and reasonable success rates in terms of model learning the latent encodings are a byproduct of applying contrastive predictive coding cpc 1 to estimate the density ratio pstk st pst k which implicitly captures the dependence between consecutive observations st the objective of cpc is called infonce and is heavily inspired by noise contrastive estimation nce 2 and avoids generative modelling of the observations st alltogether the encodings alone are not enough for planning as a latent transition model is also necessary the authors estimate the transitions conditional using nce as well in a second step based on the latent encodings alone there is some redundancy in modelling the transition see my comments below references 1 oord avd li y and vinyals o 2018 representation learning with contrastive predictive coding arxiv preprint arxiv180703748 2 gutmann m and hyvrinen a 2010 march noisecontrastive estimation a new estimation principle for unnormalized statistical models in proceedings of the thirteenth international conference on artificial intelligence and statistics pp 297304 jmlr workshop and conference proceedings strengths the method seems widely applicable as little domain knowledge is employed planning appears to be efficient the authors consider long search horizons and up to 7500 tree expansion steps during planning using search for control be it with tree heuristics like here or through optimal dynamic programming is generally better than local optimization methods that are prone to getting stuck in local minima the formulation appears novel to me weaknesses the paper sets out to address planning for partiallyobservable systems specifically where the statespace does not match the observations which for me implies a partiallyobservable markov decision process pomdp however the methods are geared towards solving an mdp as lookahead control mpc is used and generative observation modelling is omitted alltogether i found no clarification of this aspect in the main text and i believe this is a very important point in terms of positioning the method and making its limitations clear for example an optimal pomdp controller would need to plan in belief space and account for the effect of future observations on the state estimate 1 state estimates should also generally be uncertain to account for observation noise and modelling errors and from what i can tell the obtained latent encodings are a deterministic mapping of the observations could you please comment on how the method fits differs from the pomdp paradigm i believe this should be discussed in the manuscript i can see how observation generation can burden model training but why is that an issue during deployment for control sec 1 paragraph 2 assuming that we disregard the pomdp aspect which is currently the case control would only require the dynamics and rewards the goal state contrastive estimation to avoid observation generation has been considered in dreamer 2 before where through a single objective dynamics are learned in parallel with shaping the latent space im referring to the mutualinformation nce variant not the elbo what is the justification of learning the transition in two stages first the encodings through cpc then the dynamics through nce hf the forward dynamics model and psi the transition density seem redundant and this redundancy raises questions eg why is the transition density learned in a second stage see comment about dreamer above and why isnt the forward dynamics model trained to capture the transition density already it is argued that nce is better for transition estimation because the noise model provides outliers end of sec 42 but the noise model should be as close to the data distribution as possible for unbiased nce gradientscan you elaborate thresholding based on the transition density when rejecting samples from the search tree does not properly capture the probabilistic mass of the transition sec 43 have you considered rejection sampling importance sampling the experimental section ablates the planner and shows it is a core requirement for success but ablations of model learning encoding transition would also be helpful to assess the method eg what would happen if the planning scheme is used with dynamics learned through generative modelling eg planet or dreamer currently one cannot tell if the contrastive modelling is a major factor in terms of success or generative modelling could work just as well further remarks i believe the choice of the similarity used in cpc should be motivated further with more details about why it enforces proximity preservation this is not immediately clear from the ratio estimation of pstk st pstk alone i did not find details about the training of the ensemble hf this ties into my comment about the redundant transition modelling sequences of observations are generally not guaranteed to be markovian sec 4 first sentence despite the above weaknesses the method appears pragmatic and seems to work on the considered tasks i find that search in a lowdimensional latent statespace is generally a good idea albeit driven by heuristics this makes me lean towards a slightly positive assessment however i have reservations about the methodology and the authors input would be appreciated references 1 bertsekas d 2012 dynamic programming and optimal control volume i vol 1 athena scientific 2 hafner d lillicrap t ba j and norouzi m 2019 dream to control learning behaviors by latent imagination arxiv preprint arxiv191201603 i dont see any major negative social impact some of the methods limitations are acknowledged in a dedicated section docsepthis paper presents a method to perform planning in latent space for robotic applications the latent space is created for visual observations the planning procedure follows a methodology similar to samplingbased motion planning in robotics sampling new configurations here in latent space evaluation of the feasibility of transitioning reevaluating the connectivity of this new node to existing nodes the method is demonstrated in several tasks in simulation including nonrigid object manipulation a domain outside of what classical samplingbased planners could solve significance the method has the potential to open new avenues of research in planning it tries to extend a wellstudied solution from robotics samplingbased planning into domains where they couldnt be applied and also reducing the necessary information by integrating it with learning based methods quality the paper is well written and easy to read minor starting the paper with over the past decade machine learning has significantly improved the stateoftheart in imagebased robotics and perception but then citing a single paper feels a bit limited consider citing several seminal works or a survey also maybe there is a better citation than minsky for the limitations of rl credit assignment in sparse reward setup finally maybe unexplored is better word than undiscovered in some places of the text the planner explores these areas to check for feasibility also it should be clear what undiscoveredunexplored means in this context are these areas where the latents are generated before observing images extrapolation one notable lack are theoretical warranties samplingbased motion planning is a field where these warranties were carefully studied i wonder about the effect on completeness of sampling using a sampler based on an ensemble of observed dynamics this seems to steer search towards observed transitions which indicates that all paths need to be randomly explored at the beginning experiments on this would be necessary originality the main idea trajectory planning through sampling in the latent space has been presented before as the authors review however the presented method is original as it does not require a collision checker and can be used for environments with changes that can be parameterized clarity the paper and the concepts are mostly clear the section about nce could be improved for example the equations are copied from the original nce paper but the terms are not explained some sentences are a bit convoluted with multiple unclear subordinates eg l151 the use of random trajectories to train is first indirectly introduced in l191 this methodology for training should be explained before maybe at the beginning of section 4 in general the use of demonstrations is not clear consider indicating with bold text the best solution in table 1 there is a couple of typos eg l117 l207 l220 caption fig4 i indicated some of my doubts about the limitations in the questions i think some of those could be added to the currently very brief discussion of limitations in the paper ### Summary:
this work proposes a method to first learn a lower dimensional embedding via contrastive learning then learns a transition model and then utilizes a planner inspired from samplingbased motion planner literature to plan in this latent space from start to goal states a modelpredictive controller is harnessed to follow the desired trajectories in latent space overall the work is well presented and has been well received by reviewers it should inspire techniques in a lot of related areas like visionlanguage navigation during authorreviewer discussion period there was rich interaction and various additional clarifications and experiments were added by the authors the authors are encouraged to incorporate them into cameraready version and release reproducible sourcecode to accompany the paper
[ 352, 760, 2987, 275, 9864, 275, 1798, 625, 4278, 943, 320, 2530, 5001, 9171, 1430, 273, 253, 1332, 342, 1675, 281, 253, 7877, 273, 253, 21624, 2317, 13782, 2069, 323, 253, 278, 5902, 9763, 3966, 4754, 253, 4477, 1375, 253, 7364, 273, 253, 2746, 533, 15538, 253, 2929, 38771, 247, 5955, 670, 253, 13782, 2069, 247, 5955, 273, 4433, 2219, 24088, 760, 3307, 2323, 2281, 327, 253, 24627, 3126, 310, 2908, 50276, 338, 253, 4477, 2085, 247, 21414, 5955, 670, 253, 13782, 2069, 323, 616, 4081, 2746, 891, 588, 320, 5211, 281, 7164, 619, 13716, 5474, 339, 431, 248, 4477, 12661, 30595, 247, 1332, 323, 1566, 3169, 1048, 1688, 21148, 7219, 275, 247, 1698, 6967, 21624, 2317, 273, 10571, 23705, 494, 2718, 342, 1029, 6967, 7313, 331, 24088, 3888, 253, 1332, 476, 320, 9070, 275, 1264, 20178, 8661, 50276, 7053, 21624, 2349, 351, 723, 1182, 85, 403, 2797, 323, 1016, 8310, 331, 923, 2708, 50276, 9815, 247, 5502, 1566, 310, 4944, 281, 673, 585, 1704, 4995, 2349, 351, 723, 281, 9232, 253, 985, 8062, 268, 15701, 18, 50276, 15701, 20398, 13218, 3634, 260, 323, 1517, 27789, 1060, 50276, 71, 3341, 253, 5502, 1566, 310, 908, 323, 21624, 2317, 7219, 4404, 247, 21624, 4736, 436, 310, 2218, 407, 16122, 247, 3186, 5202, 285, 840, 970, 247, 47641, 281, 260, 962, 11848, 7632, 1754, 327, 253, 5502, 3210, 4038, 824, 326, 253, 5202, 4558, 38147, 342, 253, 6311, 8062, 26332, 10491, 3169, 3200, 7219, 11797, 407, 9086, 15083, 4263, 3632, 7139, 391, 1378, 285, 44380, 2317, 7139, 1144, 253, 1698, 6967, 21624, 2317, 45742, 253, 3186, 2105, 534, 4483, 253, 4477, 281, 897, 1048, 7219, 7627, 790, 253, 3186, 11330, 1039, 10801, 323, 271, 278, 5902, 9763, 534, 4648, 247, 1980, 36833, 3646, 281, 9729, 731, 50276, 783, 1332, 310, 17618, 689, 884, 1027, 15524, 12620, 342, 12524, 10454, 285, 5272, 2323, 4142, 50276, 249, 2426, 273, 1566, 4715, 253, 21624, 2349, 351, 723, 403, 247, 407, 7509, 273, 9433, 4499, 422, 15970, 12425, 260, 5902, 337, 281, 6642, 253, 4038, 4313, 268, 296, 76, 50276, 296, 50276, 81, 296, 50276, 76, 534, 29688, 28174, 253, 10096, 875, 12640, 7313, 331, 253, 8103, 273, 260, 5902, 310, 1925, 2192, 19131, 285, 310, 11306, 11797, 407, 6046, 4499, 422, 13418, 295, 336, 374, 285, 32547, 1006, 800, 26278, 273, 253, 7313, 331, 512, 36776, 253, 2349, 351, 723, 3815, 403, 417, 2217, 323, 7219, 347, 247, 21624, 5502, 1566, 310, 671, 3309, 253, 4477, 6642, 253, 16307, 17697, 970, 295, 336, 347, 973, 275, 247, 1273, 3213, 1754, 327, 253, 21624, 2349, 351, 723, 3815, 627, 310, 690, 39296, 275, 26278, 253, 5502, 923, 619, 5701, 2708, 50276, 250, 3065, 50276, 18, 258, 636, 1323, 69, 632, 340, 285, 362, 5104, 932, 258, 4765, 6779, 4715, 342, 4499, 422, 15970, 12425, 549, 32693, 638, 3845, 549, 32693, 11395, 1967, 1787, 2385, 50276, 19, 11479, 8420, 278, 285, 1465, 87, 11078, 257, 247, 4267, 14172, 6046, 45842, 422, 13418, 247, 747, 13418, 8063, 323, 440, 6320, 1025, 7605, 3210, 275, 10061, 273, 253, 20960, 16565, 5213, 8059, 327, 13345, 9260, 285, 9990, 7266, 32047, 19321, 480, 1686, 83, 22586, 285, 8059, 10061, 50276, 296, 3755, 20556, 50276, 783, 1332, 3133, 7561, 7763, 347, 1652, 5028, 3640, 310, 7091, 50276, 446, 7526, 4620, 281, 320, 5919, 253, 4477, 1908, 1048, 3186, 7627, 790, 285, 598, 281, 818, 5388, 5202, 7466, 5018, 1309, 7219, 50276, 5302, 3186, 323, 1453, 320, 352, 342, 5202, 344, 321, 3397, 751, 1060, 390, 949, 8654, 7870, 10717, 310, 3839, 1805, 685, 1980, 13757, 3082, 326, 403, 21291, 281, 2970, 10960, 275, 1980, 46836, 50276, 783, 15895, 4620, 4460, 281, 479, 50276, 20881, 1255, 265, 50276, 783, 2929, 5239, 562, 281, 2953, 7219, 323, 10571, 23705, 494, 2718, 5742, 835, 253, 3054, 4511, 1057, 417, 3761, 253, 7313, 534, 323, 479, 8018, 247, 10571, 23705, 494, 1616, 729, 3061, 1232, 31204, 12132, 2299, 253, 3082, 403, 48526, 4404, 16161, 271, 278, 12132, 347, 1007, 42338, 1453, 278, 5902, 310, 908, 285, 1006, 800, 8310, 26278, 310, 11035, 512, 36776, 891, 1119, 642, 37699, 273, 436, 4809, 275, 253, 2022, 2505, 285, 891, 2868, 436, 310, 247, 1077, 1774, 1127, 275, 2426, 273, 19274, 253, 1332, 285, 2403, 697, 7364, 2590, 323, 1650, 271, 8654, 31204, 12132, 9763, 651, 878, 281, 2098, 275, 9927, 2317, 285, 2395, 323, 253, 1055, 273, 2852, 7313, 327, 253, 1375, 6642, 337, 1375, 8197, 943, 671, 3839, 320, 8767, 281, 2395, 323, 8310, 6046, 285, 26278, 6332, 285, 432, 752, 891, 476, 2028, 253, 2797, 21624, 2349, 351, 723, 403, 247, 30027, 10603, 273, 253, 7313, 812, 368, 4496, 4385, 327, 849, 253, 1332, 13840, 50276, 13437, 398, 432, 253, 31204, 12132, 22199, 891, 2868, 436, 943, 320, 5469, 275, 253, 7714, 50276, 74, 476, 923, 849, 8310, 5978, 476, 7977, 1566, 3733, 533, 2139, 310, 326, 271, 2523, 1309, 19007, 323, 1453, 4706, 337, 12494, 374, 7384, 326, 359, 27719, 253, 31204, 12132, 4809, 534, 310, 4390, 253, 1083, 1453, 651, 760, 2430, 253, 8062, 285, 23267, 50276, 783, 4736, 1375, 50276, 45842, 422, 13418, 281, 3693, 8310, 5978, 556, 644, 2783, 275, 7156, 254, 374, 1078, 835, 949, 247, 2014, 8103, 8062, 403, 6311, 275, 7529, 342, 29209, 253, 21624, 2317, 516, 14339, 281, 253, 15577, 18480, 295, 336, 12955, 417, 253, 1045, 2399, 752, 310, 253, 22861, 273, 4715, 253, 5502, 275, 767, 8661, 806, 253, 2349, 351, 723, 949, 260, 5902, 840, 253, 8062, 949, 295, 336, 50276, 45791, 253, 3579, 8062, 1566, 285, 3714, 74, 253, 5502, 4038, 1646, 28116, 285, 436, 39296, 16540, 3533, 24088, 2139, 310, 253, 5502, 4038, 6311, 275, 247, 1273, 3924, 923, 4385, 670, 7156, 254, 1840, 285, 2139, 310, 2649, 253, 3579, 8062, 1566, 10166, 281, 9232, 253, 5502, 4038, 2168, 50276, 262, 310, 9125, 326, 295, 336, 310, 1805, 323, 5502, 13418, 984, 253, 6046, 1566, 3400, 42559, 990, 273, 4706, 5976, 533, 253, 6046, 1566, 943, 320, 347, 2810, 281, 253, 941, 3268, 347, 1896, 323, 38663, 295, 336, 27935, 5092, 368, 21184, 50276, 34503, 272, 1754, 327, 253, 5502, 4038, 672, 33944, 3530, 432, 253, 3186, 5202, 1057, 417, 6283, 9232, 253, 37851, 2280, 273, 253, 5502, 4706, 7652, 452, 368, 2783, 18235, 10491, 50276, 2948, 593, 10491, 50276, 783, 5661, 2593, 490, 77, 684, 253, 499, 9582, 285, 2722, 352, 310, 247, 5161, 8284, 323, 2323, 533, 490, 77, 569, 273, 1566, 4715, 9706, 5502, 651, 671, 320, 9371, 281, 2939, 253, 1332, 24088, 752, 651, 5108, 604, 253, 7219, 6974, 310, 908, 342, 8062, 6311, 949, 1006, 800, 26278, 24088, 8859, 390, 7156, 254, 4390, 581, 2550, 2028, 604, 253, 4499, 422, 26278, 310, 247, 2201, 2803, 275, 2426, 273, 2323, 390, 1006, 800, 26278, 812, 789, 816, 347, 973, 50276, 44295, 16157, 50276, 74, 2868, 253, 4327, 273, 253, 14259, 908, 275, 260, 5902, 943, 320, 17194, 2007, 342, 625, 4278, 670, 2139, 352, 546, 36217, 18326, 23029, 436, 310, 417, 4745, 2590, 432, 253, 4313, 13418, 273, 268, 296, 76, 50276, 296, 50276, 81, 296, 76, 3815, 50276, 74, 858, 417, 1089, 4278, 670, 253, 3733, 273, 253, 19862, 288, 71, 436, 16027, 715, 619, 4385, 670, 253, 28116, 5502, 26278, 50276, 45997, 273, 7313, 403, 3839, 417, 16293, 281, 320, 1616, 729, 757, 4706, 577, 806, 6197, 50276, 3229, 3784, 253, 1840, 32213, 253, 1332, 4620, 41585, 285, 3133, 281, 789, 327, 253, 2783, 8892, 891, 1089, 326, 3186, 275, 247, 1698, 6967, 21624, 3054, 4511, 310, 3839, 247, 1175, 2934, 23447, 8877, 407, 344, 321, 3397, 436, 2789, 479, 9644, 4404, 247, 5777, 2762, 6803, 2299, 891, 452, 33196, 670, 253, 16182, 285, 253, 4477, 3280, 651, 320, 14109, 50276, 250, 3065, 50276, 18, 270, 797, 339, 39903, 277, 4050, 7870, 10717, 285, 8654, 1453, 4644, 891, 1936, 337, 387, 864, 66, 8249, 50276, 19, 419, 71, 1216, 277, 298, 408, 280, 1761, 246, 18927, 480, 285, 4543, 276, 9877, 278, 6247, 7156, 281, 1453, 4715, 13576, 407, 21624, 17368, 549, 32693, 638, 3845, 549, 32693, 746, 805, 520, 29251, 891, 13414, 923, 667, 2201, 4016, 2675, 3486, 690, 273, 253, 3082, 7364, 403, 14969, 275, 247, 9940, 2593, 5474, 33032, 2520, 2929, 10262, 247, 1332, 281, 1347, 7219, 275, 21624, 2317, 323, 35121, 4893, 253, 21624, 2317, 310, 3562, 323, 5304, 7313, 253, 7219, 5199, 3637, 247, 16182, 2074, 281, 10491, 3169, 3200, 7219, 275, 15688, 982, 10491, 747, 16012, 1060, 275, 21624, 2317, 7103, 273, 253, 25720, 273, 5502, 272, 294, 15419, 18186, 253, 17769, 273, 436, 747, 4666, 281, 5368, 7632, 253, 1332, 310, 5183, 275, 2067, 8892, 275, 9864, 1690, 1327, 10389, 301, 1789, 19763, 247, 5028, 3345, 273, 752, 8946, 10491, 3169, 499, 23217, 812, 8415, 50275, 9188, 40348, 253, 1332, 556, 253, 2442, 281, 1527, 747, 44201, 273, 2561, 275, 7219, 352, 14177, 281, 9017, 247, 973, 14091, 728, 2900, 432, 15688, 982, 10491, 3169, 7219, 715, 10625, 835, 597, 812, 2649, 320, 3732, 285, 671, 8493, 253, 3309, 1491, 407, 24399, 352, 342, 4715, 1754, 3082, 50276, 15177, 253, 2929, 310, 973, 3542, 285, 3477, 281, 1239, 5884, 4983, 253, 2929, 342, 689, 253, 2469, 9976, 5145, 4715, 556, 3012, 5520, 253, 1375, 23037, 14387, 275, 2460, 3169, 15688, 982, 285, 13071, 533, 840, 19936, 247, 2014, 2929, 9193, 247, 2372, 3710, 1908, 19936, 2067, 41116, 2987, 390, 247, 6630, 671, 5046, 627, 310, 247, 1805, 25577, 685, 278, 26495, 323, 253, 7364, 273, 391, 77, 6152, 12714, 275, 23507, 10921, 9978, 4720, 5046, 35021, 2149, 310, 1805, 3159, 685, 3807, 2865, 3111, 275, 690, 5053, 273, 253, 2505, 253, 499, 9582, 33826, 841, 3672, 281, 2451, 323, 25720, 671, 352, 943, 320, 2590, 752, 3807, 2865, 3111, 328, 15083, 2149, 2097, 275, 436, 3634, 403, 841, 3672, 835, 253, 4329, 592, 403, 4561, 1078, 20764, 3888, 26480, 17888, 581, 16613, 3480, 403, 10527, 7501, 447, 10491, 3169, 3200, 7219, 310, 247, 1673, 835, 841, 7501, 447, 497, 9257, 5421, 891, 4282, 670, 253, 1055, 327, 29867, 273, 10491, 970, 247, 1775, 17407, 1754, 327, 271, 19862, 273, 2540, 8062, 436, 3133, 281, 37434, 3186, 4404, 2540, 16307, 534, 6492, 326, 512, 11865, 878, 281, 320, 12421, 14859, 387, 253, 5068, 4679, 327, 436, 651, 320, 3309, 50276, 19164, 414, 253, 2022, 2934, 18974, 7219, 949, 10491, 275, 253, 21624, 2317, 556, 644, 3559, 1078, 347, 253, 4477, 2278, 2299, 253, 3559, 1332, 310, 3236, 347, 352, 1057, 417, 2430, 247, 15708, 2451, 254, 285, 476, 320, 908, 323, 12620, 342, 2544, 326, 476, 320, 4764, 1025, 50276, 498, 15752, 253, 2929, 285, 253, 12342, 403, 6571, 2590, 253, 2593, 670, 295, 336, 812, 320, 5520, 323, 1650, 253, 7424, 403, 22489, 432, 253, 3236, 295, 336, 2929, 533, 253, 2426, 403, 417, 5544, 690, 14683, 403, 247, 2372, 2410, 311, 4525, 342, 2709, 12744, 42272, 8475, 24088, 298, 18795, 253, 897, 273, 3632, 24102, 281, 6194, 310, 806, 21719, 5611, 275, 298, 22179, 436, 16182, 323, 3733, 943, 320, 5544, 1078, 5046, 387, 253, 5068, 273, 2593, 577, 275, 2087, 253, 897, 273, 32367, 310, 417, 2590, 1908, 7809, 342, 13433, 2505, 253, 1682, 2900, 275, 2829, 337, 50276, 9088, 310, 247, 4564, 273, 963, 993, 24088, 298, 12231, 298, 18202, 298, 14256, 11743, 3036, 21, 891, 4860, 690, 273, 619, 24626, 670, 253, 7364, 275, 253, 3533, 891, 1158, 690, 273, 1110, 812, 320, 2879, 281, 253, 4390, 1077, 4864, 5955, 273, 7364, 275, 253, 2929, 2490, 187, 4118, 18435, 27, 2520, 789, 29328, 247, 1332, 281, 806, 3037, 247, 2406, 15759, 21496, 3066, 4499, 422, 4715, 840, 33772, 247, 5502, 1566, 285, 840, 29820, 247, 499, 9582, 11797, 432, 10491, 3169, 3200, 499, 9582, 6239, 281, 2098, 275, 436, 21624, 2317, 432, 1265, 281, 4736, 3054, 247, 1566, 22714, 422, 9763, 310, 26880, 264, 281, 956, 253, 6799, 24102, 275, 21624, 2317, 4583, 253, 789, 310, 973, 3559, 285, 556, 644, 973, 2959, 407, 30628, 352, 943, 26761, 5609, 275, 247, 2257, 273, 2905, 3672, 751, 8113, 12982, 15034, 1309, 2488, 15337, 254, 5955, 2180, 627, 369, 6793, 5016, 285, 2710, 3081, 8254, 6787, 285, 4679, 497, 2879, 407, 253, 4477, 253, 4477, 403, 14659, 281, 19071, 731, 715, 4049, 254, 609, 5102, 2715, 285, 3727, 41374, 2603, 3211, 281, 13920, 253, 2929, 50276 ]
[ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 ]
[ 352, 760, 2987, 275, 9864, 275, 1798, 625, 4278, 943, 320, 2530, 5001, 9171, 1430, 273, 253, 1332, 342, 1675, 281, 253, 7877, 273, 253, 21624, 2317, 13782, 2069, 323, 253, 278, 5902, 9763, 3966, 4754, 253, 4477, 1375, 253, 7364, 273, 253, 2746, 533, 15538, 253, 2929, 38771, 247, 5955, 670, 253, 13782, 2069, 247, 5955, 273, 4433, 2219, 24088, 760, 3307, 2323, 2281, 327, 253, 24627, 3126, 310, 2908, 50276, 338, 253, 4477, 2085, 247, 21414, 5955, 670, 253, 13782, 2069, 323, 616, 4081, 2746, 891, 588, 320, 5211, 281, 7164, 619, 13716, 5474, 339, 431, 248, 4477, 12661, 30595, 247, 1332, 323, 1566, 3169, 1048, 1688, 21148, 7219, 275, 247, 1698, 6967, 21624, 2317, 273, 10571, 23705, 494, 2718, 342, 1029, 6967, 7313, 331, 24088, 3888, 253, 1332, 476, 320, 9070, 275, 1264, 20178, 8661, 50276, 7053, 21624, 2349, 351, 723, 1182, 85, 403, 2797, 323, 1016, 8310, 331, 923, 2708, 50276, 9815, 247, 5502, 1566, 310, 4944, 281, 673, 585, 1704, 4995, 2349, 351, 723, 281, 9232, 253, 985, 8062, 268, 15701, 18, 50276, 15701, 20398, 13218, 3634, 260, 323, 1517, 27789, 1060, 50276, 71, 3341, 253, 5502, 1566, 310, 908, 323, 21624, 2317, 7219, 4404, 247, 21624, 4736, 436, 310, 2218, 407, 16122, 247, 3186, 5202, 285, 840, 970, 247, 47641, 281, 260, 962, 11848, 7632, 1754, 327, 253, 5502, 3210, 4038, 824, 326, 253, 5202, 4558, 38147, 342, 253, 6311, 8062, 26332, 10491, 3169, 3200, 7219, 11797, 407, 9086, 15083, 4263, 3632, 7139, 391, 1378, 285, 44380, 2317, 7139, 1144, 253, 1698, 6967, 21624, 2317, 45742, 253, 3186, 2105, 534, 4483, 253, 4477, 281, 897, 1048, 7219, 7627, 790, 253, 3186, 11330, 1039, 10801, 323, 271, 278, 5902, 9763, 534, 4648, 247, 1980, 36833, 3646, 281, 9729, 731, 50276, 783, 1332, 310, 17618, 689, 884, 1027, 15524, 12620, 342, 12524, 10454, 285, 5272, 2323, 4142, 50276, 249, 2426, 273, 1566, 4715, 253, 21624, 2349, 351, 723, 403, 247, 407, 7509, 273, 9433, 4499, 422, 15970, 12425, 260, 5902, 337, 281, 6642, 253, 4038, 4313, 268, 296, 76, 50276, 296, 50276, 81, 296, 50276, 76, 534, 29688, 28174, 253, 10096, 875, 12640, 7313, 331, 253, 8103, 273, 260, 5902, 310, 1925, 2192, 19131, 285, 310, 11306, 11797, 407, 6046, 4499, 422, 13418, 295, 336, 374, 285, 32547, 1006, 800, 26278, 273, 253, 7313, 331, 512, 36776, 253, 2349, 351, 723, 3815, 403, 417, 2217, 323, 7219, 347, 247, 21624, 5502, 1566, 310, 671, 3309, 253, 4477, 6642, 253, 16307, 17697, 970, 295, 336, 347, 973, 275, 247, 1273, 3213, 1754, 327, 253, 21624, 2349, 351, 723, 3815, 627, 310, 690, 39296, 275, 26278, 253, 5502, 923, 619, 5701, 2708, 50276, 250, 3065, 50276, 18, 258, 636, 1323, 69, 632, 340, 285, 362, 5104, 932, 258, 4765, 6779, 4715, 342, 4499, 422, 15970, 12425, 549, 32693, 638, 3845, 549, 32693, 11395, 1967, 1787, 2385, 50276, 19, 11479, 8420, 278, 285, 1465, 87, 11078, 257, 247, 4267, 14172, 6046, 45842, 422, 13418, 247, 747, 13418, 8063, 323, 440, 6320, 1025, 7605, 3210, 275, 10061, 273, 253, 20960, 16565, 5213, 8059, 327, 13345, 9260, 285, 9990, 7266, 32047, 19321, 480, 1686, 83, 22586, 285, 8059, 10061, 50276, 296, 3755, 20556, 50276, 783, 1332, 3133, 7561, 7763, 347, 1652, 5028, 3640, 310, 7091, 50276, 446, 7526, 4620, 281, 320, 5919, 253, 4477, 1908, 1048, 3186, 7627, 790, 285, 598, 281, 818, 5388, 5202, 7466, 5018, 1309, 7219, 50276, 5302, 3186, 323, 1453, 320, 352, 342, 5202, 344, 321, 3397, 751, 1060, 390, 949, 8654, 7870, 10717, 310, 3839, 1805, 685, 1980, 13757, 3082, 326, 403, 21291, 281, 2970, 10960, 275, 1980, 46836, 50276, 783, 15895, 4620, 4460, 281, 479, 50276, 20881, 1255, 265, 50276, 783, 2929, 5239, 562, 281, 2953, 7219, 323, 10571, 23705, 494, 2718, 5742, 835, 253, 3054, 4511, 1057, 417, 3761, 253, 7313, 534, 323, 479, 8018, 247, 10571, 23705, 494, 1616, 729, 3061, 1232, 31204, 12132, 2299, 253, 3082, 403, 48526, 4404, 16161, 271, 278, 12132, 347, 1007, 42338, 1453, 278, 5902, 310, 908, 285, 1006, 800, 8310, 26278, 310, 11035, 512, 36776, 891, 1119, 642, 37699, 273, 436, 4809, 275, 253, 2022, 2505, 285, 891, 2868, 436, 310, 247, 1077, 1774, 1127, 275, 2426, 273, 19274, 253, 1332, 285, 2403, 697, 7364, 2590, 323, 1650, 271, 8654, 31204, 12132, 9763, 651, 878, 281, 2098, 275, 9927, 2317, 285, 2395, 323, 253, 1055, 273, 2852, 7313, 327, 253, 1375, 6642, 337, 1375, 8197, 943, 671, 3839, 320, 8767, 281, 2395, 323, 8310, 6046, 285, 26278, 6332, 285, 432, 752, 891, 476, 2028, 253, 2797, 21624, 2349, 351, 723, 403, 247, 30027, 10603, 273, 253, 7313, 812, 368, 4496, 4385, 327, 849, 253, 1332, 13840, 50276, 13437, 398, 432, 253, 31204, 12132, 22199, 891, 2868, 436, 943, 320, 5469, 275, 253, 7714, 50276, 74, 476, 923, 849, 8310, 5978, 476, 7977, 1566, 3733, 533, 2139, 310, 326, 271, 2523, 1309, 19007, 323, 1453, 4706, 337, 12494, 374, 7384, 326, 359, 27719, 253, 31204, 12132, 4809, 534, 310, 4390, 253, 1083, 1453, 651, 760, 2430, 253, 8062, 285, 23267, 50276, 783, 4736, 1375, 50276, 45842, 422, 13418, 281, 3693, 8310, 5978, 556, 644, 2783, 275, 7156, 254, 374, 1078, 835, 949, 247, 2014, 8103, 8062, 403, 6311, 275, 7529, 342, 29209, 253, 21624, 2317, 516, 14339, 281, 253, 15577, 18480, 295, 336, 12955, 417, 253, 1045, 2399, 752, 310, 253, 22861, 273, 4715, 253, 5502, 275, 767, 8661, 806, 253, 2349, 351, 723, 949, 260, 5902, 840, 253, 8062, 949, 295, 336, 50276, 45791, 253, 3579, 8062, 1566, 285, 3714, 74, 253, 5502, 4038, 1646, 28116, 285, 436, 39296, 16540, 3533, 24088, 2139, 310, 253, 5502, 4038, 6311, 275, 247, 1273, 3924, 923, 4385, 670, 7156, 254, 1840, 285, 2139, 310, 2649, 253, 3579, 8062, 1566, 10166, 281, 9232, 253, 5502, 4038, 2168, 50276, 262, 310, 9125, 326, 295, 336, 310, 1805, 323, 5502, 13418, 984, 253, 6046, 1566, 3400, 42559, 990, 273, 4706, 5976, 533, 253, 6046, 1566, 943, 320, 347, 2810, 281, 253, 941, 3268, 347, 1896, 323, 38663, 295, 336, 27935, 5092, 368, 21184, 50276, 34503, 272, 1754, 327, 253, 5502, 4038, 672, 33944, 3530, 432, 253, 3186, 5202, 1057, 417, 6283, 9232, 253, 37851, 2280, 273, 253, 5502, 4706, 7652, 452, 368, 2783, 18235, 10491, 50276, 2948, 593, 10491, 50276, 783, 5661, 2593, 490, 77, 684, 253, 499, 9582, 285, 2722, 352, 310, 247, 5161, 8284, 323, 2323, 533, 490, 77, 569, 273, 1566, 4715, 9706, 5502, 651, 671, 320, 9371, 281, 2939, 253, 1332, 24088, 752, 651, 5108, 604, 253, 7219, 6974, 310, 908, 342, 8062, 6311, 949, 1006, 800, 26278, 24088, 8859, 390, 7156, 254, 4390, 581, 2550, 2028, 604, 253, 4499, 422, 26278, 310, 247, 2201, 2803, 275, 2426, 273, 2323, 390, 1006, 800, 26278, 812, 789, 816, 347, 973, 50276, 44295, 16157, 50276, 74, 2868, 253, 4327, 273, 253, 14259, 908, 275, 260, 5902, 943, 320, 17194, 2007, 342, 625, 4278, 670, 2139, 352, 546, 36217, 18326, 23029, 436, 310, 417, 4745, 2590, 432, 253, 4313, 13418, 273, 268, 296, 76, 50276, 296, 50276, 81, 296, 76, 3815, 50276, 74, 858, 417, 1089, 4278, 670, 253, 3733, 273, 253, 19862, 288, 71, 436, 16027, 715, 619, 4385, 670, 253, 28116, 5502, 26278, 50276, 45997, 273, 7313, 403, 3839, 417, 16293, 281, 320, 1616, 729, 757, 4706, 577, 806, 6197, 50276, 3229, 3784, 253, 1840, 32213, 253, 1332, 4620, 41585, 285, 3133, 281, 789, 327, 253, 2783, 8892, 891, 1089, 326, 3186, 275, 247, 1698, 6967, 21624, 3054, 4511, 310, 3839, 247, 1175, 2934, 23447, 8877, 407, 344, 321, 3397, 436, 2789, 479, 9644, 4404, 247, 5777, 2762, 6803, 2299, 891, 452, 33196, 670, 253, 16182, 285, 253, 4477, 3280, 651, 320, 14109, 50276, 250, 3065, 50276, 18, 270, 797, 339, 39903, 277, 4050, 7870, 10717, 285, 8654, 1453, 4644, 891, 1936, 337, 387, 864, 66, 8249, 50276, 19, 419, 71, 1216, 277, 298, 408, 280, 1761, 246, 18927, 480, 285, 4543, 276, 9877, 278, 6247, 7156, 281, 1453, 4715, 13576, 407, 21624, 17368, 549, 32693, 638, 3845, 549, 32693, 746, 805, 520, 29251, 891, 13414, 923, 667, 2201, 4016, 2675, 3486, 690, 273, 253, 3082, 7364, 403, 14969, 275, 247, 9940, 2593, 5474, 33032, 2520, 2929, 10262, 247, 1332, 281, 1347, 7219, 275, 21624, 2317, 323, 35121, 4893, 253, 21624, 2317, 310, 3562, 323, 5304, 7313, 253, 7219, 5199, 3637, 247, 16182, 2074, 281, 10491, 3169, 3200, 7219, 275, 15688, 982, 10491, 747, 16012, 1060, 275, 21624, 2317, 7103, 273, 253, 25720, 273, 5502, 272, 294, 15419, 18186, 253, 17769, 273, 436, 747, 4666, 281, 5368, 7632, 253, 1332, 310, 5183, 275, 2067, 8892, 275, 9864, 1690, 1327, 10389, 301, 1789, 19763, 247, 5028, 3345, 273, 752, 8946, 10491, 3169, 499, 23217, 812, 8415, 50275, 9188, 40348, 253, 1332, 556, 253, 2442, 281, 1527, 747, 44201, 273, 2561, 275, 7219, 352, 14177, 281, 9017, 247, 973, 14091, 728, 2900, 432, 15688, 982, 10491, 3169, 7219, 715, 10625, 835, 597, 812, 2649, 320, 3732, 285, 671, 8493, 253, 3309, 1491, 407, 24399, 352, 342, 4715, 1754, 3082, 50276, 15177, 253, 2929, 310, 973, 3542, 285, 3477, 281, 1239, 5884, 4983, 253, 2929, 342, 689, 253, 2469, 9976, 5145, 4715, 556, 3012, 5520, 253, 1375, 23037, 14387, 275, 2460, 3169, 15688, 982, 285, 13071, 533, 840, 19936, 247, 2014, 2929, 9193, 247, 2372, 3710, 1908, 19936, 2067, 41116, 2987, 390, 247, 6630, 671, 5046, 627, 310, 247, 1805, 25577, 685, 278, 26495, 323, 253, 7364, 273, 391, 77, 6152, 12714, 275, 23507, 10921, 9978, 4720, 5046, 35021, 2149, 310, 1805, 3159, 685, 3807, 2865, 3111, 275, 690, 5053, 273, 253, 2505, 253, 499, 9582, 33826, 841, 3672, 281, 2451, 323, 25720, 671, 352, 943, 320, 2590, 752, 3807, 2865, 3111, 328, 15083, 2149, 2097, 275, 436, 3634, 403, 841, 3672, 835, 253, 4329, 592, 403, 4561, 1078, 20764, 3888, 26480, 17888, 581, 16613, 3480, 403, 10527, 7501, 447, 10491, 3169, 3200, 7219, 310, 247, 1673, 835, 841, 7501, 447, 497, 9257, 5421, 891, 4282, 670, 253, 1055, 327, 29867, 273, 10491, 970, 247, 1775, 17407, 1754, 327, 271, 19862, 273, 2540, 8062, 436, 3133, 281, 37434, 3186, 4404, 2540, 16307, 534, 6492, 326, 512, 11865, 878, 281, 320, 12421, 14859, 387, 253, 5068, 4679, 327, 436, 651, 320, 3309, 50276, 19164, 414, 253, 2022, 2934, 18974, 7219, 949, 10491, 275, 253, 21624, 2317, 556, 644, 3559, 1078, 347, 253, 4477, 2278, 2299, 253, 3559, 1332, 310, 3236, 347, 352, 1057, 417, 2430, 247, 15708, 2451, 254, 285, 476, 320, 908, 323, 12620, 342, 2544, 326, 476, 320, 4764, 1025, 50276, 498, 15752, 253, 2929, 285, 253, 12342, 403, 6571, 2590, 253, 2593, 670, 295, 336, 812, 320, 5520, 323, 1650, 253, 7424, 403, 22489, 432, 253, 3236, 295, 336, 2929, 533, 253, 2426, 403, 417, 5544, 690, 14683, 403, 247, 2372, 2410, 311, 4525, 342, 2709, 12744, 42272, 8475, 24088, 298, 18795, 253, 897, 273, 3632, 24102, 281, 6194, 310, 806, 21719, 5611, 275, 298, 22179, 436, 16182, 323, 3733, 943, 320, 5544, 1078, 5046, 387, 253, 5068, 273, 2593, 577, 275, 2087, 253, 897, 273, 32367, 310, 417, 2590, 1908, 7809, 342, 13433, 2505, 253, 1682, 2900, 275, 2829, 337, 50276, 9088, 310, 247, 4564, 273, 963, 993, 24088, 298, 12231, 298, 18202, 298, 14256, 11743, 3036, 21, 891, 4860, 690, 273, 619, 24626, 670, 253, 7364, 275, 253, 3533, 891, 1158, 690, 273, 1110, 812, 320, 2879, 281, 253, 4390, 1077, 4864, 5955, 273, 7364, 275, 253, 2929, 2490, 187, 4118, 18435, 27, 2520, 789, 29328, 247, 1332, 281, 806, 3037, 247, 2406, 15759, 21496, 3066, 4499, 422, 4715, 840, 33772, 247, 5502, 1566, 285, 840, 29820, 247, 499, 9582, 11797, 432, 10491, 3169, 3200, 499, 9582, 6239, 281, 2098, 275, 436, 21624, 2317, 432, 1265, 281, 4736, 3054, 247, 1566, 22714, 422, 9763, 310, 26880, 264, 281, 956, 253, 6799, 24102, 275, 21624, 2317, 4583, 253, 789, 310, 973, 3559, 285, 556, 644, 973, 2959, 407, 30628, 352, 943, 26761, 5609, 275, 247, 2257, 273, 2905, 3672, 751, 8113, 12982, 15034, 1309, 2488, 15337, 254, 5955, 2180, 627, 369, 6793, 5016, 285, 2710, 3081, 8254, 6787, 285, 4679, 497, 2879, 407, 253, 4477, 253, 4477, 403, 14659, 281, 19071, 731, 715, 4049, 254, 609, 5102, 2715, 285, 3727, 41374, 2603, 3211, 281, 13920, 253, 2929, 50276 ]
Below is given review of a research paper from cnoference journal. Please write a summary the review. ### Review: the proposed method dnd learns an augmentation policy to augment a target text dataset the augmentation policy is optimized to create difficult yet not too different examples from the original data to further improve the performance a sample reweighting scheme is introduced to focus on harder examples experiments show that learned augmentation policy can achieve good results on various nlp tasks and transfer well across datasets and architectures strengths clarity the writing is clear and easy to follow wellmotivated approach with extensive experiments although the approaches to learn an augmentation policy using an adversarial objective enforce the augmented samples to be similar to the original samples and use continuous relaxation to train the augmentation policy are studied separately in image domain the proposed method is a viable way to apply the ideas to text data and shows good experimental results on various nlp tasks weaknessescomments comparison with other learning objectives the ultimate goal of data augmentation is to improve the generalization power of a model how does the proposed difficult and not different objective compare with the objective that improves the validation performance directly like autoaugment or taa 1 sensitivity of dnd hyperparameters despite being an automated data augmentation method there are several important hyperparameters eg lambdar lambdasim lambdas tp to be tuned how sensitive is the proposed method to these hyperparameters whether these hyperparameters needed to be tuned carefully ablation study for extra loss terms in the ablation study section are the vanilla random and fixed baselines also being trained with the extra loss terms lsim and lrecon while it is a motivated decision to introduce these terms during the training of ftheta can these losses also contribute to the learning of better representations and lead to the improvement it would be useful if the effects of these extra losses are discussed in the ablation study exclusion of mixup from the augmentation pool authors mention that mixup is not included in the augmentation pool as it alters the semantics of the original sentences however mixup can create difficult examples by sample and label mixing mixup also shows good results for some datasets under table 1 as a learningbased data augmentation method is it the responsibility for the search algorithm to learn the use of mixup in a datadriven way can the inclusion of mixup in the augmentation pool improves the performance if the validity of a candidate augmentation method has to be evaluated before adding to the augmentation pool does it contradict with the goal of fully learnable data augmentation hyperparameter tuning for other baselines some of the baseline methods like eda bertaug mixup backadv also involve hyperparameters according to appendix a2 it seems that these hyperparameters are not tuned as the hyperparameters of dnd eg lambdas and tp are tuned for each dataset it is fairer to tune the hyperparameters for the other baselines formulation of the probability and magnitude for eda is there a specific reason that eda is assigned with a single probability and magnitude parameter can the use of different p and mu values for synonym replacement random insertion random swap and random deletion further improves the augmentation policy definition of the magnitude parameters for the augmentation candidates it is unclear how the magnitude parameters are defined for the augmentation candidates for example is the mask probability of bertaug taken as the magnitude what are the ranges of the magnitudes it would be useful to include these in the appendix reference 1 shuhuai ren jinchao zhang lei li xu sun jie zhou text autoaugment learning compositional augmentation policy for text classification arxiv preprint arxiv210900523 2021 although the approach to learn an augmentation policy using the adversarial and similarity objectives is studied in the image domain the proposed method contributes to the adaptation of learnable augmentation in the nlp domain and shows good experimental results overall i tend to recommend for acceptance for the initial rating docsepthe authors proposed a new reward function to learn the distribution of augmentation policies for nlp tasks which can generate difficult and semantic similar samples to facilitate training they further introduced a samplewise reweighting scheme to leverage the learning status of original sample a continuous relaxation is applied to optimize the trainable parameters in policy the proposed method outperformed sota on text classification and entailment tasks they also conducted extensive studies and analysis demonstrating the effectiveness of their method on lowresource and classimbalanced regimes as well as its transferability strengths the proposed method is intuitive simple to apply and effective over many baseline augmentation methods on various datasets especially on lowresource classimbalanced regimes besides the good performance on benchmarks the authors also conducted good ablation study and analysis on learned policies showing that the selected policy coincides with previous work via exhaustive grid search weakness and concerns 1 the main issue lies in the choice of baseline as a learningbased augmentation method it will be more convincing to compare with other learningbased method using same pool eg autoaugment adversarial autoaugment etc if the method can still outperform them it can better demonstrate the effectiveness of their reward design 2 limited technical novelty the whole idea is very straightforward and most of the techniques are borrowed from previous works this paper proposed a simple and effective reward function for learningbased augmentation selection in nlp extensive experiments and analysis demonstrated its effectiveness on text classification and entailment especially on lowresource and classimbalanced regimes docsepthis paper proposes a data augmentation technique difficult but not too different the authors claim that an augmented sample should be difficult in terms of loss and semantically similar to the original sample the authors propose two rewards functions to reward difficult samples and similar samples respectively a relaxation technique is further used such that the augmentation policy can be trained experiments are conducted on the glue development set and several other text classification tasks the proposed method improves upon existing ones overall the paper is wellwritten and easy to understand experiments are conducted on different datasets and training schemes the proposed method improves upon existing data augmentation techniques the proposed algorithm in this paper is a combination of known techniques ie all the data augmentation techniques are wellestablished and the relaxation method used to train the policy is also not new because the technical novelty of this paper is limited which is acceptable empirical evaluation needs to be substantially strengthened some designs are not wellmotivated 1 in the similarity loss eq 4 a two layer mlp is used to serve as a linear classifier could the authors explain why this is better than simpler choices such as cosinesimilarity or ell2 distance 2 continue on eq 4 what is the accuracy of gw after training it is possible that gw cannot get properly trained because of the limited number of finetuning epochs 3 the reconstruction loss eq 5 is confusing could the authors provide more intuition on this also there should be experiments that demonstrate the necessity of this loss term the training looks adhoc there are several designs that seem arbitrary some additional experiments are needed 1 several existing methods are selected as the data augmentation pool cutoff backtrans bertaug adv eda and r3f there should be experiments that show the importance of these policies ie are they all necessary in figure 3a the probability of selecting r3f adv eda and cutoff is low cutoff nearly drops to 0 towards the end of training is it possible to only include backtrans and bertaug 2 the authors mention that they need to simultaneously train 4 augmentation policies what will happen if we only train one or more than 4 policy 3 the operation count is set to t2 in the experiments analysis are needed on other values of t 4 the authors mention an auxiliary operation hataya et al 2020 is added an ablation study is needed regarding the significance of this operation 5 in the sample weight w eq 1 there are two hyperparameters alpha and beta experiments are needed to see how these two hyperparameters change model performance 6 there should be experiments on the weight lambdar of the reconstruction loss algorithm 1 additional comments 1 in table 1 performance gain is not clear variance of the results is very large and it is hard to tell the significance of the gain the authors should conduct significance tests and report the pvalues 2 in the current version the training loss function is only shown in the algorithm box include it at the beginning of section 32 will make the presentation clearer i will raise my score if the authors can conduct the ablation experiments and address my concerns the paper applies existing differentiable data augmentation methods hataya et al 2020 to text classification overall the technical novelty is low there are several additional experiments in particular ablation studies needed to justify the design choices and the performance gain docsepthe paper provides a simple yet effective approach towards data augmentation in nlp tasks in particular it proposes an augmentation strategy that encourages constructing augmented samples with low confidence which makes them challenging hence more informative and high semantic similarity with the original samples which ensures the semantics are not lost in augmentation through comprehensive experiments the authors show that their approach outperforms recent stateoftheart techniques especially on lowdata and classimbalanced regimes strengths the paper is very well written the motivations and contributions of the paper are very clear and important data augmentation policy learning is a relatively underexplored area in the literature and i think this paper makes a very useful contribution to exploring this research area the proposed approach is simple enough to be replicated easily in followup research the provided code in the supplementary material is well written and easy to understand too the paper provides comprehensive experiments covering a diverse set of tasks it includes results using 8 baselinestateoftheart models on 6 datasets with different text classification tasks and on 8 tasks present in the glue benchmark it provides the results on multiple classimbalancelowdata setups further it provides an ablation study to understand the contribution of different components of the proposed model to provide evidence of the importance of each of those components finally the paper also provides some results to show the transferability of the approach and some analysis of learning dynamics in most of these results the proposed approach outperforms the baselinestateoftheart models weaknesses suggestions minor correction in page 4 rewarding not too different samples section its mentioned to see figure 1b and 6b but the figure 6 is not present in the main paper further both 1b and 6b are linked to 1b figure only gw from equation 3 is sent directly to the cross entropy loss function in equation 4 i think it would be more clear if a sigmoid function is added on gw either in equation 3 or 4 further in equation 3 did you consider simply calculating the dot product of m1 and m2 instead of concatenating them and using a neural network i think the equation 5 is a bit unclear could you clarify the following is the idea here similar to minimizing the dot product between the output embedding and input embedding for a given word can you add more description of what exactly v and x are x is referred to as the input token but its used as a label in the cross entropy loss implying that its a binary value how is the input token converted to a binary label given that the output and input embeddings of a transformer are in different spaces could you provide some justification of how calculating the dot product between those vectors would be appropriate a minor comment but the usage of policyreward made me think that this paper uses some reinforcement learning approach upon reading the paper the usage of those terms do seem appropriate i am not suggesting that this needs a change but i just wanted to point out that some readers might get confused a bit by this terminology regarding the glue benchmark results it would be interesting to see how this model fares in the glue leaderboard could you add those results if available in the final conclusion section could you add some directions in which this work can be extended overall i vote for accepting i think the paper proposes a very interesting approach to improve data augmentation in nlp tasks by learning a policy that decides how to combine different augmentation techniques in a task dependent manner to generate samples that are informative while retaining the semantic information from the original sample since this research area is relatively new i think this paper provides a really good baseline and a framework for other papers to improve upon as its easy to implement and allows adding more augmentation techniques in the policy pool improving the lossregularizer functions etc i have a few minor issues regarding the paper and i described them in detail below hopefully the authors can address my concerns in the rebuttal period ### Summary:
we appreciate the authors for addressing the comments raised by the reviewers during the discussion period which includes providing more experimental results to address the concerns we believe the publication of this paper can contribute to the important topic of data augmentation the authors are highly recommended to consider all the comments and suggestions made by the reviewers when further revising their paper for publication
[ 42072, 6363, 1057, 352, 17343, 342, 253, 4736, 273, 4751, 3037, 494, 941, 42072, 209, 186, 27049, 19484, 25184, 323, 643, 1666, 25379, 690, 273, 253, 8245, 3082, 751, 1407, 66, 270, 797, 2321, 5878, 484, 896, 24301, 671, 6388, 4373, 22041, 2556, 281, 30762, 247, 19, 352, 3133, 326, 841, 4373, 22041, 403, 417, 24251, 347, 253, 4373, 22041, 273, 277, 2109, 24088, 24082, 34797, 285, 246, 81, 403, 24251, 323, 1016, 10895, 352, 310, 22870, 83, 281, 19928, 253, 4373, 22041, 323, 253, 643, 1666, 25379, 50276, 186, 630, 1427, 273, 253, 5912, 285, 9777, 323, 1407, 66, 50276, 261, 627, 247, 2173, 1921, 326, 1407, 66, 310, 7922, 342, 247, 2014, 5912, 285, 9777, 4764, 476, 253, 897, 273, 1027, 268, 285, 12910, 2193, 323, 2753, 7983, 5407, 3632, 16941, 3632, 22101, 285, 3632, 17404, 2007, 19132, 253, 42072, 3646, 209, 186, 28692, 273, 253, 9777, 3602, 323, 253, 42072, 9183, 352, 310, 12744, 849, 253, 9777, 3602, 403, 2931, 323, 253, 42072, 9183, 323, 1650, 310, 253, 8989, 5912, 273, 270, 797, 2321, 2668, 347, 253, 9777, 752, 403, 253, 13794, 273, 253, 32800, 352, 651, 320, 4217, 281, 2486, 841, 275, 253, 30762, 50276, 14005, 50276, 18, 439, 6968, 86, 2284, 3816, 480, 12099, 8500, 1182, 12109, 43278, 632, 1269, 86, 5101, 480, 466, 1182, 14451, 2505, 6753, 2321, 420, 4715, 5889, 267, 42072, 3646, 323, 2505, 9162, 549, 32693, 638, 3845, 549, 32693, 19, 12852, 5523, 1508, 43425, 3738, 253, 2746, 281, 3037, 271, 42072, 3646, 970, 253, 48960, 285, 14259, 16566, 310, 5421, 275, 253, 2460, 5028, 253, 4081, 1332, 17904, 281, 253, 15644, 273, 3037, 494, 42072, 275, 253, 295, 24343, 5028, 285, 2722, 1175, 5661, 1543, 4583, 891, 5257, 281, 5583, 323, 14924, 323, 253, 3302, 13716, 5474, 339, 431, 248, 4477, 4081, 247, 747, 10921, 1159, 281, 3037, 253, 3268, 273, 42072, 7823, 323, 295, 24343, 8892, 534, 476, 6635, 2834, 285, 24705, 2074, 3530, 281, 12454, 3733, 597, 2007, 5611, 247, 3410, 3020, 294, 6712, 272, 6974, 281, 25057, 253, 4715, 3708, 273, 3236, 3410, 247, 5415, 17040, 310, 3732, 281, 22318, 253, 6194, 494, 3602, 275, 3646, 253, 4081, 1332, 41731, 10574, 256, 5503, 327, 2505, 9162, 285, 46518, 420, 8892, 597, 671, 5196, 9470, 2175, 285, 1783, 17227, 253, 12510, 273, 616, 1332, 327, 1698, 15024, 285, 966, 6785, 267, 3086, 27005, 347, 973, 347, 697, 3700, 1430, 20544, 253, 4081, 1332, 310, 27350, 2969, 281, 4647, 285, 3576, 689, 1142, 8245, 42072, 3082, 327, 2710, 15302, 3340, 327, 1698, 15024, 966, 6785, 267, 3086, 27005, 16280, 253, 1175, 3045, 327, 49602, 253, 4477, 671, 5196, 1175, 28913, 1263, 285, 1783, 327, 6311, 7823, 4645, 326, 253, 4236, 3646, 30150, 342, 2045, 789, 3066, 41389, 9860, 3186, 50276, 20881, 1255, 285, 7350, 337, 253, 2022, 2523, 8696, 275, 253, 4327, 273, 8245, 347, 247, 4715, 3169, 42072, 1332, 352, 588, 320, 625, 21414, 281, 7277, 342, 643, 4715, 3169, 1332, 970, 1072, 6363, 24088, 6753, 2321, 420, 48960, 6753, 2321, 420, 3966, 604, 253, 1332, 476, 1335, 562, 32231, 731, 352, 476, 1805, 7568, 253, 12510, 273, 616, 10921, 2216, 374, 3710, 7681, 38135, 253, 2644, 2934, 310, 1077, 15246, 285, 954, 273, 253, 5609, 403, 29563, 432, 2045, 2987, 50276, 2520, 2929, 4081, 247, 2969, 285, 3576, 10921, 1159, 323, 4715, 3169, 42072, 5438, 275, 295, 24343, 9470, 4679, 285, 1783, 5183, 697, 12510, 327, 2505, 9162, 285, 46518, 420, 3340, 327, 1698, 15024, 285, 966, 6785, 267, 3086, 27005, 5474, 33032, 2520, 2929, 29328, 247, 941, 42072, 5853, 50276, 38157, 533, 417, 1512, 1027, 253, 4477, 1750, 326, 271, 31612, 3410, 943, 320, 2834, 275, 2426, 273, 2957, 285, 3300, 39904, 2074, 281, 253, 3236, 3410, 253, 4477, 12661, 767, 23267, 3470, 281, 10921, 2834, 3530, 285, 2074, 3530, 2975, 247, 17040, 5853, 310, 2007, 908, 824, 326, 253, 42072, 3646, 476, 320, 10166, 4679, 403, 5196, 327, 253, 28400, 2440, 873, 285, 2067, 643, 2505, 9162, 8892, 253, 4081, 1332, 19132, 2220, 5368, 4394, 4583, 253, 2929, 310, 973, 15720, 285, 3477, 281, 2096, 4679, 403, 5196, 327, 1027, 15302, 285, 3733, 15849, 253, 4081, 1332, 19132, 2220, 5368, 941, 42072, 5609, 50276, 783, 4081, 5933, 275, 436, 2929, 310, 247, 5019, 273, 1929, 5609, 26332, 512, 253, 941, 42072, 5609, 403, 973, 21877, 285, 253, 17040, 1332, 908, 281, 6194, 253, 3646, 310, 671, 417, 747, 984, 253, 7681, 38135, 273, 436, 2929, 310, 3710, 534, 310, 12207, 16774, 7103, 3198, 281, 320, 9619, 34615, 50276, 8826, 11809, 403, 417, 973, 24013, 8550, 50276, 18, 275, 253, 14259, 2957, 16186, 577, 247, 767, 3828, 13361, 81, 310, 908, 281, 5752, 347, 247, 4872, 30410, 812, 253, 4477, 5513, 2139, 436, 310, 1805, 685, 19554, 10165, 824, 347, 7349, 1100, 303, 1858, 414, 390, 11591, 19, 4181, 374, 4035, 327, 16186, 577, 752, 310, 253, 7200, 273, 305, 88, 846, 3733, 352, 310, 1896, 326, 305, 88, 2550, 755, 6283, 10166, 984, 273, 253, 3710, 1180, 273, 1442, 292, 25004, 44540, 495, 253, 14433, 2957, 16186, 608, 310, 21643, 812, 253, 4477, 2085, 625, 30328, 327, 436, 671, 627, 943, 320, 4679, 326, 7568, 253, 15504, 273, 436, 2957, 1307, 50276, 783, 3733, 4453, 519, 37806, 627, 403, 2067, 11809, 326, 1646, 10341, 690, 3081, 4679, 403, 3058, 337, 2067, 5368, 3082, 403, 4236, 347, 253, 941, 42072, 6363, 23046, 896, 3675, 270, 797, 2321, 1604, 1407, 66, 285, 391, 20, 71, 627, 943, 320, 4679, 326, 921, 253, 6349, 273, 841, 7823, 26332, 403, 597, 512, 3309, 275, 4677, 495, 66, 253, 5912, 273, 17221, 391, 20, 71, 1604, 1407, 66, 285, 23046, 310, 1698, 23046, 4829, 15323, 281, 470, 4404, 253, 990, 273, 3733, 310, 352, 1896, 281, 760, 2486, 896, 3675, 285, 270, 797, 2321, 374, 253, 4477, 3748, 326, 597, 878, 281, 10486, 6194, 577, 42072, 7823, 752, 588, 5108, 604, 359, 760, 6194, 581, 390, 625, 685, 577, 3646, 495, 253, 4254, 1385, 310, 873, 281, 246, 19, 275, 253, 4679, 1783, 403, 3058, 327, 643, 2193, 273, 246, 577, 253, 4477, 3748, 271, 24026, 4254, 7856, 12451, 1162, 355, 9169, 310, 2879, 271, 28913, 1263, 310, 3058, 5001, 253, 8453, 273, 436, 4254, 608, 275, 253, 3410, 2801, 259, 16186, 337, 627, 403, 767, 4373, 22041, 9765, 285, 9840, 4679, 403, 3058, 281, 923, 849, 841, 767, 4373, 22041, 1818, 1566, 3045, 721, 627, 943, 320, 4679, 327, 253, 2801, 24082, 27083, 273, 253, 14433, 2957, 5933, 337, 50276, 38092, 5701, 337, 275, 2829, 337, 3045, 6351, 310, 417, 2590, 11041, 273, 253, 1543, 310, 1077, 1781, 285, 352, 310, 1892, 281, 2028, 253, 8453, 273, 253, 6351, 253, 4477, 943, 2589, 8453, 5216, 285, 1304, 253, 268, 8858, 374, 275, 253, 1655, 2715, 253, 3733, 2957, 1159, 310, 760, 2011, 275, 253, 5933, 3817, 2486, 352, 387, 253, 5068, 273, 2593, 4567, 588, 1056, 253, 9759, 30909, 50276, 74, 588, 7164, 619, 4868, 604, 253, 4477, 476, 2589, 253, 28913, 4679, 285, 2953, 619, 7350, 253, 2929, 10384, 5368, 46350, 941, 42072, 3082, 7856, 12451, 1162, 355, 9169, 281, 2505, 9162, 4583, 253, 7681, 38135, 310, 1698, 627, 403, 2067, 3081, 4679, 275, 1798, 28913, 2175, 3058, 281, 15249, 253, 2216, 10165, 285, 253, 3045, 6351, 50276, 7152, 339, 431, 248, 2929, 3400, 247, 2969, 2568, 3576, 2746, 4404, 941, 42072, 275, 295, 24343, 8892, 275, 1798, 352, 29328, 271, 42072, 5700, 326, 29426, 26736, 31612, 3530, 342, 1698, 7162, 534, 2789, 731, 11132, 7613, 625, 27096, 285, 1029, 24705, 14259, 342, 253, 3236, 3530, 534, 20096, 253, 35185, 403, 417, 3663, 275, 42072, 949, 11088, 4679, 253, 4477, 921, 326, 616, 2746, 41731, 13015, 3332, 1375, 23037, 14387, 5609, 3340, 327, 1698, 2203, 285, 966, 6785, 267, 3086, 27005, 50274, 296, 3755, 20556, 50275, 783, 2929, 310, 1077, 973, 3542, 253, 42852, 285, 9021, 273, 253, 2929, 403, 1077, 2590, 285, 1774, 941, 42072, 3646, 4715, 310, 247, 4942, 15560, 18398, 446, 2149, 2170, 275, 253, 6239, 285, 891, 1158, 436, 2929, 2789, 247, 1077, 4217, 7680, 281, 18216, 436, 2561, 2170, 253, 4081, 2746, 310, 2969, 2217, 281, 320, 37221, 4354, 275, 956, 484, 2561, 253, 2530, 2127, 275, 253, 24864, 2144, 310, 973, 3542, 285, 3477, 281, 2096, 1512, 50276, 783, 2929, 3400, 11088, 4679, 10985, 247, 11117, 873, 273, 8892, 352, 3797, 1543, 970, 854, 1666, 28957, 383, 366, 23037, 14387, 3210, 327, 721, 15302, 342, 1027, 2505, 9162, 8892, 285, 327, 854, 8892, 1246, 275, 253, 28400, 22791, 352, 3400, 253, 1543, 327, 2709, 966, 6785, 267, 18721, 319, 2203, 873, 8777, 2007, 352, 3400, 271, 28913, 1263, 281, 2096, 253, 7680, 273, 1027, 4295, 273, 253, 4081, 1566, 281, 2085, 1941, 273, 253, 6349, 273, 1016, 273, 1110, 4295, 4720, 253, 2929, 671, 3400, 690, 1543, 281, 921, 253, 3700, 1430, 273, 253, 2746, 285, 690, 1783, 273, 4715, 8062, 275, 954, 273, 841, 1543, 253, 4081, 2746, 41731, 13015, 253, 1666, 28957, 383, 366, 23037, 14387, 3210, 50273, 20881, 1255, 265, 50276, 35640, 621, 50276, 37585, 10618, 275, 3239, 577, 34975, 417, 1512, 1027, 3530, 2593, 697, 5393, 281, 923, 4677, 337, 67, 285, 721, 67, 533, 253, 4677, 721, 310, 417, 1246, 275, 253, 2022, 2929, 2007, 1097, 337, 67, 285, 721, 67, 403, 7939, 281, 337, 67, 4677, 760, 50276, 72, 88, 432, 5150, 495, 310, 2197, 3587, 281, 253, 2831, 15579, 2957, 1159, 275, 5150, 577, 891, 1158, 352, 651, 320, 625, 2590, 604, 247, 9788, 78, 1238, 1159, 310, 2879, 327, 305, 88, 2057, 275, 5150, 495, 390, 577, 2007, 275, 5150, 495, 858, 368, 1908, 3365, 18899, 253, 14261, 1885, 273, 278, 18, 285, 278, 19, 3185, 273, 32147, 839, 731, 285, 970, 247, 11454, 2990, 50276, 74, 1158, 253, 5150, 608, 310, 247, 2372, 12744, 812, 368, 19148, 253, 1563, 50272, 261, 253, 2934, 1060, 2074, 281, 28699, 253, 14261, 1885, 875, 253, 3453, 21496, 285, 3280, 21496, 323, 247, 1677, 3159, 50272, 5092, 368, 823, 625, 5740, 273, 752, 4555, 362, 285, 1269, 403, 1269, 310, 6289, 281, 347, 253, 3280, 10669, 533, 697, 908, 347, 247, 5203, 275, 253, 2831, 15579, 2957, 27594, 326, 697, 247, 8985, 1318, 849, 310, 253, 3280, 10669, 11516, 281, 247, 8985, 5203, 50272, 28821, 326, 253, 3453, 285, 3280, 46234, 273, 247, 39707, 403, 275, 1027, 8470, 812, 368, 2085, 690, 22861, 273, 849, 18899, 253, 14261, 1885, 875, 1110, 11390, 651, 320, 4569, 50276, 66, 5884, 4385, 533, 253, 10393, 273, 3646, 250, 1034, 1160, 479, 1158, 326, 436, 2929, 4648, 690, 35221, 4715, 2746, 2220, 4361, 253, 2929, 253, 10393, 273, 1110, 2426, 513, 1646, 4569, 891, 717, 417, 7738, 326, 436, 3198, 247, 1818, 533, 891, 816, 3078, 281, 1127, 562, 326, 690, 10668, 1537, 755, 13477, 247, 2372, 407, 436, 28939, 50276, 1747, 13218, 253, 28400, 22791, 1543, 352, 651, 320, 4722, 281, 923, 849, 436, 1566, 4195, 373, 275, 253, 28400, 6657, 4697, 812, 368, 823, 1110, 1543, 604, 2130, 50276, 249, 253, 2457, 6452, 2593, 812, 368, 823, 690, 10746, 275, 534, 436, 789, 476, 320, 6508, 4583, 891, 6273, 323, 18738, 891, 1158, 253, 2929, 29328, 247, 1077, 4722, 2746, 281, 3157, 941, 42072, 275, 295, 24343, 8892, 407, 4715, 247, 3646, 326, 21936, 849, 281, 13398, 1027, 42072, 5609, 275, 247, 4836, 7976, 5133, 281, 6635, 3530, 326, 403, 27096, 1223, 26179, 253, 24705, 1491, 432, 253, 3236, 3410, 1580, 436, 2561, 2170, 310, 4942, 747, 891, 1158, 436, 2929, 3400, 247, 1663, 1175, 8245, 285, 247, 7792, 323, 643, 9380, 281, 3157, 2220, 347, 697, 3477, 281, 3359, 285, 4483, 6240, 625, 42072, 5609, 275, 253, 3646, 6363, 11138, 253, 2957, 12846, 6081, 3470, 3966, 891, 452, 247, 1643, 5884, 3374, 5001, 253, 2929, 285, 891, 2529, 731, 275, 2508, 2708, 18670, 253, 4477, 476, 2953, 619, 7350, 275, 253, 30080, 22559, 2180, 2490, 187, 4118, 18435, 27, 664, 11435, 253, 4477, 323, 15974, 253, 5701, 5439, 407, 253, 30628, 1309, 253, 5955, 2180, 534, 3797, 5277, 625, 5661, 1543, 281, 2953, 253, 7350, 359, 2868, 253, 9311, 273, 436, 2929, 476, 8162, 281, 253, 1774, 9400, 273, 941, 42072, 50276, 783, 4477, 403, 4122, 8521, 281, 1908, 512, 253, 5701, 285, 13991, 1160, 407, 253, 30628, 672, 2007, 3585, 2182, 616, 2929, 323, 9311 ]
[ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 ]
[ 42072, 6363, 1057, 352, 17343, 342, 253, 4736, 273, 4751, 3037, 494, 941, 42072, 209, 186, 27049, 19484, 25184, 323, 643, 1666, 25379, 690, 273, 253, 8245, 3082, 751, 1407, 66, 270, 797, 2321, 5878, 484, 896, 24301, 671, 6388, 4373, 22041, 2556, 281, 30762, 247, 19, 352, 3133, 326, 841, 4373, 22041, 403, 417, 24251, 347, 253, 4373, 22041, 273, 277, 2109, 24088, 24082, 34797, 285, 246, 81, 403, 24251, 323, 1016, 10895, 352, 310, 22870, 83, 281, 19928, 253, 4373, 22041, 323, 253, 643, 1666, 25379, 50276, 186, 630, 1427, 273, 253, 5912, 285, 9777, 323, 1407, 66, 50276, 261, 627, 247, 2173, 1921, 326, 1407, 66, 310, 7922, 342, 247, 2014, 5912, 285, 9777, 4764, 476, 253, 897, 273, 1027, 268, 285, 12910, 2193, 323, 2753, 7983, 5407, 3632, 16941, 3632, 22101, 285, 3632, 17404, 2007, 19132, 253, 42072, 3646, 209, 186, 28692, 273, 253, 9777, 3602, 323, 253, 42072, 9183, 352, 310, 12744, 849, 253, 9777, 3602, 403, 2931, 323, 253, 42072, 9183, 323, 1650, 310, 253, 8989, 5912, 273, 270, 797, 2321, 2668, 347, 253, 9777, 752, 403, 253, 13794, 273, 253, 32800, 352, 651, 320, 4217, 281, 2486, 841, 275, 253, 30762, 50276, 14005, 50276, 18, 439, 6968, 86, 2284, 3816, 480, 12099, 8500, 1182, 12109, 43278, 632, 1269, 86, 5101, 480, 466, 1182, 14451, 2505, 6753, 2321, 420, 4715, 5889, 267, 42072, 3646, 323, 2505, 9162, 549, 32693, 638, 3845, 549, 32693, 19, 12852, 5523, 1508, 43425, 3738, 253, 2746, 281, 3037, 271, 42072, 3646, 970, 253, 48960, 285, 14259, 16566, 310, 5421, 275, 253, 2460, 5028, 253, 4081, 1332, 17904, 281, 253, 15644, 273, 3037, 494, 42072, 275, 253, 295, 24343, 5028, 285, 2722, 1175, 5661, 1543, 4583, 891, 5257, 281, 5583, 323, 14924, 323, 253, 3302, 13716, 5474, 339, 431, 248, 4477, 4081, 247, 747, 10921, 1159, 281, 3037, 253, 3268, 273, 42072, 7823, 323, 295, 24343, 8892, 534, 476, 6635, 2834, 285, 24705, 2074, 3530, 281, 12454, 3733, 597, 2007, 5611, 247, 3410, 3020, 294, 6712, 272, 6974, 281, 25057, 253, 4715, 3708, 273, 3236, 3410, 247, 5415, 17040, 310, 3732, 281, 22318, 253, 6194, 494, 3602, 275, 3646, 253, 4081, 1332, 41731, 10574, 256, 5503, 327, 2505, 9162, 285, 46518, 420, 8892, 597, 671, 5196, 9470, 2175, 285, 1783, 17227, 253, 12510, 273, 616, 1332, 327, 1698, 15024, 285, 966, 6785, 267, 3086, 27005, 347, 973, 347, 697, 3700, 1430, 20544, 253, 4081, 1332, 310, 27350, 2969, 281, 4647, 285, 3576, 689, 1142, 8245, 42072, 3082, 327, 2710, 15302, 3340, 327, 1698, 15024, 966, 6785, 267, 3086, 27005, 16280, 253, 1175, 3045, 327, 49602, 253, 4477, 671, 5196, 1175, 28913, 1263, 285, 1783, 327, 6311, 7823, 4645, 326, 253, 4236, 3646, 30150, 342, 2045, 789, 3066, 41389, 9860, 3186, 50276, 20881, 1255, 285, 7350, 337, 253, 2022, 2523, 8696, 275, 253, 4327, 273, 8245, 347, 247, 4715, 3169, 42072, 1332, 352, 588, 320, 625, 21414, 281, 7277, 342, 643, 4715, 3169, 1332, 970, 1072, 6363, 24088, 6753, 2321, 420, 48960, 6753, 2321, 420, 3966, 604, 253, 1332, 476, 1335, 562, 32231, 731, 352, 476, 1805, 7568, 253, 12510, 273, 616, 10921, 2216, 374, 3710, 7681, 38135, 253, 2644, 2934, 310, 1077, 15246, 285, 954, 273, 253, 5609, 403, 29563, 432, 2045, 2987, 50276, 2520, 2929, 4081, 247, 2969, 285, 3576, 10921, 1159, 323, 4715, 3169, 42072, 5438, 275, 295, 24343, 9470, 4679, 285, 1783, 5183, 697, 12510, 327, 2505, 9162, 285, 46518, 420, 3340, 327, 1698, 15024, 285, 966, 6785, 267, 3086, 27005, 5474, 33032, 2520, 2929, 29328, 247, 941, 42072, 5853, 50276, 38157, 533, 417, 1512, 1027, 253, 4477, 1750, 326, 271, 31612, 3410, 943, 320, 2834, 275, 2426, 273, 2957, 285, 3300, 39904, 2074, 281, 253, 3236, 3410, 253, 4477, 12661, 767, 23267, 3470, 281, 10921, 2834, 3530, 285, 2074, 3530, 2975, 247, 17040, 5853, 310, 2007, 908, 824, 326, 253, 42072, 3646, 476, 320, 10166, 4679, 403, 5196, 327, 253, 28400, 2440, 873, 285, 2067, 643, 2505, 9162, 8892, 253, 4081, 1332, 19132, 2220, 5368, 4394, 4583, 253, 2929, 310, 973, 15720, 285, 3477, 281, 2096, 4679, 403, 5196, 327, 1027, 15302, 285, 3733, 15849, 253, 4081, 1332, 19132, 2220, 5368, 941, 42072, 5609, 50276, 783, 4081, 5933, 275, 436, 2929, 310, 247, 5019, 273, 1929, 5609, 26332, 512, 253, 941, 42072, 5609, 403, 973, 21877, 285, 253, 17040, 1332, 908, 281, 6194, 253, 3646, 310, 671, 417, 747, 984, 253, 7681, 38135, 273, 436, 2929, 310, 3710, 534, 310, 12207, 16774, 7103, 3198, 281, 320, 9619, 34615, 50276, 8826, 11809, 403, 417, 973, 24013, 8550, 50276, 18, 275, 253, 14259, 2957, 16186, 577, 247, 767, 3828, 13361, 81, 310, 908, 281, 5752, 347, 247, 4872, 30410, 812, 253, 4477, 5513, 2139, 436, 310, 1805, 685, 19554, 10165, 824, 347, 7349, 1100, 303, 1858, 414, 390, 11591, 19, 4181, 374, 4035, 327, 16186, 577, 752, 310, 253, 7200, 273, 305, 88, 846, 3733, 352, 310, 1896, 326, 305, 88, 2550, 755, 6283, 10166, 984, 273, 253, 3710, 1180, 273, 1442, 292, 25004, 44540, 495, 253, 14433, 2957, 16186, 608, 310, 21643, 812, 253, 4477, 2085, 625, 30328, 327, 436, 671, 627, 943, 320, 4679, 326, 7568, 253, 15504, 273, 436, 2957, 1307, 50276, 783, 3733, 4453, 519, 37806, 627, 403, 2067, 11809, 326, 1646, 10341, 690, 3081, 4679, 403, 3058, 337, 2067, 5368, 3082, 403, 4236, 347, 253, 941, 42072, 6363, 23046, 896, 3675, 270, 797, 2321, 1604, 1407, 66, 285, 391, 20, 71, 627, 943, 320, 4679, 326, 921, 253, 6349, 273, 841, 7823, 26332, 403, 597, 512, 3309, 275, 4677, 495, 66, 253, 5912, 273, 17221, 391, 20, 71, 1604, 1407, 66, 285, 23046, 310, 1698, 23046, 4829, 15323, 281, 470, 4404, 253, 990, 273, 3733, 310, 352, 1896, 281, 760, 2486, 896, 3675, 285, 270, 797, 2321, 374, 253, 4477, 3748, 326, 597, 878, 281, 10486, 6194, 577, 42072, 7823, 752, 588, 5108, 604, 359, 760, 6194, 581, 390, 625, 685, 577, 3646, 495, 253, 4254, 1385, 310, 873, 281, 246, 19, 275, 253, 4679, 1783, 403, 3058, 327, 643, 2193, 273, 246, 577, 253, 4477, 3748, 271, 24026, 4254, 7856, 12451, 1162, 355, 9169, 310, 2879, 271, 28913, 1263, 310, 3058, 5001, 253, 8453, 273, 436, 4254, 608, 275, 253, 3410, 2801, 259, 16186, 337, 627, 403, 767, 4373, 22041, 9765, 285, 9840, 4679, 403, 3058, 281, 923, 849, 841, 767, 4373, 22041, 1818, 1566, 3045, 721, 627, 943, 320, 4679, 327, 253, 2801, 24082, 27083, 273, 253, 14433, 2957, 5933, 337, 50276, 38092, 5701, 337, 275, 2829, 337, 3045, 6351, 310, 417, 2590, 11041, 273, 253, 1543, 310, 1077, 1781, 285, 352, 310, 1892, 281, 2028, 253, 8453, 273, 253, 6351, 253, 4477, 943, 2589, 8453, 5216, 285, 1304, 253, 268, 8858, 374, 275, 253, 1655, 2715, 253, 3733, 2957, 1159, 310, 760, 2011, 275, 253, 5933, 3817, 2486, 352, 387, 253, 5068, 273, 2593, 4567, 588, 1056, 253, 9759, 30909, 50276, 74, 588, 7164, 619, 4868, 604, 253, 4477, 476, 2589, 253, 28913, 4679, 285, 2953, 619, 7350, 253, 2929, 10384, 5368, 46350, 941, 42072, 3082, 7856, 12451, 1162, 355, 9169, 281, 2505, 9162, 4583, 253, 7681, 38135, 310, 1698, 627, 403, 2067, 3081, 4679, 275, 1798, 28913, 2175, 3058, 281, 15249, 253, 2216, 10165, 285, 253, 3045, 6351, 50276, 7152, 339, 431, 248, 2929, 3400, 247, 2969, 2568, 3576, 2746, 4404, 941, 42072, 275, 295, 24343, 8892, 275, 1798, 352, 29328, 271, 42072, 5700, 326, 29426, 26736, 31612, 3530, 342, 1698, 7162, 534, 2789, 731, 11132, 7613, 625, 27096, 285, 1029, 24705, 14259, 342, 253, 3236, 3530, 534, 20096, 253, 35185, 403, 417, 3663, 275, 42072, 949, 11088, 4679, 253, 4477, 921, 326, 616, 2746, 41731, 13015, 3332, 1375, 23037, 14387, 5609, 3340, 327, 1698, 2203, 285, 966, 6785, 267, 3086, 27005, 50274, 296, 3755, 20556, 50275, 783, 2929, 310, 1077, 973, 3542, 253, 42852, 285, 9021, 273, 253, 2929, 403, 1077, 2590, 285, 1774, 941, 42072, 3646, 4715, 310, 247, 4942, 15560, 18398, 446, 2149, 2170, 275, 253, 6239, 285, 891, 1158, 436, 2929, 2789, 247, 1077, 4217, 7680, 281, 18216, 436, 2561, 2170, 253, 4081, 2746, 310, 2969, 2217, 281, 320, 37221, 4354, 275, 956, 484, 2561, 253, 2530, 2127, 275, 253, 24864, 2144, 310, 973, 3542, 285, 3477, 281, 2096, 1512, 50276, 783, 2929, 3400, 11088, 4679, 10985, 247, 11117, 873, 273, 8892, 352, 3797, 1543, 970, 854, 1666, 28957, 383, 366, 23037, 14387, 3210, 327, 721, 15302, 342, 1027, 2505, 9162, 8892, 285, 327, 854, 8892, 1246, 275, 253, 28400, 22791, 352, 3400, 253, 1543, 327, 2709, 966, 6785, 267, 18721, 319, 2203, 873, 8777, 2007, 352, 3400, 271, 28913, 1263, 281, 2096, 253, 7680, 273, 1027, 4295, 273, 253, 4081, 1566, 281, 2085, 1941, 273, 253, 6349, 273, 1016, 273, 1110, 4295, 4720, 253, 2929, 671, 3400, 690, 1543, 281, 921, 253, 3700, 1430, 273, 253, 2746, 285, 690, 1783, 273, 4715, 8062, 275, 954, 273, 841, 1543, 253, 4081, 2746, 41731, 13015, 253, 1666, 28957, 383, 366, 23037, 14387, 3210, 50273, 20881, 1255, 265, 50276, 35640, 621, 50276, 37585, 10618, 275, 3239, 577, 34975, 417, 1512, 1027, 3530, 2593, 697, 5393, 281, 923, 4677, 337, 67, 285, 721, 67, 533, 253, 4677, 721, 310, 417, 1246, 275, 253, 2022, 2929, 2007, 1097, 337, 67, 285, 721, 67, 403, 7939, 281, 337, 67, 4677, 760, 50276, 72, 88, 432, 5150, 495, 310, 2197, 3587, 281, 253, 2831, 15579, 2957, 1159, 275, 5150, 577, 891, 1158, 352, 651, 320, 625, 2590, 604, 247, 9788, 78, 1238, 1159, 310, 2879, 327, 305, 88, 2057, 275, 5150, 495, 390, 577, 2007, 275, 5150, 495, 858, 368, 1908, 3365, 18899, 253, 14261, 1885, 273, 278, 18, 285, 278, 19, 3185, 273, 32147, 839, 731, 285, 970, 247, 11454, 2990, 50276, 74, 1158, 253, 5150, 608, 310, 247, 2372, 12744, 812, 368, 19148, 253, 1563, 50272, 261, 253, 2934, 1060, 2074, 281, 28699, 253, 14261, 1885, 875, 253, 3453, 21496, 285, 3280, 21496, 323, 247, 1677, 3159, 50272, 5092, 368, 823, 625, 5740, 273, 752, 4555, 362, 285, 1269, 403, 1269, 310, 6289, 281, 347, 253, 3280, 10669, 533, 697, 908, 347, 247, 5203, 275, 253, 2831, 15579, 2957, 27594, 326, 697, 247, 8985, 1318, 849, 310, 253, 3280, 10669, 11516, 281, 247, 8985, 5203, 50272, 28821, 326, 253, 3453, 285, 3280, 46234, 273, 247, 39707, 403, 275, 1027, 8470, 812, 368, 2085, 690, 22861, 273, 849, 18899, 253, 14261, 1885, 875, 1110, 11390, 651, 320, 4569, 50276, 66, 5884, 4385, 533, 253, 10393, 273, 3646, 250, 1034, 1160, 479, 1158, 326, 436, 2929, 4648, 690, 35221, 4715, 2746, 2220, 4361, 253, 2929, 253, 10393, 273, 1110, 2426, 513, 1646, 4569, 891, 717, 417, 7738, 326, 436, 3198, 247, 1818, 533, 891, 816, 3078, 281, 1127, 562, 326, 690, 10668, 1537, 755, 13477, 247, 2372, 407, 436, 28939, 50276, 1747, 13218, 253, 28400, 22791, 1543, 352, 651, 320, 4722, 281, 923, 849, 436, 1566, 4195, 373, 275, 253, 28400, 6657, 4697, 812, 368, 823, 1110, 1543, 604, 2130, 50276, 249, 253, 2457, 6452, 2593, 812, 368, 823, 690, 10746, 275, 534, 436, 789, 476, 320, 6508, 4583, 891, 6273, 323, 18738, 891, 1158, 253, 2929, 29328, 247, 1077, 4722, 2746, 281, 3157, 941, 42072, 275, 295, 24343, 8892, 407, 4715, 247, 3646, 326, 21936, 849, 281, 13398, 1027, 42072, 5609, 275, 247, 4836, 7976, 5133, 281, 6635, 3530, 326, 403, 27096, 1223, 26179, 253, 24705, 1491, 432, 253, 3236, 3410, 1580, 436, 2561, 2170, 310, 4942, 747, 891, 1158, 436, 2929, 3400, 247, 1663, 1175, 8245, 285, 247, 7792, 323, 643, 9380, 281, 3157, 2220, 347, 697, 3477, 281, 3359, 285, 4483, 6240, 625, 42072, 5609, 275, 253, 3646, 6363, 11138, 253, 2957, 12846, 6081, 3470, 3966, 891, 452, 247, 1643, 5884, 3374, 5001, 253, 2929, 285, 891, 2529, 731, 275, 2508, 2708, 18670, 253, 4477, 476, 2953, 619, 7350, 275, 253, 30080, 22559, 2180, 2490, 187, 4118, 18435, 27, 664, 11435, 253, 4477, 323, 15974, 253, 5701, 5439, 407, 253, 30628, 1309, 253, 5955, 2180, 534, 3797, 5277, 625, 5661, 1543, 281, 2953, 253, 7350, 359, 2868, 253, 9311, 273, 436, 2929, 476, 8162, 281, 253, 1774, 9400, 273, 941, 42072, 50276, 783, 4477, 403, 4122, 8521, 281, 1908, 512, 253, 5701, 285, 13991, 1160, 407, 253, 30628, 672, 2007, 3585, 2182, 616, 2929, 323, 9311 ]
Below is given review of a research paper from cnoference journal. Please write a summary the review. ### Review: the authors study an interesting problem applying an algebraic proof system to certify the correctness of monte carlo estimators they proposed a verifiable randomized approximation scheme based on an arithmetic circuit specifically they consider both the succinct arithmetic circuit and the approximated circuit to approximate random variables moreover they apply their analysis to two applications approximate model counting of disjunctive normal form and gradient estimation the authors propose a generic method to certify the correctness of monte carlo estimators with explicit examples the authors fairly state the potential limitations eg the lack of experiments though unfamiliar with this area i think this is a welcome contribution to the trustworthy machine learning community none docsep this paper proposes to verify monte carlo algorithms through the use of lowdegree arithmetic circuits the idea being that if a monte carlo algorithm can be made to agree with a computation as represented by this circuit we can be reasonably confident in the correctness of the computation after a relatively small number of problem instances originality while im not if arithmetic circuits have been used to confirm the correctness of monte carlo inference the idea of using multiple inference algorithms to check the correctness of each algorithm against the other is not new many of the examples used in the paper appear to be effectively doing just that it appears the polynomial looks like a tractable expectation technical quality while the paper has many examples of how these circuits could be constructed there isnt much that looks like a generic method there are no experiments or really any results showing empirically how much extra time is needed to produce a correctness certificate even for all the examples mentioned the work would greatly benefit from a more thorough exploration of some of the example circuits but also a more systematic way to produce these circuits clarity i found the paper challenging to follow there are many examples introduced and many extensions suggested for the approach but not much detail about the general algorithm itself significance i think there is much potential value in producing correctness proofs for monte carlo inference algorithms while its not fully clear how often an arithmetic circuit can be found for all expectations if that set of problems can be better clarified this could be a big deal the authors do discuss some of the limitations of this work possible negative societal impacts are not discussed but i suspect the risks are fairly minimal the research problem being addressed is very general docsepmonte carlo integration methods are ubiquitous in modern computational tasks and can be highly parallelized across multiple servers this paper introduces a framework based in algebraic circuits for verifying that the returned estimates provided by the servers were not adversarially manipulated if the estimates were properly computed the verification technique will not indicate the information has been manipulated if they were not properly computed the technique will indicate that manipulation has occurred with high probability the authors demonstrate their technique on several classical problems such as permanent estimation strengths while this paper is mostly outside of my area of knowledge it seems original and to have practical applications in distributed computing a strength of the paper is the focus on examples the examples considered are common practical problems and illustrate the usefulness of this technique another strength is that the introduction clearly illustrates the motivation for the problem and provides insight into the technique being used weaknesses one weakness is that the main theorem theorem 1 feels buried in the paper most of the paper focuses on example problems but it may be valuable to give a brief proof sketch of the main theorem the authors discuss the limitations of their work in the discussion section at the end of the paper in particular they note that it is nonobvious how to construct these verification systems for problems in which estimates do not have pairwise independence such as mcmc or sequential inference problems moreover they mention that engineering low degree circuits by hand is difficult and that learning circuit structure is hard even for restricted problem classes docsepthis paper introduces an mc integration technique in that the computations have an algebraic proof of correctness where the proof is polynomial evaluated at random points it provides a verified randomized approximation scheme for weighed counting to approximate the target quantity by use of mc strengths it is wellwritten and easy to follow proof of this technique seems to be very standard in the literature especially complexity theory but it is interesting to see the application in this context weakness using mc requires samples to be as uncorrelated as possible to converge this is often hard to accomplish in practice hence mcmc there are similarities between this work and some previous work specifically 27 some discussion on the effect of size and degree of circuits seems to be necessary seems like finding the appropriate circuit that approximates the random variable of interest is not an easy task samples must be iid which is hard to draw in practice lack of comparison to other techniques seems choosing circuits can be difficult and begs some more discussion on that even though some nice examples are provided it feels like it is not something widely applicable specifically for stateoftheart methods ### Summary:
reviewers collectively had relatively low confidence when reviewing this paper this is not surprising as the paper reads somewhat more like a theory cs paper rather than a typical ml paper even a theory one the main concerns seemed to be that the work is not super concrete and fairly speculative these seem like natural concerns for a tcsstyle paper which the authors have tried to address with many explicit examples and it seems tough to further address this without completely changing the focus of the paper i cautiously recommend acceptance with the hope that there will be further exploration of these ideas perhaps with more practical instantiation
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 783, 4477, 1263, 271, 4722, 1895, 50276, 1212, 2943, 271, 20157, 4737, 985, 281, 5306, 1419, 253, 36594, 273, 1114, 442, 1113, 4213, 48489, 597, 4081, 247, 2336, 18397, 14871, 11193, 6974, 1754, 327, 271, 27844, 5049, 5742, 597, 1908, 1097, 253, 18382, 4291, 27844, 5049, 285, 253, 34930, 5049, 281, 16851, 3632, 4903, 25761, 597, 4647, 616, 1783, 281, 767, 4893, 50276, 9887, 2542, 1566, 15496, 273, 557, 75, 28816, 2622, 830, 285, 11786, 13418, 50275, 783, 4477, 12661, 247, 12314, 1332, 281, 5306, 1419, 253, 36594, 273, 1114, 442, 1113, 4213, 48489, 342, 6843, 6667, 253, 4477, 9648, 1375, 253, 2442, 7364, 24088, 253, 3480, 273, 4679, 2167, 32139, 342, 436, 2170, 891, 1158, 436, 310, 247, 10112, 7680, 281, 253, 46808, 5145, 4715, 3114, 5293, 5474, 33032, 50276, 2520, 2929, 29328, 281, 12654, 1114, 442, 1113, 4213, 11333, 949, 253, 897, 273, 1698, 14577, 27844, 14174, 50275, 783, 2934, 1146, 326, 604, 247, 1114, 442, 1113, 4213, 5933, 476, 320, 1160, 281, 5194, 342, 247, 13782, 347, 6607, 50275, 1615, 436, 5049, 359, 476, 320, 12054, 13224, 275, 253, 36594, 273, 253, 13782, 846, 247, 4942, 50275, 6795, 1180, 273, 1895, 10872, 50274, 19164, 414, 50274, 6050, 516, 417, 604, 27844, 14174, 452, 644, 908, 281, 6583, 253, 36594, 273, 1114, 442, 1113, 4213, 17032, 50275, 783, 2934, 273, 970, 2709, 17032, 11333, 281, 2451, 253, 36594, 273, 1016, 5933, 1411, 50275, 783, 643, 310, 417, 747, 1142, 273, 253, 6667, 908, 275, 253, 2929, 3176, 281, 320, 8069, 2509, 816, 326, 352, 50275, 6243, 1032, 253, 14189, 4453, 751, 247, 10649, 494, 15355, 50272, 48746, 3290, 50274, 6050, 253, 2929, 556, 1142, 6667, 273, 849, 841, 14174, 812, 320, 8818, 627, 310, 2649, 1199, 326, 50275, 6204, 84, 751, 247, 12314, 1332, 627, 403, 642, 4679, 390, 1663, 667, 1543, 4645, 45190, 849, 50275, 25914, 4465, 673, 310, 3058, 281, 4711, 247, 36594, 14204, 1014, 323, 512, 253, 6667, 5393, 50275, 783, 789, 651, 10260, 5649, 432, 247, 625, 11080, 17947, 273, 690, 273, 253, 1650, 14174, 533, 50275, 12563, 247, 625, 12082, 1039, 281, 4711, 841, 14174, 50272, 498, 15752, 50274, 74, 1119, 253, 2929, 11132, 281, 956, 627, 403, 1142, 6667, 5611, 285, 1142, 18149, 5125, 50275, 1542, 253, 2746, 533, 417, 1199, 2508, 670, 253, 2087, 5933, 3139, 50272, 9188, 40348, 50274, 74, 1158, 627, 310, 1199, 2442, 1318, 275, 9603, 36594, 27947, 323, 1114, 442, 1113, 4213, 17032, 11333, 50275, 6050, 697, 417, 4751, 2590, 849, 2223, 271, 27844, 5049, 476, 320, 1119, 323, 512, 12656, 604, 326, 873, 50275, 1171, 3237, 476, 320, 1805, 31637, 436, 812, 320, 247, 1943, 2968, 50276, 783, 4477, 513, 2319, 690, 273, 253, 7364, 273, 436, 789, 1896, 4016, 38058, 16274, 403, 417, 5469, 533, 891, 9101, 253, 10502, 403, 9648, 8723, 253, 2561, 1895, 1146, 9713, 310, 1077, 2087, 5474, 33032, 2163, 442, 1113, 4213, 9554, 3082, 403, 33079, 275, 4980, 15180, 8892, 285, 476, 320, 4122, 7529, 1025, 2439, 2709, 14903, 436, 2929, 23970, 247, 7792, 1754, 275, 20157, 14174, 323, 49160, 326, 253, 4895, 8197, 2530, 407, 253, 14903, 497, 417, 18539, 274, 1365, 32494, 604, 253, 8197, 497, 6283, 10302, 253, 21999, 5853, 588, 417, 5224, 253, 1491, 556, 644, 32494, 604, 597, 497, 417, 6283, 10302, 253, 5853, 588, 5224, 326, 19763, 556, 5866, 342, 1029, 5912, 253, 4477, 7568, 616, 5853, 327, 2067, 8946, 3237, 824, 347, 9928, 13418, 20544, 1223, 436, 2929, 310, 6571, 3345, 273, 619, 2170, 273, 3640, 352, 3133, 3236, 285, 281, 452, 8542, 4893, 275, 5939, 12672, 247, 4757, 273, 253, 2929, 310, 253, 2770, 327, 6667, 253, 6667, 2783, 403, 1846, 8542, 3237, 285, 17093, 253, 31471, 273, 436, 5853, 1529, 4757, 310, 326, 253, 10199, 4518, 18303, 253, 16038, 323, 253, 1895, 285, 3400, 12288, 715, 253, 5853, 1146, 908, 50275, 20881, 1255, 265, 581, 14855, 310, 326, 253, 2022, 10012, 10012, 337, 9193, 14205, 275, 253, 2929, 954, 273, 253, 2929, 16633, 327, 1650, 3237, 533, 352, 778, 320, 9865, 281, 1918, 247, 4864, 4737, 23211, 273, 253, 2022, 10012, 50276, 783, 4477, 2319, 253, 7364, 273, 616, 789, 275, 253, 5955, 2593, 387, 253, 990, 273, 253, 2929, 275, 1798, 597, 3877, 326, 352, 310, 1327, 706, 3391, 849, 281, 3989, 841, 21999, 2718, 323, 3237, 275, 534, 8197, 513, 417, 452, 28208, 14275, 824, 347, 278, 3591, 68, 390, 22453, 17032, 3237, 25761, 597, 3748, 326, 11369, 1698, 4248, 14174, 407, 1133, 310, 2834, 285, 326, 4715, 5049, 2605, 310, 1892, 1014, 323, 11096, 1895, 5971, 5474, 33032, 2520, 2929, 23970, 271, 278, 68, 9554, 5853, 275, 326, 253, 30745, 452, 271, 20157, 4737, 273, 36594, 835, 253, 4737, 310, 14189, 6760, 387, 3632, 2792, 352, 3400, 247, 16058, 14871, 11193, 6974, 323, 24398, 15496, 281, 16851, 253, 2303, 10671, 407, 897, 273, 278, 68, 50276, 296, 3755, 20556, 50276, 262, 310, 973, 15720, 285, 3477, 281, 956, 50275, 16314, 273, 436, 5853, 3133, 281, 320, 1077, 2629, 275, 253, 6239, 3340, 10454, 3762, 533, 352, 310, 4722, 281, 923, 253, 2898, 275, 436, 3634, 50275, 20881, 1255, 50275, 5302, 278, 68, 4419, 3530, 281, 320, 347, 41656, 4919, 347, 1896, 281, 29623, 436, 310, 2223, 1892, 281, 14294, 275, 3946, 7613, 278, 3591, 68, 50275, 9088, 403, 22620, 875, 436, 789, 285, 690, 2045, 789, 5742, 3435, 50275, 8826, 5955, 327, 253, 1055, 273, 1979, 285, 4248, 273, 14174, 3133, 281, 320, 3309, 50276, 339, 3030, 751, 4560, 253, 4569, 5049, 326, 4020, 684, 253, 3632, 4778, 273, 1600, 310, 417, 271, 3477, 4836, 50274, 33380, 1364, 320, 891, 301, 534, 310, 1892, 281, 3812, 275, 3946, 50275, 77, 471, 273, 5301, 281, 643, 5609, 50276, 339, 3030, 13887, 14174, 476, 320, 2834, 285, 2353, 84, 690, 625, 5955, 327, 326, 50275, 9154, 2167, 690, 5322, 6667, 403, 2530, 352, 9193, 751, 352, 310, 417, 1633, 7561, 7763, 5742, 323, 1375, 23037, 14387, 3082, 50276, 187, 187, 4118, 18435, 27, 15337, 398, 26708, 574, 4942, 1698, 7162, 672, 16725, 436, 2929, 436, 310, 417, 10084, 347, 253, 2929, 9563, 8489, 625, 751, 247, 3762, 29180, 2929, 2581, 685, 247, 6867, 13361, 2929, 1014, 247, 3762, 581, 253, 2022, 7350, 4455, 281, 320, 326, 253, 789, 310, 417, 2221, 11859, 285, 9648, 35377, 841, 1646, 751, 3626, 7350, 323, 247, 246, 6113, 4826, 2929, 534, 253, 4477, 452, 3597, 281, 2953, 342, 1142, 6843, 6667, 285, 352, 3133, 10458, 281, 2007, 2953, 436, 1293, 4336, 6890, 253, 2770, 273, 253, 2929, 891, 45254, 5583, 14924, 342, 253, 3524, 326, 627, 588, 320, 2007, 17947, 273, 841, 5697, 4931, 342, 625, 8542, 8164, 2492 ]
[ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 ]
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 783, 4477, 1263, 271, 4722, 1895, 50276, 1212, 2943, 271, 20157, 4737, 985, 281, 5306, 1419, 253, 36594, 273, 1114, 442, 1113, 4213, 48489, 597, 4081, 247, 2336, 18397, 14871, 11193, 6974, 1754, 327, 271, 27844, 5049, 5742, 597, 1908, 1097, 253, 18382, 4291, 27844, 5049, 285, 253, 34930, 5049, 281, 16851, 3632, 4903, 25761, 597, 4647, 616, 1783, 281, 767, 4893, 50276, 9887, 2542, 1566, 15496, 273, 557, 75, 28816, 2622, 830, 285, 11786, 13418, 50275, 783, 4477, 12661, 247, 12314, 1332, 281, 5306, 1419, 253, 36594, 273, 1114, 442, 1113, 4213, 48489, 342, 6843, 6667, 253, 4477, 9648, 1375, 253, 2442, 7364, 24088, 253, 3480, 273, 4679, 2167, 32139, 342, 436, 2170, 891, 1158, 436, 310, 247, 10112, 7680, 281, 253, 46808, 5145, 4715, 3114, 5293, 5474, 33032, 50276, 2520, 2929, 29328, 281, 12654, 1114, 442, 1113, 4213, 11333, 949, 253, 897, 273, 1698, 14577, 27844, 14174, 50275, 783, 2934, 1146, 326, 604, 247, 1114, 442, 1113, 4213, 5933, 476, 320, 1160, 281, 5194, 342, 247, 13782, 347, 6607, 50275, 1615, 436, 5049, 359, 476, 320, 12054, 13224, 275, 253, 36594, 273, 253, 13782, 846, 247, 4942, 50275, 6795, 1180, 273, 1895, 10872, 50274, 19164, 414, 50274, 6050, 516, 417, 604, 27844, 14174, 452, 644, 908, 281, 6583, 253, 36594, 273, 1114, 442, 1113, 4213, 17032, 50275, 783, 2934, 273, 970, 2709, 17032, 11333, 281, 2451, 253, 36594, 273, 1016, 5933, 1411, 50275, 783, 643, 310, 417, 747, 1142, 273, 253, 6667, 908, 275, 253, 2929, 3176, 281, 320, 8069, 2509, 816, 326, 352, 50275, 6243, 1032, 253, 14189, 4453, 751, 247, 10649, 494, 15355, 50272, 48746, 3290, 50274, 6050, 253, 2929, 556, 1142, 6667, 273, 849, 841, 14174, 812, 320, 8818, 627, 310, 2649, 1199, 326, 50275, 6204, 84, 751, 247, 12314, 1332, 627, 403, 642, 4679, 390, 1663, 667, 1543, 4645, 45190, 849, 50275, 25914, 4465, 673, 310, 3058, 281, 4711, 247, 36594, 14204, 1014, 323, 512, 253, 6667, 5393, 50275, 783, 789, 651, 10260, 5649, 432, 247, 625, 11080, 17947, 273, 690, 273, 253, 1650, 14174, 533, 50275, 12563, 247, 625, 12082, 1039, 281, 4711, 841, 14174, 50272, 498, 15752, 50274, 74, 1119, 253, 2929, 11132, 281, 956, 627, 403, 1142, 6667, 5611, 285, 1142, 18149, 5125, 50275, 1542, 253, 2746, 533, 417, 1199, 2508, 670, 253, 2087, 5933, 3139, 50272, 9188, 40348, 50274, 74, 1158, 627, 310, 1199, 2442, 1318, 275, 9603, 36594, 27947, 323, 1114, 442, 1113, 4213, 17032, 11333, 50275, 6050, 697, 417, 4751, 2590, 849, 2223, 271, 27844, 5049, 476, 320, 1119, 323, 512, 12656, 604, 326, 873, 50275, 1171, 3237, 476, 320, 1805, 31637, 436, 812, 320, 247, 1943, 2968, 50276, 783, 4477, 513, 2319, 690, 273, 253, 7364, 273, 436, 789, 1896, 4016, 38058, 16274, 403, 417, 5469, 533, 891, 9101, 253, 10502, 403, 9648, 8723, 253, 2561, 1895, 1146, 9713, 310, 1077, 2087, 5474, 33032, 2163, 442, 1113, 4213, 9554, 3082, 403, 33079, 275, 4980, 15180, 8892, 285, 476, 320, 4122, 7529, 1025, 2439, 2709, 14903, 436, 2929, 23970, 247, 7792, 1754, 275, 20157, 14174, 323, 49160, 326, 253, 4895, 8197, 2530, 407, 253, 14903, 497, 417, 18539, 274, 1365, 32494, 604, 253, 8197, 497, 6283, 10302, 253, 21999, 5853, 588, 417, 5224, 253, 1491, 556, 644, 32494, 604, 597, 497, 417, 6283, 10302, 253, 5853, 588, 5224, 326, 19763, 556, 5866, 342, 1029, 5912, 253, 4477, 7568, 616, 5853, 327, 2067, 8946, 3237, 824, 347, 9928, 13418, 20544, 1223, 436, 2929, 310, 6571, 3345, 273, 619, 2170, 273, 3640, 352, 3133, 3236, 285, 281, 452, 8542, 4893, 275, 5939, 12672, 247, 4757, 273, 253, 2929, 310, 253, 2770, 327, 6667, 253, 6667, 2783, 403, 1846, 8542, 3237, 285, 17093, 253, 31471, 273, 436, 5853, 1529, 4757, 310, 326, 253, 10199, 4518, 18303, 253, 16038, 323, 253, 1895, 285, 3400, 12288, 715, 253, 5853, 1146, 908, 50275, 20881, 1255, 265, 581, 14855, 310, 326, 253, 2022, 10012, 10012, 337, 9193, 14205, 275, 253, 2929, 954, 273, 253, 2929, 16633, 327, 1650, 3237, 533, 352, 778, 320, 9865, 281, 1918, 247, 4864, 4737, 23211, 273, 253, 2022, 10012, 50276, 783, 4477, 2319, 253, 7364, 273, 616, 789, 275, 253, 5955, 2593, 387, 253, 990, 273, 253, 2929, 275, 1798, 597, 3877, 326, 352, 310, 1327, 706, 3391, 849, 281, 3989, 841, 21999, 2718, 323, 3237, 275, 534, 8197, 513, 417, 452, 28208, 14275, 824, 347, 278, 3591, 68, 390, 22453, 17032, 3237, 25761, 597, 3748, 326, 11369, 1698, 4248, 14174, 407, 1133, 310, 2834, 285, 326, 4715, 5049, 2605, 310, 1892, 1014, 323, 11096, 1895, 5971, 5474, 33032, 2520, 2929, 23970, 271, 278, 68, 9554, 5853, 275, 326, 253, 30745, 452, 271, 20157, 4737, 273, 36594, 835, 253, 4737, 310, 14189, 6760, 387, 3632, 2792, 352, 3400, 247, 16058, 14871, 11193, 6974, 323, 24398, 15496, 281, 16851, 253, 2303, 10671, 407, 897, 273, 278, 68, 50276, 296, 3755, 20556, 50276, 262, 310, 973, 15720, 285, 3477, 281, 956, 50275, 16314, 273, 436, 5853, 3133, 281, 320, 1077, 2629, 275, 253, 6239, 3340, 10454, 3762, 533, 352, 310, 4722, 281, 923, 253, 2898, 275, 436, 3634, 50275, 20881, 1255, 50275, 5302, 278, 68, 4419, 3530, 281, 320, 347, 41656, 4919, 347, 1896, 281, 29623, 436, 310, 2223, 1892, 281, 14294, 275, 3946, 7613, 278, 3591, 68, 50275, 9088, 403, 22620, 875, 436, 789, 285, 690, 2045, 789, 5742, 3435, 50275, 8826, 5955, 327, 253, 1055, 273, 1979, 285, 4248, 273, 14174, 3133, 281, 320, 3309, 50276, 339, 3030, 751, 4560, 253, 4569, 5049, 326, 4020, 684, 253, 3632, 4778, 273, 1600, 310, 417, 271, 3477, 4836, 50274, 33380, 1364, 320, 891, 301, 534, 310, 1892, 281, 3812, 275, 3946, 50275, 77, 471, 273, 5301, 281, 643, 5609, 50276, 339, 3030, 13887, 14174, 476, 320, 2834, 285, 2353, 84, 690, 625, 5955, 327, 326, 50275, 9154, 2167, 690, 5322, 6667, 403, 2530, 352, 9193, 751, 352, 310, 417, 1633, 7561, 7763, 5742, 323, 1375, 23037, 14387, 3082, 50276, 187, 187, 4118, 18435, 27, 15337, 398, 26708, 574, 4942, 1698, 7162, 672, 16725, 436, 2929, 436, 310, 417, 10084, 347, 253, 2929, 9563, 8489, 625, 751, 247, 3762, 29180, 2929, 2581, 685, 247, 6867, 13361, 2929, 1014, 247, 3762, 581, 253, 2022, 7350, 4455, 281, 320, 326, 253, 789, 310, 417, 2221, 11859, 285, 9648, 35377, 841, 1646, 751, 3626, 7350, 323, 247, 246, 6113, 4826, 2929, 534, 253, 4477, 452, 3597, 281, 2953, 342, 1142, 6843, 6667, 285, 352, 3133, 10458, 281, 2007, 2953, 436, 1293, 4336, 6890, 253, 2770, 273, 253, 2929, 891, 45254, 5583, 14924, 342, 253, 3524, 326, 627, 588, 320, 2007, 17947, 273, 841, 5697, 4931, 342, 625, 8542, 8164, 2492 ]
Below is given review of a research paper from cnoference journal. Please write a summary the review. ### Review: this paper proposes a method for unsupervised metalearning based on using a variational autoencoder vae the variational autoencoder model they use differs from the typical one in that it considers episodespecific datasets where the approximate posterior can be computed as a function of the set using transformer architecture rather than using an individual example additionally they use a mixture of gaussian distribution as a prior whose parameters are learned perepisode using the em algorithm for the supervised evaluation phase in order to adapt the learned prior to the fewshot dataset setting semisupervised em is run using both support and query sets to adapt the mixture of gaussian distribution to the evaluation dataset then the query set predictions are obtained using the learned prior and posterior from the vae model experimental evaluation is conducted on the omniglot and miniimagenet benchmarks and the proposed method is compared against other unsupervised metalearning methods mainly cactus and umtra an interesting aspect about the miniimagenet experiments are that because learning the vae directly for this highdimensional data may be difficult the authors use features from a simclrtrained model as input for their vae model the proposed method seems to perform favorably across both of the benchmarks when varying the number of shots pros whereas previous work in unsupervised metalearning involved creating unsupervised episodes for metatraining via augmentations or clustering of unsupervised model features this paper takes a very different route but still seems to achieve very good performance the authors were able to scale their model to the miniimagenet dataset by using simclrtrained features and with this choice the final model attains good performance on the benchmark compared to previous work cons this is a not necessarily a big con but a point that could be clarified further how are the number of components for the gmm decided for metatraining how does the choice of the number of components impact how the gmm is used at evaluationtime would it not pose an issue that during training we may have moreless components than are actually necessary at evaluationtime depending on the number of classes we are considering at evaluationtime is it the case that a separate model needs to be trained if number of evaluation classes is changed ie from 1shot 5class to 1shot 10class i believe the paper could be improved by adding an algorithm description of how exactly the model is trained and how evaluation takes place in terms of exact steps the algorithmic pseudocode can reference equations within the paper but i believe this would greatly help in terms of understanding how to recreate the exact training and evaluation procedure for the proposed modeldocsepthe submission proposes an algorithm for the semisupervised metalearning unsupervised metatraining supervised metatesting setting of 1 which adapts the fewshot learning evaluation setting of 2 3 by omitting classification labels at metatraining time the algorithm makes use of a variational autoencoder vae formulation defined over a hierarchical model that describes the decomposition of a dataset into tasks of datapointtarget pairs ie the metalearning setup the prior distribution of the hierarchical vae is taken to be a mixture of gaussians to facilitate the construction of pseudolabels at metatraining time the algorithm is evaluated on the omniglot and miniimagenet fewshot classification tasks with labels unused at metatraining time strengths the semisupervised metalearning unsupervised metatraining supervised metatesting setting is interesting and worthy of study as an analogue of unsupervised learning the datasets used in the empirical evaluation are appropriate although they do not represent the most complex image datasets used in fewshot classification evaluations cf metadataset 8 the use of a gaussian mixture model gmm for the prior distribution of a variational autoencoder vae which allows an analytic solution for a subset of the variational parameters is conceptually interesting although its use would not be restricted to the metalearning setting using it to construct pseudolabels as well as incorporate labels when available for the semisupervised metalearning setting is a nice development although not a significant advance from the use of kmeans in cactus metagmvae attaints higher performance than semisupervised fewshot classification setting comparison methods cactus umtra on the omniglot and miniimagenet benchmarks moreover it approaches the level of a supervised fewshot classification method maml on the omniglot benchmark although this supervised comparator does not represent stateoftheart performance on this benchmark weaknesses 1 clarity the algorithmic components of the submission were very difficult to get straight in the development of the algorithm in section 32 the submission does not adequately discuss why and how particular subcomponents are employed and various points about the different algorithmic components are made in the text without sufficient explanation or integration some examples are the vae formulation is introduced without precedent just above equation 1 it is also a bit of a red herring because it is not subsequently used as is in the algorithm at the difference of our model from original vae is that we utilize a setlevel variational posterior qphimathbfzj mathbfxj di for inferring isotropic gaussian distribution to encode characteristics of a given dataset di specifically we utilize selfattention mechanism vaswani et al 2017 on top of a convolutional neural network this is the first time an isotropic gaussian distribution is mentioned in the method selfattention is not explained further and there is no explanation of how the convolutional neural network cnn fits into the whole framework for example it is not clear from this section whether and if so how a cnn is used in addition to the simclr feature representation we set the prior distribution as a mixture of gaussians gmm where y is a discrete random variable indicating the component of a latent variable mathbfz y and mathbfz are not yet defined except by reference to the vae formulation in 1 but that was insufficiently explained as a part of the algorithm more specific details for reproducibility are not described in the text eg how the gmm parameters are initialized for em what some of the variables mathbfz mathbfx phi theta refer to in the implementation on top of this results would be extremely difficult to reproduce while component architectures and experimental setups are detailed in the appendix how everything fits together is not adequately described more broadly the submission would benefit significantly from an algorithm box to convey how all the components interact and which components act episodically at the task level vs at the level of the entire dataset 2 quality the experimental evaluation does not provide a measure of variance eg 95 confidence interval in table 1 which should be provided to ascertain the significance of the reported improvement the algorithm uses the simclr representation learning objective to pretrain the feature extractor while the comparison semisupervised metalearning approach use less performative methods as feature extractors cactus bigan acaidc umtra a simple 4layer cnn an ablation study that ablates the use of simclr with metagmvae is necessary to ascertain whether the improvement is due to using simclr vs using components attributable to metagmvae 3 originality highly relevant work on gmm priors for vaes is not cited in the submission 4 5 the submission also does not discuss variations on the vae that address the metalearning setting eg 6 7 which also demonstrate how the vae formulation in 1 derives from a hierarchical model cf the nonhierarchical model on which the original vae formulation is based minor points there are errors in reproducing the results from 1 in table 2 of the submission some percentages are incorrect these errors do not affect the ranking of comparisons references 1 hsu kyle sergey levine and chelsea finn unsupervised learning via metalearning in iclr 2019httpsarxivorgabs181002334 2 vinyals oriol charles blundell timothy lillicrap and daan wierstra matching networks for oneshot learning in advances in neural information processing systems pp 36303638 2016httppapersnipsccpaper6385matchingnetworksforoneshotlearning 3 ravi sachin and hugo larochelle optimization as a model for fewshot learning in iclr 2017httpsopenreviewnetpdfidrjy0kcll 4 dilokthanakul nat pedro am mediano marta garnelo matthew ch lee hugh salimbeni kai arulkumaran and murray shanahan deep unsupervised clustering with gaussian mixture variational autoencoders arxiv preprint arxiv161102648 2016httpsarxivorgabs161102648 5 jiang zhuxi yin zheng huachun tan bangsheng tang and hanning zhou variational deep embedding an unsupervised and generative approach to clustering in ijcai 2017httpsarxivorgabs161105148 6 hewitt luke b maxwell i nye andreea gane tommi jaakkola and joshua b tenenbaum the variational homoencoder learning to learn high capacity generative models from few examples in uai 2018httpsarxivorgabs180708919 7 garnelo marta jonathan schwarz dan rosenbaum fabio viola danilo j rezende s m eslami and yee whye teh neural processes in icml 2018httpsarxivorgabs180701622 8 triantafillou eleni tyler zhu vincent dumoulin pascal lamblin utku evci kelvin xu ross goroshin et al metadataset a dataset of datasets for learning to learn from few examples in icml 2020httpsarxivorgabs190303096docsepthe problem which the authors attempt to solve is unsupervised metalearning uml ie learning in an unsupervised way such a model of a dataset as to be able to perform metalearning here fewshot classification later i see their contribution as twofold 1 proposing a framework for solving uml consisting of sampling subsets di of a full dataset du training a generative model based on both datapoints xj themselves and the particular subset di and using it in a semisupervised fashion 2 implementing a model in this framework based on a vae here the latent variable z doesnt just compress information about a datapoint as in a classical vae but is also able to encode in an abstract way the position of this datapoint in the subset di ie taskspecific label to be able to capture this arguably richer than in classical vaes distribution authors use a gmm to model the variational distribution because mle of gmm is intractable authors have a two stage optimization process a finding a taskspecific ie encoding info about di classes parameter phi via em and b optimizing the elbo given phi as usual during metatesting the phi parameter is estimated in a similar way using the testtime samples xi trying to embed the new task into the learned manifold and then latent variable z is sampled conditionally based on xi and the expected value of the constructed distribution pphiyz estimated via monte carlo 1 while i am neither a vae expert nor enthusiast i consider the proposed model principled while the twostage optimization mechanism is not ideal as may make it harder to optimize compared to endtoend differentiable models learning a single distribution describing both elements we care about images and their placement within a dataset seem to match the problem better than previous pseudolabelsbased methods 2 i particularly like introduction of the general framework 1 which is not emphasized in the paper i believe that it should be possible not necessarily straightforwardly to extend the proposed model to other generative models to make it clear i wouldnt expect this extension from the paper under review what im proposing is basically yet another paper but the opening of this direction of research is a big plus 3 paper is easy to understand 4 the presented results while competitive compared to the previous uml sota are only presented on somewhat toyish problems omniglot miniimagenet while it is understandable that itll be hard to train a metagmvae on more complex datasets as its only harder than classic vaes which are already struggling with higherdimensional tasks presenting the results only on small datasets even if this is the current sota and other methods do it somewhat undermines the overall motivation to uml to be able to use vast amounts of unstructured data while building ml models 5 i am not able to comment on the novelty of the work i am barely aware of the contemporary vaeuml literature i will be willing to modify my score based on other reviewers opinions in that regard questionproposal in sec 32 authors write assuming that the modalities in prior distribution represent classconcepts of any datasets why would this be the case this seems intuitive i feel like there could be a nice theoretical argument why it would be the case i find the model principled and new it solves an important problem in a natural way improving over sota and opening the potential for followup research i weakly question the use of vaes which feels like it is limiting the method making hidim uml impossible but am aware that is more of a complaint against a wellestablished research domain than the contribution of this paper itself typos 1 abstract from unlabeled data which can capture shares the spirit of unsupervised learning in that they both seek 2 sec 1 effectiveness of our framework we run experiments on 3 sec 2 one of the main limitations of 4 sec 32 inferring isotropic gaussian distribution to encode docsepthe paper goal is to learn unsupervised feature representations that can be transferred between fewshot classification tasks the paper models the classconcepts with a mixture of gaussians prior and uses variational autoencoders to model the latent representations between the tasks and the samples the presentation is clear and straightforward the idea is to use a gmm and use an expectationmaximization em approach to learn the mixture to tackle the intractability of the variational posterior qphi zj xj mathcaldi the paper proposes to use a monte carlo approximation for the metatest the model is tuned using em in a semisupervised fashion the experiments show the superiority of the metagmvae and the compared methods on the omniglot and miniimagenet datasets nevertheless i find the idea simple yet compelling the idea of adding gmm to enhance the modeling capabilities is a well known fact and that has been explored before for instance some recent publications applying the gmm idea not that the final application and overall implementation may differ from metalearningsee my comment below dilokthanakul et al deep unsupervised clustering with gaussian mixture variational autoencoders httpsarxivorgabs161102648 zhao et al truncated gaussianmixture variational autoencoder httpsarxivorgabs190203717 guo et al variational autoencoder with optimizing gaussian mixture model priors 101109access20202977671 yang et al deep clustering by gaussian mixture variational autoencoders with graph embedding 101109iccv201900654 it seems from the presentation that the main difference is the application to the metalearning approach the authors should explain better what the contribution is and how it contrast to the existing literature of mixture models applied in variational modeling pros simple and effective idea use of well known methods with simple approximators good results on the presented experiments cons the contribution is not clear im on the fence of whether the usage of the gmm to a new task is enough to guarantee a publication overall rating im giving a 5 due to the lack of clarity in the contribution and added novelty however the presentation is good and the explanations are clear im updating my rating to accept the paper due to the comments and updates on the paper the proposed flexible usage of the gmm is novel from the existing literature the changes in the paper improved its clarity and the contribution is better presented in contrast to existing work ### Summary:
this paper addresses a method for unsupervised metalearning where a vae with gaussian mixture prior is used and setlevel inference taking episodespecific dataset as input is performed to calculate its posterior in the metatesting phase semisupervised learning with the learned vae is used to fast adapt to fewshow learning reviewers are satisfied with the author responses agreeing that the method is a principled way to tackle unsupervised metalearning
[ 253, 1268, 273, 253, 2862, 10895, 50276, 19, 3290, 253, 5661, 7103, 1057, 417, 2085, 247, 2557, 273, 11041, 24088, 5325, 7162, 7726, 275, 2829, 337, 534, 943, 320, 2530, 281, 24228, 253, 8453, 273, 253, 2361, 7756, 50274, 783, 5933, 4648, 253, 948, 498, 83, 6779, 4715, 8103, 281, 3215, 1949, 253, 4735, 4908, 263, 1223, 253, 5301, 49863, 29974, 13337, 5148, 613, 920, 2746, 897, 1679, 1347, 800, 3082, 347, 4735, 4908, 641, 260, 514, 316, 1943, 266, 913, 13774, 68, 5111, 7604, 247, 2969, 577, 12026, 260, 9866, 271, 28913, 1263, 326, 490, 77, 684, 253, 897, 273, 948, 498, 83, 342, 1313, 19803, 21574, 310, 3309, 281, 24228, 1880, 253, 7756, 310, 1955, 281, 970, 948, 498, 83, 4632, 970, 4295, 26585, 281, 1313, 19803, 21574, 50276, 20, 3236, 414, 4122, 4623, 789, 327, 305, 2188, 2235, 641, 323, 13460, 265, 310, 417, 11106, 275, 253, 19529, 577, 608, 253, 19529, 671, 1057, 417, 2319, 10575, 327, 253, 362, 3348, 326, 2953, 253, 5148, 613, 920, 4758, 24088, 721, 818, 534, 671, 7568, 849, 253, 362, 3348, 15895, 275, 337, 38422, 432, 247, 24498, 1566, 21194, 253, 1327, 73, 1321, 1116, 474, 1566, 327, 534, 253, 3236, 362, 3348, 15895, 310, 1754, 50275, 37585, 2792, 50276, 9088, 403, 6332, 275, 39306, 253, 1543, 432, 337, 275, 2829, 374, 273, 253, 19529, 690, 26026, 403, 13583, 841, 6332, 513, 417, 2818, 253, 19947, 273, 14023, 50275, 250, 3065, 50276, 18, 288, 3467, 465, 2172, 1151, 463, 90, 20978, 460, 285, 1161, 77, 15681, 1442, 79, 440, 35421, 4715, 3066, 5148, 613, 920, 275, 17857, 32888, 6247, 3614, 39962, 2061, 5375, 1093, 2313, 1508, 1706, 50276, 19, 362, 5104, 932, 47692, 311, 1018, 868, 787, 1504, 437, 4522, 16715, 298, 408, 280, 1761, 285, 4204, 266, 259, 1321, 10981, 11038, 6928, 323, 4394, 12022, 4715, 275, 16424, 275, 11454, 1491, 5162, 2718, 7266, 5540, 1229, 1812, 1839, 4022, 1696, 377, 6143, 79, 2824, 550, 20790, 23, 24677, 45767, 3024, 4896, 1542, 2487, 12022, 28269, 50276, 20, 1218, 6584, 256, 607, 249, 285, 15729, 80, 1236, 406, 2955, 282, 13757, 347, 247, 1566, 323, 1643, 11860, 4715, 275, 17857, 32888, 4240, 3614, 5758, 15337, 3024, 9275, 301, 83, 75, 90, 17, 76, 498, 77, 50276, 21, 7425, 536, 14644, 518, 335, 2889, 7690, 287, 717, 8876, 80, 16172, 66, 34226, 29595, 1111, 783, 88, 448, 458, 70, 15729, 73, 3779, 6785, 29571, 465, 2284, 549, 23073, 360, 21735, 285, 4682, 1402, 439, 266, 23509, 3676, 440, 35421, 17524, 342, 305, 12064, 7802, 39762, 6753, 2083, 351, 398, 549, 32693, 638, 3845, 549, 32693, 1036, 7749, 17984, 25, 4022, 3614, 39962, 2061, 5375, 1036, 7749, 17984, 25, 50276, 22, 480, 22589, 1182, 73, 2310, 74, 340, 249, 1182, 24176, 30287, 607, 328, 23136, 29246, 84, 24176, 12717, 285, 288, 7526, 1182, 14451, 39762, 3676, 21496, 271, 440, 35421, 285, 1006, 800, 2746, 281, 17524, 275, 891, 23925, 2284, 4240, 3614, 39962, 2061, 5375, 1036, 7749, 22, 16989, 50276, 23, 344, 88, 770, 298, 17936, 270, 2781, 4714, 891, 295, 6683, 285, 658, 66, 305, 1351, 281, 2188, 74, 8729, 36258, 6836, 285, 480, 6934, 5738, 270, 3578, 257, 30735, 253, 39762, 2860, 80, 36465, 4715, 281, 3037, 1029, 5350, 1006, 800, 3210, 432, 1643, 6667, 275, 1484, 2284, 4765, 3614, 39962, 2061, 5375, 11395, 1967, 2511, 746, 50276, 24, 34226, 29595, 16172, 66, 480, 251, 10511, 5807, 29815, 16447, 687, 8243, 30735, 6969, 900, 2757, 66, 16447, 23353, 480, 294, 91, 9747, 256, 278, 1578, 5247, 74, 285, 340, 1796, 2139, 70, 716, 73, 11454, 4870, 275, 17857, 1686, 4765, 3614, 39962, 2061, 5375, 1093, 2922, 11718, 1423, 50276, 25, 1195, 386, 2320, 408, 276, 1045, 29571, 963, 2146, 1182, 11917, 35803, 1154, 47171, 276, 3642, 7222, 1179, 24082, 3642, 2780, 13312, 612, 5297, 465, 293, 8498, 1269, 86, 687, 859, 42599, 6934, 249, 1162, 355, 1313, 324, 255, 23456, 247, 10895, 273, 15302, 323, 4715, 281, 3037, 432, 1643, 6667, 275, 17857, 1686, 9169, 3614, 39962, 2061, 5375, 746, 15960, 1229, 4196, 7152, 339, 431, 248, 1895, 534, 253, 4477, 3177, 281, 8415, 310, 440, 35421, 5148, 613, 920, 5111, 77, 26332, 4715, 275, 271, 440, 35421, 1039, 824, 247, 1566, 273, 247, 10895, 347, 281, 320, 2104, 281, 1347, 5148, 613, 920, 1060, 1643, 11860, 9162, 1996, 891, 923, 616, 7680, 347, 767, 8089, 337, 36636, 247, 7792, 323, 16161, 5111, 77, 11253, 273, 10491, 20077, 1073, 273, 247, 2120, 10895, 3443, 3733, 247, 1006, 800, 1566, 1754, 327, 1097, 2856, 522, 842, 84, 1269, 75, 3746, 285, 253, 1798, 8578, 1073, 285, 970, 352, 275, 247, 49863, 29974, 13337, 8142, 374, 16994, 247, 1566, 275, 436, 7792, 1754, 327, 247, 362, 3348, 1060, 253, 21624, 4778, 1182, 36908, 816, 19477, 1491, 670, 247, 2856, 522, 842, 347, 275, 247, 8946, 362, 3348, 533, 310, 671, 2104, 281, 22573, 275, 271, 12002, 1039, 253, 1899, 273, 436, 2856, 522, 842, 275, 253, 8578, 1073, 26332, 8892, 29765, 5203, 281, 320, 2104, 281, 9232, 436, 25711, 38539, 685, 275, 8946, 13460, 265, 3268, 4477, 897, 247, 305, 2188, 281, 1566, 253, 39762, 3268, 50276, 12157, 278, 282, 273, 305, 2188, 310, 540, 44374, 4477, 452, 247, 767, 3924, 13757, 1232, 247, 4560, 247, 8892, 29765, 26332, 9706, 8692, 670, 1073, 5971, 4764, 815, 74, 3066, 802, 285, 270, 39793, 253, 1045, 2399, 1677, 815, 74, 347, 7312, 1309, 1313, 255, 38972, 253, 815, 74, 4764, 310, 5998, 275, 247, 2074, 1039, 970, 253, 1071, 2606, 3530, 1269, 74, 2820, 281, 8473, 253, 747, 4836, 715, 253, 6311, 16751, 285, 840, 21624, 4778, 1182, 310, 19958, 1617, 595, 1754, 327, 1269, 74, 285, 253, 3264, 1318, 273, 253, 8818, 3268, 268, 2162, 30608, 5998, 3066, 1114, 442, 1113, 4213, 50276, 18, 1223, 891, 717, 6747, 247, 362, 3348, 6485, 4543, 12833, 505, 891, 1908, 253, 4081, 1566, 3505, 74, 6216, 1223, 253, 2500, 493, 486, 13757, 5122, 310, 417, 7445, 347, 778, 1056, 352, 12150, 281, 22318, 2429, 281, 990, 936, 423, 46350, 3210, 4715, 247, 2014, 3268, 12930, 1097, 3603, 359, 1557, 670, 3888, 285, 616, 14663, 1561, 247, 10895, 1646, 281, 3761, 253, 1895, 1805, 685, 2045, 10585, 311, 357, 1241, 3169, 3082, 374, 891, 3782, 751, 10199, 273, 253, 2087, 7792, 337, 534, 310, 417, 21947, 275, 253, 2929, 891, 2868, 326, 352, 943, 320, 1896, 417, 7933, 15246, 314, 281, 9017, 253, 4081, 1566, 281, 643, 1006, 800, 3210, 281, 1056, 352, 2590, 891, 651, 2649, 1902, 436, 6880, 432, 253, 2929, 762, 2278, 752, 516, 36636, 310, 10323, 2568, 1529, 2929, 533, 253, 5909, 273, 436, 3884, 273, 2561, 310, 247, 1943, 5043, 495, 2929, 310, 3477, 281, 2096, 577, 253, 3559, 1543, 1223, 12085, 2429, 281, 253, 2045, 5111, 77, 256, 5503, 403, 760, 3559, 327, 8489, 20953, 763, 3237, 33039, 304, 11753, 12949, 303, 6533, 292, 1223, 352, 310, 34007, 326, 352, 620, 320, 1892, 281, 6194, 247, 1313, 19803, 21574, 327, 625, 2570, 15302, 347, 697, 760, 12150, 685, 10610, 13460, 265, 534, 403, 2168, 15586, 342, 2169, 6967, 8892, 15250, 253, 1543, 760, 327, 1355, 15302, 1014, 604, 436, 310, 253, 1655, 256, 5503, 285, 643, 3082, 513, 352, 8489, 35162, 1100, 253, 4583, 16038, 281, 5111, 77, 281, 320, 2104, 281, 897, 8485, 8322, 273, 440, 34218, 941, 1223, 3652, 13361, 3210, 608, 891, 717, 417, 2104, 281, 4385, 327, 253, 38135, 273, 253, 789, 891, 717, 12345, 6600, 273, 253, 13399, 362, 3348, 360, 77, 6239, 891, 588, 320, 7378, 281, 10007, 619, 4868, 1754, 327, 643, 30628, 11626, 275, 326, 2743, 50276, 19751, 856, 40384, 275, 4706, 4567, 4477, 3630, 7384, 326, 253, 33433, 275, 2720, 3268, 1957, 966, 31503, 84, 273, 667, 15302, 2139, 651, 436, 320, 253, 1083, 436, 3133, 27350, 891, 1928, 751, 627, 812, 320, 247, 5322, 10527, 4154, 2139, 352, 651, 320, 253, 1083, 50276, 74, 1089, 253, 1566, 3505, 74, 6216, 285, 747, 352, 35910, 271, 1774, 1895, 275, 247, 3626, 1039, 11138, 689, 256, 5503, 285, 5909, 253, 2442, 323, 956, 484, 2561, 891, 22112, 1953, 253, 897, 273, 13460, 265, 534, 9193, 751, 352, 310, 14155, 253, 1332, 2403, 29334, 303, 5111, 77, 7479, 533, 717, 6600, 326, 310, 625, 273, 247, 5833, 1411, 247, 973, 21877, 2561, 5028, 685, 253, 7680, 273, 436, 2929, 3139, 50276, 555, 993, 337, 12002, 50276, 4064, 440, 22027, 941, 534, 476, 9232, 50275, 1200, 4420, 253, 5968, 273, 440, 35421, 4715, 275, 326, 597, 1097, 7703, 50276, 19, 4706, 337, 50276, 38439, 273, 776, 7792, 359, 1408, 4679, 327, 50276, 20, 4706, 374, 50276, 531, 273, 253, 2022, 7364, 273, 50276, 21, 4706, 4567, 50276, 47586, 804, 29436, 305, 12064, 3268, 281, 22573, 50276, 7152, 339, 431, 248, 2929, 4736, 310, 281, 3037, 440, 35421, 4735, 14237, 326, 476, 320, 9495, 875, 1643, 11860, 9162, 8892, 50276, 783, 2929, 3210, 253, 966, 31503, 84, 342, 247, 7802, 273, 305, 10064, 2458, 2720, 285, 4648, 39762, 6753, 2083, 351, 398, 281, 1566, 253, 21624, 14237, 875, 253, 8892, 285, 253, 3530, 50274, 783, 9759, 310, 2590, 285, 15246, 50276, 783, 2934, 310, 281, 897, 247, 305, 2188, 285, 897, 271, 15355, 785, 3266, 1320, 802, 2746, 281, 3037, 253, 7802, 50276, 936, 18915, 253, 540, 974, 1430, 273, 253, 39762, 12637, 2805, 2162, 1182, 75, 50276, 89, 75, 14168, 1179, 5168, 253, 2929, 29328, 281, 897, 247, 1114, 442, 1113, 4213, 11193, 50276, 1542, 253, 1313, 255, 383, 253, 1566, 310, 24251, 970, 802, 275, 247, 49863, 29974, 13337, 8142, 50276, 783, 4679, 921, 253, 34385, 273, 253, 1313, 19803, 21574, 285, 253, 2429, 3082, 327, 253, 33039, 304, 11753, 285, 12949, 303, 6533, 292, 15302, 50274, 7594, 8299, 891, 1089, 253, 2934, 2969, 2568, 18511, 50276, 783, 2934, 273, 6240, 305, 2188, 281, 7278, 253, 14053, 13789, 310, 247, 973, 1929, 958, 285, 326, 556, 644, 14859, 1078, 50276, 1542, 4227, 690, 3332, 16516, 9433, 253, 305, 2188, 2934, 417, 326, 253, 2457, 2898, 285, 4583, 7092, 778, 9184, 432, 5148, 613, 920, 2887, 619, 4385, 2708, 50276, 43886, 536, 14644, 518, 335, 1162, 355, 3676, 440, 35421, 17524, 342, 305, 12064, 7802, 39762, 6753, 2083, 351, 398, 5987, 39962, 2061, 5375, 1036, 7749, 17984, 25, 50276, 91, 31035, 1162, 355, 28069, 305, 12064, 7373, 6638, 39762, 6753, 36465, 5987, 39962, 2061, 5375, 746, 9992, 1787, 1166, 50276, 4297, 80, 1162, 355, 39762, 6753, 36465, 342, 39793, 305, 12064, 7802, 1566, 2235, 641, 8437, 12852, 10773, 14952, 1717, 2357, 35276, 50276, 31524, 1162, 355, 3676, 17524, 407, 305, 12064, 7802, 39762, 6753, 2083, 351, 398, 342, 4216, 21496, 8437, 12852, 280, 17312, 9638, 361, 29195, 50276, 262, 3133, 432, 253, 9759, 326, 253, 2022, 3064, 310, 253, 2898, 281, 253, 5148, 613, 920, 2746, 50276, 783, 4477, 943, 5513, 1805, 752, 253, 7680, 310, 285, 849, 352, 4499, 281, 253, 5368, 6239, 273, 7802, 3210, 3732, 275, 39762, 14053, 50276, 856, 84, 50276, 19583, 285, 3576, 2934, 50276, 2327, 273, 973, 1929, 3082, 342, 2969, 4020, 2392, 50276, 12311, 1543, 327, 253, 3559, 4679, 50276, 5040, 50276, 783, 7680, 310, 417, 2590, 50276, 303, 327, 253, 19354, 273, 1880, 253, 10393, 273, 253, 305, 2188, 281, 247, 747, 4836, 310, 2217, 281, 12215, 247, 9311, 50276, 1189, 455, 13716, 516, 4933, 247, 608, 1955, 281, 253, 3480, 273, 19843, 275, 253, 7680, 285, 2879, 38135, 50276, 35529, 253, 9759, 310, 1175, 285, 253, 22909, 403, 2590, 516, 22753, 619, 13716, 281, 2997, 253, 2929, 1955, 281, 253, 5701, 285, 11269, 327, 253, 2929, 50276, 783, 4081, 12112, 10393, 273, 253, 305, 2188, 310, 4460, 432, 253, 5368, 6239, 50276, 783, 2544, 275, 253, 2929, 5520, 697, 19843, 285, 253, 7680, 310, 1805, 3559, 275, 4499, 281, 5368, 789, 50276, 187, 187, 4118, 18435, 27, 2520, 2929, 12453, 247, 1332, 323, 440, 35421, 5148, 613, 920, 835, 247, 362, 3348, 342, 305, 12064, 7802, 2720, 310, 908, 285, 873, 5251, 17032, 3192, 13305, 29765, 10895, 347, 3280, 310, 2684, 281, 10173, 697, 12637, 275, 253, 1313, 255, 38972, 3408, 49863, 29974, 13337, 4715, 342, 253, 6311, 362, 3348, 310, 908, 281, 3809, 5223, 281, 1643, 9029, 4715, 30628, 403, 10048, 342, 253, 2488, 6128, 33732, 326, 253, 1332, 310, 247, 3505, 74, 6216, 1039, 281, 18915, 440, 35421, 5148, 613, 920, 50276 ]
[ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 ]
[ 253, 1268, 273, 253, 2862, 10895, 50276, 19, 3290, 253, 5661, 7103, 1057, 417, 2085, 247, 2557, 273, 11041, 24088, 5325, 7162, 7726, 275, 2829, 337, 534, 943, 320, 2530, 281, 24228, 253, 8453, 273, 253, 2361, 7756, 50274, 783, 5933, 4648, 253, 948, 498, 83, 6779, 4715, 8103, 281, 3215, 1949, 253, 4735, 4908, 263, 1223, 253, 5301, 49863, 29974, 13337, 5148, 613, 920, 2746, 897, 1679, 1347, 800, 3082, 347, 4735, 4908, 641, 260, 514, 316, 1943, 266, 913, 13774, 68, 5111, 7604, 247, 2969, 577, 12026, 260, 9866, 271, 28913, 1263, 326, 490, 77, 684, 253, 897, 273, 948, 498, 83, 342, 1313, 19803, 21574, 310, 3309, 281, 24228, 1880, 253, 7756, 310, 1955, 281, 970, 948, 498, 83, 4632, 970, 4295, 26585, 281, 1313, 19803, 21574, 50276, 20, 3236, 414, 4122, 4623, 789, 327, 305, 2188, 2235, 641, 323, 13460, 265, 310, 417, 11106, 275, 253, 19529, 577, 608, 253, 19529, 671, 1057, 417, 2319, 10575, 327, 253, 362, 3348, 326, 2953, 253, 5148, 613, 920, 4758, 24088, 721, 818, 534, 671, 7568, 849, 253, 362, 3348, 15895, 275, 337, 38422, 432, 247, 24498, 1566, 21194, 253, 1327, 73, 1321, 1116, 474, 1566, 327, 534, 253, 3236, 362, 3348, 15895, 310, 1754, 50275, 37585, 2792, 50276, 9088, 403, 6332, 275, 39306, 253, 1543, 432, 337, 275, 2829, 374, 273, 253, 19529, 690, 26026, 403, 13583, 841, 6332, 513, 417, 2818, 253, 19947, 273, 14023, 50275, 250, 3065, 50276, 18, 288, 3467, 465, 2172, 1151, 463, 90, 20978, 460, 285, 1161, 77, 15681, 1442, 79, 440, 35421, 4715, 3066, 5148, 613, 920, 275, 17857, 32888, 6247, 3614, 39962, 2061, 5375, 1093, 2313, 1508, 1706, 50276, 19, 362, 5104, 932, 47692, 311, 1018, 868, 787, 1504, 437, 4522, 16715, 298, 408, 280, 1761, 285, 4204, 266, 259, 1321, 10981, 11038, 6928, 323, 4394, 12022, 4715, 275, 16424, 275, 11454, 1491, 5162, 2718, 7266, 5540, 1229, 1812, 1839, 4022, 1696, 377, 6143, 79, 2824, 550, 20790, 23, 24677, 45767, 3024, 4896, 1542, 2487, 12022, 28269, 50276, 20, 1218, 6584, 256, 607, 249, 285, 15729, 80, 1236, 406, 2955, 282, 13757, 347, 247, 1566, 323, 1643, 11860, 4715, 275, 17857, 32888, 4240, 3614, 5758, 15337, 3024, 9275, 301, 83, 75, 90, 17, 76, 498, 77, 50276, 21, 7425, 536, 14644, 518, 335, 2889, 7690, 287, 717, 8876, 80, 16172, 66, 34226, 29595, 1111, 783, 88, 448, 458, 70, 15729, 73, 3779, 6785, 29571, 465, 2284, 549, 23073, 360, 21735, 285, 4682, 1402, 439, 266, 23509, 3676, 440, 35421, 17524, 342, 305, 12064, 7802, 39762, 6753, 2083, 351, 398, 549, 32693, 638, 3845, 549, 32693, 1036, 7749, 17984, 25, 4022, 3614, 39962, 2061, 5375, 1036, 7749, 17984, 25, 50276, 22, 480, 22589, 1182, 73, 2310, 74, 340, 249, 1182, 24176, 30287, 607, 328, 23136, 29246, 84, 24176, 12717, 285, 288, 7526, 1182, 14451, 39762, 3676, 21496, 271, 440, 35421, 285, 1006, 800, 2746, 281, 17524, 275, 891, 23925, 2284, 4240, 3614, 39962, 2061, 5375, 1036, 7749, 22, 16989, 50276, 23, 344, 88, 770, 298, 17936, 270, 2781, 4714, 891, 295, 6683, 285, 658, 66, 305, 1351, 281, 2188, 74, 8729, 36258, 6836, 285, 480, 6934, 5738, 270, 3578, 257, 30735, 253, 39762, 2860, 80, 36465, 4715, 281, 3037, 1029, 5350, 1006, 800, 3210, 432, 1643, 6667, 275, 1484, 2284, 4765, 3614, 39962, 2061, 5375, 11395, 1967, 2511, 746, 50276, 24, 34226, 29595, 16172, 66, 480, 251, 10511, 5807, 29815, 16447, 687, 8243, 30735, 6969, 900, 2757, 66, 16447, 23353, 480, 294, 91, 9747, 256, 278, 1578, 5247, 74, 285, 340, 1796, 2139, 70, 716, 73, 11454, 4870, 275, 17857, 1686, 4765, 3614, 39962, 2061, 5375, 1093, 2922, 11718, 1423, 50276, 25, 1195, 386, 2320, 408, 276, 1045, 29571, 963, 2146, 1182, 11917, 35803, 1154, 47171, 276, 3642, 7222, 1179, 24082, 3642, 2780, 13312, 612, 5297, 465, 293, 8498, 1269, 86, 687, 859, 42599, 6934, 249, 1162, 355, 1313, 324, 255, 23456, 247, 10895, 273, 15302, 323, 4715, 281, 3037, 432, 1643, 6667, 275, 17857, 1686, 9169, 3614, 39962, 2061, 5375, 746, 15960, 1229, 4196, 7152, 339, 431, 248, 1895, 534, 253, 4477, 3177, 281, 8415, 310, 440, 35421, 5148, 613, 920, 5111, 77, 26332, 4715, 275, 271, 440, 35421, 1039, 824, 247, 1566, 273, 247, 10895, 347, 281, 320, 2104, 281, 1347, 5148, 613, 920, 1060, 1643, 11860, 9162, 1996, 891, 923, 616, 7680, 347, 767, 8089, 337, 36636, 247, 7792, 323, 16161, 5111, 77, 11253, 273, 10491, 20077, 1073, 273, 247, 2120, 10895, 3443, 3733, 247, 1006, 800, 1566, 1754, 327, 1097, 2856, 522, 842, 84, 1269, 75, 3746, 285, 253, 1798, 8578, 1073, 285, 970, 352, 275, 247, 49863, 29974, 13337, 8142, 374, 16994, 247, 1566, 275, 436, 7792, 1754, 327, 247, 362, 3348, 1060, 253, 21624, 4778, 1182, 36908, 816, 19477, 1491, 670, 247, 2856, 522, 842, 347, 275, 247, 8946, 362, 3348, 533, 310, 671, 2104, 281, 22573, 275, 271, 12002, 1039, 253, 1899, 273, 436, 2856, 522, 842, 275, 253, 8578, 1073, 26332, 8892, 29765, 5203, 281, 320, 2104, 281, 9232, 436, 25711, 38539, 685, 275, 8946, 13460, 265, 3268, 4477, 897, 247, 305, 2188, 281, 1566, 253, 39762, 3268, 50276, 12157, 278, 282, 273, 305, 2188, 310, 540, 44374, 4477, 452, 247, 767, 3924, 13757, 1232, 247, 4560, 247, 8892, 29765, 26332, 9706, 8692, 670, 1073, 5971, 4764, 815, 74, 3066, 802, 285, 270, 39793, 253, 1045, 2399, 1677, 815, 74, 347, 7312, 1309, 1313, 255, 38972, 253, 815, 74, 4764, 310, 5998, 275, 247, 2074, 1039, 970, 253, 1071, 2606, 3530, 1269, 74, 2820, 281, 8473, 253, 747, 4836, 715, 253, 6311, 16751, 285, 840, 21624, 4778, 1182, 310, 19958, 1617, 595, 1754, 327, 1269, 74, 285, 253, 3264, 1318, 273, 253, 8818, 3268, 268, 2162, 30608, 5998, 3066, 1114, 442, 1113, 4213, 50276, 18, 1223, 891, 717, 6747, 247, 362, 3348, 6485, 4543, 12833, 505, 891, 1908, 253, 4081, 1566, 3505, 74, 6216, 1223, 253, 2500, 493, 486, 13757, 5122, 310, 417, 7445, 347, 778, 1056, 352, 12150, 281, 22318, 2429, 281, 990, 936, 423, 46350, 3210, 4715, 247, 2014, 3268, 12930, 1097, 3603, 359, 1557, 670, 3888, 285, 616, 14663, 1561, 247, 10895, 1646, 281, 3761, 253, 1895, 1805, 685, 2045, 10585, 311, 357, 1241, 3169, 3082, 374, 891, 3782, 751, 10199, 273, 253, 2087, 7792, 337, 534, 310, 417, 21947, 275, 253, 2929, 891, 2868, 326, 352, 943, 320, 1896, 417, 7933, 15246, 314, 281, 9017, 253, 4081, 1566, 281, 643, 1006, 800, 3210, 281, 1056, 352, 2590, 891, 651, 2649, 1902, 436, 6880, 432, 253, 2929, 762, 2278, 752, 516, 36636, 310, 10323, 2568, 1529, 2929, 533, 253, 5909, 273, 436, 3884, 273, 2561, 310, 247, 1943, 5043, 495, 2929, 310, 3477, 281, 2096, 577, 253, 3559, 1543, 1223, 12085, 2429, 281, 253, 2045, 5111, 77, 256, 5503, 403, 760, 3559, 327, 8489, 20953, 763, 3237, 33039, 304, 11753, 12949, 303, 6533, 292, 1223, 352, 310, 34007, 326, 352, 620, 320, 1892, 281, 6194, 247, 1313, 19803, 21574, 327, 625, 2570, 15302, 347, 697, 760, 12150, 685, 10610, 13460, 265, 534, 403, 2168, 15586, 342, 2169, 6967, 8892, 15250, 253, 1543, 760, 327, 1355, 15302, 1014, 604, 436, 310, 253, 1655, 256, 5503, 285, 643, 3082, 513, 352, 8489, 35162, 1100, 253, 4583, 16038, 281, 5111, 77, 281, 320, 2104, 281, 897, 8485, 8322, 273, 440, 34218, 941, 1223, 3652, 13361, 3210, 608, 891, 717, 417, 2104, 281, 4385, 327, 253, 38135, 273, 253, 789, 891, 717, 12345, 6600, 273, 253, 13399, 362, 3348, 360, 77, 6239, 891, 588, 320, 7378, 281, 10007, 619, 4868, 1754, 327, 643, 30628, 11626, 275, 326, 2743, 50276, 19751, 856, 40384, 275, 4706, 4567, 4477, 3630, 7384, 326, 253, 33433, 275, 2720, 3268, 1957, 966, 31503, 84, 273, 667, 15302, 2139, 651, 436, 320, 253, 1083, 436, 3133, 27350, 891, 1928, 751, 627, 812, 320, 247, 5322, 10527, 4154, 2139, 352, 651, 320, 253, 1083, 50276, 74, 1089, 253, 1566, 3505, 74, 6216, 285, 747, 352, 35910, 271, 1774, 1895, 275, 247, 3626, 1039, 11138, 689, 256, 5503, 285, 5909, 253, 2442, 323, 956, 484, 2561, 891, 22112, 1953, 253, 897, 273, 13460, 265, 534, 9193, 751, 352, 310, 14155, 253, 1332, 2403, 29334, 303, 5111, 77, 7479, 533, 717, 6600, 326, 310, 625, 273, 247, 5833, 1411, 247, 973, 21877, 2561, 5028, 685, 253, 7680, 273, 436, 2929, 3139, 50276, 555, 993, 337, 12002, 50276, 4064, 440, 22027, 941, 534, 476, 9232, 50275, 1200, 4420, 253, 5968, 273, 440, 35421, 4715, 275, 326, 597, 1097, 7703, 50276, 19, 4706, 337, 50276, 38439, 273, 776, 7792, 359, 1408, 4679, 327, 50276, 20, 4706, 374, 50276, 531, 273, 253, 2022, 7364, 273, 50276, 21, 4706, 4567, 50276, 47586, 804, 29436, 305, 12064, 3268, 281, 22573, 50276, 7152, 339, 431, 248, 2929, 4736, 310, 281, 3037, 440, 35421, 4735, 14237, 326, 476, 320, 9495, 875, 1643, 11860, 9162, 8892, 50276, 783, 2929, 3210, 253, 966, 31503, 84, 342, 247, 7802, 273, 305, 10064, 2458, 2720, 285, 4648, 39762, 6753, 2083, 351, 398, 281, 1566, 253, 21624, 14237, 875, 253, 8892, 285, 253, 3530, 50274, 783, 9759, 310, 2590, 285, 15246, 50276, 783, 2934, 310, 281, 897, 247, 305, 2188, 285, 897, 271, 15355, 785, 3266, 1320, 802, 2746, 281, 3037, 253, 7802, 50276, 936, 18915, 253, 540, 974, 1430, 273, 253, 39762, 12637, 2805, 2162, 1182, 75, 50276, 89, 75, 14168, 1179, 5168, 253, 2929, 29328, 281, 897, 247, 1114, 442, 1113, 4213, 11193, 50276, 1542, 253, 1313, 255, 383, 253, 1566, 310, 24251, 970, 802, 275, 247, 49863, 29974, 13337, 8142, 50276, 783, 4679, 921, 253, 34385, 273, 253, 1313, 19803, 21574, 285, 253, 2429, 3082, 327, 253, 33039, 304, 11753, 285, 12949, 303, 6533, 292, 15302, 50274, 7594, 8299, 891, 1089, 253, 2934, 2969, 2568, 18511, 50276, 783, 2934, 273, 6240, 305, 2188, 281, 7278, 253, 14053, 13789, 310, 247, 973, 1929, 958, 285, 326, 556, 644, 14859, 1078, 50276, 1542, 4227, 690, 3332, 16516, 9433, 253, 305, 2188, 2934, 417, 326, 253, 2457, 2898, 285, 4583, 7092, 778, 9184, 432, 5148, 613, 920, 2887, 619, 4385, 2708, 50276, 43886, 536, 14644, 518, 335, 1162, 355, 3676, 440, 35421, 17524, 342, 305, 12064, 7802, 39762, 6753, 2083, 351, 398, 5987, 39962, 2061, 5375, 1036, 7749, 17984, 25, 50276, 91, 31035, 1162, 355, 28069, 305, 12064, 7373, 6638, 39762, 6753, 36465, 5987, 39962, 2061, 5375, 746, 9992, 1787, 1166, 50276, 4297, 80, 1162, 355, 39762, 6753, 36465, 342, 39793, 305, 12064, 7802, 1566, 2235, 641, 8437, 12852, 10773, 14952, 1717, 2357, 35276, 50276, 31524, 1162, 355, 3676, 17524, 407, 305, 12064, 7802, 39762, 6753, 2083, 351, 398, 342, 4216, 21496, 8437, 12852, 280, 17312, 9638, 361, 29195, 50276, 262, 3133, 432, 253, 9759, 326, 253, 2022, 3064, 310, 253, 2898, 281, 253, 5148, 613, 920, 2746, 50276, 783, 4477, 943, 5513, 1805, 752, 253, 7680, 310, 285, 849, 352, 4499, 281, 253, 5368, 6239, 273, 7802, 3210, 3732, 275, 39762, 14053, 50276, 856, 84, 50276, 19583, 285, 3576, 2934, 50276, 2327, 273, 973, 1929, 3082, 342, 2969, 4020, 2392, 50276, 12311, 1543, 327, 253, 3559, 4679, 50276, 5040, 50276, 783, 7680, 310, 417, 2590, 50276, 303, 327, 253, 19354, 273, 1880, 253, 10393, 273, 253, 305, 2188, 281, 247, 747, 4836, 310, 2217, 281, 12215, 247, 9311, 50276, 1189, 455, 13716, 516, 4933, 247, 608, 1955, 281, 253, 3480, 273, 19843, 275, 253, 7680, 285, 2879, 38135, 50276, 35529, 253, 9759, 310, 1175, 285, 253, 22909, 403, 2590, 516, 22753, 619, 13716, 281, 2997, 253, 2929, 1955, 281, 253, 5701, 285, 11269, 327, 253, 2929, 50276, 783, 4081, 12112, 10393, 273, 253, 305, 2188, 310, 4460, 432, 253, 5368, 6239, 50276, 783, 2544, 275, 253, 2929, 5520, 697, 19843, 285, 253, 7680, 310, 1805, 3559, 275, 4499, 281, 5368, 789, 50276, 187, 187, 4118, 18435, 27, 2520, 2929, 12453, 247, 1332, 323, 440, 35421, 5148, 613, 920, 835, 247, 362, 3348, 342, 305, 12064, 7802, 2720, 310, 908, 285, 873, 5251, 17032, 3192, 13305, 29765, 10895, 347, 3280, 310, 2684, 281, 10173, 697, 12637, 275, 253, 1313, 255, 38972, 3408, 49863, 29974, 13337, 4715, 342, 253, 6311, 362, 3348, 310, 908, 281, 3809, 5223, 281, 1643, 9029, 4715, 30628, 403, 10048, 342, 253, 2488, 6128, 33732, 326, 253, 1332, 310, 247, 3505, 74, 6216, 1039, 281, 18915, 440, 35421, 5148, 613, 920, 50276 ]
Below is given review of a research paper from cnoference journal. Please write a summary the review. ### Review: this paper studies the problem of hyper parameter tuning in the setting of differentially private training of machine learning models the paper first shows that the hyper parameters used to train a model and the corresponding utility can leak information about training through experimenting with outliers they then go on to introduce and analyze ways that would help release dp models trained with the best set of hyper parameters with small privacy leakage if for hyper parameter tuning m models are trained with dp then the leakage for releasing the best model would be mepsilon with simple composition the method introduced by the paper however builds on prior work by liutalwar and improves this to become 2epsilon through randomly choosing and running hyper parameter settings the problem studied here is a really important one since it is directly related to actual deployment of dp models in real life settings the proposed method seems solid and the experimental results look promising i do have a few clarification questions 1 how would this method be extended to be used with adaptive search methods grid search can be too time consuminginefficient especially when used with dpsgd as dpsgd needs perexample gradients that are expensive to obtain and there are some reinforcement learning based methods that can yield optimal hyper parameters much faster than grid search can this method be modified maybe with a higher privacy expenditure to be used in those cases 2 what kind of reallife attack scenario would you envision that could actually use the best models obtained by hyper parameter tuning to extract information basically what typehow much information about a model do the set of optimal hyper parameters leak i dont see any major issues with the paper and i find the problem addressed very relevant docsepthis paper has made the following contributions firstly this paper illustrates how simply tuning hyperparameters based on nonprivate training runs can leak private information second this paper provides privacy guarantees for hyperparameter search procedures within the framework of renyi differential privacy their results improve and extend the work of liu and talwar stoc 2019 this paper considers an interesting and important problem how hyperparameter tuning on private dataset can leak information where it provides an intuitive svm example the paper then considers how to reduce the leakage the problem is formalized in the following way we pick a total number of runs k from some distribution then for each run k 1 2ldots k we pick an index jk in m uniformly at random and run mjk then at the end we return the best of the k outcomes the paper proposes theoretical guarantees when k comes from truncated negative binomial distribution or poission distribution which strictly generalizes the previous results furthermore it also proposes a new method of computing privacy leakage when k comes from a general distribution which should be of independent interest finally the paper conducts empirical experiments to show the improvement if the new method my only concern for this paper is the problem formalization first for the current problem formulation it is possible that the same parameter will be tried multiple times which is definitely a waste of privacy not sure whether people will do it in practice second the paper assumes the following scheme satisfies dp randomly choose j in m and run mj note that this is not equivalent with assuming each mj is dp where the latter is more realistic for example in the hyperparameter tuning of dpsgd it easily holds that each run with different clipping norm satisfies dp however it is not clear to me whether randomly selecting one clipping norm and then running dpsgd is differentially private the authors need to justify the relationship between these two assumptions naively speaking the second can not lead to the first assumption please correct me if i miss something generally speaking i recommend acceptance of this paper docsepthe paper provides an considerable improvement to the dp analysis of hyperparameter tuning of dp algorithms such as dpsgd the analysis is carried out using rnyi differential privacy rdp and the dp bounds are rdp bounds that contain the rdp parameters of the underlying mechanisms that give the private candidates most importantly the paper considerably improves the stateoftheart of liu and talwar 2019 also a nice counterexample using svms is constructed that shows the importance of this problem the paper is very well written and the analysis seems solid though i did not check every line the problem is important and the paper improves the stateoftheart so its merits are clear my only critique as emhpasised in sec 32 the strawman approach of fixing m does not work for pure epsdp however i find this a bit misleading since you are not observing eps0dp of hyperparameter tuning but epsdelta or rdp so the delta might actually play a crucial role here allowing bit of delta in the dp bound the randomness in choosing the number of repetition might perhaps be not that crucial or vice versa in eps0 you would not get great gains from that randomness i suspect that this fact that you have to have the randomness in the number of repetitions actually is a requirement of the rdp analysis that you carry out as far as i see in proposition 17 you claim it is not but i do not fully see why that would be the case the result of proposition 17 ie that the rdp of k repetitions is not rdp for less than epslambda does not really give a counterexample in the same way as that result for pure epsilon for example suppose the underlying mechanism is eps05 dp then you choose lambda 19483 then eps log1expepslambda1 1e4 in which case the bound does not really say anything even for quite large numbers of k and you can of course make that bound arbitrarily small by choosing lambda appropriately ie if you let delta 0 i think there is room for tighter analysis also for fixed k could you comment on this tradeoff between lambda and eps in the bound of proposition 17 could you give more intuition on why the randomness in number of repetitions k would be crucial or some other example that illustrates this other there is something strange at the bottom of p 22 equation going over the line all in all i think this is a very nice analysis and improves the stateoftheart also this is a very important problem with small modifications i think this ought to be accepted my only critique i am just not entirely convinced that randomness in the number of repetitions is crucial for having tight bounds for dp hyperparameter tuning i hope the authors can clarify my concerns docsepthe authors make the following contributions regarding differentially private hyperparameter tuning as an example the authors train an svm with a weight penalty and show that in presence of an outlier a membership inference attack can be employed to infer from the hyperparameter whether or not the outlier was a part of the training set the authors provide an algorithm for private hyperparameter tuning which consists of running a learning algorithm that satisfies rnyi differential privacy rdp for a random number of times k each with a hyperparameter drawn uniformly at random from a finite candidate set the authors first prove rdp guarantees of the algorithm when k is sampled from a truncated negative binomial distribution and poisson distribution then they proceed to prove rdp guarantees for any distribution of k supported on mathbbncup 0 the authors propose a way to measure the utility of the algorithms by looking at the expected quantile of the output the results of their utility analysis coupled with an experiment on mnist show that the algorithm with poisson distribution performs better than those proposed by liu and talwar 2019 in an intermediate range of privacy budget varepsilon from what i understand we can obtain a tighter generic bound by going through the proof of lemma 7 but the authors opt to use 7 since the postprocessing often leads to a bound that is independent of q and q when plugging in a specific distribution dpfocused machine learning systems would benefit from this work as most of them require hyperparameter tuning the algorithms provided in this work help us avoid a large cost of composition and perform tuning at much smaller privacy loss in addition the generic bound allows us to be flexible on the distribution of the number of runs k though it requires some effort to translate the logarithmic term into something usable i have checked all the proofs and there are no critical errors the authors have sufficiently compared their method with previous approaches specifically using the main results the authors show that their method extends and improves upon the work of liu and talwar 2019 the authors also compare their method with the exponential mechanismbased method gupta et al 2010 theorem 102 and show in section d4 that both methods have the same lower bounds of the utility guarantees up to constants nonetheless the authors might want to compare their method with the noise perturbation method proposed by chaudhuri et al 2013 this method requires the stability condition on the training procedure which might not be tractable for neural networks but for linear classifiers the algorithm algorithm 1 only adds noises of size o1nvarepsilon to the scores which is quite attractive for training on large datasets the paper is mostly written with dpsgd in mind but it does not consider when the models evaluation on a holdout validation set is incorporated in the base algorithm q which is a common practice in training an ml model from an application point of view the authors might want to discuss a bit how the models evaluation in classification or regression can be made dp or rdp as a part of q i see that distributions with finite support eg the truncated binomial are not considered in this study but in practice one might want to limit the number of hyperparameter searches eg due to computational constraints so such distributions might come into play have the authors performed some experiments on these against the truncated negative binomial and poisson distribution i wonder if the privacyutility tradeoff is better when restricting to finite support could the authors comment on how to determine an appropriate size of the candidate set m a simple heuristic is mmathbbek but the authors might have something better in mind specific comments page 2 in eq 2 the special case where lambda1 should be mentioned here as it is also in the range of hatlambda in the main results page 6 when i tried to derive 6 from 5 i got an extra rhoeta term from rewriting epsilonrholambda11etaleft1frac1hatlambdarighthatepsilonrholambdarho1etahatlambda1rholambda1rhohatlambda1etarhoeta combining this with the rest of the terms and then choosing lambda and lambda so that the equality holds in the following inequality leftrholambda1 fraclogmathbbeklambda1 rightleftrhohatlambda1eta frac1etalog1gammahatlambdaright rhoeta geq 2sqrtrhologmathbbek 21etasqrtrholog1gammarhoeta to clarify this i think the authors should provide the proof of corollary 4 somewhere in the appendix as i find the bound to be nontrivial page 7 in lemma 7 q and q are arbitrary probabilities is misleading when i saw the statement for the first time i read this as q and q can be anything in 01 but the proof indicates that they take specific forms in order for the lemma to hold true one way to resolve this issue is by directly stating the definition of q and q after 7 line 12 in page 8 vaguely f being smooth corresponds to the distribution k being spread out ie far from being a point mass this is not true if x is a point mass at 1 ie prx11 then the pgf of x is fxx and so fx1 which is smoother than any polynomial looking at the definition of pgf the smoothness of f should depend on the righttail heaviness of the distribution of k specifically a heavier right tail corresponds to a faster growth rate of f which in turn leads to a larger rhs of eq 7 when q q this observation is in line with the privacyutility tradeoff more probabilities of sampling a large k ie more utility lead to a larger privacy loss page 18 when prinfty then the value of rp is unclear what is the convention for this case page 23 it is mentioned below the definition of qqaa that the total ordering prefers the first option corresponding to the first coordinate probability which should refer to the ones with the probabilities 1bc and 1bc in other words we assume that 1bcb and 1bcb however the approximations below suggest that bc1bc and bc1bc page 25 the proof of lemma 20 is missing does it appear in bun steinke 2016 minor corrections page 6 in remark 5 lambda2varepsilonrdp implies lambda2varepsilonrdp rightarrow lambda2varepsilonrdp implies lambda1varepsilonrdp and ann rightarrow any page 14 in the footnote experssion rightarrow expression page 16 in the definition of gy t should be t page 21 in the third display equation the equal sign should be replaced with page 23 first line in section d4 hyperparamter rightarrow hyperparameter references bun m steinke t 2016 concentrated differential privacy simplifications extensions and lower bounds tcc16 chaudhuri k vinterbo sa 2013 a stabilitybased validation procedure for differentially private machine learning nips13 gupta a ligett k mcsherry f roth a talwar k 2010 differentially private combinatorial optimization soda 10 liu j talwar k 2019 private selection from private candidates stoc19 this work provides careful privacy and utility analysis of private algorithms for hyperparameter tuning all of the analyses and the proofs are sound and the experiments give good comparisons between various distributions to make the methods widely applicable the authors might want to comment on how the models evaluation on a holdout validation set can be integrated into the dp workflow overall this is a strong paper and i recommend it for publication ### Summary:
this paper tackles a problem at the intersection of automl and trustworthiness that has not been studied much before and provides a first solution leaving much space for a lot of interesting future research all reviewers agree that this is a strong paper and clearly recommend acceptance i recommend acceptance as an oral since the paper opens the door for a lot of interesting followups
[ 19132, 253, 1375, 23037, 14387, 273, 632, 86, 285, 5269, 7523, 6247, 671, 247, 5322, 2258, 442, 18398, 4636, 970, 18504, 983, 310, 8818, 326, 2722, 253, 6349, 273, 436, 1895, 50276, 783, 2929, 310, 1077, 973, 3542, 285, 253, 1783, 3133, 4891, 2167, 891, 858, 417, 2451, 1046, 1386, 253, 1895, 310, 1774, 285, 253, 2929, 19132, 253, 1375, 23037, 14387, 594, 697, 16108, 403, 2590, 50276, 2577, 760, 29254, 50276, 284, 802, 28368, 284, 1701, 275, 4706, 4567, 253, 17844, 1342, 2746, 273, 18505, 278, 1057, 417, 789, 323, 6313, 299, 793, 12132, 2299, 891, 1089, 436, 247, 2372, 24363, 1580, 368, 403, 417, 20764, 299, 793, 17, 12132, 273, 4373, 19484, 25184, 533, 299, 793, 3005, 390, 391, 12132, 594, 253, 18687, 1537, 2686, 1132, 247, 9560, 2554, 1060, 6941, 2372, 273, 18687, 275, 253, 33234, 3033, 253, 3632, 1255, 275, 13887, 253, 1180, 273, 22563, 1537, 4931, 320, 417, 326, 9560, 390, 12008, 26620, 275, 299, 793, 17, 368, 651, 417, 755, 1270, 15988, 432, 326, 3632, 1255, 50276, 74, 9101, 326, 436, 958, 326, 368, 452, 281, 452, 253, 3632, 1255, 275, 253, 1180, 273, 49495, 2686, 310, 247, 8284, 273, 253, 391, 12132, 1783, 326, 368, 4459, 562, 347, 2080, 347, 891, 923, 275, 13989, 1722, 368, 1750, 352, 310, 417, 533, 891, 513, 417, 4751, 923, 2139, 326, 651, 320, 253, 1083, 253, 906, 273, 13989, 1722, 26332, 326, 253, 391, 12132, 273, 465, 49495, 310, 417, 391, 12132, 323, 1679, 685, 299, 793, 2260, 1057, 417, 1663, 1918, 247, 2258, 442, 18398, 4636, 275, 253, 1072, 50276, 1106, 347, 326, 906, 323, 6313, 299, 4277, 323, 1650, 9428, 253, 6944, 5122, 310, 299, 793, 1762, 50276, 12132, 840, 368, 5206, 29331, 50276, 746, 32282, 840, 299, 793, 50276, 2808, 18, 911, 365, 793, 2260, 18, 50276, 18, 70, 21, 275, 534, 1083, 253, 3033, 1057, 417, 1663, 1333, 2712, 1014, 323, 3240, 1781, 3904, 273, 465, 285, 368, 476, 273, 2282, 1056, 326, 3033, 29607, 1355, 407, 13887, 29331, 20420, 26332, 604, 368, 1339, 18687, 50276, 17, 891, 1158, 627, 310, 2316, 323, 40638, 1783, 671, 323, 4229, 465, 50276, 16534, 368, 4385, 327, 436, 5454, 2727, 875, 29331, 285, 299, 793, 275, 253, 3033, 273, 13989, 1722, 50276, 16534, 368, 1918, 625, 30328, 327, 2139, 253, 3632, 1255, 275, 1180, 273, 49495, 465, 651, 320, 9560, 390, 690, 643, 1650, 326, 18303, 436, 50276, 977, 50276, 9088, 310, 1633, 8921, 387, 253, 5004, 273, 268, 3307, 5150, 1469, 689, 253, 1386, 50276, 455, 275, 512, 891, 1158, 436, 310, 247, 1077, 5322, 1783, 285, 19132, 253, 1375, 23037, 14387, 671, 436, 310, 247, 1077, 1774, 1895, 342, 1355, 14586, 891, 1158, 436, 12758, 281, 320, 7607, 619, 760, 29254, 891, 717, 816, 417, 7094, 13762, 326, 3632, 1255, 275, 253, 1180, 273, 49495, 310, 9560, 323, 1907, 6863, 14493, 323, 33234, 4373, 19484, 25184, 891, 3524, 253, 4477, 476, 19148, 619, 7350, 50275, 7152, 339, 431, 248, 4477, 1056, 253, 1563, 9021, 5001, 21673, 3055, 4373, 19484, 25184, 50275, 284, 271, 1650, 253, 4477, 6194, 271, 256, 11618, 342, 247, 2801, 12339, 285, 921, 326, 275, 3361, 273, 271, 562, 3623, 247, 14199, 17032, 2983, 476, 320, 7091, 281, 9441, 432, 253, 4373, 19484, 1880, 390, 417, 253, 562, 3623, 369, 247, 629, 273, 253, 3733, 873, 253, 4477, 2085, 271, 5933, 323, 3055, 4373, 19484, 25184, 534, 8414, 273, 3515, 247, 4715, 5933, 326, 12310, 391, 5134, 74, 8967, 11068, 391, 12132, 323, 247, 3632, 1180, 273, 2069, 465, 1016, 342, 247, 4373, 19484, 8392, 17568, 387, 3632, 432, 247, 6486, 7431, 873, 253, 4477, 806, 5276, 391, 12132, 23632, 273, 253, 5933, 672, 465, 310, 19958, 432, 247, 28069, 4016, 47585, 3268, 285, 2963, 17469, 3268, 840, 597, 4262, 281, 5276, 391, 12132, 23632, 323, 667, 3268, 273, 465, 4516, 327, 14168, 4482, 79, 6837, 470, 50276, 783, 4477, 12661, 247, 1039, 281, 2557, 253, 11839, 273, 253, 11333, 407, 2819, 387, 253, 3264, 2677, 587, 273, 253, 3453, 253, 1543, 273, 616, 11839, 1783, 9904, 342, 271, 3368, 327, 278, 79, 382, 921, 326, 253, 5933, 342, 2963, 17469, 3268, 17923, 1805, 685, 1110, 4081, 407, 632, 86, 285, 5269, 7523, 6247, 275, 271, 10444, 2491, 273, 11068, 7563, 362, 609, 4277, 50275, 4064, 752, 891, 2096, 359, 476, 4044, 247, 40638, 12314, 3033, 407, 1469, 949, 253, 4737, 273, 18057, 818, 533, 253, 4477, 1478, 281, 897, 818, 1580, 253, 1501, 21678, 2223, 5644, 281, 247, 3033, 326, 310, 3907, 273, 2805, 285, 2805, 672, 10358, 3390, 275, 247, 2173, 3268, 50276, 12132, 34821, 5145, 4715, 2718, 651, 5649, 432, 436, 789, 347, 954, 273, 731, 2430, 4373, 19484, 25184, 253, 11333, 2530, 275, 436, 789, 1361, 441, 3693, 247, 1781, 2105, 273, 5889, 285, 1347, 25184, 387, 1199, 4577, 11068, 2957, 275, 1635, 253, 12314, 3033, 4483, 441, 281, 320, 12112, 327, 253, 3268, 273, 253, 1180, 273, 6613, 465, 2167, 352, 4419, 690, 3434, 281, 16497, 253, 32643, 1307, 715, 1633, 31998, 891, 452, 10141, 512, 253, 27947, 285, 627, 403, 642, 4619, 6332, 50276, 783, 4477, 452, 10481, 2429, 616, 1332, 342, 2045, 7274, 5742, 970, 253, 2022, 1543, 253, 4477, 921, 326, 616, 1332, 8725, 285, 19132, 2220, 253, 789, 273, 632, 86, 285, 5269, 7523, 6247, 253, 4477, 671, 7277, 616, 1332, 342, 253, 17619, 5122, 3169, 1332, 1149, 37668, 1162, 355, 4267, 10012, 12197, 285, 921, 275, 2593, 277, 21, 326, 1097, 3082, 452, 253, 1072, 2406, 14493, 273, 253, 11839, 23632, 598, 281, 14637, 50276, 4160, 14153, 253, 4477, 1537, 971, 281, 7277, 616, 1332, 342, 253, 6046, 20452, 1332, 4081, 407, 448, 5353, 73, 11317, 1162, 355, 4072, 436, 1332, 4419, 253, 7882, 1617, 327, 253, 3733, 5199, 534, 1537, 417, 320, 10649, 494, 323, 11454, 6928, 533, 323, 4872, 49996, 253, 5933, 5933, 337, 760, 11323, 33737, 273, 1979, 258, 18, 79, 4519, 281, 253, 7363, 534, 310, 3240, 12994, 323, 3733, 327, 1781, 15302, 50276, 783, 2929, 310, 6571, 3542, 342, 20093, 35333, 275, 2564, 533, 352, 1057, 417, 1908, 672, 253, 3210, 7103, 327, 247, 2186, 483, 12820, 873, 310, 11217, 275, 253, 2613, 5933, 2805, 534, 310, 247, 1846, 3946, 275, 3733, 271, 13361, 1566, 432, 271, 2898, 1127, 273, 1859, 253, 4477, 1537, 971, 281, 2319, 247, 2372, 849, 253, 3210, 7103, 275, 9162, 390, 9077, 476, 320, 1160, 33234, 390, 391, 12132, 347, 247, 629, 273, 2805, 50275, 74, 923, 326, 10670, 342, 6486, 1329, 24088, 253, 28069, 47585, 403, 417, 2783, 275, 436, 1263, 533, 275, 3946, 581, 1537, 971, 281, 2701, 253, 1180, 273, 4373, 19484, 17891, 24088, 1955, 281, 15180, 10806, 594, 824, 10670, 1537, 1705, 715, 1132, 452, 253, 4477, 2684, 690, 4679, 327, 841, 1411, 253, 28069, 4016, 47585, 285, 2963, 17469, 3268, 891, 4282, 604, 253, 11068, 307, 874, 5454, 2727, 310, 1805, 672, 34617, 281, 6486, 1329, 50276, 16534, 253, 4477, 4385, 327, 849, 281, 3653, 271, 4569, 1979, 273, 253, 7431, 873, 278, 247, 2969, 47641, 310, 278, 1324, 33909, 533, 253, 4477, 1537, 452, 1633, 1805, 275, 2564, 50275, 6160, 5701, 50276, 6377, 374, 275, 16186, 374, 253, 2714, 1083, 835, 29331, 18, 943, 320, 5393, 1060, 347, 352, 310, 671, 275, 253, 2491, 273, 7856, 2260, 275, 253, 2022, 1543, 50276, 6377, 721, 672, 891, 3597, 281, 15313, 721, 432, 608, 891, 1694, 271, 4465, 391, 1689, 1464, 1307, 432, 294, 17695, 299, 4277, 9492, 311, 1836, 883, 292, 1079, 649, 18, 1124, 18, 700, 77, 1369, 27083, 429, 73, 366, 4277, 9492, 311, 1369, 27083, 1689, 18, 1464, 700, 2260, 18, 9492, 311, 1836, 18, 2859, 700, 2260, 18, 292, 274, 1689, 1464, 16248, 436, 342, 253, 1551, 273, 253, 2426, 285, 840, 13887, 29331, 285, 29331, 594, 326, 253, 13919, 6556, 275, 253, 1563, 11370, 50276, 1274, 9492, 311, 1836, 18, 50276, 1124, 2808, 1324, 33909, 2260, 18, 987, 1274, 2859, 700, 2260, 18, 1464, 50276, 1124, 18, 16190, 462, 18, 2733, 700, 77, 1369, 27083, 429, 391, 1689, 1464, 50274, 5090, 50276, 19, 18858, 83, 1206, 73, 862, 1324, 33909, 3127, 292, 284, 50070, 1206, 73, 862, 18, 72, 21089, 1689, 1464, 50276, 936, 19148, 436, 891, 1158, 253, 4477, 943, 2085, 253, 4737, 273, 40460, 577, 9366, 275, 253, 30762, 347, 891, 1089, 253, 3033, 281, 320, 37825, 50276, 6377, 818, 275, 18057, 818, 2805, 285, 2805, 403, 10341, 20552, 310, 24363, 672, 891, 3047, 253, 3908, 323, 253, 806, 673, 891, 1239, 436, 347, 2805, 285, 2805, 476, 320, 2712, 275, 14805, 533, 253, 4737, 6492, 326, 597, 1379, 2173, 4948, 275, 1340, 323, 253, 18057, 281, 2186, 2032, 581, 1039, 281, 11322, 436, 2523, 310, 407, 3587, 14851, 253, 5426, 273, 2805, 285, 2805, 846, 818, 50276, 1282, 1249, 275, 3239, 854, 50276, 87, 19438, 600, 269, 1146, 6032, 10140, 281, 253, 3268, 465, 1146, 5195, 562, 26332, 2080, 432, 1146, 247, 1127, 2280, 50272, 2520, 310, 417, 2032, 604, 1269, 310, 247, 1127, 2280, 387, 337, 26332, 819, 89, 883, 840, 253, 23256, 71, 273, 1269, 310, 269, 5260, 285, 594, 269, 89, 18, 534, 310, 39797, 977, 685, 667, 14189, 50275, 13565, 387, 253, 5426, 273, 23256, 71, 253, 6032, 1255, 273, 269, 943, 3469, 327, 253, 987, 14694, 3573, 1632, 273, 253, 3268, 273, 465, 5742, 247, 27953, 987, 8105, 10140, 281, 247, 7938, 3116, 2281, 273, 269, 534, 275, 1614, 5644, 281, 247, 4067, 38309, 273, 16186, 818, 672, 2805, 50276, 82, 436, 8310, 310, 275, 1386, 342, 253, 11068, 307, 874, 5454, 2727, 625, 20552, 273, 10491, 247, 1781, 465, 26332, 625, 11839, 1421, 281, 247, 4067, 11068, 2957, 50276, 6377, 1283, 672, 819, 3259, 840, 253, 1318, 273, 391, 81, 310, 12744, 752, 310, 253, 5008, 323, 436, 1083, 50276, 6377, 3495, 352, 310, 5393, 2708, 253, 5426, 273, 2805, 82, 5781, 326, 253, 2264, 15824, 41469, 253, 806, 4500, 3969, 281, 253, 806, 13249, 5912, 534, 943, 3730, 281, 253, 4394, 342, 253, 20552, 337, 12847, 285, 337, 12847, 275, 643, 3000, 359, 5467, 326, 337, 67, 11316, 285, 337, 67, 11316, 2299, 253, 34754, 2708, 1804, 326, 49501, 18, 12847, 285, 49501, 18, 12847, 50276, 6377, 2030, 253, 4737, 273, 18057, 1384, 310, 5816, 1057, 352, 3176, 275, 34258, 50276, 6339, 413, 4022, 50275, 37585, 17660, 50276, 6377, 721, 275, 7579, 608, 29331, 19, 4519, 5784, 81, 8018, 29331, 19, 4519, 5784, 81, 987, 2501, 29331, 19, 4519, 5784, 81, 8018, 29331, 18, 4519, 5784, 81, 285, 2459, 987, 2501, 667, 50276, 6377, 1638, 275, 253, 43302, 866, 398, 84, 279, 987, 2501, 2048, 50276, 6377, 1668, 275, 253, 5426, 273, 19859, 246, 943, 320, 246, 50276, 6377, 3127, 275, 253, 2626, 3148, 5150, 253, 4503, 861, 943, 320, 7932, 342, 50275, 6377, 3495, 806, 1386, 275, 2593, 277, 21, 4373, 3575, 350, 987, 2501, 4373, 19484, 50275, 250, 3065, 34258, 278, 50276, 6339, 413, 246, 4022, 16761, 8967, 11068, 8077, 6787, 18149, 285, 2406, 14493, 246, 550, 1036, 50275, 348, 5353, 73, 11317, 465, 50276, 87, 2388, 2399, 618, 4072, 247, 7882, 3169, 12820, 5199, 323, 21673, 3055, 5145, 4715, 295, 2824, 1012, 50275, 4297, 37668, 247, 8405, 3592, 465, 278, 6113, 16210, 269, 687, 394, 247, 50276, 22559, 7523, 465, 4267, 21673, 3055, 38183, 13757, 29737, 884, 50275, 965, 86, 480, 50276, 22559, 7523, 465, 6247, 3055, 5438, 432, 3055, 9183, 331, 406, 746, 50276, 2520, 789, 3400, 10182, 11068, 285, 11839, 1783, 273, 3055, 11333, 323, 4373, 19484, 25184, 512, 273, 253, 6260, 285, 253, 27947, 403, 3590, 285, 253, 4679, 1918, 1175, 14023, 875, 2710, 10670, 50276, 936, 1056, 253, 3082, 7561, 7763, 253, 4477, 1537, 971, 281, 4385, 327, 849, 253, 3210, 7103, 327, 247, 2186, 483, 12820, 873, 476, 320, 8527, 715, 253, 33234, 24824, 4583, 436, 310, 247, 2266, 2929, 285, 891, 5583, 352, 323, 9311, 2490, 187, 4118, 18435, 27, 2520, 2929, 39223, 247, 1895, 387, 253, 15171, 273, 3772, 77, 285, 4517, 11448, 1632, 326, 556, 417, 644, 5421, 1199, 1078, 285, 3400, 247, 806, 2900, 6108, 1199, 2317, 323, 247, 2257, 273, 4722, 2852, 2561, 512, 30628, 5194, 326, 436, 310, 247, 2266, 2929, 285, 4518, 5583, 14924, 891, 5583, 14924, 347, 271, 7946, 1580, 253, 2929, 13279, 253, 3369, 323, 247, 2257, 273, 4722, 956, 8777 ]
[ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 ]
[ 19132, 253, 1375, 23037, 14387, 273, 632, 86, 285, 5269, 7523, 6247, 671, 247, 5322, 2258, 442, 18398, 4636, 970, 18504, 983, 310, 8818, 326, 2722, 253, 6349, 273, 436, 1895, 50276, 783, 2929, 310, 1077, 973, 3542, 285, 253, 1783, 3133, 4891, 2167, 891, 858, 417, 2451, 1046, 1386, 253, 1895, 310, 1774, 285, 253, 2929, 19132, 253, 1375, 23037, 14387, 594, 697, 16108, 403, 2590, 50276, 2577, 760, 29254, 50276, 284, 802, 28368, 284, 1701, 275, 4706, 4567, 253, 17844, 1342, 2746, 273, 18505, 278, 1057, 417, 789, 323, 6313, 299, 793, 12132, 2299, 891, 1089, 436, 247, 2372, 24363, 1580, 368, 403, 417, 20764, 299, 793, 17, 12132, 273, 4373, 19484, 25184, 533, 299, 793, 3005, 390, 391, 12132, 594, 253, 18687, 1537, 2686, 1132, 247, 9560, 2554, 1060, 6941, 2372, 273, 18687, 275, 253, 33234, 3033, 253, 3632, 1255, 275, 13887, 253, 1180, 273, 22563, 1537, 4931, 320, 417, 326, 9560, 390, 12008, 26620, 275, 299, 793, 17, 368, 651, 417, 755, 1270, 15988, 432, 326, 3632, 1255, 50276, 74, 9101, 326, 436, 958, 326, 368, 452, 281, 452, 253, 3632, 1255, 275, 253, 1180, 273, 49495, 2686, 310, 247, 8284, 273, 253, 391, 12132, 1783, 326, 368, 4459, 562, 347, 2080, 347, 891, 923, 275, 13989, 1722, 368, 1750, 352, 310, 417, 533, 891, 513, 417, 4751, 923, 2139, 326, 651, 320, 253, 1083, 253, 906, 273, 13989, 1722, 26332, 326, 253, 391, 12132, 273, 465, 49495, 310, 417, 391, 12132, 323, 1679, 685, 299, 793, 2260, 1057, 417, 1663, 1918, 247, 2258, 442, 18398, 4636, 275, 253, 1072, 50276, 1106, 347, 326, 906, 323, 6313, 299, 4277, 323, 1650, 9428, 253, 6944, 5122, 310, 299, 793, 1762, 50276, 12132, 840, 368, 5206, 29331, 50276, 746, 32282, 840, 299, 793, 50276, 2808, 18, 911, 365, 793, 2260, 18, 50276, 18, 70, 21, 275, 534, 1083, 253, 3033, 1057, 417, 1663, 1333, 2712, 1014, 323, 3240, 1781, 3904, 273, 465, 285, 368, 476, 273, 2282, 1056, 326, 3033, 29607, 1355, 407, 13887, 29331, 20420, 26332, 604, 368, 1339, 18687, 50276, 17, 891, 1158, 627, 310, 2316, 323, 40638, 1783, 671, 323, 4229, 465, 50276, 16534, 368, 4385, 327, 436, 5454, 2727, 875, 29331, 285, 299, 793, 275, 253, 3033, 273, 13989, 1722, 50276, 16534, 368, 1918, 625, 30328, 327, 2139, 253, 3632, 1255, 275, 1180, 273, 49495, 465, 651, 320, 9560, 390, 690, 643, 1650, 326, 18303, 436, 50276, 977, 50276, 9088, 310, 1633, 8921, 387, 253, 5004, 273, 268, 3307, 5150, 1469, 689, 253, 1386, 50276, 455, 275, 512, 891, 1158, 436, 310, 247, 1077, 5322, 1783, 285, 19132, 253, 1375, 23037, 14387, 671, 436, 310, 247, 1077, 1774, 1895, 342, 1355, 14586, 891, 1158, 436, 12758, 281, 320, 7607, 619, 760, 29254, 891, 717, 816, 417, 7094, 13762, 326, 3632, 1255, 275, 253, 1180, 273, 49495, 310, 9560, 323, 1907, 6863, 14493, 323, 33234, 4373, 19484, 25184, 891, 3524, 253, 4477, 476, 19148, 619, 7350, 50275, 7152, 339, 431, 248, 4477, 1056, 253, 1563, 9021, 5001, 21673, 3055, 4373, 19484, 25184, 50275, 284, 271, 1650, 253, 4477, 6194, 271, 256, 11618, 342, 247, 2801, 12339, 285, 921, 326, 275, 3361, 273, 271, 562, 3623, 247, 14199, 17032, 2983, 476, 320, 7091, 281, 9441, 432, 253, 4373, 19484, 1880, 390, 417, 253, 562, 3623, 369, 247, 629, 273, 253, 3733, 873, 253, 4477, 2085, 271, 5933, 323, 3055, 4373, 19484, 25184, 534, 8414, 273, 3515, 247, 4715, 5933, 326, 12310, 391, 5134, 74, 8967, 11068, 391, 12132, 323, 247, 3632, 1180, 273, 2069, 465, 1016, 342, 247, 4373, 19484, 8392, 17568, 387, 3632, 432, 247, 6486, 7431, 873, 253, 4477, 806, 5276, 391, 12132, 23632, 273, 253, 5933, 672, 465, 310, 19958, 432, 247, 28069, 4016, 47585, 3268, 285, 2963, 17469, 3268, 840, 597, 4262, 281, 5276, 391, 12132, 23632, 323, 667, 3268, 273, 465, 4516, 327, 14168, 4482, 79, 6837, 470, 50276, 783, 4477, 12661, 247, 1039, 281, 2557, 253, 11839, 273, 253, 11333, 407, 2819, 387, 253, 3264, 2677, 587, 273, 253, 3453, 253, 1543, 273, 616, 11839, 1783, 9904, 342, 271, 3368, 327, 278, 79, 382, 921, 326, 253, 5933, 342, 2963, 17469, 3268, 17923, 1805, 685, 1110, 4081, 407, 632, 86, 285, 5269, 7523, 6247, 275, 271, 10444, 2491, 273, 11068, 7563, 362, 609, 4277, 50275, 4064, 752, 891, 2096, 359, 476, 4044, 247, 40638, 12314, 3033, 407, 1469, 949, 253, 4737, 273, 18057, 818, 533, 253, 4477, 1478, 281, 897, 818, 1580, 253, 1501, 21678, 2223, 5644, 281, 247, 3033, 326, 310, 3907, 273, 2805, 285, 2805, 672, 10358, 3390, 275, 247, 2173, 3268, 50276, 12132, 34821, 5145, 4715, 2718, 651, 5649, 432, 436, 789, 347, 954, 273, 731, 2430, 4373, 19484, 25184, 253, 11333, 2530, 275, 436, 789, 1361, 441, 3693, 247, 1781, 2105, 273, 5889, 285, 1347, 25184, 387, 1199, 4577, 11068, 2957, 275, 1635, 253, 12314, 3033, 4483, 441, 281, 320, 12112, 327, 253, 3268, 273, 253, 1180, 273, 6613, 465, 2167, 352, 4419, 690, 3434, 281, 16497, 253, 32643, 1307, 715, 1633, 31998, 891, 452, 10141, 512, 253, 27947, 285, 627, 403, 642, 4619, 6332, 50276, 783, 4477, 452, 10481, 2429, 616, 1332, 342, 2045, 7274, 5742, 970, 253, 2022, 1543, 253, 4477, 921, 326, 616, 1332, 8725, 285, 19132, 2220, 253, 789, 273, 632, 86, 285, 5269, 7523, 6247, 253, 4477, 671, 7277, 616, 1332, 342, 253, 17619, 5122, 3169, 1332, 1149, 37668, 1162, 355, 4267, 10012, 12197, 285, 921, 275, 2593, 277, 21, 326, 1097, 3082, 452, 253, 1072, 2406, 14493, 273, 253, 11839, 23632, 598, 281, 14637, 50276, 4160, 14153, 253, 4477, 1537, 971, 281, 7277, 616, 1332, 342, 253, 6046, 20452, 1332, 4081, 407, 448, 5353, 73, 11317, 1162, 355, 4072, 436, 1332, 4419, 253, 7882, 1617, 327, 253, 3733, 5199, 534, 1537, 417, 320, 10649, 494, 323, 11454, 6928, 533, 323, 4872, 49996, 253, 5933, 5933, 337, 760, 11323, 33737, 273, 1979, 258, 18, 79, 4519, 281, 253, 7363, 534, 310, 3240, 12994, 323, 3733, 327, 1781, 15302, 50276, 783, 2929, 310, 6571, 3542, 342, 20093, 35333, 275, 2564, 533, 352, 1057, 417, 1908, 672, 253, 3210, 7103, 327, 247, 2186, 483, 12820, 873, 310, 11217, 275, 253, 2613, 5933, 2805, 534, 310, 247, 1846, 3946, 275, 3733, 271, 13361, 1566, 432, 271, 2898, 1127, 273, 1859, 253, 4477, 1537, 971, 281, 2319, 247, 2372, 849, 253, 3210, 7103, 275, 9162, 390, 9077, 476, 320, 1160, 33234, 390, 391, 12132, 347, 247, 629, 273, 2805, 50275, 74, 923, 326, 10670, 342, 6486, 1329, 24088, 253, 28069, 47585, 403, 417, 2783, 275, 436, 1263, 533, 275, 3946, 581, 1537, 971, 281, 2701, 253, 1180, 273, 4373, 19484, 17891, 24088, 1955, 281, 15180, 10806, 594, 824, 10670, 1537, 1705, 715, 1132, 452, 253, 4477, 2684, 690, 4679, 327, 841, 1411, 253, 28069, 4016, 47585, 285, 2963, 17469, 3268, 891, 4282, 604, 253, 11068, 307, 874, 5454, 2727, 310, 1805, 672, 34617, 281, 6486, 1329, 50276, 16534, 253, 4477, 4385, 327, 849, 281, 3653, 271, 4569, 1979, 273, 253, 7431, 873, 278, 247, 2969, 47641, 310, 278, 1324, 33909, 533, 253, 4477, 1537, 452, 1633, 1805, 275, 2564, 50275, 6160, 5701, 50276, 6377, 374, 275, 16186, 374, 253, 2714, 1083, 835, 29331, 18, 943, 320, 5393, 1060, 347, 352, 310, 671, 275, 253, 2491, 273, 7856, 2260, 275, 253, 2022, 1543, 50276, 6377, 721, 672, 891, 3597, 281, 15313, 721, 432, 608, 891, 1694, 271, 4465, 391, 1689, 1464, 1307, 432, 294, 17695, 299, 4277, 9492, 311, 1836, 883, 292, 1079, 649, 18, 1124, 18, 700, 77, 1369, 27083, 429, 73, 366, 4277, 9492, 311, 1369, 27083, 1689, 18, 1464, 700, 2260, 18, 9492, 311, 1836, 18, 2859, 700, 2260, 18, 292, 274, 1689, 1464, 16248, 436, 342, 253, 1551, 273, 253, 2426, 285, 840, 13887, 29331, 285, 29331, 594, 326, 253, 13919, 6556, 275, 253, 1563, 11370, 50276, 1274, 9492, 311, 1836, 18, 50276, 1124, 2808, 1324, 33909, 2260, 18, 987, 1274, 2859, 700, 2260, 18, 1464, 50276, 1124, 18, 16190, 462, 18, 2733, 700, 77, 1369, 27083, 429, 391, 1689, 1464, 50274, 5090, 50276, 19, 18858, 83, 1206, 73, 862, 1324, 33909, 3127, 292, 284, 50070, 1206, 73, 862, 18, 72, 21089, 1689, 1464, 50276, 936, 19148, 436, 891, 1158, 253, 4477, 943, 2085, 253, 4737, 273, 40460, 577, 9366, 275, 253, 30762, 347, 891, 1089, 253, 3033, 281, 320, 37825, 50276, 6377, 818, 275, 18057, 818, 2805, 285, 2805, 403, 10341, 20552, 310, 24363, 672, 891, 3047, 253, 3908, 323, 253, 806, 673, 891, 1239, 436, 347, 2805, 285, 2805, 476, 320, 2712, 275, 14805, 533, 253, 4737, 6492, 326, 597, 1379, 2173, 4948, 275, 1340, 323, 253, 18057, 281, 2186, 2032, 581, 1039, 281, 11322, 436, 2523, 310, 407, 3587, 14851, 253, 5426, 273, 2805, 285, 2805, 846, 818, 50276, 1282, 1249, 275, 3239, 854, 50276, 87, 19438, 600, 269, 1146, 6032, 10140, 281, 253, 3268, 465, 1146, 5195, 562, 26332, 2080, 432, 1146, 247, 1127, 2280, 50272, 2520, 310, 417, 2032, 604, 1269, 310, 247, 1127, 2280, 387, 337, 26332, 819, 89, 883, 840, 253, 23256, 71, 273, 1269, 310, 269, 5260, 285, 594, 269, 89, 18, 534, 310, 39797, 977, 685, 667, 14189, 50275, 13565, 387, 253, 5426, 273, 23256, 71, 253, 6032, 1255, 273, 269, 943, 3469, 327, 253, 987, 14694, 3573, 1632, 273, 253, 3268, 273, 465, 5742, 247, 27953, 987, 8105, 10140, 281, 247, 7938, 3116, 2281, 273, 269, 534, 275, 1614, 5644, 281, 247, 4067, 38309, 273, 16186, 818, 672, 2805, 50276, 82, 436, 8310, 310, 275, 1386, 342, 253, 11068, 307, 874, 5454, 2727, 625, 20552, 273, 10491, 247, 1781, 465, 26332, 625, 11839, 1421, 281, 247, 4067, 11068, 2957, 50276, 6377, 1283, 672, 819, 3259, 840, 253, 1318, 273, 391, 81, 310, 12744, 752, 310, 253, 5008, 323, 436, 1083, 50276, 6377, 3495, 352, 310, 5393, 2708, 253, 5426, 273, 2805, 82, 5781, 326, 253, 2264, 15824, 41469, 253, 806, 4500, 3969, 281, 253, 806, 13249, 5912, 534, 943, 3730, 281, 253, 4394, 342, 253, 20552, 337, 12847, 285, 337, 12847, 275, 643, 3000, 359, 5467, 326, 337, 67, 11316, 285, 337, 67, 11316, 2299, 253, 34754, 2708, 1804, 326, 49501, 18, 12847, 285, 49501, 18, 12847, 50276, 6377, 2030, 253, 4737, 273, 18057, 1384, 310, 5816, 1057, 352, 3176, 275, 34258, 50276, 6339, 413, 4022, 50275, 37585, 17660, 50276, 6377, 721, 275, 7579, 608, 29331, 19, 4519, 5784, 81, 8018, 29331, 19, 4519, 5784, 81, 987, 2501, 29331, 19, 4519, 5784, 81, 8018, 29331, 18, 4519, 5784, 81, 285, 2459, 987, 2501, 667, 50276, 6377, 1638, 275, 253, 43302, 866, 398, 84, 279, 987, 2501, 2048, 50276, 6377, 1668, 275, 253, 5426, 273, 19859, 246, 943, 320, 246, 50276, 6377, 3127, 275, 253, 2626, 3148, 5150, 253, 4503, 861, 943, 320, 7932, 342, 50275, 6377, 3495, 806, 1386, 275, 2593, 277, 21, 4373, 3575, 350, 987, 2501, 4373, 19484, 50275, 250, 3065, 34258, 278, 50276, 6339, 413, 246, 4022, 16761, 8967, 11068, 8077, 6787, 18149, 285, 2406, 14493, 246, 550, 1036, 50275, 348, 5353, 73, 11317, 465, 50276, 87, 2388, 2399, 618, 4072, 247, 7882, 3169, 12820, 5199, 323, 21673, 3055, 5145, 4715, 295, 2824, 1012, 50275, 4297, 37668, 247, 8405, 3592, 465, 278, 6113, 16210, 269, 687, 394, 247, 50276, 22559, 7523, 465, 4267, 21673, 3055, 38183, 13757, 29737, 884, 50275, 965, 86, 480, 50276, 22559, 7523, 465, 6247, 3055, 5438, 432, 3055, 9183, 331, 406, 746, 50276, 2520, 789, 3400, 10182, 11068, 285, 11839, 1783, 273, 3055, 11333, 323, 4373, 19484, 25184, 512, 273, 253, 6260, 285, 253, 27947, 403, 3590, 285, 253, 4679, 1918, 1175, 14023, 875, 2710, 10670, 50276, 936, 1056, 253, 3082, 7561, 7763, 253, 4477, 1537, 971, 281, 4385, 327, 849, 253, 3210, 7103, 327, 247, 2186, 483, 12820, 873, 476, 320, 8527, 715, 253, 33234, 24824, 4583, 436, 310, 247, 2266, 2929, 285, 891, 5583, 352, 323, 9311, 2490, 187, 4118, 18435, 27, 2520, 2929, 39223, 247, 1895, 387, 253, 15171, 273, 3772, 77, 285, 4517, 11448, 1632, 326, 556, 417, 644, 5421, 1199, 1078, 285, 3400, 247, 806, 2900, 6108, 1199, 2317, 323, 247, 2257, 273, 4722, 2852, 2561, 512, 30628, 5194, 326, 436, 310, 247, 2266, 2929, 285, 4518, 5583, 14924, 891, 5583, 14924, 347, 271, 7946, 1580, 253, 2929, 13279, 253, 3369, 323, 247, 2257, 273, 4722, 956, 8777 ]
Below is given review of a research paper from cnoference journal. Please write a summary the review. ### Review: the paper at hand shows relations from selfsupervisedlearning eg contrastive learning and metalearning eg finetuning whereas the shown relationship is interesting its not that novel and i would even argue that several teams have implemented similar ideas already however it has not made that explicit especially the gradient accumulation for contrastive learning which is quite relevant for many learning in real world applications overall the paper is quite well written and easy to understand and follow as said above novelty might be a bit limited but i can see the benefit of highlighting the relationship of meta and contrastive learning for many readers in the community experimental results are in favor of the approach however a more compressive analysis eg confidence intervals would have been appreciated a good summary of the stateoftheart and highlighting some interesting relationships docsepthis paper formalizes a connection between the training procedures of fewshot metalearning and contrastive learning it shows that metalearning algorithms can be used to pretrain on image data and outperform standard contrastive learning on downstream tasks the authors further use the connection to develop data augmentation procedures for contrastive learning strengths 1 mathematical formalization of the connection between contrastive learning and metalearning 2 new data augmentation techniques for contrastive learning lead to improvements on standard evaluations 3 paper is clear and easytoread 4 code is provided in the supplement weaknesses 1 the main weakness with the paper is the lack of comparison to the large body of work comparing unsupervised learning and metalearning most notably hsu et al 2019 showed that metalearning algorithms can be used for unsupervised learning and many of the papers that cite it have explored transferring techniques in both directions the novelty here seems to be just the connection to contrastive learning specifically 2 it is unclear if any of the methods proposed here would extend to domains such as text which is not examined 3 minor the paper title is aggressive suggesting that we can stop doing contrastive learning and focus on metalearning even though the only connection is in the data generation during training not in the goals of the two learning paradigms comments 1 the training loop for metalearners typically involves i sampling a random batch of classes and ii updating a feature extractor to distinguish between these classes this describes the fewshot learning subset of metalearning metalearning itself is much broader including applications to supervised learning and rl that do not involve sampling classes 2 in the inner loop the model is first finetuned on support data tsi then in the outer loop the updated model is used to predict on query data tqi and a loss is minimized with respect to the models parameters before finetuning algorithms such as reptile do not separate task data into support and query data 3 these methods usually require operations in pixel space which is computationally expensive what does this mean dont most deep nets apply at least one operation to pixel space references hsu finn levine unsupervised learning via metalearning iclr 2019 the main concern with this work is that the main insight seems to be a somewhat more specific variant of observations made in past work on the connection between modern unsupervised and metalearning while there are some interesting experimental results i lean against accepting given the lack of any analysis concerning what the novelty is here compared to those papers docsepthe paper first shows that the current popular contrastive learning for selfsupervised visual representation learning shares some similarity with a metalearning framework for fewshot learning this inspires the authors to propose a metalearning framework for selfsupervised learning in addition the paper also shows that tools data augmentation and gradient accumulation developed in metalearning can help enhance contrastive learners experimental results show that the proposed metalearning framework for selfsupervised learning outperforms simclr a stateoftheart contrastive learning method on multiple downstream tasks in a semisupervised learning framework though it performs not as well as simclr under the linear evaluation protocol on imagenet other results show tools developed in metalearning can help enhance contrastive learners under the linear evaluation protocol strengths 1 learning selfsupervised visual representations via metalearning is new and interesting 2 the presented metalearning framework for selfsupervised learning improves over the simclr in downstream tasks this shows the potential of using metalearning for selfsupervised visual representation learning 3 the tools from metalearning are proved to benefit simclr weaknesses 1 a core claim and also the title of the paper is contrastive learning is just metalearning contrastive learning can be interpreted as a special case of metalearning with a certain task distribution however its main arguments are not convincing to support this statement in particular the paper only discusses similar features and also differences shared by contrastive learning and metalearning such as solving new tasks onthefly with each batch and learning invariances which generalize to novel problems at inference these shared features are not unique or characteristics to either metalearning or contrastive learning for example deep metric learning via a triplet loss also solves new tasks different anchors positivenegative examples in each batch and aims to learn features generalize to new problems eg new faces for face recognition to claim contrastive learning is just metalearning one should present a general metalearning framework and show how the contrastive learning framework can fit in that framework as a special case also one should think about the core characteristics of these two frameworks and show one is a subset of the other for example i did not see anything meta in simclr otherwise only showing some shared features it is similar to proving that elephants are just mice because they both have four legs a head and a tail 2 the evaluation is inconsistent in different parts of the paper section 4 presents results from both linear protocol and downstream tasks while section 5 and 6 only present the former some experiments are only on cifar10 while some are only on imagenet it is unclear whether the data augmentation and gradient accumulation help metalearning or simclr in downstream tasks 3 the idea of large rotation as auxiliary loss reads similar to prior work on classifying random image rotations as a pretext task for selfsupervised learning what is the main difference 4 the proposed metalearning framework for selfsupervised learning contains iterations in the inner loop prior selfsupervised work indicates the number of training iterationsepochs can significantly affect the performance an experiment is needed to the performance of metalearning and simclr under different epochsiterations it is also interesting to see the extra training time added by the inner loop 5 is the loss l on page 5 a contrastive loss learning selfsupervised visual representations via metalearning is new and interesting experimental results also indicate its potential effectiveness but the main claim contrastive learning is just metalearning is not well supported as detailed in main review the choice of datasets and tasks in different experiments is heuristic making the results less convincing experiments on the impact of iterations within the inner loop are needed to show more indepth comparison between the metalearning framework and simclr docsepthis paper proposed a framework to integrate contrastiveselfsupervised learning into metalearning literature the authors demonstrate that contrastive learning principles implemented in metalearning methods such as r2d2 can achieve comparable results on various computer vision tasks next the authors proposed two tricks in metalearning literature to improve ssl models including 1 rotation prediction 2 batch gradient accumulation both methods demonstrate incremental improvement over the standard simclr baseline 1 the idea of integrating selfsupervised learning into metalearning is novel the insight of treating different augmentation as task augmentation is conceptually interesting even though the paper does not extend such a framework to the level of other metalearning literature that datasets and tasks are largely different i believe the concept proposed in the paper may be a good inspiration for future selfsupervised learning research 2 the first empirical idea proposed in the paper is exactly just rotnet arxiv180307728 and combining it with the jointembedding approach is also not novel selfsupervised representation learning by rotation feature decoupling cvpr19 therefore it has limited novelty i do not see how this setting is different when fitting into the metalearning framework its still working as an auxiliary loss 3 the second empirical idea seems new to the selfsupervised learning framework even though the idea itself comes from metalearning is not novel i think that demonstrating such an idea can work in a selfsupervised learning framework is interesting and can probably benefit the general selfsupervised learning research community 4 the authors provided all experimental details for reproducing the results i understand that its not possible to reproduce standard simclr due to computational budget but it seems that the comparison is not satisfactory i do not see any hyperparameter search for the baseline simclrbyol model these hyperparameters may not be optimal for example using a large learning rate of 4 and a high temperature of 05 may not be best for training small batch sizes such as 256 also all experiments improvements are incremental eg table 6 7 8 only demonstrates 1 without an exhaustive hyperparameter search this 1 is not convincing enough 5 overall the paper is well written the concept of combining selfsupervised learning and metalearning is interesting and may have a bigger impact in the future the first trick of rotation prediction has limited novelty but the second trick is interesting and demonstrated to be useful empirical results are all incremental and not convincing enough ### Summary:
this paper was borderline based on the reviews the paper points out an interesting connection somewhat known but not in this specific version and good experimental results however numerous reviewers raised concerns that the paper was lacking a comparison to prior work connecting unsupervised learning and metalearning most notably hsu et al 2019 after reading the revised version of the paper the authors address this issue and also all the other reviewer comments in relation to prior work they clarify that they focus on the contrastive unsupervised case and also do a good job in answering other reviewer concerns relative to novelty and results i would also like to point out as reviewers also did that the previous title was a bit aggressive and provocative gladly the authors agree to change it to a more scientific the close relationship between contrastive learning and metalearning overall i think the authors have done a good effort on addressing the reviewer concerns and i think the paper would be interesting for iclr readers
[ 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 783, 2929, 387, 1133, 2722, 2493, 432, 1881, 35421, 28269, 24088, 4499, 422, 4715, 285, 5148, 613, 920, 24088, 1442, 292, 25004, 5727, 253, 2011, 2954, 310, 4722, 697, 417, 326, 4460, 285, 891, 651, 1014, 9059, 326, 2067, 6671, 452, 9009, 2074, 5697, 2168, 2299, 352, 556, 417, 1160, 326, 6843, 3340, 253, 11786, 12037, 323, 4499, 422, 4715, 534, 310, 3240, 4623, 323, 1142, 50276, 28269, 275, 1524, 1533, 4893, 4583, 253, 2929, 310, 3240, 973, 3542, 285, 3477, 281, 2096, 285, 956, 347, 753, 1840, 38135, 1537, 320, 247, 2372, 3710, 533, 891, 476, 923, 253, 5649, 273, 27321, 253, 2954, 273, 11419, 285, 4499, 422, 4715, 323, 1142, 10668, 275, 253, 3114, 5661, 1543, 403, 275, 3718, 273, 253, 2746, 2299, 247, 625, 509, 8122, 1783, 24088, 7162, 11508, 651, 452, 644, 14109, 247, 1175, 6010, 273, 253, 1375, 23037, 14387, 285, 27321, 690, 4722, 7688, 5474, 33032, 2520, 2929, 7473, 4219, 247, 4602, 875, 253, 3733, 7259, 273, 1643, 11860, 5148, 613, 920, 285, 4499, 422, 4715, 352, 2722, 326, 5148, 613, 920, 11333, 476, 320, 908, 281, 3215, 1949, 327, 2460, 941, 285, 562, 32231, 2629, 4499, 422, 4715, 327, 15450, 8892, 253, 4477, 2007, 897, 253, 4602, 281, 1287, 941, 42072, 7259, 323, 4499, 422, 4715, 20544, 337, 15965, 7473, 1320, 273, 253, 4602, 875, 4499, 422, 4715, 285, 5148, 613, 920, 374, 747, 941, 42072, 5609, 323, 4499, 422, 4715, 1421, 281, 11701, 327, 2629, 27163, 495, 2929, 310, 2590, 285, 1842, 1767, 410, 324, 577, 2127, 310, 2530, 275, 253, 8499, 50276, 20881, 1255, 265, 337, 253, 2022, 14855, 342, 253, 2929, 310, 253, 3480, 273, 5301, 281, 253, 1781, 2133, 273, 789, 10941, 440, 35421, 4715, 285, 5148, 613, 920, 954, 19836, 288, 3467, 1162, 355, 6247, 2692, 326, 5148, 613, 920, 11333, 476, 320, 908, 323, 440, 35421, 4715, 285, 1142, 273, 253, 9380, 326, 26542, 352, 452, 14859, 27090, 5609, 275, 1097, 10746, 253, 38135, 1060, 3133, 281, 320, 816, 253, 4602, 281, 4499, 422, 4715, 5742, 374, 352, 310, 12744, 604, 667, 273, 253, 3082, 4081, 1060, 651, 9017, 281, 10625, 824, 347, 2505, 534, 310, 417, 6730, 495, 5884, 253, 2929, 4060, 310, 13847, 7738, 326, 359, 476, 3523, 2509, 4499, 422, 4715, 285, 2770, 327, 5148, 613, 920, 1014, 2167, 253, 760, 4602, 310, 275, 253, 941, 5978, 1309, 3733, 417, 275, 253, 7342, 273, 253, 767, 4715, 11951, 304, 983, 50276, 26122, 337, 253, 3733, 6287, 323, 5148, 613, 6118, 5431, 8687, 891, 10491, 247, 3632, 14604, 273, 5971, 285, 21255, 22753, 247, 4735, 4908, 263, 281, 12129, 875, 841, 5971, 436, 8631, 253, 1643, 11860, 4715, 8578, 273, 5148, 613, 920, 5148, 613, 920, 3139, 310, 1199, 16055, 1690, 4893, 281, 22296, 4715, 285, 391, 77, 326, 513, 417, 6388, 10491, 5971, 374, 275, 253, 6703, 6287, 253, 1566, 310, 806, 1442, 292, 37437, 327, 1329, 941, 246, 9245, 50276, 7461, 275, 253, 8346, 6287, 253, 9300, 1566, 310, 908, 281, 3283, 327, 7316, 941, 246, 33980, 50276, 395, 247, 2957, 310, 36625, 342, 1675, 281, 253, 3210, 3602, 1078, 1442, 292, 25004, 11333, 824, 347, 43571, 587, 513, 417, 4858, 4836, 941, 715, 1329, 285, 7316, 941, 495, 841, 3082, 3798, 2430, 5871, 275, 12275, 2317, 534, 310, 43245, 8214, 752, 1057, 436, 1599, 13414, 954, 3676, 37507, 4647, 387, 1878, 581, 4254, 281, 12275, 2317, 50276, 250, 3065, 288, 3467, 1442, 79, 20978, 460, 440, 35421, 4715, 3066, 5148, 613, 920, 17857, 32888, 6247, 253, 2022, 4468, 342, 436, 789, 310, 326, 253, 2022, 12288, 3133, 281, 320, 247, 8489, 625, 2173, 12955, 273, 7313, 1160, 275, 2469, 789, 327, 253, 4602, 875, 4980, 440, 35421, 285, 5148, 613, 920, 1223, 627, 403, 690, 4722, 5661, 1543, 891, 9644, 1411, 18738, 1677, 253, 3480, 273, 667, 1783, 8664, 752, 253, 38135, 310, 1060, 2429, 281, 1110, 9380, 5474, 339, 431, 248, 2929, 806, 2722, 326, 253, 1655, 4633, 4499, 422, 4715, 323, 1881, 35421, 5304, 6779, 4715, 10764, 690, 14259, 342, 247, 5148, 613, 920, 7792, 323, 1643, 11860, 4715, 436, 6381, 2731, 253, 4477, 281, 12661, 247, 5148, 613, 920, 7792, 323, 1881, 35421, 4715, 275, 1635, 253, 2929, 671, 2722, 326, 5657, 941, 42072, 285, 11786, 12037, 3715, 275, 5148, 613, 920, 476, 1361, 7278, 4499, 422, 40390, 50275, 49363, 1543, 921, 326, 253, 4081, 5148, 613, 920, 7792, 323, 1881, 35421, 4715, 41731, 13015, 948, 498, 83, 247, 1375, 23037, 14387, 4499, 422, 4715, 1332, 327, 2709, 15450, 8892, 275, 247, 49863, 29974, 13337, 4715, 7792, 2167, 352, 17923, 417, 347, 973, 347, 948, 498, 83, 762, 253, 4872, 7103, 7241, 327, 4440, 257, 292, 50276, 977, 1543, 921, 5657, 3715, 275, 5148, 613, 920, 476, 1361, 7278, 4499, 422, 40390, 762, 253, 4872, 7103, 7241, 20544, 337, 4715, 1881, 35421, 5304, 14237, 3066, 5148, 613, 920, 310, 747, 285, 4722, 374, 253, 3559, 5148, 613, 920, 7792, 323, 1881, 35421, 4715, 19132, 689, 253, 948, 498, 83, 275, 15450, 8892, 436, 2722, 253, 2442, 273, 970, 5148, 613, 920, 323, 1881, 35421, 5304, 6779, 4715, 495, 253, 5657, 432, 5148, 613, 920, 403, 8058, 281, 5649, 948, 498, 83, 50276, 20881, 1255, 265, 337, 247, 5161, 1750, 285, 671, 253, 4060, 273, 253, 2929, 310, 4499, 422, 4715, 310, 816, 5148, 613, 920, 4499, 422, 4715, 476, 320, 12814, 347, 247, 2714, 1083, 273, 5148, 613, 920, 342, 247, 2176, 4836, 3268, 2299, 697, 2022, 7125, 403, 417, 21414, 281, 1329, 436, 3908, 275, 1798, 253, 2929, 760, 25339, 2074, 3386, 285, 671, 3910, 6096, 407, 4499, 422, 4715, 285, 5148, 613, 920, 824, 347, 16161, 747, 8892, 327, 783, 16247, 342, 1016, 14604, 285, 4715, 828, 6656, 707, 534, 39970, 281, 4460, 3237, 387, 17032, 841, 6096, 3386, 403, 417, 4451, 390, 5319, 281, 2057, 5148, 613, 920, 390, 4499, 422, 4715, 323, 1650, 3676, 7982, 4715, 3066, 247, 39716, 2957, 671, 35910, 747, 8892, 1027, 44791, 10538, 3870, 909, 800, 6667, 275, 1016, 14604, 285, 13698, 281, 3037, 3386, 39970, 281, 747, 3237, 24088, 747, 9365, 323, 2454, 8981, 281, 1750, 4499, 422, 4715, 310, 816, 5148, 613, 920, 581, 943, 1246, 247, 2087, 5148, 613, 920, 7792, 285, 921, 849, 253, 4499, 422, 4715, 7792, 476, 4944, 275, 326, 7792, 347, 247, 2714, 1083, 671, 581, 943, 1158, 670, 253, 5161, 5319, 273, 841, 767, 31225, 285, 921, 581, 310, 247, 8578, 273, 253, 643, 323, 1650, 891, 858, 417, 923, 2712, 11419, 275, 948, 498, 83, 5010, 760, 4645, 690, 6096, 3386, 352, 310, 2074, 281, 18597, 326, 42322, 403, 816, 3754, 984, 597, 1097, 452, 1740, 9246, 247, 1481, 285, 247, 8105, 50276, 19, 253, 7103, 310, 16706, 275, 1027, 4243, 273, 253, 2929, 2593, 577, 10262, 1543, 432, 1097, 4872, 7241, 285, 15450, 8892, 1223, 2593, 608, 285, 721, 760, 1246, 253, 3438, 690, 4679, 403, 760, 327, 260, 338, 274, 740, 1223, 690, 403, 760, 327, 4440, 257, 292, 352, 310, 12744, 1880, 253, 941, 42072, 285, 11786, 12037, 1361, 5148, 613, 920, 390, 948, 498, 83, 275, 15450, 8892, 50276, 20, 253, 2934, 273, 1781, 9381, 347, 24026, 2957, 9563, 2074, 281, 2720, 789, 327, 49653, 3632, 2460, 39501, 347, 247, 39543, 4836, 323, 1881, 35421, 4715, 752, 310, 253, 2022, 3064, 50276, 21, 253, 4081, 5148, 613, 920, 7792, 323, 1881, 35421, 4715, 4428, 25142, 275, 253, 6703, 6287, 2720, 1881, 35421, 789, 6492, 253, 1180, 273, 3733, 19502, 33032, 3770, 84, 476, 3012, 2818, 253, 3045, 271, 3368, 310, 3058, 281, 253, 3045, 273, 5148, 613, 920, 285, 948, 498, 83, 762, 1027, 44540, 2562, 569, 352, 310, 671, 4722, 281, 923, 253, 4465, 3733, 673, 2879, 407, 253, 6703, 6287, 50276, 22, 310, 253, 2957, 298, 327, 3239, 608, 247, 4499, 422, 2957, 50276, 28269, 1881, 35421, 5304, 14237, 3066, 5148, 613, 920, 310, 747, 285, 4722, 5661, 1543, 671, 5224, 697, 2442, 12510, 533, 253, 2022, 1750, 4499, 422, 4715, 310, 816, 5148, 613, 920, 310, 417, 973, 4516, 347, 7000, 275, 2022, 2278, 253, 4327, 273, 15302, 285, 8892, 275, 1027, 4679, 310, 47641, 2403, 253, 1543, 1679, 21414, 4679, 327, 253, 3486, 273, 25142, 1561, 253, 6703, 6287, 403, 3058, 281, 921, 625, 801, 554, 394, 5301, 875, 253, 5148, 613, 920, 7792, 285, 948, 498, 83, 5474, 33032, 2520, 2929, 4081, 247, 7792, 281, 19837, 4499, 1644, 813, 35421, 4715, 715, 5148, 613, 920, 6239, 253, 4477, 7568, 326, 4499, 422, 4715, 9241, 9009, 275, 5148, 613, 920, 3082, 824, 347, 391, 19, 69, 19, 476, 5115, 10870, 1543, 327, 2710, 4382, 8113, 8892, 1735, 253, 4477, 4081, 767, 24866, 275, 5148, 613, 920, 6239, 281, 3157, 256, 3433, 3210, 1690, 337, 9381, 10554, 374, 14604, 11786, 12037, 1097, 3082, 7568, 32809, 7756, 689, 253, 2629, 948, 498, 83, 8245, 337, 253, 2934, 273, 24399, 1881, 35421, 4715, 715, 5148, 613, 920, 310, 4460, 253, 12288, 273, 12767, 1027, 42072, 347, 4836, 42072, 310, 4473, 1230, 4722, 1014, 2167, 253, 2929, 1057, 417, 9017, 824, 247, 7792, 281, 253, 1268, 273, 643, 5148, 613, 920, 6239, 326, 15302, 285, 8892, 403, 8127, 1027, 891, 2868, 253, 4473, 4081, 275, 253, 2929, 778, 320, 247, 1175, 17006, 323, 2852, 1881, 35421, 4715, 2561, 50276, 19, 253, 806, 16774, 2934, 4081, 275, 253, 2929, 310, 4555, 816, 4000, 3024, 549, 32693, 11395, 1229, 2357, 1619, 285, 16248, 352, 342, 253, 6036, 24224, 5361, 2746, 310, 671, 417, 4460, 1881, 35421, 6779, 4715, 407, 9381, 4735, 34430, 4906, 30105, 1087, 746, 3103, 352, 556, 3710, 38135, 891, 513, 417, 923, 849, 436, 4758, 310, 1027, 672, 13532, 715, 253, 5148, 613, 920, 7792, 697, 1335, 2444, 347, 271, 24026, 2957, 50276, 20, 253, 1273, 16774, 2934, 3133, 747, 281, 253, 1881, 35421, 4715, 7792, 1014, 2167, 253, 2934, 3139, 3249, 432, 5148, 613, 920, 310, 417, 4460, 891, 1158, 326, 17227, 824, 271, 2934, 476, 789, 275, 247, 1881, 35421, 4715, 7792, 310, 4722, 285, 476, 3164, 5649, 253, 2087, 1881, 35421, 4715, 2561, 3114, 50276, 21, 253, 4477, 2530, 512, 5661, 4278, 323, 39306, 253, 1543, 891, 2096, 326, 697, 417, 1896, 281, 18302, 2629, 948, 498, 83, 1955, 281, 15180, 7563, 533, 352, 3133, 326, 253, 5301, 310, 417, 20297, 891, 513, 417, 923, 667, 4373, 19484, 3186, 323, 253, 8245, 948, 498, 83, 1615, 311, 1566, 841, 4373, 22041, 778, 417, 320, 8654, 323, 1650, 970, 247, 1781, 4715, 2281, 273, 577, 285, 247, 1029, 3276, 273, 16987, 778, 417, 320, 1682, 323, 3733, 1355, 14604, 9552, 824, 347, 17558, 671, 512, 4679, 11701, 403, 32809, 24088, 2829, 721, 818, 854, 760, 14371, 337, 1293, 271, 41389, 4373, 19484, 3186, 436, 337, 50276, 261, 417, 21414, 2217, 50276, 22, 4583, 253, 2929, 310, 973, 3542, 50275, 783, 4473, 273, 16248, 1881, 35421, 4715, 285, 5148, 613, 920, 310, 4722, 285, 778, 452, 247, 8750, 3486, 275, 253, 2852, 253, 806, 10480, 273, 9381, 10554, 556, 3710, 38135, 533, 253, 1273, 10480, 310, 4722, 285, 5183, 281, 320, 4217, 16774, 1543, 403, 512, 32809, 285, 417, 21414, 2217, 2490, 187, 4118, 18435, 27, 2520, 2929, 369, 45210, 1754, 327, 253, 10123, 253, 2929, 2792, 562, 271, 4722, 4602, 8489, 1929, 533, 417, 275, 436, 2173, 2715, 285, 1175, 5661, 1543, 2299, 7418, 30628, 5439, 7350, 326, 253, 2929, 369, 14999, 247, 5301, 281, 2720, 789, 12873, 440, 35421, 4715, 285, 5148, 613, 920, 954, 19836, 288, 3467, 1162, 355, 6247, 50276, 6438, 4361, 253, 17265, 2715, 273, 253, 2929, 253, 4477, 2953, 436, 2523, 285, 671, 512, 253, 643, 37317, 5701, 275, 5886, 281, 2720, 789, 597, 19148, 326, 597, 2770, 327, 253, 4499, 422, 440, 35421, 1083, 285, 671, 513, 247, 1175, 2628, 275, 22291, 643, 37317, 7350, 4103, 281, 38135, 285, 1543, 50275, 74, 651, 671, 751, 281, 1127, 562, 347, 30628, 671, 858, 326, 253, 2045, 4060, 369, 247, 2372, 13847, 285, 49317, 46107, 253, 4477, 5194, 281, 1818, 352, 281, 247, 625, 8249, 253, 2810, 2954, 875, 4499, 422, 4715, 285, 5148, 613, 920, 50275, 1189, 455, 891, 1158, 253, 4477, 452, 2218, 247, 1175, 3434, 327, 15974, 253, 37317, 7350, 285, 891, 1158, 253, 2929, 651, 320, 4722, 323, 17857, 32888, 10668 ]
[ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 ]
[ 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 783, 2929, 387, 1133, 2722, 2493, 432, 1881, 35421, 28269, 24088, 4499, 422, 4715, 285, 5148, 613, 920, 24088, 1442, 292, 25004, 5727, 253, 2011, 2954, 310, 4722, 697, 417, 326, 4460, 285, 891, 651, 1014, 9059, 326, 2067, 6671, 452, 9009, 2074, 5697, 2168, 2299, 352, 556, 417, 1160, 326, 6843, 3340, 253, 11786, 12037, 323, 4499, 422, 4715, 534, 310, 3240, 4623, 323, 1142, 50276, 28269, 275, 1524, 1533, 4893, 4583, 253, 2929, 310, 3240, 973, 3542, 285, 3477, 281, 2096, 285, 956, 347, 753, 1840, 38135, 1537, 320, 247, 2372, 3710, 533, 891, 476, 923, 253, 5649, 273, 27321, 253, 2954, 273, 11419, 285, 4499, 422, 4715, 323, 1142, 10668, 275, 253, 3114, 5661, 1543, 403, 275, 3718, 273, 253, 2746, 2299, 247, 625, 509, 8122, 1783, 24088, 7162, 11508, 651, 452, 644, 14109, 247, 1175, 6010, 273, 253, 1375, 23037, 14387, 285, 27321, 690, 4722, 7688, 5474, 33032, 2520, 2929, 7473, 4219, 247, 4602, 875, 253, 3733, 7259, 273, 1643, 11860, 5148, 613, 920, 285, 4499, 422, 4715, 352, 2722, 326, 5148, 613, 920, 11333, 476, 320, 908, 281, 3215, 1949, 327, 2460, 941, 285, 562, 32231, 2629, 4499, 422, 4715, 327, 15450, 8892, 253, 4477, 2007, 897, 253, 4602, 281, 1287, 941, 42072, 7259, 323, 4499, 422, 4715, 20544, 337, 15965, 7473, 1320, 273, 253, 4602, 875, 4499, 422, 4715, 285, 5148, 613, 920, 374, 747, 941, 42072, 5609, 323, 4499, 422, 4715, 1421, 281, 11701, 327, 2629, 27163, 495, 2929, 310, 2590, 285, 1842, 1767, 410, 324, 577, 2127, 310, 2530, 275, 253, 8499, 50276, 20881, 1255, 265, 337, 253, 2022, 14855, 342, 253, 2929, 310, 253, 3480, 273, 5301, 281, 253, 1781, 2133, 273, 789, 10941, 440, 35421, 4715, 285, 5148, 613, 920, 954, 19836, 288, 3467, 1162, 355, 6247, 2692, 326, 5148, 613, 920, 11333, 476, 320, 908, 323, 440, 35421, 4715, 285, 1142, 273, 253, 9380, 326, 26542, 352, 452, 14859, 27090, 5609, 275, 1097, 10746, 253, 38135, 1060, 3133, 281, 320, 816, 253, 4602, 281, 4499, 422, 4715, 5742, 374, 352, 310, 12744, 604, 667, 273, 253, 3082, 4081, 1060, 651, 9017, 281, 10625, 824, 347, 2505, 534, 310, 417, 6730, 495, 5884, 253, 2929, 4060, 310, 13847, 7738, 326, 359, 476, 3523, 2509, 4499, 422, 4715, 285, 2770, 327, 5148, 613, 920, 1014, 2167, 253, 760, 4602, 310, 275, 253, 941, 5978, 1309, 3733, 417, 275, 253, 7342, 273, 253, 767, 4715, 11951, 304, 983, 50276, 26122, 337, 253, 3733, 6287, 323, 5148, 613, 6118, 5431, 8687, 891, 10491, 247, 3632, 14604, 273, 5971, 285, 21255, 22753, 247, 4735, 4908, 263, 281, 12129, 875, 841, 5971, 436, 8631, 253, 1643, 11860, 4715, 8578, 273, 5148, 613, 920, 5148, 613, 920, 3139, 310, 1199, 16055, 1690, 4893, 281, 22296, 4715, 285, 391, 77, 326, 513, 417, 6388, 10491, 5971, 374, 275, 253, 6703, 6287, 253, 1566, 310, 806, 1442, 292, 37437, 327, 1329, 941, 246, 9245, 50276, 7461, 275, 253, 8346, 6287, 253, 9300, 1566, 310, 908, 281, 3283, 327, 7316, 941, 246, 33980, 50276, 395, 247, 2957, 310, 36625, 342, 1675, 281, 253, 3210, 3602, 1078, 1442, 292, 25004, 11333, 824, 347, 43571, 587, 513, 417, 4858, 4836, 941, 715, 1329, 285, 7316, 941, 495, 841, 3082, 3798, 2430, 5871, 275, 12275, 2317, 534, 310, 43245, 8214, 752, 1057, 436, 1599, 13414, 954, 3676, 37507, 4647, 387, 1878, 581, 4254, 281, 12275, 2317, 50276, 250, 3065, 288, 3467, 1442, 79, 20978, 460, 440, 35421, 4715, 3066, 5148, 613, 920, 17857, 32888, 6247, 253, 2022, 4468, 342, 436, 789, 310, 326, 253, 2022, 12288, 3133, 281, 320, 247, 8489, 625, 2173, 12955, 273, 7313, 1160, 275, 2469, 789, 327, 253, 4602, 875, 4980, 440, 35421, 285, 5148, 613, 920, 1223, 627, 403, 690, 4722, 5661, 1543, 891, 9644, 1411, 18738, 1677, 253, 3480, 273, 667, 1783, 8664, 752, 253, 38135, 310, 1060, 2429, 281, 1110, 9380, 5474, 339, 431, 248, 2929, 806, 2722, 326, 253, 1655, 4633, 4499, 422, 4715, 323, 1881, 35421, 5304, 6779, 4715, 10764, 690, 14259, 342, 247, 5148, 613, 920, 7792, 323, 1643, 11860, 4715, 436, 6381, 2731, 253, 4477, 281, 12661, 247, 5148, 613, 920, 7792, 323, 1881, 35421, 4715, 275, 1635, 253, 2929, 671, 2722, 326, 5657, 941, 42072, 285, 11786, 12037, 3715, 275, 5148, 613, 920, 476, 1361, 7278, 4499, 422, 40390, 50275, 49363, 1543, 921, 326, 253, 4081, 5148, 613, 920, 7792, 323, 1881, 35421, 4715, 41731, 13015, 948, 498, 83, 247, 1375, 23037, 14387, 4499, 422, 4715, 1332, 327, 2709, 15450, 8892, 275, 247, 49863, 29974, 13337, 4715, 7792, 2167, 352, 17923, 417, 347, 973, 347, 948, 498, 83, 762, 253, 4872, 7103, 7241, 327, 4440, 257, 292, 50276, 977, 1543, 921, 5657, 3715, 275, 5148, 613, 920, 476, 1361, 7278, 4499, 422, 40390, 762, 253, 4872, 7103, 7241, 20544, 337, 4715, 1881, 35421, 5304, 14237, 3066, 5148, 613, 920, 310, 747, 285, 4722, 374, 253, 3559, 5148, 613, 920, 7792, 323, 1881, 35421, 4715, 19132, 689, 253, 948, 498, 83, 275, 15450, 8892, 436, 2722, 253, 2442, 273, 970, 5148, 613, 920, 323, 1881, 35421, 5304, 6779, 4715, 495, 253, 5657, 432, 5148, 613, 920, 403, 8058, 281, 5649, 948, 498, 83, 50276, 20881, 1255, 265, 337, 247, 5161, 1750, 285, 671, 253, 4060, 273, 253, 2929, 310, 4499, 422, 4715, 310, 816, 5148, 613, 920, 4499, 422, 4715, 476, 320, 12814, 347, 247, 2714, 1083, 273, 5148, 613, 920, 342, 247, 2176, 4836, 3268, 2299, 697, 2022, 7125, 403, 417, 21414, 281, 1329, 436, 3908, 275, 1798, 253, 2929, 760, 25339, 2074, 3386, 285, 671, 3910, 6096, 407, 4499, 422, 4715, 285, 5148, 613, 920, 824, 347, 16161, 747, 8892, 327, 783, 16247, 342, 1016, 14604, 285, 4715, 828, 6656, 707, 534, 39970, 281, 4460, 3237, 387, 17032, 841, 6096, 3386, 403, 417, 4451, 390, 5319, 281, 2057, 5148, 613, 920, 390, 4499, 422, 4715, 323, 1650, 3676, 7982, 4715, 3066, 247, 39716, 2957, 671, 35910, 747, 8892, 1027, 44791, 10538, 3870, 909, 800, 6667, 275, 1016, 14604, 285, 13698, 281, 3037, 3386, 39970, 281, 747, 3237, 24088, 747, 9365, 323, 2454, 8981, 281, 1750, 4499, 422, 4715, 310, 816, 5148, 613, 920, 581, 943, 1246, 247, 2087, 5148, 613, 920, 7792, 285, 921, 849, 253, 4499, 422, 4715, 7792, 476, 4944, 275, 326, 7792, 347, 247, 2714, 1083, 671, 581, 943, 1158, 670, 253, 5161, 5319, 273, 841, 767, 31225, 285, 921, 581, 310, 247, 8578, 273, 253, 643, 323, 1650, 891, 858, 417, 923, 2712, 11419, 275, 948, 498, 83, 5010, 760, 4645, 690, 6096, 3386, 352, 310, 2074, 281, 18597, 326, 42322, 403, 816, 3754, 984, 597, 1097, 452, 1740, 9246, 247, 1481, 285, 247, 8105, 50276, 19, 253, 7103, 310, 16706, 275, 1027, 4243, 273, 253, 2929, 2593, 577, 10262, 1543, 432, 1097, 4872, 7241, 285, 15450, 8892, 1223, 2593, 608, 285, 721, 760, 1246, 253, 3438, 690, 4679, 403, 760, 327, 260, 338, 274, 740, 1223, 690, 403, 760, 327, 4440, 257, 292, 352, 310, 12744, 1880, 253, 941, 42072, 285, 11786, 12037, 1361, 5148, 613, 920, 390, 948, 498, 83, 275, 15450, 8892, 50276, 20, 253, 2934, 273, 1781, 9381, 347, 24026, 2957, 9563, 2074, 281, 2720, 789, 327, 49653, 3632, 2460, 39501, 347, 247, 39543, 4836, 323, 1881, 35421, 4715, 752, 310, 253, 2022, 3064, 50276, 21, 253, 4081, 5148, 613, 920, 7792, 323, 1881, 35421, 4715, 4428, 25142, 275, 253, 6703, 6287, 2720, 1881, 35421, 789, 6492, 253, 1180, 273, 3733, 19502, 33032, 3770, 84, 476, 3012, 2818, 253, 3045, 271, 3368, 310, 3058, 281, 253, 3045, 273, 5148, 613, 920, 285, 948, 498, 83, 762, 1027, 44540, 2562, 569, 352, 310, 671, 4722, 281, 923, 253, 4465, 3733, 673, 2879, 407, 253, 6703, 6287, 50276, 22, 310, 253, 2957, 298, 327, 3239, 608, 247, 4499, 422, 2957, 50276, 28269, 1881, 35421, 5304, 14237, 3066, 5148, 613, 920, 310, 747, 285, 4722, 5661, 1543, 671, 5224, 697, 2442, 12510, 533, 253, 2022, 1750, 4499, 422, 4715, 310, 816, 5148, 613, 920, 310, 417, 973, 4516, 347, 7000, 275, 2022, 2278, 253, 4327, 273, 15302, 285, 8892, 275, 1027, 4679, 310, 47641, 2403, 253, 1543, 1679, 21414, 4679, 327, 253, 3486, 273, 25142, 1561, 253, 6703, 6287, 403, 3058, 281, 921, 625, 801, 554, 394, 5301, 875, 253, 5148, 613, 920, 7792, 285, 948, 498, 83, 5474, 33032, 2520, 2929, 4081, 247, 7792, 281, 19837, 4499, 1644, 813, 35421, 4715, 715, 5148, 613, 920, 6239, 253, 4477, 7568, 326, 4499, 422, 4715, 9241, 9009, 275, 5148, 613, 920, 3082, 824, 347, 391, 19, 69, 19, 476, 5115, 10870, 1543, 327, 2710, 4382, 8113, 8892, 1735, 253, 4477, 4081, 767, 24866, 275, 5148, 613, 920, 6239, 281, 3157, 256, 3433, 3210, 1690, 337, 9381, 10554, 374, 14604, 11786, 12037, 1097, 3082, 7568, 32809, 7756, 689, 253, 2629, 948, 498, 83, 8245, 337, 253, 2934, 273, 24399, 1881, 35421, 4715, 715, 5148, 613, 920, 310, 4460, 253, 12288, 273, 12767, 1027, 42072, 347, 4836, 42072, 310, 4473, 1230, 4722, 1014, 2167, 253, 2929, 1057, 417, 9017, 824, 247, 7792, 281, 253, 1268, 273, 643, 5148, 613, 920, 6239, 326, 15302, 285, 8892, 403, 8127, 1027, 891, 2868, 253, 4473, 4081, 275, 253, 2929, 778, 320, 247, 1175, 17006, 323, 2852, 1881, 35421, 4715, 2561, 50276, 19, 253, 806, 16774, 2934, 4081, 275, 253, 2929, 310, 4555, 816, 4000, 3024, 549, 32693, 11395, 1229, 2357, 1619, 285, 16248, 352, 342, 253, 6036, 24224, 5361, 2746, 310, 671, 417, 4460, 1881, 35421, 6779, 4715, 407, 9381, 4735, 34430, 4906, 30105, 1087, 746, 3103, 352, 556, 3710, 38135, 891, 513, 417, 923, 849, 436, 4758, 310, 1027, 672, 13532, 715, 253, 5148, 613, 920, 7792, 697, 1335, 2444, 347, 271, 24026, 2957, 50276, 20, 253, 1273, 16774, 2934, 3133, 747, 281, 253, 1881, 35421, 4715, 7792, 1014, 2167, 253, 2934, 3139, 3249, 432, 5148, 613, 920, 310, 417, 4460, 891, 1158, 326, 17227, 824, 271, 2934, 476, 789, 275, 247, 1881, 35421, 4715, 7792, 310, 4722, 285, 476, 3164, 5649, 253, 2087, 1881, 35421, 4715, 2561, 3114, 50276, 21, 253, 4477, 2530, 512, 5661, 4278, 323, 39306, 253, 1543, 891, 2096, 326, 697, 417, 1896, 281, 18302, 2629, 948, 498, 83, 1955, 281, 15180, 7563, 533, 352, 3133, 326, 253, 5301, 310, 417, 20297, 891, 513, 417, 923, 667, 4373, 19484, 3186, 323, 253, 8245, 948, 498, 83, 1615, 311, 1566, 841, 4373, 22041, 778, 417, 320, 8654, 323, 1650, 970, 247, 1781, 4715, 2281, 273, 577, 285, 247, 1029, 3276, 273, 16987, 778, 417, 320, 1682, 323, 3733, 1355, 14604, 9552, 824, 347, 17558, 671, 512, 4679, 11701, 403, 32809, 24088, 2829, 721, 818, 854, 760, 14371, 337, 1293, 271, 41389, 4373, 19484, 3186, 436, 337, 50276, 261, 417, 21414, 2217, 50276, 22, 4583, 253, 2929, 310, 973, 3542, 50275, 783, 4473, 273, 16248, 1881, 35421, 4715, 285, 5148, 613, 920, 310, 4722, 285, 778, 452, 247, 8750, 3486, 275, 253, 2852, 253, 806, 10480, 273, 9381, 10554, 556, 3710, 38135, 533, 253, 1273, 10480, 310, 4722, 285, 5183, 281, 320, 4217, 16774, 1543, 403, 512, 32809, 285, 417, 21414, 2217, 2490, 187, 4118, 18435, 27, 2520, 2929, 369, 45210, 1754, 327, 253, 10123, 253, 2929, 2792, 562, 271, 4722, 4602, 8489, 1929, 533, 417, 275, 436, 2173, 2715, 285, 1175, 5661, 1543, 2299, 7418, 30628, 5439, 7350, 326, 253, 2929, 369, 14999, 247, 5301, 281, 2720, 789, 12873, 440, 35421, 4715, 285, 5148, 613, 920, 954, 19836, 288, 3467, 1162, 355, 6247, 50276, 6438, 4361, 253, 17265, 2715, 273, 253, 2929, 253, 4477, 2953, 436, 2523, 285, 671, 512, 253, 643, 37317, 5701, 275, 5886, 281, 2720, 789, 597, 19148, 326, 597, 2770, 327, 253, 4499, 422, 440, 35421, 1083, 285, 671, 513, 247, 1175, 2628, 275, 22291, 643, 37317, 7350, 4103, 281, 38135, 285, 1543, 50275, 74, 651, 671, 751, 281, 1127, 562, 347, 30628, 671, 858, 326, 253, 2045, 4060, 369, 247, 2372, 13847, 285, 49317, 46107, 253, 4477, 5194, 281, 1818, 352, 281, 247, 625, 8249, 253, 2810, 2954, 875, 4499, 422, 4715, 285, 5148, 613, 920, 50275, 1189, 455, 891, 1158, 253, 4477, 452, 2218, 247, 1175, 3434, 327, 15974, 253, 37317, 7350, 285, 891, 1158, 253, 2929, 651, 320, 4722, 323, 17857, 32888, 10668 ]
Below is given review of a research paper from cnoference journal. Please write a summary the review. ### Review: this paper proposes to increase the latent space dimensionality of images by stacking the latent representation vectors as a tensor then convolutional decoder and encoder networks are used to map the original data to latent space and vice versa the learned latent representations can then be used in a universal framework for multiple tasks such as image inpainting superresolution and colorization the idea of increasing the dimensionality of the latent space although not sophisticated seems to be performing very good indeed in some of qualitative experiments the results are surprising the authors should clarify that how is the training procedure performed in more details are test images included in the training the convolutional networksdocsep summary the paper proposes to embed natural images in a latent convolutional space of high dimensionality to obtain a universal image prior concretely each image is embedded as a custom parameter vector of a cnn which turns random noise into the input of a universal generator network to restore the image in pixel space inference for image restoration is performed by minimizing the energy of a likelihood objective while constraining the latent representation of the restored image to be part of the learned latent space experiments for inpainting superresolution and colorization are performed to evaluate the proposed method positive as mentioned in the paper i agree that the idea of learning a universal image prior is appealing since it can be applied to many image restoration tasks without adjustment i am not very familiar with the related work but if i understood correctly the paper seems to combine deep latent modeling glo bojanowski et al 2018 and deep image priors ulyanov et al 2018 the experiments show good results which qualitatively appear better than those of related methods a user study also shows that people mostly prefer the results of the proposed method did you try other standard restoration tasks such as image denoising or deblurring if not do you think they would work equally well limitations while i agree that a universal image prior is valuable the paper should briefly mention what the disadvantages of the proposed approach are a limitation at least as presented is that the corruption process has to be known analytically as a likelihood objective and must be differentiable for gradientbased inference furthermore the disadvantage of the universal prior as presented in the paper is that restoring an image requires optimization eg gradient descent in contrast corruptionspecific neural nets typically just need a forward pass to restore the image and are thus easier and faster to use restoration inference how dependent is the restoration result with respect to the initialization for example when starting gradient descent with the degraded image vs a random image roughly how many iterations and runtime is needed for inference did you try different optimizers such as lbfgsdocsepsummary this work proposes a new complex latent space described by convolutional manifold and this manifold can map the image in a more robust manner when some part of the image are to be restored pros the results show that the latent variable mapped to the image well represents the image and it will be helpful for the image restoration problem it seems novel to adapt the idea of dip for defining complex latent space cons the main concern is that there is no guarantee that the defined latent space is continuous it means that it is difficult to judge whether the interpolated point phiin sin between two points phi1 s1 and phi2 s2 will be matched to the image distribution equation 2 in the paper seems that it just fit the generator parameter theta to map the phii and xi and memorize the mapping between the training images and the given latent convolutional variables if the proposed algorithm just memorizes the training image and map them into given the latent convolution the result cannot justify the proposal that the author proposes a new latent space summary this work proposes an interesting idea of defining complex latent space but it is doubtful that this work just memorized the mapping between the training images and the latent convolutional parameters i want to see the latent space interpolation test for the proposed latent convolutional space if the author provides a profound explanation of the problem i would consider changing the rating see the additional comment for the changed rating ### Summary:
the reviewers are in general impressed by the results and like the idea but they also express some uncertainty about how the proposed actually is set up the authors have made a good attempt to address the reviewers concerns
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 2520, 2929, 29328, 281, 2572, 253, 21624, 2317, 7877, 1319, 50276, 1171, 3888, 407, 37444, 253, 21624, 6779, 11390, 347, 247, 13148, 840, 27311, 267, 29810, 285, 32049, 6928, 403, 908, 281, 3711, 253, 3236, 941, 281, 21624, 2317, 285, 12008, 26620, 253, 6311, 21624, 14237, 476, 840, 320, 908, 275, 247, 10898, 7792, 323, 2709, 8892, 824, 347, 2460, 275, 31406, 1076, 2221, 21061, 285, 3295, 1320, 50276, 783, 2934, 273, 3629, 253, 7877, 1319, 273, 253, 21624, 2317, 3738, 417, 18144, 3133, 281, 320, 9591, 1077, 1175, 6296, 275, 690, 273, 18276, 4679, 253, 1543, 403, 10084, 253, 4477, 943, 19148, 326, 849, 310, 253, 3733, 5199, 2684, 275, 625, 4278, 403, 1071, 3888, 2908, 275, 253, 3733, 253, 27311, 267, 6928, 7152, 33032, 6010, 253, 2929, 29328, 281, 8473, 3626, 3888, 275, 247, 21624, 27311, 267, 2317, 273, 1029, 7877, 1319, 281, 4044, 247, 10898, 2460, 2720, 345, 2414, 600, 1016, 2460, 310, 12691, 347, 247, 2840, 4764, 4972, 273, 247, 260, 9866, 534, 7819, 3632, 6046, 715, 253, 3280, 273, 247, 10898, 14156, 2990, 281, 15042, 253, 2460, 275, 12275, 2317, 17032, 323, 2460, 20384, 310, 2684, 407, 28699, 253, 2341, 273, 247, 12177, 8103, 1223, 1030, 26208, 253, 21624, 6779, 273, 253, 16789, 2460, 281, 320, 629, 273, 253, 6311, 21624, 2317, 4679, 323, 275, 31406, 1076, 2221, 21061, 285, 3295, 1320, 403, 2684, 281, 7472, 253, 4081, 1332, 50275, 10247, 347, 5393, 275, 253, 2929, 891, 5194, 326, 253, 2934, 273, 4715, 247, 10898, 2460, 2720, 310, 23176, 1580, 352, 476, 320, 3732, 281, 1142, 2460, 20384, 8892, 1293, 14000, 891, 717, 417, 1077, 7615, 342, 253, 2905, 789, 533, 604, 891, 7192, 9113, 253, 2929, 3133, 281, 13398, 3676, 21624, 14053, 43976, 1766, 17551, 15767, 1162, 355, 4765, 285, 3676, 2460, 2235, 641, 1484, 314, 46964, 1162, 355, 4765, 253, 4679, 921, 1175, 1543, 534, 36143, 3176, 1805, 685, 1110, 273, 2905, 3082, 247, 2608, 1263, 671, 2722, 326, 952, 6571, 4510, 253, 1543, 273, 253, 4081, 1332, 858, 368, 1611, 643, 2629, 20384, 8892, 824, 347, 2460, 1850, 80, 2182, 390, 372, 49857, 804, 604, 417, 513, 368, 1158, 597, 651, 789, 9696, 973, 50275, 17465, 569, 1223, 891, 5194, 326, 247, 10898, 2460, 2720, 310, 9865, 253, 2929, 943, 13366, 3748, 752, 253, 23797, 273, 253, 4081, 2746, 403, 50276, 66, 12291, 387, 1878, 347, 3559, 310, 326, 253, 16933, 1232, 556, 281, 320, 1929, 41398, 347, 247, 12177, 8103, 285, 1364, 320, 46350, 323, 11786, 3169, 17032, 50276, 44295, 3062, 253, 18928, 273, 253, 10898, 2720, 347, 3559, 275, 253, 2929, 310, 326, 33269, 271, 2460, 4419, 13757, 24088, 11786, 18499, 275, 4499, 17715, 621, 29765, 11454, 37507, 5431, 816, 878, 247, 3579, 1509, 281, 15042, 253, 2460, 285, 403, 3021, 6927, 285, 7938, 281, 897, 50275, 1120, 7843, 17032, 50276, 5430, 7976, 310, 253, 20384, 906, 342, 1675, 281, 253, 31850, 323, 1650, 672, 4983, 11786, 18499, 342, 253, 30853, 2460, 4632, 247, 3632, 2460, 50276, 903, 314, 849, 1142, 25142, 285, 20243, 310, 3058, 323, 17032, 50276, 14958, 368, 1611, 1027, 5556, 14460, 824, 347, 298, 3342, 5943, 7152, 339, 793, 360, 3454, 50276, 2520, 789, 29328, 247, 747, 2570, 21624, 2317, 2529, 407, 27311, 267, 16751, 285, 436, 16751, 476, 3711, 253, 2460, 275, 247, 625, 10237, 5133, 672, 690, 629, 273, 253, 2460, 403, 281, 320, 16789, 50276, 856, 84, 50276, 783, 1543, 921, 326, 253, 21624, 4778, 18301, 281, 253, 2460, 973, 6125, 253, 2460, 285, 352, 588, 320, 9371, 323, 253, 2460, 20384, 1895, 50276, 262, 3133, 4460, 281, 5223, 253, 2934, 273, 12539, 323, 13947, 2570, 21624, 2317, 50276, 5040, 50276, 783, 2022, 4468, 310, 326, 627, 310, 642, 12215, 326, 253, 2931, 21624, 2317, 310, 5415, 50276, 262, 2097, 326, 352, 310, 2834, 281, 5963, 1880, 253, 20670, 456, 1127, 815, 74, 249, 6868, 875, 767, 2792, 815, 74, 18, 256, 18, 285, 815, 74, 19, 256, 19, 588, 320, 13373, 281, 253, 2460, 3268, 50276, 29813, 374, 275, 253, 2929, 3133, 326, 352, 816, 4944, 253, 14156, 4764, 39116, 281, 3711, 253, 815, 2886, 285, 1269, 74, 285, 16407, 907, 253, 10603, 875, 253, 3733, 3888, 285, 253, 1677, 21624, 27311, 267, 4903, 50276, 338, 253, 4081, 5933, 816, 16407, 4219, 253, 3733, 2460, 285, 3711, 731, 715, 1677, 253, 21624, 27311, 253, 906, 2550, 15249, 253, 10419, 326, 253, 2488, 29328, 247, 747, 21624, 2317, 50276, 8774, 50276, 2520, 789, 29328, 271, 4722, 2934, 273, 13947, 2570, 21624, 2317, 533, 352, 310, 38342, 326, 436, 789, 816, 16407, 1025, 253, 10603, 875, 253, 3733, 3888, 285, 253, 21624, 27311, 267, 3602, 50276, 74, 971, 281, 923, 253, 21624, 2317, 30370, 1071, 323, 253, 4081, 21624, 27311, 267, 2317, 604, 253, 2488, 3400, 247, 15585, 8813, 273, 253, 1895, 891, 651, 1908, 6890, 253, 13716, 50275, 2887, 253, 3081, 4385, 323, 253, 4391, 13716, 2490, 187, 4118, 18435, 27, 783, 30628, 403, 275, 2087, 17847, 407, 253, 1543, 285, 751, 253, 2934, 533, 597, 671, 3890, 690, 11649, 670, 849, 253, 4081, 2686, 310, 873, 598, 253, 4477, 452, 1160, 247, 1175, 3177, 281, 2953, 253, 30628, 7350, 209 ]
[ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 ]
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 2520, 2929, 29328, 281, 2572, 253, 21624, 2317, 7877, 1319, 50276, 1171, 3888, 407, 37444, 253, 21624, 6779, 11390, 347, 247, 13148, 840, 27311, 267, 29810, 285, 32049, 6928, 403, 908, 281, 3711, 253, 3236, 941, 281, 21624, 2317, 285, 12008, 26620, 253, 6311, 21624, 14237, 476, 840, 320, 908, 275, 247, 10898, 7792, 323, 2709, 8892, 824, 347, 2460, 275, 31406, 1076, 2221, 21061, 285, 3295, 1320, 50276, 783, 2934, 273, 3629, 253, 7877, 1319, 273, 253, 21624, 2317, 3738, 417, 18144, 3133, 281, 320, 9591, 1077, 1175, 6296, 275, 690, 273, 18276, 4679, 253, 1543, 403, 10084, 253, 4477, 943, 19148, 326, 849, 310, 253, 3733, 5199, 2684, 275, 625, 4278, 403, 1071, 3888, 2908, 275, 253, 3733, 253, 27311, 267, 6928, 7152, 33032, 6010, 253, 2929, 29328, 281, 8473, 3626, 3888, 275, 247, 21624, 27311, 267, 2317, 273, 1029, 7877, 1319, 281, 4044, 247, 10898, 2460, 2720, 345, 2414, 600, 1016, 2460, 310, 12691, 347, 247, 2840, 4764, 4972, 273, 247, 260, 9866, 534, 7819, 3632, 6046, 715, 253, 3280, 273, 247, 10898, 14156, 2990, 281, 15042, 253, 2460, 275, 12275, 2317, 17032, 323, 2460, 20384, 310, 2684, 407, 28699, 253, 2341, 273, 247, 12177, 8103, 1223, 1030, 26208, 253, 21624, 6779, 273, 253, 16789, 2460, 281, 320, 629, 273, 253, 6311, 21624, 2317, 4679, 323, 275, 31406, 1076, 2221, 21061, 285, 3295, 1320, 403, 2684, 281, 7472, 253, 4081, 1332, 50275, 10247, 347, 5393, 275, 253, 2929, 891, 5194, 326, 253, 2934, 273, 4715, 247, 10898, 2460, 2720, 310, 23176, 1580, 352, 476, 320, 3732, 281, 1142, 2460, 20384, 8892, 1293, 14000, 891, 717, 417, 1077, 7615, 342, 253, 2905, 789, 533, 604, 891, 7192, 9113, 253, 2929, 3133, 281, 13398, 3676, 21624, 14053, 43976, 1766, 17551, 15767, 1162, 355, 4765, 285, 3676, 2460, 2235, 641, 1484, 314, 46964, 1162, 355, 4765, 253, 4679, 921, 1175, 1543, 534, 36143, 3176, 1805, 685, 1110, 273, 2905, 3082, 247, 2608, 1263, 671, 2722, 326, 952, 6571, 4510, 253, 1543, 273, 253, 4081, 1332, 858, 368, 1611, 643, 2629, 20384, 8892, 824, 347, 2460, 1850, 80, 2182, 390, 372, 49857, 804, 604, 417, 513, 368, 1158, 597, 651, 789, 9696, 973, 50275, 17465, 569, 1223, 891, 5194, 326, 247, 10898, 2460, 2720, 310, 9865, 253, 2929, 943, 13366, 3748, 752, 253, 23797, 273, 253, 4081, 2746, 403, 50276, 66, 12291, 387, 1878, 347, 3559, 310, 326, 253, 16933, 1232, 556, 281, 320, 1929, 41398, 347, 247, 12177, 8103, 285, 1364, 320, 46350, 323, 11786, 3169, 17032, 50276, 44295, 3062, 253, 18928, 273, 253, 10898, 2720, 347, 3559, 275, 253, 2929, 310, 326, 33269, 271, 2460, 4419, 13757, 24088, 11786, 18499, 275, 4499, 17715, 621, 29765, 11454, 37507, 5431, 816, 878, 247, 3579, 1509, 281, 15042, 253, 2460, 285, 403, 3021, 6927, 285, 7938, 281, 897, 50275, 1120, 7843, 17032, 50276, 5430, 7976, 310, 253, 20384, 906, 342, 1675, 281, 253, 31850, 323, 1650, 672, 4983, 11786, 18499, 342, 253, 30853, 2460, 4632, 247, 3632, 2460, 50276, 903, 314, 849, 1142, 25142, 285, 20243, 310, 3058, 323, 17032, 50276, 14958, 368, 1611, 1027, 5556, 14460, 824, 347, 298, 3342, 5943, 7152, 339, 793, 360, 3454, 50276, 2520, 789, 29328, 247, 747, 2570, 21624, 2317, 2529, 407, 27311, 267, 16751, 285, 436, 16751, 476, 3711, 253, 2460, 275, 247, 625, 10237, 5133, 672, 690, 629, 273, 253, 2460, 403, 281, 320, 16789, 50276, 856, 84, 50276, 783, 1543, 921, 326, 253, 21624, 4778, 18301, 281, 253, 2460, 973, 6125, 253, 2460, 285, 352, 588, 320, 9371, 323, 253, 2460, 20384, 1895, 50276, 262, 3133, 4460, 281, 5223, 253, 2934, 273, 12539, 323, 13947, 2570, 21624, 2317, 50276, 5040, 50276, 783, 2022, 4468, 310, 326, 627, 310, 642, 12215, 326, 253, 2931, 21624, 2317, 310, 5415, 50276, 262, 2097, 326, 352, 310, 2834, 281, 5963, 1880, 253, 20670, 456, 1127, 815, 74, 249, 6868, 875, 767, 2792, 815, 74, 18, 256, 18, 285, 815, 74, 19, 256, 19, 588, 320, 13373, 281, 253, 2460, 3268, 50276, 29813, 374, 275, 253, 2929, 3133, 326, 352, 816, 4944, 253, 14156, 4764, 39116, 281, 3711, 253, 815, 2886, 285, 1269, 74, 285, 16407, 907, 253, 10603, 875, 253, 3733, 3888, 285, 253, 1677, 21624, 27311, 267, 4903, 50276, 338, 253, 4081, 5933, 816, 16407, 4219, 253, 3733, 2460, 285, 3711, 731, 715, 1677, 253, 21624, 27311, 253, 906, 2550, 15249, 253, 10419, 326, 253, 2488, 29328, 247, 747, 21624, 2317, 50276, 8774, 50276, 2520, 789, 29328, 271, 4722, 2934, 273, 13947, 2570, 21624, 2317, 533, 352, 310, 38342, 326, 436, 789, 816, 16407, 1025, 253, 10603, 875, 253, 3733, 3888, 285, 253, 21624, 27311, 267, 3602, 50276, 74, 971, 281, 923, 253, 21624, 2317, 30370, 1071, 323, 253, 4081, 21624, 27311, 267, 2317, 604, 253, 2488, 3400, 247, 15585, 8813, 273, 253, 1895, 891, 651, 1908, 6890, 253, 13716, 50275, 2887, 253, 3081, 4385, 323, 253, 4391, 13716, 2490, 187, 4118, 18435, 27, 783, 30628, 403, 275, 2087, 17847, 407, 253, 1543, 285, 751, 253, 2934, 533, 597, 671, 3890, 690, 11649, 670, 849, 253, 4081, 2686, 310, 873, 598, 253, 4477, 452, 1160, 247, 1175, 3177, 281, 2953, 253, 30628, 7350, 209 ]
Below is given review of a research paper from cnoference journal. Please write a summary the review. ### Review: this paper proposes a new building block for gnns called gnn this building block trades of depth for width and involves multiple parallel regular gnn processing units using the gnn architecture the authors establish bounds for the required network depth and total parameters for several combinatorial problems over graphs pros a parallel processing architecture as opposed to sequential processing is also likely to be computationally more efficient in large memory setups i havent carefully checked all the proofs but the key theorems seem relevant and interesting the paper is clearly written for most parts and the key intuitions behind the theoretical results are explained in sufficient depth cons the parallel gnnmp and mlp modules within a gnn block all share parameters and hence the only source of difference in the computed function is due to differences in initialization of node features how is that difference executed in practice both in the problems considered in this work and more generally for any graph problem from the description in the experiments section it seems that the networks are implemented using same set of features for the nodes during initialization eg node degrees booleans for source andor destination nodes the paper lacks evidence if this model outperforms gnns on realworld graph datasets all the problems considered in this work are combinatorial optimization problems on erdosrenyi graphs for which nonlearning based solutions also exist while many of the theoretical results are also custom to these problems one would hope that the architecture is also evaluated on standard tasks eg classification for realworld benchmark graph datasets the central premise of the paper seems to be that the gnn models require less depth and total parameters compared to standard gnn architectures this is not clear from the experiments a more descriptive result would be to assess the performance of these models as a function of depth and total parameters docsepthe main question this paper tackles is can one develop sample efficient architectures for graph problems while retaining the simplicity of the message passing framework while combining message passing with gnns has shown to have positive empirical results we do not know of a general neural architecture that is versatile enough to provably encode the solution space of a variety of graph problems such as shortest paths and minimum spanning trees this paper introduces a theoretically principled architecture gcn which is attempts to make gcns more efficient by using ideas from the subfields of sketching approximations and parallel computing i vote for accepting the paper due to its novelty and the pros listed below pros an interesting paper with novel contribution combining ideas from parallel computing and sketching approximations major algorithmic contribution of interest to practitioners theoretical contributions in the form of several theorems in the paper cons no code provided with the submissiondocsepthe authors propose a variation of gnn which they name gnn which runs several gnn modules in parallel with weight sharing the authors show that it works well in practice on several synthetic datasets and show many theoretical results i have some issue with the theoretical results and would raise the score if addressed unless i am mistaken the gnn architecture requires different features inputs for all copies of the gnn this should be added into the parameter counting that is done when the parameters are the same across layers they are counted as one while this is true if you enforce weight sharing but in general we train without this constraint furthermore this isnt done for gnns which leads to an unfair comparison the method should be the same preferably without discounting weight sharing maybe i am missing something basic but the node affinity definition doesnt make any sense to me apuv should be just p times the indicator of uvin e so shouldnt be more informative then just the basic adjacency matrix it doesnt seem like there is any theoretical gain when comparing theorem 6 and 8 considering that in theorem 6 alpha32 and the number of parameters is ob2l also gnn dont have to have a fixed representation size each layer where blomegan while in theorem 8 we get number of parameters on21clogn which for c2 where they agree doesnt seem to give a better bound you do not show how to implement bellmen ford with a gnn which is missing from the proof minor remarks in app a first equation should be min not max in alg 1 should be ximinxiai1 not ximinxiai also switch between x and v which is confusingdocsepoverview  the paper suggests a modification to graph neural networks which is claimed to overcome gnn expressiveness issues recently shown by loukas iclr 2020 comments i had multiple difficulties in following the content of the paper some are detailed next  unfortunately gnns require large depth as proved in theorem 6 above the lower bound in theorem 6 is for approximation alpha 15 but here you discuss approximation olog n doesnt this void the lower bound why cannot usual gnns produce an olog n approximation and specifically bourgains embedding while there do exist semidefinite programming based algorithms linial et al 1995 for computing the embeddings required for bourgains theorem they are not suitable for implementation via efficient neural architectures instead we adapt the sketch based approximate shortest path algorithms of das sarma et al 2010 in fact the algorithm you implement is the one from bourgain and linial et al not dar sarma et al all those works use roughly the same sketching algorithm which measures the distance of each point to randomly chosen clusters however the distance estimation procedure you implement specifically dgst maxivsi vti is bourgains this is just an ellinfinity embedding dar sarma et als estimation procedure is different and building on the wellknown work of thorupzwick relies on computing the common nearest neighbors of the given pair st in the random clusters note that this bears on the correctness of the proof of theorem 8 i think the statement still holds due to matouseks analysis of bourgains embedding but not for the reason you cite also note that in theorem 8 as well as all aforementioned results c needs to be an integer in particular it is known that any approximation less than 3 is impossible with less than omegan2 parameters for mincut you write that traditional gnns require omegasqrtn depthrounds citing loukas 2020 but doesnt that lower bound entail both the depth and the width dsqrtw omegasqrtn i am unable to follow the proof of theorem 9 could you please explain the correctness of your construction on this note the sentence karger stein 1996 implies that with probability at least 1n2 there exists a prefix l of l such that seems like an unfortunate inaccuracy the prefix exists deterministically and their guarantee is that the iterative random contraction algorithm finds it with probability at least 1n2 conclusion i am currently unable to recommend accepting this paper due to what seems like multiple inaccuracies misinterpretations of prior work unclear statements and possibly technical correctness issues i will await clarifications from the authors on the points detailed above post discussion update after discussion with the authors i have calibrated my score upward to 4 since the authors seemed willing to engage in discussion and correctimprove the paper which i appreciate but i still recommend not accepting the paper the authors generally acknowledged though have not yet fixed the issue of wrong attribution of the apsp algorithm this isnt just a matter of citing b instead of a the paper still contains a lengthy discussion of why a is not suitable so instead they must resort to b even though in reality they just use a and b remains unused this is glaring since these papers are famous classics widely taught in graduate courses their content is well known and it is puzzling how a diametrically incorrect representation of them made its way into the paper the reason i dwell on this is that it signifies a larger issue with the paper the original version was peppered with formal statements which were at best inaccurate and even though the authors fixed or said they would fix the ones i pointed out i remain unable to trust the overall technical soundness of the paper the review time frame doesnt allow a reviewer to carefully verify every statement nor would i want to there must be some commitment of due diligence on part of the authors that up to a small inevitable fraction of inaccuracies the formal content is rigorously correct im afraid the current version of the paper is quite off this mark putting formal soundness aside my present understanding of the idea of the paper is the following the authors observe that many basic computations on graphs can be parallelized into a few computations of small width and depth usual gnns can implicitly implement this if their width is large enough but this poses a computational burden and there are obvious advantages to explicitly building this parallelism into the architecture this seems like a sensible and potentially empirically useful observation but the experimental section still seems too thin to make the case properly that said perhaps i have not fully understood the paper since its frequent inaccuracies and fuzzy statements made it a bit hard for me to follow in conclusion i think the paper should undergo a substantial revision 1 clean up the theory part and ensure its formal soundness 2 crystalize the point of the paper in particular rather than just presenting gnn i hope a revised version would include a more thorough comparison with usual gnns not just dismiss them with some citations of prior works which allegedly prove limitations this leaves doubts about the exact model and assumptions particularly since as discussed above the prior work is not always cited accurately 3 possibly expand the experimental section ### Summary:
the paper presents a new gnn architecture and provide interesting theoretical observations about the architecture the paper is quite promising and has several interesting insights however most of the reviewers believe that the paper is not ready for publication and can be significantly improved by a more formal and precise statements b clarifying the key points of the paper c more thorough experimental validation of the framework on realworld datasets
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 2520, 2929, 29328, 247, 747, 3652, 2972, 323, 18976, 2224, 1925, 305, 9866, 436, 3652, 2972, 28587, 273, 6864, 323, 4871, 285, 8687, 2709, 7529, 3963, 305, 9866, 5162, 5085, 970, 253, 305, 9866, 10336, 253, 4477, 5100, 14493, 323, 253, 2424, 2990, 6864, 285, 2264, 3602, 323, 2067, 38183, 3237, 689, 14580, 50276, 856, 84, 50276, 66, 7529, 5162, 10336, 347, 10066, 281, 22453, 5162, 310, 671, 2779, 281, 320, 43245, 625, 5919, 275, 1781, 3541, 873, 8777, 50275, 74, 419, 2254, 9257, 10141, 512, 253, 27947, 533, 253, 2234, 39383, 1646, 4623, 285, 4722, 50275, 783, 2929, 310, 4518, 3542, 323, 954, 4243, 285, 253, 2234, 16875, 4431, 3212, 253, 10527, 1543, 403, 5544, 275, 4209, 6864, 50275, 5040, 50276, 783, 7529, 305, 9866, 2503, 285, 13361, 81, 11911, 1561, 247, 305, 9866, 2972, 512, 3894, 3602, 285, 7613, 253, 760, 2603, 273, 3064, 275, 253, 10302, 1159, 310, 1955, 281, 3910, 275, 31850, 273, 4666, 3386, 849, 310, 326, 3064, 11407, 275, 3946, 1097, 275, 253, 3237, 2783, 275, 436, 789, 285, 625, 3839, 323, 667, 4216, 1895, 432, 253, 5740, 275, 253, 4679, 2593, 352, 3133, 326, 253, 6928, 403, 9009, 970, 1072, 873, 273, 3386, 323, 253, 7632, 1309, 31850, 24088, 4666, 7759, 1766, 1306, 507, 323, 2603, 285, 263, 12095, 7632, 50275, 783, 2929, 19756, 1941, 604, 436, 1566, 41731, 13015, 18976, 2224, 327, 1524, 10186, 4216, 15302, 512, 253, 3237, 2783, 275, 436, 789, 403, 38183, 13757, 3237, 327, 2827, 38755, 445, 28212, 14580, 323, 534, 1327, 28269, 1754, 5482, 671, 2226, 1223, 1142, 273, 253, 10527, 1543, 403, 671, 2840, 281, 841, 3237, 581, 651, 3524, 326, 253, 10336, 310, 671, 6760, 327, 2629, 8892, 24088, 9162, 323, 1524, 10186, 22791, 4216, 15302, 50275, 783, 4275, 26536, 273, 253, 2929, 3133, 281, 320, 326, 253, 305, 9866, 3210, 2430, 1679, 6864, 285, 2264, 3602, 2429, 281, 2629, 305, 9866, 35615, 436, 310, 417, 2590, 432, 253, 4679, 247, 625, 27389, 906, 651, 320, 281, 2939, 253, 3045, 273, 841, 3210, 347, 247, 1159, 273, 6864, 285, 2264, 3602, 50276, 7152, 339, 431, 248, 2022, 1953, 436, 2929, 39223, 310, 476, 581, 1287, 3410, 5919, 35615, 323, 4216, 3237, 1223, 26179, 253, 17647, 273, 253, 3935, 8136, 7792, 50276, 6050, 16248, 3935, 8136, 342, 18976, 2224, 556, 2011, 281, 452, 2762, 16774, 1543, 50276, 664, 513, 417, 871, 273, 247, 2087, 11454, 10336, 326, 310, 50276, 735, 12596, 2217, 281, 872, 1598, 22573, 253, 2900, 2317, 273, 247, 5235, 273, 4216, 3237, 824, 347, 30505, 11865, 285, 5927, 28369, 7139, 50275, 2520, 2929, 23970, 247, 28055, 3505, 74, 6216, 10336, 50276, 72, 14340, 50276, 4609, 310, 9437, 281, 1056, 305, 68, 2224, 625, 5919, 407, 970, 5697, 432, 253, 749, 15069, 273, 30547, 7695, 34754, 285, 7529, 12672, 50273, 74, 6273, 323, 18738, 253, 2929, 1955, 281, 697, 38135, 285, 253, 5847, 7117, 2708, 50272, 856, 84, 50275, 266, 4722, 2929, 342, 4460, 7680, 16248, 5697, 432, 7529, 12672, 285, 30547, 7695, 34754, 50275, 24330, 5933, 280, 7680, 273, 1600, 281, 24432, 50275, 783, 33977, 9021, 275, 253, 830, 273, 2067, 39383, 275, 253, 2929, 50275, 5040, 50274, 2369, 2127, 2530, 342, 253, 19529, 7152, 339, 431, 248, 4477, 12661, 247, 7629, 273, 305, 9866, 534, 597, 1416, 305, 9866, 534, 6613, 2067, 305, 9866, 11911, 275, 7529, 342, 2801, 9628, 253, 4477, 921, 326, 352, 2987, 973, 275, 3946, 327, 2067, 13506, 15302, 285, 921, 1142, 10527, 1543, 50276, 74, 452, 690, 2523, 342, 253, 10527, 1543, 285, 651, 7164, 253, 4868, 604, 9713, 50275, 28558, 891, 717, 20854, 253, 50276, 3757, 79, 10336, 4419, 1027, 3386, 14800, 323, 512, 10125, 273, 253, 305, 9866, 436, 943, 320, 2879, 715, 253, 4764, 15496, 326, 310, 2218, 50276, 9453, 253, 3602, 403, 253, 1072, 2439, 8090, 597, 403, 16042, 347, 581, 1223, 436, 310, 2032, 604, 368, 7767, 2801, 9628, 533, 275, 2087, 359, 6194, 1293, 436, 7658, 33810, 436, 310, 2649, 2218, 323, 18976, 2224, 534, 5644, 281, 271, 16593, 5301, 253, 1332, 943, 320, 253, 1072, 13027, 1293, 13630, 272, 2801, 9628, 50276, 28489, 891, 717, 5816, 1633, 5044, 533, 253, 4666, 15430, 5426, 36908, 1056, 667, 3282, 281, 479, 1049, 8962, 943, 320, 816, 268, 2069, 253, 15301, 273, 1484, 8498, 299, 594, 943, 2649, 320, 625, 27096, 840, 816, 253, 5044, 3067, 43850, 4315, 50276, 262, 36908, 1646, 751, 627, 310, 667, 10527, 6351, 672, 10941, 10012, 721, 285, 854, 7296, 326, 275, 10012, 721, 9765, 1237, 285, 253, 1180, 273, 3602, 310, 691, 19, 77, 671, 305, 9866, 13414, 452, 281, 452, 247, 4229, 6779, 1979, 1016, 3828, 835, 787, 485, 1247, 1223, 275, 10012, 854, 359, 755, 1180, 273, 3602, 327, 1797, 498, 2331, 534, 323, 260, 19, 835, 597, 5194, 36908, 1646, 281, 1918, 247, 1805, 3033, 50276, 5658, 513, 417, 921, 849, 281, 3359, 17487, 3767, 323, 69, 342, 247, 305, 9866, 534, 310, 5816, 432, 253, 4737, 50273, 37585, 16157, 50276, 249, 622, 247, 806, 5150, 943, 320, 1054, 417, 2781, 50276, 249, 20320, 337, 943, 320, 1269, 303, 27475, 571, 74, 18, 417, 1269, 303, 27475, 571, 74, 671, 5234, 875, 1269, 285, 362, 534, 310, 21643, 7152, 33032, 39930, 575, 50276, 783, 2929, 5936, 247, 11237, 281, 4216, 11454, 6928, 534, 310, 7558, 281, 11399, 305, 9866, 3890, 6460, 3374, 4102, 2011, 407, 29245, 39903, 17857, 32888, 9169, 50276, 26122, 50276, 74, 574, 2709, 12748, 275, 1563, 253, 2600, 273, 253, 2929, 690, 403, 7000, 1735, 50276, 575, 328, 9520, 18976, 2224, 2430, 1781, 6864, 347, 8058, 275, 10012, 721, 1840, 253, 2406, 3033, 275, 10012, 721, 310, 323, 11193, 9765, 50276, 1010, 533, 1060, 368, 2319, 11193, 258, 2808, 295, 36908, 436, 2991, 253, 2406, 3033, 2139, 2550, 7312, 18976, 2224, 4711, 271, 258, 2808, 295, 11193, 285, 5742, 29772, 72, 1550, 21496, 50275, 6050, 627, 513, 2226, 3300, 504, 35161, 10717, 1754, 11333, 19169, 451, 1162, 355, 8878, 323, 12672, 253, 46234, 2424, 323, 29772, 72, 1550, 10012, 597, 403, 417, 7470, 323, 7092, 3066, 5919, 11454, 35615, 3185, 50276, 664, 5223, 253, 23211, 1754, 16851, 30505, 1854, 11333, 273, 9527, 256, 21401, 1162, 355, 4267, 50276, 249, 958, 253, 5933, 368, 3359, 310, 253, 581, 432, 29772, 41394, 285, 19169, 451, 1162, 355, 417, 13681, 256, 21401, 1162, 355, 512, 1110, 2987, 897, 11467, 253, 1072, 30547, 7695, 5933, 534, 5593, 253, 4181, 273, 1016, 1127, 281, 12421, 6777, 9959, 2299, 253, 4181, 13418, 5199, 368, 3359, 5742, 277, 72, 296, 50276, 4090, 400, 9245, 50276, 87, 6811, 310, 29772, 72, 1550, 436, 310, 816, 271, 11591, 43723, 21496, 13681, 256, 21401, 1162, 14350, 13418, 5199, 310, 1027, 285, 3652, 327, 253, 973, 4304, 789, 273, 9062, 484, 91, 15989, 15771, 327, 12672, 253, 1846, 5275, 15833, 273, 253, 1677, 4667, 331, 275, 253, 3632, 9959, 3877, 326, 436, 17267, 327, 253, 36594, 273, 253, 4737, 273, 10012, 854, 891, 1158, 253, 3908, 1335, 6556, 1955, 281, 1111, 1312, 661, 1783, 273, 29772, 72, 1550, 21496, 533, 417, 323, 253, 1921, 368, 26542, 671, 3877, 326, 275, 10012, 854, 347, 973, 347, 512, 18979, 1543, 260, 3198, 281, 320, 271, 7007, 275, 1798, 352, 310, 1929, 326, 667, 11193, 1679, 685, 495, 310, 7479, 342, 1679, 685, 7005, 30558, 19, 3602, 50275, 1542, 1054, 7317, 368, 3630, 326, 5899, 18976, 2224, 2430, 7005, 909, 284, 2274, 79, 6864, 41233, 19936, 29245, 39903, 9169, 533, 36908, 326, 2406, 3033, 46518, 1097, 253, 6864, 285, 253, 4871, 277, 2609, 88, 50276, 485, 22228, 2274, 79, 50275, 74, 717, 7591, 281, 956, 253, 4737, 273, 10012, 898, 812, 368, 4496, 5513, 253, 36594, 273, 634, 5140, 50276, 251, 436, 3877, 253, 6197, 46247, 1063, 50276, 6339, 8441, 8018, 326, 342, 5912, 387, 1878, 337, 79, 19, 627, 4961, 247, 17744, 298, 273, 298, 824, 326, 3133, 751, 271, 23293, 23437, 1974, 253, 17744, 4961, 11544, 18260, 285, 616, 12215, 310, 326, 253, 34560, 3632, 22170, 5933, 9010, 352, 342, 5912, 387, 1878, 337, 79, 19, 50276, 585, 3444, 50276, 74, 717, 4390, 7591, 281, 5583, 18738, 436, 2929, 1955, 281, 752, 3133, 751, 2709, 23437, 19103, 3731, 22416, 569, 273, 2720, 789, 12744, 7234, 285, 6830, 7681, 36594, 3374, 891, 588, 14303, 8254, 6787, 432, 253, 4477, 327, 253, 2792, 7000, 1840, 50276, 5996, 5955, 5731, 50276, 6438, 5955, 342, 253, 4477, 891, 452, 35890, 619, 4868, 19123, 281, 577, 1580, 253, 4477, 4455, 7378, 281, 11377, 275, 5955, 285, 3451, 49831, 253, 2929, 534, 891, 11435, 533, 891, 1335, 5583, 417, 18738, 253, 2929, 50276, 783, 4477, 3839, 14969, 2167, 452, 417, 2568, 4229, 253, 2523, 273, 3430, 863, 2382, 273, 253, 247, 793, 81, 5933, 436, 310, 2649, 816, 247, 2647, 273, 19936, 270, 3185, 273, 247, 253, 2929, 1335, 4428, 247, 24585, 5955, 273, 2139, 247, 310, 417, 7470, 594, 3185, 597, 1364, 17942, 281, 270, 1014, 2167, 275, 6612, 597, 816, 897, 247, 285, 270, 4558, 30732, 436, 310, 45982, 1580, 841, 9380, 403, 8530, 42975, 7561, 10256, 275, 16125, 13519, 616, 2600, 310, 973, 1929, 285, 352, 310, 21843, 1981, 849, 247, 45285, 11656, 1037, 13583, 6779, 273, 731, 1160, 697, 1039, 715, 253, 2929, 50276, 783, 1921, 891, 23031, 327, 436, 310, 326, 352, 861, 7790, 247, 4067, 2523, 342, 253, 2929, 253, 3236, 2715, 369, 268, 554, 11712, 342, 7473, 7234, 534, 497, 387, 1682, 31215, 285, 1014, 2167, 253, 4477, 4229, 390, 753, 597, 651, 4993, 253, 4394, 891, 8042, 562, 891, 3464, 7591, 281, 4517, 253, 4583, 7681, 3590, 1255, 273, 253, 2929, 253, 2278, 673, 3665, 36908, 1581, 247, 37317, 281, 9257, 12654, 1046, 3908, 4543, 651, 891, 971, 281, 627, 1364, 320, 690, 11847, 273, 1955, 35252, 327, 629, 273, 253, 4477, 326, 598, 281, 247, 1355, 19455, 6919, 273, 23437, 19103, 253, 7473, 2600, 310, 8132, 29689, 3451, 516, 9202, 253, 1655, 2715, 273, 253, 2929, 310, 3240, 745, 436, 1616, 50276, 1065, 1076, 7473, 3590, 1255, 9255, 619, 1246, 4685, 273, 253, 2934, 273, 253, 2929, 310, 253, 1563, 253, 4477, 10018, 326, 1142, 5044, 30745, 327, 14580, 476, 320, 7529, 1025, 715, 247, 1643, 30745, 273, 1355, 4871, 285, 6864, 7312, 18976, 2224, 476, 29688, 3359, 436, 604, 616, 4871, 310, 1781, 2217, 533, 436, 24543, 247, 15180, 7977, 285, 627, 403, 4755, 11361, 281, 11120, 3652, 436, 7529, 1204, 715, 253, 10336, 436, 3133, 751, 247, 24600, 285, 7826, 45190, 4217, 8310, 533, 253, 5661, 2593, 1335, 3133, 1512, 6906, 281, 1056, 253, 1083, 6283, 326, 753, 4931, 891, 452, 417, 4751, 7192, 253, 2929, 1580, 697, 10879, 23437, 19103, 285, 31921, 7234, 1160, 352, 247, 2372, 1892, 323, 479, 281, 956, 50275, 249, 6452, 891, 1158, 253, 2929, 943, 15080, 247, 6832, 18520, 337, 4076, 598, 253, 3762, 629, 285, 5416, 697, 7473, 3590, 1255, 374, 9266, 907, 253, 1127, 273, 253, 2929, 275, 1798, 2581, 685, 816, 15250, 305, 9866, 891, 3524, 247, 17265, 2715, 651, 2486, 247, 625, 11080, 5301, 342, 7312, 18976, 2224, 50276, 1439, 816, 5597, 731, 342, 690, 30404, 273, 2720, 2987, 534, 14163, 5276, 7364, 50276, 2520, 6505, 24626, 670, 253, 3242, 1566, 285, 13260, 3782, 1580, 347, 5469, 1840, 253, 2720, 789, 310, 417, 1900, 11106, 13613, 495, 6830, 5645, 253, 5661, 2593, 2490, 187, 4118, 18435, 27, 783, 2929, 10262, 247, 747, 305, 9866, 10336, 285, 2085, 4722, 10527, 7313, 670, 253, 10336, 253, 2929, 310, 3240, 12532, 285, 556, 2067, 4722, 16039, 2299, 954, 273, 253, 30628, 2868, 326, 253, 2929, 310, 417, 4704, 323, 9311, 285, 476, 320, 3012, 5520, 407, 247, 625, 7473, 285, 10799, 7234, 270, 8254, 5411, 253, 2234, 2792, 273, 253, 2929, 260, 625, 11080, 5661, 12820, 273, 253, 7792, 327, 1524, 10186, 15302, 50275 ]
[ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 ]
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 2520, 2929, 29328, 247, 747, 3652, 2972, 323, 18976, 2224, 1925, 305, 9866, 436, 3652, 2972, 28587, 273, 6864, 323, 4871, 285, 8687, 2709, 7529, 3963, 305, 9866, 5162, 5085, 970, 253, 305, 9866, 10336, 253, 4477, 5100, 14493, 323, 253, 2424, 2990, 6864, 285, 2264, 3602, 323, 2067, 38183, 3237, 689, 14580, 50276, 856, 84, 50276, 66, 7529, 5162, 10336, 347, 10066, 281, 22453, 5162, 310, 671, 2779, 281, 320, 43245, 625, 5919, 275, 1781, 3541, 873, 8777, 50275, 74, 419, 2254, 9257, 10141, 512, 253, 27947, 533, 253, 2234, 39383, 1646, 4623, 285, 4722, 50275, 783, 2929, 310, 4518, 3542, 323, 954, 4243, 285, 253, 2234, 16875, 4431, 3212, 253, 10527, 1543, 403, 5544, 275, 4209, 6864, 50275, 5040, 50276, 783, 7529, 305, 9866, 2503, 285, 13361, 81, 11911, 1561, 247, 305, 9866, 2972, 512, 3894, 3602, 285, 7613, 253, 760, 2603, 273, 3064, 275, 253, 10302, 1159, 310, 1955, 281, 3910, 275, 31850, 273, 4666, 3386, 849, 310, 326, 3064, 11407, 275, 3946, 1097, 275, 253, 3237, 2783, 275, 436, 789, 285, 625, 3839, 323, 667, 4216, 1895, 432, 253, 5740, 275, 253, 4679, 2593, 352, 3133, 326, 253, 6928, 403, 9009, 970, 1072, 873, 273, 3386, 323, 253, 7632, 1309, 31850, 24088, 4666, 7759, 1766, 1306, 507, 323, 2603, 285, 263, 12095, 7632, 50275, 783, 2929, 19756, 1941, 604, 436, 1566, 41731, 13015, 18976, 2224, 327, 1524, 10186, 4216, 15302, 512, 253, 3237, 2783, 275, 436, 789, 403, 38183, 13757, 3237, 327, 2827, 38755, 445, 28212, 14580, 323, 534, 1327, 28269, 1754, 5482, 671, 2226, 1223, 1142, 273, 253, 10527, 1543, 403, 671, 2840, 281, 841, 3237, 581, 651, 3524, 326, 253, 10336, 310, 671, 6760, 327, 2629, 8892, 24088, 9162, 323, 1524, 10186, 22791, 4216, 15302, 50275, 783, 4275, 26536, 273, 253, 2929, 3133, 281, 320, 326, 253, 305, 9866, 3210, 2430, 1679, 6864, 285, 2264, 3602, 2429, 281, 2629, 305, 9866, 35615, 436, 310, 417, 2590, 432, 253, 4679, 247, 625, 27389, 906, 651, 320, 281, 2939, 253, 3045, 273, 841, 3210, 347, 247, 1159, 273, 6864, 285, 2264, 3602, 50276, 7152, 339, 431, 248, 2022, 1953, 436, 2929, 39223, 310, 476, 581, 1287, 3410, 5919, 35615, 323, 4216, 3237, 1223, 26179, 253, 17647, 273, 253, 3935, 8136, 7792, 50276, 6050, 16248, 3935, 8136, 342, 18976, 2224, 556, 2011, 281, 452, 2762, 16774, 1543, 50276, 664, 513, 417, 871, 273, 247, 2087, 11454, 10336, 326, 310, 50276, 735, 12596, 2217, 281, 872, 1598, 22573, 253, 2900, 2317, 273, 247, 5235, 273, 4216, 3237, 824, 347, 30505, 11865, 285, 5927, 28369, 7139, 50275, 2520, 2929, 23970, 247, 28055, 3505, 74, 6216, 10336, 50276, 72, 14340, 50276, 4609, 310, 9437, 281, 1056, 305, 68, 2224, 625, 5919, 407, 970, 5697, 432, 253, 749, 15069, 273, 30547, 7695, 34754, 285, 7529, 12672, 50273, 74, 6273, 323, 18738, 253, 2929, 1955, 281, 697, 38135, 285, 253, 5847, 7117, 2708, 50272, 856, 84, 50275, 266, 4722, 2929, 342, 4460, 7680, 16248, 5697, 432, 7529, 12672, 285, 30547, 7695, 34754, 50275, 24330, 5933, 280, 7680, 273, 1600, 281, 24432, 50275, 783, 33977, 9021, 275, 253, 830, 273, 2067, 39383, 275, 253, 2929, 50275, 5040, 50274, 2369, 2127, 2530, 342, 253, 19529, 7152, 339, 431, 248, 4477, 12661, 247, 7629, 273, 305, 9866, 534, 597, 1416, 305, 9866, 534, 6613, 2067, 305, 9866, 11911, 275, 7529, 342, 2801, 9628, 253, 4477, 921, 326, 352, 2987, 973, 275, 3946, 327, 2067, 13506, 15302, 285, 921, 1142, 10527, 1543, 50276, 74, 452, 690, 2523, 342, 253, 10527, 1543, 285, 651, 7164, 253, 4868, 604, 9713, 50275, 28558, 891, 717, 20854, 253, 50276, 3757, 79, 10336, 4419, 1027, 3386, 14800, 323, 512, 10125, 273, 253, 305, 9866, 436, 943, 320, 2879, 715, 253, 4764, 15496, 326, 310, 2218, 50276, 9453, 253, 3602, 403, 253, 1072, 2439, 8090, 597, 403, 16042, 347, 581, 1223, 436, 310, 2032, 604, 368, 7767, 2801, 9628, 533, 275, 2087, 359, 6194, 1293, 436, 7658, 33810, 436, 310, 2649, 2218, 323, 18976, 2224, 534, 5644, 281, 271, 16593, 5301, 253, 1332, 943, 320, 253, 1072, 13027, 1293, 13630, 272, 2801, 9628, 50276, 28489, 891, 717, 5816, 1633, 5044, 533, 253, 4666, 15430, 5426, 36908, 1056, 667, 3282, 281, 479, 1049, 8962, 943, 320, 816, 268, 2069, 253, 15301, 273, 1484, 8498, 299, 594, 943, 2649, 320, 625, 27096, 840, 816, 253, 5044, 3067, 43850, 4315, 50276, 262, 36908, 1646, 751, 627, 310, 667, 10527, 6351, 672, 10941, 10012, 721, 285, 854, 7296, 326, 275, 10012, 721, 9765, 1237, 285, 253, 1180, 273, 3602, 310, 691, 19, 77, 671, 305, 9866, 13414, 452, 281, 452, 247, 4229, 6779, 1979, 1016, 3828, 835, 787, 485, 1247, 1223, 275, 10012, 854, 359, 755, 1180, 273, 3602, 327, 1797, 498, 2331, 534, 323, 260, 19, 835, 597, 5194, 36908, 1646, 281, 1918, 247, 1805, 3033, 50276, 5658, 513, 417, 921, 849, 281, 3359, 17487, 3767, 323, 69, 342, 247, 305, 9866, 534, 310, 5816, 432, 253, 4737, 50273, 37585, 16157, 50276, 249, 622, 247, 806, 5150, 943, 320, 1054, 417, 2781, 50276, 249, 20320, 337, 943, 320, 1269, 303, 27475, 571, 74, 18, 417, 1269, 303, 27475, 571, 74, 671, 5234, 875, 1269, 285, 362, 534, 310, 21643, 7152, 33032, 39930, 575, 50276, 783, 2929, 5936, 247, 11237, 281, 4216, 11454, 6928, 534, 310, 7558, 281, 11399, 305, 9866, 3890, 6460, 3374, 4102, 2011, 407, 29245, 39903, 17857, 32888, 9169, 50276, 26122, 50276, 74, 574, 2709, 12748, 275, 1563, 253, 2600, 273, 253, 2929, 690, 403, 7000, 1735, 50276, 575, 328, 9520, 18976, 2224, 2430, 1781, 6864, 347, 8058, 275, 10012, 721, 1840, 253, 2406, 3033, 275, 10012, 721, 310, 323, 11193, 9765, 50276, 1010, 533, 1060, 368, 2319, 11193, 258, 2808, 295, 36908, 436, 2991, 253, 2406, 3033, 2139, 2550, 7312, 18976, 2224, 4711, 271, 258, 2808, 295, 11193, 285, 5742, 29772, 72, 1550, 21496, 50275, 6050, 627, 513, 2226, 3300, 504, 35161, 10717, 1754, 11333, 19169, 451, 1162, 355, 8878, 323, 12672, 253, 46234, 2424, 323, 29772, 72, 1550, 10012, 597, 403, 417, 7470, 323, 7092, 3066, 5919, 11454, 35615, 3185, 50276, 664, 5223, 253, 23211, 1754, 16851, 30505, 1854, 11333, 273, 9527, 256, 21401, 1162, 355, 4267, 50276, 249, 958, 253, 5933, 368, 3359, 310, 253, 581, 432, 29772, 41394, 285, 19169, 451, 1162, 355, 417, 13681, 256, 21401, 1162, 355, 512, 1110, 2987, 897, 11467, 253, 1072, 30547, 7695, 5933, 534, 5593, 253, 4181, 273, 1016, 1127, 281, 12421, 6777, 9959, 2299, 253, 4181, 13418, 5199, 368, 3359, 5742, 277, 72, 296, 50276, 4090, 400, 9245, 50276, 87, 6811, 310, 29772, 72, 1550, 436, 310, 816, 271, 11591, 43723, 21496, 13681, 256, 21401, 1162, 14350, 13418, 5199, 310, 1027, 285, 3652, 327, 253, 973, 4304, 789, 273, 9062, 484, 91, 15989, 15771, 327, 12672, 253, 1846, 5275, 15833, 273, 253, 1677, 4667, 331, 275, 253, 3632, 9959, 3877, 326, 436, 17267, 327, 253, 36594, 273, 253, 4737, 273, 10012, 854, 891, 1158, 253, 3908, 1335, 6556, 1955, 281, 1111, 1312, 661, 1783, 273, 29772, 72, 1550, 21496, 533, 417, 323, 253, 1921, 368, 26542, 671, 3877, 326, 275, 10012, 854, 347, 973, 347, 512, 18979, 1543, 260, 3198, 281, 320, 271, 7007, 275, 1798, 352, 310, 1929, 326, 667, 11193, 1679, 685, 495, 310, 7479, 342, 1679, 685, 7005, 30558, 19, 3602, 50275, 1542, 1054, 7317, 368, 3630, 326, 5899, 18976, 2224, 2430, 7005, 909, 284, 2274, 79, 6864, 41233, 19936, 29245, 39903, 9169, 533, 36908, 326, 2406, 3033, 46518, 1097, 253, 6864, 285, 253, 4871, 277, 2609, 88, 50276, 485, 22228, 2274, 79, 50275, 74, 717, 7591, 281, 956, 253, 4737, 273, 10012, 898, 812, 368, 4496, 5513, 253, 36594, 273, 634, 5140, 50276, 251, 436, 3877, 253, 6197, 46247, 1063, 50276, 6339, 8441, 8018, 326, 342, 5912, 387, 1878, 337, 79, 19, 627, 4961, 247, 17744, 298, 273, 298, 824, 326, 3133, 751, 271, 23293, 23437, 1974, 253, 17744, 4961, 11544, 18260, 285, 616, 12215, 310, 326, 253, 34560, 3632, 22170, 5933, 9010, 352, 342, 5912, 387, 1878, 337, 79, 19, 50276, 585, 3444, 50276, 74, 717, 4390, 7591, 281, 5583, 18738, 436, 2929, 1955, 281, 752, 3133, 751, 2709, 23437, 19103, 3731, 22416, 569, 273, 2720, 789, 12744, 7234, 285, 6830, 7681, 36594, 3374, 891, 588, 14303, 8254, 6787, 432, 253, 4477, 327, 253, 2792, 7000, 1840, 50276, 5996, 5955, 5731, 50276, 6438, 5955, 342, 253, 4477, 891, 452, 35890, 619, 4868, 19123, 281, 577, 1580, 253, 4477, 4455, 7378, 281, 11377, 275, 5955, 285, 3451, 49831, 253, 2929, 534, 891, 11435, 533, 891, 1335, 5583, 417, 18738, 253, 2929, 50276, 783, 4477, 3839, 14969, 2167, 452, 417, 2568, 4229, 253, 2523, 273, 3430, 863, 2382, 273, 253, 247, 793, 81, 5933, 436, 310, 2649, 816, 247, 2647, 273, 19936, 270, 3185, 273, 247, 253, 2929, 1335, 4428, 247, 24585, 5955, 273, 2139, 247, 310, 417, 7470, 594, 3185, 597, 1364, 17942, 281, 270, 1014, 2167, 275, 6612, 597, 816, 897, 247, 285, 270, 4558, 30732, 436, 310, 45982, 1580, 841, 9380, 403, 8530, 42975, 7561, 10256, 275, 16125, 13519, 616, 2600, 310, 973, 1929, 285, 352, 310, 21843, 1981, 849, 247, 45285, 11656, 1037, 13583, 6779, 273, 731, 1160, 697, 1039, 715, 253, 2929, 50276, 783, 1921, 891, 23031, 327, 436, 310, 326, 352, 861, 7790, 247, 4067, 2523, 342, 253, 2929, 253, 3236, 2715, 369, 268, 554, 11712, 342, 7473, 7234, 534, 497, 387, 1682, 31215, 285, 1014, 2167, 253, 4477, 4229, 390, 753, 597, 651, 4993, 253, 4394, 891, 8042, 562, 891, 3464, 7591, 281, 4517, 253, 4583, 7681, 3590, 1255, 273, 253, 2929, 253, 2278, 673, 3665, 36908, 1581, 247, 37317, 281, 9257, 12654, 1046, 3908, 4543, 651, 891, 971, 281, 627, 1364, 320, 690, 11847, 273, 1955, 35252, 327, 629, 273, 253, 4477, 326, 598, 281, 247, 1355, 19455, 6919, 273, 23437, 19103, 253, 7473, 2600, 310, 8132, 29689, 3451, 516, 9202, 253, 1655, 2715, 273, 253, 2929, 310, 3240, 745, 436, 1616, 50276, 1065, 1076, 7473, 3590, 1255, 9255, 619, 1246, 4685, 273, 253, 2934, 273, 253, 2929, 310, 253, 1563, 253, 4477, 10018, 326, 1142, 5044, 30745, 327, 14580, 476, 320, 7529, 1025, 715, 247, 1643, 30745, 273, 1355, 4871, 285, 6864, 7312, 18976, 2224, 476, 29688, 3359, 436, 604, 616, 4871, 310, 1781, 2217, 533, 436, 24543, 247, 15180, 7977, 285, 627, 403, 4755, 11361, 281, 11120, 3652, 436, 7529, 1204, 715, 253, 10336, 436, 3133, 751, 247, 24600, 285, 7826, 45190, 4217, 8310, 533, 253, 5661, 2593, 1335, 3133, 1512, 6906, 281, 1056, 253, 1083, 6283, 326, 753, 4931, 891, 452, 417, 4751, 7192, 253, 2929, 1580, 697, 10879, 23437, 19103, 285, 31921, 7234, 1160, 352, 247, 2372, 1892, 323, 479, 281, 956, 50275, 249, 6452, 891, 1158, 253, 2929, 943, 15080, 247, 6832, 18520, 337, 4076, 598, 253, 3762, 629, 285, 5416, 697, 7473, 3590, 1255, 374, 9266, 907, 253, 1127, 273, 253, 2929, 275, 1798, 2581, 685, 816, 15250, 305, 9866, 891, 3524, 247, 17265, 2715, 651, 2486, 247, 625, 11080, 5301, 342, 7312, 18976, 2224, 50276, 1439, 816, 5597, 731, 342, 690, 30404, 273, 2720, 2987, 534, 14163, 5276, 7364, 50276, 2520, 6505, 24626, 670, 253, 3242, 1566, 285, 13260, 3782, 1580, 347, 5469, 1840, 253, 2720, 789, 310, 417, 1900, 11106, 13613, 495, 6830, 5645, 253, 5661, 2593, 2490, 187, 4118, 18435, 27, 783, 2929, 10262, 247, 747, 305, 9866, 10336, 285, 2085, 4722, 10527, 7313, 670, 253, 10336, 253, 2929, 310, 3240, 12532, 285, 556, 2067, 4722, 16039, 2299, 954, 273, 253, 30628, 2868, 326, 253, 2929, 310, 417, 4704, 323, 9311, 285, 476, 320, 3012, 5520, 407, 247, 625, 7473, 285, 10799, 7234, 270, 8254, 5411, 253, 2234, 2792, 273, 253, 2929, 260, 625, 11080, 5661, 12820, 273, 253, 7792, 327, 1524, 10186, 15302, 50275 ]
Below is given review of a research paper from cnoference journal. Please write a summary the review. ### Review: the contribution made in the paper appears to be very specialised as im not very familiar with this topic it is difficult for me to judge the merit of contributions the main weakness of this work is the lack of clarity most of the readers might not be familiar with mevs and the paper is fairly poor at explaining this concept and the papers contributions to the wider audience also the paper is written in a very condensed way i encourage the authors to provide more intuitive motivation eg for the introduced architecture of the network to improve my understanding of the topic ive inspected the code its very difficult to figure out what it is doing and where the parts corresponding to the main contributions are implemented i think the code clarity is especially important for topics which are not widely known as readers can learn about the method by understanding its application to data the code should have its environment and clearly defined commands that can be executed to run the experiments i also recommend the authors should use arguments to scripts as opposed to asking to uncomment the appropriate experiment in the bottom of the experiment docsep the paper is very well written the paper makes a comprehensive study of the question including theoretical algorithmic and experimental aspects the exhaustive documentation in the appendix is impressive the contribution is highly technical most likely only easily accessible to experts of extreme value distributions some aspects are difficult to follow for the unfamiliar reader 1 in practice how to handlechoose the normalizing sequences ak and bk described in background 2 what is the benefit of multiple layer models if one layer is enough for universal approximation how to choose the number of layers 3 i do not see baseline comparison is there nothing that makes sense i would expect that at least the low dimensional scalar offers some comparison options docsepthe paper is well written and the logic is easy to follow theres a theory for the model and the authors also do the experiments to justify the effectiveness of their model which seems sound to me i cannot find significant issues in this paper one thing i am not very sure about is in theorem 2 why converging to a zeromean gaussian process means uniform convergence doesnt the variance matter here in algorithm 1 is using adam to update parameters necessary how about sgd whats the optimization cost of the algorithm it seems like it takes longer to train the proposed network also theres some difficulty in tuning the hyperparameters as mentioned by the authors docsep the motivation for this work is important combining datadriven approaches from ml by using nns with evt is clearly inherently challenging as the extreme values are rare while this work is not aligned with my area of expertise the paper appears wellwritten with sufficient background to get an understanding of its significance code and an extensive supplementary material is provided that will help with reproducibility i believe that the proposed dmnn architecture will be interesting to people in the community the process of hyperparameter optimisation is not described and it is mentioned that the performance could be improved if they were tuned better are the baselines also in the same position that they could be improved with hyperparameter tuning i realise some are parameterfree so maybe this is not an issue it seems more effort was placed in showcasing dmnns compared to the generative approach personally it feels that the generative model approach is less developed from at least the empirical results would the paper have been better if it just focussed on the dmnns and left the generative approach to the appendix the text in the figures of the main paper are too small to read for someone with less experience in this area figures 2 and 5 are difficult to interpret what is the behaviour we are supposed to be looking for in these figures that makes dmnn superior eg convexity etc it would help the reader to spell it out a bit more what is a block in algorithm 1 ### Summary:
meta review the paper proposes networks for highd extreme value distributions the reviewers were generally happy with the paper and author response please account for reviewer comments in your final revisions
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 783, 7680, 1160, 275, 253, 2929, 4620, 281, 320, 1077, 2714, 1701, 347, 516, 417, 1077, 7615, 342, 436, 9400, 352, 310, 2834, 323, 479, 281, 5963, 253, 15785, 273, 9021, 253, 2022, 14855, 273, 436, 789, 310, 253, 3480, 273, 19843, 954, 273, 253, 10668, 1537, 417, 320, 7615, 342, 479, 10936, 285, 253, 2929, 310, 9648, 4105, 387, 15571, 436, 4473, 285, 253, 9380, 9021, 281, 253, 14200, 8446, 671, 253, 2929, 310, 3542, 275, 247, 1077, 35341, 1039, 50276, 74, 11907, 253, 4477, 281, 2085, 625, 27350, 16038, 24088, 323, 253, 5611, 10336, 273, 253, 2990, 50276, 936, 3157, 619, 4685, 273, 253, 9400, 209, 422, 36560, 253, 2127, 50276, 953, 1077, 2834, 281, 4677, 562, 752, 352, 310, 2509, 285, 835, 253, 4243, 3969, 281, 253, 2022, 9021, 403, 9009, 891, 1158, 253, 2127, 19843, 310, 3340, 1774, 323, 12989, 534, 403, 417, 7561, 1929, 347, 10668, 476, 3037, 670, 253, 1332, 407, 4685, 697, 2898, 281, 941, 50276, 783, 2127, 943, 452, 697, 3126, 285, 4518, 2931, 13896, 326, 476, 320, 11407, 281, 1408, 253, 4679, 891, 671, 5583, 253, 4477, 943, 897, 7125, 281, 20477, 347, 10066, 281, 7004, 281, 440, 13982, 253, 4569, 3368, 275, 253, 5004, 273, 253, 3368, 5474, 33032, 253, 2929, 310, 1077, 973, 3542, 50276, 783, 2929, 2789, 247, 11088, 1263, 273, 253, 1953, 1690, 10527, 5933, 280, 285, 5661, 7794, 50276, 783, 41389, 10097, 275, 253, 30762, 310, 13943, 50275, 783, 7680, 310, 4122, 7681, 954, 2779, 760, 4354, 12482, 281, 10071, 273, 9559, 1318, 10670, 690, 7794, 403, 2834, 281, 956, 323, 253, 32139, 9414, 337, 275, 3946, 849, 281, 6016, 27271, 253, 2622, 3006, 6430, 29507, 285, 270, 76, 2529, 275, 4114, 374, 752, 310, 253, 5649, 273, 2709, 3828, 3210, 604, 581, 3828, 310, 2217, 323, 10898, 11193, 849, 281, 5206, 253, 1180, 273, 8090, 495, 891, 513, 417, 923, 8245, 5301, 310, 627, 2717, 326, 2789, 3282, 891, 651, 1902, 326, 387, 1878, 253, 1698, 15759, 13434, 6131, 690, 5301, 4610, 50275, 7152, 339, 431, 248, 2929, 310, 973, 3542, 285, 253, 9317, 310, 3477, 281, 956, 253, 373, 247, 3762, 323, 253, 1566, 285, 253, 4477, 671, 513, 253, 4679, 281, 15249, 253, 12510, 273, 616, 1566, 534, 3133, 3590, 281, 479, 50274, 74, 2550, 1089, 1534, 3374, 275, 436, 2929, 581, 2181, 891, 717, 417, 1077, 2119, 670, 310, 275, 10012, 374, 2139, 5975, 3390, 281, 247, 1182, 254, 485, 266, 305, 12064, 1232, 2097, 6447, 14940, 36908, 253, 11041, 2647, 1060, 50275, 249, 5933, 337, 310, 970, 38622, 281, 5731, 3602, 3309, 849, 670, 256, 35333, 50275, 5371, 84, 253, 13757, 2105, 273, 253, 5933, 352, 3133, 751, 352, 3936, 3356, 281, 6194, 253, 4081, 2990, 671, 253, 373, 690, 10183, 275, 25184, 253, 4373, 22041, 347, 5393, 407, 253, 4477, 5474, 33032, 253, 16038, 323, 436, 789, 310, 1774, 16248, 2856, 324, 1069, 257, 7274, 432, 13361, 407, 970, 295, 2224, 342, 612, 85, 310, 4518, 26557, 11132, 347, 253, 9559, 2193, 403, 7520, 50276, 6050, 436, 789, 310, 417, 15616, 342, 619, 2170, 273, 15040, 253, 2929, 4620, 973, 15720, 342, 4209, 4114, 281, 755, 271, 4685, 273, 697, 8453, 2127, 285, 271, 9470, 24864, 2144, 310, 2530, 326, 588, 1361, 342, 38041, 50276, 74, 2868, 326, 253, 4081, 42961, 9866, 10336, 588, 320, 4722, 281, 952, 275, 253, 3114, 50276, 783, 1232, 273, 4373, 19484, 5556, 5837, 310, 417, 2529, 285, 352, 310, 5393, 326, 253, 3045, 812, 320, 5520, 604, 597, 497, 24251, 1805, 403, 253, 1666, 25379, 671, 275, 253, 1072, 1899, 326, 597, 812, 320, 5520, 342, 4373, 19484, 25184, 891, 27753, 690, 403, 4764, 4924, 594, 5046, 436, 310, 417, 271, 2523, 50276, 262, 3133, 625, 3434, 369, 4845, 275, 44762, 2355, 277, 16192, 2224, 2429, 281, 253, 1006, 800, 2746, 11697, 352, 9193, 326, 253, 1006, 800, 1566, 2746, 310, 1679, 3715, 432, 387, 1878, 253, 16774, 1543, 651, 253, 2929, 452, 644, 1805, 604, 352, 816, 41685, 47291, 327, 253, 277, 16192, 2224, 285, 1669, 253, 1006, 800, 2746, 281, 253, 30762, 50276, 783, 2505, 275, 253, 8442, 273, 253, 2022, 2929, 403, 1512, 1355, 281, 1239, 50276, 1542, 3095, 342, 1679, 2793, 275, 436, 2170, 8442, 374, 285, 608, 403, 2834, 281, 4665, 752, 310, 253, 8770, 359, 403, 6326, 281, 320, 2819, 323, 275, 841, 8442, 326, 2789, 42961, 9866, 8936, 24088, 17133, 414, 3966, 352, 651, 1361, 253, 9414, 281, 15368, 352, 562, 247, 2372, 625, 50276, 5371, 310, 247, 2972, 275, 5933, 337, 2490, 187, 4118, 18435, 27, 13518, 2278, 253, 2929, 29328, 6928, 323, 1029, 69, 9559, 1318, 10670, 253, 30628, 497, 3839, 5211, 342, 253, 2929, 285, 2488, 2380, 4496, 2395, 323, 37317, 5701, 275, 634, 2457, 38549, 209 ]
[ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 ]
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 783, 7680, 1160, 275, 253, 2929, 4620, 281, 320, 1077, 2714, 1701, 347, 516, 417, 1077, 7615, 342, 436, 9400, 352, 310, 2834, 323, 479, 281, 5963, 253, 15785, 273, 9021, 253, 2022, 14855, 273, 436, 789, 310, 253, 3480, 273, 19843, 954, 273, 253, 10668, 1537, 417, 320, 7615, 342, 479, 10936, 285, 253, 2929, 310, 9648, 4105, 387, 15571, 436, 4473, 285, 253, 9380, 9021, 281, 253, 14200, 8446, 671, 253, 2929, 310, 3542, 275, 247, 1077, 35341, 1039, 50276, 74, 11907, 253, 4477, 281, 2085, 625, 27350, 16038, 24088, 323, 253, 5611, 10336, 273, 253, 2990, 50276, 936, 3157, 619, 4685, 273, 253, 9400, 209, 422, 36560, 253, 2127, 50276, 953, 1077, 2834, 281, 4677, 562, 752, 352, 310, 2509, 285, 835, 253, 4243, 3969, 281, 253, 2022, 9021, 403, 9009, 891, 1158, 253, 2127, 19843, 310, 3340, 1774, 323, 12989, 534, 403, 417, 7561, 1929, 347, 10668, 476, 3037, 670, 253, 1332, 407, 4685, 697, 2898, 281, 941, 50276, 783, 2127, 943, 452, 697, 3126, 285, 4518, 2931, 13896, 326, 476, 320, 11407, 281, 1408, 253, 4679, 891, 671, 5583, 253, 4477, 943, 897, 7125, 281, 20477, 347, 10066, 281, 7004, 281, 440, 13982, 253, 4569, 3368, 275, 253, 5004, 273, 253, 3368, 5474, 33032, 253, 2929, 310, 1077, 973, 3542, 50276, 783, 2929, 2789, 247, 11088, 1263, 273, 253, 1953, 1690, 10527, 5933, 280, 285, 5661, 7794, 50276, 783, 41389, 10097, 275, 253, 30762, 310, 13943, 50275, 783, 7680, 310, 4122, 7681, 954, 2779, 760, 4354, 12482, 281, 10071, 273, 9559, 1318, 10670, 690, 7794, 403, 2834, 281, 956, 323, 253, 32139, 9414, 337, 275, 3946, 849, 281, 6016, 27271, 253, 2622, 3006, 6430, 29507, 285, 270, 76, 2529, 275, 4114, 374, 752, 310, 253, 5649, 273, 2709, 3828, 3210, 604, 581, 3828, 310, 2217, 323, 10898, 11193, 849, 281, 5206, 253, 1180, 273, 8090, 495, 891, 513, 417, 923, 8245, 5301, 310, 627, 2717, 326, 2789, 3282, 891, 651, 1902, 326, 387, 1878, 253, 1698, 15759, 13434, 6131, 690, 5301, 4610, 50275, 7152, 339, 431, 248, 2929, 310, 973, 3542, 285, 253, 9317, 310, 3477, 281, 956, 253, 373, 247, 3762, 323, 253, 1566, 285, 253, 4477, 671, 513, 253, 4679, 281, 15249, 253, 12510, 273, 616, 1566, 534, 3133, 3590, 281, 479, 50274, 74, 2550, 1089, 1534, 3374, 275, 436, 2929, 581, 2181, 891, 717, 417, 1077, 2119, 670, 310, 275, 10012, 374, 2139, 5975, 3390, 281, 247, 1182, 254, 485, 266, 305, 12064, 1232, 2097, 6447, 14940, 36908, 253, 11041, 2647, 1060, 50275, 249, 5933, 337, 310, 970, 38622, 281, 5731, 3602, 3309, 849, 670, 256, 35333, 50275, 5371, 84, 253, 13757, 2105, 273, 253, 5933, 352, 3133, 751, 352, 3936, 3356, 281, 6194, 253, 4081, 2990, 671, 253, 373, 690, 10183, 275, 25184, 253, 4373, 22041, 347, 5393, 407, 253, 4477, 5474, 33032, 253, 16038, 323, 436, 789, 310, 1774, 16248, 2856, 324, 1069, 257, 7274, 432, 13361, 407, 970, 295, 2224, 342, 612, 85, 310, 4518, 26557, 11132, 347, 253, 9559, 2193, 403, 7520, 50276, 6050, 436, 789, 310, 417, 15616, 342, 619, 2170, 273, 15040, 253, 2929, 4620, 973, 15720, 342, 4209, 4114, 281, 755, 271, 4685, 273, 697, 8453, 2127, 285, 271, 9470, 24864, 2144, 310, 2530, 326, 588, 1361, 342, 38041, 50276, 74, 2868, 326, 253, 4081, 42961, 9866, 10336, 588, 320, 4722, 281, 952, 275, 253, 3114, 50276, 783, 1232, 273, 4373, 19484, 5556, 5837, 310, 417, 2529, 285, 352, 310, 5393, 326, 253, 3045, 812, 320, 5520, 604, 597, 497, 24251, 1805, 403, 253, 1666, 25379, 671, 275, 253, 1072, 1899, 326, 597, 812, 320, 5520, 342, 4373, 19484, 25184, 891, 27753, 690, 403, 4764, 4924, 594, 5046, 436, 310, 417, 271, 2523, 50276, 262, 3133, 625, 3434, 369, 4845, 275, 44762, 2355, 277, 16192, 2224, 2429, 281, 253, 1006, 800, 2746, 11697, 352, 9193, 326, 253, 1006, 800, 1566, 2746, 310, 1679, 3715, 432, 387, 1878, 253, 16774, 1543, 651, 253, 2929, 452, 644, 1805, 604, 352, 816, 41685, 47291, 327, 253, 277, 16192, 2224, 285, 1669, 253, 1006, 800, 2746, 281, 253, 30762, 50276, 783, 2505, 275, 253, 8442, 273, 253, 2022, 2929, 403, 1512, 1355, 281, 1239, 50276, 1542, 3095, 342, 1679, 2793, 275, 436, 2170, 8442, 374, 285, 608, 403, 2834, 281, 4665, 752, 310, 253, 8770, 359, 403, 6326, 281, 320, 2819, 323, 275, 841, 8442, 326, 2789, 42961, 9866, 8936, 24088, 17133, 414, 3966, 352, 651, 1361, 253, 9414, 281, 15368, 352, 562, 247, 2372, 625, 50276, 5371, 310, 247, 2972, 275, 5933, 337, 2490, 187, 4118, 18435, 27, 13518, 2278, 253, 2929, 29328, 6928, 323, 1029, 69, 9559, 1318, 10670, 253, 30628, 497, 3839, 5211, 342, 253, 2929, 285, 2488, 2380, 4496, 2395, 323, 37317, 5701, 275, 634, 2457, 38549, 209 ]
Below is given review of a research paper from cnoference journal. Please write a summary the review. ### Review: summary the authors aim at solving an interesting problem for granger causality ie for different objectsenvironmentsspecific applications the causal graphs can be different but the causal relationshipsfunctional relationshipsdynamics are in the same form then how to infer the causal structures for different cases while utilizing the shared dynamics they propose a probabilistic implementation for modeling the graph structure with an encoder and the shared dynamics with a decoder the experiments are based on simple physical synthetic data and investigate the influence of the unknown confounding especially the authors propose a solution for the case with unobserved timeseries data pros a an interesting view to see the problem there are works in causal discovery focusing on solving the problem that the causal graph is shared in different environments while the causal relationships are in different forms it is also important to consider the scenario where the causal graph is not shared although the ideal case would be that both of the causal graphs and the causal relationships are not fully shared the work indeed provides a great value for the further development of causal discovery b a wellwritten paper the paper is well written which makes the idea and method easy to be understood especially first introduce the model and highlevel idea and then introduce the probabilistic implementation c investigation for the presence of unknown confounding the work also provides an extension for the case where there are unobserved time series the solution is straightforward makes sense and has reasonable results concerns questions 1 in the preliminaries after a while i understood what does the sample mean i would suggest elaborating the concept sample with examples or further explanation different samples seem to be corresponding to different applications which can have different causal graphs 2 eqn 8 doesnt seem to be consistent with eqn 1314 because the decoder seems to output the next time value xt1 based on the current time value xt but eqn 8 is only conditioning on z without indicating the actual process 3 eqn 13 use hard code to model xjt it means that it must be the case xjt is the direct cause of xjt1 however it may not always be the case what if xjt is not the direct cause 4 one minor concern i think it would be interesting to see how the model performs on the data generated with nonlinear autoregressive models which are used more often in causal discovery instead of physical synthetic data 5 it remains unclear for me how flexible is the model with its decoder 1214 or how rich is the functional class it seems like modelling the function with vector as input by assuming something which can be simplified as fx1 xi xn fvsumi1n feixi docsep quality and clarity there seem to be many important points that are not explained in this paper in this paper although it says we learn to infer causal relations across samples with different underlying causal graphs but shared dynamics on page 1 and 3 it also says edges are invariant to time on page 4 i have been confused when i read this paper because i first thought that the underlying causal mechanisms change over time i still dont understand what each graph corresponds to section 3 lacks the details about the method to identify latent confounders so the method is not likely reproducible i dont understand why the authors proposed method is robust even if there are hidden confounders the details about the method and a theoretical explanation for it is needed i wonder if the previous methods compared in this paper generates causal graphs with cycles if not the authors need to provide an explanation about the details about how they compared their method with the previous methods originality and significance there is no novelty in the problem set addressed in this paper this paper proposes a new method for an existing problem however it lacks the details of the method and the theoretical explanation to support the effectiveness of the method i wonder why the authors propose to learn a causal graph in the form of aggregated time series causal graphs summary graphs the graphs inferred by the method proposed by hyvrinen et al 2010 contains more detailed information about causal relationships in my opinion the causal graphs proposed in this paper are inferior to the graphs generated by the methods proposed in existing research in terms of the detail of the information the authors need to clarify this point the authors refer to peters et al 2017 but i could not find any descriptions about summary graphs in the book docseporiginality the paper provides something new namely it faces the problem probably not yet welladdressed in the existing literature that causal discovery methods may infer a different causal graph for each sample when the data are time series the nature of this problem should have been better justified from a statistical theoretical point of view for example is this due to nonergodicity or nonstationarity of the underlying stochastic process the idea of amortization across different causal graphs through the encoding and decoding function is very original at least it is new to me related works are correctly discussed significance the paper addresses surely an important problem namely causal heterogeneity across samples which are relevant to the clear community the proposed method significantly advances the stateoftheart although the relevance of this method depends very much on the assumption of shared dynamics it is difficult to assess whether this assumption holds or does not hold in empirical applications the authors present applications to simulated data only the method proposed in the paper if better motivated and justified is likely to have some impact outside the clear community eg in neuroscience finance and perhaps economics the paper hinges on the granger causality framework therefore it may be more interesting for fields of studies with high frequency data technical quality the proposed approach is technically sound but i expected more details about the encoder and decoder function minor point in equation 2 the time index of the noise variable should probably be t1 instead of t clarity the paper is clearly written and wellorganised the motivation of the paper could be improved it should be clarified under which conditions we get different graphs from different samples and which settings support the shared dynamics assumption overall assessment nice contribution in terms of new method framework but the paper needs to be better motivated it should be better clarified why it is interesting or even sound to consolidate causal discovery output over other samples ### Summary:
this paper tackles an interesting but specialized problem in causal discovery for multiple or multivariate time series the exact nature of the problem setup is not at all clear from the paper unless it is read very carefully and understandably confused at least one reviewer so if this is accepted the final version should have a significantly revised abstract and introduction the setting as i understand it is as follows we have multiple data sets each of which consists of n univariate time series all of length t and assumed to be regularly and simultaneously observed each data set is coming from an independent system these systems or data sets are what the ms confusingly calls samples not only are they statistically independent but the graph of effective connectivity among the n univariate processes is different from one system to another however there is a common functional form shared across all the systems for how the future of each of the n nodes is generated from its own past and the past of its neighbors this differs from system to system only up to a finitedimensional set of parameters which represents graph structure as well as eg connection strength thus for instance we might be looking at neuronal firing rates and saying that xit mathrmlogit1leftalpha sumj neq iwji xjt1right where the assumption of a common functional form is showing up in the additivity and the inverselogit parts this isnt a great model of neuronal response but you get the idea as the authors say in their replies to referees and as the manuscript hints an obvious application of this would be to multielectrodearray neuronal recordings where each data set from a different animal would be recording different neurons with a different graph but one might hope for a common neuronal response mechanism across experimental subjects i cant come up with a second convincing application especially not with fixed n across samples which seems important to the encodingdecoding step the innovation in this paper is to separate learning the common functional form of the vector autoregression from learning the graph thereby allowing for pooling of information about the shared part of the model across data sets the reports agree that this is cleverly done though it is not entirely clear what the limits on the expressive power of this method are nor under what conditions it will converge on either the correct graph or the correct functional form nonetheless this is original and innovative work and while text needs to clarify the intended application the authors replies to reviews make me fairly confident this can be done
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 6010, 50276, 783, 4477, 4388, 387, 16161, 271, 4722, 1895, 323, 650, 3751, 46449, 26332, 323, 1027, 5113, 20034, 859, 29765, 4893, 253, 19349, 14580, 476, 320, 1027, 533, 253, 19349, 7688, 28445, 7688, 41546, 403, 275, 253, 1072, 830, 840, 849, 281, 9441, 253, 19349, 5289, 323, 1027, 2219, 1223, 17617, 253, 6096, 8062, 50275, 9328, 12661, 247, 37851, 7092, 323, 14053, 253, 4216, 2605, 342, 271, 32049, 285, 253, 6096, 8062, 342, 247, 29810, 253, 4679, 403, 1754, 327, 2969, 3520, 13506, 941, 285, 7409, 253, 4833, 273, 253, 7202, 34541, 3340, 253, 4477, 12661, 247, 2900, 323, 253, 1083, 342, 440, 45912, 2069, 12395, 941, 50275, 856, 84, 247, 271, 4722, 1859, 281, 923, 253, 1895, 50275, 9088, 403, 2987, 275, 19349, 8900, 13654, 327, 16161, 253, 1895, 326, 253, 19349, 4216, 310, 6096, 275, 1027, 12620, 1223, 253, 19349, 7688, 403, 275, 1027, 4948, 352, 310, 671, 1774, 281, 1908, 253, 10076, 835, 253, 19349, 4216, 310, 417, 6096, 50276, 20261, 253, 7445, 1083, 651, 320, 326, 1097, 273, 253, 19349, 14580, 285, 253, 19349, 7688, 403, 417, 4751, 6096, 253, 789, 6296, 3400, 247, 1270, 1318, 323, 253, 2007, 2440, 273, 19349, 8900, 50276, 67, 247, 973, 15720, 2929, 50275, 783, 2929, 310, 973, 3542, 534, 2789, 253, 2934, 285, 1332, 3477, 281, 320, 7192, 3340, 806, 9569, 253, 1566, 285, 1029, 5251, 2934, 285, 840, 9569, 253, 37851, 7092, 50275, 68, 5839, 323, 253, 3361, 273, 7202, 34541, 50275, 783, 789, 671, 3400, 271, 6880, 323, 253, 1083, 835, 627, 403, 440, 45912, 673, 2962, 253, 2900, 310, 15246, 2789, 3282, 285, 556, 5272, 1543, 50274, 585, 1209, 2224, 50276, 34974, 50275, 18, 275, 253, 11944, 249, 3927, 846, 247, 1223, 891, 7192, 752, 1057, 253, 3410, 1599, 891, 651, 1804, 14883, 839, 253, 4473, 3410, 342, 6667, 390, 2007, 8813, 1027, 3530, 1646, 281, 320, 3969, 281, 1027, 4893, 534, 476, 452, 1027, 19349, 14580, 50276, 19, 16186, 79, 854, 36908, 1646, 281, 320, 5185, 342, 16186, 79, 2145, 1047, 984, 253, 29810, 3133, 281, 3453, 253, 1735, 673, 1318, 209, 633, 18, 1754, 327, 253, 1655, 673, 1318, 209, 633, 533, 16186, 79, 854, 310, 760, 21839, 327, 1182, 1293, 7809, 253, 4588, 1232, 50276, 20, 16186, 79, 2145, 897, 1892, 2127, 281, 1566, 1269, 42565, 352, 2097, 326, 352, 1364, 320, 253, 1083, 1269, 42565, 310, 253, 1480, 2847, 273, 1269, 42565, 18, 2299, 352, 778, 417, 1900, 320, 253, 1083, 752, 604, 1269, 42565, 310, 417, 253, 1480, 2847, 50276, 21, 581, 5884, 4468, 891, 1158, 352, 651, 320, 4722, 281, 923, 849, 253, 1566, 17923, 327, 253, 941, 4561, 342, 14561, 47694, 11020, 3210, 534, 403, 908, 625, 2223, 275, 19349, 8900, 3185, 273, 3520, 13506, 941, 50276, 22, 352, 4558, 12744, 323, 479, 849, 12112, 310, 253, 1566, 342, 697, 29810, 1249, 1047, 390, 849, 6793, 310, 253, 5164, 966, 50276, 262, 3133, 751, 26278, 253, 1159, 342, 4972, 347, 3280, 407, 7384, 1633, 534, 476, 320, 21010, 347, 269, 89, 18, 50276, 2981, 50276, 89, 79, 50276, 38379, 2204, 74, 18, 79, 704, 895, 74, 5474, 33032, 3290, 285, 19843, 627, 1646, 281, 320, 1142, 1774, 2792, 326, 403, 417, 5544, 275, 436, 2929, 50276, 249, 436, 2929, 3738, 352, 2296, 359, 3037, 281, 9441, 19349, 2493, 2439, 3530, 342, 1027, 6944, 19349, 14580, 533, 6096, 8062, 327, 3239, 337, 285, 495, 352, 671, 2296, 9297, 403, 13727, 281, 673, 327, 3239, 577, 891, 452, 644, 13477, 672, 891, 1239, 436, 2929, 984, 891, 806, 1869, 326, 253, 6944, 19349, 6297, 1818, 689, 673, 891, 1335, 13414, 2096, 752, 1016, 4216, 10140, 281, 50276, 4674, 495, 19756, 253, 4278, 670, 253, 1332, 281, 4271, 21624, 44667, 398, 594, 253, 1332, 310, 417, 2779, 41374, 50276, 74, 13414, 2096, 2139, 253, 4477, 4081, 1332, 310, 10237, 1014, 604, 627, 403, 8763, 44667, 398, 253, 4278, 670, 253, 1332, 285, 247, 10527, 8813, 323, 352, 310, 3058, 50276, 74, 4282, 604, 253, 2045, 3082, 2429, 275, 436, 2929, 15693, 19349, 14580, 342, 11945, 604, 417, 253, 4477, 878, 281, 2085, 271, 8813, 670, 253, 4278, 670, 849, 597, 2429, 616, 1332, 342, 253, 2045, 3082, 50275, 19164, 414, 285, 8453, 627, 310, 642, 38135, 275, 253, 1895, 873, 9713, 275, 436, 2929, 436, 2929, 29328, 247, 747, 1332, 323, 271, 5368, 1895, 2299, 352, 19756, 253, 4278, 273, 253, 1332, 285, 253, 10527, 8813, 281, 1329, 253, 12510, 273, 253, 1332, 50276, 74, 4282, 2139, 253, 4477, 12661, 281, 3037, 247, 19349, 4216, 275, 253, 830, 273, 40006, 673, 2962, 19349, 14580, 6010, 14580, 253, 14580, 22245, 407, 253, 1332, 4081, 407, 1465, 87, 11078, 257, 1162, 355, 4267, 4428, 625, 7000, 1491, 670, 19349, 7688, 275, 619, 4743, 253, 19349, 14580, 4081, 275, 436, 2929, 403, 18134, 281, 253, 14580, 4561, 407, 253, 3082, 4081, 275, 5368, 2561, 275, 2426, 273, 253, 2508, 273, 253, 1491, 253, 4477, 878, 281, 19148, 436, 1127, 253, 4477, 3730, 281, 268, 2521, 1162, 355, 4240, 533, 891, 812, 417, 1089, 667, 20121, 670, 6010, 14580, 275, 253, 1984, 5474, 339, 1831, 10019, 414, 50275, 783, 2929, 3400, 1633, 747, 10775, 352, 9365, 253, 1895, 3164, 417, 2568, 973, 1911, 2079, 275, 253, 5368, 6239, 326, 19349, 8900, 3082, 778, 9441, 247, 1027, 19349, 4216, 323, 1016, 3410, 672, 253, 941, 403, 673, 2962, 253, 3753, 273, 436, 1895, 943, 452, 644, 1805, 17285, 432, 247, 7605, 10527, 1127, 273, 1859, 323, 1650, 310, 436, 1955, 281, 1327, 1326, 351, 5755, 390, 1327, 20502, 15752, 273, 253, 6944, 19191, 1232, 253, 2934, 273, 717, 430, 1320, 2439, 1027, 19349, 14580, 949, 253, 9706, 285, 28490, 1159, 310, 1077, 3236, 387, 1878, 352, 310, 747, 281, 479, 2905, 2987, 403, 9113, 5469, 50276, 9188, 40348, 50275, 783, 2929, 12453, 13353, 271, 1774, 1895, 10775, 19349, 19331, 2439, 3530, 534, 403, 4623, 281, 253, 2590, 3114, 253, 4081, 1332, 3012, 16424, 253, 1375, 23037, 14387, 3738, 253, 17200, 273, 436, 1332, 7024, 1077, 1199, 327, 253, 9376, 273, 6096, 8062, 352, 310, 2834, 281, 2939, 1880, 436, 9376, 6556, 390, 1057, 417, 2186, 275, 16774, 4893, 253, 4477, 1246, 4893, 281, 15524, 941, 760, 253, 1332, 4081, 275, 253, 2929, 604, 1805, 17194, 285, 17285, 310, 2779, 281, 452, 690, 3486, 3345, 253, 2590, 3114, 24088, 275, 6551, 21559, 15065, 285, 4931, 20701, 253, 2929, 34865, 265, 327, 253, 650, 3751, 46449, 7792, 3103, 352, 50276, 11159, 320, 625, 4722, 323, 4910, 273, 2175, 342, 1029, 4294, 941, 50276, 48746, 3290, 50275, 783, 4081, 2746, 310, 22335, 3590, 533, 891, 3264, 625, 4278, 670, 253, 32049, 285, 29810, 1159, 5884, 1127, 275, 5150, 374, 253, 673, 3605, 273, 253, 6046, 4778, 943, 3164, 320, 246, 18, 3185, 273, 246, 50276, 498, 15752, 50275, 783, 2929, 310, 4518, 3542, 285, 973, 7397, 1701, 253, 16038, 273, 253, 2929, 812, 320, 5520, 352, 943, 320, 31637, 762, 534, 2515, 359, 755, 1027, 14580, 432, 1027, 3530, 50276, 395, 534, 7533, 1329, 253, 6096, 8062, 9376, 50275, 1189, 455, 6803, 5322, 7680, 275, 2426, 273, 747, 1332, 50276, 13149, 533, 253, 2929, 3198, 281, 320, 1805, 17194, 352, 943, 320, 1805, 31637, 2139, 352, 310, 4722, 390, 1014, 3590, 281, 16932, 366, 19349, 8900, 3453, 689, 643, 3530, 50276, 187, 187, 4118, 18435, 27, 2520, 2929, 39223, 271, 4722, 533, 18052, 1895, 275, 19349, 8900, 323, 2709, 390, 21471, 673, 2962, 50276, 783, 3242, 3753, 273, 253, 1895, 9978, 310, 417, 387, 512, 2590, 432, 253, 2929, 5734, 352, 310, 1239, 1077, 9257, 285, 2096, 1598, 13477, 387, 1878, 581, 37317, 594, 604, 436, 310, 7607, 253, 2457, 2715, 943, 452, 247, 3012, 17265, 12002, 285, 10199, 50276, 783, 4758, 347, 891, 2096, 352, 310, 347, 3637, 50276, 664, 452, 2709, 941, 5239, 1016, 273, 534, 8414, 273, 295, 36474, 673, 2962, 512, 273, 2978, 246, 285, 8025, 281, 320, 11719, 285, 10486, 2540, 50276, 14382, 941, 873, 310, 3551, 432, 271, 3907, 985, 50276, 20513, 2718, 390, 941, 5239, 403, 752, 253, 13818, 21643, 314, 5841, 3530, 50276, 1439, 760, 403, 597, 10126, 3907, 533, 253, 4216, 273, 3576, 17769, 2190, 253, 295, 36474, 4870, 310, 1027, 432, 581, 985, 281, 1529, 50276, 35529, 627, 310, 247, 1846, 5164, 830, 6096, 2439, 512, 253, 2718, 323, 849, 253, 2852, 273, 1016, 273, 253, 295, 7632, 310, 4561, 432, 697, 1211, 2469, 285, 253, 2469, 273, 697, 15833, 50276, 2520, 19986, 432, 985, 281, 985, 760, 598, 281, 247, 1442, 959, 37613, 873, 273, 3602, 534, 6125, 4216, 2605, 347, 973, 347, 24088, 4602, 4757, 50276, 40622, 323, 4227, 359, 1537, 320, 2819, 387, 16069, 14954, 4142, 285, 3981, 326, 1269, 262, 50276, 2690, 2808, 262, 18, 1274, 1637, 50276, 2204, 75, 425, 82, 891, 88, 8020, 1269, 42565, 18, 918, 835, 253, 9376, 273, 247, 1846, 5164, 830, 310, 4645, 598, 275, 253, 823, 5714, 285, 253, 275, 735, 293, 462, 262, 4243, 50276, 2520, 310, 2649, 247, 1270, 1566, 273, 16069, 2380, 533, 368, 755, 253, 2934, 347, 253, 4477, 1333, 275, 616, 32114, 281, 10591, 6151, 285, 347, 253, 7714, 28145, 271, 4755, 2898, 273, 436, 651, 320, 281, 1554, 466, 732, 16104, 613, 1402, 16069, 19654, 835, 1016, 941, 873, 432, 247, 1027, 5893, 651, 320, 7663, 1027, 8512, 342, 247, 1027, 4216, 533, 581, 1537, 3524, 323, 247, 1846, 16069, 2380, 5122, 2439, 5661, 5705, 50276, 74, 16216, 1705, 598, 342, 247, 1273, 21414, 2898, 3340, 417, 342, 4229, 295, 2439, 3530, 534, 3133, 1774, 281, 253, 9706, 8632, 4442, 3213, 50276, 783, 15832, 275, 436, 2929, 310, 281, 4858, 4715, 253, 1846, 5164, 830, 273, 253, 4972, 47694, 7186, 432, 4715, 253, 4216, 7624, 6941, 323, 45900, 273, 1491, 670, 253, 6096, 629, 273, 253, 1566, 2439, 941, 5239, 50276, 783, 5012, 5194, 326, 436, 310, 19080, 314, 2218, 2167, 352, 310, 417, 7094, 2590, 752, 253, 7787, 327, 253, 43541, 1612, 273, 436, 1332, 403, 4543, 762, 752, 2515, 352, 588, 29623, 327, 2057, 253, 3451, 4216, 390, 253, 3451, 5164, 830, 50276, 4160, 14153, 436, 310, 3236, 285, 16694, 789, 285, 1223, 2505, 3198, 281, 19148, 253, 6034, 2898, 253, 4477, 32114, 281, 10123, 1056, 479, 9648, 13224, 436, 476, 320, 2218 ]
[ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 ]
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 6010, 50276, 783, 4477, 4388, 387, 16161, 271, 4722, 1895, 323, 650, 3751, 46449, 26332, 323, 1027, 5113, 20034, 859, 29765, 4893, 253, 19349, 14580, 476, 320, 1027, 533, 253, 19349, 7688, 28445, 7688, 41546, 403, 275, 253, 1072, 830, 840, 849, 281, 9441, 253, 19349, 5289, 323, 1027, 2219, 1223, 17617, 253, 6096, 8062, 50275, 9328, 12661, 247, 37851, 7092, 323, 14053, 253, 4216, 2605, 342, 271, 32049, 285, 253, 6096, 8062, 342, 247, 29810, 253, 4679, 403, 1754, 327, 2969, 3520, 13506, 941, 285, 7409, 253, 4833, 273, 253, 7202, 34541, 3340, 253, 4477, 12661, 247, 2900, 323, 253, 1083, 342, 440, 45912, 2069, 12395, 941, 50275, 856, 84, 247, 271, 4722, 1859, 281, 923, 253, 1895, 50275, 9088, 403, 2987, 275, 19349, 8900, 13654, 327, 16161, 253, 1895, 326, 253, 19349, 4216, 310, 6096, 275, 1027, 12620, 1223, 253, 19349, 7688, 403, 275, 1027, 4948, 352, 310, 671, 1774, 281, 1908, 253, 10076, 835, 253, 19349, 4216, 310, 417, 6096, 50276, 20261, 253, 7445, 1083, 651, 320, 326, 1097, 273, 253, 19349, 14580, 285, 253, 19349, 7688, 403, 417, 4751, 6096, 253, 789, 6296, 3400, 247, 1270, 1318, 323, 253, 2007, 2440, 273, 19349, 8900, 50276, 67, 247, 973, 15720, 2929, 50275, 783, 2929, 310, 973, 3542, 534, 2789, 253, 2934, 285, 1332, 3477, 281, 320, 7192, 3340, 806, 9569, 253, 1566, 285, 1029, 5251, 2934, 285, 840, 9569, 253, 37851, 7092, 50275, 68, 5839, 323, 253, 3361, 273, 7202, 34541, 50275, 783, 789, 671, 3400, 271, 6880, 323, 253, 1083, 835, 627, 403, 440, 45912, 673, 2962, 253, 2900, 310, 15246, 2789, 3282, 285, 556, 5272, 1543, 50274, 585, 1209, 2224, 50276, 34974, 50275, 18, 275, 253, 11944, 249, 3927, 846, 247, 1223, 891, 7192, 752, 1057, 253, 3410, 1599, 891, 651, 1804, 14883, 839, 253, 4473, 3410, 342, 6667, 390, 2007, 8813, 1027, 3530, 1646, 281, 320, 3969, 281, 1027, 4893, 534, 476, 452, 1027, 19349, 14580, 50276, 19, 16186, 79, 854, 36908, 1646, 281, 320, 5185, 342, 16186, 79, 2145, 1047, 984, 253, 29810, 3133, 281, 3453, 253, 1735, 673, 1318, 209, 633, 18, 1754, 327, 253, 1655, 673, 1318, 209, 633, 533, 16186, 79, 854, 310, 760, 21839, 327, 1182, 1293, 7809, 253, 4588, 1232, 50276, 20, 16186, 79, 2145, 897, 1892, 2127, 281, 1566, 1269, 42565, 352, 2097, 326, 352, 1364, 320, 253, 1083, 1269, 42565, 310, 253, 1480, 2847, 273, 1269, 42565, 18, 2299, 352, 778, 417, 1900, 320, 253, 1083, 752, 604, 1269, 42565, 310, 417, 253, 1480, 2847, 50276, 21, 581, 5884, 4468, 891, 1158, 352, 651, 320, 4722, 281, 923, 849, 253, 1566, 17923, 327, 253, 941, 4561, 342, 14561, 47694, 11020, 3210, 534, 403, 908, 625, 2223, 275, 19349, 8900, 3185, 273, 3520, 13506, 941, 50276, 22, 352, 4558, 12744, 323, 479, 849, 12112, 310, 253, 1566, 342, 697, 29810, 1249, 1047, 390, 849, 6793, 310, 253, 5164, 966, 50276, 262, 3133, 751, 26278, 253, 1159, 342, 4972, 347, 3280, 407, 7384, 1633, 534, 476, 320, 21010, 347, 269, 89, 18, 50276, 2981, 50276, 89, 79, 50276, 38379, 2204, 74, 18, 79, 704, 895, 74, 5474, 33032, 3290, 285, 19843, 627, 1646, 281, 320, 1142, 1774, 2792, 326, 403, 417, 5544, 275, 436, 2929, 50276, 249, 436, 2929, 3738, 352, 2296, 359, 3037, 281, 9441, 19349, 2493, 2439, 3530, 342, 1027, 6944, 19349, 14580, 533, 6096, 8062, 327, 3239, 337, 285, 495, 352, 671, 2296, 9297, 403, 13727, 281, 673, 327, 3239, 577, 891, 452, 644, 13477, 672, 891, 1239, 436, 2929, 984, 891, 806, 1869, 326, 253, 6944, 19349, 6297, 1818, 689, 673, 891, 1335, 13414, 2096, 752, 1016, 4216, 10140, 281, 50276, 4674, 495, 19756, 253, 4278, 670, 253, 1332, 281, 4271, 21624, 44667, 398, 594, 253, 1332, 310, 417, 2779, 41374, 50276, 74, 13414, 2096, 2139, 253, 4477, 4081, 1332, 310, 10237, 1014, 604, 627, 403, 8763, 44667, 398, 253, 4278, 670, 253, 1332, 285, 247, 10527, 8813, 323, 352, 310, 3058, 50276, 74, 4282, 604, 253, 2045, 3082, 2429, 275, 436, 2929, 15693, 19349, 14580, 342, 11945, 604, 417, 253, 4477, 878, 281, 2085, 271, 8813, 670, 253, 4278, 670, 849, 597, 2429, 616, 1332, 342, 253, 2045, 3082, 50275, 19164, 414, 285, 8453, 627, 310, 642, 38135, 275, 253, 1895, 873, 9713, 275, 436, 2929, 436, 2929, 29328, 247, 747, 1332, 323, 271, 5368, 1895, 2299, 352, 19756, 253, 4278, 273, 253, 1332, 285, 253, 10527, 8813, 281, 1329, 253, 12510, 273, 253, 1332, 50276, 74, 4282, 2139, 253, 4477, 12661, 281, 3037, 247, 19349, 4216, 275, 253, 830, 273, 40006, 673, 2962, 19349, 14580, 6010, 14580, 253, 14580, 22245, 407, 253, 1332, 4081, 407, 1465, 87, 11078, 257, 1162, 355, 4267, 4428, 625, 7000, 1491, 670, 19349, 7688, 275, 619, 4743, 253, 19349, 14580, 4081, 275, 436, 2929, 403, 18134, 281, 253, 14580, 4561, 407, 253, 3082, 4081, 275, 5368, 2561, 275, 2426, 273, 253, 2508, 273, 253, 1491, 253, 4477, 878, 281, 19148, 436, 1127, 253, 4477, 3730, 281, 268, 2521, 1162, 355, 4240, 533, 891, 812, 417, 1089, 667, 20121, 670, 6010, 14580, 275, 253, 1984, 5474, 339, 1831, 10019, 414, 50275, 783, 2929, 3400, 1633, 747, 10775, 352, 9365, 253, 1895, 3164, 417, 2568, 973, 1911, 2079, 275, 253, 5368, 6239, 326, 19349, 8900, 3082, 778, 9441, 247, 1027, 19349, 4216, 323, 1016, 3410, 672, 253, 941, 403, 673, 2962, 253, 3753, 273, 436, 1895, 943, 452, 644, 1805, 17285, 432, 247, 7605, 10527, 1127, 273, 1859, 323, 1650, 310, 436, 1955, 281, 1327, 1326, 351, 5755, 390, 1327, 20502, 15752, 273, 253, 6944, 19191, 1232, 253, 2934, 273, 717, 430, 1320, 2439, 1027, 19349, 14580, 949, 253, 9706, 285, 28490, 1159, 310, 1077, 3236, 387, 1878, 352, 310, 747, 281, 479, 2905, 2987, 403, 9113, 5469, 50276, 9188, 40348, 50275, 783, 2929, 12453, 13353, 271, 1774, 1895, 10775, 19349, 19331, 2439, 3530, 534, 403, 4623, 281, 253, 2590, 3114, 253, 4081, 1332, 3012, 16424, 253, 1375, 23037, 14387, 3738, 253, 17200, 273, 436, 1332, 7024, 1077, 1199, 327, 253, 9376, 273, 6096, 8062, 352, 310, 2834, 281, 2939, 1880, 436, 9376, 6556, 390, 1057, 417, 2186, 275, 16774, 4893, 253, 4477, 1246, 4893, 281, 15524, 941, 760, 253, 1332, 4081, 275, 253, 2929, 604, 1805, 17194, 285, 17285, 310, 2779, 281, 452, 690, 3486, 3345, 253, 2590, 3114, 24088, 275, 6551, 21559, 15065, 285, 4931, 20701, 253, 2929, 34865, 265, 327, 253, 650, 3751, 46449, 7792, 3103, 352, 50276, 11159, 320, 625, 4722, 323, 4910, 273, 2175, 342, 1029, 4294, 941, 50276, 48746, 3290, 50275, 783, 4081, 2746, 310, 22335, 3590, 533, 891, 3264, 625, 4278, 670, 253, 32049, 285, 29810, 1159, 5884, 1127, 275, 5150, 374, 253, 673, 3605, 273, 253, 6046, 4778, 943, 3164, 320, 246, 18, 3185, 273, 246, 50276, 498, 15752, 50275, 783, 2929, 310, 4518, 3542, 285, 973, 7397, 1701, 253, 16038, 273, 253, 2929, 812, 320, 5520, 352, 943, 320, 31637, 762, 534, 2515, 359, 755, 1027, 14580, 432, 1027, 3530, 50276, 395, 534, 7533, 1329, 253, 6096, 8062, 9376, 50275, 1189, 455, 6803, 5322, 7680, 275, 2426, 273, 747, 1332, 50276, 13149, 533, 253, 2929, 3198, 281, 320, 1805, 17194, 352, 943, 320, 1805, 31637, 2139, 352, 310, 4722, 390, 1014, 3590, 281, 16932, 366, 19349, 8900, 3453, 689, 643, 3530, 50276, 187, 187, 4118, 18435, 27, 2520, 2929, 39223, 271, 4722, 533, 18052, 1895, 275, 19349, 8900, 323, 2709, 390, 21471, 673, 2962, 50276, 783, 3242, 3753, 273, 253, 1895, 9978, 310, 417, 387, 512, 2590, 432, 253, 2929, 5734, 352, 310, 1239, 1077, 9257, 285, 2096, 1598, 13477, 387, 1878, 581, 37317, 594, 604, 436, 310, 7607, 253, 2457, 2715, 943, 452, 247, 3012, 17265, 12002, 285, 10199, 50276, 783, 4758, 347, 891, 2096, 352, 310, 347, 3637, 50276, 664, 452, 2709, 941, 5239, 1016, 273, 534, 8414, 273, 295, 36474, 673, 2962, 512, 273, 2978, 246, 285, 8025, 281, 320, 11719, 285, 10486, 2540, 50276, 14382, 941, 873, 310, 3551, 432, 271, 3907, 985, 50276, 20513, 2718, 390, 941, 5239, 403, 752, 253, 13818, 21643, 314, 5841, 3530, 50276, 1439, 760, 403, 597, 10126, 3907, 533, 253, 4216, 273, 3576, 17769, 2190, 253, 295, 36474, 4870, 310, 1027, 432, 581, 985, 281, 1529, 50276, 35529, 627, 310, 247, 1846, 5164, 830, 6096, 2439, 512, 253, 2718, 323, 849, 253, 2852, 273, 1016, 273, 253, 295, 7632, 310, 4561, 432, 697, 1211, 2469, 285, 253, 2469, 273, 697, 15833, 50276, 2520, 19986, 432, 985, 281, 985, 760, 598, 281, 247, 1442, 959, 37613, 873, 273, 3602, 534, 6125, 4216, 2605, 347, 973, 347, 24088, 4602, 4757, 50276, 40622, 323, 4227, 359, 1537, 320, 2819, 387, 16069, 14954, 4142, 285, 3981, 326, 1269, 262, 50276, 2690, 2808, 262, 18, 1274, 1637, 50276, 2204, 75, 425, 82, 891, 88, 8020, 1269, 42565, 18, 918, 835, 253, 9376, 273, 247, 1846, 5164, 830, 310, 4645, 598, 275, 253, 823, 5714, 285, 253, 275, 735, 293, 462, 262, 4243, 50276, 2520, 310, 2649, 247, 1270, 1566, 273, 16069, 2380, 533, 368, 755, 253, 2934, 347, 253, 4477, 1333, 275, 616, 32114, 281, 10591, 6151, 285, 347, 253, 7714, 28145, 271, 4755, 2898, 273, 436, 651, 320, 281, 1554, 466, 732, 16104, 613, 1402, 16069, 19654, 835, 1016, 941, 873, 432, 247, 1027, 5893, 651, 320, 7663, 1027, 8512, 342, 247, 1027, 4216, 533, 581, 1537, 3524, 323, 247, 1846, 16069, 2380, 5122, 2439, 5661, 5705, 50276, 74, 16216, 1705, 598, 342, 247, 1273, 21414, 2898, 3340, 417, 342, 4229, 295, 2439, 3530, 534, 3133, 1774, 281, 253, 9706, 8632, 4442, 3213, 50276, 783, 15832, 275, 436, 2929, 310, 281, 4858, 4715, 253, 1846, 5164, 830, 273, 253, 4972, 47694, 7186, 432, 4715, 253, 4216, 7624, 6941, 323, 45900, 273, 1491, 670, 253, 6096, 629, 273, 253, 1566, 2439, 941, 5239, 50276, 783, 5012, 5194, 326, 436, 310, 19080, 314, 2218, 2167, 352, 310, 417, 7094, 2590, 752, 253, 7787, 327, 253, 43541, 1612, 273, 436, 1332, 403, 4543, 762, 752, 2515, 352, 588, 29623, 327, 2057, 253, 3451, 4216, 390, 253, 3451, 5164, 830, 50276, 4160, 14153, 436, 310, 3236, 285, 16694, 789, 285, 1223, 2505, 3198, 281, 19148, 253, 6034, 2898, 253, 4477, 32114, 281, 10123, 1056, 479, 9648, 13224, 436, 476, 320, 2218 ]
Below is given review of a research paper from cnoference journal. Please write a summary the review. ### Review: this work investigates molecule manipulation towards desirable properties by leveraging existing deep generative models the paper uses linear models trained on latent dimensions of an existing deep generative model for molecular property prediction the normal directions of the learned decision boundaries are then used to manipulate molecules strengths 1 the paper compares proposed approach with random manipulation and largest range manipulation on three different datasets there different deep generative models and different molecular properties weaknesses 1 the idea of using linear models trained on latent dimensions of existing deep generative models for steerable molecule generation or for interpretinginteracting with generative models are not new for example see das et al nature biomedical engineering volume 5 pages 613623 2021 2 steerable molecule generation using predictors trained on latent dimensions has been investigated before eg chenthamarakshan et a neurips 2020 which has been shown to tackle multiproperty manipulation well as well 3 the paper does not provide a comprehensive review of the existing works on steerable molecule generation nor does it consider existing optimizationsampling based methods as baselines chenthamarakshan et al neurips 2020 fu et al aaai 2018 yang et al icml 2020 arxiv201101921 4 current works consider many different complex molecular properties that are relevant for realworld applications eg therefore this statement does not hold for example existing work only discover two molecular properties penalized logp octanolwater partition coefficient and qed druglikeness jin et al 2018 shi et al 2020 liu et al 2018 for example httpsarxivorgabs191205910 considers drd2 activity prediction 5 it is not clear what extent of structural diversity is considered in ssr estimation the idea of leveraging linear property predictors trained on latent dimensions of deep generative models for molecule manipulationcontrolled molecule generation is not novel the work does not consider comparison with existing baselines nor does it report performance on realworld property manipulation tasks such as protein activity manipulation it is not clear what is the realworld significance of continuous property change docsepthis paper proposes to analyse latent spacebased generative models of molecules by measuring how separable the latent space is with respect to several molecular properties if a property is sufficiently separable then it can be affected by moving along the normal vector of the separation plane experiments compare the streerability of several established generative models detailed pros 1 interpreting and dissecting generative models of molecules is an interesting and important direction and one that so far has not been sufficiently explored moreover the authors have built an interactive system which makes their work much more accessible 2 the paper is mostly clear and also includes helpful figures figure 1 gives an especially good overview figure 2 is useful to get some qualitative intuition detailed cons 1 the work makes several very strong assumptions about what a good latent spacebased generative models is without sufficient evidence or ablations that this makes sense overall its not clear what the takeaways should be or how we should interpret the comparison between the models a section 5 gives qualitative results showing that nearby latent points decode to similar molecules and then goes on to assume that properties are linear functions of the latent space which is a very strong assumption its not clear if here property is linearly separable is a good approximation of property can be reliably predicted with a nonlinear model or property is easy to control using optimization one way to tackle this would be to show ablations for example proving that if a property is hard to predict with a linear model then its also hard to predict it generally given only the latent code or that its hard to optimize for it b the steerability metric assumes monotonic changes to the property which is rather strict and its unclear if results would look different if one used a soft monotonicity requirement c to add to this its not clear if the streerability or separability metrics correlate with something useful downstream eg with optimization performance or with the ability of the optimization algorithm to exploit overfit to property predictors during optimization 7 d while the authors propose to do optimization by following the normal vector of the separation plane that does not sound like a very powerful or practical optimization method it is not compared to more general methods 16 and it may not be worthwhile to do this comparison as i wouldnt expect this to do very well one argument the authors make is efficiency which is fair although its important to keep in mind that blackbox optimization methods are sometimes also very efficient eg when the property predictor is learned directly from the latent space or its learned from the molecule space with a very simple model eg random forest on molecular fingerprints thus i think its hard for the optimization method proposed in this work to really compete with more general ones and the paper should focus more on explanabilityinterpretabilitypredicting which models will be good to optimize in without running optimization e finally its not clear what getting a bad result on the metrics proposed by the authors means for example the authors mention that cgvae has a large gap between ssr and sr meaning that if often produces molecules that are different but have the same property value its unclear to me if thats a bad thing 2 some statements about prior work are just not true a current methods are confined to a limited number of molecular properties which hinders realworld applications in drug discovery and material science for example existing work only discover two molecular properties penalized logp and qed false as many existing methods can optimize towards an arbitrary invivocomputable objective function 15 for most of these models i know that they are used in reallife drugdiscovery projects in the pharma industry while many other works do work on toy properties such as ones mentioned by the authors its important to note this is not always the case even objectives in the guacamol benchmark 6 are designed to be more realistic that optimizing towards logpqed by combining several properties and occasionally other constraints i think a more realistic discussion is needed here referring to 16 and other papers as well i would also suggest to pivot this work more strongly on the interpretability aspect and its quantification instead of claiming that it actually improves molecular optimization in itself b to add to the above even the abstract says it is difficult to customize the output molecule with desired properties again suggesting that molecular optimization is hard in itself as mentioned above molecular optimization is wellstudied and just getting high values of the optimization objective is typically not hard and often can be achieved with even relatively simple genetic algorithms on the other hand i agree with low interpretability of such techniques and that they are not very wellsuited for interactive design so there is certainly room for methods similar to the one presented in this work if framed correctly other comments the paper keeps referring to druglike and drugunlike molecules while actually talking about high qed and low qed it would be useful to make this explicit druglikeness is a vague and hard to define notion while qed is a simple handcrafted approximation of that notion bottom of page 5 is confusing as it shuffles may very strong assumptions without calling them out explicitly first it talks about manipulating multiple properties but equations suggest its talking about something more specific which is manipulating a linear combination of several properties it then states that property values follow a multivariate normal distribution which is a very unexpected conclusion and is only true because of the strong linearity assumption figure 4 is very hard to read maybe use a differrent form to present this result section 63 mentions that moflow performed better than cgvae on the metric defined by the authors due to its reversibility i think given the vast amount of differences between the models attributing the difference to reversibility is just speculation so should be marked as such overall that whole paragraph 2 is highly speculative and in my opinion doesnt really bring much table 1 is hard to read because it compares both different models and different ways of choosing the latent direction at the same time maybe it would be more readable to first compare the ways of choosing latent directions establish that the proposed way works the best and then compare the models it is very odd that the largest manipulation direction performs worse than random it would be useful to get a bit more intuition into why that happens bottom of page 8 defines several ranges what are these ranges for distance in the latent space nits didnt influence my score just here to help abstract generative models can synthesize new molecules i guess you mean propose or design as synthesize would imply the molecules are actually made page 2 the steering the i guess replace the second the with of page 3 latent space which is usually modelled as a gaussian distribution latent space which is commonly assumed to be gaussian distribution technically its the prior that is gaussian while the latent space is rl which is not a distribution page 3 and capable and is capable page 3 there exists property functions fp which defines id drop the s in both cases page 5 scaling the changes scales the changes equation 10 im assuming this mathcale is supposed to mean expectation if so then its more common to use mathbfe for that many places use mathcalr to denote the set of real numbers while again its common to use mathbfr in equation 8 the scaling factor alpha seems to turn into k in equation 13 and then alpha means something different page 8 obverse observe references 1 efficient multiobjective molecular optimization in a continuous latent space 2 learning to extend molecular scaffolds with structural motifs 3 reinvent 20 an ai tool for de novo drug design 4 reinforced molecular optimization with neighborhoodcontrolled grammars 5 hit and lead discovery with explorative rl and fragmentbased molecule generation 6 guacamol benchmarking models for de novo molecular design 7 on failure modes in molecule generation and optimization while the direction pursued by the paper is interesting there are many strong assumptions made with little justification and overall its not clear how one should interpret the results in this work or what the impact on realistic optimization tasks is finally some things are also unclear and the relation to prior work is somewhat misrepresented thus i believe this paper falls below the acceptance threshold docsepthis paper proposes molecular space explorer molspace a method to generate molecules with continuously varying properties using a pretrained latent variable generative model it essentially involves 3 steps although these steps were not clearly spelled out in the paper 1 sample many points from the model and evaluate their properties 2 for each property train a svm to predict the property given the the latent vector 3 use the normal to this hyperplane as a property manipulation direction in latent space which can be added a latent vector to change the property value of the decoded molecule overview this paper addresses an important problem which few prior works have addressed before in the literature their rough approach is reasonable however i had a lot of questions and doubts some major and some minor which i will discuss below i will use major and minor to denote issues which are extremely and moderately important to me respectively problem definition i am sympathetic to your problem setting of continuously tuning the properties of molecules minor as this could be used to make a lot of practical tools such as the interesting demo you provide in the end roughly given a starting molecule you should be able to produce another similar molecule but with slightly higher or lower property values i applaud your efforts to try to define this more precisely in section 4 but i think your definition has several significant problems major there seem to be no constraints on the molecules outputted eg a similarity constraint to the starting molecule without this your metrics could be easily exploited by having a sorted list of 1000 molecules and associated property values and given a starting molecule output in order all molecules abovebelow the starting molecules property value a similarity constraint would remove this exploit for example although there are certainly other ways to do this i think the monotonicity conditions in your sr and ssr are too strict given the high degree of randomness in generative models by your metrics a property sequence of 1 1 1 100001 1001 1001 would satisfy both sr and ssr while 3 2 1 001 0 1 2 3 would not even though the second covers a much broader property range and to me seems more in line with what practitioners would want i acknowledge that this criticism is subjective i object to the use of the word continuous in your definitions which you use several times all molecular outputs are discrete and many of the properties you manipulate are integer valued in the strict technical sense of the word nothing you do has continuity although it does match the nontechnical usage of the word perhaps you could say stepwise or monotonically instead i acknowledge that creating a good definition of this is hard and i dont claim to have a great definition myself the best rough suggestion i can think of would be to define a manipulation in k steps as a sequence of molecules m1 ldots mk such that fpmi fpmi1 epsilon forall i for some reasonable epsilon and that textsimmi mi1 delta forall i for some similarity metric given this i would produce a long sequence of molecules using your method until the first instance where the conditions are violated and it ceases to be a valid manipulation you could then score each manipulation based on its length longer is better and the magnitude of the property difference between the starting and ending molecules larger is better which exact scoring function to use is unclear as well as the appropriate value of epsilon delta but hopefully you get what i mean from this sketch related work your paper is closely related to a variety of existing works on molecule generation while you have many important citations i think that you are missing many others minor bombarelli et al 2018 should really be credited with the invention of bo in the latent space of generative models not jin et al a variety of other works which do molecule optimization 17 work on graph to graph translation eg 8 which is very similar to spirit to yours work on conditional generative modelling for molecules eg 9 which could also be used for your problem setting there are also many other relevant works especially in computer vision which aim to modify images by manipulating latent vectors as far as i could tell no such analogous methods are mentioned in your paper 1013 are some starting references for you many of these papers propose methods similar enough to yours that you should discuss how the method you propose is different minor your proposed method section 5 a questionable argument in section 51 i was not sympathetic to your argument in section 51 minor while yes molecules with similar structures tend to cluster together in latent space and structures determine properties it does not follow that molecules with similar properties will cluster together this is because of activity cliffs a common and wellunderstood phenomenon where molecules which are very similar can have very different properties 14 they are caused by many things in biology having cutoffs or other nonlinear effects eg if a molecule is a little bit too big it cant fit into a receptor and will have 0 activity reading this section it felt like the authors were not aware of this although you dont state it explicitly the success of your method depends on molecules with similar property values clustering together i think that you should verify this more explicitly although i do appreciate that in figure 2 you validated that structurally similar molecules cluster together which is an important start to producing similar molecules given a target molecule one nitpick about figure 2 you dont mention how these similarity values are calculated latent directions regression becomes classification in section 5253 your motivation seems to significantly switch from regression based manipulating realvalued properties to classification based finding a hyperplane between positive and negative classes i find this awkward and not well explained major in general the existence of a separating hyperplane can only be guaranteed when the positive and negative examples form disjoint convex sets this seems unlikely to occur with generative models with a gaussian prior it seems plausible to me that a generative model could have multiple latent directions which cause the property values to increase for example consider molecular weight which is modified by addingremovingsubstituting atoms effectively anything since there are many ways to addremove atoms from a molecule i imagine that such a property could be modified by several directions in latent space and that there would not be a n1 dimensional hyperplane separating positive and negative classes even if there is just 1 direction of increase which would make the separating hyperplane assumption sensible i find the separation into positive and negative classes quite artificial minor i could not find an explanation of this in your paper but thankfully was able to find it in the code thanks for making your code accessible and quite readable minor it looks like you choose the 2nd percentile of your dataset as the cutoff between positive and negative so you will have very unbalanced classes i am a unsure of why this choice was made was it driven by performance or just an arbitrary choice an explanation of these choices would make the paper stronger experiments i was very happy to see the authors evaluate their method on 3 dataset and over 100 properties major often i think papers of this sort test too few properties datasets and i am glad that the authors did not do this i think that the choice of experiments was generally good but i have concerns about some choices that the authors made datasets i was very happy with the authors choice of datasets minor baselines i think the authors were too narrow in their choice of baselines major they compared their method against 2 simple methods which are identical in every way except for the choice of hyperplane there are many other methods which i think would make competitive and sensible baselines notably conditional vaes and graph to graph translation methods like 8 i acknowledge that these methods dont work with pretrained models but as a reader i would want to know whether the srssr scores are competitive in an absolute sense or whether they are just competitive relative to other methods which use pretrained models having at least 1 competitive baseline would make the paper much stronger molecular properties although the authors chose a large number of properties to test i felt that these properties were all superficial molecular descriptors from rdkitdescriptorsdesclist which are intrinsically not as difficult to manipulate as realworld experimental properties such as drug activity major the properties mostly fall into just a few categories which can be considered separately molwt logp heavyatomcount and similar properties are essentially a weighted sum over the atoms in a molecule they can be trivially manipulated by adding and removing specific atoms from anywhere in the molecule all properties after nhohcount are fragment properties which count instances of particular substructures they can be trivially manipulated by simply adding or removing instances of this substructure qed is more complex but is essentially a measure of whether a molecule fulfills a specific set of conditions for molwt logp and counts of certain fragments it is somewhat more complex to manipulate but still not too difficult in my opinion fpdensitymorgan are essentially a normalized count of the number of unique atomic environments in a molecule and can be increaseddecreased by addingremoving unique substructures admittedly im 90 sure about this one estate chi kappa balabanj are some sort of graph theoretic properties which i dont have a good intuition for so i dont really know how difficult they are to manipulate because of the simplicity of these properties i did not find the manipulations shown to be impressive i think it would make the paper much stronger to consider more complex properties my first suggestion is the goaldirected objectives from guacamol 15 which are mostly based on fingerprints and therefore still not super realistic preferably i would like to see results on at least 1 objective which is not similarity or rediscovery since those are too easy my second suggestion would be molecular docking scores since these are more realistic you could consider using the dockstring package 16 for this it would not surprise me if molspace performed poorly on these more complex properties results in general i liked the presentation of the results minor but have a few questions minor 1 in table 1 the averages for sr are often a lot higher than all the 7 property values you show in the table eg 60 for chembl hiervaem while all the other values are 30 this implies that there are either outliers or that the properties shown in the table are somehow atypical can you provide clarification one suggestion might be to not show the average but instead show the average rank of all the baselines the average can be highly influenced by some tasks being easier than others 2 figure 3 is very unclear are the plots for a single property or for the 7 properties you study i am not sure what the figure is showing or how to interpret it 3 in section 64 did you achieve similar results with the fragment properties or any more complex properties references 1 grammar variational autoencoder httparxivorgabs170301925 2 molecular denovo design through deep reinforcement learning httpsdoiorg101186s133210170235x 3 generating focused molecule libraries for drug discovery with recurrent neural networks 101021acscentsci7b00512 4 optimization of molecules via deep reinforcement learning 101038s4159801947148x 5 sampleefficient optimization in the latent space of deep generative models via weighted retraining httpsproceedingsneuripsccpaper2020hash81e3225c6ad49623167a4309eb4b2e75abstracthtml 6 constrained bayesian optimization for automatic chemical design using variational autoencoders httpspubsrscorgencontentarticlelanding2020scc9sc04026a 7 optimizing molecules using efficient queries from property evaluations httparxivorgabs201101921 8 learning multimodal graphtograph translation for molecular optimization httparxivorgabs181201070 9 conditional molecular design with deep generative models httpsdoiorg101021acsjcim8b00263 10 gan dissection visualizing and understanding generative adversarial networks httparxivorgabs181110597 11 interpreting the latent space of gans for semantic face editing httpsopenaccessthecvfcomcontentcvpr2020htmlsheninterpretingthelatentspaceofgansforsemanticfaceeditingcvpr2020paperhtml 12 on the steerability of generative adversarial networks httparxivorgabs190707171 13 controlling generative models with continuous factors of variations httparxivorgabs200110238 14 httpswwwncbinlmnihgovpmcarticlespmc3869489 15 guacamol benchmarking models for de novo molecular design 101021acsjcim8b00839 16 httpsarxivorgabs211015486 overall i am happy to see a paper addressing this worthwhile problem but feel that there are too many issues to recommend acceptance and therefore i must recommend rejection in order to change my mind i think the following would need to be addressed 1 clarify differences between your method and similar methods for discovering directions in latent space proposed in 1013 2 further clarification or justification of finding a hyperplane using a classification method 3 comparing against a set of nontrivial baselines and having superior or at least reasonably similar performance i also think the paper could be made much stronger by 1 using a more difficult set of molecular properties 2 improving the definition of molecular manipulation and the metrics proposed docsepthis paper presents a method named molspaceexplorer that explores the latent space of a molecule generative model to continuously optimize molecules toward despred propertiesthey identify latent directions by constructing a serapation boundary hyperplane on the latent space and use the directions to improve latent vectors which are then fed into the generatove model to produce new molecules with desired properties the authors also developed an interface for interactive molecular discovery the authors mentioned that the main contributions of this study are 1 this method is modelagnostic thus applicable to any molecule generative model 2 it does not require any retraining of the molecule generative model my first concern about this study is that there are many methods that have simlar concepts modelagnostic and does not require retraining of the molecule generative model some examples are list below they commonly perform optimization on the latent space toward desired properties without modifying the molecular generative model also they are modelagnostic applicable to any molecule generative model im wondering if the proposed method is superior to or is advantages in any aspect over existing methods gmezbombarelli r wei j n duvenaud d hernndezlobato j m snchezlengeling b sheberla d aspuruguzik a 2018 automatic chemical design using a datadriven continuous representation of molecules acs central science 42 268276 griffiths r r hernndezlobato j m 2020 constrained bayesian optimization for automatic chemical design using variational autoencoders chemical science 112 577586 notin p hernndezlobato j m gal y 2021 improving blackbox optimization in vae latent space using decoder uncertainty arxiv preprint arxiv210700096 bayesian optimization on the latent space to manipulate molecule kwon y kang s choi y s kim i 2021 evolutionary design of molecules based on deep learning and a genetic algorithm scientific reports 111 111 genetic algorithm on the latent space to manipulate molecule winter r montanari f steffen a briem h no f clevert d a 2019 efficient multiobjective molecular optimization in a continuous latent space chemical science 1034 80168024 particle swarm optimization on the latent space to manipulate molecule my second concern is whether the proposed method provides interpretability while the authors metioned theyir method is effective on interpreting molecular generative models the results the authors presented are gradual modifications of a query molecule and their respective properties just like other related studies i dont see any particually interpretable stuffs i think this paper does not demonstrate any significant contributions compared to existing studies ### Summary:
the reviewers find the work to address an interesting and important problem but have several critical concerns about its insufficient treatment of prior work in this area lack of novelty in relation to the body of existing literature
[ 352, 3133, 21541, 281, 479, 326, 247, 1006, 800, 1566, 812, 452, 2709, 21624, 10746, 534, 2847, 253, 2867, 2193, 281, 2572, 323, 1650, 1908, 5787, 2801, 534, 310, 7321, 407, 6240, 2013, 729, 723, 538, 46306, 10917, 8069, 2712, 1580, 627, 403, 1142, 4088, 281, 823, 12163, 10917, 432, 247, 12570, 891, 8564, 326, 824, 247, 2867, 812, 320, 7321, 407, 2067, 10746, 275, 21624, 2317, 285, 326, 627, 651, 417, 320, 247, 295, 18, 15759, 4373, 13568, 23694, 2762, 285, 4016, 5971, 50276, 9154, 604, 627, 310, 816, 337, 3884, 273, 2572, 534, 651, 1056, 253, 23694, 4373, 13568, 9376, 24600, 891, 1089, 253, 9712, 715, 2762, 285, 4016, 5971, 3240, 13345, 5884, 891, 812, 417, 1089, 271, 8813, 273, 436, 275, 634, 2929, 533, 5717, 2920, 369, 2104, 281, 1089, 352, 275, 253, 2127, 6701, 323, 2403, 634, 2127, 12482, 285, 3240, 34025, 5884, 352, 4453, 751, 368, 5206, 253, 374, 2109, 36384, 273, 634, 10895, 347, 253, 23046, 875, 2762, 285, 4016, 594, 368, 588, 452, 1077, 440, 30063, 5971, 891, 717, 247, 31488, 273, 2139, 436, 4327, 369, 1160, 369, 352, 8877, 407, 3045, 390, 816, 271, 10341, 4327, 271, 8813, 273, 841, 10165, 651, 1056, 253, 2929, 10046, 50275, 16217, 3825, 50276, 74, 369, 1077, 5211, 281, 923, 253, 4477, 7472, 616, 1332, 327, 495, 10895, 285, 689, 2233, 3607, 2201, 2223, 891, 1158, 9380, 273, 436, 3686, 1071, 1512, 1643, 3607, 50276, 46906, 1507, 285, 891, 717, 9995, 326, 253, 4477, 858, 417, 513, 436, 891, 1158, 326, 253, 4327, 273, 4679, 369, 3839, 1175, 533, 891, 452, 7350, 670, 690, 10165, 326, 253, 4477, 1160, 50275, 46906, 1507, 50276, 74, 369, 1077, 5211, 342, 253, 4477, 4327, 273, 15302, 5884, 50275, 10352, 25379, 50276, 74, 1158, 253, 4477, 497, 1512, 6891, 275, 616, 4327, 273, 1666, 25379, 2201, 597, 2429, 616, 1332, 1411, 374, 2969, 3082, 534, 403, 8931, 275, 1046, 1039, 3707, 323, 253, 4327, 273, 4373, 13568, 627, 403, 1142, 643, 3082, 534, 891, 1158, 651, 1056, 12085, 285, 24600, 1666, 25379, 19836, 17697, 13460, 265, 285, 4216, 281, 4216, 10234, 3082, 751, 854, 891, 14409, 326, 841, 3082, 13414, 789, 342, 3215, 11273, 3210, 533, 347, 247, 9414, 891, 651, 971, 281, 871, 1880, 253, 256, 43053, 83, 7363, 403, 12085, 275, 271, 7880, 3282, 390, 1880, 597, 403, 816, 12085, 4103, 281, 643, 3082, 534, 897, 3215, 11273, 3210, 1907, 387, 1878, 337, 12085, 8245, 651, 1056, 253, 2929, 1199, 10046, 50275, 36911, 3607, 50276, 20261, 253, 4477, 9703, 247, 1781, 1180, 273, 3607, 281, 1071, 891, 3543, 326, 841, 3607, 497, 512, 28019, 5787, 42785, 432, 47939, 11554, 3229, 1687, 641, 3229, 498, 382, 534, 403, 45654, 417, 347, 2834, 281, 26526, 347, 1524, 10186, 5661, 3607, 824, 347, 2854, 2425, 2201, 253, 3607, 6571, 2965, 715, 816, 247, 1643, 9050, 534, 476, 320, 2783, 11794, 50275, 17071, 17118, 2412, 81, 5536, 14054, 5560, 285, 2074, 3607, 403, 9093, 247, 17375, 2020, 689, 253, 10917, 275, 247, 12570, 597, 476, 320, 35820, 1365, 32494, 407, 6240, 285, 11922, 2173, 10917, 432, 9825, 275, 253, 12570, 50276, 455, 3607, 846, 31386, 1368, 5560, 403, 10087, 3607, 534, 1385, 10872, 273, 1798, 749, 45345, 597, 476, 320, 35820, 1365, 32494, 407, 3365, 6240, 390, 11922, 10872, 273, 436, 749, 18317, 50276, 82, 264, 310, 625, 2570, 533, 310, 9093, 247, 2557, 273, 1880, 247, 12570, 3744, 44849, 247, 2173, 873, 273, 2515, 323, 14008, 17118, 2412, 81, 285, 9372, 273, 2176, 14251, 352, 310, 8489, 625, 2570, 281, 26526, 533, 1335, 417, 1512, 2834, 275, 619, 4743, 50276, 16983, 20425, 78, 7397, 403, 9093, 247, 12650, 1385, 273, 253, 1180, 273, 4451, 13805, 12620, 275, 247, 12570, 285, 476, 320, 2559, 40600, 833, 407, 6240, 2013, 11305, 4451, 749, 45345, 47421, 516, 5091, 2119, 670, 436, 581, 50276, 383, 366, 21477, 465, 5596, 4273, 23818, 75, 403, 690, 3686, 273, 4216, 253, 30325, 3607, 534, 891, 13414, 452, 247, 1175, 30328, 323, 594, 891, 13414, 1663, 871, 849, 2834, 597, 403, 281, 26526, 50276, 12157, 273, 253, 17647, 273, 841, 3607, 891, 858, 417, 1089, 253, 49373, 2011, 281, 320, 13943, 891, 1158, 352, 651, 1056, 253, 2929, 1199, 10046, 281, 1908, 625, 2570, 3607, 619, 806, 14876, 310, 253, 4736, 27481, 16566, 432, 1149, 317, 312, 311, 1458, 534, 403, 6571, 1754, 327, 44901, 285, 3103, 1335, 417, 2221, 15958, 13027, 891, 651, 751, 281, 923, 1543, 327, 387, 1878, 337, 8103, 534, 310, 417, 14259, 390, 2502, 261, 17708, 1580, 1110, 403, 1512, 3477, 619, 1273, 14876, 651, 320, 5787, 36267, 7363, 1580, 841, 403, 625, 15958, 368, 812, 1908, 970, 253, 17463, 2703, 5522, 1668, 323, 436, 352, 651, 417, 9326, 479, 604, 14008, 5641, 2684, 15225, 327, 841, 625, 2570, 3607, 50275, 16680, 50276, 249, 2087, 891, 10490, 253, 9759, 273, 253, 1543, 5884, 533, 452, 247, 1643, 3533, 5884, 50276, 18, 275, 2829, 337, 253, 31218, 323, 49975, 403, 2223, 247, 2257, 2169, 685, 512, 253, 818, 2867, 2193, 368, 921, 275, 253, 2829, 24088, 3925, 323, 3554, 1559, 14260, 677, 66, 358, 1223, 512, 253, 643, 2193, 403, 50276, 1229, 436, 8018, 326, 627, 403, 2057, 42559, 390, 326, 253, 3607, 2011, 275, 253, 2829, 403, 10380, 34162, 476, 368, 2085, 37699, 581, 14876, 1537, 320, 281, 417, 921, 253, 3388, 533, 3185, 921, 253, 3388, 5958, 273, 512, 253, 1666, 25379, 253, 3388, 476, 320, 4122, 12208, 407, 690, 8892, 1146, 6927, 685, 2571, 374, 4677, 495, 310, 1077, 12744, 403, 253, 14777, 323, 247, 2014, 2867, 390, 323, 253, 818, 3607, 368, 1263, 891, 717, 417, 2119, 752, 253, 4677, 310, 4645, 390, 849, 281, 4665, 352, 495, 275, 2593, 6705, 858, 368, 5115, 2074, 1543, 342, 253, 10087, 3607, 390, 667, 625, 2570, 3607, 50275, 250, 3065, 50276, 18, 28146, 39762, 6753, 36465, 2832, 1148, 32693, 2061, 5375, 1166, 2941, 11325, 1099, 50276, 19, 5787, 1850, 19975, 2216, 949, 3676, 35221, 4715, 5987, 3088, 1528, 72, 6903, 20270, 84, 1012, 1237, 6903, 1967, 19568, 89, 50276, 20, 11365, 7106, 12570, 13747, 323, 2854, 8900, 342, 18902, 11454, 6928, 8437, 21940, 317, 1026, 592, 5297, 24, 67, 5523, 805, 50276, 21, 13757, 273, 8094, 3066, 3676, 35221, 4715, 8437, 22439, 84, 30977, 4185, 11325, 2504, 16989, 89, 50276, 22, 3410, 20246, 13757, 275, 253, 21624, 2317, 273, 3676, 1006, 800, 3210, 3066, 17375, 851, 26208, 5987, 856, 22868, 32167, 2824, 550, 20790, 14952, 13362, 3593, 70, 1237, 1099, 68, 23, 324, 30693, 1508, 18146, 66, 21, 22000, 2275, 21, 67, 19, 70, 1976, 15834, 2974, 50276, 23, 20793, 17699, 16561, 13757, 323, 12077, 5793, 2216, 970, 39762, 6753, 2083, 351, 398, 5987, 16712, 18356, 1026, 263, 1541, 6071, 14600, 1373, 272, 14952, 84, 550, 26, 1026, 17884, 1731, 66, 50276, 24, 39793, 8094, 970, 5919, 19241, 432, 2867, 27163, 2832, 1148, 32693, 2061, 5375, 1252, 6903, 26, 1797, 50276, 25, 4715, 23390, 26306, 17309, 384, 2047, 10234, 323, 5787, 13757, 2832, 1148, 32693, 2061, 5375, 1093, 805, 9104, 1967, 50276, 26, 17697, 5787, 2216, 342, 3676, 1006, 800, 3210, 5987, 3088, 1528, 72, 6903, 21940, 18944, 23925, 303, 25, 67, 361, 24235, 50276, 740, 36827, 32508, 5304, 3006, 285, 4685, 1006, 800, 48960, 6928, 2832, 1148, 32693, 2061, 5375, 1093, 883, 740, 34651, 50276, 883, 29375, 253, 21624, 2317, 273, 305, 507, 323, 24705, 2454, 14835, 5987, 5758, 317, 707, 296, 248, 17312, 71, 681, 6071, 17312, 1087, 14952, 2974, 84, 864, 2388, 3456, 1076, 783, 13324, 592, 4511, 1171, 72, 507, 1542, 6017, 6484, 1664, 264, 2996, 17312, 1087, 14952, 20790, 2974, 50276, 805, 327, 253, 37434, 1430, 273, 1006, 800, 48960, 6928, 2832, 1148, 32693, 2061, 5375, 16129, 26522, 19816, 50276, 1012, 10938, 1006, 800, 3210, 342, 5415, 2616, 273, 10575, 2832, 1148, 32693, 2061, 5375, 1518, 7749, 21378, 50276, 1047, 5987, 1477, 939, 68, 4805, 77, 16192, 6356, 12312, 2617, 68, 13137, 2617, 68, 1839, 2090, 30452, 50276, 1010, 1149, 317, 312, 311, 22791, 272, 3210, 323, 372, 17590, 5787, 2216, 8437, 21940, 18944, 23925, 303, 25, 67, 8897, 1867, 50276, 1036, 5987, 39962, 2061, 5375, 17605, 10496, 27538, 4583, 891, 717, 5211, 281, 923, 247, 2929, 15974, 436, 32811, 1895, 533, 1928, 326, 627, 403, 1512, 1142, 3374, 281, 5583, 14924, 285, 3103, 891, 1364, 5583, 18235, 275, 1340, 281, 1818, 619, 2564, 891, 1158, 253, 1563, 651, 878, 281, 320, 9713, 50276, 18, 19148, 3910, 875, 634, 1332, 285, 2074, 3082, 323, 30375, 10746, 275, 21624, 2317, 4081, 275, 8437, 20, 374, 2007, 37699, 390, 22861, 273, 4560, 247, 4373, 13568, 970, 247, 9162, 1332, 495, 10941, 1411, 247, 873, 273, 37825, 1666, 25379, 285, 1907, 8936, 390, 387, 1878, 12054, 2074, 3045, 50276, 74, 671, 1158, 253, 2929, 812, 320, 1160, 1199, 10046, 407, 50276, 18, 970, 247, 625, 2834, 873, 273, 5787, 3607, 374, 11138, 253, 5426, 273, 5787, 19763, 285, 253, 17082, 4081, 5474, 33032, 2520, 2929, 10262, 247, 1332, 4907, 14008, 5641, 15083, 14071, 326, 33826, 253, 21624, 2317, 273, 247, 12570, 1006, 800, 1566, 281, 14949, 22318, 8094, 2584, 711, 12787, 1463, 6811, 383, 26512, 4271, 21624, 10746, 407, 26736, 247, 1151, 522, 318, 7548, 4373, 13568, 327, 253, 21624, 2317, 285, 897, 253, 10746, 281, 3157, 21624, 11390, 534, 403, 840, 10208, 715, 253, 1006, 255, 710, 1566, 281, 4711, 747, 8094, 342, 6799, 3607, 253, 4477, 671, 3715, 271, 5673, 323, 18366, 5787, 8900, 253, 4477, 5393, 326, 253, 2022, 9021, 273, 436, 1263, 403, 337, 436, 1332, 310, 1566, 1530, 6932, 3021, 7763, 281, 667, 12570, 1006, 800, 1566, 374, 352, 1057, 417, 2430, 667, 851, 26208, 273, 253, 12570, 1006, 800, 1566, 50274, 2577, 806, 4468, 670, 436, 1263, 310, 326, 627, 403, 1142, 3082, 326, 452, 948, 9388, 12342, 1566, 1530, 6932, 285, 1057, 417, 2430, 851, 26208, 273, 253, 12570, 1006, 800, 1566, 690, 6667, 403, 1618, 2708, 597, 7744, 1347, 13757, 327, 253, 21624, 2317, 2584, 6799, 3607, 1293, 26264, 253, 5787, 1006, 800, 1566, 671, 597, 403, 1566, 1530, 6932, 7763, 281, 667, 12570, 1006, 800, 1566, 516, 12371, 604, 253, 4081, 1332, 310, 8936, 281, 390, 310, 11361, 275, 667, 4809, 689, 5368, 3082, 50275, 72, 1405, 30007, 4894, 609, 25658, 391, 359, 74, 480, 295, 3443, 1261, 5353, 277, 617, 9866, 26196, 77, 706, 4611, 480, 278, 3802, 50043, 77, 1205, 8855, 270, 703, 589, 4123, 277, 50275, 4938, 321, 814, 7958, 1479, 247, 4765, 12077, 5793, 2216, 970, 247, 2856, 324, 1069, 257, 5415, 6779, 273, 8094, 913, 84, 4275, 5859, 5976, 30783, 22818, 650, 1648, 334, 84, 391, 391, 50276, 3766, 79, 26196, 77, 706, 4611, 480, 278, 9169, 20793, 17699, 16561, 13757, 323, 12077, 5793, 2216, 970, 39762, 6753, 2083, 351, 398, 5793, 5859, 11633, 8988, 1976, 2691, 417, 249, 268, 617, 9866, 26196, 77, 706, 4611, 480, 278, 50276, 9896, 340, 43425, 11138, 2806, 3364, 13757, 275, 362, 3348, 21624, 2317, 970, 29810, 11649, 549, 32693, 638, 3845, 549, 32693, 19, 12224, 933, 4196, 50276, 32442, 16561, 13757, 327, 253, 21624, 2317, 281, 26526, 12570, 465, 33382, 340, 465, 606, 256, 2093, 74, 340, 256, 50276, 42686, 891, 43425, 16483, 2216, 273, 8094, 1754, 327, 3676, 4715, 285, 247, 6380, 5933, 8249, 5012, 11334, 11334, 50276, 1541, 1999, 5933, 327, 253, 21624, 2317, 281, 26526, 12570, 8986, 391, 24325, 266, 1792, 269, 2870, 46756, 247, 45633, 358, 288, 642, 269, 50276, 2148, 1748, 277, 247, 6247, 5919, 4471, 6082, 422, 5787, 13757, 275, 247, 5415, 21624, 2317, 5793, 5859, 884, 1706, 854, 11718, 1438, 1348, 50276, 25268, 47025, 13757, 327, 253, 21624, 2317, 281, 26526, 12570, 50276, 2577, 1273, 4468, 310, 1880, 253, 4081, 1332, 3400, 4665, 1430, 1223, 253, 4477, 1313, 10998, 597, 343, 1332, 310, 3576, 327, 29375, 5787, 1006, 800, 3210, 253, 1543, 253, 4477, 3559, 403, 26830, 14586, 273, 247, 7316, 12570, 285, 616, 9056, 3607, 816, 751, 643, 2905, 2175, 891, 13414, 923, 667, 1117, 1230, 4665, 494, 5017, 84, 50276, 74, 1158, 436, 2929, 1057, 417, 7568, 667, 1534, 9021, 2429, 281, 5368, 2175, 2490, 187, 4118, 18435, 27, 783, 30628, 1089, 253, 789, 281, 2953, 271, 4722, 285, 1774, 1895, 533, 452, 2067, 4619, 7350, 670, 697, 12497, 1971, 273, 2720, 789, 275, 436, 2170, 50276, 77, 471, 273, 38135, 275, 5886, 281, 253, 2133, 273, 5368, 6239 ]
[ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 ]
[ 352, 3133, 21541, 281, 479, 326, 247, 1006, 800, 1566, 812, 452, 2709, 21624, 10746, 534, 2847, 253, 2867, 2193, 281, 2572, 323, 1650, 1908, 5787, 2801, 534, 310, 7321, 407, 6240, 2013, 729, 723, 538, 46306, 10917, 8069, 2712, 1580, 627, 403, 1142, 4088, 281, 823, 12163, 10917, 432, 247, 12570, 891, 8564, 326, 824, 247, 2867, 812, 320, 7321, 407, 2067, 10746, 275, 21624, 2317, 285, 326, 627, 651, 417, 320, 247, 295, 18, 15759, 4373, 13568, 23694, 2762, 285, 4016, 5971, 50276, 9154, 604, 627, 310, 816, 337, 3884, 273, 2572, 534, 651, 1056, 253, 23694, 4373, 13568, 9376, 24600, 891, 1089, 253, 9712, 715, 2762, 285, 4016, 5971, 3240, 13345, 5884, 891, 812, 417, 1089, 271, 8813, 273, 436, 275, 634, 2929, 533, 5717, 2920, 369, 2104, 281, 1089, 352, 275, 253, 2127, 6701, 323, 2403, 634, 2127, 12482, 285, 3240, 34025, 5884, 352, 4453, 751, 368, 5206, 253, 374, 2109, 36384, 273, 634, 10895, 347, 253, 23046, 875, 2762, 285, 4016, 594, 368, 588, 452, 1077, 440, 30063, 5971, 891, 717, 247, 31488, 273, 2139, 436, 4327, 369, 1160, 369, 352, 8877, 407, 3045, 390, 816, 271, 10341, 4327, 271, 8813, 273, 841, 10165, 651, 1056, 253, 2929, 10046, 50275, 16217, 3825, 50276, 74, 369, 1077, 5211, 281, 923, 253, 4477, 7472, 616, 1332, 327, 495, 10895, 285, 689, 2233, 3607, 2201, 2223, 891, 1158, 9380, 273, 436, 3686, 1071, 1512, 1643, 3607, 50276, 46906, 1507, 285, 891, 717, 9995, 326, 253, 4477, 858, 417, 513, 436, 891, 1158, 326, 253, 4327, 273, 4679, 369, 3839, 1175, 533, 891, 452, 7350, 670, 690, 10165, 326, 253, 4477, 1160, 50275, 46906, 1507, 50276, 74, 369, 1077, 5211, 342, 253, 4477, 4327, 273, 15302, 5884, 50275, 10352, 25379, 50276, 74, 1158, 253, 4477, 497, 1512, 6891, 275, 616, 4327, 273, 1666, 25379, 2201, 597, 2429, 616, 1332, 1411, 374, 2969, 3082, 534, 403, 8931, 275, 1046, 1039, 3707, 323, 253, 4327, 273, 4373, 13568, 627, 403, 1142, 643, 3082, 534, 891, 1158, 651, 1056, 12085, 285, 24600, 1666, 25379, 19836, 17697, 13460, 265, 285, 4216, 281, 4216, 10234, 3082, 751, 854, 891, 14409, 326, 841, 3082, 13414, 789, 342, 3215, 11273, 3210, 533, 347, 247, 9414, 891, 651, 971, 281, 871, 1880, 253, 256, 43053, 83, 7363, 403, 12085, 275, 271, 7880, 3282, 390, 1880, 597, 403, 816, 12085, 4103, 281, 643, 3082, 534, 897, 3215, 11273, 3210, 1907, 387, 1878, 337, 12085, 8245, 651, 1056, 253, 2929, 1199, 10046, 50275, 36911, 3607, 50276, 20261, 253, 4477, 9703, 247, 1781, 1180, 273, 3607, 281, 1071, 891, 3543, 326, 841, 3607, 497, 512, 28019, 5787, 42785, 432, 47939, 11554, 3229, 1687, 641, 3229, 498, 382, 534, 403, 45654, 417, 347, 2834, 281, 26526, 347, 1524, 10186, 5661, 3607, 824, 347, 2854, 2425, 2201, 253, 3607, 6571, 2965, 715, 816, 247, 1643, 9050, 534, 476, 320, 2783, 11794, 50275, 17071, 17118, 2412, 81, 5536, 14054, 5560, 285, 2074, 3607, 403, 9093, 247, 17375, 2020, 689, 253, 10917, 275, 247, 12570, 597, 476, 320, 35820, 1365, 32494, 407, 6240, 285, 11922, 2173, 10917, 432, 9825, 275, 253, 12570, 50276, 455, 3607, 846, 31386, 1368, 5560, 403, 10087, 3607, 534, 1385, 10872, 273, 1798, 749, 45345, 597, 476, 320, 35820, 1365, 32494, 407, 3365, 6240, 390, 11922, 10872, 273, 436, 749, 18317, 50276, 82, 264, 310, 625, 2570, 533, 310, 9093, 247, 2557, 273, 1880, 247, 12570, 3744, 44849, 247, 2173, 873, 273, 2515, 323, 14008, 17118, 2412, 81, 285, 9372, 273, 2176, 14251, 352, 310, 8489, 625, 2570, 281, 26526, 533, 1335, 417, 1512, 2834, 275, 619, 4743, 50276, 16983, 20425, 78, 7397, 403, 9093, 247, 12650, 1385, 273, 253, 1180, 273, 4451, 13805, 12620, 275, 247, 12570, 285, 476, 320, 2559, 40600, 833, 407, 6240, 2013, 11305, 4451, 749, 45345, 47421, 516, 5091, 2119, 670, 436, 581, 50276, 383, 366, 21477, 465, 5596, 4273, 23818, 75, 403, 690, 3686, 273, 4216, 253, 30325, 3607, 534, 891, 13414, 452, 247, 1175, 30328, 323, 594, 891, 13414, 1663, 871, 849, 2834, 597, 403, 281, 26526, 50276, 12157, 273, 253, 17647, 273, 841, 3607, 891, 858, 417, 1089, 253, 49373, 2011, 281, 320, 13943, 891, 1158, 352, 651, 1056, 253, 2929, 1199, 10046, 281, 1908, 625, 2570, 3607, 619, 806, 14876, 310, 253, 4736, 27481, 16566, 432, 1149, 317, 312, 311, 1458, 534, 403, 6571, 1754, 327, 44901, 285, 3103, 1335, 417, 2221, 15958, 13027, 891, 651, 751, 281, 923, 1543, 327, 387, 1878, 337, 8103, 534, 310, 417, 14259, 390, 2502, 261, 17708, 1580, 1110, 403, 1512, 3477, 619, 1273, 14876, 651, 320, 5787, 36267, 7363, 1580, 841, 403, 625, 15958, 368, 812, 1908, 970, 253, 17463, 2703, 5522, 1668, 323, 436, 352, 651, 417, 9326, 479, 604, 14008, 5641, 2684, 15225, 327, 841, 625, 2570, 3607, 50275, 16680, 50276, 249, 2087, 891, 10490, 253, 9759, 273, 253, 1543, 5884, 533, 452, 247, 1643, 3533, 5884, 50276, 18, 275, 2829, 337, 253, 31218, 323, 49975, 403, 2223, 247, 2257, 2169, 685, 512, 253, 818, 2867, 2193, 368, 921, 275, 253, 2829, 24088, 3925, 323, 3554, 1559, 14260, 677, 66, 358, 1223, 512, 253, 643, 2193, 403, 50276, 1229, 436, 8018, 326, 627, 403, 2057, 42559, 390, 326, 253, 3607, 2011, 275, 253, 2829, 403, 10380, 34162, 476, 368, 2085, 37699, 581, 14876, 1537, 320, 281, 417, 921, 253, 3388, 533, 3185, 921, 253, 3388, 5958, 273, 512, 253, 1666, 25379, 253, 3388, 476, 320, 4122, 12208, 407, 690, 8892, 1146, 6927, 685, 2571, 374, 4677, 495, 310, 1077, 12744, 403, 253, 14777, 323, 247, 2014, 2867, 390, 323, 253, 818, 3607, 368, 1263, 891, 717, 417, 2119, 752, 253, 4677, 310, 4645, 390, 849, 281, 4665, 352, 495, 275, 2593, 6705, 858, 368, 5115, 2074, 1543, 342, 253, 10087, 3607, 390, 667, 625, 2570, 3607, 50275, 250, 3065, 50276, 18, 28146, 39762, 6753, 36465, 2832, 1148, 32693, 2061, 5375, 1166, 2941, 11325, 1099, 50276, 19, 5787, 1850, 19975, 2216, 949, 3676, 35221, 4715, 5987, 3088, 1528, 72, 6903, 20270, 84, 1012, 1237, 6903, 1967, 19568, 89, 50276, 20, 11365, 7106, 12570, 13747, 323, 2854, 8900, 342, 18902, 11454, 6928, 8437, 21940, 317, 1026, 592, 5297, 24, 67, 5523, 805, 50276, 21, 13757, 273, 8094, 3066, 3676, 35221, 4715, 8437, 22439, 84, 30977, 4185, 11325, 2504, 16989, 89, 50276, 22, 3410, 20246, 13757, 275, 253, 21624, 2317, 273, 3676, 1006, 800, 3210, 3066, 17375, 851, 26208, 5987, 856, 22868, 32167, 2824, 550, 20790, 14952, 13362, 3593, 70, 1237, 1099, 68, 23, 324, 30693, 1508, 18146, 66, 21, 22000, 2275, 21, 67, 19, 70, 1976, 15834, 2974, 50276, 23, 20793, 17699, 16561, 13757, 323, 12077, 5793, 2216, 970, 39762, 6753, 2083, 351, 398, 5987, 16712, 18356, 1026, 263, 1541, 6071, 14600, 1373, 272, 14952, 84, 550, 26, 1026, 17884, 1731, 66, 50276, 24, 39793, 8094, 970, 5919, 19241, 432, 2867, 27163, 2832, 1148, 32693, 2061, 5375, 1252, 6903, 26, 1797, 50276, 25, 4715, 23390, 26306, 17309, 384, 2047, 10234, 323, 5787, 13757, 2832, 1148, 32693, 2061, 5375, 1093, 805, 9104, 1967, 50276, 26, 17697, 5787, 2216, 342, 3676, 1006, 800, 3210, 5987, 3088, 1528, 72, 6903, 21940, 18944, 23925, 303, 25, 67, 361, 24235, 50276, 740, 36827, 32508, 5304, 3006, 285, 4685, 1006, 800, 48960, 6928, 2832, 1148, 32693, 2061, 5375, 1093, 883, 740, 34651, 50276, 883, 29375, 253, 21624, 2317, 273, 305, 507, 323, 24705, 2454, 14835, 5987, 5758, 317, 707, 296, 248, 17312, 71, 681, 6071, 17312, 1087, 14952, 2974, 84, 864, 2388, 3456, 1076, 783, 13324, 592, 4511, 1171, 72, 507, 1542, 6017, 6484, 1664, 264, 2996, 17312, 1087, 14952, 20790, 2974, 50276, 805, 327, 253, 37434, 1430, 273, 1006, 800, 48960, 6928, 2832, 1148, 32693, 2061, 5375, 16129, 26522, 19816, 50276, 1012, 10938, 1006, 800, 3210, 342, 5415, 2616, 273, 10575, 2832, 1148, 32693, 2061, 5375, 1518, 7749, 21378, 50276, 1047, 5987, 1477, 939, 68, 4805, 77, 16192, 6356, 12312, 2617, 68, 13137, 2617, 68, 1839, 2090, 30452, 50276, 1010, 1149, 317, 312, 311, 22791, 272, 3210, 323, 372, 17590, 5787, 2216, 8437, 21940, 18944, 23925, 303, 25, 67, 8897, 1867, 50276, 1036, 5987, 39962, 2061, 5375, 17605, 10496, 27538, 4583, 891, 717, 5211, 281, 923, 247, 2929, 15974, 436, 32811, 1895, 533, 1928, 326, 627, 403, 1512, 1142, 3374, 281, 5583, 14924, 285, 3103, 891, 1364, 5583, 18235, 275, 1340, 281, 1818, 619, 2564, 891, 1158, 253, 1563, 651, 878, 281, 320, 9713, 50276, 18, 19148, 3910, 875, 634, 1332, 285, 2074, 3082, 323, 30375, 10746, 275, 21624, 2317, 4081, 275, 8437, 20, 374, 2007, 37699, 390, 22861, 273, 4560, 247, 4373, 13568, 970, 247, 9162, 1332, 495, 10941, 1411, 247, 873, 273, 37825, 1666, 25379, 285, 1907, 8936, 390, 387, 1878, 12054, 2074, 3045, 50276, 74, 671, 1158, 253, 2929, 812, 320, 1160, 1199, 10046, 407, 50276, 18, 970, 247, 625, 2834, 873, 273, 5787, 3607, 374, 11138, 253, 5426, 273, 5787, 19763, 285, 253, 17082, 4081, 5474, 33032, 2520, 2929, 10262, 247, 1332, 4907, 14008, 5641, 15083, 14071, 326, 33826, 253, 21624, 2317, 273, 247, 12570, 1006, 800, 1566, 281, 14949, 22318, 8094, 2584, 711, 12787, 1463, 6811, 383, 26512, 4271, 21624, 10746, 407, 26736, 247, 1151, 522, 318, 7548, 4373, 13568, 327, 253, 21624, 2317, 285, 897, 253, 10746, 281, 3157, 21624, 11390, 534, 403, 840, 10208, 715, 253, 1006, 255, 710, 1566, 281, 4711, 747, 8094, 342, 6799, 3607, 253, 4477, 671, 3715, 271, 5673, 323, 18366, 5787, 8900, 253, 4477, 5393, 326, 253, 2022, 9021, 273, 436, 1263, 403, 337, 436, 1332, 310, 1566, 1530, 6932, 3021, 7763, 281, 667, 12570, 1006, 800, 1566, 374, 352, 1057, 417, 2430, 667, 851, 26208, 273, 253, 12570, 1006, 800, 1566, 50274, 2577, 806, 4468, 670, 436, 1263, 310, 326, 627, 403, 1142, 3082, 326, 452, 948, 9388, 12342, 1566, 1530, 6932, 285, 1057, 417, 2430, 851, 26208, 273, 253, 12570, 1006, 800, 1566, 690, 6667, 403, 1618, 2708, 597, 7744, 1347, 13757, 327, 253, 21624, 2317, 2584, 6799, 3607, 1293, 26264, 253, 5787, 1006, 800, 1566, 671, 597, 403, 1566, 1530, 6932, 7763, 281, 667, 12570, 1006, 800, 1566, 516, 12371, 604, 253, 4081, 1332, 310, 8936, 281, 390, 310, 11361, 275, 667, 4809, 689, 5368, 3082, 50275, 72, 1405, 30007, 4894, 609, 25658, 391, 359, 74, 480, 295, 3443, 1261, 5353, 277, 617, 9866, 26196, 77, 706, 4611, 480, 278, 3802, 50043, 77, 1205, 8855, 270, 703, 589, 4123, 277, 50275, 4938, 321, 814, 7958, 1479, 247, 4765, 12077, 5793, 2216, 970, 247, 2856, 324, 1069, 257, 5415, 6779, 273, 8094, 913, 84, 4275, 5859, 5976, 30783, 22818, 650, 1648, 334, 84, 391, 391, 50276, 3766, 79, 26196, 77, 706, 4611, 480, 278, 9169, 20793, 17699, 16561, 13757, 323, 12077, 5793, 2216, 970, 39762, 6753, 2083, 351, 398, 5793, 5859, 11633, 8988, 1976, 2691, 417, 249, 268, 617, 9866, 26196, 77, 706, 4611, 480, 278, 50276, 9896, 340, 43425, 11138, 2806, 3364, 13757, 275, 362, 3348, 21624, 2317, 970, 29810, 11649, 549, 32693, 638, 3845, 549, 32693, 19, 12224, 933, 4196, 50276, 32442, 16561, 13757, 327, 253, 21624, 2317, 281, 26526, 12570, 465, 33382, 340, 465, 606, 256, 2093, 74, 340, 256, 50276, 42686, 891, 43425, 16483, 2216, 273, 8094, 1754, 327, 3676, 4715, 285, 247, 6380, 5933, 8249, 5012, 11334, 11334, 50276, 1541, 1999, 5933, 327, 253, 21624, 2317, 281, 26526, 12570, 8986, 391, 24325, 266, 1792, 269, 2870, 46756, 247, 45633, 358, 288, 642, 269, 50276, 2148, 1748, 277, 247, 6247, 5919, 4471, 6082, 422, 5787, 13757, 275, 247, 5415, 21624, 2317, 5793, 5859, 884, 1706, 854, 11718, 1438, 1348, 50276, 25268, 47025, 13757, 327, 253, 21624, 2317, 281, 26526, 12570, 50276, 2577, 1273, 4468, 310, 1880, 253, 4081, 1332, 3400, 4665, 1430, 1223, 253, 4477, 1313, 10998, 597, 343, 1332, 310, 3576, 327, 29375, 5787, 1006, 800, 3210, 253, 1543, 253, 4477, 3559, 403, 26830, 14586, 273, 247, 7316, 12570, 285, 616, 9056, 3607, 816, 751, 643, 2905, 2175, 891, 13414, 923, 667, 1117, 1230, 4665, 494, 5017, 84, 50276, 74, 1158, 436, 2929, 1057, 417, 7568, 667, 1534, 9021, 2429, 281, 5368, 2175, 2490, 187, 4118, 18435, 27, 783, 30628, 1089, 253, 789, 281, 2953, 271, 4722, 285, 1774, 1895, 533, 452, 2067, 4619, 7350, 670, 697, 12497, 1971, 273, 2720, 789, 275, 436, 2170, 50276, 77, 471, 273, 38135, 275, 5886, 281, 253, 2133, 273, 5368, 6239 ]
Below is given review of a research paper from cnoference journal. Please write a summary the review. ### Review: this paper studies representation learning in the context of reinforcement learning and observes that under some noise assumption the linear spectral feature of corresponding markov transition operator can be obtained in closedform then the paper proposes the socalled spede algorithm that enjoys good theoretical guarantee and empirical results the paper studies an important problem in reinforcement learning and shows some interesting observations here are some of my comments and questions 1 in my opinion this paper is more like one on modelbased learning since equation 5 is the main structure studied in the paper it is remarked that similar observations also hold for large amounts of distribution i am wondering whether the theoretical results could be straightforwardly extended for other distributions other than gaussian distribution 2 i think a more detailed comparison to 1 in terms of technical novelty might be necessary in my opinion 1 focuses on frequentist regret bound and thus proposes a ucblike algorithm that requires an oracle for computational efficiency 1 also proposes a thompson sampling tslike algorithm that does not need such an oracle in my opinion it might be straightforward to show the bayesian regret bound for this tslike algorithm using the results in 1 and the standard and wellknown results in 2 in terms of proving bayesian regret bound could the authors elaborate more on the technical contributions beyond the existing literature 3 in terms of the transition dynamics i think it would be beneficial to the readers if the authors could say more on the dynamics that covered by the paper but not covered by 1 in my opinion providing some simple examples could be helpful 4 could the authors elaborate more on why the proposed algorithm decouples the exploration and representation learning since i think there is still longitudinal coupling across timesteps is this decoupling similar to that of neural linear algorithm in 3 1 sham kakade akshay krishnamurthy kendall lowrey motoya ohnishi wen sun information theoretic regret bounds for online nonlinear control neurips 2020 2 daniel russo and benjamin van roy learning to optimize via posterior sampling mathematics of operations research 2014 3 carlos riquelme george tucker jasper snoek deep bayesian bandits showdown an empirical comparison of bayesian deep networks for thompson sampling iclr 2018 i think this is an interesting paper and i would like to understand better the technical novelty docsepthis paper proposed a practical exploration algorithm for finitehorizon rl problems and provided a theoretical guarantee for its algorithm when the transition kernel of the rl problem can be encoded with rkhs it also conducted experiments to validate its algorithm strength this paper extended the model in lc31 to kernel setting and it also used lemma 16 to remove the dependability on the global lipschitz constant it also conducted experiments to show that its algorithm is practical and efficient weakness after going through the proof i notice that the regret bound is proportion to sigma1 which means that the bound will be bigger when the noise level decrease and should be emphasized in theorem 5 here sigma is the noise level although there are some details in section 32 there should be more detail about the posterior sampling in the experiments section since this paper claimed to provide a practical algorithm and this part is not trivial 1 sham kakade akshay krishnamurthy kendall lowrey motoya ohnishi and wen sun informationtheoretic regret bounds for online nonlinear control arxiv preprint arxiv200612466 2020 this paper is significant in the sense that it extends lc31 and provides a theoretical guarantee for a setting that is more complicated than 1 it also uses a novel technique to utilize the noise in transition to remove the dependability on the global lipschitz constant although the dependency on noise might have room for improvement i think it is a good paper and could be accepted docsepthis paper studies the problem of learning representations for rl on the theory side the paper considers the setting where the statetransition is a nonlinear function of the past stateaction plus additive noise and develops a noregret algorithm on the empirical side the paper shows that an adaptation of this algorithm on realworld rl tasks could perform better than existing modelbased algorithms strengths i think overall this paper considers an interesting problem learning representations that is of interest to the community given that a major part of the rl theory literature focuses on learning with a known linear representation or structured function class and much less is known when it comes to learning the representation it is a plus that this paper contains empirical results too on real rl benchmarks which i do appreciate as for a paper with a bulk part being theoretical contributions the experimental results seem mostly convincing to me weaknesses one of my main concerns about the theory part is that this theory may be a somewhat cute but very specific consequence of the gaussian noise assumption it is kind of hard to tell how generalizable this result is or how significant it is as a representation learning result as the gaussian structure density of the noise admits this very form is critically used instead of just for some concentration purposes though i should remark from a more practical perspective im not that worried about using it for realworld problems where it is hard to justify most structural assumptions anyway also from the current papers presentations i find it quite hard to situate the current contributions in the context of representation learning theory in rl and compare with related work as concrete examples section 3 before 31 makes caveats of two prior techniques for representation learning mle and policy cover technique these techniques are discussed quite vaguely and not presented in math consequently i couldnt find the key observation in 31 very motivated since i do not see what the prior methods exactly are the real problem setting considered in this paper appears only in equation 5 and is not emphasized as a problem setting i suggest maybe section 3 could begin with talking about this setting then talk about the existing approaches with some math that at least hints what those methods are and why the authors think they are not sufficient and then come to section 31 the authors discussed theorem 5 in the context when mathcalf is a linear class in this case i believe the model is not equivalent to linear mdp could the authors comment more on what the difference is also how does the present result compare with agarwal et al 2020 and modi et al 2021 theorem 5 uses a thompson sampling algorithm but the proof sketch mentions connection to a certain ucb algorithm are there reasons for not using ucb directly in algorithm 1 could the authors discuss a bit more in detail how the proof compares with the standard eluder dimension proof eg of wang et al currently the proof sketch does not say much where the bounded eluder dimension is used and how is the application different from the existing eluder dimension proof this one is about experiments when deployed into practice the main difference between spede and existing modelbased algorithms is just that the parametrization 3 and 4 is used aside from sgld as for approximate bayesian inference gaussian policy parametrization as the authors mentioned with the above questions in mind i am left confused about the concrete novelty in the proof techniques or problem settings and at most left with the impression that the result may be new but not sure how exactly it compares with prior work other comments the practical issues in implementing the proposed spede maybe this paragraph can form a standalone section after section 4 since logically the practical implementation tricks are not used in the theoretical results expected regret i believe it is usually called the bayes regret in the thompson sampling literature table 1 spedereg on mountaincar the number should not be bolded overall i think this paper makes interesting contributions on both the theoretical and empirical end of representation learning within rl however significant work needs to be done in order to clarify its problem setting results and position within related work docsepthis paper studies the single reward episodic mdp problem with the modelbased ts algorithm assuming the given model class satisfies realizability regularization property bounded eluder dimension low covering number and the true dynamics is the stochastic control model under gaussian noise transition eq 5 the authors show the polynomial bayesian regret guarantee in addition experiments on openai mujoco is conducted clarity and writing my first major comment is about clarity and writing this paper emphasizes that the algorithm is doing representation learning however i believe that modelbased learning might be more suitable for this paper one major difference between the current paper and the prior representation learningrewardfree learning work is that the algorithm in this paper can only handle a single reward while previous modelbased or modelfree works can learn a representation that tackles multitasks multiple rewards i think such differences should be discussed in the paper since the algorithm can only a handle single reward i do not see the significance and advantage of learning the representation model it belongs to the modelbased algorithm where the ts agent tries to learn a model to address the rewardaware rl problem moreover some claims are not accurate it is mentioned that these algorithms learn a uniformly accurate model through a rewardfree exploration upon which decouple the learning from the exploration but modi et al 2021 is a modelfree algorithm and does not try to learn a model modi et al 2021 first proposes to solve minmaxmin optimization instead of minmaxminmax optimization as one operator has a closedform solution the minmaxmin optimization is furthered solved in a more computationally efficient way by considering squared loss minimization and saddlepoint problem for the theoretical part the assumptions are not written clearly for example i believe equation 5 is a crucial assumption but it is not discussed explicitly in section 41 the regret bound is about the bayesian regret and such guarantee is much weaker than the frequentist regretworst case regret the related discussion about the bayesian regret and the definition of bayesian regret epf are missing i feel the title a free lunch from the noise oversells the paper indeed it is a restricted assumption and the theoretical guarantee is obtained under such strong assumption technical analysis and algorithmic framework my second comment is about technical and algorithmic novelty the algorithm is the standard modelbased ts algorithm intuitively given the model class ts algorithm will identify the true model given a large amount of data in addition the authors make a strong assumption on the transition dynamics eq 5 and the analysis is directly adapted from prior work russo van roy 2013 2014 osband van roy 2014 i feel the technical contribution is rather limited experiments i think the most significant contribution of this paper is the experimental results however im not confident in evaluating the empirical results and i feel there are a bunch of related modelbased algorithms that achieve good empirical performance in the experimental part its unclear how the algorithm is implemented how do you choose the function class f and the prior distribution pf i believe the clarity of the writing can be improved it might not be a good idea to emphasize the algorithm is conducting representation learning and using the modelbased learning might be more suitable the theoretical contribution is rather limited and most significant contribution seems to be on the empirical side i have to say im not familiar with the empirical work the authors make a strong assumption on the true model the algorithm and the analysis seems to be directly adapted from prior work eg russo van roy 2013 2014 osband van roy 2014 it seems that there is a gap between the theoretical part and the experiment and its unclear how the experiment is conducted ### Summary:
the ac summarizes the major strengths and weaknesses of the paper pointed out by the reviewers with possible omissions and additions by the ac strengths 1 the paper makes an important observation that the linear mdp assumptions can be met when the true dynamics has additive noise 2 inspired by the theory the paper proposes a new algorithm that empirical outperforms sac the success of the algorithm is very interesting and surprising to some degree weaknesses 3 most of the reviewers and the ac thinks the representation learning perspective is questionable if one strongly believes that the phi mu in the linear mdp assumption should be interpreted as representations then yes this paper is about representation learning in rl and the representation learning is a free lunch however suppose one ignores the linear mdp perspective for the moment and only looks at the modeling assumption s fsa epsilon then f can only be interpreted as a dynamics model and has nothing to do with the term representation that is commonly used in practice representation means the penultimate layer of the neural nets typically in emprical rl moreover in the theory part of the paper the dynamics model is learned via a standard modelbased approachfitting fsa to swhich also suggests that f should be interpreted as a dynamics model instead of a representation how to reconcile these two perspectives the acs own opinion is that this suggests we shouldnt blindly call the phi in the linear mdp formulation a representation in all scenarios but regardless of acs own opinion i suspect that the paper needs to very explicitly discuss and clarify these discrepancies instead of somewhat sweeping it under the rug and claiming the paper is about representation learning without a stronger justification 4 the sample efficiency depends on the eluder dimension which is only known to be polynomial for linear models recent works have shown that the eluder dimension for even simple nonlinear models can be exponential the analysis seems to be also quite related to previous analysis that uses the eluder dimension i think this fact limits the theoretical contribution of the paper 5 there could be a better exposition of the empirical implementation in the paper it appears that the implemented algorithm still has some major differences from the theoretical algorithms 6 its unclear if the paper should only compare with modelfree algorithms at least the theoretical algorithm fits fsa to s explicitly in the definition of confidence region therefore it does not seem to be quite fair to compare with modelfree algorithms given these considerations and given that the majority of the reviewers express some concerns about various subsets of these concerns 36 the ac would recommend the authors revise the paper and resubmit to another top ml venue the ac thinks that the paper contains really interesting and novel observations but the interpretation of the observation might require more thoughts and clarification
[ 1895, 273, 4715, 14237, 323, 391, 77, 327, 253, 3762, 1930, 253, 2929, 19401, 253, 4758, 835, 253, 1098, 11656, 507, 539, 310, 247, 14561, 1159, 273, 253, 2469, 1375, 1913, 5043, 21842, 6046, 285, 24357, 247, 295, 410, 72, 1221, 5933, 327, 253, 16774, 1930, 253, 2929, 2722, 326, 271, 15644, 273, 436, 5933, 327, 1524, 10186, 391, 77, 8892, 812, 1347, 1805, 685, 5368, 1566, 3169, 11333, 20544, 50276, 74, 1158, 4583, 436, 2929, 19401, 271, 4722, 1895, 4715, 14237, 326, 310, 273, 1600, 281, 253, 3114, 1677, 326, 247, 2201, 629, 273, 253, 391, 77, 3762, 6239, 16633, 327, 4715, 342, 247, 1929, 4872, 6779, 390, 18872, 1159, 966, 285, 1199, 1679, 310, 1929, 672, 352, 3249, 281, 4715, 253, 6779, 50275, 262, 310, 247, 5043, 326, 436, 2929, 4428, 16774, 1543, 1512, 327, 1524, 391, 77, 49602, 534, 891, 513, 11435, 347, 323, 247, 2929, 342, 247, 10713, 629, 1146, 10527, 9021, 253, 5661, 1543, 1646, 6571, 21414, 281, 479, 50276, 20881, 1255, 265, 50276, 531, 273, 619, 2022, 7350, 670, 253, 3762, 629, 310, 326, 436, 3762, 778, 320, 247, 8489, 20295, 533, 1077, 2173, 9936, 273, 253, 305, 12064, 6046, 9376, 352, 310, 2238, 273, 1892, 281, 2028, 849, 2087, 12729, 436, 906, 310, 390, 849, 1534, 352, 310, 347, 247, 6779, 4715, 906, 347, 253, 305, 12064, 2605, 4038, 273, 253, 6046, 19943, 436, 1077, 830, 310, 21038, 908, 3185, 273, 816, 323, 690, 4719, 6378, 2167, 891, 943, 7579, 432, 247, 625, 8542, 8668, 516, 417, 326, 11926, 670, 970, 352, 323, 1524, 10186, 3237, 835, 352, 310, 1892, 281, 15249, 954, 8350, 13260, 8791, 50275, 12563, 432, 253, 1655, 9380, 27228, 891, 1089, 352, 3240, 1892, 281, 5999, 366, 253, 1655, 9021, 275, 253, 3634, 273, 6779, 4715, 3762, 275, 391, 77, 285, 7277, 342, 2905, 789, 347, 11859, 6667, 50275, 4674, 495, 1078, 4562, 2789, 15985, 1832, 273, 767, 2720, 5609, 323, 6779, 4715, 278, 282, 285, 3646, 3835, 5853, 841, 5609, 403, 5469, 3240, 39559, 285, 417, 3559, 275, 14168, 17912, 891, 812, 2649, 1089, 253, 2234, 8310, 275, 4562, 1077, 17194, 1580, 891, 513, 417, 923, 752, 253, 2720, 3082, 4555, 403, 50274, 783, 1524, 1895, 4758, 2783, 275, 436, 2929, 4620, 760, 275, 5150, 608, 285, 310, 417, 21947, 347, 247, 1895, 4758, 891, 1804, 5046, 2593, 495, 812, 3135, 342, 5015, 670, 436, 4758, 840, 2312, 670, 253, 5368, 7274, 342, 690, 14168, 326, 387, 1878, 28145, 752, 1110, 3082, 403, 285, 2139, 253, 4477, 1158, 597, 403, 417, 4209, 285, 840, 1705, 281, 2593, 4562, 50274, 783, 4477, 5469, 10012, 608, 275, 253, 3634, 672, 14168, 1179, 71, 310, 247, 4872, 966, 275, 436, 1083, 891, 2868, 253, 1566, 310, 417, 6425, 281, 4872, 278, 12132, 812, 253, 4477, 4385, 625, 327, 752, 253, 3064, 310, 671, 849, 1057, 253, 1246, 906, 7277, 342, 21703, 18758, 1162, 355, 9169, 285, 771, 74, 1162, 355, 43425, 50275, 33921, 608, 4648, 247, 289, 297, 10836, 10491, 5933, 533, 253, 4737, 23211, 25957, 4602, 281, 247, 2176, 44274, 67, 5933, 403, 627, 4606, 323, 417, 970, 44274, 67, 3587, 275, 5933, 337, 50275, 16534, 253, 4477, 2319, 247, 2372, 625, 275, 2508, 849, 253, 4737, 26662, 342, 253, 2629, 1045, 32656, 7877, 4737, 24088, 273, 259, 606, 1162, 355, 4390, 253, 4737, 23211, 1057, 417, 1333, 1199, 835, 253, 11542, 1045, 32656, 7877, 310, 908, 285, 849, 310, 253, 2898, 1027, 432, 253, 5368, 1045, 32656, 7877, 4737, 50274, 2520, 581, 310, 670, 4679, 672, 18329, 715, 3946, 253, 2022, 3064, 875, 653, 13616, 285, 5368, 1566, 3169, 11333, 310, 816, 326, 253, 30364, 45031, 495, 285, 577, 310, 908, 9255, 432, 48237, 392, 347, 323, 16851, 17699, 16561, 17032, 50276, 72, 12064, 3646, 30364, 45031, 347, 253, 4477, 5393, 50276, 3113, 253, 1840, 3533, 275, 2564, 891, 717, 1669, 13477, 670, 253, 11859, 38135, 275, 253, 4737, 5609, 390, 1895, 7533, 285, 387, 954, 1669, 342, 253, 13214, 326, 253, 906, 778, 320, 747, 533, 417, 2119, 849, 4555, 352, 26662, 342, 2720, 789, 50275, 977, 5701, 50275, 783, 8542, 3374, 275, 16994, 253, 4081, 653, 13616, 5046, 436, 12494, 476, 830, 247, 40468, 2593, 846, 2593, 577, 1580, 40452, 253, 8542, 7092, 24866, 403, 417, 908, 275, 253, 10527, 1543, 50275, 9127, 14938, 891, 2868, 352, 310, 3798, 1925, 253, 17699, 265, 14938, 275, 253, 289, 297, 10836, 10491, 6239, 50275, 2420, 337, 653, 13616, 1747, 327, 11129, 5546, 253, 1180, 943, 417, 320, 13433, 264, 50276, 1189, 455, 891, 1158, 436, 2929, 2789, 4722, 9021, 327, 1097, 253, 10527, 285, 16774, 990, 273, 6779, 4715, 1561, 391, 77, 2299, 1534, 789, 3198, 281, 320, 2218, 275, 1340, 281, 19148, 697, 1895, 4758, 1543, 285, 1899, 1561, 2905, 789, 50276, 7152, 33032, 2520, 2929, 2175, 253, 2014, 10921, 6314, 23329, 278, 12132, 1895, 342, 253, 1566, 3169, 28669, 5933, 7384, 253, 1677, 1566, 966, 12310, 42924, 1430, 37820, 2867, 11542, 1045, 32656, 7877, 1698, 10985, 1180, 285, 253, 2032, 8062, 310, 253, 19191, 1453, 1566, 762, 305, 12064, 6046, 5502, 16186, 608, 253, 4477, 921, 253, 14189, 17699, 16561, 14938, 12215, 275, 1635, 4679, 327, 1527, 2284, 278, 10441, 16856, 310, 5196, 50276, 498, 15752, 285, 4028, 50275, 2577, 806, 2201, 4385, 310, 670, 19843, 285, 4028, 436, 2929, 35520, 326, 253, 5933, 310, 2509, 6779, 4715, 2299, 891, 2868, 326, 1566, 3169, 4715, 1537, 320, 625, 7470, 323, 436, 2929, 581, 2201, 3064, 875, 253, 1655, 2929, 285, 253, 2720, 6779, 4715, 250, 1034, 4924, 4715, 789, 310, 326, 253, 5933, 275, 436, 2929, 476, 760, 6016, 247, 2014, 10921, 1223, 2045, 1566, 3169, 390, 771, 813, 658, 2987, 476, 3037, 247, 6779, 326, 39223, 1554, 262, 6579, 2709, 23267, 891, 1158, 824, 3910, 943, 320, 5469, 275, 253, 2929, 1580, 253, 5933, 476, 760, 247, 6016, 2014, 10921, 891, 513, 417, 923, 253, 8453, 285, 5750, 273, 4715, 253, 6779, 1566, 352, 14125, 281, 253, 1566, 3169, 5933, 835, 253, 28669, 5570, 14177, 281, 3037, 247, 1566, 281, 2953, 253, 10921, 13823, 391, 77, 1895, 50276, 3062, 1189, 690, 3916, 403, 417, 7899, 352, 310, 5393, 326, 841, 11333, 3037, 247, 17568, 7899, 1566, 949, 247, 10921, 4924, 17947, 2220, 534, 34430, 713, 253, 4715, 432, 253, 17947, 533, 771, 74, 1162, 355, 43425, 310, 247, 771, 813, 658, 5933, 285, 1057, 417, 1611, 281, 3037, 247, 1566, 50275, 2307, 74, 1162, 355, 43425, 806, 29328, 281, 8415, 1054, 4090, 1222, 13757, 3185, 273, 1054, 4090, 1222, 4090, 13757, 347, 581, 5572, 556, 247, 4581, 630, 2900, 253, 1054, 4090, 1222, 13757, 310, 11829, 783, 433, 14042, 275, 247, 625, 43245, 5919, 1039, 407, 7296, 30044, 2957, 41458, 285, 26759, 3659, 1895, 50276, 1542, 253, 10527, 629, 253, 13260, 403, 417, 3542, 4518, 323, 1650, 891, 2868, 5150, 608, 310, 247, 9560, 9376, 533, 352, 310, 417, 5469, 11120, 275, 2593, 7609, 50275, 783, 14938, 3033, 310, 670, 253, 17699, 16561, 14938, 285, 824, 12215, 310, 1199, 21076, 685, 253, 10879, 382, 14938, 42108, 296, 1083, 14938, 253, 2905, 5955, 670, 253, 17699, 16561, 14938, 285, 253, 5426, 273, 17699, 16561, 14938, 2563, 71, 403, 5816, 50276, 74, 1928, 253, 4060, 247, 1959, 11157, 432, 253, 6046, 689, 84, 7042, 253, 2929, 6296, 352, 310, 247, 11096, 9376, 285, 253, 10527, 12215, 310, 2797, 762, 824, 2266, 9376, 50275, 48746, 1783, 285, 5933, 280, 7792, 50276, 2577, 1273, 4385, 310, 670, 7681, 285, 5933, 280, 38135, 253, 5933, 310, 253, 2629, 1566, 3169, 28669, 5933, 540, 41597, 1677, 253, 1566, 966, 28669, 5933, 588, 4271, 253, 2032, 1566, 1677, 247, 1781, 2408, 273, 941, 275, 1635, 253, 4477, 1056, 247, 2266, 9376, 327, 253, 5502, 8062, 16186, 608, 285, 253, 1783, 310, 3587, 12956, 432, 2720, 789, 391, 316, 601, 50276, 6148, 12869, 4072, 4059, 7684, 4152, 50276, 6148, 12869, 4059, 891, 1928, 253, 7681, 7680, 310, 2581, 3710, 50275, 16217, 3825, 50276, 74, 1158, 253, 954, 1534, 7680, 273, 436, 2929, 310, 253, 5661, 1543, 2299, 516, 417, 13224, 275, 16344, 253, 16774, 1543, 285, 891, 1928, 627, 403, 247, 12190, 273, 2905, 1566, 3169, 11333, 326, 5115, 1175, 16774, 3045, 50275, 249, 253, 5661, 629, 697, 12744, 849, 253, 5933, 310, 9009, 849, 513, 368, 5206, 253, 1159, 966, 269, 285, 253, 2720, 3268, 268, 71, 891, 2868, 253, 19843, 273, 253, 4028, 476, 320, 5520, 352, 1537, 417, 320, 247, 1175, 2934, 281, 22175, 253, 5933, 310, 16472, 6779, 4715, 285, 970, 253, 1566, 3169, 4715, 1537, 320, 625, 7470, 50276, 783, 10527, 7680, 310, 2581, 3710, 285, 954, 1534, 7680, 3133, 281, 320, 327, 253, 16774, 1930, 891, 452, 281, 1333, 516, 417, 7615, 342, 253, 16774, 789, 253, 4477, 1056, 247, 2266, 9376, 327, 253, 2032, 1566, 253, 5933, 285, 253, 1783, 3133, 281, 320, 3587, 12956, 432, 2720, 789, 24088, 391, 316, 601, 50276, 6148, 12869, 4072, 4059, 7684, 4152, 50276, 6148, 12869, 4059, 50275, 262, 3133, 326, 627, 310, 247, 8037, 875, 253, 10527, 629, 285, 253, 3368, 285, 697, 12744, 849, 253, 3368, 310, 5196, 2490, 187, 4118, 18435, 27, 783, 913, 37250, 253, 2201, 20544, 285, 32213, 273, 253, 2929, 8042, 562, 407, 253, 30628, 342, 1896, 49889, 285, 30733, 407, 253, 913, 50276, 296, 3755, 20556, 337, 253, 2929, 2789, 271, 1774, 8310, 326, 253, 4872, 278, 12132, 13260, 476, 320, 1313, 672, 253, 2032, 8062, 556, 21842, 6046, 50276, 19, 11797, 407, 253, 3762, 253, 2929, 29328, 247, 747, 5933, 326, 16774, 41731, 13015, 7044, 253, 2323, 273, 253, 5933, 310, 1077, 4722, 285, 10084, 281, 690, 4248, 50275, 20881, 1255, 265, 495, 954, 273, 253, 30628, 285, 253, 913, 11121, 253, 6779, 4715, 8668, 310, 30455, 604, 581, 7052, 11532, 326, 253, 815, 74, 12910, 275, 253, 4872, 278, 12132, 9376, 943, 320, 12814, 347, 14237, 840, 4754, 436, 2929, 310, 670, 6779, 4715, 275, 391, 77, 285, 253, 6779, 4715, 310, 247, 1959, 11157, 2299, 9428, 581, 35136, 253, 4872, 278, 12132, 8668, 323, 253, 2774, 285, 760, 4453, 387, 253, 14053, 9376, 256, 50276, 3671, 66, 299, 4277, 840, 269, 476, 760, 320, 12814, 347, 247, 8062, 1566, 285, 556, 2717, 281, 513, 342, 253, 1307, 6779, 326, 310, 7744, 908, 275, 3946, 6779, 2097, 253, 4331, 503, 2542, 3828, 273, 253, 11454, 37507, 5431, 275, 802, 1087, 474, 391, 77, 50276, 3062, 1189, 275, 253, 3762, 629, 273, 253, 2929, 253, 8062, 1566, 310, 6311, 3066, 247, 2629, 1566, 3169, 2746, 31893, 269, 6678, 281, 1863, 73, 469, 671, 5936, 326, 269, 943, 320, 12814, 347, 247, 8062, 1566, 3185, 273, 247, 6779, 849, 281, 42853, 841, 767, 24302, 253, 913, 84, 1211, 4743, 310, 326, 436, 5936, 359, 943, 2649, 9645, 314, 1067, 253, 815, 74, 275, 253, 4872, 278, 12132, 15895, 247, 6779, 275, 512, 15216, 533, 10159, 273, 913, 84, 1211, 4743, 891, 9101, 326, 253, 2929, 3198, 281, 1077, 11120, 2319, 285, 19148, 841, 37122, 3185, 273, 8489, 28110, 352, 762, 253, 16051, 285, 15081, 253, 2929, 310, 670, 6779, 4715, 1293, 247, 10046, 22861, 50275, 21, 253, 3410, 6733, 7024, 327, 253, 1045, 32656, 7877, 534, 310, 760, 1929, 281, 320, 14189, 323, 4872, 3210, 3332, 2987, 452, 2011, 326, 253, 1045, 32656, 7877, 323, 1014, 2969, 14561, 3210, 476, 320, 17619, 253, 1783, 3133, 281, 320, 671, 3240, 2905, 281, 2045, 1783, 326, 4648, 253, 1045, 32656, 7877, 891, 1158, 436, 958, 7787, 253, 10527, 7680, 273, 253, 2929, 50275, 22, 627, 812, 320, 247, 1805, 47284, 273, 253, 16774, 7092, 275, 253, 2929, 352, 4620, 326, 253, 9009, 5933, 1335, 556, 690, 2201, 3910, 432, 253, 10527, 11333, 50275, 23, 697, 12744, 604, 253, 2929, 943, 760, 7277, 342, 771, 813, 658, 11333, 387, 1878, 253, 10527, 5933, 13840, 269, 6678, 281, 256, 11120, 275, 253, 5426, 273, 7162, 2919, 3103, 352, 1057, 417, 1646, 281, 320, 3240, 4344, 281, 7277, 342, 771, 813, 658, 11333, 50275, 28821, 841, 15711, 285, 1677, 326, 253, 5020, 273, 253, 30628, 3890, 690, 7350, 670, 2710, 20077, 273, 841, 7350, 5540, 253, 913, 651, 5583, 253, 4477, 49620, 253, 2929, 285, 501, 538, 2225, 281, 1529, 1755, 13361, 18767, 253, 913, 11121, 326, 253, 2929, 4428, 1663, 4722, 285, 4460, 7313, 533, 253, 7914, 273, 253, 8310, 1537, 2430, 625, 7906, 285, 37699 ]
[ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 ]
[ 1895, 273, 4715, 14237, 323, 391, 77, 327, 253, 3762, 1930, 253, 2929, 19401, 253, 4758, 835, 253, 1098, 11656, 507, 539, 310, 247, 14561, 1159, 273, 253, 2469, 1375, 1913, 5043, 21842, 6046, 285, 24357, 247, 295, 410, 72, 1221, 5933, 327, 253, 16774, 1930, 253, 2929, 2722, 326, 271, 15644, 273, 436, 5933, 327, 1524, 10186, 391, 77, 8892, 812, 1347, 1805, 685, 5368, 1566, 3169, 11333, 20544, 50276, 74, 1158, 4583, 436, 2929, 19401, 271, 4722, 1895, 4715, 14237, 326, 310, 273, 1600, 281, 253, 3114, 1677, 326, 247, 2201, 629, 273, 253, 391, 77, 3762, 6239, 16633, 327, 4715, 342, 247, 1929, 4872, 6779, 390, 18872, 1159, 966, 285, 1199, 1679, 310, 1929, 672, 352, 3249, 281, 4715, 253, 6779, 50275, 262, 310, 247, 5043, 326, 436, 2929, 4428, 16774, 1543, 1512, 327, 1524, 391, 77, 49602, 534, 891, 513, 11435, 347, 323, 247, 2929, 342, 247, 10713, 629, 1146, 10527, 9021, 253, 5661, 1543, 1646, 6571, 21414, 281, 479, 50276, 20881, 1255, 265, 50276, 531, 273, 619, 2022, 7350, 670, 253, 3762, 629, 310, 326, 436, 3762, 778, 320, 247, 8489, 20295, 533, 1077, 2173, 9936, 273, 253, 305, 12064, 6046, 9376, 352, 310, 2238, 273, 1892, 281, 2028, 849, 2087, 12729, 436, 906, 310, 390, 849, 1534, 352, 310, 347, 247, 6779, 4715, 906, 347, 253, 305, 12064, 2605, 4038, 273, 253, 6046, 19943, 436, 1077, 830, 310, 21038, 908, 3185, 273, 816, 323, 690, 4719, 6378, 2167, 891, 943, 7579, 432, 247, 625, 8542, 8668, 516, 417, 326, 11926, 670, 970, 352, 323, 1524, 10186, 3237, 835, 352, 310, 1892, 281, 15249, 954, 8350, 13260, 8791, 50275, 12563, 432, 253, 1655, 9380, 27228, 891, 1089, 352, 3240, 1892, 281, 5999, 366, 253, 1655, 9021, 275, 253, 3634, 273, 6779, 4715, 3762, 275, 391, 77, 285, 7277, 342, 2905, 789, 347, 11859, 6667, 50275, 4674, 495, 1078, 4562, 2789, 15985, 1832, 273, 767, 2720, 5609, 323, 6779, 4715, 278, 282, 285, 3646, 3835, 5853, 841, 5609, 403, 5469, 3240, 39559, 285, 417, 3559, 275, 14168, 17912, 891, 812, 2649, 1089, 253, 2234, 8310, 275, 4562, 1077, 17194, 1580, 891, 513, 417, 923, 752, 253, 2720, 3082, 4555, 403, 50274, 783, 1524, 1895, 4758, 2783, 275, 436, 2929, 4620, 760, 275, 5150, 608, 285, 310, 417, 21947, 347, 247, 1895, 4758, 891, 1804, 5046, 2593, 495, 812, 3135, 342, 5015, 670, 436, 4758, 840, 2312, 670, 253, 5368, 7274, 342, 690, 14168, 326, 387, 1878, 28145, 752, 1110, 3082, 403, 285, 2139, 253, 4477, 1158, 597, 403, 417, 4209, 285, 840, 1705, 281, 2593, 4562, 50274, 783, 4477, 5469, 10012, 608, 275, 253, 3634, 672, 14168, 1179, 71, 310, 247, 4872, 966, 275, 436, 1083, 891, 2868, 253, 1566, 310, 417, 6425, 281, 4872, 278, 12132, 812, 253, 4477, 4385, 625, 327, 752, 253, 3064, 310, 671, 849, 1057, 253, 1246, 906, 7277, 342, 21703, 18758, 1162, 355, 9169, 285, 771, 74, 1162, 355, 43425, 50275, 33921, 608, 4648, 247, 289, 297, 10836, 10491, 5933, 533, 253, 4737, 23211, 25957, 4602, 281, 247, 2176, 44274, 67, 5933, 403, 627, 4606, 323, 417, 970, 44274, 67, 3587, 275, 5933, 337, 50275, 16534, 253, 4477, 2319, 247, 2372, 625, 275, 2508, 849, 253, 4737, 26662, 342, 253, 2629, 1045, 32656, 7877, 4737, 24088, 273, 259, 606, 1162, 355, 4390, 253, 4737, 23211, 1057, 417, 1333, 1199, 835, 253, 11542, 1045, 32656, 7877, 310, 908, 285, 849, 310, 253, 2898, 1027, 432, 253, 5368, 1045, 32656, 7877, 4737, 50274, 2520, 581, 310, 670, 4679, 672, 18329, 715, 3946, 253, 2022, 3064, 875, 653, 13616, 285, 5368, 1566, 3169, 11333, 310, 816, 326, 253, 30364, 45031, 495, 285, 577, 310, 908, 9255, 432, 48237, 392, 347, 323, 16851, 17699, 16561, 17032, 50276, 72, 12064, 3646, 30364, 45031, 347, 253, 4477, 5393, 50276, 3113, 253, 1840, 3533, 275, 2564, 891, 717, 1669, 13477, 670, 253, 11859, 38135, 275, 253, 4737, 5609, 390, 1895, 7533, 285, 387, 954, 1669, 342, 253, 13214, 326, 253, 906, 778, 320, 747, 533, 417, 2119, 849, 4555, 352, 26662, 342, 2720, 789, 50275, 977, 5701, 50275, 783, 8542, 3374, 275, 16994, 253, 4081, 653, 13616, 5046, 436, 12494, 476, 830, 247, 40468, 2593, 846, 2593, 577, 1580, 40452, 253, 8542, 7092, 24866, 403, 417, 908, 275, 253, 10527, 1543, 50275, 9127, 14938, 891, 2868, 352, 310, 3798, 1925, 253, 17699, 265, 14938, 275, 253, 289, 297, 10836, 10491, 6239, 50275, 2420, 337, 653, 13616, 1747, 327, 11129, 5546, 253, 1180, 943, 417, 320, 13433, 264, 50276, 1189, 455, 891, 1158, 436, 2929, 2789, 4722, 9021, 327, 1097, 253, 10527, 285, 16774, 990, 273, 6779, 4715, 1561, 391, 77, 2299, 1534, 789, 3198, 281, 320, 2218, 275, 1340, 281, 19148, 697, 1895, 4758, 1543, 285, 1899, 1561, 2905, 789, 50276, 7152, 33032, 2520, 2929, 2175, 253, 2014, 10921, 6314, 23329, 278, 12132, 1895, 342, 253, 1566, 3169, 28669, 5933, 7384, 253, 1677, 1566, 966, 12310, 42924, 1430, 37820, 2867, 11542, 1045, 32656, 7877, 1698, 10985, 1180, 285, 253, 2032, 8062, 310, 253, 19191, 1453, 1566, 762, 305, 12064, 6046, 5502, 16186, 608, 253, 4477, 921, 253, 14189, 17699, 16561, 14938, 12215, 275, 1635, 4679, 327, 1527, 2284, 278, 10441, 16856, 310, 5196, 50276, 498, 15752, 285, 4028, 50275, 2577, 806, 2201, 4385, 310, 670, 19843, 285, 4028, 436, 2929, 35520, 326, 253, 5933, 310, 2509, 6779, 4715, 2299, 891, 2868, 326, 1566, 3169, 4715, 1537, 320, 625, 7470, 323, 436, 2929, 581, 2201, 3064, 875, 253, 1655, 2929, 285, 253, 2720, 6779, 4715, 250, 1034, 4924, 4715, 789, 310, 326, 253, 5933, 275, 436, 2929, 476, 760, 6016, 247, 2014, 10921, 1223, 2045, 1566, 3169, 390, 771, 813, 658, 2987, 476, 3037, 247, 6779, 326, 39223, 1554, 262, 6579, 2709, 23267, 891, 1158, 824, 3910, 943, 320, 5469, 275, 253, 2929, 1580, 253, 5933, 476, 760, 247, 6016, 2014, 10921, 891, 513, 417, 923, 253, 8453, 285, 5750, 273, 4715, 253, 6779, 1566, 352, 14125, 281, 253, 1566, 3169, 5933, 835, 253, 28669, 5570, 14177, 281, 3037, 247, 1566, 281, 2953, 253, 10921, 13823, 391, 77, 1895, 50276, 3062, 1189, 690, 3916, 403, 417, 7899, 352, 310, 5393, 326, 841, 11333, 3037, 247, 17568, 7899, 1566, 949, 247, 10921, 4924, 17947, 2220, 534, 34430, 713, 253, 4715, 432, 253, 17947, 533, 771, 74, 1162, 355, 43425, 310, 247, 771, 813, 658, 5933, 285, 1057, 417, 1611, 281, 3037, 247, 1566, 50275, 2307, 74, 1162, 355, 43425, 806, 29328, 281, 8415, 1054, 4090, 1222, 13757, 3185, 273, 1054, 4090, 1222, 4090, 13757, 347, 581, 5572, 556, 247, 4581, 630, 2900, 253, 1054, 4090, 1222, 13757, 310, 11829, 783, 433, 14042, 275, 247, 625, 43245, 5919, 1039, 407, 7296, 30044, 2957, 41458, 285, 26759, 3659, 1895, 50276, 1542, 253, 10527, 629, 253, 13260, 403, 417, 3542, 4518, 323, 1650, 891, 2868, 5150, 608, 310, 247, 9560, 9376, 533, 352, 310, 417, 5469, 11120, 275, 2593, 7609, 50275, 783, 14938, 3033, 310, 670, 253, 17699, 16561, 14938, 285, 824, 12215, 310, 1199, 21076, 685, 253, 10879, 382, 14938, 42108, 296, 1083, 14938, 253, 2905, 5955, 670, 253, 17699, 16561, 14938, 285, 253, 5426, 273, 17699, 16561, 14938, 2563, 71, 403, 5816, 50276, 74, 1928, 253, 4060, 247, 1959, 11157, 432, 253, 6046, 689, 84, 7042, 253, 2929, 6296, 352, 310, 247, 11096, 9376, 285, 253, 10527, 12215, 310, 2797, 762, 824, 2266, 9376, 50275, 48746, 1783, 285, 5933, 280, 7792, 50276, 2577, 1273, 4385, 310, 670, 7681, 285, 5933, 280, 38135, 253, 5933, 310, 253, 2629, 1566, 3169, 28669, 5933, 540, 41597, 1677, 253, 1566, 966, 28669, 5933, 588, 4271, 253, 2032, 1566, 1677, 247, 1781, 2408, 273, 941, 275, 1635, 253, 4477, 1056, 247, 2266, 9376, 327, 253, 5502, 8062, 16186, 608, 285, 253, 1783, 310, 3587, 12956, 432, 2720, 789, 391, 316, 601, 50276, 6148, 12869, 4072, 4059, 7684, 4152, 50276, 6148, 12869, 4059, 891, 1928, 253, 7681, 7680, 310, 2581, 3710, 50275, 16217, 3825, 50276, 74, 1158, 253, 954, 1534, 7680, 273, 436, 2929, 310, 253, 5661, 1543, 2299, 516, 417, 13224, 275, 16344, 253, 16774, 1543, 285, 891, 1928, 627, 403, 247, 12190, 273, 2905, 1566, 3169, 11333, 326, 5115, 1175, 16774, 3045, 50275, 249, 253, 5661, 629, 697, 12744, 849, 253, 5933, 310, 9009, 849, 513, 368, 5206, 253, 1159, 966, 269, 285, 253, 2720, 3268, 268, 71, 891, 2868, 253, 19843, 273, 253, 4028, 476, 320, 5520, 352, 1537, 417, 320, 247, 1175, 2934, 281, 22175, 253, 5933, 310, 16472, 6779, 4715, 285, 970, 253, 1566, 3169, 4715, 1537, 320, 625, 7470, 50276, 783, 10527, 7680, 310, 2581, 3710, 285, 954, 1534, 7680, 3133, 281, 320, 327, 253, 16774, 1930, 891, 452, 281, 1333, 516, 417, 7615, 342, 253, 16774, 789, 253, 4477, 1056, 247, 2266, 9376, 327, 253, 2032, 1566, 253, 5933, 285, 253, 1783, 3133, 281, 320, 3587, 12956, 432, 2720, 789, 24088, 391, 316, 601, 50276, 6148, 12869, 4072, 4059, 7684, 4152, 50276, 6148, 12869, 4059, 50275, 262, 3133, 326, 627, 310, 247, 8037, 875, 253, 10527, 629, 285, 253, 3368, 285, 697, 12744, 849, 253, 3368, 310, 5196, 2490, 187, 4118, 18435, 27, 783, 913, 37250, 253, 2201, 20544, 285, 32213, 273, 253, 2929, 8042, 562, 407, 253, 30628, 342, 1896, 49889, 285, 30733, 407, 253, 913, 50276, 296, 3755, 20556, 337, 253, 2929, 2789, 271, 1774, 8310, 326, 253, 4872, 278, 12132, 13260, 476, 320, 1313, 672, 253, 2032, 8062, 556, 21842, 6046, 50276, 19, 11797, 407, 253, 3762, 253, 2929, 29328, 247, 747, 5933, 326, 16774, 41731, 13015, 7044, 253, 2323, 273, 253, 5933, 310, 1077, 4722, 285, 10084, 281, 690, 4248, 50275, 20881, 1255, 265, 495, 954, 273, 253, 30628, 285, 253, 913, 11121, 253, 6779, 4715, 8668, 310, 30455, 604, 581, 7052, 11532, 326, 253, 815, 74, 12910, 275, 253, 4872, 278, 12132, 9376, 943, 320, 12814, 347, 14237, 840, 4754, 436, 2929, 310, 670, 6779, 4715, 275, 391, 77, 285, 253, 6779, 4715, 310, 247, 1959, 11157, 2299, 9428, 581, 35136, 253, 4872, 278, 12132, 8668, 323, 253, 2774, 285, 760, 4453, 387, 253, 14053, 9376, 256, 50276, 3671, 66, 299, 4277, 840, 269, 476, 760, 320, 12814, 347, 247, 8062, 1566, 285, 556, 2717, 281, 513, 342, 253, 1307, 6779, 326, 310, 7744, 908, 275, 3946, 6779, 2097, 253, 4331, 503, 2542, 3828, 273, 253, 11454, 37507, 5431, 275, 802, 1087, 474, 391, 77, 50276, 3062, 1189, 275, 253, 3762, 629, 273, 253, 2929, 253, 8062, 1566, 310, 6311, 3066, 247, 2629, 1566, 3169, 2746, 31893, 269, 6678, 281, 1863, 73, 469, 671, 5936, 326, 269, 943, 320, 12814, 347, 247, 8062, 1566, 3185, 273, 247, 6779, 849, 281, 42853, 841, 767, 24302, 253, 913, 84, 1211, 4743, 310, 326, 436, 5936, 359, 943, 2649, 9645, 314, 1067, 253, 815, 74, 275, 253, 4872, 278, 12132, 15895, 247, 6779, 275, 512, 15216, 533, 10159, 273, 913, 84, 1211, 4743, 891, 9101, 326, 253, 2929, 3198, 281, 1077, 11120, 2319, 285, 19148, 841, 37122, 3185, 273, 8489, 28110, 352, 762, 253, 16051, 285, 15081, 253, 2929, 310, 670, 6779, 4715, 1293, 247, 10046, 22861, 50275, 21, 253, 3410, 6733, 7024, 327, 253, 1045, 32656, 7877, 534, 310, 760, 1929, 281, 320, 14189, 323, 4872, 3210, 3332, 2987, 452, 2011, 326, 253, 1045, 32656, 7877, 323, 1014, 2969, 14561, 3210, 476, 320, 17619, 253, 1783, 3133, 281, 320, 671, 3240, 2905, 281, 2045, 1783, 326, 4648, 253, 1045, 32656, 7877, 891, 1158, 436, 958, 7787, 253, 10527, 7680, 273, 253, 2929, 50275, 22, 627, 812, 320, 247, 1805, 47284, 273, 253, 16774, 7092, 275, 253, 2929, 352, 4620, 326, 253, 9009, 5933, 1335, 556, 690, 2201, 3910, 432, 253, 10527, 11333, 50275, 23, 697, 12744, 604, 253, 2929, 943, 760, 7277, 342, 771, 813, 658, 11333, 387, 1878, 253, 10527, 5933, 13840, 269, 6678, 281, 256, 11120, 275, 253, 5426, 273, 7162, 2919, 3103, 352, 1057, 417, 1646, 281, 320, 3240, 4344, 281, 7277, 342, 771, 813, 658, 11333, 50275, 28821, 841, 15711, 285, 1677, 326, 253, 5020, 273, 253, 30628, 3890, 690, 7350, 670, 2710, 20077, 273, 841, 7350, 5540, 253, 913, 651, 5583, 253, 4477, 49620, 253, 2929, 285, 501, 538, 2225, 281, 1529, 1755, 13361, 18767, 253, 913, 11121, 326, 253, 2929, 4428, 1663, 4722, 285, 4460, 7313, 533, 253, 7914, 273, 253, 8310, 1537, 2430, 625, 7906, 285, 37699 ]
Below is given review of a research paper from cnoference journal. Please write a summary the review. ### Review: summary and key claims this paper repurposes the balanced representation learning framework for estimating treatment effects originally proposed in shalit et al 2017 for the survival prediction setup the proposed model deals with selection bias induced by confounded treatment variables and in addition deals with censoring bias and informative censoring the key contributions claimed by the paper are developing a loss function incorporating adjustments for informative censoring and selection bias developing a generative model for event times based on planar normalizing flows proposal of survivalspecific evaluation metrics including a new nonparametric hazard ratio estimator originality and significance overall i think that the paper is a straightforward application of the balanced representation method in shalit et al 2017 to survival outcomes it does not seem like survival prediction is any different from the conventional ite setup with respect to the representation learning aspect of the model hence i dont think that the paper contributes methodologically to the problem of handling selection bias moreover the censoring terms in the loss function are also very similar to those introduced previously in chapfuwa et al icml 2018 paper based on this i think that the extent of technical contribution in the paper does not pass the acceptance threshold i was expecting some more analysis on the interplay between censoring bias selection bias and the effect of both on causal identifiability unfortunately it seems that authors chose to address each of these impairmentsbiases separately using existing solutions and the resulting model is simply an amalgamation of existing ideas the generative modeling part of the model is somehow alien to the original problem of estimating treatment effects on survival outcomes it is not clear why planar flows were specifically used and why flows are needed at all since the complexity needed in modeling is in the relation between features and survival parameters and not the complexity of the survival distribution itself the usage of normalizing flows on the output layer is fine but seems to me unnecessary this reinforces my impression of the model being all over the place on the positive side i think that the problem of estimating ites on survival is very important and rarely addressed most ml models for ites focus on realvalued targets but this is rarely the relevant setup in practice as survival is often the measure of treatment efficacy in medicine i also think that the idea of comparing the estimated hrs of rcts with the ones recovered by the model is a very smart way to evaluate counterfactual inferences and can be a useful evaluation metric for future papers docsepdisclosure i found this paper online during review process httpsarxivorgabs200607756 this is a comprehensive paper with interesting application of counterfactual inference under survival analysis setting overall my recommendation is to accept it nice that the proposed nonparametric approach in this paper can adjusts for bias from confounding due to covariate dependent selection bias and censoring informative or noninformative under three criterions concordance index cindex harrell jr et al 1984 mean coefficient of variation cov and calibration slope cslope chapfuwa et al 2019 and three datasets framingham actg semi synthetic actg compared proposed method with 7 seven others including survival bayesian additive regression trees survbart sparapani et al 2016 using nonparametric kaplanmeier based estimator and cox proportional hazard model using real hr form using three normalized weighting schemes p6 equation 10 the nonparametric form is a natural adoption of km estimator we know sft wondering the motivation of choosing a linear approximation to s and curious would the cardiovascular and hiv data adopted happen to be with s not so curved could you shed light on these p3 assumption of no unobserved confounders or ignorability sounds strong understand the mathematical challenge if relaxing it maybe for future research the overall presentation is nice the organization of a few places might be improved to help first time reader to follow eg ite initially defined on p3 without example till two paragraphs below ha first mentioned with no prior definition and no explicit math relation to ptx briefly specify do aa is for effect of intervention minor issues like align the symbols used across the paper eg add subscript when define si mi i0 1 to increase clarity docsepsummary this paper provides an approach for causal inference in observational survival dataset in which the outcome is of timetoevent type with rightcensored samples the method consists of a representation learning component to reduce selection bias and a survival analysis component which is modeled with normalizing flows pros the paper addresses an important and interesting question the manuscript is wellwritten and easy to understand cons my main concern is the minimal originality and significance of this work that is the representation learning component is directly taken from shalit et al 2017 and the objective function for the survival analysis component is directly taken from chapfuwa et al 2018 also the use of normalizing flows for modelling timetoevent kind of targets is not well motivated the literature review only points to several publications but does not go in depth into whyhow the proposed method differs from those there are also some important references that are missing for example miscouridou et al 2018 also use normalizing flows for survival analysis and it is necessary that the authors discuss how their work differs from theirs miscouridou x perotte a elhadad n ranganath r 2018 deep survival analysis nonparametrics and missingness in machine learning for healthcare conference the paper suffers from many inaccurate statements a few examples follow 1 in the third paragraph of the introduction section the authors mention that the treatment assignment mechanism is not known a priori therefore there may be variables known as confounders affecting both the treatment and survival time which lead to selection bias this is wrong even if we know the treatment assignment policy a priori we still might have selection bias these two are independent 2 in the last sentence of paragraph four of introduction the authors say that the methods cited above that account for confounding bias by reweighting lack a counterfactual prediction mechanism this is wrong because trivially all methods have a prediction mechanism in place and the ones that do account for confounding bias can predict counterfactuals accurately as well 3 the authors should note that representation learning does not remove confounding bias it might only reduce it also reweighting does not remove confounding bias either it just accounts for it minor in section 2 under estimands of interest the authors state that lambdat x is defined below but its never defined in section 3 under accounting for selection bias the authors state that we can go from eq 1 to 2 because identifiability holds since x is a sufficient set from a into t could you please explain how this is different from ignorability ie t0 t1 perp a x docsepthis paper is very well written the motivation and formalism is also clear with every step in the argument properly justified the extension of individualized treatment effects to survival data is in some sense straightforward as both areas are quite mature and can be unified with aggregated loss functions dealing with biases of different type the proposed solution metrics and datasets proposed for this problem are compelling though and i believe will serve as a benchmark for further studies on treatment effects and survival data one question i have is on corollary 1 i dont see a meaningful difference between this statement and that given by shalit et al 2017 nor a proof in the appendix why the separate statement the experiments are somewhat underwhelming survival bart has also been developed for treatment effect estimation 1 this benchmark is certainly more relevant similarly survivalbased deep learning architectures have been developed which could have been considered as well 2 at least considering these perhaps modelling each treatment group separately or including treatment as an additional feature should be considered to understand where the source of gain comes from as presented since almost all benchmarks consider linear interaction between features i would guess that improvements come from nonlinear modelling rather than bias reduction from censoring and selection bias 1 hu liangyuan jiayi ji and fan li estimating heterogeneous survival treatment effect via machinedeep learning methods in observational studies arxiv preprint arxiv200807044 2020 2 lee changhee et al deephit a deep learning approach to survival analysis with competing risks aaai 2018 ### Summary:
summary this paper provides an approach for causal inference in observational survival dataset in which the outcome is of timetoevent type with rightcensored samples to this end the paper adapts the balanced representation learning approach proposed in shalit et al 2017 to the context of survival analysis the paper adapts an approach that uses flexible models to learn nuisance models common in machine learning the authors validated their approach via simulation study and a set of application datasets a ehrbased cohort study of cardiovascular health an rct dataset of hiv patients and a semisynthetic dataset the main concerns of reviewers were due to perceived lack of originality relative to the original proposal in shalit et al 2017
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 8774, 285, 2234, 3916, 50276, 2520, 2929, 1234, 321, 6013, 253, 16645, 6779, 4715, 7792, 323, 26230, 1971, 2538, 8927, 4081, 275, 439, 267, 262, 1162, 355, 4240, 323, 253, 5788, 10554, 9978, 253, 4081, 1566, 13330, 342, 5438, 8492, 5802, 407, 1461, 8055, 1971, 4903, 285, 275, 1635, 13330, 342, 23339, 4263, 8492, 285, 27096, 23339, 4263, 50276, 783, 2234, 9021, 7558, 407, 253, 2929, 403, 50276, 16714, 272, 247, 2957, 1159, 24049, 23927, 323, 27096, 23339, 4263, 285, 5438, 8492, 50276, 16714, 272, 247, 1006, 800, 1566, 323, 2362, 2069, 1754, 327, 23601, 2622, 3006, 14221, 50275, 856, 40384, 273, 14229, 932, 29765, 7103, 17082, 1690, 247, 747, 1327, 36928, 16466, 4313, 29107, 50276, 19164, 414, 285, 8453, 50276, 1189, 455, 891, 1158, 326, 253, 2929, 310, 247, 15246, 2898, 273, 253, 16645, 6779, 1332, 275, 439, 267, 262, 1162, 355, 4240, 281, 5788, 6973, 352, 1057, 417, 1646, 751, 5788, 10554, 310, 667, 1027, 432, 253, 6041, 352, 70, 9978, 342, 1675, 281, 253, 6779, 4715, 4809, 273, 253, 1566, 7613, 891, 13414, 1158, 326, 253, 2929, 17904, 1332, 11220, 281, 253, 1895, 273, 10885, 5438, 8492, 25761, 253, 23339, 4263, 2426, 275, 253, 2957, 1159, 403, 671, 1077, 2074, 281, 1110, 5611, 3786, 275, 13885, 19766, 8754, 1162, 355, 17857, 1686, 4765, 2929, 1754, 327, 436, 891, 1158, 326, 253, 6070, 273, 7681, 7680, 275, 253, 2929, 1057, 417, 1509, 253, 14924, 7887, 50276, 74, 369, 16764, 690, 625, 1783, 327, 253, 36039, 875, 23339, 4263, 8492, 5438, 8492, 285, 253, 1055, 273, 1097, 327, 19349, 1548, 18279, 1430, 19235, 352, 3133, 326, 4477, 9703, 281, 2953, 1016, 273, 841, 38691, 4193, 1169, 11794, 970, 5368, 5482, 285, 253, 4795, 1566, 310, 3365, 271, 47780, 20717, 273, 5368, 5697, 50275, 783, 1006, 800, 14053, 629, 273, 253, 1566, 310, 10380, 14165, 281, 253, 3236, 1895, 273, 26230, 1971, 2538, 327, 5788, 6973, 352, 310, 417, 2590, 2139, 23601, 14221, 497, 5742, 908, 285, 2139, 14221, 403, 3058, 387, 512, 1580, 253, 10454, 3058, 275, 14053, 310, 275, 253, 5886, 875, 3386, 285, 5788, 3602, 285, 417, 253, 10454, 273, 253, 5788, 3268, 3139, 253, 10393, 273, 2622, 3006, 14221, 327, 253, 3453, 3828, 310, 4030, 533, 3133, 281, 479, 15279, 436, 9838, 36217, 619, 13214, 273, 253, 1566, 1146, 512, 689, 253, 1659, 50276, 251, 253, 2762, 1930, 891, 1158, 326, 253, 1895, 273, 26230, 352, 265, 327, 5788, 310, 1077, 1774, 285, 11766, 9713, 954, 13361, 3210, 323, 352, 265, 2770, 327, 1524, 24995, 8571, 533, 436, 310, 11766, 253, 4623, 9978, 275, 3946, 347, 5788, 310, 2223, 253, 2557, 273, 1971, 10307, 275, 9921, 891, 671, 1158, 326, 253, 2934, 273, 10941, 253, 5998, 39112, 273, 391, 291, 84, 342, 253, 4394, 12372, 407, 253, 1566, 310, 247, 1077, 7060, 1039, 281, 7472, 4828, 12690, 780, 27377, 285, 476, 320, 247, 4217, 7103, 7982, 323, 2852, 9380, 50274, 7152, 33032, 3431, 14019, 891, 1119, 436, 2929, 3909, 1309, 2278, 1232, 5987, 39962, 2061, 5375, 1518, 25616, 28703, 50275, 2520, 310, 247, 11088, 2929, 342, 4722, 2898, 273, 4828, 12690, 780, 17032, 762, 5788, 1783, 4758, 4583, 619, 17401, 310, 281, 2997, 50276, 186, 262, 5322, 326, 253, 4081, 1327, 36928, 2746, 275, 436, 2929, 476, 4575, 84, 323, 8492, 432, 34541, 1955, 281, 9383, 11610, 7976, 5438, 8492, 285, 23339, 4263, 27096, 390, 1327, 37650, 800, 50276, 186, 4524, 1264, 16696, 621, 34860, 593, 3605, 260, 4663, 4230, 11436, 480, 83, 1162, 355, 12459, 1599, 10235, 273, 7629, 9383, 285, 18543, 14679, 260, 3433, 1714, 13885, 19766, 8754, 1162, 355, 6247, 285, 1264, 15302, 39926, 3964, 769, 72, 10020, 13506, 769, 72, 2429, 4081, 1332, 342, 818, 5093, 2571, 1690, 5788, 17699, 16561, 21842, 9077, 7139, 3432, 35292, 653, 274, 3682, 74, 1162, 355, 4022, 970, 1327, 36928, 16288, 11139, 1405, 1321, 1754, 29107, 285, 820, 89, 14495, 16466, 1566, 970, 1524, 20589, 830, 970, 1264, 12650, 42428, 15849, 50276, 186, 81, 23, 5150, 884, 253, 1327, 36928, 830, 310, 247, 3626, 16253, 273, 10771, 29107, 359, 871, 256, 649, 12371, 253, 16038, 273, 13887, 247, 4872, 11193, 281, 256, 285, 14338, 651, 253, 13440, 285, 288, 400, 941, 8671, 5108, 281, 320, 342, 256, 417, 594, 22627, 812, 368, 17914, 1708, 327, 841, 209, 186, 81, 20, 9376, 273, 642, 440, 45912, 44667, 398, 390, 15776, 1430, 7835, 2266, 2096, 253, 15965, 5691, 604, 32196, 352, 5046, 323, 2852, 2561, 209, 186, 783, 4583, 9759, 310, 5322, 253, 6003, 273, 247, 1643, 5053, 1537, 320, 5520, 281, 1361, 806, 673, 9414, 281, 956, 24088, 352, 70, 8523, 2931, 327, 268, 20, 1293, 1650, 7357, 767, 33295, 2708, 419, 806, 5393, 342, 642, 2720, 5426, 285, 642, 6843, 14168, 5886, 281, 268, 10136, 13366, 13199, 513, 39951, 310, 323, 1055, 273, 7268, 50275, 186, 37585, 3374, 751, 8495, 253, 14217, 908, 2439, 253, 2929, 24088, 823, 749, 3866, 672, 4853, 4927, 50276, 7373, 50276, 74, 17, 337, 281, 2572, 19843, 50276, 7152, 339, 793, 360, 3454, 436, 2929, 3400, 271, 2746, 323, 19349, 17032, 275, 21899, 5788, 10895, 275, 534, 253, 6454, 310, 273, 4522, 16713, 8045, 1511, 342, 987, 46874, 2149, 3530, 253, 1332, 8414, 273, 247, 6779, 4715, 4445, 281, 4796, 5438, 8492, 285, 247, 5788, 1783, 4445, 534, 310, 23115, 342, 2622, 3006, 14221, 50275, 856, 84, 50276, 783, 2929, 12453, 271, 1774, 285, 4722, 1953, 50276, 783, 7714, 310, 973, 15720, 285, 3477, 281, 2096, 50276, 5040, 50276, 2577, 2022, 4468, 310, 253, 8723, 3236, 414, 285, 8453, 273, 436, 789, 326, 310, 253, 6779, 4715, 4445, 310, 3587, 2668, 432, 439, 267, 262, 1162, 355, 4240, 285, 253, 8103, 1159, 323, 253, 5788, 1783, 4445, 310, 3587, 2668, 432, 13885, 19766, 8754, 1162, 355, 4765, 671, 253, 897, 273, 2622, 3006, 14221, 323, 26278, 4522, 16713, 8045, 2238, 273, 8571, 310, 417, 973, 17194, 50275, 783, 6239, 2278, 760, 2792, 281, 2067, 16516, 533, 1057, 417, 564, 275, 6864, 715, 2139, 5430, 253, 4081, 1332, 19986, 432, 1110, 627, 403, 671, 690, 1774, 10414, 326, 403, 5816, 323, 1650, 27722, 454, 301, 276, 1162, 355, 4765, 671, 897, 2622, 3006, 14221, 323, 5788, 1783, 285, 352, 310, 3309, 326, 253, 4477, 2319, 849, 616, 789, 19986, 432, 31187, 27722, 454, 301, 276, 1269, 591, 32567, 247, 1045, 10178, 324, 295, 50276, 17943, 266, 506, 391, 4765, 3676, 5788, 1783, 1327, 36928, 84, 285, 5816, 1255, 275, 5145, 4715, 323, 11723, 8059, 50276, 783, 2929, 27171, 432, 1142, 31215, 7234, 247, 1643, 6667, 956, 337, 275, 253, 2626, 12494, 273, 253, 10199, 2593, 253, 4477, 3748, 326, 253, 1971, 12714, 5122, 310, 417, 1929, 247, 30400, 3103, 627, 778, 320, 4903, 1929, 347, 44667, 398, 13567, 1097, 253, 1971, 285, 5788, 673, 534, 1421, 281, 5438, 8492, 436, 310, 3430, 1014, 604, 359, 871, 253, 1971, 12714, 3646, 247, 30400, 359, 1335, 1537, 452, 5438, 8492, 50276, 20513, 767, 403, 3907, 374, 275, 253, 1390, 6197, 273, 12494, 1740, 273, 10199, 253, 4477, 1333, 326, 253, 3082, 11106, 1840, 326, 2395, 323, 34541, 8492, 407, 294, 6712, 272, 3480, 247, 4828, 12690, 780, 10554, 5122, 436, 310, 3430, 984, 35820, 1365, 512, 3082, 452, 247, 10554, 5122, 275, 1659, 285, 253, 4394, 326, 513, 2395, 323, 34541, 8492, 476, 3283, 4828, 12690, 780, 84, 13613, 347, 973, 495, 253, 4477, 943, 3877, 326, 6779, 4715, 1057, 417, 5386, 34541, 8492, 352, 1537, 760, 4796, 352, 671, 294, 6712, 272, 1057, 417, 5386, 34541, 8492, 2057, 352, 816, 8553, 323, 352, 50276, 37585, 50276, 249, 2593, 374, 762, 3311, 2287, 273, 1600, 253, 4477, 1375, 326, 24082, 8608, 50276, 89, 310, 2931, 2708, 533, 697, 1620, 2931, 50276, 249, 2593, 495, 762, 15890, 323, 5438, 8492, 253, 4477, 1375, 326, 359, 476, 564, 432, 16186, 337, 281, 374, 984, 1548, 18279, 1430, 6556, 1580, 1269, 310, 247, 4209, 873, 432, 247, 715, 246, 812, 368, 4496, 5513, 849, 436, 310, 1027, 432, 15776, 1430, 26332, 50276, 85, 17, 246, 18, 50276, 14715, 247, 50276, 89, 5474, 33032, 2520, 2929, 310, 1077, 973, 3542, 253, 16038, 285, 30221, 310, 671, 2590, 342, 1046, 3213, 275, 253, 4154, 6283, 17285, 253, 6880, 273, 47687, 1971, 2538, 281, 5788, 941, 310, 275, 690, 3282, 15246, 347, 1097, 3672, 403, 3240, 14242, 285, 476, 320, 27998, 342, 40006, 2957, 3470, 10620, 342, 31306, 273, 1027, 1511, 253, 4081, 2900, 17082, 285, 15302, 4081, 323, 436, 1895, 403, 18511, 2167, 285, 891, 2868, 588, 5752, 347, 247, 22791, 323, 2007, 2175, 327, 1971, 2538, 285, 5788, 941, 50276, 531, 1953, 891, 452, 310, 327, 40460, 337, 891, 13414, 923, 247, 14282, 3064, 875, 436, 3908, 285, 326, 1677, 407, 439, 267, 262, 1162, 355, 4240, 4543, 247, 4737, 275, 253, 30762, 2139, 253, 4858, 3908, 50276, 783, 4679, 403, 8489, 762, 11622, 3987, 5788, 44693, 556, 671, 644, 3715, 323, 1971, 1055, 13418, 337, 436, 22791, 310, 5604, 625, 4623, 12014, 5788, 3169, 3676, 4715, 35615, 452, 644, 3715, 534, 812, 452, 644, 2783, 347, 973, 374, 387, 1878, 7296, 841, 4931, 26278, 1016, 1971, 1387, 11794, 390, 1690, 1971, 347, 271, 3081, 4735, 943, 320, 2783, 281, 2096, 835, 253, 2603, 273, 6351, 3249, 432, 347, 3559, 1580, 2761, 512, 49602, 1908, 4872, 5016, 875, 3386, 891, 651, 5476, 326, 11701, 1705, 432, 14561, 26278, 2581, 685, 8492, 5141, 432, 23339, 4263, 285, 5438, 8492, 50275, 18, 30287, 632, 606, 90, 9041, 480, 74, 333, 74, 480, 74, 285, 7989, 632, 26230, 22766, 5788, 1971, 1055, 3066, 3674, 967, 70, 554, 4715, 3082, 275, 21899, 2175, 549, 32693, 638, 3845, 549, 32693, 1518, 1438, 1967, 2031, 9169, 374, 458, 70, 1683, 31622, 1162, 355, 372, 70, 545, 262, 247, 3676, 4715, 2746, 281, 5788, 1783, 342, 11771, 10502, 39951, 2284, 4765, 187, 187, 4118, 18435, 27, 8774, 436, 2929, 3400, 271, 2746, 323, 19349, 17032, 275, 21899, 5788, 10895, 275, 534, 253, 6454, 310, 273, 4522, 16713, 8045, 1511, 342, 987, 46874, 2149, 3530, 50276, 936, 436, 990, 253, 2929, 5223, 84, 253, 16645, 6779, 4715, 2746, 4081, 275, 439, 267, 262, 1162, 355, 4240, 281, 253, 3634, 273, 5788, 1783, 253, 2929, 5223, 84, 271, 2746, 326, 4648, 12112, 3210, 281, 3037, 41843, 3210, 1846, 275, 5145, 4715, 50276, 783, 4477, 17618, 616, 2746, 3066, 9864, 1263, 285, 247, 873, 273, 2898, 15302, 247, 299, 6285, 3169, 11077, 1263, 273, 13440, 1786, 271, 391, 291, 10895, 273, 288, 400, 1363, 285, 247, 49863, 23744, 10895, 50276, 783, 2022, 7350, 273, 30628, 497, 1955, 281, 12351, 3480, 273, 3236, 414, 4103, 281, 253, 3236, 10419, 275, 439, 267, 262, 1162, 355, 4240, 209 ]
[ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 ]
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 8774, 285, 2234, 3916, 50276, 2520, 2929, 1234, 321, 6013, 253, 16645, 6779, 4715, 7792, 323, 26230, 1971, 2538, 8927, 4081, 275, 439, 267, 262, 1162, 355, 4240, 323, 253, 5788, 10554, 9978, 253, 4081, 1566, 13330, 342, 5438, 8492, 5802, 407, 1461, 8055, 1971, 4903, 285, 275, 1635, 13330, 342, 23339, 4263, 8492, 285, 27096, 23339, 4263, 50276, 783, 2234, 9021, 7558, 407, 253, 2929, 403, 50276, 16714, 272, 247, 2957, 1159, 24049, 23927, 323, 27096, 23339, 4263, 285, 5438, 8492, 50276, 16714, 272, 247, 1006, 800, 1566, 323, 2362, 2069, 1754, 327, 23601, 2622, 3006, 14221, 50275, 856, 40384, 273, 14229, 932, 29765, 7103, 17082, 1690, 247, 747, 1327, 36928, 16466, 4313, 29107, 50276, 19164, 414, 285, 8453, 50276, 1189, 455, 891, 1158, 326, 253, 2929, 310, 247, 15246, 2898, 273, 253, 16645, 6779, 1332, 275, 439, 267, 262, 1162, 355, 4240, 281, 5788, 6973, 352, 1057, 417, 1646, 751, 5788, 10554, 310, 667, 1027, 432, 253, 6041, 352, 70, 9978, 342, 1675, 281, 253, 6779, 4715, 4809, 273, 253, 1566, 7613, 891, 13414, 1158, 326, 253, 2929, 17904, 1332, 11220, 281, 253, 1895, 273, 10885, 5438, 8492, 25761, 253, 23339, 4263, 2426, 275, 253, 2957, 1159, 403, 671, 1077, 2074, 281, 1110, 5611, 3786, 275, 13885, 19766, 8754, 1162, 355, 17857, 1686, 4765, 2929, 1754, 327, 436, 891, 1158, 326, 253, 6070, 273, 7681, 7680, 275, 253, 2929, 1057, 417, 1509, 253, 14924, 7887, 50276, 74, 369, 16764, 690, 625, 1783, 327, 253, 36039, 875, 23339, 4263, 8492, 5438, 8492, 285, 253, 1055, 273, 1097, 327, 19349, 1548, 18279, 1430, 19235, 352, 3133, 326, 4477, 9703, 281, 2953, 1016, 273, 841, 38691, 4193, 1169, 11794, 970, 5368, 5482, 285, 253, 4795, 1566, 310, 3365, 271, 47780, 20717, 273, 5368, 5697, 50275, 783, 1006, 800, 14053, 629, 273, 253, 1566, 310, 10380, 14165, 281, 253, 3236, 1895, 273, 26230, 1971, 2538, 327, 5788, 6973, 352, 310, 417, 2590, 2139, 23601, 14221, 497, 5742, 908, 285, 2139, 14221, 403, 3058, 387, 512, 1580, 253, 10454, 3058, 275, 14053, 310, 275, 253, 5886, 875, 3386, 285, 5788, 3602, 285, 417, 253, 10454, 273, 253, 5788, 3268, 3139, 253, 10393, 273, 2622, 3006, 14221, 327, 253, 3453, 3828, 310, 4030, 533, 3133, 281, 479, 15279, 436, 9838, 36217, 619, 13214, 273, 253, 1566, 1146, 512, 689, 253, 1659, 50276, 251, 253, 2762, 1930, 891, 1158, 326, 253, 1895, 273, 26230, 352, 265, 327, 5788, 310, 1077, 1774, 285, 11766, 9713, 954, 13361, 3210, 323, 352, 265, 2770, 327, 1524, 24995, 8571, 533, 436, 310, 11766, 253, 4623, 9978, 275, 3946, 347, 5788, 310, 2223, 253, 2557, 273, 1971, 10307, 275, 9921, 891, 671, 1158, 326, 253, 2934, 273, 10941, 253, 5998, 39112, 273, 391, 291, 84, 342, 253, 4394, 12372, 407, 253, 1566, 310, 247, 1077, 7060, 1039, 281, 7472, 4828, 12690, 780, 27377, 285, 476, 320, 247, 4217, 7103, 7982, 323, 2852, 9380, 50274, 7152, 33032, 3431, 14019, 891, 1119, 436, 2929, 3909, 1309, 2278, 1232, 5987, 39962, 2061, 5375, 1518, 25616, 28703, 50275, 2520, 310, 247, 11088, 2929, 342, 4722, 2898, 273, 4828, 12690, 780, 17032, 762, 5788, 1783, 4758, 4583, 619, 17401, 310, 281, 2997, 50276, 186, 262, 5322, 326, 253, 4081, 1327, 36928, 2746, 275, 436, 2929, 476, 4575, 84, 323, 8492, 432, 34541, 1955, 281, 9383, 11610, 7976, 5438, 8492, 285, 23339, 4263, 27096, 390, 1327, 37650, 800, 50276, 186, 4524, 1264, 16696, 621, 34860, 593, 3605, 260, 4663, 4230, 11436, 480, 83, 1162, 355, 12459, 1599, 10235, 273, 7629, 9383, 285, 18543, 14679, 260, 3433, 1714, 13885, 19766, 8754, 1162, 355, 6247, 285, 1264, 15302, 39926, 3964, 769, 72, 10020, 13506, 769, 72, 2429, 4081, 1332, 342, 818, 5093, 2571, 1690, 5788, 17699, 16561, 21842, 9077, 7139, 3432, 35292, 653, 274, 3682, 74, 1162, 355, 4022, 970, 1327, 36928, 16288, 11139, 1405, 1321, 1754, 29107, 285, 820, 89, 14495, 16466, 1566, 970, 1524, 20589, 830, 970, 1264, 12650, 42428, 15849, 50276, 186, 81, 23, 5150, 884, 253, 1327, 36928, 830, 310, 247, 3626, 16253, 273, 10771, 29107, 359, 871, 256, 649, 12371, 253, 16038, 273, 13887, 247, 4872, 11193, 281, 256, 285, 14338, 651, 253, 13440, 285, 288, 400, 941, 8671, 5108, 281, 320, 342, 256, 417, 594, 22627, 812, 368, 17914, 1708, 327, 841, 209, 186, 81, 20, 9376, 273, 642, 440, 45912, 44667, 398, 390, 15776, 1430, 7835, 2266, 2096, 253, 15965, 5691, 604, 32196, 352, 5046, 323, 2852, 2561, 209, 186, 783, 4583, 9759, 310, 5322, 253, 6003, 273, 247, 1643, 5053, 1537, 320, 5520, 281, 1361, 806, 673, 9414, 281, 956, 24088, 352, 70, 8523, 2931, 327, 268, 20, 1293, 1650, 7357, 767, 33295, 2708, 419, 806, 5393, 342, 642, 2720, 5426, 285, 642, 6843, 14168, 5886, 281, 268, 10136, 13366, 13199, 513, 39951, 310, 323, 1055, 273, 7268, 50275, 186, 37585, 3374, 751, 8495, 253, 14217, 908, 2439, 253, 2929, 24088, 823, 749, 3866, 672, 4853, 4927, 50276, 7373, 50276, 74, 17, 337, 281, 2572, 19843, 50276, 7152, 339, 793, 360, 3454, 436, 2929, 3400, 271, 2746, 323, 19349, 17032, 275, 21899, 5788, 10895, 275, 534, 253, 6454, 310, 273, 4522, 16713, 8045, 1511, 342, 987, 46874, 2149, 3530, 253, 1332, 8414, 273, 247, 6779, 4715, 4445, 281, 4796, 5438, 8492, 285, 247, 5788, 1783, 4445, 534, 310, 23115, 342, 2622, 3006, 14221, 50275, 856, 84, 50276, 783, 2929, 12453, 271, 1774, 285, 4722, 1953, 50276, 783, 7714, 310, 973, 15720, 285, 3477, 281, 2096, 50276, 5040, 50276, 2577, 2022, 4468, 310, 253, 8723, 3236, 414, 285, 8453, 273, 436, 789, 326, 310, 253, 6779, 4715, 4445, 310, 3587, 2668, 432, 439, 267, 262, 1162, 355, 4240, 285, 253, 8103, 1159, 323, 253, 5788, 1783, 4445, 310, 3587, 2668, 432, 13885, 19766, 8754, 1162, 355, 4765, 671, 253, 897, 273, 2622, 3006, 14221, 323, 26278, 4522, 16713, 8045, 2238, 273, 8571, 310, 417, 973, 17194, 50275, 783, 6239, 2278, 760, 2792, 281, 2067, 16516, 533, 1057, 417, 564, 275, 6864, 715, 2139, 5430, 253, 4081, 1332, 19986, 432, 1110, 627, 403, 671, 690, 1774, 10414, 326, 403, 5816, 323, 1650, 27722, 454, 301, 276, 1162, 355, 4765, 671, 897, 2622, 3006, 14221, 323, 5788, 1783, 285, 352, 310, 3309, 326, 253, 4477, 2319, 849, 616, 789, 19986, 432, 31187, 27722, 454, 301, 276, 1269, 591, 32567, 247, 1045, 10178, 324, 295, 50276, 17943, 266, 506, 391, 4765, 3676, 5788, 1783, 1327, 36928, 84, 285, 5816, 1255, 275, 5145, 4715, 323, 11723, 8059, 50276, 783, 2929, 27171, 432, 1142, 31215, 7234, 247, 1643, 6667, 956, 337, 275, 253, 2626, 12494, 273, 253, 10199, 2593, 253, 4477, 3748, 326, 253, 1971, 12714, 5122, 310, 417, 1929, 247, 30400, 3103, 627, 778, 320, 4903, 1929, 347, 44667, 398, 13567, 1097, 253, 1971, 285, 5788, 673, 534, 1421, 281, 5438, 8492, 436, 310, 3430, 1014, 604, 359, 871, 253, 1971, 12714, 3646, 247, 30400, 359, 1335, 1537, 452, 5438, 8492, 50276, 20513, 767, 403, 3907, 374, 275, 253, 1390, 6197, 273, 12494, 1740, 273, 10199, 253, 4477, 1333, 326, 253, 3082, 11106, 1840, 326, 2395, 323, 34541, 8492, 407, 294, 6712, 272, 3480, 247, 4828, 12690, 780, 10554, 5122, 436, 310, 3430, 984, 35820, 1365, 512, 3082, 452, 247, 10554, 5122, 275, 1659, 285, 253, 4394, 326, 513, 2395, 323, 34541, 8492, 476, 3283, 4828, 12690, 780, 84, 13613, 347, 973, 495, 253, 4477, 943, 3877, 326, 6779, 4715, 1057, 417, 5386, 34541, 8492, 352, 1537, 760, 4796, 352, 671, 294, 6712, 272, 1057, 417, 5386, 34541, 8492, 2057, 352, 816, 8553, 323, 352, 50276, 37585, 50276, 249, 2593, 374, 762, 3311, 2287, 273, 1600, 253, 4477, 1375, 326, 24082, 8608, 50276, 89, 310, 2931, 2708, 533, 697, 1620, 2931, 50276, 249, 2593, 495, 762, 15890, 323, 5438, 8492, 253, 4477, 1375, 326, 359, 476, 564, 432, 16186, 337, 281, 374, 984, 1548, 18279, 1430, 6556, 1580, 1269, 310, 247, 4209, 873, 432, 247, 715, 246, 812, 368, 4496, 5513, 849, 436, 310, 1027, 432, 15776, 1430, 26332, 50276, 85, 17, 246, 18, 50276, 14715, 247, 50276, 89, 5474, 33032, 2520, 2929, 310, 1077, 973, 3542, 253, 16038, 285, 30221, 310, 671, 2590, 342, 1046, 3213, 275, 253, 4154, 6283, 17285, 253, 6880, 273, 47687, 1971, 2538, 281, 5788, 941, 310, 275, 690, 3282, 15246, 347, 1097, 3672, 403, 3240, 14242, 285, 476, 320, 27998, 342, 40006, 2957, 3470, 10620, 342, 31306, 273, 1027, 1511, 253, 4081, 2900, 17082, 285, 15302, 4081, 323, 436, 1895, 403, 18511, 2167, 285, 891, 2868, 588, 5752, 347, 247, 22791, 323, 2007, 2175, 327, 1971, 2538, 285, 5788, 941, 50276, 531, 1953, 891, 452, 310, 327, 40460, 337, 891, 13414, 923, 247, 14282, 3064, 875, 436, 3908, 285, 326, 1677, 407, 439, 267, 262, 1162, 355, 4240, 4543, 247, 4737, 275, 253, 30762, 2139, 253, 4858, 3908, 50276, 783, 4679, 403, 8489, 762, 11622, 3987, 5788, 44693, 556, 671, 644, 3715, 323, 1971, 1055, 13418, 337, 436, 22791, 310, 5604, 625, 4623, 12014, 5788, 3169, 3676, 4715, 35615, 452, 644, 3715, 534, 812, 452, 644, 2783, 347, 973, 374, 387, 1878, 7296, 841, 4931, 26278, 1016, 1971, 1387, 11794, 390, 1690, 1971, 347, 271, 3081, 4735, 943, 320, 2783, 281, 2096, 835, 253, 2603, 273, 6351, 3249, 432, 347, 3559, 1580, 2761, 512, 49602, 1908, 4872, 5016, 875, 3386, 891, 651, 5476, 326, 11701, 1705, 432, 14561, 26278, 2581, 685, 8492, 5141, 432, 23339, 4263, 285, 5438, 8492, 50275, 18, 30287, 632, 606, 90, 9041, 480, 74, 333, 74, 480, 74, 285, 7989, 632, 26230, 22766, 5788, 1971, 1055, 3066, 3674, 967, 70, 554, 4715, 3082, 275, 21899, 2175, 549, 32693, 638, 3845, 549, 32693, 1518, 1438, 1967, 2031, 9169, 374, 458, 70, 1683, 31622, 1162, 355, 372, 70, 545, 262, 247, 3676, 4715, 2746, 281, 5788, 1783, 342, 11771, 10502, 39951, 2284, 4765, 187, 187, 4118, 18435, 27, 8774, 436, 2929, 3400, 271, 2746, 323, 19349, 17032, 275, 21899, 5788, 10895, 275, 534, 253, 6454, 310, 273, 4522, 16713, 8045, 1511, 342, 987, 46874, 2149, 3530, 50276, 936, 436, 990, 253, 2929, 5223, 84, 253, 16645, 6779, 4715, 2746, 4081, 275, 439, 267, 262, 1162, 355, 4240, 281, 253, 3634, 273, 5788, 1783, 253, 2929, 5223, 84, 271, 2746, 326, 4648, 12112, 3210, 281, 3037, 41843, 3210, 1846, 275, 5145, 4715, 50276, 783, 4477, 17618, 616, 2746, 3066, 9864, 1263, 285, 247, 873, 273, 2898, 15302, 247, 299, 6285, 3169, 11077, 1263, 273, 13440, 1786, 271, 391, 291, 10895, 273, 288, 400, 1363, 285, 247, 49863, 23744, 10895, 50276, 783, 2022, 7350, 273, 30628, 497, 1955, 281, 12351, 3480, 273, 3236, 414, 4103, 281, 253, 3236, 10419, 275, 439, 267, 262, 1162, 355, 4240, 209 ]
Below is given review of a research paper from cnoference journal. Please write a summary the review. ### Review: the paper proposes ella a nystrm approximation as an alternative posterior predictive approximation for linearized laplace their method uses the ntkbased formulation of linearized laplace and improves over other common laplace approximations in terms of quality of the predictive while maintaining a relatively low computational complexity strengths can reduce the cubic complexity in data points n or parameters p of linearized laplace in neural networks down to much lower complexity that can be controlled by the number of samples used for nystrm this way a performanceaccuracy tradeoff can be made performance of the proposed posterior predictive improves over cheap lastlayer laplace and is better than the fullnetwork variants interesting and open discussion of the overfitting issue of the proposed method ella exhaustive experiments with ablations on the number of factors k and several metrics of interest overall convinces that the proposed method is a valuable addition to the family of laplace approximations weaknesses only posterior predictive is discussed but not marginal likelihood estimation since the proposed method is rather simple take linearized laplace in its kernel variant and apply nystrm which is wellknown from gp literature it would greatly strengthen the paper to at least show not only predictives but also the marginal likelihood which should in principle be easy to add misunderstanding of relationship between linearized laplace and ggn ll 36 and 3032 linearized laplace and laplace with ggn are equivalent as shown in reference 22 the equivalent variant with neural tangent kernels has been used before as well in 2228 but without nystrm approximations ll 9299 discusses the scalability issues only of the parametric lla which requires approximations to the p times p ggn the problem that is tackled in the present paper however is the approximation of the equivalent n times n kernel which is for example defined clearly n times n is still intractable so the proposed nystrm approximation is well justified ella does not seem to approximate full la which it should in theory figure 1 discrepancy of ella and lla it seems like ella does not get very close to lla in figure 1 and there is no explanation as to why that would be in line 50 it is said though that the approximation becomes accurate relation to prior work could be pointed out more clearly for example it is unclear to what extent the theorems are noveloriginal or if they follow in a straightforward way from previous theoretical contributions there are no immediate societal impacts of the present work computational limitations are discussed in the paper for example the authors discuss the issue of overfitting present in their method docsepthis work proposes a new scalable variant of the linearized laplace approximation lla for inferring posterior distributions over the weights of deep learning models leveraging connections between lla and neural tangent kernels ntks namely that the lla can essentially be viewed in function space as a gaussian process with the ntk the paper describes how to speed up the lla via a nystrm approximation to the ntk effectively yielding an implicit sparse approximation to the llagp predictive distribution the work then elaborates on how to efficiently implement the proposed approach using automatic differentiation frameworks and provides a theoretical analysis of the induced approximation error for the kernel the resulting ella method is then empirically shown to outperform several baselines both other lla variants and a variational method across a range of benchmarks even largescale results with a vision transformer architecture on imagenet are reported it is also empirically found that ella and lla methods more generally tends to overfit in some settings which is proposed to be remedied by only using a subset of the data for estimating the covariance matrix similar in spirit to early stopping methods often used in standard neural network training summary of review this work proposes a novel theoreticallyjustified and clever variant of the laplace approximation that is empirically shown to be performant and scalable to fairly modern deep learning settings and that could therefore have significant impact by enabling practical uncertainty quantification for deep neural networks overall there are many things that i liked about this paper however i also see some significant issues with the empirical evaluation see details below all in all i am inclined to recommend acceptance of the manuscript but would be more enthusiastic in my judgement if the authors can convincingly address the concerns raised strengths the paper is overall wellwritten easytofollow and clearlystructured the proposed ella method for accelerating lla is novel theoretically sound and intuitively sensible it is therefore a timely and welcome addition to the fastgrowing laplace toolbox for bayesian deep learning i appreciate the implementationspecific details in section 34 including the code snippets in algorithm 12 which make it fairly clear how one would practically implement ella without needing to look at the actual python code the theoretical results on the approximation error induced by ella are nicetohave even though they appear to be somewhat straightforwardly inherited from previous analyses of nystrmlike kernel approximations a main advantage of the proposed ella method is that a practitioner can actively control the performance vs cost tradeoff by choosing the hyperparameters m and k which is in contrast to most other methods eg lladiag and llakfac that have a fixed cost and performance however as mentioned in the weaknesses below this tradeoff should be assessed more thoroughly in the empirical evaluation the empirical evaluation demonstrates that ella can outperform relevant baselines on imageclassificationbased uncertainty calibration and outofdistribution detection tasks i particularly appreciate the inclusion of experiments on imagenet which is unfortunately still not so common in the bayesian deep learning literature i was especially impressed by the results on the recent vision transformer architecture which promisingly demonstrates that the method can be applied in fairly modern deep learning settings which again is not typically the case bayesian methods normally lack behind significantly when it comes to adoption of advances in deep learning i really liked the experiment showing that ella and lla methods more generally have a tendency to overfit to larger datasets in the sense that the test loss starts increasing again past a certain number of data points used for fitting the covariance matrix demonstrating that lla methods can be significantly spedup by subsampling the data in some settings weaknesses as all experiments seem to have been run with just a single random seed the results do not come with any error bars which makes it difficult to reason about the statistical significance of the reported conclusions while i understand that repeating experiments for multiple random seeds linearly increases the required computational effort reporting some sort of error statistic would significantly aid the credibility of the drawn conclusions for the particularly expensive experiments ie on imagenet i could perhaps accept lack of compute as an excuse although even there a minimum number of 3 seeds would be desirable but on the cheaper experiments ie on cifar10 i would really expect some repetition of the experiments the empirical comparison does not seem entirely fair as different methods have different memory and compute requirements in particular for the hyperparameters chosen ella seems to be more expensive than some of the other baselines considered in terms of compute andor memory cost as eg also shown in fig 4c therefore it might be more appropriate and insightful to plot performances in 2d with memory andor compute effort on the xaxis and performance on the yaxis so that it becomes clear how the different methods can tradeoff cost vs performance for ella it would be great to then report performance for different values of m andor k ie different costs yielding a pareto curve a practitioner can then choose the methodhyperparameters on the pareto front that best fulfills their needs ie which is either 1 as cheap as possible for a given desired performance x or 2 as performant as possible for a given desired cost y without such a plot it is difficult to draw conclusions on how superior ella really is although ellas ability to cater to a wide range of performancecost tradeoffs by sweeping over m and k should already make it look better than most of the baselines you already show several plots that are related in spirit fig 2 ab fig 4 bc but i believe the specific kind of plot i described would be much more insightful for the imagenet experiments i would expect that at least the lastlayer kfac variant should be feasible to run and not yield an oom error with either a resnet or vit eg the eschenhagen et al 2021 paper you cite seems to have managed to run this laplace variant on imagenet with a resnet i think it would significantly strengthen the paper to at least have this one laplace baseline to compare with if this is really an issue with the laplace library i would encourage you to reach out to the authors raise an issue on the github repo i also had issues with this library in the past and in my experience the authors are typically happy to help and fairly quick to respond the proposed approach can be viewed as limited in novelty in the sense that it effectively combines the lla with a nystrm approximation to the ntk which both have been studied fairly extensively before that being said i think the idea is pretty neat even though somewhat obvious in hindsight and appreciate that the authors made the effort of actually getting this to work so i dont view this to be a major issue minor issues l 226227 to back up the claim that ella delivers a closer approximation to lla it might be worth explicitly quantifying the discrepancies between the predictive distributions using some metric qualitatively ie by just looking at the plots it seems like llakfac also comes pretty close but i agree that even lla might underestimate inbetween uncertainty in this setting so the ella fit could be more desirable generally it would be useful to have short takeaway messages in the captions of each tablefigure for clarity and convenience yes the authors have adequately addressed this docsepthe paper targets the improvement of probabilistic predictions in bayesian deep learning with the linearized laplace approximation in particular it proposes to apply the nystrom kernel approximation to the neuraltangentlike covariance matrix of the corresponding gaussian process that is the functionspace dual of the linearized laplace in weightspace using the nystrom approximation allows the authors to scale the linearized laplace to imagenet and vit scale they compare their method to other laplace approximations in weight space and find that their method performs better though in some cases only marginally at a similar or lower computational cost strengths the laplace approximation for larger scale machine learning has recently seen increasing attention and this paper further improves scalability by introducing the nystrom approximation in its dual function space formulation the paper seems to be technically correct though i havent checked all the details weaknesses generality novelty the novelty and originality presented in this paper is limited especially when compared to other papers at icml or neurips in this space eg ref 6 provides a library for laplace approximation and a thorough reviewbenchmark or ref 9 introduces kernel approximations more generally and not only for largescale laplace predictions while this paper extends the laplace approximation to imagenet and vits the paper ultimately discusses one particular approximation nystrom to one particular covariance function ntk for one particular task probabilistic predictions on pretrained models it largely builds on existing libraries and improvements over map predictions are often marginal especially on some of the largerscale problems and provided without errorbars the authors also claim that no other methods can be applied to vit or imagenet yet lastlayer laplace lla should be applicable in this case as well at least when subsampling the dataset eg 6 apply last layer laplace to transformer models contributions relation to prior work several results in sec 32 and 33 are in my opinion not novel and have been similarly derived in the kernel literature where the nystrom approximation is commonplace as well as in appendix a2 and a3 of neuralef icml2022 reference 9 in the paper under review for the ntk kernel while neuralef is referenced as ref 9 in the paper it is not mentioned in sec 32 and 33 of this paper and it is not clarified which results in sec 32 and 33 are novel and which are already known the nystrom approximation is a well known and well studied approximation to kernel functions yet a discussionpresentation of this is missing from this paper neuralef 9 highlights several limitations of the nystrom approximation when applied to ntk mostly to do with scalability see sec 22 in 9 in particular they highlight the cost of the eigendecomposition as well as prediction costs at test time i agree with 9 that this could be problems and scalability of the nystrom should be discussed more prominently also see limitations below i found the presentationclarity to be not clear the setting probabilistic prediction with bayesian neural networks ultimate goalobjective improve linearized posterior predictive as well as the baseline methods discussion equations for eg laplace with kfac should be introduced more clearly eg the posterior predictive distributions evaluated are only mentioned in passingdo not have clear equationsequation numbers the paper ultimately presents a technical contribution nystrom for this particular kernel in an arguably very technical way in my opinion more emphasis should be put on explanation and some of the technical details eg indexing with i in sec 33 should be left to the appendix alg 1 and 2 should be explained in much more detail comments references to the text as well as from the text to the algorithms are missing see weaknesses above in particular the authors only investigate ella for predictions several works eg ref 22 6 point out that the choice of prior precision is important to achieve good predictive performance in particular for uncertaintiesnllece however it looks like the values here have simply been taken from the pretrained model regularisation which may explain the poor performance of lla and llakfac for this reason 6 as far as i understand proposes to use lastlayer laplace by default when these hyperparameters are not tunedthe map model itself does not overfit the illustrative regression example sec 53 is not discussed in enough detail in particular it should be investigated and further discussed why ella can be better than the full ggn model lla to which it should be equivalent except for the nystrom approximation of course the authors are surprised see caption to fig 1 by ella performing better than the full model lla as well yet they do not investigate or discuss this for example the full gp predictive without nystrom approximation as well as differentlarger values for m and k should be considered in this case to build further intuition for the model and to investigate this behaviour ### Summary:
this paper introduces an approach to accelerating linearized laplace approximations to bayesian neural network posteriors particularly considering prediction tasks by performing a nystrm approximation to the neural tangent kernel the reviewers all recommended acceptance eventually one reviewer was initially quite critical but revised after a rather extensive discussion with the authors revised to a borderline accept in particular some additional experiments analyzing overfitting in these models in general were appreciated while this is perhaps borderline on the scores 567 given the overall quality of the work and the extent to which this was updated and improved during the rebuttal period i would recommend acceptance
[ 432, 2045, 6260, 273, 295, 9207, 1109, 3022, 10295, 34754, 50276, 66, 2022, 5750, 273, 253, 4081, 43746, 1332, 310, 326, 247, 34815, 476, 15257, 1453, 253, 3045, 4632, 2105, 5454, 2727, 407, 13887, 253, 4373, 22041, 278, 285, 465, 534, 310, 275, 4499, 281, 954, 643, 3082, 24088, 26198, 11282, 356, 285, 26198, 518, 28402, 326, 452, 247, 4229, 2105, 285, 3045, 2299, 347, 5393, 275, 253, 32213, 2708, 436, 5454, 2727, 943, 320, 7515, 625, 16575, 275, 253, 16774, 7103, 50276, 783, 16774, 7103, 14371, 326, 43746, 476, 562, 32231, 4623, 1666, 25379, 327, 2460, 42070, 3169, 11649, 18543, 285, 562, 1171, 35360, 5481, 8892, 891, 3782, 11435, 253, 11250, 273, 4679, 327, 4440, 257, 292, 534, 310, 19235, 1335, 417, 594, 1846, 275, 253, 17699, 16561, 3676, 4715, 6239, 891, 369, 3340, 17847, 407, 253, 1543, 327, 253, 3332, 8113, 39707, 10336, 534, 12532, 314, 14371, 326, 253, 1332, 476, 320, 3732, 275, 9648, 4980, 3676, 4715, 7533, 534, 969, 310, 417, 5431, 253, 1083, 50276, 32442, 16561, 3082, 9403, 3480, 3212, 3012, 672, 352, 3249, 281, 16253, 273, 16424, 275, 3676, 4715, 50276, 74, 1663, 10490, 253, 3368, 4645, 326, 43746, 285, 298, 4123, 3082, 625, 3839, 452, 247, 14955, 281, 689, 8491, 281, 4067, 15302, 275, 253, 3282, 326, 253, 1071, 2957, 7866, 3629, 969, 2469, 247, 2176, 1180, 273, 941, 2792, 908, 323, 13532, 253, 26677, 4315, 17227, 326, 298, 4123, 3082, 476, 320, 3012, 653, 264, 484, 407, 8790, 312, 4906, 253, 941, 275, 690, 7533, 50276, 20881, 1255, 265, 50276, 284, 512, 4679, 1646, 281, 452, 644, 1408, 342, 816, 247, 2014, 3632, 8357, 253, 1543, 513, 417, 1705, 342, 667, 2228, 8965, 534, 2789, 352, 2834, 281, 1921, 670, 253, 7605, 8453, 273, 253, 2361, 11815, 1223, 891, 2096, 326, 24385, 4679, 323, 2709, 3632, 12922, 23352, 5459, 253, 2424, 15180, 3434, 9610, 690, 3686, 273, 2228, 26312, 651, 3012, 8596, 253, 17938, 273, 253, 8392, 11815, 323, 253, 3782, 8214, 4679, 26332, 327, 4440, 257, 292, 891, 812, 4931, 2997, 3480, 273, 11897, 347, 271, 16267, 3738, 1014, 627, 247, 5927, 1180, 273, 495, 12922, 651, 320, 11408, 533, 327, 253, 20182, 4679, 26332, 327, 260, 338, 274, 740, 891, 651, 1663, 1902, 690, 22563, 273, 253, 4679, 50276, 783, 16774, 5301, 1057, 417, 1646, 7094, 4344, 347, 1027, 3082, 452, 1027, 3541, 285, 11897, 6095, 275, 1798, 323, 253, 4373, 22041, 6777, 43746, 3133, 281, 320, 625, 8214, 685, 690, 273, 253, 643, 1666, 25379, 2783, 275, 2426, 273, 11897, 285, 263, 3541, 2105, 347, 24088, 671, 2011, 275, 3036, 577, 68, 3103, 352, 1537, 320, 625, 4569, 285, 47860, 281, 7484, 16226, 275, 374, 69, 342, 3541, 285, 263, 11897, 3434, 327, 253, 1269, 10565, 285, 3045, 327, 253, 340, 10565, 594, 326, 352, 4916, 2590, 849, 253, 1027, 3082, 476, 5454, 2727, 2105, 4632, 3045, 323, 43746, 352, 651, 320, 1270, 281, 840, 1304, 3045, 323, 1027, 2193, 273, 278, 285, 263, 465, 26332, 1027, 4815, 27012, 247, 22865, 936, 6970, 247, 34815, 476, 840, 5206, 253, 1332, 27049, 22041, 327, 253, 22865, 936, 2914, 326, 1682, 3744, 44849, 616, 3198, 26332, 534, 310, 2057, 337, 347, 11142, 347, 1896, 323, 247, 1677, 6799, 3045, 1269, 390, 374, 347, 1347, 386, 347, 1896, 323, 247, 1677, 6799, 2105, 340, 1293, 824, 247, 7484, 352, 310, 2834, 281, 3812, 11815, 327, 849, 8936, 43746, 1663, 310, 50276, 20261, 11591, 284, 3745, 281, 28335, 281, 247, 4618, 2491, 273, 3045, 16736, 5454, 14273, 407, 28110, 689, 278, 285, 465, 943, 2168, 1056, 352, 1007, 1805, 685, 954, 273, 253, 1666, 25379, 368, 2168, 921, 2067, 14777, 326, 403, 2905, 275, 5968, 3036, 374, 490, 3036, 577, 49501, 533, 891, 2868, 253, 2173, 2238, 273, 7484, 891, 2529, 651, 320, 1199, 625, 47860, 50276, 1542, 253, 4440, 257, 292, 4679, 891, 651, 1902, 326, 387, 1878, 253, 1390, 12026, 465, 28402, 12955, 943, 320, 17887, 281, 1408, 285, 417, 4917, 271, 258, 297, 2228, 342, 2057, 247, 501, 3024, 390, 9084, 50276, 909, 253, 1578, 5756, 73, 6533, 1162, 355, 43425, 2929, 368, 26542, 3133, 281, 452, 7303, 281, 1408, 436, 826, 5070, 12955, 327, 4440, 257, 292, 342, 247, 501, 3024, 891, 1158, 352, 651, 3012, 17084, 253, 2929, 281, 387, 1878, 452, 436, 581, 826, 5070, 8245, 281, 7277, 342, 604, 436, 310, 1663, 271, 2523, 342, 253, 826, 5070, 6335, 891, 651, 11907, 368, 281, 3986, 562, 281, 253, 4477, 50276, 22525, 271, 2523, 327, 253, 40477, 30905, 50276, 74, 671, 574, 3374, 342, 436, 6335, 275, 253, 2469, 285, 275, 619, 2793, 253, 4477, 403, 5431, 5211, 281, 1361, 285, 9648, 3158, 281, 3794, 50276, 783, 4081, 2746, 476, 320, 11575, 347, 3710, 275, 38135, 275, 253, 3282, 326, 352, 8069, 24772, 253, 298, 4123, 342, 247, 295, 9207, 1109, 11193, 281, 253, 295, 17922, 534, 1097, 452, 644, 5421, 9648, 18171, 1078, 326, 1146, 753, 891, 1158, 253, 2934, 310, 3965, 18176, 1014, 2167, 8489, 4755, 275, 17134, 18347, 285, 11435, 326, 253, 4477, 1160, 253, 3434, 273, 2686, 2970, 436, 281, 789, 594, 891, 13414, 1859, 436, 281, 320, 247, 2201, 2523, 50276, 37585, 3374, 50276, 77, 27648, 20785, 281, 896, 598, 253, 1750, 326, 43746, 26361, 247, 8003, 11193, 281, 298, 4123, 352, 1537, 320, 4409, 11120, 2677, 5411, 253, 37122, 875, 253, 15970, 10670, 970, 690, 7982, 36143, 26332, 407, 816, 2819, 387, 253, 14777, 352, 3133, 751, 26198, 518, 28402, 671, 3249, 3965, 2810, 533, 891, 5194, 326, 1014, 298, 4123, 1537, 45166, 275, 17352, 11649, 275, 436, 4758, 594, 253, 43746, 4944, 812, 320, 625, 11408, 50276, 43786, 352, 651, 320, 4217, 281, 452, 2159, 1379, 12594, 8169, 275, 253, 3403, 621, 273, 1016, 2829, 13206, 323, 19843, 285, 16397, 4754, 253, 4477, 452, 18212, 9713, 436, 5474, 339, 431, 248, 2929, 8571, 253, 7756, 273, 37851, 13650, 275, 17699, 16561, 3676, 4715, 342, 253, 4872, 1025, 826, 5070, 11193, 275, 1798, 352, 29328, 281, 4647, 253, 295, 9207, 409, 10295, 11193, 281, 253, 11454, 85, 606, 290, 3022, 26677, 4315, 273, 253, 3969, 305, 12064, 1232, 326, 310, 253, 3470, 4511, 8746, 273, 253, 4872, 1025, 826, 5070, 275, 2801, 5641, 970, 253, 295, 9207, 409, 11193, 4483, 253, 4477, 281, 4311, 253, 4872, 1025, 826, 5070, 281, 4440, 257, 292, 285, 9084, 4311, 597, 7277, 616, 1332, 281, 643, 826, 5070, 34754, 275, 2801, 2317, 285, 1089, 326, 616, 1332, 17923, 1805, 2167, 275, 690, 2219, 760, 42876, 387, 247, 2074, 390, 2406, 15180, 2105, 50276, 296, 3755, 20556, 50276, 783, 826, 5070, 11193, 323, 4067, 4311, 5145, 4715, 556, 4102, 2326, 3629, 4116, 285, 436, 2929, 2007, 19132, 9171, 1430, 407, 16984, 253, 295, 9207, 409, 11193, 275, 697, 8746, 1159, 2317, 15895, 50276, 783, 2929, 3133, 281, 320, 22335, 3451, 2167, 891, 419, 2254, 10141, 512, 253, 4278, 50276, 20881, 1255, 265, 50276, 8719, 1319, 50276, 2369, 652, 555, 253, 38135, 285, 3236, 414, 3559, 275, 436, 2929, 310, 3710, 3340, 672, 2429, 281, 643, 9380, 387, 17857, 1686, 390, 5723, 2824, 275, 436, 2317, 24088, 1275, 721, 3400, 247, 6335, 323, 826, 5070, 11193, 285, 247, 11080, 2278, 31591, 4698, 390, 1275, 898, 23970, 10295, 34754, 625, 3839, 285, 417, 760, 323, 1236, 2510, 25912, 826, 5070, 13650, 1223, 436, 2929, 8725, 253, 826, 5070, 11193, 281, 4440, 257, 292, 285, 362, 953, 253, 2929, 9142, 25339, 581, 1798, 11193, 295, 9207, 409, 281, 581, 1798, 26677, 1159, 295, 17922, 323, 581, 1798, 4836, 37851, 13650, 327, 3215, 11273, 3210, 352, 8127, 21168, 327, 5368, 13747, 285, 11701, 689, 3711, 13650, 403, 2223, 16888, 3340, 327, 690, 273, 253, 1236, 7276, 25912, 3237, 285, 2530, 1293, 2228, 33396, 253, 4477, 671, 1750, 326, 642, 643, 3082, 476, 320, 3732, 281, 9084, 390, 4440, 257, 292, 2568, 1390, 12026, 826, 5070, 298, 4123, 943, 320, 7763, 275, 436, 1083, 347, 973, 387, 1878, 672, 8790, 312, 4906, 253, 10895, 24088, 721, 4647, 1390, 3828, 826, 5070, 281, 39707, 3210, 50276, 1987, 8303, 50276, 16429, 281, 2720, 789, 50273, 43249, 1543, 275, 4706, 4567, 285, 5922, 403, 275, 619, 4743, 417, 4460, 285, 452, 644, 12014, 6012, 275, 253, 10295, 6239, 835, 253, 295, 9207, 409, 11193, 310, 47817, 347, 973, 347, 275, 30762, 247, 19, 285, 247, 20, 273, 11454, 832, 17857, 1686, 938, 1423, 3806, 898, 275, 253, 2929, 762, 2278, 323, 253, 295, 17922, 10295, 1223, 11454, 832, 310, 23378, 347, 1275, 898, 275, 253, 2929, 352, 310, 417, 5393, 275, 4706, 4567, 285, 5922, 273, 436, 2929, 285, 352, 310, 417, 31637, 534, 1543, 275, 4706, 4567, 285, 5922, 403, 4460, 285, 534, 403, 2168, 1929, 50274, 783, 295, 9207, 409, 11193, 310, 247, 973, 1929, 285, 973, 5421, 11193, 281, 10295, 3470, 2568, 247, 5955, 49836, 273, 436, 310, 5816, 432, 436, 2929, 50274, 570, 1546, 832, 898, 16681, 2067, 7364, 273, 253, 295, 9207, 409, 11193, 672, 3732, 281, 295, 17922, 6571, 281, 513, 342, 9171, 1430, 923, 4706, 3307, 275, 898, 275, 1798, 597, 6780, 253, 2105, 273, 253, 299, 304, 9747, 42190, 347, 973, 347, 10554, 4815, 387, 1071, 673, 891, 5194, 342, 898, 326, 436, 812, 320, 3237, 285, 9171, 1430, 273, 253, 295, 9207, 409, 943, 320, 5469, 625, 46454, 50276, 12563, 923, 7364, 2708, 50276, 74, 1119, 253, 9759, 498, 15752, 281, 320, 417, 2590, 28910, 253, 4758, 37851, 10554, 342, 17699, 16561, 11454, 6928, 12553, 4736, 6082, 422, 3157, 4872, 1025, 12637, 15970, 347, 973, 347, 253, 8245, 3082, 5955, 50276, 2655, 569, 323, 24088, 826, 5070, 342, 465, 28402, 50276, 11425, 320, 5611, 625, 4518, 24088, 253, 12637, 15970, 10670, 50276, 15419, 11634, 403, 760, 5393, 275, 8136, 3088, 417, 452, 2590, 5150, 2346, 318, 3904, 28910, 253, 2929, 9142, 10262, 247, 7681, 7680, 295, 9207, 409, 323, 436, 1798, 10295, 275, 271, 25711, 1077, 7681, 1039, 275, 619, 4743, 625, 15075, 943, 320, 1691, 327, 8813, 285, 690, 273, 253, 7681, 4278, 24088, 44176, 342, 891, 275, 4706, 5922, 943, 320, 1669, 281, 253, 30762, 28910, 20320, 337, 285, 374, 943, 320, 5544, 275, 1199, 625, 2508, 5701, 50276, 250, 3065, 281, 253, 2505, 347, 973, 347, 432, 253, 2505, 281, 253, 11333, 403, 5816, 50275, 2887, 32213, 1840, 275, 1798, 253, 4477, 760, 7409, 43746, 323, 13650, 50276, 43249, 2987, 24088, 1275, 3307, 721, 1127, 562, 326, 253, 4327, 273, 2720, 12320, 310, 1774, 281, 5115, 1175, 15970, 3045, 275, 1798, 323, 20418, 13307, 282, 336, 2299, 352, 4453, 751, 253, 2193, 1060, 452, 3365, 644, 2668, 432, 253, 3215, 11273, 1566, 3963, 5837, 534, 778, 5513, 253, 4105, 3045, 273, 298, 4123, 285, 26198, 518, 28402, 323, 436, 1921, 721, 347, 2080, 347, 891, 2096, 29328, 281, 897, 1390, 12026, 826, 5070, 407, 4284, 672, 841, 4373, 22041, 403, 417, 24251, 783, 3711, 1566, 3139, 1057, 417, 689, 8491, 50276, 783, 47386, 9077, 1650, 4706, 8676, 310, 417, 5469, 275, 2217, 2508, 275, 1798, 352, 943, 320, 6949, 285, 2007, 5469, 2139, 43746, 476, 320, 1805, 685, 253, 2120, 305, 3757, 1566, 298, 4123, 281, 534, 352, 943, 320, 6425, 3707, 323, 253, 295, 9207, 409, 11193, 273, 2282, 253, 4477, 403, 9861, 923, 11743, 281, 3036, 337, 407, 43746, 9591, 1805, 685, 253, 2120, 1566, 298, 4123, 347, 973, 2568, 597, 513, 417, 7409, 390, 2319, 436, 323, 1650, 253, 2120, 31025, 15970, 1293, 295, 9207, 409, 11193, 347, 973, 347, 1027, 9388, 1063, 2193, 323, 278, 285, 465, 943, 320, 2783, 275, 436, 1083, 281, 1973, 2007, 30328, 323, 253, 1566, 285, 281, 7409, 436, 8770, 50276, 187, 187, 4118, 18435, 27, 2520, 2929, 23970, 271, 2746, 281, 38757, 4872, 1025, 826, 5070, 34754, 281, 17699, 16561, 11454, 2990, 20731, 17327, 3782, 7296, 10554, 8892, 407, 9591, 247, 295, 9207, 1109, 11193, 281, 253, 11454, 28196, 10295, 50276, 783, 30628, 512, 8521, 14924, 6524, 50276, 531, 37317, 369, 8523, 3240, 4619, 533, 17265, 846, 247, 2581, 9470, 5955, 342, 253, 4477, 17265, 281, 247, 45210, 2997, 275, 1798, 690, 3081, 4679, 18918, 689, 31893, 275, 841, 3210, 275, 2087, 497, 14109, 50276, 6050, 436, 310, 4931, 45210, 327, 253, 7363, 49609, 1677, 253, 4583, 3290, 273, 253, 789, 285, 253, 6070, 281, 534, 436, 369, 9300, 285, 5520, 1309, 253, 30080, 22559, 2180, 891, 651, 5583, 14924 ]
[ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 ]
[ 432, 2045, 6260, 273, 295, 9207, 1109, 3022, 10295, 34754, 50276, 66, 2022, 5750, 273, 253, 4081, 43746, 1332, 310, 326, 247, 34815, 476, 15257, 1453, 253, 3045, 4632, 2105, 5454, 2727, 407, 13887, 253, 4373, 22041, 278, 285, 465, 534, 310, 275, 4499, 281, 954, 643, 3082, 24088, 26198, 11282, 356, 285, 26198, 518, 28402, 326, 452, 247, 4229, 2105, 285, 3045, 2299, 347, 5393, 275, 253, 32213, 2708, 436, 5454, 2727, 943, 320, 7515, 625, 16575, 275, 253, 16774, 7103, 50276, 783, 16774, 7103, 14371, 326, 43746, 476, 562, 32231, 4623, 1666, 25379, 327, 2460, 42070, 3169, 11649, 18543, 285, 562, 1171, 35360, 5481, 8892, 891, 3782, 11435, 253, 11250, 273, 4679, 327, 4440, 257, 292, 534, 310, 19235, 1335, 417, 594, 1846, 275, 253, 17699, 16561, 3676, 4715, 6239, 891, 369, 3340, 17847, 407, 253, 1543, 327, 253, 3332, 8113, 39707, 10336, 534, 12532, 314, 14371, 326, 253, 1332, 476, 320, 3732, 275, 9648, 4980, 3676, 4715, 7533, 534, 969, 310, 417, 5431, 253, 1083, 50276, 32442, 16561, 3082, 9403, 3480, 3212, 3012, 672, 352, 3249, 281, 16253, 273, 16424, 275, 3676, 4715, 50276, 74, 1663, 10490, 253, 3368, 4645, 326, 43746, 285, 298, 4123, 3082, 625, 3839, 452, 247, 14955, 281, 689, 8491, 281, 4067, 15302, 275, 253, 3282, 326, 253, 1071, 2957, 7866, 3629, 969, 2469, 247, 2176, 1180, 273, 941, 2792, 908, 323, 13532, 253, 26677, 4315, 17227, 326, 298, 4123, 3082, 476, 320, 3012, 653, 264, 484, 407, 8790, 312, 4906, 253, 941, 275, 690, 7533, 50276, 20881, 1255, 265, 50276, 284, 512, 4679, 1646, 281, 452, 644, 1408, 342, 816, 247, 2014, 3632, 8357, 253, 1543, 513, 417, 1705, 342, 667, 2228, 8965, 534, 2789, 352, 2834, 281, 1921, 670, 253, 7605, 8453, 273, 253, 2361, 11815, 1223, 891, 2096, 326, 24385, 4679, 323, 2709, 3632, 12922, 23352, 5459, 253, 2424, 15180, 3434, 9610, 690, 3686, 273, 2228, 26312, 651, 3012, 8596, 253, 17938, 273, 253, 8392, 11815, 323, 253, 3782, 8214, 4679, 26332, 327, 4440, 257, 292, 891, 812, 4931, 2997, 3480, 273, 11897, 347, 271, 16267, 3738, 1014, 627, 247, 5927, 1180, 273, 495, 12922, 651, 320, 11408, 533, 327, 253, 20182, 4679, 26332, 327, 260, 338, 274, 740, 891, 651, 1663, 1902, 690, 22563, 273, 253, 4679, 50276, 783, 16774, 5301, 1057, 417, 1646, 7094, 4344, 347, 1027, 3082, 452, 1027, 3541, 285, 11897, 6095, 275, 1798, 323, 253, 4373, 22041, 6777, 43746, 3133, 281, 320, 625, 8214, 685, 690, 273, 253, 643, 1666, 25379, 2783, 275, 2426, 273, 11897, 285, 263, 3541, 2105, 347, 24088, 671, 2011, 275, 3036, 577, 68, 3103, 352, 1537, 320, 625, 4569, 285, 47860, 281, 7484, 16226, 275, 374, 69, 342, 3541, 285, 263, 11897, 3434, 327, 253, 1269, 10565, 285, 3045, 327, 253, 340, 10565, 594, 326, 352, 4916, 2590, 849, 253, 1027, 3082, 476, 5454, 2727, 2105, 4632, 3045, 323, 43746, 352, 651, 320, 1270, 281, 840, 1304, 3045, 323, 1027, 2193, 273, 278, 285, 263, 465, 26332, 1027, 4815, 27012, 247, 22865, 936, 6970, 247, 34815, 476, 840, 5206, 253, 1332, 27049, 22041, 327, 253, 22865, 936, 2914, 326, 1682, 3744, 44849, 616, 3198, 26332, 534, 310, 2057, 337, 347, 11142, 347, 1896, 323, 247, 1677, 6799, 3045, 1269, 390, 374, 347, 1347, 386, 347, 1896, 323, 247, 1677, 6799, 2105, 340, 1293, 824, 247, 7484, 352, 310, 2834, 281, 3812, 11815, 327, 849, 8936, 43746, 1663, 310, 50276, 20261, 11591, 284, 3745, 281, 28335, 281, 247, 4618, 2491, 273, 3045, 16736, 5454, 14273, 407, 28110, 689, 278, 285, 465, 943, 2168, 1056, 352, 1007, 1805, 685, 954, 273, 253, 1666, 25379, 368, 2168, 921, 2067, 14777, 326, 403, 2905, 275, 5968, 3036, 374, 490, 3036, 577, 49501, 533, 891, 2868, 253, 2173, 2238, 273, 7484, 891, 2529, 651, 320, 1199, 625, 47860, 50276, 1542, 253, 4440, 257, 292, 4679, 891, 651, 1902, 326, 387, 1878, 253, 1390, 12026, 465, 28402, 12955, 943, 320, 17887, 281, 1408, 285, 417, 4917, 271, 258, 297, 2228, 342, 2057, 247, 501, 3024, 390, 9084, 50276, 909, 253, 1578, 5756, 73, 6533, 1162, 355, 43425, 2929, 368, 26542, 3133, 281, 452, 7303, 281, 1408, 436, 826, 5070, 12955, 327, 4440, 257, 292, 342, 247, 501, 3024, 891, 1158, 352, 651, 3012, 17084, 253, 2929, 281, 387, 1878, 452, 436, 581, 826, 5070, 8245, 281, 7277, 342, 604, 436, 310, 1663, 271, 2523, 342, 253, 826, 5070, 6335, 891, 651, 11907, 368, 281, 3986, 562, 281, 253, 4477, 50276, 22525, 271, 2523, 327, 253, 40477, 30905, 50276, 74, 671, 574, 3374, 342, 436, 6335, 275, 253, 2469, 285, 275, 619, 2793, 253, 4477, 403, 5431, 5211, 281, 1361, 285, 9648, 3158, 281, 3794, 50276, 783, 4081, 2746, 476, 320, 11575, 347, 3710, 275, 38135, 275, 253, 3282, 326, 352, 8069, 24772, 253, 298, 4123, 342, 247, 295, 9207, 1109, 11193, 281, 253, 295, 17922, 534, 1097, 452, 644, 5421, 9648, 18171, 1078, 326, 1146, 753, 891, 1158, 253, 2934, 310, 3965, 18176, 1014, 2167, 8489, 4755, 275, 17134, 18347, 285, 11435, 326, 253, 4477, 1160, 253, 3434, 273, 2686, 2970, 436, 281, 789, 594, 891, 13414, 1859, 436, 281, 320, 247, 2201, 2523, 50276, 37585, 3374, 50276, 77, 27648, 20785, 281, 896, 598, 253, 1750, 326, 43746, 26361, 247, 8003, 11193, 281, 298, 4123, 352, 1537, 320, 4409, 11120, 2677, 5411, 253, 37122, 875, 253, 15970, 10670, 970, 690, 7982, 36143, 26332, 407, 816, 2819, 387, 253, 14777, 352, 3133, 751, 26198, 518, 28402, 671, 3249, 3965, 2810, 533, 891, 5194, 326, 1014, 298, 4123, 1537, 45166, 275, 17352, 11649, 275, 436, 4758, 594, 253, 43746, 4944, 812, 320, 625, 11408, 50276, 43786, 352, 651, 320, 4217, 281, 452, 2159, 1379, 12594, 8169, 275, 253, 3403, 621, 273, 1016, 2829, 13206, 323, 19843, 285, 16397, 4754, 253, 4477, 452, 18212, 9713, 436, 5474, 339, 431, 248, 2929, 8571, 253, 7756, 273, 37851, 13650, 275, 17699, 16561, 3676, 4715, 342, 253, 4872, 1025, 826, 5070, 11193, 275, 1798, 352, 29328, 281, 4647, 253, 295, 9207, 409, 10295, 11193, 281, 253, 11454, 85, 606, 290, 3022, 26677, 4315, 273, 253, 3969, 305, 12064, 1232, 326, 310, 253, 3470, 4511, 8746, 273, 253, 4872, 1025, 826, 5070, 275, 2801, 5641, 970, 253, 295, 9207, 409, 11193, 4483, 253, 4477, 281, 4311, 253, 4872, 1025, 826, 5070, 281, 4440, 257, 292, 285, 9084, 4311, 597, 7277, 616, 1332, 281, 643, 826, 5070, 34754, 275, 2801, 2317, 285, 1089, 326, 616, 1332, 17923, 1805, 2167, 275, 690, 2219, 760, 42876, 387, 247, 2074, 390, 2406, 15180, 2105, 50276, 296, 3755, 20556, 50276, 783, 826, 5070, 11193, 323, 4067, 4311, 5145, 4715, 556, 4102, 2326, 3629, 4116, 285, 436, 2929, 2007, 19132, 9171, 1430, 407, 16984, 253, 295, 9207, 409, 11193, 275, 697, 8746, 1159, 2317, 15895, 50276, 783, 2929, 3133, 281, 320, 22335, 3451, 2167, 891, 419, 2254, 10141, 512, 253, 4278, 50276, 20881, 1255, 265, 50276, 8719, 1319, 50276, 2369, 652, 555, 253, 38135, 285, 3236, 414, 3559, 275, 436, 2929, 310, 3710, 3340, 672, 2429, 281, 643, 9380, 387, 17857, 1686, 390, 5723, 2824, 275, 436, 2317, 24088, 1275, 721, 3400, 247, 6335, 323, 826, 5070, 11193, 285, 247, 11080, 2278, 31591, 4698, 390, 1275, 898, 23970, 10295, 34754, 625, 3839, 285, 417, 760, 323, 1236, 2510, 25912, 826, 5070, 13650, 1223, 436, 2929, 8725, 253, 826, 5070, 11193, 281, 4440, 257, 292, 285, 362, 953, 253, 2929, 9142, 25339, 581, 1798, 11193, 295, 9207, 409, 281, 581, 1798, 26677, 1159, 295, 17922, 323, 581, 1798, 4836, 37851, 13650, 327, 3215, 11273, 3210, 352, 8127, 21168, 327, 5368, 13747, 285, 11701, 689, 3711, 13650, 403, 2223, 16888, 3340, 327, 690, 273, 253, 1236, 7276, 25912, 3237, 285, 2530, 1293, 2228, 33396, 253, 4477, 671, 1750, 326, 642, 643, 3082, 476, 320, 3732, 281, 9084, 390, 4440, 257, 292, 2568, 1390, 12026, 826, 5070, 298, 4123, 943, 320, 7763, 275, 436, 1083, 347, 973, 387, 1878, 672, 8790, 312, 4906, 253, 10895, 24088, 721, 4647, 1390, 3828, 826, 5070, 281, 39707, 3210, 50276, 1987, 8303, 50276, 16429, 281, 2720, 789, 50273, 43249, 1543, 275, 4706, 4567, 285, 5922, 403, 275, 619, 4743, 417, 4460, 285, 452, 644, 12014, 6012, 275, 253, 10295, 6239, 835, 253, 295, 9207, 409, 11193, 310, 47817, 347, 973, 347, 275, 30762, 247, 19, 285, 247, 20, 273, 11454, 832, 17857, 1686, 938, 1423, 3806, 898, 275, 253, 2929, 762, 2278, 323, 253, 295, 17922, 10295, 1223, 11454, 832, 310, 23378, 347, 1275, 898, 275, 253, 2929, 352, 310, 417, 5393, 275, 4706, 4567, 285, 5922, 273, 436, 2929, 285, 352, 310, 417, 31637, 534, 1543, 275, 4706, 4567, 285, 5922, 403, 4460, 285, 534, 403, 2168, 1929, 50274, 783, 295, 9207, 409, 11193, 310, 247, 973, 1929, 285, 973, 5421, 11193, 281, 10295, 3470, 2568, 247, 5955, 49836, 273, 436, 310, 5816, 432, 436, 2929, 50274, 570, 1546, 832, 898, 16681, 2067, 7364, 273, 253, 295, 9207, 409, 11193, 672, 3732, 281, 295, 17922, 6571, 281, 513, 342, 9171, 1430, 923, 4706, 3307, 275, 898, 275, 1798, 597, 6780, 253, 2105, 273, 253, 299, 304, 9747, 42190, 347, 973, 347, 10554, 4815, 387, 1071, 673, 891, 5194, 342, 898, 326, 436, 812, 320, 3237, 285, 9171, 1430, 273, 253, 295, 9207, 409, 943, 320, 5469, 625, 46454, 50276, 12563, 923, 7364, 2708, 50276, 74, 1119, 253, 9759, 498, 15752, 281, 320, 417, 2590, 28910, 253, 4758, 37851, 10554, 342, 17699, 16561, 11454, 6928, 12553, 4736, 6082, 422, 3157, 4872, 1025, 12637, 15970, 347, 973, 347, 253, 8245, 3082, 5955, 50276, 2655, 569, 323, 24088, 826, 5070, 342, 465, 28402, 50276, 11425, 320, 5611, 625, 4518, 24088, 253, 12637, 15970, 10670, 50276, 15419, 11634, 403, 760, 5393, 275, 8136, 3088, 417, 452, 2590, 5150, 2346, 318, 3904, 28910, 253, 2929, 9142, 10262, 247, 7681, 7680, 295, 9207, 409, 323, 436, 1798, 10295, 275, 271, 25711, 1077, 7681, 1039, 275, 619, 4743, 625, 15075, 943, 320, 1691, 327, 8813, 285, 690, 273, 253, 7681, 4278, 24088, 44176, 342, 891, 275, 4706, 5922, 943, 320, 1669, 281, 253, 30762, 28910, 20320, 337, 285, 374, 943, 320, 5544, 275, 1199, 625, 2508, 5701, 50276, 250, 3065, 281, 253, 2505, 347, 973, 347, 432, 253, 2505, 281, 253, 11333, 403, 5816, 50275, 2887, 32213, 1840, 275, 1798, 253, 4477, 760, 7409, 43746, 323, 13650, 50276, 43249, 2987, 24088, 1275, 3307, 721, 1127, 562, 326, 253, 4327, 273, 2720, 12320, 310, 1774, 281, 5115, 1175, 15970, 3045, 275, 1798, 323, 20418, 13307, 282, 336, 2299, 352, 4453, 751, 253, 2193, 1060, 452, 3365, 644, 2668, 432, 253, 3215, 11273, 1566, 3963, 5837, 534, 778, 5513, 253, 4105, 3045, 273, 298, 4123, 285, 26198, 518, 28402, 323, 436, 1921, 721, 347, 2080, 347, 891, 2096, 29328, 281, 897, 1390, 12026, 826, 5070, 407, 4284, 672, 841, 4373, 22041, 403, 417, 24251, 783, 3711, 1566, 3139, 1057, 417, 689, 8491, 50276, 783, 47386, 9077, 1650, 4706, 8676, 310, 417, 5469, 275, 2217, 2508, 275, 1798, 352, 943, 320, 6949, 285, 2007, 5469, 2139, 43746, 476, 320, 1805, 685, 253, 2120, 305, 3757, 1566, 298, 4123, 281, 534, 352, 943, 320, 6425, 3707, 323, 253, 295, 9207, 409, 11193, 273, 2282, 253, 4477, 403, 9861, 923, 11743, 281, 3036, 337, 407, 43746, 9591, 1805, 685, 253, 2120, 1566, 298, 4123, 347, 973, 2568, 597, 513, 417, 7409, 390, 2319, 436, 323, 1650, 253, 2120, 31025, 15970, 1293, 295, 9207, 409, 11193, 347, 973, 347, 1027, 9388, 1063, 2193, 323, 278, 285, 465, 943, 320, 2783, 275, 436, 1083, 281, 1973, 2007, 30328, 323, 253, 1566, 285, 281, 7409, 436, 8770, 50276, 187, 187, 4118, 18435, 27, 2520, 2929, 23970, 271, 2746, 281, 38757, 4872, 1025, 826, 5070, 34754, 281, 17699, 16561, 11454, 2990, 20731, 17327, 3782, 7296, 10554, 8892, 407, 9591, 247, 295, 9207, 1109, 11193, 281, 253, 11454, 28196, 10295, 50276, 783, 30628, 512, 8521, 14924, 6524, 50276, 531, 37317, 369, 8523, 3240, 4619, 533, 17265, 846, 247, 2581, 9470, 5955, 342, 253, 4477, 17265, 281, 247, 45210, 2997, 275, 1798, 690, 3081, 4679, 18918, 689, 31893, 275, 841, 3210, 275, 2087, 497, 14109, 50276, 6050, 436, 310, 4931, 45210, 327, 253, 7363, 49609, 1677, 253, 4583, 3290, 273, 253, 789, 285, 253, 6070, 281, 534, 436, 369, 9300, 285, 5520, 1309, 253, 30080, 22559, 2180, 891, 651, 5583, 14924 ]
Below is given review of a research paper from cnoference journal. Please write a summary the review. ### Review: this manuscript introduces a new benchmark named tweetnerd for nerd evaluation the proposed dataset is based on over 340k tweets from a 10 years time range and is publicly available with the proper licenses the authors compare tweetnerd with four existing benchmarks from multiple aspects including the number of unique entities entities mentions and the number of tweets the proposed dataset shows remarkable improvement over others in this paper the authors demonstrate the experimental settings of tweetnerd for nerdrelated tasks they are named entity recognition ner entity linking with true spans el and end to end entity linking end2end the experimental results with recent models are present in this paper as well the numbers indicate the benchmark is challenging for current solutions and theres much to explore for this dataset 1 the proposed dataset is considered a largescale dataset that can adapt to a wide range of experimental settings 2 the paper introduces the experimental settings for three different tasks related to entity linking 3 compared to existing datasets the tweetnerd benchmark has notable advantages 4 the authors performed a set of experiments with recent models the results indicate the benchmark is challenging for future research 5 the authors utilize entropybased sampling and create tweetnerdood and tweetnerd academia to improve the diversity of the datasets 1 this paper has a limited description of existing work datasets in this domain 2 the paper can be improved with more detailed descriptions of models methods metric selections insights on results etc this would make the paper easier to follow 3 the present of figures and tables especially for figure 1 can be improved docsepthis paper introduces tweetnerd a dataset for benchmarking named entity recognition and disambiguation nerd systems on tweets this is so far the largest and most temporally diverse opensourced dataset benchmark for nerd on tweets two subsets of the dataset tweetnerdood and tweetnerdacademic are also provided the former is for assessing outofdomain performance the latter consists of tweets from a collection of existing academic benchmarks that have been reannotated with the new annotation guidelines the two subsets were evaluated for named entity recognition entity linking with true spans and endtoend entity linking tasks several nerel approaches were compared and the results are briefly reported in the paper 1 the annotated dataset is released by twitter its larger and more recent than existing ones 2 the dataset keeps raw annotation data and is flexible 3 the limitations of the dataset are discussed 4 the paper is wellwritten and easy to read 1 the comparison to existing benchmarks needs more discussion 2 i didnt find any description on how candidate entities were generated 3 the sampling is not uniform partly due to creating tweetnerdacademic 4 evaluation was done only on the two subsets of tweetnerd and only some brief results were reported docsepsummary this paper manually annotates 340k tweets across 20102021 to benchmark named entity recognition systems and evaluate existing models on three related tasks named entity recognition entity linking with true spans and end to end entity linking contributions besides the regular tweets the paper also provides two subsets tweetnerdood and tweetnerdacademic to assess outofdomain performance and temporal generalization of ner models respectively the largescale tweet dataset with manually annotated entity links enables further research on entity linking on social media the performance of some recent methods does not outperform some classic methods on three evaluation tasks which motivates further research for robust nerd models 1 the process of obtaining tweetnerdood is not clearly described and then perform stratified sampling based on tweet actions to divide these buckets into subbuckets hence its difficult to distinguish samples from this subset and those from regular subsets therefore the implications of performance of investigated models on tweetnerdood is not very obvious 2 analyses of performance from existing entity linking systems lack the potential reason for failures from some more recent models compared with earlier models distribution shift could be one reason although some recent method shows more powerful in their respective evaluation dataset they become less powerful when directly evaluating on the newly collected tweet dataset without finetuning 3 in the experiments section only the two subsets tweetnerdood and tweetnerdacademic are tested however the large remaining samples arent investigated hence their values are hard to tell eg models training on remaining samples can obtain much better performance on the two evaluation subsets docsepthe paper introduces tweetnerd which is a large dataset of 340k tweets mostly collected across 20202021 and which can be used to benchmark nerd systems the main contribution of the submission thus consists in the size of the dataset furthermore all expert annotations have been kept in the dataset which means that the dataset allows for further experimentation with ambiguous annotations or different levels of disagreement the main strengths of the dataset are mentioned above as contributions 1 size of the dataset and 2 disagreementaware dataset several weaknesses can be identified is is unclear what makes the proposed dataset better than existing one in terms of temporal bias the dataset is much larger but many tweets have been published way before 2020 how much have the annotations changed in the tweetnerd academic dataset compared to the original datasets why was the reannotation necessary were all annotators familiar with the task were there any instructions on how to deal or manage disagreement why computing interrater reliability as percentages instead of the more reliable ways that use proper irr metrics krippendorffs alpha for instance there is very little discussion that acknowledges the reasons for disagreement and how they can impact the models that are trained and evaluated with this dataset the dataset does not seem to be balanced but only f1 scores are reported why is that and why is the f1 score the only needed metric the results are discussed in very little to no detail docsepin this paper the authors present a largescale dataset for entity recognition and linking for tweets in terms of size the dataset is much larger than existing datasets for the same problem settings based on the description the dataset is properly annotated with clear instructions to the annotators in this sense the quality of annotations is high and the authors release raw annotations in addition to the golden labels the dataset also comes with two carefully constructed subsets targeting easy and challenging cases the main weakness comes with the limited access and dynamics of the dataset following twitter policy only twitter id is released instead of the raw content downloading the dataset at a different time may get a different subset this would make the results from different papers not comparable i would then do not consider this dataset to be really a benchmark the authors also do not mention the dynamics of wikipedia being edited every day the dataset is of large scale and the authors have considered many factors in constructing this dataset from tweets sampling to the annotation guideline and then to the construction of two subsets the authors conduct evaluations of a few models for three problems ner entity linking and endtoend entity linking the authors choose to release raw annotations in addition to the golden labels the raw annotation enables studies of other research problems on top of the entity linking problems 1 the main weakness is the access to the data is limited by twitter api terms and conditions i do understand that all users of the dataset have to follow the rules but the result is different versions of the dataset being used in different papers as a result model accuracies are not directly comparable 2 the authors do not consider dynamics of wikipedia which is always being edited a specific version or dump shall be specified to avoid minimize the differences used by different authors some entity linking models rely on interlinking between wiki entries from wikipedia which is not included in the annotation 3 the context used during annotation lines 50 53 might not well match with what is recorded in the dataset in other words a model may not be able to access the full context for making the correct linking example is the temporal context of the tweet 4 the paper is in general not well organized and with lots of small issues in writing for example annotation setup and interannotator agreement shall be presented together to offer a complete picture of the annotation process the authors have well addressed these comments ### Summary:
the authors release tweetnerd a collection of 340k tweets from a 10 years time range with the proper licenses for named entity recognition ner entity linking with true spans el and end to end entity linking end2end all reviewers appreciate that this is a largescale temporally diverse dataset with many advantages over existing benchmarks useful for multiple tasks and experiments show that it is challenging for current models they also find the paper wellwritten and easy to read the main shortcoming they find is that related work and existing benchmarks could be more exhaustively described and the paper could be more clear in certain areas some reviewers are also concerned that since the dataset needs to be downloaded by the twitter api it could change over time resulting in slightly incomparable results across groups
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 2520, 7714, 23970, 247, 747, 22791, 4907, 15975, 1216, 69, 323, 295, 15182, 7103, 253, 4081, 10895, 310, 1754, 327, 689, 28528, 76, 28311, 432, 247, 884, 1107, 673, 2491, 285, 310, 13644, 2130, 342, 253, 1463, 23937, 50275, 783, 4477, 7277, 15975, 1216, 69, 342, 1740, 5368, 49602, 432, 2709, 7794, 1690, 253, 1180, 273, 4451, 14429, 14429, 25957, 285, 253, 1180, 273, 28311, 253, 4081, 10895, 2722, 13406, 7756, 689, 2571, 275, 436, 2929, 253, 4477, 7568, 253, 5661, 7533, 273, 15975, 1216, 69, 323, 295, 15182, 4919, 8892, 597, 403, 4907, 10726, 8981, 38998, 10726, 20057, 342, 2032, 35742, 1045, 285, 990, 281, 990, 10726, 20057, 990, 19, 423, 253, 5661, 1543, 342, 3332, 3210, 403, 1246, 275, 436, 2929, 347, 973, 253, 3904, 5224, 253, 22791, 310, 11132, 323, 1655, 5482, 285, 253, 373, 1199, 281, 8338, 323, 436, 10895, 50276, 18, 253, 4081, 10895, 310, 2783, 247, 1236, 2510, 25912, 10895, 326, 476, 5223, 281, 247, 4618, 2491, 273, 5661, 7533, 50276, 19, 253, 2929, 23970, 253, 5661, 7533, 323, 1264, 1027, 8892, 2905, 281, 10726, 20057, 495, 2429, 281, 5368, 15302, 253, 15975, 1216, 69, 22791, 556, 16613, 11361, 577, 253, 4477, 2684, 247, 873, 273, 4679, 342, 3332, 3210, 253, 1543, 5224, 253, 22791, 310, 11132, 323, 2852, 2561, 608, 253, 4477, 16584, 15579, 3169, 10491, 285, 2794, 15975, 1216, 69, 836, 285, 15975, 1216, 69, 8371, 571, 281, 3157, 253, 9991, 273, 253, 15302, 50276, 18, 436, 2929, 556, 247, 3710, 5740, 273, 5368, 789, 15302, 275, 436, 5028, 50276, 19, 253, 2929, 476, 320, 5520, 342, 625, 7000, 20121, 273, 3210, 3082, 7982, 36318, 16039, 327, 1543, 3966, 436, 651, 1056, 253, 2929, 6927, 281, 956, 495, 253, 1246, 273, 8442, 285, 7180, 3340, 323, 4677, 337, 476, 320, 5520, 50276, 7152, 33032, 2520, 2929, 23970, 15975, 1216, 69, 247, 10895, 323, 22791, 272, 4907, 10726, 8981, 285, 557, 38638, 295, 15182, 2718, 327, 28311, 436, 310, 594, 2080, 253, 6253, 285, 954, 5897, 595, 11117, 13279, 47549, 10895, 22791, 323, 295, 15182, 327, 28311, 767, 20077, 273, 253, 10895, 15975, 1216, 69, 836, 285, 15975, 1216, 69, 317, 4788, 280, 403, 671, 2530, 253, 3438, 310, 323, 18005, 562, 1171, 13517, 3045, 253, 6158, 8414, 273, 28311, 432, 247, 4849, 273, 5368, 11073, 49602, 326, 452, 644, 294, 11423, 456, 342, 253, 747, 22581, 9600, 253, 767, 20077, 497, 6760, 323, 4907, 10726, 8981, 10726, 20057, 342, 2032, 35742, 285, 990, 936, 423, 10726, 20057, 8892, 2067, 425, 1661, 7274, 497, 2429, 285, 253, 1543, 403, 13366, 2361, 275, 253, 2929, 50276, 18, 253, 28267, 10895, 310, 4439, 407, 34302, 697, 4067, 285, 625, 3332, 685, 5368, 4394, 50275, 19, 253, 10895, 11359, 9305, 22581, 941, 285, 310, 12112, 50276, 20, 253, 7364, 273, 253, 10895, 403, 5469, 50275, 21, 253, 2929, 310, 973, 15720, 285, 3477, 281, 1239, 50276, 18, 253, 5301, 281, 5368, 49602, 3198, 625, 5955, 50276, 19, 891, 42126, 1089, 667, 5740, 327, 849, 7431, 14429, 497, 4561, 50275, 20, 253, 10491, 310, 417, 6447, 13730, 1955, 281, 6153, 15975, 1216, 69, 317, 4788, 280, 50275, 21, 7103, 369, 2218, 760, 327, 253, 767, 20077, 273, 15975, 1216, 69, 285, 760, 690, 4864, 1543, 497, 2361, 50276, 7152, 339, 793, 360, 3454, 436, 2929, 13542, 12182, 684, 28528, 76, 28311, 2439, 4267, 938, 1797, 281, 22791, 4907, 10726, 8981, 2718, 285, 7472, 5368, 3210, 327, 1264, 2905, 8892, 4907, 10726, 8981, 10726, 20057, 342, 2032, 35742, 285, 990, 281, 990, 10726, 20057, 50276, 1987, 8303, 16280, 253, 3963, 28311, 253, 2929, 671, 3400, 767, 20077, 15975, 1216, 69, 836, 285, 15975, 1216, 69, 317, 4788, 280, 281, 2939, 562, 1171, 13517, 3045, 285, 11935, 26647, 273, 38998, 3210, 2975, 253, 1236, 2510, 25912, 15975, 10895, 342, 13542, 28267, 10726, 4859, 13276, 2007, 2561, 327, 10726, 20057, 327, 2675, 3420, 253, 3045, 273, 690, 3332, 3082, 1057, 417, 562, 32231, 690, 10610, 3082, 327, 1264, 7103, 8892, 534, 15265, 684, 2007, 2561, 323, 10237, 295, 15182, 3210, 337, 253, 1232, 273, 13546, 15975, 1216, 69, 836, 310, 417, 4518, 2529, 285, 840, 1347, 31539, 10491, 1754, 327, 15975, 5231, 281, 10957, 841, 47289, 715, 749, 24507, 1507, 7613, 697, 2834, 281, 12129, 3530, 432, 436, 8578, 285, 1110, 432, 3963, 20077, 3103, 253, 12739, 273, 3045, 273, 6949, 3210, 327, 15975, 1216, 69, 836, 310, 417, 1077, 4755, 50276, 19, 6260, 273, 3045, 432, 5368, 10726, 20057, 2718, 3480, 253, 2442, 1921, 323, 20101, 432, 690, 625, 3332, 3210, 2429, 342, 4321, 3210, 3268, 5333, 812, 320, 581, 1921, 3738, 690, 3332, 1332, 2722, 625, 6422, 275, 616, 9056, 7103, 10895, 597, 2489, 1679, 6422, 672, 3587, 16344, 327, 253, 9841, 5728, 15975, 10895, 1293, 1442, 292, 25004, 50276, 20, 275, 253, 4679, 2593, 760, 253, 767, 20077, 15975, 1216, 69, 836, 285, 15975, 1216, 69, 317, 4788, 280, 403, 5762, 2299, 253, 1781, 5780, 3530, 403, 2649, 6949, 7613, 616, 2193, 403, 1892, 281, 2028, 24088, 3210, 3733, 327, 5780, 3530, 476, 4044, 1199, 1805, 3045, 327, 253, 767, 7103, 20077, 50273, 7152, 339, 431, 248, 2929, 23970, 15975, 1216, 69, 534, 310, 247, 1781, 10895, 273, 28528, 76, 28311, 6571, 5728, 2439, 9169, 938, 1797, 285, 534, 476, 320, 908, 281, 22791, 295, 15182, 2718, 253, 2022, 7680, 273, 253, 19529, 3021, 8414, 275, 253, 1979, 273, 253, 10895, 33810, 512, 6485, 31825, 452, 644, 4934, 275, 253, 10895, 534, 2097, 326, 253, 10895, 4483, 323, 2007, 40290, 342, 23851, 31825, 390, 1027, 2308, 273, 30859, 253, 2022, 20544, 273, 253, 10895, 403, 5393, 1840, 347, 9021, 337, 1979, 273, 253, 10895, 285, 374, 30859, 13823, 10895, 2067, 32213, 476, 320, 3636, 50276, 261, 310, 12744, 752, 2789, 253, 4081, 10895, 1805, 685, 5368, 581, 275, 2426, 273, 11935, 8492, 253, 10895, 310, 1199, 4067, 533, 1142, 28311, 452, 644, 3863, 1039, 1078, 9169, 50276, 5430, 1199, 452, 253, 31825, 4391, 275, 253, 15975, 1216, 69, 50276, 317, 4788, 280, 10895, 2429, 281, 253, 3236, 15302, 2139, 369, 253, 294, 22965, 3309, 50276, 12796, 512, 12182, 2392, 7615, 342, 253, 4836, 497, 627, 667, 7997, 327, 849, 281, 2968, 390, 8722, 30859, 50276, 22309, 12672, 734, 83, 727, 13367, 347, 26026, 3185, 273, 253, 625, 9630, 4088, 326, 897, 1463, 20433, 17082, 465, 21743, 13047, 567, 84, 9765, 323, 4227, 627, 310, 1077, 1652, 5955, 326, 26785, 253, 4606, 323, 30859, 285, 849, 597, 476, 3486, 253, 3210, 326, 403, 10166, 285, 6760, 342, 436, 10895, 50276, 783, 10895, 1057, 417, 1646, 281, 320, 16645, 533, 760, 269, 18, 7363, 403, 2361, 2139, 310, 326, 285, 2139, 310, 253, 269, 18, 4868, 253, 760, 3058, 7982, 50276, 783, 1543, 403, 5469, 275, 1077, 1652, 281, 642, 2508, 5474, 339, 9852, 436, 2929, 253, 4477, 1246, 247, 1236, 2510, 25912, 10895, 323, 10726, 8981, 285, 20057, 323, 28311, 275, 2426, 273, 1979, 253, 10895, 310, 1199, 4067, 685, 5368, 15302, 323, 253, 1072, 1895, 7533, 1754, 327, 253, 5740, 253, 10895, 310, 6283, 28267, 342, 2590, 7997, 281, 253, 12182, 2392, 275, 436, 3282, 253, 3290, 273, 31825, 310, 1029, 285, 253, 4477, 3727, 9305, 31825, 275, 1635, 281, 253, 14072, 13301, 253, 10895, 671, 3249, 342, 767, 9257, 8818, 20077, 12262, 3477, 285, 11132, 2219, 253, 2022, 14855, 3249, 342, 253, 3710, 2289, 285, 8062, 273, 253, 10895, 1563, 34302, 3646, 760, 34302, 2654, 310, 4439, 3185, 273, 253, 9305, 2600, 33676, 253, 10895, 387, 247, 1027, 673, 778, 755, 247, 1027, 8578, 436, 651, 1056, 253, 1543, 432, 1027, 9380, 417, 10870, 891, 651, 840, 513, 417, 1908, 436, 10895, 281, 320, 1663, 247, 22791, 253, 4477, 671, 513, 417, 3748, 253, 8062, 273, 259, 15170, 1146, 16168, 1046, 1388, 253, 10895, 310, 273, 1781, 4311, 285, 253, 4477, 452, 2783, 1142, 2616, 275, 26736, 436, 10895, 432, 28311, 10491, 281, 253, 22581, 29609, 285, 840, 281, 253, 5140, 273, 767, 20077, 253, 4477, 2589, 27163, 273, 247, 1643, 3210, 323, 1264, 3237, 38998, 10726, 20057, 285, 990, 936, 423, 10726, 20057, 50276, 783, 4477, 5206, 281, 3727, 9305, 31825, 275, 1635, 281, 253, 14072, 13301, 253, 9305, 22581, 13276, 2175, 273, 643, 2561, 3237, 327, 1755, 273, 253, 10726, 20057, 3237, 50276, 18, 253, 2022, 14855, 310, 253, 2289, 281, 253, 941, 310, 3710, 407, 34302, 23370, 2426, 285, 2515, 891, 513, 2096, 326, 512, 4212, 273, 253, 10895, 452, 281, 956, 253, 4803, 533, 253, 906, 310, 1027, 9508, 273, 253, 10895, 1146, 908, 275, 1027, 9380, 347, 247, 906, 1566, 3933, 19103, 403, 417, 3587, 10870, 50275, 19, 253, 4477, 513, 417, 1908, 8062, 273, 259, 15170, 534, 310, 1900, 1146, 16168, 247, 2173, 2715, 390, 13725, 3091, 320, 7616, 281, 3693, 15338, 253, 3910, 908, 407, 1027, 4477, 690, 10726, 20057, 3210, 10725, 327, 734, 30816, 875, 35372, 12028, 432, 259, 15170, 534, 310, 417, 2908, 275, 253, 22581, 50275, 20, 253, 3634, 908, 1309, 22581, 3104, 2456, 50276, 3357, 1537, 417, 973, 3761, 342, 752, 310, 5950, 275, 253, 10895, 275, 643, 3000, 247, 1566, 778, 417, 320, 2104, 281, 2289, 253, 2120, 3634, 323, 2403, 253, 3451, 20057, 1650, 310, 253, 11935, 3634, 273, 253, 15975, 50275, 21, 253, 2929, 310, 275, 2087, 417, 973, 10932, 285, 342, 8783, 273, 1355, 3374, 275, 4028, 323, 1650, 22581, 9978, 285, 734, 11423, 1080, 4345, 3091, 320, 3559, 2366, 281, 3959, 247, 3426, 5406, 273, 253, 22581, 1232, 50275, 783, 4477, 452, 973, 9713, 841, 5701, 50276, 187, 187, 4118, 18435, 27, 783, 4477, 3727, 15975, 1216, 69, 247, 4849, 273, 28528, 76, 28311, 432, 247, 884, 1107, 673, 2491, 342, 253, 1463, 23937, 323, 4907, 10726, 8981, 38998, 10726, 20057, 342, 2032, 35742, 1045, 285, 990, 281, 990, 10726, 20057, 990, 19, 423, 512, 30628, 11435, 326, 436, 310, 247, 1236, 2510, 25912, 5897, 595, 11117, 10895, 342, 1142, 11361, 689, 5368, 49602, 4217, 323, 2709, 8892, 285, 4679, 921, 326, 352, 310, 11132, 323, 1655, 3210, 597, 671, 1089, 253, 2929, 973, 15720, 285, 3477, 281, 1239, 253, 2022, 2159, 4202, 597, 1089, 310, 326, 2905, 789, 285, 5368, 49602, 812, 320, 625, 9286, 1242, 2529, 285, 253, 2929, 812, 320, 625, 2590, 275, 2176, 3672, 690, 30628, 403, 671, 7514, 326, 1580, 253, 10895, 3198, 281, 320, 20582, 407, 253, 34302, 23370, 352, 812, 1818, 689, 673, 4795, 275, 5777, 275, 681, 36730, 1543, 2439, 2390, 209 ]
[ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 ]
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 2520, 7714, 23970, 247, 747, 22791, 4907, 15975, 1216, 69, 323, 295, 15182, 7103, 253, 4081, 10895, 310, 1754, 327, 689, 28528, 76, 28311, 432, 247, 884, 1107, 673, 2491, 285, 310, 13644, 2130, 342, 253, 1463, 23937, 50275, 783, 4477, 7277, 15975, 1216, 69, 342, 1740, 5368, 49602, 432, 2709, 7794, 1690, 253, 1180, 273, 4451, 14429, 14429, 25957, 285, 253, 1180, 273, 28311, 253, 4081, 10895, 2722, 13406, 7756, 689, 2571, 275, 436, 2929, 253, 4477, 7568, 253, 5661, 7533, 273, 15975, 1216, 69, 323, 295, 15182, 4919, 8892, 597, 403, 4907, 10726, 8981, 38998, 10726, 20057, 342, 2032, 35742, 1045, 285, 990, 281, 990, 10726, 20057, 990, 19, 423, 253, 5661, 1543, 342, 3332, 3210, 403, 1246, 275, 436, 2929, 347, 973, 253, 3904, 5224, 253, 22791, 310, 11132, 323, 1655, 5482, 285, 253, 373, 1199, 281, 8338, 323, 436, 10895, 50276, 18, 253, 4081, 10895, 310, 2783, 247, 1236, 2510, 25912, 10895, 326, 476, 5223, 281, 247, 4618, 2491, 273, 5661, 7533, 50276, 19, 253, 2929, 23970, 253, 5661, 7533, 323, 1264, 1027, 8892, 2905, 281, 10726, 20057, 495, 2429, 281, 5368, 15302, 253, 15975, 1216, 69, 22791, 556, 16613, 11361, 577, 253, 4477, 2684, 247, 873, 273, 4679, 342, 3332, 3210, 253, 1543, 5224, 253, 22791, 310, 11132, 323, 2852, 2561, 608, 253, 4477, 16584, 15579, 3169, 10491, 285, 2794, 15975, 1216, 69, 836, 285, 15975, 1216, 69, 8371, 571, 281, 3157, 253, 9991, 273, 253, 15302, 50276, 18, 436, 2929, 556, 247, 3710, 5740, 273, 5368, 789, 15302, 275, 436, 5028, 50276, 19, 253, 2929, 476, 320, 5520, 342, 625, 7000, 20121, 273, 3210, 3082, 7982, 36318, 16039, 327, 1543, 3966, 436, 651, 1056, 253, 2929, 6927, 281, 956, 495, 253, 1246, 273, 8442, 285, 7180, 3340, 323, 4677, 337, 476, 320, 5520, 50276, 7152, 33032, 2520, 2929, 23970, 15975, 1216, 69, 247, 10895, 323, 22791, 272, 4907, 10726, 8981, 285, 557, 38638, 295, 15182, 2718, 327, 28311, 436, 310, 594, 2080, 253, 6253, 285, 954, 5897, 595, 11117, 13279, 47549, 10895, 22791, 323, 295, 15182, 327, 28311, 767, 20077, 273, 253, 10895, 15975, 1216, 69, 836, 285, 15975, 1216, 69, 317, 4788, 280, 403, 671, 2530, 253, 3438, 310, 323, 18005, 562, 1171, 13517, 3045, 253, 6158, 8414, 273, 28311, 432, 247, 4849, 273, 5368, 11073, 49602, 326, 452, 644, 294, 11423, 456, 342, 253, 747, 22581, 9600, 253, 767, 20077, 497, 6760, 323, 4907, 10726, 8981, 10726, 20057, 342, 2032, 35742, 285, 990, 936, 423, 10726, 20057, 8892, 2067, 425, 1661, 7274, 497, 2429, 285, 253, 1543, 403, 13366, 2361, 275, 253, 2929, 50276, 18, 253, 28267, 10895, 310, 4439, 407, 34302, 697, 4067, 285, 625, 3332, 685, 5368, 4394, 50275, 19, 253, 10895, 11359, 9305, 22581, 941, 285, 310, 12112, 50276, 20, 253, 7364, 273, 253, 10895, 403, 5469, 50275, 21, 253, 2929, 310, 973, 15720, 285, 3477, 281, 1239, 50276, 18, 253, 5301, 281, 5368, 49602, 3198, 625, 5955, 50276, 19, 891, 42126, 1089, 667, 5740, 327, 849, 7431, 14429, 497, 4561, 50275, 20, 253, 10491, 310, 417, 6447, 13730, 1955, 281, 6153, 15975, 1216, 69, 317, 4788, 280, 50275, 21, 7103, 369, 2218, 760, 327, 253, 767, 20077, 273, 15975, 1216, 69, 285, 760, 690, 4864, 1543, 497, 2361, 50276, 7152, 339, 793, 360, 3454, 436, 2929, 13542, 12182, 684, 28528, 76, 28311, 2439, 4267, 938, 1797, 281, 22791, 4907, 10726, 8981, 2718, 285, 7472, 5368, 3210, 327, 1264, 2905, 8892, 4907, 10726, 8981, 10726, 20057, 342, 2032, 35742, 285, 990, 281, 990, 10726, 20057, 50276, 1987, 8303, 16280, 253, 3963, 28311, 253, 2929, 671, 3400, 767, 20077, 15975, 1216, 69, 836, 285, 15975, 1216, 69, 317, 4788, 280, 281, 2939, 562, 1171, 13517, 3045, 285, 11935, 26647, 273, 38998, 3210, 2975, 253, 1236, 2510, 25912, 15975, 10895, 342, 13542, 28267, 10726, 4859, 13276, 2007, 2561, 327, 10726, 20057, 327, 2675, 3420, 253, 3045, 273, 690, 3332, 3082, 1057, 417, 562, 32231, 690, 10610, 3082, 327, 1264, 7103, 8892, 534, 15265, 684, 2007, 2561, 323, 10237, 295, 15182, 3210, 337, 253, 1232, 273, 13546, 15975, 1216, 69, 836, 310, 417, 4518, 2529, 285, 840, 1347, 31539, 10491, 1754, 327, 15975, 5231, 281, 10957, 841, 47289, 715, 749, 24507, 1507, 7613, 697, 2834, 281, 12129, 3530, 432, 436, 8578, 285, 1110, 432, 3963, 20077, 3103, 253, 12739, 273, 3045, 273, 6949, 3210, 327, 15975, 1216, 69, 836, 310, 417, 1077, 4755, 50276, 19, 6260, 273, 3045, 432, 5368, 10726, 20057, 2718, 3480, 253, 2442, 1921, 323, 20101, 432, 690, 625, 3332, 3210, 2429, 342, 4321, 3210, 3268, 5333, 812, 320, 581, 1921, 3738, 690, 3332, 1332, 2722, 625, 6422, 275, 616, 9056, 7103, 10895, 597, 2489, 1679, 6422, 672, 3587, 16344, 327, 253, 9841, 5728, 15975, 10895, 1293, 1442, 292, 25004, 50276, 20, 275, 253, 4679, 2593, 760, 253, 767, 20077, 15975, 1216, 69, 836, 285, 15975, 1216, 69, 317, 4788, 280, 403, 5762, 2299, 253, 1781, 5780, 3530, 403, 2649, 6949, 7613, 616, 2193, 403, 1892, 281, 2028, 24088, 3210, 3733, 327, 5780, 3530, 476, 4044, 1199, 1805, 3045, 327, 253, 767, 7103, 20077, 50273, 7152, 339, 431, 248, 2929, 23970, 15975, 1216, 69, 534, 310, 247, 1781, 10895, 273, 28528, 76, 28311, 6571, 5728, 2439, 9169, 938, 1797, 285, 534, 476, 320, 908, 281, 22791, 295, 15182, 2718, 253, 2022, 7680, 273, 253, 19529, 3021, 8414, 275, 253, 1979, 273, 253, 10895, 33810, 512, 6485, 31825, 452, 644, 4934, 275, 253, 10895, 534, 2097, 326, 253, 10895, 4483, 323, 2007, 40290, 342, 23851, 31825, 390, 1027, 2308, 273, 30859, 253, 2022, 20544, 273, 253, 10895, 403, 5393, 1840, 347, 9021, 337, 1979, 273, 253, 10895, 285, 374, 30859, 13823, 10895, 2067, 32213, 476, 320, 3636, 50276, 261, 310, 12744, 752, 2789, 253, 4081, 10895, 1805, 685, 5368, 581, 275, 2426, 273, 11935, 8492, 253, 10895, 310, 1199, 4067, 533, 1142, 28311, 452, 644, 3863, 1039, 1078, 9169, 50276, 5430, 1199, 452, 253, 31825, 4391, 275, 253, 15975, 1216, 69, 50276, 317, 4788, 280, 10895, 2429, 281, 253, 3236, 15302, 2139, 369, 253, 294, 22965, 3309, 50276, 12796, 512, 12182, 2392, 7615, 342, 253, 4836, 497, 627, 667, 7997, 327, 849, 281, 2968, 390, 8722, 30859, 50276, 22309, 12672, 734, 83, 727, 13367, 347, 26026, 3185, 273, 253, 625, 9630, 4088, 326, 897, 1463, 20433, 17082, 465, 21743, 13047, 567, 84, 9765, 323, 4227, 627, 310, 1077, 1652, 5955, 326, 26785, 253, 4606, 323, 30859, 285, 849, 597, 476, 3486, 253, 3210, 326, 403, 10166, 285, 6760, 342, 436, 10895, 50276, 783, 10895, 1057, 417, 1646, 281, 320, 16645, 533, 760, 269, 18, 7363, 403, 2361, 2139, 310, 326, 285, 2139, 310, 253, 269, 18, 4868, 253, 760, 3058, 7982, 50276, 783, 1543, 403, 5469, 275, 1077, 1652, 281, 642, 2508, 5474, 339, 9852, 436, 2929, 253, 4477, 1246, 247, 1236, 2510, 25912, 10895, 323, 10726, 8981, 285, 20057, 323, 28311, 275, 2426, 273, 1979, 253, 10895, 310, 1199, 4067, 685, 5368, 15302, 323, 253, 1072, 1895, 7533, 1754, 327, 253, 5740, 253, 10895, 310, 6283, 28267, 342, 2590, 7997, 281, 253, 12182, 2392, 275, 436, 3282, 253, 3290, 273, 31825, 310, 1029, 285, 253, 4477, 3727, 9305, 31825, 275, 1635, 281, 253, 14072, 13301, 253, 10895, 671, 3249, 342, 767, 9257, 8818, 20077, 12262, 3477, 285, 11132, 2219, 253, 2022, 14855, 3249, 342, 253, 3710, 2289, 285, 8062, 273, 253, 10895, 1563, 34302, 3646, 760, 34302, 2654, 310, 4439, 3185, 273, 253, 9305, 2600, 33676, 253, 10895, 387, 247, 1027, 673, 778, 755, 247, 1027, 8578, 436, 651, 1056, 253, 1543, 432, 1027, 9380, 417, 10870, 891, 651, 840, 513, 417, 1908, 436, 10895, 281, 320, 1663, 247, 22791, 253, 4477, 671, 513, 417, 3748, 253, 8062, 273, 259, 15170, 1146, 16168, 1046, 1388, 253, 10895, 310, 273, 1781, 4311, 285, 253, 4477, 452, 2783, 1142, 2616, 275, 26736, 436, 10895, 432, 28311, 10491, 281, 253, 22581, 29609, 285, 840, 281, 253, 5140, 273, 767, 20077, 253, 4477, 2589, 27163, 273, 247, 1643, 3210, 323, 1264, 3237, 38998, 10726, 20057, 285, 990, 936, 423, 10726, 20057, 50276, 783, 4477, 5206, 281, 3727, 9305, 31825, 275, 1635, 281, 253, 14072, 13301, 253, 9305, 22581, 13276, 2175, 273, 643, 2561, 3237, 327, 1755, 273, 253, 10726, 20057, 3237, 50276, 18, 253, 2022, 14855, 310, 253, 2289, 281, 253, 941, 310, 3710, 407, 34302, 23370, 2426, 285, 2515, 891, 513, 2096, 326, 512, 4212, 273, 253, 10895, 452, 281, 956, 253, 4803, 533, 253, 906, 310, 1027, 9508, 273, 253, 10895, 1146, 908, 275, 1027, 9380, 347, 247, 906, 1566, 3933, 19103, 403, 417, 3587, 10870, 50275, 19, 253, 4477, 513, 417, 1908, 8062, 273, 259, 15170, 534, 310, 1900, 1146, 16168, 247, 2173, 2715, 390, 13725, 3091, 320, 7616, 281, 3693, 15338, 253, 3910, 908, 407, 1027, 4477, 690, 10726, 20057, 3210, 10725, 327, 734, 30816, 875, 35372, 12028, 432, 259, 15170, 534, 310, 417, 2908, 275, 253, 22581, 50275, 20, 253, 3634, 908, 1309, 22581, 3104, 2456, 50276, 3357, 1537, 417, 973, 3761, 342, 752, 310, 5950, 275, 253, 10895, 275, 643, 3000, 247, 1566, 778, 417, 320, 2104, 281, 2289, 253, 2120, 3634, 323, 2403, 253, 3451, 20057, 1650, 310, 253, 11935, 3634, 273, 253, 15975, 50275, 21, 253, 2929, 310, 275, 2087, 417, 973, 10932, 285, 342, 8783, 273, 1355, 3374, 275, 4028, 323, 1650, 22581, 9978, 285, 734, 11423, 1080, 4345, 3091, 320, 3559, 2366, 281, 3959, 247, 3426, 5406, 273, 253, 22581, 1232, 50275, 783, 4477, 452, 973, 9713, 841, 5701, 50276, 187, 187, 4118, 18435, 27, 783, 4477, 3727, 15975, 1216, 69, 247, 4849, 273, 28528, 76, 28311, 432, 247, 884, 1107, 673, 2491, 342, 253, 1463, 23937, 323, 4907, 10726, 8981, 38998, 10726, 20057, 342, 2032, 35742, 1045, 285, 990, 281, 990, 10726, 20057, 990, 19, 423, 512, 30628, 11435, 326, 436, 310, 247, 1236, 2510, 25912, 5897, 595, 11117, 10895, 342, 1142, 11361, 689, 5368, 49602, 4217, 323, 2709, 8892, 285, 4679, 921, 326, 352, 310, 11132, 323, 1655, 3210, 597, 671, 1089, 253, 2929, 973, 15720, 285, 3477, 281, 1239, 253, 2022, 2159, 4202, 597, 1089, 310, 326, 2905, 789, 285, 5368, 49602, 812, 320, 625, 9286, 1242, 2529, 285, 253, 2929, 812, 320, 625, 2590, 275, 2176, 3672, 690, 30628, 403, 671, 7514, 326, 1580, 253, 10895, 3198, 281, 320, 20582, 407, 253, 34302, 23370, 352, 812, 1818, 689, 673, 4795, 275, 5777, 275, 681, 36730, 1543, 2439, 2390, 209 ]
Below is given review of a research paper from cnoference journal. Please write a summary the review. ### Review: this paper proposes an extension of the crossentropy method cem for optimisation that consists in an ensemble of multiple standard cem instances each one being optimised independently and the solution at each step is the output of the top performer cem at that step simulation results in a continuous control benchmark show that the proposed algorithm matches or outperforms previous cem variants the authors also use a toy problem to illustrate how using an ensemble can help to escape local minimum strengths using an ensemble of optimisers is a powerful idea that can help to explore the solution space more efficiently and should be explored further the continuous control benchmark is relevant for this task and the baselines are sensible in particular having included sac as a reference is appreciated simulation results show similar or better performance to other cem variants in most of the environments the paper is generally well written and easy to follow weaknesses training curves are stopped before convergence for many environments this makes difficult to evaluate the stability and final performance of the proposed approach especially for those environments in which the final performance of sac black dashed line is much higher than that achieved by the proposed approaches although the presented algorithm is novel the idea of using ensembles of optimisers is not comparison with other ensemblebased approaches like the decentralised cem introduced in 1 or the ensemble method proposed in 2 would be appreciated questions what is the practical benefit of using cem over other modelfree approaches like sac is it much faster in terms of actual time ie the training time per step is smaller why is not the covariance matrix learnt is it because the large dimensionality of the parameter space i understand that each step in the training curves correspond to a single statetransition but that the points are actually represented per episode which can be finished after a max number of state transitions is my understanding correct minor comments since the state space is introduced as a subset of the real vector space i assumed it is not countable hence the transition probability distribution should be in re instead of 0 1 1 s v macua s zazo and j zazo distributed blackbox optimization of nonconvex functions 2015 2 s khadka s majumdar t nassar z dwiel e tumer s miret y liu and k tumer collaborative evolutionary reinforcement learning 2019 the algorithmic idea is interesting well executed for rl and probably useful for optimisation in general but comparison with other ensemble approaches is missing in the related work to better understand its novelty simulation results are promising but not fully convincing as it shows similar training performance to previous methods but it is not clear the final performance of the proposed approach for many of the most interesting environments i lack a clear practical motivation for the cem approach docsepthis paper studies a novel cem method for modelbased rl while previous approaches used a centralized method based on a unimodal gaussian or gaussian mixture they propose a decentralized cem where each instance independently tracks its own data and topk estimates they first test their new method on a 1d toy task showing better convergence to the global optimum they then test their method on several modelbased rl tasks outperforming previous methods in some tasks while being on par or inferior in others strong the idea is straightforward but good and the toy example indeed suggest that a gmm cannot handle the multimodality in the optimization landscape the paper is well written very clear equations are concise the appendix is really detailed as well with full hyperparameter settings i am convinced another researcher could replicate these experiments i really like the 1d toy example it gives a good visual illustration of the method and what is happening weak i think your mbrl results are not super strong i see quite some noisy curves and in some of them your method indeed comes up but in some others it is actually below the alternatives i would say the real improvement is mostly for decentcema on acrobot reacher ant and reacher but given that is below on hopper and walker2d we may doubt how much signal there really is i do like that you show all results including negative but you may slightly rephrase the statement that your method either matches or outperforms their counterparts toy experiment i find it suprising that cemgmm looses the global optimum my main issue is that you do not report on the number of mixture components in the gmm this seems to be really crucial looking at the appendix results it seems that the cemgmm is unlucky on iteration 0 having few samples on the right but with enough mixture components this should still be fine and what is the number of instances in decentcem i hope the same as the number of mixtures did you tune both in short i lack some hyperparameter choices you make in the toy experiment end of 43 i have some trouble understanding why the amount of data goes up per environment interaction each instance has a different distribution over policies right and i can only evaluate one of these policies per environment interaction i might be missing something here but i would need some clarification here i think related work has a major omission since it only focuses on other mbrl work but does not discuss any related work from cem literature has this decentralized approach already been tried there or something similar then it can still be relevant to mbrl but you should note and discuss this the paper has no discussion and future work which i think should always be there the graphs are quite small and in some cases just unclear for example in figure 6 the colour coding is far from optimal i really have a hard time figuring out which line is which on my printed version in colour since you eg have red and pink on a black and white print if would really be completely infeasible figure 8 is also really small and needs an more extensive caption if you cannot increase the plot size at least increase the axis labels in size your methodology was really clear but a few things i did not get 1 is the vth in eq 3 a threshold per cem instance then i would write vthi 2 is the k of cemgmm the same as the k for each instance of decentcem ie does decent cem get more topk samples and should you correct for this i find it really hard to judge this paper since it has strong and weak points as mentioned above i like the idea it is straightforward and easy to grasp yet well motivated the paper is very clearly written has clear notation and gives good intuitive illustration of the idea on the downside i do not think the results are really convincing although there is some signal i lack hyperparameters on the number of mixtures in the toy experiment and the sensitivity of results to varying it i completely miss related work from the cem literature there is no discussion and future work and the graphs are not easy to read individually all these downsides can be overlooked but with all of them together i get in doubt docsepthe paper proposes decentcem which uses parallel instances of cem to learn optimal policies for problems that contain multimodal optimal actions instead of using a single policy as sampling distribution and optimizing the actions in the vicinity of this action trajectory the decentcem method uses parallel with multiple policies and cem optimizers therefore the proposed algorithm can learn multimodal actions and should improve sample efficiency the algorithm is evaluated on the standard openai benchmark tasks the paper wants to solve the problem that cem cannot handle multimodal action distributions within the topk samples in this case cem averages over the multiple modes and requires more samples to converge while this problem theoretically exists and one can describe a motivational example where this problem is relevant the problem is not relevant for the performed modelbased rl experiments on the openai tasks the algorithms have sufficient entropy to break the multiple solutions and converge to a single solution during a few replannings the difference in reward is usually not significant as shown by pinneri et al using the argmax which also avoids averaging over modes instead of the mean can improve runtime and marginally increase the reward but the differences are not essential from my personal experience with cem using the argmax over the mean operation can potentially increase performance in some environments but the increase is not significant the performed experiments of the authors do not show much of an improvement of decentcem over the other methods the authors also do not provide an evaluation that evaluates whether the computation time decreases with their parallel structure of cem policies furthermore the computational complexity during training most likely increases as one needs to train n policies therefore the proposed algorithm does not improve the empirical performance of poplin for more complex environments the proposed approach might be beneficial but for the openai benchmark tasks the parallel structure is not necessary to achieve a good performance the paper also does not provide new theoretical insights or has a good motivation using nparallel policies cem is a trivial extension of poplin with uncertain benefits if one wants to address the unimodal gaussian assumption of cem one should invest in sequential monte carlo methods which can handle multimodal solutions besides the poor motivation of the approach and limited experimental results the paper is wellwritten and easy to follow unfortunately only the idea is not sufficient to be presented at iclr the advantage of the proposed algorithm is unclear and there are no theoretical insights within the paper the algorithm is a trivial extension of poplin and has little relevance therefore i recommend rejecting the paper ### Summary:
this paper presents a decentralized version of the cem technique where an ensemble of cem instances run independently from one another and each performs a local improvement of its own sampling distribution the paper shows that the proposed technique can alleviate the problem of centralized cem related to converging to a local optimum the paper includes a theoretical analysis and simulation experiments that show some benefits of the proposed technique over centralized cem the key criticisms from the reviewers include the straightforward nature of the proposed idea which limits the technical contribution of the paper as well as the limited improvements over centralized cem in the simulation experiments in summary this is a borderline paper while the paper is wellwritten and the proposed approach is clearly explained the lack of strong empirical results that show a pronounced improvement of decentralized cem coupled with the incremental nature of the idea of decentralized cem makes me lean toward a rejection
[ 187, 4118, 8439, 27, 187, 2520, 2929, 29328, 271, 6880, 273, 253, 2831, 290, 10144, 1332, 260, 358, 323, 5556, 5837, 326, 8414, 275, 271, 19862, 273, 2709, 2629, 260, 358, 10872, 1016, 581, 1146, 5556, 1701, 10939, 285, 253, 2900, 387, 1016, 3213, 310, 253, 3453, 273, 253, 1755, 40247, 260, 358, 387, 326, 3213, 9864, 1543, 275, 247, 5415, 1453, 22791, 921, 326, 253, 4081, 5933, 10129, 390, 41731, 13015, 2045, 260, 358, 11640, 253, 4477, 671, 897, 247, 20953, 1895, 281, 17093, 849, 970, 271, 19862, 476, 1361, 281, 8773, 1980, 5927, 50275, 296, 3755, 20556, 50276, 5302, 271, 19862, 273, 5556, 34768, 310, 247, 6422, 2934, 326, 476, 1361, 281, 8338, 253, 2900, 2317, 625, 14556, 285, 943, 320, 14859, 2007, 50276, 783, 5415, 1453, 22791, 310, 4623, 323, 436, 4836, 285, 253, 1666, 25379, 403, 24600, 275, 1798, 1907, 2908, 7044, 347, 247, 3806, 310, 14109, 50276, 3549, 1427, 1543, 921, 2074, 390, 1805, 3045, 281, 643, 260, 358, 11640, 275, 954, 273, 253, 12620, 50276, 783, 2929, 310, 3839, 973, 3542, 285, 3477, 281, 956, 50276, 20881, 1255, 265, 50276, 31158, 9191, 403, 6331, 1078, 14940, 323, 1142, 12620, 436, 2789, 2834, 281, 7472, 253, 7882, 285, 2457, 3045, 273, 253, 4081, 2746, 3340, 323, 1110, 12620, 275, 534, 253, 2457, 3045, 273, 7044, 2806, 17889, 1386, 310, 1199, 2169, 685, 326, 6786, 407, 253, 4081, 7274, 50276, 20261, 253, 3559, 5933, 310, 4460, 253, 2934, 273, 970, 49328, 273, 5556, 34768, 310, 417, 5301, 342, 643, 19862, 3169, 7274, 751, 253, 31331, 1701, 260, 358, 5611, 275, 337, 390, 253, 19862, 1332, 4081, 275, 374, 651, 320, 14109, 50274, 34974, 50276, 5371, 310, 253, 8542, 5649, 273, 970, 260, 358, 689, 643, 771, 813, 658, 7274, 751, 7044, 310, 352, 1199, 7938, 275, 2426, 273, 4588, 673, 26332, 253, 3733, 673, 591, 3213, 310, 4577, 50275, 22309, 310, 417, 253, 26677, 4315, 34003, 310, 352, 984, 253, 1781, 7877, 1319, 273, 253, 4764, 2317, 50276, 74, 2096, 326, 1016, 3213, 275, 253, 3733, 9191, 2723, 281, 247, 2014, 1098, 11656, 507, 539, 533, 326, 253, 2792, 403, 2686, 6607, 591, 9037, 534, 476, 320, 6699, 846, 247, 2781, 1180, 273, 1375, 16307, 310, 619, 4685, 3451, 50276, 37585, 5701, 1580, 253, 1375, 2317, 310, 5611, 347, 247, 8578, 273, 253, 1524, 4972, 2317, 891, 8025, 352, 310, 417, 43998, 7613, 253, 5502, 5912, 3268, 943, 320, 275, 294, 3185, 273, 470, 337, 50276, 18, 256, 362, 5315, 5738, 256, 1182, 38507, 285, 480, 1182, 38507, 5939, 2806, 3364, 13757, 273, 1327, 44181, 3470, 4104, 374, 256, 465, 10178, 4530, 256, 19684, 360, 27083, 246, 295, 515, 274, 1182, 19858, 928, 299, 3034, 254, 256, 278, 38959, 340, 632, 86, 285, 465, 3034, 254, 27549, 16483, 35221, 4715, 6247, 50276, 783, 5933, 280, 2934, 310, 4722, 973, 11407, 323, 391, 77, 285, 3164, 4217, 323, 5556, 5837, 275, 2087, 533, 5301, 342, 643, 19862, 7274, 310, 5816, 275, 253, 2905, 789, 281, 1805, 2096, 697, 38135, 9864, 1543, 403, 12532, 533, 417, 4751, 21414, 347, 352, 2722, 2074, 3733, 3045, 281, 2045, 3082, 533, 352, 310, 417, 2590, 253, 2457, 3045, 273, 253, 4081, 2746, 323, 1142, 273, 253, 954, 4722, 12620, 50276, 74, 3480, 247, 2590, 8542, 16038, 323, 253, 260, 358, 2746, 50275, 7152, 33032, 2520, 2929, 2175, 247, 4460, 260, 358, 1332, 323, 1566, 3169, 391, 77, 1223, 2045, 7274, 908, 247, 36409, 1332, 1754, 327, 247, 32505, 26306, 305, 12064, 390, 305, 12064, 7802, 597, 12661, 247, 40880, 260, 358, 835, 1016, 4227, 10939, 11411, 697, 1211, 941, 285, 1755, 76, 8197, 597, 806, 1071, 616, 747, 1332, 327, 247, 337, 69, 20953, 4836, 4645, 1805, 14940, 281, 253, 4156, 24571, 597, 840, 1071, 616, 1332, 327, 2067, 1566, 3169, 391, 77, 8892, 41731, 14692, 2045, 3082, 275, 690, 8892, 1223, 1146, 327, 1061, 390, 18134, 275, 2571, 50276, 9072, 50276, 783, 2934, 310, 15246, 533, 1175, 285, 253, 20953, 1650, 6296, 1804, 326, 247, 305, 2188, 2550, 6016, 253, 23390, 351, 1319, 275, 253, 13757, 13016, 50275, 783, 2929, 310, 973, 3542, 1077, 2590, 7424, 403, 44003, 253, 30762, 310, 1663, 7000, 347, 973, 342, 2120, 4373, 19484, 7533, 891, 717, 13762, 1529, 22780, 812, 25464, 841, 4679, 50275, 74, 1663, 751, 253, 337, 69, 20953, 1650, 352, 4245, 247, 1175, 5304, 23356, 273, 253, 1332, 285, 752, 310, 9369, 50275, 20881, 50276, 74, 1158, 634, 278, 1288, 77, 1543, 403, 417, 2221, 2266, 891, 923, 3240, 690, 27620, 9191, 285, 275, 690, 273, 731, 634, 1332, 6296, 3249, 598, 533, 275, 690, 2571, 352, 310, 2686, 2708, 253, 18075, 891, 651, 1333, 253, 1524, 7756, 310, 6571, 323, 12524, 336, 785, 327, 913, 287, 12042, 294, 12844, 1331, 285, 294, 12844, 533, 1677, 326, 310, 2708, 327, 8511, 3803, 285, 2940, 254, 19, 69, 359, 778, 5545, 849, 1199, 2625, 627, 1663, 310, 891, 513, 751, 326, 368, 921, 512, 1543, 1690, 4016, 533, 368, 778, 5777, 294, 40712, 253, 3908, 326, 634, 1332, 2057, 10129, 390, 41731, 13015, 616, 21421, 50275, 85, 899, 3368, 891, 1089, 352, 402, 20733, 326, 260, 358, 72, 2188, 2343, 4863, 253, 4156, 24571, 619, 2022, 2523, 310, 326, 368, 513, 417, 1304, 327, 253, 1180, 273, 7802, 4295, 275, 253, 305, 2188, 436, 3133, 281, 320, 1663, 9560, 2819, 387, 253, 30762, 1543, 352, 3133, 326, 253, 260, 358, 72, 2188, 310, 46144, 12202, 327, 19502, 470, 1907, 1643, 3530, 327, 253, 987, 533, 342, 2217, 7802, 4295, 436, 943, 1335, 320, 4030, 285, 752, 310, 253, 1180, 273, 10872, 275, 12524, 336, 78, 891, 3524, 253, 1072, 347, 253, 1180, 273, 24170, 858, 368, 19928, 1097, 275, 2159, 891, 3480, 690, 4373, 19484, 10165, 368, 1056, 275, 253, 20953, 3368, 50276, 423, 273, 7652, 891, 452, 690, 7596, 4685, 2139, 253, 2408, 273, 941, 4566, 598, 591, 3126, 5016, 1016, 4227, 556, 247, 1027, 3268, 689, 7823, 987, 285, 891, 476, 760, 7472, 581, 273, 841, 7823, 591, 3126, 5016, 891, 1537, 320, 5816, 1633, 1060, 533, 891, 651, 878, 690, 37699, 1060, 50276, 74, 1158, 2905, 789, 556, 247, 2201, 33860, 1580, 352, 760, 16633, 327, 643, 278, 1288, 77, 789, 533, 1057, 417, 2319, 667, 2905, 789, 432, 260, 358, 6239, 556, 436, 40880, 2746, 2168, 644, 3597, 627, 390, 1633, 2074, 840, 352, 476, 1335, 320, 4623, 281, 278, 1288, 77, 533, 368, 943, 3877, 285, 2319, 436, 50275, 783, 2929, 556, 642, 5955, 285, 2852, 789, 534, 891, 1158, 943, 1900, 320, 627, 50275, 783, 14580, 403, 3240, 1355, 285, 275, 690, 2219, 816, 12744, 323, 1650, 275, 4677, 721, 253, 10688, 12425, 310, 2080, 432, 8654, 891, 1663, 452, 247, 1892, 673, 36182, 562, 534, 1386, 310, 534, 327, 619, 11462, 2715, 275, 10688, 1580, 368, 24088, 452, 2502, 285, 14863, 327, 247, 2806, 285, 3168, 3379, 604, 651, 1663, 320, 4336, 275, 36764, 917, 4677, 854, 310, 671, 1663, 1355, 285, 3198, 271, 625, 9470, 11743, 604, 368, 2550, 2572, 253, 7484, 1979, 387, 1878, 2572, 253, 7844, 13301, 275, 1979, 50275, 12550, 16182, 369, 1663, 2590, 533, 247, 1643, 1841, 891, 858, 417, 755, 337, 310, 253, 362, 394, 275, 16186, 495, 247, 7887, 591, 260, 358, 4227, 840, 891, 651, 3630, 362, 48146, 374, 310, 253, 465, 273, 260, 358, 72, 2188, 253, 1072, 347, 253, 465, 323, 1016, 4227, 273, 12524, 336, 78, 26332, 1057, 12524, 260, 358, 755, 625, 1755, 76, 3530, 285, 943, 368, 3451, 323, 436, 891, 1089, 352, 1663, 1892, 281, 5963, 436, 2929, 1580, 352, 556, 2266, 285, 5075, 2792, 347, 5393, 1840, 891, 751, 253, 2934, 352, 310, 15246, 285, 3477, 281, 15909, 2568, 973, 17194, 253, 2929, 310, 1077, 4518, 3542, 556, 2590, 14951, 285, 4245, 1175, 27350, 23356, 273, 253, 2934, 327, 253, 42719, 891, 513, 417, 1158, 253, 1543, 403, 1663, 21414, 3738, 627, 310, 690, 2625, 891, 3480, 4373, 22041, 327, 253, 1180, 273, 24170, 275, 253, 20953, 3368, 285, 253, 7340, 273, 1543, 281, 11962, 352, 891, 4336, 2985, 2905, 789, 432, 253, 260, 358, 6239, 627, 310, 642, 5955, 285, 2852, 789, 285, 253, 14580, 403, 417, 3477, 281, 1239, 15978, 512, 841, 37616, 1487, 476, 320, 28849, 533, 342, 512, 273, 731, 2366, 891, 755, 275, 5545, 50276, 7152, 339, 431, 248, 2929, 29328, 12524, 336, 78, 534, 4648, 7529, 10872, 273, 260, 358, 281, 3037, 8654, 7823, 323, 3237, 326, 3831, 23390, 26306, 8654, 5231, 3185, 273, 970, 247, 2014, 3646, 347, 10491, 3268, 285, 39793, 253, 5231, 275, 253, 21520, 273, 436, 2250, 18974, 253, 12524, 336, 78, 1332, 4648, 7529, 342, 2709, 7823, 285, 260, 358, 5556, 14460, 3103, 253, 4081, 5933, 476, 3037, 23390, 26306, 5231, 285, 943, 3157, 3410, 6733, 253, 5933, 310, 6760, 327, 253, 2629, 1527, 2284, 22791, 8892, 50275, 783, 2929, 5605, 281, 8415, 253, 1895, 326, 260, 358, 2550, 6016, 23390, 26306, 2250, 10670, 1561, 253, 1755, 76, 3530, 275, 436, 1083, 260, 358, 31218, 689, 253, 2709, 10006, 285, 4419, 625, 3530, 281, 29623, 1223, 436, 1895, 28055, 4961, 285, 581, 476, 6266, 247, 49956, 1650, 835, 436, 1895, 310, 4623, 253, 1895, 310, 417, 4623, 323, 253, 2684, 1566, 3169, 391, 77, 4679, 327, 253, 1527, 2284, 8892, 253, 11333, 452, 4209, 15579, 281, 2740, 253, 2709, 5482, 285, 29623, 281, 247, 2014, 2900, 1309, 247, 1643, 2397, 1136, 723, 253, 3064, 275, 10921, 310, 3798, 417, 1534, 347, 2011, 407, 268, 5338, 74, 1162, 355, 970, 253, 1736, 4090, 534, 671, 32547, 25001, 689, 10006, 3185, 273, 253, 1599, 476, 3157, 20243, 285, 42876, 2572, 253, 10921, 533, 253, 3910, 403, 417, 5667, 432, 619, 3367, 2793, 342, 260, 358, 970, 253, 1736, 4090, 689, 253, 1599, 4254, 476, 7826, 2572, 3045, 275, 690, 12620, 533, 253, 2572, 310, 417, 1534, 253, 2684, 4679, 273, 253, 4477, 513, 417, 921, 1199, 273, 271, 7756, 273, 12524, 336, 78, 689, 253, 643, 3082, 253, 4477, 671, 513, 417, 2085, 271, 7103, 326, 44995, 1880, 253, 13782, 673, 12075, 342, 616, 7529, 2605, 273, 260, 358, 50276, 81, 3422, 447, 33810, 253, 15180, 10454, 1309, 3733, 954, 2779, 5459, 347, 581, 3198, 281, 6194, 295, 7823, 3103, 253, 4081, 5933, 1057, 417, 3157, 253, 16774, 3045, 273, 1684, 3642, 323, 625, 2570, 12620, 253, 4081, 2746, 1537, 320, 12912, 533, 323, 253, 1527, 2284, 22791, 8892, 253, 7529, 2605, 310, 417, 3309, 281, 5115, 247, 1175, 3045, 50275, 783, 2929, 671, 1057, 417, 2085, 747, 10527, 16039, 390, 556, 247, 1175, 16038, 970, 295, 19783, 7823, 50276, 336, 78, 310, 247, 14916, 6880, 273, 1684, 3642, 342, 8767, 5373, 604, 581, 5605, 281, 2953, 253, 32505, 26306, 305, 12064, 9376, 273, 260, 358, 581, 943, 1718, 275, 22453, 1114, 442, 1113, 4213, 3082, 534, 476, 6016, 23390, 26306, 5482, 50276, 67, 11587, 253, 4105, 16038, 273, 253, 2746, 285, 3710, 5661, 1543, 253, 2929, 310, 973, 15720, 285, 3477, 281, 956, 19235, 760, 253, 2934, 310, 417, 4209, 281, 320, 3559, 387, 17857, 32888, 50276, 783, 5750, 273, 253, 4081, 5933, 310, 12744, 285, 627, 403, 642, 10527, 16039, 1561, 253, 2929, 253, 5933, 310, 247, 14916, 6880, 273, 1684, 3642, 285, 556, 1652, 17200, 3103, 891, 5583, 33944, 253, 2929, 50276, 187, 187, 4118, 18435, 27, 2520, 2929, 10262, 247, 40880, 2715, 273, 253, 260, 358, 5853, 835, 271, 19862, 273, 260, 358, 10872, 1408, 10939, 432, 581, 1529, 285, 1016, 17923, 247, 1980, 7756, 273, 697, 1211, 10491, 3268, 253, 2929, 2722, 326, 253, 4081, 5853, 476, 33623, 253, 1895, 273, 36409, 260, 358, 2905, 281, 5975, 3390, 281, 247, 1980, 24571, 253, 2929, 3797, 247, 10527, 1783, 285, 9864, 4679, 326, 921, 690, 5373, 273, 253, 4081, 5853, 689, 36409, 260, 358, 50276, 783, 2234, 43680, 432, 253, 30628, 2486, 253, 15246, 3753, 273, 253, 4081, 2934, 534, 7787, 253, 7681, 7680, 273, 253, 2929, 347, 973, 347, 253, 3710, 11701, 689, 36409, 260, 358, 275, 253, 9864, 4679, 50275, 249, 6010, 436, 310, 247, 45210, 2929, 1223, 253, 2929, 310, 973, 15720, 285, 253, 4081, 2746, 310, 4518, 5544, 253, 3480, 273, 2266, 16774, 1543, 326, 921, 247, 17088, 7756, 273, 40880, 260, 358, 9904, 342, 253, 32809, 3753, 273, 253, 2934, 273, 40880, 260, 358, 2789, 479, 9644, 2584, 247, 18235 ]
[ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 ]
[ 187, 4118, 8439, 27, 187, 2520, 2929, 29328, 271, 6880, 273, 253, 2831, 290, 10144, 1332, 260, 358, 323, 5556, 5837, 326, 8414, 275, 271, 19862, 273, 2709, 2629, 260, 358, 10872, 1016, 581, 1146, 5556, 1701, 10939, 285, 253, 2900, 387, 1016, 3213, 310, 253, 3453, 273, 253, 1755, 40247, 260, 358, 387, 326, 3213, 9864, 1543, 275, 247, 5415, 1453, 22791, 921, 326, 253, 4081, 5933, 10129, 390, 41731, 13015, 2045, 260, 358, 11640, 253, 4477, 671, 897, 247, 20953, 1895, 281, 17093, 849, 970, 271, 19862, 476, 1361, 281, 8773, 1980, 5927, 50275, 296, 3755, 20556, 50276, 5302, 271, 19862, 273, 5556, 34768, 310, 247, 6422, 2934, 326, 476, 1361, 281, 8338, 253, 2900, 2317, 625, 14556, 285, 943, 320, 14859, 2007, 50276, 783, 5415, 1453, 22791, 310, 4623, 323, 436, 4836, 285, 253, 1666, 25379, 403, 24600, 275, 1798, 1907, 2908, 7044, 347, 247, 3806, 310, 14109, 50276, 3549, 1427, 1543, 921, 2074, 390, 1805, 3045, 281, 643, 260, 358, 11640, 275, 954, 273, 253, 12620, 50276, 783, 2929, 310, 3839, 973, 3542, 285, 3477, 281, 956, 50276, 20881, 1255, 265, 50276, 31158, 9191, 403, 6331, 1078, 14940, 323, 1142, 12620, 436, 2789, 2834, 281, 7472, 253, 7882, 285, 2457, 3045, 273, 253, 4081, 2746, 3340, 323, 1110, 12620, 275, 534, 253, 2457, 3045, 273, 7044, 2806, 17889, 1386, 310, 1199, 2169, 685, 326, 6786, 407, 253, 4081, 7274, 50276, 20261, 253, 3559, 5933, 310, 4460, 253, 2934, 273, 970, 49328, 273, 5556, 34768, 310, 417, 5301, 342, 643, 19862, 3169, 7274, 751, 253, 31331, 1701, 260, 358, 5611, 275, 337, 390, 253, 19862, 1332, 4081, 275, 374, 651, 320, 14109, 50274, 34974, 50276, 5371, 310, 253, 8542, 5649, 273, 970, 260, 358, 689, 643, 771, 813, 658, 7274, 751, 7044, 310, 352, 1199, 7938, 275, 2426, 273, 4588, 673, 26332, 253, 3733, 673, 591, 3213, 310, 4577, 50275, 22309, 310, 417, 253, 26677, 4315, 34003, 310, 352, 984, 253, 1781, 7877, 1319, 273, 253, 4764, 2317, 50276, 74, 2096, 326, 1016, 3213, 275, 253, 3733, 9191, 2723, 281, 247, 2014, 1098, 11656, 507, 539, 533, 326, 253, 2792, 403, 2686, 6607, 591, 9037, 534, 476, 320, 6699, 846, 247, 2781, 1180, 273, 1375, 16307, 310, 619, 4685, 3451, 50276, 37585, 5701, 1580, 253, 1375, 2317, 310, 5611, 347, 247, 8578, 273, 253, 1524, 4972, 2317, 891, 8025, 352, 310, 417, 43998, 7613, 253, 5502, 5912, 3268, 943, 320, 275, 294, 3185, 273, 470, 337, 50276, 18, 256, 362, 5315, 5738, 256, 1182, 38507, 285, 480, 1182, 38507, 5939, 2806, 3364, 13757, 273, 1327, 44181, 3470, 4104, 374, 256, 465, 10178, 4530, 256, 19684, 360, 27083, 246, 295, 515, 274, 1182, 19858, 928, 299, 3034, 254, 256, 278, 38959, 340, 632, 86, 285, 465, 3034, 254, 27549, 16483, 35221, 4715, 6247, 50276, 783, 5933, 280, 2934, 310, 4722, 973, 11407, 323, 391, 77, 285, 3164, 4217, 323, 5556, 5837, 275, 2087, 533, 5301, 342, 643, 19862, 7274, 310, 5816, 275, 253, 2905, 789, 281, 1805, 2096, 697, 38135, 9864, 1543, 403, 12532, 533, 417, 4751, 21414, 347, 352, 2722, 2074, 3733, 3045, 281, 2045, 3082, 533, 352, 310, 417, 2590, 253, 2457, 3045, 273, 253, 4081, 2746, 323, 1142, 273, 253, 954, 4722, 12620, 50276, 74, 3480, 247, 2590, 8542, 16038, 323, 253, 260, 358, 2746, 50275, 7152, 33032, 2520, 2929, 2175, 247, 4460, 260, 358, 1332, 323, 1566, 3169, 391, 77, 1223, 2045, 7274, 908, 247, 36409, 1332, 1754, 327, 247, 32505, 26306, 305, 12064, 390, 305, 12064, 7802, 597, 12661, 247, 40880, 260, 358, 835, 1016, 4227, 10939, 11411, 697, 1211, 941, 285, 1755, 76, 8197, 597, 806, 1071, 616, 747, 1332, 327, 247, 337, 69, 20953, 4836, 4645, 1805, 14940, 281, 253, 4156, 24571, 597, 840, 1071, 616, 1332, 327, 2067, 1566, 3169, 391, 77, 8892, 41731, 14692, 2045, 3082, 275, 690, 8892, 1223, 1146, 327, 1061, 390, 18134, 275, 2571, 50276, 9072, 50276, 783, 2934, 310, 15246, 533, 1175, 285, 253, 20953, 1650, 6296, 1804, 326, 247, 305, 2188, 2550, 6016, 253, 23390, 351, 1319, 275, 253, 13757, 13016, 50275, 783, 2929, 310, 973, 3542, 1077, 2590, 7424, 403, 44003, 253, 30762, 310, 1663, 7000, 347, 973, 342, 2120, 4373, 19484, 7533, 891, 717, 13762, 1529, 22780, 812, 25464, 841, 4679, 50275, 74, 1663, 751, 253, 337, 69, 20953, 1650, 352, 4245, 247, 1175, 5304, 23356, 273, 253, 1332, 285, 752, 310, 9369, 50275, 20881, 50276, 74, 1158, 634, 278, 1288, 77, 1543, 403, 417, 2221, 2266, 891, 923, 3240, 690, 27620, 9191, 285, 275, 690, 273, 731, 634, 1332, 6296, 3249, 598, 533, 275, 690, 2571, 352, 310, 2686, 2708, 253, 18075, 891, 651, 1333, 253, 1524, 7756, 310, 6571, 323, 12524, 336, 785, 327, 913, 287, 12042, 294, 12844, 1331, 285, 294, 12844, 533, 1677, 326, 310, 2708, 327, 8511, 3803, 285, 2940, 254, 19, 69, 359, 778, 5545, 849, 1199, 2625, 627, 1663, 310, 891, 513, 751, 326, 368, 921, 512, 1543, 1690, 4016, 533, 368, 778, 5777, 294, 40712, 253, 3908, 326, 634, 1332, 2057, 10129, 390, 41731, 13015, 616, 21421, 50275, 85, 899, 3368, 891, 1089, 352, 402, 20733, 326, 260, 358, 72, 2188, 2343, 4863, 253, 4156, 24571, 619, 2022, 2523, 310, 326, 368, 513, 417, 1304, 327, 253, 1180, 273, 7802, 4295, 275, 253, 305, 2188, 436, 3133, 281, 320, 1663, 9560, 2819, 387, 253, 30762, 1543, 352, 3133, 326, 253, 260, 358, 72, 2188, 310, 46144, 12202, 327, 19502, 470, 1907, 1643, 3530, 327, 253, 987, 533, 342, 2217, 7802, 4295, 436, 943, 1335, 320, 4030, 285, 752, 310, 253, 1180, 273, 10872, 275, 12524, 336, 78, 891, 3524, 253, 1072, 347, 253, 1180, 273, 24170, 858, 368, 19928, 1097, 275, 2159, 891, 3480, 690, 4373, 19484, 10165, 368, 1056, 275, 253, 20953, 3368, 50276, 423, 273, 7652, 891, 452, 690, 7596, 4685, 2139, 253, 2408, 273, 941, 4566, 598, 591, 3126, 5016, 1016, 4227, 556, 247, 1027, 3268, 689, 7823, 987, 285, 891, 476, 760, 7472, 581, 273, 841, 7823, 591, 3126, 5016, 891, 1537, 320, 5816, 1633, 1060, 533, 891, 651, 878, 690, 37699, 1060, 50276, 74, 1158, 2905, 789, 556, 247, 2201, 33860, 1580, 352, 760, 16633, 327, 643, 278, 1288, 77, 789, 533, 1057, 417, 2319, 667, 2905, 789, 432, 260, 358, 6239, 556, 436, 40880, 2746, 2168, 644, 3597, 627, 390, 1633, 2074, 840, 352, 476, 1335, 320, 4623, 281, 278, 1288, 77, 533, 368, 943, 3877, 285, 2319, 436, 50275, 783, 2929, 556, 642, 5955, 285, 2852, 789, 534, 891, 1158, 943, 1900, 320, 627, 50275, 783, 14580, 403, 3240, 1355, 285, 275, 690, 2219, 816, 12744, 323, 1650, 275, 4677, 721, 253, 10688, 12425, 310, 2080, 432, 8654, 891, 1663, 452, 247, 1892, 673, 36182, 562, 534, 1386, 310, 534, 327, 619, 11462, 2715, 275, 10688, 1580, 368, 24088, 452, 2502, 285, 14863, 327, 247, 2806, 285, 3168, 3379, 604, 651, 1663, 320, 4336, 275, 36764, 917, 4677, 854, 310, 671, 1663, 1355, 285, 3198, 271, 625, 9470, 11743, 604, 368, 2550, 2572, 253, 7484, 1979, 387, 1878, 2572, 253, 7844, 13301, 275, 1979, 50275, 12550, 16182, 369, 1663, 2590, 533, 247, 1643, 1841, 891, 858, 417, 755, 337, 310, 253, 362, 394, 275, 16186, 495, 247, 7887, 591, 260, 358, 4227, 840, 891, 651, 3630, 362, 48146, 374, 310, 253, 465, 273, 260, 358, 72, 2188, 253, 1072, 347, 253, 465, 323, 1016, 4227, 273, 12524, 336, 78, 26332, 1057, 12524, 260, 358, 755, 625, 1755, 76, 3530, 285, 943, 368, 3451, 323, 436, 891, 1089, 352, 1663, 1892, 281, 5963, 436, 2929, 1580, 352, 556, 2266, 285, 5075, 2792, 347, 5393, 1840, 891, 751, 253, 2934, 352, 310, 15246, 285, 3477, 281, 15909, 2568, 973, 17194, 253, 2929, 310, 1077, 4518, 3542, 556, 2590, 14951, 285, 4245, 1175, 27350, 23356, 273, 253, 2934, 327, 253, 42719, 891, 513, 417, 1158, 253, 1543, 403, 1663, 21414, 3738, 627, 310, 690, 2625, 891, 3480, 4373, 22041, 327, 253, 1180, 273, 24170, 275, 253, 20953, 3368, 285, 253, 7340, 273, 1543, 281, 11962, 352, 891, 4336, 2985, 2905, 789, 432, 253, 260, 358, 6239, 627, 310, 642, 5955, 285, 2852, 789, 285, 253, 14580, 403, 417, 3477, 281, 1239, 15978, 512, 841, 37616, 1487, 476, 320, 28849, 533, 342, 512, 273, 731, 2366, 891, 755, 275, 5545, 50276, 7152, 339, 431, 248, 2929, 29328, 12524, 336, 78, 534, 4648, 7529, 10872, 273, 260, 358, 281, 3037, 8654, 7823, 323, 3237, 326, 3831, 23390, 26306, 8654, 5231, 3185, 273, 970, 247, 2014, 3646, 347, 10491, 3268, 285, 39793, 253, 5231, 275, 253, 21520, 273, 436, 2250, 18974, 253, 12524, 336, 78, 1332, 4648, 7529, 342, 2709, 7823, 285, 260, 358, 5556, 14460, 3103, 253, 4081, 5933, 476, 3037, 23390, 26306, 5231, 285, 943, 3157, 3410, 6733, 253, 5933, 310, 6760, 327, 253, 2629, 1527, 2284, 22791, 8892, 50275, 783, 2929, 5605, 281, 8415, 253, 1895, 326, 260, 358, 2550, 6016, 23390, 26306, 2250, 10670, 1561, 253, 1755, 76, 3530, 275, 436, 1083, 260, 358, 31218, 689, 253, 2709, 10006, 285, 4419, 625, 3530, 281, 29623, 1223, 436, 1895, 28055, 4961, 285, 581, 476, 6266, 247, 49956, 1650, 835, 436, 1895, 310, 4623, 253, 1895, 310, 417, 4623, 323, 253, 2684, 1566, 3169, 391, 77, 4679, 327, 253, 1527, 2284, 8892, 253, 11333, 452, 4209, 15579, 281, 2740, 253, 2709, 5482, 285, 29623, 281, 247, 2014, 2900, 1309, 247, 1643, 2397, 1136, 723, 253, 3064, 275, 10921, 310, 3798, 417, 1534, 347, 2011, 407, 268, 5338, 74, 1162, 355, 970, 253, 1736, 4090, 534, 671, 32547, 25001, 689, 10006, 3185, 273, 253, 1599, 476, 3157, 20243, 285, 42876, 2572, 253, 10921, 533, 253, 3910, 403, 417, 5667, 432, 619, 3367, 2793, 342, 260, 358, 970, 253, 1736, 4090, 689, 253, 1599, 4254, 476, 7826, 2572, 3045, 275, 690, 12620, 533, 253, 2572, 310, 417, 1534, 253, 2684, 4679, 273, 253, 4477, 513, 417, 921, 1199, 273, 271, 7756, 273, 12524, 336, 78, 689, 253, 643, 3082, 253, 4477, 671, 513, 417, 2085, 271, 7103, 326, 44995, 1880, 253, 13782, 673, 12075, 342, 616, 7529, 2605, 273, 260, 358, 50276, 81, 3422, 447, 33810, 253, 15180, 10454, 1309, 3733, 954, 2779, 5459, 347, 581, 3198, 281, 6194, 295, 7823, 3103, 253, 4081, 5933, 1057, 417, 3157, 253, 16774, 3045, 273, 1684, 3642, 323, 625, 2570, 12620, 253, 4081, 2746, 1537, 320, 12912, 533, 323, 253, 1527, 2284, 22791, 8892, 253, 7529, 2605, 310, 417, 3309, 281, 5115, 247, 1175, 3045, 50275, 783, 2929, 671, 1057, 417, 2085, 747, 10527, 16039, 390, 556, 247, 1175, 16038, 970, 295, 19783, 7823, 50276, 336, 78, 310, 247, 14916, 6880, 273, 1684, 3642, 342, 8767, 5373, 604, 581, 5605, 281, 2953, 253, 32505, 26306, 305, 12064, 9376, 273, 260, 358, 581, 943, 1718, 275, 22453, 1114, 442, 1113, 4213, 3082, 534, 476, 6016, 23390, 26306, 5482, 50276, 67, 11587, 253, 4105, 16038, 273, 253, 2746, 285, 3710, 5661, 1543, 253, 2929, 310, 973, 15720, 285, 3477, 281, 956, 19235, 760, 253, 2934, 310, 417, 4209, 281, 320, 3559, 387, 17857, 32888, 50276, 783, 5750, 273, 253, 4081, 5933, 310, 12744, 285, 627, 403, 642, 10527, 16039, 1561, 253, 2929, 253, 5933, 310, 247, 14916, 6880, 273, 1684, 3642, 285, 556, 1652, 17200, 3103, 891, 5583, 33944, 253, 2929, 50276, 187, 187, 4118, 18435, 27, 2520, 2929, 10262, 247, 40880, 2715, 273, 253, 260, 358, 5853, 835, 271, 19862, 273, 260, 358, 10872, 1408, 10939, 432, 581, 1529, 285, 1016, 17923, 247, 1980, 7756, 273, 697, 1211, 10491, 3268, 253, 2929, 2722, 326, 253, 4081, 5853, 476, 33623, 253, 1895, 273, 36409, 260, 358, 2905, 281, 5975, 3390, 281, 247, 1980, 24571, 253, 2929, 3797, 247, 10527, 1783, 285, 9864, 4679, 326, 921, 690, 5373, 273, 253, 4081, 5853, 689, 36409, 260, 358, 50276, 783, 2234, 43680, 432, 253, 30628, 2486, 253, 15246, 3753, 273, 253, 4081, 2934, 534, 7787, 253, 7681, 7680, 273, 253, 2929, 347, 973, 347, 253, 3710, 11701, 689, 36409, 260, 358, 275, 253, 9864, 4679, 50275, 249, 6010, 436, 310, 247, 45210, 2929, 1223, 253, 2929, 310, 973, 15720, 285, 253, 4081, 2746, 310, 4518, 5544, 253, 3480, 273, 2266, 16774, 1543, 326, 921, 247, 17088, 7756, 273, 40880, 260, 358, 9904, 342, 253, 32809, 3753, 273, 253, 2934, 273, 40880, 260, 358, 2789, 479, 9644, 2584, 247, 18235 ]
Below is given review of a research paper from cnoference journal. Please write a summary the review. ### Review: the paper presents a dataset apt36k for animal pose estimation and tracking apt36k consists of 2400 video clips collected and filtered from 30 animal species with 15 frames for each video resulting in 36000 frames in total based on apt36k three tasks are set up 1 single frame pose estimation 2 interspecies domain generation test and 3 animal pose estimation and tracking several representative models are evaluated on the three tasks main contributions 1 a largescale benchmark dataset apt36k for animal pose estimation and tracking 2 the setup of three tasks with several representative models benchmarked 1 the proposed apt36k is the first dataset for animal pose estimation and tracking the scale of the apk36k is relatively large compared with existing animal pose estimation datasets the dataset is carefully collected to cover different animal species from different scenes 2 three tasks are set up to evaluate several representative models the results and analysis provide some insights which are useful for researchers in this field for the single frame pose estimation task several representative cnn and transformerbased models are compared and also the impact of pretraining is investigated the experiments for the interspecies animal pose generalization task show that to achieve a good generalization performance for a specific species it is usually necessary to collect some training data of species which belong to the same family 1 for the interspecies animal pose generalization task only 6 representative animal families are used what would the results and conclusion be when all the species are included 2 for the animal pose tracking task only single object tracking methods are evaluated some multiple object tracking methods should also be included for more thorough comparison typically some human pose estimation and tracking methods eg simplebaseline 34 are needed to be evaluated docsepthis work presents a new benchmark apt36k for animal pose estimation and tracking apt36k consists of 2400 video clips and 36000 frames for 30 animal species it is the first benchmark for both animal pose estimation and pose tracking based on apt36k the authors benchmark three different tasks 1 singleframe animal pose estimation 2 interspecies domain generalization 3 animal pose estimation and tracking largescale annotations detailed statistics of the apt36k dataset are provided the animal motions are diverse comprehensive experiments are conducted to benchmark 3 different tracks the authors should provide the details of metric ap since ap is calculated using the metric object keypoint similarity oks which needs to define the perkeypont constant ki for each keypoint i ki controls the falloff and varies among different keypoints for each video clip the authors sample 15 frames for annotations and the sample rate depends on the animal motions i suggest the authors provide statistics about the video length this is important for pose tracking since a longer video is more difficult to track the authors should use more metrics for pose tracking there are many standard metrics for pose tracking such as mota motp id switch docsepthis paper describes the apt36k dataset consists of 2400 video sequences each on 16 frames of annotated video with applications to video tracking and pose estimation the videos are balanced across 15 families of animals 30 species allowing for estimation of crossfamily generalization the authors give benchmarks for the dataset in three categories single frame pose estimation crossspecies generalization and pose tracking where an object tracking approach is needed to identify the bounding box in each frame the paper is clear and wellwritten and the datasheet and methods are well described i am enthusiastic about this manuscript as a resource for the community however i have several reservations first and foremost as described below i am not sure it is permissible to download youtube videos that make up the dataset because of this i am voting not to accept the manuscript but if this issue is reconciled i would consider accepting second the dataset is more of an incremental advance over last years ap10k this does not preclude its acceptance but does reduce its novelty for the community of the three tasks presented only one requires the continuous video data uniquely included here the paper is clear the dataset and ethics are well documented and the dataset is of good value to the community m my main reservations are on the novelty of the dataset and the licensing data licensing the data is described as originating from youtube according to the youtube terms of service httpswwwyoutubecomstatictemplateterms you are not allowed to 1 access reproduce download distribute transmit broadcast display sell license alter modify or otherwise use any part of the service or any content except a as expressly authorized by the service or b with prior written permission from youtube and if applicable the respective rights holders moreover it is unclear if the particular videos are licensed under the youtube standard license or a ccby license which would affect whether or not they could be distributed and modified even for noncommercial use i am not a legal expert but my understanding is that this dataset may be in violation of 1 the youtube service agreement pertinent since alphabet is typically a neurips sponsor and 2 the copyright holders of these video unless youtube has specifically granted permission for their use novelty the ap10k dataset was at neuripsdb 2021 and this dataset is fairly similar there are key differences the data here is better balanced around a few categories and these are videos but many of the tasks used are pose estimation in animals there are occlusions and tracking errors in some clips is there a ground truth upper bound on the possible map in this dataset docsepin this work apt36k is introduced a new dataset for animal pose estimation and tracking apt36k consists of 36k frames sampled from youtube videos and manually annotated for 53k animal instances in total 30 animal species are captured with 80 video clips per species making this a balanced set a significant annotation effort is carried out as apt36k includes animal keypoints bounding boxes instance level ids and background class labels dataset statistics are thoroughly presented and baseline performance for sota pose estimation models is reported given the lack of videobased datasets for animal pose estimation apt36k can be valuable for further research paper is well written dataset statistics and data collection details are clearly mentioned multiple sota pose estimation and tracking approaches are evaluated authors report performance under various weight initialisation and multiple backbones providing useful benchmarks for further work luck of large data collections remains a limiting factor for animal pose estimation research especially given the existence of arbitrary many animal species and the large appearance and shape variation for different animals largescale data collections can be valuable for enabling further research in the area on the sf track authors report model performance overall captured animal species the reader could potentially benefit from a more granular analysis showing performance per individual species this can lead to a better understanding of how sota models can capture different animal categories are some species more challenging than others in the is track authors evaluate the generalisation abilities of an hrnet32 on unseen animal species the outcome of this analysis is rather intuitive a supervised pose estimation model would have a significant performance decrease on an unseen object category is interspecies generalisation expected by supervised models for example the semantics of keypoints might change for different animals as stated in lines 166168 even if a model is rather consistent in capturing an unseen category it is possible that it can follow a slightly different keypoint configuration than the one manually defined in the groundtruth ie consistently capturing the same semantic point in slightly different object location leading to large error values i am concerned about the significance of this analysis or its suitability for the comparison of different models a more appropriate path for evaluating the generalisation of different models could be through a transfer or few shot learning pipeline related experimentation was included in 38 for the apt10k dataset the claim welltrained annotators might be a little vague and some more details regarding the crosschecking performed could be informative as also mentioned by the authors this dataset is still relatively small compared to human pose estimation benchmarks particularly given the large number of animals captured leading to a few thousand images per animal species docsepthis paper introduces a new dataset apt36k for animal pose estimation and tracking apt36k consists of 2400 video clips with 30 animal species the authors set up three tracks sf is apk based on their collected dataset the paper is well written and easy to follow the authors set up three different tracks on this dataset including sf is and apk tracks experiments seem technically sound my only concern is the novelty of this dataset on animal pose estimation afaik the pose estimation in the animal kingdom dataset 26 consists of 33k frames corresponding to 850 animal species the animal kingdom dataset has more animal species with the same amount of frames so the reviewer thinks the animal kingdom dataset is more challenging than the proposed dataset on animal pose estimation since the proposed dataset contains one more animal pose tracking task i prefer to put the rating at marginally above acceptance threshold ### Summary:
after careful review i believe that this paper is a useful contribution to the study of animal pose estimation and tracking the paper received positive reviews from all the reviewers and the authors have successfully addressed the concerns regarding the lack of sufficient novelty and insufficient benchmarking the authors also added additional low shot experiments to demonstrate the usefulness of the dataset based on this i think the paper meets the bar for the track and should be accepted
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 783, 2929, 10262, 247, 10895, 13390, 1812, 76, 323, 5893, 16753, 13418, 285, 12544, 13390, 1812, 76, 8414, 273, 2164, 361, 3492, 29205, 5728, 285, 18748, 432, 1884, 5893, 3417, 342, 1458, 13009, 323, 1016, 3492, 4795, 275, 5540, 933, 13009, 275, 2264, 1754, 327, 13390, 1812, 76, 1264, 8892, 403, 873, 598, 337, 2014, 3665, 16753, 13418, 374, 734, 28956, 5028, 5978, 1071, 285, 495, 5893, 16753, 13418, 285, 12544, 2067, 8612, 3210, 403, 6760, 327, 253, 1264, 8892, 50276, 7265, 9021, 337, 247, 1236, 2510, 25912, 22791, 10895, 13390, 1812, 76, 323, 5893, 16753, 13418, 285, 12544, 374, 253, 9978, 273, 1264, 8892, 342, 2067, 8612, 3210, 22791, 264, 50276, 18, 253, 4081, 13390, 1812, 76, 310, 253, 806, 10895, 323, 5893, 16753, 13418, 285, 12544, 253, 4311, 273, 253, 1049, 76, 1812, 76, 310, 4942, 1781, 2429, 342, 5368, 5893, 16753, 13418, 15302, 253, 10895, 310, 9257, 5728, 281, 3835, 1027, 5893, 3417, 432, 1027, 13451, 50275, 19, 1264, 8892, 403, 873, 598, 281, 7472, 2067, 8612, 3210, 253, 1543, 285, 1783, 2085, 690, 16039, 534, 403, 4217, 323, 8607, 275, 436, 1673, 323, 253, 2014, 3665, 16753, 13418, 4836, 2067, 8612, 260, 9866, 285, 39707, 3169, 3210, 403, 2429, 285, 671, 253, 3486, 273, 3215, 26208, 310, 6949, 253, 4679, 323, 253, 734, 28956, 5893, 16753, 26647, 4836, 921, 326, 281, 5115, 247, 1175, 26647, 3045, 323, 247, 2173, 3417, 352, 310, 3798, 3309, 281, 4822, 690, 3733, 941, 273, 3417, 534, 5663, 281, 253, 1072, 2021, 50276, 18, 323, 253, 734, 28956, 5893, 16753, 26647, 4836, 760, 721, 8612, 5893, 5870, 403, 908, 752, 651, 253, 1543, 285, 6452, 320, 672, 512, 253, 3417, 403, 2908, 50276, 19, 323, 253, 5893, 16753, 12544, 4836, 760, 2014, 1789, 12544, 3082, 403, 6760, 690, 2709, 1789, 12544, 3082, 943, 671, 320, 2908, 323, 625, 11080, 5301, 5431, 690, 1966, 16753, 13418, 285, 12544, 3082, 24088, 2969, 44650, 5910, 403, 3058, 281, 320, 6760, 50276, 7152, 33032, 2520, 789, 10262, 247, 747, 22791, 13390, 1812, 76, 323, 5893, 16753, 13418, 285, 12544, 13390, 1812, 76, 8414, 273, 2164, 361, 3492, 29205, 285, 5540, 933, 13009, 323, 1884, 5893, 3417, 352, 310, 253, 806, 22791, 323, 1097, 5893, 16753, 13418, 285, 16753, 12544, 1754, 327, 13390, 1812, 76, 253, 4477, 22791, 1264, 1027, 8892, 337, 2014, 6301, 5893, 16753, 13418, 374, 734, 28956, 5028, 26647, 495, 5893, 16753, 13418, 285, 12544, 50276, 14915, 20039, 1079, 31825, 50276, 5992, 7193, 9990, 273, 253, 13390, 1812, 76, 10895, 403, 2530, 253, 5893, 14462, 403, 11117, 50276, 3118, 8391, 422, 4679, 403, 5196, 281, 22791, 495, 1027, 11411, 50276, 783, 4477, 943, 2085, 253, 4278, 273, 7982, 1049, 1580, 1049, 310, 5118, 970, 253, 7982, 1789, 2234, 3659, 14259, 258, 661, 534, 3198, 281, 4853, 253, 591, 2364, 81, 834, 3638, 25130, 323, 1016, 2234, 3659, 891, 25130, 5760, 253, 2965, 2727, 285, 16149, 2190, 1027, 2234, 10801, 50276, 1542, 1016, 3492, 17230, 253, 4477, 3410, 1458, 13009, 323, 31825, 285, 253, 3410, 2281, 7024, 327, 253, 5893, 14462, 891, 1804, 253, 4477, 2085, 9990, 670, 253, 3492, 2978, 436, 310, 1774, 323, 16753, 12544, 1580, 247, 3356, 3492, 310, 625, 2834, 281, 3540, 50276, 783, 4477, 943, 897, 625, 17082, 323, 16753, 12544, 627, 403, 1142, 2629, 17082, 323, 16753, 12544, 824, 347, 1733, 66, 1733, 81, 2654, 5234, 5474, 33032, 2520, 2929, 8631, 253, 13390, 1812, 76, 10895, 8414, 273, 2164, 361, 3492, 6430, 1016, 327, 1668, 13009, 273, 28267, 3492, 342, 4893, 281, 3492, 12544, 285, 16753, 13418, 253, 10556, 403, 16645, 2439, 1458, 5870, 273, 5074, 1884, 3417, 6941, 323, 13418, 273, 2831, 11807, 26647, 253, 4477, 1918, 49602, 323, 253, 10895, 275, 1264, 9050, 50276, 20199, 3665, 16753, 13418, 2831, 28956, 26647, 285, 16753, 12544, 835, 271, 1789, 12544, 2746, 310, 3058, 281, 4271, 253, 41113, 3817, 275, 1016, 3665, 253, 2929, 310, 2590, 285, 973, 15720, 285, 253, 7621, 14934, 285, 3082, 403, 973, 2529, 50275, 74, 717, 31905, 670, 436, 7714, 347, 247, 7741, 323, 253, 3114, 2299, 891, 452, 2067, 33196, 806, 285, 31971, 347, 2529, 2708, 891, 717, 417, 2119, 352, 310, 32588, 281, 6184, 49683, 10556, 326, 1056, 598, 253, 10895, 984, 273, 436, 891, 717, 13423, 417, 281, 2997, 253, 7714, 533, 604, 436, 2523, 310, 30855, 4206, 891, 651, 1908, 18738, 1273, 253, 10895, 310, 625, 273, 271, 32809, 7170, 689, 1390, 1107, 1049, 740, 76, 436, 1057, 417, 31423, 697, 14924, 533, 1057, 4796, 697, 38135, 323, 253, 3114, 273, 253, 1264, 8892, 3559, 760, 581, 4419, 253, 5415, 3492, 941, 22506, 2908, 1060, 50275, 783, 2929, 310, 2590, 253, 10895, 285, 18035, 403, 973, 14290, 285, 253, 10895, 310, 273, 1175, 1318, 281, 253, 3114, 278, 619, 2022, 33196, 403, 327, 253, 38135, 273, 253, 10895, 285, 253, 26920, 50275, 2203, 26920, 253, 941, 310, 2529, 347, 29660, 432, 49683, 50276, 35861, 281, 253, 49683, 2426, 273, 2579, 5987, 2700, 24583, 681, 8766, 882, 12837, 255, 13109, 84, 50276, 5658, 403, 417, 4136, 281, 337, 186, 10773, 18302, 6184, 16969, 13185, 10675, 3148, 5580, 7981, 6990, 10007, 390, 5010, 897, 667, 629, 273, 253, 2579, 390, 667, 2600, 3707, 247, 347, 20251, 14047, 407, 253, 2579, 390, 270, 342, 2720, 3542, 9214, 432, 49683, 285, 604, 7763, 253, 9056, 3570, 26160, 50275, 3062, 1189, 352, 310, 12744, 604, 253, 1798, 10556, 403, 17236, 762, 253, 49683, 2629, 7981, 390, 247, 25215, 1615, 7981, 534, 651, 2818, 1880, 390, 417, 597, 812, 320, 5939, 285, 7321, 1014, 323, 1327, 37763, 897, 891, 717, 417, 247, 4320, 6485, 533, 619, 4685, 310, 326, 436, 10895, 778, 320, 275, 8411, 273, 337, 253, 49683, 2579, 4345, 21452, 1580, 30156, 310, 5431, 247, 5723, 2824, 28924, 285, 374, 253, 9451, 26160, 273, 841, 3492, 5734, 49683, 556, 5742, 7169, 9214, 323, 616, 897, 50274, 2369, 652, 555, 253, 1049, 740, 76, 10895, 369, 387, 5723, 2824, 5470, 43425, 285, 436, 10895, 310, 9648, 2074, 627, 403, 2234, 3910, 50276, 783, 941, 1060, 310, 1805, 16645, 1475, 247, 1643, 9050, 285, 841, 403, 10556, 533, 1142, 273, 253, 8892, 908, 403, 16753, 13418, 275, 5074, 50275, 9088, 403, 15715, 7797, 285, 12544, 6332, 275, 690, 29205, 50276, 261, 627, 247, 3216, 5083, 5170, 3033, 327, 253, 1896, 3711, 275, 436, 10895, 50276, 7152, 339, 9852, 436, 789, 13390, 1812, 76, 310, 5611, 247, 747, 10895, 323, 5893, 16753, 13418, 285, 12544, 13390, 1812, 76, 8414, 273, 5540, 76, 13009, 19958, 432, 49683, 10556, 285, 13542, 28267, 323, 8676, 76, 5893, 10872, 275, 2264, 1884, 5893, 3417, 403, 10848, 342, 5096, 3492, 29205, 591, 3417, 2403, 436, 247, 16645, 873, 247, 1534, 22581, 3434, 310, 4824, 562, 347, 13390, 1812, 76, 3797, 5893, 2234, 10801, 41113, 12783, 4227, 1268, 44077, 285, 4114, 966, 13301, 10895, 9990, 403, 16575, 3559, 285, 8245, 3045, 323, 256, 5503, 16753, 13418, 3210, 310, 2361, 1677, 253, 3480, 273, 8851, 706, 833, 15302, 323, 5893, 16753, 13418, 13390, 1812, 76, 476, 320, 9865, 323, 2007, 2561, 2929, 310, 973, 3542, 10895, 9990, 285, 941, 4849, 4278, 403, 4518, 5393, 50276, 34263, 256, 5503, 16753, 13418, 285, 12544, 7274, 403, 6760, 4477, 1304, 3045, 762, 2710, 2801, 3302, 5837, 285, 2709, 896, 47473, 5277, 4217, 49602, 323, 2007, 789, 50276, 77, 1807, 273, 1781, 941, 18406, 4558, 247, 14155, 2803, 323, 5893, 16753, 13418, 2561, 3340, 1677, 253, 6242, 273, 10341, 1142, 5893, 3417, 285, 253, 1781, 7286, 285, 5281, 7629, 323, 1027, 5074, 1236, 2510, 25912, 941, 18406, 476, 320, 9865, 323, 17690, 2007, 2561, 275, 253, 2170, 327, 253, 42644, 3540, 4477, 1304, 1566, 3045, 4583, 10848, 5893, 3417, 253, 9414, 812, 7826, 5649, 432, 247, 625, 32449, 1783, 4645, 3045, 591, 2060, 3417, 436, 476, 1421, 281, 247, 1805, 4685, 273, 849, 256, 5503, 3210, 476, 9232, 1027, 5893, 9050, 403, 690, 3417, 625, 11132, 685, 2571, 50276, 249, 253, 310, 3540, 4477, 7472, 253, 2087, 5837, 15277, 273, 271, 20589, 3024, 1237, 327, 39709, 5893, 3417, 253, 6454, 273, 436, 1783, 310, 2581, 27350, 247, 22296, 16753, 13418, 1566, 651, 452, 247, 1534, 3045, 6379, 327, 271, 39709, 1789, 7140, 310, 734, 28956, 2087, 5837, 3264, 407, 22296, 3210, 323, 1650, 253, 35185, 273, 2234, 10801, 1537, 1818, 323, 1027, 5074, 347, 4767, 275, 3104, 23541, 13851, 1014, 604, 247, 1566, 310, 2581, 5185, 275, 26475, 271, 39709, 7140, 352, 310, 1896, 326, 352, 476, 956, 247, 5777, 1027, 2234, 3659, 6661, 685, 253, 581, 13542, 2931, 275, 253, 3216, 33024, 26332, 12724, 26475, 253, 1072, 24705, 1127, 275, 5777, 1027, 1789, 4328, 4283, 281, 1781, 2228, 2193, 891, 717, 7514, 670, 253, 8453, 273, 436, 1783, 390, 697, 45984, 323, 253, 5301, 273, 1027, 3210, 247, 625, 4569, 1854, 323, 16344, 253, 2087, 5837, 273, 1027, 3210, 812, 320, 949, 247, 3700, 390, 1643, 5103, 4715, 15722, 2905, 40290, 369, 2908, 275, 6480, 323, 253, 13390, 740, 76, 10895, 50276, 783, 1750, 973, 32927, 12182, 2392, 1537, 320, 247, 1652, 21248, 285, 690, 625, 4278, 5001, 253, 2831, 47009, 2684, 812, 320, 27096, 50276, 284, 671, 5393, 407, 253, 4477, 436, 10895, 310, 1335, 4942, 1355, 2429, 281, 1966, 16753, 13418, 49602, 3782, 1677, 253, 1781, 1180, 273, 5074, 10848, 4283, 281, 247, 1643, 8014, 3888, 591, 5893, 3417, 5474, 33032, 2520, 2929, 23970, 247, 747, 10895, 13390, 1812, 76, 323, 5893, 16753, 13418, 285, 12544, 13390, 1812, 76, 8414, 273, 2164, 361, 3492, 29205, 342, 1884, 5893, 3417, 253, 4477, 873, 598, 1264, 11411, 42644, 310, 1049, 76, 1754, 327, 616, 5728, 10895, 253, 2929, 310, 973, 3542, 285, 3477, 281, 956, 253, 4477, 873, 598, 1264, 1027, 11411, 327, 436, 10895, 1690, 42644, 310, 285, 1049, 76, 11411, 4679, 1646, 22335, 3590, 50276, 2577, 760, 4468, 310, 253, 38135, 273, 436, 10895, 327, 5893, 16753, 13418, 6706, 66, 1479, 253, 16753, 13418, 275, 253, 5893, 18794, 10895, 3436, 8414, 273, 5922, 76, 13009, 3969, 281, 39739, 5893, 3417, 253, 5893, 18794, 10895, 556, 625, 5893, 3417, 342, 253, 1072, 2408, 273, 13009, 594, 253, 37317, 11121, 253, 5893, 18794, 10895, 310, 625, 11132, 685, 253, 4081, 10895, 327, 5893, 16753, 13418, 1580, 253, 4081, 10895, 4428, 581, 625, 5893, 16753, 12544, 4836, 891, 4510, 281, 1691, 253, 13716, 387, 42876, 1840, 14924, 7887, 50275, 187, 187, 4118, 18435, 27, 6438, 10182, 2278, 891, 2868, 326, 436, 2929, 310, 247, 4217, 7680, 281, 253, 1263, 273, 5893, 16753, 13418, 285, 12544, 253, 2929, 2959, 2762, 10123, 432, 512, 253, 30628, 285, 253, 4477, 452, 8379, 9713, 253, 7350, 5001, 253, 3480, 273, 4209, 38135, 285, 12497, 22791, 272, 253, 4477, 671, 2879, 3081, 1698, 5103, 4679, 281, 7568, 253, 31471, 273, 253, 10895, 1754, 327, 436, 891, 1158, 253, 2929, 16382, 253, 2534, 323, 253, 3540, 285, 943, 320, 7607 ]
[ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 ]
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 783, 2929, 10262, 247, 10895, 13390, 1812, 76, 323, 5893, 16753, 13418, 285, 12544, 13390, 1812, 76, 8414, 273, 2164, 361, 3492, 29205, 5728, 285, 18748, 432, 1884, 5893, 3417, 342, 1458, 13009, 323, 1016, 3492, 4795, 275, 5540, 933, 13009, 275, 2264, 1754, 327, 13390, 1812, 76, 1264, 8892, 403, 873, 598, 337, 2014, 3665, 16753, 13418, 374, 734, 28956, 5028, 5978, 1071, 285, 495, 5893, 16753, 13418, 285, 12544, 2067, 8612, 3210, 403, 6760, 327, 253, 1264, 8892, 50276, 7265, 9021, 337, 247, 1236, 2510, 25912, 22791, 10895, 13390, 1812, 76, 323, 5893, 16753, 13418, 285, 12544, 374, 253, 9978, 273, 1264, 8892, 342, 2067, 8612, 3210, 22791, 264, 50276, 18, 253, 4081, 13390, 1812, 76, 310, 253, 806, 10895, 323, 5893, 16753, 13418, 285, 12544, 253, 4311, 273, 253, 1049, 76, 1812, 76, 310, 4942, 1781, 2429, 342, 5368, 5893, 16753, 13418, 15302, 253, 10895, 310, 9257, 5728, 281, 3835, 1027, 5893, 3417, 432, 1027, 13451, 50275, 19, 1264, 8892, 403, 873, 598, 281, 7472, 2067, 8612, 3210, 253, 1543, 285, 1783, 2085, 690, 16039, 534, 403, 4217, 323, 8607, 275, 436, 1673, 323, 253, 2014, 3665, 16753, 13418, 4836, 2067, 8612, 260, 9866, 285, 39707, 3169, 3210, 403, 2429, 285, 671, 253, 3486, 273, 3215, 26208, 310, 6949, 253, 4679, 323, 253, 734, 28956, 5893, 16753, 26647, 4836, 921, 326, 281, 5115, 247, 1175, 26647, 3045, 323, 247, 2173, 3417, 352, 310, 3798, 3309, 281, 4822, 690, 3733, 941, 273, 3417, 534, 5663, 281, 253, 1072, 2021, 50276, 18, 323, 253, 734, 28956, 5893, 16753, 26647, 4836, 760, 721, 8612, 5893, 5870, 403, 908, 752, 651, 253, 1543, 285, 6452, 320, 672, 512, 253, 3417, 403, 2908, 50276, 19, 323, 253, 5893, 16753, 12544, 4836, 760, 2014, 1789, 12544, 3082, 403, 6760, 690, 2709, 1789, 12544, 3082, 943, 671, 320, 2908, 323, 625, 11080, 5301, 5431, 690, 1966, 16753, 13418, 285, 12544, 3082, 24088, 2969, 44650, 5910, 403, 3058, 281, 320, 6760, 50276, 7152, 33032, 2520, 789, 10262, 247, 747, 22791, 13390, 1812, 76, 323, 5893, 16753, 13418, 285, 12544, 13390, 1812, 76, 8414, 273, 2164, 361, 3492, 29205, 285, 5540, 933, 13009, 323, 1884, 5893, 3417, 352, 310, 253, 806, 22791, 323, 1097, 5893, 16753, 13418, 285, 16753, 12544, 1754, 327, 13390, 1812, 76, 253, 4477, 22791, 1264, 1027, 8892, 337, 2014, 6301, 5893, 16753, 13418, 374, 734, 28956, 5028, 26647, 495, 5893, 16753, 13418, 285, 12544, 50276, 14915, 20039, 1079, 31825, 50276, 5992, 7193, 9990, 273, 253, 13390, 1812, 76, 10895, 403, 2530, 253, 5893, 14462, 403, 11117, 50276, 3118, 8391, 422, 4679, 403, 5196, 281, 22791, 495, 1027, 11411, 50276, 783, 4477, 943, 2085, 253, 4278, 273, 7982, 1049, 1580, 1049, 310, 5118, 970, 253, 7982, 1789, 2234, 3659, 14259, 258, 661, 534, 3198, 281, 4853, 253, 591, 2364, 81, 834, 3638, 25130, 323, 1016, 2234, 3659, 891, 25130, 5760, 253, 2965, 2727, 285, 16149, 2190, 1027, 2234, 10801, 50276, 1542, 1016, 3492, 17230, 253, 4477, 3410, 1458, 13009, 323, 31825, 285, 253, 3410, 2281, 7024, 327, 253, 5893, 14462, 891, 1804, 253, 4477, 2085, 9990, 670, 253, 3492, 2978, 436, 310, 1774, 323, 16753, 12544, 1580, 247, 3356, 3492, 310, 625, 2834, 281, 3540, 50276, 783, 4477, 943, 897, 625, 17082, 323, 16753, 12544, 627, 403, 1142, 2629, 17082, 323, 16753, 12544, 824, 347, 1733, 66, 1733, 81, 2654, 5234, 5474, 33032, 2520, 2929, 8631, 253, 13390, 1812, 76, 10895, 8414, 273, 2164, 361, 3492, 6430, 1016, 327, 1668, 13009, 273, 28267, 3492, 342, 4893, 281, 3492, 12544, 285, 16753, 13418, 253, 10556, 403, 16645, 2439, 1458, 5870, 273, 5074, 1884, 3417, 6941, 323, 13418, 273, 2831, 11807, 26647, 253, 4477, 1918, 49602, 323, 253, 10895, 275, 1264, 9050, 50276, 20199, 3665, 16753, 13418, 2831, 28956, 26647, 285, 16753, 12544, 835, 271, 1789, 12544, 2746, 310, 3058, 281, 4271, 253, 41113, 3817, 275, 1016, 3665, 253, 2929, 310, 2590, 285, 973, 15720, 285, 253, 7621, 14934, 285, 3082, 403, 973, 2529, 50275, 74, 717, 31905, 670, 436, 7714, 347, 247, 7741, 323, 253, 3114, 2299, 891, 452, 2067, 33196, 806, 285, 31971, 347, 2529, 2708, 891, 717, 417, 2119, 352, 310, 32588, 281, 6184, 49683, 10556, 326, 1056, 598, 253, 10895, 984, 273, 436, 891, 717, 13423, 417, 281, 2997, 253, 7714, 533, 604, 436, 2523, 310, 30855, 4206, 891, 651, 1908, 18738, 1273, 253, 10895, 310, 625, 273, 271, 32809, 7170, 689, 1390, 1107, 1049, 740, 76, 436, 1057, 417, 31423, 697, 14924, 533, 1057, 4796, 697, 38135, 323, 253, 3114, 273, 253, 1264, 8892, 3559, 760, 581, 4419, 253, 5415, 3492, 941, 22506, 2908, 1060, 50275, 783, 2929, 310, 2590, 253, 10895, 285, 18035, 403, 973, 14290, 285, 253, 10895, 310, 273, 1175, 1318, 281, 253, 3114, 278, 619, 2022, 33196, 403, 327, 253, 38135, 273, 253, 10895, 285, 253, 26920, 50275, 2203, 26920, 253, 941, 310, 2529, 347, 29660, 432, 49683, 50276, 35861, 281, 253, 49683, 2426, 273, 2579, 5987, 2700, 24583, 681, 8766, 882, 12837, 255, 13109, 84, 50276, 5658, 403, 417, 4136, 281, 337, 186, 10773, 18302, 6184, 16969, 13185, 10675, 3148, 5580, 7981, 6990, 10007, 390, 5010, 897, 667, 629, 273, 253, 2579, 390, 667, 2600, 3707, 247, 347, 20251, 14047, 407, 253, 2579, 390, 270, 342, 2720, 3542, 9214, 432, 49683, 285, 604, 7763, 253, 9056, 3570, 26160, 50275, 3062, 1189, 352, 310, 12744, 604, 253, 1798, 10556, 403, 17236, 762, 253, 49683, 2629, 7981, 390, 247, 25215, 1615, 7981, 534, 651, 2818, 1880, 390, 417, 597, 812, 320, 5939, 285, 7321, 1014, 323, 1327, 37763, 897, 891, 717, 417, 247, 4320, 6485, 533, 619, 4685, 310, 326, 436, 10895, 778, 320, 275, 8411, 273, 337, 253, 49683, 2579, 4345, 21452, 1580, 30156, 310, 5431, 247, 5723, 2824, 28924, 285, 374, 253, 9451, 26160, 273, 841, 3492, 5734, 49683, 556, 5742, 7169, 9214, 323, 616, 897, 50274, 2369, 652, 555, 253, 1049, 740, 76, 10895, 369, 387, 5723, 2824, 5470, 43425, 285, 436, 10895, 310, 9648, 2074, 627, 403, 2234, 3910, 50276, 783, 941, 1060, 310, 1805, 16645, 1475, 247, 1643, 9050, 285, 841, 403, 10556, 533, 1142, 273, 253, 8892, 908, 403, 16753, 13418, 275, 5074, 50275, 9088, 403, 15715, 7797, 285, 12544, 6332, 275, 690, 29205, 50276, 261, 627, 247, 3216, 5083, 5170, 3033, 327, 253, 1896, 3711, 275, 436, 10895, 50276, 7152, 339, 9852, 436, 789, 13390, 1812, 76, 310, 5611, 247, 747, 10895, 323, 5893, 16753, 13418, 285, 12544, 13390, 1812, 76, 8414, 273, 5540, 76, 13009, 19958, 432, 49683, 10556, 285, 13542, 28267, 323, 8676, 76, 5893, 10872, 275, 2264, 1884, 5893, 3417, 403, 10848, 342, 5096, 3492, 29205, 591, 3417, 2403, 436, 247, 16645, 873, 247, 1534, 22581, 3434, 310, 4824, 562, 347, 13390, 1812, 76, 3797, 5893, 2234, 10801, 41113, 12783, 4227, 1268, 44077, 285, 4114, 966, 13301, 10895, 9990, 403, 16575, 3559, 285, 8245, 3045, 323, 256, 5503, 16753, 13418, 3210, 310, 2361, 1677, 253, 3480, 273, 8851, 706, 833, 15302, 323, 5893, 16753, 13418, 13390, 1812, 76, 476, 320, 9865, 323, 2007, 2561, 2929, 310, 973, 3542, 10895, 9990, 285, 941, 4849, 4278, 403, 4518, 5393, 50276, 34263, 256, 5503, 16753, 13418, 285, 12544, 7274, 403, 6760, 4477, 1304, 3045, 762, 2710, 2801, 3302, 5837, 285, 2709, 896, 47473, 5277, 4217, 49602, 323, 2007, 789, 50276, 77, 1807, 273, 1781, 941, 18406, 4558, 247, 14155, 2803, 323, 5893, 16753, 13418, 2561, 3340, 1677, 253, 6242, 273, 10341, 1142, 5893, 3417, 285, 253, 1781, 7286, 285, 5281, 7629, 323, 1027, 5074, 1236, 2510, 25912, 941, 18406, 476, 320, 9865, 323, 17690, 2007, 2561, 275, 253, 2170, 327, 253, 42644, 3540, 4477, 1304, 1566, 3045, 4583, 10848, 5893, 3417, 253, 9414, 812, 7826, 5649, 432, 247, 625, 32449, 1783, 4645, 3045, 591, 2060, 3417, 436, 476, 1421, 281, 247, 1805, 4685, 273, 849, 256, 5503, 3210, 476, 9232, 1027, 5893, 9050, 403, 690, 3417, 625, 11132, 685, 2571, 50276, 249, 253, 310, 3540, 4477, 7472, 253, 2087, 5837, 15277, 273, 271, 20589, 3024, 1237, 327, 39709, 5893, 3417, 253, 6454, 273, 436, 1783, 310, 2581, 27350, 247, 22296, 16753, 13418, 1566, 651, 452, 247, 1534, 3045, 6379, 327, 271, 39709, 1789, 7140, 310, 734, 28956, 2087, 5837, 3264, 407, 22296, 3210, 323, 1650, 253, 35185, 273, 2234, 10801, 1537, 1818, 323, 1027, 5074, 347, 4767, 275, 3104, 23541, 13851, 1014, 604, 247, 1566, 310, 2581, 5185, 275, 26475, 271, 39709, 7140, 352, 310, 1896, 326, 352, 476, 956, 247, 5777, 1027, 2234, 3659, 6661, 685, 253, 581, 13542, 2931, 275, 253, 3216, 33024, 26332, 12724, 26475, 253, 1072, 24705, 1127, 275, 5777, 1027, 1789, 4328, 4283, 281, 1781, 2228, 2193, 891, 717, 7514, 670, 253, 8453, 273, 436, 1783, 390, 697, 45984, 323, 253, 5301, 273, 1027, 3210, 247, 625, 4569, 1854, 323, 16344, 253, 2087, 5837, 273, 1027, 3210, 812, 320, 949, 247, 3700, 390, 1643, 5103, 4715, 15722, 2905, 40290, 369, 2908, 275, 6480, 323, 253, 13390, 740, 76, 10895, 50276, 783, 1750, 973, 32927, 12182, 2392, 1537, 320, 247, 1652, 21248, 285, 690, 625, 4278, 5001, 253, 2831, 47009, 2684, 812, 320, 27096, 50276, 284, 671, 5393, 407, 253, 4477, 436, 10895, 310, 1335, 4942, 1355, 2429, 281, 1966, 16753, 13418, 49602, 3782, 1677, 253, 1781, 1180, 273, 5074, 10848, 4283, 281, 247, 1643, 8014, 3888, 591, 5893, 3417, 5474, 33032, 2520, 2929, 23970, 247, 747, 10895, 13390, 1812, 76, 323, 5893, 16753, 13418, 285, 12544, 13390, 1812, 76, 8414, 273, 2164, 361, 3492, 29205, 342, 1884, 5893, 3417, 253, 4477, 873, 598, 1264, 11411, 42644, 310, 1049, 76, 1754, 327, 616, 5728, 10895, 253, 2929, 310, 973, 3542, 285, 3477, 281, 956, 253, 4477, 873, 598, 1264, 1027, 11411, 327, 436, 10895, 1690, 42644, 310, 285, 1049, 76, 11411, 4679, 1646, 22335, 3590, 50276, 2577, 760, 4468, 310, 253, 38135, 273, 436, 10895, 327, 5893, 16753, 13418, 6706, 66, 1479, 253, 16753, 13418, 275, 253, 5893, 18794, 10895, 3436, 8414, 273, 5922, 76, 13009, 3969, 281, 39739, 5893, 3417, 253, 5893, 18794, 10895, 556, 625, 5893, 3417, 342, 253, 1072, 2408, 273, 13009, 594, 253, 37317, 11121, 253, 5893, 18794, 10895, 310, 625, 11132, 685, 253, 4081, 10895, 327, 5893, 16753, 13418, 1580, 253, 4081, 10895, 4428, 581, 625, 5893, 16753, 12544, 4836, 891, 4510, 281, 1691, 253, 13716, 387, 42876, 1840, 14924, 7887, 50275, 187, 187, 4118, 18435, 27, 6438, 10182, 2278, 891, 2868, 326, 436, 2929, 310, 247, 4217, 7680, 281, 253, 1263, 273, 5893, 16753, 13418, 285, 12544, 253, 2929, 2959, 2762, 10123, 432, 512, 253, 30628, 285, 253, 4477, 452, 8379, 9713, 253, 7350, 5001, 253, 3480, 273, 4209, 38135, 285, 12497, 22791, 272, 253, 4477, 671, 2879, 3081, 1698, 5103, 4679, 281, 7568, 253, 31471, 273, 253, 10895, 1754, 327, 436, 891, 1158, 253, 2929, 16382, 253, 2534, 323, 253, 3540, 285, 943, 320, 7607 ]
Below is given review of a research paper from cnoference journal. Please write a summary the review. ### Review: summary this paper introduces a library that implements 1 infectious disease model and 3 agents in a gym environment the authors provide analysis for a covid19 lock down scenario using their library strengths the paper addresses an important and timely topic the paper is easy to read and follow the authors have open sourced their library and it seems to be well documented weaknesses the main weakness of the paper is the strength of the contributions the authors main contributions are a gym environment for a specific epidemiological model unfortunately the novelty is somewhat lacking as the authors mention in the conclusion section this line of work has been heavily researched there are many infectious disease models and simulation environments already out there eg a gym like interface has even been already open sourced earlier this year httpsgithubcomgooglemlfairnessgymblobmasterenvironmentsinfectiousdiseasepy in terms of impact the number of environments and agents is also quite limited at this stage its also unclear if there is likely to be adoption by serious policy makers absent such impact statements it remains as one of many simulation frameworks out there what could make the paper better a many more environments and agents need to be implemented such that this library has the potential to become the standard for infectious disease simulation b a lot more analysis comparing agents and disease scenarios that truly unearth interesting scientific observations c real world impact statements of adoption by policy makers and governments docsepthis paper introduces openai gym environment for rl optimization of epidemic containment policies the envirnoment currently contains an example seir model parameterized for covid19 along with a simple economic model to evaluate the lost productivity due to lockdowns some experimental results are shown where different deep learning algorithms are used to optimize intermittent lockdown policies on the positive side connecting the epidemiological and ml communities is definitely an important goal developing opensource tools to make this interaction easier is valuable im not sure that this is an appropriate paper for iclr though it mostly takes a preexisting epidemiological model and exposes it in the openai gym interface without a strong research contribution i believe that there are technical issues in the development of a platform for health policy optimization which are likely to result in research contributions for example dealing with uncertainty in modelsparameters developing more efficient methods for multiobjective optimization or providing explanations of policies particularly since experts are unlikely to implement a rl policy verbatim but rather try to synthesize its recommendations with other considerationssources of information i hope that the authors continue to refine this platform and tackle these or other issues in the future docsepthe authors provide a python tool able to model epidemics development as optimization problems this allows easing the work of decisionmakers when faced with the problem of deciding new lockdowns the model has been applied to realworld data to evaluate the consequences in terms of deaths and percapita loss of a new lockdown the paper presents an interesting study on the dynamics of the epidemic in my opinion the development of a tool is relevant for medical and decisionmaking studies but is not enough novel or significant for the ml field i think that this kind of analysis is better suited for a more applicative venue while the novelty provided in your work is not enough to justify a publication at iclr the paper is clear and well written i appreciate the use of a real casestudy and the following analysis i think that the only problem is that the venue chosen by the authors does not fit its purpose questions did you also try your model on different datasets i think that a more wide experimental campaing might improve the value of what have been proposed here what about other forms of prevention other than lockdowns for instance it is possible to model tracing or testing as prevention methods in your modeling this would instead make the model more flexible and allow decisional organs to act in a more flexible way another interesting study might include the use of different methods for contagion prevention and their use in a joint or sequential manner in order to understand what are the policies over time which might be the most promising for a tradeoff healtgdp after rebuttal the paper seems interesting but as i already mentioned and as other reviewers pointed out the main concerns about this paper are novelty and relevance to the ml community ### Summary:
the reviewers agree that the contributions may not be relevant to the ml research community or perhaps are a poor fit for the venue but otherwise find the work potentially useful and addressing a timely topic because the paper focuses on a simulation environment for existing epidemiological models reviewers comment that the technical and methodological novelty is limited
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 8774, 436, 2929, 23970, 247, 6335, 326, 17930, 337, 18023, 2728, 1566, 285, 495, 6083, 275, 247, 17409, 3126, 253, 4477, 2085, 1783, 323, 247, 9383, 301, 746, 5569, 1066, 10076, 970, 616, 6335, 50276, 296, 3755, 20556, 253, 2929, 12453, 271, 1774, 285, 14793, 9400, 253, 2929, 310, 3477, 281, 1239, 285, 956, 253, 4477, 452, 1527, 47344, 616, 6335, 285, 352, 3133, 281, 320, 973, 14290, 50276, 20881, 1255, 265, 253, 2022, 14855, 273, 253, 2929, 310, 253, 4757, 273, 253, 9021, 253, 4477, 2022, 9021, 403, 247, 17409, 3126, 323, 247, 2173, 31634, 1566, 50276, 328, 9520, 253, 38135, 310, 8489, 14999, 347, 253, 4477, 3748, 275, 253, 6452, 2593, 436, 1386, 273, 789, 556, 644, 11306, 41548, 627, 403, 1142, 18023, 2728, 3210, 285, 9864, 12620, 2168, 562, 627, 24088, 247, 17409, 751, 5673, 556, 1014, 644, 2168, 1527, 47344, 4321, 436, 807, 5987, 7280, 681, 9906, 1686, 25525, 1255, 72, 1105, 23723, 78, 4740, 445, 11986, 39367, 784, 69, 42294, 4789, 50275, 249, 2426, 273, 3486, 253, 1180, 273, 12620, 285, 6083, 310, 671, 3240, 3710, 387, 436, 3924, 697, 671, 12744, 604, 627, 310, 2779, 281, 320, 16253, 407, 4092, 3646, 22741, 12125, 824, 3486, 7234, 352, 4558, 347, 581, 273, 1142, 9864, 31225, 562, 627, 50276, 5371, 812, 1056, 253, 2929, 1805, 247, 1142, 625, 12620, 285, 6083, 878, 281, 320, 9009, 824, 326, 436, 6335, 556, 253, 2442, 281, 2489, 253, 2629, 323, 18023, 2728, 9864, 270, 247, 2257, 625, 1783, 10941, 6083, 285, 2728, 15216, 326, 7777, 440, 29500, 4722, 8249, 7313, 50276, 68, 1524, 1533, 3486, 7234, 273, 16253, 407, 3646, 22741, 285, 13001, 5474, 33032, 2520, 2929, 23970, 1527, 2284, 17409, 3126, 323, 391, 77, 13757, 273, 22954, 46054, 7823, 253, 546, 12855, 25009, 290, 4390, 4428, 271, 1650, 396, 343, 1566, 4764, 1025, 323, 9383, 301, 746, 2112, 342, 247, 2969, 5054, 1566, 281, 7472, 253, 3663, 18053, 1955, 281, 5569, 3487, 84, 690, 5661, 1543, 403, 2011, 835, 1027, 3676, 4715, 11333, 403, 908, 281, 22318, 32540, 5569, 3487, 7823, 50275, 251, 253, 2762, 1930, 12873, 253, 31634, 285, 13361, 7888, 310, 7964, 271, 1774, 4736, 6684, 13279, 1505, 5657, 281, 1056, 436, 5016, 6927, 310, 9865, 50275, 303, 417, 2119, 326, 436, 310, 271, 4569, 2929, 323, 17857, 32888, 2167, 352, 6571, 3936, 247, 638, 20137, 31634, 1566, 285, 47566, 352, 275, 253, 1527, 2284, 17409, 5673, 1293, 247, 2266, 2561, 7680, 891, 2868, 326, 627, 403, 7681, 3374, 275, 253, 2440, 273, 247, 5147, 323, 1786, 3646, 13757, 534, 403, 2779, 281, 906, 275, 2561, 9021, 323, 1650, 10620, 342, 11649, 275, 1566, 1033, 274, 21644, 6684, 625, 5919, 3082, 323, 4471, 6082, 422, 13757, 390, 5277, 22909, 273, 7823, 3782, 1580, 10071, 403, 11543, 281, 3359, 247, 391, 77, 3646, 2336, 37438, 533, 2581, 1611, 281, 46919, 697, 12645, 342, 643, 15711, 33491, 273, 1491, 891, 3524, 326, 253, 4477, 4035, 281, 39494, 436, 5147, 285, 18915, 841, 390, 643, 3374, 275, 253, 2852, 5474, 339, 431, 248, 4477, 2085, 247, 15548, 4968, 2104, 281, 1566, 12495, 982, 2440, 347, 13757, 3237, 436, 4483, 1842, 272, 253, 789, 273, 3061, 16126, 672, 11372, 342, 253, 1895, 273, 18000, 747, 5569, 3487, 84, 253, 1566, 556, 644, 3732, 281, 1524, 10186, 941, 281, 7472, 253, 9099, 275, 2426, 273, 8923, 285, 591, 4421, 5741, 2957, 273, 247, 747, 5569, 3487, 50276, 783, 2929, 10262, 271, 4722, 1263, 327, 253, 8062, 273, 253, 22954, 275, 619, 4743, 253, 2440, 273, 247, 4968, 310, 4623, 323, 3739, 285, 3061, 11849, 2175, 533, 310, 417, 2217, 4460, 390, 1534, 323, 253, 13361, 1673, 891, 1158, 326, 436, 2238, 273, 1783, 310, 1805, 18960, 323, 247, 625, 1799, 800, 18767, 1223, 253, 38135, 2530, 275, 634, 789, 310, 417, 2217, 281, 15249, 247, 9311, 387, 17857, 32888, 50276, 783, 2929, 310, 2590, 285, 973, 3542, 891, 11435, 253, 897, 273, 247, 1524, 6483, 383, 438, 90, 285, 253, 1563, 1783, 891, 1158, 326, 253, 760, 1895, 310, 326, 253, 18767, 6777, 407, 253, 4477, 1057, 417, 4944, 697, 4096, 50275, 34974, 50276, 14958, 368, 671, 1611, 634, 1566, 327, 1027, 15302, 891, 1158, 326, 247, 625, 4618, 5661, 2986, 66, 272, 1537, 3157, 253, 1318, 273, 752, 452, 644, 4081, 1060, 50276, 5371, 670, 643, 4948, 273, 12212, 643, 685, 5569, 3487, 84, 323, 4227, 352, 310, 1896, 281, 1566, 30749, 390, 5175, 347, 12212, 3082, 275, 634, 14053, 436, 651, 3185, 1056, 253, 1566, 625, 12112, 285, 1581, 3061, 267, 15384, 281, 769, 275, 247, 625, 12112, 1039, 50276, 23955, 4722, 1263, 1537, 2486, 253, 897, 273, 1027, 3082, 323, 41142, 279, 12212, 285, 616, 897, 275, 247, 6036, 390, 22453, 5133, 275, 1340, 281, 2096, 752, 403, 253, 7823, 689, 673, 534, 1537, 320, 253, 954, 12532, 323, 247, 5454, 2727, 344, 2711, 72, 12132, 50275, 6438, 30080, 22559, 253, 2929, 3133, 4722, 533, 347, 891, 2168, 5393, 285, 347, 643, 30628, 8042, 562, 253, 2022, 7350, 670, 436, 2929, 403, 38135, 285, 17200, 281, 253, 13361, 3114, 2490, 187, 4118, 18435, 27, 783, 30628, 5194, 326, 253, 9021, 778, 417, 320, 4623, 281, 253, 13361, 2561, 3114, 390, 4931, 403, 247, 4105, 4944, 323, 253, 18767, 533, 5010, 1089, 253, 789, 7826, 4217, 285, 15974, 247, 14793, 9400, 984, 253, 2929, 16633, 327, 247, 9864, 3126, 323, 5368, 31634, 3210, 30628, 4385, 326, 253, 7681, 285, 35961, 38135, 310, 3710 ]
[ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 ]
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 8774, 436, 2929, 23970, 247, 6335, 326, 17930, 337, 18023, 2728, 1566, 285, 495, 6083, 275, 247, 17409, 3126, 253, 4477, 2085, 1783, 323, 247, 9383, 301, 746, 5569, 1066, 10076, 970, 616, 6335, 50276, 296, 3755, 20556, 253, 2929, 12453, 271, 1774, 285, 14793, 9400, 253, 2929, 310, 3477, 281, 1239, 285, 956, 253, 4477, 452, 1527, 47344, 616, 6335, 285, 352, 3133, 281, 320, 973, 14290, 50276, 20881, 1255, 265, 253, 2022, 14855, 273, 253, 2929, 310, 253, 4757, 273, 253, 9021, 253, 4477, 2022, 9021, 403, 247, 17409, 3126, 323, 247, 2173, 31634, 1566, 50276, 328, 9520, 253, 38135, 310, 8489, 14999, 347, 253, 4477, 3748, 275, 253, 6452, 2593, 436, 1386, 273, 789, 556, 644, 11306, 41548, 627, 403, 1142, 18023, 2728, 3210, 285, 9864, 12620, 2168, 562, 627, 24088, 247, 17409, 751, 5673, 556, 1014, 644, 2168, 1527, 47344, 4321, 436, 807, 5987, 7280, 681, 9906, 1686, 25525, 1255, 72, 1105, 23723, 78, 4740, 445, 11986, 39367, 784, 69, 42294, 4789, 50275, 249, 2426, 273, 3486, 253, 1180, 273, 12620, 285, 6083, 310, 671, 3240, 3710, 387, 436, 3924, 697, 671, 12744, 604, 627, 310, 2779, 281, 320, 16253, 407, 4092, 3646, 22741, 12125, 824, 3486, 7234, 352, 4558, 347, 581, 273, 1142, 9864, 31225, 562, 627, 50276, 5371, 812, 1056, 253, 2929, 1805, 247, 1142, 625, 12620, 285, 6083, 878, 281, 320, 9009, 824, 326, 436, 6335, 556, 253, 2442, 281, 2489, 253, 2629, 323, 18023, 2728, 9864, 270, 247, 2257, 625, 1783, 10941, 6083, 285, 2728, 15216, 326, 7777, 440, 29500, 4722, 8249, 7313, 50276, 68, 1524, 1533, 3486, 7234, 273, 16253, 407, 3646, 22741, 285, 13001, 5474, 33032, 2520, 2929, 23970, 1527, 2284, 17409, 3126, 323, 391, 77, 13757, 273, 22954, 46054, 7823, 253, 546, 12855, 25009, 290, 4390, 4428, 271, 1650, 396, 343, 1566, 4764, 1025, 323, 9383, 301, 746, 2112, 342, 247, 2969, 5054, 1566, 281, 7472, 253, 3663, 18053, 1955, 281, 5569, 3487, 84, 690, 5661, 1543, 403, 2011, 835, 1027, 3676, 4715, 11333, 403, 908, 281, 22318, 32540, 5569, 3487, 7823, 50275, 251, 253, 2762, 1930, 12873, 253, 31634, 285, 13361, 7888, 310, 7964, 271, 1774, 4736, 6684, 13279, 1505, 5657, 281, 1056, 436, 5016, 6927, 310, 9865, 50275, 303, 417, 2119, 326, 436, 310, 271, 4569, 2929, 323, 17857, 32888, 2167, 352, 6571, 3936, 247, 638, 20137, 31634, 1566, 285, 47566, 352, 275, 253, 1527, 2284, 17409, 5673, 1293, 247, 2266, 2561, 7680, 891, 2868, 326, 627, 403, 7681, 3374, 275, 253, 2440, 273, 247, 5147, 323, 1786, 3646, 13757, 534, 403, 2779, 281, 906, 275, 2561, 9021, 323, 1650, 10620, 342, 11649, 275, 1566, 1033, 274, 21644, 6684, 625, 5919, 3082, 323, 4471, 6082, 422, 13757, 390, 5277, 22909, 273, 7823, 3782, 1580, 10071, 403, 11543, 281, 3359, 247, 391, 77, 3646, 2336, 37438, 533, 2581, 1611, 281, 46919, 697, 12645, 342, 643, 15711, 33491, 273, 1491, 891, 3524, 326, 253, 4477, 4035, 281, 39494, 436, 5147, 285, 18915, 841, 390, 643, 3374, 275, 253, 2852, 5474, 339, 431, 248, 4477, 2085, 247, 15548, 4968, 2104, 281, 1566, 12495, 982, 2440, 347, 13757, 3237, 436, 4483, 1842, 272, 253, 789, 273, 3061, 16126, 672, 11372, 342, 253, 1895, 273, 18000, 747, 5569, 3487, 84, 253, 1566, 556, 644, 3732, 281, 1524, 10186, 941, 281, 7472, 253, 9099, 275, 2426, 273, 8923, 285, 591, 4421, 5741, 2957, 273, 247, 747, 5569, 3487, 50276, 783, 2929, 10262, 271, 4722, 1263, 327, 253, 8062, 273, 253, 22954, 275, 619, 4743, 253, 2440, 273, 247, 4968, 310, 4623, 323, 3739, 285, 3061, 11849, 2175, 533, 310, 417, 2217, 4460, 390, 1534, 323, 253, 13361, 1673, 891, 1158, 326, 436, 2238, 273, 1783, 310, 1805, 18960, 323, 247, 625, 1799, 800, 18767, 1223, 253, 38135, 2530, 275, 634, 789, 310, 417, 2217, 281, 15249, 247, 9311, 387, 17857, 32888, 50276, 783, 2929, 310, 2590, 285, 973, 3542, 891, 11435, 253, 897, 273, 247, 1524, 6483, 383, 438, 90, 285, 253, 1563, 1783, 891, 1158, 326, 253, 760, 1895, 310, 326, 253, 18767, 6777, 407, 253, 4477, 1057, 417, 4944, 697, 4096, 50275, 34974, 50276, 14958, 368, 671, 1611, 634, 1566, 327, 1027, 15302, 891, 1158, 326, 247, 625, 4618, 5661, 2986, 66, 272, 1537, 3157, 253, 1318, 273, 752, 452, 644, 4081, 1060, 50276, 5371, 670, 643, 4948, 273, 12212, 643, 685, 5569, 3487, 84, 323, 4227, 352, 310, 1896, 281, 1566, 30749, 390, 5175, 347, 12212, 3082, 275, 634, 14053, 436, 651, 3185, 1056, 253, 1566, 625, 12112, 285, 1581, 3061, 267, 15384, 281, 769, 275, 247, 625, 12112, 1039, 50276, 23955, 4722, 1263, 1537, 2486, 253, 897, 273, 1027, 3082, 323, 41142, 279, 12212, 285, 616, 897, 275, 247, 6036, 390, 22453, 5133, 275, 1340, 281, 2096, 752, 403, 253, 7823, 689, 673, 534, 1537, 320, 253, 954, 12532, 323, 247, 5454, 2727, 344, 2711, 72, 12132, 50275, 6438, 30080, 22559, 253, 2929, 3133, 4722, 533, 347, 891, 2168, 5393, 285, 347, 643, 30628, 8042, 562, 253, 2022, 7350, 670, 436, 2929, 403, 38135, 285, 17200, 281, 253, 13361, 3114, 2490, 187, 4118, 18435, 27, 783, 30628, 5194, 326, 253, 9021, 778, 417, 320, 4623, 281, 253, 13361, 2561, 3114, 390, 4931, 403, 247, 4105, 4944, 323, 253, 18767, 533, 5010, 1089, 253, 789, 7826, 4217, 285, 15974, 247, 14793, 9400, 984, 253, 2929, 16633, 327, 247, 9864, 3126, 323, 5368, 31634, 3210, 30628, 4385, 326, 253, 7681, 285, 35961, 38135, 310, 3710 ]
Below is given review of a research paper from cnoference journal. Please write a summary the review. ### Review: paper contributions the paper proposes sswgan an adversarial generative model of video which proposes a new generator architecture along with splitting the training into multiple stages strong points of the paper the results are very strong prior work has focused on efficient decomposition of the discriminator this work focuses on decomposition of the generator effectively and this is an extremely reasonable direction to take adversarial video research in the claim that this requires substantially less computational cost is grounded weak points of the paper a major departure from prior work is training stage 1 of sswgan separate of stage 2 but this is independent of the computational benefits present from the generator architecture innovations it is missing an important ablation showing the results of training the stages jointly the generator architectures changes from prior work are quite concrete but the high level description doesnt clearly reflect that some of the comparisons and model descriptions are lacking details which make it difficult to understand experiments clearly state your recommendation accept or reject with one or two key reasons for this choice i believe this paper is an accept 7 however i think there are a number of places where the description of the architecture and experiments needs more detail and for my final rating i would like to see these addressed in rebuttal furthermore i think there is a key experiment missing were that to be added i would strongly consider moving to clear accept 8 supporting arguments for your recommendation there are two key innovations in the sswgan model a generator architecture which provides output to discriminators that does not require an entire highresolution fulllength sample to be generated because between stage 1 and 2 you select only a window of frames to run stage 2 on splitting the training into two phases where at first you only train stage 1 and then you train stage 2 these are very interesting ideas since either one substantially reduces the trainingcost of the video model something aptly described in the paper and not something strongly touched on by previous work in addition the metric scores of the sswgan model are very nice state of the art in almost all cases for that reason i think the work in this paper is worthy of acceptance however there are a couple points of clarification and increased description which i think are needed and i would like to see the following points addressed in the rebuttal the core idea of judging fixedlength output upscaling as a generative problem seems like a very clear and reasonable idea most of the highlevel description of the most omits this in place of a general multi stage definition of the sswgan model i think the introduction and abstract would benefit about being more clear with regards to this change in section 4 it is unclear if all four bold sections are trained independently or not i think it would be good to be clear in the paperstructure which components are trained together in general the paper is not super clear where upsampling both in time and space occurs i think it would be good to describe this in the paper text and in the architecture figures i am not super clear on the comparisons between sswgan and dvdgan in comparison with prior work when you say our model trained to generate 128x12812 videos is this the model which had a first stage trained on 32s3225 and then you trained the second stage on input windows of 6 frames generating 128x12812 and took just single samples from that to compare similarly when you say however our model is only trained on 128x12812 outputs as it is unrolled and applied convolutionally over the first stage output to generate 48 frames isnt it the case that the first stage is trained on longer sequences in table 1 the numbers for dvdgan seem lifted from the paper which i believe is using a kinetics600 trained i3d for metric calculation this means the fid numbers are comparable but in section 5 when describing is you say you are using a kinetics400 trained i3d like in fvd is this correct in which case the numbers are not quite comparable or is this is just misexplained and all numbers in table 1 come from kinetics600 trained i3ds i think the isfid metrics you pick in table 1 are the better metrics but could you also add fvd numbers that would allow you to compare against trivdgan 1 which outperforms dvdgan i think this is important because that paper also discusses modifications which reduce the memory requirement of dvdgan finally there is a major question this paper does not address is the twostage training of sswgan necessary or can it be trained in a single pass but with the generator decomposition as described doing so would be simpler and also might potentially reduce the need for the matching discriminator which the paper contains an ablation for but i think needs a further ablation when the model is not trained in two stages i believe a key experiment would be training an sswgan architecture but in a single pass the results of that experiment would mean quite a lot for interpreting the changes described in this paper in particular the papers title and abstract focus on the multistage aspect of the model but skim over the substantial change of making the generator architecture more modular and scalable i think it would be very beneficial to understand how each of these changes independently effect the performance of the model i do not think this ablation is required to maintain an accept rating but including it would push my rating closer to clear accept and be quite a strong addition to the papers content 1 httpsarxivorgabs200304035 docsepsummary the paper proposes a stagewise training pipeline for training 128x128 resolution videos of up to 100 frames it starts by generating low resolution and temporally downsampled videos and upsample the results in a stagewise manner experimental results on kinetics600 and bdd100k demonstrate that the network is effective in generating higher resolution videos strengths the idea is easy to understand the paper is well written and easy to follow quantitative results show that the proposed method is superior than existing methods under some circumstances weaknesses 1 the novelty is very low stagewise and progressive training have been proposed for such a long time they have been used everywhere the way the authors use them dont really exhibit anything novel to me 2 the resolution of the outputs 128x128 is lower than prior works eg dvdgan has 256x256 outputs since the paper claims the computation cost is lower one would expect the model can generate higher resolution and much longer duration videos but in fact its quite the opposite to prove the effectiveness i feel the authors need to show something higher than 256x256 say 512 or 1024 resolution on the other hand the hardware requirement is still high 128 gpus instead of some normal equipment that everyone can have so i really dont see any benefit of the model if the authors can train dvdgan using only a handful of gpus that might also be a contribution but its not the case now 3 output quality is reasonable but still far from realistic recent gan works have shown amazing quality in synthesized results and the bar has become much higher than a few years ago in that aspect i feel theres still much room for improvement for the result quality overall given the limited novelty low resolution output and still high hardware requirement im inclined to reject the paper docsepthe method shows promising results on on generating high duration up to 100 frames class conditional videos with convincing inception scores indicating quality similar to dvdgan while consuming less memory and with better coherence while the contribution of the paper is mainly to improve the dvdgan architecture to reduce training time and memory consumption the reviewer believes that the paper would be a good contribution to the venue below there are some questions on the methodology 1 is there any way to tell does the matching discriminator actually only estimates the ability to upsample hatxw from the previous low resolution sample xwl from the architecture it is not evident whether or not it only does this or it is also entangled with assessment of how good the lowresolution sample xwl was in other words if the low resolution sample scores good eg because its the realworld data but the upsampling does not match would the objective of the matching descriptor training still score it as a good upscaling or is there any reason preventing from this type of behaviour 2 although as mentioned in the introduction it may not be as big problem as for vaebased models the problem of blurring might exist for dvdganlike models it is written in the caption of figure 5 that despite the two stages of local upsampling the frame quality does not degrade noticeably through time although the reviewer appreciates that previous work reported only isfidfvd metrics and that defining proper evaluation metrics for generative models is an open question it might be a good idea to show some other quantitative metrics such as power spectral density psd plots similar to figure 5 from 1 this would help get an idea how it compares to the realworld video in terms of blurring of the results 3 given that the generation of videos is classconditional is it possible to show the metrics per class are the scores per class similar or does the method score better for larger classes or the classes with specific motion dynamics 1 ayzel et al 2020 rainnet v10 a convolutional neural network for radarbased precipitation nowcastingdocseppros 1 a stagewise approach to train gans for video is defined to reduce the computational costs needed to generate long high resolution videos 2 the authors provide some quality results of the proposed approach cons 1 the contribution of this paper is very limited the authors just do some incremental improvement based on current gan models and the theoretical analysis for the stagewise training approach is not enough 2 the experiments are not convincing the authors only compared the baseline methods in the experiments besides the proposed training strategy should be applied in different generation models based on gan to show the effectiveness in different cases 3 this paper aims to reduce the computation cost of the model training but do not achieve significant effect which takes 23 days for model training docsepthe paper proposes a ganbased model which generates videos in multiple stages the main idea is the upsampling of the spatiotemporal resolution upon addition of a stage this is the key feature of the proposed model allowing the model to generate videos of higher temporal resolution while using significantly less computational resources strengths the paper is clearly written the model performs competitively with relevant baselines with respect to quantitative metrics the evaluation of the model has been conducted on real world datasets implementation details have been mentioned clearly weaknesses there have been earlier attempts for multistage video generation 12 however the paper misses citations in this direction also apart from condition for the generation how is the proposed model different from the existing multistage ones the generated samples for kinetics dataset are not temporally consistent and misses several details especially for smaller entities in video to list a few in figure 2 row 4 the babys face looks distorted and different in every frame in figure 3 row 2 and in figure 2 row 1 the face of the person is completely incomprehensible the generated samples in the paper do not have a lot of perceived motion in them how does the model perform when the input class is supposed to possess huge temporal variations overall the paper presents a scalable way to generate video with higher temporal resolution however the generated results do not look realistic and lot of important details are missing in the generated samples therefore my initial rating for this paper is 4 references used in the review 1 learning to generate timelapse videos using multistage dynamic generative adversarial networks in proceedings of the ieee conference on computer vision and pattern recognition 2018 2 zhao l peng x tian y kapadia m and metaxas dn towards imagetovideo translation a structureaware approach via multistage generative adversarial networks international journal of computer vision 2020 postrebuttal comments i appreciate the revisions and additional results presented by the authors the authors have addressed my concerns as well as improved the clarity of the model description in the revised version of the paper while that results are not perfect i acknowledge that the problem of video generation is difficult and i believe such multistage model can motivate future methods in this direction of scalable video generation therefore i would like to improve my score to 6 and would recommend acceptance of this paper ### Summary:
this paper proposes a gan for video generation based on stagewise training over different resolutions addressing scalability issues with previous approaches reviewers noted that the paper is clearly written proposes a method that improves upon the dvdgan architecture by reducing training time and memory consumption and has competitive quantitative results on the other hand the more negative reviewers are concerned that the empirical improvements demonstrated are somewhat incremental and that there is not much novelty as the proposed approach is similar to other methods that decompose the generation process into multiple stages at different temporal window lengths andor spatial resolutions the authors argue that these criticisms are subjective and nonactionable i sympathize with their frustration but an acceptance decision for a competitive conference like iclr does involve some subjective judgment as to whether the method andor results meet a high bar beyond mere correctness for this submission thats a close call but between the noveltyincrementality concerns and the other more minor issues raised by reviewers eg missing frameconditional evaluation i believe this paper could benefit from another round of revisions and improvements and recommend rejection i hope the authors will consider improving the submission based on the reviewers feedback and resubmitting to a future venue as the paper certainly has merit to this end i have a few concrete recommendations for the authors which could have flipped my recommendation to an accept if implemented report results in the frameconditional setting for comparison with dvdgan and other methods that operate in this setting proofread the paper more thoroughly i noticed several typos while skimming the paper eg in the theory section the second term of eq 6 confusingly uses rho instead of log relatedly given that appendix b1 reports that the hinge loss is used im not sure whether log is correct in the first place this probably deserves further explanation or correction demonstrateargue more convincingly in one way or another that sswgans improved efficiency really expands the frontier of what was possible before it is true that the 128x128100 video samples contain 2x as many total pixels as dvdgans 256x25612 samples but this isnt a strict improvement as the spatial resolution is smaller and a 2x difference leaves space for reviewers to reasonably wonder whether previous methods really couldnt have matched this if pushed some possible examples of this show that sswgan can generate longer 256x256 videos a strict improvement over what was possible with dvdgan or orders of magnitude longer eg 1 minute but still temporally coherent videos at 128x128 or videos with substantially improved subjective sample quality at the same or higher resolution the paper notes that dvdgan models do not unroll well and tend to produce samples that become motionless past its training horizon if this were quantified eg by additionally reporting isfidfvd separately for different timestep ranges it could make a more compelling argument in favor of sswgan
[ 533, 275, 247, 2014, 1509, 253, 1543, 273, 326, 3368, 651, 1599, 3240, 247, 2257, 323, 29375, 253, 2544, 2529, 275, 436, 2929, 275, 1798, 253, 9380, 4060, 285, 12002, 2770, 327, 253, 1554, 382, 486, 4809, 273, 253, 1566, 533, 43816, 689, 253, 6832, 1818, 273, 2403, 253, 14156, 10336, 625, 23178, 285, 44755, 891, 1158, 352, 651, 320, 1077, 12912, 281, 2096, 849, 1016, 273, 841, 2544, 10939, 1055, 253, 3045, 273, 253, 1566, 50276, 74, 513, 417, 1158, 436, 28913, 310, 2424, 281, 6558, 271, 2997, 13716, 533, 1690, 352, 651, 7450, 619, 13716, 8003, 281, 2590, 2997, 285, 320, 3240, 247, 2266, 1635, 281, 253, 9380, 2600, 50275, 18, 5987, 39962, 2061, 5375, 1518, 1229, 1449, 1671, 5474, 339, 793, 360, 3454, 253, 2929, 29328, 247, 3924, 3020, 3733, 15722, 323, 3733, 12842, 89, 8196, 6064, 10556, 273, 598, 281, 2233, 13009, 352, 7866, 407, 11365, 1698, 6064, 285, 5897, 595, 1066, 22163, 6216, 10556, 285, 598, 16848, 253, 1543, 275, 247, 3924, 3020, 5133, 5661, 1543, 327, 24273, 10487, 285, 270, 1678, 2313, 76, 7568, 326, 253, 2990, 310, 3576, 275, 11365, 2169, 6064, 10556, 50276, 296, 3755, 20556, 253, 2934, 310, 3477, 281, 2096, 253, 2929, 310, 973, 3542, 285, 3477, 281, 956, 11745, 1543, 921, 326, 253, 4081, 1332, 310, 8936, 685, 5368, 3082, 762, 690, 5989, 50276, 20881, 1255, 265, 337, 186, 783, 38135, 310, 1077, 1698, 3924, 3020, 285, 13439, 3733, 452, 644, 4081, 323, 824, 247, 1048, 673, 597, 452, 644, 908, 11678, 253, 1039, 253, 4477, 897, 731, 13414, 1663, 10738, 2712, 4460, 281, 479, 374, 186, 783, 6064, 273, 253, 18012, 12842, 89, 8196, 310, 2406, 685, 2720, 2987, 24088, 277, 19122, 1247, 556, 17558, 89, 9726, 18012, 1580, 253, 2929, 3916, 253, 13782, 2105, 310, 2406, 581, 651, 1902, 253, 1566, 476, 6635, 2169, 6064, 285, 1199, 3356, 7467, 10556, 533, 275, 958, 697, 3240, 253, 7285, 281, 5276, 253, 12510, 891, 1928, 253, 4477, 878, 281, 921, 1633, 2169, 685, 17558, 89, 9726, 1333, 23414, 390, 27277, 6064, 327, 253, 643, 1133, 253, 10309, 8284, 310, 1335, 1029, 12842, 31025, 316, 3185, 273, 690, 2622, 6500, 326, 4130, 476, 452, 594, 891, 1663, 13414, 923, 667, 5649, 273, 253, 1566, 604, 253, 4477, 476, 6194, 277, 19122, 1247, 970, 760, 247, 17167, 273, 31025, 316, 326, 1537, 671, 320, 247, 7680, 533, 697, 417, 253, 1083, 1024, 495, 186, 9252, 3290, 310, 5272, 533, 1335, 2080, 432, 15958, 3332, 36827, 2987, 452, 2011, 8644, 3290, 275, 17791, 1543, 285, 253, 2534, 556, 2489, 1199, 2169, 685, 247, 1643, 1107, 3622, 275, 326, 4809, 891, 1928, 253, 373, 1335, 1199, 2316, 323, 7756, 323, 253, 906, 3290, 50276, 1189, 455, 1677, 253, 3710, 38135, 1698, 6064, 3453, 285, 1335, 1029, 10309, 8284, 516, 21802, 281, 12009, 253, 2929, 5474, 339, 431, 248, 1332, 2722, 12532, 1543, 327, 327, 11365, 1029, 7467, 598, 281, 2233, 13009, 966, 17697, 10556, 342, 21414, 39645, 7363, 7809, 3290, 2074, 281, 277, 19122, 1247, 1223, 21337, 1679, 3541, 285, 342, 1805, 25253, 50276, 6050, 253, 7680, 273, 253, 2929, 310, 7194, 281, 3157, 253, 277, 19122, 1247, 10336, 281, 4796, 3733, 673, 285, 3541, 8353, 253, 37317, 11532, 326, 253, 2929, 651, 320, 247, 1175, 7680, 281, 253, 18767, 2708, 627, 403, 690, 3533, 327, 253, 16182, 337, 310, 627, 667, 1039, 281, 2028, 1057, 253, 11038, 7134, 12915, 2686, 760, 8197, 253, 3745, 281, 598, 16848, 7856, 89, 88, 432, 253, 2045, 1698, 6064, 3410, 1269, 24966, 432, 253, 10336, 352, 310, 417, 8943, 1880, 390, 417, 352, 760, 1057, 436, 390, 352, 310, 671, 36255, 342, 6803, 273, 849, 1175, 253, 1698, 21061, 3410, 1269, 24966, 50276, 4238, 275, 643, 3000, 604, 253, 1698, 6064, 3410, 7363, 1175, 24088, 984, 697, 253, 1524, 10186, 941, 533, 253, 598, 48027, 1057, 417, 3761, 50276, 12756, 253, 8103, 273, 253, 11038, 30047, 3733, 1335, 4868, 352, 347, 247, 1175, 598, 49708, 50276, 263, 310, 627, 667, 1921, 13538, 432, 436, 1511, 273, 8770, 50276, 19, 3738, 347, 5393, 275, 253, 10199, 352, 778, 417, 320, 347, 1943, 1895, 347, 323, 13460, 2275, 833, 3210, 253, 1895, 273, 29017, 804, 1537, 2226, 323, 277, 19122, 1247, 3022, 3210, 352, 310, 3542, 275, 253, 11743, 273, 4677, 608, 326, 5747, 253, 767, 8661, 273, 1980, 598, 48027, 253, 3665, 3290, 1057, 417, 40195, 4366, 1598, 949, 673, 3738, 253, 37317, 6373, 28032, 326, 2045, 789, 2361, 760, 310, 71, 301, 71, 19122, 17082, 285, 326, 13947, 1463, 7103, 17082, 323, 1006, 800, 3210, 310, 271, 1527, 1953, 352, 1537, 320, 247, 1175, 2934, 281, 921, 690, 643, 11745, 17082, 824, 347, 1612, 9879, 4038, 3714, 69, 14777, 2074, 281, 4677, 608, 432, 337, 436, 651, 1361, 755, 271, 2934, 849, 352, 26662, 281, 253, 1524, 10186, 3492, 275, 2426, 273, 29017, 804, 273, 253, 1543, 50276, 20, 1677, 326, 253, 5978, 273, 10556, 310, 966, 35428, 310, 352, 1896, 281, 921, 253, 17082, 591, 966, 403, 253, 7363, 591, 966, 2074, 390, 1057, 253, 1332, 4868, 1805, 323, 4067, 5971, 390, 253, 5971, 342, 2173, 3200, 8062, 50275, 18, 26275, 21608, 1162, 355, 9169, 1218, 2966, 292, 362, 740, 247, 27311, 267, 11454, 2990, 323, 22013, 3169, 26611, 1024, 29851, 7152, 339, 377, 2921, 337, 247, 3924, 3020, 2746, 281, 6194, 305, 507, 323, 3492, 310, 2931, 281, 4796, 253, 15180, 4815, 3058, 281, 6635, 1048, 1029, 6064, 10556, 374, 253, 4477, 2085, 690, 3290, 1543, 273, 253, 4081, 2746, 50276, 5040, 337, 253, 7680, 273, 436, 2929, 310, 1077, 3710, 253, 4477, 816, 513, 690, 32809, 7756, 1754, 327, 1655, 36827, 3210, 285, 253, 10527, 1783, 323, 253, 3924, 3020, 3733, 2746, 310, 417, 2217, 50276, 19, 253, 4679, 403, 417, 21414, 253, 4477, 760, 2429, 253, 8245, 3082, 275, 253, 4679, 16280, 253, 4081, 3733, 5700, 943, 320, 3732, 275, 1027, 5978, 3210, 1754, 327, 36827, 281, 921, 253, 12510, 275, 1027, 2219, 50276, 20, 436, 2929, 13698, 281, 4796, 253, 13782, 2105, 273, 253, 1566, 3733, 533, 513, 417, 5115, 1534, 1055, 534, 3936, 3495, 1897, 323, 1566, 3733, 5474, 339, 431, 248, 2929, 29328, 247, 36827, 3169, 1566, 534, 15693, 10556, 275, 2709, 8661, 253, 2022, 2934, 310, 253, 598, 48027, 273, 253, 7046, 7173, 358, 23702, 6064, 2220, 1635, 273, 247, 3924, 436, 310, 253, 2234, 4735, 273, 253, 4081, 1566, 6941, 253, 1566, 281, 6635, 10556, 273, 2169, 11935, 6064, 1223, 970, 3012, 1679, 15180, 5300, 50275, 296, 3755, 20556, 50276, 783, 2929, 310, 4518, 3542, 50276, 783, 1566, 17923, 3947, 25785, 342, 4623, 1666, 25379, 342, 1675, 281, 11745, 17082, 50276, 783, 7103, 273, 253, 1566, 556, 644, 5196, 327, 1524, 1533, 15302, 50276, 39595, 4278, 452, 644, 5393, 4518, 50276, 20881, 1255, 265, 50276, 9088, 452, 644, 4321, 9437, 323, 1554, 382, 486, 3492, 5978, 50276, 805, 2299, 253, 2929, 38771, 30404, 275, 436, 3884, 671, 7419, 432, 1617, 323, 253, 5978, 849, 310, 253, 4081, 1566, 1027, 432, 253, 5368, 1554, 382, 486, 4394, 50276, 783, 4561, 3530, 323, 24273, 10895, 403, 417, 5897, 595, 5185, 285, 38771, 2067, 4278, 3340, 323, 4577, 14429, 275, 3492, 281, 1618, 247, 1643, 275, 4677, 374, 4194, 577, 253, 5366, 656, 2454, 4453, 32408, 285, 1027, 275, 1046, 3665, 275, 4677, 495, 4194, 374, 285, 275, 4677, 374, 4194, 337, 253, 2454, 273, 253, 1436, 310, 4336, 15321, 6792, 6286, 50276, 783, 4561, 3530, 275, 253, 2929, 513, 417, 452, 247, 2257, 273, 12351, 3200, 275, 731, 849, 1057, 253, 1566, 1347, 672, 253, 3280, 966, 310, 6326, 281, 7081, 5699, 11935, 10575, 50276, 1189, 455, 253, 2929, 10262, 247, 44755, 1039, 281, 6635, 3492, 342, 2169, 11935, 6064, 2299, 253, 4561, 1543, 513, 417, 1007, 15958, 285, 2257, 273, 1774, 4278, 403, 5816, 275, 253, 4561, 3530, 3103, 619, 3302, 13716, 323, 436, 2929, 310, 577, 50274, 250, 3065, 908, 275, 253, 2278, 50276, 18, 4715, 281, 6635, 4522, 293, 8023, 10556, 970, 1554, 382, 486, 7870, 1006, 800, 48960, 6928, 275, 10061, 273, 253, 26332, 1796, 8059, 327, 4382, 8113, 285, 3102, 8981, 4765, 50276, 19, 1182, 31035, 298, 42151, 1269, 246, 757, 340, 42844, 324, 571, 278, 285, 1313, 991, 284, 277, 79, 4404, 4440, 292, 729, 2842, 10234, 247, 2605, 13823, 2746, 3066, 1554, 382, 486, 1006, 800, 48960, 6928, 5213, 6698, 273, 4382, 8113, 9169, 50274, 5996, 250, 2858, 22559, 5701, 50276, 74, 11435, 253, 38549, 285, 3081, 1543, 3559, 407, 253, 4477, 253, 4477, 452, 9713, 619, 7350, 347, 973, 347, 5520, 253, 19843, 273, 253, 1566, 5740, 275, 253, 17265, 2715, 273, 253, 2929, 1223, 326, 1543, 403, 417, 3962, 891, 14409, 326, 253, 1895, 273, 3492, 5978, 310, 2834, 285, 891, 2868, 824, 1554, 382, 486, 1566, 476, 41509, 2852, 3082, 275, 436, 3884, 273, 44755, 3492, 5978, 3103, 891, 651, 751, 281, 3157, 619, 4868, 281, 721, 285, 651, 5583, 14924, 273, 436, 2929, 2490, 187, 4118, 18435, 27, 2520, 2929, 29328, 247, 36827, 323, 3492, 5978, 1754, 327, 3924, 3020, 3733, 689, 1027, 30285, 15974, 9171, 1430, 3374, 342, 2045, 7274, 30628, 4879, 326, 253, 2929, 310, 4518, 3542, 29328, 247, 1332, 326, 19132, 2220, 253, 277, 19122, 1247, 10336, 407, 8493, 3733, 673, 285, 3541, 8353, 285, 556, 12085, 11745, 1543, 50276, 251, 253, 643, 1133, 253, 625, 4016, 30628, 403, 7514, 326, 253, 16774, 11701, 5183, 403, 8489, 32809, 285, 326, 627, 310, 417, 1199, 38135, 347, 253, 4081, 2746, 310, 2074, 281, 643, 3082, 326, 11101, 3014, 253, 5978, 1232, 715, 2709, 8661, 387, 1027, 11935, 3497, 16095, 285, 263, 8820, 30285, 253, 4477, 9059, 326, 841, 43680, 403, 17854, 285, 1327, 1913, 494, 891, 34144, 907, 342, 616, 22014, 533, 271, 14924, 3061, 323, 247, 12085, 8059, 751, 17857, 32888, 1057, 6388, 690, 17854, 3883, 347, 281, 1880, 253, 1332, 285, 263, 1543, 2525, 247, 1029, 2534, 4457, 11019, 36594, 323, 436, 19529, 28763, 247, 2810, 1067, 533, 875, 253, 38135, 19687, 420, 1319, 7350, 285, 253, 643, 625, 5884, 3374, 5439, 407, 30628, 24088, 5816, 3665, 35428, 7103, 891, 2868, 436, 2929, 812, 5649, 432, 1529, 3790, 273, 38549, 285, 11701, 285, 5583, 18235, 50276, 74, 3524, 253, 4477, 588, 1908, 11138, 253, 19529, 1754, 327, 253, 30628, 8680, 285, 501, 538, 15318, 281, 247, 2852, 18767, 347, 253, 2929, 5604, 556, 15785, 281, 436, 990, 891, 452, 247, 1643, 11859, 12645, 323, 253, 4477, 534, 812, 452, 34572, 619, 17401, 281, 271, 2997, 604, 9009, 50275, 16223, 1543, 275, 253, 3665, 35428, 4758, 323, 5301, 342, 277, 19122, 1247, 285, 643, 3082, 326, 10196, 275, 436, 4758, 50276, 16314, 1088, 253, 2929, 625, 16575, 891, 8344, 2067, 963, 993, 1223, 1629, 34282, 253, 2929, 24088, 275, 253, 3762, 2593, 253, 1273, 1307, 273, 16186, 721, 21643, 314, 4648, 391, 1689, 3185, 273, 2412, 2905, 314, 1677, 326, 30762, 270, 18, 5012, 326, 253, 38864, 2957, 310, 908, 516, 417, 2119, 1880, 2412, 310, 3451, 275, 253, 806, 1659, 50276, 2520, 3164, 22828, 2007, 8813, 390, 10618, 50276, 48387, 366, 1662, 489, 625, 2410, 1763, 5356, 275, 581, 1039, 390, 1529, 326, 256, 2140, 72, 507, 5520, 6733, 1663, 35205, 253, 34642, 273, 752, 369, 1896, 1078, 352, 310, 2032, 326, 253, 12842, 89, 8196, 2313, 3492, 3530, 3831, 374, 89, 347, 1142, 2264, 15115, 347, 277, 19122, 72, 507, 17558, 89, 9726, 805, 3530, 533, 436, 310, 2649, 247, 7654, 7756, 347, 253, 8820, 6064, 310, 4577, 285, 247, 374, 89, 3064, 6505, 2317, 323, 30628, 281, 12054, 4282, 1880, 2045, 3082, 1663, 812, 2649, 452, 13373, 436, 604, 10184, 690, 1896, 6667, 273, 436, 921, 326, 256, 2140, 1247, 476, 6635, 3356, 17558, 89, 9726, 10556, 247, 7654, 7756, 689, 752, 369, 1896, 342, 277, 19122, 1247, 390, 7367, 273, 9777, 3356, 24088, 337, 7017, 533, 1335, 5897, 595, 18893, 10556, 387, 12842, 89, 8196, 390, 10556, 342, 9619, 5520, 17854, 3410, 3290, 387, 253, 1072, 390, 2169, 6064, 50276, 783, 2929, 7211, 326, 277, 19122, 1247, 3210, 513, 417, 440, 1811, 973, 285, 5257, 281, 4711, 3530, 326, 2489, 3200, 1417, 2469, 697, 3733, 16892, 604, 436, 497, 18755, 24088, 407, 23000, 9610, 310, 71, 301, 71, 19122, 11794, 323, 1027, 4522, 383, 554, 13794, 352, 812, 1056, 247, 625, 18511, 4154, 275, 3718, 273, 256, 2140, 1247 ]
[ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 ]
[ 533, 275, 247, 2014, 1509, 253, 1543, 273, 326, 3368, 651, 1599, 3240, 247, 2257, 323, 29375, 253, 2544, 2529, 275, 436, 2929, 275, 1798, 253, 9380, 4060, 285, 12002, 2770, 327, 253, 1554, 382, 486, 4809, 273, 253, 1566, 533, 43816, 689, 253, 6832, 1818, 273, 2403, 253, 14156, 10336, 625, 23178, 285, 44755, 891, 1158, 352, 651, 320, 1077, 12912, 281, 2096, 849, 1016, 273, 841, 2544, 10939, 1055, 253, 3045, 273, 253, 1566, 50276, 74, 513, 417, 1158, 436, 28913, 310, 2424, 281, 6558, 271, 2997, 13716, 533, 1690, 352, 651, 7450, 619, 13716, 8003, 281, 2590, 2997, 285, 320, 3240, 247, 2266, 1635, 281, 253, 9380, 2600, 50275, 18, 5987, 39962, 2061, 5375, 1518, 1229, 1449, 1671, 5474, 339, 793, 360, 3454, 253, 2929, 29328, 247, 3924, 3020, 3733, 15722, 323, 3733, 12842, 89, 8196, 6064, 10556, 273, 598, 281, 2233, 13009, 352, 7866, 407, 11365, 1698, 6064, 285, 5897, 595, 1066, 22163, 6216, 10556, 285, 598, 16848, 253, 1543, 275, 247, 3924, 3020, 5133, 5661, 1543, 327, 24273, 10487, 285, 270, 1678, 2313, 76, 7568, 326, 253, 2990, 310, 3576, 275, 11365, 2169, 6064, 10556, 50276, 296, 3755, 20556, 253, 2934, 310, 3477, 281, 2096, 253, 2929, 310, 973, 3542, 285, 3477, 281, 956, 11745, 1543, 921, 326, 253, 4081, 1332, 310, 8936, 685, 5368, 3082, 762, 690, 5989, 50276, 20881, 1255, 265, 337, 186, 783, 38135, 310, 1077, 1698, 3924, 3020, 285, 13439, 3733, 452, 644, 4081, 323, 824, 247, 1048, 673, 597, 452, 644, 908, 11678, 253, 1039, 253, 4477, 897, 731, 13414, 1663, 10738, 2712, 4460, 281, 479, 374, 186, 783, 6064, 273, 253, 18012, 12842, 89, 8196, 310, 2406, 685, 2720, 2987, 24088, 277, 19122, 1247, 556, 17558, 89, 9726, 18012, 1580, 253, 2929, 3916, 253, 13782, 2105, 310, 2406, 581, 651, 1902, 253, 1566, 476, 6635, 2169, 6064, 285, 1199, 3356, 7467, 10556, 533, 275, 958, 697, 3240, 253, 7285, 281, 5276, 253, 12510, 891, 1928, 253, 4477, 878, 281, 921, 1633, 2169, 685, 17558, 89, 9726, 1333, 23414, 390, 27277, 6064, 327, 253, 643, 1133, 253, 10309, 8284, 310, 1335, 1029, 12842, 31025, 316, 3185, 273, 690, 2622, 6500, 326, 4130, 476, 452, 594, 891, 1663, 13414, 923, 667, 5649, 273, 253, 1566, 604, 253, 4477, 476, 6194, 277, 19122, 1247, 970, 760, 247, 17167, 273, 31025, 316, 326, 1537, 671, 320, 247, 7680, 533, 697, 417, 253, 1083, 1024, 495, 186, 9252, 3290, 310, 5272, 533, 1335, 2080, 432, 15958, 3332, 36827, 2987, 452, 2011, 8644, 3290, 275, 17791, 1543, 285, 253, 2534, 556, 2489, 1199, 2169, 685, 247, 1643, 1107, 3622, 275, 326, 4809, 891, 1928, 253, 373, 1335, 1199, 2316, 323, 7756, 323, 253, 906, 3290, 50276, 1189, 455, 1677, 253, 3710, 38135, 1698, 6064, 3453, 285, 1335, 1029, 10309, 8284, 516, 21802, 281, 12009, 253, 2929, 5474, 339, 431, 248, 1332, 2722, 12532, 1543, 327, 327, 11365, 1029, 7467, 598, 281, 2233, 13009, 966, 17697, 10556, 342, 21414, 39645, 7363, 7809, 3290, 2074, 281, 277, 19122, 1247, 1223, 21337, 1679, 3541, 285, 342, 1805, 25253, 50276, 6050, 253, 7680, 273, 253, 2929, 310, 7194, 281, 3157, 253, 277, 19122, 1247, 10336, 281, 4796, 3733, 673, 285, 3541, 8353, 253, 37317, 11532, 326, 253, 2929, 651, 320, 247, 1175, 7680, 281, 253, 18767, 2708, 627, 403, 690, 3533, 327, 253, 16182, 337, 310, 627, 667, 1039, 281, 2028, 1057, 253, 11038, 7134, 12915, 2686, 760, 8197, 253, 3745, 281, 598, 16848, 7856, 89, 88, 432, 253, 2045, 1698, 6064, 3410, 1269, 24966, 432, 253, 10336, 352, 310, 417, 8943, 1880, 390, 417, 352, 760, 1057, 436, 390, 352, 310, 671, 36255, 342, 6803, 273, 849, 1175, 253, 1698, 21061, 3410, 1269, 24966, 50276, 4238, 275, 643, 3000, 604, 253, 1698, 6064, 3410, 7363, 1175, 24088, 984, 697, 253, 1524, 10186, 941, 533, 253, 598, 48027, 1057, 417, 3761, 50276, 12756, 253, 8103, 273, 253, 11038, 30047, 3733, 1335, 4868, 352, 347, 247, 1175, 598, 49708, 50276, 263, 310, 627, 667, 1921, 13538, 432, 436, 1511, 273, 8770, 50276, 19, 3738, 347, 5393, 275, 253, 10199, 352, 778, 417, 320, 347, 1943, 1895, 347, 323, 13460, 2275, 833, 3210, 253, 1895, 273, 29017, 804, 1537, 2226, 323, 277, 19122, 1247, 3022, 3210, 352, 310, 3542, 275, 253, 11743, 273, 4677, 608, 326, 5747, 253, 767, 8661, 273, 1980, 598, 48027, 253, 3665, 3290, 1057, 417, 40195, 4366, 1598, 949, 673, 3738, 253, 37317, 6373, 28032, 326, 2045, 789, 2361, 760, 310, 71, 301, 71, 19122, 17082, 285, 326, 13947, 1463, 7103, 17082, 323, 1006, 800, 3210, 310, 271, 1527, 1953, 352, 1537, 320, 247, 1175, 2934, 281, 921, 690, 643, 11745, 17082, 824, 347, 1612, 9879, 4038, 3714, 69, 14777, 2074, 281, 4677, 608, 432, 337, 436, 651, 1361, 755, 271, 2934, 849, 352, 26662, 281, 253, 1524, 10186, 3492, 275, 2426, 273, 29017, 804, 273, 253, 1543, 50276, 20, 1677, 326, 253, 5978, 273, 10556, 310, 966, 35428, 310, 352, 1896, 281, 921, 253, 17082, 591, 966, 403, 253, 7363, 591, 966, 2074, 390, 1057, 253, 1332, 4868, 1805, 323, 4067, 5971, 390, 253, 5971, 342, 2173, 3200, 8062, 50275, 18, 26275, 21608, 1162, 355, 9169, 1218, 2966, 292, 362, 740, 247, 27311, 267, 11454, 2990, 323, 22013, 3169, 26611, 1024, 29851, 7152, 339, 377, 2921, 337, 247, 3924, 3020, 2746, 281, 6194, 305, 507, 323, 3492, 310, 2931, 281, 4796, 253, 15180, 4815, 3058, 281, 6635, 1048, 1029, 6064, 10556, 374, 253, 4477, 2085, 690, 3290, 1543, 273, 253, 4081, 2746, 50276, 5040, 337, 253, 7680, 273, 436, 2929, 310, 1077, 3710, 253, 4477, 816, 513, 690, 32809, 7756, 1754, 327, 1655, 36827, 3210, 285, 253, 10527, 1783, 323, 253, 3924, 3020, 3733, 2746, 310, 417, 2217, 50276, 19, 253, 4679, 403, 417, 21414, 253, 4477, 760, 2429, 253, 8245, 3082, 275, 253, 4679, 16280, 253, 4081, 3733, 5700, 943, 320, 3732, 275, 1027, 5978, 3210, 1754, 327, 36827, 281, 921, 253, 12510, 275, 1027, 2219, 50276, 20, 436, 2929, 13698, 281, 4796, 253, 13782, 2105, 273, 253, 1566, 3733, 533, 513, 417, 5115, 1534, 1055, 534, 3936, 3495, 1897, 323, 1566, 3733, 5474, 339, 431, 248, 2929, 29328, 247, 36827, 3169, 1566, 534, 15693, 10556, 275, 2709, 8661, 253, 2022, 2934, 310, 253, 598, 48027, 273, 253, 7046, 7173, 358, 23702, 6064, 2220, 1635, 273, 247, 3924, 436, 310, 253, 2234, 4735, 273, 253, 4081, 1566, 6941, 253, 1566, 281, 6635, 10556, 273, 2169, 11935, 6064, 1223, 970, 3012, 1679, 15180, 5300, 50275, 296, 3755, 20556, 50276, 783, 2929, 310, 4518, 3542, 50276, 783, 1566, 17923, 3947, 25785, 342, 4623, 1666, 25379, 342, 1675, 281, 11745, 17082, 50276, 783, 7103, 273, 253, 1566, 556, 644, 5196, 327, 1524, 1533, 15302, 50276, 39595, 4278, 452, 644, 5393, 4518, 50276, 20881, 1255, 265, 50276, 9088, 452, 644, 4321, 9437, 323, 1554, 382, 486, 3492, 5978, 50276, 805, 2299, 253, 2929, 38771, 30404, 275, 436, 3884, 671, 7419, 432, 1617, 323, 253, 5978, 849, 310, 253, 4081, 1566, 1027, 432, 253, 5368, 1554, 382, 486, 4394, 50276, 783, 4561, 3530, 323, 24273, 10895, 403, 417, 5897, 595, 5185, 285, 38771, 2067, 4278, 3340, 323, 4577, 14429, 275, 3492, 281, 1618, 247, 1643, 275, 4677, 374, 4194, 577, 253, 5366, 656, 2454, 4453, 32408, 285, 1027, 275, 1046, 3665, 275, 4677, 495, 4194, 374, 285, 275, 4677, 374, 4194, 337, 253, 2454, 273, 253, 1436, 310, 4336, 15321, 6792, 6286, 50276, 783, 4561, 3530, 275, 253, 2929, 513, 417, 452, 247, 2257, 273, 12351, 3200, 275, 731, 849, 1057, 253, 1566, 1347, 672, 253, 3280, 966, 310, 6326, 281, 7081, 5699, 11935, 10575, 50276, 1189, 455, 253, 2929, 10262, 247, 44755, 1039, 281, 6635, 3492, 342, 2169, 11935, 6064, 2299, 253, 4561, 1543, 513, 417, 1007, 15958, 285, 2257, 273, 1774, 4278, 403, 5816, 275, 253, 4561, 3530, 3103, 619, 3302, 13716, 323, 436, 2929, 310, 577, 50274, 250, 3065, 908, 275, 253, 2278, 50276, 18, 4715, 281, 6635, 4522, 293, 8023, 10556, 970, 1554, 382, 486, 7870, 1006, 800, 48960, 6928, 275, 10061, 273, 253, 26332, 1796, 8059, 327, 4382, 8113, 285, 3102, 8981, 4765, 50276, 19, 1182, 31035, 298, 42151, 1269, 246, 757, 340, 42844, 324, 571, 278, 285, 1313, 991, 284, 277, 79, 4404, 4440, 292, 729, 2842, 10234, 247, 2605, 13823, 2746, 3066, 1554, 382, 486, 1006, 800, 48960, 6928, 5213, 6698, 273, 4382, 8113, 9169, 50274, 5996, 250, 2858, 22559, 5701, 50276, 74, 11435, 253, 38549, 285, 3081, 1543, 3559, 407, 253, 4477, 253, 4477, 452, 9713, 619, 7350, 347, 973, 347, 5520, 253, 19843, 273, 253, 1566, 5740, 275, 253, 17265, 2715, 273, 253, 2929, 1223, 326, 1543, 403, 417, 3962, 891, 14409, 326, 253, 1895, 273, 3492, 5978, 310, 2834, 285, 891, 2868, 824, 1554, 382, 486, 1566, 476, 41509, 2852, 3082, 275, 436, 3884, 273, 44755, 3492, 5978, 3103, 891, 651, 751, 281, 3157, 619, 4868, 281, 721, 285, 651, 5583, 14924, 273, 436, 2929, 2490, 187, 4118, 18435, 27, 2520, 2929, 29328, 247, 36827, 323, 3492, 5978, 1754, 327, 3924, 3020, 3733, 689, 1027, 30285, 15974, 9171, 1430, 3374, 342, 2045, 7274, 30628, 4879, 326, 253, 2929, 310, 4518, 3542, 29328, 247, 1332, 326, 19132, 2220, 253, 277, 19122, 1247, 10336, 407, 8493, 3733, 673, 285, 3541, 8353, 285, 556, 12085, 11745, 1543, 50276, 251, 253, 643, 1133, 253, 625, 4016, 30628, 403, 7514, 326, 253, 16774, 11701, 5183, 403, 8489, 32809, 285, 326, 627, 310, 417, 1199, 38135, 347, 253, 4081, 2746, 310, 2074, 281, 643, 3082, 326, 11101, 3014, 253, 5978, 1232, 715, 2709, 8661, 387, 1027, 11935, 3497, 16095, 285, 263, 8820, 30285, 253, 4477, 9059, 326, 841, 43680, 403, 17854, 285, 1327, 1913, 494, 891, 34144, 907, 342, 616, 22014, 533, 271, 14924, 3061, 323, 247, 12085, 8059, 751, 17857, 32888, 1057, 6388, 690, 17854, 3883, 347, 281, 1880, 253, 1332, 285, 263, 1543, 2525, 247, 1029, 2534, 4457, 11019, 36594, 323, 436, 19529, 28763, 247, 2810, 1067, 533, 875, 253, 38135, 19687, 420, 1319, 7350, 285, 253, 643, 625, 5884, 3374, 5439, 407, 30628, 24088, 5816, 3665, 35428, 7103, 891, 2868, 436, 2929, 812, 5649, 432, 1529, 3790, 273, 38549, 285, 11701, 285, 5583, 18235, 50276, 74, 3524, 253, 4477, 588, 1908, 11138, 253, 19529, 1754, 327, 253, 30628, 8680, 285, 501, 538, 15318, 281, 247, 2852, 18767, 347, 253, 2929, 5604, 556, 15785, 281, 436, 990, 891, 452, 247, 1643, 11859, 12645, 323, 253, 4477, 534, 812, 452, 34572, 619, 17401, 281, 271, 2997, 604, 9009, 50275, 16223, 1543, 275, 253, 3665, 35428, 4758, 323, 5301, 342, 277, 19122, 1247, 285, 643, 3082, 326, 10196, 275, 436, 4758, 50276, 16314, 1088, 253, 2929, 625, 16575, 891, 8344, 2067, 963, 993, 1223, 1629, 34282, 253, 2929, 24088, 275, 253, 3762, 2593, 253, 1273, 1307, 273, 16186, 721, 21643, 314, 4648, 391, 1689, 3185, 273, 2412, 2905, 314, 1677, 326, 30762, 270, 18, 5012, 326, 253, 38864, 2957, 310, 908, 516, 417, 2119, 1880, 2412, 310, 3451, 275, 253, 806, 1659, 50276, 2520, 3164, 22828, 2007, 8813, 390, 10618, 50276, 48387, 366, 1662, 489, 625, 2410, 1763, 5356, 275, 581, 1039, 390, 1529, 326, 256, 2140, 72, 507, 5520, 6733, 1663, 35205, 253, 34642, 273, 752, 369, 1896, 1078, 352, 310, 2032, 326, 253, 12842, 89, 8196, 2313, 3492, 3530, 3831, 374, 89, 347, 1142, 2264, 15115, 347, 277, 19122, 72, 507, 17558, 89, 9726, 805, 3530, 533, 436, 310, 2649, 247, 7654, 7756, 347, 253, 8820, 6064, 310, 4577, 285, 247, 374, 89, 3064, 6505, 2317, 323, 30628, 281, 12054, 4282, 1880, 2045, 3082, 1663, 812, 2649, 452, 13373, 436, 604, 10184, 690, 1896, 6667, 273, 436, 921, 326, 256, 2140, 1247, 476, 6635, 3356, 17558, 89, 9726, 10556, 247, 7654, 7756, 689, 752, 369, 1896, 342, 277, 19122, 1247, 390, 7367, 273, 9777, 3356, 24088, 337, 7017, 533, 1335, 5897, 595, 18893, 10556, 387, 12842, 89, 8196, 390, 10556, 342, 9619, 5520, 17854, 3410, 3290, 387, 253, 1072, 390, 2169, 6064, 50276, 783, 2929, 7211, 326, 277, 19122, 1247, 3210, 513, 417, 440, 1811, 973, 285, 5257, 281, 4711, 3530, 326, 2489, 3200, 1417, 2469, 697, 3733, 16892, 604, 436, 497, 18755, 24088, 407, 23000, 9610, 310, 71, 301, 71, 19122, 11794, 323, 1027, 4522, 383, 554, 13794, 352, 812, 1056, 247, 625, 18511, 4154, 275, 3718, 273, 256, 2140, 1247 ]
Below is given review of a research paper from cnoference journal. Please write a summary the review. ### Review: in their submission the authors discuss an alternative learning rule to backpropagation called maximal coding rate reduction that was introduced recently in yu et a 2020 as far as i could tell as a nonexpert maximizing the coding rate reduction objective encourages inputs from different classes to be maximally incoherent with respect to one another spanning the largest possible volume while inputs from the same class should be highly correlated in the present submission the authors show that gradient ascent on the coding rate reduction naturally takes the form of a neural network with a particular residual network architecture that they term redunet the authors argue that sending an input through the layers of redunet naturally leads to representations that approximately maximize the coding rate the authors continue to show that when the representation is required to be groupequivariant with respect to shifts a convolutional structure naturally emerges this allows the authors to compute the update using layerwise fourier transformations finally the authors show that on several example tasks redunet leads to outputs that have a large coding rate with no parameters learned via backpropagation there were several aspects of this submission that i liked a lot i think the construction seems interesting and the rate reduction metric seems like a reasonable thing to optimize i found the relationship of coding rate maximization to redunet to be quite clever and the experiments seemed to suggest that the approximations employed in the paper eg the estimated membership in eqn 9 did not spoil the mapping having said this i dont know that i agree with the authors interpretation of their results and i believe they might be somewhat overstated of course i am happy to be corrected if i am incorrect in my assessment 1 my primary concern is the claim that the maximum rate reduction principle gives firstprinciples insight into convolutional networks the authors show that a particular convolutional architecture namely redunet approximates projected gradient ascent on the mcr2 metric however it is not at all clear to me that standard convolutional networks or residual network architectures are welldescribed by redunet in fact it seems that the very particular structure of redunet has a number of features that are at odds with standard convolutional networks nonlinearity only enters into redunet via the approximate membership which enters via a softmaxlike function and outputs of each layer are projected onto the n1sphere 2 since the relationship between redunet and commonplace cnns seems tenuous to me it seems there is a significant burden on the authors to probe this question empirically for example do cnns trained via backpropagation lead to architectures that resemble redunet unfortunately as far as i could tell the authors did not include experiments of this type 3 if the authors are proposing redunet as an alternative to modern cnns it would be nice to see how redunet performs on some common tasks with wellestablished baselines possibly with some finetuning step however as far as i could tell most of the experiments focused on verifying the properties of redunet note that i do not think one needs to achieve stateoftheart performance here but it would be nice to see how redunet fairs since this will probably affect its impact a few more minor points 1 i may be missing something but given the construction of redunet i feel as though the emergence of a convolutional structure subject to translation invariance is not terribly surprising indeed i feel like this result has high overlap with cohen and welling 1 this is not necessarily a criticism of the result perse but i think the phrasing could do a bit more to note this connection 2 it was not obvious to me why z had to be constrained to sn1 can the authors provide some intuition here 3 the authors note that nonlinearity enters the network via the estimated membership however as far as i could tell the estimated membership is only used for test points does this imply that the network is linear when the membership is known 1 cohen t and welling m group equivariant convolutional networks jmlr 2016 update after discussion with the authors i am inclined to lower my score while i find the architecture proposed by the authors to be interesting i do not think they have done enough to motivate the connection with neural networks i find this especially troubling since the language used by the authors continues to imply that the connection is obvious i would encourage the authors to look into the literature on scattering networks eg httpsarxivorgabs12031513 for another approach to explaining networks from first principles that i think does a better job of making the connection to realistic neural network architectures docsep summary this paper proposes a theoretical understanding of neural architecture using the principle of rate reduction yu etal 2020 and the derived optimization steps naturally leads to operations such as network layers and identity residual by enforcing shift invariant it can also lead to convolutional operation the network can be constructed with a forward propagation fasion which conserves good discriminative ability novelty theorectical guidance of network design is one of the key direction in representation learning the paper proposes a novel perspective rate reduction in network construction that generalized to the design of networks such as resnet and resnext however there are a bunch or other networks such as densenet nonlocal network etc which are also performing well in practice and designed with huerestics of larger context and better gradient propagation will the author also include these representations within the objective of rate reduction or does the generation process from rate reduction compact discriminative representation come with additional guidance of what network or answer the question in introduction what is the object is optimal for network design which could show to be effective on multiple tasks i think the major issue of current result is that we can see the objective explain that the set up of networks resnet seeks a compact representation but the author has not shown strong experiments that their yielded alternatives by optimizing the objective rate reduction has positive relation with the network generalization is higher rate reduction leads network with better performance on multiple tasks last it is obvious that shift invariance leads to convolution which is effective for classification of 2d objects however in realistic senario we may seem equivalent disentangled representation rather than invariance is it possible for the objective leads to convolution without explicit inducing shift invariance writing and reference the writing is good and easy to follow checked few derivatives which are correct i think the paper could also related to optimization inspired network design eg optimization algorithm inspired deep neural network structure design which delivers another perspective in general i think the objective is novel but not generalized enough to explain lots of high performance networks yet however more exploration of theoretical study should be encouraged docsepthe paper proposed a network derived from maximizing of rate reduction known as the mcr2 principle where all parameters are explicitly constructed layer by layer in a forward propagation fashion compared with exiting work by yu et al 2020 the proposed work is more like a white box that each layer is more interpretable results showed the proposed network can learn a good discriminative deep representation without any back propagation training the derivation of the network also suggests that the network is more efficient to learn and construct in the spectral domain overall the proposed work looks reasonable the paper is wellstructured the derivation and experiments seem convincing unfortunately the proposed work is out of the reviewers expertise and therefore it is hard to provide valuable commentsdocsepthe authors propose a deep network approach using the principle of rate reduction as a loss under a gradient ascent approach avoiding traditional backpropagation besides the work attempts to interpret the proposed framework from both geometrical and statistical views then shiftinvariant properties are discussed the innovative method allows the inclusion of a new layer structure named redunet which could benefit the deep learning community though the experiments are not challenging concerning the studied databases the authors aim to probe the concept without a complete implementation tuning overall the paper is illustrative enough regarding the mathematical foundationdocsepthe paper formulates an iterative process of deriving encoding of data into feature space as a deep model called redunet where each layer corresponds to iteration of the optimisation process the feature space according to the mcr2 principle the mcr2 optimisation maps points of different classes into separate subspaces with volume of each subspace being minimized while the volume of the entire space is maximized it is analogous to pushing like things together and unlike things apart the novelty of the paper is in that formulation of the feature optimisation is bakedin into a deep architecture mcr2 principle seems like a sensible approach to learning especially given that embedding algorithms such as face encoding use it already its nice to see some rigorous mathematical treatment on this however i get confused pretty early on by the notations if fxthetain mathcalrk and zfxin mathcalrnthen since yhz then fxthetahfxand so fxdelta and fx are two different functions yet later in the text zfxtheta and from then on including equation 11 fxdeltapsilzletagzl1thetal1which would make it seem fxthetain mathcarn and what is gzltheta1 equation 8 tells us what gzltheta1 must approximatebut what is it exactly is that a neural network or some model with parameters thetal or is equation 8 a definition of gzlthetalin which case what is thetal i dont think the math is necessarily wrongjust notation is confusing and definitions changingnot consistent i have also questions about equation 11 where number of layers is equivalent to iterations while maximizing mcr2 and the width of each layer corresponds to m the training points in the dataset so in order to do a mapping of an input x we need to perform l iterative steps using the entire m points every time isnt that equivalent to doing a massive learning process using the entire dataset for each mapping how computationally costly is that i also dont quite understand how psilz1theata1 works how does thetal change over iterations experimental section is not helping me with this since its stated that e cj are computed for each layerbut there are no details on what thetal is and how gz1theta1 is evaluated and if fxthetazlthen how do we get classification from that is it just based on definition of hatpijzl from page 4 finally i am not sure if the result of obtaining a convnet architecture in redunet when translation invariance constraint is added the embedding is all that surprising isnt it somewhat obvious that if each layer of redunet is invariant in some way then the entire network is invariant it feels like that what we are learning here is not that in order to have translationinvariant mapping we must have a conventbut rather that we can obtain a translation invariant deep architecture with translation invariant layers ### Summary:
this paper received borderline scores which makes for a difficult recommendation unfortunately two of the reviews were too short and thus were of limited use in forming a recommendation that includes the highscoring one which did not adequately substantiate its score there is much to admire in this submission reviewers appreciated the originality of this research linking rate reduction optimization to deep network architectures r1 the paper proposes a novel perspective r4 the novelty of the paper is in that formulation of the feature optimisation is bakedin into a deep architecture r5 i think the construction seems interesting and the rate reduction metric seems like a reasonable thing to optimize i found the relationship of coding rate maximization to redunet to be quite clever r3 short the innovative method allows the inclusion of a new layer structure named redunet reviewers also applauded the papers clarity including r4 who raised their score to 6 based on satisfying clarity revisions from the authors r1 the writing is good and easy to follow r4 postdiscussion clarity is not an issues anymore additional explanations provided by the authors and one more careful reading of the paper helped in understanding of all the aspects of the model r2 short the paper is wellstructured however there were some core questions around how well the main significance claims of the paper are supported the most indepth discussion on these topics is in the detailed thread with r5 in that thread there are many points discussed but the two issues seem to be 1 whether the connection between redunet and standard neural net architectures is sufficiently substantiated so as to constitute an explanation for behaviors of those standard architectures like cnns and 2 whether the emergence of redunets group invarianceequivariance is surprising or qualitatively new the first is much more central on the first issue r5 writes in summary fundamentally i think the authors propose a hypothesis that redunets explain dl models however the authors do not take meaningful steps towards validating this hypothesis i would contrast this with for example the scattering networks paper httpsarxivorgabs12031513 which did an exceptional job of arguing for an ab initio explanation of convolutional networks i find r5s perspective on this point to be compelling in that the paper currently doesnt do enough to justify these main claims either through drawing precise nontrivial mathematical connections or through experimental validation the thread has a much more detailed and nuanced discussion the second issue is not quite as central to the significance of the paper but it was noted by multiple reviewers r5 i may be missing something but given the construction of redunet i feel as though the emergence of a convolutional structure subject to translation invariance is not terribly surprising r4 finally i am not sure if the result of obtaining a convnet architecture in redunet when translation invariance constraint is added the embedding is all that surprising r4 postdiscussion reading the exchange between the authors and r5 i am still not fully convinced that translation invariance property is all that surprising but for me thats not a reason to reject at the least the paper as written hasnt yet convinced some readers myself included on these claims as i mentioned at the start this paper is borderline but because i am largely aligned with r5s perspectives i think this paper does not quite pass the bar for acceptance i recommend a rejection but i look forward to seeing a strengthened version of this work in the future i hope the feedback here has been useful to bringing about that stronger version
[ 5955, 342, 253, 4477, 891, 717, 21802, 281, 2406, 619, 4868, 1223, 891, 1089, 253, 10336, 4081, 407, 253, 4477, 281, 320, 4722, 891, 513, 417, 1158, 597, 452, 2218, 2217, 281, 41509, 253, 4602, 342, 11454, 6928, 891, 1089, 436, 3340, 39728, 1580, 253, 3448, 908, 407, 253, 4477, 7788, 281, 16084, 326, 253, 4602, 310, 4755, 891, 651, 11907, 253, 4477, 281, 1007, 715, 253, 6239, 327, 11715, 6928, 24088, 5987, 39962, 2061, 5375, 805, 2941, 1010, 1012, 323, 1529, 2746, 281, 15571, 6928, 432, 806, 9241, 326, 891, 1158, 1057, 247, 1805, 2628, 273, 2403, 253, 4602, 281, 15958, 11454, 2990, 35615, 50276, 7152, 33032, 6010, 436, 2929, 29328, 247, 10527, 4685, 273, 11454, 10336, 970, 253, 50276, 26985, 2113, 273, 2281, 5141, 340, 86, 1162, 267, 9169, 285, 253, 6012, 13757, 5018, 10748, 5644, 281, 5871, 824, 347, 2990, 8090, 285, 6489, 12541, 407, 37703, 5333, 13727, 352, 476, 671, 1421, 281, 27311, 267, 4254, 253, 2990, 476, 320, 8818, 342, 247, 3579, 18634, 269, 4930, 534, 6405, 265, 1175, 20741, 800, 3745, 50274, 2369, 652, 555, 253, 20149, 474, 12925, 273, 2990, 2216, 310, 581, 273, 253, 2234, 3884, 275, 6779, 4715, 50276, 783, 2929, 29328, 247, 4460, 8668, 2281, 5141, 275, 2990, 5140, 326, 14923, 281, 253, 2216, 273, 6928, 824, 347, 501, 3024, 285, 501, 8384, 2299, 627, 403, 247, 12190, 390, 643, 6928, 824, 347, 12006, 257, 292, 1327, 6790, 2990, 3966, 534, 403, 671, 9591, 973, 275, 3946, 285, 4158, 342, 43192, 1120, 982, 273, 4067, 3634, 285, 1805, 11786, 18634, 588, 253, 2488, 671, 2486, 841, 14237, 1561, 253, 8103, 273, 2281, 5141, 50275, 263, 1057, 253, 5978, 1232, 432, 2281, 5141, 8566, 20741, 800, 6779, 1705, 342, 3081, 12925, 273, 752, 2990, 390, 3662, 253, 1953, 275, 10199, 752, 310, 253, 1789, 310, 8654, 323, 2990, 2216, 534, 812, 921, 281, 320, 3576, 327, 2709, 8892, 50275, 74, 1158, 253, 2201, 2523, 273, 1655, 906, 310, 326, 359, 476, 923, 253, 8103, 5513, 50276, 3529, 253, 873, 598, 273, 6928, 501, 3024, 14993, 247, 8566, 6779, 533, 253, 2488, 556, 417, 2011, 2266, 4679, 326, 616, 20714, 18075, 407, 39793, 253, 8103, 2281, 5141, 556, 2762, 5886, 342, 253, 2990, 26647, 310, 2169, 2281, 5141, 5644, 2990, 342, 1805, 3045, 327, 2709, 8892, 50274, 6275, 352, 310, 4755, 326, 5333, 31429, 5644, 281, 27311, 534, 310, 3576, 323, 9162, 273, 374, 69, 5113, 2299, 275, 15958, 5303, 5629, 359, 778, 1646, 6425, 557, 290, 33195, 6779, 2581, 50276, 14644, 31429, 310, 352, 1896, 323, 253, 8103, 5644, 281, 27311, 1293, 6843, 24635, 5333, 31429, 50272, 17695, 285, 3806, 50276, 783, 4028, 310, 1175, 285, 3477, 281, 956, 10141, 1643, 13335, 534, 403, 3451, 891, 1158, 253, 2929, 812, 671, 2905, 281, 13757, 11797, 2990, 2216, 24088, 13757, 5933, 11797, 3676, 11454, 2990, 2605, 2216, 534, 26361, 1529, 8668, 50274, 249, 2087, 891, 1158, 253, 8103, 310, 4460, 533, 417, 14923, 2217, 281, 5513, 8783, 273, 1029, 3045, 6928, 2568, 2299, 625, 17947, 273, 10527, 1263, 943, 320, 14659, 50275, 7152, 339, 431, 248, 2929, 4081, 247, 2990, 6012, 432, 46875, 273, 2281, 5141, 1929, 347, 253, 278, 7083, 19, 8063, 835, 50276, 455, 3602, 403, 11120, 8818, 3828, 407, 3828, 275, 247, 3579, 18634, 8142, 2429, 342, 44528, 789, 407, 340, 86, 1162, 355, 9169, 253, 4081, 789, 310, 625, 751, 247, 3168, 3817, 326, 1016, 3828, 310, 625, 4665, 494, 50276, 16680, 2692, 253, 4081, 2990, 476, 3037, 247, 1175, 20741, 800, 3676, 6779, 1293, 667, 896, 18634, 3733, 253, 28529, 273, 253, 2990, 671, 5936, 326, 253, 2990, 310, 625, 5919, 281, 3037, 285, 3989, 275, 253, 9879, 5028, 50276, 1189, 455, 253, 4081, 789, 4453, 5272, 253, 2929, 310, 973, 34218, 253, 28529, 285, 4679, 1646, 21414, 50276, 328, 9520, 253, 4081, 789, 310, 562, 273, 253, 30628, 15040, 285, 3103, 352, 310, 1892, 281, 2085, 9865, 5701, 7152, 339, 431, 248, 4477, 12661, 247, 3676, 2990, 2746, 970, 253, 8063, 273, 2281, 5141, 347, 247, 2957, 762, 247, 11786, 49104, 2746, 17816, 5899, 896, 44263, 318, 16280, 253, 789, 9437, 281, 4665, 253, 4081, 7792, 432, 1097, 38307, 285, 7605, 6849, 840, 5333, 25168, 3607, 403, 5469, 253, 16694, 1332, 4483, 253, 11250, 273, 247, 747, 3828, 2605, 4907, 2502, 328, 292, 534, 812, 5649, 253, 3676, 4715, 3114, 2167, 253, 4679, 403, 417, 11132, 8664, 253, 5421, 16634, 253, 4477, 4388, 281, 10304, 253, 4473, 1293, 247, 3426, 7092, 25184, 4583, 253, 2929, 310, 47386, 2217, 5001, 253, 15965, 12153, 7152, 339, 431, 248, 2929, 17075, 684, 271, 34560, 1232, 273, 44190, 9706, 273, 941, 715, 4735, 2317, 347, 247, 3676, 1566, 1925, 2502, 328, 292, 835, 1016, 3828, 10140, 281, 19502, 273, 253, 5556, 5837, 1232, 253, 4735, 2317, 2556, 281, 253, 278, 7083, 19, 8063, 50276, 783, 278, 7083, 19, 5556, 5837, 8115, 2792, 273, 1027, 5971, 715, 4858, 749, 31748, 342, 4644, 273, 1016, 24822, 1146, 36625, 1223, 253, 4644, 273, 253, 2862, 2317, 310, 11903, 1025, 50275, 262, 310, 19890, 281, 13383, 751, 1841, 2366, 285, 12401, 1841, 7419, 50276, 783, 38135, 273, 253, 2929, 310, 275, 326, 15895, 273, 253, 4735, 5556, 5837, 310, 30363, 249, 715, 247, 3676, 10336, 50275, 78, 7083, 19, 8063, 3133, 751, 247, 24600, 2746, 281, 4715, 3340, 1677, 326, 21496, 11333, 824, 347, 2454, 9706, 897, 352, 2168, 50276, 953, 5322, 281, 923, 690, 26565, 15965, 1971, 327, 436, 50276, 35529, 891, 755, 13477, 3965, 2393, 327, 407, 253, 41818, 50276, 338, 269, 633, 6168, 404, 14168, 1179, 33716, 285, 1182, 21448, 249, 14168, 1179, 30930, 7461, 1580, 340, 73, 91, 840, 269, 633, 248, 15559, 21448, 395, 594, 269, 89, 3005, 285, 269, 89, 403, 767, 1027, 3470, 50276, 28948, 1996, 275, 253, 2505, 1182, 71, 633, 22666, 50276, 395, 432, 840, 327, 1690, 5150, 1903, 269, 17176, 2585, 1825, 300, 91, 1059, 356, 91, 77, 18, 783, 22559, 18, 4609, 651, 1056, 352, 1646, 269, 633, 6168, 404, 14168, 68, 1596, 50276, 395, 752, 310, 305, 91, 77, 3124, 18, 50276, 29813, 854, 8599, 441, 752, 305, 91, 77, 3124, 18, 1364, 16851, 2858, 752, 310, 352, 4555, 50275, 261, 326, 247, 11454, 2990, 390, 690, 1566, 342, 3602, 253, 22559, 50276, 263, 310, 5150, 854, 247, 5426, 273, 305, 91, 77, 783, 85, 19337, 534, 1083, 752, 310, 253, 22559, 50276, 74, 13414, 1158, 253, 14168, 310, 7933, 3430, 6309, 14951, 310, 21643, 285, 14308, 6890, 1439, 5185, 50276, 74, 452, 671, 3533, 670, 5150, 1903, 835, 1180, 273, 8090, 310, 6425, 281, 25142, 1223, 46875, 278, 7083, 19, 285, 253, 4871, 273, 1016, 3828, 10140, 281, 278, 253, 3733, 2792, 275, 253, 10895, 50276, 601, 275, 1340, 281, 513, 247, 10603, 273, 271, 3280, 1269, 359, 878, 281, 1347, 298, 34560, 5018, 970, 253, 2862, 278, 2792, 1046, 673, 50276, 261, 2649, 326, 6425, 281, 2509, 247, 7863, 4715, 1232, 970, 253, 2862, 10895, 323, 1016, 10603, 50276, 5430, 43245, 19983, 310, 326, 50276, 74, 671, 13414, 3240, 2096, 849, 3714, 300, 91, 18, 783, 682, 18, 2987, 50276, 5430, 1057, 253, 22559, 1818, 689, 25142, 50276, 49363, 2593, 310, 417, 9073, 479, 342, 436, 1580, 697, 4767, 326, 299, 260, 75, 403, 10302, 323, 1016, 3828, 2858, 627, 403, 642, 4278, 327, 752, 253, 22559, 310, 285, 849, 305, 91, 18, 3124, 18, 310, 6760, 50276, 395, 604, 269, 633, 22666, 91, 77, 7461, 849, 513, 359, 755, 9162, 432, 326, 50276, 261, 352, 816, 1754, 327, 5426, 273, 7856, 81, 1944, 91, 77, 432, 3239, 577, 50276, 71, 3341, 891, 717, 417, 2119, 604, 253, 906, 273, 13546, 247, 2410, 3024, 10336, 275, 2502, 328, 292, 672, 10234, 31429, 7658, 310, 2879, 253, 21496, 310, 512, 326, 10084, 50276, 261, 2649, 352, 8489, 4755, 326, 604, 1016, 3828, 273, 2502, 328, 292, 310, 13727, 275, 690, 1039, 840, 253, 2862, 2990, 310, 13727, 50276, 262, 9193, 751, 326, 752, 359, 403, 4715, 1060, 310, 417, 326, 275, 1340, 281, 452, 10234, 25168, 10603, 359, 1364, 452, 247, 345, 2254, 2858, 2581, 326, 359, 476, 4044, 247, 10234, 13727, 3676, 10336, 342, 10234, 13727, 8090, 2490, 187, 4118, 18435, 27, 2520, 2929, 2959, 45210, 7363, 534, 2789, 323, 247, 2834, 17401, 19235, 767, 273, 253, 10123, 497, 1512, 2159, 285, 3021, 497, 273, 3710, 897, 275, 9046, 247, 17401, 326, 3797, 253, 1029, 1026, 4263, 581, 534, 858, 417, 18212, 4326, 4513, 697, 4868, 50276, 9088, 310, 1199, 281, 26930, 275, 436, 19529, 30628, 14109, 253, 3236, 414, 273, 436, 2561, 20057, 2281, 5141, 13757, 281, 3676, 2990, 35615, 50276, 83, 18, 253, 2929, 29328, 247, 4460, 8668, 50276, 83, 21, 253, 38135, 273, 253, 2929, 310, 275, 326, 15895, 273, 253, 4735, 5556, 5837, 310, 30363, 249, 715, 247, 3676, 10336, 50276, 83, 22, 50276, 74, 1158, 253, 5140, 3133, 4722, 285, 253, 2281, 5141, 7982, 3133, 751, 247, 5272, 2181, 281, 22318, 891, 1119, 253, 2954, 273, 12425, 2281, 11903, 1320, 281, 2502, 328, 292, 281, 320, 3240, 19080, 50276, 83, 20, 2159, 253, 16694, 1332, 4483, 253, 11250, 273, 247, 747, 3828, 2605, 4907, 2502, 328, 292, 50276, 15337, 398, 671, 37977, 264, 253, 9380, 19843, 1690, 391, 21, 665, 5439, 616, 4868, 281, 721, 1754, 327, 14127, 19843, 38549, 432, 253, 4477, 50276, 83, 18, 253, 4028, 310, 1175, 285, 3477, 281, 956, 50276, 83, 21, 1501, 49794, 19843, 310, 417, 271, 3374, 10542, 50276, 38092, 22909, 2530, 407, 253, 4477, 285, 581, 625, 10182, 4361, 273, 253, 2929, 6518, 275, 4685, 273, 512, 253, 7794, 273, 253, 1566, 50276, 83, 19, 2159, 253, 2929, 310, 973, 34218, 50276, 35529, 627, 497, 690, 5161, 3533, 1475, 849, 973, 253, 2022, 8453, 3916, 273, 253, 2929, 403, 4516, 253, 954, 801, 554, 394, 5955, 327, 841, 12989, 310, 275, 253, 7000, 6293, 342, 391, 22, 275, 326, 6293, 627, 403, 1142, 2792, 5469, 533, 253, 767, 3374, 1646, 281, 320, 337, 1880, 253, 4602, 875, 2502, 328, 292, 285, 2629, 11454, 2036, 35615, 310, 10481, 4326, 4215, 594, 347, 281, 12647, 271, 8813, 323, 13576, 273, 1110, 2629, 35615, 751, 260, 79, 2224, 285, 374, 1880, 253, 21313, 273, 2502, 328, 1507, 1387, 31429, 8275, 14417, 310, 10084, 390, 36143, 747, 50276, 783, 806, 310, 1199, 625, 4275, 327, 253, 806, 2523, 391, 22, 12013, 275, 6010, 26401, 891, 1158, 253, 4477, 12661, 247, 9079, 326, 2502, 328, 1507, 5513, 45439, 3210, 2299, 253, 4477, 513, 417, 1379, 14282, 5018, 4404, 3588, 839, 436, 9079, 50276, 74, 651, 4499, 436, 342, 323, 1650, 253, 11715, 6928, 2929, 5987, 39962, 2061, 5375, 805, 2941, 1010, 1012, 534, 858, 271, 18714, 2628, 273, 16425, 323, 271, 490, 2012, 900, 8813, 273, 27311, 267, 6928, 50276, 74, 1089, 391, 22, 84, 8668, 327, 436, 1127, 281, 320, 18511, 275, 326, 253, 2929, 4390, 36908, 513, 2217, 281, 15249, 841, 2022, 3916, 2057, 949, 10263, 10799, 37825, 15965, 10291, 390, 949, 5661, 12820, 253, 6293, 556, 247, 1199, 625, 7000, 285, 8794, 3086, 5955, 50276, 783, 1273, 2523, 310, 417, 3240, 347, 4275, 281, 253, 8453, 273, 253, 2929, 533, 352, 369, 4879, 407, 2709, 30628, 50276, 83, 22, 891, 778, 320, 5816, 1633, 533, 1677, 253, 5140, 273, 2502, 328, 292, 891, 1928, 347, 2167, 253, 21313, 273, 247, 27311, 267, 2605, 2256, 281, 10234, 31429, 310, 417, 30643, 10084, 50276, 83, 21, 4720, 891, 717, 417, 2119, 604, 253, 906, 273, 13546, 247, 2410, 3024, 10336, 275, 2502, 328, 292, 672, 10234, 31429, 7658, 310, 2879, 253, 21496, 310, 512, 326, 10084, 50276, 83, 21, 1501, 49794, 4361, 253, 6431, 875, 253, 4477, 285, 391, 22, 891, 717, 1335, 417, 4751, 13762, 326, 10234, 31429, 2867, 310, 512, 326, 10084, 533, 323, 479, 28763, 417, 247, 1921, 281, 12009, 50276, 255, 253, 1878, 253, 2929, 347, 3542, 556, 2649, 2568, 13762, 690, 10668, 4266, 2908, 327, 841, 3916, 50276, 284, 891, 5393, 387, 253, 1265, 436, 2929, 310, 45210, 533, 984, 891, 717, 8127, 15616, 342, 391, 22, 84, 24302, 891, 1158, 436, 2929, 1057, 417, 3240, 1509, 253, 2534, 323, 14924, 891, 5583, 247, 18235, 533, 891, 1007, 3579, 281, 6523, 247, 34615, 2715, 273, 436, 789, 275, 253, 2852, 891, 3524, 253, 8680, 1060, 556, 644, 4217, 281, 9745, 670, 326, 10046, 2715 ]
[ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 ]
[ 5955, 342, 253, 4477, 891, 717, 21802, 281, 2406, 619, 4868, 1223, 891, 1089, 253, 10336, 4081, 407, 253, 4477, 281, 320, 4722, 891, 513, 417, 1158, 597, 452, 2218, 2217, 281, 41509, 253, 4602, 342, 11454, 6928, 891, 1089, 436, 3340, 39728, 1580, 253, 3448, 908, 407, 253, 4477, 7788, 281, 16084, 326, 253, 4602, 310, 4755, 891, 651, 11907, 253, 4477, 281, 1007, 715, 253, 6239, 327, 11715, 6928, 24088, 5987, 39962, 2061, 5375, 805, 2941, 1010, 1012, 323, 1529, 2746, 281, 15571, 6928, 432, 806, 9241, 326, 891, 1158, 1057, 247, 1805, 2628, 273, 2403, 253, 4602, 281, 15958, 11454, 2990, 35615, 50276, 7152, 33032, 6010, 436, 2929, 29328, 247, 10527, 4685, 273, 11454, 10336, 970, 253, 50276, 26985, 2113, 273, 2281, 5141, 340, 86, 1162, 267, 9169, 285, 253, 6012, 13757, 5018, 10748, 5644, 281, 5871, 824, 347, 2990, 8090, 285, 6489, 12541, 407, 37703, 5333, 13727, 352, 476, 671, 1421, 281, 27311, 267, 4254, 253, 2990, 476, 320, 8818, 342, 247, 3579, 18634, 269, 4930, 534, 6405, 265, 1175, 20741, 800, 3745, 50274, 2369, 652, 555, 253, 20149, 474, 12925, 273, 2990, 2216, 310, 581, 273, 253, 2234, 3884, 275, 6779, 4715, 50276, 783, 2929, 29328, 247, 4460, 8668, 2281, 5141, 275, 2990, 5140, 326, 14923, 281, 253, 2216, 273, 6928, 824, 347, 501, 3024, 285, 501, 8384, 2299, 627, 403, 247, 12190, 390, 643, 6928, 824, 347, 12006, 257, 292, 1327, 6790, 2990, 3966, 534, 403, 671, 9591, 973, 275, 3946, 285, 4158, 342, 43192, 1120, 982, 273, 4067, 3634, 285, 1805, 11786, 18634, 588, 253, 2488, 671, 2486, 841, 14237, 1561, 253, 8103, 273, 2281, 5141, 50275, 263, 1057, 253, 5978, 1232, 432, 2281, 5141, 8566, 20741, 800, 6779, 1705, 342, 3081, 12925, 273, 752, 2990, 390, 3662, 253, 1953, 275, 10199, 752, 310, 253, 1789, 310, 8654, 323, 2990, 2216, 534, 812, 921, 281, 320, 3576, 327, 2709, 8892, 50275, 74, 1158, 253, 2201, 2523, 273, 1655, 906, 310, 326, 359, 476, 923, 253, 8103, 5513, 50276, 3529, 253, 873, 598, 273, 6928, 501, 3024, 14993, 247, 8566, 6779, 533, 253, 2488, 556, 417, 2011, 2266, 4679, 326, 616, 20714, 18075, 407, 39793, 253, 8103, 2281, 5141, 556, 2762, 5886, 342, 253, 2990, 26647, 310, 2169, 2281, 5141, 5644, 2990, 342, 1805, 3045, 327, 2709, 8892, 50274, 6275, 352, 310, 4755, 326, 5333, 31429, 5644, 281, 27311, 534, 310, 3576, 323, 9162, 273, 374, 69, 5113, 2299, 275, 15958, 5303, 5629, 359, 778, 1646, 6425, 557, 290, 33195, 6779, 2581, 50276, 14644, 31429, 310, 352, 1896, 323, 253, 8103, 5644, 281, 27311, 1293, 6843, 24635, 5333, 31429, 50272, 17695, 285, 3806, 50276, 783, 4028, 310, 1175, 285, 3477, 281, 956, 10141, 1643, 13335, 534, 403, 3451, 891, 1158, 253, 2929, 812, 671, 2905, 281, 13757, 11797, 2990, 2216, 24088, 13757, 5933, 11797, 3676, 11454, 2990, 2605, 2216, 534, 26361, 1529, 8668, 50274, 249, 2087, 891, 1158, 253, 8103, 310, 4460, 533, 417, 14923, 2217, 281, 5513, 8783, 273, 1029, 3045, 6928, 2568, 2299, 625, 17947, 273, 10527, 1263, 943, 320, 14659, 50275, 7152, 339, 431, 248, 2929, 4081, 247, 2990, 6012, 432, 46875, 273, 2281, 5141, 1929, 347, 253, 278, 7083, 19, 8063, 835, 50276, 455, 3602, 403, 11120, 8818, 3828, 407, 3828, 275, 247, 3579, 18634, 8142, 2429, 342, 44528, 789, 407, 340, 86, 1162, 355, 9169, 253, 4081, 789, 310, 625, 751, 247, 3168, 3817, 326, 1016, 3828, 310, 625, 4665, 494, 50276, 16680, 2692, 253, 4081, 2990, 476, 3037, 247, 1175, 20741, 800, 3676, 6779, 1293, 667, 896, 18634, 3733, 253, 28529, 273, 253, 2990, 671, 5936, 326, 253, 2990, 310, 625, 5919, 281, 3037, 285, 3989, 275, 253, 9879, 5028, 50276, 1189, 455, 253, 4081, 789, 4453, 5272, 253, 2929, 310, 973, 34218, 253, 28529, 285, 4679, 1646, 21414, 50276, 328, 9520, 253, 4081, 789, 310, 562, 273, 253, 30628, 15040, 285, 3103, 352, 310, 1892, 281, 2085, 9865, 5701, 7152, 339, 431, 248, 4477, 12661, 247, 3676, 2990, 2746, 970, 253, 8063, 273, 2281, 5141, 347, 247, 2957, 762, 247, 11786, 49104, 2746, 17816, 5899, 896, 44263, 318, 16280, 253, 789, 9437, 281, 4665, 253, 4081, 7792, 432, 1097, 38307, 285, 7605, 6849, 840, 5333, 25168, 3607, 403, 5469, 253, 16694, 1332, 4483, 253, 11250, 273, 247, 747, 3828, 2605, 4907, 2502, 328, 292, 534, 812, 5649, 253, 3676, 4715, 3114, 2167, 253, 4679, 403, 417, 11132, 8664, 253, 5421, 16634, 253, 4477, 4388, 281, 10304, 253, 4473, 1293, 247, 3426, 7092, 25184, 4583, 253, 2929, 310, 47386, 2217, 5001, 253, 15965, 12153, 7152, 339, 431, 248, 2929, 17075, 684, 271, 34560, 1232, 273, 44190, 9706, 273, 941, 715, 4735, 2317, 347, 247, 3676, 1566, 1925, 2502, 328, 292, 835, 1016, 3828, 10140, 281, 19502, 273, 253, 5556, 5837, 1232, 253, 4735, 2317, 2556, 281, 253, 278, 7083, 19, 8063, 50276, 783, 278, 7083, 19, 5556, 5837, 8115, 2792, 273, 1027, 5971, 715, 4858, 749, 31748, 342, 4644, 273, 1016, 24822, 1146, 36625, 1223, 253, 4644, 273, 253, 2862, 2317, 310, 11903, 1025, 50275, 262, 310, 19890, 281, 13383, 751, 1841, 2366, 285, 12401, 1841, 7419, 50276, 783, 38135, 273, 253, 2929, 310, 275, 326, 15895, 273, 253, 4735, 5556, 5837, 310, 30363, 249, 715, 247, 3676, 10336, 50275, 78, 7083, 19, 8063, 3133, 751, 247, 24600, 2746, 281, 4715, 3340, 1677, 326, 21496, 11333, 824, 347, 2454, 9706, 897, 352, 2168, 50276, 953, 5322, 281, 923, 690, 26565, 15965, 1971, 327, 436, 50276, 35529, 891, 755, 13477, 3965, 2393, 327, 407, 253, 41818, 50276, 338, 269, 633, 6168, 404, 14168, 1179, 33716, 285, 1182, 21448, 249, 14168, 1179, 30930, 7461, 1580, 340, 73, 91, 840, 269, 633, 248, 15559, 21448, 395, 594, 269, 89, 3005, 285, 269, 89, 403, 767, 1027, 3470, 50276, 28948, 1996, 275, 253, 2505, 1182, 71, 633, 22666, 50276, 395, 432, 840, 327, 1690, 5150, 1903, 269, 17176, 2585, 1825, 300, 91, 1059, 356, 91, 77, 18, 783, 22559, 18, 4609, 651, 1056, 352, 1646, 269, 633, 6168, 404, 14168, 68, 1596, 50276, 395, 752, 310, 305, 91, 77, 3124, 18, 50276, 29813, 854, 8599, 441, 752, 305, 91, 77, 3124, 18, 1364, 16851, 2858, 752, 310, 352, 4555, 50275, 261, 326, 247, 11454, 2990, 390, 690, 1566, 342, 3602, 253, 22559, 50276, 263, 310, 5150, 854, 247, 5426, 273, 305, 91, 77, 783, 85, 19337, 534, 1083, 752, 310, 253, 22559, 50276, 74, 13414, 1158, 253, 14168, 310, 7933, 3430, 6309, 14951, 310, 21643, 285, 14308, 6890, 1439, 5185, 50276, 74, 452, 671, 3533, 670, 5150, 1903, 835, 1180, 273, 8090, 310, 6425, 281, 25142, 1223, 46875, 278, 7083, 19, 285, 253, 4871, 273, 1016, 3828, 10140, 281, 278, 253, 3733, 2792, 275, 253, 10895, 50276, 601, 275, 1340, 281, 513, 247, 10603, 273, 271, 3280, 1269, 359, 878, 281, 1347, 298, 34560, 5018, 970, 253, 2862, 278, 2792, 1046, 673, 50276, 261, 2649, 326, 6425, 281, 2509, 247, 7863, 4715, 1232, 970, 253, 2862, 10895, 323, 1016, 10603, 50276, 5430, 43245, 19983, 310, 326, 50276, 74, 671, 13414, 3240, 2096, 849, 3714, 300, 91, 18, 783, 682, 18, 2987, 50276, 5430, 1057, 253, 22559, 1818, 689, 25142, 50276, 49363, 2593, 310, 417, 9073, 479, 342, 436, 1580, 697, 4767, 326, 299, 260, 75, 403, 10302, 323, 1016, 3828, 2858, 627, 403, 642, 4278, 327, 752, 253, 22559, 310, 285, 849, 305, 91, 18, 3124, 18, 310, 6760, 50276, 395, 604, 269, 633, 22666, 91, 77, 7461, 849, 513, 359, 755, 9162, 432, 326, 50276, 261, 352, 816, 1754, 327, 5426, 273, 7856, 81, 1944, 91, 77, 432, 3239, 577, 50276, 71, 3341, 891, 717, 417, 2119, 604, 253, 906, 273, 13546, 247, 2410, 3024, 10336, 275, 2502, 328, 292, 672, 10234, 31429, 7658, 310, 2879, 253, 21496, 310, 512, 326, 10084, 50276, 261, 2649, 352, 8489, 4755, 326, 604, 1016, 3828, 273, 2502, 328, 292, 310, 13727, 275, 690, 1039, 840, 253, 2862, 2990, 310, 13727, 50276, 262, 9193, 751, 326, 752, 359, 403, 4715, 1060, 310, 417, 326, 275, 1340, 281, 452, 10234, 25168, 10603, 359, 1364, 452, 247, 345, 2254, 2858, 2581, 326, 359, 476, 4044, 247, 10234, 13727, 3676, 10336, 342, 10234, 13727, 8090, 2490, 187, 4118, 18435, 27, 2520, 2929, 2959, 45210, 7363, 534, 2789, 323, 247, 2834, 17401, 19235, 767, 273, 253, 10123, 497, 1512, 2159, 285, 3021, 497, 273, 3710, 897, 275, 9046, 247, 17401, 326, 3797, 253, 1029, 1026, 4263, 581, 534, 858, 417, 18212, 4326, 4513, 697, 4868, 50276, 9088, 310, 1199, 281, 26930, 275, 436, 19529, 30628, 14109, 253, 3236, 414, 273, 436, 2561, 20057, 2281, 5141, 13757, 281, 3676, 2990, 35615, 50276, 83, 18, 253, 2929, 29328, 247, 4460, 8668, 50276, 83, 21, 253, 38135, 273, 253, 2929, 310, 275, 326, 15895, 273, 253, 4735, 5556, 5837, 310, 30363, 249, 715, 247, 3676, 10336, 50276, 83, 22, 50276, 74, 1158, 253, 5140, 3133, 4722, 285, 253, 2281, 5141, 7982, 3133, 751, 247, 5272, 2181, 281, 22318, 891, 1119, 253, 2954, 273, 12425, 2281, 11903, 1320, 281, 2502, 328, 292, 281, 320, 3240, 19080, 50276, 83, 20, 2159, 253, 16694, 1332, 4483, 253, 11250, 273, 247, 747, 3828, 2605, 4907, 2502, 328, 292, 50276, 15337, 398, 671, 37977, 264, 253, 9380, 19843, 1690, 391, 21, 665, 5439, 616, 4868, 281, 721, 1754, 327, 14127, 19843, 38549, 432, 253, 4477, 50276, 83, 18, 253, 4028, 310, 1175, 285, 3477, 281, 956, 50276, 83, 21, 1501, 49794, 19843, 310, 417, 271, 3374, 10542, 50276, 38092, 22909, 2530, 407, 253, 4477, 285, 581, 625, 10182, 4361, 273, 253, 2929, 6518, 275, 4685, 273, 512, 253, 7794, 273, 253, 1566, 50276, 83, 19, 2159, 253, 2929, 310, 973, 34218, 50276, 35529, 627, 497, 690, 5161, 3533, 1475, 849, 973, 253, 2022, 8453, 3916, 273, 253, 2929, 403, 4516, 253, 954, 801, 554, 394, 5955, 327, 841, 12989, 310, 275, 253, 7000, 6293, 342, 391, 22, 275, 326, 6293, 627, 403, 1142, 2792, 5469, 533, 253, 767, 3374, 1646, 281, 320, 337, 1880, 253, 4602, 875, 2502, 328, 292, 285, 2629, 11454, 2036, 35615, 310, 10481, 4326, 4215, 594, 347, 281, 12647, 271, 8813, 323, 13576, 273, 1110, 2629, 35615, 751, 260, 79, 2224, 285, 374, 1880, 253, 21313, 273, 2502, 328, 1507, 1387, 31429, 8275, 14417, 310, 10084, 390, 36143, 747, 50276, 783, 806, 310, 1199, 625, 4275, 327, 253, 806, 2523, 391, 22, 12013, 275, 6010, 26401, 891, 1158, 253, 4477, 12661, 247, 9079, 326, 2502, 328, 1507, 5513, 45439, 3210, 2299, 253, 4477, 513, 417, 1379, 14282, 5018, 4404, 3588, 839, 436, 9079, 50276, 74, 651, 4499, 436, 342, 323, 1650, 253, 11715, 6928, 2929, 5987, 39962, 2061, 5375, 805, 2941, 1010, 1012, 534, 858, 271, 18714, 2628, 273, 16425, 323, 271, 490, 2012, 900, 8813, 273, 27311, 267, 6928, 50276, 74, 1089, 391, 22, 84, 8668, 327, 436, 1127, 281, 320, 18511, 275, 326, 253, 2929, 4390, 36908, 513, 2217, 281, 15249, 841, 2022, 3916, 2057, 949, 10263, 10799, 37825, 15965, 10291, 390, 949, 5661, 12820, 253, 6293, 556, 247, 1199, 625, 7000, 285, 8794, 3086, 5955, 50276, 783, 1273, 2523, 310, 417, 3240, 347, 4275, 281, 253, 8453, 273, 253, 2929, 533, 352, 369, 4879, 407, 2709, 30628, 50276, 83, 22, 891, 778, 320, 5816, 1633, 533, 1677, 253, 5140, 273, 2502, 328, 292, 891, 1928, 347, 2167, 253, 21313, 273, 247, 27311, 267, 2605, 2256, 281, 10234, 31429, 310, 417, 30643, 10084, 50276, 83, 21, 4720, 891, 717, 417, 2119, 604, 253, 906, 273, 13546, 247, 2410, 3024, 10336, 275, 2502, 328, 292, 672, 10234, 31429, 7658, 310, 2879, 253, 21496, 310, 512, 326, 10084, 50276, 83, 21, 1501, 49794, 4361, 253, 6431, 875, 253, 4477, 285, 391, 22, 891, 717, 1335, 417, 4751, 13762, 326, 10234, 31429, 2867, 310, 512, 326, 10084, 533, 323, 479, 28763, 417, 247, 1921, 281, 12009, 50276, 255, 253, 1878, 253, 2929, 347, 3542, 556, 2649, 2568, 13762, 690, 10668, 4266, 2908, 327, 841, 3916, 50276, 284, 891, 5393, 387, 253, 1265, 436, 2929, 310, 45210, 533, 984, 891, 717, 8127, 15616, 342, 391, 22, 84, 24302, 891, 1158, 436, 2929, 1057, 417, 3240, 1509, 253, 2534, 323, 14924, 891, 5583, 247, 18235, 533, 891, 1007, 3579, 281, 6523, 247, 34615, 2715, 273, 436, 789, 275, 253, 2852, 891, 3524, 253, 8680, 1060, 556, 644, 4217, 281, 9745, 670, 326, 10046, 2715 ]
Below is given review of a research paper from cnoference journal. Please write a summary the review. ### Review: summary the authors combine several cryptographic techniques to create a federated systems that allows several entities to run classification against all the model held be the participants without revealing information in the process in particular the sample to be classified is not revealed to any other party and differential privacy is used to protect the training data that was used to train the models a central semihonest coordinator is used to aggregate the results and add the differential privacy without learning any private information pros the strength of this works lie in combining relevant techniques and to show experimentally that the resulting system does improve over using a local model both when the training is distributed evenly or in skewed manner while taking privacy considerations into account cons from a cryptographic point of view the combination of techniques is somewhat expectable im wondering about the low statistical security 223 this seems to be related to the usage of unbounded integer secret sharing would it be possible to use secret sharing modulo an integer in which case the security could be perfect i think it would be easier to follow if steps 13 were combined in the description because they all take between the same pairs of parties the exact techniques used dont seem to matter as long as the output secret sharing is the desired result namely the onehot vector i find the term collaborative learning somewhat overblown because the proposed protocol only runs classification collaboratively overall despite the points above im in favor of acceptance because the paper seems to improve on previous work and because it is written very well minor issues 33 leakeage 41 odd juxtaposition in the formatting of arg max and sum figure 3 very hard to read in blackandwhite docsep this work motivated by healthcare and finance where separate parties may wish to collaborate and learn from each others data but are prevented from doing so due to privacy regulations this paper propose confidential and private collaborative capc learning the first method provably achieving both confidentially and privacy in a collaborative setting this work also discussed about fairness i liked this part since it seems very cool however im not convinced by the method in this work is better than instahide i could be wrong minor comments in section 21 this paper should be discussed since it proposed a way to encrypt the imagestexts instahide instancehiding schemes for private distributed learning httpsarxivorgabs201002772 icml 2020 yangsibo huang zhao song kai li sanjeev arora texthide tackling data privacy in language understanding tasks httpsarxivorgabs201006053 emnlp 2020 yangsibo huang zhao song danqi chen kai li sanjeev arora in section b it lists many theoremsdefinition about differential privacy in section c it list many backgrounds about sampling in section d it list many definitions on fairness i dont quite see the point of having them in appendix since none of them got mentioned in appendix e which is the proof of the main theory result in this paper this paper is closely related differential privacy i think this paper should also be mentioned somewhere privacypreserving learning via deep net pruning httpsarxivorgabs200301876 yangsibo huang yushan su sachin ravi zhao song sanjeev arora kai li docsepthis paper works on the problem of collaborative learning while preserving both confidentiality and privacy of the data points it combines techniques from secure multiparty computation and differential privacy for the same and improves on confidential inference and pate in the process the new technique is called capc finally it states empirical results as evidence for the improved accuracy weakness 1 the evaluation is done on just two datasets so it is a little hard to judge whether the techniques would generalise or not 2 the writing of the paper itself is not that great because it is difficult to understand the low level details of the experiments 3 they talk very little about improving on the fairness guarantees strengths 1 their techniques enable collaborative learning even in settings where the local architectures of different parties are different 2 the algorithms they provide improve on fairness 3 their empirical results are better than the previously known methods evaluation i believe the combination of secure multiparty computation and differential privacy is not totally new but since it yields decent results i would say that the paper deserves a chance to be accepted ### Summary:
this work describes a system for collaborative learning in which several agents holding data want to improve their models by asking other agents to label their points the system preserves confidentiality of queries using mpc and also throws in differentially private aggregation of labels taken from the pate framework it provides expriments showing computational feasibility of the system the techniques use active learning to improve the models overall the ingredients are fairly standard but are put together in a new to the best of my admittedly limited knowledge of this area this seems like a solid attempt to explore approaches for learning in a federated setting with strong limitations on data sharing
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 8774, 253, 4477, 13398, 2067, 10105, 5576, 5609, 281, 2794, 247, 10208, 12072, 2718, 326, 4483, 2067, 14429, 281, 1408, 9162, 1411, 512, 253, 1566, 2918, 320, 253, 5014, 1293, 19678, 1491, 275, 253, 1232, 275, 1798, 253, 3410, 281, 320, 10509, 310, 417, 4950, 281, 667, 643, 3128, 285, 8967, 11068, 310, 908, 281, 4017, 253, 3733, 941, 326, 369, 908, 281, 6194, 253, 3210, 247, 4275, 3300, 6356, 251, 383, 30468, 310, 908, 281, 19737, 253, 1543, 285, 823, 253, 8967, 11068, 1293, 4715, 667, 3055, 1491, 50276, 856, 84, 253, 4757, 273, 436, 2987, 7027, 275, 16248, 4623, 5609, 285, 281, 921, 21657, 326, 253, 4795, 985, 1057, 3157, 689, 970, 247, 1980, 1566, 1097, 672, 253, 3733, 310, 5939, 25356, 390, 275, 46746, 5133, 1223, 3192, 11068, 15711, 715, 2395, 50276, 5040, 50276, 4064, 247, 10105, 5576, 1127, 273, 1859, 253, 5019, 273, 5609, 310, 8489, 1902, 494, 50276, 303, 12371, 670, 253, 1698, 7605, 3988, 26879, 436, 3133, 281, 320, 2905, 281, 253, 10393, 273, 45515, 7007, 4279, 9628, 651, 352, 320, 1896, 281, 897, 4279, 9628, 40090, 271, 7007, 275, 534, 1083, 253, 3988, 812, 320, 3962, 50276, 74, 1158, 352, 651, 320, 6927, 281, 956, 604, 5018, 2145, 497, 5678, 275, 253, 5740, 984, 597, 512, 1379, 875, 253, 1072, 8557, 273, 4676, 253, 3242, 5609, 908, 13414, 1646, 281, 2647, 347, 1048, 347, 253, 3453, 4279, 9628, 310, 253, 6799, 906, 10775, 253, 581, 12022, 4972, 50276, 74, 1089, 253, 1307, 27549, 4715, 8489, 689, 48829, 984, 253, 4081, 7241, 760, 6613, 9162, 8317, 3146, 50276, 1189, 455, 5747, 253, 2792, 1840, 516, 275, 3718, 273, 14924, 984, 253, 2929, 3133, 281, 3157, 327, 2045, 789, 285, 984, 352, 310, 3542, 1077, 973, 50276, 37585, 3374, 5922, 458, 640, 486, 7609, 8909, 47806, 522, 3450, 275, 253, 33907, 273, 1736, 2781, 285, 2020, 4677, 495, 1077, 1892, 281, 1239, 275, 2806, 395, 11300, 5474, 33032, 50276, 2520, 789, 17194, 407, 11723, 285, 15065, 835, 4858, 4676, 778, 5730, 281, 42124, 285, 3037, 432, 1016, 2571, 941, 533, 403, 14415, 432, 2509, 594, 1955, 281, 11068, 10132, 436, 2929, 12661, 18987, 285, 3055, 27549, 1729, 68, 4715, 253, 806, 1332, 872, 1598, 17170, 1097, 13224, 1365, 285, 11068, 275, 247, 27549, 4758, 436, 789, 671, 5469, 670, 28959, 891, 10490, 436, 629, 1580, 352, 3133, 1077, 4484, 2299, 516, 417, 13762, 407, 253, 1332, 275, 436, 789, 310, 1805, 685, 978, 1240, 504, 891, 812, 320, 3430, 50273, 37585, 5701, 275, 2593, 3127, 436, 2929, 943, 320, 5469, 1580, 352, 4081, 247, 1039, 281, 42347, 253, 4440, 383, 2068, 84, 50275, 6839, 1240, 504, 4227, 73, 2821, 15849, 323, 3055, 5939, 4715, 5987, 39962, 2061, 5375, 1252, 37812, 3547, 17857, 1686, 9169, 30966, 84, 41968, 30287, 606, 1182, 31035, 4498, 465, 2284, 632, 7699, 5173, 1173, 549, 6464, 50275, 7109, 394, 504, 46710, 941, 11068, 275, 3448, 4685, 8892, 5987, 39962, 2061, 5375, 1252, 35173, 3357, 802, 13307, 81, 9169, 30966, 84, 41968, 30287, 606, 1182, 31035, 4498, 16447, 33980, 260, 864, 465, 2284, 632, 7699, 5173, 1173, 549, 6464, 50276, 249, 2593, 270, 352, 10894, 1142, 39383, 28692, 670, 8967, 11068, 275, 2593, 260, 352, 1618, 1142, 24550, 670, 10491, 275, 2593, 277, 352, 1618, 1142, 14308, 327, 28959, 891, 13414, 3240, 923, 253, 1127, 273, 1907, 731, 275, 30762, 1580, 5293, 273, 731, 1694, 5393, 275, 30762, 299, 534, 310, 253, 4737, 273, 253, 2022, 3762, 906, 275, 436, 2929, 50276, 2520, 2929, 310, 8244, 2905, 8967, 11068, 891, 1158, 436, 2929, 943, 671, 320, 5393, 9366, 50276, 13552, 1974, 10192, 26368, 4715, 3066, 3676, 2036, 819, 25004, 5987, 39962, 2061, 5375, 9755, 520, 35432, 30966, 84, 41968, 30287, 606, 340, 2345, 266, 402, 256, 607, 249, 1218, 6584, 1182, 31035, 4498, 7699, 5173, 1173, 549, 6464, 465, 2284, 632, 5474, 33032, 2520, 2929, 2987, 327, 253, 1895, 273, 27549, 4715, 1223, 24279, 1097, 39581, 285, 11068, 273, 253, 941, 2792, 352, 24772, 5609, 432, 7895, 10796, 43545, 13782, 285, 8967, 11068, 323, 253, 1072, 285, 19132, 327, 18987, 17032, 285, 268, 366, 275, 253, 1232, 253, 747, 5853, 310, 1925, 1729, 68, 4720, 352, 3054, 16774, 1543, 347, 1941, 323, 253, 5520, 7200, 50276, 20881, 1255, 337, 253, 7103, 310, 2218, 327, 816, 767, 15302, 594, 352, 310, 247, 1652, 1892, 281, 5963, 1880, 253, 5609, 651, 2087, 885, 390, 417, 374, 253, 4028, 273, 253, 2929, 3139, 310, 417, 326, 1270, 984, 352, 310, 2834, 281, 2096, 253, 1698, 1268, 4278, 273, 253, 4679, 495, 597, 2312, 1077, 1652, 670, 11138, 327, 253, 28959, 23632, 50276, 296, 3755, 20556, 337, 616, 5609, 8046, 27549, 4715, 1014, 275, 7533, 835, 253, 1980, 35615, 273, 1027, 4676, 403, 1027, 374, 253, 11333, 597, 2085, 3157, 327, 28959, 495, 616, 16774, 1543, 403, 1805, 685, 253, 3786, 1929, 3082, 50276, 15419, 2368, 891, 2868, 253, 5019, 273, 7895, 10796, 43545, 13782, 285, 8967, 11068, 310, 417, 9106, 747, 533, 1580, 352, 11026, 12524, 1543, 891, 651, 1333, 326, 253, 2929, 22828, 247, 4839, 281, 320, 7607, 50276, 187, 187, 4118, 18435, 27, 2520, 789, 8631, 247, 985, 323, 27549, 4715, 275, 534, 2067, 6083, 5877, 941, 971, 281, 3157, 616, 3210, 407, 7004, 643, 6083, 281, 5203, 616, 2792, 253, 985, 31221, 39581, 273, 19241, 970, 278, 5902, 285, 671, 12326, 275, 21673, 3055, 20828, 273, 13301, 2668, 432, 253, 268, 366, 7792, 352, 3400, 866, 3428, 592, 4645, 15180, 25720, 273, 253, 985, 253, 5609, 897, 3939, 4715, 281, 3157, 253, 3210, 50276, 1189, 455, 253, 12696, 403, 9648, 2629, 533, 403, 1691, 2366, 275, 247, 747, 281, 253, 1682, 273, 619, 50276, 324, 3004, 314, 3710, 3640, 273, 436, 2170, 436, 3133, 751, 247, 4891, 3177, 281, 8338, 7274, 323, 4715, 275, 247, 10208, 12072, 4758, 342, 2266, 7364, 327, 941, 9628 ]
[ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 ]
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 8774, 253, 4477, 13398, 2067, 10105, 5576, 5609, 281, 2794, 247, 10208, 12072, 2718, 326, 4483, 2067, 14429, 281, 1408, 9162, 1411, 512, 253, 1566, 2918, 320, 253, 5014, 1293, 19678, 1491, 275, 253, 1232, 275, 1798, 253, 3410, 281, 320, 10509, 310, 417, 4950, 281, 667, 643, 3128, 285, 8967, 11068, 310, 908, 281, 4017, 253, 3733, 941, 326, 369, 908, 281, 6194, 253, 3210, 247, 4275, 3300, 6356, 251, 383, 30468, 310, 908, 281, 19737, 253, 1543, 285, 823, 253, 8967, 11068, 1293, 4715, 667, 3055, 1491, 50276, 856, 84, 253, 4757, 273, 436, 2987, 7027, 275, 16248, 4623, 5609, 285, 281, 921, 21657, 326, 253, 4795, 985, 1057, 3157, 689, 970, 247, 1980, 1566, 1097, 672, 253, 3733, 310, 5939, 25356, 390, 275, 46746, 5133, 1223, 3192, 11068, 15711, 715, 2395, 50276, 5040, 50276, 4064, 247, 10105, 5576, 1127, 273, 1859, 253, 5019, 273, 5609, 310, 8489, 1902, 494, 50276, 303, 12371, 670, 253, 1698, 7605, 3988, 26879, 436, 3133, 281, 320, 2905, 281, 253, 10393, 273, 45515, 7007, 4279, 9628, 651, 352, 320, 1896, 281, 897, 4279, 9628, 40090, 271, 7007, 275, 534, 1083, 253, 3988, 812, 320, 3962, 50276, 74, 1158, 352, 651, 320, 6927, 281, 956, 604, 5018, 2145, 497, 5678, 275, 253, 5740, 984, 597, 512, 1379, 875, 253, 1072, 8557, 273, 4676, 253, 3242, 5609, 908, 13414, 1646, 281, 2647, 347, 1048, 347, 253, 3453, 4279, 9628, 310, 253, 6799, 906, 10775, 253, 581, 12022, 4972, 50276, 74, 1089, 253, 1307, 27549, 4715, 8489, 689, 48829, 984, 253, 4081, 7241, 760, 6613, 9162, 8317, 3146, 50276, 1189, 455, 5747, 253, 2792, 1840, 516, 275, 3718, 273, 14924, 984, 253, 2929, 3133, 281, 3157, 327, 2045, 789, 285, 984, 352, 310, 3542, 1077, 973, 50276, 37585, 3374, 5922, 458, 640, 486, 7609, 8909, 47806, 522, 3450, 275, 253, 33907, 273, 1736, 2781, 285, 2020, 4677, 495, 1077, 1892, 281, 1239, 275, 2806, 395, 11300, 5474, 33032, 50276, 2520, 789, 17194, 407, 11723, 285, 15065, 835, 4858, 4676, 778, 5730, 281, 42124, 285, 3037, 432, 1016, 2571, 941, 533, 403, 14415, 432, 2509, 594, 1955, 281, 11068, 10132, 436, 2929, 12661, 18987, 285, 3055, 27549, 1729, 68, 4715, 253, 806, 1332, 872, 1598, 17170, 1097, 13224, 1365, 285, 11068, 275, 247, 27549, 4758, 436, 789, 671, 5469, 670, 28959, 891, 10490, 436, 629, 1580, 352, 3133, 1077, 4484, 2299, 516, 417, 13762, 407, 253, 1332, 275, 436, 789, 310, 1805, 685, 978, 1240, 504, 891, 812, 320, 3430, 50273, 37585, 5701, 275, 2593, 3127, 436, 2929, 943, 320, 5469, 1580, 352, 4081, 247, 1039, 281, 42347, 253, 4440, 383, 2068, 84, 50275, 6839, 1240, 504, 4227, 73, 2821, 15849, 323, 3055, 5939, 4715, 5987, 39962, 2061, 5375, 1252, 37812, 3547, 17857, 1686, 9169, 30966, 84, 41968, 30287, 606, 1182, 31035, 4498, 465, 2284, 632, 7699, 5173, 1173, 549, 6464, 50275, 7109, 394, 504, 46710, 941, 11068, 275, 3448, 4685, 8892, 5987, 39962, 2061, 5375, 1252, 35173, 3357, 802, 13307, 81, 9169, 30966, 84, 41968, 30287, 606, 1182, 31035, 4498, 16447, 33980, 260, 864, 465, 2284, 632, 7699, 5173, 1173, 549, 6464, 50276, 249, 2593, 270, 352, 10894, 1142, 39383, 28692, 670, 8967, 11068, 275, 2593, 260, 352, 1618, 1142, 24550, 670, 10491, 275, 2593, 277, 352, 1618, 1142, 14308, 327, 28959, 891, 13414, 3240, 923, 253, 1127, 273, 1907, 731, 275, 30762, 1580, 5293, 273, 731, 1694, 5393, 275, 30762, 299, 534, 310, 253, 4737, 273, 253, 2022, 3762, 906, 275, 436, 2929, 50276, 2520, 2929, 310, 8244, 2905, 8967, 11068, 891, 1158, 436, 2929, 943, 671, 320, 5393, 9366, 50276, 13552, 1974, 10192, 26368, 4715, 3066, 3676, 2036, 819, 25004, 5987, 39962, 2061, 5375, 9755, 520, 35432, 30966, 84, 41968, 30287, 606, 340, 2345, 266, 402, 256, 607, 249, 1218, 6584, 1182, 31035, 4498, 7699, 5173, 1173, 549, 6464, 465, 2284, 632, 5474, 33032, 2520, 2929, 2987, 327, 253, 1895, 273, 27549, 4715, 1223, 24279, 1097, 39581, 285, 11068, 273, 253, 941, 2792, 352, 24772, 5609, 432, 7895, 10796, 43545, 13782, 285, 8967, 11068, 323, 253, 1072, 285, 19132, 327, 18987, 17032, 285, 268, 366, 275, 253, 1232, 253, 747, 5853, 310, 1925, 1729, 68, 4720, 352, 3054, 16774, 1543, 347, 1941, 323, 253, 5520, 7200, 50276, 20881, 1255, 337, 253, 7103, 310, 2218, 327, 816, 767, 15302, 594, 352, 310, 247, 1652, 1892, 281, 5963, 1880, 253, 5609, 651, 2087, 885, 390, 417, 374, 253, 4028, 273, 253, 2929, 3139, 310, 417, 326, 1270, 984, 352, 310, 2834, 281, 2096, 253, 1698, 1268, 4278, 273, 253, 4679, 495, 597, 2312, 1077, 1652, 670, 11138, 327, 253, 28959, 23632, 50276, 296, 3755, 20556, 337, 616, 5609, 8046, 27549, 4715, 1014, 275, 7533, 835, 253, 1980, 35615, 273, 1027, 4676, 403, 1027, 374, 253, 11333, 597, 2085, 3157, 327, 28959, 495, 616, 16774, 1543, 403, 1805, 685, 253, 3786, 1929, 3082, 50276, 15419, 2368, 891, 2868, 253, 5019, 273, 7895, 10796, 43545, 13782, 285, 8967, 11068, 310, 417, 9106, 747, 533, 1580, 352, 11026, 12524, 1543, 891, 651, 1333, 326, 253, 2929, 22828, 247, 4839, 281, 320, 7607, 50276, 187, 187, 4118, 18435, 27, 2520, 789, 8631, 247, 985, 323, 27549, 4715, 275, 534, 2067, 6083, 5877, 941, 971, 281, 3157, 616, 3210, 407, 7004, 643, 6083, 281, 5203, 616, 2792, 253, 985, 31221, 39581, 273, 19241, 970, 278, 5902, 285, 671, 12326, 275, 21673, 3055, 20828, 273, 13301, 2668, 432, 253, 268, 366, 7792, 352, 3400, 866, 3428, 592, 4645, 15180, 25720, 273, 253, 985, 253, 5609, 897, 3939, 4715, 281, 3157, 253, 3210, 50276, 1189, 455, 253, 12696, 403, 9648, 2629, 533, 403, 1691, 2366, 275, 247, 747, 281, 253, 1682, 273, 619, 50276, 324, 3004, 314, 3710, 3640, 273, 436, 2170, 436, 3133, 751, 247, 4891, 3177, 281, 8338, 7274, 323, 4715, 275, 247, 10208, 12072, 4758, 342, 2266, 7364, 327, 941, 9628 ]
Below is given review of a research paper from cnoference journal. Please write a summary the review. ### Review: this paper presents a method clime to generate constrained explanations using lime the method relies on boolean constraints to dictate the sampling region for lime this approach offers a number of advantages first users can compare important features for groups conditioned on different factors second the method adds a level of robustness against adversarial attacks the authors demonstrate their approach on a number of data sets and find useful insights conditioning on different features and greater robustness to attacks proposed by slack et al questions comments for the authors the authors consistently reference motivating examples left for the appendix throughout the paper these are phrased as important examples are discussed in detail in section 41 please consider using the space afforded by the revision to introduce these in the main text because this currently affects readability is there any empirical verification for the methods provided in 42 ie algorithms 2 3 i understand the result is phrased as a theoretical result but there is a lot of attention devoted to it in the paper and it would be good to evaluate it in practice it would be nice to demonstrate the output of this estimate for an example and provide some analysis there unless im missing something the extension of the attacks to only discrete data for lime is interesting making it much more similar to the shap attack in some sense because this attack only relies on discrete data im a bit confused about the claim that the methods help detect adversarial attacks if theyre using the adversarial classifier from slack et al shouldnt the method in 2b reveal that the underlying classifier is only relying on the race feature could you clarify how this reveals the method detects the attacks overall the constrained explanation approach seems pretty useful for conditioning explanations and could be a valuable contribution it also seems pretty straightforward to implement on top of lime not a bad thing so could be of immediate value what im most confused about right now is 1 evaluation for section 42 i see this as a useful result because the number of perturbations are a big issue for lime and this could be useful to assess the fidelity of explanation with more confidence have more guidance if the fidelity is trustworthy id be interested to at least see fidelity plotted against delta for an example or something like this next im struggling to see how this method reveals the slack et al attacks as i said id expect to see a method which is not fooled by them to put race as the most important feature im currently leaning weak accept because i do see value in the method but would appreciate a small empirical extension to the results in section 42 and clarification around how the method reveals adversarial attacksdocsepthe topic of this paper is highly topical as it focuses on providing explanations for blackbox classifiers the paper tackles the known issue with outofdistribution sampling in ribeiro et als lime the approach taken is to introduce the use of boolean constraints in lime algorithm which allows for providing some guarantees on the quality of the explanations the paper is generally wellwritten and the level of details provided makes it a pleasure to read however a bit more details could be provided at places related work is sufficiently covered in the main text and more discussion is provided in the appendix the authors might take a look to a recent work bjorlund et al that also tries to remedy the outofdistribution sampling problem using an approach based on robust regression sparse robust regression for explaining classifiers a bjrklund a henelius e oikarinen k kallonen k puolamki ds 2019 discovery science 351366 the experimental evaluation in the main paper is brief missing eg much of the details to describe the experimental setting used since the approach taken is to use boolean constraints it would be relevant to know the effort needed to perform the sampling ie does the runtime of clime differ much of to that of lime pros 1 very relevant topic explanations and demonstration of the importance and the relevance of the data distribution in explaining blackbox classifiers 2 analysis and proven guarantees of explanation quality 3 generally well written paper and clear scope with a more detailed descriptions provided in the appendices cons 1 rather much of the relevant material is in the appendices 2 not enough details of the experimental evaluation provided 3 the end result the actual explanations and how a human would understand them is only very briefly touched questions if i understand correctly clime can still introduce samples that are not necessarily from the distribution of the original data which might be totally unknown and the original data would be the only information available as it seems to me that the difference to lime is that the samples need to obey the boolean constraints introduced is this correct would this then not mean that the correctness of the explanation heavily relies on using correct boolean constraints in the sampling procedure could there be a scenario where the boolean constraints used in the sampling process would lead to meaningless results ie the samples used would be outside the data space for the unknown distribution of the data the work covers explanations for binary classifiers and the focus is on tabular data do you see any eg performance issues that could affect the usability of clime in a more general scenario what is the performance of the sampling algorithm ie how does clime compare to lime when considering the time could you provide more details of the experimental evaluation ie details of the classifiers trained training testing dataset splits performance will the code be made openly available minor details some inconsistencies in the algorithms alg 1 is missing input only provided in the main test which makes the code hard to follow alg 1 and alg 3 both call getsamples but have different number of arguments fig 1 the labels in xaxis are way too small mlp undefined in page 7 also page 7 that gives 691 missing accuracy page 7 the sentence eg if the height of the bar is 01 the corresponding feature is mostly top ranked in all explanations is quite unclear since the smallest values in fig 1 are roughly 1 check the capitalisation in the titles in references eg bayes dnf shap lime bayesian should not be small capsdocsepthe paper presents augmentations to the lime explanation system using logical constraints given a blackbox classifier f lime generates local explanations of an input x by sampling in the neighborhood of x and learning a linear classifier that matches f in that neighborhood the paper claims that this procedure is problematic because the samples in the neighborhood of x can be outofdistribution ood so learning from the ood samples are not useful the paper also claims that lime cannot work in constrained spaces although this is a similar point to ood samples lastly the paper claims to certify the estimation quality more quickly by breaking early if the quality is below a certain threshold the problem of generating explanations is important and the use of logical constraints to focus the attention of the explanation generation is interesting however the contributions of the paper are unclear and the experimental results were not very exciting the paper seems to present two contributions 1 generating explanations given a constraint and 2 certifying quality of explanations but for 1 it seems they are simply calling offtheshelf tools that sample from a logical constraint it is unclear to me if there were any deeper insights than that for 2 their main claim of improvement comes from the assumption that it is okay to terminate early if the estimated fidelity is small this is not an unreasonable assumption but the contribution is again unclear the paper is simply targeting an easier estimation problem and existing approaches can probably do much better if they also make this assumption since i didnt notice any deeper technical contributions i was hoping to be convinced by experimental results after just reading the introduction my top concern was how realistic is it to generate logical constraints for an input x that you want an explanation for the intro tried to motivate it as doctors setting constraints on features of a patient so i was hoping for a realworld motivated scenario for constraining the explanation space in the experiments and some measure of how the clime explanations are better than those of lime unfortunately the constrained spaces for recividism doesnt seem to be motivated at all and i dont see why i should believe that clime explanations are better than lime just based on fig1 if i were to reshape the paper i would spend much less space on the technical contributions as the insights are fairly straightforward and can be described very quickly i think the main concern that i still think is not addressed is how realistic easy it is to expect a practitioner to write down logical constraints before querying explanations for a given input x i think there is a big leap of faith here that needs to be addressed experimentally via realworld case studies eg doctor example in the introduction where logical constraints are very natural to come up with and that explanations generated using these logical constraints are much betterdocsepsummary this paper proposes a new sampling method for lime based on userdefined boolean subspaces they show that using these subspaces rather than the default sampling settings of lime can lead to robustness against adversarial attacks and allow users to better undercover bugs and biases in subspaces relevant to the users task they additionally propose a new metric for measuring the quality of an explanation pros the authors create a novel link between the explainability literature and boolean satisfiability the proposed methodology is shown to be relevant in several different applications they propose simple metrics for evaluating the quality of samples and the quality of an explanation cons i dont have a lot of experience with iclr but it is not obvious to me that this paper is appropriate for this venue none of the metrics used in the paper seem to be weighted by densitydistance to the point being explained as is done in lime given that the point of lime is that the function is unlikely to be globally approximable by a linear function the lack of incorporation of a weighting function seems to make this framework inferior to that of lime and in fact theoretically the binary subspaces used in this paper are merely a specific instantiation of the flexible weighting function used in lime the paper as opposed to lime the software package i would expect that without such a weighting function in many cases explanations with similar values of the rho metric may vary widely in their usefulness as local explanations the paper suffers somewhat from relegating all of the examples to the supplementary material examples which are important to the points being made should be brought back into the main body of the paper certain relevant works seems to have been missed sokol et al httpsarxivorgpdf191013016pdf proposed userspecified local surrogate explainers including allowing users to define their own sampling subspace but do not propose algorithms for sampling zhang et al httpsarxivorgpdf190412991pdf show that lime does not accurately reflect local invariance to globally important variables in the first experiment without some ground truth knowledge as to what the classifier is doing it is not obviously useful to point out that the clime identifies race as a top feature for females in the recidivism dataset the setup of zhang et al may be preferred where the ground truth behavior of the classifier is known in the adversarial attack experiment it is not completely clear what is done did you use clime to generate explanations of the adversarial classifier from slack et al was the adversarial classifier trained with access to clime perturbation functions or was it trained assuming lime perturbation functions this doesnt seem to immediately show superiority over the lime framework as at larger hamming distances the bias is still hidden i think a more appropriate comparison would allow the adversarial classifier access to the relevant perturbation function and consider accuracy at a variety of neighborhood widths for lime as with the hamming distance post rebuttal i have read the updated version of the paper and still feel that this paper may have errors regarding the flexibility and purpose of lime the idea is nice but the paper and evaluations would benefit from more polishing before publication i maintain my original score misrepresentation of lime section 3 lime does not assume that the sampling neighborhood is the same as the true distribution it may assume something weaker such as that the function being explained is fairly smooth in the sampling neighborhood note that this can be a feature of lime and not necessarily a bug if for example x1 and x2 are fully correlated in the data distribution but the classifier only uses x1 it would be impossible to tell this if sampling only within the data distribution by sampling outside the data distribution it becomes apparent that the classifier is using x1 only also lime assumes black box access to the function so i dont fully understand your statement that we generally do not have access to the true labels of instances generated through sampling it seems like you may be defining the correct explanation with respect to the true data distribution rather than to the classifier lime is meant to explain a blackbox classifier if the classifier is wrong lime should reveal what the classifier does that is the explanation should also be wrong with respect to the true data the framework capabilities is also simply not true users can define the data point to be explained as well as their own similarity kernel andor kernel width evaluation its not entirely obvious to me how we can be sure that clime is producing the right explanation in c1 c2 without knowing the function f if changing the training set changes the classifier f then it is correct that the explanation should change as mentioned above evaluating whether or not an explanation is correct should be done with respect to the classifier not the underlying data distribution in detecting adversarial attacks its not clear from the text whether or not you retrain the adversarial attack with your perturbation function further i suspect that lime may also be able to identify the sensitive feature for sufficiently small neighborhood sizes when sampling in binary space z it seems like a straw man argument to compare an optimized version of your sampling procedure to the default version of lime minor equations 1 and 2 if they are describing the usage in ribeiro et al should include a weighting function figure 2 seems not to be explained in the text and would benefit from more description ### Summary:
the authors present clime a variant of lime which samples from userdefined subspaces specified by boolean constraints one motivation is to address the ood sampling issue in regular lime they introduce a metric to quantify the severity of this issue and demonstrate empirically that clime helps to address it in order to stay close to the data distribution they use constraints based on hamming distance to data points they demonstrate that this approach helps to defend against the recent approach of slack et al 2020 to fool lime explanations the paper is close to borderline though concerns remain about experimental validation and the extent of novel contribution since the original lime framework is more flexible than described here and allows a custom distance function rev 1 believes that the original lime framework is sufficient to handle hamming distance constraints though sampling will be less efficient to their credit authors engaged in discussion but this should be further elaborated in a revised version
[ 7933, 432, 253, 3268, 273, 253, 3236, 941, 534, 1537, 320, 9106, 7202, 285, 253, 3236, 941, 651, 320, 253, 760, 1491, 2130, 347, 352, 3133, 281, 479, 326, 253, 3064, 281, 30037, 310, 326, 253, 3530, 878, 281, 20090, 253, 12419, 10806, 5611, 310, 436, 3451, 50275, 12756, 436, 840, 417, 1599, 326, 253, 36594, 273, 253, 8813, 11306, 15771, 327, 970, 3451, 12419, 10806, 275, 253, 10491, 5199, 812, 627, 320, 247, 10076, 835, 253, 12419, 10806, 908, 275, 253, 10491, 1232, 651, 1421, 281, 34209, 1543, 26332, 253, 3530, 908, 651, 320, 3345, 253, 941, 2317, 323, 253, 7202, 3268, 273, 253, 941, 50276, 783, 789, 10949, 50276, 911, 11139, 569, 323, 8985, 49996, 285, 253, 2770, 310, 327, 10334, 792, 941, 513, 368, 923, 667, 24088, 3045, 3374, 326, 812, 2818, 253, 47813, 273, 502, 553, 275, 247, 625, 2087, 10076, 28910, 752, 310, 253, 3045, 273, 253, 10491, 5933, 26332, 849, 1057, 502, 553, 7277, 281, 30037, 672, 7296, 253, 673, 50276, 16534, 368, 2085, 625, 4278, 273, 253, 5661, 7103, 26332, 4278, 273, 253, 49996, 10166, 3733, 50276, 19462, 10895, 36509, 3045, 588, 253, 2127, 320, 1160, 22134, 2130, 50276, 37585, 4278, 50274, 8826, 45611, 275, 253, 11333, 20320, 337, 310, 5816, 3280, 760, 2530, 275, 253, 2022, 1071, 534, 2789, 253, 2127, 1892, 281, 956, 20320, 337, 285, 20320, 495, 1097, 1067, 4850, 10240, 533, 452, 1027, 1180, 273, 7125, 50276, 926, 337, 253, 13301, 275, 1269, 10565, 403, 1039, 1512, 1355, 50276, 1686, 81, 17011, 275, 3239, 818, 50276, 12563, 3239, 818, 326, 4245, 721, 4739, 5816, 7200, 50276, 6377, 818, 253, 6197, 24088, 604, 253, 4898, 273, 253, 2534, 310, 14805, 253, 3969, 4735, 310, 6571, 1755, 17045, 275, 512, 22909, 310, 3240, 12744, 1580, 253, 8004, 2193, 275, 3036, 337, 403, 11467, 337, 50276, 5903, 253, 5347, 5837, 275, 253, 14505, 275, 10414, 24088, 17699, 265, 277, 35478, 439, 522, 30037, 17699, 16561, 943, 417, 320, 1355, 12839, 7152, 339, 431, 248, 2929, 10262, 35919, 569, 281, 253, 30037, 8813, 985, 970, 13760, 10806, 1677, 247, 2806, 3364, 30410, 269, 30037, 15693, 1980, 22909, 273, 271, 3280, 1269, 407, 10491, 275, 253, 9168, 273, 1269, 285, 4715, 247, 4872, 30410, 326, 10129, 269, 275, 326, 9168, 253, 2929, 3916, 326, 436, 5199, 310, 20276, 984, 253, 3530, 275, 253, 9168, 273, 1269, 476, 320, 562, 1171, 35360, 258, 351, 594, 4715, 432, 253, 258, 351, 3530, 403, 417, 4217, 253, 2929, 671, 3916, 326, 30037, 2550, 789, 275, 20793, 8470, 3738, 436, 310, 247, 2074, 1127, 281, 258, 351, 3530, 1390, 314, 253, 2929, 3916, 281, 5306, 1419, 253, 13418, 3290, 625, 4541, 50276, 1615, 10155, 2393, 604, 253, 3290, 310, 2708, 247, 2176, 7887, 50276, 783, 1895, 273, 11365, 22909, 310, 1774, 285, 253, 897, 273, 13760, 10806, 281, 2770, 253, 4116, 273, 253, 8813, 5978, 310, 4722, 2299, 253, 9021, 273, 253, 2929, 403, 12744, 285, 253, 5661, 1543, 497, 417, 1077, 12302, 50276, 783, 2929, 3133, 281, 1246, 767, 9021, 337, 11365, 22909, 1677, 247, 7658, 285, 374, 5306, 5411, 3290, 273, 22909, 533, 323, 337, 352, 3133, 597, 403, 3365, 6789, 273, 649, 1041, 48164, 5657, 326, 3410, 432, 247, 13760, 7658, 352, 310, 12744, 281, 479, 604, 627, 497, 667, 12861, 16039, 685, 326, 323, 374, 616, 2022, 1750, 273, 7756, 3249, 432, 253, 9376, 326, 352, 310, 8261, 281, 24174, 2393, 604, 253, 5998, 32422, 310, 1355, 436, 310, 417, 271, 20697, 9376, 533, 253, 7680, 310, 969, 12744, 50276, 783, 2929, 310, 3365, 12262, 271, 6927, 13418, 1895, 285, 5368, 7274, 476, 3164, 513, 1199, 1805, 604, 597, 671, 1056, 436, 9376, 50276, 17480, 891, 42126, 4366, 667, 12861, 7681, 9021, 891, 369, 11525, 281, 320, 13762, 407, 5661, 1543, 846, 816, 4361, 253, 10199, 619, 1755, 4468, 369, 50276, 5430, 15958, 310, 352, 281, 6635, 13760, 10806, 323, 271, 3280, 1269, 326, 368, 971, 271, 8813, 323, 253, 26432, 3597, 281, 41509, 352, 347, 11576, 4758, 10806, 327, 3386, 273, 247, 3110, 594, 891, 369, 11525, 323, 247, 1524, 10186, 17194, 10076, 323, 1030, 26208, 253, 8813, 2317, 275, 253, 4679, 285, 690, 2557, 273, 849, 253, 502, 553, 22909, 403, 1805, 685, 1110, 273, 30037, 19235, 253, 20793, 8470, 323, 761, 1741, 1204, 36908, 1646, 281, 320, 17194, 387, 512, 285, 891, 13414, 923, 2139, 891, 943, 2868, 326, 502, 553, 22909, 403, 1805, 685, 30037, 816, 1754, 327, 3036, 18, 50276, 338, 891, 497, 281, 40206, 2259, 253, 2929, 891, 651, 6947, 1199, 1679, 2317, 327, 253, 7681, 9021, 347, 253, 16039, 403, 9648, 15246, 285, 476, 320, 2529, 1077, 4541, 891, 1158, 253, 2022, 4468, 326, 891, 1335, 1158, 310, 417, 9713, 310, 849, 15958, 50276, 36423, 352, 310, 281, 1902, 247, 34815, 281, 3630, 1066, 13760, 10806, 1078, 7316, 272, 22909, 323, 247, 1677, 3280, 1269, 891, 1158, 627, 310, 247, 1943, 26416, 273, 6009, 1060, 326, 3198, 281, 320, 9713, 21657, 3066, 1524, 10186, 1083, 2175, 24088, 7345, 1650, 275, 253, 10199, 835, 13760, 10806, 403, 1077, 3626, 281, 1705, 598, 342, 285, 326, 22909, 4561, 970, 841, 13760, 10806, 403, 1199, 1805, 7152, 339, 793, 360, 3454, 436, 2929, 29328, 247, 747, 10491, 1332, 323, 30037, 1754, 327, 2608, 7769, 12419, 749, 31748, 597, 921, 326, 970, 841, 749, 31748, 2581, 685, 253, 4284, 10491, 7533, 273, 30037, 476, 1421, 281, 31640, 1411, 48960, 8104, 285, 1581, 4212, 281, 1805, 42417, 19775, 285, 31306, 275, 749, 31748, 4623, 281, 253, 4212, 4836, 597, 23000, 12661, 247, 747, 7982, 323, 10499, 253, 3290, 273, 271, 8813, 50276, 856, 84, 253, 4477, 2794, 247, 4460, 3048, 875, 253, 5513, 1430, 6239, 285, 12419, 3449, 74, 1430, 253, 4081, 16182, 310, 2011, 281, 320, 4623, 275, 2067, 1027, 4893, 597, 12661, 2969, 17082, 323, 16344, 253, 3290, 273, 3530, 285, 253, 3290, 273, 271, 8813, 50276, 5040, 891, 13414, 452, 247, 2257, 273, 2793, 342, 17857, 32888, 533, 352, 310, 417, 4755, 281, 479, 326, 436, 2929, 310, 4569, 323, 436, 18767, 50276, 15422, 273, 253, 17082, 908, 275, 253, 2929, 1646, 281, 320, 17375, 407, 4038, 19893, 281, 253, 1127, 1146, 5544, 347, 310, 2218, 275, 30037, 1677, 326, 253, 1127, 273, 30037, 310, 326, 253, 1159, 310, 11543, 281, 320, 21349, 4020, 494, 407, 247, 4872, 1159, 253, 3480, 273, 24319, 273, 247, 42428, 1159, 3133, 281, 1056, 436, 7792, 18134, 281, 326, 273, 30037, 285, 275, 958, 28055, 253, 8985, 749, 31748, 908, 275, 436, 2929, 403, 7960, 247, 2173, 8164, 2492, 273, 253, 12112, 42428, 1159, 908, 275, 30037, 253, 2929, 347, 10066, 281, 30037, 253, 3694, 5522, 891, 651, 1902, 326, 1293, 824, 247, 42428, 1159, 275, 1142, 2219, 22909, 342, 2074, 2193, 273, 253, 391, 1689, 7982, 778, 6889, 7561, 275, 616, 31471, 347, 1980, 22909, 50276, 783, 2929, 27171, 8489, 432, 1693, 72, 839, 512, 273, 253, 6667, 281, 253, 24864, 2144, 6667, 534, 403, 1774, 281, 253, 2792, 1146, 1160, 943, 320, 3982, 896, 715, 253, 2022, 2133, 273, 253, 2929, 50276, 33455, 4623, 2987, 3133, 281, 452, 644, 9829, 256, 536, 311, 1162, 355, 5987, 39962, 2061, 9275, 746, 6903, 20, 11718, 9275, 4081, 4212, 1553, 1245, 1980, 35701, 5513, 398, 1690, 6941, 4212, 281, 4853, 616, 1211, 10491, 24822, 533, 513, 417, 12661, 11333, 323, 10491, 1182, 12109, 1162, 355, 5987, 39962, 2061, 9275, 746, 2125, 805, 39405, 9275, 921, 326, 30037, 1057, 417, 13613, 4887, 1980, 31429, 281, 21349, 1774, 4903, 50275, 249, 253, 806, 3368, 1293, 690, 3216, 5083, 3640, 347, 281, 752, 253, 30410, 310, 2509, 352, 310, 417, 9090, 4217, 281, 1127, 562, 326, 253, 502, 553, 22649, 5492, 347, 247, 1755, 4735, 323, 10753, 275, 253, 761, 301, 42339, 10895, 253, 9978, 273, 1182, 12109, 1162, 355, 778, 320, 9013, 835, 253, 3216, 5083, 3879, 273, 253, 30410, 310, 1929, 50276, 249, 253, 48960, 2983, 3368, 352, 310, 417, 4336, 2590, 752, 310, 2218, 858, 368, 897, 502, 553, 281, 6635, 22909, 273, 253, 48960, 30410, 432, 37358, 1162, 355, 369, 253, 48960, 30410, 10166, 342, 2289, 281, 502, 553, 20452, 3470, 390, 369, 352, 10166, 7384, 30037, 20452, 3470, 436, 36908, 1646, 281, 4745, 921, 34385, 689, 253, 30037, 7792, 347, 387, 4067, 288, 28444, 13849, 253, 8492, 310, 1335, 8763, 891, 1158, 247, 625, 4569, 5301, 651, 1581, 253, 48960, 30410, 2289, 281, 253, 4623, 20452, 1159, 285, 1908, 7200, 387, 247, 5235, 273, 9168, 34414, 323, 30037, 347, 342, 253, 288, 28444, 4181, 50276, 5996, 30080, 22559, 891, 452, 1239, 253, 9300, 2715, 273, 253, 2929, 285, 1335, 1928, 326, 436, 2929, 778, 452, 6332, 5001, 253, 15840, 285, 4096, 273, 30037, 253, 2934, 310, 5322, 533, 253, 2929, 285, 27163, 651, 5649, 432, 625, 35952, 1078, 9311, 891, 6558, 619, 3236, 4868, 50276, 24418, 37626, 273, 30037, 50276, 4674, 495, 30037, 1057, 417, 5467, 326, 253, 10491, 9168, 310, 253, 1072, 347, 253, 2032, 3268, 352, 778, 5467, 1633, 21076, 824, 347, 326, 253, 1159, 1146, 5544, 310, 9648, 6032, 275, 253, 10491, 9168, 3877, 326, 436, 476, 320, 247, 4735, 273, 30037, 285, 417, 7933, 247, 7505, 604, 323, 1650, 1269, 18, 285, 1269, 19, 403, 4751, 9578, 275, 253, 941, 3268, 533, 253, 30410, 760, 4648, 1269, 18, 352, 651, 320, 7479, 281, 2028, 436, 604, 10491, 760, 1561, 253, 941, 3268, 407, 10491, 3345, 253, 941, 3268, 352, 4916, 5165, 326, 253, 30410, 310, 970, 1269, 18, 760, 671, 30037, 19584, 2806, 3817, 2289, 281, 253, 1159, 594, 891, 13414, 4751, 2096, 634, 3908, 326, 359, 3839, 513, 417, 452, 2289, 281, 253, 2032, 13301, 273, 10872, 4561, 949, 10491, 352, 3133, 751, 368, 778, 320, 13947, 253, 3451, 8813, 342, 1675, 281, 253, 2032, 941, 3268, 2581, 685, 281, 253, 30410, 30037, 310, 5486, 281, 5513, 247, 2806, 3364, 30410, 604, 253, 30410, 310, 3430, 30037, 943, 10313, 752, 253, 30410, 1057, 326, 310, 253, 8813, 943, 671, 320, 3430, 342, 1675, 281, 253, 2032, 941, 253, 7792, 13789, 310, 671, 3365, 417, 2032, 4212, 476, 4853, 253, 941, 1127, 281, 320, 5544, 347, 973, 347, 616, 1211, 14259, 10295, 285, 263, 10295, 4871, 50276, 15419, 2368, 50276, 953, 417, 7094, 4755, 281, 479, 849, 359, 476, 320, 2119, 326, 502, 553, 310, 9603, 253, 987, 8813, 275, 260, 18, 260, 19, 1293, 8958, 253, 1159, 269, 604, 6890, 253, 3733, 873, 2544, 253, 30410, 269, 840, 352, 310, 3451, 326, 253, 8813, 943, 1818, 347, 5393, 1840, 16344, 1880, 390, 417, 271, 8813, 310, 3451, 943, 320, 2218, 342, 1675, 281, 253, 30410, 417, 253, 6944, 941, 3268, 275, 15549, 48960, 8104, 697, 417, 2590, 432, 253, 2505, 1880, 390, 417, 368, 851, 1949, 253, 48960, 2983, 342, 634, 20452, 1159, 2007, 891, 9101, 326, 30037, 778, 671, 320, 2104, 281, 4271, 253, 7996, 4735, 323, 10481, 1355, 9168, 9552, 672, 10491, 275, 8985, 2317, 1182, 352, 3133, 751, 247, 17844, 637, 4154, 281, 7277, 271, 18325, 2715, 273, 634, 10491, 5199, 281, 253, 4284, 2715, 273, 30037, 50276, 37585, 7424, 337, 285, 374, 604, 597, 403, 12930, 253, 10393, 275, 9412, 70, 9401, 1162, 355, 943, 2486, 247, 42428, 1159, 50276, 13206, 374, 3133, 417, 281, 320, 5544, 275, 253, 2505, 285, 651, 5649, 432, 625, 5740, 2490, 187, 4118, 18435, 27, 783, 4477, 1246, 502, 553, 247, 12955, 273, 30037, 534, 3530, 432, 2608, 7769, 749, 31748, 7616, 407, 12419, 10806, 581, 16038, 310, 281, 2953, 253, 258, 351, 10491, 2523, 275, 3963, 30037, 597, 9569, 247, 7982, 281, 22048, 253, 12147, 273, 436, 2523, 285, 7568, 45190, 326, 502, 553, 7729, 281, 2953, 352, 275, 1340, 281, 3297, 2810, 281, 253, 941, 3268, 597, 897, 10806, 1754, 327, 288, 28444, 4181, 281, 941, 2792, 597, 7568, 326, 436, 2746, 7729, 281, 2342, 1411, 253, 3332, 2746, 273, 37358, 1162, 355, 9169, 281, 11213, 30037, 22909, 50276, 783, 2929, 310, 2810, 281, 45210, 2167, 7350, 3464, 670, 5661, 12820, 285, 253, 6070, 273, 4460, 7680, 1580, 253, 3236, 30037, 7792, 310, 625, 12112, 685, 2529, 1060, 285, 4483, 247, 2840, 4181, 1159, 3585, 337, 11532, 326, 253, 3236, 30037, 7792, 310, 4209, 281, 6016, 288, 28444, 4181, 10806, 2167, 10491, 588, 320, 1679, 5919, 281, 616, 6152, 4477, 9583, 275, 5955, 533, 436, 943, 320, 2007, 50221, 275, 247, 17265, 2715 ]
[ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 ]
[ 7933, 432, 253, 3268, 273, 253, 3236, 941, 534, 1537, 320, 9106, 7202, 285, 253, 3236, 941, 651, 320, 253, 760, 1491, 2130, 347, 352, 3133, 281, 479, 326, 253, 3064, 281, 30037, 310, 326, 253, 3530, 878, 281, 20090, 253, 12419, 10806, 5611, 310, 436, 3451, 50275, 12756, 436, 840, 417, 1599, 326, 253, 36594, 273, 253, 8813, 11306, 15771, 327, 970, 3451, 12419, 10806, 275, 253, 10491, 5199, 812, 627, 320, 247, 10076, 835, 253, 12419, 10806, 908, 275, 253, 10491, 1232, 651, 1421, 281, 34209, 1543, 26332, 253, 3530, 908, 651, 320, 3345, 253, 941, 2317, 323, 253, 7202, 3268, 273, 253, 941, 50276, 783, 789, 10949, 50276, 911, 11139, 569, 323, 8985, 49996, 285, 253, 2770, 310, 327, 10334, 792, 941, 513, 368, 923, 667, 24088, 3045, 3374, 326, 812, 2818, 253, 47813, 273, 502, 553, 275, 247, 625, 2087, 10076, 28910, 752, 310, 253, 3045, 273, 253, 10491, 5933, 26332, 849, 1057, 502, 553, 7277, 281, 30037, 672, 7296, 253, 673, 50276, 16534, 368, 2085, 625, 4278, 273, 253, 5661, 7103, 26332, 4278, 273, 253, 49996, 10166, 3733, 50276, 19462, 10895, 36509, 3045, 588, 253, 2127, 320, 1160, 22134, 2130, 50276, 37585, 4278, 50274, 8826, 45611, 275, 253, 11333, 20320, 337, 310, 5816, 3280, 760, 2530, 275, 253, 2022, 1071, 534, 2789, 253, 2127, 1892, 281, 956, 20320, 337, 285, 20320, 495, 1097, 1067, 4850, 10240, 533, 452, 1027, 1180, 273, 7125, 50276, 926, 337, 253, 13301, 275, 1269, 10565, 403, 1039, 1512, 1355, 50276, 1686, 81, 17011, 275, 3239, 818, 50276, 12563, 3239, 818, 326, 4245, 721, 4739, 5816, 7200, 50276, 6377, 818, 253, 6197, 24088, 604, 253, 4898, 273, 253, 2534, 310, 14805, 253, 3969, 4735, 310, 6571, 1755, 17045, 275, 512, 22909, 310, 3240, 12744, 1580, 253, 8004, 2193, 275, 3036, 337, 403, 11467, 337, 50276, 5903, 253, 5347, 5837, 275, 253, 14505, 275, 10414, 24088, 17699, 265, 277, 35478, 439, 522, 30037, 17699, 16561, 943, 417, 320, 1355, 12839, 7152, 339, 431, 248, 2929, 10262, 35919, 569, 281, 253, 30037, 8813, 985, 970, 13760, 10806, 1677, 247, 2806, 3364, 30410, 269, 30037, 15693, 1980, 22909, 273, 271, 3280, 1269, 407, 10491, 275, 253, 9168, 273, 1269, 285, 4715, 247, 4872, 30410, 326, 10129, 269, 275, 326, 9168, 253, 2929, 3916, 326, 436, 5199, 310, 20276, 984, 253, 3530, 275, 253, 9168, 273, 1269, 476, 320, 562, 1171, 35360, 258, 351, 594, 4715, 432, 253, 258, 351, 3530, 403, 417, 4217, 253, 2929, 671, 3916, 326, 30037, 2550, 789, 275, 20793, 8470, 3738, 436, 310, 247, 2074, 1127, 281, 258, 351, 3530, 1390, 314, 253, 2929, 3916, 281, 5306, 1419, 253, 13418, 3290, 625, 4541, 50276, 1615, 10155, 2393, 604, 253, 3290, 310, 2708, 247, 2176, 7887, 50276, 783, 1895, 273, 11365, 22909, 310, 1774, 285, 253, 897, 273, 13760, 10806, 281, 2770, 253, 4116, 273, 253, 8813, 5978, 310, 4722, 2299, 253, 9021, 273, 253, 2929, 403, 12744, 285, 253, 5661, 1543, 497, 417, 1077, 12302, 50276, 783, 2929, 3133, 281, 1246, 767, 9021, 337, 11365, 22909, 1677, 247, 7658, 285, 374, 5306, 5411, 3290, 273, 22909, 533, 323, 337, 352, 3133, 597, 403, 3365, 6789, 273, 649, 1041, 48164, 5657, 326, 3410, 432, 247, 13760, 7658, 352, 310, 12744, 281, 479, 604, 627, 497, 667, 12861, 16039, 685, 326, 323, 374, 616, 2022, 1750, 273, 7756, 3249, 432, 253, 9376, 326, 352, 310, 8261, 281, 24174, 2393, 604, 253, 5998, 32422, 310, 1355, 436, 310, 417, 271, 20697, 9376, 533, 253, 7680, 310, 969, 12744, 50276, 783, 2929, 310, 3365, 12262, 271, 6927, 13418, 1895, 285, 5368, 7274, 476, 3164, 513, 1199, 1805, 604, 597, 671, 1056, 436, 9376, 50276, 17480, 891, 42126, 4366, 667, 12861, 7681, 9021, 891, 369, 11525, 281, 320, 13762, 407, 5661, 1543, 846, 816, 4361, 253, 10199, 619, 1755, 4468, 369, 50276, 5430, 15958, 310, 352, 281, 6635, 13760, 10806, 323, 271, 3280, 1269, 326, 368, 971, 271, 8813, 323, 253, 26432, 3597, 281, 41509, 352, 347, 11576, 4758, 10806, 327, 3386, 273, 247, 3110, 594, 891, 369, 11525, 323, 247, 1524, 10186, 17194, 10076, 323, 1030, 26208, 253, 8813, 2317, 275, 253, 4679, 285, 690, 2557, 273, 849, 253, 502, 553, 22909, 403, 1805, 685, 1110, 273, 30037, 19235, 253, 20793, 8470, 323, 761, 1741, 1204, 36908, 1646, 281, 320, 17194, 387, 512, 285, 891, 13414, 923, 2139, 891, 943, 2868, 326, 502, 553, 22909, 403, 1805, 685, 30037, 816, 1754, 327, 3036, 18, 50276, 338, 891, 497, 281, 40206, 2259, 253, 2929, 891, 651, 6947, 1199, 1679, 2317, 327, 253, 7681, 9021, 347, 253, 16039, 403, 9648, 15246, 285, 476, 320, 2529, 1077, 4541, 891, 1158, 253, 2022, 4468, 326, 891, 1335, 1158, 310, 417, 9713, 310, 849, 15958, 50276, 36423, 352, 310, 281, 1902, 247, 34815, 281, 3630, 1066, 13760, 10806, 1078, 7316, 272, 22909, 323, 247, 1677, 3280, 1269, 891, 1158, 627, 310, 247, 1943, 26416, 273, 6009, 1060, 326, 3198, 281, 320, 9713, 21657, 3066, 1524, 10186, 1083, 2175, 24088, 7345, 1650, 275, 253, 10199, 835, 13760, 10806, 403, 1077, 3626, 281, 1705, 598, 342, 285, 326, 22909, 4561, 970, 841, 13760, 10806, 403, 1199, 1805, 7152, 339, 793, 360, 3454, 436, 2929, 29328, 247, 747, 10491, 1332, 323, 30037, 1754, 327, 2608, 7769, 12419, 749, 31748, 597, 921, 326, 970, 841, 749, 31748, 2581, 685, 253, 4284, 10491, 7533, 273, 30037, 476, 1421, 281, 31640, 1411, 48960, 8104, 285, 1581, 4212, 281, 1805, 42417, 19775, 285, 31306, 275, 749, 31748, 4623, 281, 253, 4212, 4836, 597, 23000, 12661, 247, 747, 7982, 323, 10499, 253, 3290, 273, 271, 8813, 50276, 856, 84, 253, 4477, 2794, 247, 4460, 3048, 875, 253, 5513, 1430, 6239, 285, 12419, 3449, 74, 1430, 253, 4081, 16182, 310, 2011, 281, 320, 4623, 275, 2067, 1027, 4893, 597, 12661, 2969, 17082, 323, 16344, 253, 3290, 273, 3530, 285, 253, 3290, 273, 271, 8813, 50276, 5040, 891, 13414, 452, 247, 2257, 273, 2793, 342, 17857, 32888, 533, 352, 310, 417, 4755, 281, 479, 326, 436, 2929, 310, 4569, 323, 436, 18767, 50276, 15422, 273, 253, 17082, 908, 275, 253, 2929, 1646, 281, 320, 17375, 407, 4038, 19893, 281, 253, 1127, 1146, 5544, 347, 310, 2218, 275, 30037, 1677, 326, 253, 1127, 273, 30037, 310, 326, 253, 1159, 310, 11543, 281, 320, 21349, 4020, 494, 407, 247, 4872, 1159, 253, 3480, 273, 24319, 273, 247, 42428, 1159, 3133, 281, 1056, 436, 7792, 18134, 281, 326, 273, 30037, 285, 275, 958, 28055, 253, 8985, 749, 31748, 908, 275, 436, 2929, 403, 7960, 247, 2173, 8164, 2492, 273, 253, 12112, 42428, 1159, 908, 275, 30037, 253, 2929, 347, 10066, 281, 30037, 253, 3694, 5522, 891, 651, 1902, 326, 1293, 824, 247, 42428, 1159, 275, 1142, 2219, 22909, 342, 2074, 2193, 273, 253, 391, 1689, 7982, 778, 6889, 7561, 275, 616, 31471, 347, 1980, 22909, 50276, 783, 2929, 27171, 8489, 432, 1693, 72, 839, 512, 273, 253, 6667, 281, 253, 24864, 2144, 6667, 534, 403, 1774, 281, 253, 2792, 1146, 1160, 943, 320, 3982, 896, 715, 253, 2022, 2133, 273, 253, 2929, 50276, 33455, 4623, 2987, 3133, 281, 452, 644, 9829, 256, 536, 311, 1162, 355, 5987, 39962, 2061, 9275, 746, 6903, 20, 11718, 9275, 4081, 4212, 1553, 1245, 1980, 35701, 5513, 398, 1690, 6941, 4212, 281, 4853, 616, 1211, 10491, 24822, 533, 513, 417, 12661, 11333, 323, 10491, 1182, 12109, 1162, 355, 5987, 39962, 2061, 9275, 746, 2125, 805, 39405, 9275, 921, 326, 30037, 1057, 417, 13613, 4887, 1980, 31429, 281, 21349, 1774, 4903, 50275, 249, 253, 806, 3368, 1293, 690, 3216, 5083, 3640, 347, 281, 752, 253, 30410, 310, 2509, 352, 310, 417, 9090, 4217, 281, 1127, 562, 326, 253, 502, 553, 22649, 5492, 347, 247, 1755, 4735, 323, 10753, 275, 253, 761, 301, 42339, 10895, 253, 9978, 273, 1182, 12109, 1162, 355, 778, 320, 9013, 835, 253, 3216, 5083, 3879, 273, 253, 30410, 310, 1929, 50276, 249, 253, 48960, 2983, 3368, 352, 310, 417, 4336, 2590, 752, 310, 2218, 858, 368, 897, 502, 553, 281, 6635, 22909, 273, 253, 48960, 30410, 432, 37358, 1162, 355, 369, 253, 48960, 30410, 10166, 342, 2289, 281, 502, 553, 20452, 3470, 390, 369, 352, 10166, 7384, 30037, 20452, 3470, 436, 36908, 1646, 281, 4745, 921, 34385, 689, 253, 30037, 7792, 347, 387, 4067, 288, 28444, 13849, 253, 8492, 310, 1335, 8763, 891, 1158, 247, 625, 4569, 5301, 651, 1581, 253, 48960, 30410, 2289, 281, 253, 4623, 20452, 1159, 285, 1908, 7200, 387, 247, 5235, 273, 9168, 34414, 323, 30037, 347, 342, 253, 288, 28444, 4181, 50276, 5996, 30080, 22559, 891, 452, 1239, 253, 9300, 2715, 273, 253, 2929, 285, 1335, 1928, 326, 436, 2929, 778, 452, 6332, 5001, 253, 15840, 285, 4096, 273, 30037, 253, 2934, 310, 5322, 533, 253, 2929, 285, 27163, 651, 5649, 432, 625, 35952, 1078, 9311, 891, 6558, 619, 3236, 4868, 50276, 24418, 37626, 273, 30037, 50276, 4674, 495, 30037, 1057, 417, 5467, 326, 253, 10491, 9168, 310, 253, 1072, 347, 253, 2032, 3268, 352, 778, 5467, 1633, 21076, 824, 347, 326, 253, 1159, 1146, 5544, 310, 9648, 6032, 275, 253, 10491, 9168, 3877, 326, 436, 476, 320, 247, 4735, 273, 30037, 285, 417, 7933, 247, 7505, 604, 323, 1650, 1269, 18, 285, 1269, 19, 403, 4751, 9578, 275, 253, 941, 3268, 533, 253, 30410, 760, 4648, 1269, 18, 352, 651, 320, 7479, 281, 2028, 436, 604, 10491, 760, 1561, 253, 941, 3268, 407, 10491, 3345, 253, 941, 3268, 352, 4916, 5165, 326, 253, 30410, 310, 970, 1269, 18, 760, 671, 30037, 19584, 2806, 3817, 2289, 281, 253, 1159, 594, 891, 13414, 4751, 2096, 634, 3908, 326, 359, 3839, 513, 417, 452, 2289, 281, 253, 2032, 13301, 273, 10872, 4561, 949, 10491, 352, 3133, 751, 368, 778, 320, 13947, 253, 3451, 8813, 342, 1675, 281, 253, 2032, 941, 3268, 2581, 685, 281, 253, 30410, 30037, 310, 5486, 281, 5513, 247, 2806, 3364, 30410, 604, 253, 30410, 310, 3430, 30037, 943, 10313, 752, 253, 30410, 1057, 326, 310, 253, 8813, 943, 671, 320, 3430, 342, 1675, 281, 253, 2032, 941, 253, 7792, 13789, 310, 671, 3365, 417, 2032, 4212, 476, 4853, 253, 941, 1127, 281, 320, 5544, 347, 973, 347, 616, 1211, 14259, 10295, 285, 263, 10295, 4871, 50276, 15419, 2368, 50276, 953, 417, 7094, 4755, 281, 479, 849, 359, 476, 320, 2119, 326, 502, 553, 310, 9603, 253, 987, 8813, 275, 260, 18, 260, 19, 1293, 8958, 253, 1159, 269, 604, 6890, 253, 3733, 873, 2544, 253, 30410, 269, 840, 352, 310, 3451, 326, 253, 8813, 943, 1818, 347, 5393, 1840, 16344, 1880, 390, 417, 271, 8813, 310, 3451, 943, 320, 2218, 342, 1675, 281, 253, 30410, 417, 253, 6944, 941, 3268, 275, 15549, 48960, 8104, 697, 417, 2590, 432, 253, 2505, 1880, 390, 417, 368, 851, 1949, 253, 48960, 2983, 342, 634, 20452, 1159, 2007, 891, 9101, 326, 30037, 778, 671, 320, 2104, 281, 4271, 253, 7996, 4735, 323, 10481, 1355, 9168, 9552, 672, 10491, 275, 8985, 2317, 1182, 352, 3133, 751, 247, 17844, 637, 4154, 281, 7277, 271, 18325, 2715, 273, 634, 10491, 5199, 281, 253, 4284, 2715, 273, 30037, 50276, 37585, 7424, 337, 285, 374, 604, 597, 403, 12930, 253, 10393, 275, 9412, 70, 9401, 1162, 355, 943, 2486, 247, 42428, 1159, 50276, 13206, 374, 3133, 417, 281, 320, 5544, 275, 253, 2505, 285, 651, 5649, 432, 625, 5740, 2490, 187, 4118, 18435, 27, 783, 4477, 1246, 502, 553, 247, 12955, 273, 30037, 534, 3530, 432, 2608, 7769, 749, 31748, 7616, 407, 12419, 10806, 581, 16038, 310, 281, 2953, 253, 258, 351, 10491, 2523, 275, 3963, 30037, 597, 9569, 247, 7982, 281, 22048, 253, 12147, 273, 436, 2523, 285, 7568, 45190, 326, 502, 553, 7729, 281, 2953, 352, 275, 1340, 281, 3297, 2810, 281, 253, 941, 3268, 597, 897, 10806, 1754, 327, 288, 28444, 4181, 281, 941, 2792, 597, 7568, 326, 436, 2746, 7729, 281, 2342, 1411, 253, 3332, 2746, 273, 37358, 1162, 355, 9169, 281, 11213, 30037, 22909, 50276, 783, 2929, 310, 2810, 281, 45210, 2167, 7350, 3464, 670, 5661, 12820, 285, 253, 6070, 273, 4460, 7680, 1580, 253, 3236, 30037, 7792, 310, 625, 12112, 685, 2529, 1060, 285, 4483, 247, 2840, 4181, 1159, 3585, 337, 11532, 326, 253, 3236, 30037, 7792, 310, 4209, 281, 6016, 288, 28444, 4181, 10806, 2167, 10491, 588, 320, 1679, 5919, 281, 616, 6152, 4477, 9583, 275, 5955, 533, 436, 943, 320, 2007, 50221, 275, 247, 17265, 2715 ]
Below is given review of a research paper from cnoference journal. Please write a summary the review. ### Review: after feedback i would like to thank the authors for careful revision of the paper and answering and addressing most of my concerns from the initial submission my main concern was clarity and now the paper looks much more clearer i believe this is a strong paper and it represents an interesting contribution for the community still things to fix a a dataset used in 42 is not stated b missing articles for example p5 in practice however we need a weaker regularization for a small dataset or a large model c upper case at the beginning of a sentence after question p8 is our advbnn model susceptible to transfer attack we answer we we the paper proposes a bayesian neural network with adversarial training as an approach for defence against adversarial attacks main pro it is an interesting and reasonable idea for defence against adversarial attacks to combine adversarial training and randomness in a nn bringing randomness into a new level in the form of a bnn which is shown to outperform both adversarial training and random nn alone main con clarity the paper does not crucially lack clarity but some claims general organisation of the paper and style of quite a few sentences can be largely improved in general the paper is sound the main idea appears to be novel and the paper addresses the very important and relevant problem in deep learning such as defence against adversarial attacks writing and general presentation can be improved especially regarding bayesian neural networks where some clarity issues almost become quality issues style of some sentences can be tuned to more formal in details 1 the organisation of section 11 can be improved a general concept attack and specific example pgd attack are on the same level of representation while it seems more logical that pgd attack should be a subsection of attack and while there is a paragraph attack there is no paragraph defence but rather only specific examples 2 the claim we can either sample w pwx y efficiently without knowing the closedform formula through the method known as stochastic gradient langevin dynamics sgld welling teh 2011 sounds like sgld is the only sampling method for bnn which is not true see eg hamiltonian monte carlo neals phd thesis 1994 it is better to be formulated as through for example the method 3 issues regarding eq 7 a why there is an expectation over x y there should be the joint probability of all x y in the evidence b could the authors add more details about why it is the elbo given that it is unconventional with adversarial examples added c it seems that it should be log py xadv omega rather than pxadv y omega d if the authors assume noise component ie y fx omega epsilon then they do not need to have a compulsory softmax layer in their network which is important for example for regression models then the claim our advbnn method trains an arbitrary bayesian neural network would be more justified 4 it would make the paper more selfcontained if the bayes by backprop algorithm would be described in more details space can be taken from the bnn introduction and it seems to be a typo that it is bayes by backprop rather than bayes by prop 5 there are missing citations in the text a no models from nips 2017 adversarial attack and defence competition kurakin et al 2018 are mentioned b citation to justify the claim cw attack and pgd attack mentioned below have been recognized as two stateoftheart whitebox attacks for image classification task c we can approximate the true posterior pwx y by a parametric distribution qw where the unknown parameter is estimated by minimizing klqw pwx y over there are a variety of works in approximate inference in bnn it would be better to cite some of them here d citation to justify the claim although in these cases the kldivergence of prior and posterior is hard to compute and practically we replace it with the montecarlo estimator which has higher variance resulting in slower convergence rate 6 the goal and result interpretation of the correlation experiment is not very clear 7 from the presentation of figure 4 it is unclear that this is a distribution of standard deviations of approximated posterior 8 to sum up our advbnn method trains an arbitrary bayesian neural network with the adversarial examples of the same model unclear which same model is meant 9 among them there are two lines of work showing effective results on mediumsized convolutional networks eg cifar10 from this sentence it looks like cifar10 is a network rather than a dataset 10 in notations y introduction is missing 11 it is better to use other symbol for perturbation rather than boldsymboldelta since delta is already used for the dirac delta function 12 via tuning the coefficient c in the composite loss function the coefficient c is never introduced minor 1 there are a few missing articles for example in notations in this paper we focus on the attack under the norm constraint 2 kurakin et al 2017 is described in the past tense whereas carlini wagner 2017a is described in the present tense 3 inner brackets in eq 2 are bigger than outer brackets 4 in eq 11 delta is not bold 5 in eq 12 it seems that the second and third terms should have rather than 6 footnote in page 6 seems to be incorrectly labelled as 1 instead of 2 docsepi have read the feedback and discussed with the authors on my concerns for a few rounds the revision makes much more sense now especially by removing section 33 and replacing it with more related experiments i have a doubt on whether the proposed method is principled see below discussions the authors responded honestly and came up with some other solution a principled approach of adversarially training bnns is still unknown but im glad that the authors are happy to think about this problem i have raised the score to 6 i wouldnt mind seeing this paper accepted and i believe this method as a practical solution will work well for vibased bnns but again this score 6 reflects my opinion that the approach is not principled thank you for an interesting read the paper proposes training a bayesian neural network bnn with adversarial training to the best of my knowledge the idea is new although from my perspective is quite straightforward but see some discussions below the paper is well written and easy to understand experimental results are promising but i dont understand how the last experiment relates to the main idea see comments below there are a few issues to be addressed in revision 1 the paper seems to have ignored many papers in bnn literature on defending adversarial attacks see eg 1234 and papers citing them in fact robustness to adversarial attacks is becoming a standard test case for developing approximate inference on bayesian neural networks this means figure 2 is misleading as in the paper bnn actually refers to bnn with meanfield variational gaussian approximations 2 carlini and wagner 2017a has discussed a cwbased attack that can increase the success rate of attack on dropout bnns which can be easily transferred to a corresponding pgd version essentially the pgd attack tested in the paper does not assume the knowledge of bnn let alone the adversarial training this seems to contradict to the pledge in athalye et al that the defence method should be tested against an attack that is aware of the defence 3 i am not exactly sure if equation 7 is the most appropriate way to do adversarial training for bnns from a modelling perspective if we can do bayesian inference exactly then after marginalisation of w the model does not assume independence between datapoints this means if we want to attack the model then we need to do mindeltax gamma log pdadv dadv x deltax y x y sim sim dtr log pdadv log int prodx y sim dtr pyx deltax w pw dw now the model evidence log pdadv is intractable and you resort to variational lowerbound but from the above equation we can see the lower bound writes as mindeltax gamma maxq eq sumx y sim dtr log pyx deltax w klqp which is different from your equation 7 in fact equation 7 is a lowerbound of the above which means the adversaries are somehow weakened 4 i am not exactly sure the purpose of section 33 true that variational inference has been used for compressing neural networks and the experiment in section 33 also support this however how does network pruning relate to adversarial robustness i didnt see any discussion on this point therefore section 33 seems to be irrelevant to the paper some papers on bnns adversarial robustness 1 li and gal dropout inference in bayesian neural networks with alphadivergences icml 2017 2 feinman et al detecting adversarial samples from artifacts arxiv170300410 3 louizos and welling multiplicative normalizing flows for variational bayesian neural networks icml 2017 4 smith and gal understanding measures of uncertainty for adversarial example detection uai 2018docsepthe paper extends the pgd adversarial training method madry et al 2017 to bayesian neural nets bnns the proposed method defines a generative process that ties the prediction output and the adversarial input pattern via a set of shared neural net weights these weights are then assinged a prior and the resultant posterior is approximated by variational inference strength the proposed approach is incremental but anyway novel the results are groundbreaking there are some technical flaws in the way the method has been presented but the rest of the paper is very wellwritten major weaknesses equation 7 does not seem to be precise first the notation pxadv y w is severely misleading if xadv is also an input no matter if stochastic or deterministic the likelihood should read py w xadv furthermore if the resultant method is a bnn with an additional expectation on xadv the distribution employed on xadv resulting from the attack generation process should also be written in the form of the related probability distribution eg nxadvxsigma second the constraint that xadv should lie within the gammaball of x has some implications on the validity of the jensens inequality which relates equation 7 to proper posterior inference blundell et als algorithm should be renamed to bayesbybackprop this is also an outdated inference technique for quite many scenarios including the one presented in this paper why did not the authors benefit from the local reparametrization trick that enjoy much lower estimator variance there even emerge samplingfree techniques that nullify this variance altogether and provide much more stable training experience and some minor issues the introduction part of paper is unnecessarily long and the method part is in turn too thin as a reader i would prefer getting deeper into the proposed method instead of reading side material which i can also find in the cited articles i do symphathize and agree that python is a dominant language in the ml community yet it is better scientific writing practice to provide languageindependent algorithmic findings as pseudocode instead of native python overall this is a solid work with a novel method and very strong experimental findings having my grade discounted due to the technical issues i listed above and the limitedness of the algorithmic novelty i still view it as an accept case ### Summary:
reviewers are in a consensus and recommended to accept after engaging with the authors please take reviewers comments into consideration to improve your submission for the camera ready
[ 247, 1643, 14683, 476, 320, 8127, 5520, 50276, 249, 2087, 253, 2929, 310, 3590, 50276, 783, 2022, 2934, 4620, 281, 320, 4460, 285, 253, 2929, 12453, 253, 1077, 1774, 285, 4623, 1895, 275, 3676, 4715, 824, 347, 17147, 1411, 48960, 8104, 4028, 285, 2087, 9759, 476, 320, 5520, 3340, 5001, 17699, 16561, 11454, 6928, 835, 690, 19843, 3374, 2761, 2489, 3290, 3374, 3740, 273, 690, 14683, 476, 320, 24251, 281, 625, 7473, 50276, 249, 4278, 337, 253, 19156, 273, 2593, 1903, 476, 320, 5520, 247, 2087, 4473, 2983, 285, 2173, 1650, 23256, 69, 2983, 403, 327, 253, 1072, 1268, 273, 6779, 1223, 352, 3133, 625, 13760, 326, 23256, 69, 2983, 943, 320, 247, 19087, 273, 2983, 285, 1223, 627, 310, 247, 12494, 2983, 627, 310, 642, 12494, 17147, 533, 2581, 760, 2173, 6667, 374, 253, 1750, 359, 476, 2057, 3410, 259, 50276, 81, 22358, 340, 14556, 1293, 8958, 253, 4581, 630, 7212, 949, 253, 1332, 1929, 347, 19191, 11786, 298, 912, 8498, 8062, 48237, 392, 973, 272, 50276, 442, 73, 4332, 7835, 751, 48237, 392, 310, 253, 760, 10491, 1332, 323, 270, 9866, 534, 310, 417, 2032, 923, 24088, 10546, 7839, 757, 1114, 442, 1113, 4213, 425, 932, 815, 69, 22857, 9354, 352, 310, 1805, 281, 320, 26115, 347, 949, 323, 1650, 253, 1332, 50276, 20, 3374, 5001, 16186, 818, 50274, 66, 2139, 627, 310, 271, 15355, 689, 1269, 340, 627, 943, 320, 253, 6036, 5912, 273, 512, 1269, 340, 275, 253, 1941, 50274, 67, 812, 253, 4477, 823, 625, 4278, 670, 2139, 352, 310, 253, 1045, 2399, 1677, 326, 352, 310, 49799, 342, 48960, 6667, 2879, 50274, 68, 50276, 262, 3133, 326, 352, 943, 320, 2412, 7239, 50276, 89, 24301, 40639, 2581, 685, 268, 89, 24301, 340, 50276, 3151, 50273, 69, 604, 253, 4477, 5467, 6046, 4445, 26332, 340, 50276, 21448, 40639, 50276, 4259, 840, 597, 513, 417, 878, 281, 452, 247, 42571, 2602, 4090, 3828, 275, 616, 2990, 534, 310, 1774, 323, 1650, 323, 9077, 3210, 840, 253, 1750, 776, 1604, 67, 9866, 1332, 18784, 271, 10341, 17699, 16561, 11454, 2990, 651, 320, 625, 17285, 577, 352, 651, 1056, 253, 2929, 625, 1881, 41010, 604, 253, 17699, 265, 407, 896, 8560, 5933, 651, 320, 2529, 275, 625, 4278, 2317, 476, 320, 2668, 432, 253, 270, 9866, 10199, 285, 352, 3133, 281, 320, 247, 1745, 80, 326, 352, 310, 17699, 265, 407, 896, 8560, 2581, 685, 17699, 265, 407, 4198, 608, 627, 403, 5816, 30404, 275, 253, 2505, 50273, 66, 642, 3210, 432, 295, 2824, 4240, 48960, 2983, 285, 17147, 7324, 42981, 28709, 1162, 355, 4765, 403, 5393, 50273, 67, 25577, 281, 15249, 253, 1750, 260, 88, 2983, 285, 23256, 69, 2983, 5393, 2708, 452, 644, 7478, 347, 767, 1375, 23037, 14387, 3168, 3364, 8104, 323, 2460, 9162, 4836, 50273, 68, 359, 476, 16851, 253, 2032, 12637, 268, 22358, 340, 407, 247, 36833, 3268, 2805, 88, 835, 253, 7202, 4764, 50276, 261, 5998, 407, 28699, 27451, 82, 88, 50276, 81, 22358, 340, 689, 50275, 9088, 403, 247, 5235, 273, 2987, 275, 16851, 17032, 275, 270, 9866, 352, 651, 320, 1805, 281, 26542, 690, 273, 731, 1060, 50273, 69, 25577, 281, 15249, 253, 1750, 3738, 275, 841, 2219, 253, 465, 392, 2373, 9515, 273, 2720, 285, 12637, 310, 1892, 281, 11897, 285, 18236, 359, 8171, 352, 342, 253, 1114, 442, 5546, 4213, 29107, 534, 556, 2169, 11041, 4795, 275, 17357, 14940, 2281, 721, 253, 4736, 285, 906, 7914, 273, 253, 5921, 3368, 310, 417, 1077, 2590, 818, 432, 253, 9759, 273, 4677, 577, 352, 310, 12744, 326, 436, 310, 247, 3268, 273, 2629, 21492, 273, 34930, 12637, 854, 281, 2020, 598, 776, 1604, 67, 9866, 1332, 18784, 271, 10341, 17699, 16561, 11454, 2990, 342, 253, 48960, 6667, 273, 253, 1072, 1566, 50276, 328, 8250, 534, 1072, 1566, 310, 5486, 898, 2190, 731, 627, 403, 767, 3104, 273, 789, 4645, 3576, 1543, 327, 4646, 10306, 27311, 267, 6928, 24088, 260, 338, 274, 740, 50276, 4064, 436, 6197, 352, 4453, 751, 260, 338, 274, 740, 310, 247, 2990, 2581, 685, 247, 10895, 884, 275, 41818, 340, 10199, 310, 5816, 1903, 352, 310, 1805, 281, 897, 643, 9484, 323, 20452, 2581, 685, 270, 3502, 3445, 744, 1862, 1580, 18687, 310, 2168, 908, 323, 253, 14035, 317, 18687, 1159, 1249, 3066, 25184, 253, 10235, 260, 275, 253, 8212, 2957, 1159, 50276, 783, 10235, 260, 310, 1620, 5611, 50276, 37585, 337, 627, 403, 247, 1643, 5816, 7774, 323, 1650, 275, 41818, 275, 436, 2929, 359, 2770, 327, 253, 2983, 762, 253, 5222, 7658, 374, 42981, 28709, 1162, 355, 4240, 310, 2529, 275, 253, 2469, 29341, 5727, 1113, 3642, 74, 50276, 88, 25823, 4240, 66, 310, 2529, 275, 253, 1246, 29341, 495, 6703, 26609, 275, 16186, 374, 403, 8750, 685, 8346, 26609, 577, 275, 16186, 1903, 18687, 310, 417, 13433, 608, 275, 16186, 1249, 352, 3133, 326, 253, 1273, 285, 2626, 2426, 943, 452, 50276, 30786, 685, 50276, 23, 43302, 275, 3239, 721, 3133, 281, 320, 30833, 27214, 347, 337, 3185, 273, 374, 50272, 7152, 339, 2059, 452, 1239, 253, 8680, 285, 5469, 342, 253, 4477, 327, 619, 7350, 323, 247, 1643, 16334, 50275, 783, 18520, 2789, 1199, 625, 3282, 1024, 3340, 407, 11922, 2593, 5922, 285, 15706, 352, 342, 625, 2905, 4679, 50276, 74, 452, 247, 5545, 327, 1880, 253, 4081, 1332, 310, 3505, 74, 6216, 923, 2708, 11985, 253, 4477, 10974, 20509, 285, 2210, 598, 342, 690, 643, 2900, 247, 3505, 74, 6216, 2746, 273, 18539, 274, 1365, 3733, 270, 79, 2224, 310, 1335, 7202, 533, 516, 9995, 326, 253, 4477, 403, 5211, 281, 1158, 670, 436, 1895, 50275, 74, 452, 5439, 253, 4868, 281, 721, 891, 651, 2649, 2564, 6523, 436, 2929, 7607, 285, 891, 2868, 436, 1332, 347, 247, 8542, 2900, 588, 789, 973, 323, 362, 487, 833, 270, 79, 2224, 533, 969, 436, 4868, 721, 13806, 619, 4743, 326, 253, 2746, 310, 417, 3505, 74, 6216, 50274, 47033, 368, 323, 271, 4722, 1239, 50276, 783, 2929, 29328, 3733, 247, 17699, 16561, 11454, 2990, 270, 9866, 342, 48960, 3733, 281, 253, 1682, 273, 619, 3640, 253, 2934, 310, 747, 3738, 432, 619, 8668, 310, 3240, 15246, 533, 923, 690, 11985, 2708, 253, 2929, 310, 973, 3542, 285, 3477, 281, 2096, 5661, 1543, 403, 12532, 533, 891, 13414, 2096, 849, 253, 1390, 3368, 7033, 281, 253, 2022, 2934, 923, 5701, 2708, 50276, 9088, 403, 247, 1643, 3374, 281, 320, 9713, 275, 18520, 50276, 18, 253, 2929, 3133, 281, 452, 12841, 1142, 9380, 275, 270, 9866, 6239, 327, 21449, 48960, 8104, 923, 24088, 1249, 1706, 285, 9380, 19936, 731, 275, 958, 31640, 281, 48960, 8104, 310, 7552, 247, 2629, 1071, 1083, 323, 6684, 16851, 17032, 327, 17699, 16561, 11454, 6928, 436, 2097, 4677, 374, 310, 24363, 347, 275, 253, 2929, 270, 9866, 2686, 10770, 281, 270, 9866, 342, 1599, 3423, 39762, 305, 12064, 34754, 50276, 19, 1113, 3642, 74, 285, 259, 25823, 4240, 66, 556, 5469, 247, 260, 88, 3169, 2983, 326, 476, 2572, 253, 2323, 2281, 273, 2983, 327, 5926, 483, 270, 79, 2224, 534, 476, 320, 4354, 9495, 281, 247, 3969, 23256, 69, 2715, 9093, 253, 23256, 69, 2983, 5762, 275, 253, 2929, 1057, 417, 5467, 253, 3640, 273, 270, 9866, 1339, 3815, 253, 48960, 3733, 436, 3133, 281, 17343, 281, 253, 31760, 275, 9621, 5242, 70, 1162, 355, 326, 253, 17147, 1332, 943, 320, 5762, 1411, 271, 2983, 326, 310, 6600, 273, 253, 17147, 50276, 20, 891, 717, 417, 4555, 2119, 604, 5150, 818, 310, 253, 954, 4569, 1039, 281, 513, 48960, 3733, 323, 270, 79, 2224, 432, 247, 26278, 8668, 604, 359, 476, 513, 17699, 16561, 17032, 4555, 840, 846, 16888, 5837, 273, 259, 253, 1566, 1057, 417, 5467, 14275, 875, 2856, 522, 842, 84, 436, 2097, 604, 359, 971, 281, 2983, 253, 1566, 840, 359, 878, 281, 513, 50276, 14785, 1862, 89, 50276, 2733, 2412, 31385, 24301, 50276, 69, 24301, 50276, 89, 50276, 3005, 89, 340, 50276, 89, 340, 948, 948, 277, 1206, 2412, 31385, 24301, 50276, 2808, 540, 354, 9665, 340, 948, 277, 1206, 7239, 89, 50276, 3005, 89, 259, 268, 88, 19858, 1024, 253, 1566, 1941, 2412, 31385, 24301, 310, 540, 44374, 285, 368, 17942, 281, 39762, 2406, 9458, 533, 432, 253, 1840, 5150, 359, 476, 923, 253, 2406, 3033, 12013, 347, 2564, 1862, 89, 50276, 2733, 2781, 82, 16186, 2020, 89, 340, 948, 277, 1206, 2412, 7239, 89, 50276, 3005, 89, 259, 50275, 7261, 39541, 534, 310, 1027, 432, 634, 5150, 818, 275, 958, 5150, 818, 310, 247, 2406, 9458, 273, 253, 1840, 534, 2097, 253, 18539, 3927, 403, 10380, 33153, 50276, 21, 891, 717, 417, 4555, 2119, 253, 4096, 273, 2593, 5922, 2032, 326, 39762, 17032, 556, 644, 908, 323, 509, 13537, 11454, 6928, 285, 253, 3368, 275, 2593, 5922, 671, 1329, 436, 2299, 849, 1057, 2990, 819, 25004, 14588, 281, 48960, 31640, 891, 42126, 923, 667, 5955, 327, 436, 1127, 3103, 2593, 5922, 3133, 281, 320, 19124, 281, 253, 2929, 50276, 8826, 9380, 327, 270, 79, 2224, 48960, 31640, 337, 632, 285, 5918, 5926, 483, 17032, 275, 17699, 16561, 11454, 6928, 342, 355, 545, 324, 2373, 1541, 707, 17857, 1686, 4240, 374, 704, 249, 1342, 1162, 355, 15549, 48960, 3530, 432, 24165, 549, 32693, 1166, 2941, 5525, 740, 495, 29245, 478, 375, 285, 973, 272, 43904, 2622, 3006, 14221, 323, 39762, 17699, 16561, 11454, 6928, 17857, 1686, 4240, 50276, 21, 924, 334, 285, 5918, 4685, 5593, 273, 11649, 323, 48960, 1650, 5481, 1484, 2284, 4765, 7152, 339, 431, 248, 2929, 8725, 253, 23256, 69, 48960, 3733, 1332, 10279, 610, 1162, 355, 4240, 281, 17699, 16561, 11454, 37507, 270, 79, 2224, 50276, 783, 4081, 1332, 13067, 247, 1006, 800, 1232, 326, 16027, 253, 10554, 3453, 285, 253, 48960, 3280, 50276, 17523, 3066, 247, 873, 273, 6096, 11454, 2036, 13461, 841, 13461, 403, 840, 718, 272, 264, 247, 2720, 285, 50276, 783, 29395, 12637, 310, 34930, 407, 39762, 17032, 50276, 45563, 50274, 783, 4081, 2746, 310, 32809, 533, 8791, 4460, 50274, 783, 1543, 403, 3216, 22071, 50274, 9088, 403, 690, 7681, 32138, 275, 253, 1039, 253, 1332, 556, 644, 3559, 50276, 2858, 253, 1551, 273, 253, 2929, 310, 1077, 973, 15720, 50276, 24330, 32213, 50273, 29813, 818, 1057, 417, 1646, 281, 320, 10799, 806, 253, 14951, 268, 89, 24301, 340, 50276, 88, 310, 18270, 24363, 604, 1269, 24301, 310, 671, 271, 3280, 642, 2647, 604, 19191, 390, 30027, 253, 12177, 943, 1239, 7239, 50276, 88, 1269, 24301, 33810, 604, 253, 29395, 1332, 310, 247, 270, 9866, 342, 271, 3081, 15355, 327, 1269, 24301, 253, 3268, 7091, 327, 1269, 24301, 4795, 432, 253, 2983, 5978, 1232, 943, 671, 320, 3542, 275, 253, 830, 273, 253, 2905, 5912, 3268, 24088, 295, 89, 24301, 89, 2592, 50273, 9815, 253, 7658, 326, 1269, 24301, 943, 7027, 1561, 253, 305, 3681, 357, 455, 273, 1269, 556, 690, 12739, 327, 253, 13091, 273, 253, 480, 561, 561, 11370, 534, 7033, 5150, 818, 281, 1463, 12637, 17032, 50273, 1559, 1504, 437, 1162, 14350, 5933, 943, 320, 27624, 281, 17699, 265, 1615, 2135, 8560, 436, 310, 671, 271, 36761, 17032, 5853, 323, 3240, 1142, 15216, 1690, 253, 581, 3559, 275, 436, 2929, 2139, 858, 417, 253, 4477, 5649, 432, 253, 1980, 294, 3575, 292, 45031, 10480, 326, 4264, 1199, 2406, 29107, 11041, 627, 1014, 20177, 10491, 4924, 5609, 326, 3635, 1419, 436, 11041, 17965, 285, 2085, 1199, 625, 6474, 3733, 2793, 50276, 395, 690, 5884, 3374, 50273, 783, 10199, 629, 273, 2929, 310, 48312, 1048, 285, 253, 1332, 629, 310, 275, 1614, 1512, 6906, 347, 247, 9414, 891, 651, 4510, 2970, 12861, 715, 253, 4081, 1332, 3185, 273, 4361, 1930, 2144, 534, 891, 476, 671, 1089, 275, 253, 11106, 7774, 50273, 74, 513, 18870, 545, 506, 907, 285, 5194, 326, 15548, 310, 247, 11360, 3448, 275, 253, 13361, 3114, 2568, 352, 310, 1805, 8249, 4028, 3946, 281, 2085, 3448, 17777, 5933, 280, 4342, 347, 10585, 406, 853, 3185, 273, 7925, 15548, 50276, 1189, 455, 436, 310, 247, 4891, 789, 342, 247, 4460, 1332, 285, 1077, 2266, 5661, 4342, 1907, 619, 9646, 42214, 1955, 281, 253, 7681, 3374, 891, 7117, 1840, 285, 253, 3710, 1255, 273, 253, 5933, 280, 38135, 891, 1335, 1859, 352, 347, 271, 2997, 1083, 187, 187, 4118, 18435, 27, 15337, 398, 403, 275, 247, 13969, 285, 8521, 281, 2997, 846, 15966, 342, 253, 4477, 4496, 1379, 30628, 5701, 715, 8180, 281, 3157, 634, 19529, 323, 253, 6568, 4704, 209 ]
[ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 ]
[ 247, 1643, 14683, 476, 320, 8127, 5520, 50276, 249, 2087, 253, 2929, 310, 3590, 50276, 783, 2022, 2934, 4620, 281, 320, 4460, 285, 253, 2929, 12453, 253, 1077, 1774, 285, 4623, 1895, 275, 3676, 4715, 824, 347, 17147, 1411, 48960, 8104, 4028, 285, 2087, 9759, 476, 320, 5520, 3340, 5001, 17699, 16561, 11454, 6928, 835, 690, 19843, 3374, 2761, 2489, 3290, 3374, 3740, 273, 690, 14683, 476, 320, 24251, 281, 625, 7473, 50276, 249, 4278, 337, 253, 19156, 273, 2593, 1903, 476, 320, 5520, 247, 2087, 4473, 2983, 285, 2173, 1650, 23256, 69, 2983, 403, 327, 253, 1072, 1268, 273, 6779, 1223, 352, 3133, 625, 13760, 326, 23256, 69, 2983, 943, 320, 247, 19087, 273, 2983, 285, 1223, 627, 310, 247, 12494, 2983, 627, 310, 642, 12494, 17147, 533, 2581, 760, 2173, 6667, 374, 253, 1750, 359, 476, 2057, 3410, 259, 50276, 81, 22358, 340, 14556, 1293, 8958, 253, 4581, 630, 7212, 949, 253, 1332, 1929, 347, 19191, 11786, 298, 912, 8498, 8062, 48237, 392, 973, 272, 50276, 442, 73, 4332, 7835, 751, 48237, 392, 310, 253, 760, 10491, 1332, 323, 270, 9866, 534, 310, 417, 2032, 923, 24088, 10546, 7839, 757, 1114, 442, 1113, 4213, 425, 932, 815, 69, 22857, 9354, 352, 310, 1805, 281, 320, 26115, 347, 949, 323, 1650, 253, 1332, 50276, 20, 3374, 5001, 16186, 818, 50274, 66, 2139, 627, 310, 271, 15355, 689, 1269, 340, 627, 943, 320, 253, 6036, 5912, 273, 512, 1269, 340, 275, 253, 1941, 50274, 67, 812, 253, 4477, 823, 625, 4278, 670, 2139, 352, 310, 253, 1045, 2399, 1677, 326, 352, 310, 49799, 342, 48960, 6667, 2879, 50274, 68, 50276, 262, 3133, 326, 352, 943, 320, 2412, 7239, 50276, 89, 24301, 40639, 2581, 685, 268, 89, 24301, 340, 50276, 3151, 50273, 69, 604, 253, 4477, 5467, 6046, 4445, 26332, 340, 50276, 21448, 40639, 50276, 4259, 840, 597, 513, 417, 878, 281, 452, 247, 42571, 2602, 4090, 3828, 275, 616, 2990, 534, 310, 1774, 323, 1650, 323, 9077, 3210, 840, 253, 1750, 776, 1604, 67, 9866, 1332, 18784, 271, 10341, 17699, 16561, 11454, 2990, 651, 320, 625, 17285, 577, 352, 651, 1056, 253, 2929, 625, 1881, 41010, 604, 253, 17699, 265, 407, 896, 8560, 5933, 651, 320, 2529, 275, 625, 4278, 2317, 476, 320, 2668, 432, 253, 270, 9866, 10199, 285, 352, 3133, 281, 320, 247, 1745, 80, 326, 352, 310, 17699, 265, 407, 896, 8560, 2581, 685, 17699, 265, 407, 4198, 608, 627, 403, 5816, 30404, 275, 253, 2505, 50273, 66, 642, 3210, 432, 295, 2824, 4240, 48960, 2983, 285, 17147, 7324, 42981, 28709, 1162, 355, 4765, 403, 5393, 50273, 67, 25577, 281, 15249, 253, 1750, 260, 88, 2983, 285, 23256, 69, 2983, 5393, 2708, 452, 644, 7478, 347, 767, 1375, 23037, 14387, 3168, 3364, 8104, 323, 2460, 9162, 4836, 50273, 68, 359, 476, 16851, 253, 2032, 12637, 268, 22358, 340, 407, 247, 36833, 3268, 2805, 88, 835, 253, 7202, 4764, 50276, 261, 5998, 407, 28699, 27451, 82, 88, 50276, 81, 22358, 340, 689, 50275, 9088, 403, 247, 5235, 273, 2987, 275, 16851, 17032, 275, 270, 9866, 352, 651, 320, 1805, 281, 26542, 690, 273, 731, 1060, 50273, 69, 25577, 281, 15249, 253, 1750, 3738, 275, 841, 2219, 253, 465, 392, 2373, 9515, 273, 2720, 285, 12637, 310, 1892, 281, 11897, 285, 18236, 359, 8171, 352, 342, 253, 1114, 442, 5546, 4213, 29107, 534, 556, 2169, 11041, 4795, 275, 17357, 14940, 2281, 721, 253, 4736, 285, 906, 7914, 273, 253, 5921, 3368, 310, 417, 1077, 2590, 818, 432, 253, 9759, 273, 4677, 577, 352, 310, 12744, 326, 436, 310, 247, 3268, 273, 2629, 21492, 273, 34930, 12637, 854, 281, 2020, 598, 776, 1604, 67, 9866, 1332, 18784, 271, 10341, 17699, 16561, 11454, 2990, 342, 253, 48960, 6667, 273, 253, 1072, 1566, 50276, 328, 8250, 534, 1072, 1566, 310, 5486, 898, 2190, 731, 627, 403, 767, 3104, 273, 789, 4645, 3576, 1543, 327, 4646, 10306, 27311, 267, 6928, 24088, 260, 338, 274, 740, 50276, 4064, 436, 6197, 352, 4453, 751, 260, 338, 274, 740, 310, 247, 2990, 2581, 685, 247, 10895, 884, 275, 41818, 340, 10199, 310, 5816, 1903, 352, 310, 1805, 281, 897, 643, 9484, 323, 20452, 2581, 685, 270, 3502, 3445, 744, 1862, 1580, 18687, 310, 2168, 908, 323, 253, 14035, 317, 18687, 1159, 1249, 3066, 25184, 253, 10235, 260, 275, 253, 8212, 2957, 1159, 50276, 783, 10235, 260, 310, 1620, 5611, 50276, 37585, 337, 627, 403, 247, 1643, 5816, 7774, 323, 1650, 275, 41818, 275, 436, 2929, 359, 2770, 327, 253, 2983, 762, 253, 5222, 7658, 374, 42981, 28709, 1162, 355, 4240, 310, 2529, 275, 253, 2469, 29341, 5727, 1113, 3642, 74, 50276, 88, 25823, 4240, 66, 310, 2529, 275, 253, 1246, 29341, 495, 6703, 26609, 275, 16186, 374, 403, 8750, 685, 8346, 26609, 577, 275, 16186, 1903, 18687, 310, 417, 13433, 608, 275, 16186, 1249, 352, 3133, 326, 253, 1273, 285, 2626, 2426, 943, 452, 50276, 30786, 685, 50276, 23, 43302, 275, 3239, 721, 3133, 281, 320, 30833, 27214, 347, 337, 3185, 273, 374, 50272, 7152, 339, 2059, 452, 1239, 253, 8680, 285, 5469, 342, 253, 4477, 327, 619, 7350, 323, 247, 1643, 16334, 50275, 783, 18520, 2789, 1199, 625, 3282, 1024, 3340, 407, 11922, 2593, 5922, 285, 15706, 352, 342, 625, 2905, 4679, 50276, 74, 452, 247, 5545, 327, 1880, 253, 4081, 1332, 310, 3505, 74, 6216, 923, 2708, 11985, 253, 4477, 10974, 20509, 285, 2210, 598, 342, 690, 643, 2900, 247, 3505, 74, 6216, 2746, 273, 18539, 274, 1365, 3733, 270, 79, 2224, 310, 1335, 7202, 533, 516, 9995, 326, 253, 4477, 403, 5211, 281, 1158, 670, 436, 1895, 50275, 74, 452, 5439, 253, 4868, 281, 721, 891, 651, 2649, 2564, 6523, 436, 2929, 7607, 285, 891, 2868, 436, 1332, 347, 247, 8542, 2900, 588, 789, 973, 323, 362, 487, 833, 270, 79, 2224, 533, 969, 436, 4868, 721, 13806, 619, 4743, 326, 253, 2746, 310, 417, 3505, 74, 6216, 50274, 47033, 368, 323, 271, 4722, 1239, 50276, 783, 2929, 29328, 3733, 247, 17699, 16561, 11454, 2990, 270, 9866, 342, 48960, 3733, 281, 253, 1682, 273, 619, 3640, 253, 2934, 310, 747, 3738, 432, 619, 8668, 310, 3240, 15246, 533, 923, 690, 11985, 2708, 253, 2929, 310, 973, 3542, 285, 3477, 281, 2096, 5661, 1543, 403, 12532, 533, 891, 13414, 2096, 849, 253, 1390, 3368, 7033, 281, 253, 2022, 2934, 923, 5701, 2708, 50276, 9088, 403, 247, 1643, 3374, 281, 320, 9713, 275, 18520, 50276, 18, 253, 2929, 3133, 281, 452, 12841, 1142, 9380, 275, 270, 9866, 6239, 327, 21449, 48960, 8104, 923, 24088, 1249, 1706, 285, 9380, 19936, 731, 275, 958, 31640, 281, 48960, 8104, 310, 7552, 247, 2629, 1071, 1083, 323, 6684, 16851, 17032, 327, 17699, 16561, 11454, 6928, 436, 2097, 4677, 374, 310, 24363, 347, 275, 253, 2929, 270, 9866, 2686, 10770, 281, 270, 9866, 342, 1599, 3423, 39762, 305, 12064, 34754, 50276, 19, 1113, 3642, 74, 285, 259, 25823, 4240, 66, 556, 5469, 247, 260, 88, 3169, 2983, 326, 476, 2572, 253, 2323, 2281, 273, 2983, 327, 5926, 483, 270, 79, 2224, 534, 476, 320, 4354, 9495, 281, 247, 3969, 23256, 69, 2715, 9093, 253, 23256, 69, 2983, 5762, 275, 253, 2929, 1057, 417, 5467, 253, 3640, 273, 270, 9866, 1339, 3815, 253, 48960, 3733, 436, 3133, 281, 17343, 281, 253, 31760, 275, 9621, 5242, 70, 1162, 355, 326, 253, 17147, 1332, 943, 320, 5762, 1411, 271, 2983, 326, 310, 6600, 273, 253, 17147, 50276, 20, 891, 717, 417, 4555, 2119, 604, 5150, 818, 310, 253, 954, 4569, 1039, 281, 513, 48960, 3733, 323, 270, 79, 2224, 432, 247, 26278, 8668, 604, 359, 476, 513, 17699, 16561, 17032, 4555, 840, 846, 16888, 5837, 273, 259, 253, 1566, 1057, 417, 5467, 14275, 875, 2856, 522, 842, 84, 436, 2097, 604, 359, 971, 281, 2983, 253, 1566, 840, 359, 878, 281, 513, 50276, 14785, 1862, 89, 50276, 2733, 2412, 31385, 24301, 50276, 69, 24301, 50276, 89, 50276, 3005, 89, 340, 50276, 89, 340, 948, 948, 277, 1206, 2412, 31385, 24301, 50276, 2808, 540, 354, 9665, 340, 948, 277, 1206, 7239, 89, 50276, 3005, 89, 259, 268, 88, 19858, 1024, 253, 1566, 1941, 2412, 31385, 24301, 310, 540, 44374, 285, 368, 17942, 281, 39762, 2406, 9458, 533, 432, 253, 1840, 5150, 359, 476, 923, 253, 2406, 3033, 12013, 347, 2564, 1862, 89, 50276, 2733, 2781, 82, 16186, 2020, 89, 340, 948, 277, 1206, 2412, 7239, 89, 50276, 3005, 89, 259, 50275, 7261, 39541, 534, 310, 1027, 432, 634, 5150, 818, 275, 958, 5150, 818, 310, 247, 2406, 9458, 273, 253, 1840, 534, 2097, 253, 18539, 3927, 403, 10380, 33153, 50276, 21, 891, 717, 417, 4555, 2119, 253, 4096, 273, 2593, 5922, 2032, 326, 39762, 17032, 556, 644, 908, 323, 509, 13537, 11454, 6928, 285, 253, 3368, 275, 2593, 5922, 671, 1329, 436, 2299, 849, 1057, 2990, 819, 25004, 14588, 281, 48960, 31640, 891, 42126, 923, 667, 5955, 327, 436, 1127, 3103, 2593, 5922, 3133, 281, 320, 19124, 281, 253, 2929, 50276, 8826, 9380, 327, 270, 79, 2224, 48960, 31640, 337, 632, 285, 5918, 5926, 483, 17032, 275, 17699, 16561, 11454, 6928, 342, 355, 545, 324, 2373, 1541, 707, 17857, 1686, 4240, 374, 704, 249, 1342, 1162, 355, 15549, 48960, 3530, 432, 24165, 549, 32693, 1166, 2941, 5525, 740, 495, 29245, 478, 375, 285, 973, 272, 43904, 2622, 3006, 14221, 323, 39762, 17699, 16561, 11454, 6928, 17857, 1686, 4240, 50276, 21, 924, 334, 285, 5918, 4685, 5593, 273, 11649, 323, 48960, 1650, 5481, 1484, 2284, 4765, 7152, 339, 431, 248, 2929, 8725, 253, 23256, 69, 48960, 3733, 1332, 10279, 610, 1162, 355, 4240, 281, 17699, 16561, 11454, 37507, 270, 79, 2224, 50276, 783, 4081, 1332, 13067, 247, 1006, 800, 1232, 326, 16027, 253, 10554, 3453, 285, 253, 48960, 3280, 50276, 17523, 3066, 247, 873, 273, 6096, 11454, 2036, 13461, 841, 13461, 403, 840, 718, 272, 264, 247, 2720, 285, 50276, 783, 29395, 12637, 310, 34930, 407, 39762, 17032, 50276, 45563, 50274, 783, 4081, 2746, 310, 32809, 533, 8791, 4460, 50274, 783, 1543, 403, 3216, 22071, 50274, 9088, 403, 690, 7681, 32138, 275, 253, 1039, 253, 1332, 556, 644, 3559, 50276, 2858, 253, 1551, 273, 253, 2929, 310, 1077, 973, 15720, 50276, 24330, 32213, 50273, 29813, 818, 1057, 417, 1646, 281, 320, 10799, 806, 253, 14951, 268, 89, 24301, 340, 50276, 88, 310, 18270, 24363, 604, 1269, 24301, 310, 671, 271, 3280, 642, 2647, 604, 19191, 390, 30027, 253, 12177, 943, 1239, 7239, 50276, 88, 1269, 24301, 33810, 604, 253, 29395, 1332, 310, 247, 270, 9866, 342, 271, 3081, 15355, 327, 1269, 24301, 253, 3268, 7091, 327, 1269, 24301, 4795, 432, 253, 2983, 5978, 1232, 943, 671, 320, 3542, 275, 253, 830, 273, 253, 2905, 5912, 3268, 24088, 295, 89, 24301, 89, 2592, 50273, 9815, 253, 7658, 326, 1269, 24301, 943, 7027, 1561, 253, 305, 3681, 357, 455, 273, 1269, 556, 690, 12739, 327, 253, 13091, 273, 253, 480, 561, 561, 11370, 534, 7033, 5150, 818, 281, 1463, 12637, 17032, 50273, 1559, 1504, 437, 1162, 14350, 5933, 943, 320, 27624, 281, 17699, 265, 1615, 2135, 8560, 436, 310, 671, 271, 36761, 17032, 5853, 323, 3240, 1142, 15216, 1690, 253, 581, 3559, 275, 436, 2929, 2139, 858, 417, 253, 4477, 5649, 432, 253, 1980, 294, 3575, 292, 45031, 10480, 326, 4264, 1199, 2406, 29107, 11041, 627, 1014, 20177, 10491, 4924, 5609, 326, 3635, 1419, 436, 11041, 17965, 285, 2085, 1199, 625, 6474, 3733, 2793, 50276, 395, 690, 5884, 3374, 50273, 783, 10199, 629, 273, 2929, 310, 48312, 1048, 285, 253, 1332, 629, 310, 275, 1614, 1512, 6906, 347, 247, 9414, 891, 651, 4510, 2970, 12861, 715, 253, 4081, 1332, 3185, 273, 4361, 1930, 2144, 534, 891, 476, 671, 1089, 275, 253, 11106, 7774, 50273, 74, 513, 18870, 545, 506, 907, 285, 5194, 326, 15548, 310, 247, 11360, 3448, 275, 253, 13361, 3114, 2568, 352, 310, 1805, 8249, 4028, 3946, 281, 2085, 3448, 17777, 5933, 280, 4342, 347, 10585, 406, 853, 3185, 273, 7925, 15548, 50276, 1189, 455, 436, 310, 247, 4891, 789, 342, 247, 4460, 1332, 285, 1077, 2266, 5661, 4342, 1907, 619, 9646, 42214, 1955, 281, 253, 7681, 3374, 891, 7117, 1840, 285, 253, 3710, 1255, 273, 253, 5933, 280, 38135, 891, 1335, 1859, 352, 347, 271, 2997, 1083, 187, 187, 4118, 18435, 27, 15337, 398, 403, 275, 247, 13969, 285, 8521, 281, 2997, 846, 15966, 342, 253, 4477, 4496, 1379, 30628, 5701, 715, 8180, 281, 3157, 634, 19529, 323, 253, 6568, 4704, 209 ]
Below is given review of a research paper from cnoference journal. Please write a summary the review. ### Review: the paper proposes elimination rules that can be easily combined with existing fixedconfidence algorithms for pure exploration both for sampling and stopping strategies the benefits are improved performance in the nonasymptotic regime the authors show a result that the elimination stopping results in eliminating pieces no later than the standard llr stopping strength is that the proposed strategy leads to improved empirical performance the weakness is that the theory does not seem to be strong at least in the current form the contribution is clear but the details can be explained better so i would say that clarity is below expectation originality quality and significance is all mediocre just around the bar my main complaint is that there is no provable improvement of the proposed methods in mathematical analysis i do not mean the proof for not worse but a proof like a strictly upper bound can be achieved and it can make an orderwise difference in some cases empirical evaluation shows that the empirical performance improvement can be better but in my opinion it is rather a minor improvement after rebuttal i have changed the score as the authors have addressed the concern na docsepthis work bridges the gap between elimination algorithms and fully adaptive algorithms for identification problems in bandits one the one hand the main strength and weakness of elimination algorithms are their high computational efficiency and large sample complexities respectively one the other hand fully adaptive algorithms enjoy stronger sample complexity guarantees while being computationally inefficient in this paper are designed fully adaptive algorithms that perform elimination these algorithms are shown to get the best of both worlds both theoretically and empirically strengths the framework has a broad range of applications it covers among others the problems of gaussian bandits topk thresholding identification for linear and unstructured bandits on this general setup this work successfully tackles the problem of designing strategies that are both fully adaptive and performing elimination the experiments support the theoretical claims weaknesses this paper could benefit from a better presentation it would be nice to highlight a pseudocode of the algorithm designed it takes a long time to get to the main theoretical results the theorems are hard to parse without examplesinstantiations i did not really find a clear statement about the limitations of the work the authors state that the potential negative societal impacts of their work is na this work is mainly theoretical but it might still be valuable to mention how bandit algorithms are applied to the world and eg what could go wrong if the suggested algorithms are used docsep the authors propose a new algorithm design principle for elimination algorithms in the fixedconfidence bestarmidentification setting with parametric distributions that modifies the usual stopping rule of trackandstopstyle algorithms adaptive in their terminology roughly under an assumption that the parameter space of each alternative for each arm i ie the parameters that would result in any arm other than i being the best can be decomposed into a small subset of pieces that are easy to optimize over the authors propose elimination of individual pieces in contrast the typical trackandstop algorithm will continue until all pieces except for those belonging to a single action can be eliminated this procedure can reduce the cost of checking the stopping rule and is shown to have the same guarantees correctness and asymptotic efficiency the authors then present sampling rule that also takes into account pieces that have been eliminated and show that under a technical assumption the new sampling rule does not degrade the sample complexity improved performance in both sample complexity and time complexity is shows experimentally while the writing was generally high quality i found the overall presentation confusing and the theoretical results a bit underwhelming though i didnt read the appendix so it is possible the technical difficulties were larger than i thought first i found the main ideas difficult to understand and had to work through section 2 a few times i think the presentation could be improved if the section was restructured to highlight a cartoon of the original trackandstop and introduce your new ideas an modifications it wasnt immediately clear to me if your procedure would produce an action elimination algorithm or an adaptive algorithm beginning with the explanation that you are in the parametric setting and are modifying track and stop would help i also wish that the elimination sampling algorithm was presented more clearly perhaps with pseudocode at least for your favorite instantiation of an adaptive algorithm it might be good to give a bit more intuition about what adaptive sampling and elimination algorithms do and their differences especially since there are several well known algorithms eg lucb ts variant that dont fit into your two classes specifically i found your repeatedly test the correctness of every answer confusing and it would be probably clearer to describe how elimination algorithms work in phasesepochs i might go so far at to argue that adaptive is a confusing choice as elimination algorithms also change their sampling distributions as data are collected albeit usually through eliminating actions perhaps phased nonphased would be a clearer distinction another very import distinction is that elimination algorithms are generally nonparametric needing only some way to get finitesample confidence intervals on the target parameters whereas adaptive methods require strong parametric assumptions i wish there was theory showing improvement in the sample complexity and i wish there was a discussion about the assumptions specifically an argument about why they should be necessary other feedback line 137 bad grammar i think a closer discussion about the assumptions is needed societal impact was not addressed but i dont think its necessary docsepthe paper considers the problem of finding or identifying the correct answer to a specific question in a bandit scenario examples are what is the arm with the highest mean best arm identification or which arms have a mean larger than some specific value thresholding bandits for this purpose they revisit the algorithmic approaches based on loglikelihood ratio stopping which are theoretically quite appealing but are oftentimes computationally expensive roughly speaking these approaches stop as soon as there is an answer such that the alternatives to these answers are not plausible enough measured in terms of minimal distance of loglikelihood ratio under the assumption that the underlying bandit problem scenario allows a decomposition of the possible alternative sets into smaller subsets called pieces the authors suggest two stopping criteria that have provably a smaller stopping time than the loglikelihood approach provided all are using the same sampling strategy these stopping criteria are combining the loglikelihood ratio stopping approach with the nonadaptive eliminationbased approaches by excluding pieces as soon as these are redundant for the decision making the authors show that for a couple of gaussian linear bandit problems the assumption above is satisfied and the minimizer of the loglikelihood ratio can be efficiently computed on the corresponding pieces moreover a bound in expectation on the resulting stopping times are shown when the stopping criteria are combined with an efficient sampling rule inspired by the suggested stopping criteria the authors derive a sampling strategy that is specifically tailored towards an early stopping of the former the suggested approaches are investigated in an experimental study on synthetic data strengths quite interesting topic the underlying problem setting of general identification in bandits is a relevant theoretical problem scenario covering a range of practical applications and has already been investigated by a couple of several authors the underlying research question of whether the fully adaptive approaches can be adapted such that they are computationally more efficient such as the nonadaptive elimination strategies is of utmost relevance for practical applications the key challenge is of course to maintain the appealing theoretical properties of the fully adaptive approaches soundness overall most of the results are sound as well as their proofs i havent checked all proofs in every last detail but had a thorough look at the proofs of core results which were fine as far as i can tell quality of writingpresentation in general the paper is well written although it is a rather technical paper the notation is well thought out weaknesses modest theoretical contribution although the paper contains a fairly extensive appendix of proofs of the theoretical results these are largely based on prior work and no groundbreaking new ideas come in as far as i can tell therefore i think the theoretical contribution of this paper is okay but not significant vague description of eliminating at sampling the part about the suggested sampling strategy can be improved regarding its clarity to be more precise i would have appreciated if the eliminating at sampling approach would have been presented in form of a pseudocode instead of a rather vague verbal description i would have appreciated if the authors would give a remark regarding a direct comparison of the actual stopping times ie in which case it is possible and in which it is not i think only for theorem 26 it is ### Summary:
this paper has initially received borderline scores the reviewers appreciated the general algorithmic framework and the high technical quality but some of them lamented the relative weakness of the contribution in particular the lack of hard improvements over existing results and pointed out that the presentation could use some improvements some of these concerns were addressed in a revised version of the paper and a series of wellwritten author responses which eventually convinced several reviewers to raise their scores eventually all reviewers agreed that the paper is acceptable for publication the authors are encouraged to do another pass of revision when preparing the final version of the paper and take all the reviewers comments into account in the process
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 783, 2929, 29328, 20408, 4803, 326, 476, 320, 4354, 5678, 342, 5368, 4229, 39943, 11333, 323, 6313, 17947, 1097, 323, 10491, 285, 15910, 8130, 253, 5373, 403, 5520, 3045, 275, 253, 1327, 284, 40045, 3875, 9459, 253, 4477, 921, 247, 906, 326, 253, 20408, 15910, 1543, 275, 23703, 7437, 642, 1996, 685, 253, 2629, 26198, 83, 15910, 4757, 310, 326, 253, 4081, 5700, 5644, 281, 5520, 16774, 3045, 253, 14855, 310, 326, 253, 3762, 1057, 417, 1646, 281, 320, 2266, 387, 1878, 275, 253, 1655, 830, 50276, 783, 7680, 310, 2590, 533, 253, 4278, 476, 320, 5544, 1805, 594, 891, 651, 1333, 326, 19843, 310, 2708, 15355, 3236, 414, 3290, 285, 8453, 310, 512, 12069, 49636, 816, 1475, 253, 2534, 50276, 2577, 2022, 5833, 310, 326, 627, 310, 642, 872, 494, 7756, 273, 253, 4081, 3082, 275, 15965, 1783, 891, 513, 417, 1599, 253, 4737, 323, 417, 7197, 533, 247, 4737, 751, 247, 13714, 5170, 3033, 476, 320, 6786, 285, 352, 476, 1056, 271, 1340, 3020, 3064, 275, 690, 2219, 16774, 7103, 2722, 326, 253, 16774, 3045, 7756, 476, 320, 1805, 533, 275, 619, 4743, 352, 310, 2581, 247, 5884, 7756, 50275, 6438, 30080, 22559, 891, 452, 4391, 253, 4868, 347, 253, 4477, 452, 9713, 253, 4468, 5549, 5474, 33032, 2520, 789, 24853, 253, 8037, 875, 20408, 11333, 285, 4751, 17825, 11333, 323, 8137, 3237, 275, 3961, 953, 581, 253, 581, 1133, 253, 2022, 4757, 285, 14855, 273, 20408, 11333, 403, 616, 1029, 15180, 6733, 285, 1781, 3410, 48663, 2975, 581, 253, 643, 1133, 4751, 17825, 11333, 4264, 10046, 3410, 10454, 23632, 1223, 1146, 43245, 31334, 275, 436, 2929, 403, 4158, 4751, 17825, 11333, 326, 1347, 20408, 841, 11333, 403, 2011, 281, 755, 253, 1682, 273, 1097, 20490, 1097, 28055, 285, 45190, 50276, 296, 3755, 20556, 50276, 783, 7792, 556, 247, 3862, 2491, 273, 4893, 352, 10949, 2190, 2571, 253, 3237, 273, 305, 12064, 3961, 953, 1755, 76, 7887, 272, 8137, 323, 4872, 285, 440, 34218, 3961, 953, 50276, 251, 436, 2087, 9978, 436, 789, 8379, 39223, 253, 1895, 273, 20462, 8130, 326, 403, 1097, 4751, 17825, 285, 9591, 20408, 50276, 783, 4679, 1329, 253, 10527, 3916, 50276, 20881, 1255, 265, 50275, 2520, 2929, 812, 5649, 432, 247, 1805, 9759, 50275, 262, 651, 320, 5322, 281, 6780, 247, 10585, 406, 853, 273, 253, 5933, 4158, 50276, 262, 3936, 247, 1048, 673, 281, 755, 281, 253, 2022, 10527, 1543, 50276, 783, 39383, 403, 1892, 281, 14390, 1293, 6667, 43760, 10944, 50276, 74, 858, 417, 1663, 1089, 247, 2590, 3908, 670, 253, 7364, 273, 253, 789, 253, 4477, 1375, 326, 253, 2442, 4016, 38058, 16274, 273, 616, 789, 310, 5549, 436, 789, 310, 7194, 10527, 533, 352, 1537, 1335, 320, 9865, 281, 3748, 849, 3961, 262, 11333, 403, 3732, 281, 253, 1533, 285, 24088, 752, 812, 564, 3430, 604, 253, 5125, 11333, 403, 908, 50276, 7152, 33032, 253, 4477, 12661, 247, 747, 5933, 2216, 8063, 323, 20408, 11333, 275, 253, 4229, 39943, 1682, 1513, 888, 1877, 4758, 342, 36833, 10670, 326, 771, 7790, 253, 7312, 15910, 4086, 273, 3540, 395, 13121, 4826, 11333, 17825, 275, 616, 28939, 50276, 903, 314, 762, 271, 9376, 326, 253, 4764, 2317, 273, 1016, 5795, 323, 1016, 4430, 891, 26332, 253, 3602, 326, 651, 906, 275, 667, 4430, 643, 685, 891, 1146, 253, 1682, 476, 320, 45765, 715, 247, 1355, 8578, 273, 7437, 326, 403, 3477, 281, 22318, 689, 253, 4477, 12661, 20408, 273, 2060, 7437, 275, 4499, 253, 6867, 3540, 395, 13121, 5933, 588, 4035, 1919, 512, 7437, 3707, 323, 1110, 15823, 281, 247, 2014, 2250, 476, 320, 17527, 436, 5199, 476, 4796, 253, 2105, 273, 12669, 253, 15910, 4086, 285, 310, 2011, 281, 452, 253, 1072, 23632, 36594, 285, 20185, 6733, 50276, 783, 4477, 840, 1246, 10491, 4086, 326, 671, 3936, 715, 2395, 7437, 326, 452, 644, 17527, 285, 921, 326, 762, 247, 7681, 9376, 253, 747, 10491, 4086, 1057, 417, 40195, 253, 3410, 10454, 5520, 3045, 275, 1097, 3410, 10454, 285, 673, 10454, 310, 2722, 21657, 50275, 6050, 253, 4028, 369, 3839, 1029, 3290, 891, 1119, 253, 4583, 9759, 21643, 285, 253, 10527, 1543, 247, 2372, 762, 11622, 3987, 2167, 891, 42126, 1239, 253, 30762, 594, 352, 310, 1896, 253, 7681, 12748, 497, 4067, 685, 891, 1869, 50276, 7053, 891, 1119, 253, 2022, 5697, 2834, 281, 2096, 285, 574, 281, 789, 949, 2593, 374, 247, 1643, 2069, 891, 1158, 253, 9759, 812, 320, 5520, 604, 253, 2593, 369, 40855, 1520, 281, 6780, 247, 28224, 273, 253, 3236, 3540, 395, 13121, 285, 9569, 634, 747, 5697, 271, 14586, 352, 369, 2649, 4745, 2590, 281, 479, 604, 634, 5199, 651, 4711, 271, 2250, 20408, 5933, 390, 271, 17825, 5933, 5068, 342, 253, 8813, 326, 368, 403, 275, 253, 36833, 4758, 285, 403, 26264, 3540, 285, 3523, 651, 1361, 50276, 74, 671, 5730, 326, 253, 20408, 10491, 5933, 369, 3559, 625, 4518, 4931, 342, 10585, 406, 853, 387, 1878, 323, 634, 7583, 8164, 2492, 273, 271, 17825, 5933, 50275, 262, 1537, 320, 1175, 281, 1918, 247, 2372, 625, 30328, 670, 752, 17825, 10491, 285, 20408, 11333, 513, 285, 616, 3910, 3340, 1580, 627, 403, 2067, 973, 1929, 11333, 24088, 18205, 67, 28669, 12955, 326, 13414, 4944, 715, 634, 767, 5971, 5742, 891, 1119, 634, 12889, 1071, 253, 36594, 273, 1046, 3662, 21643, 285, 352, 651, 320, 3164, 30909, 281, 6266, 849, 20408, 11333, 789, 275, 12475, 554, 3770, 84, 891, 1537, 564, 594, 2080, 387, 281, 9059, 326, 17825, 310, 247, 21643, 4327, 347, 20408, 11333, 671, 1818, 616, 10491, 10670, 347, 941, 403, 5728, 23447, 3798, 949, 23703, 5231, 4931, 815, 833, 50276, 4160, 545, 833, 651, 320, 247, 30909, 13812, 50276, 23955, 1077, 1395, 13812, 310, 326, 20408, 11333, 403, 3839, 1327, 36928, 25312, 760, 690, 1039, 281, 755, 1442, 3254, 4636, 7162, 11508, 327, 253, 2303, 3602, 5727, 17825, 3082, 2430, 2266, 36833, 13260, 50276, 74, 5730, 627, 369, 3762, 4645, 7756, 275, 253, 3410, 10454, 285, 891, 5730, 627, 369, 247, 5955, 670, 253, 13260, 5742, 271, 4154, 670, 2139, 597, 943, 320, 3309, 50276, 977, 8680, 1386, 14509, 3076, 28146, 50276, 74, 1158, 247, 8003, 5955, 670, 253, 13260, 310, 3058, 38058, 3486, 369, 417, 9713, 533, 891, 13414, 1158, 697, 3309, 5474, 339, 431, 248, 2929, 19401, 253, 1895, 273, 4560, 390, 12488, 253, 3451, 3662, 281, 247, 2173, 1953, 275, 247, 3961, 262, 10076, 6667, 403, 752, 310, 253, 4430, 342, 253, 4585, 1599, 1682, 4430, 8137, 390, 534, 6174, 452, 247, 1599, 4067, 685, 690, 2173, 1318, 7887, 272, 3961, 953, 323, 436, 4096, 597, 45735, 253, 5933, 280, 7274, 1754, 327, 2412, 7513, 10202, 4313, 15910, 534, 403, 28055, 3240, 23176, 533, 403, 39670, 290, 1022, 43245, 8214, 11467, 8288, 841, 7274, 3523, 347, 3517, 347, 627, 310, 271, 3662, 824, 326, 253, 18075, 281, 841, 9172, 403, 417, 21541, 2217, 4080, 275, 2426, 273, 8723, 4181, 273, 2412, 7513, 10202, 4313, 762, 253, 9376, 326, 253, 6944, 3961, 262, 1895, 10076, 4483, 247, 14717, 273, 253, 1896, 5795, 5239, 715, 4577, 20077, 1925, 7437, 253, 4477, 1804, 767, 15910, 6866, 326, 452, 872, 1598, 247, 4577, 15910, 673, 685, 253, 2412, 7513, 10202, 2746, 2530, 512, 403, 970, 253, 1072, 10491, 5700, 841, 15910, 6866, 403, 16248, 253, 2412, 7513, 10202, 4313, 15910, 2746, 342, 253, 1327, 26672, 422, 20408, 3169, 7274, 407, 22914, 7437, 347, 3517, 347, 841, 403, 28116, 323, 253, 3061, 2403, 253, 4477, 921, 326, 323, 247, 4564, 273, 305, 12064, 4872, 3961, 262, 3237, 253, 9376, 1840, 310, 10048, 285, 253, 7221, 6081, 273, 253, 2412, 7513, 10202, 4313, 476, 320, 14556, 10302, 327, 253, 3969, 7437, 25761, 247, 3033, 275, 15355, 327, 253, 4795, 15910, 2069, 403, 2011, 672, 253, 15910, 6866, 403, 5678, 342, 271, 5919, 10491, 4086, 11797, 407, 253, 5125, 15910, 6866, 253, 4477, 15313, 247, 10491, 5700, 326, 310, 5742, 27846, 4404, 271, 2393, 15910, 273, 253, 3438, 253, 5125, 7274, 403, 6949, 275, 271, 5661, 1263, 327, 13506, 941, 50276, 296, 3755, 20556, 50275, 39911, 4722, 9400, 253, 6944, 1895, 4758, 273, 2087, 8137, 275, 3961, 953, 310, 247, 4623, 10527, 1895, 10076, 10985, 247, 2491, 273, 8542, 4893, 285, 556, 2168, 644, 6949, 407, 247, 4564, 273, 2067, 4477, 253, 6944, 2561, 1953, 273, 1880, 253, 4751, 17825, 7274, 476, 320, 12956, 824, 326, 597, 403, 43245, 625, 5919, 824, 347, 253, 1327, 26672, 422, 20408, 8130, 310, 273, 34497, 17200, 323, 8542, 4893, 253, 2234, 5691, 310, 273, 2282, 281, 6558, 253, 23176, 10527, 3607, 273, 253, 4751, 17825, 7274, 50275, 27962, 1255, 4583, 954, 273, 253, 1543, 403, 3590, 347, 973, 347, 616, 27947, 891, 419, 2254, 10141, 512, 27947, 275, 1046, 1390, 2508, 533, 574, 247, 11080, 1007, 387, 253, 27947, 273, 5161, 1543, 534, 497, 4030, 347, 2080, 347, 891, 476, 2028, 50275, 15177, 273, 4028, 49836, 275, 2087, 253, 2929, 310, 973, 3542, 3738, 352, 310, 247, 2581, 7681, 2929, 253, 14951, 310, 973, 1869, 562, 50275, 20881, 1255, 265, 50275, 2307, 383, 10527, 7680, 3738, 253, 2929, 4428, 247, 9648, 9470, 30762, 273, 27947, 273, 253, 10527, 1543, 841, 403, 8127, 1754, 327, 2720, 789, 285, 642, 3216, 22071, 747, 5697, 1705, 275, 347, 2080, 347, 891, 476, 2028, 3103, 891, 1158, 253, 10527, 7680, 273, 436, 2929, 310, 8261, 533, 417, 1534, 50274, 87, 3611, 5740, 273, 23703, 387, 10491, 50276, 783, 629, 670, 253, 5125, 10491, 5700, 476, 320, 5520, 5001, 697, 19843, 281, 320, 625, 10799, 891, 651, 452, 14109, 604, 253, 23703, 387, 10491, 2746, 651, 452, 644, 3559, 275, 830, 273, 247, 10585, 406, 853, 3185, 273, 247, 2581, 21248, 21765, 5740, 50275, 74, 651, 452, 14109, 604, 253, 4477, 651, 1918, 247, 7579, 5001, 247, 1480, 5301, 273, 253, 4588, 15910, 2069, 26332, 275, 534, 1083, 352, 310, 1896, 285, 275, 534, 352, 310, 417, 891, 1158, 760, 323, 10012, 3436, 352, 310, 50274, 187, 187, 4118, 18435, 27, 2520, 2929, 556, 8523, 2959, 45210, 7363, 253, 30628, 14109, 253, 2087, 5933, 280, 7792, 285, 253, 1029, 7681, 3290, 533, 690, 273, 731, 32981, 264, 253, 4103, 14855, 273, 253, 7680, 275, 1798, 253, 3480, 273, 1892, 11701, 689, 5368, 1543, 285, 8042, 562, 326, 253, 9759, 812, 897, 690, 11701, 690, 273, 841, 7350, 497, 9713, 275, 247, 17265, 2715, 273, 253, 2929, 285, 247, 2962, 273, 973, 15720, 2488, 6128, 534, 6524, 13762, 2067, 30628, 281, 7164, 616, 7363, 50276, 8045, 1230, 512, 30628, 5821, 326, 253, 2929, 310, 12207, 323, 9311, 253, 4477, 403, 14659, 281, 513, 1529, 1509, 273, 18520, 672, 13828, 253, 2457, 2715, 273, 253, 2929, 285, 1379, 512, 253, 30628, 5701, 715, 2395, 275, 253, 1232 ]
[ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 ]
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 783, 2929, 29328, 20408, 4803, 326, 476, 320, 4354, 5678, 342, 5368, 4229, 39943, 11333, 323, 6313, 17947, 1097, 323, 10491, 285, 15910, 8130, 253, 5373, 403, 5520, 3045, 275, 253, 1327, 284, 40045, 3875, 9459, 253, 4477, 921, 247, 906, 326, 253, 20408, 15910, 1543, 275, 23703, 7437, 642, 1996, 685, 253, 2629, 26198, 83, 15910, 4757, 310, 326, 253, 4081, 5700, 5644, 281, 5520, 16774, 3045, 253, 14855, 310, 326, 253, 3762, 1057, 417, 1646, 281, 320, 2266, 387, 1878, 275, 253, 1655, 830, 50276, 783, 7680, 310, 2590, 533, 253, 4278, 476, 320, 5544, 1805, 594, 891, 651, 1333, 326, 19843, 310, 2708, 15355, 3236, 414, 3290, 285, 8453, 310, 512, 12069, 49636, 816, 1475, 253, 2534, 50276, 2577, 2022, 5833, 310, 326, 627, 310, 642, 872, 494, 7756, 273, 253, 4081, 3082, 275, 15965, 1783, 891, 513, 417, 1599, 253, 4737, 323, 417, 7197, 533, 247, 4737, 751, 247, 13714, 5170, 3033, 476, 320, 6786, 285, 352, 476, 1056, 271, 1340, 3020, 3064, 275, 690, 2219, 16774, 7103, 2722, 326, 253, 16774, 3045, 7756, 476, 320, 1805, 533, 275, 619, 4743, 352, 310, 2581, 247, 5884, 7756, 50275, 6438, 30080, 22559, 891, 452, 4391, 253, 4868, 347, 253, 4477, 452, 9713, 253, 4468, 5549, 5474, 33032, 2520, 789, 24853, 253, 8037, 875, 20408, 11333, 285, 4751, 17825, 11333, 323, 8137, 3237, 275, 3961, 953, 581, 253, 581, 1133, 253, 2022, 4757, 285, 14855, 273, 20408, 11333, 403, 616, 1029, 15180, 6733, 285, 1781, 3410, 48663, 2975, 581, 253, 643, 1133, 4751, 17825, 11333, 4264, 10046, 3410, 10454, 23632, 1223, 1146, 43245, 31334, 275, 436, 2929, 403, 4158, 4751, 17825, 11333, 326, 1347, 20408, 841, 11333, 403, 2011, 281, 755, 253, 1682, 273, 1097, 20490, 1097, 28055, 285, 45190, 50276, 296, 3755, 20556, 50276, 783, 7792, 556, 247, 3862, 2491, 273, 4893, 352, 10949, 2190, 2571, 253, 3237, 273, 305, 12064, 3961, 953, 1755, 76, 7887, 272, 8137, 323, 4872, 285, 440, 34218, 3961, 953, 50276, 251, 436, 2087, 9978, 436, 789, 8379, 39223, 253, 1895, 273, 20462, 8130, 326, 403, 1097, 4751, 17825, 285, 9591, 20408, 50276, 783, 4679, 1329, 253, 10527, 3916, 50276, 20881, 1255, 265, 50275, 2520, 2929, 812, 5649, 432, 247, 1805, 9759, 50275, 262, 651, 320, 5322, 281, 6780, 247, 10585, 406, 853, 273, 253, 5933, 4158, 50276, 262, 3936, 247, 1048, 673, 281, 755, 281, 253, 2022, 10527, 1543, 50276, 783, 39383, 403, 1892, 281, 14390, 1293, 6667, 43760, 10944, 50276, 74, 858, 417, 1663, 1089, 247, 2590, 3908, 670, 253, 7364, 273, 253, 789, 253, 4477, 1375, 326, 253, 2442, 4016, 38058, 16274, 273, 616, 789, 310, 5549, 436, 789, 310, 7194, 10527, 533, 352, 1537, 1335, 320, 9865, 281, 3748, 849, 3961, 262, 11333, 403, 3732, 281, 253, 1533, 285, 24088, 752, 812, 564, 3430, 604, 253, 5125, 11333, 403, 908, 50276, 7152, 33032, 253, 4477, 12661, 247, 747, 5933, 2216, 8063, 323, 20408, 11333, 275, 253, 4229, 39943, 1682, 1513, 888, 1877, 4758, 342, 36833, 10670, 326, 771, 7790, 253, 7312, 15910, 4086, 273, 3540, 395, 13121, 4826, 11333, 17825, 275, 616, 28939, 50276, 903, 314, 762, 271, 9376, 326, 253, 4764, 2317, 273, 1016, 5795, 323, 1016, 4430, 891, 26332, 253, 3602, 326, 651, 906, 275, 667, 4430, 643, 685, 891, 1146, 253, 1682, 476, 320, 45765, 715, 247, 1355, 8578, 273, 7437, 326, 403, 3477, 281, 22318, 689, 253, 4477, 12661, 20408, 273, 2060, 7437, 275, 4499, 253, 6867, 3540, 395, 13121, 5933, 588, 4035, 1919, 512, 7437, 3707, 323, 1110, 15823, 281, 247, 2014, 2250, 476, 320, 17527, 436, 5199, 476, 4796, 253, 2105, 273, 12669, 253, 15910, 4086, 285, 310, 2011, 281, 452, 253, 1072, 23632, 36594, 285, 20185, 6733, 50276, 783, 4477, 840, 1246, 10491, 4086, 326, 671, 3936, 715, 2395, 7437, 326, 452, 644, 17527, 285, 921, 326, 762, 247, 7681, 9376, 253, 747, 10491, 4086, 1057, 417, 40195, 253, 3410, 10454, 5520, 3045, 275, 1097, 3410, 10454, 285, 673, 10454, 310, 2722, 21657, 50275, 6050, 253, 4028, 369, 3839, 1029, 3290, 891, 1119, 253, 4583, 9759, 21643, 285, 253, 10527, 1543, 247, 2372, 762, 11622, 3987, 2167, 891, 42126, 1239, 253, 30762, 594, 352, 310, 1896, 253, 7681, 12748, 497, 4067, 685, 891, 1869, 50276, 7053, 891, 1119, 253, 2022, 5697, 2834, 281, 2096, 285, 574, 281, 789, 949, 2593, 374, 247, 1643, 2069, 891, 1158, 253, 9759, 812, 320, 5520, 604, 253, 2593, 369, 40855, 1520, 281, 6780, 247, 28224, 273, 253, 3236, 3540, 395, 13121, 285, 9569, 634, 747, 5697, 271, 14586, 352, 369, 2649, 4745, 2590, 281, 479, 604, 634, 5199, 651, 4711, 271, 2250, 20408, 5933, 390, 271, 17825, 5933, 5068, 342, 253, 8813, 326, 368, 403, 275, 253, 36833, 4758, 285, 403, 26264, 3540, 285, 3523, 651, 1361, 50276, 74, 671, 5730, 326, 253, 20408, 10491, 5933, 369, 3559, 625, 4518, 4931, 342, 10585, 406, 853, 387, 1878, 323, 634, 7583, 8164, 2492, 273, 271, 17825, 5933, 50275, 262, 1537, 320, 1175, 281, 1918, 247, 2372, 625, 30328, 670, 752, 17825, 10491, 285, 20408, 11333, 513, 285, 616, 3910, 3340, 1580, 627, 403, 2067, 973, 1929, 11333, 24088, 18205, 67, 28669, 12955, 326, 13414, 4944, 715, 634, 767, 5971, 5742, 891, 1119, 634, 12889, 1071, 253, 36594, 273, 1046, 3662, 21643, 285, 352, 651, 320, 3164, 30909, 281, 6266, 849, 20408, 11333, 789, 275, 12475, 554, 3770, 84, 891, 1537, 564, 594, 2080, 387, 281, 9059, 326, 17825, 310, 247, 21643, 4327, 347, 20408, 11333, 671, 1818, 616, 10491, 10670, 347, 941, 403, 5728, 23447, 3798, 949, 23703, 5231, 4931, 815, 833, 50276, 4160, 545, 833, 651, 320, 247, 30909, 13812, 50276, 23955, 1077, 1395, 13812, 310, 326, 20408, 11333, 403, 3839, 1327, 36928, 25312, 760, 690, 1039, 281, 755, 1442, 3254, 4636, 7162, 11508, 327, 253, 2303, 3602, 5727, 17825, 3082, 2430, 2266, 36833, 13260, 50276, 74, 5730, 627, 369, 3762, 4645, 7756, 275, 253, 3410, 10454, 285, 891, 5730, 627, 369, 247, 5955, 670, 253, 13260, 5742, 271, 4154, 670, 2139, 597, 943, 320, 3309, 50276, 977, 8680, 1386, 14509, 3076, 28146, 50276, 74, 1158, 247, 8003, 5955, 670, 253, 13260, 310, 3058, 38058, 3486, 369, 417, 9713, 533, 891, 13414, 1158, 697, 3309, 5474, 339, 431, 248, 2929, 19401, 253, 1895, 273, 4560, 390, 12488, 253, 3451, 3662, 281, 247, 2173, 1953, 275, 247, 3961, 262, 10076, 6667, 403, 752, 310, 253, 4430, 342, 253, 4585, 1599, 1682, 4430, 8137, 390, 534, 6174, 452, 247, 1599, 4067, 685, 690, 2173, 1318, 7887, 272, 3961, 953, 323, 436, 4096, 597, 45735, 253, 5933, 280, 7274, 1754, 327, 2412, 7513, 10202, 4313, 15910, 534, 403, 28055, 3240, 23176, 533, 403, 39670, 290, 1022, 43245, 8214, 11467, 8288, 841, 7274, 3523, 347, 3517, 347, 627, 310, 271, 3662, 824, 326, 253, 18075, 281, 841, 9172, 403, 417, 21541, 2217, 4080, 275, 2426, 273, 8723, 4181, 273, 2412, 7513, 10202, 4313, 762, 253, 9376, 326, 253, 6944, 3961, 262, 1895, 10076, 4483, 247, 14717, 273, 253, 1896, 5795, 5239, 715, 4577, 20077, 1925, 7437, 253, 4477, 1804, 767, 15910, 6866, 326, 452, 872, 1598, 247, 4577, 15910, 673, 685, 253, 2412, 7513, 10202, 2746, 2530, 512, 403, 970, 253, 1072, 10491, 5700, 841, 15910, 6866, 403, 16248, 253, 2412, 7513, 10202, 4313, 15910, 2746, 342, 253, 1327, 26672, 422, 20408, 3169, 7274, 407, 22914, 7437, 347, 3517, 347, 841, 403, 28116, 323, 253, 3061, 2403, 253, 4477, 921, 326, 323, 247, 4564, 273, 305, 12064, 4872, 3961, 262, 3237, 253, 9376, 1840, 310, 10048, 285, 253, 7221, 6081, 273, 253, 2412, 7513, 10202, 4313, 476, 320, 14556, 10302, 327, 253, 3969, 7437, 25761, 247, 3033, 275, 15355, 327, 253, 4795, 15910, 2069, 403, 2011, 672, 253, 15910, 6866, 403, 5678, 342, 271, 5919, 10491, 4086, 11797, 407, 253, 5125, 15910, 6866, 253, 4477, 15313, 247, 10491, 5700, 326, 310, 5742, 27846, 4404, 271, 2393, 15910, 273, 253, 3438, 253, 5125, 7274, 403, 6949, 275, 271, 5661, 1263, 327, 13506, 941, 50276, 296, 3755, 20556, 50275, 39911, 4722, 9400, 253, 6944, 1895, 4758, 273, 2087, 8137, 275, 3961, 953, 310, 247, 4623, 10527, 1895, 10076, 10985, 247, 2491, 273, 8542, 4893, 285, 556, 2168, 644, 6949, 407, 247, 4564, 273, 2067, 4477, 253, 6944, 2561, 1953, 273, 1880, 253, 4751, 17825, 7274, 476, 320, 12956, 824, 326, 597, 403, 43245, 625, 5919, 824, 347, 253, 1327, 26672, 422, 20408, 8130, 310, 273, 34497, 17200, 323, 8542, 4893, 253, 2234, 5691, 310, 273, 2282, 281, 6558, 253, 23176, 10527, 3607, 273, 253, 4751, 17825, 7274, 50275, 27962, 1255, 4583, 954, 273, 253, 1543, 403, 3590, 347, 973, 347, 616, 27947, 891, 419, 2254, 10141, 512, 27947, 275, 1046, 1390, 2508, 533, 574, 247, 11080, 1007, 387, 253, 27947, 273, 5161, 1543, 534, 497, 4030, 347, 2080, 347, 891, 476, 2028, 50275, 15177, 273, 4028, 49836, 275, 2087, 253, 2929, 310, 973, 3542, 3738, 352, 310, 247, 2581, 7681, 2929, 253, 14951, 310, 973, 1869, 562, 50275, 20881, 1255, 265, 50275, 2307, 383, 10527, 7680, 3738, 253, 2929, 4428, 247, 9648, 9470, 30762, 273, 27947, 273, 253, 10527, 1543, 841, 403, 8127, 1754, 327, 2720, 789, 285, 642, 3216, 22071, 747, 5697, 1705, 275, 347, 2080, 347, 891, 476, 2028, 3103, 891, 1158, 253, 10527, 7680, 273, 436, 2929, 310, 8261, 533, 417, 1534, 50274, 87, 3611, 5740, 273, 23703, 387, 10491, 50276, 783, 629, 670, 253, 5125, 10491, 5700, 476, 320, 5520, 5001, 697, 19843, 281, 320, 625, 10799, 891, 651, 452, 14109, 604, 253, 23703, 387, 10491, 2746, 651, 452, 644, 3559, 275, 830, 273, 247, 10585, 406, 853, 3185, 273, 247, 2581, 21248, 21765, 5740, 50275, 74, 651, 452, 14109, 604, 253, 4477, 651, 1918, 247, 7579, 5001, 247, 1480, 5301, 273, 253, 4588, 15910, 2069, 26332, 275, 534, 1083, 352, 310, 1896, 285, 275, 534, 352, 310, 417, 891, 1158, 760, 323, 10012, 3436, 352, 310, 50274, 187, 187, 4118, 18435, 27, 2520, 2929, 556, 8523, 2959, 45210, 7363, 253, 30628, 14109, 253, 2087, 5933, 280, 7792, 285, 253, 1029, 7681, 3290, 533, 690, 273, 731, 32981, 264, 253, 4103, 14855, 273, 253, 7680, 275, 1798, 253, 3480, 273, 1892, 11701, 689, 5368, 1543, 285, 8042, 562, 326, 253, 9759, 812, 897, 690, 11701, 690, 273, 841, 7350, 497, 9713, 275, 247, 17265, 2715, 273, 253, 2929, 285, 247, 2962, 273, 973, 15720, 2488, 6128, 534, 6524, 13762, 2067, 30628, 281, 7164, 616, 7363, 50276, 8045, 1230, 512, 30628, 5821, 326, 253, 2929, 310, 12207, 323, 9311, 253, 4477, 403, 14659, 281, 513, 1529, 1509, 273, 18520, 672, 13828, 253, 2457, 2715, 273, 253, 2929, 285, 1379, 512, 253, 30628, 5701, 715, 2395, 275, 253, 1232 ]
Below is given review of a research paper from cnoference journal. Please write a summary the review. ### Review: this paper describes a self supervision training approach to pretrain speech encoders this is related to recent work on wav2vec 20 and simclr the contrastive loss is similar to wav2vec 20 while the invariance to perturbation ie augmentation is akin to simclr performance is competitive with wav2vec 2 on librispeech the goal of perturbation or augmentation invariance is well motivated in speech recognition and also ml representation learning more broadly the librispeech 960 numbers are quite strong the results using synthetic noisy data are less convincing than naturally occurring noise this is especially important since the synthetic noise is used during training as well in figure 1 it appears that the positional padding is applied equally to both the input and output of the teacher this is not as clear from the description in section 31 if this is the case it would be helpful to expand on the description section 33 specaugment typically masks both time and frequency bands it is not clear if this is done here or if the masking is performed on either time or frequency assuming both time and frequency are masked is the area masked by both time and frequency replaced with zeros or gaussian noise section 35 what is the rationale for using ctc asr decoding rather than say an attention decoder or rnnt model section 42 it would be useful to have more insight into why spiral trains faster than wav2vec 20 is this due to implementation or is there something fundamental to these algorithms that makes spiral faster section 42 it might be useful to compare these results to speechstew section 531 it is interesting that other approaches are more robust to perturbing both teacher and student inputs while spiral requires clean teacher inputs more thorough discussion of this would be helpful though space is tight shorter notes section 2 the description of liang et al 2018 is notably brief compared to others the comparison of perturbation invariance in asr training seems like a worthwhile comparison to invariance during pretraining table 3 only row 2 includes 3 significant figures the rest includes 2 it would be better to be consistent here same comment in tables 4 and 5 table 5 it would be better to organize the columns in by increasing or decreasing snr typo in section 532 where when the predictor is removed we observe the we shouldnt be capitalized this is a compelling paper its a well motivated and technically sound perspective on selfsupervised training the performance on librispeech 960 is quite strong the performance in noise conditions is strong but would be more convincing if shown on more actual rather than synthetic noise docsepthe paper proposes a new speech pretraining approach named spiral the proposed method is trained by learning representation denoising of perturbed data with the teacherstudent framework in a selfsupervised manner the motivation of the method is to learn the representation that is invariant to perturbation so that the learnt representation is a highlevel one eg carrying content information that can enhance the downstream speech applications compared to the stateoftheart speech pretraining method wav2vec 20 the proposed method can achieve competitive or better results but with a significant reduction of the training cost the proposed method is also able to deal with the noiserobustness problem the paper is well written with the following strengths 1 the proposed method is selfsupervised learning with the teacherstudent framework that is an extension to mean teacher mt and boostrap your own latent byol 2 the proposed method is a novel one that aims to learn the representation denoising of perturbed data with the teacherstudent framework in addition the proposed method can also be combined with multicondition training to improve the noiserobustness 3 the motivation of the method sounds quite reasonable which aims to learn the representation that is invariant to perturbation so that the learnt representation is a highlevel one eg carrying content information to enhance the downstream speech applications 4 an inutterance contrastive loss proposed by chopra et al 2005 is adopted to avoid the model collapse problem 5 position randomization technique is further introduced to prevent the positional collapse problem 6 a gradual downsampling strategy has been adopted to train the spiral model to reduce the computation cost 7 plenty of experiments have been conducted to evaluate the performance of the proposed method 8 the proposed method can achieve competitive or better results than wav2vec 20 whereas with significantly less training cost based on the above main review especially the strengths of the paper although some of the ideas have been borrowed from previous work eg mt byol contrastive loss etc the paper has proposed extensions to these work by considering the sequential applications in speech processing and the relations with the previous work have been clearly discussed in section 2 hence i think the paper could be accepted for publication on iclr docsepthe paper presents a novel approach to selfsupervised speech representation learning which promises to be simpler than existing methods such as wav2vec 20 the approach is inspired by the byol approach from cv and is shown to be indeed largely as effective as wav2vec 20 while being significantly more efficient during training strengths the paper is overall well written and presents a good set of experimental results the baselines are strong for what i can tell the approach is well motivated and explained well the authors promise to release code upon acceptance of the paper which should allow readers to verify results and build on top of it pretrained selfsupervised audio feature extractors have the potential to improve speech recognition in many domains eg low resource multilingual speech recognition and reducing compute is a critical step in that direction weaknesses some of the experimental results are a bit early i would like to see results with lm rescoring after the hyper parameters have been optimized so that results are more comparable and hopefully consistent similarly with mixed precision training would it be possible to get rid of the convolutional layers and build a model that is based entirely on selfattention this should be even more efficient on gpu update the additional tests on chime data and the updated decoding results address these concerns i am updating my assessment making selfsupervised speech representations easier to train is a significant contribution and this paper presents a viable approach to doing so some results feel a bit preliminary but the paper is well written and the authors may have updated results but the time of the conference and they will release code plus presumably models my recommendation is thus for accept docsepthe paper describes teacherstudent selfsupervised training with a denoising advantage by perturbing the student it elaborates on techniques to avoid collapse and compares results with wav2vec 20 the paper describes teacherstudent selfsupervised training the teacher generates a representation on clean data the student matches the teachers representation with a perturbed version of the utterance inutterance contrastive loss is used to avoid learning a trivial representation positional collapse is avoided by random padding beforeafter the utterance fed to the teacher the teacher is updated as the moving average of student checkpoints results are presented for lowresource and highresource settings using librispeech and librilight the strengths are the paper has an interesting premise combining ideas from noisystudent training and selfsupervised training the results are somewhat comparable to wav2vec 20 but at lower training cost the method is robust to unseen noise situations as far as i understand it does not require separately pretraining the teacher unlike noisy student training this could be clarified however the weaknesses include the results are better or on par with wav2vec 20 in some cases but not on the whole the authors mention that the settings are not fully tuned it would be interesting to know what the best results are after tuning while there are some ablation studies some unanswered questions remain in particular i am curious how much is gained from additive noise perturbation and from positional randomization in looking at the noisy test results table 5 there is no additive noise in either pretraining or finetuning for wav2vec 20 hence this comparison seems incomplete other questions and notes related work could include contrastive semisupervised learning work from facebook as another way to combine these two aspects how is the teacher initialized in table 2 i assume training step refers to pretraining steps how is the optimal number of pretraining steps determined for each method what perturbation settings are included in the wav2vec setup table 3 noisy student unlabeled data ls860 is this a typo page 7 theclean testclean typo the paper has an interesting premise and some promising experiments to justify it on the whole the method is somewhat comparable to wav2vec 20 although it falls short in several cases this can be made clearer in the abstract and introduction while highlighting the reduced training cost i am not convinced that the comparison to wav2vec 20 is complete in terms of denoising uses as there is no discussion of specaug and additive noise in wav2vec 20 during pretraining or finetuning also some additional ablation studies to understand the benefit of additive noise and positional randomization would strengthen the paper docsepthe paper introduces spiral a new method for selfsupervised pretraining for speech spiral is based on the teacherstudent framework similar to mean teacher tarvainen and valpola 2017 and byol grill et al 2020 where the teachers weights are updated as a moving average of the students weights but makes additional modifications for sequence tasks like speech an inutterance contrastive loss is used as the pretraining objective position randomization of the teachers input is used to avoid representation collapse ablation experiments are done to show that the predictor which was essential in earlier works like byol and simsiam to avoid collapse can be replaced with a convolutional projection head without performance degradation empirically the main contributions of the paper are 1 achieving similarbetter wer on librispeech compared to wav2vec 20 with 35 of the training cost and 2 incorporating multicondition training which has been used in supervised training in the past for noiserobust asr strengths 1 while the paper adapts the teacherstudent selfsupervised pretraining framework that has been studied for image representation learning grill et al chen he tian et al the modifications for sequential learning inutterance contrastive loss position randomization and convolutional subsampling are essential for speech 2 the authors perform ablations to show the relevance of the projection head visavis the predictor and demonstrate they are complementary this result raises important questions about conclusions drawn about this framework in previous works like tian et al 2021 and their applicability to sequence representation learning 3 experimental setup eg hyperparameters is described in detail spiral obtains strong wer performance on librispeech with a fraction of the training of wav2vec 20 multicondition training is shown to be effective at pretraining versus only at finetuning to improve noise robustness 10 relative wer reduction while maintaining performance on clean speech 4 the authors have mentioned that they will release the code at the time of publication this would be useful for the community to extend this direction of research 5 the paper is clear and easytofollow with sufficient discussion of related work weaknesses 1 while the proposed model does well on clean librispeech performance under noisy conditions may be concerning section 52 in table 5 when the synthetic noise at test time is matched with training noise both in noise type and snr the wers are good but mismatched snr significantly degrades wer 80 to 261 for testclean this suggests that the model may be overfitting to the range of snrs during training second there are no evaluations on mismatched noise types during train and test so it is hard to predict the models generalizability to other types of noise furthermore the absence of evaluations on real noisy data such as chime4 raises some questions about the benefits of multicondition training these questions are of particular interest since the model is named perturbation invariant 2 there have been several recent works analyzing representation collapse of noncontrastive ssl models eg tian et al 2021 which suggest that a predictor on the student branch and weight decay during training are essential to prevent collapse the authors mention briefly in section 2 that the predictor was not sufficient and they had to use the inutterance contrastive loss with position randomization can they suggest reasons why this might be the case 3 the ablation in section 532 indicates that performance degrades significantly when perturbations are applied to the teachers input this again deviates from standard practice in image representation learning where augmentation is applied on the inputs for both the teacher and the student and as astutely observed by the authors is closer in principle to standard selftraining could the authors suggest why input noise is harmful but computation noise dropout works by for instance elucidating the link to gal and ghahremani 2016 other commentsquestions 1 for multicondition pretraining is the noisy input used for both the student and the teacher if yes what would happen if the clean input is used for the teacher 2 for librispeech evaluation under the low resource setting perhaps it would be fairer although not comparable to baevski et al 2020 to exclude the trainclean100 subset during pretraining as done in park et al 2020 3 in the first paragraph of page 2 the authors comment that spiral also allows to combine with multicondition training this is slightly misleading as there is no inherent limitation in the other models wav2vec 20 hubert mentioned in the previous line that would prevent multicondition training with those models 4 have the authors tried learning from raw waveforms instead of logmel filterbanks some typos section 1 para 3 learning representation denoising of perturbed data learning denoising representation of perturbed data section 32 para 1 predicotr predictor section 35 para 3 due limited receptive field due to limited receptive field while there are some clear limitations in the empirical sense particularly in the claim that the model learns denoised highlevel representations the paper advances selfsupervised learning for speech tasks by adapting the popular teacherstudent framework popular in image representation learning the model achieves wers similarbetter than the popular wav2vec 20 architecture while reducing training time and model size the ablation experiments raise questions about whether the modelingtraining strategies deemed essential in that modality are also needed for sequential tasks although the authors have not addressed these questions directly in this work overall this work and the promised code release is likely to lead to new explorations in this direction for speech selfsupervised learning ### Summary:
this paper proposed a selfsupervised speech pretraining approach by the name of spiral to learning perturbationinvariant representations in a teacherstudent setting the authors introduced a variety of techniques to improve the performance and stabilize the training compared to the popular unsupervised learning model wav2vec 20 better wers were reported using spiral with a reduced training cost all reviewers considered the work solid with sufficient novelty but also raised concerns regarding the generalization under unseen realworld noisy conditions and missing decoding details the authors responded with new chime3 results and updated lm decoding results the new results show that after a bug fix spiral can outperform wav2vec 20 when no external lm is used overall the proposed approach is technically novel the experiments are extensive and the results are compelling in addition the training time can be significantly reduced compared to wav2vec 20 all reviewers are supportive so i would recommend accept
[ 310, 4583, 973, 3542, 285, 10262, 247, 1175, 873, 273, 5661, 1543, 253, 1666, 25379, 403, 2266, 323, 752, 891, 476, 2028, 253, 2746, 310, 973, 17194, 285, 5544, 973, 253, 4477, 9023, 281, 3727, 2127, 2220, 14924, 273, 253, 2929, 534, 943, 1581, 10668, 281, 12654, 1543, 285, 1973, 327, 1755, 273, 352, 3215, 11273, 1881, 35421, 9797, 4735, 4908, 641, 452, 253, 2442, 281, 3157, 6519, 8981, 275, 1142, 10625, 24088, 1698, 7741, 1554, 39661, 6519, 8981, 285, 8493, 11897, 310, 247, 4619, 3213, 275, 326, 3884, 50276, 20881, 1255, 265, 50276, 8826, 273, 253, 5661, 1543, 403, 247, 2372, 2393, 891, 651, 751, 281, 923, 1543, 342, 298, 78, 9708, 4263, 846, 253, 4373, 3602, 452, 644, 18325, 594, 326, 1543, 403, 625, 10870, 285, 18670, 5185, 12014, 342, 6804, 12320, 3733, 651, 352, 320, 1896, 281, 755, 8314, 273, 253, 27311, 267, 8090, 285, 1973, 247, 1566, 326, 310, 1754, 7094, 327, 1881, 42959, 436, 943, 320, 1014, 625, 5919, 327, 305, 11113, 50276, 11183, 253, 3081, 5216, 327, 448, 553, 941, 285, 253, 9300, 28490, 1543, 2953, 841, 7350, 891, 717, 22753, 619, 6803, 2403, 1881, 35421, 6519, 14237, 6927, 281, 6194, 310, 247, 1534, 7680, 285, 436, 2929, 10262, 247, 16571, 2746, 281, 2509, 594, 690, 1543, 1928, 247, 2372, 12611, 533, 253, 2929, 310, 973, 3542, 285, 253, 4477, 778, 452, 9300, 1543, 533, 253, 673, 273, 253, 8059, 285, 597, 588, 3727, 2127, 5043, 18289, 3210, 619, 17401, 310, 3021, 323, 2997, 5474, 339, 431, 248, 2929, 8631, 9732, 39095, 1881, 35421, 3733, 342, 247, 1850, 80, 2182, 5750, 407, 12230, 272, 253, 5974, 352, 14883, 684, 327, 5609, 281, 3693, 13551, 285, 26662, 1543, 342, 259, 580, 19, 4642, 1384, 253, 2929, 8631, 9732, 39095, 1881, 35421, 3733, 253, 9732, 15693, 247, 6779, 327, 4076, 941, 253, 5974, 10129, 50276, 783, 10954, 6779, 342, 247, 44711, 2715, 273, 253, 13894, 593, 275, 12216, 593, 4499, 422, 2957, 310, 908, 281, 3693, 4715, 247, 14916, 6779, 40798, 13551, 310, 16371, 407, 3632, 13294, 1078, 6438, 253, 13894, 593, 10208, 281, 253, 9732, 253, 9732, 310, 9300, 347, 253, 4886, 3388, 273, 5974, 2451, 10801, 1543, 403, 3559, 323, 1698, 15024, 285, 1029, 15024, 7533, 970, 40211, 261, 365, 5036, 285, 40211, 30673, 50276, 783, 20544, 403, 50276, 783, 2929, 556, 271, 4722, 26536, 16248, 5697, 432, 642, 261, 9207, 438, 290, 3733, 285, 1881, 35421, 3733, 50276, 783, 1543, 403, 8489, 10870, 281, 259, 580, 19, 4642, 1384, 533, 387, 2406, 3733, 2105, 50276, 783, 1332, 310, 10237, 281, 39709, 6046, 9534, 50276, 284, 2080, 347, 891, 2096, 352, 1057, 417, 2430, 11794, 3215, 26208, 253, 9732, 12401, 27620, 5974, 3733, 436, 812, 320, 31637, 2299, 50276, 783, 32213, 2486, 50276, 783, 1543, 403, 1805, 390, 327, 1061, 342, 259, 580, 19, 4642, 1384, 275, 690, 2219, 533, 417, 327, 253, 2644, 253, 4477, 3748, 326, 253, 7533, 403, 417, 4751, 24251, 50276, 262, 651, 320, 4722, 281, 871, 752, 253, 1682, 1543, 403, 846, 25184, 50276, 6050, 627, 403, 690, 28913, 2175, 690, 440, 42195, 3533, 3464, 275, 1798, 891, 717, 14338, 849, 1199, 310, 12103, 432, 21842, 6046, 20452, 285, 432, 40798, 46852, 50276, 249, 2819, 387, 253, 27620, 1071, 1543, 2829, 608, 627, 310, 642, 21842, 6046, 275, 2057, 3215, 26208, 390, 1442, 292, 25004, 323, 259, 580, 19, 4642, 1384, 7613, 436, 5301, 3133, 18464, 50276, 977, 3533, 285, 7211, 50276, 4919, 789, 812, 2486, 4499, 422, 49863, 29974, 13337, 4715, 789, 432, 36979, 347, 1529, 1039, 281, 13398, 841, 767, 7794, 50276, 5430, 310, 253, 9732, 31260, 50276, 249, 2829, 374, 891, 5467, 3733, 3213, 10770, 281, 3215, 26208, 5018, 849, 310, 253, 8654, 1180, 273, 3215, 26208, 5018, 3413, 323, 1016, 1332, 50276, 5371, 20452, 7533, 403, 2908, 275, 253, 259, 580, 19, 4642, 9978, 50276, 2420, 495, 27620, 5974, 440, 22027, 941, 50276, 5200, 48294, 50276, 261, 436, 247, 1745, 80, 50276, 6377, 818, 253, 16437, 50276, 2566, 16437, 1745, 80, 253, 2929, 556, 271, 4722, 26536, 285, 690, 12532, 4679, 281, 15249, 352, 327, 253, 2644, 253, 1332, 310, 8489, 10870, 281, 259, 580, 19, 4642, 1384, 3738, 352, 11521, 2159, 275, 2067, 2219, 436, 476, 320, 1160, 30909, 275, 253, 12002, 285, 10199, 1223, 27321, 253, 3777, 3733, 2105, 891, 717, 417, 13762, 326, 253, 5301, 281, 259, 580, 19, 4642, 1384, 310, 3426, 275, 2426, 273, 1850, 80, 2182, 4648, 347, 627, 310, 642, 5955, 273, 946, 2321, 285, 21842, 6046, 275, 259, 580, 19, 4642, 1384, 1309, 3215, 26208, 390, 1442, 292, 25004, 671, 690, 3081, 28913, 2175, 281, 2096, 253, 5649, 273, 21842, 6046, 285, 40798, 46852, 651, 17084, 253, 2929, 5474, 339, 431, 248, 2929, 23970, 22377, 247, 747, 1332, 323, 1881, 35421, 3215, 26208, 323, 6519, 22377, 310, 1754, 327, 253, 9732, 39095, 7792, 2074, 281, 1599, 9732, 13586, 87, 39198, 285, 821, 4818, 66, 4240, 285, 407, 311, 32257, 1162, 355, 9169, 835, 253, 10954, 13461, 403, 9300, 347, 247, 4886, 3388, 273, 253, 3484, 13461, 533, 2789, 3081, 14586, 323, 3425, 8892, 751, 6519, 50275, 266, 275, 12216, 593, 4499, 422, 2957, 310, 908, 347, 253, 3215, 26208, 8103, 50276, 3321, 46852, 273, 253, 10954, 3280, 310, 908, 281, 3693, 6779, 13551, 50276, 1752, 318, 4679, 403, 2218, 281, 921, 326, 253, 23403, 534, 369, 5667, 275, 4321, 2987, 751, 407, 311, 285, 948, 9245, 312, 281, 3693, 13551, 476, 320, 7932, 342, 247, 27311, 267, 12378, 1481, 1293, 3045, 11961, 50276, 358, 5378, 1037, 253, 2022, 9021, 273, 253, 2929, 403, 337, 17170, 2074, 29266, 16640, 327, 40211, 261, 365, 5036, 2429, 281, 259, 580, 19, 4642, 1384, 342, 4791, 273, 253, 3733, 2105, 285, 374, 24049, 23559, 857, 539, 3733, 534, 556, 644, 908, 275, 22296, 3733, 275, 253, 2469, 323, 642, 9141, 706, 461, 347, 83, 50276, 296, 3755, 20556, 337, 1223, 253, 2929, 5223, 84, 253, 9732, 39095, 1881, 35421, 3215, 26208, 7792, 326, 556, 644, 5421, 323, 2460, 6779, 4715, 32257, 1162, 355, 260, 864, 50276, 248, 246, 757, 1162, 355, 253, 14586, 323, 22453, 4715, 50276, 249, 12216, 593, 4499, 422, 2957, 1899, 46852, 285, 27311, 267, 8790, 312, 4906, 50276, 609, 5667, 323, 6519, 374, 253, 4477, 1347, 490, 77, 569, 281, 921, 253, 17200, 273, 253, 12378, 1481, 1649, 41826, 253, 23403, 285, 7568, 597, 403, 19767, 436, 906, 16540, 1774, 3533, 670, 11815, 8392, 670, 436, 7792, 275, 2045, 2987, 751, 246, 757, 1162, 355, 43425, 285, 616, 30437, 281, 3425, 6779, 4715, 495, 5661, 9978, 24088, 4373, 22041, 310, 2529, 275, 2508, 22377, 31326, 2266, 16640, 3045, 327, 40211, 261, 365, 5036, 342, 247, 6919, 273, 253, 3733, 273, 259, 580, 19, 4642, 1384, 23559, 857, 539, 3733, 310, 2011, 281, 320, 3576, 387, 3215, 26208, 7147, 760, 387, 1442, 292, 25004, 50276, 936, 3157, 6046, 31640, 884, 4103, 16640, 5141, 1223, 11850, 3045, 327, 4076, 6519, 577, 253, 4477, 452, 5393, 326, 597, 588, 3727, 253, 2127, 387, 253, 673, 273, 9311, 436, 651, 320, 4217, 323, 253, 3114, 281, 9017, 436, 3884, 273, 2561, 608, 253, 2929, 310, 2590, 285, 3477, 936, 25739, 342, 4209, 5955, 273, 2905, 789, 50276, 20881, 1255, 265, 337, 1223, 253, 4081, 1566, 1057, 973, 327, 4076, 40211, 261, 365, 5036, 3045, 762, 27620, 2515, 778, 320, 8664, 2593, 8073, 275, 2829, 608, 672, 253, 13506, 6046, 387, 1071, 673, 310, 13373, 342, 3733, 6046, 1097, 275, 6046, 1511, 285, 3802, 83, 253, 259, 398, 403, 1175, 533, 19412, 24529, 3802, 83, 3012, 372, 25013, 16640, 5096, 281, 32858, 323, 1071, 16437, 436, 5936, 326, 253, 1566, 778, 320, 689, 31893, 281, 253, 2491, 273, 3802, 2967, 1309, 3733, 1273, 627, 403, 642, 27163, 327, 19412, 24529, 6046, 3510, 1309, 6194, 285, 1071, 594, 352, 310, 1892, 281, 3283, 253, 3210, 2087, 50228, 281, 643, 3510, 273, 6046, 33810, 253, 5928, 273, 27163, 327, 1524, 27620, 941, 824, 347, 448, 553, 21, 16540, 690, 3533, 670, 253, 5373, 273, 23559, 857, 539, 3733, 841, 3533, 403, 273, 1798, 1600, 1580, 253, 1566, 310, 4907, 20452, 13727, 374, 627, 452, 644, 2067, 3332, 2987, 18918, 6779, 13551, 273, 1327, 45842, 422, 256, 3433, 3210, 24088, 246, 757, 1162, 355, 43425, 534, 1804, 326, 247, 23403, 327, 253, 5974, 7789, 285, 2801, 10027, 1309, 3733, 403, 5667, 281, 3657, 13551, 253, 4477, 3748, 13366, 275, 2593, 374, 326, 253, 23403, 369, 417, 4209, 285, 597, 574, 281, 897, 253, 275, 12216, 593, 4499, 422, 2957, 342, 1899, 46852, 476, 597, 1804, 4606, 2139, 436, 1537, 320, 253, 1083, 495, 253, 28913, 275, 2593, 40062, 50276, 527, 31290, 326, 3045, 372, 25013, 3012, 672, 26309, 403, 3732, 281, 253, 10954, 3280, 436, 969, 1474, 28032, 432, 2629, 3946, 275, 2460, 6779, 4715, 835, 42072, 310, 3732, 327, 253, 14800, 323, 1097, 253, 9732, 285, 253, 5974, 285, 347, 7846, 7348, 2540, 407, 253, 4477, 310, 8003, 275, 8063, 281, 2629, 11329, 649, 26208, 812, 253, 4477, 1804, 2139, 3280, 6046, 310, 19632, 533, 13782, 6046, 5926, 483, 2987, 407, 323, 4227, 19125, 839, 253, 3048, 281, 5918, 285, 32798, 1240, 250, 1342, 74, 4022, 50276, 977, 5701, 34974, 337, 323, 23559, 857, 539, 3215, 26208, 310, 253, 27620, 3280, 908, 323, 1097, 253, 5974, 285, 253, 9732, 604, 4754, 752, 651, 5108, 604, 253, 4076, 3280, 310, 908, 323, 253, 9732, 374, 323, 40211, 261, 365, 5036, 7103, 762, 253, 1698, 7741, 4758, 4931, 352, 651, 320, 22870, 83, 3738, 417, 10870, 281, 18927, 1173, 9327, 1162, 355, 9169, 281, 16670, 253, 1140, 1763, 282, 266, 2313, 8578, 1309, 3215, 26208, 347, 2218, 275, 5603, 1162, 355, 9169, 495, 275, 253, 806, 12494, 273, 3239, 374, 253, 4477, 4385, 326, 22377, 671, 4483, 281, 13398, 342, 23559, 857, 539, 3733, 436, 310, 5777, 24363, 347, 627, 310, 642, 12794, 12291, 275, 253, 643, 3210, 259, 580, 19, 4642, 1384, 14713, 797, 5393, 275, 253, 2045, 1386, 326, 651, 3657, 23559, 857, 539, 3733, 342, 1110, 3210, 577, 452, 253, 4477, 3597, 4715, 432, 9305, 5149, 13015, 3185, 273, 2412, 18683, 5806, 49069, 50276, 8826, 963, 993, 50276, 4674, 337, 5586, 495, 4715, 6779, 1850, 80, 2182, 273, 44711, 941, 50275, 28269, 1850, 80, 2182, 6779, 273, 44711, 941, 50276, 4674, 4567, 5586, 337, 2063, 280, 302, 83, 50276, 12787, 8617, 50276, 4674, 4791, 5586, 495, 1955, 3710, 44952, 1673, 50276, 21848, 281, 3710, 44952, 1673, 50274, 6050, 627, 403, 690, 2590, 7364, 275, 253, 16774, 3282, 3782, 275, 253, 1750, 326, 253, 1566, 33772, 1850, 80, 1701, 1029, 5251, 14237, 253, 2929, 16424, 1881, 35421, 4715, 323, 6519, 8892, 407, 42174, 253, 4633, 9732, 39095, 7792, 4633, 275, 2460, 6779, 4715, 253, 1566, 33526, 259, 398, 2074, 29266, 685, 253, 4633, 259, 580, 19, 4642, 1384, 10336, 1223, 8493, 3733, 673, 285, 1566, 1979, 253, 28913, 4679, 7164, 3533, 670, 1880, 253, 14053, 31158, 8130, 14320, 5667, 275, 326, 36453, 403, 671, 3058, 323, 22453, 8892, 3738, 253, 4477, 452, 417, 9713, 841, 3533, 3587, 275, 436, 789, 4583, 436, 789, 285, 253, 12316, 2127, 3727, 310, 2779, 281, 1421, 281, 747, 31880, 569, 275, 436, 3884, 323, 6519, 1881, 35421, 4715, 2490, 187, 4118, 18435, 27, 2520, 2929, 4081, 247, 1881, 35421, 6519, 3215, 26208, 2746, 407, 253, 1416, 273, 22377, 281, 4715, 20452, 25168, 14237, 275, 247, 9732, 39095, 4758, 50276, 783, 4477, 5611, 247, 5235, 273, 5609, 281, 3157, 253, 3045, 285, 33292, 253, 3733, 50276, 3118, 1096, 281, 253, 4633, 440, 35421, 4715, 1566, 259, 580, 19, 4642, 1384, 1805, 259, 398, 497, 2361, 970, 22377, 342, 247, 3777, 3733, 2105, 50276, 455, 30628, 2783, 253, 789, 4891, 342, 4209, 38135, 533, 671, 5439, 7350, 5001, 253, 26647, 762, 39709, 1524, 10186, 27620, 2515, 285, 5816, 28490, 4278, 50276, 783, 4477, 10974, 342, 747, 448, 553, 20, 1543, 50276, 395, 9300, 298, 78, 28490, 1543, 50276, 783, 747, 1543, 921, 326, 846, 247, 7505, 4993, 22377, 476, 562, 32231, 259, 580, 19, 4642, 1384, 672, 642, 6024, 298, 78, 310, 908, 50274, 1189, 455, 253, 4081, 2746, 310, 22335, 4460, 50276, 783, 4679, 403, 9470, 285, 253, 1543, 403, 18511, 275, 1635, 253, 3733, 673, 476, 320, 3012, 3777, 2429, 281, 259, 580, 19, 4642, 1384, 512, 30628, 403, 23384, 50276, 601, 891, 651, 5583, 2997 ]
[ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 ]
[ 310, 4583, 973, 3542, 285, 10262, 247, 1175, 873, 273, 5661, 1543, 253, 1666, 25379, 403, 2266, 323, 752, 891, 476, 2028, 253, 2746, 310, 973, 17194, 285, 5544, 973, 253, 4477, 9023, 281, 3727, 2127, 2220, 14924, 273, 253, 2929, 534, 943, 1581, 10668, 281, 12654, 1543, 285, 1973, 327, 1755, 273, 352, 3215, 11273, 1881, 35421, 9797, 4735, 4908, 641, 452, 253, 2442, 281, 3157, 6519, 8981, 275, 1142, 10625, 24088, 1698, 7741, 1554, 39661, 6519, 8981, 285, 8493, 11897, 310, 247, 4619, 3213, 275, 326, 3884, 50276, 20881, 1255, 265, 50276, 8826, 273, 253, 5661, 1543, 403, 247, 2372, 2393, 891, 651, 751, 281, 923, 1543, 342, 298, 78, 9708, 4263, 846, 253, 4373, 3602, 452, 644, 18325, 594, 326, 1543, 403, 625, 10870, 285, 18670, 5185, 12014, 342, 6804, 12320, 3733, 651, 352, 320, 1896, 281, 755, 8314, 273, 253, 27311, 267, 8090, 285, 1973, 247, 1566, 326, 310, 1754, 7094, 327, 1881, 42959, 436, 943, 320, 1014, 625, 5919, 327, 305, 11113, 50276, 11183, 253, 3081, 5216, 327, 448, 553, 941, 285, 253, 9300, 28490, 1543, 2953, 841, 7350, 891, 717, 22753, 619, 6803, 2403, 1881, 35421, 6519, 14237, 6927, 281, 6194, 310, 247, 1534, 7680, 285, 436, 2929, 10262, 247, 16571, 2746, 281, 2509, 594, 690, 1543, 1928, 247, 2372, 12611, 533, 253, 2929, 310, 973, 3542, 285, 253, 4477, 778, 452, 9300, 1543, 533, 253, 673, 273, 253, 8059, 285, 597, 588, 3727, 2127, 5043, 18289, 3210, 619, 17401, 310, 3021, 323, 2997, 5474, 339, 431, 248, 2929, 8631, 9732, 39095, 1881, 35421, 3733, 342, 247, 1850, 80, 2182, 5750, 407, 12230, 272, 253, 5974, 352, 14883, 684, 327, 5609, 281, 3693, 13551, 285, 26662, 1543, 342, 259, 580, 19, 4642, 1384, 253, 2929, 8631, 9732, 39095, 1881, 35421, 3733, 253, 9732, 15693, 247, 6779, 327, 4076, 941, 253, 5974, 10129, 50276, 783, 10954, 6779, 342, 247, 44711, 2715, 273, 253, 13894, 593, 275, 12216, 593, 4499, 422, 2957, 310, 908, 281, 3693, 4715, 247, 14916, 6779, 40798, 13551, 310, 16371, 407, 3632, 13294, 1078, 6438, 253, 13894, 593, 10208, 281, 253, 9732, 253, 9732, 310, 9300, 347, 253, 4886, 3388, 273, 5974, 2451, 10801, 1543, 403, 3559, 323, 1698, 15024, 285, 1029, 15024, 7533, 970, 40211, 261, 365, 5036, 285, 40211, 30673, 50276, 783, 20544, 403, 50276, 783, 2929, 556, 271, 4722, 26536, 16248, 5697, 432, 642, 261, 9207, 438, 290, 3733, 285, 1881, 35421, 3733, 50276, 783, 1543, 403, 8489, 10870, 281, 259, 580, 19, 4642, 1384, 533, 387, 2406, 3733, 2105, 50276, 783, 1332, 310, 10237, 281, 39709, 6046, 9534, 50276, 284, 2080, 347, 891, 2096, 352, 1057, 417, 2430, 11794, 3215, 26208, 253, 9732, 12401, 27620, 5974, 3733, 436, 812, 320, 31637, 2299, 50276, 783, 32213, 2486, 50276, 783, 1543, 403, 1805, 390, 327, 1061, 342, 259, 580, 19, 4642, 1384, 275, 690, 2219, 533, 417, 327, 253, 2644, 253, 4477, 3748, 326, 253, 7533, 403, 417, 4751, 24251, 50276, 262, 651, 320, 4722, 281, 871, 752, 253, 1682, 1543, 403, 846, 25184, 50276, 6050, 627, 403, 690, 28913, 2175, 690, 440, 42195, 3533, 3464, 275, 1798, 891, 717, 14338, 849, 1199, 310, 12103, 432, 21842, 6046, 20452, 285, 432, 40798, 46852, 50276, 249, 2819, 387, 253, 27620, 1071, 1543, 2829, 608, 627, 310, 642, 21842, 6046, 275, 2057, 3215, 26208, 390, 1442, 292, 25004, 323, 259, 580, 19, 4642, 1384, 7613, 436, 5301, 3133, 18464, 50276, 977, 3533, 285, 7211, 50276, 4919, 789, 812, 2486, 4499, 422, 49863, 29974, 13337, 4715, 789, 432, 36979, 347, 1529, 1039, 281, 13398, 841, 767, 7794, 50276, 5430, 310, 253, 9732, 31260, 50276, 249, 2829, 374, 891, 5467, 3733, 3213, 10770, 281, 3215, 26208, 5018, 849, 310, 253, 8654, 1180, 273, 3215, 26208, 5018, 3413, 323, 1016, 1332, 50276, 5371, 20452, 7533, 403, 2908, 275, 253, 259, 580, 19, 4642, 9978, 50276, 2420, 495, 27620, 5974, 440, 22027, 941, 50276, 5200, 48294, 50276, 261, 436, 247, 1745, 80, 50276, 6377, 818, 253, 16437, 50276, 2566, 16437, 1745, 80, 253, 2929, 556, 271, 4722, 26536, 285, 690, 12532, 4679, 281, 15249, 352, 327, 253, 2644, 253, 1332, 310, 8489, 10870, 281, 259, 580, 19, 4642, 1384, 3738, 352, 11521, 2159, 275, 2067, 2219, 436, 476, 320, 1160, 30909, 275, 253, 12002, 285, 10199, 1223, 27321, 253, 3777, 3733, 2105, 891, 717, 417, 13762, 326, 253, 5301, 281, 259, 580, 19, 4642, 1384, 310, 3426, 275, 2426, 273, 1850, 80, 2182, 4648, 347, 627, 310, 642, 5955, 273, 946, 2321, 285, 21842, 6046, 275, 259, 580, 19, 4642, 1384, 1309, 3215, 26208, 390, 1442, 292, 25004, 671, 690, 3081, 28913, 2175, 281, 2096, 253, 5649, 273, 21842, 6046, 285, 40798, 46852, 651, 17084, 253, 2929, 5474, 339, 431, 248, 2929, 23970, 22377, 247, 747, 1332, 323, 1881, 35421, 3215, 26208, 323, 6519, 22377, 310, 1754, 327, 253, 9732, 39095, 7792, 2074, 281, 1599, 9732, 13586, 87, 39198, 285, 821, 4818, 66, 4240, 285, 407, 311, 32257, 1162, 355, 9169, 835, 253, 10954, 13461, 403, 9300, 347, 247, 4886, 3388, 273, 253, 3484, 13461, 533, 2789, 3081, 14586, 323, 3425, 8892, 751, 6519, 50275, 266, 275, 12216, 593, 4499, 422, 2957, 310, 908, 347, 253, 3215, 26208, 8103, 50276, 3321, 46852, 273, 253, 10954, 3280, 310, 908, 281, 3693, 6779, 13551, 50276, 1752, 318, 4679, 403, 2218, 281, 921, 326, 253, 23403, 534, 369, 5667, 275, 4321, 2987, 751, 407, 311, 285, 948, 9245, 312, 281, 3693, 13551, 476, 320, 7932, 342, 247, 27311, 267, 12378, 1481, 1293, 3045, 11961, 50276, 358, 5378, 1037, 253, 2022, 9021, 273, 253, 2929, 403, 337, 17170, 2074, 29266, 16640, 327, 40211, 261, 365, 5036, 2429, 281, 259, 580, 19, 4642, 1384, 342, 4791, 273, 253, 3733, 2105, 285, 374, 24049, 23559, 857, 539, 3733, 534, 556, 644, 908, 275, 22296, 3733, 275, 253, 2469, 323, 642, 9141, 706, 461, 347, 83, 50276, 296, 3755, 20556, 337, 1223, 253, 2929, 5223, 84, 253, 9732, 39095, 1881, 35421, 3215, 26208, 7792, 326, 556, 644, 5421, 323, 2460, 6779, 4715, 32257, 1162, 355, 260, 864, 50276, 248, 246, 757, 1162, 355, 253, 14586, 323, 22453, 4715, 50276, 249, 12216, 593, 4499, 422, 2957, 1899, 46852, 285, 27311, 267, 8790, 312, 4906, 50276, 609, 5667, 323, 6519, 374, 253, 4477, 1347, 490, 77, 569, 281, 921, 253, 17200, 273, 253, 12378, 1481, 1649, 41826, 253, 23403, 285, 7568, 597, 403, 19767, 436, 906, 16540, 1774, 3533, 670, 11815, 8392, 670, 436, 7792, 275, 2045, 2987, 751, 246, 757, 1162, 355, 43425, 285, 616, 30437, 281, 3425, 6779, 4715, 495, 5661, 9978, 24088, 4373, 22041, 310, 2529, 275, 2508, 22377, 31326, 2266, 16640, 3045, 327, 40211, 261, 365, 5036, 342, 247, 6919, 273, 253, 3733, 273, 259, 580, 19, 4642, 1384, 23559, 857, 539, 3733, 310, 2011, 281, 320, 3576, 387, 3215, 26208, 7147, 760, 387, 1442, 292, 25004, 50276, 936, 3157, 6046, 31640, 884, 4103, 16640, 5141, 1223, 11850, 3045, 327, 4076, 6519, 577, 253, 4477, 452, 5393, 326, 597, 588, 3727, 253, 2127, 387, 253, 673, 273, 9311, 436, 651, 320, 4217, 323, 253, 3114, 281, 9017, 436, 3884, 273, 2561, 608, 253, 2929, 310, 2590, 285, 3477, 936, 25739, 342, 4209, 5955, 273, 2905, 789, 50276, 20881, 1255, 265, 337, 1223, 253, 4081, 1566, 1057, 973, 327, 4076, 40211, 261, 365, 5036, 3045, 762, 27620, 2515, 778, 320, 8664, 2593, 8073, 275, 2829, 608, 672, 253, 13506, 6046, 387, 1071, 673, 310, 13373, 342, 3733, 6046, 1097, 275, 6046, 1511, 285, 3802, 83, 253, 259, 398, 403, 1175, 533, 19412, 24529, 3802, 83, 3012, 372, 25013, 16640, 5096, 281, 32858, 323, 1071, 16437, 436, 5936, 326, 253, 1566, 778, 320, 689, 31893, 281, 253, 2491, 273, 3802, 2967, 1309, 3733, 1273, 627, 403, 642, 27163, 327, 19412, 24529, 6046, 3510, 1309, 6194, 285, 1071, 594, 352, 310, 1892, 281, 3283, 253, 3210, 2087, 50228, 281, 643, 3510, 273, 6046, 33810, 253, 5928, 273, 27163, 327, 1524, 27620, 941, 824, 347, 448, 553, 21, 16540, 690, 3533, 670, 253, 5373, 273, 23559, 857, 539, 3733, 841, 3533, 403, 273, 1798, 1600, 1580, 253, 1566, 310, 4907, 20452, 13727, 374, 627, 452, 644, 2067, 3332, 2987, 18918, 6779, 13551, 273, 1327, 45842, 422, 256, 3433, 3210, 24088, 246, 757, 1162, 355, 43425, 534, 1804, 326, 247, 23403, 327, 253, 5974, 7789, 285, 2801, 10027, 1309, 3733, 403, 5667, 281, 3657, 13551, 253, 4477, 3748, 13366, 275, 2593, 374, 326, 253, 23403, 369, 417, 4209, 285, 597, 574, 281, 897, 253, 275, 12216, 593, 4499, 422, 2957, 342, 1899, 46852, 476, 597, 1804, 4606, 2139, 436, 1537, 320, 253, 1083, 495, 253, 28913, 275, 2593, 40062, 50276, 527, 31290, 326, 3045, 372, 25013, 3012, 672, 26309, 403, 3732, 281, 253, 10954, 3280, 436, 969, 1474, 28032, 432, 2629, 3946, 275, 2460, 6779, 4715, 835, 42072, 310, 3732, 327, 253, 14800, 323, 1097, 253, 9732, 285, 253, 5974, 285, 347, 7846, 7348, 2540, 407, 253, 4477, 310, 8003, 275, 8063, 281, 2629, 11329, 649, 26208, 812, 253, 4477, 1804, 2139, 3280, 6046, 310, 19632, 533, 13782, 6046, 5926, 483, 2987, 407, 323, 4227, 19125, 839, 253, 3048, 281, 5918, 285, 32798, 1240, 250, 1342, 74, 4022, 50276, 977, 5701, 34974, 337, 323, 23559, 857, 539, 3215, 26208, 310, 253, 27620, 3280, 908, 323, 1097, 253, 5974, 285, 253, 9732, 604, 4754, 752, 651, 5108, 604, 253, 4076, 3280, 310, 908, 323, 253, 9732, 374, 323, 40211, 261, 365, 5036, 7103, 762, 253, 1698, 7741, 4758, 4931, 352, 651, 320, 22870, 83, 3738, 417, 10870, 281, 18927, 1173, 9327, 1162, 355, 9169, 281, 16670, 253, 1140, 1763, 282, 266, 2313, 8578, 1309, 3215, 26208, 347, 2218, 275, 5603, 1162, 355, 9169, 495, 275, 253, 806, 12494, 273, 3239, 374, 253, 4477, 4385, 326, 22377, 671, 4483, 281, 13398, 342, 23559, 857, 539, 3733, 436, 310, 5777, 24363, 347, 627, 310, 642, 12794, 12291, 275, 253, 643, 3210, 259, 580, 19, 4642, 1384, 14713, 797, 5393, 275, 253, 2045, 1386, 326, 651, 3657, 23559, 857, 539, 3733, 342, 1110, 3210, 577, 452, 253, 4477, 3597, 4715, 432, 9305, 5149, 13015, 3185, 273, 2412, 18683, 5806, 49069, 50276, 8826, 963, 993, 50276, 4674, 337, 5586, 495, 4715, 6779, 1850, 80, 2182, 273, 44711, 941, 50275, 28269, 1850, 80, 2182, 6779, 273, 44711, 941, 50276, 4674, 4567, 5586, 337, 2063, 280, 302, 83, 50276, 12787, 8617, 50276, 4674, 4791, 5586, 495, 1955, 3710, 44952, 1673, 50276, 21848, 281, 3710, 44952, 1673, 50274, 6050, 627, 403, 690, 2590, 7364, 275, 253, 16774, 3282, 3782, 275, 253, 1750, 326, 253, 1566, 33772, 1850, 80, 1701, 1029, 5251, 14237, 253, 2929, 16424, 1881, 35421, 4715, 323, 6519, 8892, 407, 42174, 253, 4633, 9732, 39095, 7792, 4633, 275, 2460, 6779, 4715, 253, 1566, 33526, 259, 398, 2074, 29266, 685, 253, 4633, 259, 580, 19, 4642, 1384, 10336, 1223, 8493, 3733, 673, 285, 1566, 1979, 253, 28913, 4679, 7164, 3533, 670, 1880, 253, 14053, 31158, 8130, 14320, 5667, 275, 326, 36453, 403, 671, 3058, 323, 22453, 8892, 3738, 253, 4477, 452, 417, 9713, 841, 3533, 3587, 275, 436, 789, 4583, 436, 789, 285, 253, 12316, 2127, 3727, 310, 2779, 281, 1421, 281, 747, 31880, 569, 275, 436, 3884, 323, 6519, 1881, 35421, 4715, 2490, 187, 4118, 18435, 27, 2520, 2929, 4081, 247, 1881, 35421, 6519, 3215, 26208, 2746, 407, 253, 1416, 273, 22377, 281, 4715, 20452, 25168, 14237, 275, 247, 9732, 39095, 4758, 50276, 783, 4477, 5611, 247, 5235, 273, 5609, 281, 3157, 253, 3045, 285, 33292, 253, 3733, 50276, 3118, 1096, 281, 253, 4633, 440, 35421, 4715, 1566, 259, 580, 19, 4642, 1384, 1805, 259, 398, 497, 2361, 970, 22377, 342, 247, 3777, 3733, 2105, 50276, 455, 30628, 2783, 253, 789, 4891, 342, 4209, 38135, 533, 671, 5439, 7350, 5001, 253, 26647, 762, 39709, 1524, 10186, 27620, 2515, 285, 5816, 28490, 4278, 50276, 783, 4477, 10974, 342, 747, 448, 553, 20, 1543, 50276, 395, 9300, 298, 78, 28490, 1543, 50276, 783, 747, 1543, 921, 326, 846, 247, 7505, 4993, 22377, 476, 562, 32231, 259, 580, 19, 4642, 1384, 672, 642, 6024, 298, 78, 310, 908, 50274, 1189, 455, 253, 4081, 2746, 310, 22335, 4460, 50276, 783, 4679, 403, 9470, 285, 253, 1543, 403, 18511, 275, 1635, 253, 3733, 673, 476, 320, 3012, 3777, 2429, 281, 259, 580, 19, 4642, 1384, 512, 30628, 403, 23384, 50276, 601, 891, 651, 5583, 2997 ]
Below is given review of a research paper from cnoference journal. Please write a summary the review. ### Review: the paper proposes a novel class of continuoustime diffusionbased generative models specifically the paper proposes forward diffusions that are simple forms of stochastic hamiltonian dynamics unlike the previous methods where linear stochastic differential equations linear sde diffuse the data distribution thus the proposed forward diffusion transport a joint distribution of data x0 and auxiliary random variables v0 to a prior joint distribution then the generative models are their reversetime diffusion processes on the joint space whose initial condition is the prior joint distribution in the hamiltonian dynamicstype forward diffusion the auxiliary random variable can be interpreted as velocity as in physics while data corresponds to position the forward diffusion updates data by the velocities without any noise while the velocities are updated via linear sdes similarly to deterministic hamiltonian dynamics in particular the paper shows that the data distribution will be transported to a prior distribution and that velocities will map to a prior distribution similar to previous approaches the prior joint distributions are commonly chosen to simple distributions such as standard normal distributions the authors emphasize two interesting properties of the proposed method first the paper points out that the reversetime sdes should only contain nablavt log ptvt xt but not logarithmic gradient wrt xt which implies that the model complexity the size of models is almost similar to the previous approaches second the authors shows that ptvt xt is gaussians for all t in 0 t including p0v0x0 p0v0 this indicates that nablavt log ptvt xt is potentially bounded unlike previous diffusionbased generative models where the scores closer to data are possibly unbounded as a result esp with common practice to learn scores at all time ts by a single neural networks training the proposed model will be more stable in order to learn the nablavt log ptvt xt the paper first propose to use denoising score matching dsm following previous diffusionbased generative models however observing that training the proposed model via dsm is unstable the paper proposes a modified objective called hybrid score matching hsm hsm can be obtained by marginalizing out v0 from score matching loss similarly to deriving dsm the marginalization is possible as the auxiliary random variable defined to follow a known distribution such as normal distribution in addition as the reversetime diffusion is also a hamiltonian sde the paper proposes a new integrator for generations benefitting from the symplectic structure of the hamiltonian dynamics note that it has been well appreciated that the discretization methods that utilize symplectic structure are more accurate than the eulermaruyama method under similar computational costs thus the proposed integrator will have a better quality of samples esp when the number of discretization is small in order to show the effectiveness of the proposed method the paper runs three main experiments first with toy experiments the authors demonstrate that learning nablavt log ptvt xt is less difficult in comparison to learning the scores wrt the data directly then the authors evaluate the generation qualities of the proposed models on two image modeling benchmark datasets cifar10 and celeba256 finally the paper demonstrates the proposed integrators effectiveness compared to the eulermaruyama method including additional analysis ablating the effect of hyperparameters of the proposed integrator the paper claims the following contributions 1 first the paper proposes a novel class of continuoustime diffusionbased generative models called criticallydamped langevine diffusion cld 2 second the authors show that a score matching objective for the proposed method requires only nablavt log ptvt xt not nablaxt vt ptxt vt 3 third for the proposed model the paper proposes a variant of denoising score matching called hybrid score matching hsm observing that the training of cldbased models is unstable by using dsm 4 fourth the authors propose a leapfrogstyle integrator designed for cldbased models benefitting from hamiltonian dynamics symplectic structure 5 finally the paper claims to provide novel insights into continuoustime diffusionbased generative models and their connections to statistic mechanics strengths of the paper in general the papers contributions wrt the novelty are clear for several reasons first the paper provides novel diffusionbased models inspired by stochastic hamiltonian dynamics moreover the paper provides sufficient proofs to support that the proposed methods are well defined i find that the proposed method will be very interesting contribution to the generative model community second the authors demonstrate the scalability of the proposed methods by 1 showing that the increment of the model complexity is marginal thanks to reversetime sdes only requires logdensity gradient of conditional distributions of velocities given data and 2 proposing a modified dsm loss suited for cldbased models in addition i found that the paper has a wellorganized structure so that it is clear to understand the proposed and other practical techniques to improve training weaknesses of the paper the following aspects of the paper can be improved first the discussion about the proposed integrator can be improved the authors emphasize that the proposed leapfrogstyle integrator for cldbased models can generate better quality samples even when the number of discretization steps is small compared to the eulermaruyama method however it isnt easy to assess the statement based on the results described in tables 2 and 3 in the experiments the results are dependent on both the integrators and the qualities of learned models to resolve this issue one can add toy experiments to solidify the argument where ground truth reversetime sdes are approximately accessible thus the resulting analysis can highlight the difference of the integrators second the paper claims that the proposed method only needs to learn smoother signals and thus the models can perform better however in the current submission the discussion about this seems limited while the authors have already discussed the property by using toy experiments such as in fig 2 the benefits of the proposed model seem marginal in highdimensional datasets i assume that dropping weights to improve the generation qualities has diluted the benefits of the proposed method thus i believe that discussion about the proposed methods wrt maximum likelihood training can strengthen the significance of the proposed method in my understanding dropping weights for training diffusionbased generative models prevents modeling unbounded scores at ts close to the data where the weights are extremely high as a result the previous methods dont learn unbounded scores in practice which can be a potential reason for the marginal benefits of the proposed method in highdimensional settings therefore the proposed method may benefit from not dropping weights possibly resulting in highquality samples being achieved with maximum likelihoods training i anticipate that the comparison can be more observable if the models are trained in real space after logit transformation uniform dequantization moreover it can be helpful to add discussions about the tradeoff between the variance of the dsmhsm objective estimates and the variance of the p0v0 for example i imagine that the high variance of p0v0 results in the high variance in ptvtxt implying that the models only need to learn smoother scores again i assume this aspect can be more observable if the models are trained to maximize likelihoods third the motivation of hamiltonian dynamicsstyle generative models can be improved aside from the applicability of the leapfrogstyle integrator for the proposed method the authors claim that the velocity in clds accelerates mixing in the diffusion process for example in sec 4 however it is unclear how important is this mixing in the context of diffusionbased generative models in standard mcmc methods the mixing properties are important since such methods aim at matching the distribution of recorded states from a generated chain to target distributions on the other hand the diffusionbased generative models rely on the distributions of collection of chains esp the chains last states knowing that the previous diffusionbased generative models are already expressive and capable of generating diverse samples it is difficult to motivate the proposed method while its novelty i believe that improving the discussion about the proposed integrators will resolve the issue regardless of the proposed methods connection to statistic mechanics finally the discussion about the motivation for hsm can be improved in sec 53 the authors state that for the proposed method training with regular dsm is unstable while the authors state some reasoning about the instability but the analysis about it is limited i believe that the single sample monte carlo estimate for the regular dsm objective can have a high variance for the proposed method and thus i believe that the marginalizing out v0 in hsm has reduced the variance i consider that additional toy experiments that discuss the difference between dsm vs hsm wrt their variance will further motivate the importance of hsm for cldbased models in general the papers contributions wrt the novelty are clear and the proposed methods are welldefined in addition i found that the paper has a wellorganized structure so that it is clear to understand the proposed and other practical techniques to improve training thus im inclined to accept the paper however i found that several aspects of the submission can be improved im inclined to improve my evaluation if the aforementioned weak points are well addressed postrebuttal comments the rebuttal had addressed most of my concerns consequently i raised my score from 6 to 8 and i also raised the empirical novelty and significance score from 2 to 3 docsepthis paper proposes a criticallydamped version of scorebased diffusion models by extending the inference process to an augmented statespace and diffusing the data coupled with an auxiliary velocity variable the authors further proposed a hybrid score matching loss for training the reverse generative process which provides an empirical advantage of learning a conditional score function that does not blow up eg to infinity and therefore stabilizes training for sampling the authors derived a new numerical integration method based on first principles of statistical mechanics and md which they found has a better convergence behavior in practice if the computing budget is limited this is a paper welldone diffusion models require designing or sometimes training an inference process that brings the data distribution closer to some tractable prior working on an augmented statespace to introduce a momentum velocity variable to simulate damped langevin dynamics is very a natural extension that complements well the current scorebased diffusionbased models literature the authors then went on exploring several benefits of such an extended framework including a smoother conditional score function to approximate a novel smoother scorematching loss function to optimize and a specialized numerical integrator for efficient sampling i believe the novelties in all different aspects introduced in this work could largely benefit the community overall this is a great paper that is easy to follow and has multiple neat ideas strengths of the paper include novelty although it is a straightforward extension the authors explored different aspects of the central idea of augmentationacceleration and proposed a new loss that is smooth and a numerical method that works better for certain settings clarity the paper is well written ideas presented with clear theoretical development most of the time the design choices are explained leaving room for future research on some relevant topics such as optimizing for nll and adaptive solvers impact the problem the proposed method solves is very specialized particularly for sgdms the idea of statespace augmentation could potentially mitigate the exploding score of dsmtype training and can be a principled way of designing an sgdm that naturally admits faster numerical simulation i do not have any serious criticisms but i do have the following more detailed questions comments 1 there is a paragraph in section 3 describing the tradeoff between being overdamped and underdamped the explanation is quite intuitive but is there any more quantitative evidence for opting for critical damping either in theory or empirically i suppose an optimal ratio between gamma and m will be problemspecific would such generality be useful in practice 2 a missing point about using damped langevin dynamics is the potential speedup in convergence to the prior distribution a la ma et al 2019 proposition 1 diffusing along with the momentum variable could allow for significant acceleration which might reduce the integration time t needed for sampling as well 3 for hsm is marginalizing out v0 the key to smoothing out the loss function compared to dsm if so is this a form of blackwellization and does the gradient estimate always have a smaller variance compared to dsm 4 in 33 leimkuhler reich 2005 was cited for eulers methods not being suitable for hamiltonian dynamics is there any intuition behind this can the standard midpoint integration scheme be employed which is well suited for hamiltonian systems 5 for sscs can the second half step of the current iteration be merged with the first half step of the next iteration similarly to leapfrog integration 6 experimentwise are the numbers presented in the tables with one random seed for experiment and for evaluation just reading the number i wouldnt say the difference between em and sscs for large n in table 3 is negligible but perhaps thats more due to the stochasticity of the nature of fid 7 about pv0 the reason for worse nll for small gamma could be due to the fact that the upper bound requires subtracting the entropy of the augmented distribution which will be smaller itd be worth noting that this could be potentially optimized for nll aside from that smaller gamma would also mean the magnitude for the score at trightarrow0 is also larger ma et al 2019 is there an analog of nesterov acceleration for mcmc this is a timely paper that explores acceleration methods for diffusionbased models the authors also explored different aspects benefits of the proposed framework which i think will benefit the community of researchers working on sgdms including the smoothness of loss function and a new numerical integration method my recommendation is on the lower band of 8 accept good paper docsepthe authors propose criticallydamped langevin dynamics cld for scorebased generative modeling this consists of a higherorder langevin dynamics scheme with particle velocity and position coupled to each other as in hamiltonian dynamics the langevin dynamics is critical in the sense that it is neither over nor underdamped a corresponding score matching objective is derived as an objective with proof given that it is simply necessary to approximate the score of the velocity given the position empirical evidence is provided that this score is easier to estimate on a synthetic example as dsm is analytically intractable for the higherorder scheme hybrid score matching hsm is proposed and the integration integral to this objective is addressed with a new numerical integration scheme this approximation scheme called symmetric splitting cld sample sscs decomposes the sde to be integrated into a tractable expression and a hopefully small eulermaruyama integration for improved accuracy although still first order overall synthetic examples are used in both the main text and the supplementary material to motivate the theory benchmark image datasets exhibit exceptionally strong performance with improved sample efficiency after training and robust hyperparameters strengths section 1 and 4 provide a very thorough overview of recent work with a fantastic visualisation of the method i found the arguments and derivations throughout section 3 compelling with the appropriate mathematics reserved for the appendix the synthetic experiment demonstrating the ease of numerical approximation is also clearly explained and useful particular praise should be afforded to the consistent use of appropriate references in section 3 placing the derivations in context as well as the connections drawn to the highfriction limit in section a2 model evaluation is inline with literature and onpar with the very best performing models up to extreme levels of compute the method is essentially sota given the higherorder nature of the sampling evaluation of nfes is also wellmotivated and is a clear improvement for constant compute finally the robust hyperparameters are of practical relevance weaknesses there are very few weaknesses beyond typos and personal preferences for presentation my only comment would be the absence of references to the line of work on higherorder mcmc by michael i jordan and others which i believe to be pertinent 1 ma yian et al is there an analog of nesterov acceleration for mcmc arxiv preprint arxiv190200996 2019 2 mou wenlong et al highorder langevin diffusion yields an accelerated mcmc algorithm arxiv preprint arxiv190810859 2019 minor comments the reference for song et als sde paper is dated as iclr 2020 rather than 2021 penultimate para of sec 1 diffusionand diffusion and section b2 sgms does not need to be repeated in sentence 1 one of the sigmatxv matrices in equation 9 should sigmatvt this notation is used at the end of section b2 so consistency would make things a little smoother sec 51 we are significantly outperforming this model could be rephrased sec 51 outperforms outperform sec 52 para 2 as backbone as athe backbone tab 2 caption denotes denoted a1 very inefficient too very inefficiently too sec b2 penultimate para and equation 67 what is sigmatzz i assume this is a typo this paper is exceptionally well put together in terms of derivations presentation and experimentation the narrative is wellwritten wellmotivated and logical throughout to the best of my abilities the proofs are entirely correct the only way i can see to improve the experiments is to offer the authors more compute or data as a result i believe this paper to be of the highest quality and recommend that it is given particular attention at the conference docsepin this paper the authors introduce a novel approach for training a scorebased generative model with prior score based networks generation is performed by solving a stochastic differential equation sde based on langevin dynamics using an estimate of gradient of the log likelihood of the underlying signal distribution in the present paper the authors present a novel forward process where diffusion is run in a joint datavelocity space the noise term is only applied to the velocity component this reduces the learning problem to only needing to learn the score of the conditional distribution of velocity given data which is easier than learning the score of the data distribution directly the paper shows that the novel scheme cld yields higher quality for image generation when compared to prior models of similar capacity and number of neural network evaluations strengths of the paper the paper uses tools from physics to design a novel score based generative model allowing it to leverage existing insights in that field the paper results in training of generative networks that outperform existing approaches when making sure to balance compute budgets and model size the paper derive a sde integrator that allows for efficient sampling from the proposed class of models my only question to the authors is do they envision that a class of methods of this sort are also possible for example would the method work if they had an acceleration term in which the noise was injected i recommend the paper be accepted it presents a potentially significant improvement to the training of scorebased generative models which is well grounded in theory from physics and demonstrates strong empirical performance for sample generation when compared to baselines of similar complexity and compute budgets ### Summary:
the paper develops a diffusionprocess based generative model that perturbs the data using a critically damped langevin diffusion the diffusion is set up through an auxiliary velocity term like in hamiltonian dynamics the idea is that picking a process that diffuses faster will lead to better resultsthe paper then constructs a new score matching objective adapted to this diffusion along with a sampling scheme for critically damped langevin score based generative models the idea of a faster diffusion to make generative models is a good one the paper is a solid accept reviewer tk3a was lukewarm as evidenced by their original 2 for empirical novelty that moved to a 3 from my look it felt like a straightforward application of ideas in one domain sampling to another generative modeling its a good paper but it does not stand out relative to other accepts
[ 417, 8230, 598, 24088, 281, 23579, 285, 3103, 10308, 4219, 3733, 323, 10491, 253, 4477, 6012, 247, 747, 10704, 9554, 1332, 1754, 327, 806, 9241, 273, 7605, 17823, 285, 31934, 534, 597, 1119, 556, 247, 1805, 14940, 3879, 275, 3946, 604, 253, 12672, 7563, 310, 3710, 50275, 2520, 310, 247, 2929, 6210, 392, 531, 12393, 3210, 2430, 20462, 390, 4536, 3733, 271, 17032, 1232, 326, 10316, 253, 941, 3268, 8003, 281, 690, 10649, 494, 2720, 2444, 327, 271, 31612, 3054, 4511, 281, 9569, 247, 10254, 7602, 4778, 281, 26065, 16109, 264, 298, 912, 8498, 8062, 310, 1077, 247, 3626, 6880, 326, 509, 9115, 973, 253, 1655, 4868, 3169, 12393, 3169, 3210, 6239, 253, 4477, 840, 2427, 327, 18216, 2067, 5373, 273, 824, 271, 6508, 7792, 1690, 247, 39797, 977, 17697, 4868, 1159, 281, 16851, 247, 4460, 39797, 977, 660, 4362, 16464, 2957, 1159, 281, 22318, 285, 247, 18052, 10704, 2899, 1080, 323, 5919, 10491, 891, 2868, 253, 4460, 2890, 275, 512, 1027, 7794, 5611, 275, 436, 789, 812, 8127, 5649, 253, 3114, 50274, 1189, 455, 436, 310, 247, 1270, 2929, 326, 310, 3477, 281, 956, 285, 556, 2709, 18176, 5697, 20544, 273, 253, 2929, 2486, 50275, 2369, 652, 555, 3738, 352, 310, 247, 15246, 6880, 253, 4477, 14859, 1027, 7794, 273, 253, 4275, 2934, 273, 42072, 3649, 41563, 285, 4081, 247, 747, 2957, 326, 310, 6032, 285, 247, 10704, 1332, 326, 2987, 1805, 323, 2176, 7533, 50276, 498, 15752, 253, 2929, 310, 973, 3542, 5697, 3559, 342, 2590, 10527, 2440, 954, 273, 253, 673, 253, 2216, 10165, 403, 5544, 6108, 2316, 323, 2852, 2561, 327, 690, 4623, 12989, 824, 347, 39793, 323, 295, 620, 285, 17825, 1220, 735, 50274, 48276, 253, 1895, 253, 4081, 1332, 35910, 310, 1077, 18052, 3782, 323, 256, 35333, 983, 253, 2934, 273, 3054, 4511, 42072, 812, 7826, 29966, 253, 1414, 4442, 4868, 273, 277, 3610, 881, 3733, 285, 476, 320, 247, 3505, 74, 6216, 1039, 273, 20462, 271, 48237, 17670, 326, 10748, 19943, 7938, 10704, 9864, 50274, 74, 513, 417, 452, 667, 4092, 43680, 533, 891, 513, 452, 253, 1563, 625, 7000, 3533, 50276, 26122, 337, 627, 310, 247, 12494, 275, 2593, 495, 12930, 253, 5454, 2727, 875, 1146, 689, 69, 17263, 285, 762, 69, 17263, 253, 8813, 310, 3240, 27350, 533, 310, 627, 667, 625, 11745, 1941, 323, 1478, 272, 323, 4619, 31731, 2057, 275, 3762, 390, 45190, 891, 9428, 271, 8654, 4313, 875, 17356, 285, 278, 588, 320, 3237, 29765, 651, 824, 31376, 320, 4217, 275, 3946, 374, 247, 5816, 1127, 670, 970, 16109, 264, 298, 912, 8498, 8062, 310, 253, 2442, 3885, 484, 275, 14940, 281, 253, 2720, 3268, 247, 826, 6429, 1162, 355, 6247, 13989, 337, 2171, 5302, 2112, 342, 253, 10254, 4778, 812, 1581, 323, 1534, 17680, 534, 1537, 4796, 253, 9554, 673, 246, 3058, 323, 10491, 347, 973, 50276, 20, 323, 288, 3610, 310, 16888, 3006, 562, 362, 17, 253, 2234, 281, 36971, 562, 253, 2957, 1159, 2429, 281, 277, 3610, 604, 594, 310, 436, 247, 830, 273, 2806, 4714, 1320, 285, 1057, 253, 11786, 6642, 1900, 452, 247, 4577, 11041, 2429, 281, 277, 3610, 577, 275, 5922, 458, 303, 76, 6968, 2146, 50276, 38938, 5826, 369, 11106, 323, 299, 335, 398, 3082, 417, 1146, 7470, 323, 10546, 7839, 757, 8062, 310, 627, 667, 30328, 3212, 436, 476, 253, 2629, 4260, 3659, 9554, 6974, 320, 7091, 534, 310, 973, 18960, 323, 10546, 7839, 757, 2718, 608, 323, 256, 1026, 84, 476, 253, 1273, 2716, 3213, 273, 253, 1655, 19502, 320, 21884, 342, 253, 806, 2716, 3213, 273, 253, 1735, 19502, 12014, 281, 26416, 71, 6375, 9554, 50276, 23, 3368, 3020, 403, 253, 3904, 3559, 275, 253, 7180, 342, 581, 3632, 8357, 323, 3368, 285, 323, 7103, 816, 4361, 253, 1180, 891, 651, 2649, 1333, 253, 3064, 875, 802, 285, 256, 1026, 84, 323, 1781, 295, 275, 2829, 495, 310, 22879, 533, 4931, 28763, 625, 1955, 281, 253, 19191, 414, 273, 253, 3753, 273, 269, 301, 50276, 24, 670, 268, 87, 17, 253, 1921, 323, 7197, 295, 620, 323, 1355, 17356, 812, 320, 1955, 281, 253, 958, 326, 253, 5170, 3033, 4419, 45771, 253, 15579, 273, 253, 31612, 3268, 534, 588, 320, 4577, 352, 69, 320, 4409, 15806, 326, 436, 812, 320, 7826, 18325, 323, 295, 620, 9255, 432, 326, 4577, 17356, 651, 671, 1599, 253, 9777, 323, 253, 4868, 387, 492, 429, 2501, 17, 310, 671, 4067, 50275, 785, 1162, 355, 6247, 310, 627, 271, 7370, 273, 295, 9358, 729, 17680, 323, 278, 3591, 68, 50276, 2520, 310, 247, 14793, 2929, 326, 33826, 17680, 3082, 323, 12393, 3169, 3210, 253, 4477, 671, 14859, 1027, 7794, 50276, 31891, 953, 273, 253, 4081, 7792, 534, 891, 1158, 588, 5649, 253, 3114, 273, 8607, 2444, 327, 256, 35333, 983, 1690, 253, 6032, 1255, 273, 2957, 1159, 285, 247, 747, 10704, 9554, 1332, 50275, 2577, 17401, 310, 327, 253, 2406, 3961, 273, 854, 2997, 1175, 2929, 50276, 7152, 339, 431, 248, 4477, 12661, 21038, 69, 17263, 298, 912, 8498, 8062, 260, 392, 323, 4868, 3169, 1006, 800, 14053, 436, 8414, 273, 247, 2169, 2621, 298, 912, 8498, 8062, 6974, 342, 8091, 7602, 285, 1899, 9904, 281, 1016, 643, 347, 275, 10546, 7839, 757, 8062, 253, 298, 912, 8498, 8062, 310, 4619, 275, 253, 3282, 326, 352, 310, 6747, 689, 4543, 762, 69, 17263, 247, 3969, 4868, 11038, 8103, 310, 6012, 347, 271, 8103, 342, 4737, 1677, 326, 352, 310, 3365, 3309, 281, 16851, 253, 4868, 273, 253, 7602, 1677, 253, 1899, 16774, 1941, 310, 2530, 326, 436, 4868, 310, 6927, 281, 6642, 327, 247, 13506, 1650, 347, 277, 3610, 310, 41398, 540, 44374, 323, 253, 2169, 2621, 6974, 9769, 4868, 11038, 288, 3610, 310, 4081, 285, 253, 9554, 9909, 281, 436, 8103, 310, 9713, 342, 247, 747, 10704, 9554, 6974, 436, 11193, 6974, 1925, 13123, 19860, 260, 392, 3410, 256, 1026, 84, 11101, 6013, 253, 256, 615, 281, 320, 8527, 715, 247, 10649, 494, 2048, 285, 247, 18670, 1355, 299, 335, 693, 274, 7352, 2902, 9554, 323, 5520, 7200, 3738, 1335, 806, 1340, 4583, 13506, 6667, 403, 908, 275, 1097, 253, 2022, 2505, 285, 253, 24864, 2144, 281, 41509, 253, 3762, 22791, 2460, 15302, 10738, 35888, 2266, 3045, 342, 5520, 3410, 6733, 846, 3733, 285, 10237, 4373, 22041, 20544, 50275, 4674, 337, 285, 577, 2085, 247, 1077, 11080, 18389, 273, 3332, 789, 342, 247, 15143, 5304, 5837, 273, 253, 1332, 50276, 74, 1119, 253, 7125, 285, 3538, 569, 4768, 2593, 495, 18511, 342, 253, 4569, 23065, 10827, 323, 253, 30762, 253, 13506, 3368, 17227, 253, 11990, 273, 10704, 11193, 310, 671, 4518, 5544, 285, 4217, 50276, 50077, 19916, 943, 320, 26299, 281, 253, 5185, 897, 273, 4569, 10414, 275, 2593, 495, 15606, 253, 3538, 569, 275, 3634, 347, 973, 347, 253, 10291, 8392, 281, 253, 1029, 71, 14365, 2701, 275, 2593, 247, 19, 50276, 7645, 7103, 310, 13866, 342, 6239, 285, 327, 1148, 342, 253, 1077, 1682, 9591, 3210, 598, 281, 9559, 2308, 273, 11897, 253, 1332, 310, 9093, 256, 5503, 1677, 253, 2169, 2621, 3753, 273, 253, 10491, 7103, 273, 295, 71, 265, 310, 671, 973, 24013, 8550, 285, 310, 247, 2590, 7756, 323, 3638, 11897, 4720, 253, 10237, 4373, 22041, 403, 273, 8542, 17200, 50276, 20881, 1255, 265, 50276, 9088, 403, 1077, 1643, 32213, 4457, 963, 993, 285, 3367, 17971, 323, 9759, 50276, 2577, 760, 4385, 651, 320, 253, 5928, 273, 10414, 281, 253, 1386, 273, 789, 327, 2169, 2621, 278, 3591, 68, 407, 278, 44023, 891, 480, 11208, 285, 2571, 534, 891, 2868, 281, 320, 21452, 50276, 18, 6429, 340, 757, 1162, 355, 310, 627, 271, 7370, 273, 295, 9358, 729, 17680, 323, 278, 3591, 68, 549, 32693, 638, 3845, 549, 32693, 16129, 1518, 28053, 6247, 50276, 19, 278, 276, 259, 257, 5056, 1162, 355, 1029, 2621, 298, 912, 8498, 12393, 11026, 271, 21702, 278, 3591, 68, 5933, 549, 32693, 638, 3845, 549, 32693, 746, 2904, 740, 32168, 6247, 50276, 37585, 5701, 50276, 783, 3806, 323, 4498, 1162, 14350, 256, 615, 2929, 310, 15483, 347, 17857, 32888, 9169, 2581, 685, 43425, 50276, 3878, 503, 2542, 5586, 273, 4706, 337, 12393, 395, 50276, 13437, 2035, 285, 50276, 4674, 270, 19, 48237, 983, 1057, 417, 878, 281, 320, 6015, 275, 6197, 337, 50276, 531, 273, 253, 9788, 2056, 89, 87, 12624, 275, 5150, 898, 943, 9788, 2056, 20282, 436, 14951, 310, 908, 387, 253, 990, 273, 2593, 270, 19, 594, 15274, 651, 1056, 1841, 247, 1652, 39797, 977, 50276, 1704, 8319, 359, 403, 3012, 41731, 14692, 436, 1566, 812, 320, 294, 545, 83, 833, 50276, 1704, 8319, 41731, 13015, 50276, 483, 32231, 50276, 1704, 8073, 5586, 374, 347, 27882, 50276, 284, 15389, 27882, 50276, 8476, 374, 11743, 12853, 50276, 3354, 4225, 50276, 66, 18, 1077, 31334, 1512, 50276, 635, 31334, 314, 1512, 50276, 1704, 270, 19, 4331, 503, 2542, 5586, 285, 5150, 9963, 752, 310, 9788, 2056, 4396, 891, 5467, 436, 310, 247, 1745, 80, 436, 2929, 310, 35888, 973, 1691, 2366, 275, 2426, 273, 3538, 569, 9759, 285, 40290, 253, 14511, 310, 973, 15720, 973, 24013, 8550, 285, 13760, 4768, 281, 253, 1682, 273, 619, 15277, 253, 27947, 403, 7094, 3451, 253, 760, 1039, 891, 476, 923, 281, 3157, 253, 4679, 310, 281, 3959, 253, 4477, 625, 11897, 390, 941, 50276, 284, 247, 906, 891, 2868, 436, 2929, 281, 320, 273, 253, 4585, 3290, 285, 5583, 326, 352, 310, 1677, 1798, 4116, 387, 253, 8059, 5474, 339, 9852, 436, 2929, 253, 4477, 9569, 247, 4460, 2746, 323, 3733, 247, 4868, 3169, 1006, 800, 1566, 50276, 3113, 2720, 4868, 1754, 6928, 5978, 310, 2684, 407, 16161, 247, 19191, 8967, 5150, 256, 615, 1754, 327, 298, 912, 8498, 8062, 970, 271, 6642, 273, 11786, 273, 253, 2412, 12177, 273, 253, 6944, 2625, 3268, 50276, 249, 253, 1246, 2929, 253, 4477, 1246, 247, 4460, 3579, 1232, 835, 12393, 310, 1408, 275, 247, 6036, 2856, 8526, 23716, 2317, 50276, 783, 6046, 1307, 310, 760, 3732, 281, 253, 7602, 4445, 50276, 2520, 11355, 253, 4715, 1895, 281, 760, 25312, 281, 3037, 253, 4868, 273, 253, 17697, 3268, 273, 7602, 1677, 941, 534, 310, 6927, 685, 4715, 253, 4868, 273, 253, 941, 3268, 3587, 50276, 783, 2929, 2722, 326, 253, 4460, 6974, 260, 392, 11026, 2169, 3290, 323, 2460, 5978, 672, 2429, 281, 2720, 3210, 273, 2074, 5350, 285, 1180, 273, 11454, 2990, 27163, 50274, 296, 3755, 20556, 273, 253, 2929, 50276, 783, 2929, 4648, 5657, 432, 12057, 281, 2216, 247, 4460, 4868, 1754, 1006, 800, 1566, 6941, 352, 281, 25057, 5368, 16039, 275, 326, 1673, 50276, 783, 2929, 1543, 275, 3733, 273, 1006, 800, 6928, 326, 562, 32231, 5368, 7274, 672, 2403, 2119, 281, 6654, 11897, 35905, 285, 1566, 1979, 50276, 783, 2929, 15313, 247, 256, 615, 2899, 1080, 326, 4483, 323, 5919, 10491, 432, 253, 4081, 966, 273, 3210, 50275, 2577, 760, 1953, 281, 253, 4477, 310, 513, 597, 31161, 326, 247, 966, 273, 3082, 273, 436, 3686, 403, 671, 1896, 50276, 1542, 1650, 651, 253, 1332, 789, 604, 597, 574, 271, 17680, 1307, 275, 534, 253, 6046, 369, 13945, 50276, 74, 5583, 253, 2929, 320, 7607, 50276, 262, 10262, 247, 7826, 1534, 7756, 281, 253, 3733, 273, 4868, 3169, 1006, 800, 3210, 534, 310, 973, 28462, 275, 3762, 432, 12057, 285, 14371, 2266, 16774, 3045, 323, 3410, 5978, 672, 2429, 281, 1666, 25379, 273, 2074, 10454, 285, 11897, 35905, 50275, 187, 187, 4118, 18435, 27, 783, 2929, 24357, 247, 12393, 7404, 1754, 1006, 800, 1566, 326, 6925, 28312, 253, 941, 970, 247, 21038, 16109, 264, 298, 912, 8498, 12393, 253, 12393, 310, 873, 598, 949, 271, 24026, 7602, 1307, 751, 275, 10546, 7839, 757, 8062, 253, 2934, 310, 326, 8871, 247, 1232, 326, 2171, 5123, 7938, 588, 1421, 281, 1805, 906, 296, 248, 2929, 840, 21031, 247, 747, 4868, 11038, 8103, 12956, 281, 436, 12393, 2112, 342, 247, 10491, 6974, 323, 21038, 16109, 264, 298, 912, 8498, 4868, 1754, 1006, 800, 3210, 253, 2934, 273, 247, 7938, 12393, 281, 1056, 1006, 800, 3210, 310, 247, 1175, 581, 253, 2929, 310, 247, 4891, 2997, 50274, 15337, 254, 246, 76, 20, 66, 369, 298, 17936, 44041, 347, 27007, 407, 616, 3236, 374, 323, 16774, 38135, 326, 4395, 281, 247, 495, 432, 619, 1007, 352, 3543, 751, 247, 15246, 2898, 273, 5697, 275, 581, 5028, 10491, 281, 1529, 1006, 800, 14053, 697, 247, 1175, 2929, 533, 352, 1057, 417, 1462, 562, 4103, 281, 643, 25026 ]
[ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 ]
[ 417, 8230, 598, 24088, 281, 23579, 285, 3103, 10308, 4219, 3733, 323, 10491, 253, 4477, 6012, 247, 747, 10704, 9554, 1332, 1754, 327, 806, 9241, 273, 7605, 17823, 285, 31934, 534, 597, 1119, 556, 247, 1805, 14940, 3879, 275, 3946, 604, 253, 12672, 7563, 310, 3710, 50275, 2520, 310, 247, 2929, 6210, 392, 531, 12393, 3210, 2430, 20462, 390, 4536, 3733, 271, 17032, 1232, 326, 10316, 253, 941, 3268, 8003, 281, 690, 10649, 494, 2720, 2444, 327, 271, 31612, 3054, 4511, 281, 9569, 247, 10254, 7602, 4778, 281, 26065, 16109, 264, 298, 912, 8498, 8062, 310, 1077, 247, 3626, 6880, 326, 509, 9115, 973, 253, 1655, 4868, 3169, 12393, 3169, 3210, 6239, 253, 4477, 840, 2427, 327, 18216, 2067, 5373, 273, 824, 271, 6508, 7792, 1690, 247, 39797, 977, 17697, 4868, 1159, 281, 16851, 247, 4460, 39797, 977, 660, 4362, 16464, 2957, 1159, 281, 22318, 285, 247, 18052, 10704, 2899, 1080, 323, 5919, 10491, 891, 2868, 253, 4460, 2890, 275, 512, 1027, 7794, 5611, 275, 436, 789, 812, 8127, 5649, 253, 3114, 50274, 1189, 455, 436, 310, 247, 1270, 2929, 326, 310, 3477, 281, 956, 285, 556, 2709, 18176, 5697, 20544, 273, 253, 2929, 2486, 50275, 2369, 652, 555, 3738, 352, 310, 247, 15246, 6880, 253, 4477, 14859, 1027, 7794, 273, 253, 4275, 2934, 273, 42072, 3649, 41563, 285, 4081, 247, 747, 2957, 326, 310, 6032, 285, 247, 10704, 1332, 326, 2987, 1805, 323, 2176, 7533, 50276, 498, 15752, 253, 2929, 310, 973, 3542, 5697, 3559, 342, 2590, 10527, 2440, 954, 273, 253, 673, 253, 2216, 10165, 403, 5544, 6108, 2316, 323, 2852, 2561, 327, 690, 4623, 12989, 824, 347, 39793, 323, 295, 620, 285, 17825, 1220, 735, 50274, 48276, 253, 1895, 253, 4081, 1332, 35910, 310, 1077, 18052, 3782, 323, 256, 35333, 983, 253, 2934, 273, 3054, 4511, 42072, 812, 7826, 29966, 253, 1414, 4442, 4868, 273, 277, 3610, 881, 3733, 285, 476, 320, 247, 3505, 74, 6216, 1039, 273, 20462, 271, 48237, 17670, 326, 10748, 19943, 7938, 10704, 9864, 50274, 74, 513, 417, 452, 667, 4092, 43680, 533, 891, 513, 452, 253, 1563, 625, 7000, 3533, 50276, 26122, 337, 627, 310, 247, 12494, 275, 2593, 495, 12930, 253, 5454, 2727, 875, 1146, 689, 69, 17263, 285, 762, 69, 17263, 253, 8813, 310, 3240, 27350, 533, 310, 627, 667, 625, 11745, 1941, 323, 1478, 272, 323, 4619, 31731, 2057, 275, 3762, 390, 45190, 891, 9428, 271, 8654, 4313, 875, 17356, 285, 278, 588, 320, 3237, 29765, 651, 824, 31376, 320, 4217, 275, 3946, 374, 247, 5816, 1127, 670, 970, 16109, 264, 298, 912, 8498, 8062, 310, 253, 2442, 3885, 484, 275, 14940, 281, 253, 2720, 3268, 247, 826, 6429, 1162, 355, 6247, 13989, 337, 2171, 5302, 2112, 342, 253, 10254, 4778, 812, 1581, 323, 1534, 17680, 534, 1537, 4796, 253, 9554, 673, 246, 3058, 323, 10491, 347, 973, 50276, 20, 323, 288, 3610, 310, 16888, 3006, 562, 362, 17, 253, 2234, 281, 36971, 562, 253, 2957, 1159, 2429, 281, 277, 3610, 604, 594, 310, 436, 247, 830, 273, 2806, 4714, 1320, 285, 1057, 253, 11786, 6642, 1900, 452, 247, 4577, 11041, 2429, 281, 277, 3610, 577, 275, 5922, 458, 303, 76, 6968, 2146, 50276, 38938, 5826, 369, 11106, 323, 299, 335, 398, 3082, 417, 1146, 7470, 323, 10546, 7839, 757, 8062, 310, 627, 667, 30328, 3212, 436, 476, 253, 2629, 4260, 3659, 9554, 6974, 320, 7091, 534, 310, 973, 18960, 323, 10546, 7839, 757, 2718, 608, 323, 256, 1026, 84, 476, 253, 1273, 2716, 3213, 273, 253, 1655, 19502, 320, 21884, 342, 253, 806, 2716, 3213, 273, 253, 1735, 19502, 12014, 281, 26416, 71, 6375, 9554, 50276, 23, 3368, 3020, 403, 253, 3904, 3559, 275, 253, 7180, 342, 581, 3632, 8357, 323, 3368, 285, 323, 7103, 816, 4361, 253, 1180, 891, 651, 2649, 1333, 253, 3064, 875, 802, 285, 256, 1026, 84, 323, 1781, 295, 275, 2829, 495, 310, 22879, 533, 4931, 28763, 625, 1955, 281, 253, 19191, 414, 273, 253, 3753, 273, 269, 301, 50276, 24, 670, 268, 87, 17, 253, 1921, 323, 7197, 295, 620, 323, 1355, 17356, 812, 320, 1955, 281, 253, 958, 326, 253, 5170, 3033, 4419, 45771, 253, 15579, 273, 253, 31612, 3268, 534, 588, 320, 4577, 352, 69, 320, 4409, 15806, 326, 436, 812, 320, 7826, 18325, 323, 295, 620, 9255, 432, 326, 4577, 17356, 651, 671, 1599, 253, 9777, 323, 253, 4868, 387, 492, 429, 2501, 17, 310, 671, 4067, 50275, 785, 1162, 355, 6247, 310, 627, 271, 7370, 273, 295, 9358, 729, 17680, 323, 278, 3591, 68, 50276, 2520, 310, 247, 14793, 2929, 326, 33826, 17680, 3082, 323, 12393, 3169, 3210, 253, 4477, 671, 14859, 1027, 7794, 50276, 31891, 953, 273, 253, 4081, 7792, 534, 891, 1158, 588, 5649, 253, 3114, 273, 8607, 2444, 327, 256, 35333, 983, 1690, 253, 6032, 1255, 273, 2957, 1159, 285, 247, 747, 10704, 9554, 1332, 50275, 2577, 17401, 310, 327, 253, 2406, 3961, 273, 854, 2997, 1175, 2929, 50276, 7152, 339, 431, 248, 4477, 12661, 21038, 69, 17263, 298, 912, 8498, 8062, 260, 392, 323, 4868, 3169, 1006, 800, 14053, 436, 8414, 273, 247, 2169, 2621, 298, 912, 8498, 8062, 6974, 342, 8091, 7602, 285, 1899, 9904, 281, 1016, 643, 347, 275, 10546, 7839, 757, 8062, 253, 298, 912, 8498, 8062, 310, 4619, 275, 253, 3282, 326, 352, 310, 6747, 689, 4543, 762, 69, 17263, 247, 3969, 4868, 11038, 8103, 310, 6012, 347, 271, 8103, 342, 4737, 1677, 326, 352, 310, 3365, 3309, 281, 16851, 253, 4868, 273, 253, 7602, 1677, 253, 1899, 16774, 1941, 310, 2530, 326, 436, 4868, 310, 6927, 281, 6642, 327, 247, 13506, 1650, 347, 277, 3610, 310, 41398, 540, 44374, 323, 253, 2169, 2621, 6974, 9769, 4868, 11038, 288, 3610, 310, 4081, 285, 253, 9554, 9909, 281, 436, 8103, 310, 9713, 342, 247, 747, 10704, 9554, 6974, 436, 11193, 6974, 1925, 13123, 19860, 260, 392, 3410, 256, 1026, 84, 11101, 6013, 253, 256, 615, 281, 320, 8527, 715, 247, 10649, 494, 2048, 285, 247, 18670, 1355, 299, 335, 693, 274, 7352, 2902, 9554, 323, 5520, 7200, 3738, 1335, 806, 1340, 4583, 13506, 6667, 403, 908, 275, 1097, 253, 2022, 2505, 285, 253, 24864, 2144, 281, 41509, 253, 3762, 22791, 2460, 15302, 10738, 35888, 2266, 3045, 342, 5520, 3410, 6733, 846, 3733, 285, 10237, 4373, 22041, 20544, 50275, 4674, 337, 285, 577, 2085, 247, 1077, 11080, 18389, 273, 3332, 789, 342, 247, 15143, 5304, 5837, 273, 253, 1332, 50276, 74, 1119, 253, 7125, 285, 3538, 569, 4768, 2593, 495, 18511, 342, 253, 4569, 23065, 10827, 323, 253, 30762, 253, 13506, 3368, 17227, 253, 11990, 273, 10704, 11193, 310, 671, 4518, 5544, 285, 4217, 50276, 50077, 19916, 943, 320, 26299, 281, 253, 5185, 897, 273, 4569, 10414, 275, 2593, 495, 15606, 253, 3538, 569, 275, 3634, 347, 973, 347, 253, 10291, 8392, 281, 253, 1029, 71, 14365, 2701, 275, 2593, 247, 19, 50276, 7645, 7103, 310, 13866, 342, 6239, 285, 327, 1148, 342, 253, 1077, 1682, 9591, 3210, 598, 281, 9559, 2308, 273, 11897, 253, 1332, 310, 9093, 256, 5503, 1677, 253, 2169, 2621, 3753, 273, 253, 10491, 7103, 273, 295, 71, 265, 310, 671, 973, 24013, 8550, 285, 310, 247, 2590, 7756, 323, 3638, 11897, 4720, 253, 10237, 4373, 22041, 403, 273, 8542, 17200, 50276, 20881, 1255, 265, 50276, 9088, 403, 1077, 1643, 32213, 4457, 963, 993, 285, 3367, 17971, 323, 9759, 50276, 2577, 760, 4385, 651, 320, 253, 5928, 273, 10414, 281, 253, 1386, 273, 789, 327, 2169, 2621, 278, 3591, 68, 407, 278, 44023, 891, 480, 11208, 285, 2571, 534, 891, 2868, 281, 320, 21452, 50276, 18, 6429, 340, 757, 1162, 355, 310, 627, 271, 7370, 273, 295, 9358, 729, 17680, 323, 278, 3591, 68, 549, 32693, 638, 3845, 549, 32693, 16129, 1518, 28053, 6247, 50276, 19, 278, 276, 259, 257, 5056, 1162, 355, 1029, 2621, 298, 912, 8498, 12393, 11026, 271, 21702, 278, 3591, 68, 5933, 549, 32693, 638, 3845, 549, 32693, 746, 2904, 740, 32168, 6247, 50276, 37585, 5701, 50276, 783, 3806, 323, 4498, 1162, 14350, 256, 615, 2929, 310, 15483, 347, 17857, 32888, 9169, 2581, 685, 43425, 50276, 3878, 503, 2542, 5586, 273, 4706, 337, 12393, 395, 50276, 13437, 2035, 285, 50276, 4674, 270, 19, 48237, 983, 1057, 417, 878, 281, 320, 6015, 275, 6197, 337, 50276, 531, 273, 253, 9788, 2056, 89, 87, 12624, 275, 5150, 898, 943, 9788, 2056, 20282, 436, 14951, 310, 908, 387, 253, 990, 273, 2593, 270, 19, 594, 15274, 651, 1056, 1841, 247, 1652, 39797, 977, 50276, 1704, 8319, 359, 403, 3012, 41731, 14692, 436, 1566, 812, 320, 294, 545, 83, 833, 50276, 1704, 8319, 41731, 13015, 50276, 483, 32231, 50276, 1704, 8073, 5586, 374, 347, 27882, 50276, 284, 15389, 27882, 50276, 8476, 374, 11743, 12853, 50276, 3354, 4225, 50276, 66, 18, 1077, 31334, 1512, 50276, 635, 31334, 314, 1512, 50276, 1704, 270, 19, 4331, 503, 2542, 5586, 285, 5150, 9963, 752, 310, 9788, 2056, 4396, 891, 5467, 436, 310, 247, 1745, 80, 436, 2929, 310, 35888, 973, 1691, 2366, 275, 2426, 273, 3538, 569, 9759, 285, 40290, 253, 14511, 310, 973, 15720, 973, 24013, 8550, 285, 13760, 4768, 281, 253, 1682, 273, 619, 15277, 253, 27947, 403, 7094, 3451, 253, 760, 1039, 891, 476, 923, 281, 3157, 253, 4679, 310, 281, 3959, 253, 4477, 625, 11897, 390, 941, 50276, 284, 247, 906, 891, 2868, 436, 2929, 281, 320, 273, 253, 4585, 3290, 285, 5583, 326, 352, 310, 1677, 1798, 4116, 387, 253, 8059, 5474, 339, 9852, 436, 2929, 253, 4477, 9569, 247, 4460, 2746, 323, 3733, 247, 4868, 3169, 1006, 800, 1566, 50276, 3113, 2720, 4868, 1754, 6928, 5978, 310, 2684, 407, 16161, 247, 19191, 8967, 5150, 256, 615, 1754, 327, 298, 912, 8498, 8062, 970, 271, 6642, 273, 11786, 273, 253, 2412, 12177, 273, 253, 6944, 2625, 3268, 50276, 249, 253, 1246, 2929, 253, 4477, 1246, 247, 4460, 3579, 1232, 835, 12393, 310, 1408, 275, 247, 6036, 2856, 8526, 23716, 2317, 50276, 783, 6046, 1307, 310, 760, 3732, 281, 253, 7602, 4445, 50276, 2520, 11355, 253, 4715, 1895, 281, 760, 25312, 281, 3037, 253, 4868, 273, 253, 17697, 3268, 273, 7602, 1677, 941, 534, 310, 6927, 685, 4715, 253, 4868, 273, 253, 941, 3268, 3587, 50276, 783, 2929, 2722, 326, 253, 4460, 6974, 260, 392, 11026, 2169, 3290, 323, 2460, 5978, 672, 2429, 281, 2720, 3210, 273, 2074, 5350, 285, 1180, 273, 11454, 2990, 27163, 50274, 296, 3755, 20556, 273, 253, 2929, 50276, 783, 2929, 4648, 5657, 432, 12057, 281, 2216, 247, 4460, 4868, 1754, 1006, 800, 1566, 6941, 352, 281, 25057, 5368, 16039, 275, 326, 1673, 50276, 783, 2929, 1543, 275, 3733, 273, 1006, 800, 6928, 326, 562, 32231, 5368, 7274, 672, 2403, 2119, 281, 6654, 11897, 35905, 285, 1566, 1979, 50276, 783, 2929, 15313, 247, 256, 615, 2899, 1080, 326, 4483, 323, 5919, 10491, 432, 253, 4081, 966, 273, 3210, 50275, 2577, 760, 1953, 281, 253, 4477, 310, 513, 597, 31161, 326, 247, 966, 273, 3082, 273, 436, 3686, 403, 671, 1896, 50276, 1542, 1650, 651, 253, 1332, 789, 604, 597, 574, 271, 17680, 1307, 275, 534, 253, 6046, 369, 13945, 50276, 74, 5583, 253, 2929, 320, 7607, 50276, 262, 10262, 247, 7826, 1534, 7756, 281, 253, 3733, 273, 4868, 3169, 1006, 800, 3210, 534, 310, 973, 28462, 275, 3762, 432, 12057, 285, 14371, 2266, 16774, 3045, 323, 3410, 5978, 672, 2429, 281, 1666, 25379, 273, 2074, 10454, 285, 11897, 35905, 50275, 187, 187, 4118, 18435, 27, 783, 2929, 24357, 247, 12393, 7404, 1754, 1006, 800, 1566, 326, 6925, 28312, 253, 941, 970, 247, 21038, 16109, 264, 298, 912, 8498, 12393, 253, 12393, 310, 873, 598, 949, 271, 24026, 7602, 1307, 751, 275, 10546, 7839, 757, 8062, 253, 2934, 310, 326, 8871, 247, 1232, 326, 2171, 5123, 7938, 588, 1421, 281, 1805, 906, 296, 248, 2929, 840, 21031, 247, 747, 4868, 11038, 8103, 12956, 281, 436, 12393, 2112, 342, 247, 10491, 6974, 323, 21038, 16109, 264, 298, 912, 8498, 4868, 1754, 1006, 800, 3210, 253, 2934, 273, 247, 7938, 12393, 281, 1056, 1006, 800, 3210, 310, 247, 1175, 581, 253, 2929, 310, 247, 4891, 2997, 50274, 15337, 254, 246, 76, 20, 66, 369, 298, 17936, 44041, 347, 27007, 407, 616, 3236, 374, 323, 16774, 38135, 326, 4395, 281, 247, 495, 432, 619, 1007, 352, 3543, 751, 247, 15246, 2898, 273, 5697, 275, 581, 5028, 10491, 281, 1529, 1006, 800, 14053, 697, 247, 1175, 2929, 533, 352, 1057, 417, 1462, 562, 4103, 281, 643, 25026 ]
Below is given review of a research paper from cnoference journal. Please write a summary the review. ### Review: this paper proposes a method to repurpose classifiers trained using the normalized softmax loss function to novel classes by averaging the examples from the same previously unseen class to gain a new weight vector for that class the method is simple and effective even though it is not theoretically justified for more than one class this work follows on from the normalized sotmax loss function work by wang et al 2018 in that work the last layer of a network learns feature vectors that are constrainedprojected to a unit sphere learned simultaneously are weight vectors w for each class this work considers the arrival of a new set of examples for a set of new previously unseen classes these examples come with labels but there are assumed to be few examples and we would like to lever the features from the pretraining of the network on the original training set ideally one would use the cross entropy loss to optimize a new set of classifying vectors w in the unit sphere for the new classes the authors point out that for one class this is equivalent to making the new class vector w to be the normalized average of all the feature vectors for the new class note the weight vectors for the classes and the features vectors share the same space in the unit sphere then the authors consider what might happen if they apply the same method for more than one new class this has less theoretical justification in effect for each new class a weight vector is generated by computing the normalized average of the feature vectors for examples from that class they consider different settings joint and disjoint and evaluate on a few different datasets the main results tables 1234 appear impressive but i have some reservations 1 it is not clear what the feature dimensions are for the datasets 2 there is little detail about the tripletcontrastive methods used in the comparison 3 i would expect to compare against many more competing methods for this proposal the first three evaluation datasets were minst cifar10 and fashion mnist a lot of space in the paper is given to the plntnet dataset 44 it is a careful appropriate and novel application of the method there is a discussion about how accuracy is affected for varied numbers of unseen classes under disjoint and joint scenarios with different training unseen class splits it is revealed that it is helpful to pretrain with a large number of seen classes but that too many seen classes can be detrimental so there is a trade of for this parameter for this dataset at least this work plntnet does not have any baselines or comparisons with other methods it shows that the method can be applied but doesnt tell us much else there are no strong conclusions or detailed analysis of the results in section 45 the authors compare their method to incremental learning in this case by training a new layer to the new classes while freezing all other weights this would be a section of great interest to anyone reading this paper but it is short and the results are presented in one figure which is hard to read and no strong conclusions can be drawn from it i like the method it is simple and intuitive and the initial results appear impressive but for such a simple model the burden of proof is high and there should be many more detailed comparisons with soa zeroshot methods allowing the reader to compare and contrast benefits so i would like to see more detailed analysis on datasets and methods for example i would also like to see study on 1 how does the dimension of the feature vector affects performance 2 what happens when a set of new classes are very similar to a known class perhaps some synthetic data would allow to demonstrate the strengths and weaknesses of the method docsepthe paper presents how the normalized softmax loss can be used on the open set problem superior results seem to be achieved without the pairwise pairwise training 1 the paper completely missed the related works many papers addressing this problem are ignored basically only two metric learning approaches are introduced the paper should improve the related work following this recent survey geng chuanxing shengjun huang and songcan chen recent advances in open set recognition a survey ieee transactions on pattern analysis and machine intelligence 2020 2 in the reviewers opinion the paper in its current shape is not ready to be published the proposed method is loosely motivated and the comparisons are somewhat limited 3 paper contributions are not highlighted difficult for a reviewer to understand the novelty of the paper if any paper difficult to review in its current shape docsepthis paper is about classification of images in an open set setting data coming from new classes are introduced to the network after training on data from a fixed set of known classes the goal is to be able to correctly classify the old and new classes either jointly or not this paper proposes a method for handling new classes by increasing the size of the classifier weights for each new classes the network is trained using the normalized softmax loss and new classifier elements are added by finding the center of mass of the data coming from the new set of class the method performs on fashion mnist cifar and plantnet datasets this paper presents a good empirical study on dataset showing the effect of the number of class seen and unseen however there is a concern regarding some statement made by the author and regarding the novelty presented in this work strength case study in sec 44 this case study gives valuable insight on the effect of the data distribution on learning fig 45 and 6 shows that the choice of seen classes is important for learning on unseen classes and that interclass diversity is more important than having a high number of classes weakness in the abstract the author states that the results are 81 more accurate but do not specify compare to what i do not see in the table a jump in performance of more than 81 compare to the proposed baselines the authors chose to have metric learning methods as a baseline and compare their classifier with tripletcontrastive networks while deep metric learning advanced recent methods proposed frameworks that do not necessary need to form pairs see proxy anchor loss for deep metric learning from s kim et al or no fuss distance metric learning using proxies by y movshovitzattias et al the method proposed by the authors seems to be an easier setting than the method proposed in the stated paper since they are classifying in a zeroshot manner the new classes are not seen in training the main issue that could bring concern is the design of the weight for the new class in eq 4 by not finetuning the network on the new accessible classes there is a chance that the features used to compute the new weights belong to a similar distribution than features from another existing classes it could then happen that a new weights is equal to an already existing weight how to make sure that it does not happen another issue is the lack of comparison with existing work it would have been interesting to compare with stateofthe art incremental learning method or the newest metric learning method it would have provided a strong support for claiming that the authors work is new and performant there are too few related work reviewed in sec 5 my recommendation for this paper is a strong reject i am not convinced of the novelty proposed in this paper and there are too few comparison with existing work to strengthen the validity of the performance on the proposed datasets docsepthe paper proposes an approach to fewshot learning based on the the cosface normalized softmax loss after pretraining class weights are added to the cross entropy loss for each new class in the test set which are computed by averaging over inferred weights from a support set while fixing the remaining network weights feature extractor experiments conducted on cifar10 and fashionmnist indicated gains over baseline approaches strengths the presentation and writing is clear and easy to follow the proposed approach is simple and efficient weaknesses the paper states that deep metric learning approaches dml suffer from high computitational complexity due to reliance on pairwise learning constraints however many approaches in dml effectively circumvent the computational burden using sampling strategies eg a or class proxies eg b etc in particular the latter has shown to be both very computationally efficient and competitive with the stateoftheart in dml in general lots of references to established works in deep metric learning dml and fewshot learning fsl are missing for extensive discussions of both fields eg see c d the core idea of the presented approach is adding additional neurons respectively class weights for new classes to the entropyloss based classification the class weights are set to the average over inferred weights based on few representatives of the new classes and subsequently fixed for classifiying unseen samples however it is questionable if the proposed approach would work on larger more complex and finegrained datasets as inferring class weights by averaging over few samples may be too unstable for representing finegrained class features more experiments on larger standard benchmark sets would be very helpful the experimental section lacks details about the trained model ie architecture details and optimization paramters no standard architectures commonly used in dml or fsl cd seem to be considered for fair comparison to the standardstateoftheart approachess moreover comparison to stateoftheart approaches in dmlfsl are lacking in the quantative evaluation only tripet and contrastive learning seems to be considered however reference about the exact implementation details of these approaches are missing as well finally the evaluation also does neither consider standard benchmark sets in dml eg cub200 cars196 standford online products or fsl miniimagenet cifar100 nor follows the established evaluation protocols of these areas in total the evaluation protocol significantly lacks comparison to established results in dml fsl and thus does not allow for proper evaluation of the effectiveness of the proposed approach evaluation on the plantnet dataset lacks comparison to other approachesbaselines references a wu et al 2017 sampling matters in deep embedding learning b kim et al 2020 proxy anchor loss for deep metric learning c roth et al 2020 revisiting training strategies and generalization performance in deep metric learning d tian et al 2020 rethinking fewshot image classification a good embedding is all you need the paper lacks novelty and significance both for the presented approach and the empirical results further the quantitative evaluation is based on nonstandard datasets evaluation protocols architectures and baselines hence does not provide a proper basis to evaluate the effectiveness of the proposed approach ### Summary:
this paper tackles an openset setting where new classes with few labeled examples are introduced after the initial pretraining on different categories a simple approach is proposed based on a normalized softmax classifier and feature averaging to generate a classifier for the new categories results are shown on a few standard datasets as well as the plntnet dataset while reviewers found the topic and setting as well as plntnet dataset interesting they had significant concerns on the novelty t3uk tp5p ahvj contribution and rigor of the empirical evaluation since the method is simple and largely leverages from prior works the latter is especially important reviewers pointed out that some of the latest in metric learning is ignored eg proxy anchor tp5p and ahvj and no comparison is made to other classes of methods that by the authors admission are very close to the setting such as openset recognition especially those that seek to classify new categories and incremental learning unfortunately no rebuttal was provided by the authors so these significant concerns remain and the paper cannot be accepted asis since the reviewers did appreciate the setting and dataset i recommend refining the paper and significantly beefing up the empirical evaluation for future resubmissions
[ 4735, 11390, 323, 253, 747, 966, 3877, 253, 2801, 11390, 323, 253, 5971, 285, 253, 3386, 11390, 3894, 253, 1072, 2317, 275, 253, 3943, 15269, 840, 253, 4477, 1908, 752, 1537, 5108, 604, 597, 4647, 253, 1072, 1332, 323, 625, 685, 581, 747, 966, 436, 556, 1679, 10527, 22861, 275, 1055, 323, 1016, 747, 966, 247, 2801, 4972, 310, 4561, 407, 12672, 253, 12650, 3388, 273, 253, 4735, 11390, 323, 6667, 432, 326, 966, 50276, 9328, 1908, 1027, 7533, 50276, 16662, 285, 28465, 285, 7472, 327, 247, 1643, 1027, 15302, 253, 2022, 1543, 7180, 1249, 1706, 3176, 13943, 533, 891, 452, 690, 33196, 50276, 18, 352, 310, 417, 2590, 752, 253, 4735, 10103, 403, 323, 253, 15302, 374, 627, 310, 1652, 2508, 670, 253, 39716, 45842, 422, 3082, 908, 275, 253, 5301, 495, 891, 651, 1902, 281, 7277, 1411, 1142, 625, 11771, 3082, 323, 436, 10419, 50276, 783, 806, 1264, 7103, 15302, 497, 1054, 296, 260, 338, 274, 740, 285, 8142, 278, 79, 382, 247, 2257, 273, 2317, 275, 253, 2929, 310, 1677, 281, 253, 499, 2649, 3024, 10895, 7127, 352, 310, 247, 10182, 4569, 285, 4460, 2898, 273, 253, 1332, 627, 310, 247, 5955, 670, 849, 7200, 310, 5876, 323, 12848, 3904, 273, 39709, 5971, 762, 28465, 285, 6036, 15216, 342, 1027, 3733, 50276, 328, 16564, 966, 36509, 352, 310, 4950, 326, 352, 310, 9371, 281, 3215, 1949, 342, 247, 1781, 1180, 273, 2326, 5971, 533, 326, 1512, 1142, 2326, 5971, 476, 320, 30078, 594, 627, 310, 247, 5454, 273, 323, 436, 4764, 323, 436, 10895, 387, 1878, 436, 789, 499, 2649, 3024, 1057, 417, 452, 667, 1666, 25379, 390, 14023, 342, 643, 3082, 352, 2722, 326, 253, 1332, 476, 320, 3732, 533, 36908, 2028, 441, 1199, 2010, 50276, 9088, 403, 642, 2266, 11815, 390, 7000, 1783, 273, 253, 1543, 50276, 249, 2593, 5329, 253, 4477, 7277, 616, 1332, 281, 32809, 4715, 275, 436, 1083, 407, 3733, 247, 747, 3828, 281, 253, 747, 5971, 1223, 24250, 512, 643, 13461, 436, 651, 320, 247, 2593, 273, 1270, 1600, 281, 3780, 4361, 436, 2929, 533, 352, 310, 2159, 285, 253, 1543, 403, 50276, 15068, 264, 275, 581, 4677, 534, 310, 1892, 281, 1239, 285, 642, 2266, 11815, 476, 320, 8392, 432, 352, 50276, 74, 751, 253, 1332, 50276, 262, 310, 2969, 285, 27350, 285, 253, 3302, 1543, 3176, 13943, 533, 323, 824, 247, 2969, 1566, 253, 7977, 273, 4737, 310, 1029, 285, 627, 943, 320, 1142, 625, 7000, 14023, 342, 594, 66, 1182, 254, 6934, 302, 3082, 6941, 253, 9414, 281, 7277, 285, 4499, 5373, 594, 891, 651, 751, 281, 923, 625, 7000, 1783, 327, 15302, 285, 3082, 323, 1650, 891, 651, 671, 751, 281, 923, 1263, 327, 337, 849, 1057, 253, 7877, 273, 253, 4735, 4972, 11852, 3045, 374, 752, 6569, 672, 247, 873, 273, 747, 5971, 403, 1077, 2074, 281, 247, 1929, 966, 4931, 690, 13506, 941, 651, 1581, 281, 7568, 253, 20544, 285, 32213, 273, 253, 1332, 50275, 7152, 339, 431, 248, 2929, 10262, 849, 253, 12650, 2602, 4090, 2957, 476, 320, 908, 327, 253, 1527, 873, 1895, 8936, 1543, 1646, 281, 320, 6786, 1293, 253, 28208, 28208, 3733, 337, 253, 2929, 4336, 9829, 253, 2905, 2987, 1142, 9380, 15974, 436, 1895, 403, 12841, 10323, 760, 767, 7982, 4715, 7274, 403, 5611, 253, 2929, 943, 3157, 253, 2905, 789, 1563, 436, 3332, 6630, 730, 72, 448, 9041, 89, 272, 703, 1251, 30986, 30287, 606, 285, 4498, 5092, 260, 864, 3332, 16424, 275, 1527, 873, 8981, 247, 6630, 26332, 1796, 13122, 327, 3102, 1783, 285, 5145, 9260, 9169, 50276, 19, 275, 253, 30628, 4743, 253, 2929, 275, 697, 1655, 5281, 310, 417, 4704, 281, 320, 3863, 253, 4081, 1332, 310, 35056, 17194, 285, 253, 14023, 403, 8489, 3710, 50276, 20, 2929, 9021, 403, 417, 16318, 2834, 323, 247, 37317, 281, 2096, 253, 38135, 273, 253, 2929, 604, 667, 50276, 20790, 2834, 281, 2278, 275, 697, 1655, 5281, 5474, 33032, 2520, 2929, 310, 670, 9162, 273, 3888, 275, 271, 1527, 873, 4758, 941, 3551, 432, 747, 5971, 403, 5611, 281, 253, 2990, 846, 3733, 327, 941, 432, 247, 4229, 873, 273, 1929, 5971, 253, 4736, 310, 281, 320, 2104, 281, 9113, 30215, 253, 1711, 285, 747, 5971, 2057, 26277, 390, 417, 436, 2929, 29328, 247, 1332, 323, 10885, 747, 5971, 407, 3629, 253, 1979, 273, 253, 30410, 13461, 323, 1016, 747, 5971, 253, 2990, 310, 10166, 970, 253, 12650, 2602, 4090, 2957, 285, 747, 30410, 3603, 403, 2879, 407, 4560, 253, 4055, 273, 2280, 273, 253, 941, 3551, 432, 253, 747, 873, 273, 966, 253, 1332, 17923, 327, 8142, 278, 79, 382, 260, 338, 274, 285, 4444, 3024, 15302, 436, 2929, 10262, 247, 1175, 16774, 1263, 327, 10895, 4645, 253, 1055, 273, 253, 1180, 273, 966, 2326, 285, 39709, 2299, 627, 310, 247, 4468, 5001, 690, 3908, 1160, 407, 253, 2488, 285, 5001, 253, 38135, 3559, 275, 436, 789, 50276, 45563, 50276, 5045, 1263, 275, 4706, 7127, 436, 1083, 1263, 4245, 9865, 12288, 327, 253, 1055, 273, 253, 941, 3268, 327, 4715, 3036, 5329, 285, 721, 2722, 326, 253, 4327, 273, 2326, 5971, 310, 1774, 323, 4715, 327, 39709, 5971, 285, 326, 734, 2437, 9991, 310, 625, 1774, 685, 1907, 247, 1029, 1180, 273, 5971, 50276, 20881, 1255, 50275, 249, 253, 12002, 253, 2488, 3054, 326, 253, 1543, 403, 11681, 50276, 3062, 7899, 533, 513, 417, 13199, 7277, 281, 752, 891, 513, 417, 923, 275, 253, 2829, 247, 6923, 275, 3045, 273, 625, 685, 11681, 7277, 281, 253, 4081, 1666, 25379, 50276, 783, 4477, 9703, 281, 452, 7982, 4715, 3082, 347, 247, 8245, 285, 7277, 616, 30410, 342, 39716, 45842, 422, 6928, 1223, 3676, 7982, 4715, 7269, 3332, 3082, 4081, 31225, 326, 513, 417, 3309, 878, 281, 830, 8557, 923, 17335, 18536, 2957, 323, 3676, 7982, 4715, 432, 256, 465, 303, 1162, 355, 390, 642, 39866, 4181, 7982, 4715, 970, 16843, 447, 407, 340, 1855, 1200, 729, 5432, 1595, 6358, 1162, 355, 253, 1332, 4081, 407, 253, 4477, 3133, 281, 320, 271, 6927, 4758, 685, 253, 1332, 4081, 275, 253, 4767, 2929, 1580, 597, 403, 49653, 275, 247, 1182, 254, 6934, 302, 5133, 253, 747, 5971, 403, 417, 2326, 275, 3733, 50275, 783, 2022, 2523, 326, 812, 3324, 4468, 310, 253, 2216, 273, 253, 2801, 323, 253, 747, 966, 275, 16186, 577, 407, 417, 1442, 292, 25004, 253, 2990, 327, 253, 747, 12482, 5971, 627, 310, 247, 4839, 326, 253, 3386, 908, 281, 11897, 253, 747, 13461, 5663, 281, 247, 2074, 3268, 685, 3386, 432, 1529, 5368, 5971, 352, 812, 840, 5108, 326, 247, 747, 13461, 310, 4503, 281, 271, 2168, 5368, 2801, 849, 281, 1056, 2119, 326, 352, 1057, 417, 5108, 50275, 23955, 2523, 310, 253, 3480, 273, 5301, 342, 5368, 789, 352, 651, 452, 644, 4722, 281, 7277, 342, 1375, 23037, 248, 1445, 32809, 4715, 1332, 390, 253, 26101, 7982, 4715, 1332, 352, 651, 452, 2530, 247, 2266, 1329, 323, 15081, 326, 253, 4477, 789, 310, 747, 285, 1347, 386, 50276, 9088, 403, 1512, 1643, 2905, 789, 9814, 275, 4706, 608, 619, 17401, 323, 436, 2929, 310, 247, 2266, 12009, 891, 717, 417, 13762, 273, 253, 38135, 4081, 275, 436, 2929, 285, 627, 403, 1512, 1643, 5301, 342, 5368, 789, 281, 17084, 253, 13091, 273, 253, 3045, 327, 253, 4081, 15302, 5474, 339, 431, 248, 2929, 29328, 271, 2746, 281, 1643, 11860, 4715, 1754, 327, 253, 253, 7349, 1664, 12650, 2602, 4090, 2957, 846, 3215, 26208, 966, 13461, 403, 2879, 281, 253, 2831, 15579, 2957, 323, 1016, 747, 966, 275, 253, 1071, 873, 534, 403, 10302, 407, 25001, 689, 22245, 13461, 432, 247, 1329, 873, 1223, 18505, 253, 5780, 2990, 13461, 4735, 4908, 263, 4679, 5196, 327, 260, 338, 274, 740, 285, 8142, 16192, 382, 4860, 15988, 689, 8245, 7274, 50276, 296, 3755, 20556, 50276, 783, 9759, 285, 4028, 310, 2590, 285, 3477, 281, 956, 50276, 783, 4081, 2746, 310, 2969, 285, 5919, 50275, 20881, 1255, 265, 50276, 783, 2929, 3054, 326, 3676, 7982, 4715, 7274, 277, 1686, 11089, 432, 1029, 2475, 262, 1050, 10454, 1955, 281, 22095, 327, 28208, 4715, 10806, 2299, 1142, 7274, 275, 277, 1686, 8069, 39256, 253, 15180, 7977, 970, 10491, 8130, 24088, 247, 390, 966, 16843, 447, 24088, 270, 3966, 275, 1798, 253, 6158, 556, 2011, 281, 320, 1097, 1077, 43245, 5919, 285, 12085, 342, 253, 1375, 23037, 14387, 275, 277, 1686, 50275, 249, 2087, 8783, 273, 10414, 281, 4232, 2987, 275, 3676, 7982, 4715, 277, 1686, 285, 1643, 11860, 4715, 269, 3433, 403, 5816, 323, 9470, 11985, 273, 1097, 4910, 24088, 923, 260, 277, 50275, 783, 5161, 2934, 273, 253, 3559, 2746, 310, 6240, 3081, 8512, 2975, 966, 13461, 323, 747, 5971, 281, 253, 994, 1658, 1190, 1730, 1754, 9162, 253, 966, 13461, 403, 873, 281, 253, 3388, 689, 22245, 13461, 1754, 327, 1643, 15572, 273, 253, 747, 5971, 285, 9674, 4229, 323, 966, 18279, 3184, 39709, 3530, 2299, 352, 310, 30455, 604, 253, 4081, 2746, 651, 789, 327, 4067, 625, 2570, 285, 4030, 72, 11273, 15302, 347, 9441, 804, 966, 13461, 407, 25001, 689, 1643, 3530, 778, 320, 1512, 17631, 323, 9999, 4030, 72, 11273, 966, 3386, 625, 4679, 327, 4067, 2629, 22791, 5239, 651, 320, 1077, 9371, 50275, 783, 5661, 2593, 19756, 4278, 670, 253, 10166, 1566, 26332, 10336, 4278, 285, 13757, 2236, 1336, 642, 2629, 35615, 7744, 908, 275, 277, 1686, 390, 269, 3433, 22942, 1646, 281, 320, 2783, 323, 4344, 5301, 281, 253, 2629, 3409, 23037, 14387, 1192, 2679, 859, 25761, 5301, 281, 1375, 23037, 14387, 7274, 275, 277, 1686, 71, 3433, 403, 14999, 275, 253, 2677, 800, 7103, 760, 7408, 292, 285, 4499, 422, 4715, 3133, 281, 320, 2783, 50276, 35529, 3806, 670, 253, 3242, 7092, 4278, 273, 841, 7274, 403, 5816, 347, 973, 4720, 253, 7103, 671, 1057, 6747, 1908, 2629, 22791, 5239, 275, 277, 1686, 24088, 12966, 1518, 8458, 19196, 1462, 4379, 3909, 3580, 390, 269, 3433, 12949, 303, 6533, 292, 260, 338, 274, 2313, 4543, 3637, 253, 4232, 7103, 14238, 273, 841, 3672, 275, 2264, 253, 7103, 7241, 3012, 19756, 5301, 281, 4232, 1543, 275, 277, 1686, 269, 3433, 285, 3021, 1057, 417, 1581, 323, 1463, 7103, 273, 253, 12510, 273, 253, 4081, 2746, 50275, 15419, 2368, 327, 253, 4444, 3024, 10895, 19756, 5301, 281, 643, 7274, 10352, 25379, 50275, 250, 3065, 50276, 66, 259, 86, 1162, 355, 4240, 10491, 8213, 275, 3676, 21496, 4715, 50276, 67, 465, 303, 1162, 355, 9169, 17335, 18536, 2957, 323, 3676, 7982, 4715, 50276, 68, 687, 394, 1162, 355, 9169, 27694, 2996, 3733, 8130, 285, 26647, 3045, 275, 3676, 7982, 4715, 50276, 69, 246, 757, 1162, 355, 9169, 294, 37341, 1643, 11860, 2460, 9162, 247, 1175, 21496, 310, 512, 368, 878, 253, 2929, 19756, 38135, 285, 8453, 1097, 323, 253, 3559, 2746, 285, 253, 16774, 1543, 2007, 253, 11745, 7103, 310, 1754, 327, 1327, 15291, 15302, 7103, 14238, 35615, 285, 1666, 25379, 7613, 1057, 417, 2085, 247, 1463, 3720, 281, 7472, 253, 12510, 273, 253, 4081, 2746, 2490, 187, 4118, 18435, 27, 2520, 2929, 39223, 271, 13279, 292, 4758, 835, 747, 5971, 342, 1643, 13130, 6667, 403, 5611, 846, 253, 3302, 3215, 26208, 327, 1027, 9050, 247, 2969, 2746, 310, 4081, 1754, 327, 247, 12650, 2602, 4090, 30410, 285, 4735, 25001, 281, 6635, 247, 30410, 323, 253, 747, 9050, 1543, 403, 2011, 327, 247, 1643, 2629, 15302, 347, 973, 347, 253, 499, 2649, 3024, 10895, 50275, 6050, 30628, 1119, 253, 9400, 285, 4758, 347, 973, 347, 499, 2649, 3024, 10895, 4722, 597, 574, 1534, 7350, 327, 253, 38135, 246, 20, 2788, 246, 81, 22, 81, 21799, 87, 75, 7680, 285, 8132, 263, 273, 253, 16774, 7103, 1580, 253, 1332, 310, 2969, 285, 8127, 19732, 1131, 432, 2720, 2987, 253, 6158, 310, 3340, 1774, 30628, 8042, 562, 326, 690, 273, 253, 6323, 275, 7982, 4715, 310, 12841, 24088, 17335, 18536, 246, 81, 22, 81, 285, 21799, 87, 75, 285, 642, 5301, 310, 1160, 281, 643, 5971, 273, 3082, 326, 407, 253, 4477, 11341, 403, 1077, 2810, 281, 253, 4758, 824, 347, 13279, 292, 8981, 3340, 1110, 326, 7703, 281, 30215, 747, 9050, 285, 32809, 4715, 50276, 328, 9520, 642, 30080, 22559, 369, 2530, 407, 253, 4477, 594, 841, 1534, 7350, 3464, 285, 253, 2929, 2550, 320, 7607, 347, 261, 1580, 253, 30628, 858, 11435, 253, 4758, 285, 10895, 891, 5583, 1275, 1699, 253, 2929, 285, 3012, 17645, 272, 598, 253, 16774, 7103, 323, 2852, 501, 538, 8331 ]
[ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 ]
[ 4735, 11390, 323, 253, 747, 966, 3877, 253, 2801, 11390, 323, 253, 5971, 285, 253, 3386, 11390, 3894, 253, 1072, 2317, 275, 253, 3943, 15269, 840, 253, 4477, 1908, 752, 1537, 5108, 604, 597, 4647, 253, 1072, 1332, 323, 625, 685, 581, 747, 966, 436, 556, 1679, 10527, 22861, 275, 1055, 323, 1016, 747, 966, 247, 2801, 4972, 310, 4561, 407, 12672, 253, 12650, 3388, 273, 253, 4735, 11390, 323, 6667, 432, 326, 966, 50276, 9328, 1908, 1027, 7533, 50276, 16662, 285, 28465, 285, 7472, 327, 247, 1643, 1027, 15302, 253, 2022, 1543, 7180, 1249, 1706, 3176, 13943, 533, 891, 452, 690, 33196, 50276, 18, 352, 310, 417, 2590, 752, 253, 4735, 10103, 403, 323, 253, 15302, 374, 627, 310, 1652, 2508, 670, 253, 39716, 45842, 422, 3082, 908, 275, 253, 5301, 495, 891, 651, 1902, 281, 7277, 1411, 1142, 625, 11771, 3082, 323, 436, 10419, 50276, 783, 806, 1264, 7103, 15302, 497, 1054, 296, 260, 338, 274, 740, 285, 8142, 278, 79, 382, 247, 2257, 273, 2317, 275, 253, 2929, 310, 1677, 281, 253, 499, 2649, 3024, 10895, 7127, 352, 310, 247, 10182, 4569, 285, 4460, 2898, 273, 253, 1332, 627, 310, 247, 5955, 670, 849, 7200, 310, 5876, 323, 12848, 3904, 273, 39709, 5971, 762, 28465, 285, 6036, 15216, 342, 1027, 3733, 50276, 328, 16564, 966, 36509, 352, 310, 4950, 326, 352, 310, 9371, 281, 3215, 1949, 342, 247, 1781, 1180, 273, 2326, 5971, 533, 326, 1512, 1142, 2326, 5971, 476, 320, 30078, 594, 627, 310, 247, 5454, 273, 323, 436, 4764, 323, 436, 10895, 387, 1878, 436, 789, 499, 2649, 3024, 1057, 417, 452, 667, 1666, 25379, 390, 14023, 342, 643, 3082, 352, 2722, 326, 253, 1332, 476, 320, 3732, 533, 36908, 2028, 441, 1199, 2010, 50276, 9088, 403, 642, 2266, 11815, 390, 7000, 1783, 273, 253, 1543, 50276, 249, 2593, 5329, 253, 4477, 7277, 616, 1332, 281, 32809, 4715, 275, 436, 1083, 407, 3733, 247, 747, 3828, 281, 253, 747, 5971, 1223, 24250, 512, 643, 13461, 436, 651, 320, 247, 2593, 273, 1270, 1600, 281, 3780, 4361, 436, 2929, 533, 352, 310, 2159, 285, 253, 1543, 403, 50276, 15068, 264, 275, 581, 4677, 534, 310, 1892, 281, 1239, 285, 642, 2266, 11815, 476, 320, 8392, 432, 352, 50276, 74, 751, 253, 1332, 50276, 262, 310, 2969, 285, 27350, 285, 253, 3302, 1543, 3176, 13943, 533, 323, 824, 247, 2969, 1566, 253, 7977, 273, 4737, 310, 1029, 285, 627, 943, 320, 1142, 625, 7000, 14023, 342, 594, 66, 1182, 254, 6934, 302, 3082, 6941, 253, 9414, 281, 7277, 285, 4499, 5373, 594, 891, 651, 751, 281, 923, 625, 7000, 1783, 327, 15302, 285, 3082, 323, 1650, 891, 651, 671, 751, 281, 923, 1263, 327, 337, 849, 1057, 253, 7877, 273, 253, 4735, 4972, 11852, 3045, 374, 752, 6569, 672, 247, 873, 273, 747, 5971, 403, 1077, 2074, 281, 247, 1929, 966, 4931, 690, 13506, 941, 651, 1581, 281, 7568, 253, 20544, 285, 32213, 273, 253, 1332, 50275, 7152, 339, 431, 248, 2929, 10262, 849, 253, 12650, 2602, 4090, 2957, 476, 320, 908, 327, 253, 1527, 873, 1895, 8936, 1543, 1646, 281, 320, 6786, 1293, 253, 28208, 28208, 3733, 337, 253, 2929, 4336, 9829, 253, 2905, 2987, 1142, 9380, 15974, 436, 1895, 403, 12841, 10323, 760, 767, 7982, 4715, 7274, 403, 5611, 253, 2929, 943, 3157, 253, 2905, 789, 1563, 436, 3332, 6630, 730, 72, 448, 9041, 89, 272, 703, 1251, 30986, 30287, 606, 285, 4498, 5092, 260, 864, 3332, 16424, 275, 1527, 873, 8981, 247, 6630, 26332, 1796, 13122, 327, 3102, 1783, 285, 5145, 9260, 9169, 50276, 19, 275, 253, 30628, 4743, 253, 2929, 275, 697, 1655, 5281, 310, 417, 4704, 281, 320, 3863, 253, 4081, 1332, 310, 35056, 17194, 285, 253, 14023, 403, 8489, 3710, 50276, 20, 2929, 9021, 403, 417, 16318, 2834, 323, 247, 37317, 281, 2096, 253, 38135, 273, 253, 2929, 604, 667, 50276, 20790, 2834, 281, 2278, 275, 697, 1655, 5281, 5474, 33032, 2520, 2929, 310, 670, 9162, 273, 3888, 275, 271, 1527, 873, 4758, 941, 3551, 432, 747, 5971, 403, 5611, 281, 253, 2990, 846, 3733, 327, 941, 432, 247, 4229, 873, 273, 1929, 5971, 253, 4736, 310, 281, 320, 2104, 281, 9113, 30215, 253, 1711, 285, 747, 5971, 2057, 26277, 390, 417, 436, 2929, 29328, 247, 1332, 323, 10885, 747, 5971, 407, 3629, 253, 1979, 273, 253, 30410, 13461, 323, 1016, 747, 5971, 253, 2990, 310, 10166, 970, 253, 12650, 2602, 4090, 2957, 285, 747, 30410, 3603, 403, 2879, 407, 4560, 253, 4055, 273, 2280, 273, 253, 941, 3551, 432, 253, 747, 873, 273, 966, 253, 1332, 17923, 327, 8142, 278, 79, 382, 260, 338, 274, 285, 4444, 3024, 15302, 436, 2929, 10262, 247, 1175, 16774, 1263, 327, 10895, 4645, 253, 1055, 273, 253, 1180, 273, 966, 2326, 285, 39709, 2299, 627, 310, 247, 4468, 5001, 690, 3908, 1160, 407, 253, 2488, 285, 5001, 253, 38135, 3559, 275, 436, 789, 50276, 45563, 50276, 5045, 1263, 275, 4706, 7127, 436, 1083, 1263, 4245, 9865, 12288, 327, 253, 1055, 273, 253, 941, 3268, 327, 4715, 3036, 5329, 285, 721, 2722, 326, 253, 4327, 273, 2326, 5971, 310, 1774, 323, 4715, 327, 39709, 5971, 285, 326, 734, 2437, 9991, 310, 625, 1774, 685, 1907, 247, 1029, 1180, 273, 5971, 50276, 20881, 1255, 50275, 249, 253, 12002, 253, 2488, 3054, 326, 253, 1543, 403, 11681, 50276, 3062, 7899, 533, 513, 417, 13199, 7277, 281, 752, 891, 513, 417, 923, 275, 253, 2829, 247, 6923, 275, 3045, 273, 625, 685, 11681, 7277, 281, 253, 4081, 1666, 25379, 50276, 783, 4477, 9703, 281, 452, 7982, 4715, 3082, 347, 247, 8245, 285, 7277, 616, 30410, 342, 39716, 45842, 422, 6928, 1223, 3676, 7982, 4715, 7269, 3332, 3082, 4081, 31225, 326, 513, 417, 3309, 878, 281, 830, 8557, 923, 17335, 18536, 2957, 323, 3676, 7982, 4715, 432, 256, 465, 303, 1162, 355, 390, 642, 39866, 4181, 7982, 4715, 970, 16843, 447, 407, 340, 1855, 1200, 729, 5432, 1595, 6358, 1162, 355, 253, 1332, 4081, 407, 253, 4477, 3133, 281, 320, 271, 6927, 4758, 685, 253, 1332, 4081, 275, 253, 4767, 2929, 1580, 597, 403, 49653, 275, 247, 1182, 254, 6934, 302, 5133, 253, 747, 5971, 403, 417, 2326, 275, 3733, 50275, 783, 2022, 2523, 326, 812, 3324, 4468, 310, 253, 2216, 273, 253, 2801, 323, 253, 747, 966, 275, 16186, 577, 407, 417, 1442, 292, 25004, 253, 2990, 327, 253, 747, 12482, 5971, 627, 310, 247, 4839, 326, 253, 3386, 908, 281, 11897, 253, 747, 13461, 5663, 281, 247, 2074, 3268, 685, 3386, 432, 1529, 5368, 5971, 352, 812, 840, 5108, 326, 247, 747, 13461, 310, 4503, 281, 271, 2168, 5368, 2801, 849, 281, 1056, 2119, 326, 352, 1057, 417, 5108, 50275, 23955, 2523, 310, 253, 3480, 273, 5301, 342, 5368, 789, 352, 651, 452, 644, 4722, 281, 7277, 342, 1375, 23037, 248, 1445, 32809, 4715, 1332, 390, 253, 26101, 7982, 4715, 1332, 352, 651, 452, 2530, 247, 2266, 1329, 323, 15081, 326, 253, 4477, 789, 310, 747, 285, 1347, 386, 50276, 9088, 403, 1512, 1643, 2905, 789, 9814, 275, 4706, 608, 619, 17401, 323, 436, 2929, 310, 247, 2266, 12009, 891, 717, 417, 13762, 273, 253, 38135, 4081, 275, 436, 2929, 285, 627, 403, 1512, 1643, 5301, 342, 5368, 789, 281, 17084, 253, 13091, 273, 253, 3045, 327, 253, 4081, 15302, 5474, 339, 431, 248, 2929, 29328, 271, 2746, 281, 1643, 11860, 4715, 1754, 327, 253, 253, 7349, 1664, 12650, 2602, 4090, 2957, 846, 3215, 26208, 966, 13461, 403, 2879, 281, 253, 2831, 15579, 2957, 323, 1016, 747, 966, 275, 253, 1071, 873, 534, 403, 10302, 407, 25001, 689, 22245, 13461, 432, 247, 1329, 873, 1223, 18505, 253, 5780, 2990, 13461, 4735, 4908, 263, 4679, 5196, 327, 260, 338, 274, 740, 285, 8142, 16192, 382, 4860, 15988, 689, 8245, 7274, 50276, 296, 3755, 20556, 50276, 783, 9759, 285, 4028, 310, 2590, 285, 3477, 281, 956, 50276, 783, 4081, 2746, 310, 2969, 285, 5919, 50275, 20881, 1255, 265, 50276, 783, 2929, 3054, 326, 3676, 7982, 4715, 7274, 277, 1686, 11089, 432, 1029, 2475, 262, 1050, 10454, 1955, 281, 22095, 327, 28208, 4715, 10806, 2299, 1142, 7274, 275, 277, 1686, 8069, 39256, 253, 15180, 7977, 970, 10491, 8130, 24088, 247, 390, 966, 16843, 447, 24088, 270, 3966, 275, 1798, 253, 6158, 556, 2011, 281, 320, 1097, 1077, 43245, 5919, 285, 12085, 342, 253, 1375, 23037, 14387, 275, 277, 1686, 50275, 249, 2087, 8783, 273, 10414, 281, 4232, 2987, 275, 3676, 7982, 4715, 277, 1686, 285, 1643, 11860, 4715, 269, 3433, 403, 5816, 323, 9470, 11985, 273, 1097, 4910, 24088, 923, 260, 277, 50275, 783, 5161, 2934, 273, 253, 3559, 2746, 310, 6240, 3081, 8512, 2975, 966, 13461, 323, 747, 5971, 281, 253, 994, 1658, 1190, 1730, 1754, 9162, 253, 966, 13461, 403, 873, 281, 253, 3388, 689, 22245, 13461, 1754, 327, 1643, 15572, 273, 253, 747, 5971, 285, 9674, 4229, 323, 966, 18279, 3184, 39709, 3530, 2299, 352, 310, 30455, 604, 253, 4081, 2746, 651, 789, 327, 4067, 625, 2570, 285, 4030, 72, 11273, 15302, 347, 9441, 804, 966, 13461, 407, 25001, 689, 1643, 3530, 778, 320, 1512, 17631, 323, 9999, 4030, 72, 11273, 966, 3386, 625, 4679, 327, 4067, 2629, 22791, 5239, 651, 320, 1077, 9371, 50275, 783, 5661, 2593, 19756, 4278, 670, 253, 10166, 1566, 26332, 10336, 4278, 285, 13757, 2236, 1336, 642, 2629, 35615, 7744, 908, 275, 277, 1686, 390, 269, 3433, 22942, 1646, 281, 320, 2783, 323, 4344, 5301, 281, 253, 2629, 3409, 23037, 14387, 1192, 2679, 859, 25761, 5301, 281, 1375, 23037, 14387, 7274, 275, 277, 1686, 71, 3433, 403, 14999, 275, 253, 2677, 800, 7103, 760, 7408, 292, 285, 4499, 422, 4715, 3133, 281, 320, 2783, 50276, 35529, 3806, 670, 253, 3242, 7092, 4278, 273, 841, 7274, 403, 5816, 347, 973, 4720, 253, 7103, 671, 1057, 6747, 1908, 2629, 22791, 5239, 275, 277, 1686, 24088, 12966, 1518, 8458, 19196, 1462, 4379, 3909, 3580, 390, 269, 3433, 12949, 303, 6533, 292, 260, 338, 274, 2313, 4543, 3637, 253, 4232, 7103, 14238, 273, 841, 3672, 275, 2264, 253, 7103, 7241, 3012, 19756, 5301, 281, 4232, 1543, 275, 277, 1686, 269, 3433, 285, 3021, 1057, 417, 1581, 323, 1463, 7103, 273, 253, 12510, 273, 253, 4081, 2746, 50275, 15419, 2368, 327, 253, 4444, 3024, 10895, 19756, 5301, 281, 643, 7274, 10352, 25379, 50275, 250, 3065, 50276, 66, 259, 86, 1162, 355, 4240, 10491, 8213, 275, 3676, 21496, 4715, 50276, 67, 465, 303, 1162, 355, 9169, 17335, 18536, 2957, 323, 3676, 7982, 4715, 50276, 68, 687, 394, 1162, 355, 9169, 27694, 2996, 3733, 8130, 285, 26647, 3045, 275, 3676, 7982, 4715, 50276, 69, 246, 757, 1162, 355, 9169, 294, 37341, 1643, 11860, 2460, 9162, 247, 1175, 21496, 310, 512, 368, 878, 253, 2929, 19756, 38135, 285, 8453, 1097, 323, 253, 3559, 2746, 285, 253, 16774, 1543, 2007, 253, 11745, 7103, 310, 1754, 327, 1327, 15291, 15302, 7103, 14238, 35615, 285, 1666, 25379, 7613, 1057, 417, 2085, 247, 1463, 3720, 281, 7472, 253, 12510, 273, 253, 4081, 2746, 2490, 187, 4118, 18435, 27, 2520, 2929, 39223, 271, 13279, 292, 4758, 835, 747, 5971, 342, 1643, 13130, 6667, 403, 5611, 846, 253, 3302, 3215, 26208, 327, 1027, 9050, 247, 2969, 2746, 310, 4081, 1754, 327, 247, 12650, 2602, 4090, 30410, 285, 4735, 25001, 281, 6635, 247, 30410, 323, 253, 747, 9050, 1543, 403, 2011, 327, 247, 1643, 2629, 15302, 347, 973, 347, 253, 499, 2649, 3024, 10895, 50275, 6050, 30628, 1119, 253, 9400, 285, 4758, 347, 973, 347, 499, 2649, 3024, 10895, 4722, 597, 574, 1534, 7350, 327, 253, 38135, 246, 20, 2788, 246, 81, 22, 81, 21799, 87, 75, 7680, 285, 8132, 263, 273, 253, 16774, 7103, 1580, 253, 1332, 310, 2969, 285, 8127, 19732, 1131, 432, 2720, 2987, 253, 6158, 310, 3340, 1774, 30628, 8042, 562, 326, 690, 273, 253, 6323, 275, 7982, 4715, 310, 12841, 24088, 17335, 18536, 246, 81, 22, 81, 285, 21799, 87, 75, 285, 642, 5301, 310, 1160, 281, 643, 5971, 273, 3082, 326, 407, 253, 4477, 11341, 403, 1077, 2810, 281, 253, 4758, 824, 347, 13279, 292, 8981, 3340, 1110, 326, 7703, 281, 30215, 747, 9050, 285, 32809, 4715, 50276, 328, 9520, 642, 30080, 22559, 369, 2530, 407, 253, 4477, 594, 841, 1534, 7350, 3464, 285, 253, 2929, 2550, 320, 7607, 347, 261, 1580, 253, 30628, 858, 11435, 253, 4758, 285, 10895, 891, 5583, 1275, 1699, 253, 2929, 285, 3012, 17645, 272, 598, 253, 16774, 7103, 323, 2852, 501, 538, 8331 ]
Below is given review of a research paper from cnoference journal. Please write a summary the review. ### Review: the paper considers the contextual batched bandit setting and introduces the idea of imputation utilizing the nonexecuted actions in each batch this provides better regret properties than without and also further speedup is provided by considering the sketch version theoretical results in terms of sketching performance are also provided such as going down from obd2 to ocd2 for sketch size c as well as the regret bounds of the sketched approach spuir experimental results are shown on a couple of datasets to showcase the improved performance over stateoftheart batched bandit algorithms such as bexp3 bltsb pros the paper is wellwritten and considers an important problem of batched linear bandits in the literature a the authors introduce the idea of imputation to further speedup the learning process regret bounds showing that the lower variance of the imputation approach over the vanilla version further strengthens the paper b sketching is then introduced to further speed up the computation of the regression problem the introduced sketching error is further tuned by appropriately setting the sketch size to provide a sublinear regret of osqrtmdt c experimental results are pretty convincing on the provided datasets of both improved regret performance as well as in terms of sketching time better regret performance is achieved by spuir compared to other approaches but roughly less than half the time of puir questions 1 why are the thetas for each of the actions in the episodes computed afresh could we reuse the previously computed thetas to regularize the solution in the current batch and get rid of the imputation which is expensive 2 this uses the classic sketch and solve paradigm can we utilize the iterate and sketch to further speed up the approacha a oblivious sketchingbased central path method for linear programming icml 2021 overall the paper solves an important problem in the literature of contextual batched bandits the proposed approach of imputation of unobserved actions is sound and both theoretical results in terms of regrets bounds and superior practical performance on realworld datasets are shown in the paper still unsure about the imputation approach taken and if we can further improve the sketching approach considered in the paper docsepthe paper considers the behavior of a series of algorithms for learning in a batched bandit setting where parameters are armaction specific rather than shared across armsactions these algorithms are based on ucb like techniques with the addition of a regularization term that aims to use information from imputed unobserved rewards such techniques are present in the linear regression and dimensionality reduction literature where one aims to use either prior information or data distribution information to aid prediction however to my knowledge these formalisms are quite new to online settings like bandits thus making the presented approach appealing positive points the paper is well written and generally easy to follow the results are clearly communicated both means and standard deviations are provided eg in tables in my biased opinion the problem proposed and the chosen approach are both interesting major issues details regarding the imputation procedure itself were scarce and hard to parse the effect of different imputation methods could be rather large this goes unaddressed in the current paper comparisons are performed with methods where the parameters are shared across armsactions while this assumption is
restrictive application wise it is important to understand the benefits if any of imputation in those cases as well there is little intuition regarding the supportrank of the context matrix arm specific parameters and their interaction and how this can impact 
whether imputing contexts is helpful or not simple thompson sampling seems to be performing comparatively well minor issues the number of iterations used to compute the means and standard deviations should be made explicit 10 100 for the top performing algorithms it would be interesting to see the standard deviation bands throughout the plots claiming that most literature ignores potential rewards is slightly misleading bayesian methods aim to model distributions over 
rewards given contexts hence there is an implicit occasional spelling suggestions for improvement assess the impact of imputation when different imputation methods are considered be explicit about how the imputation is achieved assess the performance of the algorithms in settings where the armaction parameters are the same across armactions consider synthetic experiments where the distributions of contexts and action parameters are made explicit and where the rewards
are modeled as explicit functionals of context and actions i believe 1 could be a great basis for such experiments address minor issues
 1 lloyd james et al random function priors for exchangeable arrays with applications to graphs and relational data advances in neural information processing systems 25 2012 9981006 recommendation weak reject the paper addresses an interesting question and is generally well written with a promising algorithmic direction however i believe the central imputation idea gets lost in the many variants considered i believe having sketching variants is important and i liked that facet of the paper however i believe that there isnt sufficient clarity on how the actual imputation is accomplished and what is the impact of such imputation further the paper does not provide intuition on what modeling assumptions and statistical data features are behind the performance improvements when imputation is used docsepthis paper addresses batched contextual bandits with a fixed set of actions and separate unknown parameters for each action the authors propose an approach where the unobserved rewards ie the rewards that would have been obtained if actions that hadnt been selected for a given context are imputed and these imputed values are incorporated in to the regularization of the parameter estimates in a linucblike algorithm for reasons of computational efficiency the authors also design a process to approximate this regularized estimator via a sketching technique for the approaches with and without sketching the authors derive a osqrtdmt regret bound where m is the number of actions and d is the dimensionality of the parameter vectors which broadly matches what is expected in the nonbatched setting they argue that the proposed algorithms have uniformly lower variance in the instantaneous regret than approaches without imputation and any increase in bias decreases exponentially quickly versions with timeadaptive parameters and for nonlinear functions are proposed without a full theoretical treatment and all the proposed algorithms are shown to perform well in an empirical study pros the theoretical work is complex and besides the comment i make in the major comments section accurate as far as i can tell the paper has compared its approaches to a number of sensible benchmarks on a nice mix of simulated and real dataset and for several realistic extensions of the model the problem studied is of genuine interest and the approach taken to derive a new algorithm is an interesting combination of ideas from the literature major comments on cons my main issues are that the explanations of the main algorithm are not sufficiently detailed to convince me that it is as effective as suggested there is also an error in the theory which once corrected weakens the contributions see more details below construction of the imputation regularized ridge regression o the paper is mostly accessible with some omissions of detail as described in the more minor comments below until this point where it stops being as userfriendly some more textual explanation of the terminology is needed here to help guide the reader as to the function and interpretation of matrices l t phi psi and b this kind of discussion is provided prior to equation 1 o the jump from equation 1 to equation 2 is not obvious and could do with some further explanation o related to this its not obvious where eta comes from is it a function of gamma and lambda if not why does it not enter explicitly in to eq 1 somewhere are there infinitely many solutions and a particular choice of eta parameterises a specific one o at a more conceptual level its hard to understand what this complex process of imputation achieves that is more than just reducing the importance of the regularization term and increasing the importance of the datadriven term since the imputed values for a batch are deterministic functions of the observed data in that batch is the entire imputation process not equivalent to a very specific means of reducing lambda comments on the highlevel idea of reward imputation and the explorationexploitation tradeoff o something that came to my mind upon reading the abstract but i couldnt find directly addressed in the paper is the idea that reward imputation could feasibly push an algorithm towards exploitation and away from exploitation im imagining a scenario where we draw a small number of samples for a particular action in an early round which are not particularly representative of its true value and lead to a biased estimate through deterministic imputation of the rewards that would have been observed on the other contexts this bias will propagate further and would seem to only encourage the algorithm to favour this action less if this happens to be a good action there would be a concern that this could lead to sticking in suboptimal actions and incurring more regret than algorithm which does not impute on the basis of biased estimates o a particular concern in this setting would be that it leads to heavy tails on the regret distribution as the sticking will only happen sometimes and by reporting only means and standard deviations this phenomenon is somewhat obscured this issue has been resolved in the new version error in the theory at eq 19 o i believe unfortunately that there is an error at eq 19 in the proof of theorem 2 it has nested within a square root for eta in 01 the equality sumi0n etani etan1etan11eta which is not correct it should be with some simplification added for clarity sumi0n etani 1sumi1n etan 11etan1eta this unfortunately cannot then be bounded as a decreasing function of n and i believe leads to a constant order additional bias perhaps bounded as ctheta gamma 111eta in the final statement of the theorem if i am not mistaken and there is not some other technique that can be used to remove this additional bias from a conceptual viewpoint this seems unlikely as it makes sense for a reduction in variance to be necessarily traded off against an increase in bias this necessitates some modifications to the discussion and reframing of the contributions the exponentially decreasing bias is sold as a major contribution indeed arguably oversold as it is only the additional bias which would exponentially vanish not the entire bias and therefore the loss of this does affect the message of the paper somewhat minor comments on cons the related work section may be improved and made more useful by indicating how these papers link to your work principles of making a batched decision optimism vs other methods similarities or inspiration in the theoretical analyses the sketching approach is quite an important concept to the paper but one that is not commonly used in bandits and therefore potentially unfamiliar to prospective readers of the paper it would be nice to make it clearer what it involves conceptually when its first mentioned in the introduction i dont think m is defined further to this it may be useful to give a sense of the likely values of m b d early in the paper for instance i found myself worrying about situations where b is close to m and some actions are not used at all in some early rounds which seems to be less relevant when you get later in the paper and realise the intended setting is mb the definition of a policy pi could be more mathematically precise the text seems to suggest that pin is a single sampling distribution on mathcala used for all elements of the nth batch but it is not clear how precisely it maps from the contexts and history to actions and whether it can vary across the batch depending on say the earlier selections in the batch the distribution of the rewards is not made clear in section 2 only the functional form of their expectation is this deliberate the experiment in figure 1 isnt fully reproducible as the experimental parameters reward distributions context distribution and batch size arent specified this also makes it harder to interpret what level of feedback the algorithm could expect without any additional information finally its not clear what modifications are made to the han et al algorithm to adapt to the actionspecific reward parameters or whether this is all conducted in the setting of a shared reward parameter on page 4 you say for forall aj which reads as saying for twice from section 4 the amount of whitespace seems to have been reduced between paragraphs this makes things quite dense and hard to read at the start of the proof of theorem 2 triangle inequality i dont think you need the s neq 0 condition in thm 2 since gamma is positive semidefinite not positivedefinite there is a substantial amount of work which has gone into this paper and the authors have combined several different ideas contextual bandits batched bandits sketching approximations for efficiency together in a mostly careful manner this is nontrivial work my concerns are that there several areas where the explanations feel limited and not user friendly where the interesting features of the method are not fully explained and where there are slipups in the theory and reproducibility the latter two of these points mean that im not fully convinced of the effectiveness and utility of the method i also question whether iclr is the right venue for this work it seems that it has been difficult to adequately explain the method its theoretical guarantees and its place within the literature within 9 pages and a more accessible discussion may be achievable in a journal paper ### Summary:
in this paper the authors consider a contextual batched bandit setting where they rely on imputationin order to estimated the nonexecuted actions in each batch even though the idea is quite ineteretsing and can lead to new methods there is still a lof of issues raised by the reviwers in particular part of the proof was incorrect and the authors tried to fix it but given the short time the reviwers felt that this part should be rewritten and scrutanized further also there are many suggestions by reviewers that the authors need to apply in order to make this work publishable
[ 908, 281, 11897, 253, 2097, 285, 2629, 21492, 943, 320, 1160, 6843, 884, 2233, 50276, 1542, 253, 1755, 9591, 11333, 352, 651, 320, 4722, 281, 923, 253, 2629, 11254, 10604, 4768, 253, 14777, 50276, 43759, 326, 954, 6239, 35136, 2442, 23267, 310, 5777, 24363, 17699, 16561, 3082, 4388, 281, 1566, 10670, 689, 209, 40702, 250, 4515, 1677, 22349, 7613, 627, 310, 271, 15424, 50275, 406, 16559, 1593, 33797, 50276, 35640, 621, 323, 7756, 50276, 515, 405, 253, 3486, 273, 516, 10340, 672, 1027, 516, 10340, 3082, 403, 2783, 320, 6843, 670, 849, 253, 516, 10340, 310, 6786, 50276, 515, 405, 253, 3045, 273, 253, 11333, 275, 7533, 835, 253, 549, 785, 421, 3602, 403, 253, 1072, 2439, 549, 785, 960, 50276, 15603, 13506, 4679, 835, 253, 10670, 273, 22349, 285, 2250, 3602, 403, 1160, 6843, 285, 835, 253, 23267, 40702, 609, 23115, 347, 6843, 1159, 932, 273, 3634, 285, 5231, 891, 2868, 337, 812, 320, 247, 1270, 3720, 323, 824, 4679, 50276, 12025, 5884, 3374, 40702, 50275, 18, 26198, 15860, 480, 1443, 1162, 355, 3632, 1159, 2235, 641, 323, 6431, 494, 16417, 342, 4893, 281, 14580, 285, 38524, 941, 575, 24301, 1972, 275, 11454, 1491, 5162, 2718, 575, 1099, 4050, 898, 4185, 45535, 17401, 5075, 12009, 253, 2929, 12453, 271, 4722, 1953, 285, 310, 3839, 973, 3542, 342, 247, 12532, 5933, 280, 3884, 2299, 891, 2868, 253, 4275, 516, 10340, 2934, 4850, 3663, 275, 253, 1142, 11640, 2783, 891, 2868, 1907, 30547, 7695, 11640, 310, 1774, 285, 891, 10490, 326, 32124, 273, 253, 2929, 2299, 891, 2868, 326, 627, 310, 2649, 4209, 19843, 327, 849, 253, 4588, 516, 10340, 310, 14123, 285, 752, 310, 253, 3486, 273, 824, 516, 10340, 2007, 253, 2929, 1057, 417, 2085, 30328, 327, 752, 14053, 13260, 285, 7605, 941, 3386, 403, 3212, 253, 3045, 11701, 672, 516, 10340, 310, 908, 5474, 33032, 2520, 2929, 12453, 10464, 2147, 33876, 3961, 953, 342, 247, 4229, 873, 273, 5231, 285, 4858, 7202, 3602, 323, 1016, 2250, 253, 4477, 12661, 271, 2746, 835, 253, 440, 45912, 23267, 26332, 253, 23267, 326, 651, 452, 644, 2797, 604, 5231, 326, 574, 2649, 644, 4236, 323, 247, 1677, 3634, 403, 516, 19280, 285, 841, 516, 19280, 2193, 403, 11217, 275, 281, 253, 37820, 273, 253, 4764, 8197, 275, 247, 19169, 1028, 67, 3022, 5933, 323, 4606, 273, 15180, 6733, 253, 4477, 671, 2216, 247, 1232, 281, 16851, 436, 3963, 1025, 29107, 3066, 247, 30547, 7695, 5853, 323, 253, 7274, 342, 285, 1293, 30547, 7695, 253, 4477, 15313, 247, 258, 2609, 69, 6917, 14938, 3033, 835, 278, 310, 253, 1180, 273, 5231, 285, 277, 310, 253, 7877, 1319, 273, 253, 4764, 11390, 534, 21450, 10129, 752, 310, 3264, 275, 253, 1327, 13419, 2147, 4758, 597, 9059, 326, 253, 4081, 11333, 452, 17568, 2406, 11041, 275, 253, 35774, 14938, 685, 7274, 1293, 516, 10340, 285, 667, 2572, 275, 8492, 12075, 28596, 4541, 9508, 342, 673, 26672, 422, 3602, 285, 323, 14561, 3470, 403, 4081, 1293, 247, 2120, 10527, 1971, 285, 512, 253, 4081, 11333, 403, 2011, 281, 1347, 973, 275, 271, 16774, 1263, 5847, 253, 10527, 789, 310, 2570, 285, 16280, 253, 4385, 891, 1056, 275, 253, 2201, 5701, 2593, 7899, 347, 2080, 347, 891, 476, 2028, 253, 2929, 556, 2429, 697, 7274, 281, 247, 1180, 273, 24600, 49602, 327, 247, 5322, 5878, 273, 15524, 285, 1524, 10895, 285, 323, 2067, 15958, 18149, 273, 253, 1566, 253, 1895, 5421, 310, 273, 13241, 1600, 285, 253, 2746, 2668, 281, 15313, 247, 747, 5933, 310, 271, 4722, 5019, 273, 5697, 432, 253, 6239, 50276, 24330, 5701, 327, 772, 619, 2022, 3374, 403, 326, 253, 22909, 273, 253, 2022, 5933, 403, 417, 10481, 7000, 281, 18578, 479, 326, 352, 310, 347, 3576, 347, 5125, 627, 310, 671, 271, 2228, 275, 253, 3762, 534, 2378, 15045, 5075, 561, 253, 9021, 923, 625, 4278, 2708, 50276, 186, 11682, 273, 253, 516, 10340, 3963, 1025, 27563, 9077, 50276, 80, 186, 783, 2929, 310, 6571, 12482, 342, 690, 49889, 273, 2508, 347, 2529, 275, 253, 625, 5884, 5701, 2708, 1919, 436, 1127, 835, 352, 14545, 1146, 347, 2608, 19771, 690, 625, 45860, 8813, 273, 253, 28939, 310, 3058, 1060, 281, 1361, 7102, 253, 9414, 347, 281, 253, 1159, 285, 7914, 273, 12624, 298, 246, 815, 74, 3714, 74, 285, 270, 436, 2238, 273, 5955, 310, 2530, 2720, 281, 5150, 337, 50276, 80, 186, 783, 6923, 432, 5150, 337, 281, 5150, 374, 310, 417, 4755, 285, 812, 513, 342, 690, 2007, 8813, 50276, 80, 186, 4919, 281, 436, 697, 417, 4755, 835, 1162, 66, 3249, 432, 50276, 261, 352, 247, 1159, 273, 17356, 285, 29331, 604, 417, 2139, 1057, 352, 417, 4901, 11120, 275, 281, 16186, 337, 9366, 403, 627, 29556, 1142, 5482, 285, 247, 1798, 4327, 273, 1162, 66, 4764, 3013, 247, 2173, 581, 50276, 80, 186, 255, 247, 625, 20178, 1268, 697, 1892, 281, 2096, 752, 436, 2570, 1232, 273, 516, 10340, 33526, 326, 310, 625, 685, 816, 8493, 253, 6349, 273, 253, 37820, 1307, 285, 3629, 253, 6349, 273, 253, 2856, 324, 1069, 257, 1307, 1580, 253, 516, 19280, 2193, 323, 247, 14604, 403, 30027, 3470, 273, 253, 2540, 941, 275, 326, 14604, 310, 253, 2862, 516, 10340, 1232, 417, 6425, 281, 247, 1077, 2173, 2097, 273, 8493, 29331, 50276, 186, 26122, 327, 253, 1029, 5251, 2934, 273, 10921, 516, 10340, 285, 253, 17947, 15083, 80, 3535, 5454, 2727, 258, 186, 17873, 326, 2210, 281, 619, 2564, 2220, 4361, 253, 12002, 533, 891, 812, 2649, 1089, 3587, 9713, 275, 253, 2929, 310, 253, 2934, 326, 10921, 516, 10340, 812, 13050, 4360, 7450, 271, 5933, 4404, 30211, 285, 1977, 432, 30211, 516, 42249, 247, 10076, 835, 359, 3812, 247, 1355, 1180, 273, 3530, 323, 247, 1798, 2250, 275, 271, 2393, 3790, 534, 403, 417, 3782, 8612, 273, 697, 2032, 1318, 285, 1421, 281, 247, 23539, 6642, 949, 30027, 516, 10340, 273, 253, 23267, 326, 651, 452, 644, 2540, 327, 253, 643, 22349, 436, 8492, 588, 38500, 2007, 285, 651, 1646, 281, 760, 11907, 253, 5933, 281, 9796, 436, 2250, 1679, 604, 436, 6569, 281, 320, 247, 1175, 2250, 627, 651, 320, 247, 4468, 326, 436, 812, 1421, 281, 26530, 275, 749, 29776, 5231, 285, 36967, 804, 625, 14938, 685, 5933, 534, 1057, 417, 516, 48334, 327, 253, 3720, 273, 23539, 8197, 50275, 80, 186, 66, 1798, 4468, 275, 436, 4758, 651, 320, 326, 352, 5644, 281, 5536, 32936, 327, 253, 14938, 3268, 347, 253, 26530, 588, 760, 5108, 4536, 285, 407, 9610, 760, 2097, 285, 2629, 21492, 436, 11562, 310, 8489, 45650, 50276, 186, 2520, 2523, 556, 644, 11512, 275, 253, 747, 2715, 2228, 275, 253, 3762, 387, 16186, 655, 258, 186, 74, 2868, 19235, 326, 627, 310, 271, 2228, 387, 16186, 655, 275, 253, 4737, 273, 10012, 374, 352, 556, 20494, 1561, 247, 6278, 5230, 323, 1162, 66, 275, 14805, 253, 13919, 50276, 2204, 74, 17, 79, 1162, 6451, 50276, 292, 266, 18, 292, 266, 883, 1464, 50275, 4609, 310, 417, 3451, 352, 943, 320, 342, 690, 8077, 1877, 2879, 323, 19843, 50275, 2204, 74, 17, 79, 1162, 6451, 50276, 18, 2204, 74, 18, 79, 1162, 266, 50276, 883, 292, 266, 18, 1464, 50275, 2520, 19235, 2550, 840, 320, 11542, 347, 247, 11052, 1159, 273, 295, 285, 891, 2868, 5644, 281, 247, 3638, 1340, 3081, 8492, 4931, 11542, 347, 260, 3124, 17356, 11334, 1464, 275, 253, 2457, 3908, 273, 253, 10012, 50276, 338, 891, 717, 417, 20854, 285, 627, 310, 417, 690, 643, 5853, 326, 476, 320, 908, 281, 5386, 436, 3081, 8492, 50276, 4064, 247, 20178, 31460, 436, 3133, 11543, 347, 352, 2789, 3282, 323, 247, 5141, 275, 11041, 281, 320, 7933, 24191, 745, 1411, 271, 2572, 275, 8492, 50276, 2520, 2436, 36269, 690, 14586, 281, 253, 5955, 285, 16110, 6472, 273, 253, 9021, 253, 28596, 11052, 8492, 310, 4211, 347, 247, 2201, 7680, 6296, 25711, 689, 42022, 347, 352, 310, 760, 253, 3081, 8492, 534, 651, 28596, 29259, 417, 253, 2862, 8492, 285, 3103, 253, 2957, 273, 436, 1057, 2818, 253, 3935, 273, 253, 2929, 8489, 50275, 37585, 5701, 327, 772, 209, 186, 783, 2905, 789, 2593, 778, 320, 5520, 285, 1160, 625, 4217, 407, 7809, 849, 841, 9380, 3048, 281, 634, 789, 9241, 273, 2403, 247, 10464, 2147, 3061, 36970, 4632, 643, 3082, 22620, 390, 17006, 275, 253, 10527, 6260, 50275, 186, 783, 30547, 7695, 2746, 310, 3240, 271, 1774, 4473, 281, 253, 2929, 533, 581, 326, 310, 417, 7744, 908, 275, 3961, 953, 285, 3103, 7826, 32139, 281, 13893, 10668, 273, 253, 2929, 352, 651, 320, 5322, 281, 1056, 352, 30909, 752, 352, 8687, 4473, 1230, 672, 697, 806, 5393, 275, 253, 10199, 50276, 186, 74, 13414, 1158, 278, 310, 2931, 2007, 281, 436, 352, 778, 320, 4217, 281, 1918, 247, 3282, 273, 253, 2779, 2193, 273, 278, 270, 277, 2393, 275, 253, 2929, 323, 4227, 891, 1119, 4266, 29124, 670, 9534, 835, 270, 310, 2810, 281, 278, 285, 690, 5231, 403, 417, 908, 387, 512, 275, 690, 2393, 16334, 534, 3133, 281, 320, 1679, 4623, 672, 368, 755, 1996, 275, 253, 2929, 285, 27753, 253, 6034, 4758, 310, 45505, 50276, 186, 783, 5426, 273, 247, 3646, 12580, 812, 320, 625, 11076, 1037, 10799, 50276, 783, 2505, 3133, 281, 1804, 326, 9176, 310, 247, 2014, 10491, 3268, 327, 14168, 1179, 66, 908, 323, 512, 3603, 273, 253, 295, 394, 14604, 533, 352, 310, 417, 2590, 849, 10534, 352, 8115, 432, 253, 22349, 285, 2892, 281, 5231, 285, 1880, 352, 476, 6889, 2439, 253, 14604, 7293, 327, 1333, 253, 4321, 36318, 275, 253, 14604, 50274, 186, 783, 3268, 273, 253, 23267, 310, 417, 1160, 2590, 275, 2593, 374, 760, 253, 5164, 830, 273, 616, 15355, 50276, 261, 436, 24807, 50276, 186, 783, 3368, 275, 4677, 337, 310, 2649, 4751, 41374, 347, 253, 5661, 3602, 10921, 10670, 3634, 3268, 285, 14604, 1979, 403, 2649, 7616, 436, 671, 2789, 352, 12150, 281, 4665, 752, 1268, 273, 8680, 253, 5933, 812, 1902, 1293, 667, 3081, 1491, 4720, 697, 417, 2590, 752, 14586, 403, 1160, 281, 253, 15761, 1162, 355, 5933, 281, 5223, 281, 253, 5231, 29765, 10921, 3602, 390, 1880, 436, 310, 512, 5196, 275, 253, 4758, 273, 247, 6096, 10921, 4764, 50276, 186, 251, 3239, 577, 368, 1333, 323, 323, 455, 29168, 534, 9563, 347, 3981, 323, 7019, 50276, 186, 4064, 2593, 577, 253, 2408, 273, 19991, 4511, 3133, 281, 452, 644, 3777, 875, 33295, 436, 2789, 1841, 3240, 14086, 285, 1892, 281, 1239, 50276, 186, 255, 253, 1265, 273, 253, 4737, 273, 10012, 374, 19037, 11370, 50276, 186, 74, 13414, 1158, 368, 878, 253, 256, 425, 82, 470, 1617, 275, 289, 78, 374, 1580, 17356, 310, 2762, 3300, 504, 35161, 417, 10538, 1567, 832, 8234, 50276, 9088, 310, 247, 6832, 2408, 273, 789, 534, 556, 4783, 715, 436, 2929, 285, 253, 4477, 452, 5678, 2067, 1027, 5697, 50276, 8882, 780, 3961, 953, 10464, 2147, 3961, 953, 30547, 7695, 34754, 323, 6733, 50276, 36776, 275, 247, 6571, 10182, 5133, 436, 310, 37825, 789, 619, 7350, 403, 326, 627, 2067, 3672, 835, 253, 22909, 1928, 3710, 285, 417, 2608, 11453, 835, 253, 4722, 3386, 273, 253, 1332, 403, 417, 4751, 5544, 285, 835, 627, 403, 15813, 8777, 275, 253, 3762, 285, 38041, 253, 6158, 767, 273, 841, 2792, 1599, 326, 516, 417, 4751, 13762, 273, 253, 12510, 285, 11839, 273, 253, 1332, 50276, 74, 671, 1953, 1880, 17857, 32888, 310, 253, 987, 18767, 323, 436, 789, 352, 3133, 326, 352, 556, 644, 2834, 281, 18212, 5513, 253, 1332, 697, 10527, 23632, 285, 697, 1659, 1561, 253, 6239, 1561, 898, 7223, 285, 247, 625, 12482, 5955, 778, 320, 39941, 275, 247, 6698, 2929, 50275, 187, 187, 4118, 18435, 27, 249, 436, 2929, 253, 4477, 1908, 247, 33876, 10464, 2147, 3961, 262, 4758, 835, 597, 10725, 327, 50276, 303, 10340, 249, 1340, 281, 5998, 253, 44382, 886, 4525, 5231, 275, 1016, 14604, 1014, 2167, 253, 2934, 310, 3240, 275, 16606, 1221, 4093, 285, 476, 1421, 281, 747, 3082, 627, 310, 1335, 247, 298, 1171, 273, 3374, 5439, 407, 253, 3585, 27684, 398, 275, 1798, 629, 273, 253, 4737, 369, 13583, 285, 253, 4477, 3597, 281, 4993, 352, 533, 1677, 253, 2159, 673, 253, 3585, 27684, 398, 3543, 326, 436, 629, 943, 320, 35993, 285, 19666, 266, 1025, 2007, 671, 627, 403, 1142, 13991, 407, 30628, 326, 253, 4477, 878, 281, 4647, 275, 1340, 281, 1056, 436, 789, 15452, 494 ]
[ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 ]
[ 908, 281, 11897, 253, 2097, 285, 2629, 21492, 943, 320, 1160, 6843, 884, 2233, 50276, 1542, 253, 1755, 9591, 11333, 352, 651, 320, 4722, 281, 923, 253, 2629, 11254, 10604, 4768, 253, 14777, 50276, 43759, 326, 954, 6239, 35136, 2442, 23267, 310, 5777, 24363, 17699, 16561, 3082, 4388, 281, 1566, 10670, 689, 209, 40702, 250, 4515, 1677, 22349, 7613, 627, 310, 271, 15424, 50275, 406, 16559, 1593, 33797, 50276, 35640, 621, 323, 7756, 50276, 515, 405, 253, 3486, 273, 516, 10340, 672, 1027, 516, 10340, 3082, 403, 2783, 320, 6843, 670, 849, 253, 516, 10340, 310, 6786, 50276, 515, 405, 253, 3045, 273, 253, 11333, 275, 7533, 835, 253, 549, 785, 421, 3602, 403, 253, 1072, 2439, 549, 785, 960, 50276, 15603, 13506, 4679, 835, 253, 10670, 273, 22349, 285, 2250, 3602, 403, 1160, 6843, 285, 835, 253, 23267, 40702, 609, 23115, 347, 6843, 1159, 932, 273, 3634, 285, 5231, 891, 2868, 337, 812, 320, 247, 1270, 3720, 323, 824, 4679, 50276, 12025, 5884, 3374, 40702, 50275, 18, 26198, 15860, 480, 1443, 1162, 355, 3632, 1159, 2235, 641, 323, 6431, 494, 16417, 342, 4893, 281, 14580, 285, 38524, 941, 575, 24301, 1972, 275, 11454, 1491, 5162, 2718, 575, 1099, 4050, 898, 4185, 45535, 17401, 5075, 12009, 253, 2929, 12453, 271, 4722, 1953, 285, 310, 3839, 973, 3542, 342, 247, 12532, 5933, 280, 3884, 2299, 891, 2868, 253, 4275, 516, 10340, 2934, 4850, 3663, 275, 253, 1142, 11640, 2783, 891, 2868, 1907, 30547, 7695, 11640, 310, 1774, 285, 891, 10490, 326, 32124, 273, 253, 2929, 2299, 891, 2868, 326, 627, 310, 2649, 4209, 19843, 327, 849, 253, 4588, 516, 10340, 310, 14123, 285, 752, 310, 253, 3486, 273, 824, 516, 10340, 2007, 253, 2929, 1057, 417, 2085, 30328, 327, 752, 14053, 13260, 285, 7605, 941, 3386, 403, 3212, 253, 3045, 11701, 672, 516, 10340, 310, 908, 5474, 33032, 2520, 2929, 12453, 10464, 2147, 33876, 3961, 953, 342, 247, 4229, 873, 273, 5231, 285, 4858, 7202, 3602, 323, 1016, 2250, 253, 4477, 12661, 271, 2746, 835, 253, 440, 45912, 23267, 26332, 253, 23267, 326, 651, 452, 644, 2797, 604, 5231, 326, 574, 2649, 644, 4236, 323, 247, 1677, 3634, 403, 516, 19280, 285, 841, 516, 19280, 2193, 403, 11217, 275, 281, 253, 37820, 273, 253, 4764, 8197, 275, 247, 19169, 1028, 67, 3022, 5933, 323, 4606, 273, 15180, 6733, 253, 4477, 671, 2216, 247, 1232, 281, 16851, 436, 3963, 1025, 29107, 3066, 247, 30547, 7695, 5853, 323, 253, 7274, 342, 285, 1293, 30547, 7695, 253, 4477, 15313, 247, 258, 2609, 69, 6917, 14938, 3033, 835, 278, 310, 253, 1180, 273, 5231, 285, 277, 310, 253, 7877, 1319, 273, 253, 4764, 11390, 534, 21450, 10129, 752, 310, 3264, 275, 253, 1327, 13419, 2147, 4758, 597, 9059, 326, 253, 4081, 11333, 452, 17568, 2406, 11041, 275, 253, 35774, 14938, 685, 7274, 1293, 516, 10340, 285, 667, 2572, 275, 8492, 12075, 28596, 4541, 9508, 342, 673, 26672, 422, 3602, 285, 323, 14561, 3470, 403, 4081, 1293, 247, 2120, 10527, 1971, 285, 512, 253, 4081, 11333, 403, 2011, 281, 1347, 973, 275, 271, 16774, 1263, 5847, 253, 10527, 789, 310, 2570, 285, 16280, 253, 4385, 891, 1056, 275, 253, 2201, 5701, 2593, 7899, 347, 2080, 347, 891, 476, 2028, 253, 2929, 556, 2429, 697, 7274, 281, 247, 1180, 273, 24600, 49602, 327, 247, 5322, 5878, 273, 15524, 285, 1524, 10895, 285, 323, 2067, 15958, 18149, 273, 253, 1566, 253, 1895, 5421, 310, 273, 13241, 1600, 285, 253, 2746, 2668, 281, 15313, 247, 747, 5933, 310, 271, 4722, 5019, 273, 5697, 432, 253, 6239, 50276, 24330, 5701, 327, 772, 619, 2022, 3374, 403, 326, 253, 22909, 273, 253, 2022, 5933, 403, 417, 10481, 7000, 281, 18578, 479, 326, 352, 310, 347, 3576, 347, 5125, 627, 310, 671, 271, 2228, 275, 253, 3762, 534, 2378, 15045, 5075, 561, 253, 9021, 923, 625, 4278, 2708, 50276, 186, 11682, 273, 253, 516, 10340, 3963, 1025, 27563, 9077, 50276, 80, 186, 783, 2929, 310, 6571, 12482, 342, 690, 49889, 273, 2508, 347, 2529, 275, 253, 625, 5884, 5701, 2708, 1919, 436, 1127, 835, 352, 14545, 1146, 347, 2608, 19771, 690, 625, 45860, 8813, 273, 253, 28939, 310, 3058, 1060, 281, 1361, 7102, 253, 9414, 347, 281, 253, 1159, 285, 7914, 273, 12624, 298, 246, 815, 74, 3714, 74, 285, 270, 436, 2238, 273, 5955, 310, 2530, 2720, 281, 5150, 337, 50276, 80, 186, 783, 6923, 432, 5150, 337, 281, 5150, 374, 310, 417, 4755, 285, 812, 513, 342, 690, 2007, 8813, 50276, 80, 186, 4919, 281, 436, 697, 417, 4755, 835, 1162, 66, 3249, 432, 50276, 261, 352, 247, 1159, 273, 17356, 285, 29331, 604, 417, 2139, 1057, 352, 417, 4901, 11120, 275, 281, 16186, 337, 9366, 403, 627, 29556, 1142, 5482, 285, 247, 1798, 4327, 273, 1162, 66, 4764, 3013, 247, 2173, 581, 50276, 80, 186, 255, 247, 625, 20178, 1268, 697, 1892, 281, 2096, 752, 436, 2570, 1232, 273, 516, 10340, 33526, 326, 310, 625, 685, 816, 8493, 253, 6349, 273, 253, 37820, 1307, 285, 3629, 253, 6349, 273, 253, 2856, 324, 1069, 257, 1307, 1580, 253, 516, 19280, 2193, 323, 247, 14604, 403, 30027, 3470, 273, 253, 2540, 941, 275, 326, 14604, 310, 253, 2862, 516, 10340, 1232, 417, 6425, 281, 247, 1077, 2173, 2097, 273, 8493, 29331, 50276, 186, 26122, 327, 253, 1029, 5251, 2934, 273, 10921, 516, 10340, 285, 253, 17947, 15083, 80, 3535, 5454, 2727, 258, 186, 17873, 326, 2210, 281, 619, 2564, 2220, 4361, 253, 12002, 533, 891, 812, 2649, 1089, 3587, 9713, 275, 253, 2929, 310, 253, 2934, 326, 10921, 516, 10340, 812, 13050, 4360, 7450, 271, 5933, 4404, 30211, 285, 1977, 432, 30211, 516, 42249, 247, 10076, 835, 359, 3812, 247, 1355, 1180, 273, 3530, 323, 247, 1798, 2250, 275, 271, 2393, 3790, 534, 403, 417, 3782, 8612, 273, 697, 2032, 1318, 285, 1421, 281, 247, 23539, 6642, 949, 30027, 516, 10340, 273, 253, 23267, 326, 651, 452, 644, 2540, 327, 253, 643, 22349, 436, 8492, 588, 38500, 2007, 285, 651, 1646, 281, 760, 11907, 253, 5933, 281, 9796, 436, 2250, 1679, 604, 436, 6569, 281, 320, 247, 1175, 2250, 627, 651, 320, 247, 4468, 326, 436, 812, 1421, 281, 26530, 275, 749, 29776, 5231, 285, 36967, 804, 625, 14938, 685, 5933, 534, 1057, 417, 516, 48334, 327, 253, 3720, 273, 23539, 8197, 50275, 80, 186, 66, 1798, 4468, 275, 436, 4758, 651, 320, 326, 352, 5644, 281, 5536, 32936, 327, 253, 14938, 3268, 347, 253, 26530, 588, 760, 5108, 4536, 285, 407, 9610, 760, 2097, 285, 2629, 21492, 436, 11562, 310, 8489, 45650, 50276, 186, 2520, 2523, 556, 644, 11512, 275, 253, 747, 2715, 2228, 275, 253, 3762, 387, 16186, 655, 258, 186, 74, 2868, 19235, 326, 627, 310, 271, 2228, 387, 16186, 655, 275, 253, 4737, 273, 10012, 374, 352, 556, 20494, 1561, 247, 6278, 5230, 323, 1162, 66, 275, 14805, 253, 13919, 50276, 2204, 74, 17, 79, 1162, 6451, 50276, 292, 266, 18, 292, 266, 883, 1464, 50275, 4609, 310, 417, 3451, 352, 943, 320, 342, 690, 8077, 1877, 2879, 323, 19843, 50275, 2204, 74, 17, 79, 1162, 6451, 50276, 18, 2204, 74, 18, 79, 1162, 266, 50276, 883, 292, 266, 18, 1464, 50275, 2520, 19235, 2550, 840, 320, 11542, 347, 247, 11052, 1159, 273, 295, 285, 891, 2868, 5644, 281, 247, 3638, 1340, 3081, 8492, 4931, 11542, 347, 260, 3124, 17356, 11334, 1464, 275, 253, 2457, 3908, 273, 253, 10012, 50276, 338, 891, 717, 417, 20854, 285, 627, 310, 417, 690, 643, 5853, 326, 476, 320, 908, 281, 5386, 436, 3081, 8492, 50276, 4064, 247, 20178, 31460, 436, 3133, 11543, 347, 352, 2789, 3282, 323, 247, 5141, 275, 11041, 281, 320, 7933, 24191, 745, 1411, 271, 2572, 275, 8492, 50276, 2520, 2436, 36269, 690, 14586, 281, 253, 5955, 285, 16110, 6472, 273, 253, 9021, 253, 28596, 11052, 8492, 310, 4211, 347, 247, 2201, 7680, 6296, 25711, 689, 42022, 347, 352, 310, 760, 253, 3081, 8492, 534, 651, 28596, 29259, 417, 253, 2862, 8492, 285, 3103, 253, 2957, 273, 436, 1057, 2818, 253, 3935, 273, 253, 2929, 8489, 50275, 37585, 5701, 327, 772, 209, 186, 783, 2905, 789, 2593, 778, 320, 5520, 285, 1160, 625, 4217, 407, 7809, 849, 841, 9380, 3048, 281, 634, 789, 9241, 273, 2403, 247, 10464, 2147, 3061, 36970, 4632, 643, 3082, 22620, 390, 17006, 275, 253, 10527, 6260, 50275, 186, 783, 30547, 7695, 2746, 310, 3240, 271, 1774, 4473, 281, 253, 2929, 533, 581, 326, 310, 417, 7744, 908, 275, 3961, 953, 285, 3103, 7826, 32139, 281, 13893, 10668, 273, 253, 2929, 352, 651, 320, 5322, 281, 1056, 352, 30909, 752, 352, 8687, 4473, 1230, 672, 697, 806, 5393, 275, 253, 10199, 50276, 186, 74, 13414, 1158, 278, 310, 2931, 2007, 281, 436, 352, 778, 320, 4217, 281, 1918, 247, 3282, 273, 253, 2779, 2193, 273, 278, 270, 277, 2393, 275, 253, 2929, 323, 4227, 891, 1119, 4266, 29124, 670, 9534, 835, 270, 310, 2810, 281, 278, 285, 690, 5231, 403, 417, 908, 387, 512, 275, 690, 2393, 16334, 534, 3133, 281, 320, 1679, 4623, 672, 368, 755, 1996, 275, 253, 2929, 285, 27753, 253, 6034, 4758, 310, 45505, 50276, 186, 783, 5426, 273, 247, 3646, 12580, 812, 320, 625, 11076, 1037, 10799, 50276, 783, 2505, 3133, 281, 1804, 326, 9176, 310, 247, 2014, 10491, 3268, 327, 14168, 1179, 66, 908, 323, 512, 3603, 273, 253, 295, 394, 14604, 533, 352, 310, 417, 2590, 849, 10534, 352, 8115, 432, 253, 22349, 285, 2892, 281, 5231, 285, 1880, 352, 476, 6889, 2439, 253, 14604, 7293, 327, 1333, 253, 4321, 36318, 275, 253, 14604, 50274, 186, 783, 3268, 273, 253, 23267, 310, 417, 1160, 2590, 275, 2593, 374, 760, 253, 5164, 830, 273, 616, 15355, 50276, 261, 436, 24807, 50276, 186, 783, 3368, 275, 4677, 337, 310, 2649, 4751, 41374, 347, 253, 5661, 3602, 10921, 10670, 3634, 3268, 285, 14604, 1979, 403, 2649, 7616, 436, 671, 2789, 352, 12150, 281, 4665, 752, 1268, 273, 8680, 253, 5933, 812, 1902, 1293, 667, 3081, 1491, 4720, 697, 417, 2590, 752, 14586, 403, 1160, 281, 253, 15761, 1162, 355, 5933, 281, 5223, 281, 253, 5231, 29765, 10921, 3602, 390, 1880, 436, 310, 512, 5196, 275, 253, 4758, 273, 247, 6096, 10921, 4764, 50276, 186, 251, 3239, 577, 368, 1333, 323, 323, 455, 29168, 534, 9563, 347, 3981, 323, 7019, 50276, 186, 4064, 2593, 577, 253, 2408, 273, 19991, 4511, 3133, 281, 452, 644, 3777, 875, 33295, 436, 2789, 1841, 3240, 14086, 285, 1892, 281, 1239, 50276, 186, 255, 253, 1265, 273, 253, 4737, 273, 10012, 374, 19037, 11370, 50276, 186, 74, 13414, 1158, 368, 878, 253, 256, 425, 82, 470, 1617, 275, 289, 78, 374, 1580, 17356, 310, 2762, 3300, 504, 35161, 417, 10538, 1567, 832, 8234, 50276, 9088, 310, 247, 6832, 2408, 273, 789, 534, 556, 4783, 715, 436, 2929, 285, 253, 4477, 452, 5678, 2067, 1027, 5697, 50276, 8882, 780, 3961, 953, 10464, 2147, 3961, 953, 30547, 7695, 34754, 323, 6733, 50276, 36776, 275, 247, 6571, 10182, 5133, 436, 310, 37825, 789, 619, 7350, 403, 326, 627, 2067, 3672, 835, 253, 22909, 1928, 3710, 285, 417, 2608, 11453, 835, 253, 4722, 3386, 273, 253, 1332, 403, 417, 4751, 5544, 285, 835, 627, 403, 15813, 8777, 275, 253, 3762, 285, 38041, 253, 6158, 767, 273, 841, 2792, 1599, 326, 516, 417, 4751, 13762, 273, 253, 12510, 285, 11839, 273, 253, 1332, 50276, 74, 671, 1953, 1880, 17857, 32888, 310, 253, 987, 18767, 323, 436, 789, 352, 3133, 326, 352, 556, 644, 2834, 281, 18212, 5513, 253, 1332, 697, 10527, 23632, 285, 697, 1659, 1561, 253, 6239, 1561, 898, 7223, 285, 247, 625, 12482, 5955, 778, 320, 39941, 275, 247, 6698, 2929, 50275, 187, 187, 4118, 18435, 27, 249, 436, 2929, 253, 4477, 1908, 247, 33876, 10464, 2147, 3961, 262, 4758, 835, 597, 10725, 327, 50276, 303, 10340, 249, 1340, 281, 5998, 253, 44382, 886, 4525, 5231, 275, 1016, 14604, 1014, 2167, 253, 2934, 310, 3240, 275, 16606, 1221, 4093, 285, 476, 1421, 281, 747, 3082, 627, 310, 1335, 247, 298, 1171, 273, 3374, 5439, 407, 253, 3585, 27684, 398, 275, 1798, 629, 273, 253, 4737, 369, 13583, 285, 253, 4477, 3597, 281, 4993, 352, 533, 1677, 253, 2159, 673, 253, 3585, 27684, 398, 3543, 326, 436, 629, 943, 320, 35993, 285, 19666, 266, 1025, 2007, 671, 627, 403, 1142, 13991, 407, 30628, 326, 253, 4477, 878, 281, 4647, 275, 1340, 281, 1056, 436, 789, 15452, 494 ]
Below is given review of a research paper from cnoference journal. Please write a summary the review. ### Review: 1 this paper proposes a treestructured multitask model recommender it takes as input an arbitrary cnn model and some predefined tasks and outputs the topk treestructured multitask architectures that achieve good performance within some computation budget limitation 2 the whole process is in a whitebox manner 3 extensive experiments validate the effectiveness of the proposed approach personally speaking during the process how to estimate the task accuracy without actual training is pretty interesting and might be useful for automl community the proposed approach theory and experiments are sound among the three key components branching point detector design space enumerator and task accuracy estimator it is necessary to explain how task accuracy estimator predicts task performance of architecture wo performing actual training works for example what is the motivation for svde instead of the regular entropy why could the associated twotask models be used to evaluate the performance the actual multitask structure how is the computation budget involved this part is not clear in table 2 the column combudget indicates the number of models different models might have varying different number of parameters and the number of models could be used to represent budget the paper is well written and very easy to follow though there are some reference citing error such as line 145 line 270 etc appendix section please see the above section technical quality and correctness this paper proposes a treestructured multitask model recommender it takes as input an arbitrary cnn model and some predefined tasks and outputs the topk treestructured multitask architectures that achieve good performance within some computation budget limitation in a whitebox manner the proposed approach consists of three key components branching point detector automatically detects branching points computation blocks such as residual block in resnet50 design space enumerator lists all the multitask architecture task accuracy estimator predicts task performance of architecture wo performing actual training please check the above section technical quality and correctness for negative points docsepthe authors work with multitask learning which aims to solve multiple tasks simultaneously the authors propose an approach that takes as inputs a backbone model and a set of tasks in interest and then predicts the topk treestructured multitask architectures that achieve high task accuracy while meeting a userspecified computation budget the authors propose to achieve it using three major components a task accuracy estimator design space enumerator and branching point predictor unlike previous approaches the authors claim that their approach does branching automatically without the need for domain expertise and efficient computation and can work under a given constraint the authors furthermore maintain their assertions from the experiments on various datasets the work performed by the authors does have a significant impact and is of interest to the automl community in general i have come to this conclusion for three reasons the approach is endtoend and can be deployed on any arbitrary pretrained model the approach fares well against current sota and does not require any retraining it gives the topk candidate set of architectures that perform well for the given set of tasks and aid better analysis studies in the future the paper is technically sound across most sections but i do have one question i hope the authors could answer for me section 31 the authors propose the task accuracy estimator by averaging over all associated twotask architectures i would suggest running with multiple seeds and reporting the variance as well to confirm that the model consistently gives outputs that have a high correlation with the oracle ranking and that the results are not an artifact of the seed why do you consider only two task architectures and not a subset of all task structures do we not need to worry about the task interactions and interventions here the paper is clearly written and is easy to follow readers can easily capture the main ideas as we mentioned in item 1 strengths the paper proposes a novel approach for mtl using a treestructured architecture candidate generation the authors show the robustness of this approach across multiple datasets and settings weakness the authors propose a twotask accuracy estimator however it would be nice to share how the task interactions and interference would be if we did not consider this scenario why not more than 2 is it for complexity reasons or does this setting achieve best performance the models have only been run on one seed and it would be very useful to see it run on multiple seeds reporting both mean and var of performance to show that their approach is not an artifact of luck and is robust overall i find the paper exciting and fun to read the authors introduce concepts that help in the field of mtl and are very useful to the research community in general their endtoend approach is efficient and does not require additional training unlike many similar approaches furthermore it returns a candidate set of architectures and makes the approach very useful for analyzing the neural network better docsepna for reproducibility reviewers na for reproducibility reviewers na for reproducibility reviewers na for reproducibility reviewers na for reproducibility reviewers na for reproducibility reviewers ### Summary:
this paper proposes a recommender that given a convolutional model and a set of tasks proposes topk treestructured multitask architectures that can achieve high accuracy by figuring out where to branch so that the appropriate parts of the network are shared across tasks while adhering to a computational budget the reviewers found the proposed approach experiments and theory sound and novel the presentation clear the empirical performance strong and the contribution overall important for the research community this is why i recommend acceptance the reviewers though do point out some flaws that should be addressed as well as possible in the revised version these relate to motivating components of the approach clarity with regards to measuring computational budget and the need to run experiments with additional random seeds an important thing to flag is that while the authors claim they will publish their code on github as soon as possible as it stands currently their code is not available
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 18, 436, 2929, 29328, 247, 2578, 383, 957, 1520, 1554, 262, 1945, 1566, 3818, 3109, 352, 3936, 347, 3280, 271, 10341, 260, 9866, 1566, 285, 690, 41364, 8892, 285, 18012, 253, 1755, 76, 2578, 383, 957, 1520, 1554, 262, 1945, 35615, 326, 5115, 1175, 3045, 1561, 690, 13782, 7563, 12291, 50276, 19, 253, 2644, 1232, 310, 275, 247, 3168, 3364, 5133, 50276, 20, 9470, 4679, 17813, 253, 12510, 273, 253, 4081, 2746, 11697, 8288, 1309, 253, 1232, 849, 281, 6642, 253, 4836, 7200, 1293, 4588, 3733, 310, 3965, 4722, 285, 1537, 320, 4217, 323, 3772, 77, 3114, 50275, 783, 4081, 2746, 3762, 285, 4679, 403, 3590, 50276, 35094, 253, 1264, 2234, 4295, 27213, 1127, 13562, 2216, 2317, 30482, 1080, 285, 4836, 7200, 29107, 352, 310, 3309, 281, 5513, 849, 4836, 7200, 29107, 26295, 4836, 3045, 273, 10336, 32063, 9591, 4588, 3733, 2987, 323, 1650, 50275, 5371, 310, 253, 16038, 323, 18504, 615, 3185, 273, 253, 3963, 15579, 50276, 22309, 812, 253, 2330, 2500, 302, 1945, 3210, 320, 908, 281, 7472, 253, 3045, 253, 4588, 1554, 262, 1945, 2605, 50275, 5430, 310, 253, 13782, 7563, 3206, 436, 629, 310, 417, 2590, 275, 2829, 374, 253, 5084, 2049, 438, 788, 6492, 253, 1180, 273, 3210, 1027, 3210, 1537, 452, 11962, 1027, 1180, 273, 3602, 285, 253, 1180, 273, 3210, 812, 320, 908, 281, 1957, 7563, 253, 2929, 310, 973, 3542, 285, 1077, 3477, 281, 956, 2167, 627, 403, 690, 3806, 19936, 2228, 824, 347, 1386, 19092, 1386, 22540, 3966, 30762, 2593, 50275, 32897, 923, 253, 1840, 2593, 7681, 3290, 285, 36594, 436, 2929, 29328, 247, 2578, 383, 957, 1520, 1554, 262, 1945, 1566, 3818, 3109, 352, 3936, 347, 3280, 271, 10341, 260, 9866, 1566, 285, 690, 41364, 8892, 285, 18012, 253, 1755, 76, 2578, 383, 957, 1520, 1554, 262, 1945, 35615, 326, 5115, 1175, 3045, 1561, 690, 13782, 7563, 12291, 275, 247, 3168, 3364, 5133, 50276, 783, 4081, 2746, 8414, 273, 1264, 2234, 4295, 50276, 27391, 272, 1127, 13562, 8356, 34472, 27213, 2792, 13782, 8336, 824, 347, 12541, 2972, 275, 501, 3024, 1235, 50276, 19417, 2317, 30482, 1080, 10894, 512, 253, 1554, 262, 1945, 10336, 50276, 14605, 7200, 29107, 26295, 4836, 3045, 273, 10336, 32063, 9591, 4588, 3733, 50276, 32897, 2451, 253, 1840, 2593, 7681, 3290, 285, 36594, 323, 4016, 2792, 5474, 339, 431, 248, 4477, 789, 342, 1554, 262, 1945, 4715, 534, 13698, 281, 8415, 2709, 8892, 10486, 253, 4477, 12661, 271, 2746, 326, 3936, 347, 14800, 247, 27882, 1566, 285, 247, 873, 273, 8892, 275, 1600, 285, 840, 26295, 253, 1755, 76, 2578, 383, 957, 1520, 1554, 262, 1945, 35615, 326, 5115, 1029, 4836, 7200, 1223, 4804, 247, 4212, 1553, 1245, 13782, 7563, 253, 4477, 12661, 281, 5115, 352, 970, 1264, 2201, 4295, 247, 4836, 7200, 29107, 2216, 2317, 30482, 1080, 285, 27213, 1127, 23403, 50276, 328, 3022, 2045, 7274, 253, 4477, 1750, 326, 616, 2746, 1057, 27213, 8356, 1293, 253, 878, 323, 5028, 15040, 285, 5919, 13782, 285, 476, 789, 762, 247, 1677, 7658, 253, 4477, 33810, 6558, 616, 33184, 432, 253, 4679, 327, 2710, 15302, 253, 789, 2684, 407, 253, 4477, 1057, 452, 247, 1534, 3486, 285, 310, 273, 1600, 281, 253, 3772, 77, 3114, 275, 2087, 891, 452, 1705, 281, 436, 6452, 323, 1264, 4606, 50276, 783, 2746, 310, 990, 936, 423, 285, 476, 320, 18329, 327, 667, 10341, 3215, 11273, 1566, 50275, 783, 2746, 4195, 373, 973, 1411, 1655, 256, 5503, 285, 1057, 417, 2430, 667, 851, 26208, 50276, 262, 4245, 253, 1755, 76, 7431, 873, 273, 35615, 326, 1347, 973, 323, 253, 1677, 873, 273, 8892, 285, 8596, 1805, 1783, 2175, 275, 253, 2852, 50276, 783, 2929, 310, 22335, 3590, 2439, 954, 7118, 533, 891, 513, 452, 581, 1953, 891, 3524, 253, 4477, 812, 3662, 323, 479, 50275, 4674, 4562, 253, 4477, 12661, 253, 4836, 7200, 29107, 407, 25001, 689, 512, 2330, 2500, 302, 1945, 35615, 891, 651, 1804, 3515, 342, 2709, 12922, 285, 9610, 253, 11041, 347, 973, 281, 6583, 326, 253, 1566, 12724, 4245, 18012, 326, 452, 247, 1029, 5921, 342, 253, 42295, 19947, 285, 326, 253, 1543, 403, 417, 271, 34332, 273, 253, 8357, 50276, 22309, 513, 368, 1908, 760, 767, 4836, 35615, 285, 417, 247, 8578, 273, 512, 4836, 5289, 513, 359, 417, 878, 281, 7664, 670, 253, 4836, 6355, 285, 12214, 1060, 50276, 783, 2929, 310, 4518, 3542, 285, 310, 3477, 281, 956, 10668, 476, 4354, 9232, 253, 2022, 5697, 347, 359, 5393, 275, 5382, 337, 50274, 296, 3755, 20556, 50275, 783, 2929, 29328, 247, 4460, 2746, 323, 278, 17945, 970, 247, 2578, 383, 957, 1520, 10336, 7431, 5978, 50276, 783, 4477, 921, 253, 31640, 273, 436, 2746, 2439, 2709, 15302, 285, 7533, 50275, 20881, 1255, 50275, 783, 4477, 12661, 247, 2500, 302, 1945, 7200, 29107, 2299, 352, 651, 320, 5322, 281, 3894, 849, 253, 4836, 6355, 285, 11689, 651, 320, 604, 359, 858, 417, 1908, 436, 10076, 2139, 417, 625, 685, 374, 310, 352, 323, 10454, 4606, 390, 1057, 436, 4758, 5115, 1682, 3045, 50276, 783, 3210, 452, 760, 644, 1408, 327, 581, 8357, 285, 352, 651, 320, 1077, 4217, 281, 923, 352, 1408, 327, 2709, 12922, 9610, 1097, 1599, 285, 945, 273, 3045, 281, 921, 326, 616, 2746, 310, 417, 271, 34332, 273, 7516, 285, 310, 10237, 4583, 891, 1089, 253, 2929, 12302, 285, 794, 281, 1239, 253, 4477, 9569, 12342, 326, 1361, 275, 253, 1673, 273, 278, 17945, 285, 403, 1077, 4217, 281, 253, 2561, 3114, 275, 2087, 616, 990, 936, 423, 2746, 310, 5919, 285, 1057, 417, 2430, 3081, 3733, 12401, 1142, 2074, 7274, 33810, 352, 6548, 247, 7431, 873, 273, 35615, 285, 2789, 253, 2746, 1077, 4217, 323, 18918, 253, 11454, 2990, 1805, 5474, 33032, 2072, 323, 38041, 30628, 5549, 323, 38041, 30628, 5549, 323, 38041, 30628, 5549, 323, 38041, 30628, 5549, 323, 38041, 30628, 5549, 323, 38041, 30628, 2490, 187, 4118, 18435, 27, 2520, 2929, 29328, 247, 3818, 3109, 326, 1677, 247, 27311, 267, 1566, 285, 247, 873, 273, 8892, 29328, 1755, 76, 2578, 383, 957, 1520, 1554, 262, 1945, 35615, 326, 476, 5115, 1029, 7200, 407, 36182, 562, 835, 281, 7789, 594, 326, 253, 4569, 4243, 273, 253, 2990, 403, 6096, 2439, 8892, 1223, 519, 42527, 281, 247, 15180, 7563, 253, 30628, 1119, 253, 4081, 2746, 4679, 285, 3762, 3590, 285, 4460, 253, 9759, 2590, 253, 16774, 3045, 2266, 285, 253, 7680, 4583, 1774, 323, 253, 2561, 3114, 436, 310, 2139, 891, 5583, 14924, 50275, 783, 30628, 2167, 513, 1127, 562, 690, 32138, 326, 943, 320, 9713, 347, 973, 347, 1896, 275, 253, 17265, 2715, 841, 14588, 281, 15265, 839, 4295, 273, 253, 2746, 19843, 342, 17730, 281, 10499, 15180, 7563, 285, 253, 878, 281, 1408, 4679, 342, 3081, 3632, 12922, 50276, 266, 1774, 2181, 281, 7908, 310, 326, 1223, 253, 4477, 1750, 597, 588, 15452, 616, 2127, 327, 40477, 347, 3517, 347, 1896, 347, 352, 9572, 4390, 616, 2127, 310, 417, 2130, 209 ]
[ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 ]
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 18, 436, 2929, 29328, 247, 2578, 383, 957, 1520, 1554, 262, 1945, 1566, 3818, 3109, 352, 3936, 347, 3280, 271, 10341, 260, 9866, 1566, 285, 690, 41364, 8892, 285, 18012, 253, 1755, 76, 2578, 383, 957, 1520, 1554, 262, 1945, 35615, 326, 5115, 1175, 3045, 1561, 690, 13782, 7563, 12291, 50276, 19, 253, 2644, 1232, 310, 275, 247, 3168, 3364, 5133, 50276, 20, 9470, 4679, 17813, 253, 12510, 273, 253, 4081, 2746, 11697, 8288, 1309, 253, 1232, 849, 281, 6642, 253, 4836, 7200, 1293, 4588, 3733, 310, 3965, 4722, 285, 1537, 320, 4217, 323, 3772, 77, 3114, 50275, 783, 4081, 2746, 3762, 285, 4679, 403, 3590, 50276, 35094, 253, 1264, 2234, 4295, 27213, 1127, 13562, 2216, 2317, 30482, 1080, 285, 4836, 7200, 29107, 352, 310, 3309, 281, 5513, 849, 4836, 7200, 29107, 26295, 4836, 3045, 273, 10336, 32063, 9591, 4588, 3733, 2987, 323, 1650, 50275, 5371, 310, 253, 16038, 323, 18504, 615, 3185, 273, 253, 3963, 15579, 50276, 22309, 812, 253, 2330, 2500, 302, 1945, 3210, 320, 908, 281, 7472, 253, 3045, 253, 4588, 1554, 262, 1945, 2605, 50275, 5430, 310, 253, 13782, 7563, 3206, 436, 629, 310, 417, 2590, 275, 2829, 374, 253, 5084, 2049, 438, 788, 6492, 253, 1180, 273, 3210, 1027, 3210, 1537, 452, 11962, 1027, 1180, 273, 3602, 285, 253, 1180, 273, 3210, 812, 320, 908, 281, 1957, 7563, 253, 2929, 310, 973, 3542, 285, 1077, 3477, 281, 956, 2167, 627, 403, 690, 3806, 19936, 2228, 824, 347, 1386, 19092, 1386, 22540, 3966, 30762, 2593, 50275, 32897, 923, 253, 1840, 2593, 7681, 3290, 285, 36594, 436, 2929, 29328, 247, 2578, 383, 957, 1520, 1554, 262, 1945, 1566, 3818, 3109, 352, 3936, 347, 3280, 271, 10341, 260, 9866, 1566, 285, 690, 41364, 8892, 285, 18012, 253, 1755, 76, 2578, 383, 957, 1520, 1554, 262, 1945, 35615, 326, 5115, 1175, 3045, 1561, 690, 13782, 7563, 12291, 275, 247, 3168, 3364, 5133, 50276, 783, 4081, 2746, 8414, 273, 1264, 2234, 4295, 50276, 27391, 272, 1127, 13562, 8356, 34472, 27213, 2792, 13782, 8336, 824, 347, 12541, 2972, 275, 501, 3024, 1235, 50276, 19417, 2317, 30482, 1080, 10894, 512, 253, 1554, 262, 1945, 10336, 50276, 14605, 7200, 29107, 26295, 4836, 3045, 273, 10336, 32063, 9591, 4588, 3733, 50276, 32897, 2451, 253, 1840, 2593, 7681, 3290, 285, 36594, 323, 4016, 2792, 5474, 339, 431, 248, 4477, 789, 342, 1554, 262, 1945, 4715, 534, 13698, 281, 8415, 2709, 8892, 10486, 253, 4477, 12661, 271, 2746, 326, 3936, 347, 14800, 247, 27882, 1566, 285, 247, 873, 273, 8892, 275, 1600, 285, 840, 26295, 253, 1755, 76, 2578, 383, 957, 1520, 1554, 262, 1945, 35615, 326, 5115, 1029, 4836, 7200, 1223, 4804, 247, 4212, 1553, 1245, 13782, 7563, 253, 4477, 12661, 281, 5115, 352, 970, 1264, 2201, 4295, 247, 4836, 7200, 29107, 2216, 2317, 30482, 1080, 285, 27213, 1127, 23403, 50276, 328, 3022, 2045, 7274, 253, 4477, 1750, 326, 616, 2746, 1057, 27213, 8356, 1293, 253, 878, 323, 5028, 15040, 285, 5919, 13782, 285, 476, 789, 762, 247, 1677, 7658, 253, 4477, 33810, 6558, 616, 33184, 432, 253, 4679, 327, 2710, 15302, 253, 789, 2684, 407, 253, 4477, 1057, 452, 247, 1534, 3486, 285, 310, 273, 1600, 281, 253, 3772, 77, 3114, 275, 2087, 891, 452, 1705, 281, 436, 6452, 323, 1264, 4606, 50276, 783, 2746, 310, 990, 936, 423, 285, 476, 320, 18329, 327, 667, 10341, 3215, 11273, 1566, 50275, 783, 2746, 4195, 373, 973, 1411, 1655, 256, 5503, 285, 1057, 417, 2430, 667, 851, 26208, 50276, 262, 4245, 253, 1755, 76, 7431, 873, 273, 35615, 326, 1347, 973, 323, 253, 1677, 873, 273, 8892, 285, 8596, 1805, 1783, 2175, 275, 253, 2852, 50276, 783, 2929, 310, 22335, 3590, 2439, 954, 7118, 533, 891, 513, 452, 581, 1953, 891, 3524, 253, 4477, 812, 3662, 323, 479, 50275, 4674, 4562, 253, 4477, 12661, 253, 4836, 7200, 29107, 407, 25001, 689, 512, 2330, 2500, 302, 1945, 35615, 891, 651, 1804, 3515, 342, 2709, 12922, 285, 9610, 253, 11041, 347, 973, 281, 6583, 326, 253, 1566, 12724, 4245, 18012, 326, 452, 247, 1029, 5921, 342, 253, 42295, 19947, 285, 326, 253, 1543, 403, 417, 271, 34332, 273, 253, 8357, 50276, 22309, 513, 368, 1908, 760, 767, 4836, 35615, 285, 417, 247, 8578, 273, 512, 4836, 5289, 513, 359, 417, 878, 281, 7664, 670, 253, 4836, 6355, 285, 12214, 1060, 50276, 783, 2929, 310, 4518, 3542, 285, 310, 3477, 281, 956, 10668, 476, 4354, 9232, 253, 2022, 5697, 347, 359, 5393, 275, 5382, 337, 50274, 296, 3755, 20556, 50275, 783, 2929, 29328, 247, 4460, 2746, 323, 278, 17945, 970, 247, 2578, 383, 957, 1520, 10336, 7431, 5978, 50276, 783, 4477, 921, 253, 31640, 273, 436, 2746, 2439, 2709, 15302, 285, 7533, 50275, 20881, 1255, 50275, 783, 4477, 12661, 247, 2500, 302, 1945, 7200, 29107, 2299, 352, 651, 320, 5322, 281, 3894, 849, 253, 4836, 6355, 285, 11689, 651, 320, 604, 359, 858, 417, 1908, 436, 10076, 2139, 417, 625, 685, 374, 310, 352, 323, 10454, 4606, 390, 1057, 436, 4758, 5115, 1682, 3045, 50276, 783, 3210, 452, 760, 644, 1408, 327, 581, 8357, 285, 352, 651, 320, 1077, 4217, 281, 923, 352, 1408, 327, 2709, 12922, 9610, 1097, 1599, 285, 945, 273, 3045, 281, 921, 326, 616, 2746, 310, 417, 271, 34332, 273, 7516, 285, 310, 10237, 4583, 891, 1089, 253, 2929, 12302, 285, 794, 281, 1239, 253, 4477, 9569, 12342, 326, 1361, 275, 253, 1673, 273, 278, 17945, 285, 403, 1077, 4217, 281, 253, 2561, 3114, 275, 2087, 616, 990, 936, 423, 2746, 310, 5919, 285, 1057, 417, 2430, 3081, 3733, 12401, 1142, 2074, 7274, 33810, 352, 6548, 247, 7431, 873, 273, 35615, 285, 2789, 253, 2746, 1077, 4217, 323, 18918, 253, 11454, 2990, 1805, 5474, 33032, 2072, 323, 38041, 30628, 5549, 323, 38041, 30628, 5549, 323, 38041, 30628, 5549, 323, 38041, 30628, 5549, 323, 38041, 30628, 5549, 323, 38041, 30628, 2490, 187, 4118, 18435, 27, 2520, 2929, 29328, 247, 3818, 3109, 326, 1677, 247, 27311, 267, 1566, 285, 247, 873, 273, 8892, 29328, 1755, 76, 2578, 383, 957, 1520, 1554, 262, 1945, 35615, 326, 476, 5115, 1029, 7200, 407, 36182, 562, 835, 281, 7789, 594, 326, 253, 4569, 4243, 273, 253, 2990, 403, 6096, 2439, 8892, 1223, 519, 42527, 281, 247, 15180, 7563, 253, 30628, 1119, 253, 4081, 2746, 4679, 285, 3762, 3590, 285, 4460, 253, 9759, 2590, 253, 16774, 3045, 2266, 285, 253, 7680, 4583, 1774, 323, 253, 2561, 3114, 436, 310, 2139, 891, 5583, 14924, 50275, 783, 30628, 2167, 513, 1127, 562, 690, 32138, 326, 943, 320, 9713, 347, 973, 347, 1896, 275, 253, 17265, 2715, 841, 14588, 281, 15265, 839, 4295, 273, 253, 2746, 19843, 342, 17730, 281, 10499, 15180, 7563, 285, 253, 878, 281, 1408, 4679, 342, 3081, 3632, 12922, 50276, 266, 1774, 2181, 281, 7908, 310, 326, 1223, 253, 4477, 1750, 597, 588, 15452, 616, 2127, 327, 40477, 347, 3517, 347, 1896, 347, 352, 9572, 4390, 616, 2127, 310, 417, 2130, 209 ]
Below is given review of a research paper from cnoference journal. Please write a summary the review. ### Review: summary this paper seeks to solve program translation problem through code retrieval it proposes an interactive code retrieval system called iptr to perform crosslanguage retrieval with minimum code pairs the method extracts textual features and structural features from code and transform the features into targetlanguage features through an autoencoder which is trained through a unilanguage manner it further leverages users correction to update feature representation and for new round of retrieval the problem is challenging especially since the parallel data is scarse i think the idea of utilizing both textual features and structural features is interesting the idea of training autoencoder for each language and fuse encoder and decoder of different languages is also very interesting though questionable as mentioned below concerns the major concern is that this paper does not made significant novel and technical algorithmtheory contribution but rather like proposing a system therefore in my opinion the paper does not fit iclr very well due to lack of parallel training corpus for query transformation model qtm the paper utilizes simple unilanguage autoencoder training for each language and take the encoder and decoder from the corresponding language respectively as the transformer model this could cause significant issue if not taking seriously the hidden space of different language can vary drastically and hence the encoder output is very likely to not make any sense to decoder to perform the transformation there should be a mechanism that encourages the hidden space of different languages to the similar the paper tries to solve the program translation problem with retrieval method in the experiment what is the database to retrieve from is it that all training code form the database or is it all training and testing code form the database in practical situation the desired translation is very likely to not exactly match the some code in the database therefore i dont understand why the program accuracy of retrieval method can achieve so high if the database doesnt overlap with ground truth codes a lotdocsepthis work proposes a retrievalbased approach for program translation existing mldl models for program translation typically design a decoder to directly generate the code in the target language on the contrary this work designs iptr which first computes a feature representation of the target code then retrieves the most similar code in a large code corpus specifically iptr includes a query transformation model qtm which generates the feature representation of the target code given the feature representation of the source code as the input the feature representation includes the information of tokens in the code and the paths in the syntax trees the idea of using the paths is similar to the code2vec paper qtm could be trained without parallel data between source and target codes specifically encoders and decoders of different programming languages could be trained in a similar way to the training of autoencoders this idea is similar to transcoder meanwhile they show that iptr could be trained with active learning and user feedback where they use active learning to acquire limited parallel training data and user feedbacks are corrections of the wrong output code they compare their approach to existing mldl program translation models as well as code retrieval systems they show that iptr performs better than other baselines even without active learning and feedbacks not surprisingly active learning and incorporating user feedbacks further improve the performance program translation is an important application and the authors did a good job of evaluating on existing benchmarks and comparing with different types of baseline models intuitively it makes sense that retrieval could at least provide more syntactically and semantically correct code while synthesis models may struggle with generating coherent code however i have a couple of questions about the assumption of the task setup and the implementation of the algorithm and i list them below 1 it seems that the feature representation is summarized per complete code snippet for translation therefore do the authors assume that the target code always exists in the retrieval corpus as a single piece of code if this is the case one clear limitation is that the retrieval approach could only search for existing code in the corpus while the synthesis model could combine lines from different code snippets to construct the output code 2 how do you simulate users for active learning and user feedback for active learning do you ask people to annotate the ground truth output programs for input programs and then add them into the corpus for retrieval for user feedback in some parts of the paper you mention that the user modifies the extracted features of the code but sometimes you also say that the user corrects the first wrong line could you provide some concrete examples of how you incorporate human annotations in training and inference loops 3 more concrete examples of how to sample code for active learning could help now the description in section 322 is not clear for example why ns is 5 instead of 4 what is the benefit of selecting program data where the largest disagreement occurs among those sampling strategies do you compare among the entire training corpus 4 what are the numbers of different paths and tokens for feature representation of code are text tokens representing terminal nodes or nonterminal codes for example are variable names included in tokens or are they simply denoted as identifier my understanding is that the feature vectors follow the bagofwords representation thus i dont quite understand why this simplified representation could work better than code2vec or other more advanced program encoders the authors say that theword2vec and code2vec variants of ptr cannot outperform ptr with our proposed feature representation based on structural features but i think code2vec encodes very similar structural features in a potentially more concise way writing comments the paper requires proofreading and there are a couple of typos for example 1 to approximate measure the informativeness of the given program on page 4 approximate should be approximately 2 to reflect this in out feature representaion on page 4 out should be ourdocsepthis paper proposes a program translation retrieval method by combining syntax tree transformer with specific encoderdecoder and interactive signals from user the novelty of each used techniques is limited but they are reasonable for the code translation task with high accuracy even for unsupervised version without interactive signals the experimental results are solid and demonstrate the power of representation of the proposed variant of syntax tree the idea of aead with specific language is quite simple but it seems that the qtm successfully transfer the representation cross different languages the performance gain achieved by interactive signals is reasonable questions there are other structural features of programming language such as graph which can provide rich information of the code snippets rather than syntax tree is the proposed method a better code representation could it be applied to other downstream tasks after pretraining will the authors release their code to public ### Summary:
i found the setup for this paper a bit contrived the tool is presented as a code translation tool but it really functions more as a multilanguage code search tool the idea is that one has a program in language a and a database that contains the same program in language b so one can translate from a to b simply by searching for the right program in the database when evaluated as a language translation tool it appears to outperform existing language translation schemes but this is an unfair comparison because iptr is being given a database that contains the exact translation of the program in question the performance is also compared with code search tools but these are also applestooranges comparisons because the tools in question are operating from very highlevel queries a much more comparable baseline would be the yogo tool recently published in pldi httpsdlacmorgdoiabs10114533854123386001 or for compiled languages you could compare against statistical similarity tools for binaries httpsdlacmorgdoi10114529809832908126 the experiment in the appendix a5 is more fair to standard language translation and it yields results that are much less impressive i would be much more comfortable with this paper if it were written around this experiment or alternatively if it were evaluated against a more comparable approach for semantic code search
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 8774, 436, 2929, 14993, 281, 8415, 2086, 10234, 1895, 949, 2127, 25064, 352, 29328, 271, 18366, 2127, 25064, 985, 1925, 891, 4773, 281, 1347, 2831, 12982, 25064, 342, 5927, 2127, 8557, 253, 1332, 16756, 45860, 3386, 285, 8350, 3386, 432, 2127, 285, 4979, 253, 3386, 715, 2303, 12982, 3386, 949, 271, 6753, 36465, 534, 310, 10166, 949, 247, 440, 300, 2848, 5133, 352, 2007, 19732, 1131, 4212, 10618, 281, 5731, 4735, 6779, 285, 323, 747, 3790, 273, 25064, 50276, 783, 1895, 310, 11132, 3340, 1580, 253, 7529, 941, 310, 660, 10788, 891, 1158, 253, 2934, 273, 17617, 1097, 45860, 3386, 285, 8350, 3386, 310, 4722, 253, 2934, 273, 3733, 6753, 36465, 323, 1016, 3448, 285, 34824, 32049, 285, 29810, 273, 1027, 11515, 310, 671, 1077, 4722, 2167, 30455, 347, 5393, 2708, 50276, 585, 1209, 2224, 50276, 783, 2201, 4468, 310, 326, 436, 2929, 1057, 417, 1160, 1534, 4460, 285, 7681, 5933, 32525, 7680, 533, 2581, 751, 36636, 247, 985, 3103, 275, 619, 4743, 253, 2929, 1057, 417, 4944, 17857, 32888, 1077, 973, 50275, 21848, 281, 3480, 273, 7529, 3733, 20689, 323, 7316, 9261, 1566, 2805, 20583, 253, 2929, 29820, 2969, 440, 300, 2848, 6753, 36465, 3733, 323, 1016, 3448, 285, 1379, 253, 32049, 285, 29810, 432, 253, 3969, 3448, 2975, 347, 253, 39707, 1566, 436, 812, 2847, 1534, 2523, 604, 417, 3192, 10369, 253, 8763, 2317, 273, 1027, 3448, 476, 6889, 31063, 285, 7613, 253, 32049, 3453, 310, 1077, 2779, 281, 417, 1056, 667, 3282, 281, 29810, 281, 1347, 253, 9261, 627, 943, 320, 247, 5122, 326, 29426, 253, 8763, 2317, 273, 1027, 11515, 281, 253, 2074, 50276, 783, 2929, 14177, 281, 8415, 253, 2086, 10234, 1895, 342, 25064, 1332, 275, 253, 3368, 752, 310, 253, 5447, 281, 19553, 432, 310, 352, 326, 512, 3733, 2127, 830, 253, 5447, 390, 310, 352, 512, 3733, 285, 5175, 2127, 830, 253, 5447, 275, 8542, 4112, 253, 6799, 10234, 310, 1077, 2779, 281, 417, 4555, 3761, 253, 690, 2127, 275, 253, 5447, 3103, 891, 13414, 2096, 2139, 253, 2086, 7200, 273, 25064, 1332, 476, 5115, 594, 1029, 604, 253, 5447, 36908, 14787, 342, 3216, 5083, 11646, 247, 2257, 7152, 33032, 2520, 789, 29328, 247, 25064, 3169, 2746, 323, 2086, 10234, 5368, 278, 392, 77, 3210, 323, 2086, 10234, 5431, 2216, 247, 29810, 281, 3587, 6635, 253, 2127, 275, 253, 2303, 3448, 327, 253, 10214, 436, 789, 11809, 891, 4773, 534, 806, 48169, 247, 4735, 6779, 273, 253, 2303, 2127, 840, 12802, 265, 253, 954, 2074, 2127, 275, 247, 1781, 2127, 20689, 5742, 891, 4773, 3797, 247, 7316, 9261, 1566, 2805, 20583, 534, 15693, 253, 4735, 6779, 273, 253, 2303, 2127, 1677, 253, 4735, 6779, 273, 253, 2603, 2127, 347, 253, 3280, 253, 4735, 6779, 3797, 253, 1491, 273, 21761, 275, 253, 2127, 285, 253, 11865, 275, 253, 16144, 7139, 253, 2934, 273, 970, 253, 11865, 310, 2074, 281, 253, 2127, 19, 4642, 2929, 2805, 20583, 812, 320, 10166, 1293, 7529, 941, 875, 2603, 285, 2303, 11646, 5742, 2349, 351, 398, 285, 1086, 351, 398, 273, 1027, 10717, 11515, 812, 320, 10166, 275, 247, 2074, 1039, 281, 253, 3733, 273, 6753, 2083, 351, 398, 436, 2934, 310, 2074, 281, 21427, 8586, 26614, 597, 921, 326, 891, 4773, 812, 320, 10166, 342, 3939, 4715, 285, 2608, 8680, 835, 597, 897, 3939, 4715, 281, 16270, 3710, 7529, 3733, 941, 285, 2608, 8680, 84, 403, 17660, 273, 253, 3430, 3453, 2127, 597, 7277, 616, 2746, 281, 5368, 278, 392, 77, 2086, 10234, 3210, 347, 973, 347, 2127, 25064, 2718, 597, 921, 326, 891, 4773, 17923, 1805, 685, 643, 1666, 25379, 1014, 1293, 3939, 4715, 285, 8680, 84, 417, 19143, 3939, 4715, 285, 24049, 2608, 8680, 84, 2007, 3157, 253, 3045, 50276, 14996, 10234, 310, 271, 1774, 2898, 285, 253, 4477, 858, 247, 1175, 2628, 273, 16344, 327, 5368, 49602, 285, 10941, 342, 1027, 3510, 273, 8245, 3210, 540, 41597, 352, 2789, 3282, 326, 25064, 812, 387, 1878, 2085, 625, 43548, 514, 1037, 285, 3300, 39904, 3451, 2127, 1223, 9066, 3210, 778, 11182, 342, 11365, 18893, 2127, 2299, 891, 452, 247, 4564, 273, 3533, 670, 253, 9376, 273, 253, 4836, 9978, 285, 253, 7092, 273, 253, 5933, 285, 891, 1618, 731, 2708, 50276, 18, 352, 3133, 326, 253, 4735, 6779, 310, 17903, 591, 3426, 2127, 36408, 323, 10234, 3103, 513, 253, 4477, 5467, 326, 253, 2303, 2127, 1900, 4961, 275, 253, 25064, 20689, 347, 247, 2014, 5313, 273, 2127, 604, 436, 310, 253, 1083, 581, 2590, 12291, 310, 326, 253, 25064, 2746, 812, 760, 3186, 323, 5368, 2127, 275, 253, 20689, 1223, 253, 9066, 1566, 812, 13398, 3104, 432, 1027, 2127, 3802, 46588, 281, 3989, 253, 3453, 2127, 50276, 19, 849, 513, 368, 26065, 4212, 323, 3939, 4715, 285, 2608, 8680, 323, 3939, 4715, 513, 368, 1642, 952, 281, 12182, 366, 253, 3216, 5083, 3453, 5659, 323, 3280, 5659, 285, 840, 823, 731, 715, 253, 20689, 323, 25064, 323, 2608, 8680, 275, 690, 4243, 273, 253, 2929, 368, 3748, 326, 253, 2608, 771, 7790, 253, 10375, 3386, 273, 253, 2127, 533, 4536, 368, 671, 1333, 326, 253, 2608, 3451, 84, 253, 806, 3430, 1386, 812, 368, 2085, 690, 11859, 6667, 273, 849, 368, 19071, 1966, 31825, 275, 3733, 285, 17032, 17417, 50276, 20, 625, 11859, 6667, 273, 849, 281, 3410, 2127, 323, 3939, 4715, 812, 1361, 1024, 253, 5740, 275, 2593, 31619, 310, 417, 2590, 323, 1650, 2139, 19769, 310, 608, 3185, 273, 577, 752, 310, 253, 5649, 273, 17221, 2086, 941, 835, 253, 6253, 30859, 6634, 2190, 1110, 10491, 8130, 513, 368, 7277, 2190, 253, 2862, 3733, 20689, 50276, 21, 752, 403, 253, 3904, 273, 1027, 11865, 285, 21761, 323, 4735, 6779, 273, 2127, 403, 2505, 21761, 9999, 8351, 7632, 390, 1327, 14104, 11646, 323, 1650, 403, 4778, 4454, 2908, 275, 21761, 390, 403, 597, 3365, 17007, 347, 21674, 619, 4685, 310, 326, 253, 4735, 11390, 956, 253, 7351, 1171, 12113, 6779, 3021, 891, 13414, 3240, 2096, 2139, 436, 21010, 6779, 812, 789, 1805, 685, 2127, 19, 4642, 390, 643, 625, 7269, 2086, 2349, 351, 398, 253, 4477, 1333, 326, 253, 3418, 19, 4642, 285, 2127, 19, 4642, 11640, 273, 27028, 2550, 562, 32231, 27028, 342, 776, 4081, 4735, 6779, 1754, 327, 8350, 3386, 533, 891, 1158, 2127, 19, 4642, 31360, 1077, 2074, 8350, 3386, 275, 247, 7826, 625, 44003, 1039, 50276, 17695, 5701, 253, 2929, 4419, 4737, 24042, 285, 627, 403, 247, 4564, 273, 963, 993, 323, 1650, 337, 281, 16851, 2557, 253, 4151, 255, 6460, 273, 253, 1677, 2086, 327, 3239, 577, 16851, 943, 320, 5512, 374, 281, 4887, 436, 275, 562, 4735, 1957, 66, 279, 327, 3239, 577, 562, 943, 320, 776, 7152, 33032, 2520, 2929, 29328, 247, 2086, 10234, 25064, 1332, 407, 16248, 50276, 40221, 5202, 39707, 342, 2173, 32049, 48759, 285, 18366, 6298, 432, 2608, 253, 38135, 273, 1016, 908, 5609, 310, 3710, 533, 597, 403, 5272, 323, 253, 2127, 10234, 4836, 342, 1029, 7200, 1014, 323, 440, 35421, 2715, 1293, 18366, 6298, 50276, 783, 5661, 1543, 403, 4891, 285, 7568, 253, 1612, 273, 6779, 273, 253, 4081, 12955, 273, 16144, 5202, 253, 2934, 273, 247, 1455, 342, 2173, 3448, 310, 3240, 2969, 533, 352, 3133, 326, 253, 2805, 20583, 8379, 3700, 253, 6779, 2831, 1027, 11515, 253, 3045, 6351, 6786, 407, 18366, 6298, 310, 5272, 50276, 34974, 50276, 9088, 403, 643, 8350, 3386, 273, 10717, 3448, 824, 347, 4216, 534, 476, 2085, 6793, 1491, 273, 253, 2127, 3802, 46588, 2581, 685, 16144, 5202, 50275, 261, 253, 4081, 1332, 247, 1805, 2127, 6779, 812, 352, 320, 3732, 281, 643, 15450, 8892, 846, 3215, 26208, 50276, 9846, 253, 4477, 3727, 616, 2127, 281, 1345, 2490, 187, 4118, 18435, 27, 74, 1119, 253, 9978, 323, 436, 2929, 247, 2372, 523, 30487, 253, 4968, 310, 3559, 347, 247, 2127, 10234, 4968, 533, 352, 1663, 3470, 625, 347, 247, 33362, 2848, 2127, 3186, 4968, 253, 2934, 310, 326, 581, 556, 247, 2086, 275, 3448, 247, 285, 247, 5447, 326, 4428, 253, 1072, 2086, 275, 3448, 270, 594, 581, 476, 16497, 432, 247, 281, 270, 3365, 407, 12203, 323, 253, 987, 2086, 275, 253, 5447, 50275, 9453, 6760, 347, 247, 3448, 10234, 4968, 352, 4620, 281, 562, 32231, 5368, 3448, 10234, 15849, 533, 436, 310, 271, 16593, 5301, 984, 891, 4773, 310, 1146, 1677, 247, 5447, 326, 4428, 253, 3242, 10234, 273, 253, 2086, 275, 1953, 253, 3045, 310, 671, 2429, 342, 2127, 3186, 5657, 533, 841, 403, 671, 2999, 383, 1887, 6525, 14023, 984, 253, 5657, 275, 1953, 403, 6498, 432, 1077, 1029, 5251, 19241, 247, 1199, 625, 10870, 8245, 651, 320, 253, 340, 24912, 4968, 50276, 45019, 314, 3863, 275, 268, 392, 74, 5987, 11830, 50232, 2061, 14369, 5375, 6903, 11838, 1610, 2227, 30965, 1610, 2691, 2874, 390, 323, 18133, 11515, 368, 812, 7277, 1411, 7605, 14259, 5657, 323, 36198, 5987, 11830, 50232, 2061, 14369, 6903, 11838, 1717, 1438, 4185, 1237, 33648, 13381, 50275, 783, 3368, 275, 253, 30762, 247, 22, 310, 625, 4344, 281, 2629, 3448, 10234, 285, 352, 11026, 1543, 326, 403, 1199, 1679, 13943, 891, 651, 320, 1199, 625, 9848, 342, 436, 2929, 604, 352, 497, 3542, 1475, 436, 3368, 390, 31506, 604, 352, 497, 6760, 1411, 247, 625, 10870, 2746, 323, 24705, 2127, 3186, 209 ]
[ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 ]
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 8774, 436, 2929, 14993, 281, 8415, 2086, 10234, 1895, 949, 2127, 25064, 352, 29328, 271, 18366, 2127, 25064, 985, 1925, 891, 4773, 281, 1347, 2831, 12982, 25064, 342, 5927, 2127, 8557, 253, 1332, 16756, 45860, 3386, 285, 8350, 3386, 432, 2127, 285, 4979, 253, 3386, 715, 2303, 12982, 3386, 949, 271, 6753, 36465, 534, 310, 10166, 949, 247, 440, 300, 2848, 5133, 352, 2007, 19732, 1131, 4212, 10618, 281, 5731, 4735, 6779, 285, 323, 747, 3790, 273, 25064, 50276, 783, 1895, 310, 11132, 3340, 1580, 253, 7529, 941, 310, 660, 10788, 891, 1158, 253, 2934, 273, 17617, 1097, 45860, 3386, 285, 8350, 3386, 310, 4722, 253, 2934, 273, 3733, 6753, 36465, 323, 1016, 3448, 285, 34824, 32049, 285, 29810, 273, 1027, 11515, 310, 671, 1077, 4722, 2167, 30455, 347, 5393, 2708, 50276, 585, 1209, 2224, 50276, 783, 2201, 4468, 310, 326, 436, 2929, 1057, 417, 1160, 1534, 4460, 285, 7681, 5933, 32525, 7680, 533, 2581, 751, 36636, 247, 985, 3103, 275, 619, 4743, 253, 2929, 1057, 417, 4944, 17857, 32888, 1077, 973, 50275, 21848, 281, 3480, 273, 7529, 3733, 20689, 323, 7316, 9261, 1566, 2805, 20583, 253, 2929, 29820, 2969, 440, 300, 2848, 6753, 36465, 3733, 323, 1016, 3448, 285, 1379, 253, 32049, 285, 29810, 432, 253, 3969, 3448, 2975, 347, 253, 39707, 1566, 436, 812, 2847, 1534, 2523, 604, 417, 3192, 10369, 253, 8763, 2317, 273, 1027, 3448, 476, 6889, 31063, 285, 7613, 253, 32049, 3453, 310, 1077, 2779, 281, 417, 1056, 667, 3282, 281, 29810, 281, 1347, 253, 9261, 627, 943, 320, 247, 5122, 326, 29426, 253, 8763, 2317, 273, 1027, 11515, 281, 253, 2074, 50276, 783, 2929, 14177, 281, 8415, 253, 2086, 10234, 1895, 342, 25064, 1332, 275, 253, 3368, 752, 310, 253, 5447, 281, 19553, 432, 310, 352, 326, 512, 3733, 2127, 830, 253, 5447, 390, 310, 352, 512, 3733, 285, 5175, 2127, 830, 253, 5447, 275, 8542, 4112, 253, 6799, 10234, 310, 1077, 2779, 281, 417, 4555, 3761, 253, 690, 2127, 275, 253, 5447, 3103, 891, 13414, 2096, 2139, 253, 2086, 7200, 273, 25064, 1332, 476, 5115, 594, 1029, 604, 253, 5447, 36908, 14787, 342, 3216, 5083, 11646, 247, 2257, 7152, 33032, 2520, 789, 29328, 247, 25064, 3169, 2746, 323, 2086, 10234, 5368, 278, 392, 77, 3210, 323, 2086, 10234, 5431, 2216, 247, 29810, 281, 3587, 6635, 253, 2127, 275, 253, 2303, 3448, 327, 253, 10214, 436, 789, 11809, 891, 4773, 534, 806, 48169, 247, 4735, 6779, 273, 253, 2303, 2127, 840, 12802, 265, 253, 954, 2074, 2127, 275, 247, 1781, 2127, 20689, 5742, 891, 4773, 3797, 247, 7316, 9261, 1566, 2805, 20583, 534, 15693, 253, 4735, 6779, 273, 253, 2303, 2127, 1677, 253, 4735, 6779, 273, 253, 2603, 2127, 347, 253, 3280, 253, 4735, 6779, 3797, 253, 1491, 273, 21761, 275, 253, 2127, 285, 253, 11865, 275, 253, 16144, 7139, 253, 2934, 273, 970, 253, 11865, 310, 2074, 281, 253, 2127, 19, 4642, 2929, 2805, 20583, 812, 320, 10166, 1293, 7529, 941, 875, 2603, 285, 2303, 11646, 5742, 2349, 351, 398, 285, 1086, 351, 398, 273, 1027, 10717, 11515, 812, 320, 10166, 275, 247, 2074, 1039, 281, 253, 3733, 273, 6753, 2083, 351, 398, 436, 2934, 310, 2074, 281, 21427, 8586, 26614, 597, 921, 326, 891, 4773, 812, 320, 10166, 342, 3939, 4715, 285, 2608, 8680, 835, 597, 897, 3939, 4715, 281, 16270, 3710, 7529, 3733, 941, 285, 2608, 8680, 84, 403, 17660, 273, 253, 3430, 3453, 2127, 597, 7277, 616, 2746, 281, 5368, 278, 392, 77, 2086, 10234, 3210, 347, 973, 347, 2127, 25064, 2718, 597, 921, 326, 891, 4773, 17923, 1805, 685, 643, 1666, 25379, 1014, 1293, 3939, 4715, 285, 8680, 84, 417, 19143, 3939, 4715, 285, 24049, 2608, 8680, 84, 2007, 3157, 253, 3045, 50276, 14996, 10234, 310, 271, 1774, 2898, 285, 253, 4477, 858, 247, 1175, 2628, 273, 16344, 327, 5368, 49602, 285, 10941, 342, 1027, 3510, 273, 8245, 3210, 540, 41597, 352, 2789, 3282, 326, 25064, 812, 387, 1878, 2085, 625, 43548, 514, 1037, 285, 3300, 39904, 3451, 2127, 1223, 9066, 3210, 778, 11182, 342, 11365, 18893, 2127, 2299, 891, 452, 247, 4564, 273, 3533, 670, 253, 9376, 273, 253, 4836, 9978, 285, 253, 7092, 273, 253, 5933, 285, 891, 1618, 731, 2708, 50276, 18, 352, 3133, 326, 253, 4735, 6779, 310, 17903, 591, 3426, 2127, 36408, 323, 10234, 3103, 513, 253, 4477, 5467, 326, 253, 2303, 2127, 1900, 4961, 275, 253, 25064, 20689, 347, 247, 2014, 5313, 273, 2127, 604, 436, 310, 253, 1083, 581, 2590, 12291, 310, 326, 253, 25064, 2746, 812, 760, 3186, 323, 5368, 2127, 275, 253, 20689, 1223, 253, 9066, 1566, 812, 13398, 3104, 432, 1027, 2127, 3802, 46588, 281, 3989, 253, 3453, 2127, 50276, 19, 849, 513, 368, 26065, 4212, 323, 3939, 4715, 285, 2608, 8680, 323, 3939, 4715, 513, 368, 1642, 952, 281, 12182, 366, 253, 3216, 5083, 3453, 5659, 323, 3280, 5659, 285, 840, 823, 731, 715, 253, 20689, 323, 25064, 323, 2608, 8680, 275, 690, 4243, 273, 253, 2929, 368, 3748, 326, 253, 2608, 771, 7790, 253, 10375, 3386, 273, 253, 2127, 533, 4536, 368, 671, 1333, 326, 253, 2608, 3451, 84, 253, 806, 3430, 1386, 812, 368, 2085, 690, 11859, 6667, 273, 849, 368, 19071, 1966, 31825, 275, 3733, 285, 17032, 17417, 50276, 20, 625, 11859, 6667, 273, 849, 281, 3410, 2127, 323, 3939, 4715, 812, 1361, 1024, 253, 5740, 275, 2593, 31619, 310, 417, 2590, 323, 1650, 2139, 19769, 310, 608, 3185, 273, 577, 752, 310, 253, 5649, 273, 17221, 2086, 941, 835, 253, 6253, 30859, 6634, 2190, 1110, 10491, 8130, 513, 368, 7277, 2190, 253, 2862, 3733, 20689, 50276, 21, 752, 403, 253, 3904, 273, 1027, 11865, 285, 21761, 323, 4735, 6779, 273, 2127, 403, 2505, 21761, 9999, 8351, 7632, 390, 1327, 14104, 11646, 323, 1650, 403, 4778, 4454, 2908, 275, 21761, 390, 403, 597, 3365, 17007, 347, 21674, 619, 4685, 310, 326, 253, 4735, 11390, 956, 253, 7351, 1171, 12113, 6779, 3021, 891, 13414, 3240, 2096, 2139, 436, 21010, 6779, 812, 789, 1805, 685, 2127, 19, 4642, 390, 643, 625, 7269, 2086, 2349, 351, 398, 253, 4477, 1333, 326, 253, 3418, 19, 4642, 285, 2127, 19, 4642, 11640, 273, 27028, 2550, 562, 32231, 27028, 342, 776, 4081, 4735, 6779, 1754, 327, 8350, 3386, 533, 891, 1158, 2127, 19, 4642, 31360, 1077, 2074, 8350, 3386, 275, 247, 7826, 625, 44003, 1039, 50276, 17695, 5701, 253, 2929, 4419, 4737, 24042, 285, 627, 403, 247, 4564, 273, 963, 993, 323, 1650, 337, 281, 16851, 2557, 253, 4151, 255, 6460, 273, 253, 1677, 2086, 327, 3239, 577, 16851, 943, 320, 5512, 374, 281, 4887, 436, 275, 562, 4735, 1957, 66, 279, 327, 3239, 577, 562, 943, 320, 776, 7152, 33032, 2520, 2929, 29328, 247, 2086, 10234, 25064, 1332, 407, 16248, 50276, 40221, 5202, 39707, 342, 2173, 32049, 48759, 285, 18366, 6298, 432, 2608, 253, 38135, 273, 1016, 908, 5609, 310, 3710, 533, 597, 403, 5272, 323, 253, 2127, 10234, 4836, 342, 1029, 7200, 1014, 323, 440, 35421, 2715, 1293, 18366, 6298, 50276, 783, 5661, 1543, 403, 4891, 285, 7568, 253, 1612, 273, 6779, 273, 253, 4081, 12955, 273, 16144, 5202, 253, 2934, 273, 247, 1455, 342, 2173, 3448, 310, 3240, 2969, 533, 352, 3133, 326, 253, 2805, 20583, 8379, 3700, 253, 6779, 2831, 1027, 11515, 253, 3045, 6351, 6786, 407, 18366, 6298, 310, 5272, 50276, 34974, 50276, 9088, 403, 643, 8350, 3386, 273, 10717, 3448, 824, 347, 4216, 534, 476, 2085, 6793, 1491, 273, 253, 2127, 3802, 46588, 2581, 685, 16144, 5202, 50275, 261, 253, 4081, 1332, 247, 1805, 2127, 6779, 812, 352, 320, 3732, 281, 643, 15450, 8892, 846, 3215, 26208, 50276, 9846, 253, 4477, 3727, 616, 2127, 281, 1345, 2490, 187, 4118, 18435, 27, 74, 1119, 253, 9978, 323, 436, 2929, 247, 2372, 523, 30487, 253, 4968, 310, 3559, 347, 247, 2127, 10234, 4968, 533, 352, 1663, 3470, 625, 347, 247, 33362, 2848, 2127, 3186, 4968, 253, 2934, 310, 326, 581, 556, 247, 2086, 275, 3448, 247, 285, 247, 5447, 326, 4428, 253, 1072, 2086, 275, 3448, 270, 594, 581, 476, 16497, 432, 247, 281, 270, 3365, 407, 12203, 323, 253, 987, 2086, 275, 253, 5447, 50275, 9453, 6760, 347, 247, 3448, 10234, 4968, 352, 4620, 281, 562, 32231, 5368, 3448, 10234, 15849, 533, 436, 310, 271, 16593, 5301, 984, 891, 4773, 310, 1146, 1677, 247, 5447, 326, 4428, 253, 3242, 10234, 273, 253, 2086, 275, 1953, 253, 3045, 310, 671, 2429, 342, 2127, 3186, 5657, 533, 841, 403, 671, 2999, 383, 1887, 6525, 14023, 984, 253, 5657, 275, 1953, 403, 6498, 432, 1077, 1029, 5251, 19241, 247, 1199, 625, 10870, 8245, 651, 320, 253, 340, 24912, 4968, 50276, 45019, 314, 3863, 275, 268, 392, 74, 5987, 11830, 50232, 2061, 14369, 5375, 6903, 11838, 1610, 2227, 30965, 1610, 2691, 2874, 390, 323, 18133, 11515, 368, 812, 7277, 1411, 7605, 14259, 5657, 323, 36198, 5987, 11830, 50232, 2061, 14369, 6903, 11838, 1717, 1438, 4185, 1237, 33648, 13381, 50275, 783, 3368, 275, 253, 30762, 247, 22, 310, 625, 4344, 281, 2629, 3448, 10234, 285, 352, 11026, 1543, 326, 403, 1199, 1679, 13943, 891, 651, 320, 1199, 625, 9848, 342, 436, 2929, 604, 352, 497, 3542, 1475, 436, 3368, 390, 31506, 604, 352, 497, 6760, 1411, 247, 625, 10870, 2746, 323, 24705, 2127, 3186, 209 ]
Below is given review of a research paper from cnoference journal. Please write a summary the review. ### Review: this paper proposes a method to segment and decouple dynamic objects while recovering the static environment from a monocular video it adapts from nerf and its extension hyper nerf but with improved handling of shadow regions as well as a loss to promote correct separation of dynamic and static regions it demonstrates plausible motion segmentation and shadow removal result compared to recent nerf based methods this paper is a solid development for using nerf to reconstruct from monocular videos with dynamic foregrounds the paper in general is wellwritten and the claims are well supported the adaptation it made to handle shadow and foreground background separation is elegant and seems to be effective it definitely holds value to people working on similar problems and deserves publication there are a couple of things i hope the authors could comment on 1 in ln 226 to demonstrate the ability of fully selfsupervised scene decoupling we do not apply any masks when registering realworld images using colmap in my experience without feeding masks colmap tends to make wrong estimation of camera poses when the foreground is sufficiently large which will definitely results in wrong reconstruction of the static background it seems the proposed do not attempt to update the camera pose during optimization so the claim above looks confusing to me 2 it would be nice if the author also visualizes the depth map of the reconstructed foreground background 3 the results of hypernerf looks far worse compared to the proposed method even in the dynamic region given that the proposed method seems to have little difference to hypernerf at least in the dynamic regions the current result surprises me a bit could the author make additional comments on the possible reasons the method cannot handle high frequency viewdependent radiance change due to the monocular moving camera setting docsepthis paper presents d2nerf a selfsupervised method that takes a monocular video and learns a 3d scene representation that decouples moving objects including their shadows from the static background in addition this paper proposes a novel loss to promote the correct separation of phenomena for static and dynamic filed the authors further propose a shadow field network to detect and decouple dynamically moving shadows a new dataset was proposed containing various dynamic objects and shadows extensive experiments demonstrate that the proposed method can achieve better performance than stateoftheart approaches in decoupling dynamic and static 3d objects occlusion and shadow removal and image segmentation for moving objects strength new dataset for static and dynamic field decomposition selfsupervised method that takes a monocular video and learns a 3d scene representation that decouples moving objects including their shadows from the static background novel loss to promote the correct separation of phenomena for static and dynamic filed a shadow field network to detect and decouple dynamically moving shadows sota results in several tasks such as decoupling dynamic and static 3d objects occlusion and shadow removal and image segmentation for moving objects weakness require the accurate camera suffer from highfrequency viewdependent radiance change which has been discussed do not compare with other motion decoupling methods such as stnerf18 nsff24 and dynnerf11 simone19 star63 although their experiment setting may be inconsistent with the proposed methods however i think in the synthetic dataset all of them can be reproduced please add some of their results for a complete comparison when i refer to the supplementary video there are still some wrong static and dynamic field decompositions such as the keyboard and i think this is a tradeoff it seems to lack generalizability just train and test in the same dataset the authors have discussed the limitation and all of them are a considerable challenge i have no idea to solve them and i think its a tradeoff docsepthis paper proposes a new selfsupervised approach for segmenting moving objects from the static background it tackles with issues encountered during training with several techniques such as skewed entropy ray regularization static regularization and integrating shadow ratio to separate shadows strengths 1 originality the proposed techniques are insightful and are beneficial to the community such as skewed entropy and ray regularization 2 quality the presented qualitative results are of high quality 3 clarity the manuscript is wellwritten 4 significance the task of decoupling dynamic objects and static background is important and the proposed approach tackles the longstanding problem with highquality results weakness 1 i think the main weakness comes from the scenelevel tuning for the hyparameter of k lambdas lambdar lambdasigmas lambdarho 2 i understand that to push the number the author may need gridsearch for each scene however the lack of analysis for the sensitivity of those hyparameters makes it unclear how robust the proposed approach is 3 especially considering in sec b of supp we have 9 sets of parameters for 19 scenes even for the same scene banana we have two sets of hyparameters for two tasks decoupling and novel view 4 besides the perscene tuning results i recommend the author to report a set of quantitative results with a single set of hyparameters if possible i appreciate the authorss discussions about the limitations of the proposed approach in sec 5 which helps the understanding of the suitable scenarios docsepthis paper introduces a series of techniques to decouple dynamic objects from monocular videos which present impressive static background recovery with shadow removal how to remove the disturbance from dynamic objectsoccluders is a longstanding problem in 3d vision domain i appreciate the simple but valuable design ie the skewed entropy loss with decoupled nerf and an extra shadow field however many previous methods have adopted similar techniques for foregroudbackground content separation some related literatures are not included in the paper besides it reads like a cvprtyle paper im not sure whether its suitable for nips strenghts 1 the skewedentropy loss is able to flexibly separate the static and dynamic parts 2 separating the timevarying shadows and the static appearace via a shadow field 2 the decouple performance and nvs quality significantly outperforms sota ### Summary:
this paper attacks an interesting problem with nerf decoupling moving objects including their shadows from the static background all four reviewers recommend accepting the paper and the weaknesses identified did not detract from substantive contributions therefore i am accepting this paper
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 2520, 2929, 29328, 247, 1332, 281, 8223, 285, 34430, 713, 7870, 5113, 1223, 27930, 253, 4228, 3126, 432, 247, 1114, 26292, 3492, 352, 5223, 84, 432, 38998, 71, 285, 697, 6880, 4373, 38998, 71, 533, 342, 5520, 10885, 273, 12195, 4811, 347, 973, 347, 247, 2957, 281, 8591, 3451, 9712, 273, 7870, 285, 4228, 4811, 352, 14371, 21541, 3200, 26405, 285, 12195, 8570, 906, 2429, 281, 3332, 38998, 71, 1754, 3082, 50275, 2520, 2929, 310, 247, 4891, 2440, 323, 970, 38998, 71, 281, 17029, 432, 1114, 26292, 10556, 342, 7870, 2273, 18646, 253, 2929, 275, 2087, 310, 973, 15720, 285, 253, 3916, 403, 973, 4516, 253, 15644, 352, 1160, 281, 6016, 12195, 285, 35936, 50276, 11814, 9712, 310, 20654, 285, 3133, 281, 320, 3576, 352, 7964, 6556, 1318, 281, 952, 2444, 327, 2074, 3237, 285, 22828, 9311, 50276, 9088, 403, 247, 4564, 273, 1841, 891, 3524, 253, 4477, 812, 4385, 327, 50276, 18, 275, 43321, 27648, 281, 7568, 253, 3745, 273, 4751, 1881, 35421, 6200, 34430, 4906, 359, 513, 417, 4647, 667, 25965, 672, 41863, 1524, 10186, 3888, 970, 847, 4251, 275, 619, 2793, 1293, 12422, 25965, 847, 4251, 14280, 281, 1056, 3430, 13418, 273, 6568, 24543, 672, 253, 35936, 310, 10481, 1781, 534, 588, 7964, 1543, 275, 3430, 14433, 273, 253, 4228, 4114, 352, 3133, 253, 4081, 513, 417, 3177, 281, 5731, 253, 6568, 16753, 1309, 13757, 594, 253, 1750, 1840, 4453, 21643, 281, 479, 50275, 19, 352, 651, 320, 5322, 604, 253, 2488, 671, 5304, 4219, 253, 6864, 3711, 273, 253, 25578, 35936, 50276, 11814, 50276, 20, 253, 1543, 273, 4373, 1216, 71, 4453, 2080, 7197, 2429, 281, 253, 4081, 1332, 1014, 275, 253, 7870, 2919, 1677, 326, 253, 4081, 1332, 3133, 281, 452, 1652, 3064, 281, 4373, 1216, 71, 387, 1878, 275, 253, 7870, 4811, 253, 1655, 906, 37700, 479, 247, 2372, 812, 253, 2488, 1056, 3081, 5701, 327, 253, 1896, 4606, 50274, 783, 1332, 2550, 6016, 1029, 4294, 1859, 6820, 1985, 5155, 1818, 1955, 281, 253, 1114, 26292, 4886, 6568, 4758, 5474, 33032, 2520, 2929, 10262, 277, 19, 1216, 71, 247, 1881, 35421, 1332, 326, 3936, 247, 1114, 26292, 3492, 285, 33772, 247, 495, 69, 6200, 6779, 326, 34430, 1868, 4886, 5113, 1690, 616, 20586, 432, 253, 4228, 4114, 275, 1635, 436, 2929, 29328, 247, 4460, 2957, 281, 8591, 253, 3451, 9712, 273, 16958, 323, 4228, 285, 7870, 4724, 253, 4477, 2007, 12661, 247, 12195, 1673, 2990, 281, 2736, 285, 34430, 713, 23043, 4886, 20586, 247, 747, 10895, 369, 4081, 4508, 2710, 7870, 5113, 285, 20586, 9470, 4679, 7568, 326, 253, 4081, 1332, 476, 5115, 1805, 3045, 685, 1375, 23037, 14387, 7274, 275, 34430, 4906, 7870, 285, 4228, 495, 69, 5113, 30796, 285, 12195, 8570, 285, 2460, 26405, 323, 4886, 5113, 4757, 50275, 1826, 10895, 323, 4228, 285, 7870, 1673, 14717, 50276, 1286, 35421, 1332, 326, 3936, 247, 1114, 26292, 3492, 285, 33772, 247, 495, 69, 6200, 6779, 326, 34430, 1868, 4886, 5113, 1690, 616, 20586, 432, 253, 4228, 4114, 50275, 2369, 652, 2957, 281, 8591, 253, 3451, 9712, 273, 16958, 323, 4228, 285, 7870, 4724, 50275, 66, 12195, 1673, 2990, 281, 2736, 285, 34430, 713, 23043, 4886, 20586, 50276, 84, 5503, 1543, 275, 2067, 8892, 824, 347, 34430, 4906, 7870, 285, 4228, 495, 69, 5113, 30796, 285, 12195, 8570, 285, 2460, 26405, 323, 4886, 5113, 50276, 20881, 1255, 50276, 15684, 253, 7899, 6568, 11089, 432, 1029, 18163, 1859, 6820, 1985, 5155, 1818, 534, 556, 644, 5469, 50276, 3088, 417, 7277, 342, 643, 3200, 34430, 4906, 3082, 824, 347, 331, 1216, 71, 1093, 19769, 567, 1348, 285, 24187, 1216, 71, 883, 948, 531, 746, 4177, 3571, 3738, 616, 3368, 4758, 778, 320, 16706, 342, 253, 4081, 3082, 2299, 891, 1158, 275, 253, 13506, 10895, 512, 273, 731, 476, 320, 23775, 4496, 823, 690, 273, 616, 1543, 323, 247, 3426, 5301, 50276, 9453, 891, 3730, 281, 253, 24864, 3492, 627, 403, 1335, 690, 3430, 4228, 285, 7870, 1673, 14717, 84, 824, 347, 253, 15487, 285, 891, 1158, 436, 310, 247, 5454, 2727, 50275, 262, 3133, 281, 3480, 2087, 50228, 816, 6194, 285, 1071, 275, 253, 1072, 10895, 253, 4477, 452, 5469, 253, 12291, 285, 512, 273, 731, 403, 247, 10665, 5691, 891, 452, 642, 2934, 281, 8415, 731, 285, 891, 1158, 697, 247, 5454, 2727, 5474, 33032, 2520, 2929, 29328, 247, 747, 1881, 35421, 2746, 323, 8223, 272, 4886, 5113, 432, 253, 4228, 4114, 352, 39223, 342, 3374, 14494, 1309, 3733, 342, 2067, 5609, 824, 347, 46746, 15579, 21868, 37820, 4228, 37820, 285, 24399, 12195, 4313, 281, 4858, 20586, 50276, 296, 3755, 20556, 50276, 18, 3236, 414, 253, 4081, 5609, 403, 47860, 285, 403, 12912, 281, 253, 3114, 824, 347, 46746, 15579, 285, 21868, 37820, 374, 3290, 253, 3559, 18276, 1543, 403, 273, 1029, 3290, 495, 19843, 253, 7714, 310, 973, 15720, 577, 8453, 253, 4836, 273, 34430, 4906, 7870, 5113, 285, 4228, 4114, 310, 1774, 285, 253, 4081, 2746, 39223, 253, 1048, 6924, 1895, 342, 1029, 15177, 1543, 50275, 20881, 1255, 50276, 18, 891, 1158, 253, 2022, 14855, 3249, 432, 253, 6200, 5251, 25184, 323, 253, 1465, 19484, 273, 465, 24082, 34797, 24082, 27083, 24082, 34797, 304, 6681, 24082, 27083, 1689, 374, 891, 2096, 326, 281, 7450, 253, 1180, 253, 2488, 778, 878, 9860, 8716, 323, 1016, 6200, 2299, 253, 3480, 273, 1783, 323, 253, 7340, 273, 1110, 1465, 22041, 2789, 352, 12744, 849, 10237, 253, 4081, 2746, 310, 495, 3340, 7296, 275, 4706, 270, 273, 915, 359, 452, 898, 5239, 273, 3602, 323, 655, 13451, 1014, 323, 253, 1072, 6200, 36767, 359, 452, 767, 5239, 273, 1465, 22041, 323, 767, 8892, 34430, 4906, 285, 4460, 1859, 577, 16280, 253, 1153, 68, 1751, 25184, 1543, 891, 5583, 253, 2488, 281, 1304, 247, 873, 273, 11745, 1543, 342, 247, 2014, 873, 273, 1465, 22041, 604, 1896, 891, 11435, 253, 4477, 84, 11985, 670, 253, 7364, 273, 253, 4081, 2746, 275, 4706, 608, 534, 7729, 253, 4685, 273, 253, 7470, 15216, 5474, 33032, 2520, 2929, 23970, 247, 2962, 273, 5609, 281, 34430, 713, 7870, 5113, 432, 1114, 26292, 10556, 534, 1246, 13943, 4228, 4114, 7355, 342, 12195, 8570, 50276, 5430, 281, 5386, 253, 26744, 432, 7870, 5113, 406, 741, 398, 310, 247, 1048, 6924, 1895, 275, 495, 69, 8113, 5028, 891, 11435, 253, 2969, 533, 9865, 2216, 26332, 253, 46746, 15579, 2957, 342, 34430, 6216, 38998, 71, 285, 271, 4465, 12195, 1673, 2299, 1142, 2045, 3082, 452, 8671, 2074, 5609, 323, 2273, 737, 2995, 11814, 2600, 9712, 690, 2905, 4133, 2478, 403, 417, 2908, 275, 253, 2929, 16280, 352, 9563, 751, 247, 30105, 1087, 9714, 2929, 516, 417, 2119, 1880, 697, 7470, 323, 295, 2824, 4056, 384, 84, 337, 253, 46746, 290, 10144, 2957, 310, 2104, 281, 6520, 4360, 4858, 253, 4228, 285, 7870, 4243, 374, 23694, 253, 673, 39381, 272, 20586, 285, 253, 4228, 3176, 584, 3066, 247, 12195, 1673, 374, 253, 34430, 713, 3045, 285, 295, 10936, 3290, 3012, 41731, 13015, 256, 5503, 50268, 187, 187, 4118, 18435, 27, 2520, 2929, 8104, 271, 4722, 1895, 342, 38998, 71, 34430, 4906, 4886, 5113, 1690, 616, 20586, 432, 253, 4228, 4114, 50276, 455, 1740, 30628, 5583, 18738, 253, 2929, 285, 253, 32213, 3636, 858, 417, 843, 974, 432, 22918, 9021, 3103, 891, 717, 18738, 436, 2929 ]
[ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 ]
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 2520, 2929, 29328, 247, 1332, 281, 8223, 285, 34430, 713, 7870, 5113, 1223, 27930, 253, 4228, 3126, 432, 247, 1114, 26292, 3492, 352, 5223, 84, 432, 38998, 71, 285, 697, 6880, 4373, 38998, 71, 533, 342, 5520, 10885, 273, 12195, 4811, 347, 973, 347, 247, 2957, 281, 8591, 3451, 9712, 273, 7870, 285, 4228, 4811, 352, 14371, 21541, 3200, 26405, 285, 12195, 8570, 906, 2429, 281, 3332, 38998, 71, 1754, 3082, 50275, 2520, 2929, 310, 247, 4891, 2440, 323, 970, 38998, 71, 281, 17029, 432, 1114, 26292, 10556, 342, 7870, 2273, 18646, 253, 2929, 275, 2087, 310, 973, 15720, 285, 253, 3916, 403, 973, 4516, 253, 15644, 352, 1160, 281, 6016, 12195, 285, 35936, 50276, 11814, 9712, 310, 20654, 285, 3133, 281, 320, 3576, 352, 7964, 6556, 1318, 281, 952, 2444, 327, 2074, 3237, 285, 22828, 9311, 50276, 9088, 403, 247, 4564, 273, 1841, 891, 3524, 253, 4477, 812, 4385, 327, 50276, 18, 275, 43321, 27648, 281, 7568, 253, 3745, 273, 4751, 1881, 35421, 6200, 34430, 4906, 359, 513, 417, 4647, 667, 25965, 672, 41863, 1524, 10186, 3888, 970, 847, 4251, 275, 619, 2793, 1293, 12422, 25965, 847, 4251, 14280, 281, 1056, 3430, 13418, 273, 6568, 24543, 672, 253, 35936, 310, 10481, 1781, 534, 588, 7964, 1543, 275, 3430, 14433, 273, 253, 4228, 4114, 352, 3133, 253, 4081, 513, 417, 3177, 281, 5731, 253, 6568, 16753, 1309, 13757, 594, 253, 1750, 1840, 4453, 21643, 281, 479, 50275, 19, 352, 651, 320, 5322, 604, 253, 2488, 671, 5304, 4219, 253, 6864, 3711, 273, 253, 25578, 35936, 50276, 11814, 50276, 20, 253, 1543, 273, 4373, 1216, 71, 4453, 2080, 7197, 2429, 281, 253, 4081, 1332, 1014, 275, 253, 7870, 2919, 1677, 326, 253, 4081, 1332, 3133, 281, 452, 1652, 3064, 281, 4373, 1216, 71, 387, 1878, 275, 253, 7870, 4811, 253, 1655, 906, 37700, 479, 247, 2372, 812, 253, 2488, 1056, 3081, 5701, 327, 253, 1896, 4606, 50274, 783, 1332, 2550, 6016, 1029, 4294, 1859, 6820, 1985, 5155, 1818, 1955, 281, 253, 1114, 26292, 4886, 6568, 4758, 5474, 33032, 2520, 2929, 10262, 277, 19, 1216, 71, 247, 1881, 35421, 1332, 326, 3936, 247, 1114, 26292, 3492, 285, 33772, 247, 495, 69, 6200, 6779, 326, 34430, 1868, 4886, 5113, 1690, 616, 20586, 432, 253, 4228, 4114, 275, 1635, 436, 2929, 29328, 247, 4460, 2957, 281, 8591, 253, 3451, 9712, 273, 16958, 323, 4228, 285, 7870, 4724, 253, 4477, 2007, 12661, 247, 12195, 1673, 2990, 281, 2736, 285, 34430, 713, 23043, 4886, 20586, 247, 747, 10895, 369, 4081, 4508, 2710, 7870, 5113, 285, 20586, 9470, 4679, 7568, 326, 253, 4081, 1332, 476, 5115, 1805, 3045, 685, 1375, 23037, 14387, 7274, 275, 34430, 4906, 7870, 285, 4228, 495, 69, 5113, 30796, 285, 12195, 8570, 285, 2460, 26405, 323, 4886, 5113, 4757, 50275, 1826, 10895, 323, 4228, 285, 7870, 1673, 14717, 50276, 1286, 35421, 1332, 326, 3936, 247, 1114, 26292, 3492, 285, 33772, 247, 495, 69, 6200, 6779, 326, 34430, 1868, 4886, 5113, 1690, 616, 20586, 432, 253, 4228, 4114, 50275, 2369, 652, 2957, 281, 8591, 253, 3451, 9712, 273, 16958, 323, 4228, 285, 7870, 4724, 50275, 66, 12195, 1673, 2990, 281, 2736, 285, 34430, 713, 23043, 4886, 20586, 50276, 84, 5503, 1543, 275, 2067, 8892, 824, 347, 34430, 4906, 7870, 285, 4228, 495, 69, 5113, 30796, 285, 12195, 8570, 285, 2460, 26405, 323, 4886, 5113, 50276, 20881, 1255, 50276, 15684, 253, 7899, 6568, 11089, 432, 1029, 18163, 1859, 6820, 1985, 5155, 1818, 534, 556, 644, 5469, 50276, 3088, 417, 7277, 342, 643, 3200, 34430, 4906, 3082, 824, 347, 331, 1216, 71, 1093, 19769, 567, 1348, 285, 24187, 1216, 71, 883, 948, 531, 746, 4177, 3571, 3738, 616, 3368, 4758, 778, 320, 16706, 342, 253, 4081, 3082, 2299, 891, 1158, 275, 253, 13506, 10895, 512, 273, 731, 476, 320, 23775, 4496, 823, 690, 273, 616, 1543, 323, 247, 3426, 5301, 50276, 9453, 891, 3730, 281, 253, 24864, 3492, 627, 403, 1335, 690, 3430, 4228, 285, 7870, 1673, 14717, 84, 824, 347, 253, 15487, 285, 891, 1158, 436, 310, 247, 5454, 2727, 50275, 262, 3133, 281, 3480, 2087, 50228, 816, 6194, 285, 1071, 275, 253, 1072, 10895, 253, 4477, 452, 5469, 253, 12291, 285, 512, 273, 731, 403, 247, 10665, 5691, 891, 452, 642, 2934, 281, 8415, 731, 285, 891, 1158, 697, 247, 5454, 2727, 5474, 33032, 2520, 2929, 29328, 247, 747, 1881, 35421, 2746, 323, 8223, 272, 4886, 5113, 432, 253, 4228, 4114, 352, 39223, 342, 3374, 14494, 1309, 3733, 342, 2067, 5609, 824, 347, 46746, 15579, 21868, 37820, 4228, 37820, 285, 24399, 12195, 4313, 281, 4858, 20586, 50276, 296, 3755, 20556, 50276, 18, 3236, 414, 253, 4081, 5609, 403, 47860, 285, 403, 12912, 281, 253, 3114, 824, 347, 46746, 15579, 285, 21868, 37820, 374, 3290, 253, 3559, 18276, 1543, 403, 273, 1029, 3290, 495, 19843, 253, 7714, 310, 973, 15720, 577, 8453, 253, 4836, 273, 34430, 4906, 7870, 5113, 285, 4228, 4114, 310, 1774, 285, 253, 4081, 2746, 39223, 253, 1048, 6924, 1895, 342, 1029, 15177, 1543, 50275, 20881, 1255, 50276, 18, 891, 1158, 253, 2022, 14855, 3249, 432, 253, 6200, 5251, 25184, 323, 253, 1465, 19484, 273, 465, 24082, 34797, 24082, 27083, 24082, 34797, 304, 6681, 24082, 27083, 1689, 374, 891, 2096, 326, 281, 7450, 253, 1180, 253, 2488, 778, 878, 9860, 8716, 323, 1016, 6200, 2299, 253, 3480, 273, 1783, 323, 253, 7340, 273, 1110, 1465, 22041, 2789, 352, 12744, 849, 10237, 253, 4081, 2746, 310, 495, 3340, 7296, 275, 4706, 270, 273, 915, 359, 452, 898, 5239, 273, 3602, 323, 655, 13451, 1014, 323, 253, 1072, 6200, 36767, 359, 452, 767, 5239, 273, 1465, 22041, 323, 767, 8892, 34430, 4906, 285, 4460, 1859, 577, 16280, 253, 1153, 68, 1751, 25184, 1543, 891, 5583, 253, 2488, 281, 1304, 247, 873, 273, 11745, 1543, 342, 247, 2014, 873, 273, 1465, 22041, 604, 1896, 891, 11435, 253, 4477, 84, 11985, 670, 253, 7364, 273, 253, 4081, 2746, 275, 4706, 608, 534, 7729, 253, 4685, 273, 253, 7470, 15216, 5474, 33032, 2520, 2929, 23970, 247, 2962, 273, 5609, 281, 34430, 713, 7870, 5113, 432, 1114, 26292, 10556, 534, 1246, 13943, 4228, 4114, 7355, 342, 12195, 8570, 50276, 5430, 281, 5386, 253, 26744, 432, 7870, 5113, 406, 741, 398, 310, 247, 1048, 6924, 1895, 275, 495, 69, 8113, 5028, 891, 11435, 253, 2969, 533, 9865, 2216, 26332, 253, 46746, 15579, 2957, 342, 34430, 6216, 38998, 71, 285, 271, 4465, 12195, 1673, 2299, 1142, 2045, 3082, 452, 8671, 2074, 5609, 323, 2273, 737, 2995, 11814, 2600, 9712, 690, 2905, 4133, 2478, 403, 417, 2908, 275, 253, 2929, 16280, 352, 9563, 751, 247, 30105, 1087, 9714, 2929, 516, 417, 2119, 1880, 697, 7470, 323, 295, 2824, 4056, 384, 84, 337, 253, 46746, 290, 10144, 2957, 310, 2104, 281, 6520, 4360, 4858, 253, 4228, 285, 7870, 4243, 374, 23694, 253, 673, 39381, 272, 20586, 285, 253, 4228, 3176, 584, 3066, 247, 12195, 1673, 374, 253, 34430, 713, 3045, 285, 295, 10936, 3290, 3012, 41731, 13015, 256, 5503, 50268, 187, 187, 4118, 18435, 27, 2520, 2929, 8104, 271, 4722, 1895, 342, 38998, 71, 34430, 4906, 4886, 5113, 1690, 616, 20586, 432, 253, 4228, 4114, 50276, 455, 1740, 30628, 5583, 18738, 253, 2929, 285, 253, 32213, 3636, 858, 417, 843, 974, 432, 22918, 9021, 3103, 891, 717, 18738, 436, 2929 ]
Below is given review of a research paper from cnoference journal. Please write a summary the review. ### Review: the authors introduce a novel onpolicy temporally consistent exploration strategy named neural adaptivedropout policy exploration nadpex for deep reinforcement learning agents the main idea is to sample from a distribution of plausible subnetworks modeling the temporally consistent exploration for this the authors use the ideas of the standard dropout for deep networks using the proposed dropout transformation that is differentiable the authors show that the kl regularizers on policyspace play an important role in stabilizing its learning the experimental validation is performed on continuous control learning tasks showing the benefits of the proposed this paper is very well written although very dense and not easy to follows as many methods are referenced and assume that the reviewer is highly familiar with the related works this poses a challenge in evaluating this paper nevertheless this paper clearly explores and offers a novel approach for more efficient onpolicy exploration which allows for more stable learning compared to traditional approaches even though the authors answer positively to each of their four questions in the experiments section it would like that the authors provide more intuition why these improvements occur and also outline the limitations of their approach docsepthe authors propose a new onpolicy exploration strategy by using a policy with a hierarchy of stochasticity the authors use a twolevel hierarchical distribution as a policy where the global variable is used for dropout this work is interesting since the authors use dropout for policy learning and exploration the authors show that parameter noise exploration is a particular case of the proposed policy the main concern is the gap between the problem formulation and the actual optimization problem in eq 12 i am very happy to give a higher rating if the authors address the following points detailed comments 1 the authors give the derivation for eq 10 however it is not obvious that how to move from line 3 to line 4 at eq 15 minor since the action is denoted by a it will be more clear if the authors use another symbol to denote the parameter of qz instead of alpha at eq 10 and 15 2 due to the use of the likelihood ratio trick the authors use the mean policy as an approximation at eq 12 does such approximation guarantee the policy improvement any justification 3 instead of using the mean policy approximation in eq 12 the authors should consider existing monte carlo techniques to reduce the variance of the gradient estimation for example 1 could be used to reduce the variance of gradient wrt phi note that the gradient is biased if the mean policy approximation is used 4 are theta and phi jointly and simultaneously optimized at eq 12 the authors should clarify this point 5 due to the mean policy approximation does the mean policy depend on phi the authors should clearly explain how to update phi when optimizing eq 12 6 if the authors jointly and simultaneously optimize theta and phi why a regularization term about qphiz is missing in eq 12 while a regularization term about pithetaz does appear in eq 12 7 the authors give the derivations about theta such as the gradient and the regularization term about theta see eq 1819 however the derivations about phi are missing for example how to compute the gradient wrt phi since the mean policy is used it is not apparent that how to compute the gradient wrt phi minor 12 is missing in the last line of eq 19 reference 1 aueb michalis titsias rc and miguel lzarogredilla local expectation gradients for black box variational inference in advances in neural information processing systems pp 26382646 2015docsepthis paper proposed to use dropout to randomly choose only a subset of neural network as a potential way to perform exploration the dropout happens at the beginning of each episode and thus leads to a temporally consistent exploration the paper shows that with small amount of gaussian multiplicative dropout the algorithm can achieve the stateoftheart results on benchmark environments and it can significantly outperform vanilla ppo for environments with sparse rewards the paper is clearly written the introduced technique is interesting i wonder except for the difference of memory consumption how different it is compared to parameter space exploration i feel that it is a straightforward extensiongeneralization of the parameter space exploration but the stochastic alignment and policy space constraint seem novel and important the motivation of this paper is mostly about learning with sparse reward i am curious whether the paper has other good side effects for example will the dropout cause the policy to be more robust furthermore if i deploy the learning algorithm on a physical robot will the temporally consistent exploration cause less wear and tear to the actuators when the robot explores in addition i would like to see some discussions whether this technique could be applied to offpolicy learning as well overall i like this paper it is well written the method seems technically sound and achieves good results for this reason i would recommend accepting this paper ### Summary:
the authors have proposed a new method for exploration that is related to parameter noise but instead uses gaussian dropout across entire episodes thus allowing for temporally consistent exploration the method is evaluated in sparsely rewarded continuous control domains such as halfcheetah and humanoid and compared against ppo and other variants the method is novel and does seem to work stably across the tested tasks and simple exploration methods are important for the rl field however the paper is poorly and confusingly written and really really needs to be thoroughly edited before the camera ready deadline there are many approaches which are referred to without any summary or description which makes it difficult to read the paper the three reviewers all had low confidence in their understanding of the paper which makes this a very borderline submission even though the reviewers gave relatively high scores
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 783, 4477, 9569, 247, 50276, 2369, 652, 50276, 251, 22872, 50276, 5142, 1831, 595, 50276, 32474, 50276, 15083, 7843, 50276, 1344, 18710, 4907, 11454, 50276, 26672, 1567, 1658, 483, 3646, 17947, 33078, 24293, 323, 3676, 35221, 4715, 6083, 253, 2022, 2934, 310, 281, 3410, 432, 247, 3268, 273, 21541, 749, 3024, 4896, 14053, 253, 5897, 595, 5185, 17947, 323, 436, 253, 4477, 897, 253, 5697, 273, 253, 2629, 5926, 483, 323, 3676, 6928, 970, 253, 4081, 50276, 12233, 483, 9261, 326, 310, 46350, 253, 4477, 921, 326, 253, 27451, 3963, 14460, 327, 6382, 656, 4511, 1132, 271, 1774, 2554, 275, 41427, 697, 4715, 253, 5661, 12820, 310, 2684, 327, 5415, 1453, 4715, 8892, 4645, 253, 5373, 273, 253, 4081, 50275, 2520, 2929, 310, 1077, 973, 3542, 3738, 1077, 14086, 285, 417, 3477, 281, 3637, 347, 1142, 3082, 403, 23378, 285, 5467, 326, 253, 37317, 310, 4122, 7615, 342, 253, 2905, 2987, 436, 24543, 247, 5691, 275, 16344, 436, 2929, 17837, 436, 2929, 4518, 33826, 285, 6131, 247, 4460, 2746, 323, 625, 5919, 327, 22872, 17947, 534, 4483, 323, 625, 6474, 4715, 2429, 281, 5899, 7274, 50275, 9154, 2167, 253, 4477, 3662, 14962, 281, 1016, 273, 616, 1740, 3533, 275, 253, 4679, 2593, 50276, 262, 651, 751, 326, 253, 4477, 2085, 625, 30328, 2139, 841, 11701, 2826, 285, 671, 19270, 253, 7364, 273, 616, 2746, 5474, 339, 431, 248, 4477, 12661, 247, 747, 327, 22872, 17947, 5700, 407, 970, 247, 3646, 342, 247, 19868, 273, 19191, 414, 253, 4477, 897, 247, 767, 5251, 24498, 3268, 347, 247, 3646, 835, 253, 4156, 4778, 310, 908, 323, 5926, 483, 436, 789, 310, 4722, 1580, 253, 4477, 897, 5926, 483, 323, 3646, 4715, 285, 17947, 50276, 783, 4477, 921, 326, 4764, 6046, 17947, 310, 247, 1798, 1083, 273, 253, 4081, 3646, 253, 2022, 4468, 310, 253, 8037, 875, 253, 1895, 15895, 285, 253, 4588, 13757, 1895, 275, 16186, 1249, 891, 717, 1077, 5211, 281, 1918, 247, 2169, 13716, 604, 253, 4477, 2953, 253, 1563, 2792, 50275, 5992, 7193, 5701, 50276, 18, 253, 4477, 1918, 253, 28529, 323, 16186, 884, 2299, 352, 310, 417, 4755, 326, 849, 281, 2118, 432, 1386, 495, 281, 1386, 577, 387, 16186, 1458, 5884, 50276, 17480, 253, 2250, 310, 17007, 407, 247, 50276, 262, 588, 320, 625, 2590, 604, 253, 4477, 897, 1529, 9484, 281, 9173, 253, 4764, 273, 2805, 91, 3185, 273, 9765, 387, 16186, 884, 285, 1458, 50276, 19, 1955, 281, 253, 897, 273, 253, 12177, 4313, 10480, 253, 4477, 897, 253, 1599, 3646, 347, 271, 11193, 387, 16186, 1249, 1057, 824, 11193, 12215, 253, 3646, 7756, 667, 22861, 50276, 20, 3185, 273, 970, 253, 1599, 3646, 11193, 275, 16186, 1249, 253, 4477, 943, 1908, 5368, 1114, 442, 1113, 4213, 5609, 281, 4796, 253, 11041, 273, 253, 11786, 13418, 323, 1650, 337, 812, 320, 908, 281, 4796, 253, 11041, 273, 11786, 8772, 815, 74, 3877, 326, 253, 11786, 310, 23539, 604, 253, 1599, 3646, 11193, 310, 908, 50276, 21, 403, 39116, 285, 815, 74, 26277, 285, 10486, 18325, 387, 16186, 1249, 50276, 783, 4477, 943, 19148, 436, 1127, 50275, 22, 1955, 281, 253, 1599, 3646, 11193, 1057, 253, 1599, 3646, 3469, 327, 815, 74, 253, 4477, 943, 4518, 5513, 849, 281, 5731, 815, 74, 672, 39793, 16186, 1249, 50275, 23, 604, 253, 4477, 26277, 285, 10486, 22318, 39116, 285, 815, 74, 2139, 247, 37820, 1307, 670, 2805, 545, 478, 50276, 261, 5816, 275, 16186, 1249, 1223, 247, 37820, 1307, 670, 8483, 22666, 91, 1057, 3176, 275, 16186, 1249, 50275, 24, 253, 4477, 1918, 253, 3538, 569, 670, 39116, 824, 347, 253, 11786, 285, 253, 37820, 1307, 670, 39116, 923, 16186, 1283, 746, 2299, 253, 3538, 569, 670, 815, 74, 403, 5816, 50276, 1542, 1650, 849, 281, 11897, 253, 11786, 8772, 815, 74, 1580, 253, 1599, 3646, 310, 908, 352, 310, 417, 5165, 326, 849, 281, 11897, 253, 11786, 8772, 815, 74, 50276, 37585, 1249, 310, 5816, 275, 253, 1390, 1386, 273, 16186, 655, 50276, 14005, 337, 247, 489, 67, 49068, 21728, 246, 953, 6358, 27657, 285, 5563, 3814, 298, 34185, 462, 433, 6077, 1980, 15355, 27935, 323, 2806, 3817, 39762, 17032, 275, 16424, 275, 11454, 1491, 5162, 2718, 7266, 3436, 1839, 17984, 23, 4104, 7152, 33032, 2520, 2929, 4081, 281, 897, 5926, 483, 281, 12421, 5206, 760, 247, 8578, 273, 11454, 2990, 347, 247, 2442, 1039, 281, 1347, 17947, 253, 5926, 483, 6569, 387, 253, 5068, 273, 1016, 9037, 285, 3021, 5644, 281, 247, 5897, 595, 5185, 17947, 253, 2929, 2722, 326, 342, 1355, 2408, 273, 305, 12064, 43904, 5926, 483, 253, 5933, 476, 5115, 253, 1375, 23037, 14387, 1543, 327, 22791, 12620, 285, 352, 476, 3012, 562, 32231, 26724, 268, 5367, 323, 12620, 342, 23507, 23267, 50276, 783, 2929, 310, 4518, 3542, 253, 5611, 5853, 310, 4722, 891, 4282, 3707, 323, 253, 3064, 273, 3541, 8353, 849, 1027, 352, 310, 2429, 281, 4764, 2317, 17947, 891, 1928, 326, 352, 310, 247, 15246, 6880, 16691, 1320, 273, 253, 4764, 2317, 17947, 533, 253, 19191, 12420, 285, 3646, 2317, 7658, 1646, 4460, 285, 1774, 50276, 783, 16038, 273, 436, 2929, 310, 6571, 670, 4715, 342, 23507, 10921, 891, 717, 14338, 1880, 253, 2929, 556, 643, 1175, 1930, 2538, 323, 1650, 588, 253, 5926, 483, 2847, 253, 3646, 281, 320, 625, 10237, 33810, 604, 891, 8745, 253, 4715, 5933, 327, 247, 3520, 15688, 588, 253, 5897, 595, 5185, 17947, 2847, 1679, 8251, 285, 16024, 281, 253, 22664, 2392, 672, 253, 15688, 33826, 275, 1635, 891, 651, 751, 281, 923, 690, 11985, 1880, 436, 5853, 812, 320, 3732, 281, 745, 22872, 4715, 347, 973, 50276, 1189, 455, 891, 751, 436, 2929, 352, 310, 973, 3542, 253, 1332, 3133, 22335, 3590, 285, 33526, 1175, 1543, 323, 436, 1921, 891, 651, 5583, 18738, 436, 2929, 187, 187, 4118, 18435, 27, 783, 4477, 452, 4081, 247, 747, 1332, 323, 17947, 326, 310, 2905, 281, 4764, 6046, 533, 3185, 4648, 305, 12064, 5926, 483, 2439, 2862, 13305, 3021, 6941, 323, 5897, 595, 5185, 17947, 253, 1332, 310, 6760, 275, 37139, 600, 33302, 5415, 1453, 10625, 824, 347, 2716, 1962, 292, 1240, 285, 1966, 1238, 285, 2429, 1411, 268, 5367, 285, 643, 11640, 253, 1332, 310, 4460, 285, 1057, 1646, 281, 789, 42526, 2439, 253, 5762, 8892, 285, 2969, 17947, 3082, 403, 1774, 323, 253, 391, 77, 1673, 2299, 253, 2929, 310, 15225, 285, 21643, 314, 3542, 285, 1663, 1663, 3198, 281, 320, 16575, 16168, 1078, 253, 6568, 4704, 20639, 627, 403, 1142, 7274, 534, 403, 6289, 281, 1293, 667, 6010, 390, 5740, 534, 2789, 352, 2834, 281, 1239, 253, 2929, 253, 1264, 30628, 512, 574, 1698, 7162, 275, 616, 4685, 273, 253, 2929, 534, 2789, 436, 247, 1077, 45210, 19529, 1014, 2167, 253, 30628, 3534, 4942, 1029, 7363, 209 ]
[ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 ]
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 783, 4477, 9569, 247, 50276, 2369, 652, 50276, 251, 22872, 50276, 5142, 1831, 595, 50276, 32474, 50276, 15083, 7843, 50276, 1344, 18710, 4907, 11454, 50276, 26672, 1567, 1658, 483, 3646, 17947, 33078, 24293, 323, 3676, 35221, 4715, 6083, 253, 2022, 2934, 310, 281, 3410, 432, 247, 3268, 273, 21541, 749, 3024, 4896, 14053, 253, 5897, 595, 5185, 17947, 323, 436, 253, 4477, 897, 253, 5697, 273, 253, 2629, 5926, 483, 323, 3676, 6928, 970, 253, 4081, 50276, 12233, 483, 9261, 326, 310, 46350, 253, 4477, 921, 326, 253, 27451, 3963, 14460, 327, 6382, 656, 4511, 1132, 271, 1774, 2554, 275, 41427, 697, 4715, 253, 5661, 12820, 310, 2684, 327, 5415, 1453, 4715, 8892, 4645, 253, 5373, 273, 253, 4081, 50275, 2520, 2929, 310, 1077, 973, 3542, 3738, 1077, 14086, 285, 417, 3477, 281, 3637, 347, 1142, 3082, 403, 23378, 285, 5467, 326, 253, 37317, 310, 4122, 7615, 342, 253, 2905, 2987, 436, 24543, 247, 5691, 275, 16344, 436, 2929, 17837, 436, 2929, 4518, 33826, 285, 6131, 247, 4460, 2746, 323, 625, 5919, 327, 22872, 17947, 534, 4483, 323, 625, 6474, 4715, 2429, 281, 5899, 7274, 50275, 9154, 2167, 253, 4477, 3662, 14962, 281, 1016, 273, 616, 1740, 3533, 275, 253, 4679, 2593, 50276, 262, 651, 751, 326, 253, 4477, 2085, 625, 30328, 2139, 841, 11701, 2826, 285, 671, 19270, 253, 7364, 273, 616, 2746, 5474, 339, 431, 248, 4477, 12661, 247, 747, 327, 22872, 17947, 5700, 407, 970, 247, 3646, 342, 247, 19868, 273, 19191, 414, 253, 4477, 897, 247, 767, 5251, 24498, 3268, 347, 247, 3646, 835, 253, 4156, 4778, 310, 908, 323, 5926, 483, 436, 789, 310, 4722, 1580, 253, 4477, 897, 5926, 483, 323, 3646, 4715, 285, 17947, 50276, 783, 4477, 921, 326, 4764, 6046, 17947, 310, 247, 1798, 1083, 273, 253, 4081, 3646, 253, 2022, 4468, 310, 253, 8037, 875, 253, 1895, 15895, 285, 253, 4588, 13757, 1895, 275, 16186, 1249, 891, 717, 1077, 5211, 281, 1918, 247, 2169, 13716, 604, 253, 4477, 2953, 253, 1563, 2792, 50275, 5992, 7193, 5701, 50276, 18, 253, 4477, 1918, 253, 28529, 323, 16186, 884, 2299, 352, 310, 417, 4755, 326, 849, 281, 2118, 432, 1386, 495, 281, 1386, 577, 387, 16186, 1458, 5884, 50276, 17480, 253, 2250, 310, 17007, 407, 247, 50276, 262, 588, 320, 625, 2590, 604, 253, 4477, 897, 1529, 9484, 281, 9173, 253, 4764, 273, 2805, 91, 3185, 273, 9765, 387, 16186, 884, 285, 1458, 50276, 19, 1955, 281, 253, 897, 273, 253, 12177, 4313, 10480, 253, 4477, 897, 253, 1599, 3646, 347, 271, 11193, 387, 16186, 1249, 1057, 824, 11193, 12215, 253, 3646, 7756, 667, 22861, 50276, 20, 3185, 273, 970, 253, 1599, 3646, 11193, 275, 16186, 1249, 253, 4477, 943, 1908, 5368, 1114, 442, 1113, 4213, 5609, 281, 4796, 253, 11041, 273, 253, 11786, 13418, 323, 1650, 337, 812, 320, 908, 281, 4796, 253, 11041, 273, 11786, 8772, 815, 74, 3877, 326, 253, 11786, 310, 23539, 604, 253, 1599, 3646, 11193, 310, 908, 50276, 21, 403, 39116, 285, 815, 74, 26277, 285, 10486, 18325, 387, 16186, 1249, 50276, 783, 4477, 943, 19148, 436, 1127, 50275, 22, 1955, 281, 253, 1599, 3646, 11193, 1057, 253, 1599, 3646, 3469, 327, 815, 74, 253, 4477, 943, 4518, 5513, 849, 281, 5731, 815, 74, 672, 39793, 16186, 1249, 50275, 23, 604, 253, 4477, 26277, 285, 10486, 22318, 39116, 285, 815, 74, 2139, 247, 37820, 1307, 670, 2805, 545, 478, 50276, 261, 5816, 275, 16186, 1249, 1223, 247, 37820, 1307, 670, 8483, 22666, 91, 1057, 3176, 275, 16186, 1249, 50275, 24, 253, 4477, 1918, 253, 3538, 569, 670, 39116, 824, 347, 253, 11786, 285, 253, 37820, 1307, 670, 39116, 923, 16186, 1283, 746, 2299, 253, 3538, 569, 670, 815, 74, 403, 5816, 50276, 1542, 1650, 849, 281, 11897, 253, 11786, 8772, 815, 74, 1580, 253, 1599, 3646, 310, 908, 352, 310, 417, 5165, 326, 849, 281, 11897, 253, 11786, 8772, 815, 74, 50276, 37585, 1249, 310, 5816, 275, 253, 1390, 1386, 273, 16186, 655, 50276, 14005, 337, 247, 489, 67, 49068, 21728, 246, 953, 6358, 27657, 285, 5563, 3814, 298, 34185, 462, 433, 6077, 1980, 15355, 27935, 323, 2806, 3817, 39762, 17032, 275, 16424, 275, 11454, 1491, 5162, 2718, 7266, 3436, 1839, 17984, 23, 4104, 7152, 33032, 2520, 2929, 4081, 281, 897, 5926, 483, 281, 12421, 5206, 760, 247, 8578, 273, 11454, 2990, 347, 247, 2442, 1039, 281, 1347, 17947, 253, 5926, 483, 6569, 387, 253, 5068, 273, 1016, 9037, 285, 3021, 5644, 281, 247, 5897, 595, 5185, 17947, 253, 2929, 2722, 326, 342, 1355, 2408, 273, 305, 12064, 43904, 5926, 483, 253, 5933, 476, 5115, 253, 1375, 23037, 14387, 1543, 327, 22791, 12620, 285, 352, 476, 3012, 562, 32231, 26724, 268, 5367, 323, 12620, 342, 23507, 23267, 50276, 783, 2929, 310, 4518, 3542, 253, 5611, 5853, 310, 4722, 891, 4282, 3707, 323, 253, 3064, 273, 3541, 8353, 849, 1027, 352, 310, 2429, 281, 4764, 2317, 17947, 891, 1928, 326, 352, 310, 247, 15246, 6880, 16691, 1320, 273, 253, 4764, 2317, 17947, 533, 253, 19191, 12420, 285, 3646, 2317, 7658, 1646, 4460, 285, 1774, 50276, 783, 16038, 273, 436, 2929, 310, 6571, 670, 4715, 342, 23507, 10921, 891, 717, 14338, 1880, 253, 2929, 556, 643, 1175, 1930, 2538, 323, 1650, 588, 253, 5926, 483, 2847, 253, 3646, 281, 320, 625, 10237, 33810, 604, 891, 8745, 253, 4715, 5933, 327, 247, 3520, 15688, 588, 253, 5897, 595, 5185, 17947, 2847, 1679, 8251, 285, 16024, 281, 253, 22664, 2392, 672, 253, 15688, 33826, 275, 1635, 891, 651, 751, 281, 923, 690, 11985, 1880, 436, 5853, 812, 320, 3732, 281, 745, 22872, 4715, 347, 973, 50276, 1189, 455, 891, 751, 436, 2929, 352, 310, 973, 3542, 253, 1332, 3133, 22335, 3590, 285, 33526, 1175, 1543, 323, 436, 1921, 891, 651, 5583, 18738, 436, 2929, 187, 187, 4118, 18435, 27, 783, 4477, 452, 4081, 247, 747, 1332, 323, 17947, 326, 310, 2905, 281, 4764, 6046, 533, 3185, 4648, 305, 12064, 5926, 483, 2439, 2862, 13305, 3021, 6941, 323, 5897, 595, 5185, 17947, 253, 1332, 310, 6760, 275, 37139, 600, 33302, 5415, 1453, 10625, 824, 347, 2716, 1962, 292, 1240, 285, 1966, 1238, 285, 2429, 1411, 268, 5367, 285, 643, 11640, 253, 1332, 310, 4460, 285, 1057, 1646, 281, 789, 42526, 2439, 253, 5762, 8892, 285, 2969, 17947, 3082, 403, 1774, 323, 253, 391, 77, 1673, 2299, 253, 2929, 310, 15225, 285, 21643, 314, 3542, 285, 1663, 1663, 3198, 281, 320, 16575, 16168, 1078, 253, 6568, 4704, 20639, 627, 403, 1142, 7274, 534, 403, 6289, 281, 1293, 667, 6010, 390, 5740, 534, 2789, 352, 2834, 281, 1239, 253, 2929, 253, 1264, 30628, 512, 574, 1698, 7162, 275, 616, 4685, 273, 253, 2929, 534, 2789, 436, 247, 1077, 45210, 19529, 1014, 2167, 253, 30628, 3534, 4942, 1029, 7363, 209 ]
Below is given review of a research paper from cnoference journal. Please write a summary the review. ### Review: this paper discusses a technique for continuous normalization flow in which the transformations are not required to be volume preserving the transformation with unity jacobian and architecture of neural network does not need to be designed to hold such property instead authors proposed no restriction on architecture of neural network to design their reversible mapping the paper has good background and literature review and as authors mentioned this paper is base on the idea of chen tian qi et al neural ordinary differential equations arxiv preprint arxiv180607366 2018 chapter two of this paper is summary of neural ordinary differential equations and chapter three is main contribution of this paper that can be summarized under two points 1 authors borrowed the continuous normalizing flow in chen et al and they have designed unbiased log density estimator using hutchinson trace estimator and evaluated the trace with complexity of od dimension of data instead of od2 that is used in chen et al paper 2 they proposed by reducing the hidden layer dimension of neural network it is possible that variance of estimator to be reduced novelty and quality the main contribution of this paper is summarized above the paper do not contain any significant theorem or mathematical claims it is more focused on design of linear algorithm that estimate continuous normalizing flow that they have borrowed from the chen et al paper this is a good achievement that can help continuous normalizing flow scale on data with higher dimensions but in results and experiments section no comparison has been made to performance of chen et al also no guarantees or bound has been given about the variance reduction of estimator and it is more based on the authors intuition clarity the paper is well written and previous relevant methods have been reviewed well there are a few issues that are listed below 1in section 3 the reason that dimensionality of estimator can reduce to d from d2 can be explained more clearly 2 figure 1 is located on first page of the paper but it has never been referred in main paper just it is mentioned once in appendix it can be moved to appendix 3 in section 311 the view view can be changed to view significance and experiments the experiments are very detailed and extensive and authors have compared their algorithm with many other competing algorithms and showed improvement in many of the cases as mentioned in quality and novelty part of the review just one comparison is missing and that is the comparison to method that the paper is inspired by it would be interesting to see how much trace estimator approach that has been used in this paper would sacrifice the negative loglikelihood or elbo specially in real data like mnist and cifar 10 it seems original paper has not reported the performance on those datasets as well is this difficult as chen et al paper algorithm for trace calculation has complexity of od2 docsepsummary this paper discusses an advance in the framework of normalizing flows for generative modeling named ffjord the authors consider normalizing flows in the form of ordinary differential equations as also discussed in 1 their contributions are twofold 1 they use an unbiased estimator of the likelihood of the model by approximating the trace of the jacobian with hutchinsons trace estimator 2 they have implemented the required ode solvers on gpus the models are evaluated on a density estimation task on tabular data and two image datasets mnist and cifar10 as well as on variational inference for autoencoders where the datasets mnist omniglot freyfaces and caltech silhouettes are considered the authors argue that the trace estimator in combination with reversemode automatic differentiation to compute vectorjacobian products leads to a computational cost of od instead of od2 for the exact trace of the jacobian they compare this to the cost of computing a jacobian determinant for finite flows which is od3 in general they argue that in general all works on finite flows have adjusted their architectures for the flows to avoid the od3 complexity and that ffjord has no such restriction however i would like the authors to comment on the following train of thought autoregressive models such as maf as well as iaf inverse of an autoregressive model do not require od3 to compute jacobian determinants as the jacobian is of triangular form note however they are still universal approximators if sufficient flows are applied as any distribution can be factorized in an autoregressive manner with this in mind i find the red cross for maf under freeform jacobian slightly misleading perhaps i misunderstood something so please clarify another topic that i would like the authors to comment on is efficiency and practical use one of the main points that the authors seem to emphasise is that contrary to autoregressive models which require d passes through the model to sample a datapoint of size d ffjord is a singlepass model requiring only one pass through the model they therefore indicate that they can do efficient sampling however for ffjord every forward pass requires a pass through an ode solver which as the authors also state can be very slow i could imagine that this is still faster than an autoregressive model but i doubt this is actually of comparable speed to a forward pass of a finite flow such as glow or realnvp on the other hand autoregressive models do not require d passes during training whereas if i understand correctly ffjord relies on two passes through ode solvers one for computing the loss and a second to compute the gradient of the loss with respect to model parameters so autoregressive models should train considerably faster the authors do comment on the fact that ffjord is slower than other models but they do not give a hint as to how much slower it is this would be of importance for practical use and for other people to consider using ffjord in future work for the density estimation task ffjord does not have the best performance compared other baselines except for mnist for which the overall best model was not evaluated mafddsf for variational inference ffjord is stated to outperform all other flows but the models are only evaluated on the negative evidence lower bound and not on the negative loglikehood nll i suspect the nll to be absent from the paper as it requires more computation and this takes a long time for ffjord without an evaluation on nll the improvement over other methods is questionable even if the improvement still holds for the nll the relative improvement might not weigh heavily enough against increased runtime ffjord does require less memory than its competitors the improved runtime by implementing the ode solvers on gpu versus the runtime on a cpu would be useful given that this is listed as one of the main contributions besides these questionscomments i do think the idea of using hutchinsons trace estimator is a valid contribution and the experimental validation of continuous normalizing flows is of interest to the research community therefore in my opinion the community will benefit from the information in this paper and it should be accepted however i do wish for the authors to address the above questions as it would give a clearer view of the practical use of the proposed model see below for comments and questions quality the paper has a good setup and is well structured the scope and limitations section is very much appreciated clarity the paper is clearly written overall the only section i can comment on is the related work section which is not the best part of the paper the division in normalizing flows and partitioned transformations is a bit odd partitioned transformations surely are also normalizing flows furthermore iaf by kingma et al is put in the box of autoregressive models whereas it is the inverse of an autoregressive model such that it does not have the dpass sample problem for a reader who is not too familiar with normalizing flows literature i think this section is a little confusing furthermore there is no related work discussed on continuous time flows such as but not limited to 2 originality the originality of the paper is not stellar but sufficient for acceptance significance the community can benefit from the experimental analysis of continuous time flows and the gpu implementation of the ode solver therefore i think this work is significant detailed questionscomments 1 in section 42 as an additional downside to mafddsf the authors argue that sampling cannot be performed analytically since ffjord needs to numerically propagate the ode i do not think ffjord can sample analytically either is this correct 2 the authors argue that they have no restriction on the architecture of the function f even if they have od estimation of the trace of the jacobian however they also say they make use of the bottleneck trick to reduce the variance that arises due to hutchinsons estimate of the trace this seems like a limitation on the architecture to me can the authors comment 3 in b1 in the appendix the street view house numbers dataset is mentioned but no results appear in the main text why not 4 in the results section it is not clear to me which numbers of the baselines for different datasets are taken from other papers and which numbers are obtained by the authors of this paper please clarify 5 in the conclusions when discussing future work the authors state that they are interested in reducing the number of function evaluations in the ode solvers in various disciplines many people have worked on this problem for a long time do the authors think major improvements are soon to be made 6 in section 52 the dependence of the number of function evaluations nfe on the data dimension d is discussed as a thought experiment they use the fact that going from an isotropic gaussian distribution in any d to an isotropic gaussian distribution has a corresponding differential equation of zero this should convince the reader that nfe is independent of d however this seems to me to be such a singular example that i gain no insight from it and it is not very convincing do the authors agree that this particular example does not add much if not please explain 1 chen et al neural ordinary differential equations nips 2018 2 chen et al continuoustime flows for deep generative models edit i have read the response of the authors and appreciate their clarifications and the additional information on the runtimes see my response below for the concern that remains about the absence of the estimate of the log likelihood for the vae experiments besides this issue the other commentsanswers were satisfactory and i think this paper is of interest to the research community so i will stick with my score docsepthis paper further explores the work of chen et al 2018 applied to reversible generative modelling while section 1 and 2 focuses on framing the context of this work the ode solver architecture for continuous normalizing flow learn a density mapping using an instantaneous change of variable formula the contribution of this work seems to be enabling the use of deeper neural network than in chen et al 2018 as part of the ode solver flow while the singlelayer architecture in chen et al 2018 enable efficient exact computation of the jacobian trace using a deeper architecture compromises that property as a result the authors propose to use the unbiased hutchinson trace estimator of the jacobian trace furthermore the authors observe that using a bottleneck architecture reduces the rank of the jacobian and can therefore help reducing the variance of the estimator the density estimation task in 2d is nice to see but lacks comparison with chen et al 2018 on which this paper improves moreover is the glow model used here only using additive coupling layers if so this might explain the difficulties of this glow model although the model presented in this paper doesnt obtain stateoftheart results on the larger problems the work presented in this paper demonstrates the ability of ode solvers as continuous normalizing flows to be competitive in the space of prescribed model concerning discussions and analysis given the lack of improvement using the bottleneck trick is there an actual improvement in variance using this trick or is this trick merely explaining why using a bottleneck architecture more suited for the hutchinson trace estimator in algorithm 1 is epsilon only one random vector that keeps being reused at every step of the solver algorithm i would be surprised that the use of a single random vector across different steps did not significantly increased the variance of the estimator ### Summary:
this paper proposes the use of recently propose neural odes in a flowbased generative model as the paper shows a big advantage of a neural ode in a generative flow is that an unbiased estimator of the logdeterminant of the mapping is straightforward to construct another advantage compared to earlier published flows is that all variables can be updated in parallel as the method does not require chopping up the variables into blocks the paper shows significant improvements on several benchmarks and seems to be a promising venue for further research a disadvantage of the method is that the authors were unable to show that the method could produce results that were similar of better than the sota on the more challenging benchmark of cifar10 another downside is its computational cost since neural odes are relatively new however these problems might resolved with further refinements to the method
[ 3454, 436, 2929, 25339, 271, 7170, 275, 253, 7792, 273, 2622, 3006, 14221, 323, 1006, 800, 14053, 4907, 34082, 75, 636, 253, 4477, 1908, 2622, 3006, 14221, 275, 253, 830, 273, 9826, 8967, 7424, 347, 671, 5469, 275, 337, 616, 9021, 403, 767, 8089, 337, 597, 897, 271, 38663, 29107, 273, 253, 12177, 273, 253, 1566, 407, 4020, 839, 253, 10711, 273, 253, 480, 317, 706, 757, 342, 288, 9248, 968, 790, 10711, 29107, 374, 597, 452, 9009, 253, 2424, 258, 615, 1220, 735, 327, 31025, 316, 50275, 783, 3210, 403, 6760, 327, 247, 4038, 13418, 4836, 327, 10334, 792, 941, 285, 767, 2460, 15302, 278, 79, 382, 285, 260, 338, 274, 740, 347, 973, 347, 327, 39762, 17032, 323, 6753, 2083, 351, 398, 835, 253, 15302, 278, 79, 382, 33039, 304, 11753, 4107, 90, 6511, 285, 1724, 17556, 43031, 23314, 403, 2783, 50275, 783, 4477, 9059, 326, 253, 10711, 29107, 275, 5019, 342, 8107, 9561, 12077, 9827, 281, 11897, 4972, 47941, 706, 757, 3580, 5644, 281, 247, 15180, 2105, 273, 7687, 3185, 273, 7687, 19, 323, 253, 3242, 10711, 273, 253, 480, 317, 706, 757, 50276, 9328, 7277, 436, 281, 253, 2105, 273, 12672, 247, 480, 317, 706, 757, 27152, 323, 6486, 14221, 534, 310, 7687, 20, 275, 2087, 597, 9059, 326, 275, 2087, 512, 2987, 327, 6486, 14221, 452, 10904, 616, 35615, 323, 253, 14221, 281, 3693, 253, 7687, 20, 10454, 285, 326, 34082, 75, 636, 556, 642, 824, 12400, 2299, 891, 651, 751, 253, 4477, 281, 4385, 327, 253, 1563, 6194, 273, 1869, 47694, 11020, 3210, 824, 347, 278, 2320, 347, 973, 347, 209, 571, 71, 13737, 273, 271, 47694, 11020, 1566, 513, 417, 2430, 7687, 20, 281, 11897, 480, 317, 706, 757, 29647, 347, 253, 480, 317, 706, 757, 310, 273, 29740, 830, 3877, 2299, 597, 403, 1335, 10898, 4020, 2392, 604, 4209, 14221, 403, 3732, 347, 667, 3268, 476, 320, 2803, 1025, 275, 271, 47694, 11020, 5133, 342, 436, 275, 2564, 891, 1089, 253, 2502, 2831, 323, 278, 2320, 762, 1959, 630, 480, 317, 706, 757, 5777, 24363, 4931, 891, 46485, 1633, 594, 4496, 19148, 50275, 23955, 9400, 326, 891, 651, 751, 253, 4477, 281, 4385, 327, 310, 6733, 285, 8542, 897, 581, 273, 253, 2022, 2792, 326, 253, 4477, 1646, 281, 10251, 885, 310, 326, 10214, 281, 47694, 11020, 3210, 534, 2430, 277, 11999, 949, 253, 1566, 281, 3410, 247, 2856, 522, 842, 273, 1979, 277, 34082, 75, 636, 310, 247, 2014, 5858, 1566, 10568, 760, 581, 1509, 949, 253, 1566, 597, 3103, 5224, 326, 597, 476, 513, 5919, 10491, 2299, 323, 34082, 75, 636, 1046, 3579, 1509, 4419, 247, 1509, 949, 271, 258, 615, 47037, 534, 347, 253, 4477, 671, 1375, 476, 320, 1077, 3468, 891, 812, 8564, 326, 436, 310, 1335, 7938, 685, 271, 47694, 11020, 1566, 533, 891, 5545, 436, 310, 2686, 273, 10870, 3885, 281, 247, 3579, 1509, 273, 247, 6486, 2685, 824, 347, 15795, 390, 1524, 79, 29035, 50276, 251, 253, 643, 1133, 47694, 11020, 3210, 513, 417, 2430, 277, 11999, 1309, 3733, 5727, 604, 891, 2096, 9113, 34082, 75, 636, 15771, 327, 767, 11999, 949, 258, 615, 1220, 735, 581, 323, 12672, 253, 2957, 285, 247, 1273, 281, 11897, 253, 11786, 273, 253, 2957, 342, 1675, 281, 1566, 3602, 594, 47694, 11020, 3210, 943, 6194, 15455, 7938, 253, 4477, 513, 4385, 327, 253, 958, 326, 34082, 75, 636, 310, 17357, 685, 643, 3210, 533, 597, 513, 417, 1918, 247, 12662, 347, 281, 849, 1199, 17357, 352, 310, 436, 651, 320, 273, 6349, 323, 8542, 897, 285, 323, 643, 952, 281, 1908, 970, 34082, 75, 636, 275, 2852, 789, 50275, 1542, 253, 4038, 13418, 4836, 34082, 75, 636, 1057, 417, 452, 253, 1682, 3045, 2429, 643, 1666, 25379, 3707, 323, 278, 79, 382, 323, 534, 253, 4583, 1682, 1566, 369, 417, 6760, 278, 2320, 69, 1397, 71, 323, 39762, 17032, 34082, 75, 636, 310, 4767, 281, 562, 32231, 512, 643, 14221, 533, 253, 3210, 403, 760, 6760, 327, 253, 4016, 1941, 2406, 3033, 285, 417, 327, 253, 4016, 2412, 3022, 3639, 295, 620, 891, 9101, 253, 295, 620, 281, 320, 12125, 432, 253, 2929, 347, 352, 4419, 625, 13782, 285, 436, 3936, 247, 1048, 673, 323, 34082, 75, 636, 1293, 271, 7103, 327, 295, 620, 253, 7756, 689, 643, 3082, 310, 30455, 1014, 604, 253, 7756, 1335, 6556, 323, 253, 295, 620, 253, 4103, 7756, 1537, 417, 14357, 11306, 2217, 1411, 2559, 20243, 34082, 75, 636, 1057, 2430, 1679, 3541, 685, 697, 21607, 50276, 783, 5520, 20243, 407, 16994, 253, 258, 615, 1220, 735, 327, 305, 11113, 7147, 253, 20243, 327, 247, 27754, 651, 320, 4217, 1677, 326, 436, 310, 7117, 347, 581, 273, 253, 2022, 9021, 50276, 67, 11587, 841, 3533, 26122, 891, 513, 1158, 253, 2934, 273, 970, 288, 9248, 968, 790, 10711, 29107, 310, 247, 3588, 7680, 285, 253, 5661, 12820, 273, 5415, 2622, 3006, 14221, 310, 273, 1600, 281, 253, 2561, 3114, 3103, 275, 619, 4743, 253, 3114, 588, 5649, 432, 253, 1491, 275, 436, 2929, 285, 352, 943, 320, 7607, 2299, 891, 513, 5730, 323, 253, 4477, 281, 2953, 253, 1840, 3533, 347, 352, 651, 1918, 247, 30909, 1859, 273, 253, 8542, 897, 273, 253, 4081, 1566, 50274, 2887, 2708, 323, 5701, 285, 3533, 50276, 15177, 253, 2929, 556, 247, 1175, 9978, 285, 310, 973, 18872, 253, 7990, 285, 7364, 2593, 310, 1077, 1199, 14109, 50275, 498, 15752, 253, 2929, 310, 4518, 3542, 4583, 253, 760, 2593, 891, 476, 4385, 327, 310, 253, 2905, 789, 2593, 534, 310, 417, 253, 1682, 629, 273, 253, 2929, 253, 9025, 275, 2622, 3006, 14221, 285, 10883, 264, 21257, 310, 247, 2372, 8909, 10883, 264, 21257, 13353, 403, 671, 2622, 3006, 14221, 33810, 209, 571, 71, 407, 6963, 785, 1162, 355, 310, 1691, 275, 253, 3817, 273, 47694, 11020, 3210, 5727, 352, 310, 253, 13737, 273, 271, 47694, 11020, 1566, 824, 326, 352, 1057, 417, 452, 253, 277, 5858, 3410, 1895, 323, 247, 9414, 665, 310, 417, 1512, 7615, 342, 2622, 3006, 14221, 6239, 891, 1158, 436, 2593, 310, 247, 1652, 21643, 33810, 627, 310, 642, 2905, 789, 5469, 327, 5415, 673, 14221, 824, 347, 533, 417, 3710, 281, 374, 50276, 19164, 414, 253, 3236, 414, 273, 253, 2929, 310, 417, 13671, 533, 4209, 323, 14924, 50275, 9188, 40348, 253, 3114, 476, 5649, 432, 253, 5661, 1783, 273, 5415, 673, 14221, 285, 253, 305, 11113, 7092, 273, 253, 258, 615, 47037, 3103, 891, 1158, 436, 789, 310, 1534, 50275, 5992, 7193, 3533, 26122, 50276, 18, 275, 2593, 5976, 347, 271, 3081, 42719, 281, 278, 2320, 69, 1397, 71, 253, 4477, 9059, 326, 10491, 2550, 320, 2684, 41398, 1580, 34082, 75, 636, 3198, 281, 27184, 38500, 253, 258, 615, 891, 513, 417, 1158, 34082, 75, 636, 476, 3410, 41398, 2057, 310, 436, 3451, 374, 253, 4477, 9059, 326, 597, 452, 642, 12400, 327, 253, 10336, 273, 253, 1159, 269, 1014, 604, 597, 452, 7687, 13418, 273, 253, 10711, 273, 253, 480, 317, 706, 757, 2299, 597, 671, 1333, 597, 1056, 897, 273, 253, 3673, 44856, 10480, 281, 4796, 253, 11041, 326, 15877, 1955, 281, 288, 9248, 968, 790, 6642, 273, 253, 10711, 436, 3133, 751, 247, 12291, 327, 253, 10336, 281, 479, 476, 253, 4477, 4385, 495, 275, 270, 18, 275, 253, 30762, 253, 6406, 1859, 2419, 3904, 10895, 310, 5393, 533, 642, 1543, 3176, 275, 253, 2022, 2505, 2139, 417, 577, 275, 253, 1543, 2593, 352, 310, 417, 2590, 281, 479, 534, 3904, 273, 253, 1666, 25379, 323, 1027, 15302, 403, 2668, 432, 643, 9380, 285, 534, 3904, 403, 2797, 407, 253, 4477, 273, 436, 2929, 4496, 19148, 608, 275, 253, 11815, 672, 16585, 2852, 789, 253, 4477, 1375, 326, 597, 403, 6110, 275, 8493, 253, 1180, 273, 1159, 27163, 275, 253, 258, 615, 1220, 735, 275, 2710, 32870, 1142, 952, 452, 4307, 327, 436, 1895, 323, 247, 1048, 673, 513, 253, 4477, 1158, 2201, 11701, 403, 3517, 281, 320, 1160, 721, 275, 2593, 8073, 253, 10096, 273, 253, 1180, 273, 1159, 27163, 295, 453, 327, 253, 941, 7877, 277, 310, 5469, 347, 247, 1869, 3368, 597, 897, 253, 958, 326, 1469, 432, 271, 29436, 305, 12064, 3268, 275, 667, 277, 281, 271, 29436, 305, 12064, 3268, 556, 247, 3969, 8967, 5150, 273, 5058, 436, 943, 18578, 253, 9414, 326, 295, 453, 310, 3907, 273, 277, 2299, 436, 3133, 281, 479, 281, 320, 824, 247, 11098, 1650, 326, 891, 6351, 642, 12288, 432, 352, 285, 352, 310, 417, 1077, 21414, 513, 253, 4477, 5194, 326, 436, 1798, 1650, 1057, 417, 823, 1199, 604, 417, 4496, 5513, 50275, 18, 260, 864, 1162, 355, 11454, 9826, 8967, 7424, 295, 2824, 4765, 374, 260, 864, 1162, 355, 44351, 26202, 553, 14221, 323, 3676, 1006, 800, 3210, 50275, 15576, 50275, 74, 452, 1239, 253, 2380, 273, 253, 4477, 285, 11435, 616, 8254, 6787, 285, 253, 3081, 1491, 327, 253, 1408, 3181, 923, 619, 2380, 2708, 323, 253, 4468, 326, 4558, 670, 253, 5928, 273, 253, 6642, 273, 253, 2412, 12177, 323, 253, 362, 3348, 4679, 16280, 436, 2523, 253, 643, 5701, 507, 49643, 497, 20297, 285, 891, 1158, 436, 2929, 310, 273, 1600, 281, 253, 2561, 3114, 594, 891, 588, 7356, 342, 619, 4868, 50276, 7152, 33032, 2520, 2929, 2007, 33826, 253, 789, 273, 260, 864, 1162, 355, 4765, 3732, 281, 24048, 1006, 800, 26278, 1223, 2593, 337, 285, 374, 16633, 327, 39926, 253, 3634, 273, 436, 789, 253, 258, 615, 47037, 10336, 323, 5415, 2622, 3006, 2685, 3037, 247, 4038, 10603, 970, 271, 35774, 1818, 273, 4778, 7212, 253, 7680, 273, 436, 789, 3133, 281, 320, 17690, 253, 897, 273, 12861, 11454, 2990, 685, 275, 260, 864, 1162, 355, 4765, 50276, 284, 629, 273, 253, 258, 615, 47037, 2685, 1223, 253, 2014, 12026, 10336, 275, 260, 864, 1162, 355, 4765, 8046, 5919, 3242, 13782, 273, 253, 480, 317, 706, 757, 10711, 970, 247, 12861, 10336, 10953, 3013, 326, 2867, 347, 247, 906, 253, 4477, 12661, 281, 897, 253, 38663, 288, 9248, 9258, 10711, 29107, 273, 253, 480, 317, 706, 757, 10711, 33810, 253, 4477, 10018, 326, 970, 247, 3673, 44856, 10336, 11355, 253, 5958, 273, 253, 480, 317, 706, 757, 285, 476, 3103, 1361, 8493, 253, 11041, 273, 253, 29107, 50276, 783, 4038, 13418, 4836, 275, 374, 69, 310, 5322, 281, 923, 533, 19756, 5301, 342, 260, 864, 1162, 355, 4765, 327, 534, 436, 2929, 19132, 25761, 310, 253, 15795, 1566, 908, 1060, 760, 970, 21842, 8789, 8090, 604, 594, 436, 1537, 5513, 253, 12748, 273, 436, 15795, 1566, 50276, 20261, 253, 1566, 3559, 275, 436, 2929, 36908, 4044, 1375, 23037, 14387, 1543, 327, 253, 4067, 3237, 253, 789, 3559, 275, 436, 2929, 14371, 253, 3745, 273, 258, 615, 1220, 735, 347, 5415, 2622, 3006, 14221, 281, 320, 12085, 275, 253, 2317, 273, 15588, 1566, 8664, 11985, 285, 1783, 50276, 28821, 253, 3480, 273, 7756, 970, 253, 3673, 44856, 10480, 310, 627, 271, 4588, 7756, 275, 11041, 970, 436, 10480, 390, 310, 436, 10480, 7960, 15571, 2139, 970, 247, 3673, 44856, 10336, 625, 18960, 323, 253, 288, 9248, 9258, 10711, 29107, 275, 5933, 337, 310, 299, 4277, 760, 581, 3632, 4972, 326, 11359, 1146, 294, 3197, 387, 1046, 3213, 273, 253, 47037, 5933, 891, 651, 320, 9861, 326, 253, 897, 273, 247, 2014, 3632, 4972, 2439, 1027, 5018, 858, 417, 3012, 2559, 253, 11041, 273, 253, 29107, 187, 187, 4118, 18435, 27, 2520, 2929, 29328, 253, 897, 273, 4102, 12661, 11454, 258, 3229, 275, 247, 2685, 3169, 1006, 800, 1566, 50275, 284, 253, 2929, 2722, 247, 1943, 5750, 273, 247, 11454, 258, 615, 275, 247, 1006, 800, 2685, 310, 326, 271, 38663, 29107, 273, 253, 2412, 18916, 249, 386, 273, 253, 10603, 310, 15246, 281, 3989, 1529, 5750, 2429, 281, 4321, 3863, 14221, 310, 326, 512, 4903, 476, 320, 9300, 275, 7529, 347, 253, 1332, 1057, 417, 2430, 2093, 2784, 598, 253, 4903, 715, 8336, 50276, 783, 2929, 2722, 1534, 11701, 327, 2067, 49602, 285, 3133, 281, 320, 247, 12532, 18767, 323, 2007, 2561, 50276, 66, 18928, 273, 253, 1332, 310, 326, 253, 4477, 497, 7591, 281, 921, 326, 253, 1332, 812, 4711, 1543, 326, 497, 2074, 273, 1805, 685, 253, 256, 5503, 327, 253, 625, 11132, 22791, 273, 260, 338, 274, 740, 1529, 42719, 310, 697, 15180, 2105, 1580, 11454, 258, 3229, 403, 4942, 747, 2299, 841, 3237, 1537, 11512, 342, 2007, 46783, 3658, 281, 253, 1332, 209 ]
[ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 ]
[ 3454, 436, 2929, 25339, 271, 7170, 275, 253, 7792, 273, 2622, 3006, 14221, 323, 1006, 800, 14053, 4907, 34082, 75, 636, 253, 4477, 1908, 2622, 3006, 14221, 275, 253, 830, 273, 9826, 8967, 7424, 347, 671, 5469, 275, 337, 616, 9021, 403, 767, 8089, 337, 597, 897, 271, 38663, 29107, 273, 253, 12177, 273, 253, 1566, 407, 4020, 839, 253, 10711, 273, 253, 480, 317, 706, 757, 342, 288, 9248, 968, 790, 10711, 29107, 374, 597, 452, 9009, 253, 2424, 258, 615, 1220, 735, 327, 31025, 316, 50275, 783, 3210, 403, 6760, 327, 247, 4038, 13418, 4836, 327, 10334, 792, 941, 285, 767, 2460, 15302, 278, 79, 382, 285, 260, 338, 274, 740, 347, 973, 347, 327, 39762, 17032, 323, 6753, 2083, 351, 398, 835, 253, 15302, 278, 79, 382, 33039, 304, 11753, 4107, 90, 6511, 285, 1724, 17556, 43031, 23314, 403, 2783, 50275, 783, 4477, 9059, 326, 253, 10711, 29107, 275, 5019, 342, 8107, 9561, 12077, 9827, 281, 11897, 4972, 47941, 706, 757, 3580, 5644, 281, 247, 15180, 2105, 273, 7687, 3185, 273, 7687, 19, 323, 253, 3242, 10711, 273, 253, 480, 317, 706, 757, 50276, 9328, 7277, 436, 281, 253, 2105, 273, 12672, 247, 480, 317, 706, 757, 27152, 323, 6486, 14221, 534, 310, 7687, 20, 275, 2087, 597, 9059, 326, 275, 2087, 512, 2987, 327, 6486, 14221, 452, 10904, 616, 35615, 323, 253, 14221, 281, 3693, 253, 7687, 20, 10454, 285, 326, 34082, 75, 636, 556, 642, 824, 12400, 2299, 891, 651, 751, 253, 4477, 281, 4385, 327, 253, 1563, 6194, 273, 1869, 47694, 11020, 3210, 824, 347, 278, 2320, 347, 973, 347, 209, 571, 71, 13737, 273, 271, 47694, 11020, 1566, 513, 417, 2430, 7687, 20, 281, 11897, 480, 317, 706, 757, 29647, 347, 253, 480, 317, 706, 757, 310, 273, 29740, 830, 3877, 2299, 597, 403, 1335, 10898, 4020, 2392, 604, 4209, 14221, 403, 3732, 347, 667, 3268, 476, 320, 2803, 1025, 275, 271, 47694, 11020, 5133, 342, 436, 275, 2564, 891, 1089, 253, 2502, 2831, 323, 278, 2320, 762, 1959, 630, 480, 317, 706, 757, 5777, 24363, 4931, 891, 46485, 1633, 594, 4496, 19148, 50275, 23955, 9400, 326, 891, 651, 751, 253, 4477, 281, 4385, 327, 310, 6733, 285, 8542, 897, 581, 273, 253, 2022, 2792, 326, 253, 4477, 1646, 281, 10251, 885, 310, 326, 10214, 281, 47694, 11020, 3210, 534, 2430, 277, 11999, 949, 253, 1566, 281, 3410, 247, 2856, 522, 842, 273, 1979, 277, 34082, 75, 636, 310, 247, 2014, 5858, 1566, 10568, 760, 581, 1509, 949, 253, 1566, 597, 3103, 5224, 326, 597, 476, 513, 5919, 10491, 2299, 323, 34082, 75, 636, 1046, 3579, 1509, 4419, 247, 1509, 949, 271, 258, 615, 47037, 534, 347, 253, 4477, 671, 1375, 476, 320, 1077, 3468, 891, 812, 8564, 326, 436, 310, 1335, 7938, 685, 271, 47694, 11020, 1566, 533, 891, 5545, 436, 310, 2686, 273, 10870, 3885, 281, 247, 3579, 1509, 273, 247, 6486, 2685, 824, 347, 15795, 390, 1524, 79, 29035, 50276, 251, 253, 643, 1133, 47694, 11020, 3210, 513, 417, 2430, 277, 11999, 1309, 3733, 5727, 604, 891, 2096, 9113, 34082, 75, 636, 15771, 327, 767, 11999, 949, 258, 615, 1220, 735, 581, 323, 12672, 253, 2957, 285, 247, 1273, 281, 11897, 253, 11786, 273, 253, 2957, 342, 1675, 281, 1566, 3602, 594, 47694, 11020, 3210, 943, 6194, 15455, 7938, 253, 4477, 513, 4385, 327, 253, 958, 326, 34082, 75, 636, 310, 17357, 685, 643, 3210, 533, 597, 513, 417, 1918, 247, 12662, 347, 281, 849, 1199, 17357, 352, 310, 436, 651, 320, 273, 6349, 323, 8542, 897, 285, 323, 643, 952, 281, 1908, 970, 34082, 75, 636, 275, 2852, 789, 50275, 1542, 253, 4038, 13418, 4836, 34082, 75, 636, 1057, 417, 452, 253, 1682, 3045, 2429, 643, 1666, 25379, 3707, 323, 278, 79, 382, 323, 534, 253, 4583, 1682, 1566, 369, 417, 6760, 278, 2320, 69, 1397, 71, 323, 39762, 17032, 34082, 75, 636, 310, 4767, 281, 562, 32231, 512, 643, 14221, 533, 253, 3210, 403, 760, 6760, 327, 253, 4016, 1941, 2406, 3033, 285, 417, 327, 253, 4016, 2412, 3022, 3639, 295, 620, 891, 9101, 253, 295, 620, 281, 320, 12125, 432, 253, 2929, 347, 352, 4419, 625, 13782, 285, 436, 3936, 247, 1048, 673, 323, 34082, 75, 636, 1293, 271, 7103, 327, 295, 620, 253, 7756, 689, 643, 3082, 310, 30455, 1014, 604, 253, 7756, 1335, 6556, 323, 253, 295, 620, 253, 4103, 7756, 1537, 417, 14357, 11306, 2217, 1411, 2559, 20243, 34082, 75, 636, 1057, 2430, 1679, 3541, 685, 697, 21607, 50276, 783, 5520, 20243, 407, 16994, 253, 258, 615, 1220, 735, 327, 305, 11113, 7147, 253, 20243, 327, 247, 27754, 651, 320, 4217, 1677, 326, 436, 310, 7117, 347, 581, 273, 253, 2022, 9021, 50276, 67, 11587, 841, 3533, 26122, 891, 513, 1158, 253, 2934, 273, 970, 288, 9248, 968, 790, 10711, 29107, 310, 247, 3588, 7680, 285, 253, 5661, 12820, 273, 5415, 2622, 3006, 14221, 310, 273, 1600, 281, 253, 2561, 3114, 3103, 275, 619, 4743, 253, 3114, 588, 5649, 432, 253, 1491, 275, 436, 2929, 285, 352, 943, 320, 7607, 2299, 891, 513, 5730, 323, 253, 4477, 281, 2953, 253, 1840, 3533, 347, 352, 651, 1918, 247, 30909, 1859, 273, 253, 8542, 897, 273, 253, 4081, 1566, 50274, 2887, 2708, 323, 5701, 285, 3533, 50276, 15177, 253, 2929, 556, 247, 1175, 9978, 285, 310, 973, 18872, 253, 7990, 285, 7364, 2593, 310, 1077, 1199, 14109, 50275, 498, 15752, 253, 2929, 310, 4518, 3542, 4583, 253, 760, 2593, 891, 476, 4385, 327, 310, 253, 2905, 789, 2593, 534, 310, 417, 253, 1682, 629, 273, 253, 2929, 253, 9025, 275, 2622, 3006, 14221, 285, 10883, 264, 21257, 310, 247, 2372, 8909, 10883, 264, 21257, 13353, 403, 671, 2622, 3006, 14221, 33810, 209, 571, 71, 407, 6963, 785, 1162, 355, 310, 1691, 275, 253, 3817, 273, 47694, 11020, 3210, 5727, 352, 310, 253, 13737, 273, 271, 47694, 11020, 1566, 824, 326, 352, 1057, 417, 452, 253, 277, 5858, 3410, 1895, 323, 247, 9414, 665, 310, 417, 1512, 7615, 342, 2622, 3006, 14221, 6239, 891, 1158, 436, 2593, 310, 247, 1652, 21643, 33810, 627, 310, 642, 2905, 789, 5469, 327, 5415, 673, 14221, 824, 347, 533, 417, 3710, 281, 374, 50276, 19164, 414, 253, 3236, 414, 273, 253, 2929, 310, 417, 13671, 533, 4209, 323, 14924, 50275, 9188, 40348, 253, 3114, 476, 5649, 432, 253, 5661, 1783, 273, 5415, 673, 14221, 285, 253, 305, 11113, 7092, 273, 253, 258, 615, 47037, 3103, 891, 1158, 436, 789, 310, 1534, 50275, 5992, 7193, 3533, 26122, 50276, 18, 275, 2593, 5976, 347, 271, 3081, 42719, 281, 278, 2320, 69, 1397, 71, 253, 4477, 9059, 326, 10491, 2550, 320, 2684, 41398, 1580, 34082, 75, 636, 3198, 281, 27184, 38500, 253, 258, 615, 891, 513, 417, 1158, 34082, 75, 636, 476, 3410, 41398, 2057, 310, 436, 3451, 374, 253, 4477, 9059, 326, 597, 452, 642, 12400, 327, 253, 10336, 273, 253, 1159, 269, 1014, 604, 597, 452, 7687, 13418, 273, 253, 10711, 273, 253, 480, 317, 706, 757, 2299, 597, 671, 1333, 597, 1056, 897, 273, 253, 3673, 44856, 10480, 281, 4796, 253, 11041, 326, 15877, 1955, 281, 288, 9248, 968, 790, 6642, 273, 253, 10711, 436, 3133, 751, 247, 12291, 327, 253, 10336, 281, 479, 476, 253, 4477, 4385, 495, 275, 270, 18, 275, 253, 30762, 253, 6406, 1859, 2419, 3904, 10895, 310, 5393, 533, 642, 1543, 3176, 275, 253, 2022, 2505, 2139, 417, 577, 275, 253, 1543, 2593, 352, 310, 417, 2590, 281, 479, 534, 3904, 273, 253, 1666, 25379, 323, 1027, 15302, 403, 2668, 432, 643, 9380, 285, 534, 3904, 403, 2797, 407, 253, 4477, 273, 436, 2929, 4496, 19148, 608, 275, 253, 11815, 672, 16585, 2852, 789, 253, 4477, 1375, 326, 597, 403, 6110, 275, 8493, 253, 1180, 273, 1159, 27163, 275, 253, 258, 615, 1220, 735, 275, 2710, 32870, 1142, 952, 452, 4307, 327, 436, 1895, 323, 247, 1048, 673, 513, 253, 4477, 1158, 2201, 11701, 403, 3517, 281, 320, 1160, 721, 275, 2593, 8073, 253, 10096, 273, 253, 1180, 273, 1159, 27163, 295, 453, 327, 253, 941, 7877, 277, 310, 5469, 347, 247, 1869, 3368, 597, 897, 253, 958, 326, 1469, 432, 271, 29436, 305, 12064, 3268, 275, 667, 277, 281, 271, 29436, 305, 12064, 3268, 556, 247, 3969, 8967, 5150, 273, 5058, 436, 943, 18578, 253, 9414, 326, 295, 453, 310, 3907, 273, 277, 2299, 436, 3133, 281, 479, 281, 320, 824, 247, 11098, 1650, 326, 891, 6351, 642, 12288, 432, 352, 285, 352, 310, 417, 1077, 21414, 513, 253, 4477, 5194, 326, 436, 1798, 1650, 1057, 417, 823, 1199, 604, 417, 4496, 5513, 50275, 18, 260, 864, 1162, 355, 11454, 9826, 8967, 7424, 295, 2824, 4765, 374, 260, 864, 1162, 355, 44351, 26202, 553, 14221, 323, 3676, 1006, 800, 3210, 50275, 15576, 50275, 74, 452, 1239, 253, 2380, 273, 253, 4477, 285, 11435, 616, 8254, 6787, 285, 253, 3081, 1491, 327, 253, 1408, 3181, 923, 619, 2380, 2708, 323, 253, 4468, 326, 4558, 670, 253, 5928, 273, 253, 6642, 273, 253, 2412, 12177, 323, 253, 362, 3348, 4679, 16280, 436, 2523, 253, 643, 5701, 507, 49643, 497, 20297, 285, 891, 1158, 436, 2929, 310, 273, 1600, 281, 253, 2561, 3114, 594, 891, 588, 7356, 342, 619, 4868, 50276, 7152, 33032, 2520, 2929, 2007, 33826, 253, 789, 273, 260, 864, 1162, 355, 4765, 3732, 281, 24048, 1006, 800, 26278, 1223, 2593, 337, 285, 374, 16633, 327, 39926, 253, 3634, 273, 436, 789, 253, 258, 615, 47037, 10336, 323, 5415, 2622, 3006, 2685, 3037, 247, 4038, 10603, 970, 271, 35774, 1818, 273, 4778, 7212, 253, 7680, 273, 436, 789, 3133, 281, 320, 17690, 253, 897, 273, 12861, 11454, 2990, 685, 275, 260, 864, 1162, 355, 4765, 50276, 284, 629, 273, 253, 258, 615, 47037, 2685, 1223, 253, 2014, 12026, 10336, 275, 260, 864, 1162, 355, 4765, 8046, 5919, 3242, 13782, 273, 253, 480, 317, 706, 757, 10711, 970, 247, 12861, 10336, 10953, 3013, 326, 2867, 347, 247, 906, 253, 4477, 12661, 281, 897, 253, 38663, 288, 9248, 9258, 10711, 29107, 273, 253, 480, 317, 706, 757, 10711, 33810, 253, 4477, 10018, 326, 970, 247, 3673, 44856, 10336, 11355, 253, 5958, 273, 253, 480, 317, 706, 757, 285, 476, 3103, 1361, 8493, 253, 11041, 273, 253, 29107, 50276, 783, 4038, 13418, 4836, 275, 374, 69, 310, 5322, 281, 923, 533, 19756, 5301, 342, 260, 864, 1162, 355, 4765, 327, 534, 436, 2929, 19132, 25761, 310, 253, 15795, 1566, 908, 1060, 760, 970, 21842, 8789, 8090, 604, 594, 436, 1537, 5513, 253, 12748, 273, 436, 15795, 1566, 50276, 20261, 253, 1566, 3559, 275, 436, 2929, 36908, 4044, 1375, 23037, 14387, 1543, 327, 253, 4067, 3237, 253, 789, 3559, 275, 436, 2929, 14371, 253, 3745, 273, 258, 615, 1220, 735, 347, 5415, 2622, 3006, 14221, 281, 320, 12085, 275, 253, 2317, 273, 15588, 1566, 8664, 11985, 285, 1783, 50276, 28821, 253, 3480, 273, 7756, 970, 253, 3673, 44856, 10480, 310, 627, 271, 4588, 7756, 275, 11041, 970, 436, 10480, 390, 310, 436, 10480, 7960, 15571, 2139, 970, 247, 3673, 44856, 10336, 625, 18960, 323, 253, 288, 9248, 9258, 10711, 29107, 275, 5933, 337, 310, 299, 4277, 760, 581, 3632, 4972, 326, 11359, 1146, 294, 3197, 387, 1046, 3213, 273, 253, 47037, 5933, 891, 651, 320, 9861, 326, 253, 897, 273, 247, 2014, 3632, 4972, 2439, 1027, 5018, 858, 417, 3012, 2559, 253, 11041, 273, 253, 29107, 187, 187, 4118, 18435, 27, 2520, 2929, 29328, 253, 897, 273, 4102, 12661, 11454, 258, 3229, 275, 247, 2685, 3169, 1006, 800, 1566, 50275, 284, 253, 2929, 2722, 247, 1943, 5750, 273, 247, 11454, 258, 615, 275, 247, 1006, 800, 2685, 310, 326, 271, 38663, 29107, 273, 253, 2412, 18916, 249, 386, 273, 253, 10603, 310, 15246, 281, 3989, 1529, 5750, 2429, 281, 4321, 3863, 14221, 310, 326, 512, 4903, 476, 320, 9300, 275, 7529, 347, 253, 1332, 1057, 417, 2430, 2093, 2784, 598, 253, 4903, 715, 8336, 50276, 783, 2929, 2722, 1534, 11701, 327, 2067, 49602, 285, 3133, 281, 320, 247, 12532, 18767, 323, 2007, 2561, 50276, 66, 18928, 273, 253, 1332, 310, 326, 253, 4477, 497, 7591, 281, 921, 326, 253, 1332, 812, 4711, 1543, 326, 497, 2074, 273, 1805, 685, 253, 256, 5503, 327, 253, 625, 11132, 22791, 273, 260, 338, 274, 740, 1529, 42719, 310, 697, 15180, 2105, 1580, 11454, 258, 3229, 403, 4942, 747, 2299, 841, 3237, 1537, 11512, 342, 2007, 46783, 3658, 281, 253, 1332, 209 ]
Below is given review of a research paper from cnoference journal. Please write a summary the review. ### Review: the paper tackles the problem of efficient offline rl in the multiagent setting marl that typically suffers from the curse of dimensionality as the number of agents grows they argue that transformers are ideally suited for estimating the rl components value functionsdynamics models and are able to implement efficient relational reasoning between agents they combine modelfree and modelbased rl algorithms with transformers as function approximators for the value function and dynamics model respectively and present theoretical results for the generalisation and suboptimality gaps for the resulting algorithms the papers main contribution is the theoretical analysis of marl algorithms when using transformers for function approximation showing that they can get significantly tighter bounds on the generalisation error eg for the modelfree algorithm the error bound becomes independent of the number of agents i am not familiar with details of prior work on offline marl and the presentation of the paper made it often difficult to judge the significance and the originality of some of the contributions eg we design offline modelfree and modelbased rl algorithms with the transformer approximators it seemed to me that the authors modified existing rl algorithms by simply replacing the function approximator used with transformers i would have appreciated more discussion on the assumptions required for the results to be able to identify which ones are the strongest instead of simply referring to other works that make similar assumptions eg the iid assumption instead of sequential on the offline data the paper is missing a discussion andor experimental demonstration of how much of the favourable scaling properties would carry over to more realistic settings which would make the significance of the results a lot more clear as mentioned before the authors should elaborate on the assumptions necessary for the results and what we can expect in practice when they dont hold docsepthis paper presents theoretical analysis on the use of transformers for offline multiagent reinforcement learning the main contributions are i a proof that approximating the relational reasoning of set transformers using feedforward neural nets requires exponential width ii modelfree and modelbased algorithms for offline marl using transformers and pessimistic policies which minimize the effect of distribution shift and iii suboptimality gaps for the proposed algorithms showing that they scale well with the number of agents unfortunately i found this paper very hard to understand even after spending several hours on it and multiple reads not only it is theory intensive but it often introduces ideas and terminology too suddenly and with very sparse explanations since this is not my area of expertise i will opt for assuming that the math and derivations are correct in which case i think this paper is probably a good contribution to the conference the topic is clearly relevant and the use of transformers is gaining prominence in in reinforcement learning so analysis such as the one presented in this paper are of great interest but again i must qualify this opinion with the caveat that im taking the results offered at face value i didnt see any discussion on limitations in this paper which in fact doesnt have a final discussionconclusion section docsepthis paper concerns the theoretical understanding and relational reasoning of permutation invariant agents framework in marl it proposes offline marl with the transformer and analyze the error bound it utilizes selfattention mechanism that is widely used in cv and nlp to model relational reasoning between agents it proposes both modelfree and modelbased offline marl it theoretically prove the gap does not scale with the number of agents and the proof is complete the environment is simple and cant empirically demonstrate the performance of the method the novelty is not enough the paper extends singleagent offline rl and utilize set transformers as neural network structure ### Summary:
the paper presents theoretical results justifying the use of transformers in cooperative multiagent rl the authors demonstrate that with this choice of architecture suboptimality gaps grow independently of the number of agents the theoretical contribution seems strong with a more limited experimental evaluation the paper is dense mathematically and was hard to assess the theorems were nevertheless deemed strong enough to justify acceptance however please do address comments from reviewer gwc2 in the final version
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 783, 2929, 39223, 253, 1895, 273, 5919, 28841, 391, 77, 275, 253, 4471, 12788, 4758, 2304, 77, 326, 5431, 27171, 432, 253, 28401, 273, 7877, 1319, 347, 253, 1180, 273, 6083, 17202, 597, 9059, 326, 4979, 398, 403, 34243, 18960, 323, 26230, 253, 391, 77, 4295, 1318, 3470, 41546, 3210, 285, 403, 2104, 281, 3359, 5919, 38524, 14720, 875, 6083, 597, 13398, 771, 813, 658, 285, 1566, 3169, 391, 77, 11333, 342, 4979, 398, 347, 1159, 4020, 2392, 323, 253, 1318, 1159, 285, 8062, 1566, 2975, 285, 1246, 10527, 1543, 323, 253, 2087, 5837, 285, 749, 32581, 1319, 18388, 323, 253, 4795, 11333, 253, 9380, 2022, 7680, 310, 253, 10527, 1783, 273, 2304, 77, 11333, 672, 970, 4979, 398, 323, 1159, 11193, 4645, 326, 597, 476, 755, 3012, 40638, 14493, 327, 253, 2087, 5837, 2228, 24088, 323, 253, 771, 813, 658, 5933, 253, 2228, 3033, 4916, 3907, 273, 253, 1180, 273, 6083, 50276, 74, 717, 417, 7615, 342, 4278, 273, 2720, 789, 327, 28841, 2304, 77, 285, 253, 9759, 273, 253, 2929, 1160, 352, 2223, 2834, 281, 5963, 253, 8453, 285, 253, 3236, 414, 273, 690, 273, 253, 9021, 24088, 359, 2216, 28841, 771, 813, 658, 285, 1566, 3169, 391, 77, 11333, 342, 253, 39707, 4020, 2392, 352, 4455, 281, 479, 326, 253, 4477, 7321, 5368, 391, 77, 11333, 407, 3365, 15706, 253, 1159, 4020, 1080, 908, 342, 4979, 398, 50276, 74, 651, 452, 14109, 625, 5955, 327, 253, 13260, 2424, 323, 253, 1543, 281, 320, 2104, 281, 4271, 534, 4394, 403, 253, 19508, 3185, 273, 3365, 14339, 281, 643, 2987, 326, 1056, 2074, 13260, 24088, 253, 891, 301, 9376, 3185, 273, 22453, 327, 253, 28841, 941, 50276, 783, 2929, 310, 5816, 247, 5955, 285, 263, 5661, 20028, 273, 849, 1199, 273, 253, 39262, 13642, 3607, 651, 4459, 689, 281, 625, 15958, 7533, 534, 651, 1056, 253, 8453, 273, 253, 1543, 247, 2257, 625, 2590, 347, 5393, 1078, 253, 4477, 943, 21184, 327, 253, 13260, 3309, 323, 253, 1543, 285, 752, 359, 476, 1902, 275, 3946, 672, 597, 13414, 2186, 5474, 33032, 2520, 2929, 10262, 10527, 1783, 327, 253, 897, 273, 4979, 398, 323, 28841, 4471, 12788, 35221, 4715, 253, 2022, 9021, 403, 891, 247, 4737, 326, 4020, 839, 253, 38524, 14720, 273, 873, 4979, 398, 970, 3997, 10495, 11454, 37507, 4419, 17619, 4871, 21255, 771, 813, 658, 285, 1566, 3169, 11333, 323, 28841, 2304, 77, 970, 4979, 398, 285, 45234, 2531, 7823, 534, 15338, 253, 1055, 273, 3268, 5333, 285, 37685, 749, 32581, 1319, 18388, 323, 253, 4081, 11333, 4645, 326, 597, 4311, 973, 342, 253, 1180, 273, 6083, 50276, 328, 9520, 891, 1119, 436, 2929, 1077, 1892, 281, 2096, 1014, 846, 9100, 2067, 3038, 327, 352, 285, 2709, 9563, 417, 760, 352, 310, 3762, 17193, 533, 352, 2223, 23970, 5697, 285, 28939, 1512, 8423, 285, 342, 1077, 23507, 22909, 1580, 436, 310, 417, 619, 2170, 273, 15040, 891, 588, 1478, 323, 7384, 326, 253, 14168, 285, 3538, 569, 403, 3451, 275, 534, 1083, 891, 1158, 436, 2929, 310, 3164, 247, 1175, 7680, 281, 253, 8059, 253, 9400, 310, 4518, 4623, 285, 253, 897, 273, 4979, 398, 310, 21896, 44373, 275, 275, 35221, 4715, 594, 1783, 824, 347, 253, 581, 3559, 275, 436, 2929, 403, 273, 1270, 1600, 533, 969, 891, 1364, 19478, 436, 4743, 342, 253, 15985, 255, 326, 516, 3192, 253, 1543, 5907, 387, 2454, 1318, 50276, 74, 42126, 923, 667, 5955, 327, 7364, 275, 436, 2929, 534, 275, 958, 36908, 452, 247, 2457, 5955, 585, 3444, 2593, 50276, 7152, 33032, 2520, 2929, 7350, 253, 10527, 4685, 285, 38524, 14720, 273, 29391, 13727, 6083, 7792, 275, 2304, 77, 352, 29328, 28841, 2304, 77, 342, 253, 39707, 285, 12106, 253, 2228, 3033, 50276, 262, 29820, 1881, 42959, 5122, 326, 310, 7561, 908, 275, 30105, 285, 295, 24343, 281, 1566, 38524, 14720, 875, 6083, 50276, 262, 29328, 1097, 771, 813, 658, 285, 1566, 3169, 28841, 2304, 77, 50276, 262, 28055, 5276, 253, 8037, 1057, 417, 4311, 342, 253, 1180, 273, 6083, 285, 253, 4737, 310, 3426, 50275, 783, 3126, 310, 2969, 285, 16216, 45190, 7568, 253, 3045, 273, 253, 1332, 50276, 783, 38135, 310, 417, 2217, 253, 2929, 8725, 2014, 12788, 28841, 391, 77, 285, 16584, 873, 4979, 398, 347, 11454, 2990, 2605, 2490, 187, 4118, 18435, 27, 783, 2929, 10262, 10527, 1543, 816, 5411, 253, 897, 273, 4979, 398, 275, 27293, 4471, 12788, 391, 77, 253, 4477, 7568, 326, 342, 436, 4327, 273, 10336, 749, 32581, 1319, 18388, 1756, 10939, 273, 253, 1180, 273, 6083, 253, 10527, 7680, 3133, 2266, 342, 247, 625, 3710, 5661, 7103, 253, 2929, 310, 14086, 11076, 1037, 285, 369, 1892, 281, 2939, 253, 39383, 497, 17837, 14320, 2266, 2217, 281, 15249, 14924, 2299, 4496, 513, 2953, 5701, 432, 37317, 305, 38212, 19, 275, 253, 2457, 2715 ]
[ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 ]
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 783, 2929, 39223, 253, 1895, 273, 5919, 28841, 391, 77, 275, 253, 4471, 12788, 4758, 2304, 77, 326, 5431, 27171, 432, 253, 28401, 273, 7877, 1319, 347, 253, 1180, 273, 6083, 17202, 597, 9059, 326, 4979, 398, 403, 34243, 18960, 323, 26230, 253, 391, 77, 4295, 1318, 3470, 41546, 3210, 285, 403, 2104, 281, 3359, 5919, 38524, 14720, 875, 6083, 597, 13398, 771, 813, 658, 285, 1566, 3169, 391, 77, 11333, 342, 4979, 398, 347, 1159, 4020, 2392, 323, 253, 1318, 1159, 285, 8062, 1566, 2975, 285, 1246, 10527, 1543, 323, 253, 2087, 5837, 285, 749, 32581, 1319, 18388, 323, 253, 4795, 11333, 253, 9380, 2022, 7680, 310, 253, 10527, 1783, 273, 2304, 77, 11333, 672, 970, 4979, 398, 323, 1159, 11193, 4645, 326, 597, 476, 755, 3012, 40638, 14493, 327, 253, 2087, 5837, 2228, 24088, 323, 253, 771, 813, 658, 5933, 253, 2228, 3033, 4916, 3907, 273, 253, 1180, 273, 6083, 50276, 74, 717, 417, 7615, 342, 4278, 273, 2720, 789, 327, 28841, 2304, 77, 285, 253, 9759, 273, 253, 2929, 1160, 352, 2223, 2834, 281, 5963, 253, 8453, 285, 253, 3236, 414, 273, 690, 273, 253, 9021, 24088, 359, 2216, 28841, 771, 813, 658, 285, 1566, 3169, 391, 77, 11333, 342, 253, 39707, 4020, 2392, 352, 4455, 281, 479, 326, 253, 4477, 7321, 5368, 391, 77, 11333, 407, 3365, 15706, 253, 1159, 4020, 1080, 908, 342, 4979, 398, 50276, 74, 651, 452, 14109, 625, 5955, 327, 253, 13260, 2424, 323, 253, 1543, 281, 320, 2104, 281, 4271, 534, 4394, 403, 253, 19508, 3185, 273, 3365, 14339, 281, 643, 2987, 326, 1056, 2074, 13260, 24088, 253, 891, 301, 9376, 3185, 273, 22453, 327, 253, 28841, 941, 50276, 783, 2929, 310, 5816, 247, 5955, 285, 263, 5661, 20028, 273, 849, 1199, 273, 253, 39262, 13642, 3607, 651, 4459, 689, 281, 625, 15958, 7533, 534, 651, 1056, 253, 8453, 273, 253, 1543, 247, 2257, 625, 2590, 347, 5393, 1078, 253, 4477, 943, 21184, 327, 253, 13260, 3309, 323, 253, 1543, 285, 752, 359, 476, 1902, 275, 3946, 672, 597, 13414, 2186, 5474, 33032, 2520, 2929, 10262, 10527, 1783, 327, 253, 897, 273, 4979, 398, 323, 28841, 4471, 12788, 35221, 4715, 253, 2022, 9021, 403, 891, 247, 4737, 326, 4020, 839, 253, 38524, 14720, 273, 873, 4979, 398, 970, 3997, 10495, 11454, 37507, 4419, 17619, 4871, 21255, 771, 813, 658, 285, 1566, 3169, 11333, 323, 28841, 2304, 77, 970, 4979, 398, 285, 45234, 2531, 7823, 534, 15338, 253, 1055, 273, 3268, 5333, 285, 37685, 749, 32581, 1319, 18388, 323, 253, 4081, 11333, 4645, 326, 597, 4311, 973, 342, 253, 1180, 273, 6083, 50276, 328, 9520, 891, 1119, 436, 2929, 1077, 1892, 281, 2096, 1014, 846, 9100, 2067, 3038, 327, 352, 285, 2709, 9563, 417, 760, 352, 310, 3762, 17193, 533, 352, 2223, 23970, 5697, 285, 28939, 1512, 8423, 285, 342, 1077, 23507, 22909, 1580, 436, 310, 417, 619, 2170, 273, 15040, 891, 588, 1478, 323, 7384, 326, 253, 14168, 285, 3538, 569, 403, 3451, 275, 534, 1083, 891, 1158, 436, 2929, 310, 3164, 247, 1175, 7680, 281, 253, 8059, 253, 9400, 310, 4518, 4623, 285, 253, 897, 273, 4979, 398, 310, 21896, 44373, 275, 275, 35221, 4715, 594, 1783, 824, 347, 253, 581, 3559, 275, 436, 2929, 403, 273, 1270, 1600, 533, 969, 891, 1364, 19478, 436, 4743, 342, 253, 15985, 255, 326, 516, 3192, 253, 1543, 5907, 387, 2454, 1318, 50276, 74, 42126, 923, 667, 5955, 327, 7364, 275, 436, 2929, 534, 275, 958, 36908, 452, 247, 2457, 5955, 585, 3444, 2593, 50276, 7152, 33032, 2520, 2929, 7350, 253, 10527, 4685, 285, 38524, 14720, 273, 29391, 13727, 6083, 7792, 275, 2304, 77, 352, 29328, 28841, 2304, 77, 342, 253, 39707, 285, 12106, 253, 2228, 3033, 50276, 262, 29820, 1881, 42959, 5122, 326, 310, 7561, 908, 275, 30105, 285, 295, 24343, 281, 1566, 38524, 14720, 875, 6083, 50276, 262, 29328, 1097, 771, 813, 658, 285, 1566, 3169, 28841, 2304, 77, 50276, 262, 28055, 5276, 253, 8037, 1057, 417, 4311, 342, 253, 1180, 273, 6083, 285, 253, 4737, 310, 3426, 50275, 783, 3126, 310, 2969, 285, 16216, 45190, 7568, 253, 3045, 273, 253, 1332, 50276, 783, 38135, 310, 417, 2217, 253, 2929, 8725, 2014, 12788, 28841, 391, 77, 285, 16584, 873, 4979, 398, 347, 11454, 2990, 2605, 2490, 187, 4118, 18435, 27, 783, 2929, 10262, 10527, 1543, 816, 5411, 253, 897, 273, 4979, 398, 275, 27293, 4471, 12788, 391, 77, 253, 4477, 7568, 326, 342, 436, 4327, 273, 10336, 749, 32581, 1319, 18388, 1756, 10939, 273, 253, 1180, 273, 6083, 253, 10527, 7680, 3133, 2266, 342, 247, 625, 3710, 5661, 7103, 253, 2929, 310, 14086, 11076, 1037, 285, 369, 1892, 281, 2939, 253, 39383, 497, 17837, 14320, 2266, 2217, 281, 15249, 14924, 2299, 4496, 513, 2953, 5701, 432, 37317, 305, 38212, 19, 275, 253, 2457, 2715 ]
Below is given review of a research paper from cnoference journal. Please write a summary the review. ### Review: the paper touches on the problem of evaluating the robustness of models to adversarial examples the proposed systematic evaluation protocol consists of 6 quantitative indicators of failure iofa and corresponding fixes to address the problems gradientbased attack failures eg shattered gradients are described first and then indicators eg unavailable gradients are introduced accordingly from this methods eg use bpda to improve the reliability of robustness evaluation are specified to mitigate loss landscape failures and attack optimization failures given positive experimental evidence the proposed pipeline is claimed to be effective by inspecting 7 previously published defenses related work limitations and potential future work are discussed code and data will be public strengths 1 the paper focuses on the valuable problem of evaluating and improving adversarial defense which is critical to the robustness of modern deep learning methods in practice 2 each indicator is wellmotivated by observed failures and is coupled with suggested fixes 3 the introduced pipeline is successfully examined to improve existing defense baselines by thoughtful experiments weaknesses 1 technically speaking the contribution of this work is incremental the proposed pipeline is not that impressive or novel rather it seems to be a pack of tricks to improve defense evaluation 2 although iofa is well supported by cited works and described failures its introduction lacks practical cases where figures 1 and 2 do not provide example failures and thus do not lead to a better understanding 3 the reported experimental results appear to evidence the proposed methods while there is a missing regarding the case analysis and further studies limitations are discussed in section 6 docsepthis paper proposes a systematic protocol for improving adversarial example evaluations against common failures the authors summarize six common failures and propose quantification metrics and meditations for each failure experiments on 7 previously broken defenses show that the proposed protocol could have identified and improved their adaptive attacks originality strengths the systematic enforcement of failure checking and improving defense evaluation is novel the proposed protocol is a novel extension and identification of wellknown pitfalls in evaluating adversarial example defenses weaknesses most techniques are wellknown 1 30 while i understand and agree that connecting wellknown failures and fixes is valuable and proposing this systematic protocol is useful i am concerned that i could not find many new insights from the current discussion that is the current discussion is more like a rigorous summary in terms of a systematic protocol that is indeed novel of wellknown insights discussed in 1 30 part of my concern is that these failures indicators and fixes are not clearly cited aside by explicitly discussing that they have been previously explored failures 12 and their fixes are already sufficiently discussed by 1 30 their indicators are also discussed by 11 eg through blackbox attacks and explicitly checking randomness failures 36 and their fixes are already sufficiently discussed by 30 the evaluation is very similar to 1 30 the attacking procedures of many defenses are very similar to those provided by 1 30 except for the discussion of indicators for example kwta follows a similar logic as in 30 it and jpegc follow a similar logic as in 1 quality strengths the evaluation is sound and shows that proposed indicators effectively identify weak attacks and that optimizing towards the proposed metrics can improve the robustness evaluation weaknesses unclear tightness of these indicators while i understand that passing these indicators does not mean a strong attack it is suggested to briefly discuss how tight these quantifiable indicators are that is how likely a weak attacker potentially outside the discussed defenses could inadvertently optimize for these indicators yet result in a weak attack not detected by these metrics i have these concerns because the hyperparameters and thresholds are only decided by empirical results of a few defenses some robustness is still higher than in previous attacks i am curious why some evaluations show much higher robustness than previously attacked by 1 30 for example the authors were not able to reduce the robustness of jpegc to 0 as has been done by 1 on the visualization of evaluations at l12 the authors mentioned that the proposed indicators could be used to visualize evaluations is this claim missing from the paper i am not sure if it refers to figure 1 if so this part is not clearly discussed clarity strengths the presentation and visualization are generally good weaknesses the evaluation part is dense and the readers might need to jump forth and back to look for references for six indicators and fixes significance strengths the idea of synthesizing known failures as indicators and providing quantifiable metrics can be useful although most techniques are wellknown synthesizing them in a systematic protocol seems to be a better way of filtering out weak evaluations weaknesses unclear significance in addition to known techniques i could not find many new insights from this paper which is more like an execution manuscript for wellknown adaptive attack techniques in a wellknown way that has been previously discussed in 1 30 given that most failures and fixes have already been sufficiently discussed in the literature the only remaining significance seems to be proposing the six metrics however i am not sure if these metrics are significant enough not to mention that some of them have also been discussed before limitations have been adequately discussed in the paper docsepthis paper proposes a set of indicators of failure for gradientbased adversarial ml attacks as well as a set of corresponding potential fixes with the aim of improving current adversarial robustness evaluations through increasing automation and systematization of the process the argument is that current proposed checklists can give a false sense of security as they do not ensure a reliable evaluation the paper brings together some reasonable indicators of different failure cases and some reasonable mitigations in each of those cases my main issues are the list of issues is almost certainly incomplete the indicators for each issue considered are not proven to guarantee to capture the issue especially in cases where some case by case threshold is required the mitigations are not guaranteed to be the only reasonable workaround there are still hyperparameters to be setfinetuned empiricallyad hoc overall while bringing together these indicators and mitigations into as more systematic approach is admirable there are no particular indicators or mitigations that are novel all seem to have been considered before in isolation admittedly or are just common sense thus this paper seems a reasonable guide for practical evaluation and best practise or even a taxonomy but lacks novelty and impact limitations addressed docsepthe paper provides a set of quantitative indicators that can be used to determine if a given adversarial attack has been correctly implemented by using such indicators developers of defenses can assess whether the attack used to evaluate the defense is flawed or correct the paper also experimentally demonstrates the application of its main proposal showing that 7 existing defenses are potentially flawed due to the incorrect implementation of the corresponding attacks personal comment the paper sets the bar very high by mentioningmany times and starting from the abstractthe false sense of security provided by existing defenses against adversarial examples although i liked the initial tone which intrigued me and induced me to read the rest of the paper with high expectations i was disappointed by the overall takeaways provided by the paper indeed my impression is that the actual contribution of the paper is just a piece of software that can be useful to debug evaluations of adversarial attacksbut in the single context of computer vision the paper is full of descriptions and observations but some of them are either obvious or apparently taken by prior workpreventing to determine the true contribution to the state of the art provided by the paper furthermore some of the considered defenses have already been found to be flawed so i am skeptical about their corresponding false sense of security which is still an overexxagerationdiscussed below put simply i believe that the biggest problem of the paper is that it is trying too much my general recommendation to the authors is to tone down the paper and focus on the specific contribution which i think is there but still hidden of their work to the stateoftheart high level originality poor the only true contribution in my opinion are the indicatorswhich are limited to evaluated on a single application domain computer vision investigating failures in ml research is not new eg e and many works also do this in the adversarial ml domain eg 30 clarity good in general the paper reads well the motivations are properly explained and figurestables are appropriate quality average the notation is faulty and the organization can be improved to better reflect the true contribution of the paper significance poor most of the content is well known and the limited applicability limited evaluation hinders its overall significance to the widespectrum of potential ml applications lowlevel strenghts some findings have practical utility for developers the implementation can be useful for future work weaknesses incorrect notation only computer vision exaggerated claims unsurprising results where is the framework unclear contribution wrt sota confusing organization remark i thank the authors for their paper despite my criticism i endorse their line of research and i would have little objections against this paper if it were submitted to a less prestigious venue an intrinsic and not reported limitation is that the work mostly focuses on attacksdefenses in the computer vision domain neglecting the plethora of other domains in which adversarial examples can be conceived another limitation which is reported but still significant and should be a subject of discussion here is that the proposed set of indicators are only applicable today if new failures arise which are likely considered the fast advances in this research field then the current proposal will not be able to detect them hence potentially inducing the same sense of false security which the authors aim to remove the issue here is that it is a devils proof the contribution shows some indicatorsbut we cannot now if such indicators can cover all possible causes of failures that may affect current adversarial ml evaluations perhaps the authors could attempt to remedy to the abovementioend issue by promising to maintain the repository for several years so that if new failures are found affecting either current or future researches then they will be integrated in the toolkit and usable by future researches my concern is this the moment a new paper finds a failure that is not included in those reported in this paper then the currently small contribution of this paper to the state of the art will be insignificant update after initial rebuttal the experiments on malware made me increase my score from 4 to 5 due to practically proving that the propose methodology can be adapted to cover also different domains than cv update after authors discussion i am increasing the score to a 6 i would rate it a 64 ### Summary:
first thank you to the reviewers and authors for an indepth discussion on the contributions and framing of this paper theres no doubt this paper was improved over the course of the rebuttal period and thank you again to the reviewers for having participated in a discussion to clear up the final points of this paper this paper essentially presents a checklist of best practices for adversarial evaluation such a contribution would hopefully induce more rigorous evaluation standards around adversarial attack research which is currently often stuck in this feedback loop of empirical attack and defences new attacks are easily stopped by slight modifications to defences but they fail under slightly modified attacks the authors propose to categorise various attack failures that plague prior work and propose a method to identify these failures ahead of time they also suggest mitigations to fix these possible sources of failure in essence this paper is a combination of survey reproduction and opinion paper all in one strength the main strength of this paper is that the framework developed by the authors is practical researchers will be able to use this framework to check the rigor of their empirical investigations hopefully leading to an increase in standards fieldwide however they also provide strong empirical results across three domains to validate their method of identifying failures weakness there is no novel technical contribution ie no new attack defence the paper is more a rehashing of prior approaches i assume there will be plenty papers of the first variety with impressive technical chops at the conference however and think this paper stands on its own a different flavour whose content is worth acceptance
[ 17082, 285, 1129, 22644, 323, 1016, 4433, 4679, 327, 818, 3786, 7154, 25774, 921, 326, 253, 4081, 7241, 812, 452, 3636, 285, 5520, 616, 17825, 8104, 50276, 19164, 414, 50276, 296, 3755, 20556, 50276, 783, 12082, 10473, 273, 4433, 12669, 285, 11138, 5684, 7103, 310, 4460, 50276, 783, 4081, 7241, 310, 247, 4460, 6880, 285, 8137, 273, 973, 4304, 8483, 27366, 275, 16344, 48960, 1650, 25774, 50276, 20881, 1255, 265, 50276, 2252, 5609, 403, 973, 4304, 337, 1884, 1223, 891, 2096, 285, 5194, 326, 12873, 973, 4304, 20101, 285, 26019, 310, 9865, 285, 36636, 436, 12082, 7241, 310, 4217, 891, 717, 7514, 326, 891, 812, 417, 1089, 1142, 747, 16039, 432, 253, 1655, 5955, 326, 310, 253, 1655, 5955, 310, 625, 751, 247, 26565, 6010, 275, 2426, 273, 247, 12082, 7241, 326, 310, 6296, 4460, 273, 973, 4304, 16039, 5469, 275, 337, 1884, 629, 273, 619, 4468, 310, 326, 841, 20101, 18172, 285, 26019, 403, 417, 4518, 11106, 9255, 407, 11120, 16585, 326, 597, 452, 644, 3786, 14859, 50274, 24796, 980, 1249, 285, 616, 26019, 403, 2168, 10481, 5469, 407, 337, 1884, 616, 18172, 403, 671, 5469, 407, 1903, 24088, 949, 2806, 3364, 8104, 285, 11120, 12669, 3632, 1255, 50274, 24796, 980, 5540, 285, 616, 26019, 403, 2168, 10481, 5469, 407, 1884, 50276, 783, 7103, 310, 1077, 2074, 281, 337, 1884, 253, 20362, 7259, 273, 1142, 25774, 403, 1077, 2074, 281, 1110, 2530, 407, 337, 1884, 3707, 323, 253, 5955, 273, 18172, 323, 1650, 50274, 20168, 893, 3637, 247, 2074, 9317, 347, 275, 1884, 50274, 262, 285, 480, 21949, 68, 956, 247, 2074, 9317, 347, 275, 337, 50275, 15177, 50276, 296, 3755, 20556, 50276, 783, 7103, 310, 3590, 285, 2722, 326, 4081, 18172, 8069, 4271, 5075, 8104, 285, 326, 39793, 4404, 253, 4081, 17082, 476, 3157, 253, 31640, 7103, 50276, 20881, 1255, 265, 50276, 328, 8250, 6863, 1255, 273, 841, 18172, 1223, 891, 2096, 326, 8136, 841, 18172, 1057, 417, 1599, 247, 2266, 2983, 352, 310, 5125, 281, 13366, 2319, 849, 6863, 841, 2677, 18397, 18172, 403, 326, 310, 849, 2779, 247, 5075, 30539, 7826, 3345, 253, 5469, 25774, 812, 42255, 22318, 323, 841, 18172, 2568, 906, 275, 247, 5075, 2983, 417, 5189, 407, 841, 17082, 891, 452, 841, 7350, 984, 253, 4373, 22041, 285, 26682, 403, 760, 4425, 407, 16774, 1543, 273, 247, 1643, 25774, 50276, 8826, 31640, 310, 1335, 2169, 685, 275, 2045, 8104, 891, 717, 14338, 2139, 690, 27163, 921, 1199, 2169, 31640, 685, 3786, 13964, 407, 337, 1884, 323, 1650, 253, 4477, 497, 417, 2104, 281, 4796, 253, 31640, 273, 480, 21949, 68, 281, 470, 347, 556, 644, 2218, 407, 337, 50276, 251, 253, 24426, 273, 27163, 387, 298, 805, 253, 4477, 5393, 326, 253, 4081, 18172, 812, 320, 908, 281, 31986, 27163, 310, 436, 1750, 5816, 432, 253, 2929, 891, 717, 417, 2119, 604, 352, 10770, 281, 4677, 337, 604, 594, 436, 629, 310, 417, 4518, 5469, 50275, 498, 15752, 50276, 296, 3755, 20556, 50276, 783, 9759, 285, 24426, 403, 3839, 1175, 50276, 20881, 1255, 265, 50276, 783, 7103, 629, 310, 14086, 285, 253, 10668, 1537, 878, 281, 6923, 6593, 285, 896, 281, 1007, 323, 10414, 323, 2800, 18172, 285, 26019, 50275, 9188, 40348, 50276, 296, 3755, 20556, 50276, 783, 2934, 273, 35143, 3006, 1929, 20101, 347, 18172, 285, 5277, 2677, 18397, 17082, 476, 320, 4217, 50276, 20261, 954, 5609, 403, 973, 4304, 35143, 3006, 731, 275, 247, 12082, 7241, 3133, 281, 320, 247, 1805, 1039, 273, 19690, 562, 5075, 27163, 50276, 20881, 1255, 265, 50276, 328, 8250, 8453, 275, 1635, 281, 1929, 5609, 891, 812, 417, 1089, 1142, 747, 16039, 432, 436, 2929, 534, 310, 625, 751, 271, 10636, 7714, 323, 973, 4304, 17825, 2983, 5609, 275, 247, 973, 4304, 1039, 326, 556, 644, 3786, 5469, 275, 337, 1884, 1677, 326, 954, 20101, 285, 26019, 452, 2168, 644, 10481, 5469, 275, 253, 6239, 253, 760, 5780, 8453, 3133, 281, 320, 36636, 253, 2800, 17082, 2299, 891, 717, 417, 2119, 604, 841, 17082, 403, 1534, 2217, 417, 281, 3748, 326, 690, 273, 731, 452, 671, 644, 5469, 1078, 7364, 452, 644, 18212, 5469, 275, 253, 2929, 5474, 33032, 2520, 2929, 29328, 247, 873, 273, 18172, 273, 4433, 323, 11786, 3169, 48960, 13361, 8104, 347, 973, 347, 247, 873, 273, 3969, 2442, 26019, 342, 253, 4388, 273, 11138, 1655, 48960, 31640, 27163, 949, 3629, 29885, 285, 985, 47159, 273, 253, 1232, 50276, 783, 4154, 310, 326, 1655, 4081, 2451, 28256, 476, 1918, 247, 3221, 3282, 273, 3988, 347, 597, 513, 417, 5416, 247, 9630, 7103, 50275, 783, 2929, 10316, 2366, 690, 5272, 18172, 273, 1027, 4433, 2219, 285, 690, 5272, 4784, 304, 569, 275, 1016, 273, 1110, 2219, 50276, 2577, 2022, 3374, 403, 209, 186, 783, 1618, 273, 3374, 310, 2761, 5604, 18464, 209, 186, 783, 18172, 323, 1016, 2523, 2783, 403, 417, 11464, 281, 12215, 281, 9232, 253, 2523, 3340, 275, 2219, 835, 690, 1083, 407, 1083, 7887, 310, 2424, 209, 186, 783, 4784, 304, 569, 403, 417, 16293, 281, 320, 253, 760, 5272, 42182, 209, 186, 9088, 403, 1335, 4373, 22041, 281, 320, 873, 71, 7795, 37437, 45190, 324, 26901, 28910, 4583, 1223, 9745, 2366, 841, 18172, 285, 4784, 304, 569, 715, 347, 625, 12082, 2746, 310, 50063, 627, 403, 642, 1798, 18172, 390, 4784, 304, 569, 326, 403, 4460, 512, 1646, 281, 452, 644, 2783, 1078, 275, 12940, 47421, 390, 403, 816, 1846, 3282, 3021, 436, 2929, 3133, 247, 5272, 7102, 323, 8542, 7103, 285, 1682, 2283, 885, 390, 1014, 247, 2891, 13646, 50276, 2858, 19756, 38135, 285, 3486, 50276, 17465, 569, 9713, 5474, 339, 431, 248, 2929, 3400, 247, 873, 273, 11745, 18172, 326, 476, 320, 908, 281, 3653, 604, 247, 1677, 48960, 2983, 556, 644, 9113, 9009, 407, 970, 824, 18172, 12259, 273, 25774, 476, 2939, 1880, 253, 2983, 908, 281, 7472, 253, 5684, 310, 33657, 390, 3451, 253, 2929, 671, 21657, 14371, 253, 2898, 273, 697, 2022, 10419, 4645, 326, 818, 5368, 25774, 403, 7826, 33657, 1955, 281, 253, 13583, 7092, 273, 253, 3969, 8104, 50276, 21941, 4385, 50276, 783, 2929, 5239, 253, 2534, 1077, 1029, 407, 29570, 20415, 2069, 285, 4983, 432, 253, 12002, 783, 3221, 3282, 273, 3988, 2530, 407, 5368, 25774, 1411, 48960, 6667, 3738, 891, 10490, 253, 3302, 10541, 534, 48515, 479, 285, 5802, 479, 281, 1239, 253, 1551, 273, 253, 2929, 342, 1029, 12656, 891, 369, 19271, 407, 253, 4583, 1379, 42287, 2530, 407, 253, 2929, 6296, 619, 13214, 310, 326, 253, 4588, 7680, 273, 253, 2929, 310, 816, 247, 5313, 273, 3694, 326, 476, 320, 4217, 281, 13844, 27163, 273, 48960, 8104, 2858, 275, 253, 2014, 3634, 273, 4382, 8113, 253, 2929, 310, 2120, 273, 20121, 285, 7313, 533, 690, 273, 731, 403, 2057, 4755, 390, 8505, 2668, 407, 2720, 789, 33898, 272, 281, 3653, 253, 2032, 7680, 281, 253, 1375, 273, 253, 1445, 2530, 407, 253, 2929, 33810, 690, 273, 253, 2783, 25774, 452, 2168, 644, 1119, 281, 320, 33657, 594, 891, 717, 33872, 670, 616, 3969, 3221, 3282, 273, 3988, 534, 310, 1335, 271, 258, 3764, 5260, 356, 3328, 35844, 264, 2708, 50276, 1065, 3365, 891, 2868, 326, 253, 5962, 1895, 273, 253, 2929, 310, 326, 352, 310, 2820, 1512, 1199, 619, 2087, 17401, 281, 253, 4477, 310, 281, 10541, 1066, 253, 2929, 285, 2770, 327, 253, 2173, 7680, 534, 891, 1158, 310, 627, 533, 1335, 8763, 273, 616, 789, 281, 253, 1375, 23037, 14387, 50275, 8656, 1268, 50276, 19164, 414, 4105, 253, 760, 2032, 7680, 275, 619, 4743, 403, 253, 18172, 4609, 403, 3710, 281, 6760, 327, 247, 2014, 2898, 5028, 4382, 8113, 15686, 20101, 275, 13361, 2561, 310, 417, 747, 24088, 299, 285, 1142, 2987, 671, 513, 436, 275, 253, 48960, 13361, 5028, 24088, 1884, 50276, 498, 15752, 1175, 275, 2087, 253, 2929, 9563, 973, 253, 42852, 403, 6283, 5544, 285, 4677, 296, 2272, 403, 4569, 50276, 15177, 3388, 253, 14951, 310, 40249, 285, 253, 6003, 476, 320, 5520, 281, 1805, 4887, 253, 2032, 7680, 273, 253, 2929, 50276, 9188, 40348, 4105, 954, 273, 253, 2600, 310, 973, 1929, 285, 253, 3710, 30437, 3710, 7103, 17134, 398, 697, 4583, 8453, 281, 253, 259, 1487, 808, 4638, 273, 2442, 13361, 4893, 50274, 676, 5251, 50276, 296, 3755, 384, 84, 50276, 8826, 4342, 452, 8542, 11839, 323, 12259, 50276, 783, 7092, 476, 320, 4217, 323, 2852, 789, 50275, 20881, 1255, 265, 50276, 1763, 263, 6471, 14951, 50276, 7483, 4382, 8113, 50276, 911, 7215, 456, 3916, 50276, 4539, 321, 20733, 1543, 50276, 2811, 310, 253, 7792, 50276, 328, 8250, 7680, 8772, 256, 5503, 50276, 8259, 5302, 6003, 50276, 39808, 50276, 74, 5717, 253, 4477, 323, 616, 2929, 5747, 619, 14226, 891, 18883, 616, 1386, 273, 2561, 285, 891, 651, 452, 1652, 21915, 1411, 436, 2929, 604, 352, 497, 9262, 281, 247, 1679, 34544, 18767, 50276, 266, 15276, 285, 417, 2361, 12291, 310, 326, 253, 789, 6571, 16633, 327, 8104, 1545, 5060, 275, 253, 4382, 8113, 5028, 50276, 8265, 732, 272, 253, 48541, 273, 643, 10625, 275, 534, 48960, 6667, 476, 320, 20913, 50275, 23955, 12291, 534, 310, 2361, 533, 1335, 1534, 285, 943, 320, 247, 2256, 273, 5955, 1060, 310, 326, 253, 4081, 873, 273, 18172, 403, 760, 7763, 3063, 604, 747, 20101, 12893, 534, 403, 2779, 2783, 253, 3809, 16424, 275, 436, 2561, 1673, 840, 253, 1655, 10419, 588, 417, 320, 2104, 281, 2736, 731, 50276, 48521, 7826, 24635, 253, 1072, 3282, 273, 3221, 3988, 534, 253, 4477, 4388, 281, 5386, 253, 2523, 1060, 310, 326, 352, 310, 247, 1474, 3683, 4737, 253, 7680, 2722, 690, 18172, 2858, 359, 2550, 1024, 604, 824, 18172, 476, 3835, 512, 1896, 5997, 273, 20101, 326, 778, 2818, 1655, 48960, 13361, 27163, 50276, 30875, 253, 4477, 812, 3177, 281, 16748, 281, 253, 1840, 420, 900, 423, 2523, 407, 12532, 281, 6558, 253, 18491, 323, 2067, 1107, 594, 326, 604, 747, 20101, 403, 1119, 13567, 2057, 1655, 390, 2852, 29905, 2706, 840, 597, 588, 320, 8527, 275, 253, 4968, 11554, 285, 31998, 407, 2852, 29905, 2706, 619, 4468, 310, 436, 253, 2774, 247, 747, 2929, 9010, 247, 4433, 326, 310, 417, 2908, 275, 1110, 2361, 275, 436, 2929, 840, 253, 4390, 1355, 7680, 273, 436, 2929, 281, 253, 1375, 273, 253, 1445, 588, 320, 34584, 50274, 11183, 846, 3302, 30080, 22559, 253, 4679, 327, 36887, 1160, 479, 2572, 619, 4868, 432, 577, 281, 608, 1955, 281, 18236, 18597, 326, 253, 12661, 16182, 476, 320, 12956, 281, 3835, 671, 1027, 10625, 685, 30105, 50276, 11183, 846, 4477, 5955, 891, 717, 3629, 253, 4868, 281, 247, 721, 891, 651, 2281, 352, 247, 6705, 50276, 187, 187, 4118, 18435, 27, 7053, 5717, 368, 281, 253, 30628, 285, 4477, 323, 271, 801, 554, 394, 5955, 327, 253, 9021, 285, 39926, 273, 436, 2929, 253, 373, 642, 5545, 436, 2929, 369, 5520, 689, 253, 2282, 273, 253, 30080, 22559, 2180, 285, 5717, 368, 969, 281, 253, 30628, 323, 1907, 13640, 275, 247, 5955, 281, 2590, 598, 253, 2457, 2792, 273, 436, 2929, 50276, 2520, 2929, 9093, 10262, 247, 44282, 273, 1682, 8333, 323, 48960, 7103, 824, 247, 7680, 651, 18670, 10808, 625, 26565, 7103, 7465, 1475, 48960, 2983, 2561, 534, 310, 4390, 2223, 10960, 275, 436, 8680, 6287, 273, 16774, 2983, 285, 809, 2979, 747, 8104, 403, 4354, 6331, 407, 4512, 14586, 281, 809, 2979, 533, 597, 1891, 762, 5777, 7321, 8104, 50276, 783, 4477, 12661, 281, 13213, 885, 2710, 2983, 20101, 326, 31781, 2720, 789, 285, 12661, 247, 1332, 281, 4271, 841, 20101, 6386, 273, 673, 597, 671, 1804, 4784, 304, 569, 281, 4993, 841, 1896, 4973, 273, 4433, 275, 17718, 436, 2929, 310, 247, 5019, 273, 6630, 21068, 285, 4743, 2929, 512, 275, 581, 50276, 45563, 253, 2022, 4757, 273, 436, 2929, 310, 326, 253, 7792, 3715, 407, 253, 4477, 310, 8542, 8607, 588, 320, 2104, 281, 897, 436, 7792, 281, 2451, 253, 8132, 263, 273, 616, 16774, 14006, 18670, 4283, 281, 271, 2572, 275, 7465, 1673, 4363, 2299, 597, 671, 2085, 2266, 16774, 1543, 2439, 1264, 10625, 281, 17813, 616, 1332, 273, 12488, 20101, 50276, 20881, 1255, 627, 310, 642, 4460, 7681, 7680, 26332, 642, 747, 2983, 50276, 1545, 566, 253, 2929, 310, 625, 247, 294, 73, 3834, 273, 2720, 7274, 891, 5467, 627, 588, 320, 9828, 9380, 273, 253, 806, 5235, 342, 13943, 7681, 448, 2695, 387, 253, 8059, 2299, 285, 1158, 436, 2929, 9572, 327, 697, 1211, 247, 1027, 34149, 3692, 2600, 310, 4409, 14924 ]
[ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 ]
[ 17082, 285, 1129, 22644, 323, 1016, 4433, 4679, 327, 818, 3786, 7154, 25774, 921, 326, 253, 4081, 7241, 812, 452, 3636, 285, 5520, 616, 17825, 8104, 50276, 19164, 414, 50276, 296, 3755, 20556, 50276, 783, 12082, 10473, 273, 4433, 12669, 285, 11138, 5684, 7103, 310, 4460, 50276, 783, 4081, 7241, 310, 247, 4460, 6880, 285, 8137, 273, 973, 4304, 8483, 27366, 275, 16344, 48960, 1650, 25774, 50276, 20881, 1255, 265, 50276, 2252, 5609, 403, 973, 4304, 337, 1884, 1223, 891, 2096, 285, 5194, 326, 12873, 973, 4304, 20101, 285, 26019, 310, 9865, 285, 36636, 436, 12082, 7241, 310, 4217, 891, 717, 7514, 326, 891, 812, 417, 1089, 1142, 747, 16039, 432, 253, 1655, 5955, 326, 310, 253, 1655, 5955, 310, 625, 751, 247, 26565, 6010, 275, 2426, 273, 247, 12082, 7241, 326, 310, 6296, 4460, 273, 973, 4304, 16039, 5469, 275, 337, 1884, 629, 273, 619, 4468, 310, 326, 841, 20101, 18172, 285, 26019, 403, 417, 4518, 11106, 9255, 407, 11120, 16585, 326, 597, 452, 644, 3786, 14859, 50274, 24796, 980, 1249, 285, 616, 26019, 403, 2168, 10481, 5469, 407, 337, 1884, 616, 18172, 403, 671, 5469, 407, 1903, 24088, 949, 2806, 3364, 8104, 285, 11120, 12669, 3632, 1255, 50274, 24796, 980, 5540, 285, 616, 26019, 403, 2168, 10481, 5469, 407, 1884, 50276, 783, 7103, 310, 1077, 2074, 281, 337, 1884, 253, 20362, 7259, 273, 1142, 25774, 403, 1077, 2074, 281, 1110, 2530, 407, 337, 1884, 3707, 323, 253, 5955, 273, 18172, 323, 1650, 50274, 20168, 893, 3637, 247, 2074, 9317, 347, 275, 1884, 50274, 262, 285, 480, 21949, 68, 956, 247, 2074, 9317, 347, 275, 337, 50275, 15177, 50276, 296, 3755, 20556, 50276, 783, 7103, 310, 3590, 285, 2722, 326, 4081, 18172, 8069, 4271, 5075, 8104, 285, 326, 39793, 4404, 253, 4081, 17082, 476, 3157, 253, 31640, 7103, 50276, 20881, 1255, 265, 50276, 328, 8250, 6863, 1255, 273, 841, 18172, 1223, 891, 2096, 326, 8136, 841, 18172, 1057, 417, 1599, 247, 2266, 2983, 352, 310, 5125, 281, 13366, 2319, 849, 6863, 841, 2677, 18397, 18172, 403, 326, 310, 849, 2779, 247, 5075, 30539, 7826, 3345, 253, 5469, 25774, 812, 42255, 22318, 323, 841, 18172, 2568, 906, 275, 247, 5075, 2983, 417, 5189, 407, 841, 17082, 891, 452, 841, 7350, 984, 253, 4373, 22041, 285, 26682, 403, 760, 4425, 407, 16774, 1543, 273, 247, 1643, 25774, 50276, 8826, 31640, 310, 1335, 2169, 685, 275, 2045, 8104, 891, 717, 14338, 2139, 690, 27163, 921, 1199, 2169, 31640, 685, 3786, 13964, 407, 337, 1884, 323, 1650, 253, 4477, 497, 417, 2104, 281, 4796, 253, 31640, 273, 480, 21949, 68, 281, 470, 347, 556, 644, 2218, 407, 337, 50276, 251, 253, 24426, 273, 27163, 387, 298, 805, 253, 4477, 5393, 326, 253, 4081, 18172, 812, 320, 908, 281, 31986, 27163, 310, 436, 1750, 5816, 432, 253, 2929, 891, 717, 417, 2119, 604, 352, 10770, 281, 4677, 337, 604, 594, 436, 629, 310, 417, 4518, 5469, 50275, 498, 15752, 50276, 296, 3755, 20556, 50276, 783, 9759, 285, 24426, 403, 3839, 1175, 50276, 20881, 1255, 265, 50276, 783, 7103, 629, 310, 14086, 285, 253, 10668, 1537, 878, 281, 6923, 6593, 285, 896, 281, 1007, 323, 10414, 323, 2800, 18172, 285, 26019, 50275, 9188, 40348, 50276, 296, 3755, 20556, 50276, 783, 2934, 273, 35143, 3006, 1929, 20101, 347, 18172, 285, 5277, 2677, 18397, 17082, 476, 320, 4217, 50276, 20261, 954, 5609, 403, 973, 4304, 35143, 3006, 731, 275, 247, 12082, 7241, 3133, 281, 320, 247, 1805, 1039, 273, 19690, 562, 5075, 27163, 50276, 20881, 1255, 265, 50276, 328, 8250, 8453, 275, 1635, 281, 1929, 5609, 891, 812, 417, 1089, 1142, 747, 16039, 432, 436, 2929, 534, 310, 625, 751, 271, 10636, 7714, 323, 973, 4304, 17825, 2983, 5609, 275, 247, 973, 4304, 1039, 326, 556, 644, 3786, 5469, 275, 337, 1884, 1677, 326, 954, 20101, 285, 26019, 452, 2168, 644, 10481, 5469, 275, 253, 6239, 253, 760, 5780, 8453, 3133, 281, 320, 36636, 253, 2800, 17082, 2299, 891, 717, 417, 2119, 604, 841, 17082, 403, 1534, 2217, 417, 281, 3748, 326, 690, 273, 731, 452, 671, 644, 5469, 1078, 7364, 452, 644, 18212, 5469, 275, 253, 2929, 5474, 33032, 2520, 2929, 29328, 247, 873, 273, 18172, 273, 4433, 323, 11786, 3169, 48960, 13361, 8104, 347, 973, 347, 247, 873, 273, 3969, 2442, 26019, 342, 253, 4388, 273, 11138, 1655, 48960, 31640, 27163, 949, 3629, 29885, 285, 985, 47159, 273, 253, 1232, 50276, 783, 4154, 310, 326, 1655, 4081, 2451, 28256, 476, 1918, 247, 3221, 3282, 273, 3988, 347, 597, 513, 417, 5416, 247, 9630, 7103, 50275, 783, 2929, 10316, 2366, 690, 5272, 18172, 273, 1027, 4433, 2219, 285, 690, 5272, 4784, 304, 569, 275, 1016, 273, 1110, 2219, 50276, 2577, 2022, 3374, 403, 209, 186, 783, 1618, 273, 3374, 310, 2761, 5604, 18464, 209, 186, 783, 18172, 323, 1016, 2523, 2783, 403, 417, 11464, 281, 12215, 281, 9232, 253, 2523, 3340, 275, 2219, 835, 690, 1083, 407, 1083, 7887, 310, 2424, 209, 186, 783, 4784, 304, 569, 403, 417, 16293, 281, 320, 253, 760, 5272, 42182, 209, 186, 9088, 403, 1335, 4373, 22041, 281, 320, 873, 71, 7795, 37437, 45190, 324, 26901, 28910, 4583, 1223, 9745, 2366, 841, 18172, 285, 4784, 304, 569, 715, 347, 625, 12082, 2746, 310, 50063, 627, 403, 642, 1798, 18172, 390, 4784, 304, 569, 326, 403, 4460, 512, 1646, 281, 452, 644, 2783, 1078, 275, 12940, 47421, 390, 403, 816, 1846, 3282, 3021, 436, 2929, 3133, 247, 5272, 7102, 323, 8542, 7103, 285, 1682, 2283, 885, 390, 1014, 247, 2891, 13646, 50276, 2858, 19756, 38135, 285, 3486, 50276, 17465, 569, 9713, 5474, 339, 431, 248, 2929, 3400, 247, 873, 273, 11745, 18172, 326, 476, 320, 908, 281, 3653, 604, 247, 1677, 48960, 2983, 556, 644, 9113, 9009, 407, 970, 824, 18172, 12259, 273, 25774, 476, 2939, 1880, 253, 2983, 908, 281, 7472, 253, 5684, 310, 33657, 390, 3451, 253, 2929, 671, 21657, 14371, 253, 2898, 273, 697, 2022, 10419, 4645, 326, 818, 5368, 25774, 403, 7826, 33657, 1955, 281, 253, 13583, 7092, 273, 253, 3969, 8104, 50276, 21941, 4385, 50276, 783, 2929, 5239, 253, 2534, 1077, 1029, 407, 29570, 20415, 2069, 285, 4983, 432, 253, 12002, 783, 3221, 3282, 273, 3988, 2530, 407, 5368, 25774, 1411, 48960, 6667, 3738, 891, 10490, 253, 3302, 10541, 534, 48515, 479, 285, 5802, 479, 281, 1239, 253, 1551, 273, 253, 2929, 342, 1029, 12656, 891, 369, 19271, 407, 253, 4583, 1379, 42287, 2530, 407, 253, 2929, 6296, 619, 13214, 310, 326, 253, 4588, 7680, 273, 253, 2929, 310, 816, 247, 5313, 273, 3694, 326, 476, 320, 4217, 281, 13844, 27163, 273, 48960, 8104, 2858, 275, 253, 2014, 3634, 273, 4382, 8113, 253, 2929, 310, 2120, 273, 20121, 285, 7313, 533, 690, 273, 731, 403, 2057, 4755, 390, 8505, 2668, 407, 2720, 789, 33898, 272, 281, 3653, 253, 2032, 7680, 281, 253, 1375, 273, 253, 1445, 2530, 407, 253, 2929, 33810, 690, 273, 253, 2783, 25774, 452, 2168, 644, 1119, 281, 320, 33657, 594, 891, 717, 33872, 670, 616, 3969, 3221, 3282, 273, 3988, 534, 310, 1335, 271, 258, 3764, 5260, 356, 3328, 35844, 264, 2708, 50276, 1065, 3365, 891, 2868, 326, 253, 5962, 1895, 273, 253, 2929, 310, 326, 352, 310, 2820, 1512, 1199, 619, 2087, 17401, 281, 253, 4477, 310, 281, 10541, 1066, 253, 2929, 285, 2770, 327, 253, 2173, 7680, 534, 891, 1158, 310, 627, 533, 1335, 8763, 273, 616, 789, 281, 253, 1375, 23037, 14387, 50275, 8656, 1268, 50276, 19164, 414, 4105, 253, 760, 2032, 7680, 275, 619, 4743, 403, 253, 18172, 4609, 403, 3710, 281, 6760, 327, 247, 2014, 2898, 5028, 4382, 8113, 15686, 20101, 275, 13361, 2561, 310, 417, 747, 24088, 299, 285, 1142, 2987, 671, 513, 436, 275, 253, 48960, 13361, 5028, 24088, 1884, 50276, 498, 15752, 1175, 275, 2087, 253, 2929, 9563, 973, 253, 42852, 403, 6283, 5544, 285, 4677, 296, 2272, 403, 4569, 50276, 15177, 3388, 253, 14951, 310, 40249, 285, 253, 6003, 476, 320, 5520, 281, 1805, 4887, 253, 2032, 7680, 273, 253, 2929, 50276, 9188, 40348, 4105, 954, 273, 253, 2600, 310, 973, 1929, 285, 253, 3710, 30437, 3710, 7103, 17134, 398, 697, 4583, 8453, 281, 253, 259, 1487, 808, 4638, 273, 2442, 13361, 4893, 50274, 676, 5251, 50276, 296, 3755, 384, 84, 50276, 8826, 4342, 452, 8542, 11839, 323, 12259, 50276, 783, 7092, 476, 320, 4217, 323, 2852, 789, 50275, 20881, 1255, 265, 50276, 1763, 263, 6471, 14951, 50276, 7483, 4382, 8113, 50276, 911, 7215, 456, 3916, 50276, 4539, 321, 20733, 1543, 50276, 2811, 310, 253, 7792, 50276, 328, 8250, 7680, 8772, 256, 5503, 50276, 8259, 5302, 6003, 50276, 39808, 50276, 74, 5717, 253, 4477, 323, 616, 2929, 5747, 619, 14226, 891, 18883, 616, 1386, 273, 2561, 285, 891, 651, 452, 1652, 21915, 1411, 436, 2929, 604, 352, 497, 9262, 281, 247, 1679, 34544, 18767, 50276, 266, 15276, 285, 417, 2361, 12291, 310, 326, 253, 789, 6571, 16633, 327, 8104, 1545, 5060, 275, 253, 4382, 8113, 5028, 50276, 8265, 732, 272, 253, 48541, 273, 643, 10625, 275, 534, 48960, 6667, 476, 320, 20913, 50275, 23955, 12291, 534, 310, 2361, 533, 1335, 1534, 285, 943, 320, 247, 2256, 273, 5955, 1060, 310, 326, 253, 4081, 873, 273, 18172, 403, 760, 7763, 3063, 604, 747, 20101, 12893, 534, 403, 2779, 2783, 253, 3809, 16424, 275, 436, 2561, 1673, 840, 253, 1655, 10419, 588, 417, 320, 2104, 281, 2736, 731, 50276, 48521, 7826, 24635, 253, 1072, 3282, 273, 3221, 3988, 534, 253, 4477, 4388, 281, 5386, 253, 2523, 1060, 310, 326, 352, 310, 247, 1474, 3683, 4737, 253, 7680, 2722, 690, 18172, 2858, 359, 2550, 1024, 604, 824, 18172, 476, 3835, 512, 1896, 5997, 273, 20101, 326, 778, 2818, 1655, 48960, 13361, 27163, 50276, 30875, 253, 4477, 812, 3177, 281, 16748, 281, 253, 1840, 420, 900, 423, 2523, 407, 12532, 281, 6558, 253, 18491, 323, 2067, 1107, 594, 326, 604, 747, 20101, 403, 1119, 13567, 2057, 1655, 390, 2852, 29905, 2706, 840, 597, 588, 320, 8527, 275, 253, 4968, 11554, 285, 31998, 407, 2852, 29905, 2706, 619, 4468, 310, 436, 253, 2774, 247, 747, 2929, 9010, 247, 4433, 326, 310, 417, 2908, 275, 1110, 2361, 275, 436, 2929, 840, 253, 4390, 1355, 7680, 273, 436, 2929, 281, 253, 1375, 273, 253, 1445, 588, 320, 34584, 50274, 11183, 846, 3302, 30080, 22559, 253, 4679, 327, 36887, 1160, 479, 2572, 619, 4868, 432, 577, 281, 608, 1955, 281, 18236, 18597, 326, 253, 12661, 16182, 476, 320, 12956, 281, 3835, 671, 1027, 10625, 685, 30105, 50276, 11183, 846, 4477, 5955, 891, 717, 3629, 253, 4868, 281, 247, 721, 891, 651, 2281, 352, 247, 6705, 50276, 187, 187, 4118, 18435, 27, 7053, 5717, 368, 281, 253, 30628, 285, 4477, 323, 271, 801, 554, 394, 5955, 327, 253, 9021, 285, 39926, 273, 436, 2929, 253, 373, 642, 5545, 436, 2929, 369, 5520, 689, 253, 2282, 273, 253, 30080, 22559, 2180, 285, 5717, 368, 969, 281, 253, 30628, 323, 1907, 13640, 275, 247, 5955, 281, 2590, 598, 253, 2457, 2792, 273, 436, 2929, 50276, 2520, 2929, 9093, 10262, 247, 44282, 273, 1682, 8333, 323, 48960, 7103, 824, 247, 7680, 651, 18670, 10808, 625, 26565, 7103, 7465, 1475, 48960, 2983, 2561, 534, 310, 4390, 2223, 10960, 275, 436, 8680, 6287, 273, 16774, 2983, 285, 809, 2979, 747, 8104, 403, 4354, 6331, 407, 4512, 14586, 281, 809, 2979, 533, 597, 1891, 762, 5777, 7321, 8104, 50276, 783, 4477, 12661, 281, 13213, 885, 2710, 2983, 20101, 326, 31781, 2720, 789, 285, 12661, 247, 1332, 281, 4271, 841, 20101, 6386, 273, 673, 597, 671, 1804, 4784, 304, 569, 281, 4993, 841, 1896, 4973, 273, 4433, 275, 17718, 436, 2929, 310, 247, 5019, 273, 6630, 21068, 285, 4743, 2929, 512, 275, 581, 50276, 45563, 253, 2022, 4757, 273, 436, 2929, 310, 326, 253, 7792, 3715, 407, 253, 4477, 310, 8542, 8607, 588, 320, 2104, 281, 897, 436, 7792, 281, 2451, 253, 8132, 263, 273, 616, 16774, 14006, 18670, 4283, 281, 271, 2572, 275, 7465, 1673, 4363, 2299, 597, 671, 2085, 2266, 16774, 1543, 2439, 1264, 10625, 281, 17813, 616, 1332, 273, 12488, 20101, 50276, 20881, 1255, 627, 310, 642, 4460, 7681, 7680, 26332, 642, 747, 2983, 50276, 1545, 566, 253, 2929, 310, 625, 247, 294, 73, 3834, 273, 2720, 7274, 891, 5467, 627, 588, 320, 9828, 9380, 273, 253, 806, 5235, 342, 13943, 7681, 448, 2695, 387, 253, 8059, 2299, 285, 1158, 436, 2929, 9572, 327, 697, 1211, 247, 1027, 34149, 3692, 2600, 310, 4409, 14924 ]